Web Site Backup

GadgetBoy

Active Member
In my spare time, I operate a local website. I really have not backed up the site in a while. Is there a way through Homeseer that I can automate the backup regularly and log/display backup information? I guess it would be via ftp of some sort. This way I can forget about it and not have the occasional gutwrench when I realize I am not truly protected.

Thanks in advance.

Jim
 
Does your host generate backups on a nightly basis? If so, you could pull them down with a vbs script fairly easy, but if you have to download the files one by one, it can get messy (binary vs ascii, etc.).
 
If your site is on a Linux or unix server, you might be able to tar the entire site up on the server, then ftp that file overnight via a VBS script. For Windows, try zipping it all up. You need to runt he archive operation ointhe remote server, though.

You just have to jump through hoops to make sure ftp got the whole file unless your ISP uses a good ftp server and you have a good client that supports integrity checking after transfer (not many batch-capable ones do). Why can't they implement the z-modem protocol within FTP??? hehehe
 
I use GoDaddy to host it. They are very good pricewise and supportwise... I just don't feel like sitting on hold right now - lol. I sent them an e-mail. I can wait the time for a response on the backup info. I thought there might be an easy way myself to do it.

Jim
 
If they already back it up for you (which means it's one single file), it's pretty easy to pull that file down on a nightly basis, so let us know how they respond, or look through your control panel for the backup options.
 
Oh well...

01/23/2005 01:36 AM

Thank you for contacting Customer Support.

We do keep backups of our customers sites incase there is a problem and we need to restore this information to our servers.
Unfortunately, we do not offer these backups to the public for download.

Please let us know if we can help you in any other way.
 
How did the data get there in the first place? If you uploaded the data then you should have a local copy no?
 
It would probably be wise to keep an entire copy of the web site on your local system and back it up. Keep them in sync with publishing tools or ftp. That way you always have a second off-site copy. You could still do the ftp thing, but it may take a while.

I used to create a flag or semaphore file in the remote dir that was the last file in the list (alphabetically). That way, when the ftp mass transfer (mget) stopped, I could check to see if I got that file. If I did, that means I got all the others as well and it was a successful transfer. If it didn't make it, the ftp transfer failed somewhere. It's possible to test for that in a batch file.

Apart from counting the files, this was the best way to make sure, under batch control, to be sure the ftp transfer was successful. This was when the ftp client in Windows was much younger, too, with minimal scripting. Of course, with VB scripting you can do a lot more, like compare file sizes, do multiple ftp transfers with multiple sessions, etc., but that's a lot of work. You might be able to use something like Ipswitch's WS_FTP Pro to do this more readily as it has much better scripting capabilities, but it is $55.
 
Back
Top