I've never been a big a fan of the 'local tar via crontab' approach.
What about using something like BackupPC? It's much smarter in the way
it uses disk space, can use rsync, and works on- or off-site. We use it
all the time. Also, you can setup pre- and post-jobs for things like
running mysqldump.
-- Puryear Information Technology, LLC Baton Rouge, LA * 225-706-8414 http://www.puryear-it.com Author, "Best Practices for Managing Linux and UNIX Servers" http://www.puryear-it.com/pubs/linux-unix-best-practices Identity Management, LDAP, and Linux Integration Chris Jones wrote: > I have a client that's needing to back up their linux web servers, so > I'm thinking of recommending an additional server. Set it up as an NFS > server, and let the other servers mount it. > > Write a bash script to essentially: > use mysqldump to dump the databases to files > tar/gz the web folder, email folders, and probably /etc to a file on the NFS > put the date into the filenames it generates, and have it delete backups > that are over, say 14 days old > > And then put the script into cron to run daily, every 6 hours, or > whatever... > > > > Is this a good solution? Does anybody know a better way? Can this be > done on a live system, without having to take everything offline first? > > Eventually they might want to do offsite backup and have hot spare > servers in a data center somewhere that they could use for disaster > recovery, I'm thinking rsync would be perfect if this need arises. ___________________ Nolug mailing list nolug@nolug.orgReceived on 02/15/08
This archive was generated by hypermail 2.2.0 : 12/19/08 EST