Re: [Nolug] Backing up linux

From: Dustin Puryear <dustin_at_puryear-it.com>
Date: Fri, 15 Feb 2008 09:20:49 -0600
Message-ID: <47B5ADD1.6020802@puryear-it.com>

Ah. The real question is: How often does that 100GB change?

--
Puryear Information Technology, LLC
Baton Rouge, LA * 225-706-8414
http://www.puryear-it.com
Author, "Best Practices for Managing Linux and UNIX Servers"
   http://www.puryear-it.com/pubs/linux-unix-best-practices
Identity Management, LDAP, and Linux Integration
Chris Jones wrote:
> I found out late last night that the amount of data is fairly 
> significant, so I'm thinking rsync would be the better option, even over 
> LAN.  It's over 100GB of data, so it would be a lot of stress on all the 
> hardware to back that up nightly.  It might be better to run rsync 
> regularly, and maybe have the backup server archive it on a regular 
> basis with tar/gz.  I'll also check out that BackupPC software, it looks 
> really nice.  Especially with the web interface that lets you manage it, 
> makes it almost like a commercial product like BackupExec.
> 
> On Fri, Feb 15, 2008 at 7:53 AM, Dustin Puryear <dustin@puryear-it.com 
> <mailto:dustin@puryear-it.com>> wrote:
> 
>     I've never been a big a fan of the 'local tar via crontab' approach.
>     What about using something like BackupPC? It's much smarter in the way
>     it uses disk space, can use rsync, and works on- or off-site. We use it
>     all the time. Also, you can setup pre- and post-jobs for things like
>     running mysqldump.
> 
>     --
>     Puryear Information Technology, LLC
>     Baton Rouge, LA * 225-706-8414
>     http://www.puryear-it.com
> 
>     Author, "Best Practices for Managing Linux and UNIX Servers"
>       http://www.puryear-it.com/pubs/linux-unix-best-practices
> 
>     Identity Management, LDAP, and Linux Integration
> 
> 
>     Chris Jones wrote:
>      > I have a client that's needing to back up their linux web servers, so
>      > I'm thinking of recommending an additional server.  Set it up as
>     an NFS
>      > server, and let the other servers mount it.
>      >
>      > Write a bash script to essentially:
>      > use mysqldump to dump the databases to files
>      > tar/gz the web folder, email folders, and probably /etc to a file
>     on the NFS
>      > put the date into the filenames it generates, and have it delete
>     backups
>      > that are over, say 14 days old
>      >
>      > And then put the script into cron to run daily, every 6 hours, or
>      > whatever...
>      >
>      >
>      >
>      > Is this a good solution?  Does anybody know a better way?  Can
>     this be
>      > done on a live system, without having to take everything offline
>     first?
>      >
>      > Eventually they might want to do offsite backup and have hot spare
>      > servers in a data center somewhere that they could use for disaster
>      > recovery, I'm thinking rsync would be perfect if this need arises.
>     ___________________
>     Nolug mailing list
>     nolug@nolug.org <mailto:nolug@nolug.org>
> 
> 
> 
> 
> -- 
> Chris Jones
> http://www.industrialarmy.com
___________________
Nolug mailing list
nolug@nolug.org
Received on 02/15/08

This archive was generated by hypermail 2.2.0 : 12/19/08 EST