Am I approaching it the right way then? I've done something just like this
in the past, just making sure that's the proper way to do it. Also you can
tar/gz and mysqldump the server while it's live, right?
On Thu, Feb 14, 2008 at 7:50 PM, Joey Kelly <joey@joeykelly.net> wrote:
> On Thursday 14 February 2008 07:23:43 pm Chris Jones wrote:
> > I have a client that's needing to back up their linux web servers, so
> I'm
> > thinking of recommending an additional server. Set it up as an NFS
> server,
> > and let the other servers mount it.
> >
> > Write a bash script to essentially:
> > use mysqldump to dump the databases to files
> > tar/gz the web folder, email folders, and probably /etc to a file on the
> > NFS put the date into the filenames it generates, and have it delete
> > backups that are over, say 14 days old
> >
> > And then put the script into cron to run daily, every 6 hours, or
> > whatever...
> >
> >
> >
> > Is this a good solution? Does anybody know a better way? Can this be
> done
> > on a live system, without having to take everything offline first?
> >
> > Eventually they might want to do offsite backup and have hot spare
> servers
> > in a data center somewhere that they could use for disaster recovery,
> I'm
> > thinking rsync would be perfect if this need arises.
>
> That's the kind of stuff I do. I'm sure there are comprehensive packages
> out
> there to do everything for you, but I'm more of a homegrown man.
>
> --
> Joey Kelly
> < Minister of the Gospel | Linux Consultant >
> http://joeykelly.net
>
-- Chris Jones http://www.industrialarmy.com ___________________ Nolug mailing list nolug@nolug.orgReceived on 02/14/08
This archive was generated by hypermail 2.2.0 : 12/19/08 EST