New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Comments
rsnapshot is what you are looking for!
http://rsnapshot.org/
mysqldump tar rsync@remotehost
Thanx buddy, however, Im looking to backup all the files not only the databases.
I set a cron for rsync to run, or use whatever softwares built in function.
node > backup server > backup cluster
I have my own backup script on cron, which backups all the sites at
/home/*/sites/*
and all the databases as their own tar.gz files, and the sends them to a backup server via rsync.can you share that script??
if ftp is not a problem
http://www.cyberciti.biz/tips/how-to-backup-mysql-databases-web-server-files-to-a-ftp-server-automatically.html
that works 100% percent
Sure. It is my first bash script though, which means I didn't know anything about bash, also still don't much. Also some bits and pieces are from tuxlite's backup script, and various stackoverflow answers.
http://paste.ee/p/rsLag#HxO2EHULHSVBgRFCm1wCTQZhUqPz6tJn
It creates a folder structure like this:
https://cdp.me/
Not sure that offers a ton of control by a quick glance.
I, personally, have started using Attic very recently and it works pretty nicely so far.
My fall back would be something like rsync though to manage the storage of it if I didn't use Attic.
How does it compare to duply/duplicity?
Right now I'm just using Tarsnap and then rsyncing my website files and mysql databases (backed up with automysqlbackup) to a backup vps.
I push content to them rather than have anything original on there. Installation/reinstallation taken care of with bash scripts.
Attic looks interesting
+1 for attic
I am doing daily rsync (push) of those boxes to a mirror, attic from there to a second mirror.
this way always have a fresh daily backup to use directly or if needed be able to get older backup data out of the attic.
deduplication on attic works like a charm and keeps overall size low, esp. if you have multiple vps with comparable setup...
I just rsync all the backs VestaCP created to another server. Set a cron to run daily.
attic to a remote server (via SSH). If further redundancy is required, I rsync the attic repository to Google Cloud or S3.
Just wondering but do you have SSH compression enabled? I set my ssh client config on my servers to not use it and get even better speeds during the backups.
crontab rsync and mysqldump
i wish you could write a brief tute for that.
Sure, this is for rsync over ssh, and you will need rsync to be installed on the backup server as well. The --delete tag isn't really needed in this case since VestaCP backups are all dated. You will need to set up a ssh key first and run the command once to add the fingerprint to your cache before crontab will work.
First line edits crontab editor.
Put the second line (after you edit it) into crontab and it will run at 9PM (or change the 21 to whenever).
I think you will need root to run this crontab, not sure. I think I got permission issues with admin user.
Depending on whether you have enough excess memory, you could use crashplan. Set it home by port forwarding from your desktop so that you can run it headless on the server. The free version lets you backup to another computer that you control and it runs automatically daily on the folders that you've specified.
I use the paid crashplan for a production file server whose data I really care about being backed up right away, as it's also where I have my seafile server for my home PC and keep some back datasets archived up there. It's gotten to 2TB at this point, though that's over the course of many months. Other things can live with a one day lag.
Alternatively if you're just archiving back data that you don't expect to use very often there's Amazon Glacier which I'd used before Crashplan (when the data were less it cost peanuts). That said I think it's $10 / month / TB more or less so maybe I'm just being cheap.
Vestacp does have a remote backup option. Why dont you use that? It takes backup every day.
I use duply (http://duply.net/) to S3 (versioning enabled) with auto-rotation to Glacier after 30 days. Duply will use duplicity (http://duplicity.nongnu.org/) as backend. It will generate compressed incremental backups encrypted with PGP and will upload directly to any S3 compatible host or other storage backend supported by duplicity, maintaining a local copy of the backup if you want.
Duplicity can upload to:
For my personal boxes, right now I am using Duplicity with Delimiter's ObjSpace which works great. I have about 30 VPS's and 4 dedicated servers backing up this way.
duplicity only or with any script in front of it? Is ObjSpace prepaid?
Install Proxmox with ZFS option, then send full and incremental ZFS snapshots periodically to a remote storage server running FreeBSD.
You could get insane read and write speeds with ZFS and large amounts of RAM. Even better with compression turned on.
I rsync all my specific files to a storage VPS which sends it off to DropBox.
However, I've been thinking of switching that out (removing storage) in favour of rclone which is a fantastic little tool.
That said I'm only backing up a few GB unlike some users here...
rsync,mysqldump and cronjob
Actually Duply/Duplicity, I did write Delimiter an article on how to set it up but I can see its not on their wiki yet.
Its prepaid packages which makes things a lot more predictable.