New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Best way to backup files and MySQL DB?
I want to automatically tar /var/www/ and backup the DB (I can write a script to do these) but then where's the best place to store them? Dropbox/Box/GoogleDrive or a cheap VPS? Or something like OVH/AWS backup? I think I also saw a provider on here offer backup services for like $3/yr too but I forgot who.
The tar.gz file will be about 50mb or less, so 1gb should do for one backup a day (to be replaced every 2 weeks)
Comments
This is what I'm using in several Centos boxes. It takes about 43 seconds to backup a 2.X gig db and growing.
mysqlhotcopy command --one line below
Rsync to remote server --One line below
Thanks. Do you just back up to another VPS?
@hostnoob Yes. It can be any other vps.
The next command makes back up to Amazon S3. It uses this tool, http://s3tools.org/s3cmd
Back up to Amazon S3 --one line below
This is how we backup a forum with hundred thousands of members and nearly a million posts with 2 million attachment files and local avatars.
We host attachment and avatar files in gluster distributed file system, and they come with native support for replications and geo-replica so if the cluster breaks we can switch to the backup copy immediately.
The rest of the forum are mostly log files and source codes. They can be easily snapshot by a
rsync
command to a temporary directory, then either have a full tar, or do incremental tar, which tutorial can be found somewhere else.The database is running on a master-slave replication with an offshore VPS. Each time when I want to dump the database, I issue a
STOP SLAVE
command and then do a mysqldump and tar.We host our backup on backupsy and it's $5/mon which is perfect for a backup in this scale. If you are looking for something much smaller, you may think about split it and send to your mailbox. (GMail offers a lot of space)
Is this commercial run forum software or custom made? I'm curious about how your application handles distributed data for users, if you don't mind sharing.
Check
glusterfs
. Basically you can mount the file system as a directory and this won't break anything. Just make sure you don't do a lot of lookup operations in the mount point, such as execute php scripts with a lot ofinclude
.As in scandir type handles that read the drive? I assume since it's really just a masked remove drive.
Thanks for the information. I'd like to incorporate this into mine, no matter how low scale it is. I'd probably only be serving user pictures and documents from here.
It may take some time to understand its concepts before making good use it. Make sure you get used to it before using it in production service.
Glad if this helps.
For incremental backups you could use duplicity.
http://blog.bokhorst.biz/6507/computers-and-internet/how-to-setup-a-vps-as-web-server/#backup
i recommend automysqlbackup and rdiffbackup...
very worth another look: attic backup: https://attic-backup.org/
uses deduplication, so even if you backup multiple servers they can take profit from it and those daily increments are very small...
Original size Compressed size Deduplicated size
This archive: 9.47 GB 6.15 GB 143.02 MB
All archives: 421.26 GB 295.21 GB 22.29 GB
Oh BTW you can also use Git/SVN for incremental backup.
tar does incremental backups with the -g option.