Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


how do you guys handle backup?
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

how do you guys handle backup?

ykbaekykbaek Member
edited May 2013 in General

I am going to get a VPS or Dedicated Server
but I have to manage all the backups myself,
I am running a forum site with about 4~500 new posts a day
so I guess I need a backup on a daily basis,
however backing up to a same server would be pointless right?

managed system for backup is quite expensive,
so I guess I need to do it myself,

so do I need buy 2 different services and run one as my web server
and the other one as a backup server? do they allow it?

Comments

  • jarjar Patron Provider, Top Host, Veteran

    I run a script to export SQL and zip the content directory and push it to another server which then stores daily snapshots and uploads them to dropbox.

  • Awmusic12635Awmusic12635 Member, Host Rep
    edited May 2013

    Depending on your budget check out bacula. It can backup everything on your server and you can even have it setup to backup your MYSQL databases as well.

    Supports compression and incremental backups.

    We use this for our VPS services running it every 6 hours, though a lot can change in six hours so in your case you may want to run it more often.

    You could set this to backup offsite to say a storage / backup VPS or another location you have a good bit of storage.

    -Alex

  • RaymiiRaymii Member

    I use Duplicity to incrementally backup my servers and web applications to backup VPS's: https://raymii.org/s/tutorials/Website-and-database-backup-with-Duplicity.html

  • taronyutaronyu Member

    @Raymii

    Does duplicity also works with big backups? We got over 1TB of data that we need to backup every night. But incremental backups are much better.

  • Duply is a frontend for duplicity and a lot easier to use.
    apt-get install duply ncftp

    I've been using it for years.

  • mikhomikho Member, Host Rep

    I've spent most of my day restoring a broken raid-set because of a customer wasn't using a proper backup solution.

    Well, I know what I will be doing this weekend. :(

  • trexostrexos Member

    I've been looking for a simple script which backups a set folder in rar or other format which supports more than 4GB. Just need to backup some gameservers. Does anybody have a script like that? :D

  • 5n1p5n1p Member

    Have you tried this
    http://bash.cyberciti.biz/backup/wizard-ftp-script.php

    ofc. dont use real details there, when you download you can change user and pass. And you will need to install ncftp http://www.ncftp.com/

  • trexostrexos Member

    Wow nice. Exactly what I need! It needs mysql for the backups right? sorry only online with my mobile :/

  • 5n1p5n1p Member
    edited May 2013

    It needs your mysql user and pass if you use mysql so it can make backup together with all other folders you specify in 1 archive file and will push it to other server via ftp.
    That is if i understood your question :)
    Remember if you have more then 1 mysql database use your root database user and pass so it can access all of your database's and backup it. If you dont use mysql then just delete that part from script.
    EDIT: Now i see you need it for gameserver i guess you dont need mysql part of script open it with notepad and comment out or delete mysql part of it.

  • marcmmarcm Member

    what kind of backup?

  • I use my own bash based script-Backup Master. It can backup prespecified folders, and all or specified mysql databases. It creates compressed archives, and then optionally syncs them with a backup server via sftp, or uploads them via plain ftp. At the end of the operation, it summarizes operation, and sends an email to specified email. Works well for my 7 Wordpress sites.

    It's WIP.
    https://bitbucket.org/droidzone/backupmaster

  • @ykbaek i honestly grabbed a box from @KuJoe good deal hidden on Dragon

  • SplitIceSplitIce Member, Host Rep

    My technique is probably unique. It was developed due to the large number of servers I have to manage and because they have a large amount of common data between them.

    For SQL:
    I have a single SQL server, so it is a special case.
    1. lvm snapshot of the VM, transferred to the backup server every day
    2. Percona xtrabackup backing up a full backup every 6 hours with a incremental backup every half hour. Transferred to the backup server direct. I plan to have this performed on a slave eventually.

    For files:
    I use unison to keep a single directory in-sync between all servers and one backup server.
    Every 5 minutes it synchronicity all servers (10 on some located in Africa and Oceania)

    Backup server
    Every hour a backup is made of the directory on the backup server (tar.bz2) and is written to the hard drive

    Local server
    Every morning at 10am it connects to the backup server and downloads the latest copy of the files and the SQL. Takes it until about 4pm to complete (Australia :()

    Future plans:
    - Bittorent sync instead of relying on a central point of failure for syncronisation. I also used the shared directory for distributing binaries and configuration files out to servers so its important that its available at all times.
    - Integration with MEGA - Im interested to know what people think of this? Better than relying on my home internet connection?

  • @SplitIce i almost dove into some thing very similar to MEGA. The concept is definitely the way to go in the future
    Just couldn't get that client to commit. Meh they'll learn when every thing goes like catastrophically wrong. Which never happens in this business right! If you can do it brother it's good.

    If and when I am able to I'll tout the combination for cloud storage that I found it's about mainstream any way.

  • trexostrexos Member

    Yesterday I created my own bash script for backup, it creates a tar.gz archiv and uploads it to an webspace account. works perfectly :)

  • I have a script running every 4 hours to take database backups then upload them to DreamHost cloud files which has free bandwidth in and only costs 3 cents per GB stored so its cheapest option.

    As for files I have a script running every 24 hours which does a full backup of files and databases and again uploads to dreamhost cloud files.

    I use s3cmd to upload to dreamhost cloud files so it all is done automatically and hassle free all very cheaply. I know relying on 1 backup service is not good in case of their failure, so I'm implementing possibly Dropbox backups or sugarsync since I have more space there.

  • RaymiiRaymii Member

    @taronyu said: Does duplicity also works with big backups? We got over 1TB of data that we need to backup every night. But incremental backups are much better.

    Yes. At one of my clients it replaced bacula and handles a nightly SAN incremental backup of about 4 TB.

    @SplitIce said: I use unison to keep a single directory in-sync between all servers and one backup server.

    Remember, Sync is not backup, just as RAID is not backup.

    @trexos said: Yesterday I created my own bash script for backup, it creates a tar.gz archiv and uploads it to an webspace account. works perfectly :)

    Duplicity also does that, probably better with incremental backup support, and it can do encryption, and saves you a lot of script writing work...

  • SplitIceSplitIce Member, Host Rep

    @Raymii said: @SplitIce said: I use unison to keep a single directory in-sync between all servers and one backup server.

    Remember, Sync is not backup, just as RAID is not backup.

    If you read the whole thing (It is a little TL;DR) you will see that its synced to a backup server where the work is done. Sync is done before backup to protect against machine type issues such as hard drive failure (its a 5 minute loss backup).

    @natestamm said: @SplitIce i almost dove into some thing very similar to MEGA. The concept is definitely the way to go in the future

    Just couldn't get that client to commit. Meh they'll learn when every thing goes like catastrophically wrong. Which never happens in this business right! If you can do it brother it's good.

    If and when I am able to I'll tout the combination for cloud storage that I found it's about mainstream any way.

    Id be really interested to hear if people are using MEGA, encryption or no encryption I have never felt right about sending sensitive data to third parties.

  • emgemg Veteran

    Allow me to point out that having a great backup method is only half the solution. In addition, you must verify that you can restore a failed system from your backups.

    I have seen many instances where the backups were performed religiously, yet when the inevitable failure occurred, the restoration process was far from smooth, and sometimes there were significant data losses.

  • FatboyFatboy Member
    edited May 2013

    I run a script that tars up the home directories (where the websites sit in their own directories), dumps all databases and then tars up the /etc/apache2/sites-available directory and the apache config. It then grabs all the ssl certs and tars them up as well.

    When its done it tars everything together in a file called -.tar.gz

    A remote server then grabs the backups to a central server elsewhere once a week.

    The script is fairly simple to do (well, I did it so it has to be!) and keeps backups tidy for me.

    Probably not the most efficient or best way but hey, it works :D

  • OliverOliver Member, Host Rep
    edited May 2013

    If you want to do incremental backups to S3 of home directories here's something I recently put together. I handle MySQL separately and this does not delete/remove old stuff but it's still a starting point to work with further...

    I just put stuff in the Oregon S3 location since it's cheapest. Note that I use a custom .s3cfg file which has IAM permissions set to write permission only. This means if the system is compromised nobody can remove the backups even though they have the IAM permissions to AWS. Some other methods mentioned here are fine but if someone really bad breaks in they can read your FTP passwords from your backup scripts and connect to the FTP server where your backups are stored and delete them as well.

    Cronjob the following to run from /home:

    find . -maxdepth 1 -name "*" -type d -not -path "." -exec ./backuper.sh '{}' \;
    

    Put backuper.sh in your /home directory:

    #!/bin/bash
    YMD=`date +%Y%m%d`
    YM=`date +%Y%m`
    tar cf $1.$YMD.tar $1
    pigz -v $1.$YMD.tar
    s3cmd --no-progress -c ~/.s3cfg_writeonly --multipart-chunk-size-mb=20 --rr put $1.$YMD.tar.gz s3://YOURS3BUCKET/backups/$YM/
    rm -v $1.$YMD.tar.gz
    

    This leaves you with a tar.gz file in the bucket for each home directory in the format user.20130511.tar.gz for the 5th of May this year.

    Feedback/improvements/ideas welcome :-)

Sign In or Register to comment.