Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


How do you backup your LEB ?
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

How do you backup your LEB ?

I am having my several LEBs with VestaCP. From time to time i wget the backups off to a remote backup server. Can it be automated ?

How do you backup your LEB ?

«1

Comments

  • rsnapshot is what you are looking for!
    http://rsnapshot.org/

  • mysqldump tar rsync@remotehost

    Thanked by 1Mridul
  • @GM2015 said:
    mysqldump tar rsync@remotehost

    Thanx buddy, however, Im looking to backup all the files not only the databases.

  • wychwych Member
    edited September 2015

    I set a cron for rsync to run, or use whatever softwares built in function.

    node > backup server > backup cluster

  • draziloxdrazilox Member
    edited September 2015

    I have my own backup script on cron, which backups all the sites at /home/*/sites/* and all the databases as their own tar.gz files, and the sends them to a backup server via rsync.

  • @drazilox said:
    I have my own backup script on cron, which backups all the sites at /home/*/sites/* and all the databases as their own tar.gz files, and the sends them to a backup server via rsync.

    can you share that script??

  • Ndha said: can you share that script??

    Sure. It is my first bash script though, which means I didn't know anything about bash, also still don't much. Also some bits and pieces are from tuxlite's backup script, and various stackoverflow answers.

    http://paste.ee/p/rsLag#HxO2EHULHSVBgRFCm1wCTQZhUqPz6tJn

    It creates a folder structure like this:

    ├─── 2015-09-23
    │   ├── databases
    │   │   └── dbname-1442982601.sql.gz
    │   └── domains
    │       ├── user-www.example.com-1442982601.tar.gz
    │       ├── user-www.example2.com-1442982601.tar.gz
    │       ├── user-www.example3.com-1442982601.tar.gz
    │       └── user-www.example4.com-1442982601.tar.gz
    └── 2015-09-24
        ├── databases
        │   └── dbname-1443069001.sql.gz
        └── domains
            ├── user-www.example.com-1443069001.tar.gz
            ├── user-www.example2.com-1443069001.tar.gz
            ├── user-www.example3.com-1443069001.tar.gz
            └── user-www.example4.com-1443069001.tar.gz
  • Not sure that offers a ton of control by a quick glance.

    I, personally, have started using Attic very recently and it works pretty nicely so far.

    My fall back would be something like rsync though to manage the storage of it if I didn't use Attic.

  • @Amitz said:
    rsnapshot is what you are looking for!
    http://rsnapshot.org/

    How does it compare to duply/duplicity?

  • Right now I'm just using Tarsnap and then rsyncing my website files and mysql databases (backed up with automysqlbackup) to a backup vps.

  • I push content to them rather than have anything original on there. Installation/reinstallation taken care of with bash scripts.

  • @MattM said:
    My fall back would be something like rsync though to manage the storage of it if I didn't use Attic.

    Attic looks interesting

  • +1 for attic

    I am doing daily rsync (push) of those boxes to a mirror, attic from there to a second mirror.
    this way always have a fresh daily backup to use directly or if needed be able to get older backup data out of the attic.
    deduplication on attic works like a charm and keeps overall size low, esp. if you have multiple vps with comparable setup...

    Thanked by 2vimalware ehab
  • I just rsync all the backs VestaCP created to another server. Set a cron to run daily.

  • attic to a remote server (via SSH). If further redundancy is required, I rsync the attic repository to Google Cloud or S3.

  • @zeitgeist said:
    attic to a remote server (via SSH). If further redundancy is required, I rsync the attic repository to Google Cloud or S3.

    Just wondering but do you have SSH compression enabled? I set my ssh client config on my servers to not use it and get even better speeds during the backups.

  • crontab rsync and mysqldump

  • @TheOnlyDK said:
    I just rsync all the backs VestaCP created to another server. Set a cron to run daily.

    i wish you could write a brief tute for that.

  • @Mridul said:
    i wish you could write a brief tute for that.

    Sure, this is for rsync over ssh, and you will need rsync to be installed on the backup server as well. The --delete tag isn't really needed in this case since VestaCP backups are all dated. You will need to set up a ssh key first and run the command once to add the fingerprint to your cache before crontab will work.

    First line edits crontab editor.
    Put the second line (after you edit it) into crontab and it will run at 9PM (or change the 21 to whenever).
    I think you will need root to run this crontab, not sure. I think I got permission issues with admin user.

    crontab -e
    0 21 * * * rsync -av --exclude 'backup' --delete -e ssh /home/backup/ backupuser@backupserveraddress:/backup/
    
    Thanked by 2Mridul aglodek
  • curmudgeoncurmudgeon Member
    edited September 2015

    Depending on whether you have enough excess memory, you could use crashplan. Set it home by port forwarding from your desktop so that you can run it headless on the server. The free version lets you backup to another computer that you control and it runs automatically daily on the folders that you've specified.

    I use the paid crashplan for a production file server whose data I really care about being backed up right away, as it's also where I have my seafile server for my home PC and keep some back datasets archived up there. It's gotten to 2TB at this point, though that's over the course of many months. Other things can live with a one day lag.

    Alternatively if you're just archiving back data that you don't expect to use very often there's Amazon Glacier which I'd used before Crashplan (when the data were less it cost peanuts). That said I think it's $10 / month / TB more or less so maybe I'm just being cheap.

  • Vestacp does have a remote backup option. Why dont you use that? It takes backup every day.

  • I use duply (http://duply.net/) to S3 (versioning enabled) with auto-rotation to Glacier after 30 days. Duply will use duplicity (http://duplicity.nongnu.org/) as backend. It will generate compressed incremental backups encrypted with PGP and will upload directly to any S3 compatible host or other storage backend supported by duplicity, maintaining a local copy of the backup if you want.

    Duplicity can upload to:

    Formats of each of the URL schemes follow:
    
    Azure
    
    Cloud Files (Rackspace)
    
    Copy cloud storage
    
    Dropbox
    
    Local file path
    
    FISH (Files transferred over Shell protocol) over ssh
    
    FTP
    
    Google Docs
    
    Google Cloud Storage
    
    HSI
    
    hubiC
    
    IMAP email storage
    
    Mega cloud storage
    
    OneDrive Backend
    
    Par2 Wrapper Backend
    
    Rsync via daemon
    
    Rsync over ssh (only key auth)
    
    S3 storage (Amazon)
    
    SCP/SFTP access
    
    Swift (Openstack)
    
    Tahoe-LAFS
    
    WebDAV
    
    pydrive
    
    multi (will upload for multiple storage backends)
    
  • For my personal boxes, right now I am using Duplicity with Delimiter's ObjSpace which works great. I have about 30 VPS's and 4 dedicated servers backing up this way.

  • @MarkTurner said:
    For my personal boxes, right now I am using Duplicity with Delimiter's ObjSpace which works great. I have about 30 VPS's and 4 dedicated servers backing up this way.

    duplicity only or with any script in front of it? Is ObjSpace prepaid?

  • howardsl2howardsl2 Member
    edited September 2015

    Install Proxmox with ZFS option, then send full and incremental ZFS snapshots periodically to a remote storage server running FreeBSD.

    You could get insane read and write speeds with ZFS and large amounts of RAM. Even better with compression turned on.

    Thanked by 1dgprasetya
  • ATHKATHK Member
    edited September 2015

    I rsync all my specific files to a storage VPS which sends it off to DropBox.

    However, I've been thinking of switching that out (removing storage) in favour of rclone which is a fantastic little tool.

    That said I'm only backing up a few GB unlike some users here...

  • rsync,mysqldump and cronjob

  • EkaatyLinux said: duplicity only or with any script in front of it? Is ObjSpace prepaid?

    Actually Duply/Duplicity, I did write Delimiter an article on how to set it up but I can see its not on their wiki yet.

    Its prepaid packages which makes things a lot more predictable.

Sign In or Register to comment.