Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Backup solutions: an alternative for BackupPC
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Backup solutions: an alternative for BackupPC

imokimok Member

Hi

I've been using BackupPC to take backups of all my servers for a while and it's been working flawlessly. The problem is it can start the process while a server is too busy (high CPU load or intensive IO) and it can contribute to downtimes on production. It happened to me last week.

I want to know if there is a solution that monitors the server health before starting the incremental backups.

Thank you!

Comments

  • ZerpyZerpy Member
    edited May 2018

    You can set the time frame it should perform backups, and you can renice and ionice your process to something super low priority.

    Now, if your server goes down because of rsync running, then I'd say you have bigger issues with your machine and should look into that instead.

    Also, you could in fact execute a script pre-start of BackupPC to check the server load - BackupPC has pre and post scripts support.

  • imokimok Member

    Thanks, I will look for information about pre scripts.

    I'm taking backups of some shared accounts with low IO performance and it happened a coincidence of really high traffic, cache cleared and backup running. A total mess.

  • eva2000eva2000 Veteran
    edited May 2018

    Zerpy said: you can renice and ionice your process to something super low priority.

    +1 that is really important part when doing backups. Also schedule/run backups at lowest cpu/disk utilisation times of day instead - you can find that by checking past sar stats outputs.

    Also if possible you can break up your backups into smaller segments with separate cron scheduled times i.e. if you have 4 sites with 4 sets of files + databases to backup, you can break them down to 4 separate backup schedules one for each site's files + databases. Schedule them around the respective site's off peak times. Also you may want to customise the order in which files/data are backed up too. I usually write my backup scripts so that the smallest to largest data set gets backed up order wise - giving me the highest chance and percentage of completion for the total data sets :)

    That's what I do for my Centmin Mod premium user's dbbackup.sh script which also utilise nice/ionice as well https://community.centminmod.com/threads/dbbackup-sh-quick-mysql-database-backups-for-centmin-mod-stack.4573/ :)

    Also play with compression methods for your backups to find which is most optimal for your needs. Never used BackupPC so no idea if you have control over that ? I did some benchmark comparisons at https://community.centminmod.com/threads/compression-comparison-benchmarks-zstd-vs-brotli-vs-pigz-vs-bzip2-vs-xz-etc.12764/.

    Some times you would want to process backups as fast as possible even if they use more cpu and memory resources so they don't drag out the backup process. This way you can also implement a site wide maintenance mode during backup times so as to prevent site usage loads and backup usage loads conflicting.

    i.e. when backup process starts switch site to maintenance for 5 minutes while backups are being done and then switch site back on. Great for when you have 100s or 1000s of Gigabytes of data to backup and don't have the $$$ budget to scale out a 100% uptime config/cluster. Just enable maintenance mode and do backups as fast as you can utilising all cpu cores where possible in multi threaded file and database backup tools i.e percona xtrabackup or mydumper or mysqldump equivalent mysqlpump https://dev.mysql.com/doc/refman/8.0/en/mysqlpump.html and multi-threaded compression i.e. pigz, pbzip2, pxz, zstd etc

    I rather backups take less than 5 minutes (with/without a site maintenance mode) and use all cpu cores than take 60 minutes and drag out the process with potential impact on normal site usage/operations.

    For my Centmin Mod users they get inbuilt global site maintenance modes they can use for such tasks via 2 SSH commands https://community.centminmod.com/threads/sitestatus-maintenance-mode.5599/

    imok said: The problem is it can start the process while a server is too busy (high CPU load or intensive IO) and it can contribute to downtimes on production. It happened to me last week.

    Optimising backup scripts can only get you so far too. When spec'ing a server you usually do it meet your highest peak usage loads you anticipate - most folks assume their web app/scripts usage but over look tasks like backups and restore process when they make those calculations. I usually have set criterias when I choose server specs such as how fast they can backup and restore XX amount of data in XX amount of time. I rather choose the server that restores my data in 30 minutes or less than a server that takes 24+ hrs to restore data :)

    Thanked by 1pullangcubo
  • imokimok Member

    BackupPC starts backups X times a day only, I can ban some days or hours but nothing more, and with a news website, you don't know when a breaking news can take the server down. I'm moving this site out from shared hosting but it's still good to have some solution for future references and other sites with increasing traffic.

    @eva2000 Thank you very much! A lot to learn.

  • ZerpyZerpy Member

    @imok said:
    BackupPC starts backups X times a day only, I can ban some days or hours but nothing more, and with a news website, you don't know when a breaking news can take the server down. I'm moving this site out from shared hosting but it's still good to have some solution for future references and other sites with increasing traffic.

    Depending on the website, you can often cache most stuff on the site anyway - I also run news-sites on a shared hosting platform, and it's pretty predictable when the traffic will be there, and when it won't.

    Depending on the reach of the site it should be obvious when visitors are sleeping, and that's the times where you'd back up things like that anyway.

    Can I ask why you don't use a solution such as JetBackup or similar that actually allows you to configure things like this? You can create individual jobs for individual accounts, limit if files and/or databases should be backed up - it gives you so much control that you'll surely be able to make it work, and it also has a feature to set a max load if I remember correctly (cba to log into one of my servers).

  • Nice necro for my own thread.

    Zerpy said: Can I ask why you don't use a solution such as JetBackup or similar

    Because I don't use cPanel.

    Anyone knows an alternative that allows customers to access to backups?

    @eva2000 have you tested dbbackup.sh with S3 alternatives like Minio?

  • Duplicati.com is free, supports large protocols and cloud companys and great

    Thanked by 1imok
  • There is a new player:

    https://github.com/gilbertchen/duplicacy

    Deduplication
    Compressiom
    Sync with cloudstorage out of the box

    I am using om different project and i am super happy. Ver, very simple to use and with cloud storage support you will have backups outside your vps in minutes

  • mfsmfs Banned, Member

    .

    License Types:
    CLI License
    $20 starting

    and the license status is unvetted enough to start with

    Thanks, but no, thanks

    I'll stick to my beloved borg

Sign In or Register to comment.