Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Synology CloudSync + B2: So Far, So Good
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Synology CloudSync + B2: So Far, So Good

raindog308raindog308 Administrator, Veteran

My home files live on a Linux NAS (or Dropbox). Previously I had an rclone script for backing up to B2. This weekend, I switched to doing an rsync to a Synology and then using it's CloudSync to backup to B2.

So far, working well! Some tips:

  • you really, really, really want to use the synology's rsync service as opposed to mounting NFS and running rsync against that. Over nfs, rsync just does a comparison of time stmap/size. When running against a real rsync service, it does the block compare. Or so I've read...rsync on Synology is pretty easy to setup.

  • my policy is rsync local and then set the bucket to a 30-day retention. Thus, I have versioned backups.

  • I didn't do the advanced comparison thing...my little DS215Js are working hard enough.

  • break out each bucket into a separate B2 config so you can see where each is at. If you have a bunch (I have about 25), create them all first and then configure the Synology. Setting up each in synology was the major work.

  • I'm only doing local to remote. By default, I think the comparison is every 60 seconds which is nuts. I ramped all those up to 600 seconds but I may move to scheduled syncs for some volumes where the change rate isn't high.

I wish CloudSync told you the percentage complete...there's really no way to know how close a volume is to completing B2 sync except by comparing size in the B2 API vs. what's on your source. CloudSync will tell you how many files remain to be synced but that's it.

I have ~500K files and 2.7TB, so with a 10Mbps upstream it'll take a bit...fortunately, I have my old rclone buckets, so as each new bucket completes I'm nuking the old. And I have a local backup now, which is nice.

Thanks to @KuJoe for originally talking about this setup on LET many moons ago...took me a while to get around to it but it's working great so far.

Thanked by 2MasonR dedicados

Comments

  • raindog308 said: Tagged: redundant-tentacle-hentai

    share...

    tks

  • dragon2611dragon2611 Member
    edited May 2018

    I did try using the Minio b2 gateway with hyperbackup on a synology as a test but it seemed to get tied in knots trying to download a 0 byte file it had created.

    It was also a bit of a pain to setup as it requires minio setup for DNS style bucket names. (Hyperbackup doesn't support path style)

    Not sure if that's a synology or minio b2 gateway bug.

  • JunJun Member

    I have similar setup as yours. I use raspberry pi 3 with Arch Linux Arm to backup 5TB of redundant-tentacle-hentai, rsync from my computer to raspberry pi, b2 sync (first-party backup script by backblaze) from raspberry pi to b2. So far, my raspberry pi 3 is strong enough to provide stable nfs to my laptop and run scheduled one-way backup and two-way sync. I'm very happy with the current setup.

    Thanked by 1raindog308
  • raindog308raindog308 Administrator, Veteran

    @Jun said:
    I have similar setup as yours. I use raspberry pi 3 with Arch Linux Arm to backup 5TB of redundant-tentacle-hentai, rsync from my computer to raspberry pi, b2 sync (first-party backup script by backblaze) from raspberry pi to b2. So far, my raspberry pi 3 is strong enough to provide stable nfs to my laptop and run scheduled one-way backup and two-way sync. I'm very happy with the current setup.

    Do you have 5TB attached to the pi?

  • JunJun Member

    @raindog308 said:

    @Jun said:
    I have similar setup as yours. I use raspberry pi 3 with Arch Linux Arm to backup 5TB of redundant-tentacle-hentai, rsync from my computer to raspberry pi, b2 sync (first-party backup script by backblaze) from raspberry pi to b2. So far, my raspberry pi 3 is strong enough to provide stable nfs to my laptop and run scheduled one-way backup and two-way sync. I'm very happy with the current setup.

    Do you have 5TB attached to the pi?

    5TB of 3.5" HDD attached to the pi, yes. Since I do not frequently access HDD, I put it on sleep mode most of the time. It serves as automated local backup and nfs for my laptop. My only problem is that the USB is a bottleneck (20MB/sec) however this is sufficient speed for b2 backup or 4K redundant-tentacle-hentai streaming over nfs so I'm happy with it.

    Thanked by 1raindog308
  • KuJoeKuJoe Member, Host Rep

    This thread reminded me to check on Synology's backup service and I was thrilled to see it's out of beta! The pricing is awesome so I picked up a free trial to test it out. I'm currently using Synology's HyperBackup to backup to Amazon Drive and Synology's C2 service because it handles versioning and client-side encryption (C2 includes free versions with their service so now I don't need a 2TB plan for my 500GB of data).

    I recommend you check it out: https://c2.synology.com/en-us

  • raindog308raindog308 Administrator, Veteran
    edited May 2018

    @KuJoe said:
    This thread reminded me to check on Synology's backup service and I was thrilled to see it's out of beta! The pricing is awesome

    And that's my concern...unlimited for 70EUR/year.

    No one has ever been able to make that work.

    I realize it's not actually unlimited since it has to source from a Synology and Synology doesn't sell unlimited appliances...however, I have 12TB of Synology at home. I'm only backing up 3TB but...would they really take all 12TB for 7EUR/mo?

    My other concern vendor lock-in. I realize I already have that with Synology to some extent, but at least with B2 on the backend I can get my stuff regardless if Synology goes out of business or has some bug that nukes my local arrays.

    Still...it looks promising. Strange that their only DC is in Frankfurt.

    After another night of working with the Synology CloudSync client, I have two further complaints:

    (1) really wish it was more verbose. I keep having to check Backblaze to see how much things have compress because there's so little info on the Synology end.

    (2) I started rsyncing 89,000 files in a directory and when 50,000 or so were done, it told me it was syncing 111,000 files. The CloudSync status has never made any sense, which is disappointing.

    (3) No way to set up any kind of prioritization. Everything tries to go at max speed. I have ~25 shared folders to put to the cloud and it's disappointing I have to manually pause and unpause them to get the initial sync done in a sane order.

    I figure about 27 days left of uploading...which is another form of vendor lock-in :-)

    My main issue is what to do with the extra 16GB of RAM I had to add to my local NAS because Crashplan was such a pig...

  • raindog308raindog308 Administrator, Veteran

    Never mind...read too quickly. It's 70EUR/TB/year, not 70EUR/unlimited.

    So that's $320/year or so for me for 3TB. B2 will charge about half of that.

  • NeoonNeoon Community Contributor, Veteran

    @dedicados said:

    raindog308 said: Tagged: redundant-tentacle-hentai

    Thanked by 1dragon1993
  • KuJoeKuJoe Member, Host Rep

    raindog308 said: unlimited for 60EUR/year.

    I don't see an unlimited plan. It's $82.88 per TB per year after 1TB which isn't bad. Considering it's basically a service for only Synology clients, you need to take the cost of the hardware into account also so they're really getting $XXX out of you plus a monthly/yearly cost. I'm sure CrashPlan would still be a viable option if they charged a $200 setup fee for each user. :)

    Also I misunderstood the "unlimited versions" part. The service takes a backup of your data once per day and keeps 11 copies with a max retention of 30 days. Not bad for an off-site backup of a backup.

    raindog308 said: My other concern vendor lock-in.

    Agreed. I currently backup my data to multiple providers and I just a built a custom NAS so I'm not exclusively relying on Synology locally either.

  • sidewindersidewinder Member
    edited May 2018

    Doesn't duplicati have a build that works on Synology?

    Anyone ever tried that? I've never had to restore from Duplicati but as a backup tool its pretty damn rad - only complaint I have are the vague error and log messages.

    If your Synology devices dies, how easy is it to restore the files it backs up?

  • raindog308raindog308 Administrator, Veteran

    @sidewinder said:
    Doesn't duplicati have a build that works on Synology?

    Yes, but I haven't tried it.

    If your Synology devices dies, how easy is it to restore the files it backs up?

    I think it depends on what the scenario is.

    Backing up only to the Synology...I think you'll need another synology, though I've heard (haven't tested) that the drives are generic Linux RAID and could be popped into a generic x86 box.

    Backing up to Synology that then goes on to B2...if the synology wasn't around, I'd probably use rclone or b2's own CLI to restore the files.

    I've occasionally got back individual files from B2 or subdirectories and it was pretty straightforward.

  • raindog308raindog308 Administrator, Veteran

    A few days later, I'm a bit less sold on Synology's CloudSync.

    • logging is really poor. It won't tell you the percentage done. It only tells you the number of files to go, and this is wildly out of sync with reality. E.g., I have a directory with 89,000 files. Synology reported 111,000+ to sync, and then as it was syncing, I'd see numbers drop and then increase (!?!?) etc. until it was done. e.g, it'd say 45K to go, then 46K, then 49K, then 45K, then 44K, etc.

    • it seems to resync a ton. E.g., my "rsync from the server to Synology" job will always produce 1 new file, because I do a fresh file listing and put the output in the directory's base before rsync. However, on a directory where nothing has changed except that one file, I'll see Synology saying it's backing up 300+ files.

    • there's no way to prioritize one share over another. What I see is that usually only one shared folder is uploading at a time.

    • I've seen some weird files appear with the name "case conflict". No idea what those are.

    I may go back to rclone. This experience overall has improved my home backups because now I'm backing up to local NAS as well, but I think rclone does a better job for upload because you have more control and more visibility. However, it also means more work, custom script, etc. - the usual tradeoff.

  • Look at Restic it may run on synology NAS depending on the model (Arm version worked on a RS816)
    Borg is also in the synocommunity repo

  • ThyTeThyTe Member

    @dragon2611 said:
    I did try using the Minio b2 gateway with hyperbackup on a synology as a test but it seemed to get tied in knots trying to download a 0 byte file it had created.

    It was also a bit of a pain to setup as it requires minio setup for DNS style bucket names. (Hyperbackup doesn't support path style)

    Not sure if that's a synology or minio b2 gateway bug.

    Could you please tell me exactly how you did it?

    I have a cloud server with 8TB, and I want to enable the minio on it, and make Synology connect via S3 to make backups ...

  • @ThyTe said:

    @dragon2611 said:
    I did try using the Minio b2 gateway with hyperbackup on a synology as a test but it seemed to get tied in knots trying to download a 0 byte file it had created.

    It was also a bit of a pain to setup as it requires minio setup for DNS style bucket names. (Hyperbackup doesn't support path style)

    Not sure if that's a synology or minio b2 gateway bug.

    Could you please tell me exactly how you did it?

    I have a cloud server with 8TB, and I want to enable the minio on it, and make Synology connect via S3 to make backups ...

    No necro! You should have PM'd him instead. And he only did it as a test (meaning, didn't stick with it), it was a PITA, and its nearly 2 years later. Bro... have you heard of Google?

    Thanked by 1angstrom
  • jhjh Member

    I tried it with our QNAP NAS and although it works well, the speed is such that it's pretty useless for disaster recovery. For small files it would be a good option IMHO.

Sign In or Register to comment.