Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Backblaze B2 Drops Download Price By 60% - Page 2
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Backblaze B2 Drops Download Price By 60%

2

Comments

  • This thing about inodes is interesting. I've noticed trouble scp'ing around lots of small files so I sometimes turn not-so-active directories into tarballs. My backup/storage stuff has turned into a mass of ad hockery but maybe I'll try to consolidate some of it as plans expire.

  • @willie said:
    Hetzner Storagebox is 39.90 euro/m for 10TB with no bandwidth charges.

    Where do you see there are no bandwidth charges? I have stayed away from them because it has a cap and its only "free" if its hetzner<->hetzner.

  • jarjar Patron Provider, Top Host, Veteran

    @willie said:
    This thing about inodes is interesting. I've noticed trouble scp'ing around lots of small files so I sometimes turn not-so-active directories into tarballs. My backup/storage stuff has turned into a mass of ad hockery but maybe I'll try to consolidate some of it as plans expire.

    Things get really bad when you hit inode limits and they're the defaults Linux uses on creation of the file system ;)

  • ihadpihadp Member

    @jarland said:

    ihadp said: You can purchase gsuite for $10/Month (1 User) which grants you all of the gsuite features including unlimited gdrive (storage & bandwidth).

    Yeah... fear not, I won't be the one to make them crack down on that. I need more reliable long term, changing my plans involves a lot of work, so it needs to be the lasting and scalable solution, not just a make-do with something that works for now. I'm talking about MXroute backups here, after all.

    Fair enough.

    Good luck!

    Thanked by 1jar
  • Hadriel said: Where do you see there are no bandwidth charges? I have stayed away from them because it has a cap and its only "free" if its hetzner<->hetzner.

    Sorry. Yeah there are caps but they're high enough to get all the data out multiple times over. We were talking about backups so that's essentially free bandwidth.

    These aren't VPS, they're scp/sftp only, so you wouldn't want to use them as public-facing servers. I'm not sure what good unlimited bandwidth would do in that situation, if it was available.

  • I'd love to use this to backup my computers. Anyone got a method of how they do it. I looked at cyberduck, which might work but something like Arq might be better. Happy with rsync too.

  • saf31saf31 Member

    10gb storage and everyday 1gb transfer is free. I use this with xcloner (https://wordpress.org/plugins/xcloner-backup-and-restore/) to backup my wordpress site. Till now had no issues.

  • I'd love B2 more if B2 able to create folder inside bucket.

  • @sibaper said:
    I'd love B2 more if B2 able to create folder inside bucket.

    You can...

  • sanvit said: You can...

    using b2 client? how

  • @sibaper said:

    sanvit said: You can...

    using b2 client? how

    I've only used the web interface and cyberduck, but I'm pretfy sure rclone can also do the job :) Never used b2 client...

  • sibaper said: using b2 client? how

    You're going about that the wrong way; B2 does not create folders and files itself, it stores whatever you put in it. Create the folder on your local system, upload the folder with b2_upload_file, and then you have a folder within your bucket.

  • jarland said: Has anyone had any success in getting this to work on linux? I have not. I can't even get rclone to work with it, and it's not like it's setup is complicated.

    Give HashBackup a try: hashbackup.com/

    Thanked by 2jar Pertti
  • sanvit said: clone can also do the job

    nope :)

    Damian said: B2 does not create folders and files itself

    on current b2 client yes, but you can create folder from b2 web

    Damian said: You're going about that the wrong way; B2 does not create folders and files itself, it stores whatever you put in it. Create the folder on your local system, upload the folder with b2_upload_file, and then you have a folder within your bucket.

    B2 wont let me create folder from b2 client, but you can create via web. And you can't upload empty folder using b2 client.

    I do some testing backup by upload all folder to B2, full backup not increment. What I want is create folder name DD-MM-YYYY every time I push backup. In current b2 client I must create these folder manually via web and push my backup.

    I do some workaround to speed up these process without using b2 web based, auto create folder on /tmp every day put some random files with random string, sync these folder first then sync all files/folder.

  • I will admit that I'm not sure the point of creating an empty folder, since you'd normally be uploading something that already exists. Folders do not exist as a construct in their system anyway (see https://www.backblaze.com/b2/docs/files.html and then roll down to Folders (There Are No Folders) for more information) and trying to create a folder in advance of storing something in it is likely only saving milliseconds of time.

  • jarjar Patron Provider, Top Host, Veteran

    @Damian said:
    I will admit that I'm not sure the point of creating an empty folder, since you'd normally be uploading something that already exists. Folders do not exist as a construct in their system anyway (see https://www.backblaze.com/b2/docs/files.html and then roll down to Folders (There Are No Folders) for more information) and trying to create a folder in advance of storing something in it is likely only saving milliseconds of time.

    Object storage in general is still a very abstract concept to many. The entire product is intended to be considered totally independent from how we traditionally view file system structure. It seems vague and hipsterish until you need it, but damn is it nice when you need it.

    That's why it's important when using object storage to drop your habits and approach it as an entirely new concept for file storage. It was never meant to replace a block mount or a Dropbox folder.

  • Still sounds like something that can be tossed on a dedi or several, with apparently better margins than most LET stuff sold around here. Don't need IP subnets, limited abuse potential compared with VPS, get to charge for outbound bw, what's not to like?

  • @hampered said:
    I'd love to use this to backup my computers. Anyone got a method of how they do it. I looked at cyberduck, which might work but something like Arq might be better. Happy with rsync too.

    Arq is pretty decent for backing up your own machine

    Duplicati isn't bad either but I've found Arq to be pretty much set it and forget for the most part.

  • Hubic doesn't seem to be mentioned much anymore. Not the best client, but swift explorer and hubicfuse make it pretty useful. 10Tb for €50/yr.

    I've been using it to backup a samba server and various windows PCs for a few years now. Curious to see how long it will take to pull everything back down to the samba server via hubicfuse when a HDD inevitably fails (set it up raid-0 instead of raid-1 - whoopsie).

    Thanked by 1ljseals
  • @squibs said:
    Hubic doesn't seem to be mentioned much anymore. Not the best client, but swift explorer and hubicfuse make it pretty useful. 10Tb for €50/yr.

    I've been using it to backup a samba server and various windows PCs for a few years now. Curious to see how long it will take to pull everything back down to the samba server via hubicfuse when a HDD inevitably fails (set it up raid-0 instead of raid-1 - whoopsie).

    Last time I checked out Hubic, they had a stupid 10Mbit/s speed limit.

  • Yeah, Oles has a rant in French about it here (06/02/2016, 10h32, last post on the page):

    https://forum.ovh.com/showthread.php/108343-hubiC-PCS-et-PCA

    He says Hubic was supposed to be a consumer product but people were using it to back up servers, so they're putting the slowdown in place and decreasing the price of regular OVH cloud storage to 0.4c/GB (or maybe that was just for migrations from Hubic). It's a year later and they didn't do the price drop, but instead they introduced Cloud Archive at 0.2c/GB + upload and download fees, so maybe that was a change in plan.

  • jarjar Patron Provider, Top Host, Veteran

    I still get invoices in French from Hubic. I can't log in. I gave up when they couldn't charge my debit card. Getting support seemed like more effort than my interest in the service at the time.

  • jarland said: I still get invoices in French from Hubic. I can't log in. I gave up when they couldn't charge my debit card.

    Your service is likely still alive. As far as I know, status.ovh.com/?do=details&id=12814 is still active and has been this way for a very long time. In fact, it looks like the last time I paid them was April 2016.

    Considering how the French work they'll likely try to bill years in arrears despite being their unsolved problem, but we'll cross that bridge when we get to it.

    Thanked by 1jar
  • The thing is if they didn't want Hubic customers to use it to back up the data on a fileserver, they should have specified this. I see nothing here that says it's a no-no: https://hubic.com/en/contracts/Contrat_hubiC_2014.pdf

    In fact I never considered that it might be an issue for them, nor do I understand why it is an issue for them.

    I'm using less than 2Tb currently, and only adding incremental updates - probaly ~10-20Mb/day. The day will come when a drive will fail locally, and I'll use a lot of bandwidth resyncing over several days - I'm not too bothered about the network speeds.

    Given that I could be hammering the entire 10Tb, rotating stuff in and out, sharing links with the general public. I would have thought that was a far more disruptive use of the service. I don't see that backing up a fileserver is any more objectionable than backing up a large directory on a personal PC?

  • They said in the tos that it was 10 mbit and I guess they expected people to believe it. But in fact for a while you could use it at full speed, like 1 gbit, so people used it for server backup. I'll guess they were willing to tolerate some amount of that but it got out of hand. So they slowed it down to 10 mbit (I think actually 20mbit) which is what they advertised to begin with. You're still allowed to back up servers afaik, as long as you don't mind the slow speed.

  • I'm happy with the slow, cos of the cheap, so long as the data is good if/when I need to retrieve it.

  • squibs said: I'm happy with the slow, cos of the cheap, so long as the data is good if/when I need to retrieve it.

    Good luck when restoration time coming.

  • @sibaper said:

    squibs said: I'm happy with the slow, cos of the cheap, so long as the data is good if/when I need to retrieve it.

    Good luck when restoration time coming.

    Why? You think it will be slow, or you think the data will be corrupt?

  • because of slow speed.
    I use them for 6 month (paid), but I can't predicted when they fast/slow, their speed is mistery.

  • Speed doesn't bother me much - if I need a resync after a failed disk, and I need a specific file while I'm waiting for it to complete, I can grab it with swift explorer.

Sign In or Register to comment.