Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Subscribe to our newsletter

Advertise on LowEndTalk.com

Latest LowEndBox Offers

    My impressions on hubiC
    New on LowEndTalk? Please read our 'Community Rules' by clicking on it in the right menu!

    My impressions on hubiC

    Hello everyone!

    I'd like to share with you my first impressions (and I'd like to stress again the word FIRST) that I had by using the cloud/backup service offered by OVH: hubiC. I want to do so in order to know what you people think of it and if you agree with my impressions.

    When early this month they reduced the price from 10€ to 5€ for 10TB of online storage (monthly subscription) I simply couldn't resist. I decided to try the service for the first time and I went for the first month (instead of the annual subscription), just to try it out.

    Indeed, I needed a cloud/backup service to upload all my data and I figured 10TB would suffice.

    Let's start by explaining where I use it: I have two dedi (one with 100Mbps unmetered and one with 500Mbps unmetered). I use on both Ubuntu server x64.


    hubiC is used mainly for two purposes: cloud or/and backup. There is subtle but important difference: when you use the cloud you rely on a synchronization of files. You can see the files in all your devices and in the website. When you upload something from a computer, also all the other computer will download and store locally that file.
    The backup system is a pure and simple copy (upload) of your file(s)/folder(s) into hubiC. You don't need to synchronize them and by default the backed up files will not be downloaded in other computers sharing the same hubiC account.

    Whether you need to simply backup your files or not it's up to you. Many of us here needs large space to save their data (snapshots of OSs, etc.) and the backup is the best and easiest solution.


    Let's get now to the first impressions. I've using the service for about 3 days. I've used both the cloud and the backup (though not very much the last one).

    Tools I used: hubiCfuse and the official hubiC application for Linux.

    hubiCfuse: extremely unreliable. The idea is great: you can mount your cloud space and see it as an external hard drive. This way you don't have to download files locally. Unfortunately, operations (renaming, deleting, moving, etc.) are extremely slow. Copy of files is totally unreliable: I copied 80GB and it ended up "segmenting" 40GB somewhere, while the others 40GB were not even complete. I read online that this is more of an issue with the platform used by hubiC than the application itself.

    hubiC for Linux: this certainly works better and in a more stable way. It has lots of commands you can use both for backups and cloud. As of now I only used it for the backup option. Perhaps the only downside I've witnessed is the huge amount of resources that the mono framework needs to make hubiC executing backups.


    For now I will not provide benchmarks of speed or anything, it's too soon and I need to know more about it. However, in a week or two I certainly share a more detailed experience.

    Conclusions as of now: don't use hubiC for the cloud. It's quite buggy and slow. Files are not always well uploaded and you'll get errors many times. If you however need a huge online backup space then perhaps hubiC is what you are searching for.

    Please share your experience :)

    Thank you!

    Thanked by 1marrco

    Comments

    • rm_rm_ Member
      edited April 2015

      Issam2204 said: Unfortunately, operations (renaming, deleting, moving, etc.) are extremely slow

      Renaming and moving are basically the same, and both of these will cause a complete reupload of the renamed or moved files. Just do not rename or move anything. Deletions should be reasonably fast (okay, maybe 1-2 seconds per file, but then again, do not store and/or delete thousands of small files at a time).

      Copy of files is totally unreliable: I copied 80GB and it ended up "segmenting" 40GB somewhere, while the others 40GB were not even complete.

      80GB in one file? It does not support files larger than 5GB without segmenting, and segmenting in the current hubicfuse does not work properly.

      Thanked by 2Issam2204 alexvolk
    • Thank you rm_! With these extra info I know more about hubiCfuse. Do you use the official tool to do backups? What are your impressions? Is it reliable?

    • BochiBochi Member

      Using hubiC (annual subscription) for backup purposes in combination with duply/duplicity and it works great!
      Currently using about 300GB and maxing out the network port on my Kimsufi most of the time and seeing about 300-500Mbit/s from DO. Many people complain about the speeds but at least in my experience it was just one or two times slow...

      Thanked by 1Issam2204
    • My interest was also for backup purposes. When I upload a file it takes forever, and not because the line is slow, but because in the status it keeps telling me "The backup X is created. The files will be sent now.". In your experience how much do you have to wait for it to actually backup files? In this case I'm uploading a file of 10GB.

    • BochiBochi Member

      @Issam2204 said:
      In your experience how much do you have to wait for it to actually backup files? In this case I'm uploading a file of 10GB.

      Not sure, are you talking about the linux client provided by hubiC itself? Can't give you any numbers on this, sorry.
      The given numbers result from my duplicity setup with chunksizes between 50 and 500MB.

    • NeoonNeoon Member

      My Ticket for a lost Payment, is pending since 9 Days. pretty neat support there. Useless bullshit.

      Thanked by 1Issam2204
    • @Bochi said:
      The given numbers result from my duplicity setup with chunksizes between 50 and 500MB.

      Yes, I'm talking about the linux client. But you got me interested on this duplicity. I'll check more info on it :) Thanks!

      @Infinity580
      I'm sorry to hear that. Fortunately my payment went through smoothly.

    • BochiBochi Member

      @Issam2204 said:
      But you got me interested on this duplicity. I'll check more info on it :)

      I am using a setup with duply as wrapper for duplicity over hubic2swiftgate (https://github.com/oderwat/hubic2swiftgate), but the latest version of duplicity should come with (direct) hubic support anyway.
      Let me know if you need to know anything else. ;)

      Thanked by 1Issam2204
    • @Bochi

      Thanks! But too much technical terms there hahaha

      So what should I use? hubic2swiftgate or duplicity? :)

    • BochiBochi Member
      edited April 2015

      Oh, you then need to read about the whole thing a bit more first. ;)
      hubic2swiftgate is just to circumvent the authentication mechanism hubiC implements instead of the one OpenStacks Swift normally has, but they (duplicity) already implemented direct hubiC support through pyrax as far as I know

      Thanked by 1Issam2204
    • I'll certainly read more about it. Thank you Bochi!

    • I installed duplicity and I must say that it rocks! It's fast, secure, solid and very powerful. But my needs are a bit different. I would like to backup my files as they are, I would prefer to avoid to compress them and segment them in tar.

      My question is: is it possible as of now to upload a folder with lots of files (< 10k) to hubiC via command line and keep files as they are? Something that doesn't do segmentation or hangs like the official app.

      Perhaps it's just me that I'm not capable of using correctly the official application. :(

    • BochiBochi Member
      edited April 2015

      As for the encryption and compression part you can disable this features in duplicity via command line params, but don't see such an option for packing.
      But why don't you just set the folder you want so upload as your folder for synchronization in the hubic command line application?

      Thanked by 1Issam2204
    • Thanks Bochi!

      I prefer to avoid synchronization because when I'll delete the files from that folder they will be deleted also from the cloud. :(

    • Issam2204Issam2204 Member
      edited April 2015

      For example, right now I'm using the official tool to synchronize a folder where I put a DVD of Debian. Also I'm downloading a 512Mb backup I did yesterday with Duplicity:

      `[email protected]:~$ hubic status
      State: Busy
      Up: 102.4 KB/s (0 B/s) Down: 96 KB/s (0 B/s)

      Account: X
      Synchronized directory: /home/issam/hubiC/
      Usage: 970.29 MB/10 TB

      Queue:
      Uploads: 0 (0 B) + 1 running
      Downloads: 17 (425.6 MB) + 3 running
      Misc: 0 + 0 running

      Running operations:
      Upload for /home/issam/hubiC/Testing/debian-7.8.0-i386-DVD-1.iso (48.47 MB/3.72 GB)
      Download for /home/issam/hubiC/duplicity-full.20150417T224645Z.vol1.difftar.gpg (12.84 MB/25.05 MB)
      Download for /home/issam/hubiC/duplicity-full.20150417T224645Z.vol12.difftar.gpg (18.72 MB/25.03 MB)
      Download for /home/issam/hubiC/duplicity-full.20150417T224645Z.vol2.difftar.gpg (1 MB/25.03 MB)

      Last events:
      [4/18/2015 6:02:10 AM|Info] Click on this icon to access your hubiC.
      `

      As you can see (in bold) hubiC is very slow :(

    • NomadNomad Member

      With hubicfuse, from Online.net I was uploading with around 30MB, Hubic is not that slow actually, but the official client sucks.

      I don't know about Duplicity though.

      Thanked by 1Issam2204

      ...
      ...

    • @Issam2204 said:
      My interest was also for backup purposes. When I upload a file it takes forever, and not because the line is slow, but because in the status it keeps telling me "The backup X is created. The files will be sent now.". In your experience how much do you have to wait for it to actually backup files? In this case I'm uploading a file of 10GB.

      for 7GB, around 90 minutes, sometime more fast.

      i'm using this script for Busy status issue: http://pastebin.com/vyqsTRch

      my last night (GMT+7) network traffic from backup server (RA 2G Sandbox)

      enjoy..

      Thanked by 2Issam2204 ehab

      "The quieter you become, the more you are able to listen" dgprasetya.com

    • NeoonNeoon Member
      edited April 2015

      Little Update, thats what i got today. A bit weired, sales on weekends but okay.

      Bonjour,

      Nous venons de finaliser votre avoir ********** relatif à votre facture ********** Son montant est de -4.96 EUR.

      Vous recevrez séparément le remboursement de cette somme.

      Le PDF de l'avoir ********** est consultable sur


      Sachez que vous pouvez aussi retrouver l'ensemble de votre facturation dans votre espace hubiC,
      à l'adresse suivante:
      https://hubic.com/home

      L'équipe hubiC

      Seems like a Refund, they just needed 17 Days for that. Lets see if it comes back to my CC on Monday/Tuesday.

      Thanked by 2Issam2204 GM2015
    • Glad to hear Infinity580 :)

    • @dgprasetya

      Thanks for your answer. What you are saying is that it's normal that it takes a lot fo time to backup? Or are you trying to show me that hubiC is not slow? I'm a bit confuse :)

    • hubicfuse been working ok, but like others have said its quirky most of the time. My only working "trick" if u want to call it that, is cp file one at a time.

      anything other than that it freezes or uploads a incomplete file. Ive managed 900gb like that.

      Fxp the file from the same server as the mount seems to work aswell.

      Thanked by 1Issam2204
    • This is quite frustrating for me. Especially because I would like to upload files that are bigger than 4 GB. I understand that everyone has different needs, so hubiC could be perfect for someone and absolutely useless for someone else. However, what astonishes me is: how is it possible that there is not a piece of software that:

      1) would allow backup/upload of files/folders to hubiC

      2) is not capped by file size

      3) doesn't segment files

      4) doesn't compress files (just leave them as they are).

      5) doesn't use huge CPU resources like the official app (because of the mono framework) or duplicity (because of compression and segmentation).

      I know I'm not asking something easy, but as of now I didn't find something perfectly suitable for my usage :( It's quite a shame because I really needed this backup space.

    • @Issam2204 said:
      dgprasetya

      Thanks for your answer. What you are saying is that it's normal that it takes a lot fo time to backup? Or are you trying to show me that hubiC is not slow? I'm a bit confuse :)

      i think that is normal, not slow. for 10GB sometimes less than 60 minutes.

      Thanked by 1Issam2204

      "The quieter you become, the more you are able to listen" dgprasetya.com

    • But how come? Isn't the backup with the official tool a simple copy (upload) of the files to your hubiC account? As far as I know there is no compression or encryption. Why does it take so much time to upload then?

    • Strangely, today (using the official application) the upload (backup) it's not that "slow": around 2 Mbyte/s.

    • Using hubicfuse it's fast, just tested it right now from a DO droplet in the new Frankfurt 1 zoneon a debian 7:

      [email protected]:~# dd bs=1M count=1024 if=/dev/zero of=/mnt/hubic/default/Documents/test2
      1024+0 records in
      1024+0 records out
      1073741824 bytes (1.1 GB) copied, 116.445 s, 9.2 MB/s
      

      so i guess i'll be using it for backups too.

      Thanked by 1Issam2204
    • hubiCfuse is the fastest among all the solutions you can find online. I guess my real problem was with hubiC and its limit of 5 GB per file. I don't like this thing of segmenting files. It has already messed up with my space by taking 400MB that I can't find anywhere.

    • marrcomarrco Member
      edited April 2015

      @Issam2204 they removed that limit a few months ago. Just use hubicfuse and it won't take 400mb space.

    • https://github.com/TurboGit/hubicfuse/issues/54

      It looks like segmenting is the issue here. I know nothing about cloud, but is segmenting something that all clouds do? Or is it something inherent to the client (web browser/app) used?

      Because with Google Drive I can upload files up to 5TB from the web browser, that's crazy! While hubiC is restricted to 1GB.

    • Have you ever uploaded 5tb from a browser without it crashing?

    • This is from OVH (Canada) to Yomura using S3fs (note the way round)

      dd bs=1M count=1024 if=/dev/zero of=/mnt/test
      
      1024+0 records in
      1024+0 records out
      1073741824 bytes (1.1 GB) copied, 31.1327 s, 34.5 MB/s
    • rm_rm_ Member

      @marrco @MarkTurner what happened to the good old conv=fdatasync?

      Thanked by 1alexvolk
    • Heya all, did you mean use hubic as drive without problems, i wrote an article. I think i have manage to overcome all problems.. It took time over 2 weeks learning setuping/tuning/fixing etc...
      http://riku.titanix.net/?p=629
      I can upload, download/stream Full-HD(without downloading whole file) etc. in fullspeed of my internet 100/50M im about to use it on my media-pc as storage as well backuping etc.. :)

      Thanked by 1GM2015
    • jarjar Provider
      edited December 2015

      @riku said:
      Heya all, did you mean use hubic as drive without problems, i wrote an article. I think i have manage to overcome all problems.. It took time over 2 weeks learning setuping/tuning/fixing etc...
      http://riku.titanix.net/?p=629
      I can upload, download/stream Full-HD(without downloading whole file) etc. in fullspeed of my internet 100/50M im about to use it on my media-pc as storage as well backuping etc.. :)

      Do you actually plan on putting any instructions on that page (like maybe how you overcame "all problems") or are you spamming your blog link in hopes that someone will click the affiliate link inside?

      Thanked by 1GM2015
    • no need click on link, i have it full already, i cannot get more space by clicking the link maxium is 5 users... i posted link cause its complicated and i explained there. to avoid put big post here. and yes i will explain when i have this more tested, but for now it seems to work already alot better than most of users here

    • You're quiet the writer, amigo.

      J.K. Rowling should retire tomorrow.

      riku said: no need click on link, i have it full already, i cannot get more space by clicking the link maxium is 5 users... i posted link cause its complicated and i explained there. to avoid put big post here. and yes i will explain when i have this more tested, but for now it seems to work already alot better than most of users here

      Go give Vultr(referral) a try. | GNU/Linux http://debian.org

    • I'm trying out S3QL with Hubic (and the PHP swift gateway to make login work). Upload, all the way from NZ, is absolutely fine - can do about 37MBit/s. Download, on the other hand, is abysmally slow. I've tried mounting my s3ql from a scaleway arm server and even there, I'm only downloading at 3 or 4Mbit/s. Does this match with others' experiences?

    • @deadlyllama said:
      I'm trying out S3QL with Hubic (and the PHP swift gateway to make login work). Upload, all the way from NZ, is absolutely fine - can do about 37MBit/s. Download, on the other hand, is abysmally slow. I've tried mounting my s3ql from a scaleway arm server and even there, I'm only downloading at 3 or 4Mbit/s. Does this match with others' experiences?

      maybe it is scaleway. was slow for me too.

    • vfusevfuse Member, Provider

      I was not able to setup hubic as a mounted drive, using amazon cloud now with acd_cli + encfs for encryption, works great. 200-900mbit upload/download. Works great with plex :)

      NIXStats monitoring Web, Server(Linux, Windows - $6.95/m), Logging (Free!) and Blacklists (start at 512 for $3.75/m) - Uptime Report - API Docs

    • bumping

      anyone knows how to make it so the mounted hubic dir is owned by non root?

      adding chown_enable=true on the config doesn't seem to help.

    • GM2015GM2015 Member
      edited February 2016

      I've had too many issues with hubicfuse
      on ubuntu last year.

      Files were not normally uploaded, corrupted and so on.

      I only use the free plan, so I've just dumped some encrypted isos there and leave it as is.

      Also, thanks to @domainbop look at this:

      https://forum.ovh.com/showthread.php/108343-hubiC-PCS-et-PCA

      https://vpsboard.com/topic/2423-hubic-by-ovh-25-gb-of-free-storage-in-the-cloud33/#comment-104732

      image

      Nihim said: bumping

      anyone knows how to make it so the mounted hubic dir is owned by non root?

      adding chown_enable=true on the config doesn't seem to help.

      Thanked by 1lbft

      Go give Vultr(referral) a try. | GNU/Linux http://debian.org

    • Nihim said: anyone knows how to make it so the mounted hubic dir is owned by non root?

      I've always created a dedicated user to mount the hubic mount point, added that user as a member of the fuse group, and mounted using the allow_other option. The hubic dir ends up being owned by that user, not root, and all users can write to the hubic store. I create a dedicated user to make it trivial to traffic shape the upload.

      GM2015 said: Files were not normally uploaded, corrupted and so on.

      I've only ever experienced problems when there was a lot of concurrent activity, and that was usually the mount point disappearing or hanging, not file corruption. Recently I've always used rsync to transfer files from a buffer area on a local server so the transfer is sequential. I also absolutely avoid large files (> 4GB) and the segmentation that occurs, and have never experienced file corruption from a CentOS 6 system.

    Sign In or Register to comment.