Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Ramnode is good for big data jobs
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Ramnode is good for big data jobs

I have bought the $5/m storage VM from them, it includes 315GB disk, 2 cores, 1 GB mem.
The disk size suite the requirement of us, its IO is even not bad, though they are HDD disks.
Two cores vCPU is important to us, since we are doing big data statistics on it (using R and python).
Their LA node's net speed to Asia is not bad, I downloaded 60GB data from Hong Kong, spent about 3 hours.
So, if you have the big data tasks to run in a VM, I give a suggestion on Ramnode.
(I didn't get any discount from them, I am not an affman too).

Thanked by 1Nick_A

Comments

  • SirFoxySirFoxy Member
    edited October 2020

    Not using RamNode for what you are, rather my own consulting brand but agreed, I'd say they're the highest quality homegrown LET brand. I pay them $4 per month and get more than a 100x return, so, it's a no brainer for me.

    Thanked by 2crpatel Nick_A
  • good deals

  • Stability is very good I run a vps with them since 2014 trough they years got couple of maintenance (think lest than fingers on my hand) that was done in the expected time, never did a reinstall (will do it in the next 20 day as centos 6 eol) and only contacted support this year to upgrade as mine 256/256 was getting too small for the 64 bit os that I will install

    Thanked by 1Zshen
  • Big data is 315GB and 1GB RAM? I really expected 10+TB and 256+GB RAM minimum for "big data".

  • @TimboJones said:
    Big data is 315GB and 1GB RAM? I really expected 10+TB and 256+GB RAM minimum for "big data".

  • @TimboJones said:
    Big data is 315GB and 1GB RAM? I really expected 10+TB and 256+GB RAM minimum for "big data".

    I am doing bio, which is text based. For each project, the compressed text data is xx GB, not that big.

    Thanked by 1vpsguy
  • @sportingdan said:

    @TimboJones said:
    Big data is 315GB and 1GB RAM? I really expected 10+TB and 256+GB RAM minimum for "big data".

    Ah ive seen this before somewhere else

  • fkjfkj Member

    @jpeng said:

    @TimboJones said:
    Big data is 315GB and 1GB RAM? I really expected 10+TB and 256+GB RAM minimum for "big data".

    I am doing bio, which is text based. For each project, the compressed text data is xx GB, not that big.

    To be honest, on this form, when it comes to "Big Data", I'd believe it's talking about sth like finish processing xxGB of data in xx seconds.

  • @jpeng said:

    @TimboJones said:
    Big data is 315GB and 1GB RAM? I really expected 10+TB and 256+GB RAM minimum for "big data".

    I am doing bio, which is text based. For each project, the compressed text data is xx GB, not that big.

    Exactly.

  • could u plz show me the geekbench5 score of ur vps? score > 800 is not bad

  • Nick_ANick_A Member, Top Host, Host Rep

    Thanks for your business!

Sign In or Register to comment.