Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


How can file hosting sites provides 50TB/uploader storage? Do they cheating?
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

How can file hosting sites provides 50TB/uploader storage? Do they cheating?

I have talked with some affiliate person from some file host. He said "we provide 50TB per uploader for uploader storage. So what kind of server do you think do they use? Is it just cheating/trolling? How can they survive? Or what kind of cheap storage host provides those kind of biggest storages?

«1

Comments

  • ask them again

    Thanked by 3ehab Pwner doghouch
  • mkshmksh Member
    edited February 2018

    distributed storage and overselling or just the latter.

  • @sibaper said:
    ask them again

    So you cant answer?

  • YmpkerYmpker Member
    edited February 2018

    @filelover said:

    @sibaper said:
    ask them again

    So you cant answer?

    How should we know an unnamed File Hosts practices?

    1 Fichier according to Reddit sources suspends accounts exceeding 30TB

    Thanked by 1uptime
  • I can answer. They make extensive use of an advanced technique known as "extreme deduplification". Ultimately every terabyte is reduced to a gigabyte. Subsequently, each gigabyte is reduced to a megabyte, and then further down to a kilobyte. And so on.

    Ultimately all data is represented by a single superbit, which is able to simulate the entire 50 TB per uploader by oscillating values of zero and one at appropriate intervals, but very quickly - much faster than the eye can see! So we never realize the illusion ....

  • @filelover said:

    @sibaper said:
    ask them again

    So you cant answer?

    Why not ask that affiliate person from that file host?

    I mean, it would be difficult for @sibaper or for anyone here to answer your question. You haven't even told us which file host.

  • @sibaper they are cheating... hide your files hide your dirs.

  • filelover said: some affiliate person from some file host

    filelover said: So you cant answer?

    What's your deficiency?

    Thanked by 2MCHPhil lazyt
  • They dont answer because its their secret.
    I am asking this from the server-side. What kind of servers provides this for cheap?

  • @filelover said:
    They dont answer because its their secret.
    I am asking this from the server-side. What kind of servers provides this for cheap?

    Define cheap?

  • uptimeuptime Member
    edited February 2018

    Also (forgot to mention) they are using blockchain technology to leverage a synergistic distributed economy of scale. They can then pass the savings along to you!

  • @uptime said:
    Also (forgot to mention) they are using blockchain technology to leverage a synergistic distributed economy of scale. They can then pass the savings along to you!

    True but lets not forget about advantages of multidimensional quantum disks.

    Thanked by 2uptime MikePT
  • FHRFHR Member, Host Rep

    @uptime Cloud-powered distributed replicated blockchain technology with DataMind™ deep learning assisted deduplication technology is really the way to go these days.

  • #disks!

    Thanked by 3that_guy mksh kkrajk
  • raindog308raindog308 Administrator, Veteran

    FHR said: assisted deduplication technology

    "Hi, we replaced the poorly encoded 'Game of Thrones Complete Series' files in your account with an existing H.264-encoded set that is already in 7 million other accounts."

  • uptimeuptime Member
    edited February 2018

    @FHR said:
    @uptime Cloud-powered distributed replicated blockchain technology with DataMind™ deep learning assisted deduplication technology is really the way to go these days.

    Hmmm .. perhaps you be making some kind of jokey-joke ...? But there is possibly a connection to be made from compression to AI: https://en.m.wikipedia.org/wiki/Hutter_Prize

    The Hutter Prize is a cash prize funded by Marcus Hutter which rewards data compression improvements on a specific 100 MB English text file. Specifically, the prize awards 500 euros for each one percent improvement (with 50,000 euros total funding)

    >

    [...]

    >

    The goal of the Hutter Prize is to encourage research in artificial intelligence (AI). The organizers believe that text compression and AI are equivalent problems. Hutter proved that the optimal behavior of a goal seeking agent in an unknown but computable environment is to guess at each step that the environment is probably controlled by one of the shortest programs consistent with all interaction so far. However, there is no general solution because Kolmogorov complexity is not computable.

    More about this, from Herr Doktor Hutter's mouth, as it were: http://prize.hutter1.net/

    (And the 100 MB file to compress is actually a sample of wikipedia, aka the sum of all human knowledge!)

    TL;DR - "brevity is the soul of wit"

  • @filelover said:
    I have talked with some affiliate person from some file host. He said "we provide 50TB per uploader for uploader storage. So what kind of server do you think do they use? Is it just cheating/trolling? How can they survive? Or what kind of cheap storage host provides those kind of biggest storages?

    I will provide you with unlimited, infinite storage if you pay me $0.01 per kb for bandwidth. So 8e+9 per TB.

  • @uptime said:

    John Oliver can confirm:

    Thanked by 2uptime jvnadr
  • NeoonNeoon Community Contributor, Veteran
    edited February 2018

    50TB thats a lot, imagine all these use cases.

  • @raindog308 said:

    FHR said: assisted deduplication technology

    "Hi, we replaced the poorly encoded 'Game of Thrones Complete Series' files in your account with an existing H.264-encoded set that is already in 7 million other accounts."

    I don’t think that anyone has 50 terabytes worth of GoT.

  • wieners!

  • raindog308raindog308 Administrator, Veteran

    uptime said: Hmmm .. perhaps you be making some kind of jokey-joke ...? But there is possibly a connection to be made from compression to AI: https://en.m.wikipedia.org/wiki/Hutter_Prize

    uptime said: The goal of the Hutter Prize is to encourage research in artificial intelligence (AI). The organizers believe that text compression and AI are equivalent problems.

    I've participated in the Hutter Prize (no, never won) and really, there is virtually zero to do with AI there. It's just a compression contest.

    It's not even a general compression contest. It's specifically for the first 100MB of the English version of wikipedia as of the date the prize started. If you can compress that further than anyone else, you win, even if the code you deliver is not applicable generally.

    I think the organizers are completely wrong. It's the same fallacy once held by people working on chess engines...by figuring out how to play chess, computers will develop AI. Nope...they just develop really specific algorithms (backed by tons of speed) for chess which cannot be generalized. Same thing here.

    Thanked by 1uptime
  • raindog308 said: by figuring out how to play chess, computers will develop AI. Nope...they just develop really specific algorithms (backed by tons of speed) for chess which cannot be generalized.

    That changed recently:

    https://www.technologyreview.com/s/609736/alpha-zeros-alien-chess-shows-the-power-and-the-peculiarity-of-ai/

    Thanked by 1uptime
  • raindog308raindog308 Administrator, Veteran

    willie said: That changed recently:

    I understand your point but I disagree. I interpreted the work you linked as taking a general game-solving system and applying it to chess. The original hope for chess programming is that they'd study chess (or Go in this case, originally) and from that branch out into many areas of AI.

    In this case, studying a specific game (Go) lead to a system that could study other generally similar games. Yes, I know both games and they're different, but they're still pieces on a board. It's not like by studying Go, the system could then turn around and play Dungeons and Dragons.

    Thanked by 1uptime
  • FHRFHR Member, Host Rep

    @raindog308 said:
    It's not like by studying Go, the system could then turn around and play Dungeons and Dragons.

    This limitation also applies to humans though. I won't know how to play Eve by studying chess.

    Thanked by 1uptime
  • vovlervovler Member
    edited February 2018

    If you are looking for a genuine big storage solution, take a look at Incero's 192TBs:

    Thats about $2.5 per TB, and if you get to be VIP, it gets to about $2

    Thanked by 3uptime mksh jetchirag
  • uptimeuptime Member
    edited February 2018

    @vovler

    Thats about $2.5 per TB, and if you get to be VIP, if gets to about $2

    So ... given (promised) 50 TB per uploader would need about $100 monthly revenue from each to pay for these monster servers. That's a lot of pop-unders etc. (or maybe they are fibbing a bit with regards to the 50 TB per uploader, I dunno ....)

    Back to our regularly scheduled deprogramming:

    @raindog308 - very cool to run into someone who has actually participated in the Hutter prize. (Noting that the long-time winner seems to be expert in compression via more traditional maths vs. esoteric AI.)

    My own reach-to-grasp ratio for this stuff is about as over-extended as most other humans, but I am a fan of the work of Juergen Schmidhuber (known for developing LSTM "Long Short-Term Memory" neural network method for time series data analysis). I believe that Schmidhuber advised Hutter's post-doctoral work - in any case later proposing an interesting paradigm named OOPS (for "Optimally Ordered Problem Solver") in a paper purporting to demonstrate a possible pathway to "strong" AI by learning to solve a 30-disk Towers of Hanoi puzzle: https://arxiv.org/abs/cs/0207097 ...

    I think the compression-as-intelligence argument rings true in terms of philosophy relating to algorithmic information theory - but we are currently so far back in what will eventually be considered the dark ages (and possibly also the weirdest timeline) in the pre-history of strong AI so as to make the Hutter compression challenge painfully sparse in terms of yielding much insight as to the deeper nature of intelligence - so far. We are still very much in the early days though.

    @willie - thinking a bit about recent emergence of super-human game playing performance by algorithmic approaches (ie for the successful Alpha Go effort, using Monte Carlo tree search combined with deep learning amplifying data gleaned from playing human experts - by then having the AI play more against itself) ... I would like to make some kind of analogy with equivalence of NP-complete problems, whereby a solution to the traveling salesman problem transforms into graph isomorphism and satisfiability solvers, etc. ... but I don't really know what I'm talking about so may want to simply leave it at that. For now ... Until such time when it becomes necessary to derail this wonderful thread once again ~;^)

  • mkshmksh Member
    edited February 2018

    @vovler said:
    If you are looking for a genuine big storage solution, take a look at Incero's 192TBs:

    Thats about $2.5 per TB, and if you get to be VIP, if gets to about $2

    Great price! To bad even without knowing OPs definition of cheap i have a feeling it's likely still fucking expensive to him.

    Edit:

    @uptime said:
    So ... given (promised) 50 TB per uploader would need about $100 monthly revenue from each to pay for these monster servers. That's a lot of pop-unders etc. (or maybe they are fibbing a bit with regards to the 50 TB per uploader, I dunno ....)

    Tbh i don't think OP is interested in math or facts. Looks like he has already abadoned the thread since it wasn't isntantly raining signup links for $7 50TB servers.

    Thanked by 1uptime
  • @mksh said:

    Tbh i don't think OP is interested in math or facts. Looks like he has already abadoned the thread since it wasn't isntantly raining signup links for $7 50TB servers.

    At least no deluge of aff-links for file hosters (yet). Anyway, here's to OP, bless their file-loving heart!

    Thanked by 1mksh
  • Hey. I'm littlebit late, but even i could provide over petabytes of cloud storage for 0.01e/gb/month

    How? Well I'm running seafile.com professional server with OVH object storage as data backend.

Sign In or Register to comment.