Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


How to handle Big Data with 10 to 50 Million of users!
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

How to handle Big Data with 10 to 50 Million of users!

It's not a small question i think. How big apps / websites with millions of traffic handle there websites/app?

Example: A Web app company expect 1 million user within 3 months after launching ... and, they expect 1 to 10 million within 1 year of launching.

So, question is what is the best solution for them to handle their servers? or, how they handle bandwidth? how they manage / troubleshoot server? will datacenter company will help them? or, what is the perfect solution? any idea?

TIA

«1

Comments

  • You need 10000 servers.. Or super computer

  • DhruboHostDhruboHost Member
    edited February 2016

    @spidervenom said:
    You need 10000 servers.. Or super computer

    everyone know that single server cant handle this type of things. i just need to know what procedure to purchase the servers or, should we colocate or, who will help in that case?

  • @DhruboHost said:
    So, question is what is the best solution for them to handle their servers? or, how they handle bandwidth? how they manage / troubleshoot server? will datacenter company will help them? or, what is the perfect solution? any idea?

    Point them to a hosting company that knows what they're doing and can actually support the traffic required.

  • nginx on a really powerful server to load balance requests between servers, use AnyCast IPs to have users routed to the nearest server, then have dedicated machines for every single layer in the application/website, database, http, etc.

    Also, don't host everything in the same DC and cause the entire network to be slow down because everyone is trying to use your app/website - get your own ASN, build up your own network etc...

    There's one app that is hosted mostly on DigitalOcean SG, and it's slow af because everyone is trying to use it - leads to major frustration between users, lol.

    Thanked by 1netomx
  • Right! For your main server used quad-xeonE7series 80cores, 1TB RAM, RAID 200 500GB SSD, 100 Gbps port speed

    @theroyalstudent said:
    nginx on a really powerful server to load balance requests between servers, use AnyCast IPs to have users routed to the nearest server, then have dedicated machines for every single layer in the application/website, database, http, etc.

    Also, don't host everything in the same DC and cause the entire network to be slow down because everyone is trying to use your app/website - get your own ASN, build up your own network etc...

    There's one app that is hosted mostly on DigitalOcean SG, and it's slow af because everyone is trying to use it - leads to major frustration between users, lol.

  • gbshousegbshouse Member, Host Rep

    In fact it's not complicated. The key is layer separation and horizontal scalability. You have UI layer (web servers plus load balancing), shared cache layer, data layer (databases with shrade/partitions plus load balancing), storage layer and management layer. If designed correctly you can start with 10-12 servers and if necessary scale horizontally on each layer. Later on you can add multi-DC setup (replicate data and sometimes storage). In very large setups you have private big pipes (10-40G) between DCs. Interesting part is that you can use free software to limit the costs (nginx, haproxy, redis, glusterfs, mysql).

  • +1 to everything @gbshouse listed
    And I would add, get a good cdn to reduce load and bandwidth needs on the servers

    Thanked by 1netomx
  • shovenoseshovenose Member, Host Rep

    @theroyalstudent said:
    nginx on a really powerful server to load balance requests between servers, use AnyCast IPs to have users routed to the nearest server, then have dedicated machines for every single layer in the application/website, database, http, etc.

    Also, don't host everything in the same DC and cause the entire network to be slow down because everyone is trying to use your app/website - get your own ASN, build up your own network etc...

    There's one app that is hosted mostly on DigitalOcean SG, and it's slow af because everyone is trying to use it - leads to major frustration between users, lol.

    What App?

  • @shovenose said:
    What App?

    PMed.

  • @gbshouse said:
    In fact it's not complicated. The key is layer separation and horizontal scalability. You have UI layer (web servers plus load balancing), shared cache layer, data layer (databases with shrade/partitions plus load balancing), storage layer and management layer. If designed correctly you can start with 10-12 servers and if necessary scale horizontally on each layer. Later on you can add multi-DC setup (replicate data and sometimes storage). In very large setups you have private big pipes (10-40G) between DCs. Interesting part is that you can use free software to limit the costs (nginx, haproxy, redis, glusterfs, mysql)

    Great informatios, I'm from aisa so how can I contact with DC? How should we proceed? Any DC will help to setup like this or we have to do ourselves? Any idea ?

  • @DhruboHost said:
    Great informatios, I'm from aisa so how can I contact with DC? How should we proceed? Any DC will help to setup like this or we have to do ourselves? Any idea ?

    Ask Equinix, they should know what to do.

  • WebProjectWebProject Host Rep, Veteran

    spidervenom said: You need 10000 servers.. Or super computer

    definitely will cost more than $7, based on cluster computing all 10000 servers act as one server.

  • @Jonchun said:

    aka not on @TinyTunnel_Tom's whopping 300 bytes/second uplink

  • @doghouch said:
    aka not on @TinyTunnel_Tom's whopping 300 bytes/second uplink

    pm me i should be able to do something for you

  • @Jonchun said:
    pm me i should be able to do something for you

    oh ya man can you do $100/mo for that? i really need it pls thx so much man

  • @doghouch said:
    oh ya man can you do $100/mo for that? i really need it pls thx so much man

    even better. how does $10/year sound?

    Thanked by 1netomx
  • @DhruboHost said:
    It's not a small question i think. How big apps / websites with millions of traffic handle there websites/app?

    Example: A Web app company expect 1 million user within 3 months after launching ... and, they expect 1 to 10 million within 1 year of launching.

    So, question is what is the best solution for them to handle their servers? or, how they handle bandwidth? how they manage / troubleshoot server? will datacenter company will help them? or, what is the perfect solution? any idea?

    TIA

    Facebook, Google, etc. Have their own data centers.

  • you need a satellite

  • theroyalstudent said: nginx on a really powerful server to load balance requests between servers, use AnyCast IPs to have users routed to the nearest server, then have dedicated machines for every single layer in the application/website, database, http, etc.

    I won't recommend reinventing the wheel when there are existing tools like cloudflare/incapsula. Only do it if you know what you are doing and only if there's no other fitting alternatives. Layering apps across servers without proper architecture could easily turn into nightmare, for example, syncing DB across multiple servers in real-time.

  • I would use AWS or Azure for something that big. Don't trust this volume of data to a host with a SLA that can't help you when you are about to lose terabytes of user data. And, please, make sure that you have the knowledge to admin something that big. DB sync across multiple servers in near real time IS a nightmare :D

  • hosting at a number of different locations is the best thing, so depending on their location , they get the nearest server. This also keeps the app quicker as not every user is going to the same server location. Colocation would be a good option if you know what you're doing, but a dedicated server with a number of different providers or with the same provider but different locations.

  • @DhruboHost said:
    It's not a small question i think. How big apps / websites with millions of traffic handle there websites/app?

    Example: A Web app company expect 1 million user within 3 months after launching ... and, they expect 1 to 10 million within 1 year of launching.

    In my experience these expectations are completely unrealistic, all the time. Also, "X million users" have absolutely no meaning. How many of these are active, how many requests / seconds, how much of this is dynamically generated, how much of this is static files. What language are you using, nginx/openresty or erlang/elixir/Phoenix will go tens of thousands of requests a second easily. Heck, http://www.phoenixframework.org/blog/the-road-to-2-million-websocket-connections they had two million clients on 40 core 128GB box which is not even particularly big.

    It's extremely rare to have a website that actually needs multiple servers if coded the right way. You will probably have two at least for HA or perhaps four for webfrontend+database times two for HA but really, beyond that... Yes, there are a few apps needing more. No, you are not one of those.

  • AnthonySmithAnthonySmith Member, Patron Provider

    This really is not complicated at all, but based on what you clearly do not know the very first thing you should do is hire a consultant.

    I am sure you know many things I don't know, everyone has their strengths, but hire a consultant.

    With numbers like that you could do it expensively bad and never chive the right result or pay a consultant once or twice and get a great result for years to come and on budget.

  • There is no simple answer that covers all situations. Having N million users is not meaningful information. Everything is domain specific: what will those users be doing on the site?

    Is it media heavy, request heavy, simple news type site? All of these factors affect scalability. In general, the bottle neck is typically the database, so getting fast disk IO is important.

    But it typically boils down to how many requests you need per second, not how many users you have in total (these are two completely different things).

    But, again there are no canned answers here. Some apps would require something like SAP HANA for fast transactions, and other apps would do fine with a MYSQL cluster, or Mongo ReplicaSet.

  • niknik Member, Host Rep

    We've scaled a Ruby on Rails app with PostgreSQL to handle half a million unique visitors over the course of 2 days. But this is completely different on how your app is developed, what language is used, which database and so on.

    We've developed the app ourselves so we knew what we had to deal with but there will be bottlenecks everywhere you don't know beforehand so you need a hoster who knows what he is doing and is working closely with your developers. You can't just order 2 servers and be done with it. This is not only on the hardware level but also software optimisation.

  • Well this is necro of an old thread, but completely aside from servers, you need a lot of other ops and support resources to handle a user load of that size. As everyone says, a lot depends on what the site is doing.

  • When I need to handle 50 million users - I come to LET to learn how to do it for $7

  • I think they cant go wrong with LiquidWeb! Great guys and Service :)

  • I love to read this suggestions.

Sign In or Register to comment.