All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
vpsbenchmarks: real world VPS providers performance
This is not a review, this is a comparison of the performances of small VPS plans at some of the most popular providers.
Our approach at vpsbenchmarks.com is different, we don't run one time benchmarks tests to push the cpu and disk to their limits for a few minutes. Instead we continuously measure the response time and system metrics of a live website running on those VMs.
The website serves between 100,000 and 200,000 pages per day, it's a Rails application backed by Postgres, Redis and memcached. It changes little and moves to a new provider/plan every one or 2 weeks. At all times, metrics are collected and reported in details at vpsbenchmarks.com.
Some highlights:
- Compare performances of selected plans
- Compare plan characteristics and prices
- Review individual trial metrics
- Review all plans that were tested at a particular provider
So far, we have tested Linode, DigitalOcean, Google Compute Engine, Amazon EC2, Microsoft Azure, Vpsdime, Ramnode and Vultr.
We hope vpsbenchmarks will help developers and administrators make an informed decision when choosing a cloud solution.
Comments
So, response time is the time for the VM to render the site and not how fast it was displayed on client side?
Yes, it's the time spent inside the VM only.
good idea.
Purely performance only?
Reliability, Service and Support?
We write posts summarizing the trials where we include interactions with support. If there was any interruption or degradation of the service, it's reported there as well.
This is nice, the only problem is the hosts are already so well known which means there is already a lot of information out there. On the other hand there are some hosts out there for over a year with no reviews or benchmarks. Do you also plan to do less popular hosts (I guess there are too many =P), or are the popular ones your focus?
That's a valid point. But there are only 52 weeks in a year so we have to choose the providers carefully. I think we can only test about 20 plans per year to be able to test each one of them at least twice. We can't pick the totally unknown ones or nobody will care for the data. So there will be more providers being tested than the current list but the new providers will have to be above some popularity threshold to get in.
At this pace, what real purpose this exercise will serve!
You know things keep changing more rapidly.