All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Review guidelines on LET
Do you think there should be review guidelines on LowEndTalk?
I haven't been a part of the community for long, but I spend my fair share of time lurking on LET and I often check out provider reviews. Lately I've seen some reviews in which the reviewer will simply indicate they are satisfied with their provider's services, but they'll hardly share more than that.
Reviews obviously influence a lot of people when it comes to choosing a provider. That's why we need them to be as accurate as possible. Instead of saying "X or Y provider is great", I believe all reviewers ought to show real test data (i.e. a proper benchmark). Don't get me wrong - a lot of you do that and I am thankful for this, but there also are quite a bit of people who skip this essential part.
My humble opinion is that the implementation of review guidelines on LET would avoid a lot of bad fuss and give us, LET members, an accurate depiction of a provider's service - instead of a mere impression of "bad" or "good".
I've made a poll with two questions - "Should there be review guidelines on LET?" and "What kind of guidelines should be set?"
- Should there be review guidelines on LET?34 votes
- Yes70.59%
- No29.41%
- What kind of review guidelines should there be on LET?34 votes
- Requirement of test data (benchmarks, etc)41.18%
- Avoiding both excessive bashing and praising of a provider29.41%
- Other (comment)29.41%
Comments
It should be generally mandatory to back up criticism, be it good or bad one. This refers to technical details (such as benchmarks) but also to other professional levels, such as type of support, responsiveness and professionalism.
It should be a client's own choice how he wants to construct the review of a provider.
I'd argue that a benchmark is a sort of criticism on its own - the provider will be positively highlighted if the benchmark is good, and negatively if it isn't. But, most importantly, it's an objective (and not subjective) test - that's what matters here.
I understand - although a review is meant to be read by others, after all. If not, what's the point of sharing it?
The issue is that if a review does not contain enough objective and accurate information, it doesn't really qualify as a (thorough) review but only as an overall impression of a provider's service.
You are correct to some extent but still a review is a personal opinion of a client about a provider so they should have full freedom of writing it.
A good benchmark is just an unilateral view. Things such as support response time, quality of response, reputation, uptime, etc should also be taken into the consideration when writing a review.
Going beyond a simple opinion is always better if you intend others to get valuable information from your review; that's my point. Of course, the reviewer's general view on the provider is important, I'm not denying it - but the problem is that, too often, it is not backed up by objective data.
For example, a reviewer can very well praise a certain provider without showing a benchmark - which, however, is not a form of 'absolute truth', but a good indicator of a provider's reliability - in this case, how can LET members have any idea about the actual performance of the said provider's hardware and network? There's no other way (at least that I know of) to have one than to analyze benchmarks.
They should be, yes. I was also thinking about including those in the poll but I'd rather have people comment here and tell what they'd want out of review guidelines (if they are in favour of having those).
Not to mention that some "reviewers" post benchmark and this wrap into awesome "review" about high quality of service after one week of usage.
I am not sure if structured guidelines for every aspect of forum would do much good as people often can't follow even those few lines now, however I completely understand the reason for this thread. Most "reviews" don't deserve to be called "review" at all.
Blind prasing of a host because initial kind response of a support guy and awesome benchmark numbers on empty node in first week of usage does not tell much about real quality of a service. One must actually use service some time to share experience worth of a consideration for readers and potential future clients.
Well, there goes another idea of a guideline then - the minimal usage time of a provider's service before putting out a review. I couldn't agree more on the fact that new shiny nodes will easily have a provider appear as a great one..
I'm iffy about relying on benchmarks. I used to think that the better numbers the better the host, but I've come to realize that benchmarks can change at the drop of a hat. Just to give one example, an I/O of 300+ one day could be 40 the next depending on so many factors: time of day, lone abuser stressing the node, people filling the node and running dds at the same time because the provider just posted another ad, etc. A benchmark really just provides a fleeting snapshot that never truly tells the whole story. I would love to hear about reliable tests that users can run in order to provide objective data, but I feel like benchmarks can be almost as unreliable as a vague "they're fantastic!!", they just have a lot of numbers in them.
Maybe it's because I'm not very knowledgeable about the technical side of things, but I have always valued the intangible part of a service more highly than the shiny hardware and fancy network blends. I actually think the "unhelpful" reviews are useful in their own way, because it allows us to see how the hosts in question handle the situation, especially when the review is a negative one. Needless to say, how they conduct themselves in public - to the customer as well as other forum members - can have be a huge influence on whether or not I would even consider buying from them in the future.
My biggest question is, how strictly should these guidelines be adhered to and enforced if implemented? What if LET posts up guidelines and people still write their reviews their own way? What if a review was written and followed the guidelines to a T, but people replying to add their own experiences (whether to refute or agree with the review) only gave a brief line or two about how they had a great or shitty experience with that host? If someone is praising the amazing support of a certain provider in their properly written review according to forum guidelines, I might want to chime in just to agree that yes, I also received quick and friendly support from them, but that doesn't necessarily mean I have the time or inclination to write a full blown review of my own. However, people might find my response not very helpful either because I haven't focused on anything else besides good support. Where should the line be drawn?
@K2Bytes --- "It should be a client's own choice how he wants to construct the review of a provider."
As a provider, this is what i want to see ... good 'ol constructive criticism.
YES, we have had some negitive comments etc, but SOME are aimed directly at STAFF. This is not only disrepectfull but ABUSEIVE in most cases.
While i respect the openness of these forums compared to say WHT, the allowed abusive behavior towards staff of company's who freely advertize here on LET/LEB
(of witch we all appreciate this being a corprate sponsorerd forum) is not only allowed but a massive point of disrespect of the hosters peers ETC.
I just think they get paid when they dont post benchmarks, like the guy who is still running in forums with 20% off code and he is working for the company (facepunch). So post proof that you are a costumer and post benchmarks.
Well that's just you, I'm sure many others don't think the same way.
Benchmarks are no more, no less than statistics, and one fundamental aspect of statistics is the size of a sample - it always plays a massive role in determining a statistical study's reliability. Therefore, yes, one benchmark carried out a time T shows only one specific side of the overall picture (i.e. provider's reliability).
However, if you carry out multiple benchmarks at different times (T+6h, T+1d, T+1w, T+1mo...) then you end up getting an accurate statistical depiction of the real situation. That kind of correlates with what @Spirit suggested: reviews shouldn't be done after one single week, because things do change quickly in the hosting business.
As a result, the problem does not lie within benchmarking itself; it is a fairly good, and objective, approach to measuring the performance of a server. The problem is to benchmark over an extended period of time.
Now, I'm not suggesting that every reviewer should set up a cron job to run dd and murder the host node. However, running benchmarks once every now and then, in a non-abusive manner (e.g. a couple times a week maybe), will get us what we need over the long term - again, that is an accurate depiction of the performance of the server.
I do agree with you on this point - the hardware is one thing, the provider's support and public behaviour is another. As for the latter, we've had a terrible example this weekend (everyone knows who I'm talking about - they've been banned for a week). Having people at LET tell us how they are being supported by their providers is, of course, essential.
But the major limit of those "unhelpful reviews" is that they are incomplete, and are not very reliable either since it is more or less an impression that the provider has made on the client. I could definitely say that a provider offers great support, and that would attract some new clients from LET to this provider - but I didn't mention the reliability of their hardware, so those new clients have no idea whether they should expect a stable environment when they get their server(s).
Anyhow, support and public behaviour both make for another criteria/guideline as I pointed out in one of my posts above, and they are essential too.
The subject of this thread is the guidelines that should be applied to the reviewer who initially writes the review, not the commenters. The latter have not decided to engage themselves in doing a provider review, so there would be no point asking them to follow strict guidelines for adding further input to an already existing review. At most, this would be counter-productive, and they would stop commenting because it would be too much of a hassle.
Of course, if a commenter happens to be against a positive review of a popular and usually liked provider in the community, then they would have to provide solid proofs - at least equivalent to those the reviewers gave in the first place (e.g. benchmarks, answers to support tickets, etc). Then again, they're not required to do so - what will most likely happen is that their opinion (and not 'review') will be ignored by the community.
To conclude, I'd say that the line needs to be drawn between the reviewer and the commenters; the first decides to carry out a review which needs to be informative and accurate, whilst the latter voice their praise or concern with their own proof.
By the way, thanks for your constructive post, @hellogoodbye
WTF...
"You can only praise so much or bash so much. We have a detailed formula that will tell you if you cross out of this approved zone..."
Just reviews that included facts as opposed to rants would be appreciated by me.
Sorry, that wasn't what I meant. My point is that unjustified bashing or praising of a provider isn't that uncommon, and that's something I'm not really willing to see.
Since you say you're new, telling us what you're "willing to see" sounds rather imperious.
Regardless, your idea is unenforceable and will lead to just a different flavor of flamefest ("seems excessive" - "no it's not" - "let me cut/paste the guidelines" - "can a moderator look at this" - blah)
Frankly if you're looking for well-balanced emotion-free reviews with detailed benchmarks, quality insights, and reasoned judgment...you're in the wrong place. This is LET...you might prefer vpsboard or WHT.
Is there a problem with reviews here? Some are good, some are not. The reader can easily see the difference.
The main issue I see is that many 'reviews' are actually unqualified drivel, often brought about by a response to a support ticket. The longer-term members see through this stuff but newbies won't, meaning they might be put off certain hosts for no reason other than the review writer is an idiot who can't secure a container properly.
A review should be about the users experience, I ignore most every review on here. Mainly because all you get is people posting up a benchmark of an idle VPS that has no purpose other than it was a "good deal".
I see people posting "I have had the VPS for 3 months, here are the DD results"
Well think about it. In the past 3 months there were 131,487 minutes, your DD takes say at worst 30 seconds to run, so you are basing performance on 0.02% of the time it has been running, not helpful.
A proper review needs to be based on real use and experience, if for example you tell me that you have been using the VPS as a web server hosting 3 websites, using a DB and show me your uptime, along with commentary to demonstrate you have been happy with the performance throughout that time it will mean so much more to me than serverbear crap. Because performance over time is more useful than a moment in time.
/2c
In addition to W1V_Lee's points above, support should be given a thorough work-out as part of any review; support is as much a part of the service as container performance.
This is why we have some basic guidelines in place, although they do not prohibit creative freedom of the review process. Basically, we just want a thorough and detailed account of the service on vpsBoard. I'm very happy with the quality of reviews there.
It helps that you don't have as many nutters over there too.
Unfortunately, it's the nutters that supply most of the entertainment.
We've got our crazies over there too unfortunately. But they're members with dual-citizenship, so they're here too. :P
Should also be noted that a some of the reviews are 'sponsored' in the sense that we find unique providers or those that don't have many reviews and I have a trusted member of the community review them in detail. The provider doesn't know they're being reviewed, nothing like that. The reviewer tests the service out with real-world testing, not just idle it and run a benchmark every week or anything... they're most certainly used and the reviews are updated, mostly at quarterly intervals.
Also, because it's not 'lowend' over there, there are non-lowend host reviews there too though not as many as I'd like.
We all just let wlanboy do the reviews over there because... well, it doesn't seem that anyone else cares. He does a well enough job and is compensated for it anyways
Any guidelines will make many not be bothered to write, not saying its a bad thing just pointing it out.
Also people are typically more inclined to write bad things; good things are "to be expected".
I don't see there is a problem calling a provider good or bad. However, sometimes, there is a strong title "stay away from XXX" but when you read the post, you clearly understand the problem is on the client.
You sir have restored some of my faith in humanity.
I'm figuring out over the long term that discouraging people who are not willing to put in a little bit of effort - i.e. taking screenshots and nicely formatting their review - is, in the end, a good thing.
Again, my point is that there's hardly any way one can make up their own mind about a provider if they can't see what it's really worth. The eyes of the reviewer add a subjective perspective to the review, so the only way to have at least some objectivity is to have factual data.
This kind of bashing pops up very often on LET, and the poster will almost always classify it as a review... when it's clearly not.
I would not impose any requirements or restrictions for reviews. Each reviewer brings their own unique perspective to LET. Imposing specific review requirements (such as benchmarks) will introduce bias to the reviews. I believe that LET members are capable and intelligent. Trust them to judge the reviewer and the quality of a review for themselves. They will get it right most of the time. Furthermore, LET members do not seem to be shy about jumping in express their own contrary opinions and expose the truth as they see it.
Ok, here's my take on things.
You be an asshole, people be assholes to you.
If you treat a client like shit, that client will shit on your face.
If you make a false review, people will shove it up your ass and make you cry mommy.
So yeah, I don't think there should be any hard criteria, guidelines, or otherwise on reviews.
EDIT: Ok, maybe a no-clickbait rule.