New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
compressible test file over https ?
I remember there was drama some time ago about a provider having his highly compressible test file served over https, which lead to unrealistically high transfer speeds observed by some clients.
My question is - is this common practice? I am asking because i noticed today another provider doing the same thing and got curious. Please note that i do not want to name the provider - it might be just an oversight and not something intentional. No need for more drama.
Comments
I've heard of this but it confuses me. Singlehop I think it was? Admittedly creative compression isn't my area of expertise. Personally I just wget cachefly's file for mine, not even sure why.
@jarland you can only wget the cachefly file after you already have a VPS with the said provider. And it only tests the speed towards the VPS, not from the VPS.
I was talking about the test IP / test file published by providers in their offers, so potential customers could test the upload network speed of that provider.
I mean for our test file on our test server. Just in case anyone would ever accuse me of it, it'd be news to me, and I'd be mad at cachefly
Just try to gzip it and see if it gets smaller. And if your testfile url is not https - it doesn't matter anyway.
We use @Telephone looking glass without modifications.
Good to know
Everyone loves @telephone LG
Cause it's easy.. I do the same thing
Just checked, cachefly's 100mb.test is highly compressible. So if you are serving it over https it could be seen as kind of cheating. Hopefully most people are using plain http to host their testfiles.
Can't you also compress content via GZIP or so over HTTP?
@gsrdgrdghd only if the client supports it? And i doubt wget does. While AFAIK for the https/SSL layer the compression happens transparently at the transport layer, the download client isn't even aware of it.
We're using http only.
Yep, that's the nature of sparse files (empty files).
This also brought up a good point for my LG. I've made a reference to add an alternative way to create test files:
head -c 100MB /dev/urandom > 100MB.test
Using head (and urandom), you can effectively make test files that won't be compressed, giving a true representation if served behind SSL or gzip.
The only disadvantage is the time it takes to create said file (a 100 MB test file takes roughly 30 seconds to create).
Thanks guys
It was SingleHop. They got flamed fairly hard for it. http://www.webhostingtalk.com/showthread.php?t=1083020
They were able to produce speeds above capable bandwidth lmfao
gzipped files would transfer faster over http, not https, https uses encryption protocol between the two and as such would actually be slower and higher CPU usage on the server.
But like telephone said, if you randomize the data quite a bit, the webserver can't as easily gzip the content to a much smaller size, than a simple repeated pattern.
Good to know. For me it's:
apt-get install lighttpd
cd /var/www
wget whatever/100mb.test
"Hey guys here's the test file"
it looks like someone just tried this trick on a LEB advertised host: http://www.lowendbox.com/blog/hosthink-6-95month-1gb-openvz-vps-in-turkey/
nice to see that even the small adsl line i have at home is capable of downloading a 100MB file in just a few seconds using google chrome!
I noticed that, downloaded in literally 2 seconds.
did an additional test, and they are sending out a gzipped version of their highly compressible (fake) test file:
so no surprise you can download the 100MB file in a few seconds, even from a slow home dsl line. Real transfer is just a few MB. I guess LEB admins/posters should check those tricks too. And ban the host doing the http to https redirect and serving a gzipped version of a compressible test file.
Interest, I would not have thought of this.
Except now it would be pointless given that we all know to avoid it.
@marrco Jesus, that file downloaded in less than a second on a 30Mbps cable connection here
@rds100 Ouch! What a find.
Imagine that, a major CDN playing a game, trickery.
I was lazy in not testing their file sooner. Long suspected they were doing something wrong to pull off speeds I sometime saw (like exceeding actual download bandwidth speed by 2X).
So I just pulled the Cachefly 100mb.test file. Original size in bytes: 104857600
Then gzipped it the standard run of mill way:
Resulting file size: 4587886
Meaning it compressed down to 4.3%. The other 95%+ is total empty space or duplicate waste.
Cachefly should be avoided for this sort of fraud. A CDN should be doing what it can to optimize files. Compressing them when they enter the CDN or at least before they get shipped to end users. A file like this should never be used for a speed test and is idiotic for it to be served by a so called CDN.