Clientside performance no priority for Dutch websites
Making a website fast goes far beyond trying to minimize the total size (in terms of kilobytes the client has to download) of an average page, but it still is a good start to try to reduce some overhead. If any page, and especially the frontpage of a website where a first-time visitor with an empty browsercache is most likely to 'enter', is already more than a megabyte in size than you're absolutely doing something wrong. Even these days where most people are on broadband connections downloading a megabyte still takes a measurable amount of time. Especially given the fact that it's not only the bandwidth that defines the total amount of time needed to download all resources for a page, but also the number of items that have to be downloaded (browsers can only download a limited number of items from a given host in parallel), added time for DNS-lookups, roundtrips, TCP and HTTP overhead, reducing the total size of required resources already improves the users' experience (especially for those that are not on a broadband connection).
Even when the download time for all resources may seem small, you have to take in mind that the client also, depending on the type of resource, needs to parse, compile and execute and/or render those items, which may in total take even more time than the download time itself. So that makes two reasons to reduce the 'footprint' of your site, especially if that can be done without any negative sideeffect. That brings me to the first example I came across recently where clientside performance obviously is not being taken seriously by one of our colleague-ICT-newssites:
Webwereld recently featured an article with the most 'hilarious' emails they received over the past year. One of those emails read (translated): "Congratulations on your new site, but can I have the old site back? This new site is terribly slow; it's like I'm on a 14K4 modem again..." (Webwereld did a frontend redesign in 2009).
Now I don't see anything hilarious in that email; complaints from users about your site being slow should be taken seriously. Even if it is just a perceived notion of reduced performance (which is not uncommon with users after a complete redesign) you should be able to explain that. However, in this case and with my knowledge of their site before the redesign, the user complaining is probably right; the site's performance did suffer from the redesign and there are various points that cause it, all of which could be avoided or improved.
Secondly, allmost all of the images they are using in their interface can be reduced to much smaller filesizes. Using the "smush-it' service I could reduce for instance http://webwereld.nl/images/inputFieldBg.jpg from 22.2K to 829 bytes, and that goes for most images webwereld is using for it's interface. This is most certainly an oversight.
Easy win: reduce filesize of static images using Smush.it
Thirdly, the number of 'tags' for advertising certainly increased after their redesign. It's no secret that 2009 has been a bad year for advertising on websites, so trying to increase advertisment income by creating 'special' bannerpositions is a means of getting more advertisers for your site. However, increasing the number of bannertags will not get you any more banners than those that are being sold. At Tweakers.net we actually try to match the number of bannertags created to the number of banners that are actually sold; 9 out of 10 views on our frontpage don't get any bannertags at all.
update: Coen van Eenbergen from Techzine notes that Techzine is no longer using this type of advertising because their users massively disapproved of it. A nice detail is that Webwereld has also in the past experimented with this type of advertising. Tweakers.net did something simular back in 2004, but that was an April fools' joke...
So how are other major Dutch websites doing in terms of clientside performance? To be able to determine that I decided to use the YSlow tool on the most populair websites in the Netherlands (which includes Tweakers.net of course):
|Site||YSlow grade (points)||Frontpage size (empty cache)||Frontpage size (primed cache)||# requests (empty/primed)|
|anwb.nl||C (72)||520.2K||19.9K||27 / 26|
|monsterboard.nl||E (58)||577.7K||126.8K||71 / 24|
|hyves.nl||D (68)||1068.5K||89.5K||120 / 9|
|youtube.nl||C (78)||210.7K||88.5K||27 / 20|
|ing.nl||D (68)||188.0K||18.5K||40 / 40|
|zylom.com||E (57)||1085.3K||57.6K||157 / 36|
|smscity.com||D (69)||569.0K||20.7K||45 / 2|
|vananaarbeter.nl||D (64)||212.6K||13.4K||32 / 32|
|nu.nl||D (69)||413.2K||80.3K||67 / 66|
|scholieren.com||C (77)||214.8K||6.6K||45 / 8|
|google.nl||A (97)||30.9K||4.5K||4 / 1|
|ibood.nl||B (80)||146.1K||2.2K||19 / 18|
|psv.nl||E (56)||1463.5K||162.3K||93 / 91|
|funda.nl||B (80)||89.3K||5.9K||14 / 13|
|inpakkenenwegwezen.nl||D (65)||428.2K||26.9K||82 / 78|
|uitzendinggemist.nl||D (60)||502.8K||88.8K||139 / 67|
|kika.nl||C (77)||599.4K||8.5K||19 / 18|
|receptenweb.nl||C (71)||464.5K||22.5K||34 / 33|
|relatieplanet.nl||C (70)||165.1K||82.5K||40 / 40|
|tweakers.net/pricewatch||B (80)||295.4K||19.4K||69 / 1|
|buienradar.nl||E (58)||1182.6K||410.4K||71 / 63|
and I'll add some other sites as well (including our own frontpage and my blogs' index ):
|Site||YSlow grade (points)||Frontpage size (empty cache)||Frontpage size (primed cache)||# requests (empty/primed)|
|tweakers.net||B (83)||563.0K||26.9K||54 / 3|
|geenstijl.nl||D (68)||1409.9K||256.5K||50 / 49|
|telegraaf.nl||E (54)||1355.6K||245.3K||181 / 59|
|fok.nl||C (71)||1241.6K||48.4K||51 / 14|
|webwereld.nl||D (62)||547.5K||150.4K||94 / 36|
|crisp.tweakblogs.net||A (99)||28.8K||3.9K||14 / 1|
It should be noted that a small total size does not necessarily compute to a higher ranking; in fact: there are a number of background images on the frontpage of Tweakers.net that can easily be reduced to a smaller filesize (I already spanked our newsposters for that). It's the total number of necessary requests and the number of requests on return-visits that make our site better compared to most of the rest. I must be honest here and tell you that the Tweakers.net examples did not contain any bannertags, but since this is true for 90% of all visits to our frontpage I thought it is the most representative. As for all other figures: these are based on the first hit on those pages, so they may be off-balanced (although I do think that they are mostly representative).
The mystery of primed cache # requests
Some of you may have noticed that the number of primed cache requests for some sites almost equals the number of initial requests, but with a much smaller total download size. This is due to the cacheing mechanism that is being used for resources that only have an Etag header without a (far-future) Expire header. Etag is a mechanism that still requires the useragent to regularly check for updates of the resource wereas Expires tells the useragent that it doesn't have to check and just take the resource from cache untill it gets expired.
Considering the fact that it is possible to achieve a B-grade for a large dynamic website such as Tweakers.net anything less than a C-grade should be deemed sub-optimal. Most sites can improve their clientside performance with just some simple measures. It's a pity (even a shame maybe) most large Dutch websites aren't paying attention to that; in my opinion it should be a basic part of the development cycle of any (large) website.
definetly a 'must read' in my honnest opinion...
@creep - yupz * i-chat thinks so too
[Comment edited on Monday 4 January 2010 09:09]
I thought it nicely complements the background of my blogpageNice blog, but I have to mention that the yellow is a bit too much on my frontend.
its kinda contraditing (is that the right word?) - your post about 'front-end design'I thought it nicely complements the background of my blogpage
though one could argue that, as its being the background, its also probably part of the back-end.
even though id have to slap that person on the head for such a remark....
Very good post!
It shouldn't have to be that obviousI'm a developer, not a designer
I'm also surprised big websites aren't aware of this. I thought most of these sites were built by competent front-enders. Because front-end optimization isn't really rocket science and any front-end developer should already know about how to optimize code and images.
What DID suprise me however is that it is 'this bad' i never expected any E-ratings i guess im pretty wrong here...
even though im not in to programming myself, so im not really the guy to point fingers to others (i guess), i do think this is REALLY BAD..
My Site currently has an grade B.
Optimization is not possible at this moment, but it is very well planned for in the future.
Customer panel has grade C.
Admin panel Grade B.
I'm planing to go for an grade A
But that will take some time.
Overall performance score 79
Ruleset applied: Default_ NoCDN
HTTP Requests - 15
Total Weight - 131.4K
HTTP Requests - 15
Total Weight - 5.4K
Graded F for gzip, Etags, Expires
"Grade A Overall performance score 99"
Nearly all my pages have grade A ((90-99) / 100). You could also give a try to PageSpeed. Google Sitemaps/Webmaster tools rates your site and gives suggestions. Consider using both tools for best performance. Ads can (as you said) significantly reduce the performance.
My pages are not compressed, because I run a loadbalancer/proxy in front of my backend webservers. So varnish does not have to save 2 versions of the page (gzip and non-gzip). Currently I optimize my pages by removing all linebreaks. For dynamic content within saved pages I use ESI.
As you see, you need more than clientside optimization. Serverside does deserve the same attention.
Too lame to install Firefox (which can be done in under a minute) and admitting you "refuse" to install it (for God-knows-why) and still have the balls to ask someone else to test your site Even though we're fresh in a new year, you're in the lonely top of my lame list.
[Comment edited on Monday 4 January 2010 17:05]
Since I'm one of the owners of Techzine.nl I thought I should respond here.
The in-text ads as mentioned above were running on Techzine.nl for a short period of time as a test, since our users massively disapproved these ads we took them offline within few days. Not mainly because of the speed but more for the user experience.
I also did some research about these ads but the in-text advertisements are not very fast in loading as crisp said. But what crisp missed and probably blamed the in-text ads supplier for. Is some code that is being loaded by Webads.
The webads bannercode is very complex and uses 5 subdomains of the domainnname webads.nl to get their information from. Also these servers don't use any form of compression. Then their are also advertisers and media companies that prefer their own tracking systems on top of the Webads one. Which leads to more slower performance.
I already mentioned this to the people of Webads who are looking into the issue. I also heard that other publishers have complained about this bannercode.
For the non dutch people reading this blog:
Webads is one of the bigger premium advertising networks in the Netherlands and provide advertisements to a few hundred websites.
So besides Techzine.nl many many other websites have slower performance then they should have when the advertising agencies invest more time and money in their systems and agreements with advertisers concerning tracking and serving of banners.
Since we are a small publisher as many others, we don't have the resources to do the advertising sales ourself. You need to partner up with a party like Webads and you are forced to put these very bad written and performing codes on your website.
So as a owner of a website you cannot always resolve these kind of issues or be blamed for them. Of course we choose to work with Webads but in our opinion there aren't any good alternatives at the moment on the Dutch market.
Then I also would like to say that the part written under the header Techzine about videos and downloads that are being automatically started/downloaded is not related to Techzine. Crisp meant this in general speaking I guess.
That difference should have a huge impact on the score in Yslow. The non-optimized code of advertising companies is a pain in the neck. If you are in a situation where you can combine the loading code within you own code it's a huge advantage.
When testing Techzine without the bannercode the site on itself scores an B grade easily (similar to tweakers), we tested that off course. But further optimization could be done here and there.
The struggle to get advertising companies to optimize their code is not always easy...
FireFox is too lame to be installed, that's why. I installed FireFox in a Virtual Machine after reading MMal's reply to plan improvements.
That's right; I have update the heading to reflect that better. It wasn't primarily about Techzine; I just took the in-text advertising as an example of how bad advertising implementations can be - which you also seem to agree with A good thing you got rid of it.Crisp meant this in general speaking I guess.
We do however load the bannertags after the content, and also combine tags for multiple formats in order to reduce the overall number of tags that we need to include in the page (and the total number of ads that can be displayed simultaniously).
[Comment edited on Monday 4 January 2010 20:32]
If I check my own site with a image sprite of 1.34 MB (- 700 kB because of the background) and visit with a empty cache then the render-time for the browser is about 1.288s according to ySlow, but if I check with Firebug (Net Panel) the loading time is around the 1.45s. (ISP: Ziggo | AiE).
Now my question is, the image sprite is transparent, how can you optimize / shrink a transparent image without deleting the transparent part. I've read lots of documents on Google (Page Speed) and also on Yahoo (ySlow) but I can't find a solution to my problem.
The only thing I know to shrink the sprite is removing the background from the sprite and then load the background as a new image file. Then the background will be fetched by a new request.
Let me know! ;-)
My advice would be to make several spritemaps: group those images that are normally used together. Added advantage is the fact that smaller spritemaps can better be optimized because of the reduced number of colours in the combined image (so you can use a palletized png-format - even with transparency).
And yes, you should probably leave the (700KB - ouch...) background out of your spritemap.
We will further look into this matter in the next few weeks.
Only know that Webkit doens't accept .js.gz or .css.gz files. Webkit only accepts .jgz files, why Webkit does this, I have no idea...
I personally think the lack the knowledge is the cause of this, when you are schooled or self thought there is no mention of performance or even how browsers work in general. You are only thought how to write HTML and CSS and make the website work. Reflow, repaint, caching is probably not in there vocabulary.
Here at Hotels.nl we take performance really serious. From optimizing parallel downloads to custom cache engines and custom preload patterns. Not only from a performance perspective but also form a business perspective optimizing can actually save you money. If you do not cache your resources properly you are only wasting bandwidth.
Comments are closed