Clientside performance no priority for Dutch websites

By crisp on Monday 4 January 2010 02:10 - Comments (28)
Categories: Tweakers.net, Webdevelopment, Views: 10.474

As a senior developer at Tweakers.net and being specialized in frontend development I always take clientside performance very seriously. Even if your backend code is optimized to the bone, a slow rendering frontend can still spoil the whole experience for your visitors, and a bad first impression will make your visitors go elsewhere. A couple of recent articles on some other Dutch ICT-centered newssites made me wonder if they are taking clientside performance just as seriously.

Making a website fast goes far beyond trying to minimize the total size (in terms of kilobytes the client has to download) of an average page, but it still is a good start to try to reduce some overhead. If any page, and especially the frontpage of a website where a first-time visitor with an empty browsercache is most likely to 'enter', is already more than a megabyte in size than you're absolutely doing something wrong. Even these days where most people are on broadband connections downloading a megabyte still takes a measurable amount of time. Especially given the fact that it's not only the bandwidth that defines the total amount of time needed to download all resources for a page, but also the number of items that have to be downloaded (browsers can only download a limited number of items from a given host in parallel), added time for DNS-lookups, roundtrips, TCP and HTTP overhead, reducing the total size of required resources already improves the users' experience (especially for those that are not on a broadband connection).

Even when the download time for all resources may seem small, you have to take in mind that the client also, depending on the type of resource, needs to parse, compile and execute and/or render those items, which may in total take even more time than the download time itself. So that makes two reasons to reduce the 'footprint' of your site, especially if that can be done without any negative sideeffect. That brings me to the first example I came across recently where clientside performance obviously is not being taken seriously by one of our colleague-ICT-newssites:

Webwereld.nl

Webwereld recently featured an article with the most 'hilarious' emails they received over the past year. One of those emails read (translated): "Congratulations on your new site, but can I have the old site back? This new site is terribly slow; it's like I'm on a 14K4 modem again..." (Webwereld did a frontend redesign in 2009).

Now I don't see anything hilarious in that email; complaints from users about your site being slow should be taken seriously. Even if it is just a perceived notion of reduced performance (which is not uncommon with users after a complete redesign) you should be able to explain that. However, in this case and with my knowledge of their site before the redesign, the user complaining is probably right; the site's performance did suffer from the redesign and there are various points that cause it, all of which could be avoided or improved.

First of all, webwereld seems to use two different javascript frameworks, both of which are included on almost every page: prototype+scriptaculous (total of 8 requests loaded from ajax.googleapis.com) and jquery (loaded from googlecode.com - not http-compressed). And it's not even the fact that they are using those libraries, but more that they are hardly using any of the features of those libraries; they could have easily written their own scripts in such way that they didn't have to use any library.

Secondly, allmost all of the images they are using in their interface can be reduced to much smaller filesizes. Using the "smush-it' service I could reduce for instance http://webwereld.nl/images/inputFieldBg.jpg from 22.2K to 829 bytes, and that goes for most images webwereld is using for it's interface. This is most certainly an oversight.

http://tweakers.net/ext/f/qqoHsJEpeu9sqRhyFIZ99IPS/full.png
Easy win: reduce filesize of static images using Smush.it

Thirdly, the number of 'tags' for advertising certainly increased after their redesign. It's no secret that 2009 has been a bad year for advertising on websites, so trying to increase advertisment income by creating 'special' bannerpositions is a means of getting more advertisers for your site. However, increasing the number of bannertags will not get you any more banners than those that are being sold. At Tweakers.net we actually try to match the number of bannertags created to the number of banners that are actually sold; 9 out of 10 views on our frontpage don't get any bannertags at all.

Advertising

Advertising actually has a huge impact on clientside performance; it's not only the number of bannertags, which are mostly inline javascript-based and thus 'locking' page rendering by the browser, it's also the obtrusiveness and other characteristics that degrade the users' experience such as huge downloads (sometimes complete movies that start playing without interaction), poor response times from adservers and a large numbers of requests and redirects. A prime example of the latter is the implementation of in-text advertising on Techzine.nl which needs 12 HTTP request, and a total of 79KB (263KB uncompressed) worth of resources just to underline a couple of 'keywords' in an article which, when hovered over, will present a popup which will (notably slow) present you with advertised links. It's not surprising that almost half of their userbase, by means of the poll with the article, declared to "turn the 'feature' off" - which fortunately is an option given to Techzine's registered members.

update: Coen van Eenbergen from Techzine notes that Techzine is no longer using this type of advertising because their users massively disapproved of it. A nice detail is that Webwereld has also in the past experimented with this type of advertising. Tweakers.net did something simular back in 2004, but that was an April fools' joke...

YSlow

So how are other major Dutch websites doing in terms of clientside performance? To be able to determine that I decided to use the YSlow tool on the most populair websites in the Netherlands (which includes Tweakers.net of course):

SiteYSlow grade (points)Frontpage size (empty cache)Frontpage size (primed cache)# requests (empty/primed)
anwb.nlC (72)520.2K19.9K27 / 26
monsterboard.nlE (58)577.7K126.8K71 / 24
hyves.nlD (68)1068.5K89.5K120 / 9
youtube.nlC (78)210.7K88.5K27 / 20
ing.nlD (68)188.0K18.5K40 / 40
zylom.comE (57)1085.3K57.6K157 / 36
smscity.comD (69)569.0K20.7K45 / 2
vananaarbeter.nlD (64)212.6K13.4K32 / 32
nu.nlD (69)413.2K80.3K67 / 66
scholieren.comC (77)214.8K6.6K45 / 8
google.nlA (97)30.9K4.5K4 / 1
ibood.nlB (80)146.1K2.2K19 / 18
psv.nlE (56)1463.5K162.3K93 / 91
funda.nlB (80)89.3K5.9K14 / 13
inpakkenenwegwezen.nlD (65)428.2K26.9K82 / 78
uitzendinggemist.nlD (60)502.8K88.8K139 / 67
kika.nlC (77)599.4K8.5K19 / 18
receptenweb.nlC (71)464.5K22.5K34 / 33
relatieplanet.nlC (70)165.1K82.5K40 / 40
tweakers.net/pricewatchB (80)295.4K19.4K69 / 1
buienradar.nlE (58)1182.6K410.4K71 / 63

and I'll add some other sites as well (including our own frontpage and my blogs' index :P):

SiteYSlow grade (points)Frontpage size (empty cache)Frontpage size (primed cache)# requests (empty/primed)
tweakers.netB (83)563.0K26.9K54 / 3
geenstijl.nlD (68)1409.9K256.5K50 / 49
telegraaf.nlE (54)1355.6K245.3K181 / 59
fok.nlC (71)1241.6K48.4K51 / 14
webwereld.nlD (62)547.5K150.4K94 / 36
crisp.tweakblogs.netA (99)28.8K3.9K14 / 1

notes

It should be noted that a small total size does not necessarily compute to a higher ranking; in fact: there are a number of background images on the frontpage of Tweakers.net that can easily be reduced to a smaller filesize (I already spanked our newsposters for that). It's the total number of necessary requests and the number of requests on return-visits that make our site better compared to most of the rest. I must be honest here and tell you that the Tweakers.net examples did not contain any bannertags, but since this is true for 90% of all visits to our frontpage I thought it is the most representative. As for all other figures: these are based on the first hit on those pages, so they may be off-balanced (although I do think that they are mostly representative).

The mystery of primed cache # requests

Some of you may have noticed that the number of primed cache requests for some sites almost equals the number of initial requests, but with a much smaller total download size. This is due to the cacheing mechanism that is being used for resources that only have an Etag header without a (far-future) Expire header. Etag is a mechanism that still requires the useragent to regularly check for updates of the resource wereas Expires tells the useragent that it doesn't have to check and just take the resource from cache untill it gets expired.

Conclusion

Considering the fact that it is possible to achieve a B-grade for a large dynamic website such as Tweakers.net anything less than a C-grade should be deemed sub-optimal. Most sites can improve their clientside performance with just some simple measures. It's a pity (even a shame maybe) most large Dutch websites aren't paying attention to that; in my opinion it should be a basic part of the development cycle of any (large) website.

Volgende: Die IE6, die! 02-'10 Die IE6, die!
Volgende: Spam @ tweakblogs 11-'09 Spam @ tweakblogs

Comments


By Tweakers user i-chat, Monday 4 January 2010 08:30

nice 'article' its btw almost a shame that its posted here rather than on the FP,
definetly a 'must read' in my honnest opinion...


@creep - yupz * i-chat thinks so too :S

[Comment edited on Monday 4 January 2010 09:09]


By Tweakers user Creep, Monday 4 January 2010 09:02

Nice blog, but I have to mention that the yellow is a bit too much on my frontend.

By Tweakers user -RetroX-, Monday 4 January 2010 09:20

Very informative. Would love to see if any improvements have been made in 6 months or so.

By Tweakers user crisp, Monday 4 January 2010 09:33

Nice blog, but I have to mention that the yellow is a bit too much on my frontend.
I thought it nicely complements the background of my blogpage :P

By Tweakers user i-chat, Monday 4 January 2010 09:49

I thought it nicely complements the background of my blogpage :P
its kinda contraditing (is that the right word?) - your post about 'front-end design'
though one could argue that, as its being the background, its also probably part of the back-end.

even though id have to slap that person on the head for such a remark.... :P

By Tweakers user crisp, Monday 4 January 2010 09:56

I'm a developer, not a designer ;)

By Tweakers user Ahrnuld, Monday 4 January 2010 10:17

You're also a blogger using a custom layout :) which makes you the designer of this page.

Very good post!

By Tweakers user Blaise, Monday 4 January 2010 11:01

I'm a developer, not a designer ;)
It shouldn't have to be that obvious ;)

I'm also surprised big websites aren't aware of this. I thought most of these sites were built by competent front-enders. Because front-end optimization isn't really rocket science and any front-end developer should already know about how to optimize code and images.

By Tweakers user i-chat, Monday 4 January 2010 11:28

well to be honnest i worked with a few 'programmers (not online webbased)' so it didn't supprise me really,

What DID suprise me however is that it is 'this bad' i never expected any E-ratings i guess im pretty wrong here...

even though im not in to programming myself, so im not really the guy to point fingers to others (i guess), i do think this is REALLY BAD..

By Tweakers user 90710, Monday 4 January 2010 12:02

Can you please test www.microse.nl for this. I simply refuse using FireFox.

By Tweakers user s.stok, Monday 4 January 2010 12:14

You have an Grade C

My Site currently has an grade B.
Optimization is not possible at this moment, but it is very well planned for in the future.

Customer panel has grade C.
Admin panel Grade B.

I'm planing to go for an grade A :9
But that will take some time.

By Tweakers user MMaI, Monday 4 January 2010 12:15

Grade C
Overall performance score 79
Ruleset applied: Default_ NoCDN
URL: http://microse.nl/

Empty Cache
HTTP Requests - 15
Total Weight - 131.4K

Primed Cache
HTTP Requests - 15
Total Weight - 5.4K

Graded F for gzip, Etags, Expires

By Tweakers user rutgerlak, Monday 4 January 2010 13:23

Client:
"Grade A Overall performance score 99"

Nearly all my pages have grade A ((90-99) / 100). You could also give a try to PageSpeed. Google Sitemaps/Webmaster tools rates your site and gives suggestions. Consider using both tools for best performance. Ads can (as you said) significantly reduce the performance.

Server:
My pages are not compressed, because I run a loadbalancer/proxy in front of my backend webservers. So varnish does not have to save 2 versions of the page (gzip and non-gzip). Currently I optimize my pages by removing all linebreaks. For dynamic content within saved pages I use ESI.

As you see, you need more than clientside optimization. Serverside does deserve the same attention.

By Tweakers user RobIII, Monday 4 January 2010 17:03

Nice article crisp, as always d:)b
Can you please test www.microse.nl for this. I simply refuse using FireFox.
:D Too lame to install Firefox (which can be done in under a minute) and admitting you "refuse" to install it (for God-knows-why) and still have the balls to ask someone else to test your site :X |:( Even though we're fresh in a new year, you're in the lonely top of my lame list.

[Comment edited on Monday 4 January 2010 17:05]


By Tweakers user Blokker_1999, Monday 4 January 2010 17:32

Especially when you are writing websites you should test them in the most popular browsers even if you don't like those browsers.

By Coen van Eenbergen, Monday 4 January 2010 18:08

This article isn't up2date at the moment.

Since I'm one of the owners of Techzine.nl I thought I should respond here.

The in-text ads as mentioned above were running on Techzine.nl for a short period of time as a test, since our users massively disapproved these ads we took them offline within few days. Not mainly because of the speed but more for the user experience.

I also did some research about these ads but the in-text advertisements are not very fast in loading as crisp said. But what crisp missed and probably blamed the in-text ads supplier for. Is some code that is being loaded by Webads.

The webads bannercode is very complex and uses 5 subdomains of the domainnname webads.nl to get their information from. Also these servers don't use any form of compression. Then their are also advertisers and media companies that prefer their own tracking systems on top of the Webads one. Which leads to more slower performance.

I already mentioned this to the people of Webads who are looking into the issue. I also heard that other publishers have complained about this bannercode.

For the non dutch people reading this blog:
Webads is one of the bigger premium advertising networks in the Netherlands and provide advertisements to a few hundred websites.

So besides Techzine.nl many many other websites have slower performance then they should have when the advertising agencies invest more time and money in their systems and agreements with advertisers concerning tracking and serving of banners.

Since we are a small publisher as many others, we don't have the resources to do the advertising sales ourself. You need to partner up with a party like Webads and you are forced to put these very bad written and performing codes on your website.

So as a owner of a website you cannot always resolve these kind of issues or be blamed for them. Of course we choose to work with Webads but in our opinion there aren't any good alternatives at the moment on the Dutch market.

Then I also would like to say that the part written under the header Techzine about videos and downloads that are being automatically started/downloaded is not related to Techzine. Crisp meant this in general speaking I guess.

By Tweakers user Civil, Monday 4 January 2010 18:16

I see that Coen has already written a comment here. As i'm also one of the developers of Techzine i would like to place a little comment.

As you mentioned in your article you have tested Tweakers.net without ads. I think - but don't know exactly- that tweakers hosts the javascript needed for banner loading on it's own domain (including all optimizations) in the main javascript files, instead of serving it from the servers of the advertising companies like webads (mostly non-optimized).

That difference should have a huge impact on the score in Yslow. The non-optimized code of advertising companies is a pain in the neck. If you are in a situation where you can combine the loading code within you own code it's a huge advantage.

When testing Techzine without the bannercode the site on itself scores an B grade easily (similar to tweakers), we tested that off course. But further optimization could be done here and there.

The struggle to get advertising companies to optimize their code is not always easy...

By Tweakers user 90710, Monday 4 January 2010 19:22

@RobIII:
FireFox is too lame to be installed, that's why. I installed FireFox in a Virtual Machine after reading MMal's reply to plan improvements.

By Tweakers user crisp, Monday 4 January 2010 20:31

Crisp meant this in general speaking I guess.
That's right; I have update the heading to reflect that better. It wasn't primarily about Techzine; I just took the in-text advertising as an example of how bad advertising implementations can be - which you also seem to agree with :) A good thing you got rid of it.
I think - but don't know exactly- that tweakers hosts the javascript needed for banner loading on it's own domain (including all optimizations) in the main javascript files [...]
No, we don't. We simply just don't put any bannertags on 90% of our index pages (frontpage, forum and pricewatch). With bannertags our index also drops to a C-grade, indeed mainly because of unoptimized flash-loading code served from different domains, so it is not so different from your situation (besides, we also used Webads before).

We do however load the bannertags after the content, and also combine tags for multiple formats in order to reduce the overall number of tags that we need to include in the page (and the total number of ads that can be displayed simultaniously).

[Comment edited on Monday 4 January 2010 20:32]


By Tweakers user momania, Monday 4 January 2010 20:54

Interesting stuff as always d:)b

By Tweakers user Manuel, Tuesday 5 January 2010 00:18

Crisp, nice item you posted here, but still I've got some questions for you, answer them if you want.

If I check my own site with a image sprite of 1.34 MB (- 700 kB because of the background) and visit with a empty cache then the render-time for the browser is about 1.288s according to ySlow, but if I check with Firebug (Net Panel) the loading time is around the 1.45s. (ISP: Ziggo | AiE).

Now my question is, the image sprite is transparent, how can you optimize / shrink a transparent image without deleting the transparent part. I've read lots of documents on Google (Page Speed) and also on Yahoo (ySlow) but I can't find a solution to my problem.

The only thing I know to shrink the sprite is removing the background from the sprite and then load the background as a new image file. Then the background will be fetched by a new request.

Let me know! ;-)

By Tweakers user crisp, Tuesday 5 January 2010 09:54

@Manual: putting all of your backgroundimages in one huge spritemap is not very efficient; not only do your visitors have to download this 1.34MB file on the first hit wereas it is unlikely that every sprite in that spritemap is actually used on one page, but also it is quite memory-costly for the browser, and having to clip portions of such a large image on several positions on the page actually takes more rendering time than clipping smaller images.

My advice would be to make several spritemaps: group those images that are normally used together. Added advantage is the fact that smaller spritemaps can better be optimized because of the reduced number of colours in the combined image (so you can use a palletized png-format - even with transparency).

And yes, you should probably leave the (700KB - ouch...) background out of your spritemap.

By Tweakers user himlims_, Wednesday 6 January 2010 14:11

sommige dingen blijven abracadabra voor mij, maar je visie is altijd duidelijk ... wederom amusante / interessant blog post

By Tom Sanders, Thursday 7 January 2010 09:57

very intersting post, and thank you for point out these issues.
We will further look into this matter in the next few weeks.

Tom Sanders
Webwereld

By Tweakers user crisp, Friday 8 January 2010 02:35

Hi Tom; good to see that you do take my advices seriously. Please note that the only reason I pointed out Webwereld was the fact that you seemed to disrespect the (imo valid) criticism of one of your users. In any other case I would have probably targetted fok.nl (who are also using 2 javascript frameworks simultaneously), hyves.nl or telegraaf.nl ;)

By Tweakers user Manuel, Sunday 10 January 2010 15:40

@crisp: Thanks for your tip, I've splitted the images and done some tiny adjustments. Also gzipped the JS (and CSS) files and there is a huge difference if you compare it to the version without these adjustments.

Only know that Webkit doens't accept .js.gz or .css.gz files. Webkit only accepts .jgz files, why Webkit does this, I have no idea...

By Arnout, Thursday 11 March 2010 10:24

Clientside performance might not be a priority for most Dutch websites, but that doesn't mean that there arent websites focused on front end performance. Most websites go in to great extends to provide the users the best and fastest web experience possible.

I personally think the lack the knowledge is the cause of this, when you are schooled or self thought there is no mention of performance or even how browsers work in general. You are only thought how to write HTML and CSS and make the website work. Reflow, repaint, caching is probably not in there vocabulary.

Here at Hotels.nl we take performance really serious. From optimizing parallel downloads to custom cache engines and custom preload patterns. Not only from a performance perspective but also form a business perspective optimizing can actually save you money. If you do not cache your resources properly you are only wasting bandwidth.

By g9g 2012, Friday 18 November 2011 20:25

which makes you the designer of this page.

Comments are closed