Wednesday, May 16, 2007

14 rules for fast web pages

Steve Souders of Yahoo's "Exceptional Performance Team" gave an insanely great presentation at Web 2.0 about optimizing website performance by focusing on front end issues. Unfortunately I didn't get to see it in person but the Web 2.0 talks have just been put up and the ppt is fascinating and absolutely a must-read for anyone involved in web products.

His work has been serialized on the Yahoo user interface blog, and will also be published in an upcoming O'Reilly title (est publish date: Sep 07).

We have so much of this wrong at topix now that it makes me want to cry but you can bet I've already emailed this ppt to my eng team. :) Even if you're pure mgmt or product marketing you need to be aware of these issues and how they directly affect user experience. We've seen a direct correlation between site speed and traffic.

This is a big presentation, with a lot of data in it (a whole book's worth apparently), but half way through he boils it down into 14 rules for faster front end performance:

  1. Make fewer HTTP requests
  2. Use a CDN
  3. Add an Expires header
  4. Gzip components
  5. Put CSS at the top
  6. Move JS to the bottom
  7. Avoid CSS expressions
  8. Make JS and CSS external
  9. Reduce DNS lookups
  10. Minify JS
  11. Avoid redirects
  12. Remove duplicate scripts
  13. Turn off ETags
  14. Make AJAX cacheable and small

The full talk has details on what all of these mean in practice. The final slide of the deck is a set of references and resources, which I've pulled out here for clickability:

book: http://www.oreilly.com/catalog/9780596514211/
examples: http://stevesouders.com/examples/
image maps: http://www.w3.org/TR/html401/struct/objects.html#h-13.6
CSS sprites: http://alistapart.com/articles/sprites
inline images: http://tools.ietf.org/html/rfc2397
jsmin: http://crockford.com/javascript/jsmin
dojo compressor: http://dojotoolkit.org/docs/shrinksafe
HTTP status codes: http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html
IBM Page Detailer: http://alphaworks.ibm.com/tech/pagedetailer
Fasterfox: http://fasterfox.mozdev.org/
LiveHTTPHeaders: http://livehttpheaders.mozdev.org/
Firebug: http://getfirebug.com/
YUIBlog: http://yuiblog.com/blog/2006/11/28/performance-research-part-1/
http://yuiblog.com/blog/2007/01/04/performance-research-part-2/
http://yuiblog.com/blog/2007/03/01/performance-research-part-3/
http://yuiblog.com/blog/2007/04/11/performance-research-part-4/
YDN: http://developer.yahoo.net/blog/archives/2007/03/high_performanc.html
http://developer.yahoo.net/blog/archives/2007/04/rule_1_make_few.html

5 Ways People Screw Up AJAX

I had noticed that not many articles existed on the negative aspects/implementation of ajax so came up with this top 5 list of things people screw up when using ajax.

1. No back button!:
One of the most annoying things to a user is the inability to go backwards. They may visit a site, perform a few searches and want to go back 2 searches before. Some sites utilizing ajax make the simple task of going back extremely difficult and end up bringing the user back to the initial page they clicked on to go to the site function, thereby removing the user's history.

2. No more links:
As mentioned in item 4 if people can't find your site or a specific section of it you'll lose traffic. Poor implementations fetching all content dynamically via ajax requests do not allow the user to get a web link they can forward along or bookmark.

3. Over complication when it isn't needed
As with other technologies things can get more complicated than is really needed and people can get excited when a new technology comes out. Do you really need to ajaxify your contact form?

4. Removing site indexability:
Depending on how your dynamic content is implemented web spiders may have a hard time finding all of the content available on your site. This can happen when content is stored in a DB only accessible via AJAX and web service calls. If a crawler can't obtain your content, how are users supposed to find it?

5. Web Server connections increase:
One of the advantages is that ajax receives tiny responses when compared to large responses typically associated with classic web browsing. While this may reduce some bandwidth it may also fill up web server max connections and require a retweaking of your server, or worst case throwing in more hardware when implemented poorly. I'm not stating this is the case for most ajax implementations by any means, however more requests (either via polling or direct user requests) equals more connections on average per user which depending on your userbase can really add up.