So you think you know website optimisation ???

Steve Sounders has put up the low cost viagra test for his stanford class on website optimisation.

There are some really challenging questions in there.

Why were ETags introduced in HTTP/1.1?
What are four techniques for reducing cookie weight?
List five techniques for making selector matching faster.
Why do ETags (with the default Apache and IIS syntax) degrade performance with regard
to proxy caching?

Kudos to Steve for pushing the boundaries of our field. Its time that optimisation was baked in by default and every web developer knew the fundamentals! That starts by teaching the students of the future.

8 Responses to “So you think you know website optimisation ???”

  1. Rob Holmes Says:

    I’m not claiming to even know the fundamentals – but I happened across a site which provided the best automated grading I’ve seen (http://website.grader.com). An example report for nzpost.co.nz is here: http://website.grader.com/wsgid/2163592/default.aspx. They also grade Press Releases and Twitter profiles??

  2. john Says:

    Hi Rob,

    This is an SEO report, not website optimisation (which is about making a page fast!)

  3. john Says:

    Test comment

  4. Shoaib Says:

    what book am i suppose to read before sitting that test?

  5. john Says:

    You’re supposed to know this already….

  6. Shoaib Says:

    Yeah.. .but the exam itself specifically references some book. Ah there it is.

    “List rules 12-14 as well as the first four chapters of High Performance Web Sites Volume 2
    that were made available: ”

    “High Performance Web Sites Volume 2″ My guess http://www.amazon.com/High-Performance-Web-Sites-Essential/dp/0596529309/ref=sr_1_1?ie=UTF8&qid=1231999764&sr=8-1

  7. Rob Holmes Says:

    I went to the TradeMe Rockstars presentation last night where TradeMe again emphasised the importance of speed. I happened across an article today which reports Google’s findings when fine tuning their search results (a report from a talk at the 2006 Web 2.0 conference). They found a half-a-second slow-down equated to an pretty incredible 20% drop in traffic. A similar experiment was conducted at Amazon where they found just a 100ms increase in load time resulted in substantial drops in revenue.

    There are also some good comments left on the article including:

    “IBM did studies on interactive interfaces in the 80s. The results showed that as the return time neared two seconds, the time the user spent staring at the screen increased. The curve turned up dramatically below two seconds. Below one second, someone had to pry them off the keyboard. And that was with non-rich terminal applications.”

    “Research from the 80s and the PLATO Project showed that response time really needs to be about .25 second. For something where users expect a bit of work by the computer, stretching that to .4 I guess is OK. But, I would shoot for .25sec.”

    Articles:
    http://glinden.blogspot.com/2006/11/marissa-mayer-at-web-20.html
    http://www.uiandus.com/2009/02/05/theories/amazon-milliseconds-means-money/

  8. Jan Holmes Says:

    I can’t get to to your google talk address
    you really do the background research don’t you

    you are in the right place–universities woudl be far too slow for this kind of stuff!!