Archive for May, 2008

Xlinks Digest – 31 / 05 / 2008

Saturday, May 31st, 2008

Xlinks is a digest of interesting links that have been found by the ProjectX staff.

    How to pick up a VC – Guy Kawaski
    Added on 05/31/2008 at 07:03PM
    Google release Google Earth Plugin for Browser: 3 months ahead of when I predicted
    Added on 05/31/2008 at 05:14PM

    New iPhone rumoured to have GPS
    Added on 05/31/2008 at 03:13PM

    Prototip2 – Create beautiful tooltips with ease
    Added on 05/31/2008 at 03:09PM

    ‘Help me Obi Wan Kenobi’ – Musion make Holograms a reality
    Added on 05/31/2008 at 11:45AM

    Findability Orphan of the Web Design Industry
    Added on 05/30/2008 at 12:45PM

    Zebra Stripping – does it help
    Added on 05/30/2008 at 12:28PM

    Improve Page performance with lazy loading
    Added on 05/30/2008 at 12:17PM

    Bill Gates – Windows 95 was a nice milestone: Seesh. That was a long long time ago
    Added on 05/30/2008 at 08:54AM

    Maps and iPhone
    Added on 05/29/2008 at 04:38PM

    Times machine – Browse NY Times archives from 1865
    Added on 05/29/2008 at 04:25PM

    Difference between a fifty-year-old carpenter and a novice

    A novice needs multiple projects to learn how to able to build iteratively without mucking up the whole project
    Added on 05/29/2008 at 12:04PM

    Heroku – Rails hosting management system: WOW
    Added on 05/28/2008 at 11:44PM

    Earthmine – 3D maps are coming
    Added on 05/28/2008 at 11:20PM

    The art of not finishing
    Added on 05/28/2008 at 10:47PM

    OK / Cancel or Cancel / OK
    Added on 05/28/2008 at 10:29PM

    Cool big scary metal statues
    Added on 05/28/2008 at 07:45PM

    Yodeler – YUI extensions
    Added on 05/28/2008 at 10:00AM

    Google Ajax Libraries API
    Added on 05/28/2008 at 09:30AM

    Buses as mobile sensing platforms
    Added on 05/27/2008 at 01:41PM

The future of the mobile web is small webpages

Saturday, May 31st, 2008

Nat Torkington posted about this presentation by Cloudfour on their research of the mobile web – Going fast on the Mobile Web. It contains some fantastic insights of how iPhone is changing the way that people are using the mobile web.

It also outlines their research into how mobile browsers deal with compression and caching. And details 8 tips on how to deliver content on the mobile web.

lowest price on non generic levitra

And tips for the iPhone.

When you partner that with Steve Sounders research on iPhone caching.

The iPhone cache experiment suggests an additional performance rule specific for developing web sites for the iPhone:

Reduce the size of each component to 25 Kbytes or less for optimal caching behavior.

Given that the wireless network speed on iPhone is limited and the browser cache is cleared across power cycle, it is even more important to make fewer HTTP requests to achieve good performance than in the desktop world…

Also, the maximum cache limit of all components is 475 – 500 KB. Minify all the JavaScript, CSS and HTML.

The future of the mobile internet is small webpages.

Finally, here is the slide deck, its well worth a read.

What happens if you have good broadband ?

Friday, May 30th, 2008

Om Malik has just wrote a post about broadband usage in London. In London there are a numer of providers and they have quality services for WI-FI, Broadband and 3G. Here’s the full report from Ofcom .

In London:

levitra professional

* 40 percent of people watch TV or video content online.
* 20 percent make VoIP calls.
* 32 percent are using their mobile phones to access the Internet.
* 19 percent listen to audio content on their mobiles.

The evidence is mounting better broadband and mobile services will result in more usage and more opportunities for companies / content providers to create new revenue streams. Let’s hope things will change for the better this decade!

Speed up your javascript – run Prototype / JQuery from Google

Wednesday, May 28th, 2008

Google has just released the AJAX Libraries API. Google has created an API that allows developers to access a number of popular javascript API from the Google Cloud.

You can use the Google AJAX Libraries API to reference:

levitra online sale

  • Prototype
  • JQuery
  • MooTools
  • Scriptalous
  • Dojo

Why is this good idea ?

  1. Its Fast – its hosted on the Google CDN.
  2. The libraries have already been optomised ie. minified, delivered via compression
  3. It enables better caching of key libraries from one site to another. ie. Visit site A that uses mootools it downloads the javascript. Visit site b that uses mootools and it fetches it from the cache.

How do I use it ?

Its really easy to use. Either call the scripts directly eg.

<script src=“”></script>

or use the Google API to load them scripts.

<script src=“”></script>


// Load jQuery

google.load(“jquery”, “1”);

A couple of requests to Google

The AJAX Library API is a fantastic idea, but it could be better. We have a couple of requests 🙂 .

  1. can you host more of the legacy versions of the libraries. (We use an old version of prototype. The recent versions are a little bloat for what we need).
  2. It would great if you could provide stats on how many sites in a particular country are using the libraries. Then we can see if there are other sites in that are using the same libraries and then we get the benefit of caching 🙂
  3. Properly minify the scripts! Thanks for to Eric for pointing it out.
  4. Fix the Website. The FAQ and Groups links are going to the wrong place !

Xlinks Digest – 27 / 05 / 2008

Tuesday, May 27th, 2008

Here is a digest of interesting links as discovered by the ProjectX staff.

    This Is Funny Only if You Know Unix
    Added on 05/27/2008 at 12:00PM
    OMeta – Object oriented Language for pattern matching
    Added on 05/27/2008 at 11:57AM

    Rolling with Rails 2.1 – part 2
    Added on 05/27/2008 at 11:55AM

    Rolling with Rails 2.1 – part 1
    Added on 05/27/2008 at 11:53AM

    21 ruby tricks
    Added on 05/27/2008 at 11:52AM

    Javascript information visualisation toolkit
    Added on 05/27/2008 at 11:52AM

    Super Mario Cart in Javascript
    Added on 05/27/2008 at 11:49AM

    Faster Wireless Networks
    Added on 05/26/2008 at 09:55PM

    Web users getting more selfish
    Added on 05/26/2008 at 02:24PM

    Advice to Twitter – Think ahead
    Added on 05/23/2008 at 03:30PM

    2nd gen OLPC is an e-book
    Added on 05/22/2008 at 09:39AM

    Top 10 worst entry level tech jobs in US
    Added on 05/22/2008 at 09:36AM

    Hash Functions – Theoretical basics mixing and evaluations of Algorithms
    Added on 05/22/2008 at 09:26AM

    Seth Godin – How to read a business book
    Added on 05/22/2008 at 09:22AM

    Geohashing comic – its easy really
    Added on 05/22/2008 at 09:20AM

    A selection of awesome business cards
    Added on 05/22/2008 at 08:37AM

    Doloto – splitting up ajax calls
    Added on 05/22/2008 at 08:33AM

    How to deploy a skunkworks application
    Added on 05/21/2008 at 09:17AM

    PC World – Top 50 Technology visionaries
    Added on 05/21/2008 at 08:51AM

    Evolutionary algorithms now surpass human designers
    Added on 05/21/2008 at 07:33AM

Speed up NZ internet – Lesson One: Maximise Parallel downloading

Tuesday, May 27th, 2008

paxil canada

What is parallel downloading?

In order to explain what parallel downloading is we first have to illustrate how a webage is rendered by a browser. When you type in the URL and press enter, the browser looks up the DNS for the site, connects to the webserver and requests the base HTML of the site. The browser downloads the file and then starts to parse the HTML. The browser starts to queue up the components of the page it needs to load and then finally begins to render the page.

If the webpage and web server are set-up properly, the browser will seek to download the various components in parallel. ie. it can load multiple components at the same time. Unfortunately, javascript and CSS files can block other components from being downloaded until the files are fully loaded. This blocking behaviour can make a huge impact on a users experience in downloading the page. This blocking behaviour is illustrated below.

Why do want my site to load in parallel ?

The major objective of website optimisation is to minimise the time spent downloading a webpage. One way to achieve this is to load as many components as possible in parallel. (without comprising performance!) The HTTP/1.1 Spec defines that browsers can download two files in parallel from a hostname.

As we move more towards the broadband era, we want to promote more parallel downloading to speed up the delivery of our web content.


Below I have included the download profile using IE 6.5 of two New Zealand websites to illustrate why parallel downloading is important.

Good – Parallel downloading

Here you can see the download profile of Trade Me Homepage. You can see that the images load in chunks of 4 at a time. This illustrates how you can much you speed up the loading of your page components.

Bad – Sequential downloading

Here you can see the download profile of Sky City Cinema Homepage. Notice the stepping behaviour of the javascript files loading. The javascript files are loading in single file, one after the other – NOT in parallel. You can see from this behaviour that the loading of the script files is blocking the rest of the content such as images from loading. This explains the delay on the page before the user starts to see the rest of the content.

How did I get my site to load in parallel?

  1. Analyse your current downloading pattern. Use Firebug or SiteTimer to show you how your page is loading. Look for what is loading in parallel and what is loading sequentially.
  2. Minimise the number of blocking files ie Javascript and CSS. When possible combine and use web compression to speed up loading of the site.
  3. Split your page components across multiple domains. ie. One for images, one for ad components. TIP: Create a domain alias for your site eg. for all your images eg. The browser treats this as a new domain even though its referencing the same webserver.
    Example: For ZoomIn maps – we use 4 domains for the map images, that means for a broadband connection the browser can load upto 8 images at a time.
  4. Note: YSlow research shows that a total of 2-4 domains per page is optimal.


NZ homepage 'frontend' vs 'backend' stats

Monday, May 26th, 2008

I’ve just completed a mini audit of the total size of the frontend vs backend requests of the top 75 NZ homepages

The average backend request size is 27.32K (ie. the size of the base html of page) and that equates to an average of 10.65% of total page size. What this means that is 10.65% of the pages is made up of HTML and the other 89.35% of the page is made up with the other content.

What does this mean and why is it important?

A backend request is the base html of the page. The frontend requests are all subsequent requests to load all css, images, javascript etc of the page. As you can see in the above graphic of the download profile of ZoomIn Homepage , you can see that the base HTML is loaded first and then other components of the page (ie. CSS, javascript and images) are loaded after the base HTML

This behaviour is a really important aspect of web compression as the faster you can load the backend html, the faster the rest of your site will start loading and more importantly the faster your users can start using the site.

Google to penalise adwords ranking for slow loading webpages

Thursday, May 22nd, 2008

Why does web performance matter????

Google has announced that it will start adding page load times into formula for ad ranking. They say that it may have an effect on your keyword score.

prednisone price list

Here’s what Google says about why they are making the change.

Why are we doing this?

Two reasons: first, users have the best experience when they don’t have to wait a long time for landing pages to load. Interstitial pages, multiple redirects, excessively slow servers, and other things that can increase load times only keep users from getting what they want: information about your business. Second, users are more likely to abandon landing pages that load slowly, which can hurt your conversion rate.

There is more information at the adwords help center.

NZ Government Home page web performance audit

Thursday, May 22nd, 2008

This part II of my audit of New Zealand Homepages (Part I looked at the top 75 homepages in New Zealand).

I have conduct an audit of 320 Government websites looking at their web performance. The audit was conducted by using YSlow to analyse the download performance of the homepage of a Government website. The list of government websites was taken from the New Zealand Government website. Full results are available from this Google Spreadsheet. Audit conducted on Tuesday Night 20th May 2008.

Audit Findings

Homepage Size

Graph of NZ Government Home page sizes

(Click on the graph to view values)

  • The average homepage size was 216.85K (Compared to 304.5K for top 75 NZ websites) The largest Homepage was staggering 2442K and the smallest homepage was 8.3K,

Web Compression

Graph of Web compression usage on Government webpages

  • 84.1% of Government homepages were not compressed, 11.6% are compressing some of the page elements and only 4.4% (14 sites) are fully compressing their pages.
  • Some sites are compressing only CSS and JS files but NOT their base HTML
  • A lot of sites have compression external javascript eg. Google Analytics (which was discounted as being part of the site) This highlights that all external javascript should be compressed.

Other findings

  • The worst site had 151 files for their homepage and the best site 3 files
  • The worst offending pages had over 21 Javscript files and the worst offender for the number of CSS files was 40.
  • Some sites have duplicate versions of scripts running. One site had 4 Google analytic scripts running.


Top 10 best Government homepages ranked by YSlow

Website Ranking Total Size (Kilobytes) Total Files # of Javascript files # of CSS files Etags Rating Web Compression Estimated Download time modem @6k/s 92 23.3 3 0 0 C None 3.88 87 65.5 4 0 1 D None 10.92 87 165.5 24 3 2 A YES 27.58 87 18.1 4 0 1 D None 3.02 87 32.7 4 0 1 D None 5.45 84 100.8 14 0 2 B None 16.80 82 32.9 9 2 3 A None 5.48 82 20.2 5 0 1 F None 3.37 81 22.4 6 0 2 A None 3.73 80 32.7 5 1 0 A None 5.45

Top 10 worst Government homepages ranked by YSlow

Website Ranking Total Size (Kilobytes) Total Files # of Javascript files # of CSS files Etags Rating Web Compression Estimated Download time modem @6k/s 31 420 77 21 15 F None 70.00 34 181.2 52 6 9 F None 30.20 35 298.8 88 3 40 F None 49.80 35 393.6 46 16 12 F None 65.60 37 172.4 42 4 14 F None 28.73 38 198.8 48 10 5 F None 33.13 38 310.2 47 13 5 F None 51.70 38 567.1 74 5 10 F None 94.52 39 147.8 69 9 16 A Some 24.63 39 325.8 37 12 6 F None 54.30

Top 10 smallest Government homepages by size

Website Ranking Total Size (Kilobytes) Total Files # of Javascript files # of CSS files Etags Rating Web Compression Estimated Download time modem @6k/s 79 8.3 7 0 1 F None 1.38 87 18.1 4 0 1 D None 3.02 82 20.2 5 0 1 F None 3.37 76 20.9 7 0 2 F None 3.48 81 22.4 6 0 2 A None 3.73 92 23.3 3 0 0 C None 3.88 68 23.9 31 0 0 F None 3.98 80 32.7 5 1 0 A None 5.45 87 32.7 4 0 1 D None 5.45 82 32.9 9 2 3 A None 5.48

Top 10 largest Government homepages by size

Website Ranking Total Size (Kilobytes) Total Files # of Javascript files # of CSS files Etags Rating Web Compression Estimated Download time modem @6k/s 63 2442 20 1 1 F None 407.00 53 2353 44 3 5 F None 392.17 64 1906 151 0 1 F None 317.67 56 1530 32 3 4 F Some 255.00 64 1209.6 12 2 1 F None 201.60 66 1009.7 40 4 2 A Some 168.28 55 971.3 42 1 2 F None 161.88 66 886.2 68 2 2 F YES 147.70 43 799.7 36 10 5 F None 133.28 47 762.5 40 5 5 F None 127.08

NZ Govt Homepage web performance summary

Tuesday, May 20th, 2008

I’ve just completed an audit of the 320 government websites as listed in the Government service directory.

best price generic propecia

The key findings are:

  • The Average homepage size for NZ Govt site is 216.85K
  • Largest homepage is 2442K
  • Smallest homepage is 23.3K
  • 84.3% of Govt Site use NO Web compression, 11.5% have some Web compression and only 4.7% use web compression
  • Some sites are compressing CSS and Javascript but not the HTML.
  • Bad design – one site uses 40 CSS files and another site uses 21 javascript files
  • A number of sites have duplicate copies of some javascripts.

I’ll be posting the full report tomorrow.

Related Posts: