Steve Souders, an amazing performance expert published a post called "Prebrowsing".
In the post he talks about some really simple techniques you can use to make your site quicker. These techniques rely on the fact that you know the next page that the user will browse to, so you 'hint' to the browser, and the browser will start downloading needed resources earlier. This will make the next page navigation appear much quicker to the user.
There are three ways presented to do this - They all use the 'link' tag, with a different 'rel' value.
The first technique is 'dns-prefetch'. This is really easy to add, and can improve performance on your site. Don't expect a major improvement though, the dns resolution itself usually doesn't take more than 150ms (from my experience).
I wrote about this to, in this blog post: Prefetching dns lookups
The second two techniques shown are 'prefetch' and 'prerender'.
Since these are really easy to add, once I read about this, I immediately added this to my site.
A little information about the site I'm working on : The anonymous homepage doesn't have SSL. From this page, most users sign-in or register. Both of these actions redirects the user to a SSL secured page. Since the protocol on these pages are https, the browser doesn't gain from the cached resources it has already since it thinks it's a different domain (and should). This causes the user to wait a long time on these pages just to have it's client download the same resources again but from a secured connection this time.
So I thought it would be perfect to have the browser prerender (or prefetch) the sign-in page or the register page. I have a WebPageTest that runs a script measuring the performance of this page, after the user was at the anonymous homepage. This test improved by a LOT. This was great! It was only a day after that I realized that the anonymous homepage itself was much slower... :/
I guess this is because while the browser takes up some of it's resources to prerender the next page, it affects the performance of the current page. Looking at multiple tests of the same page I couldn't detect any point of failure except that each resource on the page was taking just a little longer to download. Another annoyance is that you can't even see what's happening with the prerendered page on utilities like WebPageTest, so you just see the effect on the current page.
After reading a little more on the subject I found more cons to this technique. First, it's still not supported in all browsers, not even FF or Opera. Another thing is that Chrome can only prerender one page across all processes. This means I can't do this for 2 pages and I don't know how the browser will react if another site that is opened also requested to prerender some pages. You also won't see the progress the browser makes on prerendering the page, and what happens if the user browses to the next page before the prerendered page finished ? Will some of the resources be cached already ? I don't know, and honestly I don't think it's worth testing yet to see how all browsers act on these scenarios.
I think we need to wait a little longer with these two techniques for them to mature a bit...
What is the best solution ?
Well, like every performance improvement - I don't believe there is a 'best solution' as there are no 'silver bullets'.
However, the best solution for the site I'm working on so far, is to preload the resources we know the user will need ourselves. This means we use javascript to have the browser download resources we know the user will need throughout the site on the first page they land, so on subsequent pages, the user's client will have much less to download.
What are the pros with this technique ?
1. I have much more control over it - This means I can detect which browser the user has, and use the appropriate technique so it will work for all users.
2. I can trigger it after the 'page load' event. This way I know it won't block or slow down any other work the client is doing for the current page.
3. I can do this for as many resources I need. css, js, images and even fonts if I want to. Basically anything goes.
4. Downloading resources doesn't limit me to guessing the one page that the user will be heading after this one. On most sites there are many common resources used among different pages, so this gives me a bigger win.
5. I don't care about other tabs the user has open that aren't my site. :)
Of course the drawback with this is that opposed to the 'prerender' technique, the browser will still have to download the html, parse & execute the js/css files and finally render the page.
Unfortunately, doing this correctly isn't that easy. I will write about how to do this in detail in the next post (I promise!).
I want to sum up for now so this post won't be too long -
In conclusion I would say that there are many techniques out there and many of them fit different scenarios. Don't implement any technique just because it's easy and because someone else told you it works. Some of them might not be a good fit for your site and some might even cause damage. Steve Souder's blog is great and an amazing fountain of information on performance. I learned the hard way that each performance improvement I make needs to be properly analyzed and tested before implementing.
Some great resources on the subject :
- Prebrowsing by Steve Souders
- High performance networking in Google Chrome by Ilya Grigorik
- Controlling DNS prefetching by MDN
Six weeks ago,
No comments:
Post a Comment