Recently I was tasked to implement a new site design for a hunt club. The site design could only be described as beautyful. However initial testing showed some performance issues. In this guide I will go through the steps I followed to optimize the site as well as some other additional techniques to use.
Note: This is not an exhaustive guide but I will keep updating it as I become aware of new techniques.
Note: There comes a point of diminishing returns when you are only shaving milliseconds off the load time. Its up to you as the developer to weigh development effort against potential performance improvements.
When I implemented the design, I followed the designers specifications to the letter. I used the exact images given to me at the resolution supplied. This did not work out so well. On my localhost dev environment the site loaded smoothly and looked awesome. However when we uploaded it to the hosting machine and tried viewing it over our bandwidth starved office network, the performance was less than optimal.
This does highlight an important point. If you want to really test a sites performance do it from the worst network you can find. If you know someone with dial-up try that. In this case the site loaded faster on my home computer than in the office.
There are many free tools for analysing site performance. I use the "Page Speed" plugin for firefox. Whatever tool you use, you will be presented with a dizzying array of suggestions. Some of these will have little realworld impact while others can have a significant effect. It is really a case of trial and experiment to learn which steps are vital.
Note: Be aware that page speed uses percentages which can be misleading. For example reducing a css file by 40 percent may sound like alot until you see that the original size was 2k.
My first step when optimizing a site is to look at all the images that get loaded and look for repeating patterns. These can be used to significantly improve page loads.
For example in the Hunt Club site the background image I was supplied with was a high resolution jpg with 1980x1152px dimensions. This was a huge 4.5 meg file. Of course I did not use that file unaltered even on localhost but it highlights the size of the potential problem.
A quick look at it though revealed that while the height could not be altered as it contained a gradient, the width could be reduced to 41px as the pattern repeated. I then set the background image css to repeat the image on the x-axis.
I always work with png's as they are probably the most compact of the three common browser compatible image types. So the first thing I did after cropping the background was saving as a png. Even with pngs though there is significant room for improvement.
When you save a png with photoshop you can save it in many different web formats. The most common of which are png-8 and png-24. These both save as a png file, but png-8 is a much more compact file as it reduces the image to websafe 256 colors. You can further reduce png-8 to a smaller number of colors. With the background image I was able to reduce it to 64 colours.
A simple and handy tool to use is "PNGCrush". This tool strips unnecessary data (such as author name, date created etc) from pngs reducing them to the bare minimum dataset. This combined the the step above had the effect of reducing the file size for the background image from 4.5 megabytes to 16 kilobytes (or 0.016 megabytes) nice.
If you have an image with a gradiated transparency (a drop shadow for example) it will have been saved using png-24 as png-8 sucks at semi-transparency. If you must have the gradiated shadow then thats what you have to stick with, but you should check every image on your site to see what can be reduced. For example On the hunt club site I was able to convert three 1000x470px png's to png-8 with minimal loss of quality. Each one reduced from 1 megabyte to 254k.
Sprites are a technique of combining several smaller images into one large image. You then selectively display a portion of that image for different elements. This has two benefits.
There are many useful tutorials on sprites on the web. If I write one myself I will add a link here.
Note: You would only combine images into a sprite if they are guaranteed to always be displayed together. A good use of sprites is a menu with rollover effects.
This does not have any effect on download speed, but specifying the image dimensions in the img tag reduce the work the browser has to do to workout where to place each element. This results in a smoother page load with less redrawing required on the browsers part.
Many common javascript libraries ship with a min version. This is a minified javascript file where all the extra whitespace has been stripped out. Stripping out whitespace can significantly reduce the size of a javascript file. You can find many tools on the net to minify your javascript files for you.
Note:You can also minify css files
Going back to the way a browser downloads resources, it is worth considering the priority with which those resources are downloaded. Javascript for example should not start running until a page is completely loaded. For that reason it should be the last thing the browser loads. By moving our javascript to the end of the page you ensure that the visual elements are loaded before the javascript giving the impression of a faster page load (although in reality the page takes exactly the same time to completely load).
Note: While you should move scripts to the page bottom they should still be before the closing body and html tags.
Going back to the javascript libraries I mentioned. Its not too commonly known but google hosts minified copies of the most common javascript libraries. By linking to one of these copies instead of hosting it yourself you reduce your own bandwidth usage and if a user has visited another site that also uses the same google hosted js file then the file will already be cached in their browser and wont have to be downloaded at all!
As with javascript if you combine all you css into one file and minify it you can significantly reduce the size and thus time required to load the css.
The one caveat to this technique is if you have a large css file for a public site with another large css file for an internal application it makes sense to keep those css files seperate as you are probably going to have alot more public visitors to your site who will never need the internal css rules.
Gzip compression is a process by which the webserver takes a webpage (after the scripting language php/asp/jsp/CF has finished working) and compresses the output into the gzip format. The compressed output is then sent to the client browser which then reverses the compression.
Note: Images do not benefit much from compression, only the html,css and js output really benefit.
On the face of it this seems like alot of work. And yes it does add load both to the server which has to compress the output and to the client to uncompress it. However the payoff comes in two ways.
By compressing the page you can achieve some impressive size reductions. I have seem up to 80 percent compressions. This can drastically reduce bandwidth bills.
Provided that both the server and client machine are not excessively stressed, the time saved by sending a smaller page file over the net will exceed the time required to compress and decompress the file.
Note: Unpatched editions of IE 6 have been noted to have problems with gzip compression. I personally do not worry about people who use unpatched editions of IE 6 nor will I ever. It is not the responsibility of any developer to bend over backwards for stupidity.
Update 15-jan-2011: New blog entry with instructions on enabling gzip compression on IIS 6.
Update 17-jul-2012: New blog entry with instructions on enabling gzip compression on Apache HTTPD.
As I mentioned before, browsers can only open so many connections at once. Well I left something important out. They can only open so many connections at once to a Given Domain. If you host some of your resources on a different domain to your main site then the users browser will be able to open extra connections to that domain to download the extra resources. This is why you will often see in your browser how when you open www.somesite.com it is downloading images from downloads.somesite.com (A sub domain is considered a seperate domain to the parent).
After all is said and done, the most important thing you can do is run your html and css though a validator. Valid code will always run faster as it reduces the work the web browser needs to do for error checking and redrawing.
One important item of validating your code is specifying the doc type and encoding. I personally use the strict encoding as often as I can. It is more work to write fully valid code, but it requires the least work I found for cross browser compatibility and makes for clean fast executing html.