Common Misconceptions about Web Performance Optimisation

What would you do to improve the performance of a web application?

This is a question I’ve posed to a fair number of Java developers in the past week. A common theme quickly emerged and remained consistent across all the responses, which is that everyone I spoke to immediately began listing improvements to the server-side processing involved in a typical page request. Many of the responses were perfectly valid and often considered “best practice” (as much as I can’t stand that term), but few were likely to have much of a noticeable impact to the user. Some even mentioned the use of caching, but only thought to apply it internally within the server (e.g. database cache) and not to the content within the response.

It seems that the natural assumption of many Java developers is to look for optimisations in their code before trying to understand the entire request-response lifecycle and all the actors involved. According to Steve Souders, “80-90% of the time spent by users waiting for pages to load is spent on the frontend”. If that comes as a shock to you, then I strongly recommend that you read High Performance Web Sites. It is an excellent reminder of the obvious, but often forgotten, reality that your website is accessed by people, using browsers, across networks and domains, via intermediaries (proxies, etc.), over a common protocol (HTTP).

Digging into each of the bolded items above will give a much better appreciation of the components that act in concert to provide a web experience to the user. If this is the first you’ve heard of web performance optimisation and you want to know more, I recommend the following sites:

Steve Souders’ 14 Rules for Faster-Loading Websites (from his book, High Performance Websites)

Yahoo!’s Best Practices for Speeding Up Your Website

So next time someone asks you that question, please throw in a few optimisations other than server-side processing.