I’ve done some informal testing on my own website and I did get some impressive results.Before this script was added to my website you needed to download 8 javascript files, in total 168 Kb – the prototype and scriptaculous libraries. After installing this script you now need to download only a single file of 37 Kb which only takes around 400 ms.

The idea is that you have one directory for css files and one directory for javascript files on your server.

If you rewrite the URLs that point to these directories to a small script that intercepts the requests for those files.

Even on a fast internet connection it took more than 8 seconds to load a basically empty page.

The server generated the page in about 350ms, so that wasn’t the problem.

If you have many different files that need to be loaded the browser will not optimally use the bandwidth it has access to.

It will request some files from the server and wait until those files are retrieved before the rest of the files are requested.

Unfortunately I noticed a nasty side effect of the combination of these two methods.

If you combine many files the resulting files can be come quite large.

Unfortunately, if you do this manually you are going to run into maintenance problems. So after editing one of the original source files you will have to recombine it with the other files and re-compress it.

Instead of going for the easy – but hard to maintain – solution I decided to automate the process and thanks to a small PHP script and some clever URL rewriting I now have an easy to maintain method to speed up the loading of pages that use many or large css and javascript files.

It will then concatenate the requested files, compress it and send it as one to the browser.