Friday, October 24, 2008

building high performance site

Recently, I was involved with building a web 2.0 site on where the most important criteria was performance! The pages had to load supper fast (Average Response Time <2sec); style="font-weight: bold;">Caching: The biggest bottleneck in any website is database access; so you need to cache data aggressively: also, unless you are building a site which requires real time data you can live with showing stale data to the user (we make sure that the updates always pull the real time data so that you not displaying the stale data on the update page to the user). Caching alone (esp. lookup data) can increase the performance of your site multi-folds and at times; this alone can be enough to get to decent response times. The only caveat with caching in is in a distributed environment; unfortunately in ASP.Net 2.0 there is built in way of having distributed cache objects i.e. every node in the cluster has it's own copy of cached data which can be out of sync' from other nodes. One way to avoid this by either using a third-party cache provider like memcached (I haven't personally used it so not sure how it gels with or upgrading to .net 3.0 and using Velocity; or by making your load-balancing sticky i.e. a user always hits the same node on the farm for his session.
Web.config debug attribute: Make sure you set the debug="false" on production web.config; you don't want debug symbols to be loaded on production environment.
Ajax: Use Ajax where appropriate to reduce the perceived response time; but do keep in nmind that one of the biggest issue with using Ajax for (aka Atlas) is that you can't use a CDN to serve the ScriptResource.axd which means all the javascripts are served by your servers (and there are quite a few of them around 9!); this means that if your servers are located in the US and the user is accessing your site from China; he will have to make 9 requests just to download the javascript files and this greatly increases byte download time.
Offline Tools: There are lot of things that can be calculated/processed asynchronously; you can easily offload these processes to an offline tool and reduce the load on web servers (for instance calculating the tag cloud can easily be done once in a day or week as part of a offline tool).
Lazy DB Writes: Keep in mind that db writes are the slowest so it might be a good idea to do some db operations in a batch; the batch data can be written to a flat file or a MQ or even to a distributed cache environment where it can be picked up by the offline tool.
Page Compression: Since there is no direct way of g-zipping the pages in IIS; people don't even think of compressing there pages; compression can drastically reduce your page sizes. Here is the link which provides directions to setup page compression in IIS 6.
Optimize Images and JS: Images should be optimized for the web and js files should be minified.

That's all that we have done so far to achieve a pretty decent response times across our site; what do you believe that I missed out on?

No comments:

Post a Comment