Web Site Performance - It's not just your code by Matt Wrock

I'm a software developer. By default, I tend to focus on optimizing my code and architecture patterns in order to tune performance. However, there are factors that lie well outside of your application code and structure that can have a profound impact on web site performance. Two of these factors that this post will focus on are the use of a CDN and IIS compresion settings.

Over the last month, me and my team have contracted a CDN and tweaked compression settings resulting in a 40% improvement of aveage page load time outside of our immediate geographical region!

Using a CDN for static content

A CDN (content delivery network) is a company that has data centers distributed over the globe. Some of the dominant players are Akamai, CDNetworks and Limelight. These have many data centers dispersed over the United States as well as several scattered internationaly. Chances are they have servers that are located much closer to your users than your own servers. You may not think this is a big deal. What's a couple hundred milliseconds to load an image? Well if your site has alot of images as well as css and javascript (jquery alone packs a hefty download size), the boost can be  significant. Not only will average load time improve, but you will be less suseptible to general internet peaks and jitters that can potentially cause significant and annoying load delays.

One of the disadvantages of using a CDN is cost. You will need to work with a commercial CDN if you want to have your own images and other static content that you have authored to be hosted via CDN. However, several popular frameworks are freely hosted on the CDNs of large internet companies. For example just this week Scott Guthrie announced that Microsoft will be hosting jQuery and the asp.net AJAX libraries. You can also use Google's CDN to serve jQuery. I use Yahoo's YUI ajax libraries and they can be served from Yahoo. Another cost related consideration is that the bandwidth you would be paying to serve this content yourself is now absorbed by the CDN.

One other disadvantage that I often hear related to CDNs is that the control and availability of the content is now at the mercy of the CDN. Well I personally feel much more comfortable using Google's vast network resources than my own.

Migrating to a CDN for the hosting of your own media is fairly transparent and painless especially if you already host all of this content on its own domain or sub domain (ie images.coolsite.com). The CDN will assign you a DNS CNAME and you will add that to your domain's zone file. Now all requests for static content will go to the CDN's servers. If they do not have the content, they will go to your server to get it. Then all subsequent requests will go to the CDN's cache. You can specify what the cache expiration will be and you should also be able to manually flush the cache if you need to.

One other perk that most commercial CDNs provide is detailed reporting on bandwidth and geographical data telling where your content is being requested from.

Our servers are based in California's Silicon Valley and we realized a 30% performance boost in the midwest and eastern United States. It should also be noted that our pages are very image light. So a site that has lots of rich media has even more to gain.

Compress Everything

I had always been compressing static content but not dynamic content. I think I was scared off by the warnings that the IIS documentation gives regarding high CPU usage. Yes, compression does incur CPU overhead, but with today's CPU specs, chances are that it is not significant enough to keep you from turning on this feature. Our servers tend to run at about 5 to 10%. After turning on dynamic content compression, I saw no noticable CPU increase but I did see a 10% increase in page load performance. All of this is free and took me just a few minutes to configure. In fact it is better than free, you will save on bandwidth.

I did do some poking around the web for best practices and found that it is worth while to tweak the default compression levels in IIS. Here is a good blog article that goes into detail on this topic. To turn on static and dynamic compression at the ideal levels on IIS 7. I issued these commands:

C:\Windows\System32\Inetsrv\Appcmd.exe set config -section:urlCompression -doStaticCompression:true -doDynamicCompression:true 
C:\Windows\System32\Inetsrv\Appcmd.exe set config
-section:httpCompression -[name='gzip'].staticCompressionLevel:9
-[name='gzip'].dynamicCompressionLevel:4

 

Tools for analyzing page load performance

Here are a few nice tools I use to observe and analyze web site performance:

  1. Firebug. A must have Firefox plugin that will tell how long each resource takes to load.
  2. An external monitoring service. I use Gomez. This will not only tell you how long it takes for your site to load, but it can monitor from around the globe and provide very rich and detailed reporting. I have several alerts configured that page me if my site is taking too long to load or is broken. "Broken" can mean 500 errors, Server too busy errors, non responsive servers due to server outages or bad DNS or it can even mean a failed pattern match of strings expected on the page.
  3. YSlow. This is a firefox plugin from Yahoo that works with Firebug and analyzes several key indicators on your site. It examines your headers, caching, javascript, style sheet usage and much more and then gives you a detailed list of possible improvements you can make.

 

So if you feel that you have done all the code tweaking you can to get the most performance from your site, think again and take a look at these tools to see how the outside world is experiencing your content.

Colocatng ASP.NET MVC and Web Forms in the same Web Application by Matt Wrock

My team and I are migrating a large web forms/ADO web application over to the MVC framework using nHibernate and Domain Driven Design principles. After dabbling in MVC and nHibernate (although I have used MVC in java in a past life) and reading Eric Evans' book, I've been chomping at the bit to implement these frameworks in our less than clean production code.

As I mentioned above, this is a large application and I am not about to suspend our feature backlog while we spend months converting web forms to controller/views. Instead I have decided to keep the current collection of web forms pages that we have now and "fold in" conntrollers and views as we build out new functionality. Any minor enhancements to existing web forms will remain in the current web form structure, but any changes requiring heavy lifting will warrant a conversion to MVC. I should note that our "model" is thankfully already abstracted into separate class libraries.

In order to accomplish this happy coexistence of web forms and MVC, we have done the following:

  1. Created a brand new MVC project for the views.
  2. Created a new project for controllers to keep the controllers and views separate (we are using the Sharp Architecture Framework)
  3. Added a "WebForms" directory to MVC Views projects
  4. Copied the web forms tree from our legacy web application project to the new WebForms folder
  5. Made minor changes to relative URL references

 

This has worked out really well for us. It has allowed us to keep a large and long lived application functioning as is while providing a framework for future functionality that is a much better enforcer of separation of concerns and far more unit test friendly.

We are taking a similar migration strategy with our Model layer. We have sevral class libraries of domain classes riddled with ADO code (yuck). Again, we are continuing to use these classes and extend them with small changes. However, large changes and new domain objects are following a DDD approach using nHibernate. There has been a definite learning curve with nHibernate and Fluent nHibernate, but I can't tell you how much cleaner the code looks without all of the database plumbing. You see only what you need to see -- your business logic not to mention we are no longer having to create countless stored procedures for the simple CRUD functionaity.

I don't think this architecture is suitable for every application. It would most likely be a poor choice for our external web application that serves over ten million page views a day. But it is ideal for a large administrative CRUD application supporting a complex domain model -- a perfect candidate for DDD.