Installing an HttpModule via Dropping in the bin without Editing Web.config by Matt Wrock

Last week another team wanted to use an HttpModule I had written. The problem is that they had no access to the hosting platform of their content. Their DLLs are consumed via MEF by a rather large infrastructure affectionately known as MTPS or what the world knows as the MSDN Library.

Unfortunately, the functionality provided by this DLL requires that it be run as an HttpModule. So I remembered reading a post of David Ebbo that described how to load an HttpModule dynamically at run time.The topic suddenly seemed relevant to this issue so I revisited his blog and applied his instructions to my assembly and wow…sure enough, just drop my dll in the bin and BAM! it loads and runs. If you would like to see a very small example app that does this please see my code sample on the MSDN Samples Gallery.

Technically, to accomplish this the code involved couldn’t be more trivial. Here is a quick walk through.

Specify a PreApplicationStartMethod

In the Assembly.info of the assembly containing the HttpModule, the below attribute must be included:

[assembly: PreApplicationStartMethod(typeof(Loader), "LoadModule")]

This line points to a Method inside the assembly which should be envoked during Pre Application Startup. This event occurs even before the Application Start event in the ASP.NET lifecycle. The official MSDN Library reference for this attribute can be found here. Of course this is assuming that there is a class either inside the assembly or referencable to the assembly of type Loader and that it contains a method called "LoadModule."

Dynamically Register the Module

Inside the LoadModule method, you will use the Microsoft.Web.Infrastructure assembly to actually register the module. The Microsoft.Web.Infrastructure assembly is not included in the .net BCL but you can easily download and install via Nuget. Here is a look at the entire Loader class:

using Microsoft.Web.Infrastructure.DynamicModuleHelper; 

namespace HttpModule {     public class Loader     {         public static void LoadModule()         {             DynamicModuleUtility.RegisterModule(typeof (DynamicModule));         }     } } 
As you can see there is not a whole lot to this. Just one line simply tells the run time the type name of the HttpModule to be registered. Here the HttpModule is inside a class called DynamicModule. The type that you pass to RegisterModule must derrive from IHttpModule.

Is this a good practice?

At first glance this seems really cool. It removes one setup step that consumers of the HttpModule would normally have to go through to install a third party module. It appears that this would provide a more friction free installation story for any third party module. What can be bad about that?

Well maybe I'm over thinking this but my first concern here is discoverability. What if the module has heavy hitting impact on the markup of the page. I envision a new developer or team inheriting such an application and wonder just how long it will take for them to find where this "alteration" is coming from especially in a sophisticated application with perhaps 20 referenced DLLs and lots of internal code. Or perhaps a team member drops in such a module and forgets to tell the team she put it there. I'm thinking that at some point in this story some negative energy will be exerted. Perhaps even a tear shed?

As an author myself of an open source HttpModule that can transform the markup of a site, this is of particular interest to me. My RequestReduce project is a module that combines and minifies CSS and Javascript and sprites and optimizes CSS background images on the fly. Since the default behavior of the module simply requires the module to be registered in the user's web.confg, if I were to employ the technique used in this sample, the module installation would have a truly plug and play installation story which sounds very awesome and I am tempted to add it. But I worry.

I'd be interested in others feedback in this matter. Do you think it is a good idea in general practice? Do you think the perhaps limited discoverability poses a true risk? In the end, do you think consumers would be more or less attracted to such an installation story?

Released RequestReduce 1.5 Supporting Custom Minifiers and a Critical Performance Fix for v1.4 by Matt Wrock

I just released RequestReduce v1.5 on Nuget and on http://www.RequestReduce.com.

If you are currently on v1.4, you will definitely want to upgrade to this new version. There was a misconfiguration in v1.4 causing a Regex to recompile on every call. The impact will vary depending on the number of scripts and css links on your site but is could well be in the 100s of milliseconds.

Enough about that. The key feature that v1.5 adds is a small API allowing you to do the following:

  • Log RequestReduce exceptions thrown from its processing thread to your own error logging implementation including very simple support for Elmah logging.
  • Plug in a custom minifier to replace the RequestReduce defaulr Microsoft Ajax Minifier. You can also use this if you want to override the settings that RequestReduce uses.
  • Filter out certain CSS, Javascript or Image resources from being processed by RequestReduce including entire pages or areas of your site from being processed. The API allows you to evaluate any property in the HttpRequest to decide if a resource should be filtered.

Here are the details on how to use the API:

Logging RequestReduce Errors

Especially if you already have a error logging mechanism, it is advisable that you log any errors encountered by the RequestReduce reduction processing. This will aid in bug reporting and generalk troubleshooting. To do this, you simply need to provide an Action delegate that will take charge of logging the passed Exception. Here is an example of how to use this if you use Elmah for error logging:

RequestReduce.Api.Registry.CaptureError(x => ErrorLog.GetDefault(null).Log(new Error(x)));

 

Injecting your own CSS or Javascript Minifier

RequestReduce attempts to follow a decoupled architecture which allows developers to swap out certain parts with their own behavior. To override RequestReduce's use of the Micosoft Ajax minifier library, you simply create a class that derrives from IMinifier. There is not much to IMinifier:

public interface IMinifier{    string Minify<T>(string unMinifiedContent) where T : IResourceType;}

Here Is RequestReduce's implementation:

public class Minifier : IMinifier{    private readonly Microsoft.Ajax.Utilities.Minifier minifier = new Microsoft.Ajax.Utilities.Minifier();    private readonly CodeSettings settings = new CodeSettings {EvalTreatment = EvalTreatment.MakeAllSafe};

    public string Minify<T>(string unMinifiedContent) where T : IResourceType    {        if (typeof(T) == typeof(CssResource))            return minifier.MinifyStyleSheet(unMinifiedContent);        if (typeof(T) == typeof(JavaScriptResource))            return minifier.MinifyJavaScript(unMinifiedContent, settings);

        throw new ArgumentException("Cannot Minify Resources of unknown type", "unMinifiedContent");    }}

It's not difficult to imagine how you would change this implementation to use something like the YUI Compressor for.Net. Lets say you had a YuiMinifier class that you want RequestReduce to use instead of its own minifier. It might look something like this:

public class YuiMinifier : IMinifier{    public string Minify<T>(string unMinifiedContent) where T : IResourceType    {        if (typeof(T) == typeof(CssResource))            return CssCompressor.Compress(unMinifiedContent);        if (typeof(T) == typeof(JavaScriptResource))            return JavaScriptCompressor.Compress(unMinifiedContent);

        throw new ArgumentException("Cannot Minify Resources of unknown type", "unMinifiedContent");    }}

You would just need to add the following code to your startup code:

RequestReduce.Api.Registry.RegisterMinifier<YuiMinifier>();

That's it. Now your minification code will be used.

Filtering Resources from the Reduction Processing

If you would like to have certain CSS or javascript resources filtered from the reduction processing or if you would like to exclude certain images or even entire requests from all response transformations, then use the AddFilter method:

public static void AddFilter(IFilter filter)
 
RequestReduce provides four concrete types deriving from IFilter:
 
public class CssFilter : Filter<CssJsFilterContext>public class JavascriptFilter : Filter<CssJsFilterContext>public class SpriteFilter : Filter<SpriteFilterContext>public class PageFilter : Filter<PageFilterContext>

All of these derrive from:

public class Filter<T> : IFilter where T : class, IFilterContext

The IFilter implementations really don't do anything on their own other than pass a IFilterContext which exposes various properties of the request. Each of the IFilter implementations provide a single constructor which takes a Predicate where T is an IFilterContext that RequestReduce populates and provides to the predicate delegate.

There are three different implementations of IFilterContext available:

public class CssJsFilterContext : IFilterContext{    public HttpRequestBase HttpRequest { private set; get; }    public string FilteredUrl { private set; get; } //The Css or JS url to be processed    public string FilteredTag { private set; get; } //The entire HTML tag being processed}

public class PageFilterContext : IFilterContext{    public HttpRequestBase HttpRequest { private set; get; }}

public class SpriteFilterContext : IFilterContext{    //BackgroundImageClass has several properties that include all the CSS background attributes    public BackgroundImageClass BackgroundImage { private set; get; } }

Depending on the type of filter (Page, css, javascript or sprite) for each item that RequestReduce processes, it will provide the appropriate IFilterContext to your predicate. If you return true, RequestReduce will eliminate that resource from being processed. Here is an example of how to have RequestReduce ignore any image over 100px X 100px.

RequestReduce.Api.Registry.AddFilter(new SpriteFilter(x => x.Width > 100 && x.Height > 100));
 
Make sure to add these filters inside your App start or some other pre application, one time execution block.

Microsoft’s MSDN and Technet Forums and Search adopt RequestReduce for Web Performance Optimization by Matt Wrock

This week RequestReduce was launched on Microsoft’s MSDN and Technet web sites specifically with the Forums and Search applications. So if you ever land on a Forums page within these sites or conduct a search within the MSDN or Technet environment, RequestReduce is crunching and merging the CSS and javascript as well as spriting their background images resulting in about a 25-30% Reduction in the number of HTTP requests made.. RequestReduce already services the MSDN and Technet gallery sites which include Code Samples and the Visual Studio Gallery. All of these sites combined represent several million page views a day and prove that RequestReduce can perform under considerable load.

This migration process did surface some “edge case” bugs in the Sql Server cache flushing logic that would likely only surface in high traffic environments. The Sql Server cache is an add on assembly that is aimed to synchronize the RequestReduce generated Scripts, CSS and images across multiple servers and ensure that users do not view different or broken content when moving from one server to another but rather guarantees a consistent user experience throughout a session.

The discovery of these bugs provided the bulk of the code that went into this week’s 1.4 Release of RequestReduce. The release was largely a fix and polish release but here are some highlights:

Dashboard Page

Troubeshooting issues in the Forums/Search migration inspired me to create a dashboard page to assist in catching and isolating some tough to catch bugs. The page is currently extremely bare bones and rather cryptic. It displays what css and javascript has been processed, what is queued for processing and what is currently in the middle of being processed. It also provides links for flushing individual merged css/js files from the cache. The page can be accessed from:

http://<your site>/<RequestReduce virtual directory>/dashboard

Its greatest benefit is to assist in determining what RequestReduce is currently up to if anything and where RequestReduce is in terms of processing all queued content. Expect this page to get much more informative and maybe prettier in coming releases.

Performance Optimization

The javascript processing introduced in release version 1.3, added a lot of work to the Response Filter which has to iterate over every character of the response stream to the browser. This is the single piece of code that needs the most perf optimization since it is the only request blocking activity within RequestReduce. Its speed has a direct impact on the Time to First Byte download of the hosting RequestReduce enabled web page.

So I changed some List<T> implementations to arrays and removed some unnecessary lines of code. Granted these are micro optimizations but even 50 – 100ms of time filtering a page is far too long and this process has to be extremely optimized. Its also a bear to read, write and change which I plan to address in a future release. Sure I could eliminate several lines of code and overly complicated logic by using regular expressions, but that would actually degrade performance significantly.

Speaking of regular expressions. I do make extensive use of them in the background processing of content. Especially in CSS parsing for sprites and imports. So I eliminated a lot of my static Regex instances and replaced them with instance member Regex’s in a singleton class. .net will cache the most recently used 15 static regular expressions in an application and I do not want to infringe upon that cache especially when I have about 8.Using a singleton allows me to manage my own cache.

Support for CSS @Media

CSS media types is something that can be applied to an entire CSS stylesheet, instructing the browser to use its styles only for specific media. For example: print and screen are the most popular and obvious. RequestReduce had been oblivious to the Media attribute of the Link tag and @import directive. This caused some CSS to break on certain sites. Especially if the last CSS on a page was targeted for Print media and strips out a lot of styling. This was resulting in the web experience having the styles striped out as well. No longer. RequestReduce now wraps the contents of any <Link/> or @import inside @media braces to preserve the media constraints they specify.

Support for ScriptResource.axd scripts and all other Compressed content

RequestReduce already supported the minification and merging of WebResource.axd files. However, RequestReduce always assumed that the content it was requesting was uncompressed text since it did not specify compression in its Accept-Encoding request headers. Well it ends up that ScriptResource.axd files violate HTTP by sending GZIP’d content regardless of the value in this header. This is a show stopper for an upcoming large RequestReduce adoption in the near future that includes pages with over 30 to 40 CSS and script files including tons of ScriptResource.axd files. RequestReduce now looks to see if a Response is compressed with gzip or deflate and uncompresses accordingly. RequestReduce leaves the recompression up to IIS.

This really makes RequestReduce an attractive solution for sites mired in asp.net 2.0/asp.net ajax constructs that had good intentions of encapsulating scripts but horrible perf implications.

Released: RequestReduce 1.3 Now Runs on .net 3.5 and a Preview of Support for Third Party Javascripts in your Minification and Merge Process by Matt Wrock

Welcome .net 3.5 Users!

Earlier (MUCH earlier) this morning I released version 1.3 of RequestReduce. While not as exciting as the previous javascript crunching and merging release, I am very happy that a broader base of users will now have access to this resource. It can now be installed on any server running .net 3.5. Huge thanks to my co worker Matt Hawley (@matthawley) for this pull request.

As part of this effort, Matt ripped out the Sql synchronization code and moved it into a new assembly RequestReduce.SqlServer which is now available as a separate nuget package. This code uses EntityFramework Code First which is limited to .net4.0. So if you need this functionality (most do not), you must be running .net4 (I’m also open to pull requests using a different data access strategy – not at all married to EF). If you are interested in learning more about synchronizing RequestReduce across multiple servers using SqlServer, see this wiki.

What else is in this release?

  1. Fixed javascript inside IE conditional comments. RequestReduce will ignore these just as it does with CSS links inside conditional comments
  2. Added a configuration setting to turn off image spriting: imageSpritingDisabled
  3. Added some optimizations to the Response filter

Troubleshooting Guide

In addition to these features, I have added a troubleshooting wiki to the github site. I have gotten a couple issues reported from people who were unable to get RequestReduce running at first due to issues in their environment. This guide should allow most users to at least begin troubleshooting their situation. However, if for any reason you find yourself spending a lot of time investigating this, please do not hesitate to file an issue and I will get on it usually within 24 hours. If most cannot successfully use RequestReduce from a typical nuget install, I have failed.

What’s Next?

The two key features up next for RequestReduce are:

  1. Support for the CSS Media attribute. RequestReduce has no awareness of this attribute and ignores it. While most do not use this, it can have breaking consequences right now with RequestReduce.
  2. Automatic Content refreshing. This is a feature I’m particularly excited about because it will not only make changing content refresh more quickly, it will allow users to have many third party scripts included in the crunch and merge process. Right now RequestReduce ignores many third party scripts that have expiring headers. With this feature, RequestReduce will occasionally send a head request to the original script in the background to see if content has changed, if it has, RequestReduce will refetch and reduce the content.

This Release (1.3) is now available at http://www.RequestReduce.com and on Nuget. Don’t forget: if there are features in RequestReduce you would like to see added, please suggest them on the github issues page. I received two requests last week and both were added to this release.

Released: RequestReduce 1.2 with Javascript merge and minify by Matt Wrock

With this week’s release of RequestReduce 1.2, RequestReduce is even more effective in reducing the HTTP requests made by a browser when visiting your website. Most significantly, this release adds the minification and merging of external javascript resources. A big thanks goes to Matt Manela for providing the initial Javascript Reduction code! Other mentionable features in this release include:

  • Automatic expansion of CSS @Import. All imports will be downloaded and injected into the calling CSS and then minified and merged with the rest of the page’s head CSS.
  • Wider support for PNG lossless compression. RequestReduce now issues the –fix parameter to optipng which will attempt to fix PNGs with extraneous data.
  • Ignore 404 and 500 failures of individual resources. In other words, if one script or stylesheet fails to crunch, the rest should still succeed.
  • Several bug fixes to the overall response filtering.

RequestReduce 1.2 is available via Nuget installations and from downloading it at http://www.RequestReduce.com. I will discuss the new features in some detail but first…

What makes RequestReduce the best static resource cruncher available?

OK. Glad I got that off my chest. Now for the details of this release’s features:

Javascript Crunching: What exactly is crunched

RequestReduce will minify all external javascripts that have a valid javascript mime type, do not have a Cache-Control header set to No-Cache or No-Store and does not have an Expires or max-age of less than a week. Furthermore you can explicitly tell RequestReduce what scripts you would like it to skip by supplying a comma separated list of url substrings. For example, on the Microsoft properties I work on, I have this set to /tiny_mce since the way that /tiny_mce loads its plugins does not play nice when the scripts are not in the same directory where tiny_mce expects them to be.

By default, if no urls are provided in this list, RequestReduce will ignore the Google and Microsoft JQuery CDNs. It does this since it is likely that browsers will not have to download this content because most will have already cached those resources while visiting a Google or Microsoft property in the past.

You can also use more complicated or customized rules via code. RequestReduce exposes a Func for approving the processing of urls. You can hook into this by registering a lambda or delegate to this Func in your Global.asax startup. Here is an example:

var javascriptResource = RRContainer.Current.GetInstance<JavaScriptResource>();javascript.TagValidator = ((tag, url) =>             {                if(url.EdsWith(".myjss") || tag.Contains("data-caching=dont"))                    return false;            });

The first input param is the full script tag element and the url is the javascript SRC.

Important considerations in the placement of your javascripts

In a perfect world, I would crunch all your scripts together and dynamically inject them into the DOM to load asynchronously. In fact, because the world is perfect, I could even defer the loading of the scripts until after document complete. But alas, the world is not perfect. RequestReduce knows absolutely nothing about your code and takes the safest approach to ensure that your page does not break.Therefore, RequestReduce will make sure that any dependencies that your markup has on your scripts or that your scripts have on other scripts are not broken.

RequestReduce will only merge blocks of scripts that can be reduced contiguously. For example assume you have the following scripts on a page:

<html>    <head>        <link href="http://localhost:8877/Styles/style1.css" rel="Stylesheet" type="text/css" />        <link href="/Styles/style2.css" rel="Stylesheet" type="text/css" />        <script src="http://localhost:8877/Scripts/script1.js" type="text/javascript"></script>        <script src="/Scripts/script2.js" type="text/javascript"></script>    </head>    <body>        <div class="RatingStar">&nbsp;</div>        <div class="Up">&nbsp;</div>        <div class="Rss">&nbsp;</div>        <div class="Search">&nbsp;</div>        <script src="http://localhost:8877/Scripts/script3.js" type="text/javascript"></script>        <script src="http://www.google-analytics.com/ga.js" type="text/javascript"></script>        <script src="/Scripts/script4.js" type="text/javascript"></script>    </body></html>

You will end up with five scripts:

  1. The scripts in the <head/> will be merged together and remain in the head.
  2. Script 3 will be minified but will not be merged since it lies just below common markup and above a near-future expiring script (google analytics).
  3. The google analytics script remains entirely untouched since it is a near future expiring script.
  4. Script 4 will be minified but not merged since it is not adjacent to any other non near futures script.

Although script three and four do not benefit from merging, they are at least minified and served with far future expiration and a valid ETag.

With all this in mind, you want to make sure that you group scripts together that are not near future expiring and can remain together without causing issues with load order dependencies. Also make sure that you do not have any setting that causes all of your javascript to be served with a near futures expiration unless you have a particularly good reason to do so.

I intend to roll out more functionality here to allow site owners to provide hints via data attributes in the script tags where scripts can be loaded asynchronously or deferred. I also intend to use some different approaches in the ways I discover changing content on the origin server and may be able to merge and minify near future expiring content. The fact is that most near future expiring content does not change often but wants to enforce that client’s check with the server frequently just in case there is a change.

Plugging in alternative minifiers and settings

RequestReduce attempts to follow a decoupled architecture which allows developers to swap out certain parts with their own behavior. To override RequestReduce's use of the Microsoft Ajax minifier library, you simply create a class that derives from IMinifier. There is not much to IMinifier:

public interface IMinifier{    string Minify<T>(string unMinifiedContent) where T : IResourceType;}

Here Is RequestReduce's implementation:

public class Minifier : IMinifier{    private readonly Microsoft.Ajax.Utilities.Minifier minifier = new Microsoft.Ajax.Utilities.Minifier();    private readonly CodeSettings settings = new CodeSettings {EvalTreatment = EvalTreatment.MakeAllSafe};

    public string Minify<T>(string unMinifiedContent) where T : IResourceType    {        if (typeof(T) == typeof(CssResource))            return minifier.MinifyStyleSheet(unMinifiedContent);        if (typeof(T) == typeof(JavaScriptResource))            return minifier.MinifyJavaScript(unMinifiedContent, settings);

        throw new ArgumentException("Cannot Minify Resources of unknown type", "unMinifiedContent");    }}

It's not difficult to imagine how you would change this implementation to use something like the YUI Compressor for.Net. Lets say you had a YuiMinifier class that you want RequestReduce to use instead of its own minifier. You would just need to add the following code to your startup code:

 
RRContainer.Current.Configure(x => x.For<IMinifier>().Use<YuiMinifier>());

That's it. Now your minification code will be used.

Suddenly CSS @import is not quite so evil

Just about anyone worth their salt in web development will tell you never to use CSS @Import. Except maybe the internet latency monster. He loves CSS @import but will quickly bite off your toes when he see you using it. Suffice it to say that the internet latency monster is a meanie.

RequestReduce can now identify @import usages and auto expand them so that no extra latency is incurred. I still do not think it is a good idea to use @import but now it lacks its sting.

So if you have not started using RequestReduce already, I strongly encourage you to do so. I would also greatly appreciate any feedback you have of what features you would like to see added or bugs you are running into. I’d also like to hear your success stories. You can submit all feature requests and bug reports to the github page at https://github.com/mwrock/RequestReduce/issues?sort=comments&direction=desc&state=open.