I have been working in the area of Web services for some time and I am on a project where I needed to distribute some extremely large datasets to some clients (Windows Forms). So, imagine an interface where you need to request information about an entity and you have a service where you can send in an array of entities to then get back an array of result sets about each one of the identifiers passed in. We had one such application and it worked quite nicely in that we built some ASP.NET 2.0 Web services to push out the SOAP messages to the client in just this scenario. It worked so well since we were only sending in 30 identifiers at a time but then we really decided to test it out when we put in 1,000 - 5,000 identifiers into one request. This in return was giving us back tremendously large SOAP messages and in many cases our Windows Forms application was simply timing out. How to approach this we thought. There are many methods of approach and we had to pick one and go with it as we had to support this scenario for our clients.
Our options included:
- Using the Windows Communication Foundation and building services that pushed binary objects around instead of using SOAP.
- Using .NET Remoting instead and using the binary over HTTP that it offers.
- Using an HTTP compression class technique along with SOAP Extensions to compress the SOAP message before it went out across the wire.
- Using the HTTP Compression feature found on the Windows Server 2003 box with our standard ASP.NET Web services
- Using WSE 3.0 MTOM capabilities to encapsulate our messages into binary objects to send across the wire
There are a lot of options, but we quickly narrowed them down. First off, we are not able to use the WCF yet as we are unable to put this on the client at this point in time as we are supporting so many operating systems that are below the required Windows XP SP2. .NET Remoting ... maybe, but we really wanted to stick to the SOAP model for later routing purposes. Soap Extensions - a possibility, but not the easiest to implement and would require us to put a decompression class on the client. HTTP Compression - that's a possibility. WSE 3.0 - another interesting stance, but would require us to put the WSE 3.0 on the client - something we weren't too interested in doing.
So, we narrowed our tests down to .NET Remoting as well as HTTP Compression using Windows Server 2003. We built a test application that first did a comparison between .NET Remoting and standard ASP.NET 2.0 Web services with no compression enabled. The tests were interesting. First off when a single identifier was sent in and a tiny SOAP message was returned - there really wasn't that much difference between the two. The results were pretty much the same. Though, when sending in 1,000 identifiers .NET Remoting considerably outperformed ASP.NET 2.0 Web services. This was due to the fact that the large SOAP message was being sent as binary over the wire and the entire serialization and deserialization process was eliminated. The results were that .NET Remoting performed by shrinking the time it took to get the response to less than HALF that of the ASP.NET 2.0 Web services approach.
Then we turned on HTTP Compression on the Windows Server 2003 box. Instructions for this can be found at http://support.microsoft.com/?id=322603 but you want to do this for .asmx pages. Though, you will discover that enabling compression through IIS is not the only step you are required to take. In order for the HTTP message to be compressed on the server, you are going to need to get the client to ask for the message to be compressed. This means that the client will need to send a accept-encoding: gzip; in the HTTP header. By default, proxy classes don't do this for you. Instead, you are going to have to force the client to make this type of request by using the new EnableDecompression property in .NET 2.0.
Fundamentals_1_0 proxy = new Fundamental.Fundamentals_1_0();
proxy.EnableDecompression = true;
After you instantiate the proxy class, you simply set the EnableDecompression property to true and you are then good to go - the proper HTTP header will now be included. Enabling our client to do this and sending in the 1,000 identifiers we were amazed that we could just about replicate the .NET Remoting performance and in some cases we could match it. This showed us that we could deliver our large SOAP messages over HTTP and get away with doing the littlest impact to the client that was possible (as far as new installs of any software or code changes to the client app).
Other scenarios may of course warrant different approaches - I'm just describing what worked for us in this case. It was a good lesson.