Wednesday, July 11, 2007

Web service time out in Visual Studio 2005 debug mode

We use a web service to standardize US addresses (convert Av to Ave., LA to Los Angeles etc). The web service has been around since a long time and is used by a lot of applications written in .NET 1.1. But today when using it in Visual Studio 2005 I noticed something. It threw a WebException.Timeout  exception every time I invoked it while I was debugging. It worked fine when invoked while the application was running normally, just not while being debugged. That's when a colleague asked me to add the following lines to my machine.config file, after which every worked fine.

<system.diagnostics>
<switches>
<add name="Remote.Disable" value="1"/>
</switches>
</system.diagnostics>



Symptoms: A web service method runs fine when invoked normally, but throws a exception when it is invoked from Visual Studio 2005 in debug mode. The web service method runs fine when invoked from Visual Studio 2003 in debug mode.


Cause:  It seems that the Visual Studio 2005 debugger some data to each outgoing request which is used for debugging. However, this data destroys the well-formedness of the request.

3 comments:

  1. Thank you, this solves the problem in VS 2008 as well. And once it starts to happen from 2008 it effects VS 2003, as we reverted back to our old development environment and the same issue was happening there. That was really starting to drive me insane as we've been using that environment without a problem for a good few years now!

    So thank you again.

    ReplyDelete
  2. Many Thanks you just saved me days of debugging

    ReplyDelete

Subscribe to my feed in your favorite feed reader