Almost every time exceptions are mentioned in mailing lists and newsgroups, people say they're really expensive, and should be avoided in almost all situations. As an idea of just how expensive some people think they can be, in one article someone asked whether the fact that his web application was throwing about 200 exceptions an hour was likely to be harming his performance. Various people replied saying that it would indeed be causing a problem. Let's examine that claim, shall we?
The True Cost of Exceptions
Here's a short program which just throws exceptions and catches them, just to see how fast it can do it:
using System;
public class Test
{
const int Iterations = 5000000;
static void Main()
{
DateTime start = DateTime.UtcNow;
for (int i=0; i < Iterations; i++)
{
try
{
throw new ApplicationException();
}
catch (ApplicationException)
{
}
}
DateTime end = DateTime.UtcNow;
long millis = (long) (end-start).TotalMilliseconds;
Console.WriteLine ("Total time taken: {0}", end-start);
Console.WriteLine ("Exceptions per millisecond: {0}", Iterations/millis);
}
}
Now, the above isn't geared towards absolute accuracy - it's using DateTime.Now
to measure time,
just for convenience, but if you give it enough iterations to make the test run for a fair time (half a minute
or so) then any inaccuracies due to a low resolution timer and the JIT compiler are likely to get lost in the
noise. The main thing is to see roughly how expensive exceptions are. Here are the results on my
laptop, using .NET 1.1, running outside the debugger (see later for the reason for emphasis):
Total time taken: 00:00:42.0312500
Exceptions per millisecond: 118
Now, that doesn't involve any significant depth of stack, and indeed if you change the test to recurse until it reaches a certain stack depth, it does become significantly slower - recursing to a depth of 20 takes the results down to about 42 exceptions per millisecond. Also, running with .NET beta 2 gives fairly different results - even the test above only manages to throw about 40 exceptions per millisecond. However, those differences are only a factor of three - not enough to change the overall performance of exceptions, which is clearly pretty good.
Let's look back at the example from the newsgroups - 200 exceptions being thrown in an hour. Even assuming a server which was 10 times slower than my laptop (which seems unlikely) and assuming a fairly deep stack, those 200 exceptions would still only take about 50ms. That's less than 0.002% of the hour. In other words, those exceptions weren't significant at all when it came to performance.
Comments