Naturally, separating IIS and SQL Server is the first step. Sql server really wants to have a complete machine for itself.
Secondly, it is important to analyze the application as it launches. Never try to optimize the performance of your application without real usage data, because you probably just spend time optimizing materials that are rarely called up. One of the methods that I have used with success in the past is to create a System.Diagnostics.Stopwatch in Request_Begin in global.asax and then save it in a context variable
var sw = new Stopwatch(); sw.Start() HttpContext.Current.Items["stopwatch"] = sw;
In End_ request, you will get an animated stopwatch
sw = HttpContext.Current.Items["stopwatch"]; sw.Stop(); Timespan ts = sw.Elapsed;
And then write in the log table how long it took to process the request. Also register a URL (with or without query string parameters) and all sorts of materials to help you analyze performance.
Then you can analyze your application and find which operations take the longest, which are called the most, etc. This will allow you to see if there is one page that is being requested a lot, and usually it takes a long time to complete, this should be the goal of optimization, using any tools that you have for this, both .NET and SQL profilers.
Other things that I usually also register are the IP addresses and user ID for registered users. It also gives me invaluable debbugging tool when errors occur.
The reason to put it in a table rather than being written to a log file is that you can use SQL syntax to filter, group, calculate average time, etc.
Pete
source share