I am writing a stored procedure that, upon completion, will be used to scan staging tables for dummy data in column by column.
The first step in the exercise was simply to scan the table - this is what the code below does. The problem is that this code works after 5:45 seconds --- however, the same code as the console application (changing the connection string, of course), runs after about 44 seconds.
using (SqlConnection sqlConnection = new SqlConnection("context connection=true")) { sqlConnection.Open(); string sqlText = string.Format("select * from {0}", source_table.Value); int count = 0; using (SqlCommand sqlCommand = new SqlCommand(sqlText, sqlConnection)) { SqlDataReader reader = sqlCommand.ExecuteReader(); while (reader.Read()) count++; SqlDataRecord record = new SqlDataRecord(new SqlMetaData("rowcount", SqlDbType.Int)); SqlContext.Pipe.SendResultsStart(record); record.SetInt32(0, count); SqlContext.Pipe.SendResultsRow(record); SqlContext.Pipe.SendResultsEnd(); } }
However, the same code (different connection string, of course) runs in the console application after about 44 seconds (which is closer to what I expected on the client side)
What I miss on the SP side, which will lead to its slow launch.
Please note: I fully understand that if I need the number of rows, I should use the aggregation count (*) --- this is not the purpose of this exercise.
source share