Why does moving our .NET / SQL Server website to a new host result in a larger connection pool size?

We recently moved our company website to a new host. This is an ASP.NET site with C # code that connects to an MS SQL server.

Since the site is moving to a new server, the site exceeds the connection pool limit (which is not explicitly set, so I believe that the default size is 100). Inspecting open processes using the SQL Server Management Studiorevealed environment, that every database call seems to remain open, and indeed, in the code I can find not explicitly not closed connections at all.

Database connections are performed as follows:

DSLibrary.DataProviders.SqlProvider db = new DSLibrary.DataProviders.SqlProvider(Defaults.ConnStr); 

There is very little documentation about this DSLibrary, and I assume this is a library written by the original website developer. None of the members of the DSLibrary class explicitly closes the connection - and they are not defined in the using block to automatically close the connection.

My question is 2 times.

  • How would we not encounter this problem when the site was on another host for almost 3 years? Is it automatically deleted to close unused connections that I do not have implemented on the SQL server?
  • Will I be better off rewriting every connection and procedure to explicitly open and close the connection database?

UPDATE The maximum number of concurrent connections property (Server Properties → Connections) is set to 0.

If I run the website in debug mode on my development machine, connecting remotely to the production database, then the connections seem to close properly. This seems to indicate that this is related to how IIS is configured?

UPDATE 2 Configuring the application pool for processing after processes 30 workers stopped the site exceeding the maximum number of connections, but there is currently a limitation of some (session-based persistent) functionality - recently visited the list of items it reboots very quickly and trying to change something through CE is impossible, since you logged out as soon as processes are reworked ...

+3
source share
3 answers

Most likely your code is leaking connections everywhere.

I would argue that your old host had a set of applications installed, which is often processed either from memory or from a request processing point. The new host most likely has a default bicycle guitar.

My recommendation is to configure the application pool to be reused a lot more often first. Then fix the code. Either by refactoring DSLibrary (guessing what grew up at home), or simply replacing it with the use of sentences, if you have a database connection.

Update
One more thing, change the session properties to use sql server as a backup storage so that you do not lose all the session information as the application processes it. This will buy you more time.

+3
source

Have you checked the maximum number of concurrent connections property (Server Properties → Connections)?

0
source

From what you are saying, you do not seem to wrap the database in a using statement, so a connection leak every time. The system discards the material when it bypasses it, if you do not explicitly dispose of it - this is the purpose of use. You should always use use (or otherwise ensure that dispose is invoked) on everything that represents a resource outside of your program. Files, database connections, etc.

0
source

Source: https://habr.com/ru/post/1313642/


All Articles