ASP.NET Reference

I am using background thread in my asp.net web service application. The answer of this thread is to delete the database after a certain time and update the data in the cache. The data table has about 500 thousand rows. In the task manager, when I look in the processes, the web developer server for the first time consumes about 300,000 K, the next time it reaches 500,000 K, and sometimes it reaches more than 1,000,000 K, and sometimes it goes back to 500,000- 600 000 K. Since I do the work on my local machine, so the data in the database does not change. Can someone tell me what I am doing wrong in the code:

protected void Application_Start(object sender, EventArgs e) { Thread obj = new Thread(new ThreadStart(AddDataInCache)); obj.IsBackground = true; obj.Start(); } private void AddDataInCache() { Int32 iCount = 0; while (true) { MyCollection _myCollection = new MyCollection(); DataTable dtReferences = null; DataTable dtMainData = null; try { dtMainData = _myCollection.GetAllDataForCaching(ref dtReferences); HttpRuntime.Cache.Insert("DATA_ALL_CACHING", dtMainData, null, Cache.NoAbsoluteExpiration, Cache.NoSlidingExpiration, CacheItemPriority.Default, null); HttpRuntime.Cache.Insert("DATA_REFERENCES_CACHING", dtReferences, null, Cache.NoAbsoluteExpiration, Cache.NoSlidingExpiration, CacheItemPriority.NotRemovable, null ); } catch (Exception ex) { } finally { if (_myCollection != null) _myCollection = null; } iCount++; Thread.Sleep(18000); } } 

In GetAllDataForCaching I get SqlDataReader from my data access level as:

 public DataTable GetAllDataForCaching(ref DataTable dReferenceTable) { DataTable dtReturn = new DataTable(); SqlDataReader dReader = null; try { dReader = SqlHelper.ExecuteReader(CommandType.StoredProcedure, "[GetDataForCaching]", null); if (dReader != null && dReader.HasRows) { dtReturn.Load(dReader); dReferenceTable = new DataTable(); if (dReader.HasRows) { DataTable dtSchema = dReader.GetSchemaTable(); List<DataColumn> listCols = new List<DataColumn>(); if (dtSchema != null) { foreach (DataRow drow in dtSchema.Rows) { string columnName = System.Convert.ToString(drow["ColumnName"]); DataColumn column = new DataColumn(columnName, (Type)(drow["DataType"])); column.Unique = (bool)drow["IsUnique"]; column.AllowDBNull = (bool)drow["AllowDBNull"]; column.AutoIncrement = (bool)drow["IsAutoIncrement"]; listCols.Add(column); dReferenceTable.Columns.Add(column); } } while (dReader.Read()) { DataRow dataRow = dReferenceTable.NewRow(); for (int i = 0; i < listCols.Count; i++) { dataRow[((DataColumn)listCols[i])] = dReader[i]; } dReferenceTable.Rows.Add(dataRow); } } } } finally { if (dReader != null) { if (dReader.IsClosed == false) dReader.Close(); dReader = null; } } return dtReturn; } 

I am using Visual Studio 2008.

+1
source share
5 answers

I did this by putting the following code in front of Thread.Sleep(18000);

 GC.Collect(); GC.WaitForPendingFinalizers(); 

He still controls the memory.

+1
source

I will start by looking at the following question:

... is that sometimes Cache returns null ...

This may be due to the fact that, apparently, it takes some time to fill the cache with a background thread. When Application_Start fires, you start the background thread and then terminate Application_Start . Then the application can move on to other tasks, for example, process the page.

If during page processing an attempt is made to access the cache until the initial start of AddDataInCache , the cache will return null.

As for memory consumption, I don’t immediately see how you could improve the situation if you could not reduce the number of rows in cached data tables.

The first time AddDataInCache called AddDataInCache cache is empty to begin with. Then your GetAllDataForCaching creates two data tables and populates them with data. This forces the process to save memory for storing data in DataTables.

In the second and subsequent calls to AddDataInCache cache already contains all the data that was received at the previous start. And again, you create two new data and fill it with data. This results in a memory return to hold both existing data in the cache and new data in the DataTables created in the second run. Then, as soon as the second run finishes loading the data, you overflow the existing data in the cache with the new data obtained in the second run.

At this point, the data in the cache from the first start will have the right to garbage collection . But this does not mean that the memory will be immediately restored. Memory will be restored when the garbage collector arrives and notices that DataTables are no longer needed in memory.

Please note that cached items from the first launch will only have the right to garbage collection if no live objects contain a link to them. Make sure you keep using the cache for a short time.

And while all this is happening, your background thread will happily continue the business, refreshing the cache. Therefore, it is possible that the third cache update occurs before the garbage collector frees up memory for DataTables retrieved in the first run, resulting in even greater memory consumption.

Therefore, in order to reduce memory consumption, I think you just have to reduce the amount of data stored in the cache (fewer rows, fewer columns). It may also be useful to increase the time between cache updates.

Finally, make sure that you do not keep old versions of cached objects alive by referencing them in long-running application requests / processes.

+3
source

You will be much more effective with a timer than a sleeping thread. Timers are more efficient in terms of memory and processor.

+2
source

I agree with Peter, and I recommend you use System.Threading.Timer , you may find this following link useful:

http://blogs.msdn.com/b/tmarq/archive/2007/07/21/an-ounce-of-prevention-using-system-threading-timer-in-an-asp-net-application.aspx

+2
source

First, you should use using (see IDisposable ) when working with a database connection, command, reader, etc.

Secondly, the web cache may be cleared due to pool utilization or IIS reset. This is why you cannot rely on the fact that your items in the cache are "forever." This is a safe way to get data:

 private DataTable GetDataWithReferences(out DataTable dtReferences) { dtReferences = HttpRuntime.Cache["DATA_REFERENCES_CACHING"]; DataTable dtMainData = HttpRuntime.Cache["DATA_ALL_CACHING"]; if ( null == dtMainData ) { dtMainData = _myCollection.GetAllDataForCaching(/*ref - why?*/out dtReferences); // cache insert } return dtMainData; } 
0
source

All Articles