MemoryCache does not match memory limits in configuration

Im works with the .NET 4.0 class MemoryCache in the application and tries to limit the maximum cache size, but in my tests it is not visible that the cache is really subject to restrictions.

I use settings that , according to MSDN , should limit the size of the cache:

  • CacheMemoryLimitMegabytes : The maximum memory size in megabytes that can increase an instance of an object. "
  • PhysicalMemoryLimitPercentage : "The percentage of physical memory that the cache can use, expressed as an integer value from 1 to 100. The default value is zero, which means that MemoryCache instances manage their own memory 1 depending on the amount of installed memory on the computer." 1. This is not entirely correct - any value below 4 is ignored and replaced with 4.

I understand that these values โ€‹โ€‹are approximate and not hard restrictions, since the thread that clears the cache starts every x seconds and also depends on the polling interval and other undocumented variables. However, even taking into account these deviations, I see wildly inconsistent cache sizes when the first element is issued from the cache after setting CacheMemoryLimitMegabytes and PhysicalMemoryLimitPercentage together or singularly in the test application. To make sure that I conducted each test 10 times and calculated the average.

These are the results of testing the code below on a 32-bit Windows 7 PC with 3 GB of RAM. The cache size is taken after the first call to CacheItemRemoved () for each test. (I know that the actual cache size will be larger than this)

MemLimitMB MemLimitPct AVG Cache MB on first expiry 1 NA 84 2 NA 84 3 NA 84 6 NA 84 NA 1 84 NA 4 84 NA 10 84 10 20 81 10 30 81 10 39 82 10 40 79 10 49 146 10 50 152 10 60 212 10 70 332 10 80 429 10 100 535 100 39 81 500 39 79 900 39 83 1900 39 84 900 41 81 900 46 84 900 49 1.8 GB approx. in task manager no mem errros 200 49 156 100 49 153 2000 60 214 5 60 78 6 60 76 7 100 82 10 100 541 

Here is a test application:

 using System; using System.Collections.Generic; using System.Collections.Specialized; using System.Linq; using System.Runtime.Caching; using System.Text; namespace FinalCacheTest { internal class Cache { private Object Statlock = new object(); private int ItemCount; private long size; private MemoryCache MemCache; private CacheItemPolicy CIPOL = new CacheItemPolicy(); public Cache(long CacheSize) { CIPOL.RemovedCallback = new CacheEntryRemovedCallback(CacheItemRemoved); NameValueCollection CacheSettings = new NameValueCollection(3); CacheSettings.Add("CacheMemoryLimitMegabytes", Convert.ToString(CacheSize)); CacheSettings.Add("physicalMemoryLimitPercentage", Convert.ToString(49)); //set % here CacheSettings.Add("pollingInterval", Convert.ToString("00:00:10")); MemCache = new MemoryCache("TestCache", CacheSettings); } public void AddItem(string Name, string Value) { CacheItem CI = new CacheItem(Name, Value); MemCache.Add(CI, CIPOL); lock (Statlock) { ItemCount++; size = size + (Name.Length + Value.Length * 2); } } public void CacheItemRemoved(CacheEntryRemovedArguments Args) { Console.WriteLine("Cache contains {0} items. Size is {1} bytes", ItemCount, size); lock (Statlock) { ItemCount--; size = size - 108; } Console.ReadKey(); } } } namespace FinalCacheTest { internal class Program { private static void Main(string[] args) { int MaxAdds = 5000000; Cache MyCache = new Cache(1); // set CacheMemoryLimitMegabytes for (int i = 0; i < MaxAdds; i++) { MyCache.AddItem(Guid.NewGuid().ToString(), Guid.NewGuid().ToString()); } Console.WriteLine("Finished Adding Items to Cache"); } } } 

Why is MemoryCache not subject to established memory limits?

+79
c # caching memorycache
Aug 01 2018-11-11T00:
source share
7 answers

Wow, that's why I just spent too much time searching in the CLR with a reflector, but I think, finally, I am well versed in what is going on here.

Settings are read correctly, but the CLR itself seems to have a deep-seated problem that looks like it will make setting the memory limit almost useless.

The following code is reflected in the System.Runtime.Caching DLL for the CacheMemoryMonitor class (there is a similar class that controls physical memory and deals with another parameter, but this is more important):

 protected override int GetCurrentPressure() { int num = GC.CollectionCount(2); SRef ref2 = this._sizedRef; if ((num != this._gen2Count) && (ref2 != null)) { this._gen2Count = num; this._idx ^= 1; this._cacheSizeSampleTimes[this._idx] = DateTime.UtcNow; this._cacheSizeSamples[this._idx] = ref2.ApproximateSize; IMemoryCacheManager manager = s_memoryCacheManager; if (manager != null) { manager.UpdateCacheSize(this._cacheSizeSamples[this._idx], this._memoryCache); } } if (this._memoryLimit <= 0L) { return 0; } long num2 = this._cacheSizeSamples[this._idx]; if (num2 > this._memoryLimit) { num2 = this._memoryLimit; } return (int) ((num2 * 100L) / this._memoryLimit); } 

The first thing you might notice is that it doesn't even try to look at the size of the cache until the Gen2 garbage collection appears, instead simply discarding the existing value of the stored size in cacheSizeSamples. That way, you can never achieve the goal directly, but if everything else works, we will at least get a size measurement before we get real problems.

So, assuming the Gen2 GC happened, we ran into problem 2, which is that ref2.ApproximateSize does a terrible job, actually approximating the size of the cache. Slogging through the junk CLR I discovered that this is System.SizedReference, and this is what it does to get the value (IntPtr is the handle to the MemoryCache object itself):

 [SecurityCritical] [MethodImpl(MethodImplOptions.InternalCall)] private static extern long GetApproximateSizeOfSizedRef(IntPtr h); 

I assume that the extern declaration means that at this moment it is plunging into unmanaged windows, and I don't know how to start figuring out what it is doing there. From what I observed, although he does an awful job, trying to get closer to the size of the general thing.

The third notable thing is the call to manager.UpdateCacheSize, which sounds like it has to do something. Unfortunately, in any normal sample of how this should work, s_memoryCacheManager will always be null. The field is set from the public static member of ObjectCache.Host. This manifests itself to the user, which he may encounter if he so desires, and I really could do this kind of work, as expected, by smoothing my own implementation of IMemoryCacheManager, setting it to ObjectCache.Host, and then running the sample, At that while it looks like you could just make your own cache implementation and not even worry about all this, especially since I have no idea if your own class ObjectCache.Host is configured (static, so it affects each of them, which may be there in the process f) for measuring cache can ruin other things.

I have to believe that at least part of this (if not a couple of parts) is simply a mistake. It would be nice to hear from someone at MS that the deal was with this thing.

The TL; DR version of this giant answer: suppose that CacheMemoryLimitMegabytes is completely overloaded at this point in time. You can set it to 10 MB, and then continue to fill the cache up to ~ 2 GB and release an exception from memory without disabling deleting the item.

+94
Sep 11 2018-11-11T00:
source share

I know this answer is crazy, but better late than never. I wanted to inform you that I wrote a version of MemoryCache that automatically fixes problems with the Gen 2 build. Therefore, it cuts off every time the polling interval indicates memory pressure. If you encounter this problem, let it go!

http://www.nuget.org/packages/SharpMemoryCache

You can also find it on GitHub if you are interested in how I solved it. The code is somewhat simple.

https://github.com/haneytron/sharpmemorycache

+28
Apr 6 '14 at 21:21
source share

I also ran into this problem. I cache objects that run dozens of times per second in my process.

I found the following configuration and using frees items every 5 seconds most of the time .

App.config:

Pay attention to cacheMemoryLimitMegabytes . When this has been set to zero, the cleaning procedure will not work in a reasonable amount of time.

  <system.runtime.caching> <memoryCache> <namedCaches> <add name="Default" cacheMemoryLimitMegabytes="20" physicalMemoryLimitPercentage="0" pollingInterval="00:00:05" /> </namedCaches> </memoryCache> </system.runtime.caching> 

Adding to the cache:

 MemoryCache.Default.Add(someKeyValue, objectToCache, new CacheItemPolicy { AbsoluteExpiration = DateTime.Now.AddSeconds(5), RemovedCallback = cacheItemRemoved }); 

Confirm cache removal works:

 void cacheItemRemoved(CacheEntryRemovedArguments arguments) { System.Diagnostics.Debug.WriteLine("Item removed from cache: {0} at {1}", arguments.CacheItem.Key, DateTime.Now.ToString()); } 
+4
Aug 12 '15 at 18:18
source share

I (fortunately) came across this useful post yesterday when I first tried using MemoryCache. I thought this would be a simple case of setting values โ€‹โ€‹and using classes, but I ran into similar problems described above. To verify what was going on, I extracted the source using ILSpy and then set up a test and went through the code. My test code was very similar to the code above, so I will not publish it. From my tests, I noticed that measuring cache size has never been particularly accurate (as mentioned above) and given that the current implementation will never work reliably. However, the physical measurement was beautiful, and if the physical memory was measured at each survey, it seemed to me that the code would work reliably. So, I removed the gen 2 garbage collection check in MemoryCacheStatistics; under normal conditions, memory measurements will not be performed unless there has been another Gen 2 garbage collection since the last measurement.

In the test case, this is obviously of great importance, because the cache is constantly hitting, so objects will never be able to switch to gen 2. I think that we will use a modified assembly of this DLL for our project and use the official MS assembly when it comes out .net 4.5 (which, according to the connection article mentioned above, must have a fix). Logically, I can understand why the Gen 2 check was implemented, but in practice I'm not sure if that makes sense. If the memory reaches 90% (or whichever limit it is set), it does not matter whether the gen 2 assembly has occurred or not, the elements should be displayed independently.

I left my test code for about 15 minutes and the physical MemoryLimitPercentage is set to 65%. I saw that during the test the memory usage remained between 65-68%, and they saw how something got out correctly. In my test, I set pollingInterval to 5 seconds, physicalMemoryLimitPercentage to 65, and physicalMemoryLimitPercentage to 0 by default.

Following the advice above; an implementation of IMemoryCacheManager can be implemented to send things from the cache. However, he suffers from the 2nd generation verification problem mentioned. Although, depending on the scenario, this may not be a problem in production code and may work enough for people.

+3
Nov 24 '11 at 9:44
source share

If you use the following modified class and control the memory through the Task Manager, they are actually truncated:

 internal class Cache { private Object Statlock = new object(); private int ItemCount; private long size; private MemoryCache MemCache; private CacheItemPolicy CIPOL = new CacheItemPolicy(); public Cache(double CacheSize) { NameValueCollection CacheSettings = new NameValueCollection(3); CacheSettings.Add("cacheMemoryLimitMegabytes", Convert.ToString(CacheSize)); CacheSettings.Add("pollingInterval", Convert.ToString("00:00:01")); MemCache = new MemoryCache("TestCache", CacheSettings); } public void AddItem(string Name, string Value) { CacheItem CI = new CacheItem(Name, Value); MemCache.Add(CI, CIPOL); Console.WriteLine(MemCache.GetCount()); } } 
+2
Jul 12 2018-12-12T00:
source share

I did some testing with the @Canacourse example and the @woany modification, and I think there are some critical calls that block clearing the memory cache.

 public void CacheItemRemoved(CacheEntryRemovedArguments Args) { // this WriteLine() will block the thread of // the MemoryCache long enough to slow it down, // and it will never catch up the amount of memory // beyond the limit Console.WriteLine("..."); // ... // this ReadKey() will block the thread of // the MemoryCache completely, till you press any key Console.ReadKey(); } 

But why does the @woany modification seem to keep memory at the same level? Firstly, RemovedCallback is not installed, and there is no console output or is waiting for input that could block the memory cache stream.

Secondly...

 public void AddItem(string Name, string Value) { // ... // this WriteLine will block the main thread long enough, // so that the thread of the MemoryCache can do its work more frequently Console.WriteLine("..."); } 

A Thread.Sleep (1) each ~ 1000th AddItem () will have the same effect.

Well, this is not a very in-depth study of the problem, but it looks like the MemoryCache thread is not getting enough processor time to clean up, and many new items are being added.

+2
Nov 11 '13 at 14:20
source share

It turned out that this is not an error, all you have to do is set the time interval of the pool to ensure compliance with the restrictions, it seems that if you leave the pool unset, it will never work. I just tested it and did not need shells or any additional code:

  private static readonly NameValueCollection Collection = new NameValueCollection { {"CacheMemoryLimitMegabytes", "20"}, {"PollingInterval", TimeSpan.FromMilliseconds(60000).ToString()}, // this will check the limits each 60 seconds }; 

Set the value to " PollingInterval " depending on how fast the cache grows, if it grows too fast, increase the frequency of polling checks, otherwise check not very often so as not to incur overhead.

0
Dec 07 '18 at 14:12
source share



All Articles