.NET MemoryCache: how does this limit the memory limit?

.NET MemoryCache is a cache of C # objects. Some objects may have a complex structure, while others may have insecure links. Is C # magic to implement PhysicalMemoryLimit or just calculates the small size of each object?

I suspect this is later. However, if I put the same object several times in the cache (for example, to track missing items), will the size be taken into account at a time or for each record containing this instance?

+5
source share
4 answers

.NET MemoryCache is similar to the ASP.NET Cache class. If we look at the ASP.NET cache, we will see the CacheItemRemovedCallback function. This is triggered when an item is removed from the cache.

This function gives the CacheItemRemovedReason callback function. If we consider the reasons, we will see that the item can be deleted from the cache because the system deleted it in free memory. Thus, while PhysicalMemoryLimit gives the percentage of physical memory that the cache can use in one pass, I think they will leave it to the system to clear the cache if it reaches the limit.

If you really put the cache element in the cache using the Add function, it will add it as a new instance of CacheItem. Therefore, it will be counted several times. If you use the AddOrGetExisting function, it will check if all items are in the cache. If so, it will use this instance, not the new instance. So, it will be counted once.

Hope this helps you in the right direction.

+7
source

Reading the documentation, it seems that the cache is not trying to calculate the size of the objects that it caches. This makes sense because it is not something that can be made from the process itself for arbitrary types (you can do this for fixed-size structures or arrays of fixed-size structures, but that's about it); a little search query will confirm this to you. However, he knows how much RAM is available on the computer; you can get this from new Microsoft.VisualBasic.Devices.ComputerInfo().AvailablePhysicalMemory . Therefore, apparently, the cache does two things:

  • It keeps track of when the last object was last used.
  • Surveys on memory statistics at a certain interval.

Then, for each survey, either the amount of available memory is within acceptable limits or not. If it is within acceptable limits, it does nothing. If it does not start deleting items, the last item that was the last will be deleted first. It continues to delete items until the memory returns to acceptable limits.

If you think about it, that's almost all you can do with the information available in the cache.

This strategy is fine, but it obviously breaks if you have other objects that contain references to items in the cache, because deleting an item from the cache will not free it for garbage collection. This is a callback point to perform a cleanup to ensure that there are no more object references.

+2
source

Here is the source. The answer to your second question is obvious if you look at the implementation of the Add: referencesource.microsoft.com method, which calls AddOrGetExisting.

I do not know about the size, but I think that you are right in your assumption that there is no magic. Also, if you are interested, you can study the sources in detail.

+1
source

You cannot easily set the PhysicalMemoryLimit or CacheMemoryLimit elements of the MemoryCache class, because they do not have setters implemented (at least in .Net 4.0).

I agree with the other answers that AddOrGetExisting should be used if you need only one instance of the object cached. If not, you can cache alternate items with a different key.

-2
source

All Articles