Wow, that's why I just spent too much time searching in the CLR with a reflector, but I think, finally, I am well versed in what is going on here.
Settings are read correctly, but the CLR itself seems to have a deep-seated problem that looks like it will make setting the memory limit almost useless.
The following code is reflected in the System.Runtime.Caching DLL for the CacheMemoryMonitor class (there is a similar class that controls physical memory and deals with another parameter, but this is more important):
protected override int GetCurrentPressure() { int num = GC.CollectionCount(2); SRef ref2 = this._sizedRef; if ((num != this._gen2Count) && (ref2 != null)) { this._gen2Count = num; this._idx ^= 1; this._cacheSizeSampleTimes[this._idx] = DateTime.UtcNow; this._cacheSizeSamples[this._idx] = ref2.ApproximateSize; IMemoryCacheManager manager = s_memoryCacheManager; if (manager != null) { manager.UpdateCacheSize(this._cacheSizeSamples[this._idx], this._memoryCache); } } if (this._memoryLimit <= 0L) { return 0; } long num2 = this._cacheSizeSamples[this._idx]; if (num2 > this._memoryLimit) { num2 = this._memoryLimit; } return (int) ((num2 * 100L) / this._memoryLimit); }
The first thing you might notice is that it doesn't even try to look at the size of the cache until the Gen2 garbage collection appears, instead simply discarding the existing value of the stored size in cacheSizeSamples. That way, you can never achieve the goal directly, but if everything else works, we will at least get a size measurement before we get real problems.
So, assuming the Gen2 GC happened, we ran into problem 2, which is that ref2.ApproximateSize does a terrible job, actually approximating the size of the cache. Slogging through the junk CLR I discovered that this is System.SizedReference, and this is what it does to get the value (IntPtr is the handle to the MemoryCache object itself):
[SecurityCritical] [MethodImpl(MethodImplOptions.InternalCall)] private static extern long GetApproximateSizeOfSizedRef(IntPtr h);
I assume that the extern declaration means that at this moment it is plunging into unmanaged windows, and I don't know how to start figuring out what it is doing there. From what I observed, although he does an awful job, trying to get closer to the size of the general thing.
The third notable thing is the call to manager.UpdateCacheSize, which sounds like it has to do something. Unfortunately, in any normal sample of how this should work, s_memoryCacheManager will always be null. The field is set from the public static member of ObjectCache.Host. This manifests itself to the user, which he may encounter if he so desires, and I really could do this kind of work, as expected, by smoothing my own implementation of IMemoryCacheManager, setting it to ObjectCache.Host, and then running the sample, At that while it looks like you could just make your own cache implementation and not even worry about all this, especially since I have no idea if your own class ObjectCache.Host is configured (static, so it affects each of them, which may be there in the process f) for measuring cache can ruin other things.
I have to believe that at least part of this (if not a couple of parts) is simply a mistake. It would be nice to hear from someone at MS that the deal was with this thing.
The TL; DR version of this giant answer: suppose that CacheMemoryLimitMegabytes is completely overloaded at this point in time. You can set it to 10 MB, and then continue to fill the cache up to ~ 2 GB and release an exception from memory without disabling deleting the item.