Is type T caching possible?

Is it possible to leave T to Object when placed in the cache?

WeakReference require the use of objects. System.Runtime.Caching.MemoryCache locked for an object type.

Do user dictionaries / collections cause problems with the Garbage Collector, or do you need to run your own garbage collector (separate thread)?

Is it possible to have the best of both worlds?


I know I already accepted the answer, but now using WeakReference is possible! It looks like they made it into .Net 4.

http://msdn.microsoft.com/en-us/library/gg712911(v=VS.96).aspx


old function request for it.

http://connect.microsoft.com/VisualStudio/feedback/details/98270/make-a-generic-form-of-weakreference-weakreference-t-where-t-class

+7
source share
2 answers

There is nothing to prevent you from creating a common wrapper around MemoryCache - perhaps with a restriction requiring reference types:

 public class Cache<T> where T : class { private readonly MemoryCache cache = new MemoryCache(); public T this[string key] { get { return (T) cache[key]; } set { cache[key] = value; } } // etc } 

Obviously, it is worth delegating only those parts of MemoryCache that are of real interest to you.

+4
source

So, you basically want to depend on the injection of a cache provider that returns only certain types? Isn't that against the whole OOP?

The idea of ​​an “object” type is that everything and everyone is an object, therefore, using a cache that caches instances of the “objects” of an object of the type you say, you can cache anything.

By creating a cache that only caches objects of a certain predefined type, you limit the functionality of your cache ...

Nothing prevents you from implementing a custom cache provider with a common restriction, so it allows you to cache certain types of objects, and this will theoretically save you about 2 ticks (even milliseconds) for each search.

A way to look at this ...

More important to me:

  • Good OOP based on best practices.
  • about 20 milliseconds over the life of my cache provider

Another thing .... net is already focused on optimizing the process of boxing and unpacking to the extreme, and at the end of the day, when you "cache" what you just put somewhere, you can quickly restore it and keep a pointer to it location for this search later.

I have seen solutions related to streaming XML files of 4 GB in size through a business process that uses objects that are destroyed and recreated with every call. The fact is that the flow of processes is not so much important for initialization and preparation, it makes sense.

How important is this cast time loss? I would be interested to know more about a scenario that requires such speed.

As a side note: Another thing I've noticed about new technologies such as linq and the entity infrastructure is that the result of the query is that it is important to cache when the request takes a lot of time, but not so much side effects on result.

This means that (for example): If I were to cache the basic "default instance" of an object that uses a complex set of entity queries to create, I would not cache the resulting object except for the queries.

Since Microsoft is already doing work on earth, I would ask ... what am I caching and why?

+1
source

All Articles