If you want to restrict what memoize is to functions that accept string input, you can reuse the functionality from System.Runtime.Caching .
This should be reliable enough as part of the main library (you would hope ...), but the row limit is pretty heavy, and you will need to compare with the current implementation if you want to do a performance comparison.
open System open System.Runtime.Caching type Cached<'a>(func : string -> 'a, cache : IDisposable) = member x.Func : string -> 'a = func interface IDisposable with member x.Dispose () = cache.Dispose () let cache timespan (func : string -> 'a) = let cache = new MemoryCache(typeof<'a>.FullName) let newFunc parameter = match cache.Get(parameter) with | null -> let result = func parameter let ci = CacheItem(parameter, result :> obj) let cip = CacheItemPolicy() cip.AbsoluteExpiration <- DateTimeOffset(DateTime.UtcNow + timespan) cip.SlidingExpiration <- TimeSpan.Zero cache.Add(ci, cip) |> ignore result | result -> (result :?> 'a) new Cached<'a>(newFunc, cache) let cacheAsync timespan (func : string -> Async<'a>) = let cache = new MemoryCache(typeof<'a>.FullName) let newFunc parameter = match cache.Get(parameter) with | null -> async { let! result = func parameter let ci = CacheItem(parameter, result :> obj) let cip = CacheItemPolicy() cip.AbsoluteExpiration <- DateTimeOffset(DateTime.UtcNow + timespan) cip.SlidingExpiration <- TimeSpan.Zero cache.Add(ci, cip) |> ignore return result } | result -> async { return (result :?> 'a) } new Cached<Async<'a>>(newFunc, cache)
Using:
let getStuff = let cached = cacheAsync (TimeSpan(0, 0, 5)) uncachedGetStuff
If you are never interested in directly accessing the base cache, you can obviously just return a new function with the same signature as the old one, but assuming the cache is IDisposable, this seemed unreasonable.
I think that in many ways I prefer your solution, but when I came across a similar problem, I had a perverted thought that I really should use the built-in material if I could.
mavnn
source share