Time to live memoization in F #

Not sure if I got this right, or is there a better way or existing library to solve this problem already.

In particular, I'm not sure that CAS will need a memory fence ... I think it's not better, but better to ask.

I also tried with an agent and a mutable dictionary, but my intuition that it will be slower was confirmed, and the implementation was more complicated.

module CAS = open System.Threading let create (value: 'T) = let cell = ref value let get () = !cell let rec swap f = let before = get() let newValue = f before match Interlocked.CompareExchange<'T>(cell, newValue, before) with | result when obj.ReferenceEquals(before, result) -> newValue | _ -> swap f get, swap module Memoization = let timeToLive milis f = let get, swap = CAS.create Map.empty let evict key = async { do! Async.Sleep milis swap (Map.remove key) |> ignore } |> Async.Start fun key -> let data = get() match data.TryFind key with | Some v -> v | None -> let v = f key swap (Map.add key v) |> ignore evict key v 
+7
f #
source share
1 answer

If you want to restrict what memoize is to functions that accept string input, you can reuse the functionality from System.Runtime.Caching .

This should be reliable enough as part of the main library (you would hope ...), but the row limit is pretty heavy, and you will need to compare with the current implementation if you want to do a performance comparison.

 open System open System.Runtime.Caching type Cached<'a>(func : string -> 'a, cache : IDisposable) = member x.Func : string -> 'a = func interface IDisposable with member x.Dispose () = cache.Dispose () let cache timespan (func : string -> 'a) = let cache = new MemoryCache(typeof<'a>.FullName) let newFunc parameter = match cache.Get(parameter) with | null -> let result = func parameter let ci = CacheItem(parameter, result :> obj) let cip = CacheItemPolicy() cip.AbsoluteExpiration <- DateTimeOffset(DateTime.UtcNow + timespan) cip.SlidingExpiration <- TimeSpan.Zero cache.Add(ci, cip) |> ignore result | result -> (result :?> 'a) new Cached<'a>(newFunc, cache) let cacheAsync timespan (func : string -> Async<'a>) = let cache = new MemoryCache(typeof<'a>.FullName) let newFunc parameter = match cache.Get(parameter) with | null -> async { let! result = func parameter let ci = CacheItem(parameter, result :> obj) let cip = CacheItemPolicy() cip.AbsoluteExpiration <- DateTimeOffset(DateTime.UtcNow + timespan) cip.SlidingExpiration <- TimeSpan.Zero cache.Add(ci, cip) |> ignore return result } | result -> async { return (result :?> 'a) } new Cached<Async<'a>>(newFunc, cache) 

Using:

 let getStuff = let cached = cacheAsync (TimeSpan(0, 0, 5)) uncachedGetStuff // deal with the fact that the cache is IDisposable here // however is appropriate... cached.Func 

If you are never interested in directly accessing the base cache, you can obviously just return a new function with the same signature as the old one, but assuming the cache is IDisposable, this seemed unreasonable.

I think that in many ways I prefer your solution, but when I came across a similar problem, I had a perverted thought that I really should use the built-in material if I could.

+1
source share

All Articles