Giter Site home page Giter Site logo

alastairtree / lazycache Goto Github PK

View Code? Open in Web Editor NEW
1.7K 48.0 156.0 2.43 MB

An easy to use thread safe in-memory caching service with a simple developer friendly API for c#

Home Page: https://nuget.org/packages/LazyCache

License: MIT License

C# 93.04% HTML 5.06% PowerShell 1.90%
c-sharp cache inmemory objectcache lazy performance

lazycache's Introduction

Lazy Cache

Build status AppVeyor tests NuGet Nuget

Lazy cache is a simple in-memory caching service. It has a developer friendly generics based API, and provides a thread safe cache implementation that guarantees to only execute your cachable delegates once (it's lazy!). Under the hood it leverages Microsoft.Extensions.Caching and Lazy to provide performance and reliability in heavy load scenarios.

Download

LazyCache is available using nuget. To install LazyCache, run the following command in the Package Manager Console

PM> Install-Package LazyCache

Quick start

See the quick start wiki

Sample code

// Create our cache service using the defaults (Dependency injection ready).
// By default it uses a single shared cache under the hood so cache is shared out of the box (but you can configure this)
IAppCache cache = new CachingService();

// Declare (but don't execute) a func/delegate whose result we want to cache
Func<ComplexObjects> complexObjectFactory = () => methodThatTakesTimeOrResources();

// Get our ComplexObjects from the cache, or build them in the factory func 
// and cache the results for next time under the given key
ComplexObjects cachedResults = cache.GetOrAdd("uniqueKey", complexObjectFactory);

As you can see the magic happens in the GetOrAdd() method which gives the consumer an atomic and tidy way to add caching to your code. It leverages a factory delegate Func and generics to make it easy to add cached method calls to your app.

It means you avoid the usual "Check the cache - execute the factory function - add results to the cache" pattern, saves you writing the double locking cache pattern and means you can be a lazy developer!

What should I use it for?

LazyCache suits the caching of database calls, complex object graph building routines and web service calls that should be cached for performance. Allows items to be cached for long or short periods, but defaults to 20 mins.

.Net framework and dotnet core support?

The latest version targets netstandard 2.0. See .net standard implementation support

For dotnet core 2, .net framwork net461 or above, netstandard 2+, use LazyCache 2 or above.

For .net framework without netstandard 2 support such as net45 net451 net46 use LazyCache 0.7 - 1.x

For .net framework 4.0 use LazyCache 0.6

Features

  • Simple API with familiar sliding or absolute expiration
  • Guaranteed single evaluation of your factory delegate whose results you want to cache
  • Strongly typed generics based API. No need to cast your cached objects every time you retrieve them
  • Stops you inadvertently caching an exception by removing Lazys that evaluate to an exception
  • Thread safe, concurrency ready
  • Async compatible - lazy single evaluation of async delegates using GetOrAddAsync()
  • Interface based API and built in MockCache to support test driven development and dependency injection
  • Leverages a provider model on top of IMemoryCache under the hood and can be extended with your own implementation
  • Good test coverage

Documentation

Sample Application

See CacheDatabaseQueriesApiSample for an example of how to use LazyCache to cache the results of an Entity framework query in a web api controller. Watch how the cache saves trips to the database and results are returned to the client far quicker from the in-memory cache

Contributing

If you have an idea or want to fix an issue please open an issue on Github to discuss it and it will be considered.

If you have code to share you should submit a pull request: fork the repo, then create a branch on that repo with your changes, when you are happy create a pull Request from your branch into LazyCache master for review. See https://help.github.com/en/articles/creating-a-pull-request-from-a-fork.

LazyCache is narrow in focus and well established so unlikely to accept massive changes out of nowhere but come talk about on GitHub and we can all collaborate on something that works for everyone. It is also quite extensible so you may be able to extend it in your project or add a companion library if necessary.

lazycache's People

Contributors

alastairtree avatar allanrodriguez avatar doolali avatar gamblen avatar jjxtra avatar jnyrup avatar johnl404 avatar regisbsb avatar rmandvikar avatar rudithus-interview avatar smurariu avatar svengeance avatar talmroth avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lazycache's Issues

GetOrAdd(...) fails after GetOrAddAsync(...)

Please, consider the following failing sample unit test:

        [Test]
        public async Task GetOrAddFollowinGetOrAddAsyncTryOut()
        {
            Func<Task<ComplexTestObject>> fetchAsync = () => Task.FromResult(testObject);
            await sut.GetOrAddAsync(TestKey, fetchAsync);
            var actualAsync = await sut.GetAsync<ComplexTestObject>(TestKey);
            Assert.IsNotNull(actualAsync);
            Assert.That(actualAsync, Is.EqualTo(testObject));

            Func<ComplexTestObject> fetchSync = () => testObject;
            sut.GetOrAdd(TestKey, fetchSync);
            var actualSync = sut.Get<ComplexTestObject>(TestKey);
            Assert.IsNotNull(actualSync); // actualSync == null
        }

Note:

  • Fails on last assertion as actualSync is null.
  • Having the GetOrAdd before GetOrAddAsync, the test would succeed.

I assume that the private method UnwrapLazy is not considering the new AsyncLazy enhancements from September. Would be great if you could apply a fix for that, I would like to use this library in a mixed sync/async environment.

Thanks for sharing this library, great work.

Best regards, BadDonkey

Allow an item factory to return its own DateTimeOffset

Hi,

Something like the following:

T GetOrAdd<T>(string key, Func<Tuple<T, DateTimeOffset>> addItemFactory, DateTimeOffset absoluteExpiration)

Would be useful for caching resources that return an expiry - e.g. auth tokens.

Do you think this'd fit within the bounds of the project or is it better to simply extend it?

cheers,

Scott.

Consider not deprecating default constructor?

For uses without dependency injection, such as an F# webjob I'm working on now, please consider not removing the default constructor, or have another way of building the CachingService. I'm not using dependency injection since that isn't as common among functional applications and instead need a single, long-lived instance of the CachingService in my app.

Invalidate/Clear a cache instance

I can remove an cache item, but how do I clear the whole cache. I need it after a database deployment in an update scenario where all has to be loaded from the database first and the cache has to be invalidated.

Cheers

OnCacheExpiration event

Is there any way to configure an event to be run when the cache expires?

My scenario is the following: I have a web api that serves a list of banners. I want to increase their 'views' property every time I display every banner. Instead of updating the repository every time, it's better to update that property in the cached list and update the repository only when the cache expires.

I've forked and retargeted for multiple frameworks, net45, net 46, and netstandard2.0

Would you like me to contribute to your library?

I've forked your project and upgraded the solution to support the multiple framework targeting available in Visual Studio 2017, this does make it no longer able to be opened in VS 2015 (that's the downside). On the up side, I've upgraded it to the latest nuget packaging spec, and now have targeted packages for the original net45, but it also now creates and packages up into nuget a native net46 library, and a new native netstandard2.0 library.

The original net45 (and net46) are exactly the same code as the branch I pulled down.

Note the netstandard2.0 no longer supports the System.Runtime.Caching library.
For netstandard2.0 the replacement library is the Microsoft.Extensions.Caching.Memory

so I had to put in some compiler directives to do some minor spoofing that points to the correct implementation for netstandard2.0. I accomplished this by spoofing the object names so the code looks the same. This approach would allow users of your library to use it in netstandard2.0 (and most importantly Asp.net Core) with no modification in 95% of all cases, but it does sacrifice the clarity that would result if the users were "forced" to use the new specific class names in Microsoft.Extensions.Caching.Memory.

however we are currently using your library and we need to port over to asp.net core asap, so I wanted to make it possible for our stuff to be up and running as quick as possible, so I went with the spoofing route, and it just drops in and works in our stuff.

Anyways I wanted to give you the chance to receive these changes and incorporate them if you are interested.

Double check lock in GetOrAdd

What do you thin about using double check lock approach in CachingService.GetOrAdd like this: 1. try get value from cache 2. if key is missed aquire lock 3. get or create inside lock?

Add restricted sliding expiration policy

Thank you for creating this package, it looks great.

One of the features I'd like to see is a restricted sliding expiration policy which is really just a combination of absolute and sliding expirations.

I've written a few systems that benefit from this because most of the work is done within the sliding expiration, but if the data usage remains hot, there is no eviction from the cache. Absolute expiration is not always appropriate either because you don't get the advantage of releasing memory if it is not used (sliding).

Thoughts?

Trying to use LazyCache and get some cache hit/miss stats.

Hi @alastairtree

I'm trying to leverage LazyCache but to get some stats on the number of hits/misses in some time frame.

So lets say: given a 1 hour window, how many hits/misses.

First, what's the start of the 1 hour window? is it DateTime.Now for the first time this piece of code has ran? (think multiple web requests at the same time, too).

Then, how could we increment two different cache keys? I'm assuming that the miss ends up calling the lazy delegate method, so that's not tooo bad. But I still don't know where I could store a counter for misses that would reset (evicted?) after an hour.

Then to the hits! The GetOrAddAsync will just return something ... a counter? which i then need to thread-safe increment and THEN reset after an hour again.

So - tough problem to solve I think :(

Could you offer any ideas here how LazyCache might be able to leverage some of this?

EDIT: Oh! forgot to mention some reason/scenario also -> at the end of each 60 mins, I was going to send a logging message that reports the cache hit/miss ratio for the last hour. Then reset the hit/miss counters.

So then we can review the logs to see some of the stats.

Why return await?

It looks as if LazyCache's async methods return await.
While I do not consider myself an async code expert, my understanding is that this eliminates much of the utility of async code because all tasks are resolved before the return to the caller.
For example, if I have three mutually independent tasks that require 3 seconds of runtime each, if I await them individually as appears to be done with LazyCache, the total run time is 9 seconds, whereas if I Task.WaitAll the tasks, total run time is 3 seconds plus overhead.
Why not return without await?

LazyCache changes cached instance when changing retrieved instance

LazyCache persists changes to reference objects into the cache.

This, if not avoidable (very unlikely as objects would have to be copyable which is not a given), should at least be documented somewhere as a potential caveat.

Find an example application below:

using System;
using System.Collections.Generic;
using System.Data;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using LazyCache;

namespace LazyCacheTest
{
    class ExampleClass
    {
        public string Prop { get; set; } = "Not set";
    }

    class Program
    {
        static void Main(string[] args)
        {
            IAppCache cache = new CachingService();

            var CacheResult = cache.GetOrAdd("StringOne", () => new ExampleClass());


            Console.WriteLine(CacheResult.Prop);
            CacheResult.Prop = "Changed";
            Console.WriteLine(CacheResult.Prop);

            CacheResult = cache.GetOrAdd("StringOne", () => new ExampleClass());

            Console.WriteLine(CacheResult.Prop);

            Console.WriteLine("Exiting");
        }
    }
}

Expected result:

"Not Set"
"Changed"
"Not Set"
"Exiting"

Result:

"Not Set"
"Changed"
"Changes"
"Exiting"

GetOrAdd / GetOrAddAsync<T> should return null if the item ADDED.

Hi 👋

(edit: added repo and some more info)

Summary

Could the GetOrAddAsync<T> / GetOrAdd<T> methods return null if the item was ADDED. This is what Microsoft does with the ObjectCache.GetOrAdd(..) method so this would then follow the same logic.

Details

Ok, so in this code for GetOrAddAsync (or GetOrAdd...) you internally call the ObjectCache.AddOrExisting(..) under the hood to quietly add or get the item from the cache. Great!

Looking at the MS docs for OC.AOE it says (emphasis mine):

Return Value
Type: System.Runtime.Caching.CacheItem
If a cache entry with the same key exists, the specified cache entry; otherwise, null

which also is confirmed when I step through your code:

// CachingService.cs
var existingCacheItem = ObjectCache.AddOrGetExisting(key, newLazyCacheItem, policy);

existingCacheItem == null when the item was added.
existingCacheItem != null when the item already exists and was retrieved.

It's also a nice hint to the developer to explain if the item was actually ADDED or RETRIEVED (exists).
i.e.

  • If null, then added.
  • If not null, then retrieved.

So - what do you think?

Yes, this would be a major semver number increase/change.

--

How to repo:

  • Goto the test GetOrAddAsyncTaskAndThenGetTaskOfAnotherTypeReturnsNull
  • add a breakpoint on line 311.
  • add another breakpoint on CachingService.cs, line 87.
  • Debug...
  • After you step over line 87, existingCacheItem should equal null. This proves the item was added.
  • OPTIONAL: Confirm the item is in the cache by looking inside the ObjectCache using the visual studio debugger.

How to configure default expiry time in v2.0?

Like in prior version,

// Change the default cache duration from 20 minutes to 3 minutes
var cache = new CachingService() { DefaultCacheDuration= 60 * 3 };

Now v2.0 favors DI, that means it doesn't create new instance manually like above anymore.

It's probably somewhere in the doc, but I couldn't find it yet.

Absolute expiration of 7 seconds or less doesn't work

Hello,
I need absolute expiration of 7 seconds or less, but it does not expire below 15 seconds for me. Looks like there is a bug in MemoryCache, I think the same bug is causing this.

For MemoryCache folks are using a solution to override that property through reflection (please see this thread https://stackoverflow.com/questions/12630168/memorycache-absoluteexpiration-acting-strange)

What would be the best way to incorporate that in your solution, please advise?

CacheObject not getting overwritten.

I have a simple API that stores a short JSON string with a GUID as a key, for quick counters on a page so that it follows the user along on different devices.

The problem I have is that despite the code updating the underlying database correctly, the cache-object isn't updated.

On the GET method I have the simple AddOrGet as per sample;

Func<Task<string>> cacheableAsyncFunc = () => GetByGuid(key);
string result = await cache.GetOrAddAsync(key.ToString(), cacheableAsyncFunc);

However, that isn't enough, clearly as when a POST/PUT comes along and updates/sets the value, it should update the cache as well.

At first I had only this line in the code;

cache.Add(inc.key.ToString(), inc.keyValue);

It seemed to work on a small scale when I just used postman, and hit these in sequence and quite slowly and deliberately. However, there seems to be a racing condition or something that if a GET request hits just shortly after the POST/PUT has happened, the Cache object still prevails despite being updated. So I tried to forcibly remove it, and have this line on the very first line in the POST/PUT method.

cache.Remove(inc.key.ToString());

That doesn't help however. If I spam postman on the GET method while reloading a page triggering a POST/PUT request, the GET method never returns the new value that is set in the POST/PUT method.

However, this is rarely reproducible on my own machine/dev environment, but much more reliably reproduced in an Azure Web App. No errors are logged, just that the cache is not updated.

Full Code

        [HttpGet]
        public async Task<HttpResponseMessage> Get(Guid key)
        {
            Func<Task<string>> cacheableAsyncFunc = () => GetByGuid(key);
            string result = await cache.GetOrAddAsync(key.ToString(), cacheableAsyncFunc);

            if (result != null)
                return new HttpResponseMessage { StatusCode = HttpStatusCode.OK, Content = new StringContent(result, Encoding.UTF8, "application/json") };
            else
                return new HttpResponseMessage { StatusCode = HttpStatusCode.OK };
        }
        private async Task<string> GetByGuid(Guid key)
        {
            using (var context = new CounterAPI.Models.CounterModelContainer(WebApiApplication.CounterDB))
            {
                Models.Counter cnt = await context.Counters.FirstOrDefaultAsync(c => c.key == key);
                string result = null;
                if (cnt != null)
                {
                    result = Regex.Unescape(cnt.keyValue);
                    return result;
                }
                return result;
            }
        }
        [HttpPost]
        public async Task<HttpResponseMessage> Post(Models.Counter inc)
        {
            if (!ModelState.IsValid || inc == null || inc.key == null || inc.keyValue == null) return new HttpResponseMessage { StatusCode = HttpStatusCode.BadRequest, ReasonPhrase = "Missing values" };
            cache.Remove(inc.key.ToString());
            using (var context = new CounterAPI.Models.CounterModelContainer(WebApiApplication.CounterDB))
            {
                Models.Counter cnt = await context.Counters.FirstOrDefaultAsync(c => c.key == inc.key);
                if(cnt == null) { context.Counters.AddOrUpdate(inc); }
                else { cnt.keyValue = inc.keyValue; }
                try
                {
                    await context.SaveChangesAsync();
                    cache.Add(inc.key.ToString(), inc.keyValue);
                }
#pragma warning disable CS0168 // Variable is declared but never used
                catch (Exception ex)
#pragma warning restore CS0168 // Variable is declared but never used
                {
                    return new HttpResponseMessage { StatusCode = HttpStatusCode.InternalServerError, ReasonPhrase = "Unable to add content" };
                }
                return new HttpResponseMessage { StatusCode = HttpStatusCode.Created };
            }
        }

After a POST, there remains a difference between ObjectCache and SQL DB entry;

image

Maybe a problem...

This library was easy to implement into my repository of my web api. However, when I moved it to production I noticed that my cpu would gradually rise and never drop down, all the way up to 70-75 percent and stay there. As soon as I cleared the cache on the site, the cpu would go back to 0 - 5 percent and slowly begin to creep up. I can manage it using the recycler in IIS but I need to take a deeper look under the hood to see what the creep is. However, if you have a suggestion or idea of what is happening, that would be great.

WebApi and LazyCache issue

Hi @alastairtree ,

In my webapi I'm calling LazyCache in order to cache some consumed webservices.

At the moment I see a cache, but the webmethods are not called.

Here is my code:

        private Func<int, OBJDON> SWEB_CryptoClientCall = (x) =>
        {
            OBJDON resultDom = new OBJDON();

            using (PyAuth SERVIZIO = new PyAuth())
            {
                resultDom = SERVIZIO.PY_GET_DATA("", 0, 1, x, 0);
            }

            return resultDom;
        };

The call:

            OBJDON resultDom = new OBJDON();

            resultDom = cache.GetOrAdd($"SWEB_{cryptoString}", () => SWEB_CryptoClientCall((int)CUR_UTE), DateTimeOffset.Now.AddDays(1));

resultDom is always null and the function is not fired. What am I doing wrong?

RemovedCallback is not working correctly?

Hi,
Seems like the RemovedCallback is not working accurately. It is called in random intervals? For example I want to get a call back when the item is removed after the 5 seconds but somehow the callback happens in 11 seconds and sometimes in 20, 30 etc.

Am I missing anything?

Ta

``
class Program
{
static MemoryCache memoryCache = new MemoryCache("Program"); // Or???
static CachingService cache = new CachingService(memoryCache); // cache = new LazyCache.CachingService() { DefaultCacheDuration = 5}; ???

    public static void Main(string[] args)
    {
        
        var policy = new CacheItemPolicy()
        {
            Priority = CacheItemPriority.NotRemovable,
            AbsoluteExpiration =  DateTime.UtcNow.AddSeconds(5),
            RemovedCallback = OnCacheExpired,
        };

        cache.Add("1", "One", policy);
        Console.WriteLine("Adding to cache: " + DateTime.Now);
        
        //Console.WriteLine("Press any key to continue . . . ");
        Console.ReadKey(true);
    }
    
    
    private static void OnCacheExpired(CacheEntryRemovedArguments arguments)
    {
        var productsBeingRemovedFromCache = (String) arguments.CacheItem.Value;
        Console.WriteLine("Removing item from cahce: " + DateTime.Now);
        Console.WriteLine(cache.ObjectCache.GetCount());
    }
    
    private static void OnUpdateCallback(CacheEntryUpdateArguments arguments)
    {
        Console.WriteLine("Before removing=-> " + cache.ObjectCache.GetCount());
    }
}

LazyCache 2.0.0-beta03 Can be used in .net framework 4.6 + ????

Exception of type 'Unity.ObjectBuilder.BuildPlan.DynamicMethod.Creation.DynamicMethodConstructorStrategy+InvalidRegistrationException' was thrown.

container.RegisterType<IMemoryCache, Microsoft.Extensions.Caching.Memory.MemoryCache>(new ContainerControlledLifetimeManager());
container.RegisterType<ICacheProvider, LazyCache.Providers.MemoryCacheProvider>(new ContainerControlledLifetimeManager(), new InjectionConstructor(typeof(IMemoryCache)));
container.RegisterType<IAppCache, CachingService>(new ContainerControlledLifetimeManager(), new InjectionConstructor(typeof(Lazy)));

Contention on multiple readers & writters

There is two levels of contention

  1. Readers being locked by writters
  2. Application level locking

Readers being locked by writters

Reader -> Some client that will hit reading from cache
Writter -> Some client that will miss reading from cache and will set the value.

When having multiple readers and writters (Threads) there is a lot of contention here:
https://github.com/alastairtree/LazyCache/blob/feat/netcore2/LazyCache/CachingService.cs#L95

Every reader&writer will lock even if the item is already in cache, and we do not need to lock for just reading.

Adding an unlocked read try at the top would remove a lot of contention when many threads are accessing the same items (which is the point of caching)

Application Level

As locking happens at the IAppCache level, we are blocking all readers-writters of the whole app at the same time.

Meaning, that if one writter takes 5 seconds to read something from the backing storage A, there could be 10 readers waiting to read something from the cache (which was stored some time ago from storage B).

Changing the locking mechanism from "Per Application" to:

A) Per Type -> Table locking
B) Per Type + Key -> Row locking

On B) There would need to be a mechanism to dispose not used locking semaphores.

A) Can be achieved creating one cache service per type being cached, but as is, the locking happens at higher level possible.

create instance with Unity in asp.net mvc controller error

public static void RegisterTypes(IUnityContainer container)
        {
            container.RegisterType<IAppCache>(new HierarchicalLifetimeManager(), null);
        }

public class HomeController : Controller
    {
        private readonly IAppCache cache;
        public HomeController(IAppCache cache)
        {
            this.cache = cache;
        }
        public ActionResult Index()
        {
            var result = this.cache.GetOrAdd("test", () => {
                return DateTime.Now.ToString();
            });
            ViewBag.Time = result;

            return View();
        }
}

error:
An error occurred when trying to create a controller of type 'WebApp.Controllers.HomeController'. Make sure that the controller has a parameterless public constructor.

Gotcha when using CacheItemPolicy.RemovedCallback and relying on CacheEntryRemovedArguments

I have a scenario where I create a cache key with a custom CacheItemPolicy, and I specify the RemovedCallback.

This delegate provides a parameter for the event arguments, which contains the cached item that has just been removed.

My scenario is that I'm caching a 3rd party system's login session ID value, and when that cache expires, I have to "clean up" by invoking a web service call to log out that session ID on their end.

The gotcha is that once I'm in the RemovedCallback delegate, I am no longer leveraging the LazyCache API to deal with the cache item. That means I don't get the nicety of the check for Lazy<T>, etc.

This drove me crazy for the last 30 mins, as I could see the value of the CacheEntryRemovedArguments param in the debugger, but it wasn't just my simple little int sessionID that I cached. It had been wrapped in a Lazy<int>.

It may be a good idea to provide a sample usage of the RemovedCallback that actually uses the args parameter, instead of just calling a logger, for completeness.

Question How to update an item in a cache where a value is a list of object

How do I go about updating a particular object in cached object with a given key.

Here is an example

cache.GetOrAdd("latest-products", () => oProductService.GetProducts(), policy);

I am populating a list of product to "latest-products" and want to update a particular product in the given list and then update the cache.

oProductService.GetProducts() returns a list of Products

public class Products
{
public Guid Id {get;set;}
public string Name { get; set; }

     public int Quantity { get; set; }
}

Proper IDisposable support in CachingService

CachingService class cerates ICacheProvider instance silently in constructor. The last one supports IDisposable.

Expected CachingService to be disposable (e.g. implement IDisposable) too in order to dispose internal resources properly.

Auto-refresh feature for LazyCache

First of all, this is a awesome library!
We tried to use your implementation for a thread-safe cache usage and also be able to auto-refresh our cached object. We wanted to define for each cached object an absolute expiration time ,execute the delegate to create a new object before the cached object is removed and replace the expired object with the new object using the UpdateCallback delegate. That way the cached item would be reevaluate after the expiration time. We didn’t succeed because it is impossible to define an UpdateCallback in lazy cache. Looking online we found a way to do just that, and we thought maybe you could include that feature into the lazy cache library.
Here’s the link:
https://www.programmingmusings.com/index.php/2016/09/27/auto-refresh-thread-safe-cache-in-net/

Thank you very much and have a nice day.

Generic Type Keys

Hi @alastairtree ,

I have started using your LazyCache library and it works great.

If you can support Generic Type Keys, it would be great. Right now only string type is supported as keys.

Usage:

I have an object which identifies unique request which is made to database server. I want to cache the response of the database query, and for that I want the key to be set that object.

netcore GetOrAdd doesn't work with SizeLimit set

Hi alastairtree,

We have tried setting the SizeLimit on the cache using the .netcore version of IAppCache, and it doesn't set the size on the entry before GetOrCreate is called on the provider.

We think this is because of the use of the Lazy in GetOrAdd in the CachingService.cs

Here is some test code which fails for GetOrAdd (but works for Add):

IAppCache cacheWithSizeLimit = new CachingService(
                new MemoryCacheProvider(new MemoryCache(new MemoryCacheOptions { SizeLimit = 5 })));

            cacheWithSizeLimit.Add("key1", "item1", new MemoryCacheEntryOptions().SetSize(1));

            cacheWithSizeLimit.GetOrAdd("key2", () => "item2", new MemoryCacheEntryOptions().SetSize(1));

When we test the calls to MemoryCacheProvider directly, it fails only when a lazy is passed to GetOrCreate, we believe because it's using a Lazy it's not executing the call back to SetOptions on the entry.

The example below works for item1 and item2 but not item3.

var memoryCacheProvider =
                new MemoryCacheProvider(new MemoryCache(new MemoryCacheOptions { SizeLimit = 5 }));

            memoryCacheProvider.Set("key1", "item1", new MemoryCacheEntryOptions().SetSize(1));

            memoryCacheProvider.GetOrCreate("key2", entry =>
                                                    {

                                                        entry.SetOptions(new MemoryCacheEntryOptions().SetSize(1));
                                                        return "item2";
                                                    });

            memoryCacheProvider.GetOrCreate("key3", entry =>
                                                    new Lazy<string>(() =>
                                                               {
                                                                   entry.SetOptions(new MemoryCacheEntryOptions().SetSize(1));
                                                                   return "item3";
                                                               }
                                                    ));

Is this a bad idea?

public static LazyCache.IAppCache SearchCache => new CachingService();

Under the hood I understand its using the Shared default cache, so not newing up / incurring unnecessary overhead I hope.

Please release a signed version of LazyCache

Hi @alastairtree

Tnx for a great library. Really enjoy working with it and it has saved me a lot of time and effort.

Would you be so kind as to release a signed version of the NuGet package, so I can use LazyCache in my signed desktop applications.

tnx

0.7.1.44 + Azure: "ValueFactory attempted to access the Value property of this instance" upon resuming from hibernation

I'm getting the following exception every time azure puts my site asleep due to no requests, and the site is then awakened by a new request..

Any thoughts?

Microsoft.LightSwitch.DataServiceOperationException: The type initializer for ‘LightSwitchApplication.DomainClass’ threw an exception. ---> System.TypeInitializationException: The type initializer for ‘LightSwitchApplication.DomainClass’ threw an exception. ---> System.InvalidOperationException: ValueFactory attempted to access the Value property of this instance.

at System.Lazy1.CreateValue() at System.Lazy1.LazyInitValue()
at System.Lazy1.get_Value() at LazyCache.CachingService.UnwrapLazy[T](Object item) at LazyCache.CachingService.GetOrAdd[T](String key, Func1 addItemFactory, CacheItemPolicy policy)
at LazyCache.CachingService.GetOrAdd[T](String key, Func`1 addItemFactory, DateTimeOffset expires)
at LightSwitchApplication.Code.Cached.GetDomainClasses()

The get method is super simple:

        public static DomainClass[] GetDomainClasses()
        {
            Func<DomainClass[]> populate = () => {

                return Helpers.GetLsTables().DomainClasses.OrderBy(x => x.EvalOrder).Execute().ToArray();
            };

            return CachingService.GetOrAdd(DomainClassesKey, populate, DateTimeOffset.Now.AddDays(7));
        }

Handling long running cache population functions

Great library! I'm looking to solve this use case:

  1. There is an item in the cache that has expired
  2. This cache item is now requested again but it takes 30 seconds to retrieve data from the db.
  3. App just spins on all requests needing this cache item as it waits for the update to complete.

Rather than have the thread lock and block the app, I'd rather return the "expired" item and then when the new item is retrieved and available of course that can be returned. Essentially the only time there would be a true block where the user waits would be on the initial cache load.

Is there a good way to handle this? Can the RemovedCallback be utilized for something like this?

Thanks for any help or direction you can provide.

GetOrAdd/GetOrAddAsync creates policy/timespan every time?

Hi, The problem with the built-in MemoryCache that its GetOrAdd() doesn't have an overload accepting Func<CacheItemPolicy> so creates an object on each call even the result is already cached effectively discarding that object on most of the times.

Is it a case for LazyCache as well? Can this be addressed by the library? Or this is a built-in problem that can be only solved by the frsmeowke itself?

The proposed API would look like this:

Task<T> GetOrAddAsync<T>(
  string key,
  Func<Task<T>> addItemFactory, // called once
  Find<CacheItemPolicy> policyFactory // called once
);

.NETCore 2.0 Support?

I saw the previous post but I couldn't find if eventually there is a code branch or -pre nuget regarding .NET Core 2.0 support, any update?

Thank you for the library, works amazingly nice!

Multiple types with same key are returned as null from cache - last in wins?

First of all, great library - this one is really useful :-)

Currently it's possible to store values of different types in the cache but you need to be careful to generate unique keys. It would be a nice feature if the library could automatically mangle the internal key with the value type. In other words it would be desirable to be able to write...


   class Foo{ }
   class Bar{ }
   static void Main(string[] args)
   {
            IAppCache cache  = new CachingService();
            var sharedKey = "some shared key";
            var foo = cache.GetOrAdd(sharedKey, () => FetchFoo(sharedKey));
            var bar = cache.GetOrAdd(sharedKey, () => FetchBar(sharedKey));


and have both 'foo' and 'bar' sensibly returned. In the current prerelease, bar is return as NULL in this case, presumably because 'foo' is returned and then the type-cast to Bar fails.

Upon expiration, pull data on next Get

We've started using this in a project, and I am wondering after an item expires, if it's corresponding Func not called again to repopulate? It would be nice if it was kept around to repopulate after expiration.

We are creating a generic cache, where I can key by typeof(T).FullName, with a func to call out to a web api and get the data. It works, until it expires.

Once it expires the web api is never hit again and it's gone forever. I'm guessing I probably just need to keep the func around and use GetOrAddAsync instead of GetAsync.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.