Giter Site home page Giter Site logo

Comments (12)

alastairtree avatar alastairtree commented on May 30, 2024

LazyCache does not accept a delegate for the policy. Can you give a bit more info on why you might want to pass a policy factory rather than just creating a policy that might just be ignored if the cache is hit? Is this related to the use case in Issue #5 ?

from lazycache.

abatishchev avatar abatishchev commented on May 30, 2024

This just slows down reads from cache: before read and return a new policy is created and subsequently ignored. More hits have a key - more objects are created and ignored.

This might make sense for keys with expiration: once it has expired just created policy will have used. Buy for just with no expiration it's just pure waste.

from lazycache.

alastairtree avatar alastairtree commented on May 30, 2024

Do you have an actual bug/issue caused by the allocations? Being a garbage
collected language you would have to be doing some pretty serious
performance optimisations for those to be worth the effort to minimise. As
we are using MemoryCache under the hood LazyCache would have the same
behaviour.

On 4 Oct 2016 15:34, "Alexander Batishchev" [email protected]
wrote:

This just slows down reads from cache: before read and return a new policy
is created and subsequently ignored. More hits have a key - more objects
are created and ignored.

This might make sense for keys with expiration: once it has expired just
created policy will have used. Buy for just with no expiration it's just
pure waste.


You are receiving this because you commented.
Reply to this email directly, view it on GitHub
#10 (comment),
or mute the thread
https://github.com/notifications/unsubscribe-auth/AF-5lcwcThz-Fn8qzbzouMyHeIkLaOy3ks5qwmP4gaJpZM4KLvlA
.

from lazycache.

abatishchev avatar abatishchev commented on May 30, 2024

It's small if is just GetOrAdd(key, valueFactory, new ItemCachePolicy()). But it can be also GetOrAdd(key, valueFactory, CreateItemCachePolicy()) meaning it can take indefinitely, even go to a database or make an http call. And will do it on every call, and nothing prevents it from doing that. And nothing indicates it won't because it will.

I realize this is yet another serious design flaw of MemoryCache itself. The main one is AddOrGetExisting() accepting object rather than Func<object> which is already perfectly mitigated by LazyCache.

So mitigating one more, by adding Func<ItemCachePolicy> you would add even more value to LazyCache!

from lazycache.

alastairtree avatar alastairtree commented on May 30, 2024

Sorry if this seems painful but I still don't think I understand an actual use case of why the generation of the policy would be expensive? For example why would it require a db/http call in your example?

from lazycache.

abatishchev avatar abatishchev commented on May 30, 2024

It doesn't require an external call but it might because nothing prevents it. In my case it's relatively simple, mid-size logic to determine the policy basing on the key and typeof(T). But in the same time it's not as cheap and straightforward as new ItemCachePolicy { ... }.

Fixing one major design flows of the underlying framework after another would be a huge win for LazyCache over raw MemoryCache, effectively making one unusable without another. Like MemoryCache on steroids :)

from lazycache.

alastairtree avatar alastairtree commented on May 30, 2024

I just don't believe that replacing the unnecessary allocations of a cache policy is a worthwhile optimisation without any evidence that suggests replacing it with a lamda makes a measurable improvement. In addition the lamda is not "free" and there are some pitfalls with lamdas and closures that suggest caution in their use does have some merit - see https://blog.jetbrains.com/dotnet/2014/07/24/unusual-ways-of-boosting-up-app-performance-lambdas-and-linqs/ for some more info. I am going to close this issue for now, but happy to reopen if you can produce a test case to demonstrate the benefits of your proposal. Thanks for the feedback.

from lazycache.

abatishchev avatar abatishchev commented on May 30, 2024

Oh, come on! Calling a method every time is obviously more expensive than single lambda allocation. You can use a delegate if believe that lambda is expensive.

Here's a simple test:

cache.GetOrAdd(key, () => value, CreatePolicy());

ItemCachePolicy CreatePolicy()
{
  Thread.Sleep(1000);
  return new ItemCachePolicy();
}

And again, there are no much problems with new ItemCachePolicy() passed to the method comparing to calling a method returning ItemCachePolicy.

from lazycache.

alastairtree avatar alastairtree commented on May 30, 2024

I'm really struggling to understand in what realistic use case someone would have a factory method for their policy that would be sufficiently complex as to warrant this optimisation? Without a use case it feels a bit like premature optimisation.

from lazycache.

brownbl1 avatar brownbl1 commented on May 30, 2024

Hi! First, great little library. I've loved using it in our project and being lazy!

I think I do actually have a real use case for this functionality. I've created a ChangeMonitor based on the solution proposed here: http://stackoverflow.com/a/22388943/810667

Essentially, every time the change monitor is created, we subscribe to an event handler that can be fired to manually expire the cache. It seems to me that it would be better to not create the ChangeMonitor instance if we're not creating a new cache entry, and thus saving cycles on the expiration side when the cache is manually expired.

Thoughts?

Also, thanks for the link on lambdas and performance! Very instructive.

from lazycache.

alastairtree avatar alastairtree commented on May 30, 2024

Thanks @itslittlejohn glad it helped. Interestingly i have been playing with a "delete everything" type feature. Have a look at this class on a separate branch https://github.com/alastairtree/LazyCache/blob/e270bd33f59e9c94bdaf06d644938521f06e82a9/LazyCache/InMemoryCacheManager.cs

I was thinking of implementing and comparing a "dispose the MemoryCache and create a new one" based solution with a change monitor based solution like yours and seeing which had nicer experience and better performance before merging into master. Change monitors have the advantage that they could also enable a regions feature.

from lazycache.

brownbl1 avatar brownbl1 commented on May 30, 2024

That's great! It looks promising, and I think it would a great optional addition to the library. It seems to me that you could have both.

I propose adding IAppCache.Clear(string regionName = null)

If no region is specified, using the dispose approach would probably make the most sense in terms of performance, although it seems like calling Dispose does invoke the CacheEntryRemovedCallback delegate if it has a value. Not an issue, but a consideration?

If a region is specified, you'd do the approach similar to what I've implemented and is outlined in the above SO post. When adding an item to the cache, you'd only create a change monitor for it if a region is specified (which would be an additional optional param that would need to be added to a few methods).

Region functionality is something that we need in our application, so if you're open to contributions, I could submit what I have once I get something working. I'd prefer to have it integrated, or at least in the interface.

One caveat is that there seem to be some issues with MemoryCache.Default and Dispose. I guess you'd just need to be careful about not mixing MemoryCache.Default and IAppCache...

If something like this was implemented internally, then I agree with you that I see little reason for converting the CacheItemPolicy param to a delegate.

from lazycache.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.