chriso / lru Goto Github PK
View Code? Open in Web Editor NEWA simple LRU cache
License: MIT License
A simple LRU cache
License: MIT License
This way one can write:
var value = cache.get(key) || cache.set(key, read_from_database(key));
Only documentation is example; events not currently documented.
It'd be quite nice to be able to call the evict
event on all entries upon clearing the cache.
This would have the benefit of being able to do something like the following:
process.on('exit', () => cache.vacate());
Hey,
I've always been using https://github.com/isaacs/node-lru-cache for when I need a in-memory lru cache. But now I found this.
Any reason I should change and use this instead of lru-cache? :)
Currently lru.remove(key)
returns the element and not the value. I think it would be more consistent if just returned the value like set/get.
/home/substack/projects/kd-tree-store/node_modules/lru/index.js:33
this.cache[this.tail].prev = null;
^
TypeError: Cannot set property 'prev' of undefined
at LRU.remove (/home/substack/projects/kd-tree-store/node_modules/lru/index.js:33:40)
at LRU.evict (/home/substack/projects/kd-tree-store/node_modules/lru/index.js:121:29)
at LRU.set (/home/substack/projects/kd-tree-store/node_modules/lru/index.js:66:18)
at build (/home/substack/projects/kd-tree-store/index.js:35:17)
at write (/home/substack/projects/kd-tree-store/index.js:53:9)
at build (/home/substack/projects/kd-tree-store/index.js:42:7)
at write (/home/substack/projects/kd-tree-store/index.js:59:9)
at /home/substack/projects/kd-tree-store/index.js:38:14
at FSReqWrap.wrapper [as oncomplete] (fs.js:667:5)
Just a though about keys property, don't you think it should evict old elements and return only valid keys regarding maxAge?
This will clean the cache more aggressively but getting the keys would be a little longer.
What do you think @chriso @mafintosh @DiegoRBaquero ?
Edit: same could apply to length property
Hi, in function get
I see it calculates whether an element's modified
field is set in such way that its difference with Date.now()
is more than maxAge
, and if so it removes that key and returns undefined
.
However, it does not consider the most-recent time where an element was read, i.e. was accessed with a previous get
: the modified
property is set only inside function set
.
This is in contrast to the documentation and to general LRU expected behavior AFAIK.
Since the module only has one export, would be simpler to just do var lru = require('lru')
, but now I'm just being pedantic about it. Would be a major version update, but otherwise trivial.
I'm loving this LRU cache implementation, but I need use it out of my main node process.
I'm playing with the idea of move the cache in a redis service, so I want to ask: do you think that could be more or less easy create a memory laye abtraction, using memory as default, but making possible use specific implementation like redis? ๐
Hello,
The rule maxAge
being checked only on get
calls is a bit restrictive to me. In my use case. I use peek
to select some elements and then I really use only some of them and call get
.
In this way, I keep the last used date accurate to evict the oldest when max
is reached.
Now most of my elements are never accessed using get
and maxAge
is not triggered.
What do you think about adding a method to check availability or adding a parameter to get
or peek
to control maxAge
eviction?
It would prevent me for doing this:
// Check manually if the peer is active
if (peers.maxAge && (Date.now() - peers.cache[peerId].modified) > peers.maxAge) {
peers.remove(peerId)
continue
}
// Don't mark the peer as most recently used on announce
var peer = this.peers.peek(peerId)
I can propose a PR in this way if you want to.
Hello, I have a use case where I'm querying an API that tells browsers not to cache its responses. As a user bounces around from page to page in my application, the browser makes a fresh request each time, even if you're hitting "back" and "forward" repeatedly. I'd like to use a cache to save the most recent, say, 5 requests. But it should be the most recent unique requests; otherwise if you hit "back" and "forward" repeatedly, your cache will only contain those 2 pages and knock out any others you recently hit.
Is there a way to consider the uniqueness of items in an LRU like this one? Or am I thinking about it all wrong?
Thanks for your time.
Deleting and Recreating the object each time you read, lead to a lot of memory fragmentation. You do not have to recreate the object on read. Just move the pointers in the queue.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.