node-cache / node-cache Goto Github PK
View Code? Open in Web Editor NEWa node internal (in-memory) caching module
License: MIT License
a node internal (in-memory) caching module
License: MIT License
How to obtain TTL of the rest of the time
In the documentation for del() it says "...Returns the number of deleted entries..."
Will this always be 0 or 1?
Right now the module is very synchronous with the callbacks and some code execution.
I'm mostly concerned about the TTL check period which goes through all the keys synchronously.
Everything should be deferred using process.nextTick
/ setImmediate
/ setTimeout(fn,0)
Is it possible to increase this value?
const myCacheA = new NodeCache();
const myCacheB = new NodeCache();
Does it mean myCacheA get 1m and myCacheB get 1m ?
thanks
Hey,
I cant understand why this is happening:
var NodeCache = require('node-cache');
var mycache = new NodeCache( { stdTTL: 100, checkperiod: 120 } );
mycache.get("test", function( err, cachepage ){
console.log(cachepage);
console.log(err);
if(!err){
//Object is present in cache.
console.log('VIEW: Page from cache');
res.jsonp(cachepage);
it ALWAYS go into the if(!erro){ } when err is null and cachepage is {} .
i dont understand whats the issue,
var cache = new NodeCache({errorOnMissing:false});
Still returns a throw error instead of undefined.
Any way to save to cache to the file system? or memcache/some kind of db?
I have been looking into a caching solution for a node project of mine and I think this library looks very useful!. One thing I think could be a great addition though is to have a maximum cache size (perhaps either number of entries or memory size) which when exceeded will cause the check routine to evict the oldest entries from the cache to bring it back down to this size.
This of course would be entirely optional and if not set the cache would operate exactly as it does now.
coffescript is just overhead and makes the compiled app bloated, harder to debug, larger in size inconsistent with the es6
It would be nice to have a documentation with internal version switch
Greetings! I'm using node-cache module and I'm trying to store promises in the cache. Take a look at the following code snippet:
router.get('/test', (req, res) => {
myCache.get('example', ( err, value ) => {
if (!err) {
if (value == undefined) {
value = new Promise((resolve,reject) => {
resolve("hey");
});
myCache.set('example', value);
}
console.log(value instanceof Promise);
value.then((str) => console.log(str));
res.status(200);
res.send();
} else {
logger.error(err);
res.status(500);
res.send();
}
});
});
The first request to that endpoint produces the following console output:
true
hey
The second request produces this:
true
TypeError: # is not a promise
So....a promise "is not" a promise? wat
Once I added node-cache to my module it stopped exiting as it should. If you take a look at [1] you can see how node-cache has been included and used. If you look at [2], you can see where I have removed node-cache by stubbing it out.
The stubbed out version executes and returns control as it should, but the non-stubbed version remains running indefinitely. The simplest test is to run node goji.js
. That should exit immediately, but when node-cache is not stubbed then it will run until killed.
I really have no idea what is going on here. Am I doing something incorrectly and just overlooking it? Or is there something wrong with node-cache?
[1] -- https://github.com/jsumners/goji/compare/caching
[2] -- jsumners/goji@caching...node-cache-stub
I've set up a cache of objects using nodecache. These objects get "processed" roughly 10s after being inserted into the cache. After they're processed I don't need to reference them and they can be destroyed.
Is it more resource efficient to .del() them after processing them manually, or to let them automatically expire after ~15 seconds?
For instance fetching some seldom changing data from the db caching it, then applying i18n to the result, pollutes subsequent cache fetches for the plain english result to be translated into another language.
var NodeCache = require("node-cache");
var myCache = new NodeCache();
var map = new Map();
map.set(1, 1);
console.log(map);
// Map { 1 => 1 }
myCache.set("myKey", map, function (err, success) {
if (!err && success) {
console.log(success);
// true
}
});
myCache.get("myKey", function (err, value) {
if (!err) {
if (value == undefined) {
console.log("undefined");
} else {
console.log(value);
// Map {}
}
}
});
If there is an max-memory options will be very cool.
Is there a way to list all keys on cache?
(just like in memory-cache)
From the README.md:
value = myCache.get( "myKey" );
// { "myKey": { my: "Special", variable: 42 } }
Should this not just return
/// {my: "Special", variable: 42 }
Rather than have the wrapping object too?
through the document, there is noway to update ttl of a key, so it can't update the session ttl if use as a session,
I am using your package node-cache, its exactly what I was looking for!
I have one problem. I am using a simple mechanism where I want to query my MySQL DB once every 5 min:
var NodeCache = require('node-cache');
var queriesCache = new NodeCache({ stdTTL: 300, checkperiod: 1 });
And then whenever my query result object expires via the "expires" event, I query MySQL again and store the query result in the object cache.
The only problem I have is it seems like sometimes the TTL doesn't work and my object expires all the time.
Have you heard about this problem?
Thanks!
Currently it is not clear whether expired entry are automatically purged from memory or only removed when you try to get them (so that your memory usage can potentially grow uncontrollably and explicit housekeeping for expired entries should be implemented).
Documentation mentions "Keys can have a timeout after which they expire and are cleaned from the cache", but it is not clear, whether values themselves are deleted as well.
Caching request objects not working from version 3.0.0.
I've created a gist to reproduce the issue: https://gist.github.com/szeist/777d67619579d4218144
Is it intentional that it has to be string? It's not mentioned in the docs but couldn't get the cache to work with an integer key.
Suppose i have to store lot of information in cache and the assigned memory get full, then how the plugin will behave. Will it use LRU algorithm to clear old cache or it will throw error.?
This add support for cloning ES6 Maps, Sets, Promises, and Symbols.
So the notes in options useClones
could be obsolete.
in a small module (randoms) i added code coverage via coveralls.io.
This whould be nice for node-cache too ...
I'm willing to go through and thunkify (https://github.com/visionmedia/node-thunkify) all of the methods in this nifty utility, but folks not running on node 0.11 with --harmony enabled aren't so lucky. May we please get some sync methods to compliment the aync defaults of this util?
Hi, Will cache gets cleared when node restarts ?
Can I ask what is the unit of the ksize and vsize in the stats? I am displaying the stats in the admin section of my project and I would like to know it is over a certain amount of memory.
So I suppose i can check vsize+ksize, but what is the unit of those values?
Btw thanks for this project, super useful.
I was just wondering if there is a functionality in node cache for deleting the first 10(can be configurable?) records from the dictionary to avoid oversizing the node cache?
or is there any max limit into the node cache library that I should know?
Please let me know Thanks
Can i define cacheKey like in string + var way? To be able to use same ttl rule for them.
when using node-cache in a clustered environment. obviously, it will create n number of caches based on how many forks we create. i am trying to look for a way for the forked processes to just use one cache, the initial idea is to instantiate the cache on cluster.js and try to send the cache to the forks by using worker.send. the problem here is
var string = JSON.stringify(message) + '\n';
^
TypeError: Converting circular structure to JSON
at Object.stringify (native)
at ChildProcess.target._send (child_process.js:479:23)
at ChildProcess.target.send (child_process.js:416:12)
at Worker.send (cluster.js:64:21)
I set a cache key and store a value in one nodejs script. How do I use the same key and access the value in another nodejs script? Right now, it will say key_not_found in another script.
Hi thanks for your lib, very usefull. however there is a small issue, I put elements in my cache that look like this:
var element = { buffer: aNodeBufferObject, contentType : 'image/png' };
the vsize for this type does not match, as it just don't know the type. maybe it would make sense to add an optional key into the element that specify the exact size? another idea would be to define a function that returns the exact size of an element.
what do you think?
Hello everyone, hope you are all doing good.
I have an app that is using node-cache to cache some json data, the keys are all unique, because i set them with the name of what the user has entered in an input, I have tried cache.flushAll() but it doesnt clear the cache, and for now all I can do is restart the node server to clear it, is there a way to do some kind of cache.del() without sending what to exactly delete and so it can delete everything?
Here is a demo of what I have:
var nc = require("node-cache");
var cache = new nc( { stdTTL: 3000 } );
// this cache-key is normally set by what the user add in a field. This is just for example:
var cacheKey = 'key-' + new Date().getTime();
cache.set(cacheKey, { hello: 'world' });
Then in another file I have:
var nc = require("node-cache");
cache = new nc();
cache.flushAll();
Thank you.
I create simple js file:
var NodeCache = require ('node-cache');
var myCache = new NodeCache();
Then run it:
node simple_js_file.js
And it dont exit after string
var myCache = new NodeCache();
Do I need to call something like
myCache.close();
I am looking for a in process cache module, and I found nodecache, I don't know if it is stable enough for a productive environment?
Pull request momentarily.
I'm trying to promisify this lib but it does not seems to work:
http://bluebirdjs.com/docs/api/promisification.html
var Cache = require('node-cache');
promise.promisifyAll(Cache.prototype);
var cache = new Cache();
cache.set('Test', 1000).then(function () {
console.log('Success');
});
Is it even possible?
Hi,
I'm wondering as to why a delete miss would be a cache miss.
Here: https://github.com/tcs-de/nodecache/blob/master/_src/lib/node_cache.coffee#L223
Thank you
Promises (available in Node 0.12+) should not be cloned (or the clone
module should handle Promises properly).
Currently, nodecache breaks implementations caching promises.
Hello and thanks for that.
I would love to use the set method so that an additionnal last argument would be a callback function trigged at the end of time to live.
The goal is to reuse the value one last time before it dies to do something meaning full with another part of the code.
=> myCache.set( key, val, [ ttl ], succesCallback, ttlCallback )
If it sounds interesting, I can have a look at it.
Best !
if I want to cache a user's online friends list, usually we should use CAS to keep it's right in multiple thread scenario. Will nodecache could provide an easy way to operate values like List type, we can use remove_list_value_by_key, append_list_value methods to manipulate cache?
Hiya,
For certain reasons I need a way to store and alter an object
that looks a little like this in the cache.
{
parentKey1: {
child1: {},
child2: {}
// Etc...
},
parentKey2: {
child1: {},
child2: {}
// Etc...
},
// Etc.....
}
I need to be able to get
and set
all the children of each parent key. In your examples I don't see this implementation anywhere? Is it possible though?
Can you for instance do something like this?
Note the dot in the key string
myCache.set("parent1.child2", obj, function (err, success) {
if (err) {
// Handle error
return;
}
// Do something with our success!
})
Add a kind of storage setting to change the persistance from memory to for example a json file, the local storage or a custom storage provider.
This storage should have flag if all request must be called async
Not sure why I am getting this error...
var NodeCache = require( "node-cache" );
var cache = new NodeCache();
produces error:
" NodeCache is not a constructor "
I am using Typescript as well but even when i use the same code from the examples provided, I get this error.
It would be great to have the possibility to define a backup for an expired cache.
Basically, instead of listening on cacheRegion.on( "expired", function(key, value )
and then update the cache when it expires. We could have an option defined like that:
myCache.set( key, val, [ ttl, cacheBackup ], [callback] )
where cacheBackup
is true
if we want to have a backup and false
otherwise.
Workflow:
expires
Retrieve a key (GET)
request is madeobject
with the key value pair
and a boolean CacheBackup
equal to true
= let the user know that it was an expired cache.It very useful when the http requests fail and the cache has expired.
Currently the tests are written on top of the deprecated expresso framework.
A rewrite of the tests to mocha or something similar would be nice.
Just wondering why zero isn't the default, since by default if the cached value is expired it returns undefined anyway.
Is it used to stop the cache from filling up with expired values? I would expect an expired value to be removed from the cache on the 'get' call - or is this not the case?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.