Giter Site home page Giter Site logo

node-locking-cache's Introduction

locking-cache

I am a locking LRU cache. This means that subsequent calls to a cached function will wait until the initial call has populated the cache, at which point all pending calls will be provided with cached data.

Usage

var lockingCache = require("locking-cache"),
    lockedFetch = lockingCache({
      // lru-cache options
      max: 10
    });

var fetch = lockedFetch(function(uri, lock) {
  // generate a key
  var key = uri;

  // lock
  return lock(key, function(unlock) {

    // make the call that produces data to be cached
    return request.get(uri, function(err, rsp, body) {
      // optionally do stuff to data that's been returned

      // unlock, caching non-error arguments
      // all arguments will be passed to pending callbacks
      return unlock(err, rsp, body);
    });
  });
});

// will trigger the initial fetch
fetch("http://google.com/", function(err, rsp, body) {
  // ...

  // rsp and body will be returned from the cache
  // if evicted, a fetch will be triggered again
  fetch("http://google.com/", function(err, rsp, body) {
    ///
  });
});

// will wait for the initial fetch to complete (or fail)
fetch("http://google.com/", function(err, rsp, body) {
  // ...
});

See lru-cache for cache options.

The dispose option that's passed to the underlying LRU differs slightly from what lru-cache documents, as values are stored as arrays in order to support varargs (multiple values passed to unlock()):

var lockedFetch = lockingCache({
  dispose: function(key, values) {
    // ...
  });

A custom factory function (that returns an LRU instance or compatible) may be provided as the last argument. Here are 2 examples of how it can be used.

var lockingCache = require("locking-cache"),
    lockedFetchA = lockingCache(function() {
      return LRU({
        max: 10
      });
    }),
    lockedFetchB = lockingCache({
      name: "B",
      max: 10
    }, function(options) {
      console.log("Creating cache for %s", options.name);
      return LRU(options);
    });

// ...

node-locking-cache's People

Contributors

mojodna avatar thedeveloper avatar

Stargazers

Raul Ochoa avatar

Watchers

 avatar James Cloos avatar  avatar

Forkers

thedeveloper

node-locking-cache's Issues

Streaming

When the cached function returns a stream, pipe it to all pending listeners. This means that the cache should return a stream immediately.

Inspiration: 1st bullet from the varnish 4.0 release announcement

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.