spilgames / erl-cache Goto Github PK
View Code? Open in Web Editor NEWGeneric in-node caching library with memoization support
Home Page: www.spilgames.com
License: Other
Generic in-node caching library with memoization support
Home Page: www.spilgames.com
License: Other
Currently function results are cached without looking at the result. This is a problem if the function-to-be-called sometimes returns errors. In that case all users will be see the error, even though the error was perhaps only transient.
The user of this library should therefore be able to control what to cache and what not to cache. Most generic solution to the problem consists of two parts:
Currently validity of the cache entries is specified in ms, which seems overkill.
Would be simpler to specify it in seconds.
Move the stats out of cache_server and into its own module.
This removes most logic from the gen_server.
Maybe, we want to store stats in a separate ETS table...
Test case:
-module(t).
%% Caching decorator
-include_lib("erl_cache/include/erl_cache.hrl").
-include_lib("decorator_pt/include/decorator_pt.hrl").
-compile([export_all]).
start() ->
erl_cache:start(),
erl_cache:start_cache(test, []).
?CACHE(test, [{validity, 30000}, {evict, 5000}]).
sum(A, B) when A < 5 ->
A + B;
sum(A, B) when A > 5 ->
A + B.
In shell:
6> c(t).
./t.erl:none: error in parse transform 'decorator_pt_core': {{badmatch,
[{clause,14,
[{var,14,'A'},{var,14,'B'}],
[[{op,14,'<',
{var,14,'A'},
{integer,14,5}}]],
[{op,15,'+',
{var,15,'A'},
{var,15,'B'}}]},
{clause,16,
[{var,16,'A'},{var,16,'B'}],
[[{op,16,'>',
{var,16,'A'},
{integer,16,5}}]],
[{op,17,'+',
{var,17,'B'},
{var,17,'A'}}]}]},
[{decorator_pt_core,
apply_decorators,2},
{decorator_pt_core,
transform_node,2},
{decorator_pt_core,mapfoldl,3},
{decorator_pt_core,mapfoldl,3},
{decorator_pt_core,
parse_transform,2},
{compile,
'-foldl_transform/2-anonymous-2-',
2},
{compile,foldl_transform,2},
{compile,
'-internal_comp/4-anonymous-1-',
2}]}
error
Right now, entries are only evicted when a get request is done for that key, and the entry is marked as evicted (Now > Evict)
To decrease the lookup/seek times for entries, sharding should be used.
When the async refresh strategy is used to refresh stale entries, multiple spawns to fetch the value can happen at the same time. It should reduce resource usage if this can be avoided.
Is there any interest in implementing an optional memory-based backpressure mechanism?
The idea is a user would set a memory high watermark and the cache would return hits / refresh hits but would not store new entries until memory usage was below the high watermark.
Memory usage could be computed at insertion time or periodically via some kind of erlang:send_after
timer.
On a clean checkout the unit tests fail consistently for me on R16B03-1. If I increase the timer:sleep to 600 ms it works fine. Setting the evict_interval and the sleep both to 500ms is introducing a race-condition.
======================== EUnit ========================
module 'erl_cache_driver'
module 'erl_cache_app'
module 'erl_cache_server'
module 'erl_cache_server_sup'
module 'erl_cache_eunit'
erl_cache_eunit: get_set_evict_test_ (Start/Stop cache servers)...[0.001 s] ok
erl_cache_eunit: get_set_evict_test_ (Catch faulty options in get and set)...ok
erl_cache_eunit: get_set_evict_test_ (Default get, set, evict and stats)...[0.004 s] ok
erl_cache_eunit: get_set_evict_test_ (Get and set with undefined refresh callback)...[0.357 s] ok
erl_cache_eunit: get_set_evict_test_ (Get and set with refresh overdue_async_mfa)...[0.469 s] ok
erl_cache_eunit: get_set_evict_test_ (Get and set with refresh overdue_sync_closure)...[0.419 s] ok
erl_cache_eunit: get_set_evict_test_ (Get and set with error values)...[0.006 s] ok
erl_cache_eunit: get_set_evict_test_ (Refresh with error)...[0.402 s] ok
erl_cache_eunit: get_set_evict_test_ (Stats after basic operations)...[0.160 s] ok
erl_cache_eunit: get_set_evict_test_ (Evict interval cache cleanup + correct stats)...*failed*
in function erl_cache_eunit:'-evict_interval/0-fun-5-'/2 (test/erl_cache_eunit.erl, line 191)
in call from erl_cache_eunit:evict_interval/0 (test/erl_cache_eunit.erl, line 191)
**error:{assertEqual_failed,[{module,erl_cache_eunit},
{line,191},
{expression,"proplists : get_value ( evict , Stats )"},
{expected,1},
{value,undefined}]}
erl_cache_eunit: get_set_evict_test_ (Parse transform basic usage)...[3.003 s] ok
[done in 5.400 s]
module 'erl_cache_sup'
module 'erl_cache'
module 'erl_cache_decorator'
=======================================================
Failed: 1. Skipped: 0. Passed: 10.
If the amount of memory used by the cache_server exceeds X (config/default), start evicting keys to constrain memory usage.
Multiple eviction strategies should be considered
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.