iamsinghrajat / async-cache Goto Github PK
View Code? Open in Web Editor NEWA caching solution for asyncio
Home Page: https://pypi.org/project/async-cache/
License: MIT License
A caching solution for asyncio
Home Page: https://pypi.org/project/async-cache/
License: MIT License
Rarely, but sometimes the unknown error happens, when a function with @AsyncTTL decorator is called:<cache.key.KEY object at 0x7f242580c460>
``text
File "/usr/local/lib/python3.10/site-packages/cache/async_ttl.py", line 56, in wrapper
self.ttl[key] = await func(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/cache/async_ttl.py", line 36, in setitem
super().setitem(key, (value, ttl_value))
File "/usr/local/lib/python3.10/site-packages/cache/lru.py", line 17, in setitem
oldest = next(iter(self))
There are some occasions when it's useful to ignore some parameters for the purpose of determining a key. For example if you're caching a method, it might be that self
is irrelevant to the desired caching behaviour.
As I can see, there is no method to clear a cache key, would be nice to have one
Hello,
could you please create a release with latest changes on PyPI? For example the use_cache
function is not yet available in the released version.
Thanks in advance.
Need to add support for caching using redis
async-cache/cache/async_ttl.py
Line 34 in 20a9fda
async-cache/cache/async_ttl.py
Line 11 in 20a9fda
@AsyncTTL(hours=1, seconds=1, days=0, milliseconds=0, ... etc. , maxsize=1024)
datetime.timedelta(
days=days,
seconds=seconds,
microseconds=microseconds,
milliseconds=milliseconds,
minutes=minutes,
hours=hours,
weeks=weeks
)
I get this error when i try to run the code with import like from async-cache import AsyncLRU
.
File "app/workers/simulator.py", line 7
from async-cache import AsyncLRU
^
SyntaxError: invalid syntax
Look like the -
in the name is the problem.
In vsc I see the two world async
and cache
in different colors as lint flag it as error.
Is this a bug or am I doing something wrong?
Need to add more type of caching methods. Currently only LRU is available
cachetools' cached decoractor has an info
param that adds a wrapped info function for getting details on cache usage, which can be helpful for debugging. From their docs:
If info is set to True, the wrapped function is instrumented with a cache_info() function that returns a named tuple showing hits, misses, maxsize and currsize, to help measure the effectiveness of the cache.
Would be nice to have something similar here!
Add support for ttl in async-cache
is it possible to set maxsize for AsyncTTL as for AsyncLRU?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.