Giter Site home page Giter Site logo

long2ice / fastapi-cache Goto Github PK

View Code? Open in Web Editor NEW
1.2K 8.0 146.0 994 KB

fastapi-cache is a tool to cache fastapi response and function result, with backends support redis and memcached.

Home Page: https://github.com/long2ice/fastapi-cache

License: Apache License 2.0

Makefile 1.19% Python 97.83% Jinja 0.97%
cache fastapi redis memcached

fastapi-cache's Introduction

fastapi-cache

pypi license CI/CD

Introduction

fastapi-cache is a tool to cache FastAPI endpoint and function results, with backends supporting Redis, Memcached, and Amazon DynamoDB.

Features

  • Supports redis, memcache, dynamodb, and in-memory backends.
  • Easy integration with FastAPI.
  • Support for HTTP cache headers like ETag and Cache-Control, as well as conditional If-Match-None requests.

Requirements

  • FastAPI
  • redis when using RedisBackend.
  • memcache when using MemcacheBackend.
  • aiobotocore when using DynamoBackend.

Install

> pip install fastapi-cache2

or

> pip install "fastapi-cache2[redis]"

or

> pip install "fastapi-cache2[memcache]"

or

> pip install "fastapi-cache2[dynamodb]"

Usage

Quick Start

from collections.abc import AsyncIterator
from contextlib import asynccontextmanager

from fastapi import FastAPI
from starlette.requests import Request
from starlette.responses import Response

from fastapi_cache import FastAPICache
from fastapi_cache.backends.redis import RedisBackend
from fastapi_cache.decorator import cache

from redis import asyncio as aioredis

app = FastAPI()


@cache()
async def get_cache():
    return 1


@app.get("/")
@cache(expire=60)
async def index():
    return dict(hello="world")

@asynccontextmanager
async def lifespan(_: FastAPI) -> AsyncIterator[None]:
    redis = aioredis.from_url("redis://localhost")
    FastAPICache.init(RedisBackend(redis), prefix="fastapi-cache")
    yield

Initialization

First you must call FastAPICache.init during startup FastAPI startup; this is where you set global configuration.

Use the @cache decorator

If you want cache a FastAPI response transparently, you can use the @cache decorator between the router decorator and the view function.

Parameter type default description
expire int sets the caching time in seconds
namespace str "" namespace to use to store certain cache items
coder Coder JsonCoder which coder to use, e.g. JsonCoder
key_builder KeyBuilder callable default_key_builder which key builder to use
injected_dependency_namespace str __fastapi_cache prefix for injected dependency keywords.
cache_status_header str X-FastAPI-Cache Name for the header on the response indicating if the request was served from cache; either HIT or MISS.

You can also use the @cache decorator on regular functions to cache their result.

Injected Request and Response dependencies

The cache decorator injects dependencies for the Request and Response objects, so that it can add cache control headers to the outgoing response, and return a 304 Not Modified response when the incoming request has a matching If-Non-Match header. This only happens if the decorated endpoint doesn't already list these dependencies already.

The keyword arguments for these extra dependencies are named __fastapi_cache_request and __fastapi_cache_response to minimize collisions. Use the injected_dependency_namespace argument to @cache to change the prefix used if those names would clash anyway.

Supported data types

When using the (default) JsonCoder, the cache can store any data type that FastAPI can convert to JSON, including Pydantic models and dataclasses, provided that your endpoint has a correct return type annotation. An annotation is not needed if the return type is a standard JSON-supported Python type such as a dictionary or a list.

E.g. for an endpoint that returns a Pydantic model named SomeModel, the return annotation is used to ensure that the cached result is converted back to the correct class:

from .models import SomeModel, create_some_model

@app.get("/foo")
@cache(expire=60)
async def foo() -> SomeModel:
    return create_some_model()

It is not sufficient to configure a response model in the route decorator; the cache needs to know what the method itself returns. If no return type decorator is given, the primitive JSON type is returned instead.

For broader type support, use the fastapi_cache.coder.PickleCoder or implement a custom coder (see below).

Custom coder

By default use JsonCoder, you can write custom coder to encode and decode cache result, just need inherit fastapi_cache.coder.Coder.

from typing import Any
import orjson
from fastapi.encoders import jsonable_encoder
from fastapi_cache import Coder

class ORJsonCoder(Coder):
    @classmethod
    def encode(cls, value: Any) -> bytes:
        return orjson.dumps(
            value,
            default=jsonable_encoder,
            option=orjson.OPT_NON_STR_KEYS | orjson.OPT_SERIALIZE_NUMPY,
        )

    @classmethod
    def decode(cls, value: bytes) -> Any:
        return orjson.loads(value)


@app.get("/")
@cache(expire=60, coder=ORJsonCoder)
async def index():
    return dict(hello="world")

Custom key builder

By default the default_key_builder builtin key builder is used; this creates a cache key from the function module and name, and the positional and keyword arguments converted to their repr() representations, encoded as a MD5 hash. You can provide your own by passing a key builder in to @cache(), or to FastAPICache.init() to apply globally.

For example, if you wanted to use the request method, URL and query string as a cache key instead of the function identifier you could use:

def request_key_builder(
    func,
    namespace: str = "",
    *,
    request: Request = None,
    response: Response = None,
    *args,
    **kwargs,
):
    return ":".join([
        namespace,
        request.method.lower(),
        request.url.path,
        repr(sorted(request.query_params.items()))
    ])


@app.get("/")
@cache(expire=60, key_builder=request_key_builder)
async def index():
    return dict(hello="world")

Backend notes

InMemoryBackend

The InMemoryBackend stores cache data in memory and only deletes when an expired key is accessed. This means that if you don't access a function after data has been cached, the data will not be removed automatically.

RedisBackend

When using the Redis backend, please make sure you pass in a redis client that does not decode responses (decode_responses must be False, which is the default). Cached data is stored as bytes (binary), decoding these in the Redis client would break caching.

Tests and coverage

coverage run -m pytest
coverage html
xdg-open htmlcov/index.html

License

This project is licensed under the Apache-2.0 License.

fastapi-cache's People

Contributors

agri1988 avatar charlesperrotminothchb avatar cnkailyn avatar cpbotha avatar dependabot[bot] avatar dveleztx avatar geo-mathijs avatar hackjammer avatar heliumbrain avatar jegork avatar joeflack4 avatar lccmrx avatar long2ice avatar mjpieters avatar mkdir700 avatar mrreadiness avatar naoki-jarvisml avatar rtruran avatar rushilsrivastava avatar s-rigaud avatar schmocker avatar shershen08 avatar squaresmile avatar thentgesmindee avatar uriyyo avatar vvanglro avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fastapi-cache's Issues

Not caching requests except for GET

Hello!

I do not completely understand why caching is disabled for methods except for GET, and if so, why is request passed as the first parameter to the function?

In my case, which is something in the lines of:

@cache
def redirect(request: Request):
    pass

this leads to an error, because parameter request is passed twice (once as request, and another in args/kwargs)

Thanks!

fastapi-cache 0.1.4: TypeError: Unicode-objects must be encoded before hashing

Hi,

I updated fastapi-cache using Github to solve the problem of "not JSON serializable", but now I have a new error:

  File "/lib/python3.9/site-packages/fastapi_cache/decorator.py", line 40, in inner
    cache_key = key_builder(
  File "/lib/python3.9/site-packages/fastapi_cache/key_builder.py", line 21, in default_key_builder
    + hashlib.md5(  # nosec:B303
TypeError: Unicode-objects must be encoded before hashing

Before (with version 1.3.4) I was using something like this bellow, but with the two lines commented and returning the json_updated_user:

@router.get("/me/items/", response_model=List[schemas.User_Pydantic])
@cache(namespace="test", expire=60, coder=JsonCoder)
async def read_current_user_items(user: UserDB = Depends(fastapi_users.get_current_active_user)):
    user_cards = await schemas.Card_Pydantic.from_queryset(
        Card.filter(owner_id=user.id).all())
    for city in user_cards:
        await weather_api.get_api_info_by_id(city.id)
    # user_with_items = await schemas.User_Pydantic.from_queryset(UserModel.filter(id=user.id).all())
    # json_updated_user = jsonable_encoder(user_with_items)
    return await schemas.User_Pydantic.from_queryset(UserModel.filter(id=user.id).all())

Without jsonable_encoder I get the error "not JSON serializable", but now I have the hash error.
If I comment the @cache decorator, it works fine (and I don't need the jsonable_encoder).

Thanks in advance!

how to check if the data in cache is empty and then load data into the indicated cache?

for example, how to check if there is None data in the cache "test" of the following function section_data?
and how to load new data into it if it is empty.

def my_key_builder(   func,  namespace: Optional[str] = "",  request: Request = None, response: Response = None, *args,  **kwargs,):
    prefix = FastAPICache.get_prefix()
    cache_key = f"{prefix}:{namespace}:{func.__module__}:{func.__name__}:{args}:{kwargs}"
    return cache_key

@app.get('/data',response_model=List[Dict])
@cache(namespace="test", expire=60,key_builder=my_key_builder)
async def section_data( ):
    data= func('section')
    return data

Thanks a lot

Object of type JSONResponse is not JSON serializable

in my controller file i have the following

from fastapi_cache.decorator import cache
from fastapi_cache.coder import JsonCoder

...

@router.post("/questions/{type}")
@cache(expire=15, coder=JsonCoder)
async def read_item(type: str):
    # get data from 3rd part API
    questions = get_questions(type)
    json_compatible_item_data = jsonable_encoder({"items": questions, "result": "success"})
    return JSONResponse(content=json_compatible_item_data)

what I get is

json/encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type JSONResponse is not JSON serializable```

[question] Cache results respecting raw request path?

I have a function that utilizes the raw starlette request to parse the path_parameters in order to provide dynamic content. Right now, the cache implementation returns the same cached result regardless of the path. Would there be any way to intercept the args/kwargs in the decorator to provide a more in-depth cache system?

My code looks something like:

@cache()
async def get(request: Request):
    # return content based on request[some_key]

Default key builder does not work with memcached

Memcached does not allow spaces in keys. Thus when using the default key builder with endpoint that has dependencies, a space can be introduced when object is being represented as string.
Problem example:

@router.get("/",
             status_code=200,
             response_model=list)
async def distinct_values(field: str,
                              es: Elasticsearch = Depends(ElasticConnector)):

This result in ValidationError

Proposed solution:

def mcache_key_builder(
    func,
    namespace: Optional[str] = "",
    request: Request = None,
    response: Response = None,
    *args,
    **kwargs,
):
    prefix = FastAPICache.get_prefix()
    cache_key = f"{prefix}:{namespace}:{func.__module__}:{func.__name__}:{args}:{kwargs}".replace(" ","")
    return cache_key
        ```

[QUESTION] Is there an ability to run a function when the cache is used or key is set?

Is it possible to run a function whenever a cache is used or cache key is set? Ideally something similar to the custom_key_coder , for example:

def key_set_function(
    func,
    namespace: Optional[str] = "",
    request: Request = None,
    response: Response = None,
    *args,
    **kwargs,
):
    // Do something
    return something

@app.get("/")
@cache(expire=60, coder=JsonCoder , key_set_handler=key_set_function)
async def index(request: Request, response: Response):
    return dict(hello="world")

I'm aware of aiocache having hooks which are implemented via 'plugins' that you can run before and after a new key is set. Thanks!

cache invalid default value

@cache() has expire: int = None which is invalid for memcached. Thus either this parameter should be mandatory or a sensible default should be introduced.

Honor If-Modified-Since in request headers

This would be very useful in combination with Last-modified in response headers, especially for SEO. The challenge would be how to elegantly decide whether the resource has been modified or not.

Help understanding cache

So I have an endpoint I need to cache for a short time because it can get a lot of the same request at the same time.
I use InMemory cache, and the request takes a second or more to process.
If I get say 100 requests for the route "at the same time" would they all run the process because the cache wasn't ready, or would they all wait for the first one to finish and use the cache from that one?

[Errno 111] Connection refused

My application is fully dockerized, it was working fine with fastapi.
Im trying to add cache but It seems to have an error.

Im getting an error trying to create_redis_pool. I tried using localhost, also other ports without any luck.

@app.on_event("startup")
async def startup():
    redis = await aioredis.create_redis_pool("redis://127.0.0.1", encoding="utf8")
    FastAPICache.init(RedisBackend(redis), prefix="fastapi-cache")

Also ports are mapped on the docker-compose.yml

Honor Cache-Control in request headers

Do not pull from the cache when the incoming request has the header Cache-Control: no-cache. In addition to that, do not store anything in the cache if the request has the header Cache-Control: no-store.

Error thrown : TypeError: object list can't be used in 'await' expression

File "/usr/local/lib/python3.8/site-packages/uvicorn/protocols/http/httptools_impl.py", line 398, in run_asgi
result = await app(self.scope, self.receive, self.send)
File "/usr/local/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in call
return await self.app(scope, receive, send)
File "/usr/local/lib/python3.8/site-packages/fastapi/applications.py", line 208, in call
await super().call(scope, receive, send)
File "/usr/local/lib/python3.8/site-packages/starlette/applications.py", line 112, in call
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.8/site-packages/starlette/middleware/errors.py", line 181, in call
raise exc from None
File "/usr/local/lib/python3.8/site-packages/starlette/middleware/errors.py", line 159, in call
await self.app(scope, receive, _send)
File "/usr/local/lib/python3.8/site-packages/timing_asgi/middleware.py", line 68, in call
await self.app(scope, receive, send_wrapper)
File "/usr/local/lib/python3.8/site-packages/starlette/middleware/cors.py", line 86, in call
await self.simple_response(scope, receive, send, request_headers=headers)
File "/usr/local/lib/python3.8/site-packages/starlette/middleware/cors.py", line 142, in simple_response
await self.app(scope, receive, send)
File "/usr/local/lib/python3.8/site-packages/starlette/exceptions.py", line 82, in call
raise exc from None
File "/usr/local/lib/python3.8/site-packages/starlette/exceptions.py", line 71, in call
await self.app(scope, receive, sender)
File "/usr/local/lib/python3.8/site-packages/starlette/routing.py", line 580, in call
await route.handle(scope, receive, send)
File "/usr/local/lib/python3.8/site-packages/starlette/routing.py", line 241, in handle
await self.app(scope, receive, send)
File "/usr/local/lib/python3.8/site-packages/starlette/routing.py", line 52, in app
response = await func(request)
File "/usr/local/lib/python3.8/site-packages/fastapi/routing.py", line 226, in app
raw_response = await run_endpoint_function(
File "/usr/local/lib/python3.8/site-packages/fastapi/routing.py", line 159, in run_endpoint_function
return await dependant.call(**values)
File "/usr/local/lib/python3.8/site-packages/fastapi_cache/decorator.py", line 47, in inner
ret = await func(*args, **kwargs)
TypeError: object list can't be used in 'await' expression

Depends Cache

Hi, my dependency injection calls are not cached. Can I somehow not call/trigger dependency injection (current_user) in this example?

@router.get(
    "/test",
)
@cache_one_hour
async def test(
        request: Request,
        test: TestDependency = Depends()
):
    response = await test.get()
    return ORJSONResponse(content={'test': 1})

class TestDependency:
    def __init__(
            self,
            manager = Depends(get_query_manager),
    ):
...

async def get_query_manager(
        user: User = Depends(current_user)
):
...

[Question] Different expire value depending on HTTP response

Is there a possibility to set different expire value for cached responses depending on status code from external API.
E.g. if status code is 200 than it should be cached for longer period of time (2 months) and if its 400 than it should be 1 day

Response headers are sent only on the second request.

I noticed that the Cache-Control and ETag headers are not included for the first response. It takes a second request to generate and include these headers in the response. From looking at decorator.py, it seems that the reason is that the ETag header is a hash of the retrieve cache entry. The cache entry is created on the first request, and not available yet for the first response.

However, it's possible to include the response headers already in the first response. I created a modified version of the @cache decorator, which seems to work fine (unless I'm overlooking something). I can create a PR if you like:

def cache(
    expire: int = None,
    coder: Type[Coder] = None,
    key_builder: Callable = None,
    namespace: Optional[str] = "",
):
    """
    cache all function
    :param namespace:
    :param expire:
    :param coder:
    :param key_builder:
    :return:
    """

    def wrapper(func):
        @wraps(func)
        async def inner(*args, **kwargs):
            nonlocal coder
            nonlocal expire
            nonlocal key_builder
            copy_kwargs = kwargs.copy()
            request = copy_kwargs.pop("request", None)
            response = copy_kwargs.pop("response", None)
            if (
                request and request.headers.get("Cache-Control") == "no-store"
            ) or not FastAPICache.get_enable():
                return await func(*args, **kwargs)

            coder = coder or FastAPICache.get_coder()
            expire = expire or FastAPICache.get_expire()
            key_builder = key_builder or FastAPICache.get_key_builder()
            backend = FastAPICache.get_backend()

            cache_key = key_builder(
                func, namespace, request=request, response=response, args=args, kwargs=copy_kwargs
            )
            ttl, ret_encoded = await backend.get_with_ttl(cache_key)
            if not request:
                if ret_encoded is not None:
                    return coder.decode(ret_encoded)
                ret = await func(*args, **kwargs)
                await backend.set(cache_key, coder.encode(ret), expire or FastAPICache.get_expire())
                return ret

            if request.method != "GET":
                return await func(request, *args, **kwargs)
            if_none_match = request.headers.get("if-none-match")

            if ret_encoded is None:
                ret = await func(*args, **kwargs)
                ret_encoded = coder.encode(ret)
                ttl = expire or FastAPICache.get_expire()
                await backend.set(cache_key, ret_encoded, ttl)
            else:
                ret = coder.decode(ret_encoded)

            if response:
                response.headers["Cache-Control"] = f"max-age={ttl}"
                etag = f"W/{hash(ret_encoded)}"
                if if_none_match == etag:
                    response.status_code = 304
                    return response
                response.headers["ETag"] = etag

            return ret

        return inner

    return wrapper

leaving "expire" empty results in TypeError

when using the decorator

@app.get("/foo")
@cache(coder=JsonCoder)
async def foo_bar(bar: int = 0):
    return bar

not defining expire results in an TypeError Exception:

File "...\venv\lib\site-packages\fastapi_cache\backends\inmemory.py", line 46, in set
self._store[key] = Value(value, self._now + expire)
TypeError: unsupported operand type(s) for +: 'int' and 'NoneType'

fix:

fastapi_cache.backends.inmemory.py

    async def set(self, key: str, value: str, expire: int = None):
        async with self._lock:
            self._store[key] = Value(value, self._now + expire)

instead of None, have the default value for expire be whatever value it needs to be to reach the maximum timestamp possible

Alternative PyPi package names not available

The README.md says the following are both literally possible.

pip install fastapi-cache2[redis]
pip install fastapi-cache2[memcache]

But they are not:

pip install fastapi-cache2[redis]
zsh: no matches found: fastapi-cache2[redis]

I'm also not seeing these package names on PyPi.

[question] how to cache 200 http responses only?

I'm not sure how to discard 500 and 400 errors from cache using InMemoryBackend. My usecase depends on an external provider, but I want to cache only success responses. If the provider fails I want to keep asking for the data.
I tried to inject 'no-cache' header but request headers are inmutable.

Not working on class method?

Not working example:

class A:

    @cache(namespace="app", expire=3600)
    async def test(self, a=True):
        print("test")
        return [], [], []

Working example:

class A:

    @staticmethod
    @cache(namespace="app", expire=3600)
    async def test(a=True):
        print("test")
        return [], [], []

Is any way to cache data on class method?

fastapi用uvicron 多个works启动时 我用内存缓存会有问题

一开始我觉得是_store没有用进程共享变量, 我改成了

from multiprocessing import Manager
class InMemoryBackend:
    _store = Manager().dict()
    _lock = Lock()

试了下 还是不行 当请求换了一个进程处理时 还是没共享变量
这是我代码里用的地方https://github.com/vvanglro/fastapi-tortoise-orm/blob/1c7b8bec789ac29455def1e31782a5901d5cb030/coronavirus/main.py#L184
这是改了InMemoryBackend的地方https://github.com/vvanglro/fastapi-tortoise-orm/blob/1c7b8bec789ac29455def1e31782a5901d5cb030/coronavirus/cache.py#L17

Option to not set response header

By default, Cache-Control: max-age=<cache-ttl> is set on the response. I'd like to have an option to disable this and instead prevent intermediate caches or the client to cache the resource, so I can invalidate it (FastAPICache.clear()) in case of changes.

It reports error with "got multiple values for argument " when I use the request object directly

My code is as follows:

@app.post('/oneline', response_model=ClassifyResult)
@cache()
async def predict(sentences: Sentences,request:Request):
    texts = sentences.texts
    type = sentences.type

It reports error

File "C:\Users\fang_\anaconda3\envs\hmqf_dl\lib\site-packages\fastapi_cache\decorator.py", line 52, in inner
TypeError: predict() got multiple values for argument 'sentences'

after debug the code, I found that the reson maybe these line in decorator.py:

if request.method != "GET":
    return await func(request, *args, **kwargs)

The **kwargs has the arguments request, so it has multiple request

after I modify the code

if request.method != "GET":
    return await func(*args, **kwargs)

it runs ok, but I don't know whether it has other influence.

So was it a bug or I use it a wrong way?

Way to dynamically disable/deactivate caching functionality?

Description

I'm having an issue where I want to debug certain endpoints locally. But since those responses are cached, my breakpoints are never getting reached.

What's the best way to temporarily disable if I want to set to development/debug mode?

What I've tried

I added this condition to only add FastApiCache if in production:

@app.on_event("startup")
async def startup():
    """Start up FastAPI server"""
    app.state.graph = TccmGraph()
    app.state.graph.connect()
    redis = aioredis.from_url(
        url=get_settings().redis_url,
        encoding="utf8",
        decode_responses=True)
    if get_settings().environment_name == 'production':
        FastAPICache.init(
            backend=RedisBackend(redis),
            prefix="fastapi-cache")

But I'm getting the following error, expectedly:

  File "/Users/joeflack4/virtualenvs/ccdh-terminology-service/lib/python3.9/site-packages/fastapi_cache/__init__.py", line 35, in get_backend
    assert cls._backend, "You must call init first!"  # nosec: B101
AssertionError: You must call init first!

Possible solutions

What's the best way to handle this?

  1. Create my own cache decorator wrapper w/ custom logic around FastAPI's?
  2. Initialize FastAPICache.init() but do it differently?

support `filter` argument in cache decorator

It would be nice to have an optional parameter such as filter in the @cache decorator that took as input a python callable w/ the wrapped function's *args and **kwargs and returned a boolean as to whether or not to cache.

The default could just be always true e.g. lambda x: True but users may want more nuanced control, based on the parameters to the function itself. In my particular case, I can't quite use cache-control header because i want to determine whether or not to cache on the server side based on query params.

If I get some time, I can jam on this and submit a PR, if you think it could be worthwhile?

Cache-Control not working

Hi,

I've been using your lib for a while now, and it's amazing!
But for some late issues, I've found out that the Cache-Control: no-cache wasn't working.

Here is my code and a printscreen so you could understand my case:

import random
import string
import aioredis
from fastapi import FastAPI
from starlette.requests import Request
from starlette.responses import Response

from fastapi_cache import FastAPICache
from fastapi_cache.backends.redis import RedisBackend
from fastapi_cache.decorator import cache

app = FastAPI()

@app.get("/")
@cache(expire=60)
async def index(request: Request, response: Response):
    letters = string.ascii_lowercase
    rand = ''.join(random.choice(letters) for i in range(10))
    print(rand)
    return dict(hello=rand)

@app.on_event("startup")
async def startup():
    redis =  aioredis.from_url("redis://localhost", encoding="utf8", decode_responses=True)
    FastAPICache.init(RedisBackend(redis), prefix="fastapi-cache")

I've used the similar as in README.md
I changed the response so we could see that I'm generating a new response on each call of the route.
Keep in mind that I printed the new generated string inside the function, so I could know every now and then when I create a new string.

image
As expected, a new string is generated and printed on console.

image
But here you can see that I've added Cache-Control on the following requests and it still retrieves the cached string.

image
As I wait for the 60 seconds, I could successfully generated a new string.

Am I doing something wrong?

Cannot connect to radis in Docker Container

After calling an API decorated by fastapi-cache:

Traceback (most recent call last): 
File "/usr/local/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 403, in run_asgi result = await app(self.scope, self.receive, self.send) 
File "/usr/local/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__ return await self.app(scope, receive, send) 
File "/usr/local/lib/python3.10/site-packages/fastapi/applications.py", line 269, in __call__ await super().__call__(scope, receive, send) File "/usr/local/lib/python3.10/site-packages/starlette/applications.py", line 124, in __call__ await self.middleware_stack(scope, receive, send) File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__ raise exc 
File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__ await self.app(scope, receive, _send) 
File "/usr/local/lib/python3.10/site-packages/starlette/middleware/cors.py", line 84, in __call__ await self.app(scope, receive, send) 
File "/usr/local/lib/python3.10/site-packages/starlette/exceptions.py", line 93, in __call__ raise exc File "/usr/local/lib/python3.10/site-packages/starlette/exceptions.py", line 82, in __call__ await self.app(scope, receive, sender) 
File "/usr/local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__ raise e 
File "/usr/local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__ await self.app(scope, receive, send) 
File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 670, in __call__ await route.handle(scope, receive, send) 
File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 266, in handle await self.app(scope, receive, send) 
File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 65, in app response = await func(request) 
File "/usr/local/lib/python3.10/site-packages/fastapi/routing.py", line 227, in app raw_response = await run_endpoint_function( File "/usr/local/lib/python3.10/site-packages/fastapi/routing.py", line 160, in run_endpoint_function return await dependant.call(**values) 
File "/usr/local/lib/python3.10/site-packages/fastapi_cache/decorator.py", line 45, in inner ttl, ret = await backend.get_with_ttl(cache_key) File "/usr/local/lib/python3.10/site-packages/fastapi_cache/backends/redis.py", line 14, in get_with_ttl return await (pipe.ttl(key).get(key).execute()) 
File "/usr/local/lib/python3.10/site-packages/aioredis/client.py", line 4618, in execute conn = await self.connection_pool.get_connection("MULTI", self.shard_hint) 
File "/usr/local/lib/python3.10/site-packages/aioredis/connection.py", line 1416, in get_connection await connection.connect() File "/usr/local/lib/python3.10/site-packages/aioredis/connection.py", line 698, in connect raise 
ConnectionError(self._error_message(e)) aioredis.exceptions.ConnectionError: Error 111 connecting to 127.0.0.1:6379. 111.

My Dockerfile is:

FROM python:3.10

# INSTALL REDIS
RUN curl -fsSL https://packages.redis.io/gpg | gpg --dearmor -o /usr/share/keyrings/redis-archive-keyring.gpg
RUN echo "deb [signed-by=/usr/share/keyrings/redis-archive-keyring.gpg] https://packages.redis.io/deb $(lsb_release -cs) main" | tee /etc/apt/sources.list.d/redis.list
RUN tee /etc/apt/sources.list.d/redis.list
RUN apt-get update
RUN apt-get -qq install redis
RUN apt-get -qq install redis-server

#
WORKDIR /code

#
COPY ./requirements.txt /code/requirements.txt

#
RUN pip -q install --no-cache-dir --upgrade -r /code/requirements.txt --use-deprecated=legacy-resolver
#


COPY . /code

# -h flag specifies host name and --daemonize flag let it runs on backstage
CMD ["redis-server","-h", "127.0.0.1", "--daemonize", "yes"]
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "80"]

AND main.py file,

import aioredis
from fastapi import FastAPI, status, Query
from fastapi.middleware.cors import CORSMiddleware
from starlette.requests import Request
from starlette.responses import Response
from fastapi_cache import FastAPICache
from fastapi_cache.backends.redis import RedisBackend
from fastapi_cache.decorator import cache

@app.get("/")
@cache(expire=60)
async def index(request: Request, response: Response):
    return dict(hello="world")

@app.on_event("startup")
async def startup():
    redis =  aioredis.from_url("redis://127.0.0.1", encoding="utf8", decode_responses=True)
    FastAPICache.init(RedisBackend(redis), prefix="fastapi-cache")

Any assistance is much appreciated! 谢谢大佬

Does not work after upgrading to redis-py

Hi!

After upgrading to redis-py, the snippet provided in the example doesn't correctly work. It fails with the following message:

  File "/root/.cache/pypoetry/virtualenvs/backend-9TtSrW0h-py3.9/lib/python3.9/site-packages/fastapi/routing.py", line 226, in app
    raw_response = await run_endpoint_function(
  File "/root/.cache/pypoetry/virtualenvs/backend-9TtSrW0h-py3.9/lib/python3.9/site-packages/fastapi/routing.py", line 159, in run_endpoint_function
    return await dependant.call(**values)
  File "/root/.cache/pypoetry/virtualenvs/backend-9TtSrW0h-py3.9/lib/python3.9/site-packages/fastapi_cache/decorator.py", line 53, in inner
    ttl, ret = await backend.get_with_ttl(cache_key)
  File "/root/.cache/pypoetry/virtualenvs/backend-9TtSrW0h-py3.9/lib/python3.9/site-packages/fastapi_cache/backends/redis.py", line 14, in get_with_ttl
    return await (pipe.ttl(key).get(key).execute())
AttributeError: 'coroutine' object has no attribute 'get'

We did some tests, and it seems that the snippet provided in readme aioredis works well.
However, when we change import aioredis to from redis import asyncio as aioredis as advised in aioredis's docs, the code fails with the same error, as above.

Testing with FastAPI TestClient - not calling init() first

I am testing using the FastAPI/Starlette TestClient. This means that @app.on_event("startup") never runs and this is where I am running init(). This means that when the decorator is used during testing, the "You must call init first!" error is thrown.

How do you advise I proceed? What's the simplest way I can skip caching if I'm not running the app with a web server?

Thanks.

[Question]: How would I handle redis failure?

Wondering how to go about handling Redis failure and keeping app availability

I know that we should treat redis like any other database, but is there any way to handle redis failure or restoring redis cache usage once redis is back up without having to restart the application?

Caching an Auth token validation until expiry

I receive an auth token from the frontend with an expiry date in the future. I would like to cache it until expiry date, but the problem is the expiry date cannot be known until the token has been validated once. Here is the example of the function. It currently caches the response for 60 seconds, but I would like it to be timedelta(token_expiry_timestamp - time.now()).

@cache(namespace="auth", expire=60)
def validate_token(auth_token):
    # first I validate the token
    # then I decode it to obtain user_email and token_expiry_timestamp
    #
    
   def main():
   # some code
   validate_token(auth_token)

I was thinking of returning token_expiry_timestamp from validate_token. Then, in main() I'd clear the cache for this specific case if token_expiry_timestamp has already passed. Is there a better way to do this?

Calling FastAPICache.clear when not initialized

__version__ = "0.1.9"

The .clear method needs to be made possible to call without having to have called .init before.

You can have a genuine user case, such as when a third party package may have called or not called the FastAPICache.init.

Therefore, when calling FastAPICache.clear, it should not rely on properties set, such as cls._prefix or even needing the namespace. The parameter for the namespace defaults to None, however, it is not handled in the code.

Currently, calling FastAPICache.clear results in this error:

await FastAPICache.clear()
  File ".../lib/python3.9/site-packages/fastapi_cache/__init__.py", line 64, in clear
    return await cls._backend.clear(namespace, key)
AttributeError: 'NoneType' object has no attribute 'clear'

Could we discuss modifying the behaviour, so:

  • Calling FastAPICache.clear can pass silently if it has not been initialised before.
  • Not passing any params (leaving defaults), would result in a global clearance of the cache.

On a side note, an alternative way would be not to have class FastAPICache acting as a singleton, but rather to be able to handle multiple caches.

Thoughts, please?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.