Giter Site home page Giter Site logo

iam-abbas / fastapi-production-boilerplate Goto Github PK

View Code? Open in Web Editor NEW
388.0 4.0 57.0 495 KB

A scalable and production ready boilerplate for FastAPI

License: MIT License

Makefile 3.57% Python 95.83% Mako 0.61%
api fastapi fastapi-template python rest-api restful-api template boilerplate fastapi-boilerplate

fastapi-production-boilerplate's Introduction

FastAPI Production Template

A scalable and production ready boilerplate for FastAPI

Table of Contents

Project Overview

This boilerplate follows a layered architecture that includes a model layer, a repository layer, a controller layer, and an API layer. Its directory structure is designed to isolate boilerplate code within the core directory, which requires minimal attention, thereby facilitating quick and easy feature development. The directory structure is also generally very predictable. The project's primary objective is to offer a production-ready boilerplate with a better developer experience and readily available features. It also has some widely used features like authentication, authorization, database migrations, type checking, etc which are discussed in detail in the Features section.

Features

  • Python 3.11+ support
  • SQLAlchemy 2.0+ support
  • Asynchoronous capabilities
  • Database migrations using Alembic
  • Basic Authentication using JWT
  • Row Level Access Control for permissions
  • Redis for caching
  • Celery for background tasks
  • Testing suite
  • Type checking using mypy
  • Dockerized database and redis
  • Readily available CRUD operations
  • Linting using pylint
  • Formatting using black

Installation Guide

You need following to run this project:

I use asdf to manage my python versions. You can use it too. However, it is only supported on Linux and macOS. For Windows, you can use something like pyenv.

Once you have installed the above and have cloned the repository, you can follow the following steps to get the project up and running:

  1. Create a virtual environment using poetry:
poetry shell
  1. Install the dependencies:
poetry install
  1. Run the database and redis containers:
docker-compose up -d
  1. Copy the .env.example file to .env and update the values as per your needs.

  2. Run the migrations:

make migrate
  1. Run the server:
make run

The server should now be running on http://localhost:8000 and the API documentation should be available at http://localhost:8000/docs.

Usage Guide

The project is designed to be modular and scalable. There are 3 main directories in the project:

  1. core: This directory contains the central part of this project. It contains most of the boiler plate code like security dependencies, database connections, configuration, middlewares etc. It also contains the base classes for the models, repositories, and controllers. The core directory is designed to be as minimal as possible and usually requires minimal attention. Overall, the core directory is designed to be as generic as possible and can be used in any project. While building additional feature you may not need to modify this directory at all except for adding more controllers to the Factory class in core/factory.py.

  2. app: This directory contains the actual application code. It contains the models, repositories, controllers, and schemas for the application. This is the directory you will be spending most of your time in while building features. The directory has following sub-directories:

    • models Here is where you add new tables
    • repositories For each model, you need to create a repository. This is where you add the CRUD operations for the model.
    • controllers For each logical unit of the application, you need to create a controller. This is where you add the business logic for the application.
    • schemas This is where you add the schemas for the application. The schemas are used for validation and serialization/deserialization of the data.
  3. api: This directory contains the API layer of the application. It contains the API router, it is where you add the API endpoints.

Advanced Usage

The boilerplate contains a lot of features some of which are used in the application and some of which are not. The following sections describe the features in detail.

Database Migrations

The migrations are handled by Alembic. The migrations are stored in the migrations directory. To create a new migration, you can run the following command:

make generate-migration

It will ask you for a message for the migration. Once you enter the message, it will create a new migration file in the migrations directory. You can then run the migrations using the following command:

make migrate

If you need to downgrade the database or reset it. You can use make rollback and make reset-database respectively.

Authentication

The authentication used is basic implementation of JWT with bearer token. When the bearer token is supplied in the Authorization header, the token is verified and the user is automatically authenticated by setting request.user.id using middleware. To use the user model in any endpoint you can use the get_current_user dependency. If for any endpoint you want to enforce authentication, you can use the AuthenticationRequired dependency. It will raise a HTTPException if the user is not authenticated.

Row Level Access Control

The boilerplate contains a custom row level permissions management module. It is inspired by fastapi-permissions. It is located in core/security/access_control.py. You can use this to enforce different permissions for different models. The module operates based on Principals and permissions. Every user has their own set of principals which need to be set using a function. Check core/fastapi/dependencies/permissions.py for an example. The principals are then used to check the permissions for the user. The permissions need to be defined at the model level. Check app/models/user.py for an example. Then you can use the dependency directly in the route to raise a HTTPException if the user does not have the required permissions. Below is an incomplete example:

from fastapi import APIRouter, Depends
from core.security.access_control import AccessControl, UserPrincipal, RolePrincipal, Allow
from core.database import Base

class User(Base):
    __tablename__ = "users"
    id = Column(Integer, primary_key=True)
    name = Column(String)
    email = Column(String, unique=True)
    password = Column(String)
    role = Column(String)

    def __acl__(self):
        return [
            (Allow, UserPrincipal(self.id), "view"),
            (Allow, RolePrincipal("admin"), "delete"),
        ]

def get_user_principals(user: User = Depends(get_current_user)):
    return [UserPrincipal(user.id)]

Permission = AccessControl(get_user_principals)

router = APIRouter()

@router.get("/users/{user_id}")
def get_user(user_id: int, user: User = get_user(user_id), assert_access = Permission("view")):
    assert_access(user)
    return user

Caching

You can directly use the Cache.cached decorator from core.cache. Example

from core.cache import Cache

@Cache.cached(prefix="user", ttl=60)
def get_user(user_id: int):
    ...

Celery

The celery worker is already configured for the app. You can add your tasks in worker/ to run the celery worker, you can run the following command:

make celery-worker

Session Management

The sessions are already handled by the middleware and get_session dependency which injected into the repositories through fastapi dependency injection inside the Factory class in core/factory.py. There is also Transactional decorator which can be used to wrap the functions which need to be executed in a transaction. Example:

@Transactional()
async def some_mutating_function():
    ...

Note: The decorator already handles the commit and rollback of the transaction. You do not need to do it manually.

If for any case you need an isolated sessions you can use standalone_session decorator from core.database. Example:

@standalone_session
async def do_something():
    ...

Repository Pattern

The boilerplate uses the repository pattern. Every model has a repository and all of them inherit base repository from core/repository. The repositories are located in app/repositories. The repositories are injected into the controllers inside the Factory class in core/factory/factory.py.py.

The base repository has the basic crud operations. All customer operations can be added to the specific repository. Example:

from core.repository import BaseRepository
from app.models.user import User
from sqlalchemy.sql.expression import select

class UserRepository(BaseRepository[User]):
    async def get_by_email(self, email: str):
        return await select(User).filter(User.email == email).gino.first()

To facilitate easier access to queries with complex joins, the BaseRepository class has a _query function (along with other handy functions like _all() and _one_or_none()) which can be used to write compplex queries very easily. Example:

async def get_user_by_email_join_tasks(email: str):
    query = await self._query(join_)
    query = query.filter(User.email == email)
    return await self._one_or_none(query)

Note: For every join you want to make you need to create a function in the same repository with pattern _join_{name}. Example: _join_tasks for tasks. Example:

async def _join_tasks(self, query: Select) -> Select:
    return query.options(joinedload(User.tasks))

Controllers

Kind of to repositories, every logical unit of the application has a controller. The controller also has a primary repository which is injected into it. The controllers are located in app/controllers.

These controllers contain all the business logic of the application. Check app/controllers/auth.py for an example.

Schemas

The schemas are located in app/schemas. The schemas are used to validate the request body and response body. The schemas are also used to generate the OpenAPI documentation. The schemas are inherited from BaseModel from pydantic. The schemas are primarily isolated into requests and responses which are pretty self explainatory.

Formatting

You can use make format to format the code using black and isort.

Linting

You can use make lint to lint the code using pylint.

Testing

The project contains tests for all endpoints, some of the logical components like JWTHander and AccessControl and an example of testing complex inner components like BaseRepository. The tests are located in tests/. You can run the tests using make test.

Contributing

Contributions are higly welcome. Please open an issue or a PR if you want to contribute.

License

This project is licensed under the terms of the MIT license. See the LICENSE file.

Acknowledgements

fastapi-production-boilerplate's People

Contributors

dependabot[bot] avatar dka58 avatar iam-abbas avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

fastapi-production-boilerplate's Issues

Is there some kind of trick in order to regenerate the OpenAPI schema?

Thanks for this repo, it is great.

I created a new model, a new Repository wrapping the model, a controller, all the different pieces, but the openapi schema and thus the /docs endpoint just shows the first models that exist in the app.

I was wondering if you have any ways to debug the OpenAPI spec not generating.

Thank you!

Authentication Middleware

Why don't we use the JWTHandler class in the authentication middleware to decode the received token, so we can also handle exceptions properly?

ImportError: cannot import name 'TypeAliasType' from 'typing_extensions'

When i run make migrate, i got this error

$ make migrate poetry run alembic upgrade head Traceback (most recent call last): File "<frozen runpy>", line 198, in _run_module_as_main File "<frozen runpy>", line 88, in _run_code File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Scripts\alembic.exe\__main__.py", line 4, in <module> File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\alembic\__init__.py", line 1, in <module> from . import context File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\alembic\context.py", line 1, in <module> from .runtime.environment import EnvironmentContext File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\alembic\runtime\environment.py", line 19, in <module> from sqlalchemy.sql.schema import Column File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\sqlalchemy\__init__.py", line 12, in <module> from . import util as _util File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\sqlalchemy\util\__init__.py", line 15, in <module> from ._collections import coerce_generator_arg as coerce_generator_arg File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\sqlalchemy\util\_collections.py", line 39, in <module> from .typing import is_non_string_iterable File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\sqlalchemy\util\typing.py", line 56, in <module> from typing_extensions import TypeAliasType as TypeAliasType # 3.12 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ImportError: cannot import name 'TypeAliasType' from 'typing_extensions' (C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\typing_extensions.py) make: *** [Makefile:55: migrate] Error 1

Authentication Workflow

I am able to start the server and trying all of the endpoints. I saw that the \users and tasks required authentication. How can I get the token generated to try out these 2 endpoints at swagger? Right now it will give me unauthorized response, because I don't have the token.

make run
poetry run python main.py
INFO:     Will watch for changes in these directories: ['/Users/quankhuc/FastAPI-Production-Boilerplate']
INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     Started reloader process [61146] using WatchFiles
INFO:     Started server process [61152]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     127.0.0.1:52430 - "GET /docs HTTP/1.1" 200 OK
INFO:     127.0.0.1:52430 - "GET /openapi.json HTTP/1.1" 200 OK
INFO:     127.0.0.1:52430 - "GET /v1/users/me HTTP/1.1" 401 Unauthorized
INFO:     127.0.0.1:52563 - "GET /v1/tasks/ HTTP/1.1" 401 Unauthorized

Celery Task Did Not Register

I made a small example to test out celery workers. It should get universiies based on the country I query. However, it did not seem to register my task. Do you have any ideas why that would be ?

I created a test_celery.py in tasks directory:

from celery import shared_task
from typing import List, Optional
import json
import httpx
from pydantic import BaseModel
from worker import celery_app

@celery_app.task(name="get_all_universities_task", bind=True)
def get_all_universities_task(self, country):
    return get_all_universities_for_country(country)

url = 'http://universities.hipolabs.com/search'


def get_all_universities_for_country(country: str) -> dict:
    print('get_all_universities_for_country ', country)
    params = {'country': country}
    client = httpx.Client()
    response = client.get(url, params=params)
    response_json = json.loads(response.text)
    universities = []
    for university in response_json:
        university_obj = University.parse_obj(university)
        universities.append(university_obj)
    return {country: universities}

class University(BaseModel):
    """Model representing a university."""

    country: Optional[str] = None  # Optional country name
    web_pages: List[str] = []  # List of web page URLs
    name: Optional[str] = None  # Optional university name
    alpha_two_code: Optional[str] = None  # Optional alpha two-code
    domains: List[str] = []  # List of domain names

And I created an API endpoint and api/v1 directory. That endpoint triggers the tasks

from fastapi import APIRouter
from worker.tasks.test_celery import get_all_universities_task
from celery.result import AsyncResult
import redis


test_celery_router = APIRouter()

@test_celery_router.get("/", status_code=200)
async def test_celery(country):
    task = get_all_universities_task.delay(country)
    return {"task_id": task.id, "task_status": task.status}

@test_celery_router.get("/status/{task_id}", status_code=200)
async def test_celery_status(task_id):
    async_result = AsyncResult(task_id)
    if async_result.ready():
        result_value = async_result.get()
        return {"task_id": task_id, "task_status": async_result.status, "task_result": result_value}
    else:
        return {"task_id": task_id, "task_status": async_result.status, "task_result": "Not ready yet!"}

When I ran make celery-worker, it seems to me that it did not register the task "get_all_univerisity_task":

❯ make celery-worker
poetry run celery -A worker worker -l info


[email protected] v5.3.1 (emerald-rush)

macOS-10.16-x86_64-i386-64bit 2023-06-20 17:58:19

[config]
.> app:         worker:0x10cb337d0
.> transport:   amqp://guest:**@localhost:5672//
.> results:     redis://localhost:6379/0
.> concurrency: 10 (prefork)
.> task events: OFF (enable -E to monitor tasks in this worker)

[queues]
.> celery           exchange=celery(direct) key=celery


[tasks]


[2023-06-20 17:58:19,735: WARNING/MainProcess] /Users/quankhuc/anaconda3/envs/TradingBot/lib/python3.11/site-packages/celery/worker/consumer/consumer.py:498: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
  warnings.warn(

[2023-06-20 17:58:19,807: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2023-06-20 17:58:19,808: WARNING/MainProcess] /Users/quankhuc/anaconda3/envs/TradingBot/lib/python3.11/site-packages/celery/worker/consumer/consumer.py:498: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
  warnings.warn(

[2023-06-20 17:58:19,813: INFO/MainProcess] mingle: searching for neighbors
[2023-06-20 17:58:20,840: INFO/MainProcess] mingle: all alone
[2023-06-20 17:58:20,864: INFO/MainProcess] [email protected] ready.
[2023-06-20 17:58:21,142: INFO/MainProcess] Events of group {task} enabled by remote.
[2023-06-20 18:05:38,090: ERROR/MainProcess] Received unregistered task of type 'get_all_universities_task'.

The message has been ignored and discarded.

Did you remember to import the module containing this task?
Or maybe you're using relative imports?

Please see
https://docs.celeryq.dev/en/latest/internals/protocol.html
for more information.

The full contents of the message body was:
'[["vietnam"], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]' (86b)

The full contents of the message headers:
{'lang': 'py', 'task': 'get_all_universities_task', 'id': '18094bcf-9a99-4330-94b8-96acc928edca', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'group_index': None, 'retries': 0, 'timelimit': [None, None], 'root_id': '18094bcf-9a99-4330-94b8-96acc928edca', 'parent_id': None, 'argsrepr': "('vietnam',)", 'kwargsrepr': '{}', 'origin': '[email protected]', 'ignore_result': False, 'stamped_headers': None, 'stamps': {}}

The delivery info for this task is:
{'consumer_tag': 'None4', 'delivery_tag': 1, 'redelivered': False, 'exchange': '', 'routing_key': 'celery'}
Traceback (most recent call last):
  File "/Users/quankhuc/anaconda3/envs/TradingBot/lib/python3.11/site-packages/celery/worker/consumer/consumer.py", line 642, in on_task_received
    strategy = strategies[type_]
               ~~~~~~~~~~^^^^^^^
KeyError: 'get_all_universities_task'

problems with join_ or i cant get it

join_ is not working well, i am trying to join_{'tasks'}, but i got the error

query = query.filter(User.id == user_id)
        ^^^^^^^^^^^^

AttributeError: 'coroutine' object has no attribute 'filter'

poetry install error

Poetry (version 1.4.1)
Python: 3.11.2
Implementation: CPython
Platform: linux
OS: posix
OS Distribution: Ubuntu 22.04.2 LTS

When running: poetry install some errors occured:

Package operations: 50 installs, 1 update, 0 removals

• Installing parse (1.19.0): Failed

ChefBuildError

   from _ctypes import Union, Structure, Array

ModuleNotFoundError: No module named '_ctypes'

at ~/.pyenv/versions/3.11.2/lib/python3.11/site-packages/poetry/installation/chef.py:152 in _prepare
148│
149│ error = ChefBuildError("\n\n".join(message_parts))
150│
151│ if error is not None:
→ 152│ raise error from None
153│
154│ return path
155│
156│ def _prepare_sdist(self, archive: Path, destination: Path | None = None) -> Path:

Note: This error originates from the build backend, and is likely not a problem with poetry but with parse (1.19.0) not supporting PEP 517 builds. You can verify this by running 'pip wheel --use-pep517 "parse (==1.19.0)"'.

• Installing wrapt (1.14.1): Failed

ChefBuildError

Backend subprocess exited when trying to invoke build_wheel
at ~/.pyenv/versions/3.11.2/lib/python3.11/site-packages/poetry/installation/chef.py:152 in _prepare
148│
149│ error = ChefBuildError("\n\n".join(message_parts))
150│
151│ if error is not None:
→ 152│ raise error from None
153│
154│ return path
155│
156│ def _prepare_sdist(self, archive: Path, destination: Path | None = None) -> Path:

Note: This error originates from the build backend, and is likely not a problem with poetry but with wrapt (1.14.1) not supporting PEP 517 builds. You can verify this by running 'pip wheel --use-pep517 "wrapt (==1.14.1)"'.

Not able to run on ARM M1 Mac

Hi,

Thank you for making a production grade boilerplate available. I ran into troubles when I tried to start this project on my Mac M1 computer. It seems to me that there is a configuration that you have on the project is for machine of type x86-64, when the poetry enviroment is set up. How can I change the configuration to install packages for M1 machine?

Steps to reproduce:

Follow the jump start instruction at README. When running make migrate, this error happened:

  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/bin/alembic", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/config.py", line 591, in main
    CommandLine(prog=prog).main(argv=argv)
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/config.py", line 585, in main
    self.run_cmd(cfg, options)
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/config.py", line 562, in run_cmd
    fn(
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/command.py", line 378, in upgrade
    script.run_env()
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/script/base.py", line 569, in run_env
    util.load_python_file(self.dir, "env.py")
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/util/pyfiles.py", line 94, in load_python_file
    module = load_module_py(module_id, path)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/util/pyfiles.py", line 110, in load_module_py
    spec.loader.exec_module(module)  # type: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/Users/quankhuc/FastAPI-Production-Boilerplate/migrations/env.py", line 29, in <module>
    from app.models import Base
  File "/Users/quankhuc/FastAPI-Production-Boilerplate/app/models/__init__.py", line 1, in <module>
    from core.database import Base
  File "/Users/quankhuc/FastAPI-Production-Boilerplate/core/database/__init__.py", line 1, in <module>
    from .session import (
  File "/Users/quankhuc/FastAPI-Production-Boilerplate/core/database/session.py", line 30, in <module>
    "writer": create_async_engine(config.POSTGRES_URL, pool_recycle=3600),
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/ext/asyncio/engine.py", line 85, in create_async_engine
    sync_engine = _create_engine(url, **kw)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<string>", line 2, in create_engine
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/util/deprecations.py", line 277, in warned
    return fn(*args, **kwargs)  # type: ignore[no-any-return]
           ^^^^^^^^^^^^^^^^^^^
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/create.py", line 720, in create_engine
    event.listen(pool, "connect", on_connect)
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/event/api.py", line 124, in listen
    _event_key(target, identifier, fn).listen(*args, **kw)
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/event/registry.py", line 310, in listen
    self.dispatch_target.dispatch._listen(self, *args, **kw)
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/event/base.py", line 177, in _listen
    return self._events._listen(event_key, **kw)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/events.py", line 94, in _listen
    event_key.base_listen(**kw)
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/event/registry.py", line 348, in base_listen
    for_modify._set_asyncio()
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/event/attr.py", line 416, in _set_asyncio
    self._exec_once_mutex = AsyncAdaptedLock()
                            ^^^^^^^^^^^^^^^^^^
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/util/concurrency.py", line 63, in AsyncAdaptedLock
    _not_implemented()
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/util/concurrency.py", line 43, in _not_implemented
    raise ValueError(
ValueError: the greenlet library is required to use this function. dlopen(/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/greenlet/_greenlet.cpython-311-darwin.so, 0x0002): tried: '/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/greenlet/_greenlet.cpython-311-darwin.so' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/greenlet/_greenlet.cpython-311-darwin.so' (no such file), '/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/greenlet/_greenlet.cpython-311-darwin.so' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64'))
make: *** [migrate] Error 1```

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.