Giter Site home page Giter Site logo

aio-libs / aiopg Goto Github PK

View Code? Open in Web Editor NEW
1.4K 40.0 156.0 1.15 MB

aiopg is a library for accessing a PostgreSQL database from the asyncio

Home Page: http://aiopg.readthedocs.io

License: BSD 2-Clause "Simplified" License

Makefile 0.86% Python 99.14%
asyncio postgresql sqlalchemy

aiopg's Introduction

aiopg

image

image

Chat on Gitter

aiopg is a library for accessing a PostgreSQL database from the asyncio (PEP-3156/tulip) framework. It wraps asynchronous features of the Psycopg database driver.

Example

import asyncio
import aiopg

dsn = 'dbname=aiopg user=aiopg password=passwd host=127.0.0.1'

async def go():
    pool = await aiopg.create_pool(dsn)
    async with pool.acquire() as conn:
        async with conn.cursor() as cur:
            await cur.execute("SELECT 1")
            ret = []
            async for row in cur:
                ret.append(row)
            assert ret == [(1,)]

loop = asyncio.get_event_loop()
loop.run_until_complete(go())

Example of SQLAlchemy optional integration

import asyncio
from aiopg.sa import create_engine
import sqlalchemy as sa

metadata = sa.MetaData()

tbl = sa.Table('tbl', metadata,
    sa.Column('id', sa.Integer, primary_key=True),
    sa.Column('val', sa.String(255)))

async def create_table(engine):
    async with engine.acquire() as conn:
        await conn.execute('DROP TABLE IF EXISTS tbl')
        await conn.execute('''CREATE TABLE tbl (
                                  id serial PRIMARY KEY,
                                  val varchar(255))''')

async def go():
    async with create_engine(user='aiopg',
                             database='aiopg',
                             host='127.0.0.1',
                             password='passwd') as engine:

        async with engine.acquire() as conn:
            await conn.execute(tbl.insert().values(val='abc'))

            async for row in conn.execute(tbl.select()):
                print(row.id, row.val)

loop = asyncio.get_event_loop()
loop.run_until_complete(go())

Please use:

$ make test

for executing the project's unittests. See https://aiopg.readthedocs.io/en/stable/contributing.html for details on how to set up your environment to run the tests.

aiopg's People

Contributors

alefteris avatar and-semakin avatar asvetlov avatar avostap avatar beezz avatar bitdancer avatar brianmaissy avatar dependabot-preview[bot] avatar encukou avatar eumiro avatar gjcarneiro avatar gr1n avatar gyermolenko avatar hellysmile avatar hudecof avatar jettify avatar luhn avatar mpaolini avatar nickolai-dr avatar orf avatar pliner avatar popravich avatar pyup-bot avatar quantum-0 avatar runtel avatar shanecarey17 avatar stopspazzing avatar tolgahanuzun avatar vir-mir avatar wnp avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aiopg's Issues

Asyncio + sqlalchemy ORM

Привіт.

Як працювати з об'єктами алхімії?
Ну тобто зараз є можливість виконувати sql запити через conn.execute() і отримувати звідти скаляри.
А можна отримувати обджекти?

Migrate from "yield from" to await (TypeError: object Engine can't be used in 'await' expression)

Hi, i replaced in my code "yield from" to "await", and received Traceback:
"TypeError: object Engine can't be used in 'await' expression"

async def db_psql_middleware(app, handler):
    async def middleware(request):
        db = app.get('db_psql')
        if not db:
            app['db_psql'] = db = await create_engine(app['psql_dsn'], minsize=1, maxsize=5)
        request.app['db_psql'] = db
        return (await handler(request))
    return middleware


async def psql_select(request):
    with (await request.app['db_psql']) as conn:
        result = await conn.execute(models.select())

Traceback

[2015-09-17 14:50:29 +0300] [26045] [ERROR] Error handling request
Traceback (most recent call last):
  File "/Users/vvv/src/backend-tools/python/asyncio/venv35/lib/python3.5/site-packages/aiohttp/server.py", line 272, in start
    yield from self.handle_request(message, payload)
  File "/Users/vvv/src/backend-tools/python/asyncio/venv35/lib/python3.5/site-packages/aiohttp/web.py", line 85, in handle_request
    resp = yield from handler(request)
  File "/Users/vvv/src/backend-tools/python/asyncio/app.py", line 39, in middleware
    return (await handler(request))
  File "/Users/vvv/src/backend-tools/python/asyncio/app.py", line 46, in psql_select
    with (await request.app['db_psql']) as conn:
TypeError: object Engine can't be used in 'await' expression

fetchone() return is not a RowProxy but a generator

tbl2 = sa.Table('usert2', metadata,
  sa.Column('id', INTEGER, primary_key=True),
  sa.Column('webid', UUID),
  sa.Column('totyear', SMALLINT),
  sa.Column('brewdata', SMALLINT))

engine = yield from create_engine(user='postgres',database='t1',host='127.0.0.1',password='pg')
with (yield from engine) as conn:
query = sa.select([tbl2])
res = yield from conn.execute(query)
alarm=res.fetchone()
print('alarm=',alarm)

Output is :
   <generator object fetchone at 0x7f7b27eff5e8>

It is not right according to sqlalchemy rules. See below section 'Selecting' :
  http://docs.sqlalchemy.org/en/rel_1_0/core/tutorial.html#executing

Default value doesn't work

I've found an unexpected behavior in SQLAlchemy. I'm using the following versions:

  • SQLAlchemy (0.9.8)
  • PostgreSQL (9.3.5)
  • psycopg2 (2.5.4)
  • aiopg (0.5.1)

This is the table definition for the example:

import asyncio
from aiopg.sa import create_engine
from sqlalchemy import (
  MetaData,
  Column,
  Integer,
  Table,
  String,
)
metadata = MetaData()

users = Table('users', metadata,
  Column('id_user', Integer, primary_key=True, nullable=False),
  Column('name', String(20), unique=True),
  Column('age', Integer, nullable=False, default=0),
)

Now if I try to execute a simple insert to the table just populating the _id_user_ and _name, the column _age should be auto-generated with the default value 0 right? Lets see...

@asyncio.coroutine
def go():
  engine = yield from create_engine('postgresql://USER@localhost/DB')
  data = {'id_user':1, 'name':'Jimmy' }
  stmt = users.insert(values=data, inline=False)
  with (yield from engine) as conn:
    result = yield from conn.execute(stmt)


loop = asyncio.get_event_loop()
loop.run_until_complete(go())

This is the resulting statement with the corresponding error:

INSERT INTO users (id_user, name, age) VALUES (1, 'Jimmy', null);

psycopg2.IntegrityError: null value in column "age" violates not-null constraint

I didn't provide the age column, so where is that age = null value coming from? I was expecting something like this:

INSERT INTO users (id_user, name) VALUES (1, 'Jimmy');

Or if the _default_ flag actually works should be:

INSERT INTO users (id_user, name, Age) VALUES (1, 'Jimmy', 0);

Could you put some light on this?

Closing connection object throws exception

Hi,

I am using aiopg with a Pool object that is shared among coroutines. Every once in a while, randomly I see following error:

psycopg2.ProgrammingError: close cannot be used while an asynchronous query is u
nderway

I can reproduce the issue with following:


pool = yield from aiopg.create_pool(dsn)
with (yield from pool.cursor(timeout=1.0)) as cur:
    yield from cur.execute("SELECT pg_sleep(2)")

It seems valid to raise TimeoutError rather than ProgrammingError, I believe. What do you think?

Thanks,

Issue with postgres JSON field deserialization.

Hello there, we started to use your modern framework aioREST in our projects with aiopg driver on top of SQLAlchemy and i got some problems with JSON field.

https://github.com/aio-libs/aiopg/blob/master/aiopg/sa/result.py#L49

We have code like this:

    @classmethod
    @coroutine
    def get_by_id(cls, id):
        '''Return user or None'''
        with (yield from db.engine) as conn:
            cursor = yield from conn.execute(
                users.select().where(users.c.id == id).limit(1))
            user = yield from cursor.first()
        return cls(user) if user else None

as far as i understand psycopg2 already deserialize JSON into native datatype, i started to look using pdb and that's what i got:

> /projecti/auth/models.py(81)__init__()
-> print(sq_row['contacts'])
(Pdb) s
--Call--
> /project/var/env/lib/python3.4/site-packages/aiopg/sa/result.py(27)__getitem__()
-> def __getitem__(self, key):
(Pdb) s
> /project/var/env/lib/python3.4/site-packages/aiopg/sa/result.py(28)__getitem__()
-> try:
(Pdb) s
> /project/var/env/lib/python3.4/site-packages/aiopg/sa/result.py(29)__getitem__()
-> processor, obj, index = self._keymap[key]
(Pdb) s
> /project/var/env/lib/python3.4/site-packages/aiopg/sa/result.py(45)__getitem__()
-> if index is None:
(Pdb) print(processor)
<function JSON.result_processor.<locals>.process at 0x10ecd4488>
(Pdb) print(processor.__name__)
process
(Pdb) s               
> /project/var/env/lib/python3.4/site-packages/aiopg/sa/result.py(49)__getitem__()
-> if processor is not None:
(Pdb) s
> /project/var/env/lib/python3.4/site-packages/aiopg/sa/result.py(50)__getitem__()
-> return processor(self._row[index])
(Pdb) print(self._row[index])
[{'type': 'phone', 'value': ''}]
(Pdb) s
--Call--
> /project/var/env/lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/json.py(202)process()
-> def process(value):
(Pdb) s
> /project/var/env/lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/json.py(203)process()
-> return json_deserializer(value)

but value already is in native python data structure

(Pdb) print(type(value))
<class 'list'>

so finally:

TypeError: the JSON object must be str, not 'list'

Queries start failing after some time

I am facing an issue where my db queries start failing after sometime. Here is some code that I use to create a pool

class PostgresStore:
    _pool = None
    _connection_params = {}

    @classmethod
    def connect(cls, database:str, user:str, password:str, host:str, port:int):
        """
        Sets connection parameters
        """
        cls._connection_params['database'] = database
        cls._connection_params['user'] = user
        cls._connection_params['password'] = password
        cls._connection_params['host'] = host
        cls._connection_params['port'] = port

    @classmethod
    def use_pool(cls, pool:Pool):
        """
        Sets an existing connection pool instead of using connect() to make one
        """
        cls._pool = pool

    @classmethod
    @coroutine
    def get_pool(cls) -> Pool:
        """
        Yields:
            existing db connection pool
        """
        if len(cls._connection_params) < 5:
            raise ConnectionError('Please call SQLStore.connect before calling this method')
        if not cls._pool:
            cls._pool = yield from create_pool(**cls._connection_params)
        return cls._pool

I use aiohttp to create a web server and once the server is up and running for a few hours, db quesries start failing. All other apis work perfectly fine. Here are the logs attached:

2015-07-14 16:19:18,531 ERROR [base_events:698] Fatal read error on socket transport
protocol: 
transport: 
Traceback (most recent call last):
  File "/usr/lib/python3.4/asyncio/selector_events.py", line 459, in _read_ready
    data = self._sock.recv(self.max_size)
TimeoutError: [Errno 110] Connection timed out
2015-07-14 17:26:42,070 ERROR [base_events:698] Fatal error on aiopg connection: bad state in _ready callback
connection: 
2015-07-14 17:26:58,017 ERROR [base_events:698] Fatal error on aiopg connection: bad state in _ready callback
connection: 
2015-07-14 17:27:02,606 ERROR [base_events:698] Fatal error on aiopg connection: bad state in _ready callback
connection: 
2015-07-14 17:27:03,226 ERROR [base_events:698] Fatal error on aiopg connection: bad state in _ready callback
connection: 
2015-07-14 17:27:14,691 ERROR [base_events:698] Fatal error on aiopg connection: bad state in _ready callback
connection: 
2015-07-14 18:47:51,427 ERROR [base_events:698] Fatal read error on socket transport
protocol: 
transport: 
Traceback (most recent call last):
  File "/usr/lib/python3.4/asyncio/selector_events.py", line 459, in _read_ready
    data = self._sock.recv(self.max_size)
TimeoutError: [Errno 110] Connection timed out
2015-07-14 18:50:02,499 ERROR [base_events:698] Fatal read error on socket transport
protocol: 
transport: 
Traceback (most recent call last):
  File "/usr/lib/python3.4/asyncio/selector_events.py", line 459, in _read_ready
    data = self._sock.recv(self.max_size)
TimeoutError: [Errno 110] Connection timed out

Many cursor method examples are not correct?

I realize (now) that the docs say all the methods are coroutines (except close). However, the examples for fetchone, fetchmany, fetchiter, etc do not use 'yield from' in the examples and I wasted a bit of time wondering why I was getting an empty list from my select statement before I noticed the (correct) example at the top of the section. Also, the example of using the cursor as an iterator does not work for me (prints no data). I'm using 0.6.1.

Confusing traceback in update

  File "/home/tag/.virtualenvs/moxie/lib/python3.4/site-packages/aiopg/sa/connection.py", line 53, in scalar
    return (yield from res.scalar())
  File "/home/tag/.virtualenvs/moxie/lib/python3.4/site-packages/aiopg/sa/result.py", line 417, in scalar
    row = yield from self.first()
  File "/home/tag/.virtualenvs/moxie/lib/python3.4/site-packages/aiopg/sa/result.py", line 405, in first
    self._non_result()
  File "/home/tag/.virtualenvs/moxie/lib/python3.4/site-packages/aiopg/sa/result.py", line 334, in _non_result
    "This result object does not return rows. "
aiopg.sa.exc.ResourceClosedError: This result object does not return rows. It has been closed automatically.

Code is similar to:

    engine = yield from aiopg.sa.create_engine(DATABASE_URL)
    with (yield from engine) as conn:
        reschedule = (dt.datetime.utcnow() + job.interval)
        yield from conn.scalar(
            update(
                Job.__table__
            ).where(
                Job.name==job.name
            ).values(
                active=True,
                scheduled=reschedule,
            ))

Database has:

moxie=# SELECT * FROM job;
 id |      name      |      description      |    command     |    image     |         scheduled          | interval | active | maintainer_id 
----+----------------+-----------------------+----------------+--------------+----------------------------+----------+--------+---------------
  3 | test-job-three | Test Job number Three | update holyoke | paultag/pupa | 2014-06-28 14:37:09.501019 | 1 day    | f      |             1
  4 | test-job-four  | Test Job number Four  | update holyoke | paultag/pupa | 2014-06-28 14:37:09.50455  | 1 day    | f      |             2
  5 | test-job-five  | Test Job number Five  | update holyoke | paultag/pupa | 2014-06-28 14:37:09.508053 | 1 day    | f      |             2
  6 | test-job-six   | Test Job number Six   | update holyoke | paultag/pupa | 2014-06-28 14:37:09.512476 | 1 day    | f      |             2
  1 | test-job-one   | Test Job number One   | update holyoke | paultag/pupa | 2014-06-29 14:37:15.559101 | 1 day    | t      |             1
  2 | test-job-two   | Test Job number Two   | update holyoke | paultag/pupa | 2014-06-29 14:37:15.56207  | 1 day    | t      |             1
(6 rows)

Failed on entry test-job-three.

Add advanced example of making sa select

I was not able to iterate over a select query result, running from the view. Here is the flow:

i run a repository function from the view:

# views.py
users = yield from find_all()

find_all() is the function in repository module:

# app/models/user.py
@asyncio.coroutine
def find_all():
    yield from run(User.table.select())

finally here is how run looks:

# app/models/__init__.py
@asyncio.coroutine
def run(query):
    engine = yield from create_engine(user='test',
                                      database='test',
                                      host='127.0.0.1',
                                      password='test')
    with (yield from engine) as conn:
        yield from conn.execute(query)

So the problem is that users is a NoneObject, but if i iterate the result in the same file where the query is executed (app/models/__init__.py) then it works fine:

        res = yield from conn.execute(query)
        for row in res:
            print(row)

Could you please add more examples of how to get a query result in such case?

session object integration

Hey aiopg people! Nice work so far, loving this!

It'd be awesome to find a way to get session objects working with asyncio!

Fix a bug in connection pool

Extracted from private email:

В файле pool.py в _fill_free_pool
после того как получили новый connection
надо снова проверить, что пул всё еще не заполнен до конца

потому что за это время параллельный процесс мог его уже заполнить

в итоге в пуле оказывается больше connection-ов чем maxsize
и это потом приводит к ошибке во время release (он там падает на assert-е, проверяющем что пул не полный)

я вот так у себя залечил

@asyncio.coroutine
def _fill_free_pool(self):
    while self.freesize < self.minsize and self.size < self.maxsize:
        conn = yield from connect(
            self._dsn, loop=self._loop,
            **self._conn_kwargs)
        if self.freesize < self.minsize and self.size < self.maxsize:
            yield from self._free.put(conn)
        else:
            conn._close()

Connection handling in web application example

What is the most generic proper way of handling database connection in real web apps? The naive way is just to create connection pool and pick/release a cursor per-query (or per-request?).

But the naive approach has some pitfalls:

  1. The connection break because of database server connection down
  2. The connection break after locking invalid query/transaction
  3. Unknown reason like #67 etc.

So what is the way to connect/re-connect which suits basic needs and could face naive approach problems?

It would be great to have some ideas and examples on that in docs.

Thank you!

SQL statements logging

Hi!
I would like to have sql statements logging right into stdout when I run my applications in a debug mode. Normally (i.e. with a regular sqlalchemy-backed application) I setup the standard python logger with the following structure:

logging:
  version: 1
  disable_existing_loggers: True
  formatters:
    generic:
      format: "%(asctime)s %(levelname)-5.5s [%(name)s][%(threadName)s] %(message)s"

  handlers:
    console:
      level: DEBUG
      class: logging.StreamHandler
      stream: ext://sys.stdout
      formatter: generic

  loggers:
    root:
      level: DEBUG
      handlers: [console]

    asyncio:
      level: DEBUG
      handlers: [console]

    sqlalchemy:
      # "level = INFO" logs SQL queries.
      # "level = DEBUG" logs SQL queries and results.
      # "level = WARN" logs neither.  (Recommended for production systems.)
      level: DEBUG
      handlers: [console]

    sqlalchemy.engine:
      # "level = INFO" logs SQL queries.
      # "level = DEBUG" logs SQL queries and results.
      # "level = WARN" logs neither.  (Recommended for production systems.)
      level: INFO
      handlers: [console]
      propagate: False

But apparently this is not enough for aiopg. Just adding the aiopg logger into that definition makes no difference to the final output. Therefore, the question is how to set it up properly. Is there any working example of it?

Links for transactions docs

Can you, please, show docs links, where the difference between two-phase and nested transactions is explained?

Create a django database backend

Currently there's no database backend using aiopg for Django.
Two backends should be implemented one that provides the same behavior as the original psycopg2 adapter and one that also provides connection pooling. The first backend should be a drop-in replacement.
How do we advance this issue further?
I assume that yield from should be added here and here but there are probably other places where we should be async.
Care to provide some feedback? I'll create a pull request once I will have further instructions on how to proceed.

Unicode encoding error over the line

When inserting, I get the following internal error from aiopg

  File "/usr/local/lib/python3.4/dist-packages/aiopg/sa/connection.py", line 115, in scalar
    res = yield from self.execute(query, *multiparams, **params)
  File "/usr/local/lib/python3.4/dist-packages/aiopg/sa/connection.py", line 102, in execute
    yield from cursor.execute(str(compiled), post_processed_params[0])
  File "/usr/local/lib/python3.4/dist-packages/aiopg/cursor.py", line 106, in execute
    self._impl.execute(operation, parameters)
UnicodeEncodeError: 'ascii' codec can't encode character '\xe9' in position 4198: ordinal not in range(128)

Invocation is:

    @asyncio.coroutine
    def create(self, **kwargs):
        with (yield from self.db.engine) as conn:
            runid = yield from conn.scalar(insert(Run.__table__).values(**kwargs))

It's likely breaking on the log of a Run:

class Run(Base):
    __tablename__ = 'run'
    id = Column(Integer, primary_key=True)
    failed = Column(Boolean)
    job_id = Column(Integer, ForeignKey('job.id'))
    job = relationship("Job", foreign_keys=[job_id], backref='runs')
    log = Column(Text)
    start_time = Column(DateTime)
    end_time = Column(DateTime)

I'm guessing it's the log field

Problem with transaction

Hi,
I've got a problem with aiopg when using transaction. I've used a simple code for transaction:

trans = yield from conn.begin()
try:
      yield from fn(self, conn, *args, **kwargs)
except:
      yield from trans.rollback()
      raise
else:
      yield from trans.commit()

But if there timeout is occurred I'll get the next traceback:

Traceback (most recent call last):
...
File "/home/alex/work/python/.envs/stload3/lib/python3.4/site-packages/aiopg/sa/connection.py", line 100, in execute
yield from cursor.execute(str(compiled), post_processed_params[0])
File "/home/alex/work/python/.envs/stload3/lib/python3.4/site-packages/aiopg/cursor.py", line 112, in execute
yield from self._conn._poll(waiter, timeout)
File "/home/alex/work/python/.envs/stload3/lib/python3.4/site-packages/aiopg/connection.py", line 162, in _poll
yield from asyncio.wait_for(self._waiter, timeout, loop=self._loop)
File "/usr/lib/python3.4/asyncio/tasks.py", line 381, in wait_for
raise futures.TimeoutError()
concurrent.futures._base.TimeoutError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/alex/work/python/.envs/stload3/lib/python3.4/site-packages/aiopg/sa/connection.py", line 185, in _rollback_impl
yield from cur.execute('ROLLBACK')
File "/home/alex/work/python/.envs/stload3/lib/python3.4/site-packages/aiopg/cursor.py", line 107, in execute
self._impl.execute(operation, parameters)
psycopg2.ProgrammingError: execute cannot be used while an asynchronous query is underway

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
...
File "/home/alex/work/python/.envs/stload3/lib/python3.4/site-packages/aiopg/sa/transaction.py", line 67, in rollback
yield from self._do_rollback()
File "/home/alex/work/python/.envs/stload3/lib/python3.4/site-packages/aiopg/sa/transaction.py", line 95, in _do_rollback
yield from self._connection._rollback_impl()
File "/home/alex/work/python/.envs/stload3/lib/python3.4/site-packages/aiopg/sa/connection.py", line 187, in _rollback_impl
cur.close()
File "/home/alex/work/python/.envs/stload3/lib/python3.4/site-packages/aiopg/cursor.py", line 49, in close
self._impl.close()
psycopg2.ProgrammingError: close cannot be used while an asynchronous query is underway

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
...
File "/home/alex/work/python/.envs/stload3/lib/python3.4/site-packages/aiopg/sa/engine.py", line 190, in exit
self._engine.release(self._conn)
File "/home/alex/work/python/.envs/stload3/lib/python3.4/site-packages/aiopg/sa/engine.py", line 133, in release
raise InvalidRequestError("Cannot release a connection with "
aiopg.sa.exc.InvalidRequestError: Cannot release a connection with not finished transaction

As I think if we get TimoutError while the query executing, the query should be cancelled.
Thanks

Extract aiopg.sa as library

this package is almost same in aiopg and aiomysql. So I think we should extract this package as separate library, shouldn't we?

Make benchmark

sync psycopg vs aiopg vs aiopg.sa vs aiopg.sa with query caching

SQLA dsn

Hello.

How about add it ?

dsn = "postgresql+psycopg2://aiopg:passwd@localhost:5432/aiopg"
    def parse_uri(uri):
        from urllib.parse import urlparse
        uri = urlparse(uri)
        return dict(
            user=uri.username,
            password=uri.password,
            database=uri.path[1:],
            port=str(uri.port),
            host=uri.hostname
        )
 engine = yield from create_engine(**parse_uri(dsn))

If we use the SQLAlchemy we need the same uri to acceses db.

Error with sqlalchemy

I'm getting an error when using sqlalchemy and nothing is getting inserted.

Fatal error on aiopg connection: bad state in _ready callback
connection: <aiopg.connection.Connection object at 0x7f0c3ab44710>

Not sure what's causing it, maybe this additional information will help. Is it possible to use a sqlalchemy model defined using declarative to get the table object used by aiopg?

from aiopg.sa import create_engine
engine = create_engine(database='db')
Base.metadata.bind = engine
table = Base.metadata.tables['table']

And is it possible to only provide the 'database' argument like above if in the non-aiopg version I'm defining it as follows, with just a db name because it's using local authentication.

engine = engine_from_config({'sqlalchemy.url': 'postgresql+psycopg2:///db'})

I'm running the insert statements like this (not sure if I need the commented parts).

    with (yield from engine) as conn:
        # tr = yield from conn.begin()
        yield from conn.execute(table.insert(), *multivalues)
        # yield from tr.commit()

Unable to do multi-insert according to docs

The following code should work according to http://aiopg.readthedocs.org/en/stable/sa.html#aiopg.sa.SAConnection.execute:

import asyncio
from aiopg.sa import create_engine
import sqlalchemy as sa


metadata = sa.MetaData()

table = sa.Table(
    'table',
    metadata,
    sa.Column('id', sa.Integer, primary_key=True),
    sa.Column('value', sa.String(255)),
)


@asyncio.coroutine
def go():
    engine = yield from create_engine('postgresql://localhost/aiopg_issue')

    with (yield from engine) as conn:
        yield from conn.execute(
            table.insert(),
            {"id": 1, "value": "v1"},
            {"id": 2, "value": "v2"},
        )

        res = yield from conn.execute(table.select())
        for row in res:
            print(row.id, row.val)


loop = asyncio.get_event_loop()
loop.run_until_complete(go())

However, I get a aiopg.sa.exc.ArgumentError: Don't mix sqlalchemy clause and execution with parameters error because of the dictionaries passed to execute. Are the docs out of date?

Cursor documentation: coros and __iter__

On the Cursor documentation it says that all methods are coroutines. I get that you don't want to document that on each method explicitly, but there were two issues for me while using it:

  1. The Cursor.close method is not a coroutine. It probably should be mentioned in the docs that you don't need to wait for it

  2. There is no __iter__ for async cursors. Afaik you can't have them because there's no way to implement correctly (is this correct?). The docs say that

    Cursor objects are iterable [...]

    This isn't true here so it should probably be adjusted.
    My proposal as a fix would be to introduce something like fetchiter that yields each row separately (if this is possible). Otherwise the only way I found would be to use fetchall or (if you have many rows) to handle this yourself using fetchmany.

Possibly use SQLAlchemy Strategies

It looks like the aiopg.sa module relies on reimplementing some of the lower level parts of SQLAlchemy and providing alternatives which use a similar API. I wonder if this could instead be done using SQLAlchemy's support for different execution strategies?

There is an example of doing this for Twisted at https://github.com/alex/alchimia, that defers things to a thread pool, largely because it wants to work for any database not just PostgreSQL. However I wonder if it would be possible to do that instead which might deduplicate some of the code?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.