Giter Site home page Giter Site logo

aioboto3's Introduction

Async AWS SDK for Python

image

image

Documentation Status

Updates

Breaking changes for v11: The S3Transfer config passed into upload/download_file etc.. has been updated to that it matches what boto3 uses

Breaking changes for v9: aioboto3.resource and aioboto3.client methods no longer exist, make a session then call session.client etc... This was done for various reasons but mainly that it prevents the default session living longer than it should as that breaks situations where eventloops are replaced.

The .client and .resource functions must now be used as async context managers.

Now that aiobotocore has reached version 1.0.1, a side effect of the work put in to fix various issues like bucket region redirection and supporting web assume role type credentials, the client must now be instantiated using a context manager, which by extension applies to the resource creator. You used to get away with calling res = aioboto3.resource('dynamodb') but that no longer works. If you really want to do that, you can do res = await aioboto3.resource('dynamodb').__aenter__() but you'll need to remember to call __aexit__.

There will most likely be some parts that dont work now which I've missed, just make an issue and we'll get them resoved quickly.

Creating service resources must also be async now, e.g.

async def main():
    session = aioboto3.Session()
    async with session.resource("s3") as s3:
        bucket = await s3.Bucket('mybucket')  # <----------------
        async for s3_object in bucket.objects.all():
            print(s3_object)

Updating to aiobotocore 1.0.1 also brings with it support for running inside EKS as well as asyncifying get_presigned_url


This package is mostly just a wrapper combining the great work of boto3 and aiobotocore.

aiobotocore allows you to use near enough all of the boto3 client commands in an async manner just by prefixing the command with await.

With aioboto3 you can now use the higher level APIs provided by boto3 in an asynchronous manner. Mainly I developed this as I wanted to use the boto3 dynamodb Table object in some async microservices.

While all resources in boto3 should work I havent tested them all, so if what your after is not in the table below then try it out, if it works drop me an issue with a simple test case and I'll add it to the table.

Services Status
DynamoDB Service Resource Tested and working
DynamoDB Table Tested and working
S3 Working
Kinesis Working
SSM Parameter Store Working
Athena Working

Example

Simple example of using aioboto3 to put items into a dynamodb table

import asyncio
import aioboto3
from boto3.dynamodb.conditions import Key


async def main():
    session = aioboto3.Session()
    async with session.resource('dynamodb', region_name='eu-central-1') as dynamo_resource:
        table = await dynamo_resource.Table('test_table')

        await table.put_item(
            Item={'pk': 'test1', 'col1': 'some_data'}
        )

        result = await table.query(
            KeyConditionExpression=Key('pk').eq('test1')
        )

        # Example batch write
        more_items = [{'pk': 't2', 'col1': 'c1'}, \
                      {'pk': 't3', 'col1': 'c3'}]
        async with table.batch_writer() as batch:
            for item_ in more_items:
                await batch.put_item(Item=item_)

loop = asyncio.get_event_loop()
loop.run_until_complete(main())

# Outputs:
#  [{'col1': 'some_data', 'pk': 'test1'}]

Things that either dont work or have been patched

As this library literally wraps boto3, its inevitable that some things won't magically be async.

Fixed:

  • s3_client.download_file* This is performed by the s3transfer module. -- Patched with get_object
  • s3_client.upload_file* This is performed by the s3transfer module. -- Patched with custom multipart upload
  • s3_client.copy This is performed by the s3transfer module. -- Patched to use get_object -> upload_fileobject
  • dynamodb_resource.Table.batch_writer This now returns an async context manager which performs the same function
  • Resource waiters - You can now await waiters which are part of resource objects, not just client waiters, e.g. await dynamodbtable.wait_until_exists()
  • Resource object properties are normally autoloaded, now they are all co-routines and the metadata they come from will be loaded on first await and then cached thereafter.
  • S3 Bucket.objects object now works and has been asyncified. Examples here - https://aioboto3.readthedocs.io/en/latest/usage.html#s3-resource-objects

Amazon S3 Client-Side Encryption

Boto3 doesn't support AWS client-side encryption so until they do I've added basic support for it. Docs here CSE

CSE requires the python cryptography library so if you do pip install aioboto3[s3cse] that'll also include cryptography.

This library currently supports client-side encryption using KMS-Managed master keys performing envelope encryption using either AES/CBC/PKCS5Padding or preferably AES/GCM/NoPadding. The files generated are compatible with the Java Encryption SDK so I will assume they are compatible with the Ruby, PHP, Go and C++ libraries as well.

Non-KMS managed keys are not yet supported but if you have use of that, raise an issue and i'll look into it.

Documentation

Docs are here - https://aioboto3.readthedocs.io/en/latest/

Examples here - https://aioboto3.readthedocs.io/en/latest/usage.html

Features

  • Closely mimics the usage of boto3.

Todo

  • More examples
  • Set up docs
  • Look into monkey-patching the aws xray sdk to be more async if it needs to be.

Credits

This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template. It also makes use of the aiobotocore and boto3 libraries. All the credit goes to them, this is mainly a wrapper with some examples.

aioboto3's People

Contributors

abivolmv avatar and-semakin avatar bact avatar blotero avatar compscidr avatar cooperlees avatar dacevedo12 avatar feuerfritas avatar ganofins avatar huonw avatar inadarei avatar jamesverrill avatar johnhbrock avatar matteofigus avatar nvllsvm avatar ppbdrinker avatar prodeveloper0 avatar pyup-bot avatar rlindsberg avatar sanchoyzer avatar scribu avatar snstanton avatar steinnes avatar terricain avatar zikphil avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aioboto3's Issues

Incompatibility with latest boto3 and botocore

Description

When I install the latest awscli/boto3/botocore I see these errors. Why does aioboto3 have the hard pinning on a specific version of botocore/aiobotocore? What needs to be done to support the latest version? This is becoming quite a problem for users installing our library which depends on aioboto3. Do you have any suggestions on what we can do here?

ERROR: aiobotocore 0.11.0 has requirement botocore<1.13.15,>=1.13.14, but you'll have botocore 1.13.32 which is incompatible.
ERROR: aioboto3 6.4.1 has requirement aiobotocore[boto3]~=0.10.2, but you'll have aiobotocore 0.11.0 which is incompatible.

aws s3 client

  • Python version: 3.8.2
  • Operating System: Linux (based on docker 3.8-slim-buster)

Description

I was trying to create a client using aioboto3 but not successfully, but successfully if use boto3 client. However, when I tested on my localhost (Win10) aioboto3 client works correctly.

What I Did

import boto3
import aioboto3


client = aioboto3.client('s3')  # <aiobotocore.session.ClientCreatorContext object at 0x7f5bac479820>
client2  = boto3.client('s3')  # <botocore.client.S3 object at 0x7f5bac2ec5b0>

Onboard SQS as part of the arsenal of aioboto3

I'd love for this library to include support for SQS as it is a basic and foundational service of AWS. Last day or so I've spent to successful migrate from using boto3 to aioboto3 within a FastAPI applicaiton, utilizing dynamoDB and S3 aioboto3 functions.

Upload callback reporting many more bytes than file size

  • Async AWS SDK for Python version: 8.0.2
  • Python version: 3.6
  • Operating System: ubuntu 19.10

Description

Describe what you were trying to get done.
Tell us what happened, what went wrong, and what you expected to happen.

Uploading smaller files (on the order of Kbs to several Mbs is accurate). Uploading a large file (268Mb) resulting in inaccurate bytes being reported to the callback - it showed the bytes uploaded as 4.43Gb.

What I Did

Did some further digging, and looked at behavior of the boto3 upload in comparison - it looks like every time it calls the callback it sends the amount of bytes sent since the last time it called the callback, not the cumulative total, which it looks like what the aioboto3 upload is doing.

Also, note - the behavior of the download callbacks matches the original boto3, which is to callback with the bytes since last callback.

Here's some sample code (assuming the ~/.aws/credentials file is setup, and self._bucket matches the bucket name you want to push to)

import boto3
import os
import aioboto3

def callback(self, bytes):
    self._total += bytes;
    print("BYTES: " + str(bytes) + " TOTAL: " + str(self._total))

# aioboto3 version
def putfile(self, filename: str):
    loop = asyncio.get_event_loop()
    loop.run_until_complete(self.putfile_async(filename))

async def putfile_async(self, filename: str):
    objectname = os.path.basename(filename)
    async with aioboto3.client("s3") as s3:
        self._total = 0
        await s3.upload_file(filename, self._bucket, objectname, Callback=self.callback)

# boto3 version
def putfile_sync(self, filename: str):
    objectname = os.path.basename(filename)
    s3_client = boto3.client('s3', region_name=self._region)
    self._total = 0
    response = s3_client.upload_file(filename, self._bucket, objectname, Callback=self.callback)

Wrap the client into a normal call instead of Async call

  • Async AWS SDK for Python version: 4.1.1
  • Python version: 3.6
  • Operating System: Ubuntu

Description

I did not find other ways to describe this, because it is not a issue, but a query. If not good, please delete this issue, and I leave my email: [email protected].

Python async is good library, but if we use this in current format (using async and await), it requires
the other part of the code need to be async as well. I think it may be not easy to plug this library into
some exsiting code which is not in a async format.

I think we can wrap this great library into a class that at least provides normal non-blocking call for write IO without bother async, so that it is easy to plug this into exsiting code.

How can I do that? Thank you!

Regards,
Zhe

Delete all objects from a bucket

While trying to delete all objects in a bucket using the resource API:

aioboto3.resource("s3").Bucket("my-bucket").objects.all().delete()

I get the following error:

NotImplementedError: <aiobotocore.paginate.AioPageIterator object at 0x7f775d3900f0> is an AsyncIterable: use `async for`

In line 14 of the paginate.py file.

The error originates from dist-packages/boto3/resources/collection.py, line 166

From my understanding the resource API has some logic that iterates over a collection (in this case, a paginated one) and tries to do it in a synchronous way.

coroutine `AioSession._create_client` was never awaited

  • Async AWS SDK for Python version:

    • aioboto3==8.0.2
    • aiobotocore==1.0.3
    • boto3==1.12.32
    • botocore==1.15.32
  • Python version: 3.6

  • Operating System: Linux

Description

When upgrading from 7.1.0 to 8.0.2, I started seeing the following error, and the application quits.

Traceback (most recent call last):
  File "x.py", line 21, in <module>
    loop.run_until_complete(crash())
  File "/usr/lib/python3.6/asyncio/base_events.py", line 488, in run_until_complete
    return future.result()
  File "x.py", line 16, in crash
    bucket = await s3.meta.client.head_bucket(Bucket='test_bucket')
AttributeError: 'ResourceCreaterContext' object has no attribute 'meta'
sys:1: RuntimeWarning: coroutine 'AioSession._create_client' was never awaited

Rolling back to v7.1.0 resolves this issue (pip install aioboto3==7.1.0).

I've not had a chance to dig into this properly yet.

What I Did

import asyncio
import botocore
import aioboto3

async def crash():
    args = {
        'config': botocore.config.Config(
            signature_version='s3v4',
        ),
        'aws_access_key_id': 'minioadmin',
        'aws_secret_access_key': 'minioadmin',
        'endpoint_url': 'http://patch.attie.co.uk:9000/',
        'region_name': 'local',
    }
    s3 = aioboto3.resource('s3', **args)
    bucket = await s3.meta.client.head_bucket(Bucket='test_bucket')

    assert(False)

loop = asyncio.get_event_loop()
loop.run_until_complete(crash())
loop.close()

dynamodb query with memory leak

  • Async AWS SDK for Python version: 6.4.1
  • Python version: Tested with Python 3.6.6, 3.7, and 3.8.1
  • Operating System: Arch Linux

Description

Code 1:

        def _query():
            res = boto3.resource('dynamodb')
            table = res.Table(config.METADATA_TABLE_NAME)
            return table.query(KeyConditionExpression=Key('namespace').eq(namespace))

        loop = asyncio.get_running_loop()
        response = await loop.run_in_executor(None, _query)

Code 2:


        async with aioboto3.resource('dynamodb') as dyn:
            table = dyn.Table(config.METADATA_TABLE_NAME)
            response = await table.query(KeyConditionExpression=Key('namespace').eq(namespace))

I use it to query around 5k namespaces. After 5k namespaces, Code 1 is consuming around 250MB, and Code 2 is using around 1.7 GB

s3 select api response payload can't be parsed

  • Async AWS SDK for Python version: aioboto3(6.4.1), aiobotocore(0.10.3)
  • Python version: 3.7
  • Operating System: MACOS 10.14.5
    asyncio (3.4.3)

Description

I am new to python, so pardon me if I am doing something horribly wrong.
I am trying to call s3 select api asynchronously and getting a successful response but not able to iterate over the payload. The EventStream returned by the call may not be compatible for asynchronous calls.

What I Did

I tried two different ways and none of those worked. Note that the prints in the code section below is working fine
option1:
command:
python <my_file_name>
code

session = aioboto3.Session(profile_name='<profile_name>', region_name='us-east-1')    
async with session.client("s3") as client:
        obj = await client.select_object_content(**kwargs)
        print(obj['Payload'])
        print(sys.getsizeof(obj['Payload']))
        print(await obj['Payload']._raw_stream.read())


        async with obj['Payload'] as event_stream:
            for event in event_stream:
                if 'Records' in event:
                    data = event['Records']['Payload'].decode('utf-8')
                    print(data)

stack trace

Traceback (most recent call last):
  File "s3_util.py", line 164, in <module>
    asyncio.run(main())
  File "/usr/local/Cellar/python/3.7.3/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/runners.py", line 43, in run
    return loop.run_until_complete(main)
  File "/usr/local/Cellar/python/3.7.3/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/base_events.py", line 584, in run_until_complete
    return future.result()
  File "s3_util.py", line 159, in main
    'where \'aws_route53_record\' in s.resources[*].type')
  File "s3_util.py", line 53, in s3_select_from_object
    async with obj['Payload'] as event_stream:
AttributeError: __aexit__

option 2:
command: python <my_file_name>
code:

session = aioboto3.Session(profile_name='<profile_name>', region_name='us-east-1')     
async with session.client("s3") as client:
        obj = await client.select_object_content(**kwargs)
        print(obj['Payload'])
        print(sys.getsizeof(obj['Payload']))
        print(await obj['Payload']._raw_stream.read())

        for event in obj['Payload']:
            if 'Records' in event:
                data = event['Records']['Payload'].decode('utf-8')
                print(data)
        async with obj['Payload'] as event_stream:
            for event in event_stream:
                if 'Records' in event:
                    data = event['Records']['Payload'].decode('utf-8')
                    print(data)

stack trace:

Traceback (most recent call last):
  File "s3_util.py", line 167, in <module>
    asyncio.run(main())
  File "/usr/local/Cellar/python/3.7.3/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/runners.py", line 43, in run
    return loop.run_until_complete(main)
  File "/usr/local/Cellar/python/3.7.3/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/base_events.py", line 584, in run_until_complete
    return future.result()
  File "s3_util.py", line 162, in main
    'where \'aws_route53_record\' in s.resources[*].type')
  File "s3_util.py", line 52, in s3_select_from_object
    for event in obj['Payload']:
  File "/Users/lrd377/PycharmProjects/emr-cluster-init/venv/lib/python3.7/site-packages/botocore/eventstream.py", line 570, in __iter__
    for event in self._event_generator:
  File "/Users/lrd377/PycharmProjects/emr-cluster-init/venv/lib/python3.7/site-packages/botocore/eventstream.py", line 577, in _create_raw_event_generator
    for chunk in self._raw_stream.stream():
AttributeError: 'StreamReader' object has no attribute 'stream'

Get object fails for zip file

  • Async AWS SDK for Python version:1.0.0
  • Python version:3.6.5
  • Operating System: Macos

Description

I have a zip file in my s3 bucket. I tried to get the zip file from S3 using client.get_object, however it says UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe7 in position 12: invalid continuation byte. The same error does not occur when I try to get the data using boto3.

What I Did

    async with aioboto3.client("s3", config=Config(signature_version='s3v4'), aws_access_key_id=AWS_ACCESS_KEY,aws_secret_access_key=AWS_SECRET_KEY) as s3_client:
        try:
            s3_ob = await s3_client.get_object(Bucket=DEFAULT_BUCKET_NAME, Key=path)

UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe7 in position 12: invalid continuation byte

aioboto3 caches the loop and won't allow to use a new one even if the cached loop is closed

  • Async AWS SDK for Python version: 6.4.1
  • Python version: 3.7.4
  • Operating System: macOS 10.14.6

Description

I was writing tests with Pytest which uses multiple event loops and due to the global DEFAULT_SESSION behavior I had a bit of trouble.

Pytest loop fixture is function scoped which recreates the asyncio loop every time a test runs. In my case I had a client factory which interacted with aioboto3, the first test would trigger the creation of the DEFAULT_SESSION but subsequent tests will fail as they will always refer to the loop of the DEFAULT_SESSION which is closed. Even if you set the loop in the client or resource call the loop of the DEFAULT_SESSION will be used.

I'm not sure if this is the intended design and just missing in the documentation and examples or if it is a bug, either way I'm willing to open a PR to fix it after knowing your thoughts.

What I Did

# This is not the exactly code I was executing but it shows 

import pytest
import multicloud_table_client_factory

@pytest.fixture
async def multicloud_client_client(loop):
  # This client will eventually call `aioboto3.resource('dynamodb')`
  return await multicloud_table_client_factory.new_client()


def test_get_empty_result(multicloud_client_client):
  results = multicloud_client_client(primary_key="invalid_test", filter="*", only_use_aws=True)
  assert not len(results)

def test_get_many_results(multicloud_client_client):
  # This test will always fail because the loop aioboto3 cached was closed.
  results = multicloud_client_client(primary_key="many_entries", filter="*", only_use_aws=True)
  assert len(results) > 500

This is how I quickly fixed for my tests:

@pytest.fixture
async def multicloud_client_client(loop):
    yield await timeline.initialize()
    # This will clear aioboto3 default session at the end of the test and it will recreate one with the current asyncio loop when it's called again
    aioboto3.DEFAULT_SESSION = None

It does feels like a bad fix so I will just rewrite my code to use the aioboto3.session.Session.resource instead as it will use the correct loop. However this is not documented and not shown in any example

s3_bucket.objects.all() returns []

  • Async AWS SDK for Python version: 6.4.1
  • Python version: 3.7.6
  • Operating System: Linux 4.19.96 #1-NixOS SMP Tue Jan 14 19:07:09 UTC 2020 x86_64 GNU/Linux

Description

I'd like to list the objects in my bucket to -- say -- delete them in order to delete the bucket, but I can't because the list operation returns an empty list.

I noticed this operation isn't awaitable, so I guess this is just not implemented ?

Following is the test that shows the faulty behaviour. You'll need pytest-asyncio. Be aware that this will create and delete an s3 bucket for the purposes of the test.

I'm not testing the expected behaviour because then I wouldn't be able to delete the test bucket.

What I Did

@pytest.mark.asyncio
async def test_aioboto3_s3_bucket_list_objects(test_bucket):
    created_obj = await test_bucket.put_object(Key="test", Body="test data")
    objects = list(test_bucket.objects.all())
    assert objects == []  # This shouldn't be empty at all
    await created_obj.delete()


@pytest.fixture
async def test_bucket(s3_bucket_name, aio_s3_client, aio_s3_resource):
    region = aio_s3_client.meta.region_name
    response = await aio_s3_client.create_bucket(
        Bucket=s3_bucket_name,
        CreateBucketConfiguration=dict(LocationConstraint=region),
    )
    assert response["ResponseMetadata"]["HTTPStatusCode"] == 200
    yield aio_s3_resource.Bucket(s3_bucket_name)
    await aio_s3_client.delete_bucket(Bucket=s3_bucket_name)


@pytest.fixture
def s3_bucket_name():
    # Hardcoded for the example, mine is Faker-generated
    return "test-bucket-zqxvp9zl84p4"


@pytest.fixture
async def aio_s3_client():
    async with aioboto3.client("s3") as s3_client:
        yield s3_client


@pytest.fixture
async def aio_s3_resource():
    async with aioboto3.resource("s3") as s3_resource:
        yield s3_resource

upload_fileobj shouldn't convert all exceptions to ClientError

  • Async AWS SDK for Python version: 1.9.91
  • Python version: 3.7
  • Operating System: osx

Description

I'm was trying to cancel() an upload task but the CancelledError got converted to a ClientError which means I can't safely do the right thing.

inject.py line 205 should just raise the previous exception, not convert it.

ie:

- raise ClientError({'Error': {'Code': '404', 'Message': 'Not Found'}}, 'HeadObject')
+ raise

Slow S3 Download for multiple files

  • Async AWS SDK for Python version: 6.4.1
  • Python version: 3.6
  • Operating System: MacOS

Description

We have a Django app that needs to download 3 files for each id. There are N ids.

Doing it synchronously takes ~50s for 23 ids, so about 2 seconds per id.

Doing this asynchronously with aioboto3 takes the same length of time.

What I Did

I am following the example code from @Burrhank's answer from this issue... #137

The tasks are of the form....

                async def getRequest(jobKey):
                    async with aioboto3.resource(
                        "s3",
                        aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
                        aws_secret_access_key=settings.AWS_SECRECT_ACCESS_KEY,
                    ) as s3:
                        s3ObjectName = "{}_request.json".format(jobKey)
                        obj = s3.Object(
                            bucket, "{}{}".format(s3FolderKey, s3ObjectName)
                        )
                        obj = await obj.get()
                        data = await obj["Body"].read()
                        return json.loads(data.decode("utf-8"))

The tasks are managed as follows. When jobId is provided, we want to make 3 calls to S3 to get 3 different files that are all in the same bucket/folder. The tasks getRequest, getResponse and getImage are all the same task, but the file retrieved is a different file, max size is ~500Kb.

                requestTask = self.loop.create_task(getRequest(jobId))
                responseTask = self.loop.create_task(getResponse(jobId))
                imageTask = self.loop.create_task(getImage(jobId))
                self.loop.run_until_complete(
                    asyncio.wait([requestTask, responseTask, imageTask])
                )
                response = {}
                response["request"] = requestTask.result()
                response["response"] = responseTask.result()
                response["image"] = imageTask.result()

Any advice on what could be going wrong here? Async time === Sync time. :(

Update....

I have tried with the client, instead of resource, as per the documentation. This approach only saves me around 5s. What am i missing/doing wrong?

                async def getRequest(jobKey):
                    async with aioboto3.client(
                        "s3",
                        aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
                        aws_secret_access_key=settings.AWS_SECRECT_ACCESS_KEY,
                    ) as s3:
                        s3ObjectName = "{}_request.json".format(jobId)
                        obj = await s3.get_object(
                            Bucket=bucket, Key="{}{}".format(s3FolderKey, s3ObjectName)
                        )
                        data = await obj["Body"].read()
                        return json.loads(data.decode("utf-8"))

Does the region need to be specified? Or does aioboto know that S3 buckets are global?
I have also specified a region in the s3 client, but this does not affect performance.

Feature request: s3_client.download_file* multipart download

Thanks for your excellent library! The docs mention the following patches to s3transfer:

s3_client.download_file* This is performed by the s3transfer module. โ€“ Patched with get_object
s3_client.upload_file* This is performed by the s3transfer module. โ€“ Patched with custom multipart upload

For performance reasons, it would be fantastic if the download_file* methods also did a custom multipart download, the same way the upload_file* methods do. Is this in the roadmap?

Aioboto3 not working on eks.

  • Async AWS SDK for Python version:
aioboto3==6.4.1
aiobotocore==0.10.4
boto3==1.9.252
botocore==1.12.252
  • Python version:
Python 3.7.1
  • Operating System:
Docker Image python:3.7.1-alpine3.8

Description

We are using aioboto3==5.0.0 for reading files from s3 bucket inside Kubernetes(EKS) pod. Now we wanted to try out IAM Role for Service Account (IRSA) and as aioboto3==5.0.0 doesn't support IRSA check here. We updated aioboto3==5.0.0 to aioboto3==6.1.0 and then tried all the version above, but it's not working.

What I Did

import asyncio
import aioboto3
async def get_file_text(key):
    """
    get text present inside of file with s3_path
    :param s3_bucket:exit string s3 bucket name
    :param s3_path: string s3 path
    :return: string
    """
    async with aioboto3.resource('s3', endpoint_url="http://s3-ap-southeast-1.amazonaws.com") as s3:
        s3_object = s3.Object("call-ai-staging-notification-templates",
                              Key)
        return (await (await s3_object.get())['Body'].read()).decode("utf-8")

asyncio.get_event_loop().run_until_complete(get_file_text(key))

ERROR:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.7/asyncio/base_events.py", line 573, in run_until_complete
    return future.result()
  File "<stdin>", line 11, in get_file_text
  File "/usr/local/lib/python3.7/site-packages/aioboto3/resources.py", line 292, in do_action
    response = await action.async_call(self, *args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/aioboto3/resources.py", line 65, in async_call
    response = await getattr(parent.meta.client, operation_name)(**params)
  File "/usr/local/lib/python3.7/site-packages/aiobotocore/client.py", line 92, in _make_api_call
    operation_model, request_dict, request_context)
  File "/usr/local/lib/python3.7/site-packages/aiobotocore/client.py", line 113, in _make_request
    request_dict)
  File "/usr/local/lib/python3.7/site-packages/aiobotocore/endpoint.py", line 226, in _send_request
    request = self.create_request(request_dict, operation_model)
  File "/usr/local/lib/python3.7/site-packages/botocore/endpoint.py", line 116, in create_request
    operation_name=operation_model.name)
  File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 356, in emit
    return self._emitter.emit(aliased_event_name, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 228, in emit
    return self._emit(event_name, kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 211, in _emit
    response = handler(**kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/signers.py", line 90, in handler
    return self.sign(operation_name, request)
  File "/usr/local/lib/python3.7/site-packages/botocore/signers.py", line 149, in sign
    auth = self.get_auth_instance(**kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/signers.py", line 229, in get_auth_instance
    frozen_credentials = self._credentials.get_frozen_credentials()
  File "/usr/local/lib/python3.7/site-packages/botocore/credentials.py", line 577, in get_frozen_credentials
    self._refresh()
  File "/usr/local/lib/python3.7/site-packages/botocore/credentials.py", line 472, in _refresh
    self._protected_refresh(is_mandatory=is_mandatory_refresh)
  File "/usr/local/lib/python3.7/site-packages/botocore/credentials.py", line 488, in _protected_refresh
    metadata = self._refresh_using()
  File "/usr/local/lib/python3.7/site-packages/botocore/credentials.py", line 629, in fetch_credentials
    return self._get_cached_credentials()
  File "/usr/local/lib/python3.7/site-packages/botocore/credentials.py", line 640, in _get_cached_credentials
    self._write_to_cache(response)
  File "/usr/local/lib/python3.7/site-packages/botocore/credentials.py", line 665, in _write_to_cache
    self._cache[self._cache_key] = deepcopy(response)
  File "/usr/local/lib/python3.7/copy.py", line 169, in deepcopy
    rv = reductor(4)
TypeError: can't pickle coroutine objects

feature request

aioboto3/s3/inject.py

This is current method for async (and sync) streaming upload file-like object:

async def upload_fileobj(self, Fileobj: BinaryIO, Bucket: str, Key: str, ExtraArgs: Optional[Dict[str, Any]] = None,
                         Callback: Optional[Callable[[int], None]] = None,
                         Config: Optional[S3TransferConfig] = None):
...
    async def file_reader() -> None:
...
        while not eof:
...
                if asyncio.iscoroutinefunction(Fileobj.read):  # handles if we pass in aiofiles obj
                    # noinspection PyUnresolvedReferences
                    data = await Fileobj.read(io_chunksize)
                else:
                    data = Fileobj.read(io_chunksize)
                    await asyncio.sleep(0.0)
...
                multipart_payload += data
...
            if not multipart_payload:
                break

            await io_queue.put({'Body': multipart_payload, 'Bucket': Bucket, 'Key': Key, 'PartNumber': part, 'UploadId': upload_id})

It will be great to be able to create the method for stream processing the chunk. Example:

async def upload_fileobj(self, Fileobj: BinaryIO, Bucket: str, Key: str, ExtraArgs: Optional[Dict[str, Any]] = None,
                         Callback: Optional[Callable[[int], None]] = None,
                         Config: Optional[S3TransferConfig] = None,
                         Processing: Callable[[bytes], bytes] = None):
...
    async def file_reader() -> None:
...
        while not eof:
...
                if asyncio.iscoroutinefunction(Fileobj.read):  # handles if we pass in aiofiles obj
                    # noinspection PyUnresolvedReferences
                    data = await Fileobj.read(io_chunksize)
                else:
                    data = Fileobj.read(io_chunksize)
                    await asyncio.sleep(0.0)
...
                multipart_payload += data
...
            if not multipart_payload:
                break
            elif Processing:
                multipart_payload = Processing(multipart_payload)

            await io_queue.put({'Body': multipart_payload, 'Bucket': Bucket, 'Key': Key, 'PartNumber': part, 'UploadId': upload_id})

Example usage: it will be useful to be able to copy very big file from first bucket to second bucket and convert it without total save in RAM

    async with aioboto3.client('s3') as s3:
        s3_ob = await s3.get_object(Bucket=bucket_src, Key=s3_key)
        async with s3_ob['Body'] as stream:
            await s3.upload_fileobj(
                Fileobj=stream,
                Bucket=bucket_dest,
                Key=s3_key,
                Processing=lambda x: x.lower(),
            )

RuntimeError: Event loop is closed with Chalice / Lambda

  • Async AWS SDK for Python version: 6.4.1
  • Python version: 3.7.2
  • Operating System: Windows 7 / Lambda python 3.7

Description

Running aioboto3 with chalice (flusk for aws) / lambda. On local it always runs in the same runtime (script does not restart after each call to the api) but in lambda it happens sometimes because even though it is stateless sometimes aws internals will reuse the same instances. See https://stackoverflow.com/a/59309521/2470346 .

The problem is that aioboto3 stores the session in global and while it works for the first call it will not work for the second and other following calls as it tries to reuse the same session - therefore the same loop that already got closed.

What I Did

import asyncio
import aioboto3
from chalice import Chalice

app = Chalice(app_name="test")
@app.route('/v1/test', methods=['GET'])
def test():
    asyncio.run(test_main_coroutine())
    return {
        "foo": "bar",
    }

async def test_main_coroutine()
    return "lala"

Workaround

In the main coroutines, reset the default session and force it to use the new loop:

# rewrite aioboto default session so it uses a new one on each api call and not the global one on first run
async def test_main_coroutine()
    aioboto3.setup_default_session()

How it should've worked

I think aioboto3 should use the new loop on each call. This means it should not store the loop in the session or always try to get the running loop.

Trace

Traceback (most recent call last):
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\chalice\app.py", line 1104, in _get_view_function_response
    response = view_function(**function_args)
  File "C:\Users\abivol\repos\venvname\src\chalice\app.py", line 122, in get_brochure
    response = asyncio.run(get_brochure_main_coroutine(config_id))
  File "C:\python37\Lib\asyncio\runners.py", line 43, in run
    return loop.run_until_complete(main)
  File "C:\python37\Lib\asyncio\base_events.py", line 584, in run_until_complete
    return future.result()
  File "C:\Users\abivol\repos\venvname\src\chalice\app.py", line 139, in get_brochure_main_coroutine
    config_to_trunk = await get_config_to_trunk(BUCKET, config_id, raise_error=True)
  File "C:\Users\abivol\repos\venvname\src\chalice\chalicelib\utilities.py", line 42, in get_config_to_trunk
    config_to_trunk = await s3_fetch_json(bucket, get_config_to_trunk_key_name(config_id))
  File "C:\Users\abivol\repos\venvname\src\chalice\chalicelib\aws_utilities.py", line 67, in s3_fetch_json
    await s3_get(bucket, key, data)
  File "C:\Users\abivol\repos\venvname\src\chalice\chalicelib\aws_utilities.py", line 46, in s3_get
    return await s3.download_fileobj(bucket, key, filelike)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\aioboto3\s3\inject.py", line 77, in download_fileobj
    resp = await self.get_object(Bucket=Bucket, Key=Key)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\aiobotocore\client.py", line 89, in _make_api_call
    operation_model, request_dict, request_context)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\aiobotocore\client.py", line 110, in _make_request
    request_dict)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\aiobotocore\endpoint.py", line 76, in _send_request
    exception):
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\aiobotocore\endpoint.py", line 178, in _needs_retry
    caught_exception=caught_exception, request_dict=request_dict)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\botocore\hooks.py", line 356, in emit
    return self._emitter.emit(aliased_event_name, **kwargs)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\botocore\hooks.py", line 228, in emit
    return self._emit(event_name, kwargs)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\botocore\hooks.py", line 211, in _emit
    response = handler(**kwargs)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\botocore\retryhandler.py", line 183, in __call__
    if self._checker(attempts, response, caught_exception):
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\botocore\retryhandler.py", line 251, in __call__
    caught_exception)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\botocore\retryhandler.py", line 269, in _should_retry
    return self._checker(attempt_number, response, caught_exception)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\botocore\retryhandler.py", line 317, in __call__
    caught_exception)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\botocore\retryhandler.py", line 223, in __call__
    attempt_number, caught_exception)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\botocore\retryhandler.py", line 359, in _check_caught_exception
    raise caught_exception
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\aiobotocore\endpoint.py", line 139, in _do_get_response
    http_response = await self._send(request)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\aiobotocore\endpoint.py", line 220, in _send
    request.method, url=url, headers=headers_, data=data, proxy=proxy)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\aiohttp\client.py", line 483, in _request
    timeout=real_timeout
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\aiohttp\connector.py", line 523, in connect
    proto = await self._create_connection(req, traces, timeout)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\aiohttp\connector.py", line 859, in _create_connection
    req, traces, timeout)
  File "C:\Users\abivol\.virtualenvs\venvname-k8Lrhq7m\lib\site-packages\aiohttp\connector.py", line 967, in _create_direct_connection
    traces=traces), loop=self._loop)
  File "C:\python37\Lib\asyncio\tasks.py", line 769, in shield
    inner = ensure_future(arg, loop=loop)
  File "C:\python37\Lib\asyncio\tasks.py", line 581, in ensure_future
    task = loop.create_task(coro_or_future)
  File "C:\python37\Lib\asyncio\base_events.py", line 403, in create_task
    self._check_closed()
  File "C:\python37\Lib\asyncio\base_events.py", line 480, in _check_closed
    raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed

Unable to locate credentials

  • Async AWS SDK for Python version: 8.2.0
  • Python version: Python 3.8.5
  • Operating System: Ubuntu 20.04.1 LTS
aioboto3==8.2.0
aiobotocore==1.1.2
aiohttp==3.7.3
boto3==1.14.44
botocore==1.17.44

Description

I need to download one hundred files from S3 and I'm using aioboto3.
After the first 50-60 files (the number also varies from time to time) I get this error for some files:

Caught retryable HTTP exception while making metadata service request to http://169.254.169.254/latest/meta-data/iam/security-credentials/REDACTED: 
Traceback (most recent call last):
  File "/home/ubuntu/ENV/lib/python3.8/site-packages/aiobotocore/utils.py", line 79, in _get_request
    async with session.get(url, headers=headers) as resp:
  File "/home/ubuntu/ENV/lib/python3.8/site-packages/aiohttp/client.py", line 1117, in __aenter__
    self._resp = await self._coro
  File "/home/ubuntu/ENV/lib/python3.8/site-packages/aiohttp/client.py", line 619, in _request
    break
  File "/home/ubuntu/ENV/lib/python3.8/site-packages/aiohttp/helpers.py", line 656, in __exit__
    raise asyncio.TimeoutError from None
asyncio.exceptions.TimeoutError

repeated many times and than:

Max number of attempts exceeded (1) when attempting to retrieve data from metadata service

I'm using it in the wrong way or we are hitting some limit of the metadata service?

I think It's related to aio-libs/aiobotocore#808

WaiterAction support

Environment

  • Python version: 3.7
  • Package versions:
    • aioboto3-6.2.2
    • aiobotocore-0.10.2
    • boto3-1.9.91
    • botocore-1.12.91

Description

Currently in order to perform waiter action user need to access client.

waiter = resource.meta.client.get_waiter('instance_running')
await waiter.wait(InstanceId=[...])

Boto3 however generates methods to simplify this operation, it would be good to support it too

instance = await ec2.create_instance(...)
await instance.wait_instance_running()  // This operation can't be awaited now

To make this happen we need to implement waiter action creation for AIOBoto3ResourceFactory. It must be similar to ServiceAction.

ModuleNotFoundError: No module named 'botocore.vendored'

  • Async AWS SDK for Python version:
  • Python version: 3.6.3
  • Operating System: Ubuntu 18.10

Description

Hi, first of all, AWESOME that someone actually did the effort to make this stuff async!!

I'm using:

python 3.6.3
aioboto3==6.3.0
aiobotocore==0.10.2
awscli==1.16.26
boto3==1.9.91
botocore==1.12.16

But this does not work with the latest botocore version (which has no module "vendored" needed by aiobotocore.endpoint).

I've been fiddling with versions for almost an hour now and I can't get it right. Pip neither ;(

Could you please share a working version set?

Traceback (most recent call last):
  File "encryption.py", line 3, in <module>
    import aioboto3
  File "/ntfs/projects/gsmg/venv3.6/lib/python3.6/site-packages/aioboto3/__init__.py", line 5, in <module>
    from aioboto3.session import Session
  File "/ntfs/projects/gsmg/venv3.6/lib/python3.6/site-packages/aioboto3/session.py", line 8, in <module>
    import aiobotocore.session
  File "/ntfs/projects/gsmg/venv3.6/lib/python3.6/site-packages/aiobotocore/__init__.py", line 1, in <module>
    from .session import get_session, AioSession
  File "/ntfs/projects/gsmg/venv3.6/lib/python3.6/site-packages/aiobotocore/session.py", line 8, in <module>
    from .client import AioClientCreator
  File "/ntfs/projects/gsmg/venv3.6/lib/python3.6/site-packages/aiobotocore/client.py", line 11, in <module>
    from .args import AioClientArgsCreator
  File "/ntfs/projects/gsmg/venv3.6/lib/python3.6/site-packages/aiobotocore/args.py", line 9, in <module>
    from .endpoint import AioEndpointCreator
  File "/ntfs/projects/gsmg/venv3.6/lib/python3.6/site-packages/aiobotocore/endpoint.py", line 14, in <module>
    from botocore.vendored.requests.structures import CaseInsensitiveDict
ModuleNotFoundError: No module named 'botocore.vendored'

Is it possible to use S3?

    async with aioboto3.resource('s3', region_name='us-west-1',
                                   #aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
                                   #aws_access_key_id=AWS_ACCESS_KEY_ID
                                 ) as client:
        # upload object to amazon s3
        data = b'\x01'*1024
        resp = await client.put_object(Bucket=bucket,
                                            Key=key,
                                            Body=data)

There is error:

     AttributeError: 's3.AIOServiceResource' object has no attribute 'put_object'

'coroutine' object has no attribute 'batch_writer'

  • Async AWS SDK for Python version: aioboto3 was deployed as a lambda layer built from the latest version.
    aioboto3==8.0.3
    aiobotocore==1.0.4
    aiohttp==3.6.2
    boto3==1.12.32
    botocore==1.15.32

  • Python version: 3.8

  • Operating System: AWS lambda

Description

I've been using aioboto3 for some time (release 7.0.0). Yesterday I tried the latest release and had the problem described bellow:

'coroutine' object has no attribute 'batch_writer'

I use batch write to load data from s3 to dynamo.
Going back to version 7.0.0 solved the problem.

What I Did

Snippets of the code:

run(main(event, context))
    
async def main(event, context):
...
async with aioboto3_resource('dynamodb') as dynamo_resource:
            table = dynamo_resource.Table(environ['TABLE_NAME'])
            await s3_to_dynamo(event, context, shard_index, s3obj, table)
...

async def s3_to_dynamo (event, context, shard_index, s3obj, table):
...
await gather(*(dynamoBatchWrite(batch, table, segmentId) for batch in limitBatchArr))
...

async def dynamoBatchWrite(batch, table, segmentId, count=0):
       **async with table.batch_writer() as batch_writer:**
            for item in batch:
                await batch_writer.put_item(
                    Item=item
                )   

Here is the stack trace of the error:

{
  "error": "AttributeError",
  "cause": {
    "errorMessage": "'coroutine' object has no attribute 'batch_writer'",
    "errorType": "AttributeError",
    "stackTrace": [
      "  File \"/opt/python/aws_lambda_powertools/tracing/tracer.py\", line 266, in decorate\n    response = lambda_handler(event, context)\n",
      "  File \"/opt/python/aws_lambda_powertools/logging/logger.py\", line 442, in decorate\n    return lambda_handler(event, context)\n",
      "  File \"/var/task/loadSegmentChunk/lambda_function.py\", line 231, in lambda_handler\n    run(main(event, context))\n",
      "  File \"/var/lang/lib/python3.8/asyncio/runners.py\", line 43, in run\n    return loop.run_until_complete(main)\n",
      "  File \"/var/lang/lib/python3.8/asyncio/base_events.py\", line 616, in run_until_complete\n    return future.result()\n",
      "  File \"/var/task/loadSegmentChunk/lambda_function.py\", line 201, in main\n    await s3_to_dynamo(event, context, shard_index, s3obj, table)\n",
      "  File \"/var/task/loadSegmentChunk/lambda_function.py\", line 150, in s3_to_dynamo\n    await gather(*(dynamoBatchWrite(batch, table, segmentId) for batch in limitBatchArr))\n",
      "  File \"/var/task/loadSegmentChunk/lambda_function.py\", line 48, in dynamoBatchWrite\n    async with table.batch_writer() as batch_writer:\n"
    ]
  }
}

aioboto3 fails to install after 6.0.0 update

  • Async AWS SDK for Python version:
  • Python version:
    Python 3.6.6
    pip 18.0
  • Operating System:
    Kali Linux (Debian stretch based)
    commands executed in docker, but the same happens on the machine with non-root user

Description

I'm trying to install the latest version of aioboto3 inside a pipenv on Python 3.6.6.
The empty venv is successfully created.
Running pipenv install aioboto3 fails to install aioboto3 with the error message that it couldn't find a version that matches botocore<1.12.25,<1.13.0,>=1.12.24,>=1.12.25
I expect aioboto3 to successfully install in a fresh pipenv on Python 3.6.6

What I Did

create pipenv

command:

pipenv shell

output:

root@45aab38af4ee:/workspace/test# pipenv shell
Creating a virtualenv for this projectโ€ฆ
Pipfile: /workspace/test/Pipfile
Using /usr/share/pyenv/versions/3.6.6/bin/python3.6 (3.6.6) to create virtualenvโ€ฆ
โ ฆAlready using interpreter /usr/share/pyenv/versions/3.6.6/bin/python3.6
Using base prefix '/usr/share/pyenv/versions/3.6.6'
New python executable in /workspace/test/.venv/bin/python3.6
Also creating executable in /workspace/test/.venv/bin/python
Installing setuptools, pip, wheel...done.

Virtualenv location: /workspace/test/.venv
Launching subshell in virtual environmentโ€ฆ
 . /workspace/test/.venv/bin/activate
root@45aab38af4ee:/workspace/test#  . /workspace/test/.venv/bin/activate
(test) root@45aab38af4ee:/workspace/test#

installing aioboto3

command:

pipenv install aioboto3

output:

root@45aab38af4ee:/workspace/test# pipenv install aioboto3
Installing aioboto3โ€ฆ
Collecting aioboto3
  Downloading https://files.pythonhosted.org/packages/66/5d/99a7c6e08489f42f36eb2113c761d6244e8fbb3547548e07facacfccc703/aioboto3-6.0.0-py2.py3-none-any.whl
Collecting aiohttp>=3.3.1 (from aioboto3)
  Downloading https://files.pythonhosted.org/packages/52/f9/c22977fc95346911d8fe507f90c3c4e4f445fdf339b750be6f03f090498d/aiohttp-3.4.4-cp36-cp36m-manylinux1_x86_64.whl (1.1MB)
Collecting boto3<=1.9.25,>=1.9.24 (from aioboto3)
  Downloading https://files.pythonhosted.org/packages/38/84/70d0857806db52a238886c2405730cd19728c70b40285c1653e1c95c6464/boto3-1.9.25-py2.py3-none-any.whl (128kB)
Collecting botocore<1.12.25,>=1.12.24 (from aioboto3)
  Downloading https://files.pythonhosted.org/packages/f2/d5/87898cf85de64d62ae716cb2397cd3e58cdf3ba82847b0803bd2c4393f78/botocore-1.12.24-py2.py3-none-any.whl (4.7MB)
Collecting wrapt>=1.10.10 (from aioboto3)
  Downloading https://files.pythonhosted.org/packages/a0/47/66897906448185fcb77fc3c2b1bc20ed0ecca81a0f2f88eda3fc5a34fc3d/wrapt-1.10.11.tar.gz
Collecting yarl<2.0,>=1.0 (from aiohttp>=3.3.1->aioboto3)
  Downloading https://files.pythonhosted.org/packages/61/67/df71b367680e06bb4127e3df6189826d4b9daebf83c3bd5b9341c99ef528/yarl-1.2.6-cp36-cp36m-manylinux1_x86_64.whl (253kB)
Collecting idna-ssl>=1.0; python_version < "3.7" (from aiohttp>=3.3.1->aioboto3)
  Downloading https://files.pythonhosted.org/packages/46/03/07c4894aae38b0de52b52586b24bf189bb83e4ddabfe2e2c8f2419eec6f4/idna-ssl-1.1.0.tar.gz
Collecting async-timeout<4.0,>=3.0 (from aiohttp>=3.3.1->aioboto3)
  Downloading https://files.pythonhosted.org/packages/e1/1e/5a4441be21b0726c4464f3f23c8b19628372f606755a9d2e46c187e65ec4/async_timeout-3.0.1-py3-none-any.whl
Collecting attrs>=17.3.0 (from aiohttp>=3.3.1->aioboto3)
  Downloading https://files.pythonhosted.org/packages/3a/e1/5f9023cc983f1a628a8c2fd051ad19e76ff7b142a0faf329336f9a62a514/attrs-18.2.0-py2.py3-none-any.whl
Collecting multidict<5.0,>=4.0 (from aiohttp>=3.3.1->aioboto3)
  Downloading https://files.pythonhosted.org/packages/17/55/a4a9661d68b241eb3f4536287e2bb97048e4a6f8018747f02dab509f2062/multidict-4.5.1-cp36-cp36m-manylinux1_x86_64.whl (309kB)
Collecting chardet<4.0,>=2.0 (from aiohttp>=3.3.1->aioboto3)
  Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)
Collecting s3transfer<0.2.0,>=0.1.10 (from boto3<=1.9.25,>=1.9.24->aioboto3)
  Downloading https://files.pythonhosted.org/packages/d7/14/2a0004d487464d120c9fb85313a75cd3d71a7506955be458eebfe19a6b1d/s3transfer-0.1.13-py2.py3-none-any.whl (59kB)
Collecting jmespath<1.0.0,>=0.7.1 (from boto3<=1.9.25,>=1.9.24->aioboto3)
  Downloading https://files.pythonhosted.org/packages/b7/31/05c8d001f7f87f0f07289a5fc0fc3832e9a57f2dbd4d3b0fee70e0d51365/jmespath-0.9.3-py2.py3-none-any.whl
Collecting urllib3<1.24,>=1.20 (from botocore<1.12.25,>=1.12.24->aioboto3)
  Downloading https://files.pythonhosted.org/packages/bd/c9/6fdd990019071a4a32a5e7cb78a1d92c53851ef4f56f62a3486e6a7d8ffb/urllib3-1.23-py2.py3-none-any.whl (133kB)
Collecting python-dateutil<3.0.0,>=2.1; python_version >= "2.7" (from botocore<1.12.25,>=1.12.24->aioboto3)
  Downloading https://files.pythonhosted.org/packages/74/68/d87d9b36af36f44254a8d512cbfc48369103a3b9e474be9bdfe536abfc45/python_dateutil-2.7.5-py2.py3-none-any.whl (225kB)
Collecting docutils>=0.10 (from botocore<1.12.25,>=1.12.24->aioboto3)
  Downloading https://files.pythonhosted.org/packages/36/fa/08e9e6e0e3cbd1d362c3bbee8d01d0aedb2155c4ac112b19ef3cae8eed8d/docutils-0.14-py3-none-any.whl (543kB)
Collecting idna>=2.0 (from yarl<2.0,>=1.0->aiohttp>=3.3.1->aioboto3)
  Downloading https://files.pythonhosted.org/packages/4b/2a/0276479a4b3caeb8a8c1af2f8e4355746a97fab05a372e4a2c6a6b876165/idna-2.7-py2.py3-none-any.whl (58kB)
Collecting six>=1.5 (from python-dateutil<3.0.0,>=2.1; python_version >= "2.7"->botocore<1.12.25,>=1.12.24->aioboto3)
  Downloading https://files.pythonhosted.org/packages/67/4b/141a581104b1f6397bfa78ac9d43d8ad29a7ca43ea90a2d863fe3056e86a/six-1.11.0-py2.py3-none-any.whl
Building wheels for collected packages: wrapt, idna-ssl
  Running setup.py bdist_wheel for wrapt: started
  Running setup.py bdist_wheel for wrapt: finished with status 'done'
  Stored in directory: /root/.cache/pipenv/wheels/48/5d/04/22361a593e70d23b1f7746d932802efe1f0e523376a74f321e
  Running setup.py bdist_wheel for idna-ssl: started
  Running setup.py bdist_wheel for idna-ssl: finished with status 'done'
  Stored in directory: /root/.cache/pipenv/wheels/d3/00/b3/32d613e19e08a739751dd6bf998cfed277728f8b2127ad4eb7
Successfully built wrapt idna-ssl
Installing collected packages: multidict, idna, yarl, idna-ssl, async-timeout, attrs, chardet, aiohttp, urllib3, jmespath, six, python-dateutil, docutils, botocore, s3transfer, boto3, wrapt, aioboto3
Successfully installed aioboto3-6.0.0 aiohttp-3.4.4 async-timeout-3.0.1 attrs-18.2.0 boto3-1.9.25 botocore-1.12.24 chardet-3.0.4 docutils-0.14 idna-2.7 idna-ssl-1.1.0 jmespath-0.9.3 multidict-4.5.1 python-dateutil-2.7.5 s3transfer-0.1.13 six-1.11.0 urllib3-1.23 wrapt-1.10.11 yarl-1.2.6

Adding aioboto3 to Pipfile's [packages]โ€ฆ
Pipfile.lock not found, creatingโ€ฆ
Locking [dev-packages] dependenciesโ€ฆ
Locking [packages] dependenciesโ€ฆ

Warning: Your dependencies could not be resolved. You likely have a mismatch in your sub-dependencies.
  First try clearing your dependency cache with $ pipenv lock --clear, then try the original command again.
 Alternatively, you can use $ pipenv install --skip-lock to bypass this mechanism, then run $ pipenv graph to inspect the situation.
  Hint: try $ pipenv lock --pre if it is a pre-release dependency.
Could not find a version that matches botocore<1.12.25,<1.13.0,>=1.12.24,>=1.12.25
Tried: 0.4.1, 0.4.2, 0.5.0, 0.5.1, 0.5.2, 0.5.3, 0.5.4, 0.6.0, 0.7.0, 0.8.0, 0.8.1, 0.8.2, 0.8.3, 0.9.0, 0.9.1, 0.9.2, 0.10.0, 0.11.0, 0.12.0, 0.13.0, 0.13.1, 0.14.0, 0.15.0, 0.15.1, 0.16.0, 0.17.0, 0.18.0, 0.19.0, 0.20.0, 0.21.0, 0.22.0, 0.23.0, 0.24.0, 0.25.0, 0.26.0, 0.27.0, 0.28.0, 0.29.0, 0.30.0, 0.31.0, 0.32.0, 0.33.0, 0.34.0, 0.35.0, 0.36.0, 0.37.0, 0.38.0, 0.39.0, 0.40.0, 0.41.0, 0.42.0, 0.43.0, 0.44.0, 0.45.0, 0.46.0, 0.47.0, 0.48.0, 0.49.0, 0.50.0, 0.51.0, 0.52.0, 0.53.0, 0.54.0, 0.55.0, 0.56.0, 0.57.0, 0.58.0, 0.59.0, 0.60.0, 0.61.0, 0.62.0, 0.63.0, 0.64.0, 0.65.0, 0.66.0, 0.67.0, 0.68.0, 0.69.0, 0.70.0, 0.71.0, 0.72.0, 0.73.0, 0.74.0, 0.75.0, 0.76.0, 0.77.0, 0.78.0, 0.79.0, 0.80.0, 0.81.0, 0.82.0, 0.83.0, 0.84.0, 0.85.0, 0.86.0, 0.87.0, 0.88.0, 0.89.0, 0.90.0, 0.91.0, 0.92.0, 0.93.0, 0.94.0, 0.95.0, 0.96.0, 0.97.0, 0.98.0, 0.99.0, 0.100.0, 0.101.0, 0.102.0, 0.103.0, 0.104.0, 0.105.0, 0.106.0, 0.107.0, 0.108.0, 0.109.0, 1.0.0, 1.0.0, 1.0.1, 1.0.1, 1.1.0, 1.1.0, 1.1.1, 1.1.1, 1.1.2, 1.1.2, 1.1.3, 1.1.3, 1.1.4, 1.1.4, 1.1.5, 1.1.5, 1.1.6, 1.1.6, 1.1.7, 1.1.7, 1.1.8, 1.1.8, 1.1.9, 1.1.9, 1.1.10, 1.1.10, 1.1.11, 1.1.11, 1.1.12, 1.1.12, 1.2.0, 1.2.0, 1.2.1, 1.2.1, 1.2.2, 1.2.2, 1.2.3, 1.2.3, 1.2.4, 1.2.4, 1.2.5, 1.2.5, 1.2.6, 1.2.6, 1.2.7, 1.2.7, 1.2.8, 1.2.8, 1.2.9, 1.2.9, 1.2.10, 1.2.10, 1.2.11, 1.2.11, 1.3.0, 1.3.0, 1.3.1, 1.3.1, 1.3.2, 1.3.2, 1.3.3, 1.3.3, 1.3.4, 1.3.4, 1.3.5, 1.3.5, 1.3.6, 1.3.6, 1.3.7, 1.3.7, 1.3.8, 1.3.8, 1.3.9, 1.3.9, 1.3.10, 1.3.10, 1.3.11, 1.3.11, 1.3.12, 1.3.12, 1.3.13, 1.3.13, 1.3.14, 1.3.14, 1.3.15, 1.3.15, 1.3.16, 1.3.16, 1.3.17, 1.3.17, 1.3.18, 1.3.18, 1.3.19, 1.3.19, 1.3.20, 1.3.20, 1.3.21, 1.3.21, 1.3.22, 1.3.22, 1.3.23, 1.3.23, 1.3.24, 1.3.24, 1.3.25, 1.3.25, 1.3.26, 1.3.26, 1.3.27, 1.3.27, 1.3.28, 1.3.28, 1.3.29, 1.3.29, 1.3.30, 1.3.30, 1.4.0, 1.4.0, 1.4.1, 1.4.1, 1.4.2, 1.4.2, 1.4.3, 1.4.3, 1.4.4, 1.4.4, 1.4.5, 1.4.5, 1.4.6, 1.4.6, 1.4.7, 1.4.7, 1.4.8, 1.4.8, 1.4.9, 1.4.9, 1.4.10, 1.4.10, 1.4.11, 1.4.11, 1.4.12, 1.4.12, 1.4.13, 1.4.13, 1.4.14, 1.4.14, 1.4.15, 1.4.15, 1.4.16, 1.4.16, 1.4.17, 1.4.17, 1.4.18, 1.4.18, 1.4.19, 1.4.19, 1.4.20, 1.4.20, 1.4.21, 1.4.21, 1.4.22, 1.4.22, 1.4.23, 1.4.23, 1.4.24, 1.4.24, 1.4.25, 1.4.25, 1.4.26, 1.4.26, 1.4.27, 1.4.27, 1.4.28, 1.4.28, 1.4.29, 1.4.29, 1.4.30, 1.4.30, 1.4.31, 1.4.31, 1.4.32, 1.4.32, 1.4.33, 1.4.33, 1.4.34, 1.4.34, 1.4.35, 1.4.35, 1.4.36, 1.4.36, 1.4.37, 1.4.37, 1.4.38, 1.4.38, 1.4.39, 1.4.39, 1.4.40, 1.4.40, 1.4.41, 1.4.41, 1.4.42, 1.4.42, 1.4.43, 1.4.43, 1.4.44, 1.4.44, 1.4.46, 1.4.46, 1.4.47, 1.4.47, 1.4.48, 1.4.48, 1.4.49, 1.4.49, 1.4.50, 1.4.50, 1.4.51, 1.4.51, 1.4.52, 1.4.52, 1.4.53, 1.4.53, 1.4.54, 1.4.54, 1.4.55, 1.4.55, 1.4.56, 1.4.56, 1.4.57, 1.4.57, 1.4.58, 1.4.58, 1.4.59, 1.4.59, 1.4.60, 1.4.60, 1.4.61, 1.4.61, 1.4.62, 1.4.62, 1.4.63, 1.4.63, 1.4.64, 1.4.64, 1.4.65, 1.4.65, 1.4.66, 1.4.66, 1.4.67, 1.4.67, 1.4.68, 1.4.68, 1.4.69, 1.4.69, 1.4.70, 1.4.70, 1.4.71, 1.4.71, 1.4.72, 1.4.72, 1.4.73, 1.4.73, 1.4.74, 1.4.74, 1.4.75, 1.4.75, 1.4.76, 1.4.76, 1.4.77, 1.4.77, 1.4.78, 1.4.78, 1.4.79, 1.4.79, 1.4.80, 1.4.80, 1.4.81, 1.4.81, 1.4.82, 1.4.82, 1.4.83, 1.4.83, 1.4.84, 1.4.84, 1.4.85, 1.4.85, 1.4.86, 1.4.86, 1.4.87, 1.4.87, 1.4.88, 1.4.88, 1.4.89, 1.4.89, 1.4.90, 1.4.90, 1.4.91, 1.4.91, 1.4.92, 1.4.92, 1.4.93, 1.4.93, 1.5.0, 1.5.0, 1.5.1, 1.5.1, 1.5.2, 1.5.2, 1.5.3, 1.5.3, 1.5.4, 1.5.4, 1.5.5, 1.5.5, 1.5.6, 1.5.6, 1.5.7, 1.5.7, 1.5.8, 1.5.8, 1.5.9, 1.5.9, 1.5.10, 1.5.10, 1.5.11, 1.5.11, 1.5.12, 1.5.12, 1.5.13, 1.5.13, 1.5.14, 1.5.14, 1.5.15, 1.5.15, 1.5.16, 1.5.16, 1.5.17, 1.5.17, 1.5.18, 1.5.18, 1.5.19, 1.5.19, 1.5.20, 1.5.20, 1.5.21, 1.5.21, 1.5.22, 1.5.22, 1.5.23, 1.5.23, 1.5.24, 1.5.24, 1.5.25, 1.5.25, 1.5.26, 1.5.26, 1.5.27, 1.5.27, 1.5.28, 1.5.28, 1.5.29, 1.5.29, 1.5.30, 1.5.30, 1.5.31, 1.5.31, 1.5.32, 1.5.32, 1.5.33, 1.5.33, 1.5.34, 1.5.34, 1.5.35, 1.5.35, 1.5.36, 1.5.36, 1.5.37, 1.5.37, 1.5.38, 1.5.38, 1.5.39, 1.5.39, 1.5.40, 1.5.40, 1.5.41, 1.5.41, 1.5.42, 1.5.42, 1.5.43, 1.5.43, 1.5.44, 1.5.44, 1.5.45, 1.5.45, 1.5.46, 1.5.46, 1.5.47, 1.5.47, 1.5.48, 1.5.48, 1.5.49, 1.5.49, 1.5.50, 1.5.50, 1.5.51, 1.5.51, 1.5.52, 1.5.52, 1.5.53, 1.5.53, 1.5.54, 1.5.54, 1.5.55, 1.5.55, 1.5.56, 1.5.56, 1.5.57, 1.5.57, 1.5.58, 1.5.58, 1.5.59, 1.5.59, 1.5.60, 1.5.60, 1.5.61, 1.5.61, 1.5.62, 1.5.62, 1.5.63, 1.5.63, 1.5.64, 1.5.64, 1.5.65, 1.5.65, 1.5.66, 1.5.66, 1.5.67, 1.5.67, 1.5.68, 1.5.68, 1.5.69, 1.5.69, 1.5.70, 1.5.70, 1.5.71, 1.5.71, 1.5.72, 1.5.72, 1.5.73, 1.5.73, 1.5.74, 1.5.74, 1.5.75, 1.5.75, 1.5.76, 1.5.76, 1.5.77, 1.5.77, 1.5.78, 1.5.78, 1.5.79, 1.5.79, 1.5.80, 1.5.80, 1.5.81, 1.5.81, 1.5.82, 1.5.82, 1.5.83, 1.5.83, 1.5.84, 1.5.84, 1.5.85, 1.5.85, 1.5.86, 1.5.86, 1.5.87, 1.5.87, 1.5.88, 1.5.88, 1.5.89, 1.5.89, 1.5.90, 1.5.90, 1.5.91, 1.5.91, 1.5.92, 1.5.92, 1.5.93, 1.5.93, 1.5.94, 1.5.94, 1.5.95, 1.5.95, 1.6.0, 1.6.0, 1.6.1, 1.6.1, 1.6.2, 1.6.2, 1.6.3, 1.6.3, 1.6.4, 1.6.4, 1.6.5, 1.6.5, 1.6.6, 1.6.6, 1.6.7, 1.6.7, 1.6.8, 1.6.8, 1.7.0, 1.7.0, 1.7.1, 1.7.1, 1.7.2, 1.7.2, 1.7.3, 1.7.3, 1.7.4, 1.7.4, 1.7.5, 1.7.5, 1.7.6, 1.7.6, 1.7.7, 1.7.7, 1.7.8, 1.7.8, 1.7.9, 1.7.9, 1.7.10, 1.7.10, 1.7.11, 1.7.11, 1.7.12, 1.7.12, 1.7.13, 1.7.13, 1.7.14, 1.7.14, 1.7.15, 1.7.15, 1.7.16, 1.7.16, 1.7.17, 1.7.17, 1.7.18, 1.7.18, 1.7.19, 1.7.19, 1.7.20, 1.7.20, 1.7.21, 1.7.21, 1.7.22, 1.7.22, 1.7.23, 1.7.23, 1.7.24, 1.7.24, 1.7.25, 1.7.25, 1.7.26, 1.7.26, 1.7.27, 1.7.27, 1.7.28, 1.7.28, 1.7.29, 1.7.29, 1.7.30, 1.7.30, 1.7.31, 1.7.31, 1.7.32, 1.7.32, 1.7.33, 1.7.33, 1.7.34, 1.7.34, 1.7.35, 1.7.35, 1.7.36, 1.7.36, 1.7.37, 1.7.37, 1.7.38, 1.7.38, 1.7.39, 1.7.39, 1.7.40, 1.7.40, 1.7.41, 1.7.41, 1.7.42, 1.7.42, 1.7.43, 1.7.43, 1.7.44, 1.7.44, 1.7.45, 1.7.45, 1.7.46, 1.7.46, 1.7.47, 1.7.47, 1.7.48, 1.7.48, 1.8.0, 1.8.0, 1.8.1, 1.8.1, 1.8.2, 1.8.2, 1.8.3, 1.8.3, 1.8.4, 1.8.4, 1.8.5, 1.8.5, 1.8.6, 1.8.6, 1.8.7, 1.8.7, 1.8.8, 1.8.8, 1.8.9, 1.8.9, 1.8.10, 1.8.10, 1.8.11, 1.8.11, 1.8.12, 1.8.12, 1.8.13, 1.8.13, 1.8.14, 1.8.14, 1.8.15, 1.8.15, 1.8.16, 1.8.16, 1.8.17, 1.8.17, 1.8.18, 1.8.18, 1.8.19, 1.8.19, 1.8.20, 1.8.20, 1.8.21, 1.8.21, 1.8.22, 1.8.22, 1.8.23, 1.8.23, 1.8.24, 1.8.24, 1.8.25, 1.8.25, 1.8.26, 1.8.26, 1.8.27, 1.8.27, 1.8.28, 1.8.28, 1.8.29, 1.8.29, 1.8.30, 1.8.30, 1.8.31, 1.8.31, 1.8.32, 1.8.32, 1.8.33, 1.8.33, 1.8.34, 1.8.34, 1.8.35, 1.8.35, 1.8.36, 1.8.36, 1.8.37, 1.8.37, 1.8.38, 1.8.38, 1.8.39, 1.8.39, 1.8.40, 1.8.40, 1.8.41, 1.8.41, 1.8.42, 1.8.42, 1.8.43, 1.8.43, 1.8.44, 1.8.44, 1.8.45, 1.8.45, 1.8.46, 1.8.46, 1.8.47, 1.8.47, 1.8.48, 1.8.48, 1.8.49, 1.8.49, 1.8.50, 1.8.50, 1.9.0, 1.9.0, 1.9.1, 1.9.1, 1.9.2, 1.9.2, 1.9.3, 1.9.3, 1.9.4, 1.9.4, 1.9.5, 1.9.5, 1.9.6, 1.9.6, 1.9.7, 1.9.7, 1.9.8, 1.9.8, 1.9.9, 1.9.9, 1.9.10, 1.9.10, 1.9.11, 1.9.11, 1.9.12, 1.9.12, 1.9.13, 1.9.13, 1.9.14, 1.9.14, 1.9.15, 1.9.15, 1.9.16, 1.9.16, 1.9.17, 1.9.17, 1.9.18, 1.9.18, 1.9.19, 1.9.19, 1.9.20, 1.9.20, 1.9.21, 1.9.21, 1.9.22, 1.9.22, 1.9.23, 1.9.23, 1.10.0, 1.10.0, 1.10.1, 1.10.1, 1.10.2, 1.10.2, 1.10.3, 1.10.3, 1.10.4, 1.10.4, 1.10.5, 1.10.5, 1.10.6, 1.10.6, 1.10.7, 1.10.7, 1.10.8, 1.10.8, 1.10.9, 1.10.9, 1.10.10, 1.10.10, 1.10.11, 1.10.11, 1.10.12, 1.10.12, 1.10.13, 1.10.13, 1.10.14, 1.10.14, 1.10.15, 1.10.15, 1.10.16, 1.10.16, 1.10.17, 1.10.17, 1.10.18, 1.10.18, 1.10.19, 1.10.19, 1.10.20, 1.10.20, 1.10.21, 1.10.21, 1.10.22, 1.10.22, 1.10.23, 1.10.23, 1.10.24, 1.10.24, 1.10.25, 1.10.25, 1.10.26, 1.10.26, 1.10.27, 1.10.27, 1.10.28, 1.10.28, 1.10.29, 1.10.29, 1.10.30, 1.10.30, 1.10.31, 1.10.31, 1.10.32, 1.10.32, 1.10.33, 1.10.33, 1.10.34, 1.10.34, 1.10.35, 1.10.35, 1.10.36, 1.10.36, 1.10.37, 1.10.37, 1.10.38, 1.10.38, 1.10.39, 1.10.39, 1.10.40, 1.10.40, 1.10.41, 1.10.41, 1.10.42, 1.10.42, 1.10.43, 1.10.43, 1.10.44, 1.10.44, 1.10.45, 1.10.45, 1.10.46, 1.10.46, 1.10.47, 1.10.47, 1.10.48, 1.10.48, 1.10.49, 1.10.49, 1.10.50, 1.10.50, 1.10.51, 1.10.51, 1.10.52, 1.10.52, 1.10.53, 1.10.53, 1.10.54, 1.10.54, 1.10.55, 1.10.55, 1.10.56, 1.10.56, 1.10.57, 1.10.57, 1.10.58, 1.10.58, 1.10.59, 1.10.59, 1.10.60, 1.10.60, 1.10.61, 1.10.61, 1.10.62, 1.10.62, 1.10.63, 1.10.63, 1.10.64, 1.10.64, 1.10.65, 1.10.65, 1.10.66, 1.10.66, 1.10.67, 1.10.67, 1.10.68, 1.10.68, 1.10.69, 1.10.69, 1.10.70, 1.10.70, 1.10.71, 1.10.71, 1.10.72, 1.10.72, 1.10.73, 1.10.73, 1.10.74, 1.10.74, 1.10.75, 1.10.75, 1.10.76, 1.10.76, 1.10.77, 1.10.77, 1.10.78, 1.10.78, 1.10.79, 1.10.79, 1.10.80, 1.10.80, 1.10.81, 1.10.81, 1.10.82, 1.10.82, 1.10.83, 1.10.83, 1.10.84, 1.10.84, 1.11.0, 1.11.0, 1.11.1, 1.11.1, 1.11.2, 1.11.2, 1.11.3, 1.11.3, 1.11.4, 1.11.4, 1.11.5, 1.11.5, 1.11.6, 1.11.6, 1.11.7, 1.11.7, 1.11.8, 1.11.8, 1.11.9, 1.11.9, 1.12.0, 1.12.0, 1.12.1, 1.12.1, 1.12.2, 1.12.2, 1.12.3, 1.12.3, 1.12.4, 1.12.4, 1.12.5, 1.12.5, 1.12.6, 1.12.6, 1.12.7, 1.12.7, 1.12.8, 1.12.8, 1.12.9, 1.12.9, 1.12.10, 1.12.10, 1.12.11, 1.12.11, 1.12.12, 1.12.12, 1.12.13, 1.12.13, 1.12.14, 1.12.14, 1.12.15, 1.12.15, 1.12.16, 1.12.16, 1.12.17, 1.12.17, 1.12.18, 1.12.18, 1.12.19, 1.12.19, 1.12.20, 1.12.20, 1.12.21, 1.12.21, 1.12.22, 1.12.22, 1.12.23, 1.12.23, 1.12.24, 1.12.24, 1.12.25, 1.12.25, 1.12.26, 1.12.26, 1.12.27, 1.12.27, 1.12.28, 1.12.28, 1.12.29, 1.12.29, 1.12.30, 1.12.30, 1.12.31, 1.12.31, 1.12.32, 1.12.32, 1.12.33, 1.12.33, 1.12.34, 1.12.34, 1.12.35, 1.12.35, 1.12.36, 1.12.36, 1.12.37, 1.12.37, 1.12.38, 1.12.38, 1.12.39, 1.12.39, 1.12.40, 1.12.40, 1.12.41, 1.12.41, 1.12.42, 1.12.42, 1.12.43, 1.12.43, 1.12.44, 1.12.44, 1.12.45, 1.12.45, 1.12.46, 1.12.46, 1.12.47, 1.12.47, 1.12.48, 1.12.48, 1.12.49, 1.12.49, 1.12.50, 1.12.50
Skipped pre-versions: 1.0.0a1, 1.0.0a2, 1.0.0a3, 1.0.0b1, 1.0.0b2, 1.0.0b3, 1.0.0rc1, 1.0.0rc1
There are incompatible versions in the resolved dependencies.

root@45aab38af4ee:/workspace/test#

'async for' requires an object with __aiter__ method, got s3.Bucket.objectsCollection

  • Async AWS SDK for Python version: aioboto3-6.3.0
  • Python version: Python3.6
  • Operating System: Linux

Problem:

I am using aioboto3 to create s3 resource.
And then I call s3 objects filter method:

my_bucket = self.s3_resource.Bucket(bucket_name)
response = my_bucket.objects.filter(Prefix=remote_directory_prefix)

I can't loop/iterate over this response:

async for r in response:

Above async for throws this exception:
'async for' requires an object with aiter method, got s3.Bucket.objectsCollection

Analysis
It seems s3.Bucket.objectsCollection doesn't implement aiter() method

update to aiobotocore 0.12.0

aiobotocore 0.11 depends on botocore 1.13.14 which requires python-dateutil<2.8.1 and that fails with python 3.7

s3_client.copy not working

  • Async AWS SDK for Python version:
Requirement already satisfied: aioboto3 in ./anaconda3/envs/py36/lib/python3.6/site-packages (4.0.1)
Requirement already satisfied: botocore<=1.10.12 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from aioboto3) (1.10.12)
Requirement already satisfied: boto3<=1.7.12,>=1.4.7 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from aioboto3) (1.7.12)
Requirement already satisfied: aiobotocore==0.8.0 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from aioboto3) (0.8.0)
Requirement already satisfied: jmespath<1.0.0,>=0.7.1 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from botocore<=1.10.12->aioboto3) (0.9.3)
Requirement already satisfied: docutils>=0.10 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from botocore<=1.10.12->aioboto3) (0.14)
Requirement already satisfied: python-dateutil<3.0.0,>=2.1; python_version >= "2.7" in ./anaconda3/envs/py36/lib/python3.6/site-packages (from botocore<=1.10.12->aioboto3) (2.7.3)
Requirement already satisfied: s3transfer<0.2.0,>=0.1.10 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from boto3<=1.7.12,>=1.4.7->aioboto3) (0.1.13)
Requirement already satisfied: wrapt>=1.10.10 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from aiobotocore==0.8.0->aioboto3) (1.10.11)
Requirement already satisfied: packaging>=16.8 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from aiobotocore==0.8.0->aioboto3) (17.1)
Requirement already satisfied: aiohttp<3.2.0,>=3.1.0 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from aiobotocore==0.8.0->aioboto3) (3.1.3)
Requirement already satisfied: multidict>=2.1.4 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from aiobotocore==0.8.0->aioboto3) (4.3.1)
Requirement already satisfied: six>=1.5 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from python-dateutil<3.0.0,>=2.1; python_version >= "2.7"->botocore<=1.10.12->aioboto3) (1.11.0)
Requirement already satisfied: pyparsing>=2.0.2 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from packaging>=16.8->aiobotocore==0.8.0->aioboto3) (2.2.0)
Requirement already satisfied: yarl<2.0,>=1.0 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from aiohttp<3.2.0,>=3.1.0->aiobotocore==0.8.0->aioboto3) (1.2.6)
Requirement already satisfied: idna-ssl>=1.0 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from aiohttp<3.2.0,>=3.1.0->aiobotocore==0.8.0->aioboto3) (1.0.1)
Requirement already satisfied: async-timeout<3.0,>=1.2 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from aiohttp<3.2.0,>=3.1.0->aiobotocore==0.8.0->aioboto3) (2.0.1)
Requirement already satisfied: attrs>=17.3.0 in ./anaconda3/envs/py36/lib/python3.6/site-packages (from aiohttp<3.2.0,>=3.1.0->aiobotocore==0.8.0->aioboto3) (18.1.0)
Requirement already satisfied: chardet<4.0,>=2.0 in ./.local/lib/python3.6/site-packages (from aiohttp<3.2.0,>=3.1.0->aiobotocore==0.8.0->aioboto3) (3.0.4)
Requirement already satisfied: idna>=2.0 in ./.local/lib/python3.6/site-packages (from yarl<2.0,>=1.0->aiohttp<3.2.0,>=3.1.0->aiobotocore==0.8.0->aioboto3) (2.6)

  • Python version: Py3.6.6
  • Operating System: OS X

Description

I try to use s3_client.copy for copying large files (> 5GB), requiring the multi-part support of the copy method. Unfortunately that fails.

What I Did

async def copy_rename(s3, name):
    bucket = A
    sha1sum_key = B

    # get object from s3
    response = await s3.get_object(Bucket=bucket, Key=sha1sum_key)
    async with response['Body'] as stream:
        sha1sum = (await stream.read()).decode('ascii').strip()

    response = await s3.copy(
        Bucket=C,
        Key=f'{name}-{sha1sum[:8]}.bin',
        CopySource=dict(Bucket=D, Key=f'{name}.bin'))


async def go():
    async with aioboto3.client('s3', region_name='us-east-1') as s3:
        tasks = []
        for name in NAMES:
            tasks.append(copy_rename(s3, name))
        await asyncio.gather(*tasks)

Fails with

Traceback (most recent call last):
  File "s3_links.py", line 33, in <module>
    loop.run_until_complete(go())
  File "/Users/lllausen/anaconda3/envs/py36/lib/python3.6/asyncio/base_events.py", line 468, in run_until_complete
    return future.result()
  File "s3_links.py", line 29, in go
    await asyncio.gather(*tasks)
  File "s3_links.py", line 21, in copy_rename
    Bucket=..., Key=...
  File "/Users/lllausen/anaconda3/envs/py36/lib/python3.6/site-packages/boto3/s3/inject.py", line 271, in copy
    return future.result()
  File "/Users/lllausen/anaconda3/envs/py36/lib/python3.6/site-packages/s3transfer/futures.py", line 73, in result
    return self._coordinator.result()
  File "/Users/lllausen/anaconda3/envs/py36/lib/python3.6/site-packages/s3transfer/futures.py", line 233, in result
    raise self._exception
  File "/Users/lllausen/anaconda3/envs/py36/lib/python3.6/site-packages/s3transfer/tasks.py", line 255, in _main
    self._submit(transfer_future=transfer_future, **kwargs)
  File "/Users/lllausen/anaconda3/envs/py36/lib/python3.6/site-packages/s3transfer/copies.py", line 112, in _submit
    response['ContentLength'])
TypeError: 'coroutine' object is not subscriptable

how to use s3 download_file

  • Async AWS SDK for Python version: boto3 1.9.49
  • Python version:3.6
  • Operating System: Mac OS

Description

I tried to download file from S3, and I received error:
A Future or coroutine is required

In the documentation, I cannot find how to use the S3 properly except DynamoDB. Is it worth to update the doc ?

What I Did

async def download_files():
    async with aioboto3.resource('s3') as s3:
        await s3.meta.client.download_file(const.bucket_name, 'file/test.txt', '/tmp/test.txt')
        print('file downloaded')

def process():
    loop = asyncio.get_event_loop()
    loop.run_until_complete(download_files())
    loop.close()

process()

Should I reuse resource?

In the example below, considering that put_item will be called multiple times during the lifetime of my application, should I store and reuse dynamo_resource or instantiate a new one for each call?

 async with aioboto3.resource('dynamodb', region_name='eu-central-1') as dynamo_resource:
    table = dynamo_resource.Table('test_table')

    await table.put_item(
        Item={'pk': 'test1', 'col1': 'some_data'}
    )

upgrade botocore and boto3 versions

  • Async AWS SDK for Python version: 5.0.0
  • Python version: 3.6.1
  • Operating System: osX

Description

Are you planning to upgrade your botocore and boto3 dependencies anytime soon?
You are requiring boto3>=1.4.7,<=1.7.58 while latest version is already 1.9.40 and botocore>=1.10.58,<1.10.59 while latest version is already 1.12.40

Thanks

upload_fileobj does not execute abort_multipart_upload if an error occurs while reading the file

  • Async AWS SDK for Python version: 8.0.3
  • Python version: 3.8.3
  • Operating System: Mac OS Catalina 10.15.5

Description

upload_fileobj does not execute abort_multipart_upload if an error occurs while reading the file.
I expect abort_multipart_upload if upload_fileobj aborts

What I Did

ENDPOINT_URL = ''
AWS_ACCESS_KEY_ID = ''
AWS_SECRET_ACCESS_KEY = ''
Bucket = ''


async def test():
    session = aioboto3.session.Session(aws_access_key_id=AWS_ACCESS_KEY_ID, aws_secret_access_key=AWS_SECRET_ACCESS_KEY)
    async with session.client(service_name='s3', endpoint_url=ENDPOINT_URL) as s3:
        response = await s3.list_multipart_uploads(Bucket=Bucket)
        print(len(response['Uploads']))

        try:
            await s3.upload_fileobj(broken_file, Bucket, 'broken2')
        finally:
            response = await s3.list_multipart_uploads(Bucket=Bucket)
            print(len(response['Uploads']))

asyncio.run(test())

Is it possible to upload to s3 asynchronously by chunks?

What I am trying to do: I want to download file from remote source and upload it directly to s3 without storing the bytes on the localhost. (Stream the data from remote machine through my machine to s3). There are tens of files in remote location, thats why I want to use async lib, to stream all files simultaneously to save time.

Is it possible with current version of lib to do something like following?

async with aioboto3.client('s3', loop=loop) as client:
    async for chunk in response.content.iter_chunked(1024):
        if chunk:
            # await put chunk into appropriate file handle on s3

Thanks for some pointers.

S3 client->put_object not working

  • Async AWS SDK for Python version: aioboto3==6.2.2
  • Python version: Python 3.7.3
  • Operating System: Windows 10

Description

I have a method to upload data to s3 using put_object, but it failed giving a timeout to await.

What I Did

I was trying to look for an example that could help me to validate my algorithm but couldn't find it in the documentation.

Here is my def

import logging
import time
import aioboto3
import asyncio

logger = logging.getLogger(__name__)


async def upload_file_to_s3_async(bucket_name, key):
    async with aioboto3.client(
            's3',
            aws_access_key_id=<AWS ID>,
            aws_secret_access_key=<AWS-ACCESS-KEY>,
            use_ssl=False
    ) as client:
        start_time = time.time()
        logger.info('Starting Upload')
        await client.put_object(
            Bucket=bucket_name,
            Key=key,
            Body=b'Test Data'
        )
        logger.info('File Uploaded to S3 in {}'.format(time.time() - start_time))


async def main():
    await upload_file_to_s3_async('bucket_123', 'key/1/2/3')

loop = asyncio.get_event_loop()
loop.run_until_complete(main())

It hangs after logger.info('Starting Upload'), not sure why the put_object method is failing.

Exception been displayed
aiohttp.client_exceptions.ServerTimeoutError: Timeout on reading data from socket

PD: I have the same implementation only on boto3 and is working.

Running `pytest tests` fails

  • Async AWS SDK for Python version:
  • Python version: 3.7
  • Operating System: Mac

Description

Describe what you were trying to get done.
Tell us what happened, what went wrong, and what you expected to happen.

What I Did

pytest tests

_____________________________________________________________ ERROR at setup of test_asymmetric_cse_encrypt_decrypt_aes_cbc ______________________________________________________________

    @pytest.yield_fixture(scope="session")
    def s3_server():
        host = "localhost"
        port = 5002
        url = "http://{host}:{port}".format(host=host, port=port)
>       process = start_service('s3', host, port)

tests/mock_server.py:72:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/mock_server.py:19: in start_service
    process = sp.Popen(args, stdin=sp.PIPE, stdout=sp.PIPE, stderr=sp.PIPE)  # shell=True
/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/subprocess.py:775: in __init__
    restore_signals, start_new_session)
/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/subprocess.py:1436: in _execute_child
    executable = os.fsencode(executable)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

filename = None

    def fsencode(filename):
        """Encode filename (an os.PathLike, bytes, or str) to the filesystem
        encoding with 'surrogateescape' error handler, return bytes unchanged.
        On Windows, use 'strict' error handler if the file system encoding is
        'mbcs' (which is the default encoding).
        """
>       filename = fspath(filename)  # Does type-checking of `filename`.
E       TypeError: expected str, bytes or os.PathLike object, not NoneType

/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/os.py:809: TypeError

BatchGetItem

Would you be interested in a PR that supports a background-thread-supported async batch get against a single table?

It's not really async all the way down - it spawns a single IO thread per table and does timed batching the the background, but it does offer an async interface that looks basically like this:

async def get_dynamo_item(
        table_name: str,
        primary_key_tuple: tuple,
        primary_key_attr_names: ty.Sequence[str] = ('id',),
        item_nicename: str = 'Item',
) -> dict:
    """Performs Dynamo request batching behind the scenes.

    The primary key tuple must be in the order defined by the primary_key_attr_names.

    This is guaranteed to be slower than just asking for a single item
    directly, because the batcher waits a short period to maximize the
    size of its batches, so only use this if you know your client code
    is going to be making multiple async requests.
    You should almost certainly wrap this function with a helper
    method for your given table and item type.
    """
    logger.debug(f'Asking for {primary_key_tuple}')
    fulfiller = ensure_table_request_fulfiller(table_name, primary_key_attr_names)
    fut = get_event_loop().create_future()
    fulfiller.request_map[fut] = primary_key_tuple
    await fulfiller.queue.put(fut)
    item = await fut
    return item

The inputs at this level are primary key tuples instead of the 'normal' keydicts
so that we can deduplicate them, as boto3 will not allow
duplicates within a batch, and different async users can't deduplicate against each other.
This is why it is also necessary to
supply a parallel tuple of the names of the primary key
attributes. In many cases this will simply be ('id',), so the default is provided.

A table name is used instead of a TableResource so that we can pull the results from the proper part of the response.

It would be pretty trivial to write a wrapper that instead pulls this information (key schema and table name) from the TableResource and then extracts the keys from input dicts, to better match the batch_get_items interface itself.

I do have an entire implementation that's been running in production for ~6 months, so if you're interested, I could make a PR. It's Python 3.7 at the moment with static typing and whatnot, but that could be stripped. I'm not sure what versions of Python you need to support, and how much effort it would be to go back to, say, 3.3 or 2.7. And it depends on probably another ~500 lines of code that I haven't posted here. Again, it's backed by a worker thread, which might not be in scope for this library.

Can not call load() on s3.Object()

  • Async AWS SDK for Python version: latest
  • Python version: 3.7.3
  • Operating System: debian-stretch

Description

I'm trying to write an asynchronous function that checks wether a file exists on s3 in a fast way (independent of amount of files/size of the file in question), which is classically done with load(). However, running this code asynchronously seems to cause issues.

What I Did

from urllib import parse
import aioboto3

def _parse_s3_path(s3_path):
    parsed = parse.urlparse(s3_path)
    return parsed.netloc, parsed.path[1:]


async def s3_exists(filename):
    async with aioboto3.resource('s3') as s3:
        bucket, key = _parse_s3_path(filename)
        try:
            await s3.Object(bucket, key).load()
        except ClientError as e:
            if e.response['Error']['Code'] == "404":
                return False
            else:
                raise e
        else:
            return True

Traceback for trying to check for an existing filename eg await s3_exists('s3://some-bucket/some/nested/file.txt'):

await s3.Object(bucket, key).load()
File "/usr/local/lib/python3.7/site-packages/aioboto3/resources.py", line 276, in do_action
response = await action.async_call(self, *args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/aioboto3/resources.py", line 65, in async_call
response = await getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python3.7/site-packages/aiobotocore/client.py", line 93, in _make_api_call
operation_model, request_dict, request_context)
File "/usr/local/lib/python3.7/site-packages/aiobotocore/client.py", line 114, in _make_request
request_dict)
File "/usr/local/lib/python3.7/site-packages/aiobotocore/endpoint.py", line 232, in _send_request
exception)):
File "/usr/local/lib/python3.7/site-packages/aiobotocore/endpoint.py", line 263, in _needs_retry
caught_exception=caught_exception, request_dict=request_dict)
File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 356, in emit
return self._emitter.emit(aliased_event_name, **kwargs)
File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 228, in emit
return self._emit(event_name, kwargs)
File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 211, in _emit
response = handler(**kwargs)
File "/usr/local/lib/python3.7/site-packages/botocore/utils.py", line 1146, in redirect_from_error
new_region = self.get_bucket_region(bucket, response)
File "/usr/local/lib/python3.7/site-packages/botocore/utils.py", line 1204, in get_bucket_region
headers = response['ResponseMetadata']['HTTPHeaders']
TypeError: 'coroutine' object is not subscriptable

Thread safety

Description

I'm trying to evaluate whether to use aioboto3 for a project that will do multiple concurrent uploads to S3 using an aioboto3.Session. According to the boto3 documentation sessions contains shared data and should not be shared in multiple threads. Is this also a problem in aioboto3 or is it safe to use a session multiple times concurrently? Since you claim aioboto3 is just a wrapper around boto3 I would expect it to be unsafe to use concurrently if the state is used to track the upload or similar.

Error 'AIOBoto3ResourceFactory._create_autoload_property.' was never awaited on Object.copy_from

  • Async AWS SDK for Python version: 8.0.5
  • Python version: 3.8
  • Operating System: Ubuntu 20.04

Description

I tried to update tags using Object.copy_from but got error sys:1: RuntimeWarning: coroutine 'AIOBoto3ResourceFactory._create_autoload_property.' was never awaited. Here is the error message from the code.

Traceback (most recent call last):
  File "s3.py", line 36, in <module>
    loop.run_until_complete(main())
  File "/home/eka/.pyenv/versions/3.8.5/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "s3.py", line 29, in main
    await obj.copy_from(CopySource={'Bucket':'mybucket', 'Key':file.key}, TaggingDirective='REPLACE', MetadataDirective='REPLACE', Tagging="duplicate=true", Metadata=obj.metadata)
  File "/home/eka/.local/share/virtualenvs/s3-VfuamJbd/lib/python3.8/site-packages/aioboto3/resources/factory.py", line 239, in do_action
    response = await action(self, *args, **kwargs)
  File "/home/eka/.local/share/virtualenvs/s3-VfuamJbd/lib/python3.8/site-packages/aioboto3/resources/action.py", line 41, in __call__
    response = await getattr(parent.meta.client, operation_name)(**params)
  File "/home/eka/.local/share/virtualenvs/s3-VfuamJbd/lib/python3.8/site-packages/aiobotocore/client.py", line 97, in _make_api_call
    request_dict = await self._convert_to_request_dict(
  File "/home/eka/.local/share/virtualenvs/s3-VfuamJbd/lib/python3.8/site-packages/aiobotocore/client.py", line 145, in _convert_to_request_dict
    request_dict = self._serializer.serialize_to_request(
  File "/home/eka/.local/share/virtualenvs/s3-VfuamJbd/lib/python3.8/site-packages/botocore/validate.py", line 297, in serialize_to_request
    raise ParamValidationError(report=report.generate_report())
botocore.exceptions.ParamValidationError: Parameter validation failed:
Invalid type for parameter Metadata, value: <coroutine object AIOBoto3ResourceFactory._create_autoload_property.<locals>.property_loader at 0x7ff1e33ffc40>, type: <class 'coroutine'>, valid types: <class 'dict'>
sys:1: RuntimeWarning: coroutine 'AIOBoto3ResourceFactory._create_autoload_property.' was never awaited

What I Did

async def main():
    async with boto3.resource('s3') as s3:
        audio_bucket = await s3.Bucket('mybucket')

        async for file in audio_bucket.objects.all():
            obj = await audio_bucket.Object(key=file.key)
            await obj.copy_from(CopySource={'Bucket':'mybucket', 'Key':file.key}, TaggingDirective='REPLACE', MetadataDirective='REPLACE', Tagging="a=1", Metadata=obj.metadata)

loop = asyncio.get_event_loop()
loop.run_until_complete(main())

Feature request: concurrent upload

Looking at the code for upload_fileobj(), it appears max_concurrency is not used:
https://github.com/terrycain/aioboto3/blob/v6.3.0/aioboto3/s3/inject.py#L145

Compare the _upload_parts() in s3transfer, it would appear it should not be too difficult to add async concurrent upload:
https://github.com/boto/s3transfer/blob/dc28ff80658ab65b25d98872ab20afe8a56ad119/s3transfer/__init__.py#L398

Would love this feature to be add so we can maximize throughput during upload.

How to configure max connections?

Hello, is there a parameter where I can configure max connections of aiohttp connector?

This is very useful to rate limit concurrent lambda invocations

For example, in aiohttp you can configure this :

conn = aiohttp.TCPConnector(limit=10, limit_per_host=10)
session = aiohttp.ClientSession(loop=loop, connector=conn)

Saving the boto3 resource with aioboto3 8.x

After the upgrade of aiobotocore to v 1.x.x, the documentation suggests that the aioboto3 resource should be used with an async context manager. This raises a small concern on my side about saving the resource.

My use case is one in which I run a web server with aiohttp, and need to access dynamodb for every request made. Previously I was able to initiate the resource and subsequently the Table-object in my database abstraction object, which was then stored in the aiohttp request context at application startup. I would then call table.close() at application shutdown and cleanup. Creating a new resource and table object for each request is too slow.

It would be beneficial to add some documentation on how real world use cases are supported. Is there a way to maintain the Table-object with the new changes?

s3 resource ObjectSummary.size -> argument of type 'coroutine' is not iterable

  • Async AWS SDK for Python version: 8.0.5
  • Python version: 3.8.5
  • Operating System: MacOS 10.15.7

Description

Trying to using s3 resource.ObjectSummary to retrieve the filesize of an S3 object. I can await/obtain a valid ObjectSummary object, but the properties are all still coroutine objects. When I try to await the filesize property to resolve it to a value, I get argument of type 'coroutine' is not iterable.

Refer to the statement in the code below re: filesize = await object_summary.size

What I Did

import asyncio
import os

import aioboto3
import boto3


async def main():
    sess = boto3.Session(profile_name=os.getenv('AWS_PROFILE'))

    access_key_id = sess._session.full_config['profiles']['myprofile']['aws_access_key_id']
    access_key = sess._session.full_config['profiles']['myprofile']['aws_secret_access_key']
    aioboto3.setup_default_session(aws_access_key_id=access_key_id, aws_secret_access_key=access_key)

    bucket_name = '******'
    object_key = '******'

    async with aioboto3.resource("s3") as s3:
        object_summary = await s3.ObjectSummary(bucket_name, object_key)
        filesize = await object_summary.size

        print(f'File size: {filesize / 1024**2}')


if __name__ == '__main__':
    asyncio.run(main())

Traceback

RuntimeWarning: coroutine 'AIOBoto3ResourceFactory._create_autoload_property.<locals>.property_loader' was never awaited
  self._variable_reference_to_variable.clear()
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
Traceback (most recent call last):
  File "/usr/local/anaconda3/envs/*****/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/local/anaconda3/envs/*****/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/Users/*****/.vscode/extensions/ms-python.python-2020.10.332292344/pythonFiles/lib/python/debugpy/__main__.py", line 45, in <module>
    cli.main()
  File "/Users/*****/.vscode/extensions/ms-python.python-2020.10.332292344/pythonFiles/lib/python/debugpy/../debugpy/server/cli.py", line 430, in main
    run()
  File "/Users/*****/.vscode/extensions/ms-python.python-2020.10.332292344/pythonFiles/lib/python/debugpy/../debugpy/server/cli.py", line 267, in run_file
    runpy.run_path(options.target, run_name=compat.force_str("__main__"))
  File "/usr/local/anaconda3/envs/****/lib/python3.8/runpy.py", line 265, in run_path
    return _run_module_code(code, init_globals, run_name,
  File "/usr/local/anaconda3/envs/****/lib/python3.8/runpy.py", line 97, in _run_module_code
    _run_code(code, mod_globals, init_globals,
  File "/usr/local/anaconda3/envs/****/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/Users/*****/Library/Mobile Documents/com~apple~CloudDocs/****/****/****/scripts/batch-tasks/test_aioboto3.py", line 36, in <module>
    asyncio.run(main())
  File "/usr/local/anaconda3/envs/****/lib/python3.8/asyncio/runners.py", line 43, in run
    return loop.run_until_complete(main)
  File "/usr/local/anaconda3/envs/****/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/Users/*****/Library/Mobile Documents/com~apple~CloudDocs/****/****/****/scripts/batch-tasks/test_aioboto3.py", line 30, in main
    filesize = await object_summary.size
  File "/usr/local/anaconda3/envs/****/lib/python3.8/site-packages/aioboto3/resources/factory.py", line 116, in property_loader
    await self.load()
  File "/usr/local/anaconda3/envs/****/lib/python3.8/site-packages/boto3/s3/inject.py", line 88, in object_summary_load
    if 'ContentLength' in response:
TypeError: argument of type 'coroutine' is not iterable
sys:1: RuntimeWarning: coroutine 'AioBaseClient._make_api_call' was never awaited

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.