Giter Site home page Giter Site logo

sqs-extended-client's Introduction

sqs-extended-client

Implements the functionality of amazon-sqs-java-extended-client-lib in Python

Installation

pip install sqs-extended-client

Overview

sqs-extended-client allows for sending large messages through SQS via S3. This is the same mechanism that the Amazon library amazon-sqs-java-extended-client-lib provides. This library is interoperable with that library.

To do this, this library automatically extends the normal boto3 SQS client and Queue resource classes upon import using the botoinator library. This allows for further extension or decoration if desired.

Additional attributes available on boto3 SQS client and Queue objects

  • large_payload_support -- the S3 bucket name that will store large messages.
  • message_size_threshold -- the threshold for storing the message in the large messages bucket. Cannot be less than 0 or greater than 262144. Defaults to 262144.
  • always_through_s3 -- if True, then all messages will be serialized to S3. Defaults to False
  • s3 -- the boto3 S3 resource object to use to store objects to S3. Use this if you want to control the S3 resource (for example, custom S3 config or credentials). Defaults to boto3.resource("s3") on first use if not previously set.

Usage

Note:

The s3 bucket must already exist prior to usage, and be accessible by whatever credentials you have available

Enabling support for large payloads (>256Kb)

import boto3
import sqs_extended_client

# Low level client
sqs = boto3.client('sqs')
sqs.large_payload_support = 'my-bucket-name'

# boto resource
resource = boto3.resource('sqs')
queue = resource.Queue('queue-url')

# Or
queue = resource.create_queue(QueueName='queue-name')

queue.large_payload_support = 'my-bucket-name'

Enabling support for large payloads (>64K)

import boto3
import sqs_extended_client

# Low level client
sqs = boto3.client('sqs')
sqs.large_payload_support = 'my-bucket-name'
sqs.message_size_threshold = 65536

# boto resource
resource = boto3.resource('sqs')
queue = resource.Queue('queue-url')

# Or
queue = resource.create_queue(QueueName='queue-name')

queue.large_payload_support = 'my-bucket-name'
queue.message_size_threshold = 65536

Enabling support for large payloads for all messages

import boto3
import sqs_extended_client

# Low level client
sqs = boto3.client('sqs')
sqs.large_payload_support = 'my-bucket-name'
sqs.always_through_s3 = True

# boto resource
resource = boto3.resource('sqs')
queue = resource.Queue('queue-url')

# Or
queue = resource.create_queue(QueueName='queue-name')

queue.large_payload_support = 'my-bucket-name'
queue.always_through_s3 = True

Setting a custom S3 resource

import boto3
from botocore.config import Config
import sqs_extended_client

# Low level client
sqs = boto3.client('sqs')
sqs.large_payload_support = 'my-bucket-name'
sqs.s3 = boto3.resource(
  's3', 
  config=Config(
    signature_version='s3v4',
    s3={
      "use_accelerate_endpoint": True
    }
  )
)

# boto resource
resource = boto3.resource('sqs')
queue = resource.Queue('queue-url')

# Or
queue = resource.create_queue(QueueName='queue-name')

queue.large_payload_support = 'my-bucket-name'
queue.s3 = boto3.resource(
  's3', 
  config=Config(
    signature_version='s3v4',
    s3={
      "use_accelerate_endpoint": True
    }
  )
)

sqs-extended-client's People

Contributors

edumucelli avatar gwenwahl avatar joseph-wortmann avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

sqs-extended-client's Issues

TypeError when using "receive_messages" with a boto resource

Ran into the following stack trace when using "receive_messages" with a boto3 resource.

           |   File "/usr/local/lib/python3.7/site-packages/sqs_extended_client/session.py", line 320, in _receive_messages
           |     retrieve_results = list(executor.map(args[0]._retrieve_from_s3, *iterables))
           |   File "/usr/local/lib/python3.7/concurrent/futures/_base.py", line 586, in result_iterator
           |     yield fs.pop().result()
           |   File "/usr/local/lib/python3.7/concurrent/futures/_base.py", line 425, in result
           |     return self.__get_result()
           |   File "/usr/local/lib/python3.7/concurrent/futures/_base.py", line 384, in __get_result
           |     raise self._exception
           |   File "/usr/local/lib/python3.7/concurrent/futures/thread.py", line 57, in run
           |     result = self.fn(*self.args, **self.kwargs)
           | TypeError: _retrieve_from_s3() takes 4 positional arguments but 5 were given

I think the issue is in _receive_messages_decorator:

if messages:
      iterables = [ [ None for _ in range(len(messages)) ] for _ in range(4) ]
      for index in range(len(messages)):
        iterables[0][index] = messages[index].message_attributes or {}
        iterables[1][index] = messages[index].body
        iterables[2][index] = messages[index].receipt_handle
      with ThreadPoolExecutor(max_workers=len(messages)) as executor:
        retrieve_results = list(executor.map(args[0]._retrieve_from_s3, *iterables))

_receive_message_decorator instead bounds the iterables to range(3).

Destructive PyPi Deploys

Hey just a heads up that right now any new versions delete the older ones from PyPi, which can make it hard when you try to keep your versions pinned and such.

automatically retrieving the SQS message content from s3 while using lambda trigger from the event dict

Hi,

this seems to work fine when you are explicitly receiving the message, but when we put a automatic trigger from SQS to lambda, then in that case the message is already inside the event variable of the lambda handler which contains the s3 location of the message not the actually message.

in sqs java library using https://awslabs.github.io/aws-lambda-powertools-java/utilities/sqs_large_message_handling/ we can overide and automatically process the event variable and retrieve the message from the s3.

how can we do the same thing here, do we have the same functionality.

the workaround that I see is we can manually retrieve the message from s3 using boto s3 clinet using the s3 bucket and key.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.