Giter Site home page Giter Site logo

awslabs / dynein Goto Github PK

View Code? Open in Web Editor NEW
342.0 9.0 31.0 1.14 MB

DynamoDB CLI written in Rust.

Home Page: https://github.com/awslabs/dynein

License: Apache License 2.0

Rust 100.00%
aws dynamodb cli rust database nosql import export backup dynamodb-local dynamodb-client dynamodb-admin ddb command-line command-line-tool dynamodb-cli

dynein's Introduction

dynein - DynamoDB CLI

dynein /daɪ.nɪn/ is a command line interface for Amazon DynamoDB written in Rust. dynein is designed to make it simple to interact with DynamoDB tables/items from terminal.

Why use dynein?

Less Typing

  • Auto completion for table/keyDefinitions enables using DynamoDB with minimum arguments. e.g. to get an item: dy get abc
  • Switching table context by RDBMS-ish "use".
  • Prefer standard JSON ( {"id": 123} ) over DynamoDB JSON ( {"id": {"N": "123"}} ).

Quick Start

  • Bootstrap command enables you to launch sample table with sample data sets.
  • Supports DynamoDB Local and you can test DyanmoDB at no charge.

For day-to-day tasks

  • Import/Export by single command: export DynamoDB items to CSV/JSON files and conversely, import them into tables.
  • Taking on-demand backup and restore data from them.

Installation

Method 1. Download binaries

You can download binaries of a specific version from the releases page. For example, below instructions are example comamnds to download the latest version in each platform.

macOS

$ curl -O -L https://github.com/awslabs/dynein/releases/latest/download/dynein-macos.tar.gz
$ tar xzvf dynein-macos.tar.gz
$ mv dy /usr/local/bin/
$ dy --help

Currently, the above binary is automatically built on intel mac as the GitHub Action doesn't support Apple M1 (ARM) environment yet.

Linux (x86-64)

$ curl -O -L https://github.com/awslabs/dynein/releases/latest/download/dynein-linux.tar.gz
$ tar xzvf dynein-linux.tar.gz
$ sudo mv dy /usr/local/bin/
$ dy --help

Method 2. Homebrew (macOS/Linux)

$ brew install dynein

Method 3. Building from source

dynein is written in Rust, so you can build and install dynein with Cargo. To build dynein from source code you need to install Rust as a prerequisite.

$ git clone [[this_git_repository_url]]
$ cd dynein
$ cargo install --locked --path .
$ ./target/release/dy --help

You can move the binary file named "dy" to anywhere under your $PATH.

How to Use

Prerequisites - AWS Credentials

First of all, please make sure you've already configured AWS Credentials in your environment. dynein depends on rusoto and rusoto can utilize standard AWS credential toolchains - for example ~/.aws/credentials file, IAM EC2 Instance Profile, or environment variables such as AWS_DEFAULT_REGION / AWS_ACCESS_KEY_ID / AWS_SECRET_ACCESS_KEY / AWS_PROFILE.

One convenient way to check if your AWS credential configuration is ok to use dynein is to install and try to execute AWS CLI in your environment (e.g. $ aws dynamodb list-tables). Once you've configured AWS CLI, you should be ready to use dynein.

Commands overview

After you installed dynein you should have a binary named dy in your $PATH. The first command you can try is dy ls, which lists tables you have:

$ dy ls --all-regions
DynamoDB tables in region: us-west-2
  EventData
  EventUsers
* Forum
  Thread
DynamoDB tables in region: us-west-1
  No table in this region.
DynamoDB tables in region: us-east-2
  UserBooks
  Users
...

Here --all-regions option enables you to iterate over all AWS regions and list all tables for you.

Next you can try dy scan with region and table options. dy scan command executes Scan API internally to retrieve all items in the table.

$ dy scan --region us-west-2 --table Forum
Name             attributes
Amazon S3        {"Category":"Amazon Web Services"}
Amazon DynamoDB  {"Views":1000,"Threads":2,"Messages":4,"Category":...

Here Name is a primary key of this Forum table and attributes column contains rest attributes of each item.

You don't want to pass --region and --table everytime? Let's mark the table as "currently using" with the command dy use.

$ dy use Forum --region us-west-2

Now you can interact with the table without specifying a target.

$ dy scan
Name             attributes
Amazon S3        {"Category":"Amazon Web Services"}
Amazon DynamoDB  {"Threads":2,"Views":1000,"Messages":4,"Category":...

To find more features, dy help will show you complete list of available commands.

$ dy --help
dynein x.x.x
dynein is a command line tool to interact with DynamoDB tables/data using concise interface.
dynein looks for config files under $HOME/.dynein/ directory.

USAGE:
    dy [OPTIONS] <SUBCOMMAND>

FLAGS:
    -h, --help       Prints help information
    -V, --version    Prints version information

OPTIONS:
    -r, --region <region>    The region to use (e.g. --region us-east-1). When using DynamodB Local, use `--region
                             local`. You can use --region option in both top-level and subcommand-level
    -t, --table <table>      Target table of the operation. You can use --table option in both top-level and subcommand-
                             level. You can store table schema locally by executing `$ dy use`, after that
                             you need not to specify --table on every command

SUBCOMMANDS:
    admin        <sub> Admin operations such as creating/updating table or GSI
    backup       Take backup of a DynamoDB table using on-demand backup
    bootstrap    Create sample tables and load test data for bootstrapping
    bwrite       Put or Delete multiple items at one time, up to 25 requests. [API: BatchWriteItem]
    config       <sub> Manage configuration files (config.yml and cache.yml) from command line
    del          Delete an existing item. [API: DeleteItem]
    desc         Show detailed information of a table. [API: DescribeTable]
    export       Export items from a DynamoDB table and save them as CSV/JSON file
    get          Retrieve an item by specifying primary key(s). [API: GetItem]
    help         Prints this message or the help of the given subcommand(s)
    import       Import items into a DynamoDB table from CSV/JSON file
    list         List tables in the region. [API: ListTables]
    put          Create a new item, or replace an existing item. [API: PutItem]
    query        Retrieve items that match conditions. Partition key is required. [API: Query]
    restore      Restore a DynamoDB table from backup data
    scan         Retrieve items in a table without any condition. [API: Scan]
    upd          Update an existing item. [API: UpdateItem]
    use          Switch target table context. After you use the command you don't need to specify table every time,
                 but you may overwrite the target table with --table (-t) option

dynein consists of multiple layers of subcommands. For example, dy admin and dy config require you to give additional action to run.

$ dy admin --help
dy-admin x.x.x
<sub> Admin operations such as creating/updating table or GSI

USAGE:
    dy admin [OPTIONS] <SUBCOMMAND>

FLAGS: ...

OPTIONS: ...

SUBCOMMANDS:
    create    Create new DynamoDB table or GSI. [API: CreateTable, UpdateTable]
    delete    Delete a DynamoDB table or GSI. [API: DeleteTable]
    desc      Show detailed information of a table. [API: DescribeTable]
    help      Prints this message or the help of the given subcommand(s)
    list      List tables in the region. [API: ListTables]
    update    Update a DynamoDB table. [API: UpdateTable etc]

By executing following command, you can create a DynamoDB table.

$ dy admin create table mytable --keys pk,S

Bootstrapping sample DynamoDB tables

The easiest way to get familiar with dynein and DynamoDB would be executing dy bootstrap. The bootstrap subcommand creates sample tables and automatically load sample data defined here. After that, you'll see some sample commands to demonstrate basic usage of dynein.

$ dy bootstrap

Bootstrapping - dynein will creates 4 sample tables defined here:
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/AppendixSampleTables.html

'ProductCatalog' - simple primary key table
    Id (N)

'Forum' - simple primary key table
    Name (S)

'Thread' - composite primary key table
    ForumName (S)
    Subject (S)

'Reply' - composite primary key table, with GSI named 'PostedBy-Message-Index'
    Id (S)
    ReplyDateTime (S)

...(snip logs)...

Now all tables have sample data. Try following commands to play with dynein. Enjoy!
  $ dy --region us-west-2 ls
  $ dy --region us-west-2 desc --table Thread
  $ dy --region us-west-2 scan --table Thread
  $ dy --region us-west-2 use --table Thread
  $ dy scan

After you 'use' a table like above, dynein assume you're using the same region & table, which info is stored at ~/.dynein/config.yml and ~/.dynein/cache.yml
Let's move on with the 'us-west-2' region you've just 'use'd...
  $ dy scan --table Forum
  $ dy scan -t ProductCatalog
  $ dy get -t ProductCatalog 101
  $ dy query -t Reply "Amazon DynamoDB#DynamoDB Thread 2"
  $ dy query -t Reply "Amazon DynamoDB#DynamoDB Thread 2"  --sort-key "begins_with 2015-10"

If you're interested in other available sample tables with data, check dy bootstrap --list and pass desired target to --sample option.

Working with DynamoDB tables

Using dynein, you can create a table:

$ dy admin create table app_users --keys app_id,S user_id,S
---
name: app_users
region: us-east-1
status: CREATING
schema:
  pk: app_id (S)
  sk: user_id (S)
mode: OnDemand
capacity: ~
gsi: ~
lsi: ~
stream: ~
count: 0
size_bytes: 0
created_at: "2020-03-03T13:34:43+00:00"

After the table get ready (i.e. status: CREATING changed to ACTIVE), you can write-to and read-from the table.

$ dy use app_users
$ dy desc
---
name: app_users
region: us-east-1
status: ACTIVE
schema:
  pk: app_id (S)
  sk: user_id (S)
mode: OnDemand
capacity: ~
gsi: ~
lsi: ~
stream: ~
count: 0
size_bytes: 0
created_at: "2020-03-03T13:34:43+00:00"

$ dy put myapp 1234 --item '{"rank": 99}'
Successfully put an item to the table 'app_users'.

$ dy scan
app_id  user_id  attributes
myapp   1234     {"rank":99}

Similarly you can update tables with dynein.

$ dy admin update table app_users --mode provisioned --wcu 10 --rcu 25

Infrastracture as Code - enpowered by CloudFormation

NOTE: currently this feature is under development

Infrastracture as Code is a concept that you define code to provision "infrastructures", such as DynamoDB tables, with "declarative" way (On the other hand you can say dy admin create table and dy admin update table commands are "imperative" way).

To manage DynamoDB tables with "declarative" way, dynein provides dy admin plan and dy admin apply commands. Internally dynein executes AWS CloudFormation APIs to provision DynamoDB resources for you.

$ ls
mytable.cfn.yml

$ cat mytable.cfn.yml
Resources:
  MyDDB:
    Type: AWS::DynamoDB::Table
    Properties:
      AttributeDefinitions:
      - AttributeName: pk
        AttributeType: S
      KeySchema:
      - AttributeName: pk
        KeyType: HASH
      BillingMode: PAY_PER_REQUEST

(currently not available) $ dy admin plan
(currently not available) $ dy admin apply

CloudFormation manages DynamoDB tables through the resource type named AWS::DynamoDB::Table - visit the link for more information.

dy use and dy config to switch/manage context

Basically it's pretty straight forward to specify table with which you want to interact with: --table or -t option. Let's say you want to scan data in the customers table.

$ dy scan --table customers
... display items in the "customers" table ...

However, dynein assume that tipically you're interested in only one table at some point. It means that passing table name for every single command execution is a kinf of waste of your time.

By using dy use for a table, you can call commands such as scan, get, query, and put without specifying table name.

$ dy use customers
$ dy scan
... display items in the "customers" table ...

In detail, when you execute dy use command, dynein saves your table usage information in ~/.dynein/config.yml and caches table schema in ~/.dynein/cache.yml. You can dump them with dy config dump command.

$ ls ~/.dynein/
cache.yml   config.yml

$ dy config dump
---
tables:
  ap-northeast-1/customers:
    region: ap-northeast-1
    name: customers
    pk:
      name: user_id
      kind: S
    sk: ~
    indexes: ~
---
using_region: ap-northeast-1
using_table: customers

To clear current table configuration, simply execute dy config clear.

$ dy config clear
$ dy config dump
---
tables: ~
---
using_region: ~
using_table: ~

Working with DynamoDB items

As an example let's assume you have official "Movie" sample data. To prepare the table with data loaded, simply you can execute dy bootstrap --sample movie.

$ dy bootstrap --sample movie
... wait some time while dynein loading data ...
$ dy use Movie

After executing dy use <your_table> command, dynein recognize keyscheme and data type of the table. It means that some of the arguments you need to pass to access data (items) is automatically inferred when possible.

Before diving deep into each command, let me describe DynamoDB's "reserved words". One of the traps that beginners can easily fall into is that you cannot use certain reserved words in DynamoDB APIs. DynamoDB reserved words contains common words that you may want to use in your application. For example "name", "year", "url", "token", "error", "date", "group" -- all of them are reserved so you cannot use them in expressions directly.

Normally, to use reserved words in expressions, you need to use placeholders instead of actual values. For more information, see Expression Attribute Names and Expression Attribute Values.

To make it easy to interact with DynamoDB items, dynein automatically replace reserved words to placeholders internally - thus you don't need to care about it.

Read

dy scan

The simplest command would be dy scan, which list items in a table.

$ dy scan --limit 10
year  title                  attributes
1933  King Kong              {"info":{"actors":["Bruce Cabot","Fay Wray","Rober...
1944  Arsenic and Old Lace   {"info":{"actors":["Cary Grant","Priscilla Lane","...
1944  Double Indemnity       {"info":{"actors":["Barbara Stanwyck","Edward G. R...
1944  I'll Be Seeing You     {"info":{"actors":["Ginger Rogers","Joseph Cotten"...
1944  Lifeboat               {"info":{"actors":["John Hodiak","Tallulah Bankhea...
1958  Cat on a Hot Tin Roof  {"info":{"actors":["Burl Ives","Elizabeth Taylor",...
1958  Monster on the Campus  {"info":{"actors":["Arthur Franz","Joanna Moore","...
1958  No Time for Sergeants  {"info":{"actors":["Andy Griffith","Myron McCormic...
1958  Teacher's Pet          {"info":{"actors":["Clark Gable","Doris Day","Gig ...
1958  Touch of Evil          {"info":{"actors":["Charlton Heston","Janet Leigh"...

dy get

You may notice that non-key attributes are trimmed in above dy scan results. To get full details of a single item, you use dy get command with primary key. As the "Movie" table is defined with composite primary key, you have to pass "partition key (= year) and "sort key (= title)" to identify an item.

$ dy desc
---
name: Movie
region: us-west-2
status: ACTIVE
schema:
  pk: year (N)     <<<<=== "year" and
  sk: title (S)    <<<<=== "title" are the information to identify an item.
...

$ dy get 1958 "Touch of Evil"
{
  "info": {
    "actors": [
      "Charlton Heston",
      "Janet Leigh",
      "Orson Welles"
    ],
    "directors": [
      "Orson Welles"
    ],
    "genres": [
      "Crime",
      "Film-Noir",
      "Thriller"
    ],
    "image_url": "http://ia.media-imdb.com/images/M/MV5BNjMwODI0ODg1Nl5BMl5BanBnXkFtZTcwMzgzNjk3OA@@._V1_SX400_.jpg",
    "plot": "A stark, perverse story of murder, kidnapping, and police corruption in a Mexican border town.",
    "rank": 3843,
    "rating": 8.2,
    "release_date": "1958-04-23T00:00:00Z",
    "running_time_secs": 5700
  },
  "title": "Touch of Evil",
  "year": 1958
}

Note that if your table has a simple primary key, the only argument you need to pass is a partition key (e.g. dy get yourpk), as the only information DynamoDB requires to identify an item is only a partition key.

dy query

Next command you can try to retrieve items would be: dy query. By passing a partition key, dy query returns items that have the specified partition key.

$ dy query 1960
year  title                  attributes
1960  A bout de souffle      {"info":{"actors":["Daniel Boulanger","Jean Seberg...
1960  La dolce vita          {"info":{"actors":["Anita Ekberg","Anouk Aimee","M...
1960  Ocean's Eleven         {"info":{"actors":["Dean Martin","Frank Sinatra","...
1960  Plein soleil           {"info":{"actors":["Alain Delon","Marie Laforet","...
1960  Spartacus              {"info":{"actors":["Jean Simmons","Kirk Douglas","...
1960  The Apartment          {"info":{"actors":["Fred MacMurray","Jack Lemmon",...
1960  The Magnificent Seven  {"info":{"actors":["Charles Bronson","Steve McQuee...
1960  The Time Machine       {"info":{"actors":["Alan Young","Rod Taylor","Yvet...

Also you can add more conditions on sort key. For example, following command would return items that has sort keys begins with "The".

$ dy query 1960 --sort-key "begins_with The"
year  title                  attributes
1960  The Apartment          {"info":{"actors":["Fred MacMurray","Jack Lemmon",...
1960  The Magnificent Seven  {"info":{"actors":["Charles Bronson","Steve McQuee...
1960  The Time Machine       {"info":{"actors":["Alan Young","Rod Taylor","Yvet...

Other examples for the --sort-key option of dy query are: --sort-key "= 42", --sort-key "> 42", or --sort-key "between 10 and 42". You can find a more detailed explanation in the dedicated dy query command document.

Write

dynein provides subcommands to write to DynamoDB tables as well.

dy put

dy put internally calls PutItem API and save an item to a target table. To save an item, you need to pass at least primary key that identifies an item among the table.

$ dy admin create table write_test --keys id,N
$ dy use write_test

$ dy put 123
Successfully put an item to the table 'write_test'.
$ dy scan
id  attributes
123

Additionally, you can include an item body (non-key attributes) by passing --item or -i option. The --item option takes a JSON-style expression with extended syntax.

$ dy put 456 --item '{"a": 9, "b": "str"}'
Successfully put an item to the table 'write_test'.

$ dy scan
id  attributes
123
456  {"a":9,"b":"str"}

As the parameter of the --item option automatically transforms into DynamoDB-style JSON syntax, writing items into a table would be more straightforward than AWS CLI. See the following comparison:

$ dy put 789 --item '{"a": 9, "b": "str"}'

// The above dynein command is equivalent to AWS CLI's following command:
$ aws dynamodb put-item --table-name write_test --item '{"id": {"N": "456"}, "a": {"N": "9"}, "b": {"S": "str"}}'

Please see the dynein format for details of JSON-style data. To summarize, in addition to the string ("S") and number ("N"), dynein also supports other data types such as boolean ("BOOL"), null ("NULL"), binary ("B"), string set ("SS"), number set ("NS"), binary set("BS"), list ("L"), and nested object ("M").

$ dy put 999 --item '{"myfield": "is", "nested": {"can": true, "go": false, "deep": [1,2,{"this_is_set": <<"x","y","z">>}]}}'
Successfully put an item to the table 'write_test'.
$ dy get 999
{
  "nested": {
    "can": true,
    "deep": [
      1,
      2,
      {
        "this_is_set": [
          "x",
          "y",
          "z"
        ]
      }
    ],
    "go": false
  },
  "myfield": "is",
  "id": 999
}

dy upd

dy upd command internally executes UpdateItem API and you use "update expression" to update an item. Recommended way to update items is use SET and REMOVE in update expression.

with dynein, you can use --set or --remove option. Here's an exmaple:

$ dy put 42 -i '{"flag": true}'
Successfully put an item to the table 'test'.

$ dy get 42
{
  "flag": true,
  "id": 42
}

# Set a boolean
$ dy upd 42 --set "flag = false"
Successfully updated an item in the table 'write_test'.

$ dy get 42
{
  "flag": false,
  "id": 42
}

# Set a string value
$ dy upd 42 --set "date = '2022-02-22T22:22:22Z'"
$ dy get 42
{
  "date": "2022-02-22T22:22:22Z",
  "id": "42",
  "flag": false
}

# Set a number
$ dy upd 42 --set 'pi = +3.14159265358979323846'
$ dy get 42
{
  "date": "2022-02-22T22:22:22Z",
  "pi": 3.141592653589793,
  "id": "42",
  "flag": false
}

# You can apply an addition (+) and a subtraction (-) to the numbers. Please note that DynamoDB does not support unary operator (+, -), multiplication and division.
$ dy upd 42 --set 'pi = pi + 10'
$ dy get 42 | jq .pi
13.141592653589793

$ dy upd 42 --set 'pi = 1 - pi'
$ dy get 42 | jq .pi
-12.141592653589793

Next let me show an example to use --remove. Note that --remove in dy upd command never remove "item" itself, instead --remove just removes an "attribute".

$ dy upd 42 --remove flag
Successfully updated an item in the table 'write_test'.

$ dy get 42
{
  "id": "42",
  "date": "2022-02-22T22:22:22Z",
  "pi": 3.141592653589793
}

# You can remove multiple attributes
$ dy upd 42 --remove "date, pi"
$ dy get 42
{
  "id": "42"
}

DynamoDB supports a list type which has order. Let's try it with dynein.

# Create an empty list
$ dy upd 42 --set "list = []"
$ dy get 42
{
  "id": "42",
  "list": []
}

# Add an elements into the list
$ dy upd 42 --set "list = list_append(list, ['item1'])"
$ dy get 42
{
  "id": "42",
  "list": [
    "item1"
  ]
}

# Prepend an element to the list
$ dy upd 42 --set "list = list_append(['item0'], list)"
$ dy get 42 | jq .list
[
  "item0",
  "item1"
]

# Add more elements
$ dy upd 42 --set "list = list_append(list, ['item2', 'item3'])"
$ dy get 42 | jq .list
[
  "item0",
  "item1",
  "item2",
  "item3"
]

# You can directly modify the list element
$ dy upd 42 --set "list[0] = 'item0 modified'"
$ dy get 42 | jq .list
[
  "item0 modified",
  "item1",
  "item2",
  "item3"
]

# Delete the element from the list
$ dy upd 42 --remove 'list[0]'
$ dy get 42 | jq .list
[
  "item1",
  "item2",
  "item3"
]

# Remove the list attribute
$ dy upd 42 --remove list
$ dy get 42
{
  "id": "42"
}

Furthermore, it's possible to update multiple attributes simultaneously.

# Set numbers
$ dy upd 42 --set "n1 = 0, n2 = 1"
$ dy get 42
{
  "n2": 1,
  "id": "42",
  "n1": 0
}

# Calculate Fibonacci numbers
$ dy upd 42 --set "n1 = n2, n2 = n1 + n2"
$ dy get 42 | jq -c '[.n1,.n2]'
[1,1]

# Calculate the next value
$ dy upd 42 --set "n1 = n2, n2 = n1 + n2"
$ dy get 42 | jq -c '[.n1,.n2]'
[1,2]

# You can get more sequence
$ dy upd 42 --set "n1 = n2, n2 = n1 + n2"
$ dy get 42 | jq -c '[.n1,.n2]'
[2,3]

$ dy upd 42 --set "n1 = n2, n2 = n1 + n2"
$ dy get 42 | jq -c '[.n1,.n2]'
[3,5]

# Clean up the attributes
$ dy upd 42 --remove "n1,n2"
$ dy get 42
{
  "id": "42"
}

As demonstrated in dy put, map type expresses nested values. Let's manipulate it with dynein.

$ dy upd 42 --set 'ProductReviews = {"metadata": {"counts": 0, "average": null}}'
$ dy get 42
{
  "id": "42",
  "ProductReviews": {
    "metadata": {
      "average": null,
      "counts": 0
    }
  }
}

$ dy upd 42 --set 'ProductReviews.FiveStar = ["Excellent product"], ProductReviews.metadata = {"average": 5, "sum": 5, "counts": 1}'
$ dy get 42
{
  "id": "42",
  "ProductReviews": {
    "FiveStar": [
      "Excellent product"
    ],
    "metadata": {
      "average": 5,
      "counts": 1,
      "sum": 5
    }
  }
}

$ dy upd 42 --set 'ProductReviews.FiveStar[1] = "Very happy with my purchase", ProductReviews.ThreeStar = ["Just OK - not that great"], ProductReviews.metadata = {"average": 4.3, "sum": 13, "counts": 3}'
$ dy get 42
{
  "id": "42",
  "ProductReviews": {
    "FiveStar": [
      "Excellent product",
      "Very happy with my purchase"
    ],
    "ThreeStar": [
      "Just OK - not that great"
    ],
    "metadata": {
      "average": 4.3,
      "counts": 3,
      "sum": 13
    }
  }
}

$ dy upd 42 --set 'ProductReviews.OneStar = if_not_exists(ProductReviews.OneStar, [])'
$ dy get 42
{
  "id": "42",
  "ProductReviews": {
    "FiveStar": [
      "Excellent product",
      "Very happy with my purchase"
    ],
    "OneStar": [],
    "ThreeStar": [
      "Just OK - not that great"
    ],
    "metadata": {
      "average": 4.3,
      "counts": 3,
      "sum": 13
    }
  }
}

$ dy upd 42 --set 'ProductReviews.OneStar = list_append(ProductReviews.OneStar, ["Broken"]), ProductReviews.metadata = {"average": 3.5, "sum": 14, "counts": 4}'
$ dy get 42
{
  "ProductReviews": {
    "FiveStar": [
      "Excellent product",
      "Very happy with my purchase"
    ],
    "OneStar": [
      "Broken"
    ],
    "ThreeStar": [
      "Just OK - not that great"
    ],
    "metadata": {
      "average": 3.5,
      "counts": 4,
      "sum": 14
    }
  },
  "id": "42"
}

$ dy upd 42 --remove ProductReviews
$ dy get 42
{
  "id": "42"
}

dynein has a special command named --atomic-counter. It increases specified number attribute by 1.

$ dy get 52
{
  "age": 28,
  "name": "John",
  "id": 52
}

$ dy upd 52 --atomic-counter age
Successfully updated an item in the table 'write_test'.

$ dy get 52
{
  "age": 29,
  "id": 52,
  "name": "John"
}
Supported String Literals

There are two types of string literals that you can use:

  • Double quote ("): Double quoted string literals support escape sequences such as \0, \r, \n, \t, \\, \", and \'. Each of them represents a null character, carriage return, new line, horizontal tab, backslash, double quote, and single quote, respectively. If you need to include a double quote inside the literal, you must escape it.
  • Single quote ('): Single-quoted string literals are interpreted as you input them. However, you cannot specify a string that includes a single quote. In such cases, you can use a double-quoted string literal.
Supported Functions

The upd command supports the following functions:

  • list_append: This function is used to concatenate two lists, where each list can be a literal or a path to an attribute. When you call list_append([l1,l2], [l3,l4,l5]), it will return [l1,l2,l3,l4,l5].
  • if_not_exists: This fungiction allows you to set a default value for the null case. The left-hand argument represents the path to an attribute, while the right-hand argument specifies the default value for the null.

For more details, please refer to the official documentation.

Quoting a Path of an Attribute

Sometimes, you may need to specify a path that includes a space or special characters that are not allowed by dynein. In such cases, you can use backticks to quote the path. For example, consider the following item:

{
  "id": {"S":  "55"},
  "map": {
    "M": {
      "Do you have spaces?": {
        "S": "Yes"
      },
      "Dou you `?": {
        "S": "Yes"
      },
      "路径": {"S": "Chinese"},
      "パス": {"S": "Japanese"},
      "경로": {"S": "Korean"}
    }
  }
}

You can specify a path using the following syntax:

  • dy upd 55 --set 'map.`Do you have spaces?` = "Allowed"'
  • dy upd 55 --set 'map.`Dou you ``?` = "Maybe"'

As demonstrated above, you can use double backticks (``) to represent a backtick (`) within the path.

Please note that you may not need to escape non-ASCII paths like CJK characters. For example, you can specify 路径, パス, and 경로 without quotes. Dynein allows you to specify a path where the first character belongs to the ID_Start class and the subsequent characters belong to the ID_Continue class without requiring escape sequences. These classes are defined by the Unicode standard. The following examples illustrate this:

  • dy upd 55 --set 'map.路径 = "A word of Chinese"'
  • dy upd 55 --set 'map.パス = "A word of Japanese"'
  • dy upd 55 --set 'map.경로 = "A word of Korean"'

dy del

To delete an item, you use dy del command with primary key to identify an item.

$ dy get 42
{ "id": 42 }
$ dy del 42
Successfully deleted an item from the table 'write_test'.
$ dy get 42
No item found.

dy bwrite

dy bwrite internally calls BatchWriteItem API and is used for putting and deleting multiple items.

You can specify the --input option when providing operations from a JSON file that follows the Request Syntax of BatchWriteItem API.

$ dy bwrite --input request.json
$ cat request.json
{
    "__TABLE_NAME__": [
        {
            "PutRequest": {
                "Item": {
                    "pk": { "S": "ichi" },
                    "ISBN": { "S": "111-1111111111" },
                    "Price": { "N": "2" },
                    "Dimensions": { "SS": ["Giraffe", "Hippo" ,"Zebra"] },
                    "PageCount": { "NS": ["42.2", "-19", "7.5", "3.14"] },
                    "InPublication": { "BOOL": false },
                    "Binary": {"B": "dGhpcyB0ZXh0IGlzIGJhc2U2NC1lbmNvZGVk"},
                    "BinarySet": {"BS": ["U3Vubnk=", "UmFpbnk=", "U25vd3k="]},
                    "Nothing": { "NULL": true },
                    "Authors": {
                        "L": [
                            { "S": "Author1" },
                            { "S": "Author2" },
                            { "N": "42" }
                        ]
                    },
                    "Details": {
                        "M": {
                            "Name": { "S": "Joe" },
                            "Age":  { "N": "35" },
                            "Misc": {
                                "M": {
                                    "hope": { "BOOL": true },
                                    "dream": { "L": [ { "N": "35" }, { "NULL": true } ] }
                                }
                            }
                        }
                    }
                }
            }
        }
    ]
}

You can also use --put or --del options to achieve corresponding operations. These options take the dynein format, a JSON-style expression. To put or delete items, you must provide at least a primary key to identify each item uniquely.

$ dy bwrite --put '{"pk": "1", "this_is_set": <<"i","j","k">>}' --put '{"pk": "2", "this_is_set": <<"x","y","z">>}'
$ dy scan
pk  attributes
1   {"this_is_set":["i","j","k"]}
2   {"this_is_set":["x","y","z"]}

The --put, --del, and --input options can be used simultaneously.

$ dy bwrite --del '{"pk": "1"}' --del '{"pk": "2"}' --put '{"pk": "3", "this_is_set": <<"a","b","c">>}'
$ dy scan
pk  attributes
3   {"this_is_set":["a","b","c"]}
$ dy bwrite --del '{"pk": "1"}' --del '{"pk": "2"}' --put '{"pk": "3", "this_is_set": <<"a","b","c">>}' --input request.json

Working with Indexes

DynamoDB provides flexible way to query data efficiently by utilizing Secondary Index features. There're two types of secondary indexes: GSI (Global Secondary Index) and LSI (Local Secondary Index), but you can create LSI only when creating a table.

With dynein, you can add GSI to existing table.

$ dy admin create index top_rank_users_index --keys rank,N --table app_users
---
name: app_users
region: us-west-2
status: UPDATING
schema:
  pk: app_id (S)
  sk: user_id (S)
mode: OnDemand
capacity: ~
gsi:
  - name: top_rank_users_index
    schema:
      pk: rank (N)
      sk: ~
    capacity: ~
lsi: ~
stream: ~
count: 0
size_bytes: 0
created_at: "2020-06-02T14:22:56+00:00"

$ dy use app_users
$ dy scan --index top_rank_users_index

Import/Export for DynamoDB items

dy export

You can export DynamoDB items into JSON or CSV file. As the default format is json, you can simply call following command to export:

$ dy export --table Reply --format json --output-file out.json
$ cat out.json
[
  {
    "PostedBy": "User A",
    "ReplyDateTime": "2015-09-15T19:58:22.947Z",
    "Id": "Amazon DynamoDB#DynamoDB Thread 1",
    "Message": "DynamoDB Thread 1 Reply 1 text"
  },
  {
    "Id": "Amazon DynamoDB#DynamoDB Thread 1",
...

No --format option means --format json. If you want to dump data in oneline, try --format json-compact. Or, if you want to export in JSONL (JSON Lines), i.e. "one JSON item per one line" style, --format jsonl is also available.

$ dy export --table Reply --format jsonl --output-file out.jsonl
$ cat out.jsonl
{"PostedBy":"User A","ReplyDateTime":"2015-09-15T19:58:22.947Z","Message":"DynamoDB Thread 1 Reply 1 text","Id":"Amazon DynamoDB#DynamoDB Thread 1"}
{"PostedBy":"User B","Message":"DynamoDB Thread 1 Reply 2 text","ReplyDateTime":"2015-09-22T19:58:22.947Z","Id":"Amazon DynamoDB#DynamoDB Thread 1"}
...

When export data to CSV, primary key(s) are exported by default. You can explicitly pass additional attributes to export.

$ dy export --table Reply --output-file out.csv --format csv --attributes PostedBy,Message
$ cat out.csv
Id,ReplyDateTime,PostedBy,Message
"Amazon DynamoDB#DynamoDB Thread 1","2015-09-15T19:58:22.947Z","User A","DynamoDB Thread 1 Reply 1 text"
...

dy import

To import data into a table, you use with specified --format option. Here default format is JSON like dy export.

$ dy import --table target_movie --format json --input-file movie.json

Enable set type inference

Dynein provides the type inference for set types (number set, string set) for backward compatibility. If you want to retain the inference behavior before 0.3.0, you can use --enable-set-inference option.

Without option, all JSON lists are inferred as list type.

$ cat load.json
{"pk":1,"string-set":["1","2","3"]}
{"pk":2,"number-set":[1,2,3]}
{"pk":3,"list":["1",2,"3"]}

$ dy admin create table target_movie -k pk,N
$ dy import --table target_movie --format jsonl --input-file load.json
$ aws dynamodb get-item --table-name target_movie --key '{"pk":{"N":"1"}}'
{
    "Item": {
        "string-set": {
            "L": [
                {
                    "S": "1"
                },
                {
                    "S": "2"
                },
                {
                    "S": "3"
                }
            ]
        },
        "pk": {
            "N": "1"
        }
    }
}

With --enable-set-inference option, JSON lists are inferred based on their content.

$ dy admin create table target_movie2 -k pk,N
$ dy import --table target_movie --format jsonl --enable-set-inference --input-file load.json
$ aws dynamodb get-item eu-north-1 --table-name target_movie --key '{"pk":{"N":"1"}}'
{
    "Item": {
        "string-set": {
            "SS": [
                "1",
                "2",
                "3"
            ]
        },
        "pk": {
            "N": "1"
        }
    }
}

$ aws dynamodb get-item --table-name target_movie --key '{"pk":{"N":"2"}}'
{
    "Item": {
        "pk": {
            "N": "2"
        },
        "number-set": {
            "NS": [
                "3",
                "2",
                "1"
            ]
        }
    }
}

$ aws dynamodb get-item --table-name target_movie --key '{"pk":{"N":"3"}}'
{
    "Item": {
        "pk": {
            "N": "3"
        },
        "list": {
            "L": [
                {
                    "S": "1"
                },
                {
                    "N": "2"
                },
                {
                    "S": "3"
                }
            ]
        }
    }
}

Using DynamoDB Local with --region local option

DynamoDB provides free tier that consists of 25 GB of storage and 25 WCU/RCU which is enough to handle up to 200M requests per month. However, if you're already using DynamoDB in your account and worrying about additional costs by getting started with dynein, you can use DynamoDB Local.

Yes, dynein supports DynamoDB Local. The only difference you need to add would be --region local option for every command. To get start with DynamoDB Local Docker version with dynein is quite simple as follows.

Simply you can run docker container and expose 8000 port by following command.

$ docker run -p 8000:8000 -d amazon/dynamodb-local

Optionally, if you prefer Kubernetes, you can use manifest file in this repository.

$ kubectl apply -f k8s-deploy-dynamodb-local.yml
$ kubectl port-forward deployment/dynamodb 8000:8000

Now you can interact with DynamoDB Local with --region local option.

$ dy --region local admin create table localdb --keys pk
$ dy --region local use -t localdb
$ dy put firstItem
$ dy put secondItem
$ dy scan

Contribution

We welcome community contributions and pull requests. See CONTRIBUTING.md for our guidelines on how to submit your code.

Misc

Asides

dynein is named after a motor protein.

Troubleshooting

If you encounter troubles, the first option worth trying is removing files in ~/.dynein/ or the directory itself. Doing this just clears "cached" info stored locally for dynein and won't affect your data stored in DynamoDB tables.

$ rm -rf ~/.dynein/

To see verbose output for troubleshooting purpose, you can change log level by RUST_LOG environment variable. For example:

$ RUST_LOG=debug RUST_BACKTRACE=1 dy scan --table your_table

Ideas for future works

  • dy admin plan & dy admin apply commands to manage tables through CloudFormation.
  • Linux's top -like experience to monitor table status. e.g. dy top tables
    • inspired by kubectl top nodes
    • implementation: (CloudWatch metrics such as Consumed WCU/RCU, SuccessfulRequestLatency, ReplicationLatency for GT etc)
  • Shell (bash/zsh) completion
  • Retrieving control plane APIs, integrated with CloudTrail
  • dy logs command to retrieving data plane API logs via DynamoDB Streams (write APIs only)
    • tail -f -ish usability. e.g. dy logs -f mytable
  • truncate command to delete all data in a table
  • Support Transaction APIs (TransactGetItems, TransactWriteItems)
  • simple load testing. e.g. dy load --tps 100
  • import/export tool supports LTSV, TSV
  • PITR configuration enable/disable (UpdateContinuousBackups) and exporting/restoring tables (ExportTableToPointInTime, RestoreTableToPointInTime)

dynein's People

Contributors

adamcrosby avatar amazon-auto avatar chenrui333 avatar dependabot[bot] avatar hhatto avatar jbpratt avatar keens avatar kirie-0217 avatar kzym-w avatar mlafeldt avatar ryota-sakamoto avatar sebitommy123 avatar sszafir avatar stonedot avatar tmyoda avatar wafuwafu13 avatar zulinx86 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dynein's Issues

Feature suggestion: Allow filtering a query response

The DynamoDB API allows for further filtering the response of a query to reduce the data returned to the client.

It would be nice to have a mechanism to apply a number of filters when using dynein to run a query.

Dynein format does not accept valid JSON string

Currently, dynein is not able to parse all JSON strings. For example, JSON string can use \f for formfeed, while dynein does not support yet. We should improve parsing for the string type to accept a string containing all types of escape sequences JSON supports.

Feature suggestion: Allow port override

It would be nice to be able to define an override for the default port of 8000. This is similar to being able to override the entire endpoint via --endpoint-url when using the AWS CLI.

Improve documents

Background

For now, README.md is big and a lot of informations within one place. I think it is hard to maintain so we need to make README.md simple like other tools and specific informations should be written in specific page like how to use the command.
Also all of samples of command is maintained manually and it will cause differences from actual command.

Work

  • make README.md simple, divide informations to specific page
  • use trycmd for all of command regard dynein in the docs

Design

The doc docs/cmd/*.md is to describe each command how to use command and samples.

.
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── Cargo.lock
├── Cargo.toml
├── LICENSE
├── NOTICE
├── README.md
├── docs
│   ├── cmd
│   │   ├── admin.md
│   │   └── use.md
│   ├── development.md
│   ├── faq.md
│   └── troubleshooting.md
├── k8s-deploy-dynamodb-local.yml
├── rust-toolchain
├── src
└── tests

Ref

https://github.com/aws/aws-cli
https://github.com/aws/aws-sam-cli
https://github.com/aws/aws-nitro-enclaves-cli
https://github.com/aws/copilot-cli

Implement sophisticated parser for query command

We should offer the same parsing experience for the query command as the upd command implemented by #132.

For example, we should support the following ways.

dy query 13 --sort-key '= "12"'
dy query 13 --sort-key "= '12'"
dy query 13 --sort-key '= 12'
dy query 13 --sort-key '<=12'

Additionally, we should care about the type difference between the schema of the sort key and the provided value. If dynein detects these kinds of mistakes, dynein should raise the error and propose a way to fix these errors or fall back to legacy parsing.

For example, we assume that a table was created with a sort key of a string. The following value of --sort-key does not match the expected type.

dy query 13 --sort-key '= 12'

In this case, dynein should raise the error and propose the following command or automatically fallback.

dy query 13 --sort-key '= "12"'

Create a load testing command

Load testing command

Background

The load testing command is useful in understanding DynamoDB behaviors, for example, throttling, auto-scaling, metrics, etc. Also, it helps users to investigate an application's behavior when throttling happens.

Proposed design

The decisions in the implementations are the followings;

  • The amount of request traffic is controlled by leaky bucket algorithm with a feedback loop that adjusts the next amount of acquisition by actual consumed capacity.
  • The current consumed capacity is updated and presented in real time. But, in the first implementation, we will omit visualization like a graph.
  • To prevent consuming capacity unintentionally, RCU and WCU must be provided by the user.
  • The internal request manager controls the maximum parallel request to DynamoDB. It has a responsibility to scale in or out the number of parallel requests. It scales requests exponentially with base 2.

Interface

At first implementation, load testing functionality is provided with the command, dy bench run or dy benchmark run and provided options are the following;

  • --rcu <number>: Specify target RCU when reading items. This is a required argument.
  • --wcu <number>: Specify target WCU when writing items. This is a required argument if you do not provide --skip-item-createion.
  • --size <number>: The preferred size of an attribute in bytes. The default value is 500.
  • --skip-item-creation: By default, dynein creates items first for the writing test, and then, performs the read tests by using created items. This option skips wcu testing and uses the data stored on the table.
  • --partition-key-variations <number>: The maximum number of primary key variations of items. The default value is 1000.
  • --sort-key-variations <number>: The maximum number of sort key variations of items. The default value is 100.
  • --duration-write <number>: The duration of the write testing. The default value is five minutes.
  • --duration-read <number>: The duration of the read testing. The default value is five minutes.

Common options like --table, --region, etc are considered as well as other commands.

We use a bench run subcommand for initial implementation. Please note that we have room of feature enhancements. For example, we can use dy bench run -s <scenario-file> for scenario based tests and dy bench report <report-file> for showing a result of a test.

The workflow

The workflow of the load testing is schematically described as the followings;

  1. Based on the --item-variations argument, create a list of primary keys to use in the test. In the case in which --skip-item-creation is provided, Scan APIs are invoked to list primary keys. We must use parallel scans because sequential scans create a hot partition.
  2. Based on the --wcu argument, PutItem are invoked with the primary keys created by the first step for the duration of --duration-write. An item created has an additional string attribute with --size bytes.
  3. Based on the --rcu argument, GetItem are invoked with the primary keys created by the first step for the duration of --duration-read.

"dy ls --all-regions" fails when using a table in local region.

Description

dy ls --all-regions fails when using a table in local region.

How to reproduce

$ docker run -p 8000:8000 -d amazon/dynamodb-local

$ ./dy --region local admin create table localdb --keys pk
---
name: localdb
region: local
status: ACTIVE
schema:
  pk: pk (S)
  sk: ~
mode: OnDemand
capacity: ~
gsi: ~
lsi: ~
stream: ~
count: 0
size_bytes: 0
created_at: "2021-12-04T23:37:48+00:00"

$ ./dy --region local use -t localdb
Now you're using the table 'localdb' (local).

$ ./dy ls --all-regions
[2021-12-04T23:38:20Z ERROR dy::control] Request ID: Some("c55ee3a5-458a-478f-9fb8-f680086b8791") Body: {"__type":"com.amazonaws.dynamodb.v20120810#InternalFailure","Message":"The request processing has failed because of an unknown error, exception or failure."}

$ ./dy config dump
---
tables:
  local/localdb:
    region: local
    name: localdb
    pk:
      name: pk
      kind: S
    sk: ~
    indexes: ~
    mode: OnDemand

---
using_region: local
using_table: localdb
using_port: 8000

Additional Info

dy --region local ls succeeds even when using a table in local region.

$ ./dy --region local ls
DynamoDB tables in region: local
* localdb

After config clear, dy ls --all-regions succeeds.

$ ./dy config clear
[ec2-user@ip-172-31-83-239 release]$ ./dy ls --all-regions
DynamoDB tables in region: us-east-1
(snipped)
DynamoDB tables in region: ap-southeast-1
  No table in this region.

Unable to run Dynein with custom config directories

I request we allow users to change location of config files with env var DYNEIN_CONFIG_DIR. Right now, it is always ~/.dynein.

Changing this is useful for users that may want to run a Dynein process which does not interfere with the state of another Dynein process.

Artifact downloads named improperly for filetype

The macOS archive is a plain tar archive, not a Gzipped tar archive:

% shasum -a 256 -c dynein-macos.tar.gz.sha256 
dynein-macos.tar.gz: OK

% file dynein-macos.tar.gz
dynein-macos.tar.gz: POSIX tar archive

% gunzip dynein-linux.tar.gz
gzip: dynein-linux.tar.gz: not in gzip format

% tar xzvf dynein-linux.tar.gz
gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error is not recoverable: exiting now

% tar xvf dynein-linux.tar.gz
x dy

%

Please either rename the files or actually compress them to avoid confusion.

How about rename cmd `dy` to `ddb`?

dy is pronounced similarly to die, which is a bad name in my view. I would prefer ddb because it's a very common service acronym for DynamoDB.

Run integration tests on GitHub Actions

Currently, we don't execute CI/CD tests on GitHub actions. This issue track blockers of CI/CD on GitHub actions.

The current main blocker are followings:

Additional related enhancements are followings:

Before we tackle this issue, we should resolve the above blockers.

Empty string set failing on export/import

After exporting some of my tables in JSON format, I have the following error when importing:

Error: BatchWriteError(Validation("One or more parameter values were invalid: An string set  may not be empty"))

It looks like this is due to a string set attribute in the export JSON:

    "tags": []

The original exported table can contain:

  "tags": {
    "L": []
  }

In this case, the import fails with the error and stops. Is this expected?

And when the original export is done, the table has a list, not a string set:

  "tags": {
    "L": [
      {
        "S": "a-tag"
      }
    ]
  }

The import is probably trying to create a String Set attribute (SS) instead of a List attribute (L) due to JSON loss of typing, right? Is there some way to get a full identical table with export then import?

Add base64 literal for binary type to dynein format

Because normal DynamoDB JSON uses base64 encoded string to pass a binary, dynein format should support base64 encoded binary. For example, it is beneficial if we can create an item using dy put "{'pk': b64'$(echo 'key1' | base64)'}" command.

Cannot update a string attribute with an UUID.

Trying to set a string attribute to a UUID fails, it seems like it tries to parse it as if it was an operation on number, however the quotes should indicate that it is a raw string.

> dy upd item1Key --set 'Attribute = "00000000-0000-0000-0000-000000000000"' --table DataTable
2022-05-17T21:47:25Z ERROR dy::data] failed to parse a right hand statement '"00000000-0000-0000-0000-000000000000"'. Valid syntax would be: 'Attr = "val"', or 'Attr = Attr + 100'

Weirdly enough passing "a00000000-0000-0000-0000-000000000000" or "-00000000-0000-0000-0000-000000000000" fails too, not sure where it comes from but it definitely looks like a bug. If not the solution should be made clear in the documentation.

Allow Full Attributes on Query or Scan

It would be useful to be able to access the full attributes on a given query or scan. alternatively, the export command could allow for exporting to stdout (as opposed to requiring a file).

Implement sophisticated parser other than dy upd

Currently, only the upd command provides the sophisticated parser for input, which is implemented by #132. We should offer the same feature for the following commands;

  • dy put
  • dy query for --sort-key
  • dy bwrite

Tasks

  1. enhancement
    StoneDot
  2. enhancement
    StoneDot
  3. enhancement
    tmyoda

"extern crate" isn't required in 2018 edition.

dynein uses 2018 edition as described in Cargo.toml.

edition = "2018"

In 2018 edition, extern crate is unidiomatic as follows.
https://doc.rust-lang.org/beta/reference/names/preludes.html

Edition Differences: In the 2015 edition, crates in the extern prelude cannot be referenced via use declarations, so it is generally standard practice to include extern crate declarations to bring them into scope.

Beginning in the 2018 edition, use declarations can reference crates in the extern prelude, so it is considered unidiomatic to use extern crate.

Why do we still use extern crate?

Implement sophisticated parser for bwrite command

We should offer the same parsing experience for the bwrite command as the upd command implemented by #132.

Currently, dynein supports the following way.

dy bwrite --input file.json

However, it is not much dynein feel. The below example demonstrates the preferable way.

dy bwrite --put '{Dynein format}' --put '{Dynein format}' --del '{Dynein format}'

Inference preferable format based on the situation

This proposal is related to #176.

As I mentioned in #176, the dynein format is preferable when showing output for human reading, but JSON is more suitable for reading by programs. I propose inferring the preferred format using atty.

If the user uses a pipe, a program consumes output; if the user does not, it means that a human checks the output. I think this correspondence can be used to infer a preferable format.

Info: how to use SSO profiles

This is more of an FYI than an issue, so I'll close it. Just wanted to note that if, like our company, you use AWS SSO profiles as part of your workflow, dynein won't work out of the box. However, it's possible to wrap it with aws2-wrap which makes life very simple.

Say you have the following SSO profile in .aws/config:

[profile dev]
sso_start_url = https://d-123456789.awsapps.com/start#/
sso_region = us-west-2
sso_account_id = 123456789
sso_role_name = DevUser
region = us-west-2
output = json
cli_pager =

Normally, you'd log in using:

$ aws sso login --profile dev
$ aws dynamodb list-tables --profile dev

etc. However,

$ AWS_PROFILE=dev dy ls

won't work. Enter aws2-wrap:

$ pip3 install aws2-wrap==1.2.2
$ alias dydev="aws2-wrap --profile dev dy"
$ dydev ls
DynamoDB tables in region: us-west-2
  some-table
* some-other-table

Works a treat.

Unreasonable type guessing for SS and NS causes Validation error when importing backup

When trying to import a backup containing a record with a schema like this:

      "lines": {
        "L": [
          {
            "S": "Lohkoweg 40"
          }
        ]
      },

dynein assumes a schema like this:

      "lines": {
        "SS": [
          "Lohkoweg 40"
        ]
      },

This causes a problem when we try to import a record where there are duplicates (allowed by our schema) such as this:

"lines":["Theostraße 42-90","Hamburg","Hamburg"],

This is caused by an unreasonable assumption in dynein/src/data.rs
line 831

any list of strings is a String Set "SS" and any list of numbers is a Number Set "NS"

To be correct there should be a check that there are no duplicates and if there are duplicates it should be a list "L"

However this would still not be enough for us. In our schema we have no uses of String Set "SS" or Number Set "NS" anywhere and we do not like the idea of the schema changing from one record to another.

A solution would be to add a cli option to the import function which disables string sets and one that disables number sets.

Thanks

example to update item's attribute as string

I can update an existing item to set/add attribute as boolean or number.

# working when set the `attr` as boolean or number
dy upd key --set "attr = false"
dy upd key --set "attr = 100"

When set an attribute as string, I got below error. No idea how making it work.

dy upd key --set "attr = 'ssss'"
thread 'main' panicked at 'failed to parse right hand object ''ssss'' into AttributeValue.', src/data.rs:553:29
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

dy upd key --set "attr = ssss"  
thread 'main' panicked at 'failed to parse right hand object 'ssss' into AttributeValue.', src/data.rs:553:29
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

dy upd fails when right hand statement contains a hyphen

Using dynein 0.2.1 on Amazon Linux 2, I am unable to run dyn upd to update a string attribute when the new value contains a hyphen. For example, running:

 dy upd somepk --set 'myattr = "my-value"'

gives the error:

[2023-03-08T18:19:45Z ERROR dy::data] failed to parse a right hand statement '"my-value"'. Valid syntax would be: 'Attr = "val"', or 'Attr = Attr + 100'

but same command succeeds if I remove the hyphen.

I also tried escaping the hyphen with a backslash, to the same result:

[2023-03-08T18:20:20Z ERROR dy::data] failed to parse a right hand statement '"my\-value"'. Valid syntax would be: 'Attr = "val"', or 'Attr = Attr + 100'

`bwrite` command does not support binary data types

Using binary types with 'bwrite' command results in the following error.

dy bwrite --table test  --input tests/resources/test_batch_write.json                                                                                                                                       
[2023-09-13T09:07:17Z ERROR dy::batch] [skip] invalid/unsupported DynamoDB JSON format: Object {"B": String("dGhpcyB0ZXh0IGlzIGJhc2U2NC1lbmNvZGVk")}
[2023-09-13T09:07:17Z ERROR dy::batch] [skip] invalid/unsupported DynamoDB JSON format: Object {"BS": Array [String("dGhpcyB0ZXh0IGlzIGJhc2U2NC1lbmNvZGVk"), String("sdfsdhSRHwFEDw4f")]}

This issue seems to be due to the lack of implementation for B and BS types in the following.

dynein/src/batch.rs

Lines 413 to 475 in f95b9fc

fn ddbjson_val_to_attrval(ddb_jsonval: &JsonValue) -> Option<AttributeValue> {
// prepare shared logic that can be used for both SS and NS.
let set_logic = |val: &JsonValue| -> Vec<String> {
val.as_array()
.expect("should be valid JSON array")
.iter()
.map(|el| el.as_str().expect("should -> str").to_string())
.collect::<Vec<String>>()
};
// following list of if-else statements would be return value of this function.
if let Some(x) = ddb_jsonval.get("S") {
Some(AttributeValue {
s: Some(x.as_str().unwrap().to_string()),
..Default::default()
})
} else if let Some(x) = ddb_jsonval.get("N") {
Some(AttributeValue {
n: Some(x.as_str().unwrap().to_string()),
..Default::default()
})
} else if let Some(x) = ddb_jsonval.get("BOOL") {
Some(AttributeValue {
bool: Some(x.as_bool().unwrap()),
..Default::default()
})
} else if let Some(x) = ddb_jsonval.get("SS") {
Some(AttributeValue {
ss: Some(set_logic(x)),
..Default::default()
})
} else if let Some(x) = ddb_jsonval.get("NS") {
Some(AttributeValue {
ns: Some(set_logic(x)),
..Default::default()
})
} else if let Some(x) = ddb_jsonval.get("L") {
let list_element = x
.as_array()
.unwrap()
.iter()
.map(|el| ddbjson_val_to_attrval(el).expect("failed to digest a list element"))
.collect::<Vec<AttributeValue>>();
debug!("List Element: {:?}", list_element);
Some(AttributeValue {
l: Some(list_element),
..Default::default()
})
} else if let Some(x) = ddb_jsonval.get("M") {
let inner_map: HashMap<String, AttributeValue> = ddbjson_attributes_to_attrvals(x);
Some(AttributeValue {
m: Some(inner_map),
..Default::default()
})
} else if ddb_jsonval.get("NULL").is_some() {
Some(AttributeValue {
null: Some(true),
..Default::default()
})
} else {
None
}
}

I will work on this issue.

Unable to bootstrap because the GET request sampledata.zip fails with 403.

$ RUST_LOG=debug cargo run -- bootstrap
...
[2022-06-24T06:25:54Z DEBUG dy::bootstrap] temporary download & unzip directory: TempDir { path: "/tmp/.tmpJwnuKk" }
Temporarily downloading sample data from https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/samples/sampledata.zip
[2022-06-24T06:25:54Z DEBUG reqwest::connect] starting new connection: https://docs.aws.amazon.com/
[2022-06-24T06:25:54Z DEBUG reqwest::async_impl::client] response '403 Forbidden' for https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/samples/sampledata.zip
[2022-06-24T06:25:54Z DEBUG dy::bootstrap] Downloading the file at: /tmp/.tmpJwnuKk/downloaded_sampledata.zip
[2022-06-24T06:25:54Z DEBUG dy::bootstrap] Finished writing content of the downloaded data into '/tmp/.tmpJwnuKk/downloaded_sampledata.zip'
Error: ZipError(InvalidArchive("Could not find central directory end"))

Support DynamoDB JSON for import/export table

Currently, we do not support DynamoDB JSON type for importing and exporting, as mentioned in #66. This prevents us from copying the table data as it is. We should provide an import/export option to use DynamoDB JSON format for such a use case.

`cargo test` failed due to difference in command output on windows environment

On the Windows environment, the cargo test command fails because of snapshot test failure;

Testing tests\cmd\bwrite.md:4 ... failed
Exit: success

---- expected: stdout
++++ actual:   stdout
   1      - dy-bwrite 0.2.1
        1 + dy[EXE]-bwrite 0.2.1
   2    2 | Put or Delete multiple items at one time, up to 25 requests. [API: BatchWriteItem]
   3    3 | 
   4    4 | https://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_BatchWriteItem.html
   5    5 | 
   6    6 | USAGE:
   7      -     dy bwrite [OPTIONS] --input <input>
        7 +     dy[EXE] bwrite [OPTIONS] --input <input>
   8    8 | 
   9    9 | FLAGS:
  10   10 |     -h, --help       
  11   11 |             Prints help information
  12   12 | 
          ⋮
  26   26 |             --region option in both top-level and subcommand-level
  27   27 |     -t, --table <table>      
  28   28 |             Target table of the operation. You can use --table option in both top-level and subcommand-level. You can
  29   29 |             store table schema locally by executing `$ dy use`, after that you need not to specify --table on every
  30   30 |             command
stderr:

We should consider a way to absorb this type of difference.

AWS auth does not work as per instructions

The instructions say:

How to Use

Prerequisites - AWS Credentials

First of all, please make sure you've already configured AWS Credentials in your environment. dynein depends on [rusoto](https://github.com/rusoto/rusoto) and rusoto [can utilize standard AWS credential toolchains](https://github.com/rusoto/rusoto/blob/master/AWS-CREDENTIALS.md) - for example ~/.aws/credentials file, [IAM EC2 Instance Profile](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2_instance-profiles.html), or environment variables such as AWS_DEFAULT_REGION / AWS_ACCESS_KEY_ID / AWS_SECRET_ACCESS_KEY / AWS_PROFILE.

One convenient way to check if your AWS credential configuration is ok to use dynein is to install and try to execute [AWS CLI](https://aws.amazon.com/cli/) in your environment (e.g. $ aws dynamodb list-tables). Once you've [configured AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html), you should be ready to use dynein.

When I try:

> aws sts get-caller-identity
{
    "UserId": "<redacted>:[email protected]",
    "Account": "<redacted>",
    "Arn": "arn:aws:sts::<redacted>:assumed-role/<redacted>/[email protected]"
}

> aws s3 ls
2021-02-20 04:04:50 <redacted>
2021-02-20 04:04:39 <redacted>
2022-04-20 17:34:26 <redacted>
2022-04-20 17:34:26 <redacted>
2022-07-05 01:10:09 <redacted>
2022-08-25 16:36:38 <redacted>
2022-09-08 08:50:24 <redacted>

> echo $AWS_PROFILE
<redacted>

> echo $AWS_DEFAULT_REGION
eu-west-1

$HOME/.aws/config -->

[profile <redacted>]
azure_tenant_id=<redacted>
azure_app_id_uri=https://signin.aws.amazon.com/saml\#<redacted>
[email protected]
azure_default_role_arn=arn:aws:iam::<redacted>:role/<redacted>
azure_default_duration_hours=8
azure_default_remember_me=true
region=eu-west-1

[profile <redacted>]
source_profile=<redacted>
role_arn=arn:aws:iam::<redacted>:role/<redacted>
[email protected]
region=eu-west-1

> arch
arm64

> file $(which dy)
/Users/curtis.wahlfeld/.local/bin/dy: Mach-O 64-bit executable arm64

> dy --version
dynein 0.2.1

> RUST_LOG=debug RUST_BACKTRACE=1 dy ls
[2022-09-16T05:00:02Z DEBUG dy] Command details: Dynein { child: Some(List { all_regions: false }), region: None, port: None, table: None, shell: false }
[2022-09-16T05:00:02Z DEBUG dy::app] Loading Config File: /Users/curtis.wahlfeld/.dynein/config.yml
[2022-09-16T05:00:02Z DEBUG dy::app] Loaded current config: Config { using_region: None, using_table: None, using_port: None }
[2022-09-16T05:00:02Z DEBUG dy::app] Loading Cache File: /Users/curtis.wahlfeld/.dynein/cache.yml
[2022-09-16T05:00:02Z DEBUG dy::app] Loaded current cache: Cache { tables: None }
[2022-09-16T05:00:02Z DEBUG dy] Initial command context: Context { config: Some(Config { using_region: None, using_table: None, using_port: None }), cache: Some(Cache { tables: None }), overwritten_region: None, overwritten_table_name: None, overwritten_port: None, output: None }
[2022-09-16T05:00:32Z DEBUG dy::control] ListTables API call got an error -- Credentials(
        CredentialsError {
            message: "Couldn't find AWS credentials in environment, credentials file, or IAM role.",
        },
    )
[2022-09-16T05:00:32Z ERROR dy::control] Couldn't find AWS credentials in environment, credentials file, or IAM role.

Race condition of loading config cause EndOfStream

Description

Sometimes test is failed due to EndOfStream as follows.

---- test_create_table_with_region_local_and_port_number_options stdout ----
thread 'test_create_table_with_region_local_and_port_number_options' panicked at 'Unexpected failure.
code-1
stderr=``````
[2023-09-16T13:54:39Z DEBUG dy] Command details: Dynein { child: Some(Admin { grandchild: Create { target_type: Table { new_table_name: \"table--test_create_table_with_region_local_and_port_number_options\", keys: [\"pk\"] } } }), region: Some(\"local\"), port: Some(8001), table: None, shell: false }
[2023-09-16T13:54:39Z DEBUG dy::app] Creating dynein config directory: /home/runner/.dynein
[2023-09-16T13:54:39Z DEBUG dy::app] Loading Config File: /home/runner/.dynein/config.yml
[2023-09-16T13:54:39Z INFO  dy::app] Config file doesn\'t exist in the path, hence creating a blank file: No such file or directory (os error 2)
[2023-09-16T13:54:39Z DEBUG dy::app] Loading Config File: /home/runner/.dynein/config.yml
Error: Yaml(EndOfStream)

The cause is that serde_yaml::from_str return EndOfStream.

dynein/src/app.rs

Lines 364 to 368 in 0e627d8

Ok(_str) => {
let config: Config = serde_yaml::from_str(&_str)?;
debug!("Loaded current config: {:?}", config);
Ok(config)
}

Implement IaC feature enpowered by CloudFormation

IaC feature enpowered by CloudFormation

Background

https://github.com/awslabs/dynein#infrastracture-as-code---enpowered-by-cloudformation

Infrastracture as Code - enpowered by CloudFormation
NOTE: currently this feature is under development

Infrastracture as Code is a concept that you define code to provision "infrastructures", such as DynamoDB tables, with "declarative" way (On the other hand you can say dy admin create table and dy admin update table commands are "imperative" way).

To manage DynamoDB tables with "declarative" way, dynein provides dy admin plan and dy admin apply commands. Internally dynein executes AWS CloudFormation APIs to provision DynamoDB resources for you.

Interface

The workflow

  1. create template file
$ ls
cfn.yml

$ cat cfn.yml
Resources:
    MyDDB:
        Type: AWS::DynamoDB::Table
        Properties:
            AttributeDefinitions:
            - AttributeName: pk
               AttributeType: S
        KeySchema:
        - AttributeName: pk
           KeyType: HASH
        BillingMode: PAY_PER_REQUEST
  1. dy admin plan

  2. dy admin apply

Roadmap

Tasks

[test] Improve setup function

Background

The function setup() in test is a bit heavy function that run some processes hence the cost of test will be increased. We will improve performance of some functions such as setup().

Idea

  • improve setup() as lightweight
  • create wrapper for Command::cargo_bin
  • reduce race condition

Add format option to show dynein format

Current dynein outputs data in JSON format. It is beneficial when you use other tools to extract data because JSON is a well-known format. However, it is problematic if you want to check data that resides in DynamoDB quickly because we cannot understand its data type.

I propose to provide a new option to show items in dynein format.

Add integration test for each command

Background

We need to check compatibility through test before pull request is merged. For now, some integration tests are implemented but we haven't covered all of the command. So we need to add integration test for all of the command.

Tasks

`desc_all_tables` integration test sometimes fails

$ git show
commit f95b9fcce244e7a78b3112b68b13dd32fa959e16 (HEAD -> main, origin/main, origin/HEAD)
Author: Hiroaki Goto <[email protected]>
Date:   Fri Aug 25 07:36:42 2023 +0900

    test: fix broken tests for desc command

diff --git a/tests/desc.rs b/tests/desc.rs
index abb2069..cf1db78 100644
--- a/tests/desc.rs
+++ b/tests/desc.rs
  • AMI ID
ami-0d52744d6551d851e
  • OS version
 cat /etc/os-release
PRETTY_NAME="Ubuntu 22.04.2 LTS"
NAME="Ubuntu"
VERSION_ID="22.04"
VERSION="22.04.2 LTS (Jammy Jellyfish)"
VERSION_CODENAME=jammy
ID=ubuntu
ID_LIKE=debian
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
UBUNTU_CODENAME=jammy
  • Rust
rustc --version
rustc 1.69.0 (84c898d65 2023-04-16)

cargo --version
cargo 1.69.0 (6e9a83356 2023-04-12)

rustup --version
rustup 1.26.0 (5af9b9484 2023-04-05)
info: This is the version for the rustup toolchain manager, not the rustc compiler.
info: The currently active `rustc` version is `rustc 1.69.0 (84c898d65 2023-04-16)`

rustup toolchain list
stable-x86_64-unknown-linux-gnu (default)
1.69.0-x86_64-unknown-linux-gnu (override)


  • Error
ubuntu@ip-10-0-138-178:~/tmp/dynein$ RUST_BACKTRACE=full RUST_LOG=debug cargo test --test desc
    Finished test [unoptimized + debuginfo] target(s) in 0.18s
     Running tests/desc.rs (target/debug/deps/desc-b9a63d3448aa3521)

running 4 tests
test test_desc_non_existent_table ... ok
test test_desc_table_from_options ... ok
test test_desc_table_from_args ... ok
test test_desc_all_tables ... FAILED

failures:

---- test_desc_all_tables stdout ----
create temporary table: XKX7yywVK3brTykY
thread 'test_desc_all_tables' panicked at 'Unexpected failure.
code-1
stderr=```<281 lines total>

[2023-09-14T09:19:06Z DEBUG dy] Command details: Dynein { child: Some(Desc { target_table_to_desc: None, all_tables: true, output: None }), region: Some("local"), port: None, table: None, shell: false }
[2023-09-14T09:19:06Z DEBUG dy::app] Loading Config File: /home/ubuntu/.dynein/config.yml
[2023-09-14T09:19:06Z DEBUG dy::app] Loaded current config: Config { using_region: None, using_table: None, using_port: None }
[2023-09-14T09:19:06Z DEBUG dy::app] Loading Cache File: /home/ubuntu/.dynein/cache.yml
[2023-09-14T09:19:06Z DEBUG dy::app] Loaded current cache: Cache { tables: Some({"local/nJgvbJMLS3kdWCEJ": TableSchema { region: "local", name: "nJgvbJMLS3kdWCEJ", pk: Key { name: "pk", kind: S }, sk: None, indexes: None, mode: OnDemand }, "local/Ke2jRQM3n2NiRDDw": TableSchema { region: "local", name: "Ke2jRQM3n2NiRDDw", pk: Key { name: "pk", kind: S }, sk: Some(Key { name: "sk", kind: N }), indexes: None, mode: OnDemand }, "local/2AjCGFkT7ZYYfDQr": TableSchema { region: "local", name: "2AjCGFkT7ZYYfDQr", pk: Key { name: "pk", kind: S }, sk: Some(Key { name: "sk", kind: N }), indexes: None, mode: OnDemand }}) }
[2023-09-14T09:19:06Z DEBUG dy::app] setting DynamoDB Local 'http://localhost:8000\' as target region.
[2023-09-14T09:19:06Z DEBUG dy] Initial command context: Context { config: Some(Config { using_region: None, using_table: None, using_port: None }), cache: Some(Cache { tables: Some({"local/nJgvbJMLS3kdWCEJ": TableSchema { region: "local", name: "nJgvbJMLS3kdWCEJ", pk: Key { name: "pk", kind: S }, sk: None, indexes: None, mode: OnDemand }, "local/Ke2jRQM3n2NiRDDw": TableSchema { region: "local", name: "Ke2jRQM3n2NiRDDw", pk: Key { name: "pk", kind: S }, sk: Some(Key { name: "sk", kind: N }), indexes: None, mode: OnDemand }, "local/2AjCGFkT7ZYYfDQr": TableSchema { region: "local", name: "2AjCGFkT7ZYYfDQr", pk: Key { name: "pk", kind: S }, sk: Some(Key { name: "sk", kind: N }), indexes: None, mode: OnDemand }}) }), overwritten_region: Some(Custom { name: "local", endpoint: "http://localhost:8000\" }), overwritten_table_name: None, overwritten_port: None, output: None }
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] Full request:
method: POST
final_uri: http://localhost:8000/
Headers:

[2023-09-14T09:19:06Z DEBUG rusoto_core::request] authorization:"AWS4-HMAC-SHA256 Credential=ASIA2HDURIGY7YFLVB73/20230914/local/dynamodb/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-security-token;x-amz-target, Signature=653378402a3d3aa7a5e7d83537edf0551297d69a2bedabae8de9910f8b0a4206"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] content-length:"2"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] content-type:"application/x-amz-json-1.0"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] host:"localhost:8000"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-content-sha256:"44136fa355b3678a1146ad16f7e8649e94fb4fc21fe77e8310c060f61caaff8a"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-date:"20230914T091906Z"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-security-token:"IQoJb3JpZ2luX2VjEFkaDmFwLW5vcnRoZWFzdC0xIkcwRQIhAPnNXXQtxQ2oVhffpmDlfMKvE3gFhZ5cz2PeyusRx2NDAiArJWBy/0onk6e3NYxV6a/r7BoPW+CDWePlhL7TXQTAxirKBQhCEAEaDDcwMjQ3MTU1MzQ1NyIMCpfr3whS/HqIwnSYKqcFFRNFxr/MIKTTLpLtU3aspC4wKQmvmgdMAqSIZ1YY4uP0OJRtyUKrLtlAcR8TGuKOAgkcpDqBPC6pjf/PNb3FHoU5gwIC39FQ7RMimyaiVyjlZxSPeHI+gQYOaI3cYWYKPpGx9JH1WqNS0zMmS/L9ktZfULJzcygcgeg81IqQwlzYCy+sWs7fogwyzxAZ4ZtCSl+FH9XSg2bG635Z/ZrxDS47nlEwgn36PBmn/sTyrfw9of7fpxmWVITwaUw3L/qncl2OVjKKSGFr2e/aFxj9XMXB/xMV1XpOe4N5zLJCzK6D7uXjJf5yxGcYJNiEXSBXIQ9pl90GGDUK2w9ftYc8svEZGkS5VqxgKNu5cLPR0UeCtEMHmYnYGeQSpf+WVo0zYuWKGysjmsydDqlWGVbUYBfOYpEEgdQ05j6j1pZ7uLJaout1F/7zW2mphkUsSYWQLx7Do7XvlDHdsrfLOLCgNTrZi16PBDQ0W+tL44rxSwtCJVRGm3t6sNc7BwwOkeGWivXUyhNIOPnGbxwQhYiu+x8KBagkVBDPL+MuyvRhI4rA1UNUJJQNAuZ+ZiUJsMFYN99VKqGUFBFA/vDuaqglSJAXvy5l2uLn27n9tUz0Qz6lxmVj2Xkc2DvB+57onjCTlEDDsL8Hzn0eYyovKrhyXlc76kUBDYJte7pbGexAUN4goIB76Lin7XcVuuTytl94RPlKkVE5slGN9531523nTUEGDycYEMdM7mEAxcSmF1mFH70buXMIXqLDxgRpc/OJpzkrKoy9GdOKI2EYcFkRY4XW6Qf8tix/DbIbKbkkGaktMnl4YcKlXdvHGLDGJKA71x7TnPu9hTjkSjL2dDYvXmPODk4DQT0dKLRiNmiOHzBswVf+56Kjs9CCkMygRyi5kO3J9yPq0jCsk4uoBjqxAVf/s16eOk4Wpr2db1Vh7cDBRIOI+UNcLC02CstVwcrRO3FbNG42CgjKvP+n3Mh8VMdjLKSkJzOFv3AhBLWcG6BF8v9+dRf9ORqU6ZmuAqS2k2NwM8D+BChlBX8cpx8ZxbSHXJKGAehjKUeFaDGTP8fv7SoTeOspjD2x7ftkoK8sKbjDYqKIu6AQtiX/z9pwOZFSO6dP9Pxh9A/ty7FZvqRz1XO5EnVZauA9qJhPn8mYYg=="
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-target:"DynamoDB_20120810.ListTables"

<221 lines omitted>

[2023-09-14T09:19:06Z DEBUG rusoto_core::request] host:"localhost:8000"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-content-sha256:"33fa6cd1694b3da8e0ed826bd41a871f445be3f156578d05f22271504d63be85"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-date:"20230914T091906Z"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-security-token:"IQoJb3JpZ2luX2VjEFkaDmFwLW5vcnRoZWFzdC0xIkcwRQIhAPnNXXQtxQ2oVhffpmDlfMKvE3gFhZ5cz2PeyusRx2NDAiArJWBy/0onk6e3NYxV6a/r7BoPW+CDWePlhL7TXQTAxirKBQhCEAEaDDcwMjQ3MTU1MzQ1NyIMCpfr3whS/HqIwnSYKqcFFRNFxr/MIKTTLpLtU3aspC4wKQmvmgdMAqSIZ1YY4uP0OJRtyUKrLtlAcR8TGuKOAgkcpDqBPC6pjf/PNb3FHoU5gwIC39FQ7RMimyaiVyjlZxSPeHI+gQYOaI3cYWYKPpGx9JH1WqNS0zMmS/L9ktZfULJzcygcgeg81IqQwlzYCy+sWs7fogwyzxAZ4ZtCSl+FH9XSg2bG635Z/ZrxDS47nlEwgn36PBmn/sTyrfw9of7fpxmWVITwaUw3L/qncl2OVjKKSGFr2e/aFxj9XMXB/xMV1XpOe4N5zLJCzK6D7uXjJf5yxGcYJNiEXSBXIQ9pl90GGDUK2w9ftYc8svEZGkS5VqxgKNu5cLPR0UeCtEMHmYnYGeQSpf+WVo0zYuWKGysjmsydDqlWGVbUYBfOYpEEgdQ05j6j1pZ7uLJaout1F/7zW2mphkUsSYWQLx7Do7XvlDHdsrfLOLCgNTrZi16PBDQ0W+tL44rxSwtCJVRGm3t6sNc7BwwOkeGWivXUyhNIOPnGbxwQhYiu+x8KBagkVBDPL+MuyvRhI4rA1UNUJJQNAuZ+ZiUJsMFYN99VKqGUFBFA/vDuaqglSJAXvy5l2uLn27n9tUz0Qz6lxmVj2Xkc2DvB+57onjCTlEDDsL8Hzn0eYyovKrhyXlc76kUBDYJte7pbGexAUN4goIB76Lin7XcVuuTytl94RPlKkVE5slGN9531523nTUEGDycYEMdM7mEAxcSmF1mFH70buXMIXqLDxgRpc/OJpzkrKoy9GdOKI2EYcFkRY4XW6Qf8tix/DbIbKbkkGaktMnl4YcKlXdvHGLDGJKA71x7TnPu9hTjkSjL2dDYvXmPODk4DQT0dKLRiNmiOHzBswVf+56Kjs9CCkMygRyi5kO3J9yPq0jCsk4uoBjqxAVf/s16eOk4Wpr2db1Vh7cDBRIOI+UNcLC02CstVwcrRO3FbNG42CgjKvP+n3Mh8VMdjLKSkJzOFv3AhBLWcG6BF8v9+dRf9ORqU6ZmuAqS2k2NwM8D+BChlBX8cpx8ZxbSHXJKGAehjKUeFaDGTP8fv7SoTeOspjD2x7ftkoK8sKbjDYqKIu6AQtiX/z9pwOZFSO6dP9Pxh9A/ty7FZvqRz1XO5EnVZauA9qJhPn8mYYg=="
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-target:"DynamoDB_20120810.DescribeTable"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] user-agent:"rusoto/0.48.0 rust/1.69.0 linux"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] Full request:
method: POST
final_uri: http://localhost:8000/
Headers:

[2023-09-14T09:19:06Z DEBUG rusoto_core::request] authorization:"AWS4-HMAC-SHA256 Credential=ASIA2HDURIGY7YFLVB73/20230914/local/dynamodb/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-security-token;x-amz-target, Signature=e0f6848eb4b3ec3be2df196ec53f7f37fde04d77bd4ecff5242cda0b613b83f7"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] content-length:"32"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] content-type:"application/x-amz-json-1.0"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] host:"localhost:8000"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-content-sha256:"500695f2c87a6fad65bf19a0fb058d3ec3fa772408b16e39e8cfd8a3d50a6ce1"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-date:"20230914T091906Z"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-security-token:"IQoJb3JpZ2luX2VjEFkaDmFwLW5vcnRoZWFzdC0xIkcwRQIhAPnNXXQtxQ2oVhffpmDlfMKvE3gFhZ5cz2PeyusRx2NDAiArJWBy/0onk6e3NYxV6a/r7BoPW+CDWePlhL7TXQTAxirKBQhCEAEaDDcwMjQ3MTU1MzQ1NyIMCpfr3whS/HqIwnSYKqcFFRNFxr/MIKTTLpLtU3aspC4wKQmvmgdMAqSIZ1YY4uP0OJRtyUKrLtlAcR8TGuKOAgkcpDqBPC6pjf/PNb3FHoU5gwIC39FQ7RMimyaiVyjlZxSPeHI+gQYOaI3cYWYKPpGx9JH1WqNS0zMmS/L9ktZfULJzcygcgeg81IqQwlzYCy+sWs7fogwyzxAZ4ZtCSl+FH9XSg2bG635Z/ZrxDS47nlEwgn36PBmn/sTyrfw9of7fpxmWVITwaUw3L/qncl2OVjKKSGFr2e/aFxj9XMXB/xMV1XpOe4N5zLJCzK6D7uXjJf5yxGcYJNiEXSBXIQ9pl90GGDUK2w9ftYc8svEZGkS5VqxgKNu5cLPR0UeCtEMHmYnYGeQSpf+WVo0zYuWKGysjmsydDqlWGVbUYBfOYpEEgdQ05j6j1pZ7uLJaout1F/7zW2mphkUsSYWQLx7Do7XvlDHdsrfLOLCgNTrZi16PBDQ0W+tL44rxSwtCJVRGm3t6sNc7BwwOkeGWivXUyhNIOPnGbxwQhYiu+x8KBagkVBDPL+MuyvRhI4rA1UNUJJQNAuZ+ZiUJsMFYN99VKqGUFBFA/vDuaqglSJAXvy5l2uLn27n9tUz0Qz6lxmVj2Xkc2DvB+57onjCTlEDDsL8Hzn0eYyovKrhyXlc76kUBDYJte7pbGexAUN4goIB76Lin7XcVuuTytl94RPlKkVE5slGN9531523nTUEGDycYEMdM7mEAxcSmF1mFH70buXMIXqLDxgRpc/OJpzkrKoy9GdOKI2EYcFkRY4XW6Qf8tix/DbIbKbkkGaktMnl4YcKlXdvHGLDGJKA71x7TnPu9hTjkSjL2dDYvXmPODk4DQT0dKLRiNmiOHzBswVf+56Kjs9CCkMygRyi5kO3J9yPq0jCsk4uoBjqxAVf/s16eOk4Wpr2db1Vh7cDBRIOI+UNcLC02CstVwcrRO3FbNG42CgjKvP+n3Mh8VMdjLKSkJzOFv3AhBLWcG6BF8v9+dRf9ORqU6ZmuAqS2k2NwM8D+BChlBX8cpx8ZxbSHXJKGAehjKUeFaDGTP8fv7SoTeOspjD2x7ftkoK8sKbjDYqKIu6AQtiX/z9pwOZFSO6dP9Pxh9A/ty7FZvqRz1XO5EnVZauA9qJhPn8mYYg=="
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-target:"DynamoDB_20120810.DescribeTable"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] user-agent:"rusoto/0.48.0 rust/1.69.0 linux"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] Full request:
method: POST
final_uri: http://localhost:8000/
Headers:

[2023-09-14T09:19:06Z DEBUG rusoto_core::request] authorization:"AWS4-HMAC-SHA256 Credential=ASIA2HDURIGY7YFLVB73/20230914/local/dynamodb/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-security-token;x-amz-target, Signature=1a10ea2a6bf5a2390c2922aed0db8fca60c06ccfbf8606c291eced8be53e4daf"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] content-length:"32"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] content-type:"application/x-amz-json-1.0"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] host:"localhost:8000"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-content-sha256:"17e71deb2ba5335f3ccccb99a185bc2a479807631b8543212ec330121f42340f"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-date:"20230914T091906Z"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-security-token:"IQoJb3JpZ2luX2VjEFkaDmFwLW5vcnRoZWFzdC0xIkcwRQIhAPnNXXQtxQ2oVhffpmDlfMKvE3gFhZ5cz2PeyusRx2NDAiArJWBy/0onk6e3NYxV6a/r7BoPW+CDWePlhL7TXQTAxirKBQhCEAEaDDcwMjQ3MTU1MzQ1NyIMCpfr3whS/HqIwnSYKqcFFRNFxr/MIKTTLpLtU3aspC4wKQmvmgdMAqSIZ1YY4uP0OJRtyUKrLtlAcR8TGuKOAgkcpDqBPC6pjf/PNb3FHoU5gwIC39FQ7RMimyaiVyjlZxSPeHI+gQYOaI3cYWYKPpGx9JH1WqNS0zMmS/L9ktZfULJzcygcgeg81IqQwlzYCy+sWs7fogwyzxAZ4ZtCSl+FH9XSg2bG635Z/ZrxDS47nlEwgn36PBmn/sTyrfw9of7fpxmWVITwaUw3L/qncl2OVjKKSGFr2e/aFxj9XMXB/xMV1XpOe4N5zLJCzK6D7uXjJf5yxGcYJNiEXSBXIQ9pl90GGDUK2w9ftYc8svEZGkS5VqxgKNu5cLPR0UeCtEMHmYnYGeQSpf+WVo0zYuWKGysjmsydDqlWGVbUYBfOYpEEgdQ05j6j1pZ7uLJaout1F/7zW2mphkUsSYWQLx7Do7XvlDHdsrfLOLCgNTrZi16PBDQ0W+tL44rxSwtCJVRGm3t6sNc7BwwOkeGWivXUyhNIOPnGbxwQhYiu+x8KBagkVBDPL+MuyvRhI4rA1UNUJJQNAuZ+ZiUJsMFYN99VKqGUFBFA/vDuaqglSJAXvy5l2uLn27n9tUz0Qz6lxmVj2Xkc2DvB+57onjCTlEDDsL8Hzn0eYyovKrhyXlc76kUBDYJte7pbGexAUN4goIB76Lin7XcVuuTytl94RPlKkVE5slGN9531523nTUEGDycYEMdM7mEAxcSmF1mFH70buXMIXqLDxgRpc/OJpzkrKoy9GdOKI2EYcFkRY4XW6Qf8tix/DbIbKbkkGaktMnl4YcKlXdvHGLDGJKA71x7TnPu9hTjkSjL2dDYvXmPODk4DQT0dKLRiNmiOHzBswVf+56Kjs9CCkMygRyi5kO3J9yPq0jCsk4uoBjqxAVf/s16eOk4Wpr2db1Vh7cDBRIOI+UNcLC02CstVwcrRO3FbNG42CgjKvP+n3Mh8VMdjLKSkJzOFv3AhBLWcG6BF8v9+dRf9ORqU6ZmuAqS2k2NwM8D+BChlBX8cpx8ZxbSHXJKGAehjKUeFaDGTP8fv7SoTeOspjD2x7ftkoK8sKbjDYqKIu6AQtiX/z9pwOZFSO6dP9Pxh9A/ty7FZvqRz1XO5EnVZauA9qJhPn8mYYg=="
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-target:"DynamoDB_20120810.DescribeTable"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] user-agent:"rusoto/0.48.0 rust/1.69.0 linux"
[2023-09-14T09:19:06Z DEBUG dy::control] DescribeTable API call got an error -- Service(
ResourceNotFound(
"Cannot do operations on a non-existent table",
),
)
[2023-09-14T09:19:06Z ERROR dy::control] Cannot do operations on a non-existent table

command="/home/ubuntu/tmp/dynein/target/debug/dy" "--region" "local" "desc" "--all-tables"
code=1
stdout=""
stderr=<281 lines total>

[2023-09-14T09:19:06Z DEBUG dy] Command details: Dynein { child: Some(Desc { target_table_to_desc: None, all_tables: true, output: None }), region: Some(\"local\"), port: None, table: None, shell: false }
[2023-09-14T09:19:06Z DEBUG dy::app] Loading Config File: /home/ubuntu/.dynein/config.yml
[2023-09-14T09:19:06Z DEBUG dy::app] Loaded current config: Config { using_region: None, using_table: None, using_port: None }
[2023-09-14T09:19:06Z DEBUG dy::app] Loading Cache File: /home/ubuntu/.dynein/cache.yml
[2023-09-14T09:19:06Z DEBUG dy::app] Loaded current cache: Cache { tables: Some({\"local/nJgvbJMLS3kdWCEJ\": TableSchema { region: \"local\", name: \"nJgvbJMLS3kdWCEJ\", pk: Key { name: \"pk\", kind: S }, sk: None, indexes: None, mode: OnDemand }, \"local/Ke2jRQM3n2NiRDDw\": TableSchema { region: \"local\", name: \"Ke2jRQM3n2NiRDDw\", pk: Key { name: \"pk\", kind: S }, sk: Some(Key { name: \"sk\", kind: N }), indexes: None, mode: OnDemand }, \"local/2AjCGFkT7ZYYfDQr\": TableSchema { region: \"local\", name: \"2AjCGFkT7ZYYfDQr\", pk: Key { name: \"pk\", kind: S }, sk: Some(Key { name: \"sk\", kind: N }), indexes: None, mode: OnDemand }}) }
[2023-09-14T09:19:06Z DEBUG dy::app] setting DynamoDB Local \'http://localhost:8000\' as target region.
[2023-09-14T09:19:06Z DEBUG dy] Initial command context: Context { config: Some(Config { using_region: None, using_table: None, using_port: None }), cache: Some(Cache { tables: Some({\"local/nJgvbJMLS3kdWCEJ\": TableSchema { region: \"local\", name: \"nJgvbJMLS3kdWCEJ\", pk: Key { name: \"pk\", kind: S }, sk: None, indexes: None, mode: OnDemand }, \"local/Ke2jRQM3n2NiRDDw\": TableSchema { region: \"local\", name: \"Ke2jRQM3n2NiRDDw\", pk: Key { name: \"pk\", kind: S }, sk: Some(Key { name: \"sk\", kind: N }), indexes: None, mode: OnDemand }, \"local/2AjCGFkT7ZYYfDQr\": TableSchema { region: \"local\", name: \"2AjCGFkT7ZYYfDQr\", pk: Key { name: \"pk\", kind: S }, sk: Some(Key { name: \"sk\", kind: N }), indexes: None, mode: OnDemand }}) }), overwritten_region: Some(Custom { name: \"local\", endpoint: \"http://localhost:8000\" }), overwritten_table_name: None, overwritten_port: None, output: None }
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] Full request:
     method: POST
     final_uri: http://localhost:8000/
    Headers:

[2023-09-14T09:19:06Z DEBUG rusoto_core::request] authorization:\"AWS4-HMAC-SHA256 Credential=ASIA2HDURIGY7YFLVB73/20230914/local/dynamodb/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-security-token;x-amz-target, Signature=653378402a3d3aa7a5e7d83537edf0551297d69a2bedabae8de9910f8b0a4206\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] content-length:\"2\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] content-type:\"application/x-amz-json-1.0\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] host:\"localhost:8000\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-content-sha256:\"44136fa355b3678a1146ad16f7e8649e94fb4fc21fe77e8310c060f61caaff8a\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-date:\"20230914T091906Z\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-security-token:\"IQoJb3JpZ2luX2VjEFkaDmFwLW5vcnRoZWFzdC0xIkcwRQIhAPnNXXQtxQ2oVhffpmDlfMKvE3gFhZ5cz2PeyusRx2NDAiArJWBy/0onk6e3NYxV6a/r7BoPW+CDWePlhL7TXQTAxirKBQhCEAEaDDcwMjQ3MTU1MzQ1NyIMCpfr3whS/HqIwnSYKqcFFRNFxr/MIKTTLpLtU3aspC4wKQmvmgdMAqSIZ1YY4uP0OJRtyUKrLtlAcR8TGuKOAgkcpDqBPC6pjf/PNb3FHoU5gwIC39FQ7RMimyaiVyjlZxSPeHI+gQYOaI3cYWYKPpGx9JH1WqNS0zMmS/L9ktZfULJzcygcgeg81IqQwlzYCy+sWs7fogwyzxAZ4ZtCSl+FH9XSg2bG635Z/ZrxDS47nlEwgn36PBmn/sTyrfw9of7fpxmWVITwaUw3L/qncl2OVjKKSGFr2e/aFxj9XMXB/xMV1XpOe4N5zLJCzK6D7uXjJf5yxGcYJNiEXSBXIQ9pl90GGDUK2w9ftYc8svEZGkS5VqxgKNu5cLPR0UeCtEMHmYnYGeQSpf+WVo0zYuWKGysjmsydDqlWGVbUYBfOYpEEgdQ05j6j1pZ7uLJaout1F/7zW2mphkUsSYWQLx7Do7XvlDHdsrfLOLCgNTrZi16PBDQ0W+tL44rxSwtCJVRGm3t6sNc7BwwOkeGWivXUyhNIOPnGbxwQhYiu+x8KBagkVBDPL+MuyvRhI4rA1UNUJJQNAuZ+ZiUJsMFYN99VKqGUFBFA/vDuaqglSJAXvy5l2uLn27n9tUz0Qz6lxmVj2Xkc2DvB+57onjCTlEDDsL8Hzn0eYyovKrhyXlc76kUBDYJte7pbGexAUN4goIB76Lin7XcVuuTytl94RPlKkVE5slGN9531523nTUEGDycYEMdM7mEAxcSmF1mFH70buXMIXqLDxgRpc/OJpzkrKoy9GdOKI2EYcFkRY4XW6Qf8tix/DbIbKbkkGaktMnl4YcKlXdvHGLDGJKA71x7TnPu9hTjkSjL2dDYvXmPODk4DQT0dKLRiNmiOHzBswVf+56Kjs9CCkMygRyi5kO3J9yPq0jCsk4uoBjqxAVf/s16eOk4Wpr2db1Vh7cDBRIOI+UNcLC02CstVwcrRO3FbNG42CgjKvP+n3Mh8VMdjLKSkJzOFv3AhBLWcG6BF8v9+dRf9ORqU6ZmuAqS2k2NwM8D+BChlBX8cpx8ZxbSHXJKGAehjKUeFaDGTP8fv7SoTeOspjD2x7ftkoK8sKbjDYqKIu6AQtiX/z9pwOZFSO6dP9Pxh9A/ty7FZvqRz1XO5EnVZauA9qJhPn8mYYg==\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-target:\"DynamoDB_20120810.ListTables\"

<221 lines omitted>

[2023-09-14T09:19:06Z DEBUG rusoto_core::request] host:\"localhost:8000\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-content-sha256:\"33fa6cd1694b3da8e0ed826bd41a871f445be3f156578d05f22271504d63be85\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-date:\"20230914T091906Z\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-security-token:\"IQoJb3JpZ2luX2VjEFkaDmFwLW5vcnRoZWFzdC0xIkcwRQIhAPnNXXQtxQ2oVhffpmDlfMKvE3gFhZ5cz2PeyusRx2NDAiArJWBy/0onk6e3NYxV6a/r7BoPW+CDWePlhL7TXQTAxirKBQhCEAEaDDcwMjQ3MTU1MzQ1NyIMCpfr3whS/HqIwnSYKqcFFRNFxr/MIKTTLpLtU3aspC4wKQmvmgdMAqSIZ1YY4uP0OJRtyUKrLtlAcR8TGuKOAgkcpDqBPC6pjf/PNb3FHoU5gwIC39FQ7RMimyaiVyjlZxSPeHI+gQYOaI3cYWYKPpGx9JH1WqNS0zMmS/L9ktZfULJzcygcgeg81IqQwlzYCy+sWs7fogwyzxAZ4ZtCSl+FH9XSg2bG635Z/ZrxDS47nlEwgn36PBmn/sTyrfw9of7fpxmWVITwaUw3L/qncl2OVjKKSGFr2e/aFxj9XMXB/xMV1XpOe4N5zLJCzK6D7uXjJf5yxGcYJNiEXSBXIQ9pl90GGDUK2w9ftYc8svEZGkS5VqxgKNu5cLPR0UeCtEMHmYnYGeQSpf+WVo0zYuWKGysjmsydDqlWGVbUYBfOYpEEgdQ05j6j1pZ7uLJaout1F/7zW2mphkUsSYWQLx7Do7XvlDHdsrfLOLCgNTrZi16PBDQ0W+tL44rxSwtCJVRGm3t6sNc7BwwOkeGWivXUyhNIOPnGbxwQhYiu+x8KBagkVBDPL+MuyvRhI4rA1UNUJJQNAuZ+ZiUJsMFYN99VKqGUFBFA/vDuaqglSJAXvy5l2uLn27n9tUz0Qz6lxmVj2Xkc2DvB+57onjCTlEDDsL8Hzn0eYyovKrhyXlc76kUBDYJte7pbGexAUN4goIB76Lin7XcVuuTytl94RPlKkVE5slGN9531523nTUEGDycYEMdM7mEAxcSmF1mFH70buXMIXqLDxgRpc/OJpzkrKoy9GdOKI2EYcFkRY4XW6Qf8tix/DbIbKbkkGaktMnl4YcKlXdvHGLDGJKA71x7TnPu9hTjkSjL2dDYvXmPODk4DQT0dKLRiNmiOHzBswVf+56Kjs9CCkMygRyi5kO3J9yPq0jCsk4uoBjqxAVf/s16eOk4Wpr2db1Vh7cDBRIOI+UNcLC02CstVwcrRO3FbNG42CgjKvP+n3Mh8VMdjLKSkJzOFv3AhBLWcG6BF8v9+dRf9ORqU6ZmuAqS2k2NwM8D+BChlBX8cpx8ZxbSHXJKGAehjKUeFaDGTP8fv7SoTeOspjD2x7ftkoK8sKbjDYqKIu6AQtiX/z9pwOZFSO6dP9Pxh9A/ty7FZvqRz1XO5EnVZauA9qJhPn8mYYg==\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-target:\"DynamoDB_20120810.DescribeTable\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] user-agent:\"rusoto/0.48.0 rust/1.69.0 linux\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] Full request:
     method: POST
     final_uri: http://localhost:8000/
    Headers:

[2023-09-14T09:19:06Z DEBUG rusoto_core::request] authorization:\"AWS4-HMAC-SHA256 Credential=ASIA2HDURIGY7YFLVB73/20230914/local/dynamodb/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-security-token;x-amz-target, Signature=e0f6848eb4b3ec3be2df196ec53f7f37fde04d77bd4ecff5242cda0b613b83f7\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] content-length:\"32\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] content-type:\"application/x-amz-json-1.0\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] host:\"localhost:8000\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-content-sha256:\"500695f2c87a6fad65bf19a0fb058d3ec3fa772408b16e39e8cfd8a3d50a6ce1\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-date:\"20230914T091906Z\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-security-token:\"IQoJb3JpZ2luX2VjEFkaDmFwLW5vcnRoZWFzdC0xIkcwRQIhAPnNXXQtxQ2oVhffpmDlfMKvE3gFhZ5cz2PeyusRx2NDAiArJWBy/0onk6e3NYxV6a/r7BoPW+CDWePlhL7TXQTAxirKBQhCEAEaDDcwMjQ3MTU1MzQ1NyIMCpfr3whS/HqIwnSYKqcFFRNFxr/MIKTTLpLtU3aspC4wKQmvmgdMAqSIZ1YY4uP0OJRtyUKrLtlAcR8TGuKOAgkcpDqBPC6pjf/PNb3FHoU5gwIC39FQ7RMimyaiVyjlZxSPeHI+gQYOaI3cYWYKPpGx9JH1WqNS0zMmS/L9ktZfULJzcygcgeg81IqQwlzYCy+sWs7fogwyzxAZ4ZtCSl+FH9XSg2bG635Z/ZrxDS47nlEwgn36PBmn/sTyrfw9of7fpxmWVITwaUw3L/qncl2OVjKKSGFr2e/aFxj9XMXB/xMV1XpOe4N5zLJCzK6D7uXjJf5yxGcYJNiEXSBXIQ9pl90GGDUK2w9ftYc8svEZGkS5VqxgKNu5cLPR0UeCtEMHmYnYGeQSpf+WVo0zYuWKGysjmsydDqlWGVbUYBfOYpEEgdQ05j6j1pZ7uLJaout1F/7zW2mphkUsSYWQLx7Do7XvlDHdsrfLOLCgNTrZi16PBDQ0W+tL44rxSwtCJVRGm3t6sNc7BwwOkeGWivXUyhNIOPnGbxwQhYiu+x8KBagkVBDPL+MuyvRhI4rA1UNUJJQNAuZ+ZiUJsMFYN99VKqGUFBFA/vDuaqglSJAXvy5l2uLn27n9tUz0Qz6lxmVj2Xkc2DvB+57onjCTlEDDsL8Hzn0eYyovKrhyXlc76kUBDYJte7pbGexAUN4goIB76Lin7XcVuuTytl94RPlKkVE5slGN9531523nTUEGDycYEMdM7mEAxcSmF1mFH70buXMIXqLDxgRpc/OJpzkrKoy9GdOKI2EYcFkRY4XW6Qf8tix/DbIbKbkkGaktMnl4YcKlXdvHGLDGJKA71x7TnPu9hTjkSjL2dDYvXmPODk4DQT0dKLRiNmiOHzBswVf+56Kjs9CCkMygRyi5kO3J9yPq0jCsk4uoBjqxAVf/s16eOk4Wpr2db1Vh7cDBRIOI+UNcLC02CstVwcrRO3FbNG42CgjKvP+n3Mh8VMdjLKSkJzOFv3AhBLWcG6BF8v9+dRf9ORqU6ZmuAqS2k2NwM8D+BChlBX8cpx8ZxbSHXJKGAehjKUeFaDGTP8fv7SoTeOspjD2x7ftkoK8sKbjDYqKIu6AQtiX/z9pwOZFSO6dP9Pxh9A/ty7FZvqRz1XO5EnVZauA9qJhPn8mYYg==\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-target:\"DynamoDB_20120810.DescribeTable\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] user-agent:\"rusoto/0.48.0 rust/1.69.0 linux\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] Full request:
     method: POST
     final_uri: http://localhost:8000/
    Headers:

[2023-09-14T09:19:06Z DEBUG rusoto_core::request] authorization:\"AWS4-HMAC-SHA256 Credential=ASIA2HDURIGY7YFLVB73/20230914/local/dynamodb/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-security-token;x-amz-target, Signature=1a10ea2a6bf5a2390c2922aed0db8fca60c06ccfbf8606c291eced8be53e4daf\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] content-length:\"32\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] content-type:\"application/x-amz-json-1.0\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] host:\"localhost:8000\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-content-sha256:\"17e71deb2ba5335f3ccccb99a185bc2a479807631b8543212ec330121f42340f\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-date:\"20230914T091906Z\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-security-token:\"IQoJb3JpZ2luX2VjEFkaDmFwLW5vcnRoZWFzdC0xIkcwRQIhAPnNXXQtxQ2oVhffpmDlfMKvE3gFhZ5cz2PeyusRx2NDAiArJWBy/0onk6e3NYxV6a/r7BoPW+CDWePlhL7TXQTAxirKBQhCEAEaDDcwMjQ3MTU1MzQ1NyIMCpfr3whS/HqIwnSYKqcFFRNFxr/MIKTTLpLtU3aspC4wKQmvmgdMAqSIZ1YY4uP0OJRtyUKrLtlAcR8TGuKOAgkcpDqBPC6pjf/PNb3FHoU5gwIC39FQ7RMimyaiVyjlZxSPeHI+gQYOaI3cYWYKPpGx9JH1WqNS0zMmS/L9ktZfULJzcygcgeg81IqQwlzYCy+sWs7fogwyzxAZ4ZtCSl+FH9XSg2bG635Z/ZrxDS47nlEwgn36PBmn/sTyrfw9of7fpxmWVITwaUw3L/qncl2OVjKKSGFr2e/aFxj9XMXB/xMV1XpOe4N5zLJCzK6D7uXjJf5yxGcYJNiEXSBXIQ9pl90GGDUK2w9ftYc8svEZGkS5VqxgKNu5cLPR0UeCtEMHmYnYGeQSpf+WVo0zYuWKGysjmsydDqlWGVbUYBfOYpEEgdQ05j6j1pZ7uLJaout1F/7zW2mphkUsSYWQLx7Do7XvlDHdsrfLOLCgNTrZi16PBDQ0W+tL44rxSwtCJVRGm3t6sNc7BwwOkeGWivXUyhNIOPnGbxwQhYiu+x8KBagkVBDPL+MuyvRhI4rA1UNUJJQNAuZ+ZiUJsMFYN99VKqGUFBFA/vDuaqglSJAXvy5l2uLn27n9tUz0Qz6lxmVj2Xkc2DvB+57onjCTlEDDsL8Hzn0eYyovKrhyXlc76kUBDYJte7pbGexAUN4goIB76Lin7XcVuuTytl94RPlKkVE5slGN9531523nTUEGDycYEMdM7mEAxcSmF1mFH70buXMIXqLDxgRpc/OJpzkrKoy9GdOKI2EYcFkRY4XW6Qf8tix/DbIbKbkkGaktMnl4YcKlXdvHGLDGJKA71x7TnPu9hTjkSjL2dDYvXmPODk4DQT0dKLRiNmiOHzBswVf+56Kjs9CCkMygRyi5kO3J9yPq0jCsk4uoBjqxAVf/s16eOk4Wpr2db1Vh7cDBRIOI+UNcLC02CstVwcrRO3FbNG42CgjKvP+n3Mh8VMdjLKSkJzOFv3AhBLWcG6BF8v9+dRf9ORqU6ZmuAqS2k2NwM8D+BChlBX8cpx8ZxbSHXJKGAehjKUeFaDGTP8fv7SoTeOspjD2x7ftkoK8sKbjDYqKIu6AQtiX/z9pwOZFSO6dP9Pxh9A/ty7FZvqRz1XO5EnVZauA9qJhPn8mYYg==\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] x-amz-target:\"DynamoDB_20120810.DescribeTable\"
[2023-09-14T09:19:06Z DEBUG rusoto_core::request] user-agent:\"rusoto/0.48.0 rust/1.69.0 linux\"
[2023-09-14T09:19:06Z DEBUG dy::control] DescribeTable API call got an error -- Service(
        ResourceNotFound(
            \"Cannot do operations on a non-existent table\",
        ),
    )
[2023-09-14T09:19:06Z ERROR dy::control] Cannot do operations on a non-existent table

', /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/ops/function.rs:250:5
stack backtrace:
0: 0x55fe0c88298a - std::backtrace_rs::backtrace::libunwind::trace::ha9053a9a07ca49cb
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/../../backtrace/src/backtrace/libunwind.rs:93:5
1: 0x55fe0c88298a - std::backtrace_rs::backtrace::trace_unsynchronized::h9c2852a457ad564e
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/../../backtrace/src/backtrace/mod.rs:66:5
2: 0x55fe0c88298a - std::sys_common::backtrace::_print_fmt::h457936fbfaa0070f
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/sys_common/backtrace.rs:65:5
3: 0x55fe0c88298a - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::h5779d7bf7f70cb0c
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/sys_common/backtrace.rs:44:22
4: 0x55fe0c8aa88e - core::fmt::write::h5a4baaff1bcd3eb5
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/fmt/mod.rs:1232:17
5: 0x55fe0c87eff5 - std::io::Write::write_fmt::h478f79c628ef31d1
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/io/mod.rs:1684:15
6: 0x55fe0c882755 - std::sys_common::backtrace::_print::h5fcdc36060f177e8
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/sys_common/backtrace.rs:47:5
7: 0x55fe0c882755 - std::sys_common::backtrace::print::h54ca9458b876c8bf
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/sys_common/backtrace.rs:34:9
8: 0x55fe0c88459f - std::panicking::default_hook::{{closure}}::hbe471161c7664ed6
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/panicking.rs:271:22
9: 0x55fe0c88425f - std::panicking::default_hook::ha3500da57aa4ac4f
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/panicking.rs:287:9
10: 0x55fe0c10c7f7 - <alloc::boxed::Box<F,A> as core::ops::function::Fn>::call::hdf1b89dd137ece34
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/alloc/src/boxed.rs:2001:9
11: 0x55fe0c10c7f7 - test::test_main::{{closure}}::h9081a79f20e75698
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/test/src/lib.rs:135:21
12: 0x55fe0c884ccd - <alloc::boxed::Box<F,A> as core::ops::function::Fn>::call::h6507bddc3eebb4a5
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/alloc/src/boxed.rs:2001:9
13: 0x55fe0c884ccd - std::panicking::rust_panic_with_hook::h50c09d000dc561d2
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/panicking.rs:696:13
14: 0x55fe0c884a49 - std::panicking::begin_panic_handler::{{closure}}::h9e2b2176e00e0d9c
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/panicking.rs:583:13
15: 0x55fe0c882df6 - std::sys_common::backtrace::__rust_end_short_backtrace::h5739b8e512c09d02
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/sys_common/backtrace.rs:150:18
16: 0x55fe0c884752 - rust_begin_unwind
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/panicking.rs:579:5
17: 0x55fe0c0c94b3 - core::panicking::panic_fmt::hf33a1475b4dc5c3e
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/panicking.rs:64:14
18: 0x55fe0c81a62c - core::panicking::panic_display::h6d51986636d8a5b9
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/panicking.rs:147:5
19: 0x55fe0c81fdb7 - assert_cmd::assert::AssertError::panic::h6c2e9f0f75fbf88e
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/assert_cmd-2.0.11/src/assert.rs:1033:9
20: 0x55fe0c820d17 - core::ops::function::FnOnce::call_once::hbcde93fefceffe5e
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/ops/function.rs:250:5
21: 0x55fe0c81e180 - core::result::Result<T,E>::unwrap_or_else::he37ca3c928540803
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/result.rs:1465:23
22: 0x55fe0c81ef8a - assert_cmd::assert::Assert::success::h140f2c001de31bfe
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/assert_cmd-2.0.11/src/assert.rs:158:9
23: 0x55fe0c0db2b1 - desc::test_desc_all_tables::{{closure}}::h81800d431328956a
at /home/ubuntu/tmp/dynein/tests/desc.rs:105:5
24: 0x55fe0c0d18ad - <core::pin::Pin

as core::future::future::Future>::poll::hce8b359968a0c6b0
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/future/future.rs:125:9
25: 0x55fe0c0d17ca - <core::pin::Pin

as core::future::future::Future>::poll::h0a19dade11a6492f
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/future/future.rs:125:9
26: 0x55fe0c0d1673 - tokio::runtime::scheduler::current_thread::CoreGuard::block_on::{{closure}}::{{closure}}::{{closure}}::h9a2bef90e64c70a0
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.28.0/src/runtime/scheduler/current_thread.rs:541:57
27: 0x55fe0c0d156c - tokio::runtime::coop::with_budget::he4ad1c37260581ad
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.28.0/src/runtime/coop.rs:107:5
28: 0x55fe0c0d156c - tokio::runtime::coop::budget::h59424dc29770443a
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.28.0/src/runtime/coop.rs:73:5
29: 0x55fe0c0d156c - tokio::runtime::scheduler::current_thread::CoreGuard::block_on::{{closure}}::{{closure}}::h4ec4552abbe347d3
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.28.0/src/runtime/scheduler/current_thread.rs:541:25
30: 0x55fe0c0d001c - tokio::runtime::scheduler::current_thread::Context::enter::he797df4c086cd110
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.28.0/src/runtime/scheduler/current_thread.rs:350:19
31: 0x55fe0c0d0b7a - tokio::runtime::scheduler::current_thread::CoreGuard::block_on::{{closure}}::h427865ba761e033d
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.28.0/src/runtime/scheduler/current_thread.rs:540:36
32: 0x55fe0c0d0852 - tokio::runtime::scheduler::current_thread::CoreGuard::enter::{{closure}}::h72fed09ad7fe8ac1
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.28.0/src/runtime/scheduler/current_thread.rs:615:57
33: 0x55fe0c0d7cdf - tokio::macros::scoped_tls::ScopedKey::set::h31044468c3ae8e23
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.28.0/src/macros/scoped_tls.rs:61:9
34: 0x55fe0c0d05dd - tokio::runtime::scheduler::current_thread::CoreGuard::enter::h25479f7676ad30c1
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.28.0/src/runtime/scheduler/current_thread.rs:615:27
35: 0x55fe0c0d0888 - tokio::runtime::scheduler::current_thread::CoreGuard::block_on::hed12a46d7de2e2e8
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.28.0/src/runtime/scheduler/current_thread.rs:530:19
36: 0x55fe0c0cf6d3 - tokio::runtime::scheduler::current_thread::CurrentThread::block_on::h3324286d5dadadaf
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.28.0/src/runtime/scheduler/current_thread.rs:154:24
37: 0x55fe0c0d5364 - tokio::runtime::runtime::Runtime::block_on::h4185dd94a6993e51
at /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.28.0/src/runtime/runtime.rs:302:47
38: 0x55fe0c0d869c - desc::test_desc_all_tables::h99acdcedea96c3d0
at /home/ubuntu/tmp/dynein/tests/desc.rs:125:5
39: 0x55fe0c0dabc7 - desc::test_desc_all_tables::{{closure}}::hb90fe737c00c64e5
at /home/ubuntu/tmp/dynein/tests/desc.rs:100:36
40: 0x55fe0c0d6265 - core::ops::function::FnOnce::call_once::h72e9f0fd494afede
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/ops/function.rs:250:5
41: 0x55fe0c111bdf - core::ops::function::FnOnce::call_once::h7d969581be7d0075
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/ops/function.rs:250:5
42: 0x55fe0c111bdf - test::__rust_begin_short_backtrace::hca25a52684e56655
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/test/src/lib.rs:656:18
43: 0x55fe0c0e31fc - test::run_test::{{closure}}::ha8ac744c4af1a4bb
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/test/src/lib.rs:647:30
44: 0x55fe0c0e31fc - core::ops::function::FnOnce::call_once{{vtable.shim}}::h7b411a9ed45bdecf
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/ops/function.rs:250:5
45: 0x55fe0c110ba6 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce>::call_once::hd05328869a8ed200
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/alloc/src/boxed.rs:1987:9
46: 0x55fe0c110ba6 - <core::panic::unwind_safe::AssertUnwindSafe as core::ops::function::FnOnce<()>>::call_once::h0c601ce20bd5b2be
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/panic/unwind_safe.rs:271:9
47: 0x55fe0c110ba6 - std::panicking::try::do_call::ha4be5c164fe30854
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/panicking.rs:487:40
48: 0x55fe0c110ba6 - std::panicking::try::h604546f3609af05f
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/panicking.rs:451:19
49: 0x55fe0c110ba6 - std::panic::catch_unwind::h5bfa8afe44c9c2f9
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/panic.rs:140:14
50: 0x55fe0c110ba6 - test::run_test_in_process::hd50dee55dd63e6aa
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/test/src/lib.rs:679:27
51: 0x55fe0c110ba6 - test::run_test::run_test_inner::{{closure}}::h4aa0f433aa85cea8
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/test/src/lib.rs:573:39
52: 0x55fe0c0dd871 - test::run_test::run_test_inner::{{closure}}::had56431adf24b4d6
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/test/src/lib.rs:600:37
53: 0x55fe0c0dd871 - std::sys_common::backtrace::rust_begin_short_backtrace::ha8d9890e0c73bf13
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/sys_common/backtrace.rs:134:18
54: 0x55fe0c0e328b - std::thread::Builder::spawn_unchecked
::{{closure}}::{{closure}}::h251832bcb4a95327
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/thread/mod.rs:560:17
55: 0x55fe0c0e328b - <core::panic::unwind_safe::AssertUnwindSafe as core::ops::function::FnOnce<()>>::call_once::h8fdff598fa414831
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/panic/unwind_safe.rs:271:9
56: 0x55fe0c0e328b - std::panicking::try::do_call::h455dd2f7764a950f
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/panicking.rs:487:40
57: 0x55fe0c0e328b - std::panicking::try::had6f23b7b73ae72a
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/panicking.rs:451:19
58: 0x55fe0c0e328b - std::panic::catch_unwind::ha6a5ca3915b4dad0
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/panic.rs:140:14
59: 0x55fe0c0e328b - std::thread::Builder::spawn_unchecked
::{{closure}}::h883b72ef75da6231
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/thread/mod.rs:559:30
60: 0x55fe0c0e328b - core::ops::function::FnOnce::call_once{{vtable.shim}}::hb7dff73dc8bd2ccb
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/core/src/ops/function.rs:250:5
61: 0x55fe0c8899e3 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce>::call_once::h39990b24eedef2ab
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/alloc/src/boxed.rs:1987:9
62: 0x55fe0c8899e3 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce>::call_once::h01a027258444143b
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/alloc/src/boxed.rs:1987:9
63: 0x55fe0c8899e3 - std::sys::unix::thread::Thread::new::thread_start::ha4f1cdd9c25884ba
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library/std/src/sys/unix/thread.rs:108:17
64: 0x7fac84494b43 -
65: 0x7fac84526a00 -
66: 0x0 -

failures:
test_desc_all_tables

test result: FAILED. 3 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.42s

error: test failed, to rerun pass --test desc

Set bootstrap data locally instead of downloading

Now, we use download_and_extract_zip function for getting bootstrap data.

async fn download_and_extract_zip(target: &str) -> Result<tempfile::TempDir, DyneinBootstrapError> {

Downloading json files each time is redundant and may cause errors in the unzip process.

In my case, the following error occurred:

error logs
$ uname -a
Darwin c889f3a95659 22.5.0 Darwin Kernel Version 22.5.0: Mon Apr 24 20:52:24 PDT 2023; root:xnu-8796.121.2~5/RELEASE_ARM64_T6000 arm64

$ curl -O -L https://github.com/awslabs/dynein/releases/latest/download/dynein-macos.tar.gz
$ tar xzvf dynein-macos.tar.gz
$ mv dy /usr/local/bin/

$ RUST_LOG=debug dy bootstrap
[2023-08-13T15:43:34Z DEBUG dy] Command details: Dynein { child: Some(Bootstrap { list: false, sample: None }), region: None, port: None, table: None, shell: false }
[2023-08-13T15:43:34Z DEBUG dy::app] Loading Config File: /Users/herotaka/.dynein/config.yml
[2023-08-13T15:43:34Z DEBUG dy::app] Loaded current config: Config { using_region: None, using_table: None, using_port: None }
[2023-08-13T15:43:34Z DEBUG dy::app] Loading Cache File: /Users/herotaka/.dynein/cache.yml
[2023-08-13T15:43:34Z DEBUG dy::app] Loaded current cache: Cache { tables: None }
[2023-08-13T15:43:34Z DEBUG dy] Initial command context: Context { config: Some(Config { using_region: None, using_table: None, using_port: None }), cache: Some(Cache { tables: None }), overwritten_region: None, overwritten_table_name: None, overwritten_port: None, output: None }
Bootstrapping - dynein will creates 4 sample tables defined here:
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/AppendixSampleTables.html

'ProductCatalog' - simple primary key table
    Id (N)

'Forum' - simple primary key table
    Name (S)

'Thread' - composite primary key table
    ForumName (S)
    Subject (S)

'Reply' - composite primary key table, with GSI named 'PostedBy-Message-Index'
    Id (S)
    ReplyDateTime (S)

[2023-08-13T15:43:34Z DEBUG dy::control] Trying to create a table 'ProductCatalog' with keys '["Id,N"]'
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] Full request: 
     method: POST
     final_uri: https://dynamodb.eu-west-1.amazonaws.com/
    Headers:
    
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] authorization:"AWS4-HMAC-SHA256 Credential=xxxxxxxxxx/20230813/eu-west-1/dynamodb/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-target, Signature=xxxxxxxxxx"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] content-length:"184"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] content-type:"application/x-amz-json-1.0"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] host:"dynamodb.eu-west-1.amazonaws.com"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] x-amz-content-sha256:"xxxxxxxxxx"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] x-amz-date:"20230813T154334Z"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] x-amz-target:"DynamoDB_20120810.CreateTable"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] user-agent:"rusoto/0.47.0 rust/1.56.1 macos"
[skip] Table 'ProductCatalog' already exists, skipping to create new one.
[2023-08-13T15:43:34Z DEBUG dy::control] Trying to create a table 'Forum' with keys '["Name,S"]'
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] Full request: 
     method: POST
     final_uri: https://dynamodb.eu-west-1.amazonaws.com/
    Headers:
    
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] authorization:"AWS4-HMAC-SHA256 Credential=xxxxxxxxxx/20230813/eu-west-1/dynamodb/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-target, Signature=xxxxxxxxxx"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] content-length:"179"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] content-type:"application/x-amz-json-1.0"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] host:"dynamodb.eu-west-1.amazonaws.com"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] x-amz-content-sha256:"xxxxxxxxxx"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] x-amz-date:"20230813T154334Z"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] x-amz-target:"DynamoDB_20120810.CreateTable"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] user-agent:"rusoto/0.47.0 rust/1.56.1 macos"
[skip] Table 'Forum' already exists, skipping to create new one.
[2023-08-13T15:43:34Z DEBUG dy::control] Trying to create a table 'Thread' with keys '["ForumName,S", "Subject,S"]'
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] Full request: 
     method: POST
     final_uri: https://dynamodb.eu-west-1.amazonaws.com/
    Headers:
    
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] authorization:"AWS4-HMAC-SHA256 Credential=xxxxxxxxxx/20230813/eu-west-1/dynamodb/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-target, Signature=xxxxxxxxxx"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] content-length:"284"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] content-type:"application/x-amz-json-1.0"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] host:"dynamodb.eu-west-1.amazonaws.com"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] x-amz-content-sha256:"xxxxxxxxxx"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] x-amz-date:"20230813T154334Z"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] x-amz-target:"DynamoDB_20120810.CreateTable"
[2023-08-13T15:43:34Z DEBUG rusoto_core::request] user-agent:"rusoto/0.47.0 rust/1.56.1 macos"
[skip] Table 'Thread' already exists, skipping to create new one.
[2023-08-13T15:43:35Z DEBUG dy::control] Trying to create a table 'Reply' with keys '["Id,S", "ReplyDateTime,S"]'
[2023-08-13T15:43:35Z DEBUG rusoto_core::request] Full request: 
     method: POST
     final_uri: https://dynamodb.eu-west-1.amazonaws.com/
    Headers:
    
[2023-08-13T15:43:35Z DEBUG rusoto_core::request] authorization:"AWS4-HMAC-SHA256 Credential=xxxxxxxxxx/20230813/eu-west-1/dynamodb/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-target, Signature=xxxxxxxxxx"
[2023-08-13T15:43:35Z DEBUG rusoto_core::request] content-length:"281"
[2023-08-13T15:43:35Z DEBUG rusoto_core::request] content-type:"application/x-amz-json-1.0"
[2023-08-13T15:43:35Z DEBUG rusoto_core::request] host:"dynamodb.eu-west-1.amazonaws.com"
[2023-08-13T15:43:35Z DEBUG rusoto_core::request] x-amz-content-sha256:"xxxxxxxxxx"
[2023-08-13T15:43:35Z DEBUG rusoto_core::request] x-amz-date:"20230813T154335Z"
[2023-08-13T15:43:35Z DEBUG rusoto_core::request] x-amz-target:"DynamoDB_20120810.CreateTable"
[2023-08-13T15:43:35Z DEBUG rusoto_core::request] user-agent:"rusoto/0.47.0 rust/1.56.1 macos"
[skip] Table 'Reply' already exists, skipping to create new one.
[2023-08-13T15:43:35Z DEBUG dy::bootstrap] temporary download & unzip directory: TempDir { path: "/var/folders/jg/xxxxxxxxxxx/T/.tmpvIndj8" }
Temporarily downloading sample data from https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/samples/sampledata.zip
[2023-08-13T15:43:35Z DEBUG reqwest::connect] starting new connection: https://docs.aws.amazon.com/
[2023-08-13T15:43:35Z DEBUG reqwest::async_impl::client] response '403 Forbidden' for https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/samples/sampledata.zip
[2023-08-13T15:43:35Z DEBUG dy::bootstrap] Downloading the file at: /var/folders/jg/xxxxxxxxxxx/T/.tmpvIndj8/downloaded_sampledata.zip
[2023-08-13T15:43:35Z DEBUG dy::bootstrap] Finished writing content of the downloaded data into '/var/folders/jg/xxxxxxxxxx/T/.tmpvIndj8/downloaded_sampledata.zip'
Error: ZipError(InvalidArchive("Could not find central directory end"))

So, setting bootstrap data locally like dynein/src/resources/bootstrap/Forum.json is better.

list tables does not work for local region

List tables does not work for local region

$ dy -r local list
or
$ dy admin -r local list

fails with

[2021-03-01T02:25:07Z ERROR dy::app] 
    To execute the command you must specify target table in one of following ways:
        * [RECOMMENDED] $ dy use <your_table> ... save target table to use.
        * Or, optionally you can pass --region and --table options to specify target for your commands. Refer --help for more information.
    To find all tables in all regions, try:
        * $ dy ls --all-regions

I have used
$ dy -r local bootstrap
and it worked fine populating database with 4 tables.
Other individual commands to scan and query tables work on those tables and -r local.

export command cause panic when table doesn't have items

Description

If the table doesn't have items, export command cause panic due to truncate.

> cargo run -- scan
    Finished dev [unoptimized + debuginfo] target(s) in 0.28s
     Running `target/debug/dy scan`
No item to show in the table 'fuga'

> cargo run -- export --output-file export_test
    Finished dev [unoptimized + debuginfo] target(s) in 0.27s
     Running `target/debug/dy export --output-file export_test`
Specified output file already exists. Is it OK to truncate contents? yes
thread 'main' panicked at 'attempt to subtract with overflow', src/transfer.rs:461:20
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

We will need to use more simple way to concatenate output instead of putting comma manually.

dynein/src/transfer.rs

Lines 451 to 465 in bd470cf

/// This function tweaks scan output items.
/// Each scan iteration, converted string would be a single JSON array: e.g. [ {a:1}, {a:2} ]
/// When multiple scan is needed (i.e. when last_evaluated_key is Some), connected string would be: e.g. [ {a:1}, {a:2} ][ {a:3}, {a:4} ]
/// To avoid this invalid JSON from written to output file, this method remove the first "[" and the last "]", then add "," after the last item.
fn connectable_json(mut s: String, compact: bool) -> String {
s.remove(0); // remove first char "["
let len = s.len();
if compact {
s.truncate(len - 1); // remove last char "]"
} else {
s.truncate(len - 2); // remove last char "]" and newline
}
s.push(','); // add last "," so that continue to next iteration
s
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.