ex-aws / ex_aws Goto Github PK
View Code? Open in Web Editor NEWA flexible, easy to use set of clients AWS APIs for Elixir
Home Page: https://hex.pm/packages/ex_aws
License: MIT License
A flexible, easy to use set of clients AWS APIs for Elixir
Home Page: https://hex.pm/packages/ex_aws
License: MIT License
It would be awesome if this client worked with ec2 roles, pulling credentials from instance metadata. Even better would be to use the default credentials chain that the amazon-developed clients use :)
i'd like to use MIX_ENV=gamma or others. It currently fails with
21) test sort works (OuteTest)
test/backends_test.exs:125
** (FunctionClauseError) no function clause matching in ExAws.Config.Defaul
ts.defaults/1
stacktrace:
lib/ex_aws/config/defaults.ex:10: ExAws.Config.Defaults.defaults(:gamma)
lib/ex_aws/config/defaults.ex:4: ExAws.Config.Defaults.defaults/0
lib/ex_aws/config.ex:26: ExAws.Config.get/1
lib/ex_aws/config.ex:14: ExAws.Config.build/2
lib/ex_aws/dynamo.ex:2: ExAws.Dynamo.delete_table/1
test/backends_test.exs:12: OuteTest.__ex_unit_setup_0/1
test/backends_test.exs:1: OuteTest.__ex_unit__/2
Amazon has released support for a PutRecords
action, allowing batched puts.
http://docs.aws.amazon.com/kinesis/latest/APIReference/API_PutRecords.html
Add test configuration for Travis CI. See, eg, CargoSense/vex#14.
Obviously tests that depend on the network and/or local DynamoDB, etc, should not be run by Travis CI, so more configuration may be necessary.
Hi Ben,
I was using Dynamodb Local via ExAws just fine until this commit:d68ae5f
I have host: "localhost"
which seems to cause the following crash:
** (EXIT) an exception was raised:
** (ArgumentError) argument error
(stdlib) binary.erl:317: :binary.replace/4
(ex_aws) lib/ex_aws/config.ex:67: ExAws.Config.parse_host_for_region/1
(ex_aws) lib/ex_aws/config.ex:16: ExAws.Config.build/2
Is it still possible to use DynamoDB Local via ExAws? If so, how? If not, could you consider adding that option back?
Thanks!
The functions in dynamo and kinesis should all have bang options that simply return the value, or raise an exception.
params = @params |> format_and_take(opts)
must be
params = opts |> format_and_take(@params)
** (Protocol.UndefinedError) protocol Enumerable not implemented for "connection/1.0.2/css/elixir.css"
(elixir) lib/enum.ex:1: Enumerable.impl_for!/1
(elixir) lib/enum.ex:112: Enumerable.reduce/3
(elixir) lib/enum.ex:1398: Enum.reduce/3
(ex_aws) lib/ex_aws/s3/impl.ex:201: ExAws.S3.Impl.delete_object/4
(ex_aws) lib/ex_aws/s3/impl.ex:204: ExAws.S3.Impl.delete_object!/4
(elixir) lib/enum.ex:585: anonymous fn/3 in Enum.each/2
(elixir) lib/enum.ex:1390: anonymous fn/3 in Enum.reduce/3
(elixir) lib/stream.ex:1220: anonymous fn/3 in Enumerable.Stream.reduce/3
(elixir) lib/enum.ex:2607: Enumerable.List.reduce/3
(elixir) lib/stream.ex:1119: Stream.do_list_resource/6
(elixir) lib/stream.ex:1240: Enumerable.Stream.do_each/4
(elixir) lib/enum.ex:1389: Enum.reduce/3
(elixir) lib/enum.ex:584: Enum.each/2
(hex_web) lib/hex_web/api/handlers/docs.ex:120: HexWeb.API.Handlers.Docs.upload_docs/3
(elixir) lib/task/supervised.ex:74: Task.Supervised.do_apply/2
(stdlib) proc_lib.erl:240: :proc_lib.init_p_do_apply/
Happens on commit 649d36c.
This is most likely a mistake on my part, so on that note, is there an easy way to see the generated Dynamo request for debugging purposes?
Let's say I have an Item in a table called Notifications:
{:ok, n} = Dynamo.get_item("Notifications", %{ id: "1234" });
=> {:ok,
%{"Item" => %{"attempts" => %{"N" => "0"}, "id" => %{"S" => "1234"},
"max_attempts" => %{"N" => "20"}, "mock" => %{"BOOL" => false},
"request" => %{"M" => %{}}, "response" => %{"M" => %{}},
"state" => %{"S" => "pending"}}}}
I then want to update that item, and since I'm updating an older application to use expressions, I'll also need to use expression attribute names for some of these keys with reserved names.
Dynamo.update_item("Notifications", %{id: n.id}, %{ return_values: "ALL_NEW", expression_attribute_names: %{ "#s" => "state" }, update_expression: "SET #s = foobar" })
=> {:error,
{"ValidationException",
"The provided expression refers to an attribute that does not exist in the item"}}
Again, I'm sure this is something wrong on my end. I could always stub out the Dynamo methods and see what's being passed in, but a baked in way to get the generated request body would be really helpful in situations like this.
Edit: To make things more frustrating, things seem to be working fine with good ol' AttributeUpdates
.
Dynamo.update_item(table, %{id: "1234"}, %{return_values: "ALL_NEW", attribute_updates: %{ state: %{Action: "PUT", Value: %{S: "queued"}}})
=> {:ok,
%{"Attributes" => %{"attempts" => %{"N" => "0"}, "id" => %{"S" => "1234"},
"max_attempts" => %{"N" => "20"}, "mock" => %{"BOOL" => false},
"request" => %{"M" => %{}}, "response" => %{"M" => %{}},
"state" => %{"S" => "queued"}}}}
it stored ok, but can't decode
iex(29)> ExAws.Dynamo.get_item("Graph-dev",%{id: "3",r: "a"})
{:ok,
%{"Item" => %{"created_at" => %{"N" => "1431134399.66159391403"},
"id" => %{"S" => "3"}, "lst" => %{"NS" => ["4", "5", "6"]},
"r" => %{"S" => "a"}, "that" => %{"N" => "321"},
"this" => %{"S" => "this"}}}}
iex(30)> ExAws.Dynamo.get_item!("Graph-dev",%{id: "3",r: "a"}) |> ExAws.Dynamo.Decoder.decode
** (FunctionClauseError) no function clause matching in ExAws.Dynamo.Decoder.decode/1
lib/ex_aws/dynamo/decoder.ex:34: ExAws.Dynamo.Decoder.decode(["4", "5", "6"])
lib/ex_aws/dynamo/decoder.ex:47: anonymous fn/2 in ExAws.Dynamo.Decoder.decode/1
(stdlib) lists.erl:1261: :lists.foldl/3
lib/ex_aws/dynamo/decoder.ex:47: anonymous fn/2 in ExAws.Dynamo.Decoder.decode/1
(stdlib) lists.erl:1261: :lists.foldl/3
While I don't want to do tons of this, there's interest in adding erlang friendly clients so that erlang users can use this instead of erlcloud. Right now erlang users have to do 'Elixir.ExAws.S3':list_objects
which is awkward.
The following gives an example s3 client.
defmodule :ex_aws_s3 do
use ExAws.S3.Client
end
In Dynamo and other APIs, there is a clear value to having ! function corollaries to a lot of regular functions. The question is, for something like get_item! should it unwrap {:ok, result}
to just result
or {:ok, %{"Item" => item}
to item
. It's worth keeping in mind that if we do the latter it invalidates certain options like :return_consumed_capacity
which add an attribute to the result map.
Basically the question boils down to what the ! asserts in something like scan! query! get_records! etc. Are we asserting that the requests succeeds in an HTTP sense or are we trying to have it do more? I'm presently inclined towards the former option, but will think it over more.
Unless s3:GetObjectAcl
action is allowed, the Content
entries of the response xml will not contain this part, which seems to be expected by the parsing code:
<Owner><ID>...</ID><DisplayName>...</DisplayName></Owner>
leads to this crash:
** (Protocol.UndefinedError) protocol Enumerable not implemented for nil
(elixir) lib/enum.ex:1: Enumerable.impl_for!/1
(elixir) lib/enum.ex:112: Enumerable.reduce/3
(elixir) lib/stream.ex:695: Stream.do_enum_transform/8
(elixir) lib/stream.ex:647: Stream.do_transform/7
lib/sweet_xml.ex:484: anonymous fn/4 in SweetXml.continuation_opts/2
xmerl_scan.erl:565: :xmerl_scan.scan_document/2
xmerl_scan.erl:288: :xmerl_scan.string/2
lib/sweet_xml.ex:180: SweetXml.parse/2
wondering if {:ok, %{}} is correct on an empty result. {:ok,nil} may be better. maybe not for get_item!, but get_item should return an empty map if you expect people to use case
this smells a bit
case Dynamo.get_item!(..) do
{:ok,map} when map == %{} -> ....
# {:ok,%{}} always matches so we can't use it with case...
The S3.Client list_objects!
, get_object!
functions do not throw meaningful exception (as expected) on error, generating MatchError
instead.
defmodule ExAwsTest do
use ExUnit.Case
use ExAws.S3.Client, otp_app: Delorean.RankingsBuilder
test "abc" do
__MODULE__.list_objects!("foo", prefix: "bar/baz")
assert false
end
end
generates something like:
1) test abc (ExAwsTest)
test/unit/ex_aws_test.exs:5
** (MatchError) no match of right hand side value: {:error, {:http_error, 400, "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>AuthorizationHeaderMalformed</Code><Message>The authorization header is malformed; the region 'us-west-2' is wrong; expecting 'us-east-1'</Message><Region>us-east-1</Region><RequestId>C7DE1C18F8D90136</RequestId><HostId>x5m/0zmS4VlrUAzM/igI5iWwddeUYq7vozXq/x2+bmWDW6804WzntLsZ5e8m/Mqz</HostId></Error>"}}
stacktrace:
(ex_aws) lib/ex_aws/s3/impl.ex:57: ExAws.S3.Impl.list_objects!/3
test/unit/ex_aws_test.exs:6
Quoting Dave Thomas, Programming Elixir, p. 129: """The trailing exclamation point in the method name is an Elixir conventionโif you see it, you know the function will raise an exception on error, and that exception will be meaningful.""" Sorry, It is not LRM, but I do not think Elixir has LRM.
Currently, the dynamo decoder is setup to decode string data types with a value of "TRUE"
to the atom true
upon deserialization.
That absolutely makes sense when the dynamo type is boolean but I'm not so sure if it makes sense when the dynamo data type is string. In the case of a string data type, it seems like more of a bandaid for those that are not using the proper types when the better solution would be to simply fix their types.
I'm dealing with stock market and "TRUE"
is a valid ticker symbol.
The leads to strange behavior since, I cannot do this...
# Retrieve metadata
{:ok, %{"Item" => item}} = ExAws.Dynamo.get_item(table(), %{"symbol" => symbol})
metadata = ExAws.Dynamo.Decoder.decode(item, as: Metadata)
# Update metadata and save it back
new_metadata = update_metadata(metadata, new_data)
ExAws.Dynamo.put_item table(), new_metadata
Because the decode
transforms the "TRUE"
string into an atom, the put_item
blows up with a validation exception since I'm now trying to save a boolean value into a key field which is a string field:
{:error, {"ValidationException", "One or more parameter values were invalid: Type mismatch for key symbol expected: S actual: BOOL"}}
It seems like it makes sense to remove these two cases from the decode function:
def decode(%{"S" => "TRUE"}), do: true
def decode(%{"S" => "FALSE"}), do: false
What do you think?
Different services have different conventions about how errors are returned. Right now client_error/2 exists in the root request and only handles dynamo exceptions, which is bad. client_error should exist inside the individual service request modules and handle stuff there.
Hi Ben,
just updated ex_aws
dependency on my project from 0.4.8 to 0.4.10 and now get this error on compilation:
== Compilation error on file lib/ex_aws/s3/parsers.ex ==
** (CompileError) lib/ex_aws/s3/parsers.ex:3: module SweetXml is not loaded and could not be found
(elixir) expanding macro: Kernel.if/2
lib/ex_aws/s3/parsers.ex:2: ExAws.S3.Parsers (module)
(elixir) lib/kernel/parallel_compiler.ex:97: anonymous fn/4 in Kernel.ParallelCompiler.spawn_compilers/8
could not compile dependency ex_aws, mix compile failed. You can recompile this dependency with `mix deps.compile ex_aws` or update it with `mix deps.update ex_aws
Has SweetXML become a mandatory dependency?
Thanks!
I may be wrong, but I don't seem to see this.
Does ExAws
currently provide any helper functions to build and/or sign expiring urls for access to private objects on S3?
References:
From https://docs.aws.amazon.com/AmazonS3/latest/API/multiobjectdeleteapi.html
The Multi-Object Delete request contains a list of up to 1000 keys that you want to delete.
https://github.com/CargoSense/ex_aws/blob/master/lib/ex_aws/dynamo/decoder.ex#L8
i think you mean
items |> Stream.map(&decode(&1,as: struct_module))
regardless, you have any thoughts on the "correct" way to pipe results from a scan?
also this is a good case for scan!, i saw your notes from stream_scan, but scan could still use this. It would require Decoder knows about "Item" vs "Items"
#my vote
Dynamo.scan!(@t_name) |> Dynamo.Decoder.decode(as: Foo)
# vs
Dynamo.scan!(@t_name)["Items"] |> Dynamo.Decoder.decode(as: Foo)
# vs.
{:ok, stuff} = Dynamo.scan(@t_name)
Dynamo.Decoder...
If a struct is nested inside another struct, dynamo will not encode it as a map properly. While other maps when encoded return %{"M" => encoded_map}
structs just return encoded_map
so that they can be used directly in Dynamo queries.
Ideally, current usage of encode/1
for structs would be replaced by an encode_root/1
or encode(item, :root)
function, and encode/1
would always return the %{"M" => encoded_map}
form. How much of a breaking change this is needs to be evaluated.
I'm used to poking around in iex
using h
to figure out library APIs. Your hexdocs.pm docs are pretty good but in iex the docs are awful:
I'm able to look at the source and hexdocs.pm to figure things out, but it would be great if you had usable docs available in iex. Actually, the reason I was even trying erlcloud was because I first tried ExAws, noticed the lack of iex docs and took that as a sign that ExAws is just a prototype so far, and that I should therefore look elsewhere for an S3 client. I guess I'm spoiled by many other elixir libraries having good docs available in iex :).
The float encoding is using Float.to_string/1 which means that the results stored in Dynamo can be different of the ones in the source map. IMO, the encoding shouldn't implicitly transform the values to encode.
e.g.
1.67 will be stored as "1.66999999999999992895e+00" instead of 1.67
A simple solution would be using to_string/1 instead of Float.to_string/1. However the to_string/1 is much slower that the Float.to_string/1.
Another solution is to be able to set the Float.to_string/1 options.
At the moment hackney pools are disabled which means:
Golden plating would be to be able to have a pool per service.
doesn't look like RANGE key is supported in key_schema for create_table. is this intentional?
Hello,
When using temp ec2 credentials based on an IAM role, you must pass a x-amz-security-token header when making requests to S3.
The token value is just the token value from the instance metadata.
More info at: http://docs.aws.amazon.com/AmazonS3/latest/dev/RESTAuthentication.html
Currently, all S3 requests using the temp credentials result in: The AWS Access Key Id you provided does not exist in our records.
This should go away once the header is added in.
I can't get any request to bucket on regions other than "US Standard" to work.
iex(4)> ExAws.S3.list_objects "bucket"
Request URL: "https://bucket.s3.amazonaws.com/"
...
** (CaseClauseError) no case clause matching: {:ok, %HTTPoison.Response{body: "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>TemporaryRedirect</Code><Message>Please re-send this request to the specified temporary endpoint. Continue to use the original request endpoint for future requests.</Message><Bucket>bucket</Bucket><Endpoint>bucket.s3-eu-west-1.amazonaws.com</Endpoint><RequestId>...</RequestId><HostId>...</HostId></Error>", headers: [{"x-amz-bucket-region", "eu-west-1"}, {"x-amz-request-id", "..."}, {"x-amz-id-2", "..."}, {"Location", "https://shim-wtf.s3-us-west-2.amazonaws.com/"}, {"Content-Type", "application/xml"}, {"Transfer-Encoding", "chunked"}, {"Date", "Tue, 08 Sep 2015 13:04:34 GMT"}, {"Server", "AmazonS3"}], status_code: 307}}
(ex_aws) lib/ex_aws/request.ex:33: ExAws.Request.request_and_retry/7
(ex_aws) lib/ex_aws/s3/impl.ex:52: ExAws.S3.Impl.list_objects/3
Configuring s3.region doesn't make any difference...
would be useful to see example tests for query and stream_query for ppl not familiar with dynamodb query expressions. also might want to clarify position on queryfilter and if it is not going to be supported.
i could not get the following to work
user = %User{admin: false, age: 23, email: "[email protected]", name: "Bubba"}
Dynamo.scan("Users",
limit: 12,
exclusive_start_key: [email: "email"],
expression_attribute_names: [email: "#email"],
expression_attribute_values: [email: "[email protected]"],
filter_expression: "#email = :email")
{:error,
{:http_error, 400,
"{\"__type\":\"com.amazon.coral.validate#ValidationException\",\"Message\":\"ExpressionAttributeNames contains invalid key: Syntax error; key: \\\"email\\\"\"}"}}
https://github.com/CargoSense/ex_aws/blob/master/lib/ex_aws/s3/impl.ex#L375
basically there is no place to put part's body
== Compilation error on file lib/ex_aws/s3/parsers.ex ==
** (CompileError) lib/ex_aws/s3/parsers.ex:3: module SweetXml is not loaded and could not be found
(elixir) expanding macro: Kernel.if/2
lib/ex_aws/s3/parsers.ex:2: ExAws.S3.Parsers (module)
(elixir) lib/kernel/parallel_compiler.ex:100: anonymous fn/4 in Kernel.ParallelCompiler.spawn_compilers/8
In addition, the sweet_xml dependency requires an older version:
Looking up alternatives for conflicting requirements on sweet_xml
Activated version: 0.5.0
From mix.exs: ~> 0.5.0
From ex_aws v0.4.11: ~> 0.2.1
It would be nice to either make this truly optional or at least upgrade the dep to the latest version.
I'm looking for a way to safely change AWS credentials for each requests, using the request I determine which account and bucket to use.
Thanks for ExAws and your help! ๐
Dynamo queries that take a primary key need to support more than one, because you can have primary keys based on hashes and ranges.
would be great to have stream_query!(...) etc etc.
When using S3.get_object!
with a key that contains a :
character, I get a 403 SignatureDoesNotMatch
error from AWS. Keys that do not contain this character do not have this problem.
I'm guessing there's some bug in the way the signing logic encodes :
.
I updated arc
recently, and moved from ercloud
to ex_aws
, now the image upload works with the local storage but could not upload it to s3
using ex_aws
, it throws a timeout error
.
I tried with the latest version of ex_aws
and even tried with current master branch.
I previously had HTTPoison of version 0.8
, so tried with 0.7.4
.
Tried with a region and without it. The error still exists. What am I doing wrong, could some one help me?
# config.exs
config :ex_aws
region: "eu-west-1",
access_key_id: [{:system, "MY_AWS_ACCESS_KEY_ID"}, :instance_role],
secret_access_key: [{:system, "MY_AWS_SECRET_ACCESS_KEY"}, :instance_role]
Based on these docs, I expected the following to work:
# lib/delorean/s3.ex
defmodule Delorean.S3 do
use ExAws.S3.Client, opt_app: :delorean
end
# config/config.exs
use Mix.config
config :delorean, ExAws,
s3: [region: "us-west-2", scheme: "https://", host: "s3.amazonaws.com"]
import_config "s3_creds.exs"
# config/s3_creds.exs
use Mix.Config
config :delorean, ExAws,
access_key_id: "redacted",
secret_access_key: "redacted"
However, I get an error:
** (RuntimeError) A valid configuration root is required in your s3 client
stacktrace:
(ex_aws) lib/ex_aws/config.ex:23: ExAws.Config.get/1
(ex_aws) lib/ex_aws/config.ex:14: ExAws.Config.build/2
(delorean) lib/delorean/s3.ex:2: Delorean.S3.list_objects!/2
I can get it to work by changing lib/delorean/s3.ex
to:
defmodule Delorean.S3 do
use ExAws.S3.Client
def config_root do
Application.get_all_env(:delorean)[ExAws]
end
end
...but that doesn't seem necessary. What am I doing wrong? Or are your docs inaccurate?
https://github.com/aws/aws-sdk-go/tree/master/apis provides a JSON structure with all the documentation from AWS for each action. If it's possible to easily generate the documentation from this that might be a good idea.
Instead of using guard clauses, dynamo should have dynamization protocol that is implemented for various data types.
This is going to be a very unsatisfactory issue report, I apologise in advance.
This was reported by @ericmj and experienced by myself in one of our projects.
It seems that sometimes uploads to S3 stop working. No error is being raised and it looks like they just stall. Data doesn't make it into S3 but again: No error.
Best guess right now is that requests are timing out and the pool only holds one connection.
Possible fixes:
I get the following when running the scan from your dynamodb test
iex(27)> Dynamo.scan("Users",
...(27)> limit: 12,
...(27)> exclusive_start_key: [api_key: "api_key"],
...(27)> expression_attribute_names: [api_key: "#api_key"],
...(27)> expression_attribute_values: [api_key: "asdfasdfasdf", name: "bubba"],
...(27)> filter_expression: "ApiKey = #api_key and Name = :name")
{:error,
{:http_error, 400,
"{\"__type\":\"com.amazon.coral.validate#ValidationException\",\"Message\":\"ExpressionAttributeNames contains invalid key: Syntax error; key: \\\"api_key\\\"\"}"}}
The lazy functions could be lazier. The issue is that with Stream functions like unfold/2 and resource calculate the N+1th value at the same time that you retrieve the Nth value. For example
defp build_scan_stream(initial, request_fun) do
Stream.resource(fn -> initial end, fn
:quit -> {:halt, nil}
{:error, items} -> {[{:error, items}], :quit}
{:ok, %{"Items" => items, "LastEvaluatedKey" => key}} ->
{items, request_fun.(%{ExclusiveStartKey: key})}
{:ok, %{"Items" => items}} ->
{items, :quit}
end, &pass/1)
end
request_fun.(%{ExclusiveStartKey: key})
is called at the time the initial item set is added to the accumulator. Suppose initial looks like {:ok, %{"Items" => [1,2,3], "LastEvaluatedKey" => 3}}
. This pattern matches to the {:ok, %{"Items" => items, "LastEvaluatedKey" => key}} ->
clause, which means that request_fun is called EVEN if you only call Enum.take(1)
on the stream. What would be desirable is if there was some way to have request_fun not called until it was actually needed.
seems like having to pipe to decode is a bit terse, would you consider some short cuts to return a decoded item or list? something
get_map(table_name,primary_key) :: {Atom, Map}
query_maps( table_name,primary_key) :: {Atom,List}
you could also do
get_struct(table_name,primary_key,struct_name)
query_structs(...)
also, it is unclear how you use a primary_key and range_key with get_item
finally wondering if there is a way to annotate the primary key, range key and even table_name in the module. it would be nice to derive this for things like
defmodule User do
@table_name "Users"
@primary_key "email"
@range_key "age"
@derive [ExAws.Dynamo.Encodable]
defstruct [:email, :name, :age, :admin]
end
user = %User{admin: false, age: 23, email: "[email protected]", name: "Bubba"}
put_item(user)
new_user = %User{admin: true, age: 33, email: "[email protected]", name: "Unf"}
update_item(new_user)
Hi Ben!
I am using version 0.4.8.
I am getting the following error when putting an item into Dynamo:
** (exit) an exception was raised:
** (Protocol.UndefinedError) protocol ExAws.Dynamo.Encodable not implemented for %Service.Operation{oid: "1442-696618-318045", type: "2"}
(ex_aws) lib/ex_aws/dynamo/encodable.ex:1: ExAws.Dynamo.Encodable.impl_for!/1
(ex_aws) lib/ex_aws/dynamo/encodable.ex:6: ExAws.Dynamo.Encodable.encode/1
(ex_aws) lib/ex_aws/dynamo/impl.ex:147: ExAws.Dynamo.Impl.put_item/4
Since I have declared the Encodable protocol in the Operation module I am really confused by this error. Here is a simplified version of the Operation module, which still causes the exception:
defmodule Service.Operation do
alias __MODULE__
@derive [ExAws.Dynamo.Encodable]
defstruct [:oid, :type]
def setup_store do
ExAws.Dynamo.create_table("Operations", "oid", %{oid: :string}, 1, 1)
end
def new(oid, _, _, _) do
%Operation{type: "2",
oid: oid
}
end
def write(operation) do
ExAws.Dynamo.put_item("Operations", operation)
end
end
Obviously I must be doing something wrong but I honestly can't see it... I'd appreciate some help.
Thank you again Ben!
Hi!
First of all, thank you so much for putting so much time and dedication into this library. I truly appreciate it! I think it's awesome that you made it possible to get the credentials via the EC2 role!
I am facing some problems when running my Elixir application from within an EC2 instance inside an ECS container. All requests time out.
This is my ex_aws config:
[dynamodb: [
scheme: {:system, "DYNAMODB_SCHEME"},
host: {:system, "DYNAMODB_HOST"},
port: {:system, "DYNAMODB_PORT"},
region: "eu-west-1"
],
included_applications: [],
access_key_id: [{:system, "AWS_ACCESS_KEY_ID"}, :instance_role],
secret_access_key: [{:system, "AWS_SECRET_ACCESS_KEY"}, :instance_role]]
The last two OS env variables are undefined, so the :instance_role is used.
I have put an IO.inspect on ExAws.Request.request/5 to better understand what might be the problem:
Body of the request:
%{"AttributeDefinitions" => [%{"AttributeName" => :uid,
"AttributeType" => "S"}],
"KeySchema" => [%{"AttributeName" => "uid", "KeyType" => "HASH"}],
"ProvisionedThroughput" => %{"ReadCapacityUnits" => 1,
"WriteCapacityUnits" => 1}, "TableName" => "facebook_service_users"}
Headers:
[{"Authorization",
"AWS4-HMAC-SHA256 Credential=/20150626/eu-west-1/dynamodb/aws4_request,SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-target,Signature=668....f"},
{"host", "dynamodb.eu-west-1.amazonaws.com"},
{"x-amz-date", "20150626T155125Z"},
{"x-amz-target", "DynamoDB_20120810.CreateTable"},
{"content-type", "application/x-amz-json-1.0"}, {"x-amz-content-sha256", ""}]
URL:
https://dynamodb.eu-west-1.amazonaws.com:80/
This is the error I get in the logs:
ExAws: HTTP ERROR: :connect_timeout
Just to rule out that the problem is not in the IAM side, I have tried to create a dynamodb table from the EC2 instances via awscli and that worked just fine. Therefore, the problem is at the application level.
I am running my application with MIX_ENV=test.
I would appreciate it if you would share with me any hints on what could be the problem.
Thank you a lot!
Hello,
What would be the best way to fix this conflicting version issue?
Looking up alternatives for conflicting requirements on poison
From ex_aws v0.4.9: ~> 1.2.0
From phoenix v1.0.2: ~> 1.3
** (Mix) Hex dependency resolution failed, relax the version requirements or unlock dependencies
Thanks!
Hi, thanks for making this library!
I'm looking into using it in our evercam-media application and everything worked nicely in development but I'm having problems making it run in production environment. We use OTP releases generated with exrm and that seems to conflict with assumptions that ex_aws
makes. I first thought that the problem was just that mix
wasn't included in the release, but adding it uncovered another issue.
This is the error I'm getting:
[error] %ArgumentError{message: "argument error"}
[error] (stdlib) :ets.lookup(Mix.State, :env)
(mix) lib/mix/state.ex:33: Mix.State.fetch/1
(mix) lib/mix/state.ex:42: Mix.State.get/2
(ex_aws) lib/ex_aws/config/defaults.ex:3: ExAws.Config.Defaults.defaults/0
(ex_aws) lib/ex_aws/config.ex:26: ExAws.Config.get/1
(ex_aws) lib/ex_aws/config.ex:14: ExAws.Config.build/2
(ex_aws) lib/ex_aws/s3.ex:2: ExAws.S3.put_object!/3
(evercam_media) lib/snapshots/snapshot_fetch.ex:69: EvercamMedia.Snapshot.store/4
You can see the code I'm using at: https://github.com/evercam/evercam-media/pull/42/files
Any suggestions, am I doing something wrong?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.