Giter Site home page Giter Site logo

codegenieapp / serverless-express Goto Github PK

View Code? Open in Web Editor NEW
5.1K 5.1K 658.0 25.46 MB

Run Express and other Node.js frameworks on AWS Serverless technologies such as Lambda, API Gateway, Lambda@Edge, and more.

Home Page: https://codegenie.codes

License: Apache License 2.0

JavaScript 99.74% EJS 0.05% Shell 0.21%
alb api-gateway aws aws-lambda aws-serverless dynamodb express express-js expressjs hacktoberfest lambda node node-js nodejs sam serverless serverless-applications serverless-express serverless-framework vendia

serverless-express's People

Contributors

brett-vendia avatar brettstack avatar cnuss avatar codpot avatar colonist4 avatar dependabot[bot] avatar ferjul17 avatar forabi avatar fredericgermain avatar gitter-badger avatar h4ad avatar htchaan avatar jeremylevy avatar jongreenwood-dunelm avatar jun711 avatar kahouieong avatar lightningspirit avatar martoncsikos avatar michaelmerrill avatar nogut0123 avatar octavianmindera avatar ovalba avatar panva avatar sapessi avatar semantic-release-bot avatar shamshiel avatar shawnsparks-work avatar warfox avatar xavm avatar y13i avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

serverless-express's Issues

npm run win-local doesn't work

This lib seems to use unix sockets for the proxy, which won't work on windows. I was able to get it to work by modifying the http request and server listen calls to use localhost + port.

[BUG] package-function doesn't remove deleted files

Hello guys, I faced a problem when I deleted a file that already uploaded, the package-function only add new and update old files, but if you delete a file, you must delete lambda-function.zip before compress a new lambda-function.zip

just update package-function to

"package-function": "zip -FSr -q -r lambda-function.zip lambda.js app.js index.html node_modules",

​​"listener" argument must be a function​​

When I call awsServerlessExpress.createServer with my exported express app as the first parameter I get the error

​​"listener" argument must be a function​​
  at ​​​_addListener​​​ ​events.js:216​
  at ​​​Server.addListener​​​ ​events.js:270​
  at ​​​new Server​​​ ​_http_server.js:231​
  at ​​​Object.exports.createServer​​​ ​http.js:44​
  at ​​​Object.exports.createServer​​​ ​./node_modules/aws-serverless-express/index.js:134​

Listener appears to be the first parameter in the createServer function. I don't understand how listener could be a function if it's my express app.

My full code reads:

const awsServerlessExpress = require('aws-serverless-express');
const app = require('./dist/server');
const server = awsServerlessExpress.createServer(app);

exports.handler = (event, context) => awsServerlessExpress.proxy(server, event, context);

Any ideas?

[Question] Internal server error when request method OPTIONS

Had anyone same issue? When I try via postman make a request method OPTIONS the response is
{"message": "Internal server error"} if I try in API Gateway Console, lambda Console, npm run local and npm run invoke-lambda the response is ok. Any idea what is the problem?

how enable compression (gzip)?

Hello, how enable compression, i added compression library

app.use(compression());
app.use(cors());
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
app.use(awsServerlessExpressMiddleware.eventContext());

added too Accept-Encoding to request

the response is

{ 'x-powered-by': 'Express',
     'access-control-allow-origin': '*',
     'content-type': 'application/json; charset=utf-8',
     'content-length': '331',
     etag: 'W/"14b-aAEboC6Fi//N5cLk6Fl9hQ"',
     vary: 'Accept-Encoding',
     date: 'Mon, 02 Jan 2017 17:59:22 GMT',
     connection: 'close' }

why the response doesn't contains?

Content-Encoding: gzip

POST body is not in the apiGateway.event

The body of a POST is not being put into the apiGateway.event object of the request. I'm not sure if you'll consider this a defect or not, but it was surprising that it wasn't there given that it typically is (when not using this package, like when you write a typical lambda).

A simple fix would be to change the middleware to pluck it from the request.body value and put it in request.apiGateway.event.

req.originalUrl will not guarantee URI is exactly the same as incoming requests

Hi,

A webhook that I'm listening to signs requests with the entire path + queryString.
However, req.originalUrl does not guarantee that the exact same URI string is accessible by my lambda functions because getPathWithQueryStringParams: https://github.com/awslabs/aws-serverless-express/blob/61d34f075d193e2b6ee83adbb9e6dfd45db4b98e/index.js#L18-L26 is reconstructing queryString from an object of params, so the order of my queryString is not identical to the signed request. This would require API Gateway to pass the raw URI in the context, but I don't see anything here http://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-mapping-template-reference.html#context-variable-reference that would allow it. This may not be doable with the current API Gateway, but it would be a good fix to keep in mind.

Thanks

Unknown output type: JSON

Hey Guys,

I'm getting this error when I run npm run setup - the stack gets created but any subsequent updates won't deploy.

> aws cloudformation create-stack --stack-name $npm_package_config_cloudFormationStackName --template-body file://./cloudformation.json --capabilities CAPABILITY_IAM --parameters ParameterKey=AwsServerlessExpressS3Bucket,ParameterValue=$npm_package_config_s3BucketName --region $npm_package_config_region


Unknown output type: JSON

npm ERR! Darwin 16.0.0
npm ERR! argv "/Users/gb/.nvm/versions/node/v4.3.2/bin/node" "/Users/gb/.nvm/versions/node/v4.3.2/bin/npm" "run" "create-stack"
npm ERR! node v4.3.2
npm ERR! npm  v2.14.12
npm ERR! code ELIFECYCLE
npm ERR! [email protected] create-stack: `aws cloudformation create-stack --stack-name $npm_package_config_cloudFormationStackName --template-body file://./cloudformation.json --capabilities CAPABILITY_IAM --parameters ParameterKey=AwsServerlessExpressS3Bucket,ParameterValue=$npm_package_config_s3BucketName --region $npm_package_config_region`
npm ERR! Exit status 255

The header content contains invalid characters

When running local test with the following api-gateway-event.json:

{
  "httpMethod": "GET",
  "//body": "{\"name\": \"Jake\"}",
  "path": "/users",
  "resource": "/{proxy+}",
  "queryStringParameters": {
    "foo": "💩"
  },
  "pathParameters": {},
  "headers": {},
  "requestContext": {}
}

the following error occurs:

TypeError: The header content contains invalid characters
    at ClientRequest.OutgoingMessage.setHeader (_http_outgoing.js:358:11)
    at new ClientRequest (_http_client.js:86:14)
    at Object.exports.request (http.js:31:10)
    at forwardRequestToNodeServer (/Users/johnDoe/Developer/github.com/aws-serverless-express/example/node_modules/aws-serverless-express/index.js:115:22)
    at Server.startServer.on.error (/Users/johnDoe/Developer/github.com/aws-serverless-express/example/node_modules/aws-serverless-express/index.js:165:21)
    at emitNone (events.js:91:20)
    at Server.emit (events.js:185:7)
    at emitListeningNT (net.js:1285:10)
    at _combinedTickCallback (internal/process/next_tick.js:71:11)
    at process._tickCallback (internal/process/next_tick.js:98:9)

Any idea how to avoid this? Basically anyone adding weird character (emoji) to a query parameter can now crash the app.

Error: EADDRINUSE /tmp/server0.sock incrementing socketPathSuffix

Hi,

when i run my express-app on lambda i get sometimes this error:
EADDRINUSE /tmp/server0.sock incrementing socketPathSuffix.
EADDRINUSE /tmp/server1.sock incrementing socketPathSuffix.
EADDRINUSE /tmp/server2.sock incrementing socketPathSuffix.
EADDRINUSE /tmp/server3.sock incrementing socketPathSuffix.
EADDRINUSE /tmp/server4.sock incrementing socketPathSuffix.
EADDRINUSE /tmp/server5.sock incrementing socketPathSuffix.
EADDRINUSE /tmp/server6.sock incrementing socketPathSuffix.

When a call gets this error it will take around 1800ms to resolve (compared to a 3-8ms normally). Is this releated to aws-serverless-express or to my app?

Cognito Authorization

Can you include a quick write up on how to include Cognito Authentication, and how it fits into the yml config?

Windows Compatibility

Hi,
this is great work and the proxy API concept will really really make things a lot easier!
Since I'm one of the few developing node on a Windows machine (actually we are quite a few ...) I'd really love to see this library working seamlessly also on Windows.

It took me just an hour to set-up a medium complexity project (about 10 end-points, token authentication middleware, several folders and host of 3rd party libraries, a bunch of AWS additional services to give authorization too) and I just ran in 2 issues that can be easily resolved or worked around since they both are related to command scripting in package.json:

  1. zip - "Package Function": "zip -q -r lambda-function.zip lambda.js app.js node_modules ...". The best practice for command line zip in windows is to use 7zip which is open source and doesn't need installation, so this command would translate into 7z a lambda-function.zip -r lambda.js app.js node_modules\ .... Would it be possible to have a platform dependent variant of the script, something like "Package-Function-Win"?
  2. open the cloudformation console from the command line - open https://console.aws.amazon.com/cloudformation/home?region=$npm_package_config_region in create-stack. This is not really a Win issue, actually after erroring on zip I used Bash in Ubuntu on Windows (the new Win10 command line Ubuntu subsystem) to make it work, but it is related to not having a GUI interface altogether. I'm not a linux guy so it took me a while to figure out where the error Couldn't get a file descriptor referring to the console was coming from and what it really meant: I finally got back to my browser (in Windows) to discover that everything was completed and perfectly up and running! So my 2 cents here would be to just take the open command part out of the script an have us users open a link in the browser.

These are really minor things!
Thanks
paolo

README version update needed

Under "cons", the README states, "Currently limited to Node.js 4.3 (LTS)." It should be "4.3 and 6.10", although I'm not sure this really even rates as a con anymore. Even if it is, we can say that you can use babel to achieve whatever language features you want.

How to switch the stage from prod to test?

This repository code currently creates a stage 'prod', which is mentioned in:

  • cloudformation.yaml
  • packaged-sam.yaml
  • simple-proxy-api.yaml

I've change the stage name to 'test' in the above files, which created the test stage successfully, however the prod stage has been removed.

How could I create another stages while retaining the 'prod'?

Thanks.

Response problem.

@lucianopf that's most likely an issue with buffering in aws-serverless-express, I suggest reporting the issue on their github repo

As oriented by @gojko at the ClaudiaJS Gitter channel I'll post here my problem:

Good afternoon community! =D
I'm using Claudiajs as a proxy to an Express application and it is working perfectly with only one exception.
In one of my endpoints the user post username/password and should recieve a token, which is pretty big but nothing ridiculous.
(This works on any instance running this express app)
As the following:

{
  "success": true,
  "message": "Enjoy your token!",
  "token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyIkX18iOnsic3RyaWN0TW9kZSI6dHJ1ZSwiZ2V0dGVycyI6e30sIndhc1BvcHVsYXRlZCI6ZmFsc2UsImFjdGl2ZVBhdGhzIjp7InBhdGhzIjp7ImVtYWlsIjoiaW5pdCIsInBhc3N3b3JkIjoiaW5pdCIsInVzZXJuYW1lIjoiaW5pdCIsIm5hbWUiOiJpbml0Iiwicm9sZSI6ImluaXQiLCJfX3YiOiJpbml0IiwiX2lkIjoiaW5pdCJ9LCJzdGF0ZXMiOnsiaWdub3JlIjp7fSwiZGVmYXVsdCI6e30sImluaXQiOnsiX192Ijp0cnVlLCJyb2xlIjp0cnVlLCJuYW1lIjp0cnVlLCJ1c2VybmFtZSI6dHJ1ZSwicGFzc3dvcmQiOnRydWUsImVtYWlsIjp0cnVlLCJfaWQiOnRydWV9LCJtb2RpZnkiOnt9LCJyZXF1aXJlIjp7fX0sInN0YXRlTmFtZXMiOlsicmVxdWlyZSIsIm1vZGlmeSIsImluaXQiLCJkZWZhdWx0IiwiaWdub3JlIl19LCJlbWl0dGVyIjp7ImRvbWFpbiI6bnVsbCwiX2V2ZW50cyI6e30sIl9ldmVudHNDb3VudCI6MCwiX21heExpc3RlbmVycyI6MH19LCJpc05ldyI6ZmFsc2UsIl9kb2MiOnsicm9sZSI6ImRlZmF1bHQiLCJfX3YiOjAsIm5hbWUiOiJsdWNpYW5vIiwidXNlcm5hbWUiOiJsdWNpYW5vcGYiLCJwYXNzd29yZCI6InFXdW1qblQwRisrN3owWFNLbHlGUWU2UEZobWpraVA5c0tqcks3cVpFNUhXSm90SGd3ZXRpVzVWWkFZWDAwUUxwcUd0MUFpQWdTR0VDVHlxdEdoM05QTVlCbS85azZIcjU4aTFWY3c4YUI5Y0xKUnJUNjJHN092bTQ1eVFUTE5PWjhYc1FHcUViTFZlczQxQ005Sk1neHIrQ2gxWlBQUlJsYWY3bmw5U1VtMD08fHw-TWRoMXhGRExlb3hONzRycWthRFFHaFFiRU9lRTg3WnExNGFqbjVxVjk2cy9VRHZZMFZoVy8yelRSb0JoQld0MVpBRWU4RHd4M3FncjNNZkJVUzU1NUgzZVl6MkNzOGNyNElCUVhKQUdEMjhucmtaY1ZRMCtZVHYwbXkyeTczQ2ZKOVN1SzFLcFh0VDBvNzRCTVFpVGowRFFxdENqRUQ3L1lTcmM2Mk9vbE80OEpmUmVJR1lHWTRVZUtqM3QrMGxObmRVd090dXppdWs5WjZmU3g3UTJRcDc1OUpSN1hGQnMrT2pTemtGYjQ2NGlmUk45aW1iazhxTEZDMU9GSjl0ZS9MeDRYcVJCUDRxWEFNUU90c0diaDhTaTJuc0tDS2hpVDZQUjFSMXIyaFdTd1UySWtoaHVRbGZnQmxFeHAwRERqZnZBaGZybDJmNGFmaWZjem5zUTF3PT0iLCJlbWFpbCI6Imx1Y2lhbm9wZkBvdXRsb29rLmNvbSIsIl9pZCI6IjU4OTQ5YjBiYjlhNjdhMDZlYTZhZDQwZiJ9LCJfcHJlcyI6eyIkX19vcmlnaW5hbF9zYXZlIjpbbnVsbCxudWxsLG51bGxdLCIkX19vcmlnaW5hbF92YWxpZGF0ZSI6W251bGxdLCIkX19vcmlnaW5hbF9yZW1vdmUiOltudWxsXX0sIl9wb3N0cyI6eyIkX19vcmlnaW5hbF9zYXZlIjpbXSwiJF9fb3JpZ2luYWxfdmFsaWRhdGUiOltdLCIkX19vcmlnaW5hbF9yZW1vdmUiOltdfSwiaWF0IjoxNDg3MDA5MzE0LCJleHAiOjE0ODcwMjM3MTR9.j-C0NO3FnCRsIafaq9YlZFbXsGQRKyFLE2OSJh5m3XE"
}

But when I'm calling at my lambda function (API Gateway endpoint) I get his as response:

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<HTML>
    <HEAD>
        <META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=iso-8859-1">
        <TITLE>ERROR: The request could not be satisfied</TITLE>
    </HEAD>
    <BODY>
        <H1>ERROR</H1>
        <H2>The request could not be satisfied.</H2>
        <HR noshade size="1px">
CloudFront attempted to establish a connection with the origin, but either the attempt failed or the origin closed the connection.

        <BR clear="all">
        <HR noshade size="1px">
        <PRE>
Generated by cloudfront (CloudFront)
Request ID: zLnwkqISkpsRVfDvUPRy1V25mJxPVfZ7UsjtYm8x0GtjEZOUWlwxwA==
</PRE>
        <ADDRESS></ADDRESS>
    </BODY>
</HTML>

When I reduce the size of this payload to ~900 characters at the token it works. (Aproximely 1kb as a response)
Do you guys know if there's such a low limit as this?
And if there is, do you guys know how to solve this issue? =(

questions about jade route

I am new to serverless, and try to get my simple express site working on AWS lambda + api-gateway.

When testing you need to append /pets(whatever) in the browser address after the stage name of the api gateway, while in my jade code is something like:
ul.nav.navbar-nav
li
a(href="/page1") Page1
li
a(href="/page") Page2
The jade route does not have the stage name, so it gives errors after deploying to api gateway. Just want to know if I understand correctly and how i can fix it to get it work publicly?

Chunked transfer encoding

Hi,

I use an express middleware that set to chunked the transfer-encoding headers

'transfer-encoding': 'chunked'

In that case, I get the following error :

The request could not be satisfied.
CloudFront attempted to establish a connection with the origin, but either the attempt failed or the origin closed the connection.

To make my request works, in the forwardResponseToApiGateway function of the aws-serverless-express/index.js file I remove this header.

Do you think you could make chunked transfert encoding pass ?

aws profile option

Thanks for awesome project that I expected.

I suggest that there should be a way to setup specific profile data. (profile is defined in ~/.aws/config)

For example:

npm run config <accountId> <bucketName> [region] [profileName]

[Question] body response in base64

Hello, I would like to know if there is a way to print body in text plan using npm run local instead base64

{ statusCode: 200,
  body: 'W3siaWQiOjEsIm5hbWUiOiJKb2UifSx7ImlkIjoyLCJuYW1lIjoiSmFuZSJ9XQ==',
  headers: 
   { 'x-powered-by': 'Express',
     'access-control-allow-origin': '*',
     'content-type': 'application/json; charset=utf-8',
     'content-length': '46',
     etag: 'W/"2e-Lu6qxFOQSPFulDAGUFiiK6QgREo"',
     vary: 'Accept-Encoding',
     date: 'Fri, 07 Apr 2017 11:27:14 GMT',
     connection: 'close' },
  isBase64Encoded: true }

Cannot deploy - constraint failed

I am running npm run setup, it looks like I am getting all the way through until the very end. I am getting an exit status of 255 from npm run deploy. Throwing the --debug flag on top of that aws command provides me with the full stack of the error.

The error is described in the last 30% of the output:

2017-04-19 12:04:47,538 - MainThread - botocore.parsers - DEBUG - Response body:
b'<ErrorResponse xmlns="http://cloudformation.amazonaws.com/doc/2010-05-15/">\n  <Error>\n    <Type>Sender</Type>\n    <Code>ValidationError</Code>\n    <Message>1 validation error detected: Value \'[CAPABILITY_NAMED_IAM, true[B]\' at \'capabilities\' failed to satisfy constraint: Member must satisfy constraint: [Member must satisfy enum value set: [CAPABILITY_NAMED_IAM, CAPABILITY_IAM]]</Message>\n  </Error>\n  <RequestId>4afa421b-2522-11e7-a6cd-c3bbf2db20f6</RequestId>\n</ErrorResponse>\n'
2017-04-19 12:04:47,539 - MainThread - botocore.hooks - DEBUG - Event needs-retry.cloudformation.CreateChangeSet: calling handler <botocore.retryhandler.RetryHandler object at 0x109266be0>
2017-04-19 12:04:47,539 - MainThread - botocore.retryhandler - DEBUG - No retry needed.
2017-04-19 12:04:47,539 - MainThread - awscli.customizations.cloudformation.deployer - DEBUG - Unable to create changeset
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/awscli/customizations/cloudformation/deployer.py", line 102, in create_changeset
    Description=description
  File "/usr/local/lib/python3.5/site-packages/botocore/client.py", line 253, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/usr/local/lib/python3.5/site-packages/botocore/client.py", line 557, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (ValidationError) when calling the CreateChangeSet operation: 1 validation error detected: Value '[CAPABILITY_NAMED_IAM, true[B]' at 'capabilities' failed to satisfy constraint: Member must satisfy constraint: [Member must satisfy enum value set: [CAPABILITY_NAMED_IAM, CAPABILITY_IAM]]
2017-04-19 12:04:47,540 - MainThread - awscli.clidriver - DEBUG - Exception caught in main()
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/awscli/clidriver.py", line 199, in main
    return command_table[parsed_args.command](remaining, parsed_args)
  File "/usr/local/lib/python3.5/site-packages/awscli/clidriver.py", line 337, in __call__
    return command_table[parsed_args.operation](remaining, parsed_globals)
  File "/usr/local/lib/python3.5/site-packages/awscli/customizations/commands.py", line 187, in __call__
    return self._run_main(parsed_args, parsed_globals)
  File "/usr/local/lib/python3.5/site-packages/awscli/customizations/cloudformation/deploy.py", line 160, in _run_main
    parsed_args.execute_changeset)
  File "/usr/local/lib/python3.5/site-packages/awscli/customizations/cloudformation/deploy.py", line 168, in deploy
    capabilities=capabilities)
  File "/usr/local/lib/python3.5/site-packages/awscli/customizations/cloudformation/deployer.py", line 178, in create_and_wait_for_changeset
    stack_name, cfn_template, parameter_values, capabilities)
  File "/usr/local/lib/python3.5/site-packages/awscli/customizations/cloudformation/deployer.py", line 107, in create_changeset
    raise ex
  File "/usr/local/lib/python3.5/site-packages/awscli/customizations/cloudformation/deployer.py", line 102, in create_changeset
    Description=description
  File "/usr/local/lib/python3.5/site-packages/botocore/client.py", line 253, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/usr/local/lib/python3.5/site-packages/botocore/client.py", line 557, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (ValidationError) when calling the CreateChangeSet operation: 1 validation error detected: Value '[CAPABILITY_NAMED_IAM, true[B]' at 'capabilities' failed to satisfy constraint: Member must satisfy constraint: [Member must satisfy enum value set: [CAPABILITY_NAMED_IAM, CAPABILITY_IAM]]
2017-04-19 12:04:47,540 - MainThread - awscli.clidriver - DEBUG - Exiting with rc 255

An error occurred (ValidationError) when calling the CreateChangeSet operation: 1 validation error detected: Value '[CAPABILITY_NAMED_IAM, true[B]' at 'capabilities' failed to satisfy constraint: Member must satisfy constraint: [Member must satisfy enum value set: [CAPABILITY_NAMED_IAM, CAPABILITY_IAM]]

Important part at the very end:

An error occurred (ValidationError) when calling the CreateChangeSet operation: 1 validation error detected: Value '[CAPABILITY_NAMED_IAM, true[B]' at 'capabilities' failed to satisfy constraint: Member must satisfy constraint: [Member must satisfy enum value set: [CAPABILITY_NAMED_IAM, CAPABILITY_IAM]]

I'm certain this is misbehaving from the --capabilities CAPABILITY_IAM option, but I don't understand why or how to fix it. Is this a common issue?

Compatibility with restify?

Hi,

has anybody tested this library in conjunction with restify instead of express? Is it doable or not?

Best
Andreas

process.on('uncaughtException') no longer catches all exception in express

Hi, after using aws-serverless-express, i noticed that node process.on('uncaughtException') of our previous express application doesn't catch all exception from async callbacks anymore. Is there another way to catch all exception that cause the server to crash?

This happens when we use the "request" model and an uncaught exception happens in the callback of an request.

Fail to get binary data.

(Region : ap-northeast-2, seoul)

What I did

  1. Set API Gateway Binary Support, case of Conetent-type : application/octet-stream.
  2. Add binary types parameter on createServer from aws-serverless-express
  3. Set request header Accept with application/octet-stream.
  4. Send binary data like res.send(data);. I checked isBase64Encoded is true.

What I expected

I expected to got binary data that decoded by api gateway

What I actually got

I got base64 encoded values.

And also, when I revert commit b61d288 , I could get binary data.

What's wrong with me?

[Question] Socket.io / Long running connections

Hi everyone,

I was wondering about the way if it could work (or not) if your front is using some kind of socket.io stuff. Is it possible?
I guess the long-running connection is a limitation in this case (since Lambda timeouts).

Thank your for any explanation!

Response GZipped even when no 'accept-encoding' is defined?

Hi Team,

It seems that when I don't set accept-encoding header, or set it to '', it always returns a Gzipped (and base64-encoded) response. Is that intended behavior? It seems that if the client doesn't explicitly request encoding, we shouldn't compress the response (even if it exceeds the size threshold).

Thanks

P.S. Sorry but I'm a bit new to encoding - perhaps this has been explained already, but reading through other issues I didn't get a clear picture of how it works.

TypeError: The header content contains invalid characters

I got a error below when I set Japanese to post parameter.
Is it a bug?

2016-11-01T15:46:46.806Z	653a43f4-a04a-11e6-8038-c5c1480ca076	TypeError: The header content contains invalid characters
at ClientRequest.OutgoingMessage.setHeader (_http_outgoing.js:351:13)
at new ClientRequest (_http_client.js:79:14)
at Object.exports.request (http.js:31:10)
at forwardRequestToNodeServer (/var/task/node_modules/aws-serverless-express/index.js:75:22)
at Object.response.setEncoding.on.on.req.on.exports.createServer.exports.proxy (/var/task/node_modules/aws-serverless-express/index.js:118:9)
at exports.handler (/var/task/lambda.js:6:60)

res.sendFile returning different results when run locally vs in Lambda

The below code returns different results based on running it locally vs running it on Lambda. Locally this returns a javascript file properly formatted. However, when I deploy to Lambda I get a base64 encoded blob that is not readable by my application. I have this set up as a proxy integration with API gateway, so it should not be altering the payload at all, and yet it is. Why is this library (or lambda) forcing a base64 encoded response?

'use strict'
const path = require('path')
const compression = require('compression')
const express = require('express')
const bodyParser = require('body-parser')
const cors = require('cors')
const awsServerlessExpressMiddleware = require('aws-serverless-express/middleware')
const app = express()


app.set('view engine', 'pug')
app.use(compression())
app.use(cors())
app.use(bodyParser.json())
app.use(bodyParser.urlencoded({ extended: true }))
app.use(awsServerlessExpressMiddleware.eventContext())

app.get('/widget/:widgetId', (req, res) => {
	console.log(req)
	res.setHeader('Content-Type', 'text/javascript')
    res.sendFile(path.join(__dirname, '/widget/bundle.js'), {}, function (err) {
    	if(err) {
    		console.log(err)
    	} else {
    		console.log("no error")
    	}
    })
})

// app.listen(3000, function () {
//   console.log('Example app listening on port 3000!')
// })

module.exports = app

Impossible to serve HTML larger than 1kB ?

It seems to be impossible to serve back HTML larger than 1kB.

No error in the logs but also no response from the APIGateway endpoint.

If you reduce the HTML response string to under 1kB text it magically starts working.

Does anybody know how to fix this?

const app = express();

app.use((req, res, next) => {
  console.log(req.url);
  next();
});

app.disable('x-powered-by');

app.use(compression());

app.use(awsServerlessExpressMiddleware.eventContext());

app.get('*', (req, res, next) => {
    let htmlString = "<html><body>PUT A HTML STRING WITH MORE THAN 1KB TEXT HERE</body></html>";
    res.status(200).send(htmlString);
});

// NOTE: If you get ERR_CONTENT_DECODING_FAILED in your browser, this is likely
// due to a compressed response (e.g. gzip) which has not been handled correctly
// by aws-serverless-express and/or API Gateway. Add the necessary MIME types to
// binaryMimeTypes, and to the x-amazon-apigateway-binary-media-types array in
// simple-proxy-api.yaml, then redeploy (`npm run package-deploy`)
const binaryMimeTypes = [
  // 'application/javascript',
  // 'application/json',
  'application/octet-stream',
  // 'application/xml',
  'font/eot',
  'font/opentype',
  'font/otf',
  'image/jpeg',
  'image/png',
  'image/svg+xml',
  // 'text/comma-separated-values',
  // 'text/css',
  // 'text/html',
  // 'text/javascript',
  // 'text/plain',
  // 'text/text',
  // 'text/xml'
];

// Create the AWS Lambda server.
const server = awsServerlessExpress.createServer(app, null, binaryMimeTypes);

// Export the AWS Lambda server proxy.
exports.handler = (event, context) => awsServerlessExpress.proxy(server, event, context);

Clarification on Readme

On the Cons section on "Is AWS serverless right for my app?"

Maximum execution time for a request is 300 seconds (5 minutes). This is true for Lambda, but not true if you use API gateway because API gateway has a 30 sec timeout.

Why is application/json listed as a binary media type in example project?

Hi!
I've just spent a few hours troubleshooting a strange error I got only when deploying to AWS Lambda / API Gateway, and it turns out that it is because application/json is listed in the binaryMimeTypes collection in your example project. When I removed it, the POST and PATCH calls work just fine.

What is the reason for this, in what scenario are you interested in treating JSON input data as base64?

unable to import module "lambda"

I'm trying to migrate an existing express app.

I've deployed, but I'm getting this error in Cloudwatch:

Unable to import module 'lambda': Error
at Function.Module._resolveFilename (module.js:325:15)
at Function.Module._load (module.js:276:25)
at Module.require (module.js:353:17)
at require (internal/module.js:12:17)

And "internal server error" when trying to hit my ApiUrl.

I'm requireing my server in lambda.js and changed "main": in my package.json to lambda.js as well as following the other instructions. Could I be missing something obvious?

multipart request, corrupted files

I found this module useful, because I want to handle multipart form data in my lambda function and express framework has a lot or working modules with multipart, I am using formidable, which works fine with normal express application, but in case of this, I keep getting corrupted files in some of the reasons.

This happens only for non txt files, txt files are uploaded and rendered properly.
Here is the code snippet that might be useful to reproduce the problem.

   fs.readFile(req.files['file'].path, 'binary', (err, data) => {
        if (err) {
            throw err;
        }
        fs.writeFile('some.[some ext]', data, 'binary', (err, data) => {
            if (err) {
                throw err;
            }
            res.end('OK');
        });
    });

After running this, I get the file in my local server , but it is corrupted, so I can't open it.
e.g images are not being shown, in normal image preview applications
e.g zip files can not be opened via archivers ERROR 21 is not a directory.

can anyone explain the problem ? Thanks in advance .

If content returns bigger than the certain size having error message.

I am using ElasticSearch, it returns fine for single record but when I request multiple records, it fails and following error message returns from API Gateway

ERROR

The request could not be satisfied.

CloudFront attempted to establish a connection with the origin, but either the attempt failed or the origin closed the connection.
Generated by cloudfront (CloudFront)
Request ID: eKTj_GMFZc692dOpJSiQa3qsv0IlbJgs_RYOxa-PSdCyWBDlSi9XiQ==

ECS Containers

Would you ever recommend packaging an app like this (Express -> Lambda) into a Docker container on ECS? Say I'm building the backend for a mobile app, and want to use the Microservice pattern.

Please add support to Facebook chatbot

const successResponse = {statusCode, body, headers, isBase64Encoded}

            context.succeed(successResponse)

I have written chatbot with node.js and express. And I have pushed it Heroku which works fine. But now when I want move to lambda function, it needs this node_module to support express.

Facebook chatbot expects only body with hub.challenge field in int format but in this code it sends all the response which facebook webhook doesnt allow it expects hub.challenge number.
Please help or let me know if there is any other package doing the same since this package wont work.

Changelog?

Hi, can't find anything on what the breaking change between 2.x and 3.x is, can you please give any pointers?

Socket hang up error after 3 sec when calling https.get from within app.get

I have server simple server index.js. I upload the attached zip to lambda. When I run the test to request /allPrizeData I get the error noted above in 2 to 3 seconds. The call back in 'https.get' is never called as demonstrated with the console.log statement. This code executes quickly and correctly locally. The code tests fine on Lambda when I give it mock data to return instead of using the call to https.get in order to retrieve the real data. But as stated, there is no trouble when running locally.
The problem can be duplicated by uploading the attached archive file as the code for a new lambda function, changing the proxy test to get /allPrizeData, and running the test. Note: The url to the data will work for your test as well because it is readable by everyone.
archive.zip

`

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.