Giter Site home page Giter Site logo

subzerocloud / postgrest-starter-kit Goto Github PK

View Code? Open in Web Editor NEW
743.0 35.0 72.0 1.17 MB

Starter Kit and tooling for authoring REST API backends with PostgREST

License: MIT License

PLpgSQL 56.85% Shell 5.10% Lua 13.04% JavaScript 19.11% HTML 5.30% Dockerfile 0.59%
postgrest postgresql api rest openresty docker boilerplate starter-kit rabbitmq automatic-api

postgrest-starter-kit's Introduction

PostgREST Starter Kit

Base project and tooling for authoring REST API backends with PostgREST.

PostgREST Starter Kit

Purpose

PostgREST enables a different way of building data driven API backends. It does "one thing well" and that is to provide you with a REST api over your database, however to build a complex production system that does things like talk to 3rd party systems, sends emails, implements real time updates for browsers, write integration tests, implement authentication, you need additional components. For this reason, some developers either submit feature requests that are not the scope of PostgREST or think of it just as a prototyping utility and not a powerful/flexible production component with excellent performance. This repository aims to be a starting point for all PostgREST based projects and bring all components together under a well defined structure. We also provide tooling that will aid you with iterating on your project and tools/scripts to enable a build pipeline to push everything to production. There are quite a few components in the stack but you can safely comment out pg_amqp_bridge/rabbitmq (or even openresty) instances in docker-compose.yml if you don't need the features/functionality they provide.

PostgREST+ as a service

Run your PostgREST instance in subZero cloud and get additional features to the OS version ( free plan available).

Alternatively, deploy an enhanced version of PostgREST on your infrastructure using binary and docker distributions.

Fully Managed — subZero automates every part of setup, running and scaling of PostgREST. Let your team focus on what they do best - building your product. Leave PostgREST management and monitoring to the experts.
Faster Queries — Run an enhanced PostgREST version that uses prepared statements instead of inline queries. This results in up to 30% faster response times.
Custom Relations — Define custom relations when automatic detection does not work. This allows you to use the powerful embedding feature even with the most complicated views
GraphQL API — In addition to the REST API you get a GraphQL api with no additional coding. Leverage all the powerful tooling, libraries and integrations for GraphQL in your frontend.
Preconfigured Authentication — Authenticate users with local email/password or using 3rd party OAuth 2.0 providers (google/facebook/github preconfigured)
Analytical queries — Use GROUP BY, aggregate and window functions in your queries

Features

✓ Cross-platform development on macOS, Windows or Linux inside Docker
PostgreSQL database schema boilerplate with authentication and authorization flow
OpenResty configuration files for the reverse proxy
RabbitMQ integration through pg-amqp-bridge
Lua functions to hook into each stage of the HTTP request and add custom logic (integrate 3rd party systems)
✓ Debugging and live code reloading (sql/configs/lua) functionality using subzero-cli
✓ Full migration management (migration files are automatically created) through subzero-cli/sqitch/apgdiff
✓ SQL unit test using pgTAP
✓ Integration tests with SuperTest / Mocha
✓ Docker files for building production images
✓ Community support on Slack
✓ Compatible with subZero Starter Kit if you need a GraphQL API and a few more features with no additional work

Directory Layout

.
├── db                        # Database schema source files and tests
│   └── src                   # Schema definition
│       ├── api               # Api entities avaiable as REST endpoints
│       ├── data              # Definition of source tables that hold the data
│       ├── libs              # A collection modules of used throughout the code
│       ├── authorization     # Application level roles and their privileges
│       ├── sample_data       # A few sample rows
│       └── init.sql          # Schema definition entry point
├── openresty                 # Reverse proxy configurations and Lua code
│   ├── lua                   # Application Lua code
│   ├── nginx                 # Nginx configuration files
│   ├── html                  # Static frontend files
│   └── Dockerfile            # Dockerfile definition for building production images
├── tests                     # Tests for all the components
│   ├── db                    # pgTap tests for the db
│   └── rest                  # REST interface tests
├── docker-compose.yml        # Defines Docker services, networks and volumes
└── .env                      # Project configurations

Installation

Prerequisites

Create a New Project

Click [Use this template] (green) button. Choose the name of your new repository, description and public/private state then click [Create repository from template] button. Check out the step by step guide if you encounter any problems.

After this, clone the newly created repository to your computer. In the root folder of application, run the docker-compose command

docker-compose up -d

The API server will become available at the following endpoints:

Try a simple request

curl http://localhost:8080/rest/todos?select=id,todo

Development workflow and debugging

Execute subzero dashboard in the root of your project.
After this step you can view the logs of all the stack components (SQL queries will also be logged) and if you edit a sql/conf/lua file in your project, the changes will immediately be applied.

Testing

The starter kit comes with a testing infrastructure setup. You can write pgTAP tests that run directly in your database, useful for testing the logic that resides in your database (user privileges, Row Level Security, stored procedures). Integration tests are written in JavaScript.

Here is how you run them

npm install                     # Install test dependencies
npm test                        # Run all tests (db, rest)
npm run test_db                 # Run pgTAP tests
npm run test_rest               # Run integration tests

Contributing

Anyone and everyone is welcome to contribute.

Support and Documentation

License

Copyright © 2017-present subZero Cloud, LLC.
This source code is licensed under MIT license
The documentation to the project is licensed under the CC BY-SA 4.0 license.

postgrest-starter-kit's People

Contributors

akagomez avatar coolzilj avatar futtetennista avatar gavrilyak avatar kljensen avatar numtel avatar pjlindsay avatar ruslantalpa avatar soyuka avatar steve-chavez avatar stonecypher avatar synapseradio avatar wildsurfer avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

postgrest-starter-kit's Issues

Duplicate CORS headers for POST requests

Communication with the REST API for POST requests will put duplicated CORS (Access-Control-*) headers into any response.

When using the REST API endpoints from a JS app in a browser (I used react-admin as a sample), this behavoiur leads to the following error in a browser console:

Access to fetch at 'http://localhost:8080/rest/rpc/login'
from origin 'http://localhost:3001' has been blocked by CORS policy:
The 'Access-Control-Allow-Origin' header contains multiple values '*, http://localhost:3001',
but only one is allowed. Have the server send the header with a valid value,
or, if an opaque response serves your needs,
set the request's mode to 'no-cors' to fetch the resource with CORS disabled.

Test case with RPC login endpoint (same behaviour for resources):

curl -D- 'http://localhost:8080/rest/rpc/login' \
	-H 'Origin: http://localhost:3001' \
	-H 'content-type: application/json' \
	-H 'accept: application/vnd.pgrst.object+json' \
	--data-binary '{"email":"[email protected]","password":"pass"}'

Example output:

[...]
Access-Control-Allow-Origin: *
Access-Control-Allow-Methods: GET, POST, PATCH, DELETE, OPTIONS
Access-Control-Allow-Credentials: true
[...]
Access-Control-Allow-Origin: http://localhost:3001
Access-Control-Allow-Credentials: true
Access-Control-Expose-Headers: Content-Encoding, Content-Location, Content-Range, Content-Type, Date, Location, Server, Transfer-Encoding, Range-Unit
[...]
  • The first batch of headers is injected by openresty/nginx/conf/includes/http/server/locations/rest/cors.conf (line 12-19)
  • The second batch of headers (with customized origin name) is injected by PostgREST via src/PostgREST/Config.hs (line 89-105)

IMHO the current cors.conf is not needed at all, as PostgREST will always add CORS headers on its own, (also for OPTIONS requests).

My fix was to comment / delete everything inside openresty/nginx/conf/includes/http/server/locations/rest/cors.conf, which makes everything work perfectly.

The only use case where these headers might be needed, is when using the subzero.jwt_session_cookie module (with subzero-starter-kit) as this module will not add the headers itself.

Maybe I'm overlooking something additional, which might still be dependent on the LUA injection of these headers (you put them there for a reason I guess). A better options to fix it might be to only add these headers by LUA in a response hook, if they're not already there in the response.

Getting Cannot find module 'C:\Projects\postGrest\subzero-base-project'

I installed the npm, docker and then the subzero cli (all seemed to install fine).
I then run the command subzero base-project and am getting this error:

C:\Projects\postGrest>subzero base-project
internal/modules/cjs/loader.js:783
throw err;
^

Error: Cannot find module 'C:\Projects\postGrest\subzero-base-project'
�[90m at Function.Module._resolveFilename (internal/modules/cjs/loader.js:780:15)�[39m
�[90m at Function.Module._load (internal/modules/cjs/loader.js:685:27)�[39m
�[90m at Function.Module.runMain (internal/modules/cjs/loader.js:1014:10)�[39m
�[90m at internal/main/run_main_module.js:17:11�[39m {
code: �[32m'MODULE_NOT_FOUND'�[39m,
requireStack: []
}

Any ideas what I'm missing? I'm running this on Windows 10 Pro

rabbitMQ stomp plugin

the amqp bridge readme says that by enabling the STOMP plugin you can archive real time communication.
https://github.com/subzerocloud/pg-amqp-bridge
I think it's worthwhile to add such integration directly in the repo, I need to do it in anycase for a project I'm using that is using this starter kit, so if you think it's worthwhile I might fork and then merge back here.

Do you have any suggestion on how this should be done? probably a new docker image that is installing the plugin and we need somehow to expose an endpoint that is forwarding all notifications...

thanks!

Will this work without Docker

I had many issues with Docker so I just put PostgreSQL and PostgREST on native windows for development. I've verified that I can make API calls with CURL so I'm set from that perspective.

This starter kit seems great and if there's a way for me to use it without Docker I'd like to, so please let me know if that's possible.

psql cannot connect if DB port changed

Hello,

I have a clean install of postgrest-starter-kit. I edited the docker-compose.yml file to change db ports to "4000:5432". Everything works fine and I can access the database and retrieve results.

When I edit db/src/sample_data/data.sql I get the following message in the openresty tab of the subzero dashboard...

psql: could not connect to server: Connection refused Is the server running on host "localhost" (::1) and accepting TCP/IP connections on port 5432? could not connect to server: Connection refused Is the server running on host "localhost" (127.0.0.1) and accepting TCP/IP connections on port 5432?

How to setup CORS in the docker

I'm accessing starter-kit from remote web app, but I'm getting CORS error, would it possible to setup in the docker container? Thanks!

lost all data in the DB

After running docker-compose down then docker-compose restart /docker-compose up -d I found I've lost all my added tables and views in the database only left me with the starter-kit db objects(user, todo)

dashboard crashing

Hi,

I tried brand new install - went down the line on install instructions, containers start successfully. cURL example works just as published.

I try to change the sample_data/data.sql file item_1 -> item_1 update and as soon as I save dashboard crashes...

Error: spawn psql ENOENT
    at _errnoException (util.js:1024:11)
    at Process.ChildProcess._handle.onexit (internal/child_process.js:190:19)
    at onErrorNT (internal/child_process.js:372:16)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickCallback (internal/process/next_tick.js:180:9)

I restart the dashboard no problem but run the curl again on /todo and there is no change to item_1.

I press r on the running dashboard terminal and get same crash as above.

MAC/OSX using node v8.9.1

Revoke execute on functions from public

-- by default all functions are accessible to the public, we need to remove that and define our specific access rules
revoke all privileges on function login(text, text) from public;

It is possible to forget to revoke grant on some function. To prevent this possibility at once we can revoke on every function at the very beginning.

alter default privileges revoke execute on functions from public;

is there api versioning settings in the starter kit?

I think it's nice to have api versioning considerations in the starter kit. Like in production for some upgrade of the api, Originally for old client I call from /rest/1.0/xx, but for some upgrade new client, we need /rest/2.0/xx etc.

Forgot Password/Password Recovery

Hi All --

I don't know if this is something that would make sense to include in the project itself, but it would probably be nice to have a section in the Wiki on password recovery (among other account management features). A more fleshed out auth system that doesn't require Auth0 would probably help draw a lot of people to the project.

I have done a simple and likely mediocre implementation here. It creates a JWT token with a short expiry with a claim ensuring that the JWT can only be used to change the user's password a single time within the allotted duration. The JWT is then sent off the the RabbitMQ bridge and the user is expected to handle the actual emailing -- much like the welcoming new users section on the wiki.

If there is demand for it I would be happy to flesh it out more, but it would be great if someone already had a battle tested/good solution that they could share.

Thanks

Question about fundamental differences between postgrest-starter-kit vs subzero-starter-kit

Hi,

I just started to learn postgrest-starter-kit, subzero-starter-kit and jekirl/postgrest-starter-kit and I found that, even they more-or-less provide solution for the "same" problem, they provide different functionality and services when we look at the details.

If I am right, some basic differences are:

  • subzero-starter-kit focuses on using postgrest-plus in the background and provide tools for easy subzerocloud migration/support
  • postgrest-starter-kit is for open-source postgrest not as tightly boundled to subzero
  • jekirl/postgrest-starter-kit is a restructured and modified subzero-starter-kit
  • all-of-them basically based on docker
  • only subzero provides a logout function which deletes the cookie on the client using set-header function
Feature subzero-* postgrest-* jekirl-*
postgrest X X
postgrest-plus (1 connection) X
docker image all-included except pgsql X
separate docker images for components (db, postgrest, rabbitmq, openresty) X X
supports cookies via set-header X
provides logout function out of box by deleting cookie X

Anything else, which should be important?
(I will extend this post as I will have more time)

deploy: can't connect DB server through socket

We are having the same problem with our postgrest container and the exact starter-kit example. From the cloudwatch logs for PgRest:

{
    "details": "could not connect to server: No such file or directory\n\tIs the server running locally and accepting\n\tconnections on Unix domain socket \"/var/run/postgresql/.s.PGSQL.5432\"?\n",
    "code": "",
    "message": "Database connection error"
}

AWS support is not very helpful, but said that the recommended way is not to use a socket. Are you getting the same error in the logs? Every other indicator from the aws web console says all is good.

JWSError JWSInvalidSignature

I'm using the JWT token from the example on this page:
eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxLCJyb2xlIjoid2VidXNlciJ9.vAN3uJSleb2Yj8RVPRsb1UBkokqmKlfl6lJ2bg3JfFg

When I make a request this error message is returned (curl & Postman):

{
    "message": "JWSError JWSInvalidSignature"
}

I've also copied the example JWT token into the .env file replacing the string "reallyreallyreallysecret".

....
# Global configs
DEVELOPMENT=1
JWT_SECRET=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxLCJyb2xlIjoid2VidXNlciJ9.vAN3uJSleb2Yj8RVPRsb1UBkokqmKlfl6lJ2bg3JfFg

# DB connection details (used by all containers)
DB_HOST=db
....

Have I missing something here?

Milestones and releases

Big fan of postgres-starter-kit here. I'm using it as a dependency for another production-ish app. It would be great if there is a versioned release and changelog/milestones added to this project so that its easier for me to fixate a particular version of postgres-starter-kit instead of grabbing master every time.

Happy to help!

Code reload not working

Hello, thank you for open sourcing this project it is quite interesting and I want to learn more about it. I am facing some issues regarding the code reload, as described below.

Every time I do a change in data db/src/sample_data/data.sql as a specified here the stack breaks down. I start getting the following errors

OpenResty

│./db/src/sample_data/data.sql changed                                                                                                                                                                                                                                         │
│Starting code reload ------------------------                                                                                                                                                                                                                                 │
│Ready ---------------------------------------                                                                                                                                                                                                                                 │
│sed: /docker-entrypoint-initdb.d/libs/pgjwt/pgjwt-0.0.1.sql: No such file or directory                                                                                                                                                                                        │
│psql:/Users/steveazzopardi/Code/github.com/SteveAzz/khumbuicefall/db/src/libs/auth/schema.sql:27: ERROR:  function pgjwt.sign(json, text) does not exist                                                                                                                      │
│LINE 2:     select pgjwt.sign($1, settings.get('jwt_secret'))

pg-amqp-bridge

│thread '<unnamed>' panicked at 'internal error: entered unreachable code', /home/travis/.cargo/registry/src/github.com-1ecc6299db9ec823/postgres-0.14.2/src/notification.rs:155:17                                                                                            │
│note: Run with `RUST_BACKTRACE=1` for a backtrace.

curl

$ curl -H "Authorization: Bearer $JWT_TOKEN" "http://localhost:8080/rest/todos?select=id,todo"
{"hint":null,"details":null,"code":"42P01","message":"relation \"api.todos\" does not exist"}⏎

I am running on latest master. Macos, latest stable docker version

Line endings of .sh files break process on Windows 10

Hi -

First off, thank you so much for creating this starter kit. It's a huge help.

I wanted to submit this issue to let you know that there is a ( solvable ) problem that prevents running this on Windows 10, and probably other versions of Windows as well.

Windows has different line endings as opposed to Unix / OSX - CRLF instead of LF - which makes the .sh files in the project kick out errors in the PostgREST and RabbitMQ boxes due to unrecognized newline characters.

The way that I fixed it was to download dos2unix , and run

dos2unix openresty/entrypoint.sh
dos2unix db/src/init.sh
docker-compose up -d

in a terminal opened in the root directory. Once that happened, everything ran smoothly. I did not run it on the test scripts but I imagine the same thing would need to happen with those.

It was an easy fix, but a non-obvious solution for me. I don't know if there's a way to detect and fix this automatically, but if there is, I'm sure there are fellow travelers out there who would appreciate it. In any case, I'm happy to submit a PR to add the solution above to the documentation. Thanks again!

CircleCI build fails - 'import' statement unsupported

I'm able to test using npm test successfully on my local computer running OSX following the tutorial. However when I go to run the same tests in CircleCI, I get the below error log:

#!/bin/bash -eo pipefail
npm test

> starter-kit@ test /home/circleci/project
> npm run test_db && npm run test_rest


> starter-kit@ test_db /home/circleci/project
> ( set -a && . ./.env && set +a && docker run -i -t --rm --name pgtap --net ${COMPOSE_PROJECT_NAME}_default --link ${COMPOSE_PROJECT_NAME}_db_1:db -v $(pwd)/tests/db/:/test -e HOST=$DB_HOST -e DATABASE=$DB_NAME -e USER=$SUPER_USER -e PASSWORD=$SUPER_USER_PASSWORD lren/pgtap )

^@^@Waiting for database...
2019/02/28 03:09:01 Waiting for: tcp://db:5432
2019/02/28 03:09:01 Connected to tcp://db:5432

Running tests: /test/*.sql
/test/rls.sql ........ ok
/test/simple.sql ..... ok
/test/structure.sql .. ok
All tests successful.
Files=3, Tests=13,  0 wallclock secs ( 0.03 usr +  0.01 sys =  0.04 CPU)
Result: PASS

> starter-kit@ test_rest /home/circleci/project
> mocha --compilers js:babel-core/register ./tests/rest/

/home/circleci/project/tests/rest/auth.js:1
(function (exports, require, module, __filename, __dirname) { import { rest_service, jwt, resetdb } from './common.js';
                                                              ^^^^^^
SyntaxError: Unexpected token import
    at exports.runInThisContext (vm.js:53:16)
    at Module._compile (module.js:511:25)
    at loader (/home/circleci/project/node_modules/babel-register/lib/node.js:144:5)
    at Object.require.extensions.(anonymous function) [as .js] (/home/circleci/project/node_modules/babel-register/lib/node.js:154:7)
    at Module.load (module.js:456:32)
    at tryModuleLoad (module.js:415:12)
    at Function.Module._load (module.js:407:3)
    at Module.require (module.js:466:17)
    at require (internal/module.js:20:19)
    at /home/circleci/project/node_modules/mocha/lib/mocha.js:231:27
    at Array.forEach (native)
    at Mocha.loadFiles (/home/circleci/project/node_modules/mocha/lib/mocha.js:228:14)
    at Mocha.run (/home/circleci/project/node_modules/mocha/lib/mocha.js:514:10)
    at Object.<anonymous> (/home/circleci/project/node_modules/mocha/bin/_mocha:480:18)
    at Module._compile (module.js:541:32)
    at Object.Module._extensions..js (module.js:550:10)
    at Module.load (module.js:456:32)
    at tryModuleLoad (module.js:415:12)
    at Function.Module._load (module.js:407:3)
    at Function.Module.runMain (module.js:575:10)
    at startup (node.js:160:18)
    at node.js:445:3
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! starter-kit@ test_rest: `mocha --compilers js:babel-core/register ./tests/rest/`
npm ERR! Exit status 1
npm ERR! 
npm ERR! Failed at the starter-kit@ test_rest script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.

npm ERR! A complete log of this run can be found in:
npm ERR!     /home/circleci/.npm/_logs/2019-02-28T03_09_02_947Z-debug.log
npm ERR! Test failed.  See above for more details.
Exited with code 1

Searching around, it looks like others have run into it using when using Mocha.

openresty module not found

Hi,

trying to run the docker-compose, the instance of openresty/openresty:jessie return some the errors:

module ngx_http_lua_cache_module.so not found, than
lualib/user_code/hooks.lua:6: module 'subzero.jwt_session_cookie' not found

some hints?

psql: FATAL: role "superuser" does not exist

when I modify my code on data/src/api/data/*.sql and save, the zubZero devtools reports this error.
what about it ?

─Logs───────────────────────────────────────────────────────────┐
│./db/src/data/todo.sql changed │
│Starting code reload ------------------------ │
│psql: FATAL: role "superuser" does not exist │
│ │
│Ready --------------------------------------- │
│psql: FATAL: role "superuser" does not exist

GRANT api TO anonymous, webuser?

Hi!

Sorry to bother with the following questions, it's probably a misunderstanding on my side.

I created a role api as you did:

-- this role will be used as the owner of the views in the api schema
-- it is needed for the definition of the RLS policies
drop role if exists api;
create role api;
grant api to current_user; -- this is a workaround for RDS where the master user does not have SUPERUSER priviliges 

Then created a policy assigned to api as you did:

-- define the who can access todo model data
-- enable RLS on the table holding the data
alter table my_table enable row level security;
-- define the RLS policy controlling what rows are visible to a particular application user
create policy my_table_access_policy on my_table to api ...

api is the owner of the corresponding view and has access to the table as you did:

alter view my_table_view owner to api;
grant select, insert, update, delete on my_table to api;

However the endpoint my_table_view returns 0 rows to webuser, although the policy allows him to see some rows. Then once I executed:

grant api to anonymous, webuser;

the endpoint and the policy worked as expected.

So my question:

You never do grant api to anonymous, webuser in this project. Is it actually needed or did I miss something?

(And I'm wondering, what's the purpose of grant api to current_user;? I saw the comment but why does master need SUPERUSER privileges to run PostgREST? And how giving api to master solves the issue since api is created without SUPERUSER privileges? I'm on RDS and I tried with both grant api to current_user; and revoke api from current_user; and I don't see a difference, but I might be missing something again)

circleci build fails

mocha tests fail with the following error: Cannot find module 'babel-core/register'

Fargate RDS Example

Didn't find a repo to more directly suggest a PR/change to your subzero.cloud docs, but your fargate sample page suggests authorizing access to the DB using the cidr range of the Cluster. Logically I assume that number could change (e.g. by redeploying the cluster) so you're better off authorizing the security group for the cluster.

Here's some sample code (with slightly different variable names than you're using):

# create a subnet for the DB
aws rds create-db-subnet-group \
    --db-subnet-group-name $CLIENT-db-subnet \
    --db-subnet-group-description $CLIENT-db-subnet \
    --subnet-ids $Cluster_Resource_PubSubnetAz1 $Cluster_Resource_PubSubnetAz2

# get the Security Group ID for authorizing access
# I'm assuming each has only one and grabbing it using [0].  If you've already 
# added extra groups to either side, you may need to do something more
# complex.
export DB_SubnetGroup_VpcId=$(aws rds describe-db-subnet-groups\
 --db-subnet-group-name=$CLIENT-db-subnet\
 --query DBSubnetGroups[0].VpcId\
 --output text)
echo DB_SubnetGroup_VpcId=$DB_SubnetGroup_VpcId >> .env
export DB_SecurityGroup_VpcId=$(aws ec2 describe-security-groups\
 --filters Name=vpc-id,Values=${DB_SubnetGroup_VpcId}\
 --region ${AWS_REGION}\
 --query SecurityGroups[0].GroupId\
 --output text) >> .env
echo DB_SecurityGroup_VpcId=$DB_SecurityGroup_VpcId >> .env

... including making the cluster

# get the Cluster security group
# see https://docs.amazonaws.cn/en_us/AmazonECS/latest/userguide/ecs-cli-tutorial-fargate.html
export Cluster_Resource_EcsSecurityGroup=$(aws ec2 describe-security-groups\
 --filters Name=vpc-id,Values=${Cluster_Resource_Vpc}\
 --region ${AWS_REGION}\
 --query SecurityGroups[0].GroupId\
 --output text)
echo Cluster_Resource_EcsSecurityGroup=$Cluster_Resource_EcsSecurityGroup >> .env

# allow ECS nodes to connect to this db
aws ec2 authorize-security-group-ingress \
	--region $AWS_REGION \
	--group-id $DB_SecurityGroup_VpcId \
	--protocol tcp \
	--port 5432 \
	--source-group $Cluster_Resource_EcsSecurityGroup

I'm keeping all of the DB stuff separate from the ECS app which may be unnecessary but feels more futureproof. This also means I can build them in any order (so long as I authorize the one to the other last).

problem at the end of API core tutorial with sub join

Hi Rusian! I have been trying this out for a couple of weeks. Extremely helpful. Read the docs and went through your tutorial. There was one small problem at the very end of the API core part of the tutorial. The comments(body) sub join does not work:

--data-urlencode select="id,name,projects(id,name,comments(body),tasks(id,name,comments(body)))" | \

But this does work:

--data-urlencode select="id,name,projects(id,name,comments(body),tasks(id,name))" | \

Perhaps this is a bug in postgrest or just a change?

Can't get last request from chapter "API Core" in the tutorial to work

Hi!

I'm currently following the tutorial (thank you very much for it, very helpful) and I can't get the last request under "API Core" to work:

christian@Ubuntu-VM:~/khumbuicefall$ curl -s -G -H "Authorization: Bearer $JWT_TOKEN" http://localhost:8080/rest/clients --data-urlencode select="id,name,projects(id,name,comments(body),tasks(id,name,comments(body)))" | python -mjson.tool

I always get the following response:

{ "message": "Could not find foreign keys between these entities, No relation found between tasks and comments" }

All other requests from the "API Core" chapter as well as the ones from "Using the API" got the expected responses.

Steps I took so far trying to solve this:

  1. I double-checked several times if I maybe overlooked pieces of code in the previous sections of the tutorial, but everything seems to be in order – at least I hope so, see my data/API DDL files as well as the stored procedure for reference.
  2. Found a passage in the docs regarding schema reloading (link) that sounded promising, but issuing a killall -HUP postgrest just returns postgrest: no process found in my Ubuntu VM.
  3. Restarted the postgrest and db containers from subzero-dashboard.

Any help/hints to solve this would be greatly appreciated!

Kind regards,

Christian

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.