appyter-catalog: A catalog of appyters
Pull requests encouraged, please refer to the example for registering your own appyter.
Currently, because this application deals with several independent appyters, we construct Dockerfiles independently for each and facilitate deployment with docker-compose.yml. In the future this can be extended to automatically generating a kubernetes deployment or simply using docker-in-docker, but for now a simple Makefile will do the trick of hosting the docker-compose on a single system.
# Download the catalog locally
git clone [email protected]:MaayanLab/appyter-catalog.git
# Run only no build
make docker-compose.yml && docker-compose pull && docker-compose up -d --remove-orphans
# Build and run
make build && docker-compose up -d --remove-orphans
The appter-catalog does several things to permit integration of several independent appyters with their own dependencies while permitting various modifications performed at the entire application level.
- Submit pull request with new appyter added to
appyters
directory. .github/workflows/validate_merge.yml
instructs github to executevalidate/validate_merge.py
validate/validate_merge.py
executes, validating the structure of the directory including- Asserting that
appyter.json
is formatted according to theschema/appyter-validator.json
json-schema validator - Asserting that other relevant files are present
- Uses
compose/build_dockerfile.py
to construct and build a Dockerfile the same way it would be done in production
- Asserting that
- PR is accepted if and only if the validation and manual review is passed
Makefile
can be used to facilitate the remaining steps- Run
compose/build_dockerfile.py
for each appyter to injectoverride
s,merge_j2
, and construct a Dockerfile for theappyter
- When built, the files in
override
will be merged (usingcompose/merge_j2.py
) with the appyter's ownappyter
overrides
- When built, the files in
- Run
compose/build_appyters.py
to build a unifiedappyters.json
file, containing information about each appyter for theapp
- Run
cd app && npm i && npm run build
to build theapp
(written in nodejs) with the most recently renderedappyters.json
- Run
compose/build_compose.py
to build a application widedocker-compose.yml
which includes a unified proxy for serving all apps on one endpoint - Run
docker-compose build
to build all Dockerfiles for theappyters
and theapp
- Variables in
.env
are automatically loaded bydocker-compose
appyter_version
in.env
is used as aDockerfile arg
permitting easy updates to the version used by allappyters
- A
postgres
database is used throughpostgrest
forapp
state.postgres/migrations
contains thepostgres
schema of that database, which are applied at database initialization inpostgres/Dockerfile
- Variables in
- Run
docker-compose up -d
to start all docker containers in the application.- Variables in
.env
are automatically loaded bydocker-compose
maayanlab/proxy
is used to proxy different paths to the respective containers and set uphttps
withletsencrypt
postgrest
exposespostgres
tables, views, and functions on theapi
schema with theguest
role over HTTP at/postgrest
app
facilitates showing allappyters
and navigating users to the mount location of the actualappyter
container.data/<container_name>
contains all application data split up by container mounted from the host.
- Variables in