Giter Site home page Giter Site logo

directus_gcp's Introduction

Note: It's original step by step tutorial that was removed from directus docs.

Run directus locally: https://www.npmjs.com/package/create-directus-project

Google Cloud Platform

While there are many different ways to run Directus on GCP, from single Compute Engine instances to a full Kubernetes Enginge stack, we like the following combination of services:

Google Cloud Run

We will run Directus in an autoscaling managed container environment called Google Cloud Run. Google Cloud Run will scale back to zero instances by default but can be configured to meet every scalability requirements.

While there a numerous ways to get a Docker container up and running in Cloud Run, we will talk you through a simple setup with still a lot of improvements to be made. The section Additional improvements stipulates additional improvements you can make to your setup.

Google Cloud SQL

Fully managed relational database service for MySQL, PostgreSQL, and SQL Server. This will be the persistent database layer. In this guide we will use PostgreSQL.

Google Cloud Storage

Will be used as object/file storage. Since we're running Directus in Google Cloud, it'll be authenticated and authorized to access all storage buckets in the same Google Cloud project by default. We just need to set it up in the environment variables.

Cost

Running Directus in Google Cloud can and will infer costs. Please read and understand all pricing options per service before rolling out Directus on Google Cloud. As a rule of thumb you could say that with the setup written here the main costs come from a persistent Postgres in CloudSQL. You can estimate the pricing here: https://cloud.google.com/products/calculator

Cloud Run click to deploy

Just to see Directus work on Google Cloud you can try it out by clicking this button. Keep in mind, this will use a non persistent SQLite database inside the container itself, so everything will be lost once the container shuts down. And it will shut down.

Run on Google Cloud

After deploying you can login with [email protected] and localpassword

Walkthrough complete setup

Let's get into it.

  1. Install the Google Cloud SDK. We will run a lot of gcloud commands to get your GCP environment setup.

  2. Go over the steps for Manually installing Directus

  3. Add a start script to your package.json like so:

"scripts": {
		"start": "npx directus bootstrap; npx directus start"
}
  1. Add a new Dockerfile file to the root of your newly setup Directus folder and add these contents
FROM node:16-alpine
WORKDIR /src
ADD . /src
RUN npm install --production
CMD ["npm", "run", "start"]
  1. Create a new project in Google Cloud: gcloud projects create <your-project-id> and make sure your-project-id is globally unique. Write down this ID as it will be used in all subsequent gcloud commands.

  2. Add/link a billing account/details to your project by going here: https://console.cloud.google.com/billing/linkedaccount?project=<your-project-id>. Since the database and storage bucket are persistent, there are (costs)[#cost] involved.

  3. Create a storage bucket where Directus will store files:

gsutil mb -p <your-project-id> -c standard -l europe-west4 gs://<unique-bucket-name>/

Some notes:

  • The region (europe-west4) is the region where the bucket resides. It's a good idea, sometimes even mandatory, to set all services to the same region.
  • The bucket name should be globally unique
  1. Create a Postgres13 database:
gcloud sql instances create <database-instance-name> --region=europe-west4 --tier=db-f1-micro --project=<your-project-id> --database-version=POSTGRES_13

Some notes:

  • The region (europe-west4) is the region where the database resides. It's a good idea, sometimes even mandatory, to set all services to the same region.
  • The tier determines resources and cost of the database. For this example we've picked the smallest one.
  • This operation will take a while. If, for some reason, the gcloud times out, you can still find your database instance and its status here.

The output will look something like this:

Creating Cloud SQL instance...done.
Created [https://sqladmin.googleapis.com/sql/v1beta4/projects/your-project-id/instances/your-project-id-pg13].
NAME                         DATABASE_VERSION  LOCATION        TIER         PRIMARY_ADDRESS  PRIVATE_ADDRESS  STATUS
your-project-id-pg13         POSTGRES_13       europe-west4-b  db-f1-micro  123.456.789.0    -                RUNNABLE

Write down the IP address (in this example 123.456.789.0), you'll need to set it in your .env later

  1. Set the root user password in your database:
gcloud sql users set-password root --host=% --instance <database-instance-name> --password <your-safe-root-password> --project=<your-project-id>
  1. Create the directus database:
gcloud sql databases create directus --instance=<database-instance-name> --project=<your-project-id>

In this example the database is called directus

  1. Get the connection name of your CloudSQL instance:
gcloud sql instances describe <database-instance-name> --project=<your-project-id>

You will need the value of connectionName in step 12 and 14.

  1. Add these items to your .env file:
DB_CLIENT="pg"
DB_PORT="5432"
DB_DATABASE=directus
DB_USER=root
DB_PASSWORD=<your-root-password>
DB_HOST=/cloudsql/<connection-name-from-step-11>
STORAGE_LOCATIONS="gcs"
STORAGE_GCS_DRIVER="gcs"
STORAGE_GCS_BUCKET=<your-bucket-name>
ADMIN_EMAIL="[email protected]"
ADMIN_PASSWORD="localpassword"
KEY="secretkey"
SECRET="secret"
LOGGER_LEVELS="trace:DEBUG,debug:DEBUG,info:INFO,warn:WARNING,error:ERROR,fatal:CRITICAL"

Notes:

  • the value of connectionName from step 11 should be prefixed with /cloudsql/ as the value of DB_HOST
  • LOGGER_LEVELS is optional, but makes Directus logs show up with correct level in Google Cloud Logging.
  1. Build your container Run these commands.
docker build -t eu.gcr.io/<your-project-id>/directus .
gcloud auth configure-docker -q
docker push eu.gcr.io/<your-project-id>/directus

This will build the Docker container, authenticate your docker installation with Google Cloud Platform and push the container image to the container registry in your GCP project.

  1. Deploy your container
gcloud run deploy directus \
     --project "<your-project-id>" \
     --image "eu.gcr.io/<your-project-id>/directus:latest" \
     --region "europe-west1" \
     --platform "managed" \
     --allow-unauthenticated \
     --add-cloudsql-instances "<database-connection-name-from-step11>"

Note: the value of connectionName from step 11 should be used as the value of add-cloudsql-instances without any prefix.

  1. Done! The deploy command should've told you the URL where you can access your Directus instance. You can login with [email protected] and localpassword.

Additional improvements

  • You should make your CloudSQL instance only accessible through a private IP and VPC connector. This way only your current cloud project is able to access the database. More on this here.
  • You should not store your .env file locally and build it into your Dockerfile. Ideally you should save your .env file in Google Cloud Secret Manager and in your CI/CD pipeline retrieve it and add it to your conainter. Or, even better, let the container pick the .env up at runtime from Google Secret Manager.
  • Cloud Run typically allocates resources in a request context. Meaning async hooks etc will get drastically less CPU and memory, often even resulting in those processes not completing. You have two options: CPU allocation that is always allocated (which will increase cost) or handle everything in your extensions synchronously.
  • You could setup caching using Memorystore for Redis
  • Since, by default, Google Cloud Run will scale back to zero instances of the container, it's impossible to use the Schedule hooks since there is no container to handle those schedules / cron jobs. Again you have two options: Set the minimum number of instances to 1, this will definitely increase cost as at least 1 container keeps on running 24/7. Or you could use Cloud Scheduler to schedule calls to custom endpoints which will do the tasks.

directus_gcp's People

Contributors

huldacz avatar

Stargazers

 avatar Roman avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.