Giter Site home page Giter Site logo

lucasallan / gcloud Goto Github PK

View Code? Open in Web Editor NEW

This project forked from blacklabelops-legacy/gcloud

0.0 0.0 0.0 15 KB

Dockerized Google Cloud SDK. Run & Schedule Commands Against the Cloud!

License: MIT License

Shell 100.00%
googlecloud docker-image

gcloud's Introduction

This is a container stuffed with the latest Google Cloud SDK along with all modules. It is the easiest way to run commands against your cloud instances, apps, switch projects and regions!

Furthermore, you can schedule your commands with cron in order to manage the cloud!

Make It Short!

In short, you can run Google Cloud SDK commands against your cloud projects with this container. Just by executing:

$ docker run -it --rm \
    -e "GCLOUD_ACCOUNT=$(base64 auth.json)" \
    -e "[email protected]" \
    -e "CLOUDSDK_CORE_PROJECT=example-project" \
    -e "CLOUDSDK_COMPUTE_ZONE=europe-west1-b" \
    -e "CLOUDSDK_COMPUTE_REGION=europe-west1" \
    blacklabelops/gcloud \
    gcloud compute instances list

Lists your instances inside the specified cloud project. Note: Auth credentials are inside the file auth.json.

You can even set up a Cron Schedule and manage the cloud!

$ docker run --rm \
	  -v $(pwd)/logs/:/logs \
    -e "GCLOUD_ACCOUNT=$(base64 auth.json)" \
    -e "[email protected]" \
    -e "GCLOUD_CRON=$(base64 example-crontab.txt)" \
    blacklabelops/gcloud
$ cat logs/gcloud.log

Will start the schedule and log to the local log folder. The cron schedule is defined inside the file example-crontab.txt.

Use Cases

  • Managing backups by pushing files to Cloud Storage and Buckets.
  • Restoring backups from Cloud Storage and Buckets into containers.
  • Executing Long Running File Transfers.

Google Cloud API

You can only run commands against existing cloud projects!

Documentation can be found here: Creating & Deleting Projects

Also you will have to activate APIs manually before you can use them!

Documentation can be found here: Activating & Deactivating APIs

Google Cloud Authentication

There are two ways to authenticate the gcloud tools and execute gcloud commands. Both ways need a Google Cloud OAuth Service Account file. This is documented here: Service Account Authentication.

You can now mount the file into your container and execute commands like this:

$ docker run -it --rm \
    -v $(pwd)/auth.json:/auth.json \
    -e "GCLOUD_ACCOUNT_FILE=/auth.json" \
    -e "[email protected]" \
    blacklabelops/gcloud \
    bash
$ gcloud compute instances list

Opens the bash console inside the container, the second command is executed inside the authenticated container. This works both with json and P12 key files.

You can also Base64 encode the authentication file and stuff it inside an environment variable. This works perfect for long-running stand-alone containers.

$ docker run -it --rm \
    -e "GCLOUD_ACCOUNT=$(base64 auth.json)" \
    -e "[email protected]" \
    blacklabelops/gcloud \
    bash
$ gcloud compute instances list

Opens the bash console inside the container, the second command is executed inside the authenticated container. This works both with json and P12 key files.

Setting the Cloud Project

Set your default Google Project by defining the CLOUDSDK_CORE_PROJECT environment variable.

$ docker run -it --rm \
    -e "GCLOUD_ACCOUNT=$(base64 auth.json)" \
    -e "[email protected]" \
    -e "CLOUDSDK_CORE_PROJECT=example-project" \
    blacklabelops/gcloud \
    bash
$ gcloud compute instances list

Runs all commands against the project example-project.

Setting the Zone and Region

Set your default Google Project Zone and Region with the environment variables CLOUDSDK_COMPUTE_ZONE and CLOUDSDK_COMPUTE_REGION.

The documentation can be found here : Regions & Zones

Example:

$ docker run -it --rm \
    -e "GCLOUD_ACCOUNT=$(base64 auth.json)" \
    -e "[email protected]" \
    -e "CLOUDSDK_CORE_PROJECT=example-project" \
    -e "CLOUDSDK_COMPUTE_ZONE=europe-west1-b" \
    -e "CLOUDSDK_COMPUTE_REGION=europe-west1" \
    blacklabelops/gcloud \
    bash
$ gcloud compute zones list
$ gcloud compute regions describe ${CLOUDSDK_COMPUTE_REGION}

Set your region and zone to belgium. More details appear with the describe command.

Cron Scheduling

This container can manage gcloud instances using cron. The crontab can be mounted or simply converted into a base64 string and configured inside the container over environment variables.

An working example crontab can be found here: example-crontab.txt

Please note that in the case of cron triggering commands, the environment variables have to be configured inside the crontab. See my example file for details.

Also note that when you need to include your own scripts then you just have to extend this container.

Mounting a crontab:

$ docker run -d \
	  -v $(pwd)/example-crontab.txt:/example-crontab.txt \
    -v $(pwd)/logs/:/logs \
    -e "GCLOUD_CRONFILE=/example-crontab.txt" \
    -e "GCLOUD_ACCOUNT=$(base64 auth.json)" \
    -e "[email protected]" \
    -e "GCLOUD_CRON=$(base64 example-crontab.txt)" \
    blacklabelops/gcloud

Needs an environment variable in order to tell the entryscript where to find the crontab.

Using a Base64 encoded crontab:

$ docker run -d \
	  -v $(pwd)/logs/:/logs \
    -e "GCLOUD_ACCOUNT=$(base64 auth.json)" \
    -e "[email protected]" \
    -e "GCLOUD_CRON=$(base64 example-crontab.txt)" \
    blacklabelops/gcloud

The authentication file and crontab are encoded on the fly.

Google Cloud SDK Logging

This container does not write a logfile by default. It's considered bad practise as logs should be accessed by the command docker logs. There are use case where you want to have additional log files, e.g. my use case is to relay log to Loggly Loggly Homepage. I have added a routine for logging and it's activated by defining a logfile.

Environment Variable: LOG_FILE

Example for a separate volume with a logfile:

$ docker run -d \
    --name gcloudcron \
	  -v $(pwd)/logs/:/gcloudlogs \
	  -e "LOG_FILE=/gcloudlogs/cron.log" \
    -e "GCLOUD_ACCOUNT=$(base64 auth.json)" \
    -e "[email protected]" \
    -e "GCLOUD_CRON=$(base64 example-crontab.txt)" \
    blacklabelops/gcloud

You can watch the log by typing cat ./logs/cron.log.

Now lets hook up the container with my Loggly side-car container and relay the log to Loggly! The Full documentation of the loggly container can be found here: blacklabelops/loggly

$ docker run -d \
  --volumes-from gcloudcron \
  -e "LOGS_DIRECTORIES=/gcloudlogs" \
	-e "LOGGLY_TOKEN=412e12ee-12e12e1-12e12e-12e12e" \
  -e "LOGGLY_TAG=gcloudlog" \
  --name gcloudloggly \
  blacklabelops/loggly

Note: You need a valid Loggly Customer Key in order to log to Loggly.

Google Storage Container Backups

First run the Jenkins example container:

docker run -d -p 8090:8080 --name jenkins_jenkins_1 blacklabelops/jenkins

This will pull the container and start the latest jenkins on port 8090

Instant backup of the jenkins volume using a run-once container:

$ docker run \
  --volumes-from jenkins_jenkins_1 \
  -v $(pwd)/backups/:/backups \
  -v $(pwd)/logs/:/logs \
  -e "GCLOUD_ACCOUNT=$(base64 auth.json)" \
  -e "[email protected]" \
  blacklabelops/gcloud \
  bash -c "cd /jenkins/ && tar -czvf /backups/JenkinsBackup$(date +%Y-%m-%d-%H-%M-%S).tar.gz * && gsutil rsync /backups gs://jenkinsbackups"

Note: You need a cloud storage named jenkinsbackups to make this work

Now the cron example with a prefixed schedule:

$ docker run \
  --volumes-from jenkins_jenkins_1 \
  -v $(pwd)/backups/:/backups \
  -v $(pwd)/logs/:/logs \
  -e "GCLOUD_ACCOUNT=$(base64 auth.json)" \
  -e "[email protected]" \
  -e "GCLOUD_CRON=$(base64 example-crontab.backup.txt)" \
  blacklabelops/gcloud

Backup using cron schedule.

Support

Leave a message and ask questions on Hipchat: blacklabelops/hipchat

References

gcloud's People

Contributors

lucasallan avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.