Giter Site home page Giter Site logo

capgemini-cloud-devops-assignment's Introduction

                                                Cloud-DevOps-assignment

In this assigment I used the Microsoft Azure cloud resources to carry out the operations that I was required to carry out. This is the workflow diagram of the assisgment

DevOps workflow

Note: The Deployment files and YAML pipelines that are screenshot in this writeup are in the repository, you can check them out.

                                                    Tools used:

Git and Azure Git Repos – Source Code Management and Version control system

Docker – Application Containerization system

Azure Container Registry – Managing container images

DockerHub - Sharing container images

Azure Kubernetes Service - Managed container workloads and services

Azure Pipeline - Continuous Integration (CI) and Continuous Delivery (CD)

Datadog – Metrics collection and monitoring.

                                           React-and-Spring-data-rest

Deploy a pipeline to build, test and push the docker image to Azure Container Registry.

As requested, that the pipeline should build, test and push the docker image to a repository. The pipeline built and pushed the Docker images (Front-end and Back-end) to Azure Container Registry known as abimbolacontainerregistry as seen below.

The pipeline below built and pushed the front-end Docker image to the Azure Container Registry.

image

The Docker image for the front-end app was successfully built and pushed to the Azure Container Registry as seen in the screenshot below

image

The Docker image is in the Azure repository, the image could also be pushed to DockerHub and other container image repositories. The pipeline below built and pushed the back-end Docker image to the Azure Container Registry

image

The Docker image for the back-end app was successfully built and pushed to the Azure Container Registry as seen in the screenshot below

image

The Docker images can be seen in the repository.

image

                       Deploy the infrastructure using Infrastructure as Code (Terraform)

I understand that you would like the deployment pipelines to deploy the applications across different environments on the target infrastructure, however, we would first need to build the target infrastructure, in this assignment I would be using Terraform as the Infrastructure as Code (IaC) to deploy the target Infrastructure.

The target Infrastructure would be a Managed Service known as (Azure Kubernetes Service). I used the below pipeline to deploy the Azure Kubernetes Service cluster

image

The Azure Kubernetes Service cluster screenshot can be seen below

image

                The next line of action is to deploy the applications across different environments on the target infrastructure

I created a secret for the imagepullsecret and MySQL database using the YAML file below. This was done from the terminal, this will allow authentication to the repository and the MySQL database.

image

            Deploying the front-end application using the frontend docker image in ACR, I used this YAML pipeline

image

The front-end was successfully deployed

image

Front-end application is deployed to the Azure Kubernetes Cluster as seen below.

image

The front-end application can be accessed using the load balancer IP address and port 8080, in the deployment YAML file, there is a service that exposed the deployment with type=LoadBalancer

image

I created a scaling policy that makes the infrastructure scale automatically.

image

When the CPUs of the nodes in the Kubernetes Cluster gets to 70% threshold for 10 minutes, the node scales out by 1 automatically, also, if the CPUs of the cluster get to below the average threshold, it scales down the nodes.

                                              Deploying the back-end application

Before deploying the back-end application, I first deployed a storage class to the Kubernetes cluster, then created the persisted volume claim, lastly, I deployed the MySQL database to consume the persistent volume claim and ensure that the data in the database are persistent. Once this is done the back-end application would make use of the MySQL database.

See below Storage class and persistent volume claim YAML file

image

image

I deployed the Storage Class and Persistent Volume Claim from the terminal

                                        Deploying the MySQL database, I used this YAML pipeline

image

Database created successfully

image

MySQL database deployed to Kubernetes cluster as a deployment

image

Persistent volume claim to persist the data in MySQL database bound successfully

image

                                Deploying the back-end application using the backend docker image in ACR, I used this YAML pipeline

image

The MySQL environment variables was used to deploy the backend application as it can be seen in the deployment file.

                                             Monitoring the Infrastructure

Datadog can help you get full visibility into your AKS deployment by collecting metrics, distributed request traces, and logs from Kubernetes, Azure, and every service running in your container infrastructure. To start monitoring AKS with Datadog, all you need to do is configure the integrations for Kubernetes and Azure. Deploy the containerized Datadog Agent as a DaemonSet within your AKS cluster using the Helm chart.

                                                    Install Datadog into Kubernetes using Helm chart

API_KEY=""

helm repo add datadog https://helm.datadoghq.com

helm install datadog --set datadog.site='datadoghq.com' --set datadog.apiKey=$API_KEY --set datadog.apm.enabled=true datadog/datadog

The nodes in the kubernetes cluster can be seen below

image

The Datadog agent was deployed as a Daemonset, the agent can be seen in the Datadog portal

image

The agents represent the node running in the Kubernetes cluster, once an agent is selected you will be able to see the properties of the node in the cluster as you can see in the below screenshot.

image

                                         Create a dashboard and start monitoring the cluster

image

                                             Terraform Pipeline to clean up the infastructure

image

This is a Public repository, to fork kindly select FORK and you have it.

image

Thank You

capgemini-cloud-devops-assignment's People

Contributors

abimbolarepo avatar

Watchers

 avatar  avatar

Forkers

ayanfe19

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.