Giter Site home page Giter Site logo

aws-codepipeline-terraform-cicd-samples's Introduction

AWS CodePipeline CI/CD example

Terraform is an infrastructure-as-code (IaC) tool that helps you create, update, and version your infrastructure in a secure and repeatable manner.

The scope of this pattern is to provide a guide and ready to use terraform configurations to setup validation pipelines with end-to-end tests based on AWS CodePipeline, AWS CodeBuild, AWS CodeCommit and Terraform.

The created pipeline uses the best practices for infrastructure validation and has the below stages

  • validate - This stage focuses on terraform IaC validation tools and commands such as terraform validate, terraform format, tfsec, tflint and checkov
  • plan - This stage creates an execution plan, which lets you preview the changes that Terraform plans to make to your infrastructure.
  • apply - This stage uses the plan created above to provision the infrastructure in the test account.
  • destroy - This stage destroys the infrastructure created in the above stage. Running these four stages ensures the integrity of the terraform configurations.

Directory Structure

|-- CODE_OF_CONDUCT.md
|-- CONTRIBUTING.md
|-- LICENSE
|-- README.md
|-- data.tf
|-- examples
|   `-- terraform.tfvars
|-- locals.tf
|-- main.tf
|-- modules
|   |-- codebuild
|   |   |-- README.md
|   |   |-- main.tf
|   |   |-- outputs.tf
|   |   `-- variables.tf
|   |-- codecommit
|   |   |-- README.md
|   |   |-- data.tf
|   |   |-- main.tf
|   |   |-- outputs.tf
|   |   `-- variables.tf
|   |-- codepipeline
|   |   |-- README.md
|   |   |-- main.tf
|   |   |-- outputs.tf
|   |   `-- variables.tf
|   |-- iam-role
|   |   |-- README.md
|   |   |-- data.tf
|   |   |-- main.tf
|   |   |-- outputs.tf
|   |   `-- variables.tf
|   |-- kms
|   |   |-- README.md
|   |   |-- main.tf
|   |   |-- outputs.tf
|   |   `-- variables.tf
|   `-- s3
|       |-- README.md
|       |-- main.tf
|       |-- outputs.tf
|       `-- variables.tf
|-- templates
|   |-- buildspec_apply.yml
|   |-- buildspec_destroy.yml
|   |-- buildspec_plan.yml
|   |-- buildspec_validate.yml
|   `-- scripts
|       `-- tf_ssp_validation.sh
`-- variables.tf

Installation

Step 1: Clone this repository.

[email protected]:aws-samples/aws-codepipeline-terraform-cicd-samples.git

Note: If you don't have git installed, install git.

Step 2: Update the variables in examples/terraform.tfvars based on your requirement. Make sure you ae updating the variables project_name, environment, source_repo_name, source_repo_branch, create_new_repo, stage_input and build_projects.

  • If you are planning to use an existing terraform CodeCommit repository, then update the variable create_new_repo as false and provide the name of your existing repo under the variable source_repo_name
  • If you are planning to create new terraform CodeCommit repository, then update the variable create_new_repo as true and provide the name of your new repo under the variable source_repo_name

Step 3: Update remote backend configuration as required

Step 4: Configure the AWS Command Line Interface (AWS CLI) where this IaC is being executed. For more information, see Configuring the AWS CLI.

Step 5: Initialize the directory. Run terraform init

Step 6: Start a Terraform run using the command terraform apply

Note: Sample terraform.tfvars are available in the examples directory. You may use the below command if you need to provide this sample tfvars as an input to the apply command.

terraform apply -var-file=./examples/terraform.tfvars

Pre-Requisites

Step 1: You would get source_repo_clone_url_http as an output of the installation step. Clone the repository to your local.

git clone <source_repo_clone_url_http>

Step 2: Clone this repository.

[email protected]:aws-samples/aws-eks-accelerator-for-terraform.git

Note: If you don't have git installed, install git.

Step 3: Copy the templates folder to the AWS CodeCommit sourcecode repository which contains the terraform code to be deployed.

cd examples/ci-cd/aws-codepipeline
cp -r templates $YOUR_CODECOMMIT_REPO_ROOT

Step 4: Update the variables in the template files with appropriate values and push the same.

Step 5: Trigger the pipeline created in the Installation step.

Note1: The IAM Role used by the newly created pipeline is very restrictive and follows the Principle of least privilege. Please update the IAM Policy with the required permissions. Alternatively, use the create_new_role = false option to use an existing IAM role and specify the role name using the variable codepipeline_iam_role_name

Note2: If the create_new_repo flag is set to true, a new blank repository will be created with the name assigned to the variable source_repo_name. Since this repository will not be containing the templates folder specified in Step 3 nor any code files, the initial run of the pipeline will be marked as failed in the Download-Source stage itself.

Note3: If the create_new_repo flag is set to false to use an existing repository, ensure the pre-requisite steps specified in step 3 have been done on the target repository.

Requirements

Name Version
terraform >= 1.0.0

Providers

Name Version
aws >= 4.20.1

Modules

Name Source Version
codebuild_terraform ./modules/codebuild n/a
codecommit_infrastructure_source_repo ./modules/codecommit n/a
codepipeline_iam_role ./modules/iam-role n/a
codepipeline_kms ./modules/kms n/a
codepipeline_terraform ./modules/codepipeline n/a
s3_artifacts_bucket ./modules/s3 n/a

Resources

Name Type
aws_caller_identity.current data source
aws_region.current data source

Inputs

Name Description Type Default Required
build_project_source aws/codebuild/standard:4.0 string "CODEPIPELINE" no
build_projects Tags to be attached to the CodePipeline list(string) n/a yes
builder_compute_type Relative path to the Apply and Destroy build spec file string "BUILD_GENERAL1_SMALL" no
builder_image Docker Image to be used by codebuild string "aws/codebuild/amazonlinux2-x86_64-standard:3.0" no
builder_image_pull_credentials_type Image pull credentials type used by codebuild project string "CODEBUILD" no
builder_type Type of codebuild run environment string "LINUX_CONTAINER" no
codepipeline_iam_role_name Name of the IAM role to be used by the Codepipeline string "codepipeline-role" no
create_new_repo Whether to create a new repository. Values are true or false. Defaulted to true always. bool true no
create_new_role Whether to create a new IAM Role. Values are true or false. Defaulted to true always. bool true no
environment Environment in which the script is run. Eg: dev, prod, etc string n/a yes
project_name Unique name for this project string n/a yes
repo_approvers_arn ARN or ARN pattern for the IAM User/Role/Group that can be used for approving Pull Requests string n/a yes
source_repo_branch Default branch in the Source repo for which CodePipeline needs to be configured string n/a yes
source_repo_name Source repo name of the CodeCommit repository string n/a yes
stage_input Tags to be attached to the CodePipeline list(map(any)) n/a yes

Outputs

Name Description
codebuild_arn The ARN of the Codebuild Project
codebuild_name The Name of the Codebuild Project
codecommit_arn The ARN of the Codecommit repository
codecommit_name The name of the Codecommit repository
codecommit_url The Clone URL of the Codecommit repository
codepipeline_arn The ARN of the CodePipeline
codepipeline_name The Name of the CodePipeline
iam_arn The ARN of the IAM Role used by the CodePipeline
kms_arn The ARN of the KMS key used in the codepipeline
s3_arn The ARN of the S3 Bucket
s3_bucket_name The Name of the S3 Bucket

Security

See CONTRIBUTING for more information.

License

This library is licensed under the MIT-0 License. See the LICENSE file.

aws-codepipeline-terraform-cicd-samples's People

Contributors

amazon-auto avatar aromal-amzn avatar vvnair-amzn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

aws-codepipeline-terraform-cicd-samples's Issues

tfsec console output shows success even when there are failures

Validation stage gives below output. But in realtime, there is failures for both checkov and tfsec validations.

## VALIDATION Summary ##
--
802 | ------------------------
803 | Terraform Validate : 0
804 | Terraform Format   : 0
805 | Terraform checkov  : 1
806 | Terraform tfsec    : 0
807 | ------------------------

Terraform Error: The bucket does not allow ACLs

Getting the following error when setting up the project:

│ Error: creating S3 bucket ACL for tf-validate-project-rpl20230723095148611900000001: AccessControlListNotSupported: The bucket does not allow ACLs
│ status code: 400, request id: V6MDTSSF5QK5H1JT, host id: KME82KNLmvnEfdMNOsbUfC8+bf1F6KiSYfcppi3NnFOZgVwOjdd0qwoHh+Dfsv+sE7xDNIiten8=

│ with module.s3_artifacts_bucket.aws_s3_bucket_acl.replication_bucket_acl,
│ on modules/s3/main.tf line 120, in resource "aws_s3_bucket_acl" "replication_bucket_acl":
│ 120: resource "aws_s3_bucket_acl" "replication_bucket_acl" {



│ Error: creating S3 bucket ACL for tf-validate-project20230723100746984200000001: AccessControlListNotSupported: The bucket does not allow ACLs
│ status code: 400, request id: ABM8EN00AAZVR2W3, host id: loDvDdpHBCpKuVnLCzztPyQcLiH453qit9bh9rdyZdetBRoY08oFXOlDZmZuNc8Wl9oAo35QbnU=

│ with module.s3_artifacts_bucket.aws_s3_bucket_acl.codepipeline_bucket_acl,
│ on modules/s3/main.tf line 199, in resource "aws_s3_bucket_acl" "codepipeline_bucket_acl":
│ 199: resource "aws_s3_bucket_acl" "codepipeline_bucket_acl" {

Which alignes well with this following issue: terraform-aws-modules/terraform-aws-s3-bucket#223

As s3 buckets would have ACLs disabled by default from April (Announced in december): https://aws.amazon.com/about-aws/whats-new/2022/12/amazon-s3-automatically-enable-block-public-access-disable-access-control-lists-buckets-april-2023/

Terraform Says there is a Cycle in the main.tf for all the modules.

codepipeline_kms -> codepipeline_iam_role -> codepipeline_kms

module "codepipeline_kms" {
  source                = "./modules/kms"
  codepipeline_role_arn = module.codepipeline_iam_role.role_arn
  tags = {
    Project_Name = var.project_name
    Environment  = var.environment
    Account_ID   = local.account_id
    Region       = local.region
  }

}

module "codepipeline_iam_role" {
  source                     = "./modules/iam-role"
  project_name               = var.project_name
  create_new_role            = var.create_new_role
  codepipeline_iam_role_name = var.create_new_role == true ? "${var.project_name}-codepipeline-role" : var.codepipeline_iam_role_name
  source_repository_name     = var.source_repo_name
  kms_key_arn                = module.codepipeline_kms.arn
  s3_bucket_arn              = module.s3_artifacts_bucket.arn
  tags = {
    Project_Name = var.project_name
    Environment  = var.environment
    Account_ID   = local.account_id
    Region       = local.region
  }
}

run_order is not working with codepipeline

Problem Statement:: I am using the code-pipeline module to create a pipeline and setting run_order value to run parallel action In codepipeline but all the actions in pipelines are getting created sequentially.

I have created my module of pipeline with some little changes in the module mentioned in this repository. here is my terraform files of the module.

  1. main.tf
terraform {
  required_version = "~> 1.4"
  required_providers {
    aws = {
      source  = "hashicorp/aws"
      version = "5.4.0"
    }
  }
}


resource "aws_codepipeline" "deployment_pipeline" {
  name     = var.name
  role_arn = var.role_arn

  artifact_store {
    location = var.s3_bucket_name
    type     = "S3"
  }

  stage {
    name = "Source"

    action {
      name             = "Source"
      category         = "Source"
      owner            = "AWS"
      version          = "1"
      provider         = "CodeStarSourceConnection"
      output_artifacts = ["source_output"]

      configuration = {
        FullRepositoryId = var.source_repo_name
        BranchName       = var.source_repo_branch
        ConnectionArn    = var.ConnectionArn
      }
    }
  }

  dynamic "stage" {
    for_each = var.stages

    content {
      name = "Stage-${stage.value["name"]}"
      action {
        category         = stage.value["category"]
        name             = "Action-${stage.value["name"]}"
        owner            = stage.value["owner"]
        provider         = stage.value["provider"]
        input_artifacts  = [stage.value["input_artifacts"]]
        output_artifacts = [stage.value["output_artifacts"]]
        version          = "1"
        run_order        = stage.value["run_order"]

        configuration = {
          ProjectName = stage.value["project_name"]
        }
      }
    }
  }
  tags = var.tags
}
  1. variables.tf
variable "name" {
  description = "Unique name for this project"
  type        = string
}

variable "source_repo_name" {
  description = "Source repo name of the CodeCommit repository"
  type        = string
}

variable "source_repo_branch" {
  description = "Default branch in the Source repo for which CodePipeline needs to be configured"
  type        = string
}

variable "ConnectionArn" {
  description = "Github Connection ARN"
  type        = string
}

variable "s3_bucket_name" {
  description = "S3 bucket name to be used for storing the artifacts"
  type        = string
}

variable "role_arn" {
  description = "ARN of the codepipeline IAM role"
  type        = string
}

variable "tags" {
  description = "Tags to be attached to the CodePipeline"
  type        = map(any)
}

variable "stages" {
  description = "List of Map containing information about the stages of the CodePipeline"
  type        = list(map(any))
}

Here I am passing run_order not as a static value but will pass through the parent module.

Now I am referencing this module in my other terraform file which is described here -

module "deployment_pipeline" {
  source = "../../modules/codepipeline"


  name           = "${var.namespace}-${var.environment}-terraform-pipeline"
  role_arn       = module.codepipeline_role.arn
  s3_bucket_name = data.aws_ssm_parameter.artifact_bucket.value

  ConnectionArn      = data.aws_codestarconnections_connection.existing_github_connection.arn
  source_repo_name   = var.github_FullRepositoryId
  source_repo_branch = var.github_BranchName
  stages = [
    { name = "Bootstrap", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 2, project_name = "${module.initial_bootstrap.name}" },
    { name = "Networking", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 3, project_name = "${module.networking_module_build_step_codebuild_project.name}" },
    { name = "Database", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 4, project_name = "${aws_codebuild_project.rds_module_build_step_codebuild_project.name}" },
    { name = "Elasticache", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 4, project_name = "${module.elasticache_module_build_step_codebuild_project.name}" },
    { name = "Opensearch", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 4, project_name = "${module.opensearch_module_build_step_codebuild_project.name}" },
    { name = "ClientVPN", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 4, project_name = "${module.vpn_module_build_step_codebuild_project.name}" },
    { name = "IAMRole", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 5, project_name = "${module.iam_role_module_build_step_codebuild_project.name}" },
    { name = "EKS", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 6, project_name = "${module.eks_module_build_step_codebuild_project.name}" },
    { name = "EKS-Auth", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 7, project_name = "${module.eks_auth_module_build_step_codebuild_project.name}" },
    { name = "EKS-Istio", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 7, project_name = "${module.istio_module_build_step_codebuild_project.name}" },
    { name = "Observability", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 7, project_name = "${module.eks_observability_module_build_step_codebuild_project.name}" },
    { name = "Opensearch-Ops", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 7, project_name = "${aws_codebuild_project.os_ops_module_build_step_codebuild_project.name}" },
    { name = "Cognito", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 8, project_name = "${module.cognito_module_build_step_codebuild_project.name}" },
    { name = "ControlPlaneApplication", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 8, project_name = "${module.control_plane_module_build_step_codebuild_project.name}" },
    { name = "TenantCodebuilds", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 8, project_name = "${module.tenant_codebuild_module_build_step_codebuild_project.name}" },
    { name = "Billing", category = "Build", owner = "AWS", provider = "CodeBuild", input_artifacts = "source_output", output_artifacts = "", run_order = 8, project_name = "${module.billing_module_build_step_codebuild_project.name}" }
  ]
  tags = module.tags.tags
}

Here I am providing run_order value the same for some actions (like Database & Elasticache) so it should create parallel actions in the code pipeline but it is creating sequential actions.

image

P.S. The Module is not giving any errors. It is creating a pipeline with all the above actions as mentioned.

For any other information, please let me know.

"create_new_role" variable is not respected

Setting the "create_new_role" variable to "false" in the terraform.tfvars file has not effect as it is not passed to the "codepipeline_iam_role" module in the main.tf file.

module "codepipeline_iam_role" {
  source                     = "./modules/iam-role"

  project_name               = var.project_name
  codepipeline_iam_role_name = var.create_new_role == true ? "${var.project_name}-codepipeline-role" : var.codepipeline_iam_role_name

  source_repository_name     = var.source_repo_name
  kms_key_arn                = module.codepipeline_kms.arn
  s3_bucket_arn              = module.s3_artifacts_bucket.arn

  tags = {
    Project_Name = var.project_name
    Environment  = var.environment
    Account_ID   = local.account_id
    Region       = local.region
  }
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.