Giter Site home page Giter Site logo

terraform-google-modules / terraform-google-cloud-storage Goto Github PK

View Code? Open in Web Editor NEW
164.0 23.0 543.0 506 KB

Creates one or more Cloud Storage buckets and assigns basic permissions on them to arbitrary users

Home Page: https://registry.terraform.io/modules/terraform-google-modules/cloud-storage/google

License: Apache License 2.0

Makefile 6.18% Python 4.49% HCL 79.80% Go 9.54%
cft-terraform storage

terraform-google-cloud-storage's Introduction

Terraform Google Cloud Storage Module

This module makes it easy to create one or more GCS buckets, and assign basic permissions on them to arbitrary users.

The resources/services/activations/deletions that this module will create/trigger are:

  • One or more GCS buckets
  • Zero or more IAM bindings for those buckets

If you only wish to create a single bucket, consider using the simple bucket submodule instead.

Compatibility

This module is meant for use with Terraform 0.13+ and tested using Terraform 1.0+. If you find incompatibilities using Terraform >=0.13, please open an issue. If you haven't upgraded and need a Terraform 0.12.x-compatible version of this module, the last released version intended for Terraform 0.12.x is v1.7.1.

Usage

Basic usage of this module is as follows:

module "gcs_buckets" {
  source  = "terraform-google-modules/cloud-storage/google"
  version = "~> 6.0"
  project_id  = "<PROJECT ID>"
  names = ["first", "second"]
  prefix = "my-unique-prefix"
  set_admin_roles = true
  admins = ["group:[email protected]"]
  versioning = {
    first = true
  }
  bucket_admins = {
    second = "user:[email protected],user:[email protected]"
  }
}

Functional examples are included in the examples directory.

Inputs

Name Description Type Default Required
admins IAM-style members who will be granted roles/storage.objectAdmin on all buckets. list(string) [] no
autoclass Optional map of lowercase unprefixed bucket name => boolean, defaults to false. map(bool) {} no
bucket_admins Map of lowercase unprefixed name => comma-delimited IAM-style per-bucket admins. map(string) {} no
bucket_creators Map of lowercase unprefixed name => comma-delimited IAM-style per-bucket creators. map(string) {} no
bucket_hmac_key_admins Map of lowercase unprefixed name => comma-delimited IAM-style per-bucket HMAC Key admins. map(string) {} no
bucket_lifecycle_rules Additional lifecycle_rules for specific buckets. Map of lowercase unprefixed name => list of lifecycle rules to configure.
map(set(object({
# Object with keys:
# - type - The type of the action of this Lifecycle Rule. Supported values: Delete and SetStorageClass.
# - storage_class - (Required if action type is SetStorageClass) The target Storage Class of objects affected by this Lifecycle Rule.
action = map(string)

# Object with keys:
# - age - (Optional) Minimum age of an object in days to satisfy this condition.
# - created_before - (Optional) Creation date of an object in RFC 3339 (e.g. 2017-06-13) to satisfy this condition.
# - with_state - (Optional) Match to live and/or archived objects. Supported values include: "LIVE", "ARCHIVED", "ANY".
# - matches_storage_class - (Optional) Comma delimited string for storage class of objects to satisfy this condition. Supported values include: MULTI_REGIONAL, REGIONAL, NEARLINE, COLDLINE, STANDARD, DURABLE_REDUCED_AVAILABILITY.
# - num_newer_versions - (Optional) Relevant only for versioned objects. The number of newer versions of an object to satisfy this condition.
# - custom_time_before - (Optional) A date in the RFC 3339 format YYYY-MM-DD. This condition is satisfied when the customTime metadata for the object is set to an earlier date than the date used in this lifecycle condition.
# - days_since_custom_time - (Optional) The number of days from the Custom-Time metadata attribute after which this condition becomes true.
# - days_since_noncurrent_time - (Optional) Relevant only for versioned objects. Number of days elapsed since the noncurrent timestamp of an object.
# - noncurrent_time_before - (Optional) Relevant only for versioned objects. The date in RFC 3339 (e.g. 2017-06-13) when the object became nonconcurrent.
condition = map(string)
})))
{} no
bucket_policy_only Disable ad-hoc ACLs on specified buckets. Defaults to true. Map of lowercase unprefixed name => boolean map(bool) {} no
bucket_storage_admins Map of lowercase unprefixed name => comma-delimited IAM-style per-bucket storage admins. map(string) {} no
bucket_viewers Map of lowercase unprefixed name => comma-delimited IAM-style per-bucket viewers. map(string) {} no
cors Set of maps of mixed type attributes for CORS values. See appropriate attribute types here: https://www.terraform.io/docs/providers/google/r/storage_bucket.html#cors set(any) [] no
creators IAM-style members who will be granted roles/storage.objectCreators on all buckets. list(string) [] no
custom_placement_config Map of lowercase unprefixed name => custom placement config object. Format is the same as described in provider documentation https://www.terraform.io/docs/providers/google/r/storage_bucket#custom_placement_config any {} no
default_event_based_hold Enable event based hold to new objects added to specific bucket. Defaults to false. Map of lowercase unprefixed name => boolean map(bool) {} no
encryption_key_names Optional map of lowercase unprefixed name => string, empty strings are ignored. map(string) {} no
folders Map of lowercase unprefixed name => list of top level folder objects. map(list(string)) {} no
force_destroy Optional map of lowercase unprefixed name => boolean, defaults to false. map(bool) {} no
hmac_key_admins IAM-style members who will be granted roles/storage.hmacKeyAdmin on all buckets. list(string) [] no
hmac_service_accounts List of HMAC service accounts to grant access to GCS. map(string) {} no
labels Labels to be attached to the buckets map(string) {} no
lifecycle_rules List of lifecycle rules to configure. Format is the same as described in provider documentation https://www.terraform.io/docs/providers/google/r/storage_bucket.html#lifecycle_rule except condition.matches_storage_class should be a comma delimited string.
set(object({
# Object with keys:
# - type - The type of the action of this Lifecycle Rule. Supported values: Delete and SetStorageClass.
# - storage_class - (Required if action type is SetStorageClass) The target Storage Class of objects affected by this Lifecycle Rule.
action = map(string)

# Object with keys:
# - age - (Optional) Minimum age of an object in days to satisfy this condition.
# - created_before - (Optional) Creation date of an object in RFC 3339 (e.g. 2017-06-13) to satisfy this condition.
# - with_state - (Optional) Match to live and/or archived objects. Supported values include: "LIVE", "ARCHIVED", "ANY".
# - matches_storage_class - (Optional) Comma delimited string for storage class of objects to satisfy this condition. Supported values include: MULTI_REGIONAL, REGIONAL, NEARLINE, COLDLINE, STANDARD, DURABLE_REDUCED_AVAILABILITY.
# - matches_prefix - (Optional) One or more matching name prefixes to satisfy this condition.
# - matches_suffix - (Optional) One or more matching name suffixes to satisfy this condition.
# - num_newer_versions - (Optional) Relevant only for versioned objects. The number of newer versions of an object to satisfy this condition.
# - custom_time_before - (Optional) A date in the RFC 3339 format YYYY-MM-DD. This condition is satisfied when the customTime metadata for the object is set to an earlier date than the date used in this lifecycle condition.
# - days_since_custom_time - (Optional) The number of days from the Custom-Time metadata attribute after which this condition becomes true.
# - days_since_noncurrent_time - (Optional) Relevant only for versioned objects. Number of days elapsed since the noncurrent timestamp of an object.
# - noncurrent_time_before - (Optional) Relevant only for versioned objects. The date in RFC 3339 (e.g. 2017-06-13) when the object became nonconcurrent.
condition = map(string)
}))
[] no
location Bucket location. string "EU" no
logging Map of lowercase unprefixed name => bucket logging config object. Format is the same as described in provider documentation https://www.terraform.io/docs/providers/google/r/storage_bucket.html#logging any {} no
names Bucket name suffixes. list(string) n/a yes
prefix Prefix used to generate the bucket name. string "" no
project_id Bucket project id. string n/a yes
public_access_prevention Prevents public access to a bucket. Acceptable values are inherited or enforced. If inherited, the bucket uses public access prevention, only if the bucket is subject to the public access prevention organization policy constraint. string "inherited" no
randomize_suffix Adds an identical, but randomized 4-character suffix to all bucket names bool false no
retention_policy Map of retention policy values. Format is the same as described in provider documentation https://www.terraform.io/docs/providers/google/r/storage_bucket#retention_policy any {} no
set_admin_roles Grant roles/storage.objectAdmin role to admins and bucket_admins. bool false no
set_creator_roles Grant roles/storage.objectCreator role to creators and bucket_creators. bool false no
set_hmac_access Set S3 compatible access to GCS. bool false no
set_hmac_key_admin_roles Grant roles/storage.hmacKeyAdmin role to hmac_key_admins and bucket_hmac_key_admins. bool false no
set_storage_admin_roles Grant roles/storage.admin role to storage_admins and bucket_storage_admins. bool false no
set_viewer_roles Grant roles/storage.objectViewer role to viewers and bucket_viewers. bool false no
soft_delete_policy Soft delete policies to apply. Map of lowercase unprefixed name => soft delete policy. Format is the same as described in provider documentation https://www.terraform.io/docs/providers/google/r/storage_bucket.html#nested_soft_delete_policy map(any) {} no
storage_admins IAM-style members who will be granted roles/storage.admin on all buckets. list(string) [] no
storage_class Bucket storage class. string "STANDARD" no
versioning Optional map of lowercase unprefixed name => boolean, defaults to false. map(bool) {} no
viewers IAM-style members who will be granted roles/storage.objectViewer on all buckets. list(string) [] no
website Map of website values. Supported attributes: main_page_suffix, not_found_page map(any) {} no

Outputs

Name Description
bucket Bucket resource (for single use).
buckets Bucket resources as list.
buckets_map Bucket resources by name.
hmac_keys List of HMAC keys.
name Bucket name (for single use).
names Bucket names.
names_list List of bucket names.
url Bucket URL (for single use).
urls Bucket URLs.
urls_list List of bucket URLs.

Requirements

These sections describe requirements for using this module.

Software

The following dependencies must be available:

Service Account

User or service account credentials with the following roles must be used to provision the resources of this module:

  • Storage Admin: roles/storage.admin

The Project Factory module and the IAM module may be used in combination to provision a service account with the necessary roles applied.

APIs

A project with the following APIs enabled must be used to host the resources of this module:

  • Google Cloud Storage JSON API: storage-api.googleapis.com

The Project Factory module can be used to provision a project with the necessary APIs enabled.

Contributing

Refer to the contribution guidelines for information on contributing to this module.

terraform-google-cloud-storage's People

Contributors

aaron-lane avatar adrian-gierakowski avatar apeabody avatar bharathkkb avatar cloud-foundation-bot avatar dependabot[bot] avatar g-awmalik avatar jberlinsky avatar kam1kaze avatar kopachevsky avatar kunalkg11 avatar ludoo avatar matty-rose avatar mkjmdski avatar morgante avatar muffl0n avatar nandagirin avatar naseemkullah avatar petomalina avatar philip-harvey avatar pkatsovich avatar release-please[bot] avatar renovate[bot] avatar rjmco avatar sha65536 avatar tensho avatar tpdownes avatar umairidris avatar vendin avatar vkamlov avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

terraform-google-cloud-storage's Issues

Fake feature: add a "silly_label" variable

This is a fake feature for learning how to contribute to CFT. We should not actually add it.

Desired addition: expose a new (string) variable silly_label which will automatically be added to all bucket labels. For example:

module "gcs_buckets" {
  source  = "terraform-google-modules/cloud-storage/google"
  version = "~> 1.3"
  project_id  = "<PROJECT ID>"
  names = ["first", "second"]
  ...
  silly_label = "awesome"
}

This would label buckets first and second with silly: awesome.

When trying this out, please make sure to add it to examples + tests.

1.7.1 is borked

This change fixes it link

Can we cut a patch release?

Here's the error you get w/ 1.7.1

Error: Unsupported argument

  on .terraform/modules/gcs_buckets/main.tf line 98, in resource "google_storage_bucket" "buckets":
  98:         is_live               = lookup(lifecycle_rule.value.condition, "is_live", null)

An argument named "is_live" is not expected here.

Unable to create a PR

I wanted to create a PR with adding an optional parameter default_event_based_hold to google_storage_bucket, but I can not do it because of linting.

docker run --rm -it \
        -v /home/max/projects/terraform-google-cloud-storage:/workspace \
        gcr.io/cloud-foundation-cicd/cft/developer-tools:0.10.0 \
        /usr/local/bin/test_lint.sh
Checking for documentation generation
Checking for trailing whitespace
Checking for missing newline at end of file
Error: No newline at end of file ./.envrc
Running shellcheck
Checking file headers
Running flake8
Running terraform fmt
Running terraform validate
terraform_validate . 
Success! The configuration is valid.

terraform_validate ./examples/multiple_buckets 

Error: Unsupported argument

  on ../../main.tf line 35, in resource "google_storage_bucket" "buckets":
  35:   default_event_based_hold = var.default_event_based_hold

An argument named "default_event_based_hold" is not expected here.

terraform_validate ./examples/simple_bucket 
Success! The configuration is valid.

terraform_validate ./modules/simple_bucket 
Success! The configuration is valid.

terraform_validate ./test/fixtures/multiple_buckets 

Error: Unsupported argument

  on ../../../main.tf line 35, in resource "google_storage_bucket" "buckets":
  35:   default_event_based_hold = var.default_event_based_hold

An argument named "default_event_based_hold" is not expected here.

terraform_validate ./test/setup 

Warning: google_project_services is deprecated - many users reported issues with dependent services that were not resolvable.  Please use google_project_service or the https://github.com/terraform-google-modules/terraform-google-project-factory/tree/master/modules/project_services module.  It's recommended that you use a provider version of 2.13.0 or higher when you migrate so that requests are batched to the API, reducing the request rate. This resource will be removed in version 3.0.0.

  on .terraform/modules/project/terraform-google-project-factory-3.3.1/modules/core_project_factory/main.tf line 165, in resource "google_project_services" "project_services_authority":
 165: resource "google_project_services" "project_services_authority" {


Success! The configuration is valid, but there were some validation warnings as shown above.

Error: The following tests have failed: check_whitespace check_terraform
Makefile:70: recipe for target 'docker_test_lint' failed
make: *** [docker_test_lint] Error 2```

The parameter `default_event_based_hold` was added to google provider [here](https://github.com/terraform-providers/terraform-provider-google/pull/5373).

Could you please fix the Docker image with developer tools? Maybe you can help me, @bharathkkb?

Create multiple buckets using same statefile and terraform script

Hello,

     Could you please help me with below scenario

Scenario: To create multiple buckets using the same terraform script and same state file, for two different projects.

  • Suppose I have two projects, Project A and Project B and under each Project, I would like two create multiple buckets using same terraform script and state file, by passing bucket names through Jenkins Pipeline

Issue: I have tried to pass bucket name through Jenkins Pipeline to terraform script, each time a new value is passed, the bucket which was created earlier gets destroyed and a new bucket is created with the value being passed from Jenkins pipeline, because of statefile. So, ultimately, there is only a single bucket being created through that terraform script each time it is run with the new value. I have tried to follow something similar to here - Using Modules. But, I was only passing single bucket_name value from pipeline.

Broken tag v1.7.2

The release tag points to this commit: 935f42c

...which doesn't exist on master, and returns the message: "This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository."

Also seeing this:

~/projects/terraform-google-cloud-storage$ git pull origin master 
From github.com:kpeder/terraform-google-cloud-storage
 * branch            master     -> FETCH_HEAD
Already up to date.
~/projects/terraform-google-cloud-storage$ git pull origin --tag
Already up to date.
~/projects/terraform-google-cloud-storage$ git checkout v1.7.2
Note: switching to 'v1.7.2'.

You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by switching back to a branch.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -c with the switch command. Example:

  git switch -c <new-branch-name>

Or undo this operation with:

  git switch -

Turn off this advice by setting config variable advice.detachedHead to false

HEAD is now at 935f42c fix: Fix deprecation of is_live lifecycle rule (#91)

...I have been using the branch as a reference in the meantime.

Condition `matches_storage_class` in the `lifecycle_rules` variable of the submodule `simple_bucket` must be an `Array`

Condition matches_storage_class in the lifecycle_rules variable of the submodule simple_bucket must be an Array and this information is not clear from the information in the code

# - matches_storage_class - (Optional) Storage Class of objects to satisfy this condition. Supported values include: MULTI_REGIONAL, REGIONAL, NEARLINE, COLDLINE, STANDARD, DURABLE_REDUCED_AVAILABILITY.

In the main module,

# - matches_storage_class - (Optional) Comma delimited string for storage class of objects to satisfy this condition. Supported values include: MULTI_REGIONAL, REGIONAL, NEARLINE, COLDLINE, STANDARD, DURABLE_REDUCED_AVAILABILITY.

the conditon is declared as a "comma delimited string for storage class of objects to satisfy the condition."

And it is transformed in an array when the resource is created

matches_storage_class = contains(keys(lifecycle_rule.value.condition), "matches_storage_class") ? split(",", lifecycle_rule.value.condition["matches_storage_class"]) : null

but in the simple_bucket submodule it is used "as is" in the resource creation

matches_storage_class = lookup(lifecycle_rule.value.condition, "matches_storage_class", null)

The documentation for the condition in the simple_bucket submodule cloud make this difference more clear.

terraform 0.13 now supports for_each on modules: what does it mean for the future of this one?

Should a new module, which creates a single bucket with associated iam bindings and folders, be created and this module be implemented in terms of this new module?

Or maybe it doesn't even make sense to have a module which creates multiple buckets anymore? It seems to me that the pattern used in this module (and many other terraform-google-modules), namely creating multiple similar groups of resources using for_each on each of them, was simply a workaround used due to lack of support for for_each on user defined modules.

There is still a value in grouping resources which ofter go together (like bucket + iam + folders in that bucket), but having a module create multiple such groups seems redundant in 0.13.

Feature request: Allow creation of top level folders in bucket

I would like to be able to create buckets with a few top level folders within it.
I'm thinking of something that looks like this:

module "gcs_buckets" {
  source  = "terraform-google-modules/cloud-storage/google"
  version = "~> 1.3"
  project_id  = "<PROJECT ID>"
  names = ["first", "second"]
  folders = {
    first = ["test", "dev", "prod"]
  }
}

which would result in the "first" bucket that contains 3 folders (test, dev, prod), while the second bucket would be empty.

Thank you,
Kent

simple_bucket - no possibility to add multiple members to role in one statement

Hi,

It seems that I cannot add multiple members to a role. I tried these:
1)
{ role = "roles/storage.legacyBucketOwner" member = "projectEditor:${module.project-factory.project_id}, projectOwner:${module.project-factory.project_id}" }
2)
{ role = "roles/storage.legacyBucketOwner" member = ["projectEditor:${module.project-factory.project_id}", "projectOwner:${module.project-factory.project_id}"] }

Do I have to specify this statement twice for the same role but with different member or am I missing something?

Support for Google provider v3

I'm using GCP Provider v3.38 recently and noticed a deprecation warning. I understand that the requirements is GCP Provider v2 but are u planning to support v3 at some point? I'd also be happy to try making a PR.

Warning: "bucket_policy_only": [DEPRECATED] Please use the uniform_bucket_level_access as this field has been renamed by Google.

unspecified type constraints

For best practices and for users who implement terragrunt [0]. Type constrains should be added [1]. Without specifying these constraints, applying a terragrunt plan/apply results in type errors [2], depending on the type constraints. Here is a PR that fixes this issue [3].

[0] https://github.com/gruntwork-io/terragrunt
[1] https://www.terraform.io/docs/configuration/variables.html#type-constraints
[2] Invalid value for "inputMap" parameter: lookup() requires a map as the first
argument.
[3] #43

lifecycle_rules not working for action.type=delete

There are two issues but they are both closely related

Steps to repeat issue
Add the following block for the module:

lifecycle_rules = [{
    action = {
      type = "Delete"
    }
    condition = {
      age = 1
    }
  }]

Error occurs when planning or applying:

  on .terraform/somepathhere/main.tf line 42, in module "my-bucket":
  42:   lifecycle_rules = [{
  43:     action = {
  44:       type = "Delete"
  46:     }
  47:     condition = {
  48:       age = 1
  49:     }
  50:   }]
The given value is not suitable for child module variable "lifecycle_rules"
defined at
.terraform/modules/somepathhere/terraform-google-cloud-storage-1.5.0/variables.tf:130,1-27:
element 0: attribute "action": attribute "storage_class" is required.

When I apply the required storage_class to the module:

lifecycle_rules = [{
    action = {
      type = "Delete"
      storage_class = "STANDARD"
    }
    condition = {
      age = 1
    }
  }]

terraform plan will work, but on a terraform apply we get another error:

Error: googleapi: Error 400: Invalid argument, invalid
  on .terraform/modules/somepathhere/main.tf line 21, in resource "google_storage_bucket" "buckets":
  21: resource "google_storage_bucket" "buckets" {

Workaround
I modified the local copy of the module to ignore the storage_class and I was able to run both a plan and apply with no issues.

Migrate to Terraform 0.12

As other modules under github.com/terraform-google-modules are moving to TF v0.12 in recent weeks, this one should too.

I'm happy to help out with this one if required.

simple_bucket module uses removed API "is_live"

Since today, terraform validate fails when handling resources managed by the simple_bucket module because it is still using is_live.

+ terraform validate
Error: Unsupported argument
  on .terraform/modules/firebase_project.storage/modules/simple_bucket/main.tf line 55, in resource "google_storage_bucket" "bucket":
  55:         is_live               = lookup(lifecycle_rule.value.condition, "is_live", null)
An argument named "is_live" is not expected here.

days_since_custom_time - syntax reference

There appears to be a documentation error for the google provider raised an issue for correction [1]

by using an integer value for the lifecycle rule I am getting 400 Bad Request error, Can someone help here

lifecycle_rules = [{ action = { type = "Delete" } condition = { days_since_custom_time = 10 } }] }

Error: googleapi: Error 400: Invalid argument, invalid
{“error”:{“code”:400,“message”:“Invalid argument”,“errors”:[{“message”:“Invalid argument”,“domain”:“global”,“reason”:“invalid”}]}}:

Not able to set website properties.

My configuration (using Terraform 0.13.6):

module "gcs_buckets" {
  source            = "terraform-google-modules/cloud-storage/google"
  version           = "~> 1.7"
  project_id        = "my_project"
  names             = ["site"]
  prefix            = "my_prefix"
  location          = "us"
  set_viewer_roles  = true
  viewers           = ["allUsers"]
  set_creator_roles = true
  versioning = {
    site = true
  }
  website = {
    main_page_suffix = "index.html"
    not_found_page   = "404.html"
  }
}

However the website properties aren't set. Am I doing something wrong?

simple bucket no longer works after uniform_bucket_level_access

Description

Recent update with #80 was a breaking change. There should have been some documentation around this.

Steps

terraform init
terraform plan
provider "google" {
  version = "~> 3.37.0"
  region  = var.region
  project = var.project_id
}
variable "name" { default = "my-backups" }
variable "region" {}
variable "project_id" {}

module "my_backups" {
  source                 = "git::https://github.com/terraform-google-modules/terraform-google-cloud-storage.git//modules/simple_bucket?ref=v1.7.0"
  name       = var.name
  project_id = var.project_id
  location   = var.region

  lifecycle_rules = [{
    action = {
      type = "Delete"
    }

    condition = {
      age        = 365
      with_state = "ANY"
    }
  }]
}

Results

Error: Unsupported argument

  on .terraform/modules/dgraph_backups/modules/simple_bucket/main.tf line 22, in resource "google_storage_bucket" "bucket":
  22:   uniform_bucket_level_access = var.bucket_policy_only

An argument named "uniform_bucket_level_access" is not expected here.

automatic bucket label not sanitized properly

3e7b600 introduced setting the prefix + name as a bucket label.

However the sanitizing of the name is not complete:

In our case we use a valid DNS name as a bucket name (e.g. host-name.example.com).
This module would then add the label name = host-name.example.com which violates the bucket label constraints.

Steps to reproduce:

module "gcs_terraform_state" {
  source            = "terraform-google-modules/cloud-storage/google"
  version           = "~> 1.0"
  ...
  names             = ["host-name.example.com"]
  ...
}
terraform apply -auto-approve

Result:

[...]
Error: googleapi: Error 400: Invalid argument, invalid

  on .terraform/modules/gcs_terraform_state/terraform-google-modules-terraform-google-cloud-storage-231f6ed/main.tf line 21, in resource "google_storage_bucket" "buckets":
  21: resource "google_storage_bucket" "buckets" {

Expected Result:

[...]
Apply complete! Resources: X added, Y changed, Z destroyed.

Solutions:

  • either properly sanitize the auto-generated label
  • or give an option to disable automatically setting the label

Call to function "zipmap" failed: number of keys (2) does not match number of values (3).

Summary

Run into the following issue when terraform apply after removing an item from the var.names list

Error: Error in function call
  on ../../terraform-google-cloud-storage/outputs.tf line 39, in output "names":
  39:   value       = zipmap(var.names, google_storage_bucket.buckets[*].name)
    |----------------
    | google_storage_bucket.buckets is tuple with 3 elements
    | var.names is list of string with 2 elements
Call to function "zipmap" failed: number of keys (2) does not match number of
values (3).
Error: Error in function call
  on ../../terraform-google-cloud-storage/outputs.tf line 44, in output "urls":
  44:   value       = zipmap(var.names, google_storage_bucket.buckets[*].url)
    |----------------
    | google_storage_bucket.buckets is tuple with 3 elements
    | var.names is list of string with 2 elements
Call to function "zipmap" failed: number of keys (2) does not match number of
values (3).

Steps to reproduce

  1. Use the module with some bucket names (e.g. names = ["first", "second", "third"])
  2. terraform apply
  3. Remove one name from the list (e.g. names = ["first", "second"])
  4. terraform apply again
  5. Encounter the above error

Refactor module interface for simplified IAM

We should consider revamping the root module interface now that Terraform has support for module for_each to support a single bucket and any arbitrary bucket IAM roles, so we don't have a bool flag, map for each bucket and list for all buckets per bucket IAM role.

bucket_admins seems to do nothing

Adding people to the bucket_admins list seems to do nothing. Also looking at the examples directory I notice there is iam_members.

Looking at the module itself, it seems to reference iam_members in the variables.tf and the main.tf

variable "iam_members" {
  description = "The list of IAM members to grant permissions on the bucket."
  type = list(object({
    role   = string
    member = string
  }))
  default = []
}

So when using the bucket_admins in the terraform outputs nothing, while setting iam_members it complains that

An argument named "iam_members" is not expected here.

Am I missing something or just confused?

My Buckets.tf

module "gcs_buckets" {
  source  = "terraform-google-modules/cloud-storage/google"
  version = "~> 1.7.2"
  project_id  = "${var.project_id}"

  names = ["one", "two"]

  prefix = "${var.environment}"
  set_admin_roles = true
  versioning = {
    enabled = false
  }

  iam_members = [{
    role   = "roles/storage.viewer"
    member = "user:[email protected]"
  }]
}

Thanks for any help.

function list deprecated Terraform v0.15.0-beta1

Hi,

I tried to use the module with terraform v0.15.0-beta1 but got this error

│ Error: Error in function call
│ 
│   on .terraform/modules/dump_bucket/main.tf line 18:
│   (source code not available)
│     ├────────────────
│     │ var.location is "europe-west1"
│     │ var.prefix is "passculture-metier-prod"
│ 
│ Call to function "list" failed: the "list" function was deprecated in Terraform v0.12 and is no
│ longer available; use tolist([ ... ]) syntax to write a literal list.

From terraform documentation

This function is deprecated. From Terraform v0.12, the Terraform language has built-in syntax for creating lists using the [ and ] delimiters. Use the built-in syntax instead. The list function will be removed in a future version of Terraform.

It seems to work using 0.14.9, so I guess they hard dropped the usage of this function.
IMO, list() occurrences just need to be replaced by tolist()

Error: An argument named "bucket_policy_only" is not expected here

Hi,

I'm using the module and started running into this issue today. I have no explicit reference to this input variable. My TF template is below:

module "search_cloud_storage" {
source = "terraform-google-modules/cloud-storage/google"
version = "1.1.0"

project_id = var.our_project
prefix = "mstr"
names = ["gcts", "df"]
labels = {
name = "our label here"
}
storage_class = "STANDARD"
location = var.region
}

Error when running "terraform validate" or "terraform plan", for both versions 1.0.0 and 1.1.0:

Error: Unsupported argument

on .terraform/modules/search_cloud_storage/terraform-google-modules-terraform-google-cloud-storage-c011106/main.tf line 33, in resource "google_storage_bucket" "buckets":
33: bucket_policy_only = lookup(

An argument named "bucket_policy_only" is not expected here.

Error in function call

Hi,

Im trying to use this module and I'm getting this error:

│ Error: Error in function call
│ 
│   on .terraform/modules/gcs_buckets/main.tf line 18, in locals:
│   18:   prefix = var.prefix == "" ? "" : join("-", list(var.prefix, lower(var.location), ""))
│     ├────────────────
│     │ var.location is "EU"
│     │ var.prefix is "my-unique-prefix"
│ 
│ Call to function "list" failed: the "list" function was deprecated in Terraform v0.12 and is no longer available; use tolist([ ... ]) syntax
│ to write a literal list.

Im doing the same thats in this tutorial.

Is this a bug? Can you help me?

Thanks!

Unable to enable versioning on GCS bucket

I've been attempting to enable versioning but can't seem to get it to work and I'm unsure of what is wrong.

I have a parent module that is referencing a child module. The child module is handling the enabling of versioning. I have also tried with the parent module enabling versioning but that hasn't worked either.

If I look in my state file it shows versioning as set to false but I'm using the same syntax as the Usage example at the front page of the repository.

Parent module main.tf

module "permanent_storage_bucket" {
  source = "../../../../modules/gcs_bucket"

  // REQUIRED FIELDS
  project_id         = var.project_id
  bucket_suffix_name = var.permanent_bucket_suffix_name
  bucket_prefix_name = var.permanent_bucket_prefix_name

  // OPTIONAL FIELDS
  bucket_set_admin_roles = var.permanent_bucket_set_admin_roles
  admins                 = var.permanent_bucket_admins

  creators                    = var.permanent_bucket_creators
  bucket_encryption_key_names = var.permanent_bucket_encryption_key_names
  bucket_folders              = var.permanent_bucket_folders
  bucket_force_destroy        = var.permanent_bucket_force_destroy
  storage_bucket_labels       = var.permanent_bucket_labels
  bucket_location             = var.permanent_bucket_location
  bucket_set_creator_roles    = var.permanent_bucket_set_creator_roles
  bucket_set_viewer_roles     = var.permanent_bucket_set_viewer_roles
  bucket_storage_class        = var.permanent_bucket_storage_class
  viewers                     = var.permanent_bucket_viewers
  depends_on                  = []
}

Child module main.tf

module "gcs_bucket" {
  source               = "terraform-google-modules/cloud-storage/google"
  version              = "~> 1.7.0"
  project_id           = var.project_id
  names                = formatlist("%v-%v", var.bucket_suffix_name, random_id.bucket_suffix_addition.hex)
  prefix               = var.bucket_prefix_name
  set_admin_roles      = var.bucket_set_admin_roles
  admins               = var.admins
  versioning           = {
    first = true
  }
  bucket_admins        = var.bucket_admins
  bucket_creators      = var.bucket_creators
  bucket_viewers       = var.bucket_viewers
  creators             = var.creators
  encryption_key_names = var.bucket_encryption_key_names
  folders              = var.bucket_folders
  force_destroy        = var.bucket_force_destroy
  labels               = var.storage_bucket_labels
  location             = var.bucket_location
  set_creator_roles    = var.bucket_set_creator_roles
  set_viewer_roles     = var.bucket_set_viewer_roles
  storage_class        = var.bucket_storage_class
  viewers              = var.viewers
  depends_on           = []
}

default.tfstate

"versioning": [
      {
                "enabled": false
       }
],

Add Compatibility with `constraints/storage.bucketPolicyOnly` Org Policy Constraint

When attempting to create a bucket with this module, within a folder/organization where enforcement of constraints/storage.bucketPolicyOnly is enabled, creation of the bucket fails if no ACLs are provided:

module "test-bucket" {
  source     = "terraform-google-modules/cloud-storage/google"
  version    = "~> 0.1.0"
  project_id = "${var.project_id}"
  location   = "US"

  names = [
    "my-assets",
  ]

  prefix           = "${var.project_id}"
}
googleapi: Error 412: Constraint 'constraints/storage.bucketPolicyOnly' violated for 'projects/<REDACTED>' enabling legacy object ACLs, conditionNotMet

This is because there is no support for the bucket_policy_only attribute on the google_storage_bucket resource within this module. Setting this is required for satisfying the organization policy constraint.

Add notification configuration

Hi,

Any plans to include notification configuration on the bucket in this module anytime soon ? I currently use this module and wondering if there's a way to do the notifications independently and still use this module...should I do this after the bucket is created by using a depends-on ?

I can try taking this up, but don't know where to get started for working as a contributor.

Thanks
Kishore

Error: Variables not allowed

Hello,

I am trying to use a remote state file in gcs and passing the credentials to this backend through hashicorp vault. Is this even possible? Passing the credentials to the provider works, but not gcs.

main.tf

data "vault_generic_secret" "gcp_credentials" {
  path = "<<path_is_added_here>>"
}

// Configure the Google Cloud provider
provider "google" {
  credentials = data.vault_generic_secret.gcp_credentials.data_json
  project     = var.gcp_project
  region      = "europe-west3"
}

state.tf

terraform {
  backend "gcs" {
    bucket  =  "<<bucket_added>>"
    prefix  = "<<prefix_added>>"
    credentials = data.vault_generic_secret.gcp_credentials.data_json // here is my problem
}

I have tried adding it as a local as well and I still get this error. Any help is appreciated.

Error importing resources

Hi I'm trying to import bucket resource in to the state with this resource address:

git:(main) ✗ terraform import 'module.clients["myproject"].module.data_buckets.google_storage_bucket.buckets[0]' 'myproject/myproject_d6s_scoring'

But I got this error:

module.clients["myproject"].module.data_buckets.google_storage_bucket.buckets[0]: Importing from ID "myproject/myproject_d6s_scoring"...
module.clients["myproject"].module.data_buckets.google_storage_bucket.buckets[0]: Import prepared!
  Prepared google_storage_bucket for import
module.clients["myproject"].module.data_buckets.google_storage_bucket.buckets[0]: Refreshing state... [id=myproject/myproject_d6s_scoring]

Error: Invalid function argument

  on .terraform/modules/clients.data_buckets/outputs.tf line 39, in output "names":
  39:   value       = zipmap(var.names, slice(google_storage_bucket.buckets[*].name, 0, length(var.names)))
    |----------------
    | var.names is list of string with 2 elements

Invalid value for "end_index" parameter: end index must not be greater than
the length of the list.


Error: Invalid function argument

  on .terraform/modules/clients.data_buckets/outputs.tf line 44, in output "urls":
  44:   value       = zipmap(var.names, slice(google_storage_bucket.buckets[*].url, 0, length(var.names)))
    |----------------
    | var.names is list of string with 2 elements

Invalid value for "end_index" parameter: end index must not be greater than
the length of the list.

I have 2 buckets in var.names, the module version is 1.7.2

The default value for storage_class variable in this module is using deprecated option "MULTI_REGIONAL"

The current default value for storage_class variable is "MULTI_REGIONAL":

variable "storage_class" {
  description = "Bucket storage class."
  type        = string
  default     = "MULTI_REGIONAL"
}

This value is deprecated according to the documentation of Cloud Storage

Additional classes
Cloud Storage supports several additional storage classes; however, these classes cannot be set using the Cloud Console. Unless you already are using one of these additional classes, you should use Standard Storage instead.

According to this, I suggest changing the default value to "STANDARD".

simple bucket broken as is_live removed

STEPS

module "dgraph_backups" {
  source     = "git::https://github.com/terraform-google-modules/terraform-google-cloud-storage.git//modules/simple_bucket?ref=v1.7.0"
  name       = var.name
  project_id = var.project_id
  location   = var.region

  lifecycle_rules = [{
    action = {
      type = "Delete"
    }

    condition = {
      age        = 365
      with_state = "ANY"
    }
  }]
}

Actual Results

Error: Unsupported argument

  on .terraform/modules/gcs.gcs/modules/simple_bucket/main.tf line 55, in resource "google_storage_bucket" "bucket":
  55:         is_live               = lookup(lifecycle_rule.value.condition, "is_live", null)

An argument named "is_live" is not expected here.

NOTES

`cors` and `website` parameter ignored

I tried to run version 2.1.0 with the following example code and found core and website parameters being ignored:

module "bucket" {
  source  = "terraform-google-modules/cloud-storage/google"
  version = "2.1.0"

  project_id    = "my-project-id"
  location      = "europe-west3"
  names         = ["xxx-xxx"]
  prefix        = ""
  storage_class = "STANDARD"
  cors = {
    x = {
      max_age_seconds = 3600
      method = [
        "POST",
        "OPTIONS",
      ]
      origin = [
        "*",
      ]
      response_header = [
        "Content-Type",
      ]
    }
    y = {
      max_age_seconds = 360
      method = [
        "POST",
      ]
      origin = [
        "*",
      ]
      response_header = [
        "Content-Type",
      ]
    }
  }
  website = {
    not_found_page = "404.html"
  }
}

This is resulting in:

Terraform will perform the following actions:

  # module.bucket.google_storage_bucket.buckets["xxx-xxx"] will be created
  + resource "google_storage_bucket" "buckets" {
      + bucket_policy_only          = (known after apply)
      + force_destroy               = false
      + id                          = (known after apply)
      + labels                      = {
          + "name" = "xxx-xxx"
        }
      + location                    = "EUROPE-WEST3"
      + name                        = "xxx-xxx"
      + project                     = "my-project-id"
      + self_link                   = (known after apply)
      + storage_class               = "STANDARD"
      + uniform_bucket_level_access = true
      + url                         = (known after apply)

      + versioning {
          + enabled = false
        }
    }

Plan: 1 to add, 0 to change, 0 to destroy.

Location included in the bucket name?

Version 1.0.0 now includes the location in the bucket name. Version 0.1.0 did not.

Have the location in the bucket name is good practice, but shouldn't 1.0.0 be backwards compatible?

Lifecyle rules

Hi,

I'm trying to use this module and not understanding how to use lifecyle rules.

module "cloud_storage" {
  source             = "terraform-google-modules/cloud-storage/google"
  version = "1.1.0"
  project_id         = var.project_id
  prefix             = var.prefix
  names              = var.names
  bucket_policy_only = var.bucket_policy_only
  location	     = var.location
  storage_class      = var.storage_class
  force_destroy      = var.buckets_force_destroy
  lifecycle_rules = [{
    action = {
      type          = "SetStorageClass"
      storage_class = "NEARLINE"
    }
    condition = {
      age                   = "30"
      matches_storage_class = "MULTI_REGIONAL,STANDARD,DURABLE_REDUCED_AVAILABILITY"
    }
   },
   { action = {
      type          = "SetStorageClass"
      storage_class =  "COLDLINE"
    }
    condition = {
      age                   = "365"
      matches_storage_class = "MULTI_REGIONAL,STANDARD,DURABLE_REDUCED_AVAILABILITY"
    }
   }]  
}

This gives me an error -

$ terraform plan

Error: Unsupported argument

  on main.tf line 159, in module "cloud_storage":
 159:   lifecycle_rules = [{

An argument named "lifecycle_rules" is not expected here.

I do see this in variables.tf. what else needs to be done to expose this input argument.

Feature: Need to Dynamically assign encryption

for multi-regional buckets with encryption, the encryption key from the global key-ring will not work, The keys need to be bound to the regional resource itself
eg: Bucket in US can't have the global key to encrypt, it should have a key from US location to encrypt

Module fails when creating a single bucket

Not sure if something changed or i'm doing something wrong, but i could have sworn this used to work.

Basically if I only specify one bucket name, a few of the outputs fail. 2 or more, plan appears to be successful.

Realise there is a module for single bucket creation but wanted to use this module as there will be an eventual need to add more buckets with similar specifications

Create with two buckets

    source  = "terraform-google-modules/cloud-storage/google"
    version = "~> 1.3"

    project_id    = var.project_id
    names         = ["test1","test2"]
    #names         = [for name in var.bucket_names : "${local.resource_type_code}-${name}-${random_id.bucket-postfix.hex}"]
    location      = var.location
    storage_class = var.storage_class
    prefix        = var.feature_code

    versioning = {
        first = true
    }
}

No Worries

  + resource "google_storage_bucket" "buckets" {
      + bucket_policy_only = true
      + force_destroy      = false
      + id                 = (known after apply)
      + labels             = {
          + "name" = "test1"
        }
      + location           = "AUSTRALIA-SOUTHEAST1"
      + name               = "test1"
      + project            = "shareddata-dev-af5e"
      + self_link          = (known after apply)
      + storage_class      = "STANDARD"
      + url                = (known after apply)

      + versioning {
          + enabled = false
        }
    }

  # module.as-gcs-buckets.module.gcs_buckets.google_storage_bucket.buckets[1] will be created
  + resource "google_storage_bucket" "buckets" {
      + bucket_policy_only = true
      + force_destroy      = false
      + id                 = (known after apply)
      + labels             = {
          + "name" = "test2"
        }
      + location           = "AUSTRALIA-SOUTHEAST1"
      + name               = "test2"
      + project            = "shareddata-dev-af5e"
      + self_link          = (known after apply)
      + storage_class      = "STANDARD"
      + url                = (known after apply)

      + versioning {
          + enabled = false
        }
    }

Create with single bucket

    source  = "terraform-google-modules/cloud-storage/google"
    version = "~> 1.3"

    project_id    = var.project_id
    names         = ["test1"]
    #names         = [for name in var.bucket_names : "${local.resource_type_code}-${name}-${random_id.bucket-postfix.hex}"]
    location      = var.location
    storage_class = var.storage_class
    prefix        = var.feature_code

    versioning = {
        first = true
    }
}

Following errors occur in the outputs


  on .terraform/modules/as-gcs-buckets.gcs_buckets/terraform-google-modules-terraform-google-cloud-storage-dfbab2e/outputs.tf line 19, in output "bucket":
  19:   value       = google_storage_bucket.buckets[0]
    |----------------
    | google_storage_bucket.buckets is empty tuple

The given key does not identify an element in this collection value.


Error: Invalid index

  on .terraform/modules/as-gcs-buckets.gcs_buckets/terraform-google-modules-terraform-google-cloud-storage-dfbab2e/outputs.tf line 24, in output "name":
  24:   value       = google_storage_bucket.buckets[0].name
    |----------------
    | google_storage_bucket.buckets is empty tuple

The given key does not identify an element in this collection value.


Error: Invalid index

  on .terraform/modules/as-gcs-buckets.gcs_buckets/terraform-google-modules-terraform-google-cloud-storage-dfbab2e/outputs.tf line 29, in output "url":
  29:   value       = google_storage_bucket.buckets[0].url
    |----------------
    | google_storage_bucket.buckets is empty tuple

The given key does not identify an element in this collection value.


Error: Invalid function argument

  on .terraform/modules/as-gcs-buckets.gcs_buckets/terraform-google-modules-terraform-google-cloud-storage-dfbab2e/outputs.tf line 39, in output "names":
  39:   value       = zipmap(var.names, slice(google_storage_bucket.buckets[*].name, 0, length(var.names)))
    |----------------
    | var.names is list of string with 1 element

Invalid value for "end_index" parameter: end index must not be greater than
the length of the list.


Error: Invalid function argument

  on .terraform/modules/as-gcs-buckets.gcs_buckets/terraform-google-modules-terraform-google-cloud-storage-dfbab2e/outputs.tf line 44, in output "urls":
  44:   value       = zipmap(var.names, slice(google_storage_bucket.buckets[*].url, 0, length(var.names)))
    |----------------
    | var.names is list of string with 1 element

Terraform Google Provider 3.41.0 Breaking Change for `is_live`

Hello,

Looks like the google terraform provider version 3.41.0 removed support for is_live:
hashicorp/terraform-provider-google@d1014fe

We'll patch our local copy of the module but can you create an official release to remove this?
https://github.com/terraform-google-modules/terraform-google-cloud-storage/blob/master/main.tf#L102

Too bad terraform didn't provide the warning, looks like it was marked as "Removed" since July 2019.

Thanks

Simple bucket example fails - no such role storage.viewer

This samples https://registry.terraform.io/modules/terraform-google-modules/cloud-storage/google/latest/examples/simple_bucket fails to create the IAM-related things.

Please refer to my original issue here: hashicorp/terraform-provider-google#9130

The issue seems to be that there is no role storage.viewer by default. See https://github.com/terraform-google-modules/terraform-google-cloud-storage/blob/v1.7.2/examples/simple_bucket/main.tf#L35
There's only roles/storage.admin.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.