Giter Site home page Giter Site logo

gradle-s3-plugin's Introduction

Gradle S3 Plugin

Install MIT License Gradle Plugin

Simple Gradle plugin that uploads and downloads S3 objects. It is designed to work with Gradle version 7 and later.

Setup

Add the following to your build.gradle file:

plugins {
    id 'com.mgd.core.gradle.s3' version '2.0.4'
}

Versioning

This project uses semantic versioning

See gradle plugin page for other versions.

Usage

Authentication

The S3 plugin searches for credentials in the same order as the AWS default credentials provider chain. See the AWS Docs for details on credentials.

Profiles

You can specify a default credentials profile for the project to use by setting the project s3.profile property. These credentials will be used if no other authentication mechanism has been specified for the Gradle task.

s3 {
    profile = 'my-profile'
}

Environment Variables

Setting the environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY is another way to provide your S3 credentials. Settings these variables at the machine-level will make them available for every Gradle project to use as the default credentials. The environment variables can also be set or overridden at the task level for each Gradle task which requires them.

System Properties

Another way to set S3 credentials is to set system properties at the Gradle task level. This may be useful for managing multiple tasks in the same project which each require distinct credentials.
For example, suppose you want distinct tasks for uploading to different S3 buckets, each with different security credentials. You could define different Gradle tasks (e.g. uploadToS3Profile1, uploadToS3Profile2) and map the credentials to each using the AWS SDK v2 system properties:

['Profile1', 'Profile2'].each { profile ->
    tasks.register("uploadToS3${profile}", S3Upload) {
        // credentials injected into the project as profile1KeyId, profile1SecretAccessKey and profile2KeyId, profile2SecretAccessKey 
        System.setProperty('aws.accessKeyId', project.ext."${profile.toLowerCase()}KeyId")
        System.setProperty('aws.secretAccessKey', project.ext."${profile.toLowerCase()}SecretAccessKey")

        bucket = 'target-bucketname'
        key = 'artifact.jar'
        file = layout.buildDirectory.file('libs/artifact.jar').get().asFile
        overwrite = true
    }
}

Note that this example is provided for illustrative purposes only. All passwords should be externalized, secured via access control and/or encrypted. A good option for managing secrets in build files is the Gradle Credentials plugin.

Amazon EC2 Endpoint

The s3.endpoint property can be used to define an Amazon EC2 compatible third-party cloud or Docker environment for all tasks (e.g. LocalStack). This option is only valid when combined with the region property (either defined globally using s3.region or defined for the task using task-level properties). Endpoints can also be defined on a per-task basis, which enables switching between Amazon S3 and third-party endpoints for each task, if needed.

s3 {
    endpoint = 'http://localstack.cloud'
    region = 'global'
}

Amazon EC2 Region

The s3.region property can optionally be set to define the Amazon EC2 region if one has not been set in the authentication profile. It can also be used to override the default region set in the AWS credentials provider. Regions can also be defined on a per-task basis.

s3 {
    region = 'us-east-1'
}

Default S3 Bucket

The s3.bucket property sets a default S3 bucket that is common to all tasks. This can be useful if all S3 tasks operate against the same Amazon S3 bucket.

s3 {
    bucket = 'my-default-bucketname'
}

Tasks

The following Gradle tasks are provided.

S3Upload

Uploads one or more files to S3. This task has two modes of operation: single file upload and directory upload (including recursive upload of all child subdirectories).

Properties that apply to both modes:

  • profile - credentials profile to use (optional, defaults to the project s3 configured profile)
  • bucket - S3 bucket to use (optional, defaults to the project s3 configured bucket, if any)
  • region - the Amazon EC2 region (optional, defaults to the project s3 configured region, if any)
  • endpoint - the third-party Amazon EC2 endpoint (optional, defaults to the project s3 configured endpoint, if any)

Single file upload:

  • key - key of S3 object to create
  • file - path of file to be uploaded
  • overwrite - (optional, default is false), if true the S3 object is created or overwritten if it already exists
  • then - (optional), callback closure called upon completion with the java.io.File that was uploaded

By default S3Upload does not overwrite the S3 object if it already exists. Set overwrite to true to upload the file even if it exists.

Directory upload:

  • keyPrefix - root S3 prefix under which to create the uploaded contents (optional, if not provided files will be uploaded to S3 bucket root)
  • sourceDir - local directory containing the contents to be uploaded

A directory upload will always overwrite existing content if it already exists under the specified S3 prefix.

S3Download

Downloads one or more S3 objects. This task has three modes of operation: single file download, recursive download and path pattern matching.

Properties that apply to all modes:

  • profile - credentials profile to use (optional, defaults to the project s3 configured profile)
  • bucket - S3 bucket to use (optional, defaults to the project s3 configured bucket, if any)
  • region - the Amazon EC2 region (optional, defaults to the project s3 configured region, if any)
  • endpoint - the third-party Amazon EC2 endpoint (optional, defaults to the project s3 configured endpoint, if any)

Single file download:

  • key - key of S3 object to download
  • file - local path of file to save the download to
  • version - (optional), the specific object version id to download if the bucket has S3 versioning enabled
    • if S3 versioning is not enabled, this field should not be provided
    • if S3 versioning is enabled and a version id is not provided, the latest file will be always be downloaded
    • discovery of S3 object version ids must be accomplished via other means and is beyond the scope of this plugin
      NOTE: care should be exercised when using this parameter, as incorrect values will cause the task to fail
  • then - (optional), callback closure called upon completion with the java.io.File that was downloaded

Recursive download:

  • keyPrefix - S3 prefix of objects to download (optional, if not provided entire S3 bucket will be downloaded)
  • destDir - local directory to download objects to

Path pattern matching:

  • pathPatterns - a list of path patterns to match against, which can specify any combination of the following items:
    • an individual S3 object name (e.g. /path/to/some-file.txt)
    • a key prefix pointing to a folder (e.g. /some-folder/)
      NOTE: when specifying folders, the folder name must end with a trailing forward slash, (i.e. /), otherwise it will be treated as an object name
    • a wildcard path pattern ending with an asterisk to search for matching folders (e.g. /parent-folder/child-folder/folder-name-prefix-*)
  • destDir - local directory to download objects into
  • then - (optional, invoked only on individual S3 object name patterns), callback closure called upon completion with the java.io.File that was downloaded

Example:

...

s3 {
    bucket = 'project-default-bucketname'
    endpoint = 'http://localstack.cloud'
    region = 'us-east-1'
}

tasks.register('defaultFilesDownload', S3Download) {
    keyPrefix = 'sourceFolder'
    destDir = 'targetDirectory'
}

tasks.register('singleFileDownload', S3Download) {
    bucket = 'task-source-bucketname'
    key = 'source-filename'
    file = 'target-filename'
    then = { File file ->
        // do something with the file
        println("Downloaded file named ${file.name}!")
    }
}

tasks.register('downloadRecursive', S3Download) {
    keyPrefix = 'recursive/sourceFolder'
    destDir = './some/recursive/targetDirectory'
}

tasks.register('downloadPathPatterns', S3Download) {
    bucket = 'another-task-source-bucketname'
    pathPatterns = [
        'path/to/filename.txt',
        'single-folder/',
        'matching/folder/with-prefix-names*'
    ]
    destDir = 'pathPatternMatches'
    then = { File file ->
        // do something with the file
        println("Downloaded the file named 'path/to/filename.txt' to ${file.parent}!")
    }
}

tasks.register('filesUpload', S3Upload) {
    bucket = 'task-target-bucketname'
    keyPrefix = 'targetFolder'
    sourceDir = 'sourceDirectory'
}

tasks.register('defaultSingleFileUpload', S3Upload) {
    key = 'target-filename'
    file = 'source-filename'
}

Note:

Recursive downloads create a sparse directory tree containing the full keyPrefix under destDir. So with an S3 bucket containing the object keys:

top/foo/bar
top/README

a recursive download:

tasks.register('downloadRecursive', S3Download) {
  keyPrefix = 'top/foo/'
  destDir = 'local-dir'
}

results in this local tree:

local-dir/
└── top
    └── foo
        └── bar

So only files under top/foo are downloaded, but their full S3 paths are appended to the destDir. This is different from the behavior of the aws cli aws s3 cp --recursive command which prunes the root of the downloaded objects. Use the flexible Gradle Copy task to prune the tree after downloading it.

For example:

String s3PathTree = 'path/to/source/location'
String tempDownloadRoot = 'temp-download-root'

tasks.register('downloadRecursive', S3Download) {
    bucket = 's3-bucket-name'
    keyPrefix = "${s3PathTree}"
    destDir = layout.buildDirectory.dir(tempDownloadRoot).get().asFile
}

// prune and re-root the downloaded tree, removing the keyPrefix
tasks.register('pruneDownload', Copy) {

    dependsOn(tasks.downloadRecursive)

    from layout.buildDirectory.dir("${tempDownloadRoot}/${s3PathTree}")
    into layout.buildDirectory.dir('path/to/destination')
}

Progress Reporting

Downloads report percentage progress at the gradle INFO level. Run gradle with the -i option to see download progress.

License

MIT License

gradle-s3-plugin's People

Contributors

github-actions[bot] avatar jonnybot0 avatar narenanandan avatar peter-thomas avatar peter-thomas-mgd avatar sanmibuh avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

gradle-s3-plugin's Issues

Not able to upload directory to bucket's root

Currently it is not possible to upload a directory to the root of a bucket:

task deployDocs(type: S3Upload) {
    bucket = System.getenv("MY_BUCKET")
    sourceDir = 'site'
}

Class S3Upload does not support this operation because of groovy truth (keyPrefix is null in the above example):

if (keyPrefix && sourceDir) {

Would be very nice to have this supported in a next version.
Any chance to get this quickly released?

Could not get unknown property issue

Hi,
I have followed the README to add the plugin.

plugins {
	id "java"
	id "maven"
	id "maven-publish"
	id 'org.springframework.boot' version '2.3.1.RELEASE'
	id 'io.spring.dependency-management' version '1.0.9.RELEASE'
	id 'com.mgd.core.gradle.s3' version '1.1.0'
}
.
.
.
s3 {
	bucket = 'mybucketname'
	profile = 'myprofile'
}

task upload(type: S3Upload) {
	key = 'myobjectname'
	file = 'myfileIwantToUpload'
}

But I get the error:

A problem occurred evaluating root project 'redacted'.
> Could not get unknown property 'S3Upload' for root project...

Must be doing something wrong, any ideas?

Thanks

Use Alternative AWS providers

Hello,
It could be helpful allowing to use AWS compatible third-party clouds like Cloudflare, localstack

To do this, simply we need to add another parameter with the service endpoint URL.
Mainly is adding this line

builder.withEndpointConfiguration(new EndpointConfiguration("myAlternativeEndpoint", region))
here:
https://github.com/mygrocerydeals/gradle-s3-plugin/blob/master/src/main/groovy/com/mgd/core/gradle/AbstractS3Task.groovy#L59

What do you think?

I could create a PR with the changes

Gradle with Configuration Cache

I use Gradle with Kotlin DSL. Everything worked fine until I tried to turn on the Gradle Configuration Cache.

  plugins {
      id("com.github.node-gradle.node") apply true
      id("com.mgd.core.gradle.s3") version "1.3.1" apply true
  }
  ...
   s3 {
          region = "us-east-1"
          bucket = "s3ArtifactFolder"
      }
  ...
    tasks.register("uploadArtifactToS3", S3Upload::class) {
        keyPrefix = "myS3Dir"
        sourceDir = "aritfactAbsolutePath"
    }

Error:

Caused by: groovy.lang.MissingPropertyException: Could not get unknown property 's3' for project ':uploadArtifactToS3' of type org.gradle.api.Project.
	at org.gradle.internal.metaobject.AbstractDynamicObject.getMissingProperty(AbstractDynamicObject.java:85)
	at org.gradle.internal.metaobject.AbstractDynamicObject.getProperty(AbstractDynamicObject.java:62)
	at org.gradle.api.internal.project.DefaultDynamicLookupRoutine.property(DefaultDynamicLookupRoutine.java:31)
	at org.gradle.api.internal.project.DefaultProject.property(DefaultProject.java:1158)
	at org.gradle.api.internal.project.DefaultProject.getProperty(DefaultProject.java:1135)
	at com.mgd.core.gradle.AbstractS3Task.getProfile(AbstractS3Task.groovy:34)
	at com.mgd.core.gradle.S3Upload_Decorated.getProfile(Unknown Source)

How to fix this?

fails to download large files

setup was done using:

s3 {
  region = 'us-east-1'
}

task downlaodData(type: S3Download) {
    bucket = 'my-bucket
    key = "data.tar.tbz2"
    file = "${buildDir}/data/data.tar.tbz2"
}

error:

"Not all bytes were read from the S3ObjectInputStream, aborting HTTP connection. This is likely an error and may result in sub-optimal behavior. Request only the bytes you need via a ranged GET or drain the input stream after use."

Download specific version of object

S3 buckets have object versioning, so I want to know if I can download a specific version of an object. It's possible with the AWS SDK in Python:

s3 = boto3.resource('s3')
bucket = s3.Bucket("mybucket")
bucket.download_file(
    "somefile",
    "/donwload/path/somefile.txt",
    ExtraArgs={"VersionId": "my_version"}
)

Where "my_version" is the version I want to download. Is that possible with this plugin? If not, is that a feature that can be implemented?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.