Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 28 Current »

You can use the Amazon S3 Object task to upload, download, delete or copy Amazon Simple Storage Service (Amazon S3) (files); in particular, you can upload your Bamboo job's build artifacts or select local files and directories (optionally via Ant Patterns) - when addressing S3 objects (files), it matches those by key prefix, which allows to operate on multiple objects at once (virtual directories resp. folder hierarchies).

Configuration

To configure an Amazon S3 Object task:

Navigate to the Tasks configuration tab for the job (this will be the default job if creating a new plan).

Click the name of an existing Amazon S3 Object task, or click Add Task and then Amazon S3 Object Task to create a new task.

Complete the following settings:

Common to all tasks

 Click here to expand...

Common (Bamboo)


Task Description (Optional) Identify the purpose of the task.
Disable this task

Check, or clear, to selectively run this task.

...Configure task/action specific parameters, see below ...
Bamboo Variables
NamespaceProvide the namespace for generated variables – defaults to custom.aws for backward compatibility.
Scope

Select the scope for generated variables – can be either Local (Variables will only be available in this job), or Result (Variables will be available in subsequent plan stages and deployment releases).

Common (AWS)


Action

Each task supports one or more actions depending on the target AWS resource. Select the desired action and configure the action specific parameters below.

RegionSelect the desired AWS Region. Alternatively, select [Use region variable ...] to supply the region dynamically via Bamboo variables (needs to be a region code such as ap-southeast-2) - refer to How to parametrize the AWS region via a Bamboo variable for details.
...Configure task/action specific parameters, see below ...
AWS Security Credentials
SourceSelect the source for the required AWS Security Credentials – can be either Inline, an IAM Role for EC2 or a shared Identity Federation for AWS Connector.
Connector(Conditional) Select the shared Identity Federation for AWS Connector. Alternatively, select [Use connector variable ...] to supply the connector dynamically via Bamboo variables (needs to be a connector id such as f24e81bc-7aff-42db-86a2-7cf82e24d871) - refer to How to parametrize the AWS connector via a Bamboo variable for details.

Actions supported by this task:

S3 Features

Most features offered by S3 are available by means of specifying additional Metadata fragments.

ZIP Archive Support

As of Tasks for AWS 2.9, you can compress files into a ZIP archive during uploading to ease using AWS services that support or require the deployment artifact to be a ZIP file - this is enabled by a new option Upload as ZIP archive for the Amazon S3 Object task's Upload File(s) action.


Upload File(s)


Don't fail if nothing to uploadCheck to not fail the build, if there is nothing to upload. Clear to trigger a build failure otherwise.
Artifact

Select the artifact you want to upload.

Source Local Path

Specify the local path (relative to the Bamboo working directory) to the files you want to upload. Use commas to separate files and directories. You can also use:

Use default excludes when selecting files

Check to apply Ant default excludes when selecting files. Uncheck to ignore default excludes.

  • (warning) Due to the way file selection is implemented in Bamboo, this setting also applies when the Use Ant patterns to select filesoption is not checked!

  • (info) As of Ant 1.8.2 the default excludes also contain common version control meta files/directories (refer to the related upstream issueSCP Task to disable Ant default excludes (BAM-17438) for details):

    Default excludes contains common version control meta files/directories
    **/.git
    **/.git/**
    **/.gitattributes
    **/.gitignore
    **/.gitmodules
    **/.hg
    **/.hg/**
    **/.hgignore
    **/.hgsub
    **/.hgsubstate
    **/.hgtags
    **/.bzr
    **/.bzr/**
    **/.bzrignore
Upload as ZIP archiveCheck to compress the selected file(s) into a single ZIP archive before uploading. Uncheck to upload separately.
Target Bucket Name

Specify the name of the S3 bucket where you want the files to be uploaded to.

Target Object Key Prefix (Virtual Directory)
(Optional) Specify the S3 object key prefix you want the uploaded files to gain in the target bucket.
Metadata Configuration

(Optional) Specify additional metadata in JSON format. Insert fragments from the inline Examples dialog to get started.

  • Refer to PUT Object and PUT Object - Copy for available metadata options and values covering various use cases, for example:

    •  – Declares file(s) to be compressed css.
    • Activate server-side encryption and reduced redundancy storage.
    • Redirect request to another location, if the bucket is configured as a website.
    • – Sets a one hour maximum age.
    •  – Grant public read access.
    •  – Replace source metadata with new values.
      • (warning) Note that this has somewhat tricky implications, refer to PUT Object - Copy for details.
    • Tagging – Use object tagging to categorize storage, refer to Object Tagging for details.
    • Object lock retention and legal hold – Store objects using a "Write Once Read Many" (WORM) model, refer to Object Lock for details.
    • ...
Tags

(Optional) Specify tags to apply to the resulting object(s) in JSON format. Insert fragments from the inline Examples dialog to get started.

Download Object(s)


Don't fail if nothing to downloadCheck to not fail the build, if there is nothing to download. Clear to trigger a build failure otherwise.
Source Bucket Name

Specify the name of the S3 bucket where you want the objects to be downloaded from.

Source Object Key Prefix
Specify the key prefix of the S3 objects you want to download from the source bucket.
Target Local Path(Optional) Specify the local path (relative to the working directory) where the objects will be downloaded to.

Delete Object(s)


Don't fail if nothing to deleteCheck to not fail the build, if there is nothing to delete. Clear to trigger a build failure otherwise.
Source Bucket Name

see Download above

Source Object Key Prefix
see Download above

Copy Object(s)


Don't fail if nothing to copyCheck to not fail the build, if there is nothing to copy. Clear to trigger a build failure otherwise.
Source Bucket Name

see Download above

Source Object Key Prefix
see Download above
Target Bucket Name

see Upload above

Target Object Key Prefix (Virtual Directory)
see Upload above
Metadata Configurationsee Upload above
Tags

(Optional) Specify tags to apply to the resulting object(s) in JSON format. Insert fragments from the inline Examples dialog to get started.

  • (lightbulb) If you do not specify any tags, the existing tags will be copied – you can prevent the copying of existing tags by providing an empty JSON array.
  • (info) Refer to Object Tagging for details.

Generate Pre-signed URL

Use S3 to provide configuration as code

The Generate Pre-signed URL action allows using S3 objects to provide configuration as code, refer to Injecting task configuration via URLs for details.

Bucket Name

Specify the name of the S3 bucket the URL should target.

Object Key
Specify the key of the S3 object the URL should target.
Method

Select the HTTP method the URL is supposed to be used with.

  • (info) Currently supported is GETif you have a use case for PUT, HEAD or DELETE, please vote and comment on UAA-259 - Getting issue details... STATUS
ExpirationSpecify for how long the URL should be valid (seconds).

Variables

All tasks support Bamboo Variable Substitution/Definition - this task's actions generate variables as follows:

A task's generated variables might get amended with respective AWS API additions over time - a live build log will always provide the most current variable shape accordingly

Upload File(s)

Creating resource variables for uploaded object 'prefix/taws-tst-object-4B.txt':
... bamboo.custom.aws.s3.object.first.BucketName: taws-tst-target-us-east-1
... bamboo.custom.aws.s3.object.first.ETag: 1dafad37f6d9e169248bacb8485fd9cc
... bamboo.custom.aws.s3.object.first.ObjectKey: prefix/taws-tst-object-4B.txt
... bamboo.custom.aws.s3.object.first.VersionId: null

Download Object(s)

N/A

Delete Object(s)

N/A

Copy Object(s)

Creating resource variables for copied object 'taws-tst-object-4B.a.txt':
... bamboo.custom.aws.s3.object.first.BucketName: taws-tst-target-us-east-1
... bamboo.custom.aws.s3.object.first.ETag: 1dafad37f6d9e169248bacb8485fd9cc
... bamboo.custom.aws.s3.object.first.ObjectKey: taws-tst-object-4B.a.txt
... bamboo.custom.aws.s3.object.first.VersionId: null

Generate Pre-signed URL

Creating resource variables for pre-signed URL 'https://taws-tst-source-us-east-1.s3.amazonaws.com/taws-tst-object-4B.txt':
... bamboo.custom.aws.s3.object.first.PresignedUrl.password: ******
... bamboo.custom.aws.s3.object.first.BucketName: taws-tst-source-us-east-1
... bamboo.custom.aws.s3.object.first.ObjectKey: taws-tst-object-4B.txt
... bamboo.custom.aws.s3.object.first.Method: GET
... bamboo.custom.aws.s3.object.first.Expiration: 20170511T103911Z

How-to Articles

Frequently Asked Questions (FAQ)

Atlassian account required

Due to a regression within Questions for Confluence Cloud, you need to log in with your Atlassian ID (or sign up for a new account) to access these automatically curated FAQs (refer to UAA-312 for the background).


  • No labels