Using the Amazon S3 Object task in Bamboo

Using the Amazon S3 Object task in Bamboo

You can use the Amazon S3 Object task to upload, download, delete or copy Amazon Simple Storage Service (Amazon S3) (files); in particular, you can upload your Bamboo job's build artifacts or select local files and directories (optionally via Ant Patterns) - when addressing S3 objects (files), it matches those by key prefix, which allows to operate on multiple objects at once (virtual directories resp. folder hierarchies).

 

Configuration

To configure an Amazon S3 Object task:

1. Navigate to the Tasks configuration tab for the job (this will be the default job if creating a new plan).

2. Click the name of an existing Amazon S3 Object task, or click Add Task and then Amazon S3 Object Task to create a new task.

3. Complete the following settings:

a. Common to all tasks

 Click here to expand...

Common (Bamboo)


Task Description (Optional) Identify the purpose of the task.
Disable this task

Check, or clear, to selectively run this task.

...Configure task/action specific parameters, see below ...
Bamboo Variables
NamespaceProvide the namespace for generated variables – defaults to custom.aws for backward compatibility.
Scope

Select the scope for generated variables – can be either Local (Variables will only be available in this job), or Result (Variables will be available in subsequent plan stages and deployment releases).

Common (AWS)


Action

Each task supports one or more actions depending on the target AWS resource. Select the desired action and configure the action specific parameters below.

RegionSelect the desired AWS Region. Alternatively, select [Use region variable ...] to supply the region dynamically via Bamboo variables (needs to be a region code such as ap-southeast-2) - refer to How to parametrize the AWS region via a Bamboo variable for details.
...Configure task/action specific parameters, see below ...
AWS Security Credentials
SourceSelect the source for the required AWS Security Credentials – can be either Inline, an IAM Role for EC2 or a shared Identity Federation for AWS Connector.
Connector(Conditional) Select the shared Identity Federation for AWS Connector. Alternatively, select [Use connector variable ...] to supply the connector dynamically via Bamboo variables (needs to be a connector id such as f24e81bc-7aff-42db-86a2-7cf82e24d871) - refer to How to parametrize the AWS connector via a Bamboo variable for details.

b. Actions supported by this task:

S3 Features

Most features offered by S3 are available by means of specifying additional Metadata fragments.

ZIP Archive Support

As of Tasks for AWS 2.9, you can compress files into a ZIP archive during uploading to ease using AWS services that support or require the deployment artifact to be a ZIP file - this is enabled by a new option Upload as ZIP archive for the Amazon S3 Object task's Upload File(s) action.

 

Upload File(s)

 

Upload File(s)

 

Don't fail if nothing to upload

Check to not fail the build, if there is nothing to upload. Clear to trigger a build failure otherwise.

Artifact

Select the artifact you want to upload.

Source Local Path

Specify the local path (relative to the Bamboo working directory) to the files you want to upload. Use commas to separate files and directories. You can also use:

Use default excludes when selecting files

Check to apply Ant default excludes when selecting files. Uncheck to ignore default excludes.

  •  Due to the way file selection is implemented in Bamboo, this setting also applies when the Use Ant patterns to select filesoption is not checked!

  •  As of Ant 1.8.2 the default excludes also contain common version control meta files/directories (refer to the related upstream issueSCP Task to disable Ant default excludes (BAM-17438) for details):

    Default excludes contains common version control meta files/directories

    **/.git **/.git/** **/.gitattributes **/.gitignore **/.gitmodules **/.hg **/.hg/** **/.hgignore **/.hgsub **/.hgsubstate **/.hgtags **/.bzr **/.bzr/** **/.bzrignore

Upload as ZIP archive

Check to compress the selected file(s) into a single ZIP archive before uploading. Uncheck to upload separately.

Target Bucket Name

Specify the name of the S3 bucket where you want the files to be uploaded to.

Target Object Key Prefix (Virtual Directory)

(Optional) Specify the S3 object key prefix you want the uploaded files to gain in the target bucket.

Metadata Configuration

(Optional) Specify additional metadata in JSON format. Insert fragments from the inline Examples dialog to get started.

  • Refer to PUT Object and PUT Object - Copy for available metadata options and values covering various use cases, for example:

  • Content type and encoding – Declares file(s) to be compressed css.

  • Encryption and storage class – Activate server-side encryption and reduced redundancy storage.

  • Website redirect location – Redirect request to another location, if the bucket is configured as a website. 

  • Cache control – Sets a one hour maximum age.

  • Access control – Grant public read access.

  • (Copy only) Replace metadata – Replace source metadata with new values.
    Note that this has somewhat tricky implications, refer to PUT Object - Copy for details.

  • Tagging – Use object tagging to categorize storage, refer to Object Tagging for details.

  • Object lock retention and legal hold – Store objects using a "Write Once Read Many" (WORM) model, refer to Object Lock for details.

  • ...

Tags

(Optional) Specify tags to apply to the resulting object(s) in JSON format. Insert fragments from the inline Examples dialog to get started.

Download Object(s)

 

Don't fail if nothing to download

Check to not fail the build, if there is nothing to download. Clear to trigger a build failure otherwise.

Source Bucket Name

Specify the name of the S3 bucket where you want the objects to be downloaded from.

Source Object Key Prefix

Specify the key prefix of the S3 objects you want to download from the source bucket.

Target Local Path

(Optional) Specify the local path (relative to the working directory) where the objects will be downloaded to.

Delete Object(s)

 

Don't fail if nothing to delete

Check to not fail the build, if there is nothing to delete. Clear to trigger a build failure otherwise.

Source Bucket Name

see Download above

Source Object Key Prefix

see Download above

Copy Object(s)

 

Don't fail if nothing to copy

Check to not fail the build, if there is nothing to copy. Clear to trigger a build failure otherwise.

Source Bucket Name

see Download above

Source Object Key Prefix

see Download above

Target Bucket Name

see Upload above

Target Object Key Prefix (Virtual Directory)

see Upload above

Metadata Configuration

see Upload above

Tags

(Optional) Specify tags to apply to the resulting object(s) in JSON format. Insert fragments from the inline Examples dialog to get started.

  •  If you do not specify any tags, the existing tags will be copied – you can prevent the copying of existing tags by providing an empty JSON array.

  •  Refer to Object Tagging for details.

Generate Pre-signed URL

Use S3 to provide configuration as code

The Generate Pre-signed URL action allows using S3 objects to provide configuration as code, refer to Injecting task configuration via URLs for details.

Bucket Name

Specify the name of the S3 bucket the URL should target.

Object Key

Specify the key of the S3 object the URL should target.

Method

Select the HTTP method the URL is supposed to be used with.

Expiration

Specify for how long the URL should be valid (seconds).

Atlassian®, Atlassian Bamboo®, Bitbucket®, Atlassian Crowd®, Confluence®, Jira®, Jira Service Management™, Opsgenie®, and Statuspage™ are registered trademarks of Atlassian.
Amazon Web Services™, AWS™ and the “Powered by Amazon Web Services” logo are trademarks of Amazon.com, Inc. or its affiliates in the United States and/or other countries.

Utoolity® is a registered trademark of Utoolity GmbH.
© 2025 Utoolity GmbH. All rights reserved.