- unmigrated-wiki-markup
Documentation for Tasks for AWS 2.5 – other releases are available in the Tasks for AWS Documentation Directory.
View
or visit the current documentation home.
Using the Amazon S3 Object task in Bamboo
- Steffen Opel [Utoolity]
You can use the Amazon S3 Object task to upload, download, delete or copy Amazon Simple Storage Service (Amazon S3) (files); in particular, you can upload your Bamboo job's build artifacts or select local files and directories (optionally via Ant Patterns) - when addressing S3 objects (files), it matches those by key prefix, which allows to operate on multiple objects at once (virtual directories resp. folder hierarchies).
Configuration
To configure an Amazon S3 Object task:
- Navigate to the Tasks configuration tab for the job (this will be the default job if creating a new plan).
- Click the name of an existing Amazon S3 Object task, or click Add Task and then Amazon S3 Object Task to create a new task.
Complete the following settings:
Common to all tasks
Task Description (Optional) Identify the purpose of the task. Disable this task Check, or clear, to selectively run this task.
Action Each task supports one or more actions depending on the target AWS resource. Select the desired action and configure the action specific parameters below.
Region Select the desired AWS Region from the preconfigured list. Alternatively, select [Use region variable ...] to supply the region dynamically via Bamboo variables for example (needs to be a region code such as ap-southeast-2
) .... Configure task/action specific parameters, see below ... AWS Credentials Source Select the source for the required AWS Security Credentials - can be either Inline, an IAM Role for EC2 or a shared Identity Federation for AWS Connector. - Actions supported by this task:
Most features offered by S3 are available by means of specifying additional Metadata fragments.
Upload
Artifact Select the artifact you want to upload.
Source Local Path Specify the local path (relative to the Bamboo working directory) to the files you want to upload. Use commas to separate files and directories. You can also use:
- Ant-style pattern matching to include multiple files
- Ant-style regular expression matching to rename multiple files
Target Bucket Name Specify the name of the S3 bucket where you want the files to be uploaded to.
Target Object Key Prefix (Virtual Directory) (Optional) Specify the S3 object key prefix you want the uploaded files to gain in the target bucket. Metadata Configuration (Optional) Specify additional metadata in JSON format. Insert fragments from the inline Examples dialog to get started.
Refer to PUT Object and PUT Object - Copy for available metadata options and values. For example:
- Content type and encoding – Declares file(s) to be compressed css.
- Encryption and storage class – Activate server-side encryption and reduced redundancy storage.
- Website redirect location – Redirect request to another location, if the bucket is configured as a website.
- Cache control – Sets a one hour maximum age.
- Access control – Grant public read access.
- (Copy only) Replace metadata – Replace source metadata with new values.
- Note that this has somewhat tricky implications, please see the PUT Object - Copy documentation for details.
Download
Source Bucket Name Specify the name of the S3 bucket where you want the objects to be downloaded from.
Source Object Key Prefix Specify the key prefix of the S3 objects you want to download from the source bucket. Target Local Path (Optional) Specify the local path (relative to the working directory) where the objects will be downloaded to. Delete
Source Bucket Name see Download above
Source Object Key Prefix see Download above Copy
Source Bucket Name see Download above
Source Object Key Prefix see Download above Target Bucket Name see Upload above
Target Object Key Prefix (Virtual Directory) see Upload above Metadata Configuration see Upload above
Frequently Asked Questions (FAQ)