You want to use AWS credentials securely from AWS unaware tasks, or you encounter a limitation or bug within Utoolity's AWS related Bamboo apps and are in need of a workaround:
Identity Federation for AWS 2.2 (bundled free of charge with Tasks for AWS (Bamboo) and Automation with AWS (Bamboo)), you can use the AWS Credentials Variables task to ease using the AWS Command Line Interface (AWS CLI) in turn, which is a unified tool to manage [almost all current and future] AWS services.
AWS Credentials Variables task with the same AWS credentials source you would use for a dedicated task - this makes the resulting temporary AWS security credentials available as Bamboo variables. Inject these AWS credentials variables as environment variables into a subsequent Bamboo Script task, for example: Bash/Unix shell 1
$AWS_ACCESS_KEY_ID = $Env:bamboo_custom_aws_accessKeyId
$AWS_SECRET_ACCESS_KEY = $Env:bamboo_custom_aws_secretAccessKey_password
$AWS_SESSION_TOKEN = $Env:bamboo_custom_aws_sessionToken_password
The expected environment variable names are significant so that they will be automatically picked up by the AWS CLI.
Refer to the
AWS CLI reference to determine the relevant commands for your use case, for example, use describe-stacks to retrieve details for the
aws --region ap-southeast-2 cloudformation describe-stacks --stack-name myteststack
This might return a result like the following:
"Description": "AWS CloudFormation Sample Template S3_Bucket: Sample template showing how to create a publicly accessible S3 bucket. **WARNING** This template creates an S3 bucket. You will be billed for the AWS resources used if you create a stack from this template.",
"Description": "Name of S3 bucket to hold website content",
(Optional) Post process the AWS CLI output to extract values relevant to your use case:
JSON post processing
There are two main options to post process the AWS CLI's output:
The AWS CLI offers native
control the command output in various ways, notably including an option to filter the default JSON output by means of the
--query options turns out to be limiting still for your use case, or you are more comfortable with using a dedicated tool, the
lightweight and flexible command-line JSON processor jq provides even more powerful options to slice and filter and map and transform structured data with the same ease that
grep and friends let you play with text.