aws connector reference on shell script step
complete
R
Revolutionary Salamander
Currently there is no capability to use connector reference in the shell script step. In order to achieve this we need to have STS session with AWS within the shell script. It would be easier if we can have the Connector Reference supported within the shell script step so that it avoids adding the STS Session part of script to our script.
Log In
Shylaja Sundararajan
complete
Shylaja Sundararajan
This feature is supported and available for consumption. Please refer docs and let us know if there any queries in this regard
Regards
Shylaja
CD Product Team
N
Nofar Bluestein
open
N
Nofar Bluestein
under review
N
Nofar Bluestein
pending feedback
N
Nofar Bluestein
Hey,
Could you please elaborate on your use case?
Are you trying to run a script in a Run step that communicate with AWS cli ? Could you provide an example of how you achieve your use case toady? would be good to see am example of the command/script.
Thank you ,
Nofar Bluestein,
CI product team
E
Environmental Parrot
Nofar Bluestein - Hi, We'd like to see this feature in harness CD pipelines. i.e. have the ability to interrogate the AWS environment using the AWS CLI or boto3 scripts. It'd be nice to have a pattern like how https://github.com/aws-actions/configure-aws-credentials support with OIDC/JWT auth to assume a named role and then to expose the STS tokens/credentials for use by subsequent pipeline steps in their environment variables.
e.g.
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@main
with:
aws-region: us-east-1
role-to-assume: ${{ secrets.AWS_ROLE_FOR_GITHUB }}
role-session-name: GitHubActions
- run: aws sts get-caller-identity
- name: Determine target bucket
run: |
aws s3api list-buckets --output json | jq -S '...'
- name: Deploy content
run: |
aws s3 sync /harness/local/directory s3://$target_bucket/path
Ultimately though, being able to run AWSCLI commands like the following in a simple shell script is the objective
aws ec2 describe-instances \
--filters Name=instance-state-name,Values=stopped,running \
--query "Reservations[*].Instances[*].{Region:'$region', Name:Tags[?Key==\`Name\`]|[0].Value, Instance:InstanceId, Type:InstanceType, State:State.Name}" \
--output text \
--region $region >> instances.csv
or in a container step with python
from boto.s3.connection import S3Connection
conn = S3Connection(os.environ.get('AWS_ACCESS_KEY_ID'), os.environ.get('AWS_SECRET_ACCESS_KEY'))
bucket = conn.get_bucket('bucket')
for key in bucket.list():
print(key.name.encode('utf-8'))
...
etc, etc