Skip to content

Instantly share code, notes, and snippets.

@StevenMapes
Created January 3, 2024 10:37
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save StevenMapes/f7a9959f9bac076219421664fa4165f3 to your computer and use it in GitHub Desktop.
Save StevenMapes/f7a9959f9bac076219421664fa4165f3 to your computer and use it in GitHub Desktop.
Using the AWS-CLI on Unraid via the User Scripts plugin
#!/bin/bash
# Setup
## Create the .config and .credentials file on the NAS storage E.G. /root/.aws
## The path will be mounted at the point you run the container so that the AWS CLI can use the credentials
# Always use the latest CLI
docker pull amazon/aws-cli:latest
# Examples
## List all S3 buckets you have acess too using the named profile you created within the .config and .credentials files
## Just replace <profile_name> with the name of the profile you wish to use.
docker run --rm -v /root/.aws:/root/.aws amazon/aws-cli s3 ls --profile=<profile_name>
# In the above example the folder /root/.aws on the host filesystem is mounted to the same path on the container.
# The AWS CLI is invoked to to list all S3 buckets using the named profile
@StevenMapes
Copy link
Author

StevenMapes commented Jan 3, 2024

For example usages such as downloading a file to the local filesystem see the AWS documentation https://docs.aws.amazon.com/cli/latest/userguide/getting-started-docker.html

Example
In this scenario a bucket exists called backup. Within the bucket there is an object with the key my-files/document1.pdf that I want to download to the current directory.
I've setup a profile called my-user that has the required access to the S3 bucket and the object

To download it to the pwd I just need to run this

docker run --rm -v ~/.aws:/root/.aws -v $(pwd):/aws amazon/aws-cli s3 cp s3://backup/my-files/document1.pdf . --profile=my-user

This command mounts the .aws folder within the current users home folder to the root user within the container
It then mounts the current working directory "$(pwd)" to a folder called aws within the container.
Finally it connects to the S3 bucket called backup and copies down the file "document1.pdf" from the path my-files/ to the current folder on the host.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment