- (check) kill training job (throttled)
aws s3 stop-training-job --training-job-name $TRAINING_JOB_NAME
Can do that in for loop, can also check the status before killing
for d in d1 d2 d3
do
aws sagemaker stop-training-job --training-job-name $d
done
checking status:
aws sagemaker describe-training-job --training-job-name $TRAINING_JOB_NAME
- give RW access to a bucket
- copy file to from bucket
aws s3 cp $LOCAL_PATH s3://$BUCKET_PATH
aws s3 cp s3://$BUCKET_PATH $LOCAL_PATH
- list content of bucket
aws s3 ls s3://$BUCKET
- Get the size of a bucket (in MiB or GiB)
aws s3 ls s3://$BUCKET_PATH --recursive --human-readable --summarize | grep "Total Size"
- Remove a file in a bucket
aws s3 rm s3://$BUCKET_PATH/file/path
- Copy a folder being mindful of what may already have been copied and giving full access
aws s3 sync s3://$SOURCE_PATH s3://$TARGET_PATH --acl bucket-owner-full-control