Skip to content

Instantly share code, notes, and snippets.

@andreicek
Last active June 9, 2017 12:17
Show Gist options
  • Save andreicek/01f95213129dfeaf4ebf80decfe27c4c to your computer and use it in GitHub Desktop.
Save andreicek/01f95213129dfeaf4ebf80decfe27c4c to your computer and use it in GitHub Desktop.
Delete multiple files from s3

Usage

  1. Clone this git clone https://gist.github.com/01f95213129dfeaf4ebf80decfe27c4c.git
  2. Have all object to be deleted in a file (e.g. files.txt) separated by new line
  3. Have AWS CLI configured
  4. Run the whole thing
echo files.txt | ./delete.sh

It will delete 1000 items per call (maximum allowed).

#! /usr/bin/env bash
BUCKET="bucket"
function makechunk() {
local N="$1"
local line
local rc="1"
for i in $(seq 1 $N)
do
read line
if [ $? -eq 0 ]
then
echo $line
rc="0"
else
break
fi
done
return $rc
}
while chunk=$(makechunk 1000)
do
JSON=`echo ${chunk} | ./format.js`
echo $JSON > files.json
aws s3api delete-objects --bucket $BUCKET --delete file://files.json
done
rm files.json
#!/usr/bin/env node
var stdin = process.openStdin();
let data = '';
stdin.on('data', (chunk) => (data = data + chunk));
stdin.on('end', () => {
const files = data.split(' ').filter(Boolean);
const payload = JSON.stringify({Objects: files.map((file) => ({Key: file}))});
console.log(payload);
});
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment