Skip to content

Instantly share code, notes, and snippets.

@groundcat
Created August 2, 2023 05:25
Show Gist options
  • Save groundcat/af870ce1eb15a4bb1cd5b4b646a47220 to your computer and use it in GitHub Desktop.
Save groundcat/af870ce1eb15a4bb1cd5b4b646a47220 to your computer and use it in GitHub Desktop.
Backup proxmox files to S3

Here is a step-by-step guide on how to use AWS CLI with a non-AWS custom endpoint, access id, and key to move (upload then delete) files to object storage.

  1. Install AWS CLI

This can be done by running the following command in your terminal:

$ curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install
  1. Configure the AWS CLI

You can configure the AWS CLI by running aws configure command in your terminal. It will prompt you to enter your Access Key ID, Secret Access Key, Default Region Name, and Default Output Format.

aws configure

If you are using a non-AWS S3 compatible service, you would need to add your custom endpoint URL. You can do this by editing the ~/.aws/config file and adding the endpoint_url under the default profile or any custom profile you have created.

[default]
aws_access_key_id = YOUR_ACCESS_KEY
aws_secret_access_key = YOUR_SECRET_KEY
region = us-east-1
  1. Create a Bash Script

Create a bash script that will upload and then delete the files from your local directory. For instance, you can name it backup.sh.

Replace 'https://your-custom-s3-endpoint.com' with your actual custom S3 endpoint URL.

Also replace the bucket name.

#!/bin/bash

# Directory where the backup files are stored
BACKUP_DIR="/var/lib/vz/dump"

# Name of the S3 bucket
S3_BUCKET="my-s3-bucket"

# Custom endpoint URL
ENDPOINT_URL="https://your-custom-s3-endpoint.com"

# Upload files to S3 bucket and then delete from local directory 
for file in $BACKUP_DIR/*
do
  aws s3 cp "$file" "s3://$S3_BUCKET/" --endpoint-url $ENDPOINT_URL
  if [ $? -eq 0 ]; then
    rm "$file"
  fi
done
  1. Make the Script Executable

Run this command in your terminal:

chmod +x backup.sh 
  1. Create a Cron Job

Now create a cron job that will run this script on a weekly schedule.

To open crontab configuration, run:

crontab -e 

Add this line to run the script every Sunday at 3:00 AM:

0 3 * * SUN /path/to/backup.sh > /dev/null 2>&1 

Replace '/path/to/backup.sh' with the actual path of your bash script.

  1. Test Your Setup

You should test this setup by manually running your bash script and checking if the backup files are successfully uploaded to your S3 bucket and deleted from local directory.

Please note that this is a simple example and might need adjustments according to specific needs or errors that might occur due to different environments or permission issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment