Skip to content

Instantly share code, notes, and snippets.

@parikshit223933
Last active October 16, 2022 14:10
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save parikshit223933/5c0ee5416803bc56075cb4535ef0d2de to your computer and use it in GitHub Desktop.
Save parikshit223933/5c0ee5416803bc56075cb4535ef0d2de to your computer and use it in GitHub Desktop.
Commands in this gist can be used while upgrading or copying data between clusters or backing up data from clusters to s3

To upgrade the cluster

REFERENCE: HERE

Create PSQL data dump

pg_dump -h <public dns> -U <my username> -f <name of dump file .sql> <name of my database>

Restore PSQL data dump

psql -U <postgresql username> -d <database name> -f <dump file that you want to restore>

Copy some file from inside a POD.

kubectl cp <some-namespace>/<some-pod>:/tmp/foo /tmp/bar

Copy a file from local machine to a EKS cluster POD

kubectl cp <local file location> <cluster namespace, the POD resides in>/<POD name>:<destination inside the POD>

Copy some file from remote to local

scp -i <pem file location>  <username>@<ip>:<location of the file which is to be copied from remote> <location on local to which file has to be downloaded>

Create and upload clickhouse backup to S3

REMOTE_STORAGE=s3 S3_ACCESS_KEY=<Access key of you AWS account> S3_SECRET_KEY=<Secret key of your AWS account> S3_BUCKET=<S3 bucket name in which you want to create the backup> S3_REGION=<S3 bucket region> BACKUP_NAME=<s3 backup name> clickhouse-backup create_remote <s3 backup name, sometimes BACKUP_NAME variable does not work>

Restore clickhouse data from remote backup file (S3)

REMOTE_STORAGE=s3 S3_ACCESS_KEY=<Access key of you AWS account> S3_SECRET_KEY=<Secret key of your AWS account> S3_BUCKET=<S3 bucket from which the backup is to be extracted> S3_REGION=<S3 bucket region> clickhouse-backup restore_remote <Backup file name on S3 bucket>

Point/Switch to a cluster from local system

  1. Use your credentials to login to aws account via cli

    aws configure
    
  2. Update kubeconfig locally

    aws eks update-kubeconfig --name <cluster-name> --region ap-southeast-1
    
  3. Write kubeconfig to remote

    eksctl utils write-kubeconfig --cluster=<cluster-name>
    

Shifting from local to S3 or vice versa (source)

The following cp command copies a single file to a specified bucket and key

aws s3 cp test.txt s3://mybucket/test2.txt

The following cp command copies a single s3 object to a specified bucket and key

aws s3 cp s3://mybucket/test.txt s3://mybucket/test2.txt

The following cp command copies a single object to a specified file locally

aws s3 cp s3://mybucket/test.txt test2.txt

The following cp command copies a single object to a specified bucket while retaining its original name

aws s3 cp s3://mybucket/test.txt s3://mybucket2/
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment