Skip to content

Instantly share code, notes, and snippets.

@mantasu
Last active March 29, 2025 05:39
How to Clone Repositories via SSH on GCP

SSH From GCP VMs (Clone Private Repositories and More!)

Overview

This guide explains how to automatically setup SSH connection between your Google Cloud Platform Compute Engine VM Instances and third-party applications or servers. The most common use case is when you need auto-authenticate yourself when cloning private repositories or pushing commits via SSH. Shame on Google not to have a tutorial on this one.

Warning

This is different than ssh'ing into GCP VMs which has plenty of guides already.

Tags

  • How to clone private repositories on GCP
  • How to clone repositories without password on GCP
  • How to setup SSH connection from GCP
  • How to add private SSH keys to VM instances on GCP
  • How to SSH from Google Cloud Compute Engine to remote servers
  • How to clone GitHub, BitBucket, or GitLab repositories using SSH instead of HTTP on Google Cloud
  • Allow Google Cloud VM Instances access private SSH keys from Secret Manager

Grant Secret Access

Open Cloud Shell and specify your project ID and the target compute service account. The following lines automatically specify the default ones:

PROJECT_ID=$(gcloud config get-value project)
COMPUTE_SERVICE_ACCOUNT=$(gcloud iam service-accounts list --filter="email:compute" --format="value(email)")

Now, enable the Secret Manager service (the thing that stores secrets like SHH and GPG keys) and grant its access to your Compute Engine service account (the thing that manages your VMs):

gcloud services enable secretmanager.googleapis.com
gcloud projects add-iam-policy-binding "$PROJECT_ID" --member="serviceAccount:$COMPUTE_SERVICE_ACCOUNT" --role="roles/secretmanager.secretAccessor"

Tip

You can also do these steps manually. Open the navigation section (left-side panel), go to IAM & Admin > IAM. Find the compute engine service account [SERVICE-ID]-compute@developer.gserviceaccount.com (you should have at least the default one) and click edit (pencil button). Add the role: Secret Manager > Secret Manager Secret Accessor, and click save.

SHH Key Generation

Open Cloud Shell and generate a temporary SSH key pair (leave empty for passphrase):

mkdir tmp-shh
ssh-keygen -t ed25519 -f ssh-tmp/id_ed25519

Now add that pair to your Secret Manager service (we name the secrets here github-ssh-private and github-ssh-public):

gcloud secrets create github-ssh-private --replication-policy="automatic" --data-file=ssh-tmp/id_ed25519
gcloud secrets create github-ssh-public --replication-policy="automatic" --data-file=ssh-tmp/id_ed25519.pub

Now view the public key and copy it to your GitHub account (or Bitbucket account etc):

cat ssh-tmp/id_ed25519.pub

Now safely remove the temporary SSH key pair:

shred -u ssh-tmp/id_ed25519
shred -u ssh-tmp/id_ed25519.pub
rmdir ssh-tmp

Tip

You can view your stored public and private keys after deletion manually. Open navigation, Security > Secret Manager, click on teh secret you want, Actions (three dots) > View secret value.

Extending SSH Access to All VMs

Open Cloud Shell and create a temporary bash script:

vim tmp.sh

Click I to enter the Insert mode and paste the following:

#!/bin/bash

# Get local user name and user home
USER_NAME=$(getent passwd | awk -F: '$3 >= 1000 && $3 < 65534 {print $1}')
USER_HOME=$(eval echo ~$USER_NAME)

if [ -f "$HOME/.ssh/id_ed25519" ]; then
  # Exit early if the private key already exists
  echo "SSH private key already exists at $USER_HOME/.ssh/id_ed25519"
  exit 0
fi

# Fetch the private key from Secret Manager and write it to ~/.ssh/id_ed25519
secret_key=$(gcloud secrets versions access latest --secret="github-ssh-private")
mkdir -p "$USER_HOME/.ssh"
echo "$secret_key" > "$USER_HOME/.ssh/id_ed25519"
ssh-keyscan github.com bitbucket.org gitlab.com >> "$USER_HOME/.ssh/known_hosts"

# Optionally, configure the SSH client (e.g., specify identity file)
echo "Host github.com bitbucket.org gitlab.com" >> "$USER_HOME/.ssh/config"
echo "  AddKeysToAgent yes" >> "$USER_HOME/.ssh/config"
echo "  IdentityFile $USER_HOME/.ssh/id_ed25519" >> "$USER_HOME/.ssh/config"

# Change permission and user ownership
chmod 600 "$USER_HOME/.ssh/id_ed25519"
chown -R $USER_NAME:$USER_NAME "$USER_HOME/.ssh"

# [Optional] Configure git account globally
# echo "[user]" >> "$USER_HOME/.gitconfig"
# echo "        name = Your Name" >> "$USER_HOME/.gitconfig"
# echo "        email = your.email@example.com" >> "$USER_HOME/.gitconfig"
# chown $USER_NAME:$USER_NAME "$USER_HOME/.gitconfig"

Click ESC, type :wq and click Enter to exit the Insert mode and save the file. Now add the startup script to your project metadata globally so it's applicable to all VM instances:

gcloud compute project-info add-metadata --metadata-from-file startup-script=tmp.sh
rm tmp.sh

Tip

If you want to add the startup script only to specific VM instances, you may want to check the official specification. Or if you further modify the script and want to check the logs, see how to view the output.

Verify

Create or start an existing VM. Check if the all 3 files are present under .ssh:

$ ls ~/.ssh
# config  id_ed25519  known_hosts

Check the ownership is correct, i.e., matches teh current user:

[ $(stat -c "%U" ~/.ssh) = $USER ] && echo OK || echo NOT OK
# OK

Optionally, also verify the correct git account was setup:

git config --global --list
# user.name=Your Name
# user.email=your.email@example.com

Tip

If you're already running a VM and don't want to restart, just run the startup script manually sudo google_metadata_script_runner startup

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment