Skip to content

Instantly share code, notes, and snippets.

@mikegreen
Last active February 16, 2021 15:25
Show Gist options
  • Star 7 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save mikegreen/7561191dffc449cce225faeca8a35cc8 to your computer and use it in GitHub Desktop.
Save mikegreen/7561191dffc449cce225faeca8a35cc8 to your computer and use it in GitHub Desktop.
Vault raft snapshot backup and restore quick demo
# 2020-06-23
# this shows creating a Vault instance running integrated storage/raft,
# then adding a KV and taking a snapshot
# then kill the raft DB files to simulate a storage failure
# repeat new Vault instance, restore snapshot, unseal and auth with orig keys
# and read some data to show how backup/restore works
# not meant to be a live script to run!
# this uses the vault_config.hcl from https://gist.github.com/mikegreen/c2df5eea2283f0dbc5f3a5d3650536fd
# startup integrated storage/raft vault
$ vault server -config=vault_raft.hcl
$ vault operator init -key-shares=1 -key-threshold=1
# Snapshot details:
# Unseal Key 1: sxYcm0n9CAg2QKzdAyEyJuGlzQj+8OPanmOABsCxTwc=
# Initial Root Token: s.f5Jv7son8PMGqBUI6R1ZqR2V
$ vault operator unseal sxYcm0n9CAg2QKzdAyEyJuGlzQj+8OPanmOABsCxTwc=
$ vault login s.f5Jv7son8PMGqBUI6R1ZqR2V
$ vault secrets enable -path=kvDemo -version=2 kv
$ vault kv put /kvDemo/legacy_app_creds_01 username=legacyUser password=supersecret
# Take snapshot, this should be done pointing to the active node
# Will get a 0-byte snapshot if not, as standby nodes will not forward this request (though this might be fixed in later ver)
$ vault operator raft snapshot save raft01.snap
# Kill cluster, rm DB files
$ rm -rf /opt/vault/*
# restart Vault with same config (but empty raft data folder now)
# New instance details, we don't need these:
# Unseal Key 1: NxgdYN6W0mhamxMPfiNnOQipgAENU+eRwlPJHE6xR0Y=
# Initial Root Token: s.c75QL4pb4oPa2FVnF263Wofb
# restore snapshot
$ vault operator raft snapshot restore -force raft01.snap
# unseal with original cluster keys
$ vault operator unseal sxYcm0n9CAg2QKzdAyEyJuGlzQj+8OPanmOABsCxTwc=
$ vault login s.f5Jv7son8PMGqBUI6R1ZqR2V
$ vault kv get /kvDemo/legacy_app_creds_01
...====== Metadata ======...
====== Data ======
Key Value
--- -----
password supersecret
username legacyUser
@neb14
Copy link

neb14 commented Oct 2, 2020

Have found any inconsistency in the vault operator raft snapshot save raft01.snap command. I get a return code of 0 but no file is there after the command is run. Happens randomly.

@mikegreen
Copy link
Author

@neb14 - no, but are you seeing this in a cluster or single-server configuration? Is the URL you're hitting behind a LB? There was a bug or two fixed in the last month or so. What ver are you on?

@neb14
Copy link

neb14 commented Oct 12, 2020

It is a cluster of 5 servers. The default url does hit a load balancer. We are on version 1.5.4, I'll try the command with an export of VAULT_ADDR as localhost.

@neb14
Copy link

neb14 commented Oct 13, 2020

pointing to locahost results in consistent backup.

@Dirc
Copy link

Dirc commented Feb 16, 2021

I got this error when running the restore command

URL: POST https://127.0.0.1:8200/v1/sys/storage/raft/snapshot-force
Code: 503. Errors:

* Vault is sealed

Adding the Vault initial root token of the new instance, solved this.

export VAULT_TOKEN=s.c75QL4pb4oPa2FVnF263Wofb
vault operator raft snapshot restore -force raft01.snap

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment