Skip to content

Instantly share code, notes, and snippets.

@pgorod
Forked from bojanpopic/backup_github_issues.sh
Last active November 5, 2021 16:13
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save pgorod/db7b26ab012c30000f1ebe752512596b to your computer and use it in GitHub Desktop.
Save pgorod/db7b26ab012c30000f1ebe752512596b to your computer and use it in GitHub Desktop.
A (dirty) bash script that will backup issues from the GitHub repo using API. It takes pagination into consideration. Originally created by @boombatower, improved by @bojanpopic, them improved by me to handle number of pages automatically.
#!/bin/bash
# based on the script by Bojan Popic https://gist.github.com/bojanpopic/2c3025d2952844de1dd0
# Gets all issues form a repo. Since Pull Requests are also Issues in Github internally, these are also included.
repo="organization/repoName" # put the name of the repo here. Usually it is like this: organization/repo-na$
username="myuser" # your github username
password="mypassword" # your github password
# this script can be used on public repos with any user's credentials, you don't have to own the repo!
# -I will give back only the headers without the data
numberofpages=$( curl -I -u "$username:$password" \
"https://api.github.com/repos/$repo/issues?per_page=100&state=all" \
| grep last | cut -d"=" -f8 | cut -f1 -d">" )
# the human logic behind these greps and cuts is: look for the "last" keyword in the output,
# the number of pages is a few characters behind it, between a '=' and a '>'
echo
echo "Getting ${numberofpages} pages of issues..."
echo
echo
for i in `eval echo {1..$numberofpages}`
do
filename=$(echo "$repo-issues-page$i.json" | tr / -)
curl -u $username:$password \
"https://api.github.com/repos/$repo/issues?per_page=100&state=all&filter=all&page=$i" \
> $filename
echo
echo "Page $i/$numberofpages retrieved, saved as $filename"
echo
done
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment