Skip to content

Instantly share code, notes, and snippets.

@MosheBerman
Last active January 10, 2019 03:42
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save MosheBerman/e12f6747b0a7a344f4cd4f1e26612104 to your computer and use it in GitHub Desktop.
Save MosheBerman/e12f6747b0a7a344f4cd4f1e26612104 to your computer and use it in GitHub Desktop.
Too many "pro" tier repos?
import requests
print "####################################"
print "# GitHub Collaborator Counter v1.0 #"
print "####################################"
access_token = None
if access_token is None:
print "Please set up a personal Oauth access token and store it in `access_token`."
exit()
page = 1
last_batch_size = -1
repos = list()
base_params = {
'access_token': access_token
}
while last_batch_size != 0:
params = {
'per_page': 100, # GitHub doesn't seem to handle pages sizes greater than this.
'page': page,
'affiliation': "owner"
}
params.update(base_params)
url = "https://api.github.com/user/repos"
r = requests.get(url, params=params)
batch = r.json()
page += 1
repos += batch
last_batch_size = len(batch)
key = "collaborators_url"
private_repos = list(filter(lambda r: r["private"] is True, repos))
print "You own {} repos and {} of them are private.".format(len(repos), len(private_repos))
urls = [repo[key] for repo in private_repos if key in repo]
urls = [url.replace("{/collaborator}", "") for url in urls]
free_plan_limit = 3
for url in urls:
collaborators_request = requests.get(url, params=base_params)
collaborators = collaborators_request.json()
collaborator_count = len(collaborators)
if collaborator_count > free_plan_limit:
msg = "Repo has {} collaborators at url: {}".format(collaborator_count, url)
print msg
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment