Published on

How to Clone All of Your Organisations Repos Including Private

Authors

There may come a time where you want to cancel/move an organisations Github account. You will likely want to download all repos so you can set their origin's to a new provider/server. Going through this process is not hard if you only have a handful of repos - you can easily do this manually. But what if you have 50/100+? Luckily Github has an API which simplifies doing things like getting a list of all repos for an organisation.

This answer gives a very good breakdown of how to use this API. But the may missing piece is: how do you get private repos?

It turns out it is not too bad it involves 3 things:

  1. Setup a "Personal Access Token"
  2. Use this token with the API to get a list of your repo's git addresses
  3. Run through this list and clone each item

Personal Access Token

This is a token that lets you programmatically access GitHub by specifying what access is allowed. These tokens:

  • Cannot exist from an organisation's profile perspective
    • You have to create it under your profile where your profile has access to the organisation
  • Will work even if your profile has 2FA, 2FA will not be required when using these tokens.
  • These are created under your profile:
    • Under (your profile) Settings > Developer Settings > Personal access tokens
      • This link should take you straight there if you are logged in.
    • Give the token all read repo related permissions
    • Copy the token you are given (you will not see it again after leaving the last page)

Use the Token to List an Organisations Private Repos

We will use the command above with one adjustment - the access_token parameter.

The command from the original answer will be changed to the following:

CNTX=orgs; NAME=foundery-rmb; PAGE=1;GITHUB_API_TOKEN=<Personal_Access_Token_Here>; curl "https://api.github.com/$CNTX/$NAME/repos?page=1&per_page=100&access_token=$GITHUB_API_TOKEN" | grep -o 'git@[^"]*'  > results_p1.json

A few points about the above:

  • Github has a limit of 100 items per page, as a result, you will need to call this ceiling(numberOfRepos/100) to get all of them
    • You do this by adjusting the page number in the above query
  • The above query returns a bunch of metadata for your repos, grep is used to filter out only the git URL part.

Run Through the Repo List and Clone Each Repo

Assuming you have all the repo links in one file which we will call repos.txt which simply has one line per repo URL, downloading them now is fairly simple:

  • Create a new empty directory
  • cd into this directory
  • Run (we'll assume that the repos.txt file is one directory back):
for repo in `cat ../repos.txt`; do echo -------------------$repo; git clone $repo; done

You do not have to worry about using your Personal Access Token on the clone step as running this from the terminal is like you are running a clone (i.e. it'll use your ssh keys so no need for a password/token).

This approach should also work with personal repos (public and private) by changing the CNTX variable to users.