Over the years I've been using github, and especially while I've been working with Adafruit, I have quite the collection of forked repositories on github. Most of them haven't been updated in years. I decided that I wanted to archive all my forks that hadn't been updated in the last 2 years—over 200 repositories. This would take a long time manually on the github website, so I wrote a script to do it instead.
To use the script (included below), you'll need to
- Create an authorization token for your github account. It needs access to all your repositories to perform administration tasks. You can create a "fine grained" token within your user's settings on github.
- Edit the script to set the correct owner_name and token
Next, run the script once in the terminal. It'll print a line for each repository it will archive. If the list looks correct, then uncomment the "repo.edit" line and run the script a second time. This will actually archive the repositories.
These images show the permissions you need to give the token created for this purpose. They reflect github's web UI in early 2024:
import datetime from github import Github from github import Auth activity_limit = datetime.datetime.now(tz=datetime.timezone.utc) - datetime.timedelta(days=2*365) owner_name="jepler" token = "github_pat_....." auth = Auth.Token(token) g = Github(auth=auth) print(f"Archive repositories not active since {activity_limit}") to_archive = [] for repo in g.get_user(owner_name).get_repos(): if repo.archived: continue if repo.owner.login != owner_name: continue if not repo.fork: continue if repo.updated_at < activity_limit: print(f"A {repo.name} {repo.updated_at}") to_archive.append(repo) ### To actually do something, uncomment the following line: # repo.edit(archived=True)