After an unexpected need to reset my work machine 😓, and needing to set up my development environment again by hand 🛠️, I decided to create a solution to quickly restore my local git repositories (and associated folder structure) 🔄.
I took this opportunity to write two bash scripts that clone and update all repositories on GitHub belonging to either a user or an organization 📦.
This means that, for example, with a single command ⌨️, you can clone hundreds or thousands of repositories, with high levels of concurrency (50 clones in parallel is doable 💨).
The scripts allow for a configurable clone depth, a limit for the number of repositories cloned, and a level of concurrency that decides how many clones are run in parallel 📈.
By running the following command:
git-clone-all --owner f3rno64 --limit 200 --jobs 40 --dir ./f3rno64
I was able to clone all 174 personal repositories 📚, with full commit histories and all tags & branches, in 58 seconds ⏱️.
I wrote a blog post describing this in more detail here 📝, check it out for a breakdown of the arguments and examples of usage.
The GitHub repository is f3rno64/mass-git-scripts and the README also includes examples and general usage instructions 🗂️.
Please check it out and let me know what you think! 💬
I hope you find it useful, and any feedback or suggestions for improvement would be greatly appreciated! 🙏
No, I’ve never heard of it. It looks interesting and offers more functionality in a structured way, but it’s also unmaintained and hasn’t had any activity for a long time.
I’ll use it as an inspiration to implement more features and develop the setup of scripts I’ve already written into a full-fledged command line tool.
Thank you for the idea! 🙇 I’m curious what this will lead to.
A feature I use constantly is the threading.
I’ve been using
mr
since the raspberry pi and Ubuntu days. It allowed me to keep my remote clones up to date. I created a few bash scripts for populating the registry.