September 2025
Makesure is a task/command runner that I am developing. It is somewhat similar to the well-known make
tool, but without most of its idiosyncrasies (and with a couple of unique features!).
Being zero-install it needs a way to self-update.
This is implemented by option -U
/--selfupdate
:
./makesure -U
Under the hood it is implemented by downloading the latest version of the utility executable and replacing the current one.
The most important thing here is how to determine the latest version.
For a long time it was implemented by simply checking the version inside the most recent utility source file stored in the GitHub repo:
https://raw.githubusercontent.com/xonixx/makesure/main/makesure?token=$RANDOM
. The trick with ?token=$RANDOM
was needed to overcome caching. By default, GitHub caches raw links for an unpredictable amount of time (from minutes to, sometimes, days).
This trick was “patched” by GitHub, effectively breaking it. Now adding the parameter results in a 404 error.
Now, what options do we have?
Probably, the most correct one is to maintain a separate file (like a text file or JSON) on our own server with a list of all releases and their versions.
I didn’t want to go this route because the maintenance complexity of this solution would be much higher than the current scale of the project.
Another option would be to use the GitHub API to get the latest release version.
I did try this approach, but the main obstacle here appeared to be the aggressive rate limiting applied by GitHub.
Let me demonstrate.
At first, I tried to get the latest commit hash and use it to reliably fetch the most recent (= not stale) version of the file.
Quickly I realized my self-update integration test breaks in GitHub Actions. In my pipeline I run the test suit over multiple OSes in parallel, and this hits the rate limit.
It’s important to note that I could do a compromise and exclude the test from the CI pipeline. It’s unlikely that rate limiting will affect final users. Unlikely, but not impossible. What if users sit inside a corporate network (i.e. behind a single IP) and decide to update the utility simultaneously?
My next attempt to outsmart GitHub (haha, how naive I was) was to download and parse the HTML page instead of the API/JSON one.
Apparently, this triggered yet another level of rate limiting which covers the GitHub UI part.
For some reason I kept persisting. I attempted yet another page with the predictably same result.
All in all, it appeared that GitHub represented an inscrutable wall here.
Basically, if you want to implement the mechanism staying solely in the realm of GitHub, you have to choose between:
raw.githubusercontent.com
) (but no rate limiting!)api.github.com
) and GitHub UI (github.com
) (but no caching!)And if you think about it, these constraints make a lot of sense for the resilience of such a big service as GitHub.
Is there a way out? I found one, I called it “incremental strategy”.
The idea is simple. Can we predict the next release version? Well, if the current version of the utility is 0.9.24 it seems reasonable to expect the next one to be 0.9.25.
If we know the next version beforehand, we can download the file from a (now known) raw link (in this case https://raw.githubusercontent.com/xonixx/makesure/v0.9.25/makesure
), and there won’t be any caching-related problem!
How cool is that?
And that’s exactly what I’ve implemented.
Does this solution have drawbacks? Lots of!
The implemented approach is not ideal for sure. For a more robust self-update implementation, it needs to support our own server with a release versions file.
Even better — distribute the utility via the default package managers on every OS, but the implementation efforts are monumental 🤯.
I decided not to do any of this to keep it manageable for me.