Recently, I ran into a challenge with my personal website which sent me down the rabbit hole and I wanted to share it with folks who might want to do a similar thing.
My website is generated by Jekyll, because I use Akamai as a content delivery network and it just works better with statically generated sites; you build them once and then they’re served up as needed without making your origin server (the server you work with) do any extra effort. Pages are generated once when changes happen, they're cached at the Edge, and users get the pages without waiting for them to be generated.
The challenge here is that I uploaded a changed file to my server and it wasn’t represented immediately by my browser. Obviously, this is because it’s cached and Akamai is doing good work to make sure that my visitors have an excellent experience, but you’d think I’d know better, that files need to be cleared from the cache after updating, but I didn’t, and it was frustrating. And then I thought, jeez, shouldn’t there be a way to automate this?
The great news is that there’s a new Akamai CLI which can be leveraged for this sort of thing, and the purge command is particularly simple to work with.
So I set forth on my goal of automating the cache purging when the Jekyll site is updated, and realized… Jekyll doesn’t know, nor care, which files have changed. It just regenerates the entire site willy-nilly, and so can’t be relied upon as a good way to know what to purge.
I build the files on my local server, so there’s no reason I couldn’t push those files to the production server using git, which, as it turns out, cares deeply about which files have changed on the site.
Having added the _site files to my repository, it was a simple thing to push the changes from my local system to the web server, so that only left one order of business - the original goal, purging the cache when a file had changed. For that I used a git webhook which runs a git diff at the end of each push and clears the cache for each file that’s changed.
I’ll walk through the functionality here so you can see what’s happening.
First, we’re only looking at the master branch - otherwise the command is a no-op.
Next, in my installation, the git files live in a separate directory from the content. You can see here that the first action is to checkout the newest commit from the git directory. The git repository files are stored in the other directory for security and privacy reasons.
Now we get to the meat of the issue. Once the files are received, we can ask git which files have changed between the previous commit and this one (both are available in the hook). For each of those files, a simple string substitution is done to make it match the URLs on my site, and then the files are purged using the Akamai CLI.
And that’s it. The CLI for purge expects that you’ve gone through the authentication and provisioning portion of the “Get Started” section of https://developer.akamai.com but once you’ve done that you’re good to go with your publication process, without having to manually purge changed files.