Java Sealed Classes are a powerful feature and a subpar design đ
When I was a youngling learning to code Delphi on the hard knock streets of Joburg⊠one thing was made very clear by the compiler and teachers were that circular dependencies are wrong⊠just wrong. No code will compile. I appreciate we are about 800 years from that part and tech has moved on, so it can sort of circular dependencies, but to me, it still feels like a bad design decision.
Which brings me to the Java permits
keyword (also known as Sealed Interfaces), this feature allows one to create an interface and âassignâ classes to it and basically get a Union type like TypeScript has; example code (note this is an empty interface - but doesnât have to be)
sealed interface Animal permits Dog, Cat {}
Now I can use Animal
in my code and do a type check if it is a Dog
or Cat
and things just work. This also means the shape of Dog
and Cat
do not need to match each other (again, a Union type)⊠coolness. Again an example:
void speak(Animal animal) {
switch (animal) {
case Dog dog -> dog.bark();
case Cat cat -> cat.meow();
}
}
Here is the wild part for me, all the classes âjoinedâ in this interface must also extend that interface⊠So Dog
is an Animal
, Cat
is an Animal
and Animal
is both Cat
and Dog
. Example:
final class Dog extends Animal {
public String bark() { ... }
}
or
final class Cat extends Animal {
public String meow() { ... }
}
This means Animal
needs to know what Cat
& Dog
are first, to set itself up, but, each of those classes need to know what Animal
is first so that they can use it⊠it is a circular dependency by design. It seems like a bad - it works⊠just feels bad. I likely wouldâve not had them extend the interface if it was my call. There is a perk to this though, you can force some shared shape in the interface
sealed interface Animal permits Dog, Cat {
boolean isTailWagging;
and then you end up with both classed needing to implement it, Ă la
final class Dog extends Animal {
public String bark() { ... }
public boolean isTailWagging() {...}
}
or
final class Cat extends Animal {
public String meow() { ... }
public boolean isTailWagging() {...}
}
and then we use that without a type check
void speak(Animal animal) {
if (!animal.isTailWagging()) {
// animal is unhappy so will make a noise
switch (animal) {
case Dog dog -> dog.bark();
case Cat cat -> cat.meow();
}
}
}
Though in my fictional design, one would just have two interfaces - one for the shared shape and one for the union⊠Anyway, I hope this little tangent into this was interesting, definitely good to see Java getting this sort of functionality (though it did get a while back to be fair), just need to see people using it now!
Why AI Centaurs need Git excludesFile aka why global .gitignore files are useful
Before we dive into the problem, letâs set the stage for all the terms which maybe new to someone.
AI Centaur
Cory Doctorow describes Centaurs as
The idea of âcentaursâ comes from automation theorists: it describes a system where a human and a machine collaborate to do more than either one could do on their own
I have recently heard this also referred to as an âAI Nativeâ, no doubt a play on âDigital Nativeâ - but Centaurs are a better term in isolation and paired with Reverse-Centaur (see the link above for this is).
AGENTS files
So you may work with a centaur, or you may be one yourself - and if you are in that situation, you have seen the increasing AGENTS.md format slowly creeping up into a variety of your code bases. Some tools have different filenames, like Warp (which I highly recommend) uses a `WARP.md, but I am going to call them AGENTS files for this post.
A key detail on AGENTS files though is while there is a standard name, there is no standard for the content - just recommendations that your favourite AI agent/tool should use.
Problem
In theory, teams could commit their AGENTS file to the repo, and everyone in the team benefits from a single file. However, this seems to fail in 2 ways
- Most teams I have worked with do not have a standard AI tool plan and each developer is using their own favourite (some maybe not even using a tool) - this does mean each tool will want to tweak this differently or could hallucinate in odd ways getting context it does not expect or understand. In theory this should not be an issue, especially if you handcraft your AGENTS file - but theory and practise donât always align⊠especially with LLMs.
- In some cases, the different team members will want the AI agent they work with to work in a way that is slightly different from the rest of the team. Warp handles this nicely with Rules, but many tools donât - and now you have each team member either compromising or having battles on whose file is correct.
In these situations, the outcome is often not committing the AGENTS file and when one has a lot of code basis to flip through, this can lead to situations where one forgets and accidentally commits it, or you need to update every .gitignore
all the time.
Recently though, I found core.excludesFile
which is a setting in Git, which allows one to set a specific file to use inconjuction with the standard .gitignore
in your project. Why this is useful is one can set this to point to a file in their home directory and set the global config for Git⊠which means every Git project will use that and this will let you or an AI Centaur you know put the AGENTS file in say ~/.config/.gitIgnore
and then use the command git config --global core.excludesFile ~/.config/.gitIgnore
to set the file and problem solved!
Maybe you donât need the config?
While writing this up, I learnt you donât even need to set this as there is already a default value of $XDG_CONFIG_HOME/git/ignore
set and if (like I think many people) have not set the XDG_CONFIG_HOME
environment variable then it falls back to $HOME/.config/git/ignore
.
Bonus trick
Also from the documentation, I learnt another trick
Patterns which are specific to a particular repository but which do not need to be shared with other related repositories (e.g., auxiliary files that live inside the repository but are specific to one userâs workflow) should go into the
$GIT_DIR/info/exclude
file.
$GIT_DIR
is the .git
folder in the root of your project (normally - Git allows you to customise everything)
AWS Community Day - South Africa
What an amazing day! For the first time in South African history, we had an AWS Community Day, all thanks to the hard work of Hennie Francis and the team working with him. The organisation, swag, and content were all next-level. For a first-time event, it was stellar.
I skipped all the talks before lunch because I was having such a great time networking with the sponsors and catching up with friends. Before I knew it, it was lunchtime, followed by Dale Nunns doing his famous Duck Hunt talk (you can watch it on YouTube. Dale was amazing, as always.
The next talk I saw was from a new speaker to me, Louise Schmidtgen. She did my favourite type of talk: an honest review of a real-world failure, what was learned, and how they would avoid it in the future. Will absolutely be at her talks in future.
The final talk of the day was Mark Birchâs. I had zero interest in seeing this talk from the title, âCommunity as Code: Engineering Human Infrastructure,â especially since, like most attendees, I hadnât even read the description. But, as Iâve said before, conferences are a serendipity gold mineand this is yet another example of it; but not the last one of the day. For me, this talk felt like it was designed perfectly for me. I was wowed and definitely need to get his book - Community in a Box.
Following the day, I got an invitation to the speakersâ dinner, and again, I just ended up in the right seat (thanks serendipity) for an amazing night of discussion and inspiration. I got to hear from Jared Naude, Lindelani Mbatha, Roberto Arico, and Liza Cullis. I am still in awe of their skills and so grateful for them sharing their wisdom and stories with me.
It was not a perfect day, unfortunately. The venue had some rather short-sighted restrictions on movement and taking photos, which is why the images in my post are cartoons I âdrew myselfâ and totally not from photos I sneakily took and an AI changed đ
The venue also extended lunch without asking the organisers, which led to the last time slot of the day being cancelled. I appreciate that the venue staff were trying their best, but decisions like that need to be made by the organisers, not the venue. This meant that Candice Grobler, whose talks always blows me away, didnât get to do her talk - a big disappointment for me.
I hope this is a lesson for the venue to improve, because their reputation is in bad shape right now.
It was amazing to see how Hennie and the team coped with the challenges; they truly did an amazing job despite the venue. I cannot wait for the 2026 event! I will be getting tickets as soon as they come out!
The serendipity of the conference
My experience at yesterdayâs DataConf, a fantastic event organised by two people I adore, Candice Mesk and Mercia Malan, perfectly illustrates this. My day began as usual: catching up with friends and desperately trying to convince the Bootlegger coffee people to open early just for me. â
I had my first serendipitous moment before the day even began. Quinn Grace asked me what I thought comes after AI. The conversation and the ideas that came from that discussion were awesome, and I am still thinking on it as I write this.
Another moment of serendipity happened after lunch. I was unsure which talk to attend, so I simply stayed in the main room without even really knowing what the talk was about. It turned out to be the most practical talk of the day for me: âHow to craft teams of exceptional analystsâ by Lisema Matsietsi. The title alone screamed that this was NOT a talk for me, and like many attendees, I never read the description, so I was flying blind. This was pure serendipity. The talk was entirely leadership-focused and helped illuminate parts of team dynamics I hadnât deeply understood. Iâm so glad it was at the event; it just goes to show how you can end up in the right room at the right timeâdespite yourself.
The Learning Was in the Talks
I initially thought Iâd written enough on the topic, but Duncan Dudley told me my posts were too short. I also thought Iâd steal a page from Dale Nunns wonderful post on the event (he even has photos⊠I am way too lazy for that), so here are some of the actual learnings that happened for me.
I had the pleasure of watching certified genius (certified by me - still counts) Michael Johnson talk about data engineering. As the resident ânot dataâ guy, his talk was incredibly useful, giving a wonderful look at the history of the field and why we are where we are. It really helped me understand the landscape better.
This was followed by Pippa Hillebrand who has the genius and laser focus of your favourite super villain, but without the desire to take over the world. Her talk on AI privacy was so powerful, focusing not on how we build AIs, but on how we run them and the risks involved.
Pippaâs talk was a perfect lead-in for the funniest and most genuine speaker of the day, Georgina Armstrong. Her talk on recommender systems was genius. I wish DataConf had recorded these talks because hers is a must-see, if for no other reason than so my partner didnât have to listen to me go over every detail when I got home.
Iâve already mentioned Lisemaâs talk, so Iâll move on to Marijn Hazelbag, PhD talk on digital twins with cellphones and fibre networks. While it was entirely pointless to my work, it was SO interesting (also extra points for the only live demo of the day, which helped captivate me more). It opened a door to a world I didnât know existed. I have no idea if Iâll ever need that knowledge, but serendipity may have a plan for it.
The talks of the day concluded with Carike Blignaut-Staden, who gave a must-see talk for any team building a dashboard. Iâve been guilty of doing all the things she said you shouldnât do, which is a great place to learn from because itâs all about improving from there.
What a wonderful day. I hope this encourages you to try a conference. And when you do, maybe skip a talk to discuss the future of work or go to a talk you wouldnât normally have chosen. It just might lead to an even better experience.
Originally posted to my LinkedIn but thought would share here too for those who donât follow me there: https://www.linkedin.com/pulse/serendipity-conference-robert-maclean-342pf/?trackingId=shsYleAzXj22zxAHwx6J%2BQ%3D%3D
The Git LFS Problem with SSH Profiles
Like anyone with a brain, I want mine to be as empty as possible of important things so I can fill it with memes and TV quotes. To help that process, I put my passwords into a password managerâmostly for security, but also for convenience.
1Password, like others, supports running an SSH agent, which is amazing when you have multiple accounts on GitHub and GitLab that all need SSH. It provides great portability, and for most things, it just works.
This does mean having something like this in my ssh config file so I can reference each account with a different name:
Host personalgh
HostName github.com
User git
IdentityFile ~/.ssh/rmaclean_github.pub
IdentitiesOnly yes
Host clientAgh
HostName github.com
User git
IdentityFile ~/.ssh/clientA_github.pub
IdentitiesOnly yes
Then, when I git clone, I donât use [email protected]:rmaclean/developmentEnvironment.git
, instead, I use personalgh:rmaclean/developmentEnvironment.git
. This instructs Git to use the specific profile in the SSH config.
This works great, except if you use Git LFS (Large File Storage). Over the last 14 months, Iâve used LFS a lot since one of my clients is a game developer, and LFS is essential for all the binary assets. Since LFS uses a separate process, it makes assumptions about the hostname and thinks the SSH profile name is the hostname. And you get an error like this:
Cloning into 'demo'...
remote: Enumerating objects: 4108, done.
remote: Counting objects: 100% (234/234), done.
remote: Compressing objects: 100% (127/127), done.
remote: Total 4108 (delta 140), reused 149 (delta 103), pack-reused 3874 (from 3)
Receiving objects: 100% (4108/4108), 1.62 MiB | 1.46 MiB/s, done.
Resolving deltas: 100% (2524/2524), done.
Downloading public/assets/audio/music/Arcade_LoopNew.mp3 (2.4 MB)
Error downloading object: public/assets/audio/music/Arcade_LoopNew.mp3 (425014c): Smudge error: Error downloading public/assets/audio/music/Arcade_LoopNew.mp3 (425014c3f342e099e5c041b875d440e9547ef4fb2a725d41bb81992ad9f37ddd): batch request: ssh: Could not resolve hostname clientA: nodename nor servname provided, or not known: exit status 255
Errors logged to '/private/tmp/demo/.git/lfs/logs/20250826T090610.648994.log'.
Use `git lfs logs last` to view the log.
error: external filter 'git-lfs filter-process' failed
fatal: public/assets/audio/music/Arcade_LoopNew.mp3: smudge filter lfs failed
warning: Clone succeeded, but checkout failed.
You can inspect what was checked out with 'git status'
and retry with 'git restore --source=HEAD :/'
To fix this, you must use the standard [email protected]:
host structure. But since you also need to pass in the correct SSH key, you can do that with a temp config setting: core.sshCommand
.
When cloning, you can specify the command directly:
git -c core.sshCommand="ssh -i ~/.ssh/clientA_github.pub" clone [email protected]:client/demo.git
After successfully cloning the repository, any subsequent pushes or pulls will still fail. This is because the repository is not yet configured to use the right key. The git clone command simply added a parameter for that single operationâit didnât change the repositoryâs configuration.
To fix this, you need to set the configuration correctly after the clone. Switch into the cloned folder and run this command:
git config core.sshCommand "ssh -i ~/.ssh/clientA_github.pub"
This command sets the core.sshCommand
for the repository, ensuring that it always uses the correct key. Now it will just work.
The Case of the Dotty Environment Variables
This post is for that one other person whoâs losing their mind over a bizarre issue: lowercase environment variables with dots in their names not showing up in one environment, when it works elsewhere.
The clientâs request was simple: configure all environment variables with a naming convention like my.app.variable
. Everything I knew suggested this was impossibleâenvironment variables typically follow a UPPERCASE_SNAKE_CASE convention and donât play well with special characters like dots. Yet, in their environment it works.
The Investigation
To get to the bottom of this, I created a minimal, reproducible example using a simple Java application that reads an environment with the k8s and dockerfile need to run it. I hosted the code on GitHub for anyone to see and confirm: it worked. I could evensh
into the container and confirm the variables were present and correctly formatted using the env
command.
The puzzle deepened. The environment variables were definitely there and readable, so why couldnât my real application see them?
The Unexpected Culprit
After a lot of debugging, I finally found the problem: the Docker base image.
In the clientâs working environment and my local test (by coincidence), were running on eclipse-temurin:21-jre-alpine
. The actual project code, however, was using eclipse-temurin:21-jre
.
The difference seems subtle, but it was critical. The alpine version, being a minimal Linux distribution, likely handles environment variable parsing differently or has a different shell configuration (via the entry point being bash) that allows these non-standard variable names to be passed and read correctly. The standard image, based on a different underlying OS (likely Debian or similar), does not.
Trying out DenoDeploy Early Access
I have been using DenoDeploy for my experiments and toys recently, and been wowed by it. Recently they have announced their version 2 is coming, and you can try it out in early access now: https://deno.com/deploy
I wanted to try this and felt that after a year, it was a good time to rebuild the website for my sole proprietorship, https://www.goodname.co.za. Last year when I launched it, I hosted it on my main (expensive) hosting provider where I run this and used Drupal as a system for it⊠all because it was quick to get going.
In a year, I did no updates and spent way too long updating dependencies I didnât need⊠so why not use something new with DenoDeploy EA? And what could I not do before⊠run static HTML content! Yeah, DenoDeploy now lets that work and since I can put together HTML/CSS/JS quickly⊠it makes it really solid. It also means I can use my normal dev tools and push updates via GitHub.
I donât have much more to say, because DenoDeploy EA just worked; it was easy to configure (just connect to GitHub), link the domain via DNS and BOOM! It is running! It is amazing. I am very excited to see what is coming from that in the future.
If you are looking for a place to run TypeScript, JavaScript, or static content⊠you owe it to yourself (and your wallet) to check it out.
Bring your Google calendar into a spreadsheet
When it comes to spreadsheets, Excel kicks ass, like it is massively more powerful than anything else out there, but I have recently had to pull Google Calendar info into a spreadsheet and rather than manual capturing it, I found that Sheets from Google with the App Script is really powerful thanks to the unified Google experience.
To bring in the info, I followed the following steps.
- Create a new spreadsheet (I used the awesome https://sheets.new url to do that)
- In the spreadsheet, add your start and end dates for the range you want to import. I put start in A1 and end in B1
- Click extensions â App Scripts
- In the Code.gs file, drop the following code in
// Configuration constants
// change these as needed
const START_DATE_CELL = 'A1';
const END_DATE_CELL = 'B1';
const HEADER_ROW = 3;
const HEADER_COL = 2;
// do not change these
const DATA_START_ROW = HEADER_ROW + 1;
const NUM_COLS = 3;
function calendar_update() {
//your calendar email address here
var mycal = Session.getActiveUser().getEmail();
var cal = CalendarApp.getCalendarById(mycal);
var sheet = SpreadsheetApp.getActiveSheet();
// Clear existing data rows
var currentRow = DATA_START_ROW;
while (true) {
var checkRange = sheet.getRange(currentRow, HEADER_COL, 1, NUM_COLS);
var values = checkRange.getValues()[0];
var hasData = values.some(cell => cell !== '' && cell !== null && cell !== undefined);
if (!hasData) {
break;
}
checkRange.clearContent();
currentRow++;
}
//put dates here
var events = cal.getEvents(
sheet.getRange(START_DATE_CELL).getValue(),
sheet.getRange(END_DATE_CELL).getValue(),
{ search: '-project123' },
);
var header = [['Date', 'Event Title', 'Duration']];
var range = sheet.getRange(HEADER_ROW, HEADER_COL, 1, NUM_COLS);
range.setValues(header);
var rowIndex = DATA_START_ROW;
for (const event of events) {
if (event.getTitle() === 'Busy' || event.getTitle() === 'WFH' || event.getMyStatus() === CalendarApp.GuestStatus.NO) {
continue;
}
var duration = (event.getEndTime() - event.getStartTime()) / 3600000
var details = [[event.getStartTime(), event.getTitle(), duration]];
var range = sheet.getRange(rowIndex, HEADER_COL, 1, 3);
range.setValues(details);
rowIndex++;
}
}
- Set the config at the top of the script and hit save
const START_DATE_CELL = 'A1'; // this is where you specified the inclusive start date to pull from
const END_DATE_CELL = 'B1'; // this is where you specified the exclusive end date to pull to
const HEADER_ROW = 3; // the row for where the header for the table will be
const HEADER_COL = 1; // this is the column where the first part of the header is A = 1, B = 2 etc...
- Save and run⊠you will be asked for auth, this is a one time approval;
- The content will be in the sheet now! But letâs make it easy top update
- Go to Insert â Drawing, and draw a button or icon and hit insert
- On the button, click the 3 dots and select Assign Script
- For which script do you want to assign, put in
calendar_update
and click Ok.Now you can click that button at any time and it will update
And as a final awesome trick, you may wish to convert something like 0.25 to a human-readable 15 minutes? I use the formula =TEXT(<VALUE>,"[h] \h\o\u\r\s m \m\i\n\u\t\e\s")
My Secret Weapon: Single-Letter Git Aliases
Today I want to share my favourite Git aliases that Iâve built up over the years. If you havenât heard of a Git alias, itâs basically a way to add your own custom commands to Git. Think of it as a personal shortcut for frequently used Git operations (you can check out the official docs here for the technical deep dive).
You could add these to your shell directly, and theyâd work pretty similarly. But for me, that requirement of still having to type git
first really keeps them nicely ring-fenced. It helps keep my mental space focused just on Git commands. Plus, since I use these constantly, theyâre all single-letter commands. If youâre doing everything on the shell, you might run out of good single letters (depending on your shell, youâve only got so many!). Here, I still have a limit, but itâs focused specifically on Git commands, so Iâm not going to hit it anytime soon.
git u
(Update Branch)
My Go-To: First up is u
. This one is for updating my branch. Most days, before I even start coding, I want to update my feature branch to main
. I also like to do a bit of clean-up to keep Git per formant and my local repo tidy. Man, this used to be a bunch of things I had to remember to do:
-
Switch to
main
⊠except it might also bemaster
,release
, orstaging
. I bounce between multiple teams and clients, and remembering which one is which each time can be tiring. -
Run
fetch --prune
to clean up any local branches that no longer exist on the remote. Keeps things neat! -
Pull the latest changes into my local
main
. -
Run
maintenance
for performance goodness. - Switch back to my feature branch and merge those fresh changes across.
Thatâs a lot of individual commands, right? Today, thatâs just git u
. Internally, the alias looks like this:
alias.u=!git switch $(git symbolic-ref refs/remotes/origin/HEAD | sed 's@^refs/remotes/origin/@@') && git fetch --prune && git pull && git maintenance run && git switch - && git merge $(git symbolic-ref refs/remotes/origin/HEAD | sed 's@^refs/remotes/origin/@@')
Now, where this still messes up for me is on one client where their main
is actually production
, but their workflow dictates pulling from staging
⊠so itâs a bit weird. And then thereâs the classic rebase vs. merge debate. I landed on merge for this alias since itâs often safer, but I do wonder if I could build a smarter system to try rebase, and if that has issues, then switch to merge. Food for thought!
git n
(New Branch)
Branching Out with Next up is n
. This one is for creating a new branch. Take everything from the u
command above, except for the last step (merging back). Instead, I want to type in a new branch name, and it creates that new branch for me. That was super easy to integrate using read
, and the final alias looks like this:
alias.n=!git switch $(git symbolic-ref refs/remotes/origin/HEAD | sed 's@^refs/remotes/origin/@@') && git fetch --prune && git pull && git maintenance run && read -p 'New branch name: ' branch && git switch -c $branch
And there you have it! These two aliases have saved me countless keystrokes and mental gymnastics over the years.
TIL: Tax is a asymptote
Tax is complicated, and we donât even have lunatic tariffs to worry about.
Often people talk about the fact we pay up to 45% tax in South Africa, but that is seldom true; in fact, mathematically it will never be true because tax is something called an asymptote. Even if you round up, you have to earn over R32.1 million a month, to get to 45% - something no-one (or very few people) will get close to.
I wanted to see and play with this, so I built taxreality.sadev.co.za this evening to play with the formula and visualise it.
Like with my last project, it is built with Deno, Fresh and Deploy. In addition, I also made use of System.css for some of the styling and RealFaviconGenerator which is an amazing tool; it let me use a SVG and have it change to work with light and dark mode!