Highlights of the Week
You Will Know Nothing and Be Happy
https://seattledataguy.substack.com/p/you-will-know-nothing-and-be-happy
But there’s a difference between using AI to accelerate your thinking and using AI to replace your thinking!
Previously we built up our mental maps of how things work by writing code. However now that has changed and we need to figure out what the new thing is.
Production Is Where the Rigor Goes
https://www.honeycomb.io/blog/production-is-where-the-rigor-goes
The notes describe five destinations where rigor is already moving to: • Upstream to specification review • Into test suites as first-class artifacts • Into type systems and constraints • Into risk mapping • Into continuous comprehension All of these are great and exciting. Beefing up your pre-production test quality, capturing intent in specification docs, separating specs from constraints, revisiting the jobs to be done by code review, yes yes yes, all of that. Yes please. But where is production on that list? If control is supposed to be moving “closer to reality,” what is closer to reality than your production systems? Production is reality! Reality is production!
Observability has always been a bit of an afterthought at most companies, just like documentation, but new AI systems can make it so much more useful that everyone might start actually caring about them again. Just writing tests and docs for your code is never going to replicate the messy realities of production system so instead we need to be able to see what is going on there and use that as context for the AI to build and fix.
Lose Myself
https://www.eod.com/blog/2026/02/lose-myself/
It still makes me sad, though, that what I’ve spent 45 years of my life toiling at will likely end up as a footnote, the providence of folksy artisans and historical reenactors. I didn’t leave a dent in the universe so much as splatted against it. The world no longer has a need for what I somewhat sardonically call my art. We are all product managers now, pleading with obtuse underlings to go back and try again and to get it right this time. I remain a father and husband and son and friend, but the need for what I can do — the need for what programmers can do — is shrinking, and my conception of myself and my usefulness along with it. There will be more software than ever, as its production is automated; we are entering the industrial age of the digital age. But less of this code will be elegant, or considerate, or graceful. Less of it will be created by removing what isn’t David, and less of it will be driven by a human understanding of human needs.
I see more and more developers being split into two: those who love writing code, and those who love building things. AI is helping the latter but detrimental to the former.
The Machine Didn’t Take Your Craft. You Gave It Up.
https://www.davidabram.dev/musings/the-machine-didnt-take-your-craft/
Speaking as a developer, this becomes obvious the moment you step outside the romantic framing. I have been doing this for years, and the hardest parts of the job were never about typing out code. I have always struggled most with understanding systems, debugging things that made no sense, designing architectures that wouldn’t collapse under heavy load, and making decisions that would save months of pain later. None of these problems can be solved LLMs. They can suggest code, help with boilerplate, sometimes can act as a sounding board. But they don’t understand the system, they don’t carry context in their “minds”, and they certainly don’t know why a decision is right or wrong. And the most importantly, they don’t choose. That part is still yours. The real work of software development, the part that makes someone valuable, is knowing what should exist in the first place, and why.
In contrast to the one above, we the developers still have lots of room for personalisation and choice. In many if not most places it is better to let the AI off and write the code. And you can let it make higher level decisions and assumptions. But you don’t have to, and in many places probably shouldn’t I would argue. You can still make your decisions around how things work and how it all looks.
Meditation, Language, and LLMs
https://craigmod.com/roden/112/
I’m not actually a Doomer around AI — don’t worry, I think we’ll be working more than ever, sometimes more interestingly, but mostly, perhaps, more depressingly. What’s special about this moment is there is something existential in the air, and that makes us open to reflection and change. I really do believe the denuding of purpose and meaning is coming for many, many jobs (coding being the first). But I also think we’ve long ascribed meaning to the wrong activities. So it’s a good time to start meditating, to spend a few afternoons talking about what you’re doing, why you’re doing it, and maybe what you’d rather be doing.
Times are changing and LLMs are the driving factor behind it. Like the ones above, if you got meaning from writing code then you’re going to have to force an identity change to what you actually care about. Is it solving problems or something else