There is a quiet shift happening in programming. It’s the subtle thinning of craftsmanship. Tools are more powerful than ever, the barriers to entry lower than ever, and yet, paradoxically, the depth of understanding behind much of today’s software is disappearing. It feels less like progress and more like the early stages of a decline: a transition into what could be described as a dark age of programming.
I’m not saying excellence has vanished. It’s a thinning – true craftsmanship still exists. There are engineers who understand systems down to their bones, who can reason about performance, memory, concurrency and architecture with clarity and precision. But they are becoming a minority; not because they are being replaced by something better, but because the ecosystem no longer demands their level of attention to detail from the majority.
In other words: on the surface, productivity and innovation are accelerating, but the expectation that a programmer understands the system they are building is diminishing. This used to be necessary. There was a general assumption that one could, if need be, trace behavior all the way through the stack, reason about performance impacts and form a mental model that extended beyond the immediate layer they were working in. Programming was not just about producing outcomes; it was about understanding causality.
History offers an uncomfortable parallel.
History Repeating
Many civilizations have experienced periods of extraordinary growth followed by gradual decline – not through catastrophe, but through complacency. As systems become more sophisticated, the individuals within them rely increasingly on inherited structures rather than understanding how to build or repair them. Skills that were once essential become specialized, then rare, and eventually lost.
In Classical Greece and the later Hellenistic period, there was a bloom of philosophy, mathematics and engineering, much of which relied on a relatively small group of deeply trained thinkers and craftsmen. Over time, as those ideas spread and were adopted more broadly across cultures, they were often applied without the same depth of understanding that originally produced them. The forms remained: architecture, rhetoric, governance – but the underlying deep thinking and innovation became less consistently practiced, and knowledge gradually shifted from something actively developed to something inherited and reused. The result was not an immediate disappearance of capability, but a dilution, where complexity persisted while the proportion of people capable of fully maintaining it declined.
The Roman Empire maintained vast infrastructure: roads, aqueducts, engineering feats that defined an era. But over time, the knowledge required to construct and maintain these systems became scarce. What followed was not an immediate collapse, but a slow unraveling. Maintenance gave way to decay. Complexity outpaced understanding.
The comparisons aren’t perfect, but the resemblances are difficult to ignore.
Complexity persists. Understanding doesn’t.
In programming, we are not losing the ability to build; in fact, we are building more than ever. What we may be losing is the distributed depth required to maintain, debug, and evolve those systems reliably over time. The knowledge still exists, but it is no longer widespread, and in many environments it is not even valued.
Cultural Decay
Modern development practices tend to prioritize speed, iteration, and delivery above all else, which is reasonable within certain constraints but has secondary effects that are rarely examined. When the dominant message becomes “use the tool, trust the abstraction and move on”, the natural consequence is that fewer people invest the time required to understand what’s actually running under the surface.
Over time, this shifts the view of what it means to be competent. Craftsmanship becomes optional. And when something becomes optional at scale, it tends to disappear completely.
None of this implies that skilled engineers have disappeared. On the contrary, they remain essential, and in many cases they are the ones quietly holding systems together when abstractions begin to leak. What has changed is their proportion relative to the overall ecosystem, and the expectations placed on the average practitioner.
If this trajectory continues, a “dark age” in programming would not manifest as a visible halt, but as a divergence between appearance and reality. Software would continue to ship on time, capabilities would expand, new tools would emerge – but the underlying systems would become increasingly difficult to reason about, and the number of people capable of addressing fundamental issues would continue to shrink.
It would feel like progress, right up until the moment it doesn’t, the collapse.
I’m hopeful that we never reach that point.
DRY Principle To The Rescue
History doesn’t need to repeat itself. If there’s one principle many programmers adhere to, it’s DRY – Don’t Repeat Yourself.
The question is not whether tools will continue to improve. They will. The question is whether the people using them will continue to understand what they are building – or whether, like so many before them, they will inherit a system they can no longer fully comprehend. That is how dark ages begin. Not with a sudden loss, but with a quiet forgetting.
However, craftsmanship has a way of surviving. It persists in small groups, in individuals who care about the integrity of their work, who study systems not just to use them but to understand them. These are the programmers who read source code for insight, who question abstractions, who rebuild things from ground up – not because they have to, but because they want to know.
If there is a way to avoid a digital dark age, it lies with the craftspeople and with a cultural recalibration that once again values depth alongside speed.






Leave a Reply