Simo Virokannas

Writings and ramblings

There was a time when software respected the machine it ran on. Developers measured memory in kilobytes, not because they were nostalgic minimalists, but because they had no choice. Code was written with intent, because waste meant failure.

Now, that discipline is gone, replaced by the assumption that users will simply have more RAM.

Open a modern system monitor and you’ll see the absurdity in numbers. A single browser tab, 500 MB. A chat client, 1 GB. Adobe Creative Cloud, several gigabytes even when you’re not using it (see fig. 1-1 and fig. 1-2, these are what prompted the whole article). These aren’t the natural costs of progress. They’re the symptoms of an industry that has stopped caring.

The Age of Bloat

Software today behaves like it owns your computer. Adobe, Microsoft, Google – they install “services” that run continuously, doing little of value. Half of them are update managers, telemetry clients, or background helpers that no one asked for. The other half exist solely to keep themselves running: watchdog processes and crash launchers.

We tolerate it because the machines keep getting bigger. The RAM doubles, the SSDs expand, the CPUs add more cores. And yet, the experience doesn’t feel faster. We’ve created a strange illusion of progress where hardware evolution hides software decay.

It’s not that modern applications can’t be efficient. There’s just no incentive to make them so. Frameworks promise speed of development, not speed of execution. Web stacks, cross-platform wrappers, and convenience runtimes all pile up until the original intent of the program is buried under its own scaffolding.

An app that used to fit on a floppy disk now ships with its own operating environment.

When Megabytes Were Enough

In the 1990s, entire office suites fit comfortably within a few dozen megabytes. Word processors launched instantly. Spreadsheets with tens of thousands of cells recalculated in real time on hardware that would now be considered an embedded microcontroller.

The same fundamental tasks: rendering text, decoding images, storing documents, are not meaningfully harder today. Higher resolutions and color depth explain only a fraction of the difference. The rest is self-inflicted: layers of frameworks, duplicated functionality, and unnecessary background processing that burns through memory for no tangible benefit.

The Contempt of Convenience

At some point, many software companies stopped optimizing and started assuming. “Users have 32 GB of RAM anyway.” “It’s fine, SSDs are fast.”

But this mindset isn’t harmless. It’s contempt in disguise. It says, “Your resources belong to us. Your computer is just a host for our ecosystem.”

Adobe is a perfect example. Install Photoshop, and you get a miniature operating system of background daemons: updater agents, licensing managers, cloud synchronizers. All running, all hungry. They occupy memory and CPU cycles before you even open the program. That’s not engineering. That’s exploitation.

Look back at fig 1-1 and fig 1-2. The eight persistent Adobe processes run a total of 132 threads. There was a time one could not imagine that scale of multithreading on a desktop computer.

Fun fact: the picture for this article has been generated with Adobe Illustrator, asking it to visualize wasteful memory management in software. Ironically, it sees that as a bullet train in utopian surroundings. While handling that file (with only vector data), it consumed over 4 GB of RAM on top of the 700MB needed by the background processes.

20 years ago this would have exceeded the memory a typical desktop CPU and operating system could have in total due to the 32-bit / 4 GB memory address cap. Adobe Illustrator first came out in 1987, when a typical Macintosh had 1 MB of RAM.

We laugh about it now, because what else can you do? We’ve normalized bloat as inevitability. But every gigabyte consumed without reason represents a little more arrogance from the developer and a little less control for the user.

The Lost Art of Efficiency

The programmers of the 90s didn’t have magic tools. They had limits, and those limits taught them craft. Efficient code wasn’t a stylistic choice. It was survival. You could see it in every line of C, every clever reuse of memory, every design built around precision rather than excess.

Today, we could do the same, but we don’t. We don’t need to. There’s no market reward for optimization. As long as the interface looks modern and there’s a little spinning thing for when you have to wait for more of it to load, it’s “good enough.”

But it’s not good enough. It’s wasteful, and it’s lazy. And worse, it’s dishonest – because deep down, any developer with a measure of self-respect knows software doesn’t need this much to function.

A Call for Respect

Efficiency isn’t nostalgia. It’s respect. Respect for the machine, for the user, for the craft. Somewhere in the rush to make software easier to build, we forgot that it’s supposed to serve the user, not occupy their hardware like a parasite.

Maybe it’s time to bring that respect back. Not because we can’t afford the gigabytes, but because we shouldn’t have to.

I propose a new replacement for Moore’s law: Adobe’s law. It dictates that any popular-enough “heavy-weight” software will double its host memory requirement every three years. Here’s a really bad graph illustrating this on a logarithmic scale:

Disclaimer: not real science


Comments

One response to “Memories”

  1. Reima Karppinen

    The best article I’ve read about this irritating issue. Hands down. Also laughed out loud several times while reading it. Not because the issue is funny. It’s not. It’s sad. So sad that I had to laugh. Ofcourse those genious verbalic choices played a role.

    Thank you for this one. Happy to know there are still people who care. And yes!!! Let’s bring the respect back!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.