There are a few generalizations in this article, but it mostly nails my thoughts on the current state of the IT industry.
Why can we watch 4K videos and play heavy games in hi-res on our new laptops, but Google Inbox takes 10-13 seconds to open an email that weighs a couple of MBs?
Why does Windows 10 take 30 minutes to update, when within that time frame I could flash a whole fresh Windows 10 ISO to an SSD drive 5 times?
Why do we have games that can draw hundreds of thousands of polygons on a screen in 16 ms, but most of the modern editors and IDEs can draw a single character on the screen within the same time frame, while consuming a comparable amount of RAM and CPU?
Why is writing code in IntelliJ today a much slower experience compared to writing code in vim/emacs on a 386 in the early 1990s? And don't tell me that autocompletion features justify the difference between an editor that takes 3 MB of RAM and one that takes 5 GB of RAM to edit the same project.
Why did Windows 95 take 30 MB of storage, but a vanilla installation of Android takes 6 GB?
Why does a keyboard app eat 150-200 MB of storage and is often responsible for 10-20% of the battery usage on many phones?
Why does a simple Electron-based todo/calendar app take 500 MB of storage?
Why do we want to run everything into Docker containers that take minutes or hours to build, when most of those applications would also be supported on the underlying bare metal?
Why did we get to the point where the best way of shipping and running an app across multiple systems is to pack it into a container, a fat Electron bundle, or a Flatpak/Snap package - in other words, every app becomes its own mini-OS with its own filesystem and dependencies, each of them with their own installation of libc, gnutils/busybox, Java, Python, Rust, node.js, Spring, Django, Express and all? Why did we decide to solve the problem of optimizing shared resources in a system by just giving up on solving it? Just because we assume that it's always cheaper to just add more storage and RAM?
Why does even a basic hello world Vue/React app install 200-300 MB of node_modules? What makes a hello world webapp 10x more complex than a whole Windows 95 installation?
We keep repeating "developer time is more expensive than computer time, so it's ok for an application to be dead inefficient if that saves a couple of days of engineering work", but I'd argue that even that doesn't apply anymore. I've spent the last couple of years working in companies where it takes hours (and sometimes days) to deliver a single change of 1-2 lines. All that time goes in huge pipelines that nobody understands in their entirety, compilation tasks that pull in GBs of dependencies just because a developer at some point wanted to try a new framework or flavour of programming in a module of 100 LoC, wasted electricity that goes in building and destroying dozens of containers just to run a test, and so on. While pipelines do their obscure work, developers take long, expensive breaks browsing social media, playing games or watching videos, because often they can't do any other work in the meantime - so much for "optimizing for engineering costs".
How come nobody gets enraged at such an inefficient use of both computing and human resources?
Would you buy a car that can run at 1% (or less) of its potential performance, built with a process that used <10% of the available engineering resources? Then why do we routinely buy and use devices that take 10 seconds to open a simple todo app in 2023? No amount of splash screen animations can sugarcoat that bitter pill.
The thing is that we know what's causing this problem as well.
As industries consolidate and monopolies/oligopolies form, businesses have less incentives for investing engineering resources in improving their products - or take risks with the development of new products or features based on customer's demand.
That creates a vicious cycle. Customers' expectation bars lower because they get used to sub-optimal solutions, because that's all they know and that's all they are used to. That drives businesses to take even less risks and enshittify their products even more, as they know that they can get away with even more sub-optimal solutions without losing market share - folks will just buy a new phone or laptop when they realize that their hardware can no longer store more than 20 Electron apps, or when their browser can't keep more than 10 tabs open without swapping memory pages. That drives the bar further down. Businesses are incentivised to push out MVPs at a franctic pace and call them products - marketing and design tricks will cover the engineering gaps anyway. Moreover, now companies have even one more incentive to enshittify their product: if the same software can no longer run on the same device, make money out of the new hardware that people will be forced to buy (because, of course, you've made it hard to repair or replace components on their existing hardware). And the cycle repeats. Until you reach a point where progress isn't about getting new stuff, nor getting better versions of the existing stuff, but just about buying better hardware in order to do the same stuff that we used to do 10-15 years ago.
Note however that it doesn't have to be always like this. The author brings a good counter-example: gaming.
Gamers are definitely *not* ok if a new version of a game has a few more ms latency than the previous one. They buy expensive hardware, and they expect that the software that they run on that hardware makes the best use of the available resources. As a result, gaming companies are pushed to release every time titles that draw more polygons on the screen than the previous version, while not requiring a 2-10x bump in resource requirements.
If the gaming industry hadn't had such a demanding user base, I wouldn't be surprised if games in 2023 looked pretty much like the SNES 2D games back in the early 1990s, while using up 100-1000x more resources.
I guess that the best solution to the decay problem that affects our industry would be if users of non-gaming software started to have similar expectations as their gaming fellows, and they could just walk away from the products that can't deliver on their expectations.
@blacklight
Great Points!
My first job as a programmer was on a CP/M computer, that took ~2 seconds to boot from a 5 1/4 inch floppy disk, on a Z80 running at 4MHz (~1MIPS).
I asked many over the years; A modern computer executes billions of instructions per second... Many billions... So what can it possibly be doing, when Apple II got shit done at ~500,000 ops/sec? Including cores/threads, that is at least 5 magnitudes more power.