Some software bloat is OK

Published on
Updated on

In the era of fast CPUs, gigabytes of RAM, and terabytes of storage software efficiency has become an overlooked thing. Many believe there are less reasons to optimize software today, as processors are fast and memory is cheap. They claim that we should focus on other things, such as developers' efficiency, maintainability, fast prototyping etc. They often recall Donald Knuth's famous quote:

Premature optimization is the root of all evil.

How bad is software bloat nowadays?

Historically computers had much less power and memory. CPUs and memory were expensive. Programmers were much more constrained by the CPU speed and available memory. A lot of work had to be done in order to fit the program into these limited resources. It's no surprise that for 1970-80s era programs it was a normal thing to be written in very low-level languages, such as machine code or assembly as they give the programmers ultimate control over every byte and processor instruction used.

Over time, memory and CPUs became cheaper (all the hardware overall). This reduced the constraints and allowed to use higher level languages which eventually led to the rise of languages with garbage collection (Java, C#, PHP, Python, JavaScript, etc).

Software bloat progression
Software bloat progression

Sometimes when you compare the system requirements of older and newer generation software, you become shocked. Just compare the system requirements of, let's say Windows 11 Calculator alone (let alone the full OS), and the whole Windows 95 OS! Windows 11 Calculator alone consumes over 30MiB of RAM (even this might be an underestimation because shared memory is not included), while Windows 95 could work even with 4MiB of RAM.

Windows 95
Windows 95 (the whole OS) required only 4MiB of RAM. Yes, it could be slow and swap a lot, but would work.
Windows 11 Calculator.
Windows 11 Calculator alone consumes more than 30MiB of RAM. It's unlikely that it does something that sophisticated to require 7-8x amount of the RAM Windows 95 required.

Here is another more dramatic example. Super Mario Bros. game was just 31KiB (or 40KiB, still doesn't change much anything) and used only 2KiB of RAM. But this high quality (preserved pixels from original, lossless) WebP image below is almost 54KiB. Larger than the game itself!

Super Mario Bros.
This loaded image is almost 54KiB and it's larger than the game itself which used less ROM and only 2KiB of RAM!

A significant part of this bloat is actually a tradeoff

From the first glance this seems insane, the developers probably don't care about efficiency, right? Well, it's more complicated, and actually a significant part of this isn't caused by the developers' incompetence or laziness.

Today's software solves some problems that were less of a concern in 1990s and before. Here are some examples:

Balance
There are other things that are just as important, if not more important, than performance.

But yeah, a significant part of the bloat is also bad

That being said, a lot of this bloat is also not a result of a tradeoff, but often incompetence (not only in developers) or laziness. For example, using libraries or frameworks for trivial things or not understanding algorithmic complexity. Many websites are notoriously bloated by having dozens (sometimes hundreds) of questionable dependencies, they don't only degrade performance, they can cause security issues and maintenance problems. And nowadays AI is a multiplier of such problems.

Some of the issues also stem from the software complexity. When software is large and complex, people tend to work on individual smaller pieces. It's a very common problem that software developers don't understand how the entire system works. This, of course, can also lead to inefficiency and tech debt. Poorly thought-out or poorly structured / managed projects aggravate this problem by exacerbating burnout or lack of motivation which often starts the vicious cycle of code rot.

Another source of bloat is over-engineering:

Finally, this obsession with containers is also concerning. Containers often cause increased startup time (quite long even on SSDs), RAM and CPU usage. Sadly, containers are very appealing as an easy crutch for mitigating compatibility and security issues for ordinary desktop apps (looking at you, Ubuntu Snap).

"Bottlenecks" still exist and are still optimized

Many programs still have some small but critical areas of code that need some level of optimization. This might be some database query or a long running function.

There are still highly demanded optimized programs or parts of such programs which won't disappear any time soon. Here is a small fraction of such software:

Such software will always exist, it just moved to some niche or became a lower level "backbone" of other higher level software.

Too late optimization is also a source of evil

While premature optimization by sacrificing design and correctness for hypothetical gains is harmful, delaying it is also bad. You still need to choose the right algorithms and architecture early on since wrong decisions might bite you later. The choice between O(N) and O(N2) algorithms is usually still important in the early stages.

Conclusion

Some bloat is actually OK and it has benefits. Without some bloat we would not have so much innovation. Sure, we could make many things ultra-optimized, we could reduce many apps' sizes by a factor of 10 - 100. But most of the time it will be just an unworthy exchange of the developers' time for a Pyrrhic victory. On the other hand, like most other things, the bloat becomes harmful when it's not in moderation which we also see. Thus, too late optimization can be a problem as well.





Read previous



UP
This site uses cookies. By continuing to use this website, you agree to their use. To find out more, including how to control cookies, see here: Privacy & cookies.