Software Bloat

20 years ago the most advanced games console I can think of was the PlayStation 2, this ran on a remarkable 32mb of Ram (with a small amount of other ram, and when I say small I mean less than 10mb). Today it is common to see a single program use that amount for an extraordinary small task. This was a machine that could play sound, moving video, accepted input and did networking, all in a remarkably small amount of Ram.

How is it hardware has got so sophisticated but software has become so large and required the utilisation of this.

To put it into perspective the worst machine I can find on Amazon right now has 2gb ram. That's 62.5x the amount the Ps2 had.
 
gaming is not the good example...
Let's do some maths.

The PS 2 can display an image up to 1920x1080 pixels, with a color depth of 8bits (per color and transparency) and with a 50hz refresh frequency.

The PS5 can display a 4k (2160x4080) image with a color depth of 10 bits (per color and 8 for transparency?) and a frequency up to 120hz.

What is the size of an image displayed with PS2 and PS5 ?
and a video ?

And this do not include the sound capabilities, raytracing and so on.

In my Mac SE (8" black and white screen) all game that I played was distributed in a 1.44Mb floppy.
 
The most simplified thing about this topic was already mentioned, but let's repeat:

Hardware is cheap, Software is expensive.

Software is expensive because software development is complex (and yes, hardware development is complex as well, but in a much more limited domain. The problems you solve in software are much more varied).

So, if you can greatly cut down software development effort by using e.g. simplified languages, libraries, frameworks, and so on, and the price is you will need more RAM (for all that "hidden" complexity), that's a good deal.

Then, games are special, and development for a gaming console is even more special. Games (definitely 20 years ago) were relatively simple in structure. There's a limited amount of things that can happen (or the player can do) during gameplay. Furthermore, if you write a game for a console, you will find a special-purpose OS (if any) and a defined hardware. There's no need for any abstraction layers you will need with your typical portable PC software. You will never have to think about other processes, there will only be those your game needs. Directly accessing audio and video hardware isn't a problem at all, cause again, you're "alone" on that machine and you know exactly which hardware is used. Comparing memory amounts needed for this to programs running on a general-purpose multiuser/multitasking OS on your general-purpose machine is fundamentally flawed.

Finally, I think the "extraordinary small task" you talk about is also mistaken. Please compare such software to software on "desktop" machines 20 years ago. Back then, it wasn't uncommon that a crashing program would crash your whole system (yep, it was the time of e.g. win 9x on your typical x86). UIs were "simple", but very lacking in functionality (and UX). Few things, if any, were configurable by the user. Sound was an exclusive resource. Well, this list goes on, depending on which software you actually look at...

So now, what's "Software bloat"? Maybe the "excessive" use of libraries and frameworks. This does happen. I'm looking for example at node and electron. I don't like them for other reasons (portability isn't as nice as advertised, packaging software using them is a PITA), but I personally wouldn't mind the "wasted" memory, if I can have a well-working application quickly, because the devs didn't have to "reinvent the wheel" over and over.

And finally, I just stumbled over this: https://v8.dev/blog/pointer-compression – it's IMHO an awesome example how today(!), in a lower level, a lot of effort is spent for optimizations. Doing it there makes a lot of sense because a lot of software will immediately benefit from it.
 
Software did not became bad. .

I'd like to counter that... Today the approach to building a 'small' (in terms of functionality) program is sadly more often than not: "lets use this framework, which needs that ecosystem with this interpreter and drags in those few hundred libraries and dependencies and needs exactly *this* one version of that graphical framework and exactly *that* version of this obscure library someone abandoned in 2005"
Programs that used to be a few MBs in size 20 years ago are now 500+ MB behemoths in total - often under the disguise of (non-existent) "cross-platform" support. Take for example all those fancy "desktop apps" (I hate that term) that are basically just a complete browser plus/or something horrible like electron and the actual program logic and functionality is just a few kb of actual code and could have been solved easily without all of that bloat.

Yes, this isn't true or that extreme on all platforms, but *very* extreme on some. (E.g. android - which is a dumpster fire of bloat, bad code quality and horrible security practices)
It seems the profession of writing dedicated and optimized programs has nearly vanished or at least has been pushed back to some ecosystems (of which the BSDs are luckily one), and often it was replaced by just cobbling together a bunch of libraries and frameworks that drag in tons of dependencies and are a nightmare to support and keep working over time.
 
still the specs of a pi zero looked high end 20 years ago
you could run win2k/xp without trouble in 256 megs
i ran ms small biz server (nt4) WITH exchange on a K6 with 64 megs or ram serving 10-15 users
today a printer driver package is larger than win2k(3) and it clearly does A LOT less
 
I think some of you are taking my comparison with the consoles a little out of context. I used the Ps2 because it was an example of a machine that did a lot with a small amount of memory, but consoles are still general purpose computers. By all accounts I think the PS5 does memory optimisation much better than most, although the storage is sometimes questionable. A better example to think about is why could I play a game on 32 mb 20 years ago when I now need that just to open a text editor... Also for people saying the comparison isn't fair because modern machines are more sophisticated multitasking ones, I do get that, but we also have massively parallel hardware compared to then.
 
Finally, I think the "extraordinary small task" you talk about is also mistaken. Please compare such software to software on "desktop" machines 20 years ago. Back then, it wasn't uncommon that a crashing program would crash your whole system (yep, it was the time of e.g. win 9x on your typical x86). UIs were "simple", but very lacking in functionality (and UX). Few things, if any, were configurable by the user. Sound was an exclusive resource. Well, this list goes on, depending on which software you actually look at...

still the specs of a pi zero looked high end 20 years ago
you could run win2k/xp without trouble in 256 megs
i ran ms small biz server (nt4) WITH exchange on a K6 with 64 megs or ram serving 10-15 users
today a printer driver package is larger than win2k(3) and it clearly does A LOT less
Exactly. The comparison with respect to graphics makes perfect sense. Graphics have got more detailed, but what about simple UI sprites?

Also I have highlighted a bit there. Is this not still the case? It is very common for me to have a single program stall my entire system, whether it be chrome or another application, ironically part of that is lack of memory which people on this thread are telling me is not so much of an issue, sure I could get more but couldn't programmers just not be so wasteful?
 
I could get more but couldn't programmers just not be so wasteful?
No, because they don't get it, and will fight tooth and nail against something being better, because they don't understand.

It takes understanding the kind of algebra used to solve a calculus problem, not 3rd grade algebra. Now, many will say they get that, and maybe less will than claim it, but they still wouldn't be able to apply those concepts to anything else.

I've applied mathematical concepts to the work place, even, with physical objects, and they saw what I did was better and accomplished more work faster with the same previous effort. Someone else did problem solving a bit quicker, but I actually was doing that already, and the person did something before I was going to do it. I also suspect that it crossed the boss'es mind to do work that way, but didn't think it could work, because they understood it quickly when they saw it work, and they adopted it. I kept doing problem solving in my head, finding the best way to do more with the same effort, and doing it, even as I could only do one part of a task at a time, that other workers changed the setting due to them doing part of the work when I returned to continue to complete a task. Then again, I was able to do this to a lesser are more basic extent at a previous workplace, before I took calculus.

The only way that will happen is starting a new sub project. Otherwise, nay sayers will keep saying, "no, you can't." It doesn't make a difference if it's software, or something else. We're beyond the technology of 1910, because people didn't listen to them.
 
BTW as for comparing sizes, I just had a quick look at my latest tool, because
  • It does one "simple" (but, on a lower level not so simple) job and nothing else
  • It's written in plain C with no dependencies outside POSIX/C standard libraries, so probably isn't suspicious of "bloat"
and it looks like this:
Code:
root@mail:~ # ls -lh /usr/local/bin/remusockd
-rwxr-xr-x  1 root  wheel    43K Oct  9 09:57 /usr/local/bin/remusockd
root@mail:~ # file /usr/local/bin/remusockd
/usr/local/bin/remusockd: ELF 64-bit LSB executable, x86-64, version 1 (FreeBSD), dynamically linked, interpreter /libexec/ld-elf.so.1, FreeBSD-style, stripped
root@mail:~ # ldd /usr/local/bin/remusockd
/usr/local/bin/remusockd:
    libthr.so.3 => /lib/libthr.so.3 (0x80024f000)
    libc.so.7 => /lib/libc.so.7 (0x80027c000)

So now I could say, damn, in 43K, I could write a whole nice action game for the C64, with music and everything. And I would be right 😛 But my codebase uses a modular and flexible design, which e.g. helps to prevent bugs, makes future changes easier, etc...

Yes, there's software out there that is "unnecessarily bloated". But there are also very valid reasons to use the resources you have at hands.
 
From what I have seen it is down to middleware.

Back in the day a team would write their own i.e occlusion culling engine which does the bare minimum for their purpose. Or at least off-the-shelf ones were minimal of features to fit the hardware of the time.

Now you would grab an off-the-shelf one and it would have every bell and whistle, even if you don't use 99% of the features. Yes the compiler can strip out unused code paths but there are a lot of things it still can't.

Now consider that a typical game would include dozens of different types of middleware. Even worse if you look at prosumer / hobby game engines like Unity. The fact that they stuck an entire .NET VM and runtime in there means a spinning cube program is already 10 times the size of the maximum PS2 memory limits.

It is costly to "re-invent" the wheel. But sometimes you simply can't beat a wheel invented for a specific task.

These days, if I need a library, I often start with an old revision (from ~2005) and build up from there. I usually don't need to backport any security fixes because these always only address issues caused by the later bloat.
 
Just saw Teams installs a local version of Edge, keeps the previous version around, so you end up with 3 edge browsers on your company laptop. Plus chrome. What the...
 
Software did not became bad. Memory just became cheap.
One word: Electron.

And aside that the underlying problem is that programmers got lazy. Who started programming in the 70/80s, when memory was a scarce ressource, needed to squeeze out as much out of the limited stuff as possible. Today this is mostly a forgotten art, among some niches like embedded systems.

Since memory is dirt cheap these days, programmers tend to treat it that way.
 
Today this is mostly a forgotten art, among some niches like embedded systems.
Not sure whether I'd refer to embedded systems as niches. As an embedded engineer I might be biased tho.
But then again... it seems like the "perceived definition" of embedded is changing too. Personally, I see more and more customers asking for some embedded engineering and then it turns out that it's something based on a raspberry pi running a gazillion python scripts.
 
And aside that the underlying problem is that programmers got lazy. Who started programming in the 70/80s, when memory was a scarce ressource, needed to squeeze out as much out of the limited stuff as possible. Today this is mostly a forgotten art, among some niches like embedded systems.
The thing is: Would you prefer to pay someone for weeks of "optimizing" they could also spend to actually satisfy business needs?
 
Back
Top