ELI5 - why do computers/operating systems become slow?

Out of curiosity, which browser do you use?

USB Keyboards and mice follow a protocol, so I don't know how they wouldn't work.

I use www/luakit, but Firefox for banking, tax office, etc. because those 'formal' sites often won't handle other browsers than the usual suspects. They refuse to understand certain things apparently.

When my keyboard broke down, I wanted a mechanical keyboard. On FreeBSD there was little written about it and on Linux sites it only warned for problems. So I took my FBSD netbook with me to the store to try, and the Keychron K2 worked out of the box. Other types did, but had trouble with the Fn-keys. For €140 a device should work IMHO. And this K2 did.
 
Question:- why is it that operating systems doing the same basic task become heavier for the hardware as time goes on?
My other observation is the ongoing 'caressability' of UI that consumes CPU, 'The Ultimate Unique User Experience®':

Technically windows, dialogs, menu's are simply there or not -- visibility on/off.

In modern day operating systems windows etc. always appear to the user. As if 'zoomed in from cyberspace', or esthetically draping down like a curtain with menu's. Compiz is the hyperbole of what most OS/WM do for default. Websites also use this 'hoover = zoom-in' function. Consider all the extra code needed to be processed for that!

Colors [1] are nice, and milions are more natural than monochrome or 16-bit. But all that extra code probably takes extra CPU, and seems useless unless you're working in photo editing. Even 'dark modes' are often not pitch black with all LEDs off on your display.

All those graphical trickery consumes time and CPU, but it makes regular people love their computers. Probably most hobby tweaking is to find the better setting for eye candy. Other people prefer to read a good book (or man pages if you like).

That's why proper TTY is always fast. And distraction free.

[1] US spelling of the word 'colour' because most OSses are of North American origin.
 
To add to the other answers, I think user perception is also often a factor.

I remember maybe 15 years ago visiting a friend to help set up a new computer her parents had bought her, the existing computer was fine but fairly slow, I can't remember why her parents had bought a new computer...My friend didn't notice that her computer was slow, it just did what it did in the time it needed.

We set up her new computer and transferred some files across, turned off the old computer and put it to one side.
A few weeks later she realised there were still some files she needed on the old computer, we set it back up and she started to navigate around to find the files and complained how slow it was.

What had changed? Nothing on the old computer, but she had become accustomed to how zippy the new computer was.

Booting up and loading Firefox to watch a YouTube video or something takes a good 10 minutes before you are even looking at the Youtube home screen
Sounds pretty awful, although I had almost forgotten that we used to turn on our computers in the morning and go and get tea/coffee while they booted.
That being said, that wasn't on SSDs or with an OS that was tuned to boot quickly...
 
We could add 'The Cloud' to extra delay. At work the whole lot migrates to the cloud. Access time to software (nowadays called 'apps') probably is more CPU and time intensive than just reading its code from al local HDD or SDD.

For obvious reasons on my own FreeBSD box all software and files are local. I even dislike IMAP.
 
We could add 'The Cloud' to extra delay. At work the whole lot migrates to the cloud. Access time to software (nowadays called 'apps') probably is more CPU and time intensive than just reading its code from al local HDD or SDD.

For obvious reasons on my own FreeBSD box all software and files are local. I even dislike IMAP.
We absolutely can!
On one past $DAYJOB, the network delay was responsible for taking a model to load into the special editor 3/4 of an hour. I sat there with a stopwatch. The "app designer" was sure it was not his problem because on his side the thing loaded in seconds. Mind you, he was inside the same subnet of the DB server. Add some firewall, VPN, long distance wires and the delay was becoming huge. I made the suggestion to maybe query the DB for a full row instead of iterating trough millions of IDs. Naaah, that would make it faster. Can't have that.
 
What had changed? Nothing on the old computer, but she had become accustomed to how zippy the new computer was.

Sounds pretty awful, although I had almost forgotten that we used to turn on our computers in the morning and go and get tea/coffee while they booted.
That being said, that wasn't on SSDs or with an OS that was tuned to boot quickly...

That is a good point, but I've never had this particular computer before. It's new to me. When running an operating system "of its time" it works perfectly well, but if I install a newer FreeBSD or Linux installation, it is basically unusable.

One thought might be that the newer operating systems might not have as good support for the older hardware, for example, (and I'm not that technical so bear with me) DDR5 has different memory management algorithms and optimisations than DDR4 and DDR3 has so some applications on DDR5 hardware run slower than they would on DDR4 with a lower clock speed. Perhaps in this instance, the older operating system has better support for DDR2 and dual quad-core processor of this particular generation, along with some missing features of the hardware that the operating system either has to workaround or go without?

Another factor might be the take-up of OpenGL in the operating system GUI. OpenGL actually uses filestreaming to pass in shader files that are compiled at runtime. Pretty resource-intensive when you think about it.

My most responsive machine is actually a 2006 Macbook Pro running Snow Leopard. It's more snappy than the fancy 12th gen Intel laptop I have for work. The newer hardware comes into its own when I wake up too late and have to join a video call, open Visual Studio, fire up Chrome and Firefox, and rebuild a database concurrently.
 
Lots can be said about this subject, but maybe some of it is because todays engineers wouldn't understand how to cram a real system into 4KByte of RAM using a ROM BASIC interpetter and stuffing subroutines of machine code into arrays or string space.
Today's engineers can't understand, but somehow, there are still enough hackers who can... ?

Why do you think we have this chicken-and-egg game of computer security? :P My take is that this patching of holes in the basic design of OSes and the never-ending quest to find more holes is partly what contributes to OSes slowing down and requiring epyc-grade power to do just basic stuff.
 
Today's engineers can't understand, but somehow, there are still enough hackers who can... ?

Why do you think we have this chicken-and-egg game of computer security? :p My take is that this patching of holes in the basic design of OSes and the never-ending quest to find more holes is partly what contributes to OSes slowing down and requiring epyc-grade power to do just basic stuff.
You do have a point there, the Spectre vulnerability, for example, manifests in Javascript implementations, but required Intel to make a hardware change? Sounds weird
 
My work laptop is already at that point with all the updates it had to swallow.
That was exactly the moment I switched from Windows XP to Linux. The XP installed OS was 4Gb and all the updates were also 4Gb. Removing them causes the laptop to crash and with it became terribly slow.
 
Back
Top