Software Bloat

What is bloat? A big program or a slow program? The speed does not depend on the size, but on the
number of steps till the goal is reached. A single line of code like "10 GOTO 10" is bloat in speed, because
it never reachs a goal. A very big program may be very fast for every data entered. Todays bloat is both
in size and speed.
 
What is bloat? A big program or a slow program? The speed does not depend on the size, but on the
number of steps till the goal is reached. A single line of code like "10 GOTO 10" is bloat in speed, because
it never reachs a goal. A very big program may be very fast for every data entered. Todays bloat is both
in size and speed.
I can forgive size increase if it trades off with performance (a smaller program is not necessarily a faster or more responsive one) or makes development time easier. However as someone above said, they have three versions of Edge. Why? It's madness the weight of modern day dependency graphs.
 
Back in the day a team would write their own i.e occlusion culling engine which does the bare minimum for their purpose. Or at least off-the-shelf ones were minimal of features to fit the hardware of the time.
Yes, I worked there. We had "libcommon" or "liblocal" or "standard" that had all the things you need to write software, like lists, hashmaps, etc. They were all closed-source and had new and interesting bugs in them. If you were lucky there was a truly hair-raising library of concurrent programming abstractions.

The line between not-invented-here and dragging in the kitchen sink takes careful consideration, and many people are just not up to the task.

And aside that the underlying problem is that programmers got lazy.
Some people I highly respect consider laziness a virtue in programmers.
(EDIT: Wow. That URL went wrong!)

ah yes, python... "I learned programming with python... this is so easy! just import these 83 libraries and you can print 'hello world' on the screen with just 2 lines of code! lets use it for everything!"
It' so much worse than that. Have you heard about Jupyter? Now you need a local Web server and a browser to write (bad) Python.
 
Last edited:
The line between not-invented-here and dragging in the kitchen sink takes careful consideration, and many people are just not up to the task.
It helps to be aware there is actually a trade-off to consider.

And yes, home-grown libs implementing "basic" stuff that seems super simple (hashmaps – except if threads come to the table. or even stuff like a datetime type and methods) are a huge red flag.

When looking for libs to use, we tend to look for a solution providing "just enough" for our problem. Unfortunately, you won't always find something.
 
When looking for libs to use, we tend to look for a solution providing "just enough" for our problem. Unfortunately, you won't always find something.
Agreed. This is exactly why as mentioned I start to look at older stuff. Yes, it may feel pretty weird at first but ultimately if a library has gotten objectively worse and regressed with cruft, it makes sense to use the version before the breakage.

The whole IT industry has tried to put fear into us that "old software is insecure". However code is still code. Almost always, the less there is, the better.
 
And yes, home-grown libs implementing "basic" stuff that seems super simple (hashmaps – except if threads come to the table. or even stuff like a datetime type and methods) are a huge red flag.
There weren't many options back in the day when the product was software. No way the lawyers were going to let you use any GPL stuff, the commercial libraries had their own new and interesting bugs, and were expensive to boot.

I think that was the original reason why Java got so popular. It's a good language, but not spectacularly so. What was new and good was the rich core library it came with. Especially back then when everything had to be threaded and the POSIX threads libraries were a sick joke.
 
There weren't many options back in the day when the product was software. No way the lawyers were going to let you use any GPL stuff, the commercial libraries had their own new and interesting bugs, and were expensive to boot.

I think that was the original reason why Java got so popular. It's a good language, but not spectacularly so. What was new and good was the rich core library it came with. Especially back then when everything had to be threaded and the POSIX threads libraries were a sick joke.
Yes the core java libraries are very good and the implementations are good. I also think they stay away from implementing the kitchen sink, there are some common variations of things but on the whole they're quite generic. For example there's no graphs but you can implement one easily with a list.
 
Software bloat is irrelevant to me. Having said that, I always have new hardware and that hardware is almost always high end, because I can afford it. That’s not everyone’s situation and I understand that. I do see plenty of use cases where older hardware can be used so in those cases, using FOSS and a limited set of applications can work.

The web is terrible in my opinion. JavaScript is useful for devs and “can” make web applications more useful but can also ruin the experience and bring new hardware to its knees.
 
They don't get it, because they don't understand what bloat is, and how much redundancy there is. They misattribute why there's redundancy. There's also excuses for multitudes of repeated redundancy. Even when code was cleaned up, those who never built things or compiled things and didn't know why will later falsely attribute that to hardware. It's obvious from some replies.
 
They don't get it, because they don't understand what bloat is, and how much redundancy there is. They misattribute why there's redundancy. There's also excuses for multitudes of repeated redundancy. It's obvious from some replies.
Please tell us about your experience in professional software development.
 
When a non-programmer actually pointed out redundancies, and they fixed it, reducing hours of compile time, thus further making programs run faster, that's all there is to know at how badly the thought process is for many programmers. It even shows that even good programmers didn't even know where to start or there was so much, they didn't realize it could be fixed so well.
 
So this excuses terrible programming, and piling on dependencies, while not understanding why. Actually, my example is programming experience anyway.

It's better to have people who understand math and know how to apply that to unrelated subjects than it is for those who don't understand math. Because math goes a long way, into every single topic.

And telling someone something like, you don't do this or you don't have this is usually a cheap cop-out, and there's no logic involved in it, it's just because they don't want to hear something that actually somehow has a lot of validity.
 
I just don't see anything good in people talking about stuff they don't fully understand.
You don't realize it, but that refers to you. Programming or anything without understanding math is inefficient. Of course you wouldn't understand, because even if you do understand advanced math, your arguments show that you don't know how to apply it.
 
So only a programmer is only able to tell something runs better and compiles faster? It doesn't matter if someone else fixes something, and makes it 10x better, that's irrelevant, because they're not a programmer.

You're on very thin ice, my friend.
Maybe you shouldn't be so argumentative in an antagonistic way.
 
Maybe you should get to the point. If you managed to improve something, show sources. This would "count" as an "experience".

Otherwise, don't tell me crap about maths. Studying computer sciences at a german university involves an awful lot of maths (more than you will ever need for the job, but that's fine, cause universities are about science). And especially, don't tell me anything about software development, engineering, design, etc. That's my profession for MANY MANY years now. So?
 
My only mistake is not taking out a full ad in a newspaper.

If a tree falls, and no one heard it, you would claim the tree never fell. Even if it had an effect, which was so seemless. You may not know something happened, but to claim nothing ever happened is not a logical thought. You can't know everything that didn't happen, even when at least someone knows.

You also sound like you don't even apply what math you supposedly know.

Is a child not able to say, the leaning tower of Piza isn't leaning, because he's not an engineer?
 
tower of Piza
Hungry?


I just don't see anything good in people talking about stuff they don't fully understand.
As a programmer of various application software I can assure you that talking with the users of that software is very useful. Think of design flaws like implicit assumptions that turn out to be incomplete or even completely wrong. Users can tell you. It will usually lead to substantial improvements.
 
Hungry?



As a programmer of various application software I can assure you that talking with the users of that software is very useful. Think of design flaws like implicit assumptions that turn out to be incomplete or even completely wrong. Users can tell you. It will usually lead to substantial improvements.
That is usually the basis of good UI design so I very much agree.
 
Back
Top