AI -bugs...

Meh It's annoying because like none of this is really "AI" imho. They decided that current neural networks were close enough to gen AI and every company started hyping/lying about general intelligence being here along with it's capabilities.

That and the whole deal with them scraping everyone's data including stuff they don't have licenses to use; which is only more egregious given how companies like Meta, Microsoft etc all violently defend their own copyrights and licensing agreements.


That and all the ML tooling is absolutely terrible; it's all glued together proprietary binary blobs for gpus and infinite amounts of python dependencies to interface with C/Cpp libraries in some of the most inefficient ways possible. 🤮

It's pretty crazy seeing the kind of projects/code that's being produced now as well. You'll see someone trying to get their embedded system to use a gpt api to handle OCR for numbers instead of just rolling their own solution, or using the cstdlib to do a tiny bit of compute work.

It's not even that hard anymore with the 32bit MCUs to do some SIMD as uint8_t packed into uint32_t matrices on an MCU. Heck I've seen packed booleans used for running decent perceptrons on resource constrained hardware.

So yeah I agree with USerID that the current state of the industry is for sure a bubble.
 
First there was Big Data. Corporations gathered a ton of data, that could potentially be used for reshaping current and developing new products. The world made new databases new filesystems new search algorithms and a ton of new scaffolding to make Big Data the collection, the search and the retrieval.

Then there was the Cloud, a sort of omnipresent Internet infrastructure that could swallow both commercial internet services and intranet networks. A lot of new network services and OS dev went into this.

These technologies haven't got any mass commercial application.
Good technology solves people's problems. Jobs (don't like him, but kudos) talked to the customers and asked them what they want. That kind of an approach.

These technologies were invented along the way. They're techno-infra stuff. They involve themselves with changing current IT processes, there is nothing new.

The best way to sell them both to masses at the same time was slapping a native language interface on top. There was IBM Watson but in a form of a limited supercomputer, so innovations had to be made and thus came LLM.

The notion that somebody stumbled upon AI/LLM 'discovery' and we are now unearthing some artificial intelligence is insultingly dumb.
Both the Big Data/Cloud vendors had attractive rates in their first years, then when they raised it to a profitable value the adoption of those services stalled.

When AI rates climb to true market value the output of both code and multimedia slop will fall down, but the AI vendors will move to proper entertainment industry, signing contracts with Hollywood as we speak, so expect a new Beatles album soon. Of course, initially this stuff will be highly popular, then it will oversaturate and people will go back to normal stuff.

By that time, the corps will pick up the profits, and move on to corrupt next area of technology and life, with AI/LLM or something else by that point.
 
Good technology solves people's problems. Jobs (don't like him, but kudos) talked to the customers and asked them what they want. That kind of an approach.
I remember reading an article in Newsweek, I think, about a reporter walking with Jobs in San Francisco before the iPhone came about. They walked into a store cause he wanted to see what was out there. He picked up a couple and fiddled with them. Then said, "Bah!" and walked out. He despised how they worked and talked about how a user would use them.
 
Back
Top