AI as a form of lossy compression

I read now the HOWTO about local AI models, and I noted that in only 28 GB of RAM an AI model knows many spoken/programming languages, algorithms, stack overflow answers, image recognition capabilities, etc... It is roughly the same size of all the articles of English Wikipedia, but already in compressed form.

Yes, it hallucinates, but what it can store in 7-28GB of space is incredible. If we consider hallucination as a form of lossy compression, it is still enourmously efficient. By the way, compression is a form of intelligence, because it is based on structures recognition and prediction.
 
Which says something about the quality and information density in "the internet".
There are ways to download Wikipedia and have it locally. I was astonished, that this was only something like 30..40GB (if I'm not mistaken.) Since Wikipedia is ~95% of everything useful on the internet I find this is a realistic value.
 
I read now the HOWTO about local AI models, and I noted that in only 28 GB of RAM an AI model knows many spoken/programming languages, algorithms, stack overflow answers, image recognition capabilities, etc... It is roughly the same size of all the articles of English Wikipedia, but already in compressed form.

Yes, it hallucinates, but what it can store in 7-28GB of space is incredible. If we consider hallucination as a form of lossy compression, it is still enourmously efficient. By the way, compression is a form of intelligence, because it is based on structures recognition and prediction.

Indeed, the guy who literally wrote the book on the formal link between compression and intelligence runs a prize to make the best compressor, with a view to furthering AI research: http://prize.hutter1.net/ :)
 
Back
Top