I read now the HOWTO about local AI models, and I noted that in only 28 GB of RAM an AI model knows many spoken/programming languages, algorithms, stack overflow answers, image recognition capabilities, etc... It is roughly the same size of all the articles of English Wikipedia, but already in compressed form.
Yes, it hallucinates, but what it can store in 7-28GB of space is incredible. If we consider hallucination as a form of lossy compression, it is still enourmously efficient. By the way, compression is a form of intelligence, because it is based on structures recognition and prediction.
Yes, it hallucinates, but what it can store in 7-28GB of space is incredible. If we consider hallucination as a form of lossy compression, it is still enourmously efficient. By the way, compression is a form of intelligence, because it is based on structures recognition and prediction.