AI Hello World

Been playing last weekend with GPT-Neo-2.7B (2.7 Billion parameters), a pre-trained model from Eleuther.AI (open source, academic foundation?). If you didn't try any GPT (Generative Pre-trained Transformer) yet, then you may be surprised by what it can do. There are 3 sizes for the GPT-Neo, the 2.7B one is 10GB and all the rest another 5GB or so of installs.

Installing and running a Hello World of it with Python and Linux was made trivial. Didn't try with FreBSD, don't have 15GB free in FreeBSD, sorry for the sudo apt.

Install
Bash:
sudo apt install python3 python3-pip
pip3 install torch torchvision torchaudio transformers

Run from Python3
Python:
from transformers import pipeline
generator = pipeline('text-generation', model='EleutherAI/gpt-neo-2.7B')
      
text_seed = "A horse, a frog and a giraffe walk into a bar."
ai_answer = generator(text_seed, max_length=128, do_sample=True, temperature=0.7)

print( ai_answer[0]['generated_text'] )

At first run it will auto-download the 10GB model (by default in ~/.cache/huggingface/transformers, the smallest pretrained model that can be downloaded is gpt-neo-125M).

The generated answers can be quite surprising. To me It was scary good, uncanny and unsettling. :oops:

The same model with the 3 lines above was able to:
- generate plain English, so I've tried to bamboozle it and put some French in the text_seed=" ... "
- and it speak back in French! o_O
- tried Romanian, and it could answer in plain Romanian, too
- it even speaks "programming", tried Python and it returned Python lines

If you never did this before, you may want to give it a run, and see for yourself. It's not the ultimate thing, but this model is a free one and from a couple of years ago. It already has much bigger brothers, from either closed or open source implementations.




Some random answers from GPT-Neo-2.7B

text_seed = "A horse, a frog and a giraffe walk into a bar."
A horse, a frog and a giraffe walk into a bar. The frog says, “It’s been a while since I’ve had a drink, but I’ll have one.” The giraffe says, “I’m going to have one, too.” And the bartender says, “How about a drink for the whole group?” The horse says, “Are you going to pay for this?” And the bartender says, “No, it’s on me.”

text_seed = "La mécanisation de l'agriculture a changé la vie de nos villages."
L'économie est devenue stratosphère, la croissance économique était née de la mécanisation de l'agriculture et de l'élevage, les salaires augmentent et les coûts de production sont plus bas.

text_seed = "Unde dai şi unde crapă."
Unde dai şi unde crapă. În ciuda faptului că, în ciuda unui astfel de scop, ceva întâmplător se va întâmpla, cineva va întâlni, într-un post de televiziune, un politicien care-l va acuza pe un lider politic de corupţie, şi cineva va spune, în numele lui, ...

text_seed = "def palindrome(string): # returns the string backwards"
Python:
def palindrome(string):     # returns the string backwards in order
    for i, char in enumerate(string):
        if char!= string[-i - 1]:
            return string[:i] + string[-i - 1]
    return string

Usage:
print(palindrome("I ate a piece of pie yesterday")[::-1])

Output:
I ate a piece of pie yesterday
 
Eliza on steroids.

Find some academic presentation about generating images. That stuff is amazing. You feed the AI program a sentence, and it returns an image matching the sentence.
 
Didn't read any papers yet, only fooled around with an online demo of 'DALL·E mini' that generates images by words description: https://huggingface.co/spaces/dalle-mini/dalle-mini Results were between funny and nightmare-ish looking pics. :)

I remember Eliza's answers were based on online searches. While the generated text from Eliza and GPT look similar, PyTorch/Transformers/GPT-Neo is running locally, no Internet required.

What I like the most with GPT generators is that they can generate code (for basic Python at least, generating C doesn't work that well). This code suggestions can be very helpful for someone like me, not a programmer by profession, yet having to write small pieces of code once in a while. It can spare a lot of Stackoverflow searches, and doing that while running locally. Doesn't depend on remote services from a 3rd party that can go off.
 
It throws an error: Unable to connect to the model. Please try again.

screenshot.jpg
 
Concerning the UNIX Philosophy's first rule by Mike Gancarz someone posts 1443x847 pixels to fill app. 78% of the screen with de facto emptiness I wonder if NI already fallen behind AI. 😁
 
The answer depends much from one run to another, and a single word is a too smaller hint to generate something related. Since it didn't run for you, here's one of the answers:

Result​

FreeBSD 11.0-RELEASE Now Available -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 The FreeBSD 11.0-RELEASE release is now available via: ftp://ftp.FreeBSD.org/pub/FreeBSD/ ############################################################################ FreeBSD 11.0-RELEASE - 496 packages available ############################################################################ The release notes for FreeBSD 11.0-RELEASE are available from the ftp://ftp.FreeBSD.org/pub/FreeBSD/releases/11.0/relnotes/. ISO images for the CD-ROM and
 
Back
Top