Been playing last weekend with GPT-Neo-2.7B (2.7 Billion parameters), a pre-trained model from Eleuther.AI (open source, academic foundation?). If you didn't try any GPT (Generative Pre-trained Transformer) yet, then you may be surprised by what it can do. There are 3 sizes for the GPT-Neo, the 2.7B one is 10GB and all the rest another 5GB or so of installs.
Installing and running a Hello World of it with Python and Linux was made trivial. Didn't try with FreBSD, don't have 15GB free in FreeBSD, sorry for the
Install
Run from Python3
At first run it will auto-download the 10GB model (by default in ~/.cache/huggingface/transformers, the smallest pretrained model that can be downloaded is gpt-neo-125M).
The generated answers can be quite surprising. To me It was scary good, uncanny and unsettling.
The same model with the 3 lines above was able to:
- generate plain English, so I've tried to bamboozle it and put some French in the text_seed=" ... "
- and it speak back in French!
- tried Romanian, and it could answer in plain Romanian, too
- it even speaks "programming", tried Python and it returned Python lines
If you never did this before, you may want to give it a run, and see for yourself. It's not the ultimate thing, but this model is a free one and from a couple of years ago. It already has much bigger brothers, from either closed or open source implementations.
Some random answers from GPT-Neo-2.7B
text_seed = "A horse, a frog and a giraffe walk into a bar."
text_seed = "La mécanisation de l'agriculture a changé la vie de nos villages."
text_seed = "Unde dai şi unde crapă."
text_seed = "def palindrome(string): # returns the string backwards"
Installing and running a Hello World of it with Python and Linux was made trivial. Didn't try with FreBSD, don't have 15GB free in FreeBSD, sorry for the
sudo apt
.Install
Bash:
sudo apt install python3 python3-pip
pip3 install torch torchvision torchaudio transformers
Run from Python3
Python:
from transformers import pipeline
generator = pipeline('text-generation', model='EleutherAI/gpt-neo-2.7B')
text_seed = "A horse, a frog and a giraffe walk into a bar."
ai_answer = generator(text_seed, max_length=128, do_sample=True, temperature=0.7)
print( ai_answer[0]['generated_text'] )
At first run it will auto-download the 10GB model (by default in ~/.cache/huggingface/transformers, the smallest pretrained model that can be downloaded is gpt-neo-125M).
The generated answers can be quite surprising. To me It was scary good, uncanny and unsettling.

The same model with the 3 lines above was able to:
- generate plain English, so I've tried to bamboozle it and put some French in the text_seed=" ... "
- and it speak back in French!

- tried Romanian, and it could answer in plain Romanian, too
- it even speaks "programming", tried Python and it returned Python lines
If you never did this before, you may want to give it a run, and see for yourself. It's not the ultimate thing, but this model is a free one and from a couple of years ago. It already has much bigger brothers, from either closed or open source implementations.
Some random answers from GPT-Neo-2.7B
text_seed = "A horse, a frog and a giraffe walk into a bar."
A horse, a frog and a giraffe walk into a bar. The frog says, “It’s been a while since I’ve had a drink, but I’ll have one.” The giraffe says, “I’m going to have one, too.” And the bartender says, “How about a drink for the whole group?” The horse says, “Are you going to pay for this?” And the bartender says, “No, it’s on me.”
text_seed = "La mécanisation de l'agriculture a changé la vie de nos villages."
L'économie est devenue stratosphère, la croissance économique était née de la mécanisation de l'agriculture et de l'élevage, les salaires augmentent et les coûts de production sont plus bas.
text_seed = "Unde dai şi unde crapă."
Unde dai şi unde crapă. În ciuda faptului că, în ciuda unui astfel de scop, ceva întâmplător se va întâmpla, cineva va întâlni, într-un post de televiziune, un politicien care-l va acuza pe un lider politic de corupţie, şi cineva va spune, în numele lui, ...
text_seed = "def palindrome(string): # returns the string backwards"
Python:
def palindrome(string): # returns the string backwards in order
for i, char in enumerate(string):
if char!= string[-i - 1]:
return string[:i] + string[-i - 1]
return string
Usage:
print(palindrome("I ate a piece of pie yesterday")[::-1])
Output:
I ate a piece of pie yesterday