You've seen openai/gpt-2.
You've seen karpathy/minGPT.
You've even seen karpathy/nanoGPT!
But have you seen picoGPT??!?
picoGPT
is an unnecessarily tiny and minimal implementation of GPT-2 in plain NumPy. The entire forward pass code is 40 lines of code.
So is picoGPT:
- Fast? โ Nah, picoGPT is megaSLOW ๐
- Training code? โ Error, 4๏ธโฃ0๏ธโฃ4๏ธโฃ not found
- Batch inference? โ picoGPT is civilized, one at a time only, single file line
- Support top-p? โ top-k? โ temperature? โ categorical sampling?! โ greedy? โ
- Readable? ๐ค Well actually, it's not too horrible on the eyes!
- Smol??? โ โ โ โ โ โ YESS!!! TEENIE TINY in fact ๐ค
pip install -r requirements.txt
If you're using an M1 Macbook, you'll need to replace tensorflow
with tensorflow-macos
.
Tested on Python 3.9.10
.
python gpt2.py "Alan Turing theorized that computers would one day become"
Which generates
the most powerful machines on the planet.
The computer is a machine that can perform complex calculations, and it can perform these calculations in a way that is very similar to the human brain.
You can also control the number of tokens to generate, the model size (one of ["124M", "355M", "774M", "1558M"]
), and the directory to save the models:
python gpt2.py \
"Alan Turing theorized that computers would one day become" \
--n_tokens_to_generate 40 \
--model_size "124M" \
--models_dir "models"
gpt2_pico.py
is the same as gpt2.py
, but has even fewer lines of code (removed comments, extra whitespace, and combined certain operations into a single line). Why? Because why not.