Comments (3)
Thanks for the note, but I think my figure is correct. The best source for this would be the original GPT-2 model implementation, which you can find here:
https://github.com/openai/gpt-2/blob/9b63575ef42771a015060c964af2c3da4cf7c8ab/src/model.py#L123-L130
def block(x, scope, *, past, hparams):
with tf.variable_scope(scope):
nx = x.shape[-1].value
a, present = attn(norm(x, 'ln_1'), 'attn', nx, past=past, hparams=hparams)
x = x + a
m = mlp(norm(x, 'ln_2'), 'mlp', nx*4, hparams=hparams)
x = x + m
return x, present
Unless I am reading this incorrectly, the shortcut path starts before not after LayerNorm.
from llms-from-scratch.
Yes, I think you're right, including about the official code. For Pre-LN, the residual connection starts with the shortcut before the layer normalization step, which is nicely illustrated in your figure here and the official illustration of Pre-LN on the far right side:
Apologies - I have only found a few graphics illustrating the GPT-2 architecture, and it seems like all of them were incorrect about the Pre-LN step (even on Wikipedia, surprisingly).
Thanks again, and issue can be closed then 🙂
from llms-from-scratch.
Nice, I am glad that it all makes sense now (and also looks correct haha). Thanks for raising this issue though, it's always good to have multiple sets of eyes on these things!
from llms-from-scratch.
Related Issues (20)
- llm
- ch06 - fine-tuning an LLM for binary classification task - add vs update output layer HOT 1
- Unable to reproduce notebook (Evaluating Instruction Responses Locally Using a Llama 3 Model Via Ollama) HOT 7
- ch07/03_model-evaluation - prompt HOT 3
- load_weights_into_gpt Getting error Ch : 5 HOT 19
- Incorrect diagram: chapter 2 under "This chapter covers attention mechanisms, the engine of LLMs" HOT 1
- Some typos in ch06.ipynb HOT 8
- Possible typos in Chapter 2 (Working with text data) HOT 4
- Will this book talk about RLHF?
- chapter 5 Exercise 5.4: Continued pretraining optimizer not on the same device HOT 6
- ch07 - ollama reproducibility HOT 15
- ch04\01_main-chapter-code\ch04.ipynb - Shortcut connection figure HOT 2
- eos_id argument missing from book listing 5.4 ("generate") HOT 4
- Figure 2.7 is the same as Figure 2.6 and different from the image in the corresponding notebook (2.3 Converting tokens into token IDs) HOT 1
- 2.7 Creating token embeddings - Text in bold print HOT 2
- Inconsistent vocabulary sizes mentioned in the same section (2.3 Converting tokens into token IDs) HOT 1
- Ch-07 instruction tuning HOT 3
- The GPT 2 Model url link needs to be corrected HOT 3
- Duplicated line in the Listing 2.6 (2.6 Data sampling with a sliding window) HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from llms-from-scratch.