Giter Site home page Giter Site logo

era_v1_session21's Introduction

Bigram Language Model

Training GPT model from scratch

Libraries used: - torch
- numpy
- pandas

Set up:
- Data source is set up by downloading from !wget https://raw.githubusercontent.com/karpathy/char-rnn/master/data/tinyshakespeare/input.txt
- Proceed to setting up training and validation data
- Train with a BigramLanguage Model architecture using Attention mechanism


File structure:
__ README.md
__ S21_GPTFromScratch_Andrej_v1.ipynb

Steps:

The notebook S21_GPTFromScratch_Andrej_v1.ipynb is self explanatory, all one needs to do is run the cells in sequence,
irrespective of which environment you are running from Colab, or otherwise

Some background:

Notes: - Attention is a **communication mechanism**. Can be seen as nodes in a directed graph looking at each other and aggregating information with a weighted sum from all nodes that point to them, with data-dependent weights. - There is no notion of space. Attention simply acts over a set of vectors. This is why we need to positionally encode tokens. - Each example across batch dimension is of course processed completely independently and never "talk" to each other - In an "encoder" attention block just delete the single line that does masking with `tril`, allowing all tokens to communicate. This block here is called a "decoder" attention block because it has triangular masking, and is usually used in autoregressive settings, like language modeling. - "self-attention" just means that the keys and values are produced from the same source as queries. In "cross-attention", the queries still get produced from x, but the keys and values come from some other, external source (e.g. an encoder module) - "Scaled" attention additional divides `wei` by 1/sqrt(head_size). This makes it so when input Q,K are unit variance, wei will be unit variance too and Softmax will stay diffuse and not saturate too much. Illustration below

Acknoledgements: This is based on Andrej Karpathy's training GPT from scratch and his YouTube video

Collaborators: GK Divya and Pankaja Shankar. Any questions or comments please drop a line.

era_v1_session21's People

Contributors

pankaja0285 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.