This website works better with JavaScript
GitBross
Halaman utama
Jelajahi
Bantuan
Daftar
Masuk
sol_87yf6cwp
/
LLMs-from-scratch
cermin dari
https://github.com/br8bit/LLMs-from-scratch.git
Liatin
1
Star
0
Fork
0
Berkas
Masalah
0
Wiki
Pohon:
1b635f760e
Ranting
Tag
main
LLMs-from-scrat...
/
ch03
rasbt
1183fd7837
add dropout scaling note
1 tahun lalu
..
01_main-chapter-code
1183fd7837
add dropout scaling note
1 tahun lalu
02_bonus_efficient-multihead-attention
5ff72c2850
fixed typos (
#414
)
1 tahun lalu
03_understanding-buffers
f5a003744e
Update README.md
1 tahun lalu
README.md
b6c4b2f9f1
Update bonus section formatting (
#400
)
1 tahun lalu
README.md
Chapter 3: Coding Attention Mechanisms
Main Chapter Code
01_main-chapter-code
contains the main chapter code.
Bonus Materials
02_bonus_efficient-multihead-attention
implements and compares different implementation variants of multihead-attention
03_understanding-buffers
explains the idea behind PyTorch buffers, which are used to implement the causal attention mechanism in chapter 3