
100% Completed
Introduction
Understanding Language Models: Laguage as a Bag-of-Words
Understanding Language Models: (Word) Embeddings
Understanding Language Models: Encoding and Decoding Context with Attention
Understanding Language Models: Transformers
Tokenizers
Architectural Overview
The Transformer Block
Self-Attention
Model Example
Recent Improvements
Mixture of Experts (MoE)
Conclusion
Appendix – Tips, Help, and Download