Attention in Transformers: Concepts and Code in PyTorchUnderstand and implement the attention mechanism, a key element of transformer-based LLMs, using PyTorch.StatQuest
Improving Accuracy of LLM ApplicationsSystematically improve the accuracy of LLM applications with evaluation, prompting, and memory tuning.Lamini, Meta
How Transformer LLMs WorkUnderstand the transformer architecture that powers LLMs to use them more effectively.Jay Alammar, Maarten Grootendorst