CourseIntermediate3h7m

Transformers in Practice

Instructor: Sharon Zhou

AMD logo

Earn a certificate with PRO

Transformers in Practice

  • Intermediate
  • 3h7m
  • 19 Video Lessons
  • 8 Code Examples
  • 6 Graded AssignmentsPRO
  • Earn a certificate withPRO
  • Instructor: Sharon Zhou
  • AMD
    AMD
  • Learn more about Membership PRO Plan

Understand what's actually happening inside your LLMs

  • Understand text generation: see how transformers produce output one token at a time, and why that explains so much about their behavior.

  • Look inside the model: build intuition for what attention is really doing, how positional encoding works, and how layers combine to make predictions.

  • Optimize for production: learn how quantization, KV caching, and flash attention help transformers run efficiently on GPUs.

Why Enroll

If you’ve worked with LLMs, you’ve probably run into slow inference, out-of-memory errors, or hallucinations you couldn’t explain. There’s no shortage of resources on how transformers work, but most of them either ask you to build one from scratch or get lost in theory that doesn’t connect to the problems you’re actually facing.

Transformers in Practice is different. Taught by Sharon Zhou, VP of Engineering & AI at AMD, this course gives you a complete practical view of how transformers work, from how they generate text to what’s happening inside the model to how it all gets optimized to run on real hardware. Interactive visualizations throughout let you see key concepts in action and build intuition that actually sticks.

Here’s what you’ll learn:

  • Model Behavior: You’ll learn how LLMs generate text through an autoregressive loop, selecting one token at a time from a probability distribution. You’ll see how sampling parameters like temperature shape the output, why hallucinations happen, and how techniques like RAG, constrained generation, and chain-of-thought reasoning all work within this same loop.
  • Model Architecture and Attention: You’ll look inside the transformer to understand what attention is really doing, how positional encoding tracks token order, and how multiple layers and attention heads work together to turn an input sequence into a next-token prediction.
  • Scaling and Deploying: You’ll learn why GPUs are well-suited for transformer inference and where the real bottlenecks are. You’ll build practical intuition for quantization, KV caching, flash attention, and speculative decoding, including the tradeoffs each one introduces for cost, speed, and output quality.

You’ll earn a certificate upon completing the course, recognizing your skills in transformer-based language models.

In partnership with

null

We built this course with AMD to help engineers move beyond treating LLMs as black boxes. You’ll build practical intuition for how transformers generate text, process context, and run efficiently on GPUs, while learning techniques and concepts that apply across transformer-based models and hardware environments.

Who should join?

This course is designed for software engineers, ML engineers, and developers who work with LLMs and want to understand what’s actually happening under the hood.

You don’t need to have built a model from scratch, but you should be comfortable using LLMs through an API or chat interface and have a basic understanding of neural network concepts like weights, layers, and training.

Course Outline

Transformers in Practice

Instructor

Sharon Zhou

Sharon Zhou

VP of Engineering & AI at AMD

What Learners From Previous Courses Say About DeepLearning.AI

Frequently Asked Questions

I have questions about my DeepLearning.AI Pro subscription, whom can I ask?
How much does a Pro membership cost?

The DeepLearning.AI Pro membership costs $25/mo billed annually and $30/mo billed monthly.

More pricing details are available on the membership page.

Important details:

  • All prices are listed in USD
  • Payments are processed securely via Stripe
  • Taxes may apply depending on your location
Will I receive a certificate at the end of the course?

Yes! If you’re a DeepLearning.AI Pro member, you’ll earn a certificate upon completing the course, recognizing your skills in AI prompting.

Join today and be on the forefront of the next generation of AI!

Want to learn more about Generative AI?

Keep learning with updates on curated AI news, courses, events, as well as Andrew’s thoughts from DeepLearning.AI!