DLAI Logo
AI is the new electricity and will transform and improve nearly all areas of human lives.

Welcome back!

We'd like to know you better so we can create more relevant courses. What do you do for work?

DLAI Logo
  • Explore Courses
  • Community
    • Forum
    • Events
    • Ambassadors
    • Ambassador Spotlight
  • My Learnings

ChatGPT Prompt Engineering for Developers
Short Course・

ChatGPT Prompt Engineering for Developers

Learn the fundamentals of prompt engineering for ChatGPT. Learn effective prompting, and how to use LLMs for summarizing, inferring, transforming, and expanding.

GenAI ApplicationsPrompt EngineeringTransformers
  • OpenAI
OpenAI
Finetuning Large Language Models
Short Course・

Finetuning Large Language Models

Discover when to use finetuning vs prompting for LLMs. Select suitable open-source models, prepare data, and train & evaluate for your specific domain.

Deep LearningFine-TuningTransformers
  • Lamini
Lamini
Reinforcement Learning From Human Feedback
Short Course・

Reinforcement Learning From Human Feedback

Get an introduction to tuning and evaluating LLMs using Reinforcement Learning from Human Feedback (RLHF) and fine-tune the Llama 2 model.

Fine-TuningGenerative ModelsLLMOpsTransformers
  • Google Cloud
Google Cloud
Prompt Engineering with Llama 2&3
Short Course・

Prompt Engineering with Llama 2&3

Learn best practices for prompting and selecting among Meta Llama 2 & 3 models. Interact with Meta Llama 2 Chat, Code Llama, and Llama Guard models.

AI SafetyGenAI ApplicationsGenerative ModelsPrompt EngineeringTransformers
  • Meta
Meta
Open Source Models with Hugging Face
Short Course・

Open Source Models with Hugging Face

Learn how to easily build AI applications using open-source models and Hugging Face tools. Find and filter open-source models on Hugging Face Hub.

ChatbotsGenerative ModelsMultiModalNLPPrompt EngineeringTransformers
  • Hugging Face
Hugging Face
Efficiently Serving LLMs
Short Course・

Efficiently Serving LLMs

Understand how LLMs predict the next token and how techniques like KV caching can speed up text generation. Write code to serve LLM applications efficiently to multiple users.

Fine-TuningGenerative ModelsLLMOpsLLM ServingTransformers
  • Predibase
Predibase
Quantization Fundamentals with Hugging Face
Short Course・

Quantization Fundamentals with Hugging Face

Learn how to quantize any open-source model. Learn to compress models with the Hugging Face Transformers library and the Quanto library.

Generative ModelsCompression and QuantizationMultiModalTransformers
  • Hugging Face
Hugging Face
Pretraining LLMs
Short Course・

Pretraining LLMs

Learn the essential steps to pretrain a large language model from scratch.

Deep LearningEvaluation and MonitoringFine-TuningGenAI ApplicationsLLMOpsMachine LearningMathematical FoundationsTransformers
  • Upstage
Upstage
How Transformer LLMs Work
Short Course・

How Transformer LLMs Work

Understand the transformer architecture that powers LLMs to use them more effectively.

Deep LearningEmbeddingsGenAI ApplicationsLLMOpsMachine LearningNLPRAGTransformers
  • Jay Alammar, Maarten Grootendorst
Jay Alammar, Maarten Grootendorst
Attention in Transformers: Concepts and Code in PyTorch
Short Course・

Attention in Transformers: Concepts and Code in PyTorch

Understand and implement the attention mechanism, a key element of transformer-based LLMs, using PyTorch.

Deep LearningEmbeddingsGenAI ApplicationsMachine LearningNLPTransformers
  • StatQuest
StatQuest
Reinforcement Fine-Tuning LLMs With GRPO
Short Course・

Reinforcement Fine-Tuning LLMs With GRPO

Improve LLM reasoning with reinforcement fine-tuning and reward functions.

Evaluation and MonitoringFine-TuningGenAI ApplicationsLLMOpsLLM ServingMachine LearningPrompt EngineeringSupervised LearningTransformers
  • Predibase
Predibase