DLAI Logo
AI is the new electricity and will transform and improve nearly all areas of human lives.

Welcome back!

We'd like to know you better so we can create more relevant courses. What do you do for work?

DLAI Logo
  • Explore Courses
  • Community
    • Forum
    • Events
    • Ambassadors
    • Ambassador Spotlight
  • My Learnings
  • daily streak fire

    You've achieved today's streak!

    Complete one lesson every day to keep the streak going.

    Su

    Mo

    Tu

    We

    Th

    Fr

    Sa

    free pass got

    You earned a Free Pass!

    Free Passes help protect your daily streak. Complete more lessons to earn up to 3 Free Passes.

    Free PassFree PassFree Pass
Hi, I'm excited to share with you this new course on using LangChain to chat with your data. This is built in collaboration with Harrison Chase, co-founder and CEO of LangChain. Large language models or LLMs such as ChatGPT can answer questions about a lot of topics, but an LLM in isolation knows only what it was trained on, which doesn't include your personal data, such as if you're in a company and have proprietary documents not on the internet, as well as data or articles that were written after the LLM was trained. So wouldn't it be useful if you or others such as your customers can have a conversation with your own documents and get questions answered using information from those documents and using an LLM? In this short course, we will cover how to use LangChain to chat with your data. LangChain is an open-source developer framework for building LLM applications. LangChain consists of several modular components as well as more end-to-end templates. The modular components in LangChain include prompts, models, indexes, chains, and agents. For a more detailed look at these components, you can see our first course that I taught with Andrew. In this course, we will zoom in and focus on one of the more popular use cases of LangChain, how to use LangChain to chat with your data. We will first cover how to use LangChain document loaders to load data from a variety of exciting sources. We will then touch on how to split these documents into semantically meaningful chunks. This pre-processing step may seem simple but has a lot of nuance. Next, we'll give an overview of semantic search, a basic method for fetching relevant information given a user question. This is the easiest method to get started with but there are several cases where it fails. We'll go over these cases and then we'll go over how to fix them. We'll then show how to use those retrieved documents to enable an LLM to answer questions about a document but show that you're still missing one key piece in order to fully recreate that chatbot experience. Finally, we'll cover that missing piece, memory, and show how to build a fully functioning chatbot through which you can chat with your data. This will be an exciting short course. We're grateful to Ankush Gola as well as Lance Martin from the LangChain team for working on all the materials that you hear Harrison present later, as well as on the deeplearning.ai side, Geoff Ladwig and Diala Ezzeddine. In case you're going through this course and decide you'd like a refresher on the basics of LangChain, I encourage you to also take that earlier short course on LangChain for LLM application development that Harrison had mentioned as well. But with that, let us now go on to the next video where Harrison will show you how to use LangChain's very convenient collection of document loaders.
course detail
Next Lesson
LangChain Chat with Your Data
  • Introduction
    Video
    ・
    2 mins
  • Document Loading
    Video with Code Example
    ・
    7 mins
  • Document Splitting
    Video with Code Example
    ・
    15 mins
  • Vectorstores and Embedding
    Video with Code Example
    ・
    9 mins
  • Retrieval
    Video with Code Example
    ・
    11 mins
  • Question Answering
    Video with Code Example
    ・
    9 mins
  • Chat
    Video with Code Example
    ・
    9 mins
  • Conclusion
    Video
    ・
    1 mins
  • Course Feedback
  • Community
  • 0%