This lesson is all about tool calling. You will learn both the conceptual steps and Hands-On examples for tool calling with a Jamba model. Let's go! Jamba models have built-in tool calling functionality to support your Agentic workflow with external functions and tools. The tools can be external APIs to access real-time data, such as stock market, news or your CRM system data. You may also define your custom function, for example, arithmetic functions to serve as calculators for the LLM. First, let's go through a workflow diagram of using tool calling with the Jamba model. When you send a user query to the Jamba model, the Jamba model will decide if any available tool should be used to provide the best answer for your query. If no tool is triggered, the Jamba model will simply generate a response. If a tool should be used, the Jamba model will extract the proper parameters for the right tool based on the user query. The parameters will then be used to call the tool, and tool results will be sent back to the Jamba model to generate the final response. Let's go to the code. These two lines of code is used here to ignore any unnecessary warnings. First, you import everything you need from the AI21 library. Similar to the previous lab, you just need to load AI21 API key and create AI21 client. The API key has been set up here already for you, so you don't have to worry about it. Now let's get to the first tool calling sample. You'll probably know that the large language models are not the best tool for arithmetic calculation, and some even say guarded input LLM to do arithmetic, you don't want to multiply billions of numbers in order to multiply two numbers. So let's give the Jamba model two arithmetic functions, multiplication and addition for it to use at the right time. Pretty simple. Each function takes in two numbers and perform a simple arithmetic calculation. The heart of the Jamba well know that these two functions are available to it. You can use the tool definition we imported earlier from AI21 SDK to describe the available function and pass it to the Jamba model via the tool parameter. You can do it one two at a time. Input a definition. You specify type as function, provide a name and description on the function matched semantic meaning of the function, and also supply the parameters for the function you want to call later. Also, describe the type, and description of those parameters, and both numbers are required for the multiplication function. Similarly, you can define the addition function as well. Now you can put the two tools together in a list for the Jamba model. To easily understand the Jamba response. Let's ask what is the capital of France? Now you pass on messages and tools to the Jamba model to get a response. In the response, you can see that the tool calls in assistant message is none, which means none of the tools we defined were used for this question. Which makes sense. We don't need multiplication or addition to know the capital of France. Now it is time for you to ask a relevant question for our tools. For example, to multiply two numbers together, we can also add a system message to provide further guidance for the Jamba model. Now pass a message and tools to the Jamba 1.5 large model again. Now you can see the tool calling information is in the Assistant message tool called section with a tool called ID function name multiplication here and also the arguments, because in the query you ask the Jamba model to multiply two numbers, and the Jamba model correctly identifies the multiplication is the right function for the job. You can parse the function arguments from the Jamba response and use them to call the multiplication function you defined earlier. To do this calculation for you. The results from the multiplication function can be wrapped in a tool message object. Make sure you add the correct tool call ID and the result from your multiplication function added as content. The final step is to combine the initial assistant message and tool message in the chat history, and send them together to the Jamba model to give you the final response. We have multiple functions. You can use this code block to select the right function to call. With the same example, you can add a two arithmetic function in a dictionary. Parse the function name and the parameters from the Jamba model response and call the write function. Now let's look at another tool calling example to run some analysis on SEC_10Q founding documents, which is public companies quarterly reports. You will first import the SEC_10q function from the utils. This function goes to the SEC website. Uses the company's stock ticker symbol to poll the full text of the most recent annual 10q report. Now, only one parameter it needs is a stock ticker symbol. Let's start by defining the tool for the Jamba model, like we did in the previous example with a function tool, description and parameters. Now you can add it to a tools list. Now let's say you want to get a summary of Nvidia's most recent 10q report. You can write a system message and user message and pass it to the Jamba model, along with a tool list. Let's take a look at the Jamba model response. The Jamba model correctly identifies the SEC 10Q tool is required here and returns the Nvidia ticker NVDA as a parameter to the function. Let's call the SEC_10Q function. Now you can get the full text of the Nvidia's most recent 10q quarter report, which is a very long file here. For the convenience, let's clear the cell output, append the initial Jamba model response and the tool message, including the 10q file as content to the chat history. The Jamba model can now complete your initial request. And here the final response which is the summary for the most recent Nvidia 10Q file. In a few lines of code, the Jamba model was able to point to the right function and pull the right document from the SEC site, and generate this summary for you. You can now stitch together the entire process using this code block. It checks if a particular tool is called, and generates a final response accordingly. All right. In this lesson, you have learned how to use the Jamba model for tool calling, which is critical for building agentic AI applications. In the next lesson, you will learn how to expand the context window size for a large language model. See you there!