In this short video, you will learn how to set up Airflow and Weaviate locally to run the DAGs you've built in this course on your local machine. Go to this repository in the Astronomer organization called Orchestrating Workflows for GenAI Deep Learning.AI. Fork the repo into your own GitHub account and then clone it to your local machine and navigate into the directory. I'm using VS Code here to run a terminal and edit my DAG files, but you can edit your DAG files and run your commands any way you'd like. The only package you need to install on your machine to run this project is the Astro CLI, a free tool by Astronomer developed to run Airflow locally in containers. Use brew install Astro to install the package on your Mac. If you are on a different operating system, check out the Astro CLI documentation for install instructions. If you already have the Astro CLI installed like me here, then make sure you are at least on version 1.34.1 by running Astro version. Once you have the Astro CLI installed, there is only one small change you need to make to the repo. Create a new file in the root directory called .env. This file will contain all environment variables for your Airflow project. Copy the contents from .env_example into this file. This environment variable defines the connection between the Airflow instance and the local Weaviate environment. If you want to connect to a different Weaviate environment, you can make changes to this variable, for example, to connect to Weaviate cloud. Note that you can include your own OpenAI API key if you'd like, but this is not necessary to run the course pipelines. After saving the change to the .env file, run astro def start in the root directory of your project. This command will spin up the four containers running the Airflow components you learned about in this course. The scheduler, the API server, the DAG processor, and the Airflow metadata database. The astro dev start command also initializes the triggerer component, which we did not cover in the course. Alongside these five containers running Airflow, a container for a local Weaviate instance is created as well, based on the definition in the docker-compose.override file. You do not need to have Docker installed on your machine to run this command. The Astro CLI will set up Podman for you to run containers, if you do not have Docker available. Once the containers are ready, the airflow UI opens at localhost 8080. You do not need any credentials to login. Now you can run the same DAGs you built in this course on your computer and start building out your own GenAI pipelines. Once you are ready to deploy your pipeline, you can send those DAGs to any type of Airflow deployment. If you have an Astro account which has a free trial available, then you can just use the command astro deploy to send your pipelines to the cloud.