Before we conclude this course, I wanted to share some interesting papers and projects with you to hopefully inspire you on your carbon-aware journey. Let's dive in. And the first is the Google Cloud Region Picker tool. And this course we talked about how to select a region based just on carbon. But of course selecting a region for a real project usually depends on many other factors like latency, price and data location. To help you with region selection, you can use this region picker tool by specifying for different factors like carbon footprint and lower price and latency, how important or not important they are. So let's say that carbon is of a little bit less than medium importance, but we really care about price and we don't care so much about latency. If it's applicable, you can also specify where your traffic is coming from and then the products that you're interested in running. So let's scroll down here and select Vertex AI. Once you specified all these different parameters, you'll get a recommendation for regions that are good at based on your specific preferences. So it actually looks like I'm getting recommended a U.S. central one. And Iowa What you might recall is the region we used and Lesson 3 to train the machine learning model. And then later on I also trained in Montreal, Canada, because that one had more renewable energy available at that point in time. So definitely check out the region picker if you plan to use Google Cloud, and you can use this tool as a way to guide you to pick the best region that still optimizes for carbon efficiency, while also keeping in mind other factors that might matter to you. Now for some interesting research papers. If you want to learn a little bit more about this field. And the first is a family of language models called GLaM or generalist language model. The authors of this paper show that the largest GlaM model, which is 1.2 trillion parameters, consumes only a one third of the energy used to train GPT-3 and requires half of the computation flops for inference, all while achieving comparable performance across 29 NLP tasks. I think it's super cool the way the researchers were able to achieve this efficiency. They use something called sparsely activated mixture of experts architecture, and it was something I wasn't familiar with until looking at this paper and would definitely recommend checking out if you want to learn more about how that works, and how they were able to get these efficiency gains when it comes to energy use and in addition to looking at this paper, if you want to learn a little bit more about mixture of experts, There is another short course from MistralAI that covers a little bit about mixture of experts. So definitely check that out if you want a little bit of an introduction. Before diving straight into this technical paper. And while we talked mostly about the footprint from training models in this course, there's recently been more research on energy consumption at inference time. So I would recommend checking out the paper. Power Hungry Processing: what's driving the cost of AI deployment? The authors here point out that while inference on a single example requires much less computation than that required to train the same model, inference happens a lot more frequently than model training. Specifically, the researchers here compare the amount of energy use per inference for a few different Bloom's models. With the total amount of energy used for both training and fine tuning them. They use this number to estimate how many inferences would be needed to be carried out with a given model, in order for the cost of inference to reach the cost of training, and they estimate that for ChatGPT, even just assuming a single query per user, the energy costs of deploying this model would surpass its training costs after just a few weeks or months of deployment. This paper studies 88 models across ten tasks and 30 data sets and applications in natural language computer vision, and they analyze the impact of the modality, model, size, architecture and learning paradigm all on energy efficiency. one thing you also learn in this paper is that, for instance, pumping a large language model to perform a summarization task is much more energy and carbon-intensive than asking it to perform classification. And as you might imagine, image generation is much more energy intensive than image classification or text generation. Lastly, thinking beyond carbon, there's also a water footprint associated with gen AI. Remember that data centers running GPUs and TPUs can get really hot and cooling systems may evaporate a significant amount of water in order to keep these processors running. Recently, there's been some work to try and understand and quantify water usage. So if you want to learn more about this, I would check out the paper Making AI Less thirsty. There's definitely less research here than there is and quantifying the carbon footprint from compute. But just like how in the past few years that's become a topic we understand a lot better, I think understanding the water footprint and learning more about how we might make certain optimizations is something that will see a lot more of, and the research space. Water usage is a topic that's been getting more attention and the research community recently. And just like how a few years ago, we didn't really understand or think about the carbon footprint of compute that much. But that's changed a lot in the last few years. And there's been a lot more research there. I think water is a new area that we're going to see a lot more research and understanding on what this impact really is, and how we can make changes to utilize less water and have a lower footprint. So definitely check out this paper if you want to get an understanding of how this new field is starting to develop. And with that, those are just a few papers and projects that I found really interesting. I hope this course is a starting point for you and your carbon-aware journey. I think it's really exciting that you can make a difference as a developer and engineer, and I hope from now on you'll start to consider carbon when you're developing software applications. I mean, we're already used to thinking about things like performance, security, cost. So really carbon is just one more thing to add to that list.