This is part 3 of our series. You can find part 1 and part 2 here.
There's no denying the immense carbon footprint of AI. Training just one large language model (LLM) causes as much CO2 emissions as five cars throughout their entire lifespan. This is alarming, particularly considering that the global emissions from data centers and data transmission networks already account for between 0.6% and 4% of worldwide greenhouse gas emissions. This is on par with global air travel emissions. And we're only at the cusp of the AI revolution. Hence, it's crucial to contemplate how we train AI models with the least possible impact on the environment. This approach intertwines with a rising vision in development known as GreenOps. GreenOps, a revolutionary operational management method, zeroes in on eco-awareness and sustainability within the IT sector. Instead of just chasing efficiency and performance, GreenOps steers toward shrinking the ecological footprint, and in turn, cost of server use. The principles of GreenOps encompass energy optimization, waste reduction, and slashing CO2 emissions. One of the prime methods to achieve this practically is by optimizing server configurations. Tailoring servers to the specific needs of workloads prevents excessive energy consumption. GreenOps isn't solely an ethical stance in IT management; it brings tangible benefits to businesses, such as cost savings, enhanced operational efficiency, and reduced negative environmental impact. It's a stride toward a more sustainable and responsible IT industry. Thus, in a series of blogs, we're doling out tips on making your AI models as sustainable as can be. In previous blogs, we offered suggestions on reducing workload and optimizing server usage. In this blog, we'll illustrate how timing and location also affect energy consumption and CO2 emissions. We'll present you with 4 questions to consider regarding timing and location in the development of your AI models. Making a conscious choice regarding the timing and location of training can further the development of your AI models responsibly.
Question 1: When are renewable energy sources available?
Optimizing the use of renewable energy sources – as opposed to fossil fuel-generated energy – is crucial in AI training. The right timing for AI training is pivotal; between 11:00 AM and 4:00 PM, most energy is generated from solar and wind sources. Utilizing this energy, especially on sunny or windy days when it can't be stored, is imperative as unused energy at generation time goes to waste. This primarily applies if your cloud provider utilizes renewable energy sources—have you checked that? However, even if that's not the case, energy plants prioritize using naturally generated power before resorting to non-renewable sources. Thus, maximizing daytime usage of this energy is beneficial in all scenarios.
Question 2: What are your hardware options?
Depending on the hardware used for training AI models, choices can be made:
Energy-saving modes: Modern hardware often offers advanced energy management options in the BIOS. Optimizing settings significantly reduces server energy consumption without compromising performance. For instance, enabling sleep mode and limiting maximum energy output.
Dynamic frequency scaling: Many processors and GPUs support dynamic frequency scaling, adjusting performance to workload, preventing unnecessary energy consumption during less intensive processes.
Selection of efficient hardware: Opt for devices with the above features. While avoiding electronic waste and not prematurely discarding servers is important, newer models are often designed for energy efficiency. Make a judgment call whether recycled hardware will suffice for your current need and when it's time to upgrade.
Maintenance and upgrades: Whether in-house or through a cloud provider, regular maintenance and timely upgrades ensure efficient hardware operation. Outdated or poorly maintained systems often consume more energy.
Cooling techniques and reuse of waste heat: Server cooling consumes substantial energy. Inquire about your cloud provider's cooling methods. Data centers using liquid cooling often consume less energy than those using traditional air cooling.
Even better is the reuse of waste heat. Leafcloud excels in this aspect among cloud providers, reusing 85% of heat directly for warm tap water, significantly reducing fossil fuel use. These are hardware-related choices, but energy savings can also be achieved in software and usage, as outlined in our previous blogs offering tips on optimizing servers and reducing workload for the same tasks with less energy consumption.
Question 3: What location do you choose: cloud, edge, or on-premises?
Aligned with hardware choices, a conscious decision between cloud, edge, or on-premises is vital. Consider factors from the previous question: energy sources, hardware efficiency, and waste heat reuse. Even when outsourced to a cloud provider, more options might be available than anticipated.
Choose a sustainable cloud provider: We've compiled a checklist for assessing the sustainability of your cloud provider.
Question 4: How do you measure and report the savings in training your AI models?
Sustainable AI model training isn't a one-time setup; it's an evolving process. Continuous monitoring of energy consumption provides insights for optimization using new technologies and ideas. Transparent reporting on energy usage and CO2 emissions demonstrates organizational responsibility and contribution to sustainability in the tech industry.
Choosing a Greener Future Now
At the onset of AI development, it's imperative to be conscious of how we handle its environmental impacts. Training AI models is just the beginning. As the use of AI applications grows, it will require energy and resources. Organizations wield influence by making conscious choices, including selecting a cloud provider. Many claim climate neutrality, but Leafcloud approaches it uniquely. Our servers are placed in buildings directly connected to hot water systems. Operating on green energy and utilizing residual heat to warm tap water for doing the dishes or showering significantly reduces CO2 emissions. Moreover, we avoid building new data centers by using existing structures and making them more sustainable. For more details on our approach, visit our website. If you want to explore services tailored to your needs, contact us directly to address all your queries.