Core Concepts
The author explores the potential of generative AI models in producing sustainable code by optimizing sustainability metrics. The study compares the green capacity of human-generated code with that of three AI language models.
Abstract
The content delves into the environmental impact of software development due to increasing data services demand. It discusses the energy consumption and carbon emissions from data centers, emphasizing the need for green coding practices. The study evaluates the sustainability awareness of generative AI models like ChatGPT, Copilot, and CodeWhisperer in generating eco-friendly code. By comparing human submissions with AI-generated solutions, insights are provided on how advanced technologies can contribute to sustainable software development. The analysis includes metrics such as runtime, memory usage, FLOPs, energy consumption, and code correctness across different coding problems.
Stats
Global data center electricity consumption was estimated to be 240-340 TWh in 2022.
Estimated CO2 emissions from training a large NLP model are 1-10 times those from the lifecycle of a car.
Energy consumption of large NLP models during training ranged from 20MWh to more than 1200MWh.
Development of GPT-3 estimated to have generated 552 tons of CO2 equivalent (tCO2e).
Energy use per query for ChatGPT is roughly 0.002kWh.
Assuming 100 million daily queries for ChatGPT results in an estimated energy consumption of about 0.2GWh per day.
Quotes
"Green coding practices aim to reduce carbon emissions associated with electricity consumption."
"AI models have the potential to contribute to environmental sustainability through 'carbon handprint' software."