From Efficiency Gains to Rebound Effects: The Problem of Jevons’ Paradox in AI’s Polarized Environmental Debate (Luccioni et al. 2025) [Paper]
SPROUT: Green Generative AI with Carbon-Efficient LLM Inference (Li et al. 2024) [Paper].
Artificial Intelligence in Climate Change Mitigation: A Review of Predictive Modeling and Data-Driven Solutions for Reducing Greenhouse Gas Emissions (Adegbite et al. 2024) [Paper]
Addition is All You Need for Energy-Efficient Language Models (Luo et al. 2024) [Paper]
LLMCO2: Advancing Accurate Carbon Footprint Prediction for LLM Inferences (Fu et al. 2024) [Paper]
AI, Climate, and Regulation: From Data Centers to the AI Act (Erbert et al. 2024) [Paper]
Offline Energy-Optimal LLM Serving: Workload-Based Energy Models for LLM Inference on Heterogeneous Systems (Wilkins et al. 2024) [Paper]
Hybrid Heterogeneous Clusters Can Lower the Energy Consumption of LLM Inference Workloads (Wilkins et al. 2024) [Paper]
The Price of Prompting: Profiling Energy Use in Large Language Models Inference (Husom et al. 2024) [Paper]
Towards Greener LLMs: Bringing Energy-Efficiency to the Forefront of LLM Inference (Stojkovic et al. 2024) [Paper]
Towards Efficient Generative Large Language Model Serving: A Survey from Algorithms to Systems (Miao et al. 2024) [Paper]
Beyond Efficiency: Scaling AI Sustainably (Wu et al. 2024) [Paper]
A Simplified Machine Learning Product Carbon Footprint Evaluation Tool (Lang et al. 2024) [Paper]
Green AI: Exploring Carbon Footprints, Mitigation Strategies, and Trade Offs in Large Language Model Training (Liu et al. 2024) [Paper]
Measuring and Improving the Energy Efficiency of Large Language Models Inference (Argerich et al. 2024) [Paper][GitHub]