The Climate Cost of Artificial Intelligence: An Architect’s View
By Vineet Gupta Published by PuneNow.com
The global race for Artificial Intelligence is often framed as a battle of algorithms and hardware. But behind the seamless interface of a chatbot or the efficiency of a predictive model lies a staggering physical cost. As AI adoption scales, so does its appetite for electricity, water, and rare minerals.
To peel back the curtain on the environmental impact of our digital future, Jayant Mahajan, an educator and advocate for the “Change Before Climate Change” mission, had a conversation with Sudip Acharya, a Senior Enterprise Architect with over two decades of experience in GenAI, Cloud, and Data Platform architecture.
The Invisible Footprint of Innovation
Jayant Mahajan: Sudip, you’ve spent over twenty years designing large-scale architectures. From your vantage point, what are the hidden climate costs of AI that remain invisible to the average user?
Sudip Acharya: Thank you, PuneNow, for this platform. Like any major innovation, AI has an ever-growing need for energy and natural resources. High-performance data centers are expected to double their energy demand by 2030. Some top systems already emit as much carbon as a small country.
Currently, data centers use 1–2% of global electricity, and AI accounts for about 15% of that. We’ve seen major cloud providers’ emissions surge by 40% in five years due to AI. Beyond electricity, these centers, often located in arid regions, consume millions of gallons of water for cooling. Then there is the e-waste; 2023 alone produced 2,600 tons of specialized AI chip waste.
AI vs. Traditional IT: A Different Beast
Jayant Mahajan: How does the footprint of modern AI-driven data centers compare to the traditional enterprise IT infrastructure we’ve used for decades?
Sudip Acharya: AI-optimized facilities are designed for 10–30 times more energy consumption than traditional counterparts. Traditional IT relies on CPUs, but AI workloads depend on GPUs, which generate significantly more heat. This necessitates much higher water usage for cooling. Even though modern facilities are more efficient per individual computation, the sheer volume of AI tasks leads to a much higher total energy footprint.

Training vs. Inference: Where is the Real Cost?
Jayant Mahajan: You’ve led GenAI implementations across various industries. Which phase is more carbon-intensive, training these massive models or running them at scale?
Sudip Acharya: Training one large AI model can produce as much $CO_2$ as 120 cars do in a year. While training is a massive, concentrated event, ‘inference’, the act of running the model for users, is generally more carbon-intensive over its lifecycle.
Inference occurs every time you ask ChatGPT a question. When you multiply that by millions of users daily, inference accounts for 80–90% of a model’s total lifetime energy consumption.
The Cloud Migration Myth
Jayant Mahajan: Cloud migration is often sold as a “green” move. In your experience, when does the cloud actually reduce emissions, and when does it just shift the problem?
Sudip Acharya: It’s a win when you move from rigid Virtual Machines to containers or serverless computing, ensuring servers only run when needed. However, sustainability takes a hit during ‘lift & shift’ migrations where there is no optimization. Poor architectural designs and unnecessary large-scale data transfers can actually increase resource usage compared to on-premise setups.
The Looming Water Crisis
Jayant Mahajan: As an architect, how serious is the water footprint, particularly in regions already facing climate stress?
Sudip Acharya: It is critical. A single data center’s consumption can equal the needs of 30,000 to 40,000 residents per day. This creates friction between corporations and local communities, especially since many centers are in drought-prone areas. About 75% of this water evaporates in cooling towers, while “indirect” water is used in chip manufacturing and hydroelectric power generation.
Governance and “Green FinOps”
Jayant Mahajan: In regulated sectors like banking and healthcare, are leaders actually factoring in the environment, or is it still all about speed?
Sudip Acharya: It’s changing. Regulations like the EU AI Act and the Corporate Sustainability and Reporting Directive (CSRD) are forcing firms to report their carbon footprints. We are also seeing the rise of “Green FinOps.” This means making carbon costs as visible as financial costs. Future “Carbon-aware” architectures will automatically select data centers in regions with cleaner energy grids.
The Path Forward: Architecture as a Solution
Jayant Mahajan: What practical steps should CXOs take today to ensure AI adoption doesn’t compromise climate responsibility?
Sudip Acharya: From an architectural standpoint, organizations must:
- Adopt smaller, specialized models over massive, general ones.
- Use containerization and orchestration to maximize hardware efficiency.
- Partner with cloud providers using green energy (nuclear, hydro, or thermal).
- Implement closed-loop cooling systems to reduce freshwater waste.
Jayant Mahajan: Sudip, thank you for sharing these insights. It’s clear that while AI is a tool to fight climate change, we must first ensure the tool itself doesn’t break the planet.

Vineet Gupta is the Founder and Managing Editor of PuneNow, where he oversees local news delivery and explores mindful living, parenting, and personal growth. An alumnus of the University of Wales, Vineet has travelled extensively and worked across hospitality, finance, and academia. Now based in Pune, his diverse global background informs his hyper-local perspective, helping the community find meaning, balance, and connection in everyday life.

