The Innovation vs Sustainability Paradox: The Case of AI

Introduction / Our Research Question

The world is changing quickly. Generative AI and data centers are growing fast. They are meant to help with sustainability and progress. However, their energy use and material needs could actually harm global climate goals. For example, some estimates suggest that data centers in the United States could use up to 8.6% of the country’s electricity by 2035. This is mostly because of AI workloads, which can use a lot of energy depending on the task. This leads to an important research question: what does responsible and informed use of AI and related technological advances look like, and how should we govern their impact to truly support sustainability?

Argument

If the growth of generative AI and data centers continues without clear limits and oversight, these technologies could harm the environment more than they help. Clearly, it is not just a technical problem. It is also about how we govern and regulate these technologies. At present, most rules focus on issues like privacy and fairness in AI. They do not pay enough attention to the environmental risks, such as energy use, water consumption, and e-waste. Global governance must evolve to include clear, enforceable rules that connect AI’s growth to what the planet can handle, and that requires a more nuanced and informed understanding of both the benefits and the trade-offs.

Evidence

Energy use is a key part of the research question. AI uses energy in ways that are not always clear or easy to compare. For example, a simple ChatGPT text query uses about 0.001 kWh, while a more complex query can use 0.3 kWh. Creating a 30-second video with AI can use up to 15 kWh, which is as much as charging almost two smartphones. This wide range means companies can highlight their most efficient features and downplay the real cost of popular, energy-hungry tools. To elucidate, without clear and standardized energy reporting, it is hard for the public and policymakers to know the true impact. Therefore, understanding and governing energy use is essential to answering the research question.

How data centers are powered is also central to responsible AI use. Some people think that renewable energy will solve the problem, but many tech companies are still investing in fossil fuels to meet demand. For example, Entergy is spending $3.2 billion on new natural gas plants for Meta’s data center in Louisiana There are similar projects in Texas and Nevada. Supporters say natural gas is a “bridge” fuel. However, the International Energy Agency warns   that this could delay decarbonization by 10 to 15 years. Each 1 GW gas plant adds about 3.6 million metric tons of CO₂ to the air every year, which is like adding 780,000 more cars to the road. This evidence shows that choices about energy sourcing are not just technical—they are at the heart of responsible AI governance.

Waste governance is another important link to the research question. A recent review found that 73% of AI sustainability research is about energy, but only 12% covers hardware waste . To elucidate, training a large language model needs about 400 special Graphic Processing Units (GPUs), which only last three to five years. This creates a lot of e-waste, including rare metals like neodymium and dysprosium. Less than 20% of this hardware is recycled. Improper disposal pollutes water and soil in places like Malaysia and Ghana. Therefore, managing the full lifecycle of hardware is an essential part of responsible and informed AI use.

Sustainable Use Examples

Some companies are trying new ideas to address these challenges. For example, For example, Google’s TPU v5 chips are designed so 80% of the parts can be reused. Microsoft’s Azure Circular Cloud program uses AI to predict when servers need repairs, so they last longer. AI-powered recycling can now sort GPU parts with 92% accuracy and recover 98% pure silicon. These steps could cut AI’s material footprint by 40% by 2030. However, most companies will not do this unless there are strong rules and incentives. This shows that governance and policy frameworks are needed to make these solutions mainstream and directly links back to our research question.

Hardware is getting better in terms of energy usage too, but not fast enough. The power needed for AI models doubles every 100 days, which is faster than efficiency improvements. Cooling systems also need a lot of water.  For example, Nevada’s Switch data centers use 12 million gallons every day, and Singapore stopped new data centers after they reached 7% of the country’s water use. These examples show that technical progress alone does not guarantee responsible use.

Counterargument

Some people believe that new technology will fix these problems.  For example, AMD’s 3D V-Cache, Lightmatter’s optical chips, and Intel’s neuromorphic computing are all much more efficient. However, the Harvard Business Review warns that better technology alone is not enough. If we do not set limits, AI’s emissions could cancel out all the gains from new renewable energy by 2031..Also, current rules are not strong enough to enforce these limits or ensure that improvements lead to real reductions in impact. This again shows that governance, not just innovation, is central to responsible AI use.

Policy Recommendations

Governments need to set clear standards for energy reporting and emissions transparency for all large-scale AI operations. This would help ensure that energy use is measured, reported, and compared in a standard way. Governments should also require data centers to use a certain percentage of renewable energy and set limits on fossil fuel investments for new facilities. These steps would help align AI growth with climate goals.

Companies must adopt best practices for hardware reuse and recycling, such as modular chip designs and predictive maintenance programs. Firms should also publicly report on their material sourcing and recycling rates. This would help make responsible AI use a core part of business strategy, not just a marketing claim.

Both governments and companies should work together to develop incentives for innovations that reduce water and energy use, and to support research into the full lifecycle impacts of AI hardware and software. These actions would help ensure that the benefits of AI are balanced with a clear understanding of the trade-offs.

Conclusion

The AI sector is at a turning point. The dream of greener, faster supply chains is possible, but only if we resolve the innovation vs sustainability paradox. We already have the tools:: real-time emissions tracking, hardware that can be reused, and pricing that reflects the true carbon cost. What is missing is the will to use them and the policy frameworks to require them. If we do nothing, data center emissions could be higher than aviation by 2028. Therefore, the next step must be clear rules and shared responsibility. Only then can AI help the planet instead of hurting it.

Pratul Malthumkar is a 2026 candidate for the Master of Arts in International Relations (MAIR) at Johns Hopkins University’s School of Advanced International Studies (SAIS). He previously earned a Bachelor of Commerce (Honours) in Accounting from Shri Ram College of Commerce, University of Delhi. Driven by a passion for the human stories behind the numbers, Pratul aspires to join the Indian Foreign Service and shape India’s diplomatic engagement. He is currently interning with the Foundation for Global Governance and Sustainability (FOGGS) on issues at the nexus of global governance and development.


Would you like to share your thoughts?

Your email address will not be published.

© 2025 Katoikos, all rights are reserved. Developed by eMutation | New Media