The Weight of Light: AI, Computation, and the Cost of Thinking Big

While AI illuminates vast possibilities, it casts long shadows of energy consumption, carbon emissions, and resource depletion, writes Shagun Tripathi.

Listen to this Article

At 3 a.m., in the Arizonan desert, inside a data center lit by LEDs and cooled by evaporated water, GPT-4 is wide awake. Outside, the ground is dry and silent. Inside, air circulates over racks of humming machines, each one drawing energy, releasing heat, and consuming water just to keep thinking.

This glowing machinery is the infrastructure of modern intelligence. And we are still blinking in its light, much like the prisoners in Plato’s The Republic. Using the double allegory of the cave and the sun, Plato describes prisoners chained in a cave since childhood, watching shadows flicker on a wall. Having never seen anything else, they believe that shadows are the only reality. Until one day, a prisoner escapes and stumbles out into the sunlight. Disoriented and in pain at first, he slowly adapts to the “light” of the sun. It’s a timeless metaphor for knowledge and truth.

In many ways, artificial intelligence is our modern technological “sun”. It is no wonder, then, that we are using this “light” to predict climate trends, redesign energy systems, reimagine economies, discover life-saving drugs, solve intractable problems, and reconfigure the very fabric of our societies. Keeping with the metaphor, AI is intelligent, fast, and increasingly omnipresent. But here’s the twist: unlike the real sun, our algorithmic sun isn’t a self-sustaining nuclear fusion reactor of its own. Instead, it runs on servers, consumes electricity, and leaves a carbon footprint. Behind every chatbot conversation, image generation, and algorithmic insight lies a vast, invisible infrastructure of computational apparatus that is data- and resource-hungry.

This brings us to the defining question of our times. On the one hand, AI holds immense promise. It can amplify human intelligence, extend productivity, and unlock innovation at an unseen scale. On the other lies a mounting cost: AI systems are increasingly data hungry, computation intensive, and resource extractive. Concerns have grown around the cost of vast amounts of energy, strained water supplies for data center cooling, intellectual property conflicts, and limited transparency and oversight.

Is this the paradox we now face, where intelligence has both the ability to unlock a prosperous future and the potential to do harm? Or have we made a perilous Faustian bargain, trading insight for the planet’s future, and speed for sustainability? Either way, data-driven discernment and climate models suggest that our bills will come due. Even if the receipt is still printing.

To reconcile the rising entries on AI’s expense ledger, we must trace them back to their technical origins – one that starts with a simple idea: in AI, more is better.

The Scaling Laws of AI’s Power – Why Bigger Is Smarter. And Hungrier.

In the engine room of modern AI lies a deceptively simple idea: if you make it bigger, it gets better. This core idea comes from a 2020 OpenAI paper by Kaplan et al. titled “Scaling Laws for Neural Language Models”. Their findings were astonishingly consistent: as you increase three factors: model size (the number of parameters), training data volume, and compute power (measured in floating point operations, or FLOPs)—AI performance rises almost without limit. No plateau. No clear ceiling. Just a steady, predictable curve of improvement. This power-law relationship underpins why so many organizations are pouring massive resources into bigger models.

However, model growth inevitably comes with an escalating energy bill. Large AI models consume energy at every stage of their lifecycle. It starts even before the learning begins, in the pre-training phase. Here, massive datasets must be gathered, cleaned, formatted, and tokenized, i.e., transformed into a form the model can understand. This preparatory work alone demands serious computing power. Then comes training, the most energy-intensive stage. Billions and trillions of internal parameters are adjusted over days or weeks of nonstop computation. Specialized chips hum in data centers, drawing electricity and, often, water for cooling. Finally, there’s inference — the phase after deployment, when the model is actually used. Each user query triggers a cascade of calculations. One response may not cost much, but on a global scale, the power draw becomes enormous. Even gratitude, it turns out, has a cost. Sam Altman, CEO of OpenAI, recently revealed that even polite phrases such as “thank you” and “please” alone cost the company millions of dollars in operational expenses.

Designing Lean Intelligence for Sustainability: Smarter AI without the Burn

Much of AI’s recent power has come from scaling up. Bigger models, more data, more computation. But this rush for scale has triggered an urgent question: how do we keep the AI light without getting burned? The result is a wave of engineering innovations that rethink AI’s appetite. One such development involves the use of specialist models that only activate the parts they need, rather than running the whole system every time. Think of it like calling only the right expert from a panel, instead of assembling the entire team for every question. These “Mixture of Experts” models can be enormous on paper, but only a small portion works at a time, cutting energy use significantly.

Another tactic involves simple compression. Just as a zip file contains the same contents in less space, these “distilled” models deliver similar results with far less computing power. Tools like DistilBERT and quantized models are already being used to shrink AI’s carbon footprint without sacrificing performance. Then there’s retrieval-augmented generation (RAG model), which is a fancy way of saying: don’t store everything in the model, just look it up when you need it. Rather than memorizing all knowledge in advance, some AI systems now fetch information in real time instead of cramming everything into memory.

Beyond models, even the core design of AI is an evolving frontier for engineering and research. Engineers are rethinking how models read and process information, replacing the old “read everything at once” method with more targeted, energy-efficient scanning techniques. These redesigns reduce the strain on chips and servers, cutting AI bloat and building sustainability into the very design of AI systems.

The ESG Mirror: Measuring the Costs and Benefits of Thinking Big

Engineering innovations show us how to feed AI less. However, to confront AI’s full impact, we must ask a harder question: how do we account for AI’s growing costs beyond dollars and FLOPs? This is where ESG comes in: to measure what matters and serve as a compass for sustainability-oriented growth. In ESG vocabulary, materiality refers to what is economically significant. A close look reveals that on the environmental (E) front, AI is no longer immaterial. Training a model like GPT-3 consumed roughly 1,287 megawatt-hours of electricity—the equivalent of powering over 120 U.S. homes for a year. Inference adds further load as millions of queries run through servers, day and night. Furthermore, there are water-related costs: Microsoft reported a 34% spike in water consumption during a year of AI expansion, much of it for cooling data centers. Add rising e-waste from short-lived GPUs and AI accelerators, and the environmental costs become undeniably material.

Yet this is only one side of the story. AI is also being used to optimize power grids, reduce food waste in commercial kitchens, and improve resource efficiency in agriculture. DeepMind’s data center cooling system cut energy use by 40%, and platforms like FarmBeats help farmers irrigate more precisely, cutting water waste.

The social (S) impact of AI is just as complex. Automation threatens a wide swath of jobs, from call centers to content production, raising familiar questions about labor displacement. However, even as jobs are threatened, new forms of invisible labor have emerged: annotators, moderators, and data labelers working in precarious conditions for cents per task. Meanwhile, on the positive side, AI has opened new doors to access healthcare, education, and financial services. Diagnostic models now match or exceed human doctors in detecting certain conditions. Language apps adapt to user pace. Financial inclusion is expanding through AI-based credit scoring while tools like Microsoft’s Seeing AI help visually impaired users navigate the world.

Finally, governance (G) is where the system feels most brittle. Many companies have AI ethics principles, but few enforce them transparently. Most large models remain black boxes, opaque in both design and operation. Meanwhile, greenwashing through low-quality carbon offsets continues to erode corporate credibility. Yet, here too, regulatory tools are emerging in the landscape. Impact assessments, the EU’s AI Act, and open frameworks for auditing models are some of the early steps in this direction.

It is important to note that with its ability to process massive, unstructured datasets across languages and formats, AI shows tremendous potential in attaining ESG goals by helping companies measure emissions data, monitor supplier behavior, and flag inconsistencies in ESG disclosures. Natural language processing models can scan thousands of regulatory filings, media reports, and sustainability documents to detect gaps or greenwashing risks. Machine learning tools now assist in predicting environmental risks, automating climate scenario modeling, and generating dynamic ESG dashboards in real time. Tools like the Bank for International Settlements’ Project Gaia use AI to assess climate-related financial exposures at scale. What once took analysts weeks can now be done in hours, with greater accuracy and fewer blind spots. In a landscape where ESG data is often fragmented, unaudited, and self-reported, AI offers a chance to shift ESG from a branding exercise into a verifiable, accountable system.

The Ripple Effects: Carbon, Energy, and Water in AI’s Wake

What ESG helps measure is only part of the AI sustainability story. The rest is playing out as ripple effects across carbon markets, energy systems, and water supplies. For instance, shifts in carbon markets have begun to point to the price of emissions. Carbon credits, traditionally used by heavy industry, are now being absorbed by the tech sector at scale. Microsoft, for instance, signed a deal to purchase 3.5 million carbon removal credits to offset the impact of its AI-driven data center expansion. The risk, though, is that credits can become a financial workaround that allows emissions to rise while accountability lags.

Close on the heels of carbon credit markets are renewable energy markets where AI’s appetite for clean power is changing the terrain. Tech giants like Google, Amazon, and Microsoft are locking in vast amounts of clean energy through long-term Power Purchase Agreements. Although these deals help future-proof their infrastructure, they distort renewable markets, driving up prices in some regions and squeezing out smaller players. The mismatch between when renewables are generated and AI’s constant demand is already surfacing in energy policy debates. This is a sign that even clean intelligence can strain the grid.

Finally, water and local utility segments have also begun to show AI-related strains – because it turns out, even AI needs to drink. Many data centers use evaporative cooling systems that draw millions of gallons from local water supplies. Microsoft disclosed a 34% year-on-year increase in water use during its recent AI expansion. Concerns have emerged in states like Arizona and Iowa, where freshwater is scarce and infrastructure is aging. Local communities are beginning to push back, sometimes calling for moratoriums or transparency mandates. As AI becomes a permanent fixture in the digital economy, its physical presence is being felt in places most tech users never think about.

Designing Intelligence for the World We Want

Once symbolic of the future, AI is now a material force with real-world weight. And unlike the sun in Plato’s allegory, its artificial light is not self-sustaining. It runs on electricity, draws water, and leaves emissions in its wake. As the sustainability receipt continues printing, we must confront the cost, not to halt progress, but to shape it with clarity. For researchers, especially in business schools, this opens critical ground: lifecycle accounting for computing infrastructure, the economics of data labor, and governance models for responsible scale. Computer scientists have an array of promising research directions to solve the AI sustainability challenge, even as social scientists, philosophers, and legal experts have a rich opportunity to help add evidence and recommendations for a new future. From a policy perspective, ESG frameworks must evolve to account for the growing reliance on digital infrastructures, and the often-overlooked impact of technological design choices on energy consumption, human behaviour, and societal outcomes. For firms, sustainable intelligence means consciously embedding environmental metrics into model development, infrastructure planning, and procurement.

Ultimately, when aligned with intent, the same systems that emit carbon and consume water can also help optimize supply chains and energy grids and create new opportunities for growth, discover drugs, and accelerate scientific progress. We may have made a Faustian bargain, but the design is not final. The light is powerful. Whether it clarifies or consumes depends on how we govern it now.

 

© IE Insights.

Would you like to receive IE Insights?

Suscríbete a nuestra Newsletter

Suscripción a la Newsletter