Italy’s second-richest man makes a disturbing comment about artificial intelligence and companies that, almost without saying so, are training their own replacements

May 3, 2026 - 13:00
 0  1
Italy’s second-richest man makes a disturbing comment about artificial intelligence and companies that, almost without saying so, are training their own replacements

Andrea Pignataro, founder and CEO of ION Group, says many businesses are panicking about AI for the wrong reason. In an op-ed published February 17, 2026, he argues the deeper danger is collective behavior, because companies adopt AI to stay competitive while also training systems that can one day sideline them.

There is a climate mirror of that idea, and it is easier to measure. The International Energy Agency says data centers used about 485 billion kilowatt-hours of electricity globally in 2025 and could reach about 950 billion kilowatt-hours by 2030, roughly 3% of the world’s electricity demand.

Pignataro’s “tragedy of the commons”

Pignataro wrote that more than $2 trillion in market value vanished from enterprise software between late January and mid February 2026 as investors assumed AI agents could replace today’s tools. He calls that a substitution fallacy, arguing enterprise software is also an institutional layer for permissions, audits, and compliance, which is hard to swap out overnight.

His warning goes further than software. When firms use AI in everyday workflows, he argues they collectively teach platforms the “grammar” of an industry, and he writes that “every customer is simultaneously a revenue source and a training signal.”

A parallel problem for the planet

In environmental terms, the shared resource is electricity, water, and the stable climate those systems depend on. Each company that adds AI copilots, agentic tools, or video generation is making a rational choice in isolation, but the combined demand forces more infrastructure to be built.

The IEA reports global data center electricity demand grew 17% in 2025, while AI focused data centers surged 50% in a single year. That is fast enough to outpace the normal rhythm of grid planning.

The scale of electricity demand

The United States is a clear example of how quickly this can show up on a map. A Department of Energy-backed Lawrence Berkeley National Laboratory report estimates U.S. data centers used about 176 billion kilowatt-hours in 2023, or 4.4% of total US electricity. It projects a range of about 325 to 580 billion kilowatt-hours by 2028, up to 12% of U.S. power use.

The IEA adds a vivid detail that helps explain why communities are paying attention. It says a single rack of AI servers could hit peak power demand comparable to about 65 homes by 2027, and one rack’s heat output can rival dozens of gas boilers.

Water use is rising alongside power

Cooling is not just an engineering detail – it is a local ecological issue. Berkeley Lab estimates direct water consumption by U.S. data centers rose from about 21.2 billion liters in 2014, roughly 5.6 billion gallons, to about 66 billion liters in 2023, roughly 17.4 billion gallons.

It also estimates an indirect water footprint through electricity generation of nearly 800 billion liters in 2023, about 211 billion gallons. Some cooling setups save energy but use more water, while others use less water but push up electricity demand, so the tradeoffs can shift stress from a river basin to the power grid.

Emissions depend on what powers the servers

The same Berkeley Lab report estimates the electricity used by U.S. data centers in 2023 corresponded to about 61 billion kilograms of CO2 equivalent, roughly 67 million short tons. It also reports an average emissions intensity near 0.75 pounds of CO2 equivalent per kilowatt-hour for that electricity mix.

Globally, the IEA projects data center emissions could roughly double and reach around 350 million metric tons of CO2 equivalent in 2035, about 386 million short tons. It also warns that community pushback is growing as concerns rise about affordability and environmental impacts.

Chips, supply chains, and e-waste

AI is also a materials story. The IEA says bottlenecks have tightened across advanced chip manufacturing, including a shortage of high-bandwidth memory that it expects to last through at least the end of 2027.

And more hardware usually means more waste unless recycling catches up. The U.N. backed Global E waste Monitor reported 62 billion kilograms of e-waste generated worldwide in 2022, about 68 million US tons, with only 22.3% documented as formally collected and recycled.

AI can still support the clean energy transition

The environmental case is not one-sided. The IEA says energy use per simple AI text task has dropped sharply, and that if all conventional internet searches were done as simple AI text queries, the annual electricity demand would be under 4 terawatt-hours, less than 1% of today’s data center use.

The larger prize is using AI to cut waste in buildings, industry, and transportation. The IEA estimates AI-enabled efficiency could unlock about 13 exajoules of savings in 2035, around 3% of global final energy consumption, but it stresses that energy policy and power sector decarbonization still matter most.

What “sustainable AI” looks like in real life

A good first step is transparency, because the IEA notes that newer uses like reasoning, agentic tools, and video generation can consume hundreds or thousands of times more energy per query than simple text generation. If companies disclosed energy use more consistently, it would be easier to compare tools, locations, and workloads.

The next step is flexibility, so data centers are not always competing with households at peak demand. The IEA projects around 20 to 25 gigawatts of battery storage could be installed at data centers globally by 2030, potentially turning them into grid assets if incentives are right. 

The report was published by the International Energy Agency.

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0