AI Needs Ever More Electricity—And Google Wants Us to Pay for It

Uncle Sugar’s “National Emergency” Pitch to Congress

At a recent Congressional hearing, former Google CEO Eric “Uncle Sugar” Schmidt delivered a message that was as jingoistic as it was revealing: if America wants to win the AI arms race, it better start building power plants. Fast. But the subtext was even clearer—he expects the taxpayer to foot the bill because, you know, the Chinese Communist Party. Yes, when it comes to fighting the Red Menace, the all-American boys in Silicon Valley will stand ready to fight to the last Ukrainian, or Taiwanese, or even Texan.

Testifying before the House Energy & Commerce Committee on April 9, Schmidt warned that AI’s natural limit isn’t chips—it’s electricity. He projected that the U.S. would need 92 gigawatts of new generation capacity—the equivalent of nearly 100 nuclear reactors—to keep up with AI demand.

Schmidt didn’t propose that Google, OpenAI, Meta, or Microsoft pay for this themselves, just like they didn’t pay for broadband penetration. No, Uncle Sugar pushed for permitting reform, federal subsidies, and government-driven buildouts of new energy infrastructure. In plain English? He wants the public sector to do the hard and expensive work of generating the electricity that Big Tech will profit from.

Will this Improve the Grid?

And let’s not forget: the U.S. electric grid is already dangerously fragile. It’s aging, fragmented, and increasingly vulnerable to cyberattacks, electromagnetic pulse (EMP) weapons, and even extreme weather events. Pouring public money into ultra-centralized AI data infrastructure—without first securing the grid itself—is like building a mansion on a cracked foundation.

If we are going to incur public debt, we should prioritize resilience, distributed energy, grid security, and community-level reliability—not a gold-plated private infrastructure buildout for companies that already have trillion-dollar valuations.

Big Tech’s Growing Appetite—and Private Hoarding

This isn’t just a future problem. The data center buildout is already in full swing and your Uncle Sugar must be getting nervous about where he’s going to get the money from to run his AI and his autonomous drone weapons. In Oregon, where electricity is famously cheap thanks to the Bonneville Power Administration’s hydroelectric dams on the Columbia River, tech companies have quietly snapped up huge portions of the grid’s output. What was once a shared public benefit—affordable, renewable power—is now being monopolized by AI compute farms whose profits leave the region to the bank accounts in Silicon Valley.

Meanwhile, Microsoft is investing in a nuclear-powered data center next to the defunct Three Mile Island reactor—but again, it’s not about public benefit. It’s about keeping Azure’s training workloads running 24/7. And don’t expect them to share any of that power capacity with the public—or even with neighboring hospitals, schools, or communities.

Letting the Public Build Private Fortresses

The real play here isn’t just to use public power—it’s to get the public to build the power infrastructure, and then seal it off for proprietary use. Moats work both ways.

That includes:
– Publicly funded transmission lines across hundreds of miles to deliver power to remote server farms;
– Publicly subsidized generation capacity (nuclear, gas, solar, hydro—you name it);
– And potentially, prioritized access to the grid that lets AI workloads run while the rest of us face rolling blackouts during heatwaves.

All while tech giants don’t share their models, don’t open their training data, and don’t make their outputs public goods. It’s a privatized extractive model, powered by your tax dollars.

Been Burning for Decades

Don’t forget: Google and YouTube have already been burning massive amounts of electricity for 20 years. It didn’t start with ChatGPT or Gemini. Serving billions of search queries, video streams, and cloud storage events every day requires a permanent baseload—yet somehow this sudden “AI emergency” is being treated like a surprise, as if nobody saw it coming.

If they knew this was coming (and they did), why didn’t they build the power? Why didn’t they plan for sustainability? Why is the public now being told it’s our job to fix their bottleneck?

The Cold War Analogy—Flipped on Its Head

Some industry advocates argue that breaking up Big Tech or slowing AI infrastructure would be like disarming during a new Cold War with China. But Gail Slater, the Assistant Attorney General leading the DOJ’s Antitrust Division, pushed back forcefully—not at a hearing, but on the War Room podcast.

In that interview, Slater recalled how AT&T tried to frame its 1980s breakup as a national security threat, arguing it would hurt America’s Cold War posture. But the DOJ did it anyway—and it led to an explosion of innovation in wireless technology.

“AT&T said, ‘You can’t do this. We are a national champion. We are critical to this country’s success. We will lose the Cold War if you break up AT&T,’ in so many words. … Even so, [the DOJ] moved forward … America didn’t lose the Cold War, and … from that breakup came a lot of competition and innovation.”

“I learned that in order to compete against China, we need to be in all these global races the American way. And what I mean by that is we’ll never beat China by becoming more like China. China has national champions, they have a controlled economy, et cetera, et cetera.

We win all these races and history has taught by our free market system, by letting the ball rip, by letting companies compete, by innovating one another. And the reason why antitrust matters to that picture, to the free market system is because we’re the cop on the beat at the end of the day. We step in when competition is not working and we ensure that markets remain competitive.”

Slater’s message was clear: regulation and competition enforcement are not threats to national strength—they’re prerequisites to it. So there’s no way that the richest corporations in commercial history should be subsidized by the American taxpayer.

Bottom Line: It’s Public Risk, Private Reward

Let’s be clear:

– They want the public to bear the cost of new electricity generation.
– They want the public to underwrite transmission lines.
– They want the public to streamline regulatory hurdles.
– And they plan to privatize the upside, lock down the infrastructure, keep their models secret and socialize the investment risk.

This isn’t a public-private partnership. It’s a one-way extraction scheme. America needs a serious conversation about energy—but it shouldn’t begin with asking taxpayers to bail out the richest companies in commercial history.

AI is Using Forks and Knives to Eat Your Bacon…and then there’s the political clout

The Internet is an extraordinary electricity hog. You know this intuitively even if you’ve never studied the question of just how big a hog it really is. AI has already taken that electricity use to exponentially extraordinary new levels. These hogs will ultimately consume the farm if that herd is not thinned out.

This is nothing new. Consider YouTube. First of all, YouTube has long been the second largest search engine in the world. So there’s that. Reportedly, YouTube’s aggregate audience watches over 1 billion viewing hours per day.

To put that in context, the electricity burned by YouTubers works out to approximately 600 terawatt-hours (TWh) per year. (A terawatt hour (TWh) is a unit of energy that represents the amount of work done by one terawatt of power in one hour. The prefix ‘tera’ signifies 10^12. Therefore, one terawatt equals one trillion (1,000,000,000,000 or 10^12) watts.)

600 TWh is roughly 2.5% of global electricity use, exceeds the combined consumption of all data centers and data transmission networks worldwide, and “could power an American household for about 2 billion years. Or all 127 million U.S. households for about 8 years.”

And that’s just YouTube.

AI’s electricity consumption is a growing concern. Data centers worldwide, which store the information required for online interactions, account for about 1 to 1.5 percent of global electricity useThe AI boom could potentially increase this figure significantly.

A study by Alex de Vries, a data scientist at the central bank of the Netherlands, estimates that by 2027, NVIDIA could be shipping 1.5 million AI server units per yearThese servers, running at full capacity, would consume at least 85.4 terawatt-hours (TWh) of electricity annuallyThis consumption is more than what many countries use in a year.

AI’s energy footprint does not end with training–after training comes users or the “inference” phase. When an AI is put to work generating data based on prompts in the inference phase, it also uses a significant amount of computing power and thus electricityFor example, generative AI like ChatGPT could cost 564 megawatt-hours (MWh) of electricity a day to run.

According to de Vries, “In 2021, Google’s total electricity consumption was 18.3 TWh, with AI accounting for 10%–15% of this total. The worst-case scenario suggests Google’s AI alone could consume as much electricity as a country such as Ireland (29.3 TWh per year), which is a significant increase compared to its historical AI-related energy consumption.” Remember, that’s just Google.

And then there’s bitcoin mining. Bitcoin mining involves computers across the globe racing to complete a computation that creates a 64-digit hexadecimal number, or hash, for each Bitcoin transaction. This hash goes into a public ledger so anyone can confirm that the transaction for that particular bitcoin happened. The computer that solves the computation first gets a reward of 6.2 bitcoins.

Bitcoin mining consumes a significant amount of electricity. In May 2023, Bitcoin mining was estimated to consume around 95.58 terawatt-hours of electricity. It reached its highest annual electricity consumption in 2022, peaking at 204.5 terawatt-hours, surpassing the power consumption of Finland.

By some estimates, a single Bitcoin transaction can spend up to 1,200 kWh of energy, which is equivalent to almost 100,000 VISA transactions. Another estimate is that Bitcoin uses more than 7 times as much electricity as all of Google’s global operations.

Back to AI, remember that Big Tech requires big data centers. According to Bloomberg, AI is currently–no pun intended–currently using so much electrical power that coal plants that utilities planned to shut down for climate sustainability are either staying online or being brought back online if they had been shut down. For example, Virginia has been suffering from this surge in usage:

In a 30-square-mile patch of northern Virginia that’s been dubbed “data center alley,” the boom in artificial intelligence is turbocharging electricity use. Struggling to keep up, the power company that serves the area temporarily paused new data center connections at one point in 2022. Virginia’s environmental regulators considered a plan to allow data centers to run diesel generators during power shortages, but backed off after it drew strong community opposition.

It’s also important to realize that building data centers in states that are far flung from the Silicon Valley heartland also increases Big Tech’s political clout. This explains why Oregon Senator Ron Wyden is the confederate of the worst of Big Tech’s excesses like child exploitation and of course, copyright. Copyright never had a worse enemy, all because of the huge presence in Oregon of Big Tech’s data centers–not their headquarters or anything obvious.

Wyden with his hand in his own pocket.

A terawatt here and a terawatt there and pretty soon you’re talking about a lot of electricity. So if you’re interested in climate change, there’s a lot of material here to work with. Maybe we do this before we slaughter the cattle, just sayin’.