What if I told you that powering the next generation of artificial intelligence might require as much electricity as two major U.S. cities? That’s exactly what experts are saying about Sam Altman’s AI empire. According to a recent Fortune article making waves on Reddit, the company behind ChatGPT and other advanced AI tools could soon need more energy than New York City *and* San Diego combined. Sounds wild, right?
How Much Power Could Sam Altman’s AI Really Use?
Let’s put it into perspective. New York City alone uses roughly 11 gigawatts of electricity at peak demand. Toss in San Diego’s 3 gigawatts or so, and you’re looking at a massive chunk of the U.S. power grid. Now imagine one tech company needing that much juice just to keep its servers humming and its AIs answering questions.
Why is this happening? Well, powering artificial intelligence isn’t just about plugging in a few laptops. Training massive models like GPT-4 or running endless queries for millions of users means thousands upon thousands of high-powered graphics cards running 24/7 in data centers all over the country (and beyond). Each new leap in capability eats up even more energy.
Experts Are Sounding the Alarm
It’s not just armchair commentators raising their eyebrows—industry veterans are calling these numbers “scary.” The worry isn’t only about the sheer amount of electricity needed; it’s also about where that energy comes from and what it means for everything from climate change to your monthly electric bill.
Here are some big concerns experts have flagged:
- Strain on local power grids: Huge data centers can overwhelm existing infrastructure.
- Carbon emissions: If most of that power comes from fossil fuels, emissions will skyrocket.
- Water usage: Cooling these server farms often takes millions of gallons of water.
- Sustainability: Can renewable energy keep up with explosive tech growth?
- Inequality: Smaller players may get priced out if energy costs spike.
In short, building smarter machines could mean making some tough choices about how we power them—and who gets access.
The Human Side: A Data Center Next Door
On a personal note, I’ll never forget when a new data center popped up near my hometown. Overnight, our quiet rural roads were crowded with trucks hauling in generators and transformers. Folks started noticing brownouts during summer heatwaves—not just because people were cranking their ACs but because those shiny new servers were gobbling up local supply.
Local officials promised tax revenue and jobs (and sure, there were some), but we still scratched our heads every time the lights flickered or the water pressure dipped. Multiply that one facility by thousands across the country, each pushing technological boundaries—and you get an idea why experts are raising red flags about projects on the scale of Sam Altman’s AI ambitions.
What Happens Next?
So what does all this mean for you—and for everyone else who relies on a stable electric grid? Well, it depends on how quickly we can shift toward cleaner energy sources and smarter infrastructure planning. Some companies are racing to build solar and wind farms alongside their data centers; others are experimenting with liquid cooling or even small nuclear reactors.
But there aren’t any easy answers here. If artificial intelligence keeps advancing at breakneck speed—and if demand for tools like ChatGPT keeps growing—the pressure on our already-strained grids will only get worse.
At the end of the day, should we be excited about what powerful new AIs can do—or worried about whether we can afford to keep them running? Maybe both! What do you think—is society ready for an “AI empire” that needs as much power as entire cities?
Leave a Reply