32.7 C
Miami
Thursday, September 25, 2025

I met Sam Altman in Texas. He’s turning the race for AI into a gigawatt arms race | Fortune

- Advertisement -spot_imgspot_img
- Advertisement -spot_imgspot_img

Welcome to Eye on AI, with AI reporter Sharon Goldman. In this edition…Open AI’s gigawatt arms race is underway in Abilene, Texas…Nscale announces record-breaking $1.1 billion Series B...OpenAI and Databricks strike AI agent deal…Trump administration will provide Elon Musk’s xAI to federal agencies. 

Sam Altman stood outside Building 2 at OpenAI, Oracle, and SoftBank’s flagship Stargate data center in Abilene, Texas. He — along with the cluster of journalists peppering him with questions — looked small against the backdrop of the sprawling 800-acre site, swarming with thousands of construction workers and dotted with spools of fiber cable, steel beams, water pipes, and heavy machinery.

As I reported on Tuesday, we were there for a media event to tout the progress of the high-profile and ambitious “Stargate” AI infrastructure project. OpenAI and Oracle announced an expansion of the Abilene site, plus plans to build five massive new data center complexes across the U.S. over the next several years. Altogether, the initiative represents hundreds of billions of dollars in investment — a project of mind-boggling scale. In Abilene alone, a crew of 6,400 workers has already flattened hills by moving mountains of soil and laid enough fiber optic cable to wrap the Earth 16 times.

“We cannot fall behind in the need to put the infrastructure together to make this revolution happen,” Altman told reporters during the media event, which also included  Clay Magouryk. one of Oracle’s two new CEOs, as well as Texas Senator Ted Cruz. “What you saw today is just a small fraction of what this site will eventually be — and this site is just a small fraction of what we’re building. All of that still won’t be enough to serve even the demand of ChatGPT,” he added, referring to OpenAI’s flagship product.

Building AI with brute industrial force

Altman and OpenAI have been relentless in their drive to “scale compute.” By this, they don’t mean chasing the next algorithmic breakthrough or elegant line of code. They mean brute industrial force: millions of chips, sprawling campuses wired with fiber, and gigawatts of electricity — along with the gallons of water needed to help cool all that equipment. To OpenAI, scaling compute means piling on ever more of this horsepower, betting that sheer scale — not software magic — is what will unlock not just artificial general intelligence (AGI), which the company defines as “highly autonomous systems that outperform humans at most economically valuable work,” but what it calls artificial super intelligence (ASI), that would hypothetically surpass human capabilities in all domains. 

That’s why OpenAI keeps pointing to a number: 10 gigawatts of capacity across the Stargate project sites by the end of 2025. Ten gigawatts — enough to power roughly 7.5 million homes, or an entire regional grid — marks a shift in how AI capacity is measured. At this scale, Altman explained to me with a quick handshake before the press gaggle, companies like OpenAI don’t even bother counting GPUs anymore. The unit of measure has become gigawatts: how much electricity the entire fleet of chips consumes. That number is shorthand for the only thing that matters: how much compute the company can keep running.

That’s why it was so striking to come home from Texas and read Alex Heath’s Sources the very next day. In it, Heath revealed an internal Slack note Altman had shared with employees on the same day I saw him in Abilene. Altman spelled out what he called OpenAI’s “audacious long-term goal”: to build not 10, not 100, but a staggering 250 gigawatts of capacity by 2033. In the note, he disclosed that OpenAI started the year at “around” 230 megawatts of capacity and is “now on track to exit 2025 north of 2GW of operational capacity.”

To put that into perspective: 250 gigawatts would be about a quarter of the entire U.S. electrical generation capacity, which hovers around 1,200 GW. And Altman isn’t just talking about electricity. The number is shorthand for the entire industrial system required to use it: the chips, the data centers, the cooling and water, the networking fiber and high-speed interconnects to tie millions of processors into supercomputers.

‘A new core bet’ for OpenAI

Heath reported that Altman’s Slack note announced OpenAI is “formalizing the industrial compute team,” led by Peter Hoeschele, who reports to president Greg Brockman. “The mission is simple: create and deliver massive usable compute as fast as physics allows, to power us through ASI,” Altman wrote. “In several years, I think this could be something like a gigawatt per week, although that will require us to completely reimagine how we build compute.”

“Industrial compute should be considered a new core bet (like research, consumer devices, custom chips, robotics, applications, etc.) which will hire and operate in the way it needs to run at maximum effectiveness for the domain,” Altman continued. “We’ve already invested hundreds of billions of dollars, and doing this right will cost trillions. We will need support from team members across OpenAI to help us move fast, unlock projects, and clear the path for the buildout ahead.”

A quarter of the U.S. power grid. Trillions in cost. Does that sound bonkers to you? It does to me — which is precisely why I hopped on a plane to Dallas, rented a car, and drove three hours through rolling hills and ranches to Abilene to see for myself. The scale of this one site is staggering. Imagining it multiplied by dozens is nearly impossible.

I told Altman that the scene in Abilene reminded me a bit of a tour I recently took of Hoover Dam, one of the great engineering feats of the 20th century that produces about 2 gigawatts of power at capacity. In the 1930s, Hoover Dam was a symbol of American industrial might: concrete, turbines, and power on a scale no one had imagined.

Altman acknowledged that “people like to pick their historical analogies” and thought the “vibe was right” to compare Stargate to Hoover Dam. It wasn’t his own personal favorite, however: “A recent thing I’ve thought about is airplane factories,” he said. “The history of what went into airplane factories, or container ships, the whole industry that came around those,” he said. “And certainly, everything that went into the Apollo program.” 

The need for public awareness

That’s when I realized: whether you think Altman’s goals make sense, seem nuts, or feel downright reckless really comes down to what you believe about AI itself. If you think supercharged versions of AI will change everything — and mostly for the good, like curing cancer — or you are a China hawk that wants to win the new AI ‘cold war’ with China, then Altman’s empire of data centers looks like a necessary bet. If you’re skeptical, it looks like the biggest boondoggle since America’s grandest infrastructure follies: think California’s long-awaited high-speed rail. If you’ve read Karen Hao’s Empire of AI, you might also be shouting that scaling isn’t inevitable — that building a ‘compute empire’ risks centralizing power, draining resources, and sidelining efficiency and safety. And if you think AGI will kill us all, like Eliezer Yudowsky? Well, you won’t be a fan.

No one can predict the future, of course. My greater concern is that there isn’t nearly enough public awareness of what’s happening here. I don’t mean just in Abilene, with its mesquite shrubland ground into dust, or even OpenAI’s expanding Stargate ambitions around the US and beyond. I mean the vast, almost unimaginable infrastructure buildout across Big Tech — the buildout that’s propping up the stock market, fueling a data center arms race with China, and reshaping energy, land, and labor around the world. Are we sleepwalking into the equivalent of an AI industrial revolution—and not a metaphorical one, but in terms of actual building of physical stuff—without truly reckoning with its costs versus its benefits? 

Even Sam Altman doesn’t think enough people understand what he’s talking about. “Do you feel like people understand what ‘compute’ is?” I asked him outside of Building 2. That is, does the average citizen really grok what Altman is saying about the physical manifestation of these mega data centers? 

“No, that’s why we wanted to do this,” he said about the Abilene media event. “I don’t think when you hit the button on ChatGPT…you think of walking the halls here.” 

Of course, Hoover Dam, too, was also divisive, controversial and considered risky. But I wasn’t alive when it was built. This time I could see the dust rising in Abilene with my own eyes — and while Altman talked about walking the newly-built halls filled with racks of AI chips, I walked away unsettled about what comes next. 

With that, here’s the rest of the AI news.

Sharon Goldman
sharon.goldman@fortune.com
@sharongoldman

FORTUNE ON AI

Sam Altman’s AI empire will devour as much power as New York City and San Diego combined. Experts say it’s ‘scary– by Eva Roytburg

Exclusive: Startup using AI to automate software testing in the age of ‘vibe coding’ receives $20 million in new venture funding – by Jeremy Kahn

OpenAI plans to build 5 giant U.S. ‘Stargate’ datacenters, a $400B challenge to Meta and Microsoft in the relentless AI arms race – by Sharon Goldman

AI IN THE NEWS

Nscale announces record-breaking $1.1 billion Series B. UK cloud infrastructure company Nscale announced a $1.1 billion funding round, the largest in UK and European history. The Series B, led by Aker ASA with participation from NVIDIA, Dell, Fidelity, Point72, and others, will accelerate Nscale’s rollout of “AI factory” data centers across Europe, North America, and the Middle East. The company, which recently unveiled partnerships with Microsoft, NVIDIA, and OpenAI to establish Stargate UK and launched Stargate Norway with Aker, says the funding will expand its engineering teams and GPU deployment pipeline as it races to deliver sovereign, energy-efficient AI infrastructure at massive scale.

OpenAI and Databricks strike AI agent deal. OpenAI and data platform Databricks struck a multiyear deal expected to generate about $100M, the Wall Street Journal reported, making OpenAI’s models—including GPT-5—natively available inside Databricks so enterprises can build AI agents on their own data “out of the box.” The partnership includes joint research, and OpenAI COO Brad Lightcap says the two aim to “far eclipse” the contracted figure. It targets a key adoption barrier—reliable, data-integrated agents—tapping Databricks’ 20,000+ customers and $4B ARR footprint (Mastercard is already using Databricks-built agents for onboarding/support). The move sits alongside Databricks’ model partnerships (e.g., Anthropic) and a broader vendor push (Salesforce, Workday) to pair agent tooling with customer data, as OpenAI ramps its infrastructure ambitions with Oracle/SoftBank.

Trump administration will provide Elon Musk’s xAI to federal agencies.  According to the Wall Street Journal, Elon Musk’s xAI will be available to federal agencies via the General Services Administration for just 42 cents—part of a broader effort to bring top AI systems into government. The arrangement mirrors similar nominal-fee deals with Google (47 cents), OpenAI ($1), and Anthropic ($1), meaning Washington is now working with all four leading U.S. model makers, each of which also has $200M Pentagon contracts. Officials say the low-cost access is less about revenue than securing a foothold in government AI adoption, where automating bureaucratic processes is seen as a major opportunity. The move also highlights a thaw in Musk’s relationship with the White House, while underscoring the administration’s push to foster competition among frontier AI providers.

AI CALENDAR

Oct. 6-10: World AI Week, Amsterdam

Oct. 21-22: TedAI San Francisco. Apply to attend here.

Nov. 10-13: Web Summit, Lisbon. 

Nov. 26-27: World AI Congress, London.

Dec. 2-7: NeurIPS, San Diego

Dec. 8-9: Fortune Brainstorm AI San Francisco. Apply to attend here.

EYE ON AI NUMBERS

$923 Billion

That’s how much AI-driven capital expenditures in fiscal year 2025 will translate into total U.S. economic output, according to new economic modeling results released by IMPLAN.

The analysis, based on reported 2025 capital expenditure estimates, found that Amazon, Alphabet, Microsoft, and Meta are set to spend a record $364B on AI-driven capital expenditures in  fiscal 2025—more than all new U.S. commercial construction in 2023. The modeling showed that those dollars will generate $923B in total U.S. economic output, support 2.7M jobs, and add $469B to GDP.

Every $1 invested, the report said, yields $2.50 in impact, rippling from construction and chip manufacturing to retail and local services. For policymakers, it’s a reminder: Big Tech’s AI buildout isn’t just about data centers—it’s reshaping the broader U.S. economy.

Source link

- Advertisement -spot_imgspot_img

Highlights

- Advertisement -spot_img

Latest News

- Advertisement -spot_img