OpenAI plans mega AI city on foundations of custom chip advancements

Source Cryptopolitan

OpenAI is sketching out what looks like the blueprint for a machine-built civilization, one powered by its own chips, its own infrastructure, and enough electricity to light two New York Cities.

The company’s massive AI city vision rests on a vision that is very nearly impossible pull off: designing and producing billions of custom chips in partnership with Broadcom to support what CEO Sam Altman calls the “computing spine” of the future.

Sam told the Wall Street Journal that delivering the artificial-intelligence services people demand will require at least one AI-specific chip per user, a mind-bending projection that runs into the billions.

Ali Farhadi, head of the Allen Institute for AI, backed that scale, saying if AI replaces human labor at the rate promised, “the world will need as many AI chips as it has conventional ones.” For OpenAI, this is about control; over costs, over power consumption, and over the long-term survival of its models as demand explodes.

OpenAI links Broadcom, Nvidia, and memory giants for next-gen compute

Nvidia of course still dominates the AI training space, with roughly 70% market share, which is why OpenAI has to continue using its GPUs for model training.

But OpenAI is now splitting the pipeline: training happens on Nvidia, inference (the process of delivering answers to users) moves to Broadcom’s custom silicon. This two-track design could cut expenses and power usage at a scale where every percentage point matters.

Jordan Nanos, a semiconductor researcher at SemiAnalysis, said Broadcom is helping OpenAI “remix the typical AI-chip recipe.” These chips won’t be generic. They’re being engineered specifically for OpenAI’s models, which rely on high-bandwidth memory, supplied by Samsung and SK Hynix, two firms the company recently partnered with.

That type of memory allows faster data movement between processors, critical for systems like OpenAI’s Pulse, an AI agent that scans the web daily to brief users. Pulse consumes so much computing power that Sam said it’s limited to those who pay $200 a month for the Pro tier.

This dependency on high-bandwidth memory ties directly to how OpenAI’s models operate. Early neural networks were “dense,” activating large sections of their systems for every query. Newer ones use “sparsity”, which activates only specific expert sections.

Instead of using 25% of the model to answer a question, modern systems trigger a fraction of a percent. That difference slashes power draw and speeds up response times. When a chip is built around that sparse logic, efficiency skyrockets, and Broadcom is the one making that hardware possible.

OpenAI’s gigawatt-scale AI supercomputers redefine infrastructure

Sam has said that OpenAI’s current compute footprint is around 2 gigawatts, spread across global data centers. The Broadcom partnership aims to build up to 10 gigawatts by 2030, forming the physical base for what insiders are calling AI cities, dense campuses of servers, storage, and custom interconnects tied together by Broadcom’s Tomahawk Ultra networking chips.

That’s only part of the wave. Over the past three weeks, OpenAI has added 16 gigawatts in fresh capacity deals with AMD and Nvidia, bringing the total to levels that could require nearly $1 trillion in investment.

xAI’s Memphis Colossus already reached 1.21 gigawatts this fall. Meta’s Hyperion facility in Louisiana is approved for 2.3 gigawatts, with Mark Zuckerberg targeting 5 gigawatts. The AI energy race is officially global.

Sam described this transformation as “the biggest joint industrial project in history,” saying even these deals are “a drop in the bucket compared to where we need to go.” Part of his goal is to diversify suppliers.

The Stargate campus in Abilene, Texas, being built by Oracle, will focus on AI training, mostly on Nvidia chips. AMD hardware will handle inference workloads, while Broadcom’s custom silicon fills the efficiency gap.

As Nanos put it, “OpenAI is looking quite far into the future, and trying to make sure they have access to enough supply of chips.”

Claim your free seat in an exclusive crypto trading community - limited to 1,000 members.

Disclaimer: For information purposes only. Past performance is not indicative of future results.
placeholder
Bitcoin ETF Inflows For 2025 Now Outpace 2024, Data ShowsUS Bitcoin spot exchange-traded funds (ETFs) have seen more inflows this year so far compared to the same point in 2024, according to data.
Author  Bitcoinist
Jul 16, Wed
US Bitcoin spot exchange-traded funds (ETFs) have seen more inflows this year so far compared to the same point in 2024, according to data.
placeholder
Gold Price Forecast: XAU/USD gains momentum to near $3,650, eyes on US CPI releaseThe Gold price (XAU/USD) gains momentum to near $3,645 during the early Asian session on Thursday.
Author  FXStreet
Sep 11, Thu
The Gold price (XAU/USD) gains momentum to near $3,645 during the early Asian session on Thursday.
placeholder
What to expect from Ethereum in October 2025With broader sentiment worsening, user demand falling across the Ethereum network, and institutional investors pulling back, the coin faces mounting headwinds in October.
Author  Beincrypto
Sep 30, Tue
With broader sentiment worsening, user demand falling across the Ethereum network, and institutional investors pulling back, the coin faces mounting headwinds in October.
placeholder
Gold reverses intraday corrective slide below $4,300; back near all-time highGold continues to attract safe-haven flows amid trade uncertainties and geopolitical tensions.
Author  FXStreet
Oct 17, Fri
Gold continues to attract safe-haven flows amid trade uncertainties and geopolitical tensions.
placeholder
Gold-backed PAXG hits record volumes, trades at premium to spotPAXG traded at a premium on Binance, based on an anomalous price spike above $5,000, which liquidated short positions.
Author  Cryptopolitan
Oct 17, Fri
PAXG traded at a premium on Binance, based on an anomalous price spike above $5,000, which liquidated short positions.
goTop
quote