Nvidia Shares Slip as Google's AI Chips Gain Ground with Meta Deal Talks

Nvidia shares declined Tuesday following a report that Meta Platforms is in advanced talks to spend billions on Google's tensor processing units (TPUs), signaling the search giant's growing momentum in the competitive AI accelerator market.
According to The Information, Meta is discussing the use of Google's chips in its data centers starting in 2027, and may also rent TPUs through Google Cloud as early as next year. The news sent Nvidia shares down as much as 2.7% in after-hours trading, while Alphabet, Google's parent company, saw its stock rise 2.7%—extending recent gains driven by optimism around its Gemini AI model.

A potential agreement would position Google's TPUs as a credible alternative to Nvidia's dominant AI chips, which are currently used by most major tech firms—from Meta to OpenAI—to develop and run AI systems. Google has already secured a major deal to supply up to 1 million chips to AI startup Anthropic, though Nvidia continues to lead the market.
Seaport analyst Jay Goldberg described the Anthropic agreement as a "really powerful validation" of Google's chip technology, noting that it has spurred broader industry interest in TPUs as a competitive option.
Meta declined to comment on the report, while Google did not immediately respond to requests for clarification.
In a research note, Bloomberg Intelligence analysts Mandeep Singh and Robert Biggar suggested that Meta's potential adoption of TPUs—following Anthropic's—indicates that third-party AI providers are increasingly viewing Google as a viable secondary supplier for inference chips. They estimate that Meta's projected 2026 capital expenditure of at least $100 billion could translate into $40–50 billion spent on inference-chip capacity next year, potentially accelerating growth for Google Cloud.
The news also lifted several Asian suppliers linked to Alphabet in early Tuesday trading. South Korea’s IsuPetasys, which provides multilayer boards to Google, surged 18%, while Taiwan’s MediaTek rose nearly 5%.
A deal with Meta—one of the world's largest investors in AI infrastructure—would represent a significant milestone for Google. Still, the long-term success of TPUs will depend on their ability to deliver competitive performance and power efficiency.
Originally developed over a decade ago for AI-specific workloads, Google's tensor chips are gaining traction beyond the company's own operations as more firms seek to reduce their reliance on Nvidia. While Nvidia's graphics processing units (GPUs) continue to dominate the AI training market—having evolved from their original use in gaming and graphics—TPUs represent a specialized approach. They are application-specific integrated circuits designed explicitly for AI and machine learning tasks, refined through years of deployment in Google's own products and models like Gemini.
This customization has enabled Google to optimize both its chips and its AI systems in tandem—a feedback loop that could strengthen its position as a rising challenger in the accelerating AI hardware race.
The above content was completed with the assistance of AI and has been reviewed by an editor.


