Meta Platforms introduced its new AI training and inference chips last week.
Meta collaborated with Broadcom in the effort, with Broadcom saying customers are increasingly turning to specialized XPUs for different workloads over GPUs.
Should the announcements make Nvidia investors nervous?
Nvidia (NASDAQ: NVDA) is known as the king of artificial intelligence (AI), but as the industry migrates from training large language models to inference, will its competitive moat hold up?
Over the past week, even more competitive pressures emerged, with a big custom chip announcement from Meta Platforms (NASDAQ: META) and its chipmaking partner Broadcom (NASDAQ: AVGO).
Will AI create the world's first trillionaire? Our team just released a report on the one little-known company, called an "Indispensable Monopoly" providing the critical technology Nvidia and Intel both need. Continue »
As more large customers migrate to custom XPU solutions, should Nvidia investors be worried about the competition?
On Wednesday, Meta unveiled four new artificial intelligence chips: The MTIA 300, MTIA 400, MTIA 450, and the MTIA 500.
The 300 is optimized for Meta's core ranking & recommendation (R&R) workloads, which were Meta's dominant workload before generative AI. The 400, 450, and 500 are each for different types of inference workloads. The 400 can implement larger generative AI models for traditional R&R applications. The 450 then augments the 400's capabilities by doubling the high-bandwidth memory (HBM) capacity, and the 500 takes the 450's HBM higher by another 50% on top of that.
Meta disclosed that while the 300 is in use now, the 400, 450, and 500 will be rolled out beginning in early 2027 for generative AI inference. Meta also elaborated on its chip design strategy, noting it uses a "modular" approach that enables it to iterate on new chip designs every six months, rather than the typical two-year cadence. Meta believes this is a necessity, given the rapid pace of AI evolution today:
Rather than placing a bet and waiting for a long period of time, we deliberately take an iterative approach: Each MTIA generation builds on the last, using modular chiplets, incorporating the latest AI workload insights and hardware technologies, and deploying on a shorter cadence. This tighter loop keeps our hardware better aligned with evolving models while enabling faster adoption of new technology.
Like other major cloud companies, Meta is using Broadcom to manufacture and package parts of its chips.
Broadcom counts Meta as one of its five major XPU customers, to which it supplies SerDes components that connect the chip logic to the networking fabric. At the same time, Broadcom also handles packaging and other elements, ensuring these self-designed chips are manufacturable.
Broadcom held its quarterly earnings call last week, during which CEO Hock Tan elaborated on the current trend toward XPUs over graphics processing units (GPUs), noting that as AI workloads evolve, chips require greater specialization for each step in the AI training and inference processes:
The one-size-fits-all general-purpose GPU gets you only that far. ... In a GPU, you have a design for dense matrix multiplication. So you do it with software kernels, but it is not as effective as if you hard-coded it in silicon and make those XPUs purposely designed to be much more performing for mixture-of-experts workloads. The same applies for inference. ... And the design starts to depart from what is the traditional standard GPU design. Which is why, as we always indicated before, XPUs will eventually be more the choice simply because it will allow flexibility in making designs that work with particular workloads -- one for training even and one for inference. ... You can tweak your XPUs toward a particular kind of workload LLM that you want. And we are seeing that. We are seeing that road map in all our five customers.
Image source: Getty Images.
As the AI computing industry evolves toward pre-training, post-training, reinforcement learning, and inference for diverse applications, is Nvidia in danger of losing market share? After all, Nvidia did shell out $20 billion for the intellectual property and engineering talent of inference chip start-up Groq late last year. That may indicate that Nvidia sees emerging demand for non-GPU chips as the industry pivots, as Hock Tan described.
That being said, Nvidia investors shouldn't necessarily panic. Even as the inference market is becoming more competitive, Nvidia still has a strong lead in training, and investment in training infrastructure will continue to grow.
Look no further than Meta itself for evidence. Despite unveiling new chips last week, Meta inked a massive, multiyear deal with Nvidia last month to deploy literally millions of Nvidia Blackwell and Rubin chips in its data centers, along with Nvidia central processing units (CPUs), all connected via Nvidia's SpectrumX ethernet switches.
So, even Meta's new chip designs haven't enabled it to stop buying Nvidia infrastructure.
Meta has legacy businesses across Facebook, Instagram, WhatsApp, and its Reality Labs segments, but it is also building its own Llama family of large language models (LLMs). So, Meta may be deploying Nvidia for its LLM efforts and frontier AI research, while the homegrown chips can more efficiently serve its legacy business footprint with optimized solutions.
But the big picture is that AI computing demand is still growing exponentially. That means the emergence of new inference chipmakers won't cause traditional training-focused GPUs to decline; rather, these new types of chips should be incremental and not displace Nvidia GPUs.
In this case, it appears the rising tide of AI compute truly lifts all boats.
Before you buy stock in Meta Platforms, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Meta Platforms wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you’d have $514,000!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $1,105,029!*
Now, it’s worth noting Stock Advisor’s total average return is 930% — a market-crushing outperformance compared to 187% for the S&P 500. Don't miss the latest top 10 list, available with Stock Advisor, and join an investing community built by individual investors for individual investors.
See the 10 stocks »
*Stock Advisor returns as of March 15, 2026.
Billy Duberstein and/or his clients have positions in Broadcom and Meta Platforms. The Motley Fool has positions in and recommends Meta Platforms and Nvidia. The Motley Fool recommends Broadcom. The Motley Fool has a disclosure policy.