The Newest Artificial Intelligence Stock Has Arrived -- and It Claims to Make Chips That Are 20x Faster Than Nvidia

Source Motley_fool

Key Points

  • Nvidia's GPUs are the backbone of generative AI infrastructure.

  • Cerebras believes its wafer-style chip designs can deliver processing speeds 20 times faster than what Nvidia offers.

  • Cerebras had previously planned to go public, but has tabled its path to the public exchanges following a recent funding round.

  • 10 stocks we like better than Nvidia ›

Over the past three years, Nvidia (NASDAQ: NVDA) evolved from a niche semiconductor player into the most valuable company in the world. The catalyst behind its meteoric rise can be summed up in three letters: GPU.

Nvidia's graphics processing units (GPUs) have become the engine of the artificial intelligence (AI) revolution -- fueling everything from large language models (LLMs) to autonomous vehicles, robotics, and high-end video rendering.

Where to invest $1,000 right now? Our analyst team just revealed what they believe are the 10 best stocks to buy right now, when you join Stock Advisor. See the stocks »

But while Nvidia's dominance appears unshakable, a challenger is emerging. The startup Cerebras is making bold claims that its chips can power AI models 20 times faster than Nvidia's hardware. It's an ambitious promise -- and one that has investors asking whether Nvidia's reign might finally face a serious contender.

Cerebras' wafer-scale chip explained: One giant engine for AI

To understand why Cerebras is generating so much buzz, investors need to look at how it's breaking the rules of traditional chip design.

Nvidia's GPUs are small but powerful processors that must be clustered -- sometimes in the tens of thousands -- to perform the enormous calculations required to train modern AI models. These clusters deliver incredible performance, but they also introduce inefficiencies. Each chip must constantly pass data to its neighbors through high-speed networking equipment, which creates communication delays, drives up energy costs, and adds technical complexity.

Cerebras turned this model upside down. Instead of linking thousands of smaller chips, it has a single, massive processor the size of an entire silicon wafer -- aptly named the Wafer Scale Engine. Within this one piece of silicon sit hundreds of thousands of cores that work together seamlessly. Because everything happens on a unified architecture, data no longer needs to bounce between chips -- dramatically boosting speed while cutting power consumption.

Person in a chip foundry manufacturing a wafer.

Image source: Getty Images.

Why Cerebras thinks it can outrun Nvidia

Cerebras' big idea is efficiency. By eliminating the need for inter-chip communication, its wafer-scale processor keeps an entire AI model housed within a single chip -- cutting out wasted time and power.

That's where Cerebras' claim of 20x faster performance originates. The breakthrough isn't about raw clock speed; rather, it's about streamlining how data moves and eliminating bottlenecks.

Side by side comparison of Cerebras chip vs. Nvidia GPU.

Data source: Cerebras.

The practical advantage to this architecture is simplicity. Instead of managing, cooling, and synchronizing tens of thousands of GPUs, a single Cerebras system can occupy just on one rack and be ready for deployment -- translating to dramatic savings on AI infrastructure costs.

Why Nvidia still reigns

Despite the hype, Cerebras still carries risk. Manufacturing a chip this large is an engineering puzzle. Yield rates can fluctuate, and even a minor defect anywhere on the wafer can compromise a significant portion of the processor. This makes scaling a wafer-heavy model both costly and uncertain.

Nvidia remains the undisputed leader in AI computing. Beyond its powerful hardware, Nvidia's CUDA software platform created a deeply entrenched ecosystem on which virtually every major hyperscaler builds its generative AI applications. Replacing this kind of competitive moat requires more than cutting-edge hardware -- it demands a complete shift in how businesses design and deploy AI, forcing them to consider the operational burden of switching costs.

That said, the total addressable market (TAM) for AI chips is expanding rapidly, leaving room for new architectures to coexist alongside incumbents like Nvidia. For instance, Alphabet's tensor processing units (TPUs) are tailored for deep learning tasks, whereas Nvidia's GPUs serve as versatile, general-purpose workhorses. This dynamic suggests that Cerebras could carve out its own niche within the AI chip realm without needing to dethrone Nvidia entirely.

How to invest in Cerebras stock

Cerebras previously explored an initial public offering (IPO) and even published a draft S-1 filing late last year. However, following a recent $1.1 billion funding round, the company appears to have put its IPO plans on hold. For now, investing in Cerebras is largely limited to accredited investors, venture capital (VC) firms, and private equity funds.

For everyday investors, the more practical approach is to stick with established chip leaders such as Nvidia, Advanced Micro Devices, Taiwan Semiconductor Manufacturing, or ancillary partners like Broadcom or Micron Technology -- all of which are poised to benefit from the explosive growth of AI infrastructure spending.

Should you invest $1,000 in Nvidia right now?

Before you buy stock in Nvidia, consider this:

The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.

Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you’d have $646,805!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $1,123,113!*

Now, it’s worth noting Stock Advisor’s total average return is 1,055% — a market-crushing outperformance compared to 188% for the S&P 500. Don’t miss out on the latest top 10 list, available when you join Stock Advisor.

See the 10 stocks »

*Stock Advisor returns as of October 13, 2025

Adam Spatacco has positions in Alphabet and Nvidia. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Nvidia, and Taiwan Semiconductor Manufacturing. The Motley Fool recommends Broadcom. The Motley Fool has a disclosure policy.

Disclaimer: For information purposes only. Past performance is not indicative of future results.
placeholder
Gold Price Forecast: XAU/USD climbs above $3,350 as Trump rekindles trade tensionsThe Gold price (XAU/USD) extends its upside to around $3,365 during the early Asian session on Monday. The precious metal edges higher as traders rushed toward the traditional safe-haven assets after US President Donald Trump widened the global trade war with a fresh wave of tariffs.
Author  FXStreet
Jul 14, Mon
The Gold price (XAU/USD) extends its upside to around $3,365 during the early Asian session on Monday. The precious metal edges higher as traders rushed toward the traditional safe-haven assets after US President Donald Trump widened the global trade war with a fresh wave of tariffs.
placeholder
Ripple’s $21 Trillion Dream: What Capturing 20% Of SWIFT Volume Means For XRPRipple Labs, a crypto payments company, continues to set its ambitions and those of XRP higher than ever as it edges closer to disrupting the global financial messaging giant SWIFT. After Ripple CEO
Author  NewsBTC
Jul 14, Mon
Ripple Labs, a crypto payments company, continues to set its ambitions and those of XRP higher than ever as it edges closer to disrupting the global financial messaging giant SWIFT. After Ripple CEO
placeholder
OpenAI Introduces Lowest-Cost ChatGPT Subscription in India with UPI Payment OptionOn Tuesday, OpenAI introduced ChatGPT Go, its most affordable AI subscription tier, targeting the price-sensitive Indian market. Nick Turley, OpenAI’s Vice President and Head of ChatGPT, announced the launch via an X post, highlighting that users can pay through India’s Unified Payments Interface (UPI).
Author  Mitrade
Aug 19, Tue
On Tuesday, OpenAI introduced ChatGPT Go, its most affordable AI subscription tier, targeting the price-sensitive Indian market. Nick Turley, OpenAI’s Vice President and Head of ChatGPT, announced the launch via an X post, highlighting that users can pay through India’s Unified Payments Interface (UPI).
placeholder
ANZ Raises Gold Price Forecast to $3,800/Oz, Predicts Rally to Continue Through 2026Gold is expected to continue its upward momentum throughout 2025 and into early 2026, driven by ongoing geopolitical tensions, macroeconomic challenges, and market anticipation of U.S. monetary easing, according to analysts from ANZ in a research note released Wednesday.
Author  Mitrade
Sept 10, Wed
Gold is expected to continue its upward momentum throughout 2025 and into early 2026, driven by ongoing geopolitical tensions, macroeconomic challenges, and market anticipation of U.S. monetary easing, according to analysts from ANZ in a research note released Wednesday.
placeholder
Samsung Electronics Forecasts Stronger-Than-Expected Q3 Profit on AI Demand Samsung forecasts Q3 profit of 12.1 trillion won, boosted by strong AI chip demand.
Author  Mitrade
Oct 14, Tue
Samsung forecasts Q3 profit of 12.1 trillion won, boosted by strong AI chip demand.
goTop
quote