Groq is developing chips that are optimized for artificial intelligence (AI) inference workloads.
The GPUs designed by Nvidia and AMD are optimized for training generative AI models.
Groq's entrance into the chip realm underscores the idea that AI developers and infrastructure providers will need more than just GPUs to stay ahead of the curve.
Over the last few years, investors have witnessed an unprecedented surge in capital expenditures by big tech companies, which are continuing to pour funds into building out their artificial intelligence (AI) infrastructure. Much of that money has been spent on graphics processing units (GPUs) from Nvidia (NASDAQ: NVDA) and Advanced Micro Devices, as well as networking gear and custom application-specific integrated circuits (ASICs) from Broadcom.
The tide is beginning to shift, however. Capital is beginning to move further downstream, where a handful of Silicon Valley startups are no longer just attracting curiosity -- they are disrupting a semiconductor market long dominated by the current incumbents.
Where to invest $1,000 right now? Our analyst team just revealed what they believe are the 10 best stocks to buy right now. Continue »
Enter Groq, fresh off a $750 million capital raise that values the company at $6.9 billion. Beyond the headline figure, what stands out is its investor roster, which includes Samsung, Cisco, and BlackRock. This funding round represents a pivotal moment in the broader semiconductor narrative -- one that could rewrite the dynamics between the newcomers and legacy giants.
Image source: Getty Images.
Nvidia and AMD both design GPUs -- hardware that is particularly well suited to powering the training of generative AI models. Groq, however, is pioneering a different path with a chip category known as language processing units (LPUs). Unlike GPUs, LPUs are built for inference -- the stage at which trained models are deployed in real-world applications.
This distinction matters because inference requires chips with faster processing speeds, greater power efficiency, and ultra-low latency compared to what GPUs currently deliver. Groq's approach highlights the fact that semiconductors are not one-size-fits-all products, and that AI infrastructure providers will need to look beyond the GPUs they are currently hoarding.
Groq's impressive funding round signals that investors are betting that it can carve out a space in the chip realm by offering viable alternatives optimized for the next wave of AI development.
Nvidia today commands an estimated 90% share of the AI accelerator market. That dominance stems from its leading GPU architectures and the deep integration of its CUDA software ecosystem. Together, those have given it a wide moat in its part of the chip space.
Even so, Groq's rise underscores the fact that AI workloads are becoming more fragmented and specialized. If cloud hyperscalers like Microsoft, Amazon, Alphabet, and Oracle determine that Groq's chips are better suited for inference, Nvidia could be forced to defend its position more aggressively -- or risk ceding ground in certain high-value corners of the AI landscape.
Throughout the AI revolution, AMD's main pitch against Nvidia has been its ability to deliver lower-cost alternatives. But Groq's emergence could swiftly reshape that narrative.
If enterprises begin diversifying into multivendor platforms rather than relying solely on Nvidia's stack, AMD could benefit too. In effect, Groq's rise does not merely challenge Nvidia; rather, it broadens the playing field and gives buyers more leverage. Such moves could propel AMD into the spotlight.
Groq's $750 million equity raise proves that the AI chip race is far from settled. However, Nvidia certainly hasn't been dethroned just yet.
Its record levels of profitability give it unmatched financial strength -- and plenty of resources it can tap in its efforts to out-innovate smaller rivals. With its current growth trajectory, robust margins, and next-generation chips like Blackwell Ultra and Rubin on the horizon, Nvidia's leadership position still appears durable.
By contrast, AMD is a higher-risk bet. While sales of its MI300 chips are gaining traction, AMD ultimately lacks the ecosystem lock-in that underpins Nvidia's dominance. While Groq's emergence as a chip supplier could help open doors for AMD, it is unclear how much ground it can gain in its efforts to close the gap between itself and the undisputed segment leader.
AMD may capture some incremental upside as a complementary chip provider, but Nvidia continues to be the smarter buy and perhaps the most direct way to profit from the ongoing secular tailwinds fueling the AI infrastructure boom.
Before you buy stock in Nvidia, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you’d have $651,593!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $1,089,215!*
Now, it’s worth noting Stock Advisor’s total average return is 1,058% — a market-crushing outperformance compared to 188% for the S&P 500. Don’t miss out on the latest top 10 list, available when you join Stock Advisor.
See the 10 stocks »
*Stock Advisor returns as of September 22, 2025
Adam Spatacco has positions in Alphabet, Amazon, Microsoft, and Nvidia. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Amazon, Cisco Systems, Microsoft, Nvidia, and Oracle. The Motley Fool recommends Broadcom and recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.