AMD is a fraction of the size of AI chip leader Nvidia.
AMD has a huge opportunity in the fast-growing market for inference chips.
If it can take just a little market share away from Nvidia in that segment, it could have huge upside ahead.
Advanced Micro Devices (NASDAQ: AMD) has been in Nvidia's (NASDAQ: NVDA) shadow for a long time, but the artificial intelligence (AI) market is shifting in a way that could finally tilt in AMD's favor. The early boom in AI was all about training large language models (LLMs), and Nvidia owned that part of the market thanks in part to its CUDA software and ecosystem.
That edge still remains for Nvidia, but inference is where things are headed, and that is a much bigger long-term opportunity.
Where to invest $1,000 right now? Our analyst team just revealed what they believe are the 10 best stocks to buy right now. Learn More »
Image source: Getty Images.
Training AI models is a computationally heavy and expensive process, but it only happens once. Inference -- what happens when those models are deployed and actually used -- requires chips to run processes over and over again every time a query is run or a recommendation is delivered. As models get bigger and are used more often, demand for inference-specific processing power grows. Meanwhile, the cost per inference ends up mattering a lot more than whether or not the chips that support it are delivering peak performance.
This is exactly why AMD can make inroads into the inference market, and it already has. One of the largest AI companies is already running a big chunk of its inference traffic on AMD's graphics processing units (GPUs), and seven of the 10 biggest AI operators are now using some of its chips. That is real progress.
The company has also been improving its ROCm software platform, which has always lagged behind Nvidia's CUDA. The most recent update, ROCm 7, was designed with inference in mind. It may not match CUDA on training, but for inference workloads, customers are finding that it's good enough. That is a key point, because once the software hurdles are lowered, the conversation becomes more about price and efficiency. If customers can save money by using AMD's chips without meaningfully losing performance, it will have a real opportunity to take some market share from the GPU leader.
Another development that could reshape the market is the UALink Consortium, which AMD helped form in partnership with several other companies. Nvidia's NVLink has been one of its biggest advantages because it lets its GPUs talk to each other at extremely high speeds, allowing clusters of them to operate almost like a single giant chip. The UALink Consortium is pushing an open standard alternative.
If this gains traction, data centers would no longer be locked into using only Nvidia hardware to build their AI clusters. Instead, they could mix and match chips from different vendors, which would be a huge boost for AMD and other chipmakers over time. The consortium is just getting started, but the effort shows AMD is thinking about how to chip away at Nvidia's moat in the long term.
While the GPU side of the story gets the most attention in the AI space today, AMD's biggest strength is with central processing units (CPUs), versatile chips that provide the core "brains" of computer systems. It has been taking market share in the data center CPU market and is now the leader in the space. In the AI data center segment, the market for CPUs is not as large as the market for GPUs, but they are nonetheless a critical part of that infrastructure, and AMD's growth on that front adds another ingredient to its story. Beyond that, AMD still has solid gaming and PC chip businesses.
One of the biggest reasons why AMD could deliver outsized returns in the coming years is the huge size gap between it and Nvidia. Nvidia's data center revenue last quarter was over $40 billion, while AMD's was around $3 billion. That massive spread explains why AMD's stock could soar if it just grabs a modestly bigger slice of the overall data center market.
The simple fact is that AMD is not going to overtake Nvidia for the top spot in the GPU market -- but it doesn't have to. It just needs to make its chips viable alternatives for companies in the inference market. Even small market share gains could move the needle in a big way, given its much smaller revenue base and the huge growth expected in the inference arena.
By 2030, inference could dwarf training in size. If AMD continues to improve its ROCm platform and UALink is adopted widely, it should be able to gain some market share in what should be a huge market. That makes the stock an attractive option to buy and hold while the next phase of the AI trend develops.
Before you buy stock in Advanced Micro Devices, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Advanced Micro Devices wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you’d have $670,781!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $1,023,752!*
Now, it’s worth noting Stock Advisor’s total average return is 1,052% — a market-crushing outperformance compared to 185% for the S&P 500. Don’t miss out on the latest top 10 list, available when you join Stock Advisor.
See the 10 stocks »
*Stock Advisor returns as of August 25, 2025
Geoffrey Seiler has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices and Nvidia. The Motley Fool has a disclosure policy.