Nvidia CEO Jensen Huang sees demand for AI inference surging.
Microsoft has built its business to deliver, and profit from, high volumes of AI usage across its services.
Broadcom's AI revenue is exploding, as leading AI companies use its custom accelerators for AI inference workloads.
Nvidia CEO Jensen Huang recently said the "inflection point for inference has arrived." Over time, the market for inference is expected to exceed the market for training artificial intelligence (AI) models. Training is what builds the model. Inference is what happens when that model is put to work in the real world -- answering questions, generating content, summarizing documents, writing code, and powering AI agents.
As more businesses deploy AI products and those products process more "tokens" (the bits of data that models consume and produce), demand for the cloud and computing infrastructure that enables inference should continue to grow. That means more spending on data centers, chips, networking, and cloud platforms.
Will AI create the world's first trillionaire? Our team just released a report on the one little-known company, called an "Indispensable Monopoly" providing the critical technology Nvidia and Intel both need. Continue »
Beyond Nvidia, two companies well positioned to benefit from this next phase of growth are Microsoft (NASDAQ: MSFT) and Broadcom (NASDAQ: AVGO).
Image source: Getty Images.
Microsoft makes software products that are used by millions of people. The integration of Copilot across its products, along with its Azure enterprise cloud platform, puts Microsoft in a great position to benefit from the growth in AI inference.
CEO Satya Nadella describes the company as a "cloud and token factory," alluding to its expansive data center footprint and ability to efficiently process inference workloads, such as high-volume AI requests across its products. Microsoft is focused on improving the efficiency and profitability of its inference capabilities. It wants to make every AI prompt cheaper to process and more profitable to deliver.
On that note, Microsoft has shown significant efficiency gains in handling large inference workloads. On its highest-volume inference workloads with OpenAI, which underpins Microsoft's Copilot products, the company has achieved a 50% increase in throughput. This shows that it can process more AI prompts with the same infrastructure, thereby maximizing profitability from its infrastructure spending.
It's also an advantage for Microsoft that it is making money across multiple AI-powered products. Azure captures cloud spending from enterprises that build and run AI applications. On top of that, Microsoft is layering AI features into products customers use every day, including Word, Excel, and Teams, with Microsoft 365 Copilot. Last quarter, Microsoft reported 15 million paid seats for Microsoft 365 Copilot, up 160% year over year.
Microsoft is converting demand for AI inference into growing revenue across its products. Importantly, management is focused on maximizing token throughput per dollar spent on infrastructure, which should drive higher earnings over time. With the stock still well below its highs and trading at a forward price-to-earnings (P/E) multiple of about 23, the recent pullback could be a great buying opportunity.
Top AI companies have been spending aggressively to expand AI capacity, with a significant share of capital expenditures going toward data centers.
Last year, tech giants, including Microsoft, spent a combined $410 billion on capital expenditures, according to The Motley Fool's research. This is up 80% over 2024, and it's expected to increase in 2026. Given the need for additional infrastructure to deliver AI inference at greater scale, Broadcom remains a compelling stock to buy.
Broadcom has been a leading supplier of specialized chips and networking solutions for many years. Its custom AI accelerators are in high demand, as they are cheaper than general-purpose graphics processing units (GPUs) and more cost-effective for specific AI workloads, including inference.
Three of its top customers are Google (Gemini), Anthropic (Claude), and OpenAI (ChatGPT). These companies are using Broadcom's accelerators to maximize performance and optimize costs for their AI workloads. In the recent quarter, Broadcom's AI semiconductor revenue doubled year over year to $8.4 billion.
Broadcom is also seeing strong demand for its networking gear, like its Tomahawk 6 switches and optical components, which connect these accelerators -- enabling extremely fast processing for inference workloads. In the recent quarter, Broadcom's AI networking revenue grew 60% year over year.
Overall, management says it has "line of sight" to achieve more than $100 billion in revenue from AI chips by 2027. The stock's forward P/E of 28 isn't cheap, but it's supported by analysts' estimates calling for 40% annualized earnings growth. Barring a sudden slowdown in data center spending, Broadcom stock could deliver more gains in 2026 and beyond.
Before you buy stock in Microsoft, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Microsoft wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you’d have $490,325!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $1,074,070!*
Now, it’s worth noting Stock Advisor’s total average return is 900% — a market-crushing outperformance compared to 184% for the S&P 500. Don't miss the latest top 10 list, available with Stock Advisor, and join an investing community built by individual investors for individual investors.
See the 10 stocks »
*Stock Advisor returns as of March 26, 2026.
John Ballard has positions in Nvidia. The Motley Fool has positions in and recommends Alphabet, Microsoft, and Nvidia. The Motley Fool recommends Broadcom. The Motley Fool has a disclosure policy.