Key Points
-
Nvidia is not just a leader in training, but also in AI inference.
-
AMD has carved out a nice niche in inference, and also has a nice agentic AI opportunity with its CPUs.
-
Broadcom is set to benefit from a surge in custom AI chips.
- 10 stocks we like better than Nvidia ›
The early innings of the artificial intelligence (AI) infrastructure buildout have been dominated by training, as companies rush to create the best AI models. However, according to reports, the AI inference market could climb from around $106 billion to nearly $255 billion by 2030.
Let’s look at three stocks set to benefit from this upward trend.
Will AI create the world’s first trillionaire? Our team just released a report on the one little-known company, called an “Indispensable Monopoly” providing the critical technology Nvidia and Intel both need. Continue »
Image source: Getty Images.
Nvidia
While Nvidia (NASDAQ: NVDA) is known for its dominance in large language model (LLM) training, the company is also the leader in AI inference. Through Nvidia NIM (Nvidia Inference Microservices), the company offers prebuilt, optimized inference microservices. Meanwhile, its Blackwell GB300 Ultra graphics processing units (GPUs) have been optimized for inference and agentic AI, while its upcoming Vera Rubin platform is expected to continue to improve upon its inference performance.
However, it is the company’s acquisition of Groq’s employees and licensing of its technology that could really set it up to be an AI inference winner. Groq (owned by X) developed a new type of chip called language processing units (LPUs) designed specifically for AI inference. Nvidia plans to integrate these chips into its CUDA software platform and networking infrastructure to improve its inference offering. As such, I wouldn’t overlook Nvidia in the inference market, where it should continue to be a winner.
Advanced Micro Devices
Since Nvidia’s CUDA moat isn’t as wide in inference as it is in training, this opens the door for Advanced Micro Devices (NASDAQ: AMD) to take some share. The company has already done a nice job carving out a niche in the inference market, so the overall growth of the market should benefit it, especially given its much smaller revenue base than Nvidia.
Meanwhile, AMD is set to benefit from an investment by OpenAI and a commitment from the start-up to use 6 gigawatts worth of its GPUs. With 1 gigawatt of chips worth about $35 billion based on the price of Nvidia GPUs, that’s a big upcoming growth driver for the company. OpenAI will use them specifically for inference, so this could also open the door for inference deals with other companies, as well.
Also not to be overlooked in the AMD story is the importance of central processing units (CPUs) when it comes to agentic AI. CPUs act more as the brains of a computer, and with AI agents, they are becoming a more important part of the AI infrastructure story. Between increasing AI inference and data center CPU demand, AMD looks well-positioned for the future.
Broadcom
As companies look to reduce AI infrastructure compute costs, they have been increasingly turning to AI ASICs (application-specific integrated circuits). ASICs are custom chips that are hardwired for specific tasks, and as such, they tend to perform these tasks very well while also being more energy-efficient. This becomes increasingly important with inference since it is an ongoing cost that consumes power every time it answers a query or completes a task.
As a leader in ASIC technology, Broadcom (NASDAQ: AVGO) is one of the best ways to play this trend. The company provides the building blocks to help take its customers’ chip designs and turn them into physical chips. Meanwhile, it also has important relationships with memory makers and foundries to secure important components and manufacturing capacity for these chips so that they can be manufactured at scale.
Broadcom helped Alphabet design its highly regarded tensor processing units (TPUs), and this alone is a big opportunity, especially as Alphabet is now letting customers deploy TPUs through Google Cloud. Anthropic has already placed a $21 billion TPU order with Broadcom for this year, while a nice chunk of Alphabet’s approximate $180 billion in capital expenditures this year will likely go to TPUs, as well. Meanwhile, the company is bringing in new ASIC customers, including OpenAI, which has committed to 10 gigawatts worth of chips.
With the inference market set to surge, Broadcom looks poised to be one of the biggest winners in the chip space.
Should you buy stock in Nvidia right now?
Before you buy stock in Nvidia, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Netflix made this list on December 17, 2004… if you invested $1,000 at the time of our recommendation, you’d have $420,864!* Or when Nvidia made this list on April 15, 2005… if you invested $1,000 at the time of our recommendation, you’d have $1,182,210!*
Now, it’s worth noting Stock Advisor’s total average return is 903% — a market-crushing outperformance compared to 192% for the S&P 500. Don’t miss the latest top 10 list, available with Stock Advisor, and join an investing community built by individual investors for individual investors.
*Stock Advisor returns as of February 25, 2026.
Geoffrey Seiler has positions in Alphabet and Broadcom. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, and Nvidia. The Motley Fool recommends Broadcom. The Motley Fool has a disclosure policy.
The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.