With the Nasdaq Composite and S&P 500 indexes entering correction territory, it has opened up a lot of good buying opportunities for investors, including in the semiconductor space. This includes the stocks of the two leading companies in the artificial intelligence (AI) chip space: Nvidia (NVDA 0.76%) and Broadcom (AVGO -2.19%).
As of this writing, both stocks are down a little more than 20% from their recent highs. Let's examine why these two AI chip stocks look like no-brainer buys at these levels.
Nvidia
Nvidia is the undisputed champion of AI chips with its graphics processing units (GPUs). While originally designed to help speed up the graphics rendering in video games, today GPUs are used for a variety of tasks. However, the most important of these uses, at least in terms of revenue, are to train large language models (LLMs) and run inference for AI.
GPUs have become a key part of AI infrastructure due to their parallel processing capacity and high memory bandwidth, which together allow large amounts of data to be processed quickly. Nvidia is not the only company that designs GPUs, with Advanced Micro Devices, and to a lesser extent Intel, also making these chips. However, Nvidia has a dominant market share of over 80% in the GPU space as a results of its CUDA software platform.
Nvidia created CUDA back in 2006 to allow developers to program its chips for tasks beyond their original purpose in order to expand the market for GPUs. The use of GPUs in other areas was slow to develop and AMD did not introduce a competitive software offering until about 10 years later. Meanwhile, developers learned to program GPUs on Nvidia's platform, while the company had a decade-long lead in improving its platform. With the advent of AI, Nvidia was able to build a collection of GPU-accelerated libraries and tools designed specifically for AI on top of the CUDA platform with CUDA-X. This has created the wide moat the company sees today.
As such, as long as AI infrastructure spending continues to be strong, Nvidia remains very well positioned, as its GPUs remain the main chips used to train AI models and run inference. AI infrastructure remains robust, as cloud computing companies build out data centers to keep up with growing demand and a number of tech companies rush to create better AI models. Currently, the chief way to improve AI models is through brute computing power strength, which means using more and more AI chips.
The recent market sell-off leaves the stock attractively valued. It trades at a forward price-to-earnings (P/E) ratio of 27 times 2025 analyst estimates and a price/earnings-to-growth (PEG) ratio of about 0.5, with PEG ratios below 1 often viewed as the threshold for a stock being considered undervalued.
Broadcom
Broadcom participates in the AI infrastructure market in two main ways. The first is through its networking technology portfolio through which it supplies switches, network interface cards (NICs), and other components. Switches are an essential part of AI infrastructure as they allow data to be transferred between GPUs and servers while helping manage the flow of data to avoid network congestion. NICs, meanwhile, allow computers to communicate with each other, allowing AI workloads to be distributed across multiple servers.
Broadcom is a leader in ethernet switching technology, where it competes with other ethernet switching companies as well as with Nvidia's InfiniBand switching technology. As AI clusters become bigger, there is a need for more high-bandwidth, low latency ethernet switches, which is benefiting Broadcom.
The other area in which Broadcom is involved in the AI infrastructure buildout is by helping customers develop custom chips, known as application-specific integrated circuits (ASICs), for AI. These custom chips are designed for very specific tasks, and as such they can outperform mass-market GPUs while also having lower power consumption. The downside is that these custom chips offer less flexibility, while the cost and time to develop them is much greater.

Image source: Getty Images.
The company's first custom AI chip customer was Alphabet, where it helped the cloud computing company develop its Trillium sixth-generation tensor processing unit (TPU). It took about 15 months for the chip to be designed and deployed within Google Cloud's data centers, which was considered quick. Since then, Broadcom continued to add additional AI chip customers, bringing its total to seven.
Among its three AI chip customers that are the furthest along in their development, Broadcom sees a serviceable address market of between $60 billion and $90 billion in its fiscal year 2027 (ending October 2027) as they look to expand to 1 million AI chip clusters. For context, Broadcom ended last quarter at about a $16 billion AI revenue run rate (this is quarterly revenue that is annualized).
While Nvidia will capture its fair share of this market as well, this is a huge opportunity for Broadcom. Meanwhile, it should see continued AI growth as its new AI chips move into production in later years.
The sell-off has dropped Broadcom's valuation to a forward P/E ratio of just over 29, which will prove to be inexpensive if it can capture much of the AI opportunity in front of it.