Support 100% Independent Research With Our 7-Day Free Trial
Something strange is happening in the AI economy.
Take Broadcom (NASDAQ:AVGO). The semiconductor giant just delivered one of the most explosive growth outlooks in tech. Revenue is rising sharply. AI demand is surging. And the company now has a line of sight to over $100 billion in annual AI chip revenue by 2027.
Those numbers would have sounded absurd just a few years ago. Today, they barely move the needle for investors.
That paradox says a lot about where the market is in 2026. AI expectations have become enormous. Companies are announcing growth figures that would once have defined a decade. Yet the market response often feels muted. Nvidia, Microsoft, Amazon, and Alphabet have all experienced similar reactions despite strong AI results.
Broadcom sits right in the middle of that phenomenon. The company is producing remarkable numbers, but investors are asking harder questions. Who ultimately earns returns from this spending? How sustainable is the AI boom? And when will the profits show up outside the chip industry?
Broadcom’s latest results reveal something deeper than a single earnings report. They show how the AI market has entered an era where even extraordinary growth struggles to impress.
AI Growth Has Become Too Big To Surprise Anyone
Broadcom’s latest quarter reads like a highlight reel from the AI boom.
The company reported $19.3 billion in quarterly revenue, up 29% year over year. Semiconductor revenue alone reached $12.5 billion, growing 52%. Much of that surge came from AI chips, which generated $8.4 billion in revenue during the quarter. That figure more than doubled compared with last year.
The momentum appears to be accelerating. Broadcom expects AI semiconductor revenue to reach $10.7 billion next quarter, representing growth of roughly 140% year over year. Overall company revenue could reach about $22 billion, which would mean 47% growth.
Those numbers would normally ignite a rally across the semiconductor sector. Yet investors have grown surprisingly cautious about AI results.
Part of the reason is simple. AI growth has become so large that it no longer feels surprising. Nvidia recently reported record demand. Microsoft and Amazon are spending tens of billions on data centers. Alphabet continues to expand AI infrastructure across its cloud platform.
Against that backdrop, Broadcom’s results feel less like an outlier and more like another chapter in the same story.
The market now expects extraordinary growth. Anything less than perfection invites skepticism. Ironically, that dynamic makes the most powerful growth cycle in semiconductor history feel strangely routine.
Six Customers Are Quietly Driving The Entire AI Infrastructure Boom
Another remarkable detail in Broadcom’s earnings call received little attention. The company’s AI future depends on just six major customers.
Those customers include some of the biggest names in artificial intelligence: Google, Meta Platforms, OpenAI, and Anthropic. Two additional large technology players are also part of the group. Each is developing massive computing clusters to train and deploy large language models.
Broadcom works closely with these companies to design custom AI accelerators, known as XPUs. Unlike general-purpose GPUs, these chips are optimized for specific workloads. That approach can reduce costs and improve performance for training or inference tasks.
Because the partnerships are long term, Broadcom has unusual visibility into future demand. These companies plan their AI infrastructure years in advance. The roadmaps include both training chips and inference systems that run deployed AI models.
The scale of those plans is staggering. Broadcom estimates that AI demand across its customers could approach ten gigawatts of compute capacity by 2027.
That is an extraordinary number. Ten gigawatts roughly equals the power output of several large power plants. It illustrates just how large AI infrastructure projects have become.
Yet the economic impact is concentrated among a small group of companies. A handful of technology giants are driving much of the demand for AI chips, networking equipment, and data-center capacity.
Broadcom sits directly inside that ecosystem. Its relationships with those six customers may ultimately shape the company’s next decade of growth.
Custom Chips & Networking Are Becoming The New AI Battleground
Much of the public discussion around AI chips focuses on graphics processors. Nvidia’s GPUs still dominate the market for training large AI models. But the next phase of AI infrastructure may look different.
Broadcom is building its strategy around custom silicon.
The company collaborates with cloud providers and AI developers to design chips tailored to specific workloads. These custom accelerators can focus on particular tasks, such as inference or mixture-of-experts models. In some cases, they may deliver better efficiency than general-purpose GPUs.
Broadcom’s opportunity extends beyond processors. Networking hardware has become just as important inside large AI clusters.
Modern AI data centers contain thousands of accelerators that must communicate at extremely high speeds. Broadcom provides the switches and connectivity technology that link those systems together.
Products such as the Tomahawk switch platform and high-speed SerDes technology are gaining traction among hyperscale customers. AI networking revenue already represents about one-third of Broadcom’s AI segment and could approach 40% in the near future.
This combination of custom processors and networking creates a powerful position. Broadcom participates in both the compute layer and the infrastructure that connects it.
As AI clusters grow larger, networking performance becomes critical. Faster interconnects can improve training efficiency and reduce latency for inference workloads.
In other words, the AI race is not just about who builds the fastest chips. It is also about who controls the plumbing that connects them.
The AI Arms Race Is Locking Up The Entire Supply Chain
One of the more revealing moments from Broadcom’s earnings call involved the supply chain.
The company has already secured key components needed for AI production through 2028. That includes advanced substrates, packaging capacity, wafers, and other specialized materials required for leading-edge chips.
Locking in supply several years ahead is unusual in the semiconductor industry. But the AI boom has created intense competition for manufacturing capacity.
Hyperscale customers want assurance that their infrastructure projects will not be delayed by shortages. Chipmakers want to guarantee access to scarce components. The result is a wave of long-term agreements across the semiconductor ecosystem.
Broadcom’s inventory also increased during the quarter as the company prepared for future AI demand. Inventory days rose from 58 to 68 as the company accumulated components needed for upcoming deployments.
This behavior reflects a broader shift in the technology industry. AI infrastructure is increasingly treated as strategic investment rather than short-term spending.
Companies building large language models view computing power as a core competitive advantage. Training and deploying AI systems requires massive clusters that must operate continuously.
As a result, supply chains are being secured years in advance. That kind of forward planning suggests the AI race may be entering a longer and more structural phase.
Final Thoughts
Broadcom sits at the center of the AI infrastructure buildout. Its custom accelerators, networking chips, and data-center software are becoming essential components of modern AI systems. Revenue growth has been extraordinary, and projections for 2027 suggest a dramatic expansion of the company’s semiconductor business.
Yet the market response highlights a broader shift in investor psychology. Artificial intelligence has raised expectations to unprecedented levels. Even forecasts exceeding $100 billion in AI chip revenue no longer guarantee excitement.
Valuation multiples also reflect this tension. Broadcom currently trades around 22.8x LTM enterprise value to revenue, 41.7x EV/EBITDA, and roughly 64.9x trailing earnings. These figures remain elevated, though they have declined from earlier peaks. Forward multiples have also moderated, with NTM EV/EBITDA near 19x and forward P/E around 25x.
In other words, the market still assigns Broadcom a premium valuation. At the same time, investors appear cautious about how quickly AI investments will translate into durable profits. The result is a company delivering historic growth while operating in a market that has become increasingly hard to impress.
Disclaimer: We do not hold any positions in the above stock(s). Read our full disclaimer here.





