AI Chip Makers Diverge on Growth Paths
March 30, 2026 at 15:17 UTC

Key Points
- AMD, Intel (INTC), Micron (MU), Broadcom (AVGO) and Marvell all report surging AI-related demand but with sharply different profit profiles
- AMD posts record data center results while Intel (INTC) absorbs heavy foundry losses and bets on its 18A node
- Micron (MU) sells out 2026 HBM supply despite a sector-wide pullback after Google’s TurboQuant announcement
- Broadcom (AVGO) and Marvell extend leadership in custom AI silicon as new switches and accelerators ramp
AI Infrastructure Boom Reshapes Semiconductor Leaders
Across the semiconductor sector, AI infrastructure spending is driving rapid growth, but recent results show a clear divergence in how major chip makers are converting that demand into revenue and profit. Latest disclosures from AMD, Intel (INTC), Micron (MU), Broadcom (AVGO) and Marvell highlight contrasting strategies and risk profiles as they chase data center and accelerator opportunities.
SPDR S&P Semiconductor ETF (XSD), which holds 43 names on an equal‑weight basis, illustrates the mixed picture. The fund is up roughly 52% over the past year but down about 8% in the past month, reflecting a structural AI demand boom colliding with near‑term supply frictions and macro uncertainty.
AMD’s Record Data Center Quarter and AI Megadeals
Advanced Micro Devices (AMD) closed 2025 with record Q4 revenue of $10.27 billion, up 34.1% year over year and ahead of estimates. Data Center revenue reached a record $5.38 billion, up 39% year over year, driven by EPYC server processors and Instinct GPU shipments. Client revenue rose 34% to $3.10 billion, while Gaming jumped 50% to $843 million.
AMD’s AI strategy centers on scaling EPYC CPUs and Instinct accelerators into hyperscale data centers while expanding its ROCm software ecosystem. The company has secured large infrastructure commitments, including a 6 gigawatt GPU deployment from OpenAI, an Oracle AI supercluster order for 50,000 Instinct MI450 GPUs targeted for Q3 2026, and a 1 gigawatt AI infrastructure deal with Cisco and HUMAIN by 2030.
Analysts and banks have highlighted these “gigawatt‑scale” deals as potential share‑gain drivers in the accelerator market. UBS maintains a $310 price target on AMD, implying about 54% upside from a recent close at $201.99, and characterizes the stock as primarily a second‑half 2026 story as MI450 shipments tied to OpenAI and Meta ramp.
Intel Leans on a Costly Foundry Turnaround
Intel’s most recent quarter presents a contrast. Revenue of $13.67 billion declined 4.1% year over year, and the company reported a GAAP net loss of $591 million. Its Client Computing Group fell 7% to $8.19 billion, while Data Center and AI revenue grew 9% to $4.74 billion, well behind AMD’s data center growth rate.
Intel Foundry was a major drag, posting a $2.51 billion operating loss in the quarter. CEO Lip‑Bu Tan is staking the company’s recovery on its U.S.‑based Intel 18A process node. Panther Lake, the first client system‑on‑chip on 18A, is expected to power more than 200 OEM designs, and Nvidia (NVDA) has invested $5 billion in Intel’s foundry turnaround.
The company has guided Q1 2026 non‑GAAP EPS to approximately breakeven and flagged that future nodes such as Intel 14A could be paused if external customers do not materialize, underscoring how central the foundry revival is to its strategy and to diversified vehicles like XSD that give Intel similar weight to faster‑growing peers.
Micron Rides HBM Demand Through Algorithm Jitters
Micron Technology has emerged as a key AI memory supplier. It is the only U.S.‑based DRAM manufacturer and a primary U.S. NAND producer, and its CEO describes Micron as one of the industry’s biggest enablers of AI. The stock is up 291.9% over the past year but recently pulled back after Google announced its TurboQuant algorithm on March 24, which significantly reduces memory usage in AI workloads.
The TurboQuant news sparked a sector‑wide selloff, with Micron down 15.5% over the past week and 13.4% over the past month and Lam Research (LRCX) off 9.4% on the announcement day. Yet Micron continues to post strong fundamentals: revenue climbed from $8.053 billion in Q2 FY2025 to $23.86 billion in Q2 FY2026, with Q3 FY2026 guidance of $33.5 billion and GAAP gross margin guided to 67%.
Micron is guiding that its entire calendar 2026 high‑bandwidth memory supply, including its HBM4 products, is already sold out on price and volume. HBM4 for Nvidia’s Vera Rubin platform entered mass production in late March, and the company is trading at $357.22 versus a consensus target of $527.60, implying about 47% upside with 38 of 43 analysts rating the stock Buy or Strong Buy.
Custom AI Silicon: Broadcom and Marvell Expand Roles
Custom accelerators are another growth pillar. Broadcom, described as the market leader in custom AI accelerators with more than 70% share, serves major hyperscalers including Alphabet (GOOGL), Meta Platforms (META), OpenAI and Anthropic. In Q1 2026, Broadcom’s revenue exceeded $19 billion, up 29% year over year, while AI semiconductor revenue grew 106%. It has guided Q2 revenue of $22 billion, a 47% increase from 2025.
Marvell Technology, a smaller rival focused on custom silicon and high‑speed interconnects, reported record fiscal 2026 revenue of nearly $8.2 billion, up 42%, with earnings per share rising 81%. Data center revenue reached $1.52 billion in Q3 FY2026, 73% of total and up 38% year over year, and management now expects higher data center growth next year than previously.
Marvell recently broadened its AI networking franchise with new Structera S 60260 PCIe 6.0 and S 30260 CXL switches, offering 260 lanes to boost bandwidth, memory pooling and efficiency in AI data centers. By combining these with earlier CXL, PCIe and optical products, Marvell aims to provide an end‑to‑end interconnect fabric for increasingly memory‑hungry AI workloads.
ETF and Market Signals to Watch
For diversified investors, XSD’s equal‑weight construction means outperformers like Micron and Marvell are periodically trimmed, while turnaround stories such as Intel’s foundry segment maintain similar influence. The fund currently gives Micron a 4.21% weight and Marvell 2.25%, and has fallen about 8% in a month even as it is up sharply year on year.
The articles highlight two key signals for the next year: the durability of hyperscaler AI capital spending, which underpins demand for data center logic, memory and interconnects, and Intel’s progress in narrowing its foundry losses and attracting paying customers to its 18A node. Together, they will shape how the AI infrastructure boom translates into returns across the semiconductor landscape.
Key Takeaways
- AI infrastructure demand is lifting revenue across multiple chip makers, but profitability and execution vary sharply by company and business model.
- AMD, Micron, Broadcom and Marvell are translating AI growth into record data center and memory results, while Intel’s strategy remains centered on a high‑risk foundry turnaround.
- Sold‑out 2026 HBM capacity and large, multi‑gigawatt accelerator deals point to continued AI build‑out, even as events like Google’s TurboQuant trigger bouts of sector volatility.
- Equal‑weight vehicles such as XSD concentrate investors’ attention on both AI capex trends and Intel’s foundry trajectory, since both can swing ETF performance despite strong peers.
References
- 1. https://247wallst.com/investing/2026/03/30/amd-vs-intel-which-stock-will-lead-in-2026/
- 2. https://finance.yahoo.com/m/0853b18f-9826-3973-9932-522bbe35cb8a/amd-vs-intel%3A-which-stock.html
- 3. https://simplywall.st/stocks/us/semiconductors/nasdaq-mrvl/marvell-technology/news/marvell-technology-mrvl-is-up-52-after-launching-260-lane-ai
- 4. https://www.ad-hoc-news.de/boerse/news/ueberblick/advanced-micro-devices-inc-stock-ai-momentum-and-strategic-positioning/69029410
Get premium market insights delivered directly to your inbox.