Micron’s AI Memory Bet Delivers a Blowout Quarter — But Supply is the New Brake
Highlights
- Revenue: $13.6B (+21% QoQ, +57% YoY; third consecutive quarterly record)
- DRAM revenue: $10.8B (+20% QoQ, +69% YoY; 79% of total)
- NAND revenue: $2.7B (+22% QoQ, +22% YoY; 20% of total)
- Gross margin: 56.8% (up 11ppt QoQ; record expansion)
- Operating income: $6.4B; operating margin 47% (up 12ppt QoQ, 20ppt YoY)
- Non-GAAP EPS: $4.78 (+58% QoQ, +167% YoY)
- Free cash flow: $3.9B (quarterly record; >20% above prior record)
- Cloud Memory BU revenue: $5.3B (+16% QoQ; GM 66%, +620bps QoQ)
- Core Data Center BU revenue: $2.4B (+51% QoQ; GM 51%, +990bps QoQ)
- Mobile & Client BU revenue: $4.3B (+13% QoQ; GM 54%, +1700bps QoQ)
- Auto & Embedded BU revenue: $1.7B (+20% QoQ; GM 45%, +1400bps QoQ)
- Cash & investments: $12B; liquidity (incl. credit facility): $15.5B
- Net cash position: >$250M (after $2.7B debt reduction)
- FY26 CapEx: ~$20B (up from $18B prior, weighted to 2H)
- HBM TAM forecast: ~$35B in 2025 to ~$100B by 2028 (~40% CAGR; timeline pulled in by 2 years)
- Q2 FY26 guidance: revenue $18.7B ±$0.4B; GM 68% ±100bps; OpEx $1.38B ±$20M; EPS $8.42 ±$0.20 (all record levels)
- Ability to meet only ~50–67% of demand from several key customers in the medium term due to supply constraints
- FY26 CapEx raised to ~$20B, reflecting heavy investment needs in HBM and 1-gamma DRAM
Record quarter built on scarcity and pricing power
Micron opened fiscal 2026 with numbers that underline just how radically AI has reshaped the memory industry’s economics.
Revenue in the November quarter surged to $13.6 billion, up 21% sequentially and 57% year on year, marking a third consecutive quarterly record. Every major product line and business unit set new revenue highs, with DRAM and NAND both benefiting from a mix of tight supply and aggressive pricing.
Gross margin leapt 11 percentage points sequentially to 56.8%, the highest level in Micron’s history. Operating margin rose to 47%, up 12 points quarter on quarter. Non‑GAAP EPS of $4.78 grew 58% sequentially and 167% year on year, far above prior guidance.
The top line was driven predominantly by DRAM:
- DRAM revenue: $10.8 billion (79% of total), up 20% sequentially and 69% year on year. Bit shipments were only “slightly” higher, but average selling prices rose by 20%, a stark illustration of tight industry supply and Micron’s pricing leverage.
- NAND revenue: $2.7 billion (20% of total), up 22% sequentially and 22% year on year. NAND bits grew in the mid‑ to high‑single digits while prices climbed in the mid‑teens percentage range.
The company’s ability to expand margins so quickly rests less on volume and more on the scarcity value of its product. Both DRAM and NAND markets are capacity constrained, with Micron repeatedly emphasizing that “industry demand is greater than supply” and that tightness is likely to persist through and beyond calendar 2026.
Free cash flow reached an all‑time quarterly high of $3.9 billion, surpassing the previous 2018 peak by more than 20%. Operating cash flow was $8.4 billion against capital expenditures of $4.5 billion. Micron used the windfall to pay down $2.7 billion of debt and modestly return capital: $300 million of share repurchases, constrained by CHIPS Act covenants. The balance sheet now shows a net cash position of more than $250 million, with $12 billion of cash and investments and $15.5 billion of available liquidity including undrawn credit.
Inventory days stood at 126, with DRAM inventory “tight and below 120 days” — an operational data point consistent with the pricing behavior investors are seeing in the P&L.
AI turns memory into a strategic choke point
Management’s narrative framed memory as having crossed an inflection point: from a fungible component to a strategic asset that determines AI system performance.
High Bandwidth Memory (HBM) — the stacked DRAM that sits next to leading AI accelerators — is now the clearest expression of that shift. Micron has fully contracted price and volume for its entire 2026 HBM output, including its upcoming HBM4 generation. The company now expects:
- HBM total addressable market (TAM) to grow from roughly $35 billion in 2025 to about $100 billion in 2028, a 40% CAGR.
- The $100 billion milestone to occur two years earlier than its prior projections.
- The 2028 HBM TAM to exceed the size of the entire DRAM market in 2024.
HBM is also structurally constraining the broader DRAM ecosystem. Producing a dollar of HBM typically consumes 3x the wafer capacity of conventional DDR5, and that ratio rises with future HBM generations. As hyperscalers and AI chip vendors raise their HBM content per accelerator, the resulting trade‑off is squeezing supply of “standard” DRAM.
Micron says the gap between demand and supply across DRAM — including HBM — is now the largest the company has ever seen. In the “medium term,” it estimates it can meet only about 50% to two‑thirds of demand from several key customers, even as it pushes node transitions and accelerates capacity additions.
This is the context for Micron’s decision to raise fiscal 2026 CapEx to approximately $20 billion, up from a prior $18 billion estimate. The incremental $2 billion is aimed primarily at expanding HBM as well as the 1‑gamma DRAM node, which will be the main driver of DRAM bit growth in calendar 2026.
Yet even with that step‑up, capital intensity is running below Micron’s historical 35% of revenue benchmark, a function of both elevated revenue and the physical constraints on how fast new cleanroom space can be brought online. Management stressed that from fiscal 2025 to 2026, brick‑and‑mortar construction CapEx will “roughly double,” with further increases expected in 2027, but reiterated a commitment to keep capacity adds aligned with long‑term demand, rather than chasing short‑term pricing spikes.
Business units: data center leads, but strength is broad‑based
All four operating segments delivered record revenue and significant margin expansion, underscoring how AI is now pulling through demand not just for HBM, but for a broad array of memory types and form factors.
Cloud Memory Business Unit (CMBU):
Revenue reached $5.3 billion (39% of total), up 16% sequentially. Gross margin was 66%, up 620 basis points quarter on quarter, supported by bit growth and price increases. This segment encompasses HBM and high‑capacity server DRAM and reflects Micron’s move to treat cloud memory as a distinct, higher‑value franchise.Core Data Center Business Unit (CDBU):
Revenue was $2.4 billion (17% of total), up 51% sequentially, with gross margin at 51%, up 990 basis points. Data center NAND revenue alone exceeded $1 billion in the quarter, lifted by momentum in performance SSDs and capacity QLC (quad‑level cell) products. Micron highlighted the world’s first PCIe Gen6 SSD, built on its G9 NAND, now moving through qualifications at multiple hyperscalers.Mobile & Client Business Unit (MCBU):
Revenue came in at $4.3 billion (31% of total), up 13% quarter on quarter. Gross margin jumped to 54%, a 17‑point sequential increase, mainly due to pricing. PC memory demand is being pushed by Windows 10 end‑of‑life and the emergence of “AI PCs,” while smartphone DRAM content is climbing as flagship models shift to 12GB and beyond. Micron is sampling 1‑gamma LPDDR6 and higher‑density LPDDR5X parts, positioning itself for the next wave of edge AI.Automotive & Embedded Business Unit (AEBU):
Revenue rose to $1.7 billion (13% of total), up 20% sequentially. Gross margin improved to 45%, up 14 points. Advanced driver assistance (L2+ and above) and industrial automation are each pushing higher memory content per system. Micron’s ASIL‑rated LPDDR5X and UFS 4.1 NAND for automotive and robotics have already secured “billions of dollars” in design wins, and management underscored persistent strength in legacy LPDDR4X and DDR4 for long‑lifecycle industrial and automotive applications.
What is striking from an investor perspective is the uniformity of margin expansion: every BU added double‑digit gross margin points quarter on quarter, driven more by pricing than by volume. That pattern reinforces the broader thesis that we are in a pricing‑led upcycle, underpinned by structural scarcity rather than just cyclical demand.
Technology roadmap: 1‑gamma DRAM, G9 NAND, and HBM4
Micron’s operational performance in this cycle rests heavily on its claim to technology leadership in both DRAM and NAND.
DRAM: The 1‑gamma node is ramping well and is set to become the majority of Micron’s DRAM bit output in the second half of calendar 2026. This node will be the primary driver of DRAM bit growth in 2026. Development is already underway for 1‑delta and 1‑epsilon, which Micron says will extend its lead with further innovations. On the data center side, Micron continues to push low‑power DRAM (LPDRAM) into servers, with 192GB LP modules enabling rack‑scale densities above 50TB at one‑third the power of traditional DDR.
NAND: The G9 node is ramping across both client and data center SSDs, with QLC mix at a record high in the quarter. G9 transitions will be the main driver of NAND bit growth in 2026, with G9 expected to become Micron’s largest NAND node later in the fiscal year. At the capacity end, Micron’s G9‑based 122TB and 245TB QLC SSDs are entering qualification at multiple hyperscalers.
HBM: The company’s HBM4 product, with >11Gbps per pin, is slated to ramp in the second calendar quarter of 2026 with what Micron characterizes as “high yields” and a faster learning curve than HBM3E. Both the base logic die and DRAM core dies are designed and manufactured in‑house on advanced CMOS and metallization processes, which management sees as a competitive differentiator for power and performance. In the nearer term, HBM3E remains in heavy demand, especially for custom AI accelerators from cloud providers, with Micron managing the mix between HBM3E and HBM4 in 2026 based on customer roadmaps.
Management reiterated that its share of the HBM market reached parity with its DRAM share around calendar Q3 and that, in the current environment, both HBM and conventional DRAM are “in high demand” and enjoy strong profitability. The company is explicitly managing the allocation between HBM and non‑HBM DRAM to balance strategic customer needs with margin optimization.
Capacity build‑out: racing against the clock
Behind the pristine P&L, Micron’s main strategic challenge is physical: building enough cleanroom capacity fast enough to keep up with AI‑driven demand.
The company laid out an ambitious, multi‑geography manufacturing roadmap:
Idaho (US):
The first new fab in Boise has been pulled in, with first wafer output now expected in mid‑2027, earlier than the previous “second half of 2027” target. A second fab in Idaho will start construction in 2026 and become operational in 2028.New York (US):
Micron is progressing through permitting for its planned megafab complex and now expects to break ground in early 2026, with volume supply “in 2030 and beyond.” The site is central to Micron’s long‑term US manufacturing strategy under the CHIPS Act framework.Japan:
With support from METI, Micron is investing to enable future DRAM node transitions, in coordination with Boise R&D. Additional cleanroom space in Hiroshima will support advanced DRAM nodes and improve scale economics.Singapore:
An HBM advanced packaging facility in Singapore is on track to contribute “meaningfully” to HBM supply in 2027. Integrating HBM packaging into the existing NAND‑centric footprint is expected to create synergies between DRAM and NAND production.India:
An assembly and test facility has begun pilot production and is set to ramp in 2026.
Despite this build‑out, there is no short‑term fix for the current tightness. Management stressed that 2026 DRAM and NAND bit shipment growth will be constrained by industry supply, not demand, with industry and Micron bit shipments both projected to grow around 20% year on year. Node transitions — 1‑gamma DRAM and G9 NAND — will be the primary vehicles for adding bits, rather than greenfield wafer starts, at least through 2026.
Contracting discipline in a seller’s market
Perhaps the most telling structural change in this cycle is emerging in Micron’s customer contracts.
The company is currently negotiating multi‑year agreements with several key customers, covering both DRAM and NAND, and in some cases bundling HBM, DDR5, and SSDs. While management declined to share detailed terms, they drew a sharp contrast with previous “LTA” cycles:
- The new contracts are longer in duration, stretching into 2026, 2027, and even 2028 in some cases.
- They include “specific commitments” and a “much stronger contract structure” than prior volume‑flexible, price‑reset agreements.
- Multiple market segments are involved, not just data center hyperscalers.
Given that Micron can only meet about half to two‑thirds of demand from several key customers in the medium term, the company is in a position to demand more stringent commitments in exchange for capacity allocation. Management confirmed that HBM 2026 volumes and pricing are already locked in via these agreements, and stressed that both HBM and non‑HBM DRAM are earning attractive returns.
For investors, these contracts matter not just as a sign of pricing power today, but as a potential dampener on the extreme volatility that has historically characterized memory cycles. If a larger portion of Micron’s output is pre‑sold under binding, multi‑year frameworks, the amplitude of future downturns could be structurally lower — though that remains a hypothesis rather than a proven fact.
Outlook: more records in sight, but growth constrained by supply
For the current quarter (fiscal Q2 2026), Micron guided to:
- Revenue: $18.7 billion ± $400 million (another record).
- Gross margin: 68% ± 100 basis points, up roughly 11 points quarter on quarter and 7 points above the prior record.
- Operating expenses: $1.38 billion ± $20 million.
- Non‑GAAP EPS: $8.42 ± $0.20, another all‑time high.
Management expects the business to “strengthen through the year” with further gross margin expansion for both DRAM and NAND, albeit at a more gradual pace than the step‑function gains of the last two quarters. Cost reductions from 1‑gamma and G9 ramps, combined with higher pricing and favorable mix, are expected to offset start‑up costs from new fabs.
The underlying message for investors is that Micron’s growth in this AI cycle is now less a function of demand — which continues to surprise to the upside in data center, PC and mobile — and more a function of how quickly it can translate leading‑edge process technology and government‑backed fabs into usable bits. In that sense, the company looks less like the cyclical price taker of old, and more like a capital‑intensive infrastructure provider sitting at the heart of the AI economy’s physical constraints.