Micron’s AI Windfall Forces a Rethink of Memory Cycles

Highlights
  • Q2 revenue: $23.9B (+75% QoQ, +196% YoY; 4th consecutive record)
  • DRAM revenue: $18.8B (+207% YoY), 79% of total
  • NAND revenue: $5.0B (+169% YoY), 21% of total
  • Gross margin: 75% (record; +18 ppt QoQ), guided to ~81% in Q3
  • Operating income: $16.5B, 69% margin
  • Non-GAAP EPS: $12.20 (+155% QoQ, +682% YoY)
  • Free cash flow: $6.9B (record; +77% vs prior record in Q1)
  • Net cash: $6.5B (record), debt reduced by $1.6B in Q2
  • Q3 revenue guidance: $33.5B ± $0.75B (another record)
  • Q3 EPS guidance: $19.15 ± $0.40 (record)
  • Fiscal 2026 CapEx: >$25B; fiscal 2027 construction CapEx +>$10B YoY
  • Quarterly dividend: +30% to $0.15/share
  • DRAM and NAND supply shortages; some key customers receiving only 50–66% of demand
  • PC and smartphone units expected to fall low double-digits % in 2026 amid tight memory supply

A new ceiling for margins in an AI-constrained world

Micron Technology’s latest quarter reads less like a traditional memory-cycle upswing and more like a structural reset. Revenue in the fiscal second quarter of 2026 surged to $23.9bn, almost tripling year-on-year and marking the fourth consecutive quarterly record. Yet it is the profitability metrics that signal how sharply the economics of memory have shifted in the AI age.

Gross margin hit 75%, up 18 percentage points in just three months and nearly double last year’s level. Operating margin reached 69%, turning Micron’s P&L into something more akin to a software franchise than a commodity hardware vendor. Management is now guiding to an 81% gross margin in the current quarter, a level that would have seemed fanciful in prior memory booms.

The company insists this is not a fleeting spike. Chief executive Sanjay Mehrotra framed AI as having “fundamentally recast memory as a defining strategic asset,” while chief financial officer Mark Murphy argued that historical precedents are of limited use: AI is both expanding the total addressable market and binding the industry in structural supply constraints that will “remain very tight beyond 2026.”

AI demand meets physical limits of supply

Underneath the headline numbers lies a simple story: AI systems are devouring memory faster than the industry can physically add cleanrooms and tools.

On the demand side, DRAM and NAND consumption by data centres is set to exceed half of total industry bit demand for the first time in calendar 2026. Advanced AI accelerators now require roughly double the DRAM content they did a year ago, with both training and inference workloads pushing architectures toward higher-capacity, higher-bandwidth memory. Micron is riding this wave with new generations of products: HBM4 ramping in volume, HBM4E in development for a 2027 launch, and an expanding portfolio of DDR, LPDRAM and SSDs tailored to AI-optimised system designs.

On the supply side, constraints are biting. Micron expects industry DRAM bit shipments in 2026 to grow only in the low-20s percent, and NAND at roughly 20%, with several structural bottlenecks: limited cleanroom space, long construction lead times, higher HBM mix consuming more wafer capacity, and declining bits-per-wafer gains from node shrinks. Some rivals are even reallocating NAND cleanroom space to DRAM, further tightening flash supply.

Micron itself plans to grow its DRAM and NAND bit output only in line with the industry in 2026, despite “demand significantly in excess” of its available NAND supply and DRAM inventory days sitting below 120. The upshot is a pricing environment where bit shipments are up only modestly, but prices have risen in the mid-60s percent range for DRAM and high-70s percent for NAND quarter-on-quarter, magnified by favourable mix.

Segment records across the board

The surge in pricing and mix has washed through every corner of Micron’s business. All four operating units posted record revenues and rich margins, underscoring how broadly AI-driven demand and tight supply are reshaping the portfolio.

Cloud Memory, Micron’s core DRAM franchise for hyperscaler data centres, generated $7.7bn in revenue, up 47% sequentially, at 74% gross margins. The separate Core Data Center Business Unit, which leans more heavily on SSDs and broader infrastructure products, reached $5.7bn, also at 74% margins and with a 23-point sequential margin expansion.

The Mobile and Client segment – long the cyclical heart of DRAM and NAND – has been pulled into the AI slipstream as PCs and smartphones adopt on-device agentic AI features. Revenue there rose 81% sequentially to $7.7bn, with gross margin at 79%. Micron highlighted that AI-capable PCs now recommend at least 32GB of memory, twice the average, while new “personal AI workstations” ship at 128GB. In smartphones, flagship devices with 12GB or more DRAM have jumped to nearly 80% of shipments from under 20% a year earlier.

Even the more prosaic automotive and embedded business is enjoying the AI uplift. Revenues reached a record $2.7bn, up 57% sequentially, with 68% gross margins. As Level 2+ driver assistance proliferates and the industry eyes L4 autonomy with over 300GB of DRAM per vehicle, Micron is seeding the market with 1γ LPDDR5 DRAM and G9-based UFS 4.1 NAND for automotive.

Balance sheet muscle and a capex supercycle

The financial transformation is as striking on the balance sheet as in the income statement. Operating cash flow in the quarter totalled $11.9bn, with capital expenditure of $5bn producing $6.9bn of free cash flow – a company record, and 77% above the previous high set just one quarter earlier. Net cash now stands at $6.5bn, the highest in Micron’s history, even after repurchasing $350m of stock and paying down $1.6bn of debt.

This torrent of cash is being recycled into what amounts to a global construction programme. Micron now expects fiscal 2026 capital spending to exceed $25bn, with fiscal 2027 capex stepping up “meaningfully” as both construction and equipment outlays rise to support HBM and DRAM investments. Construction spending alone is slated to increase by more than $10bn year-on-year in 2027.

The list of projects reads like a map of the new AI industrial base. In DRAM, Micron has closed ahead of schedule on its acquisition of Powerchip Semiconductor’s Tongluo site in Taiwan, which should support meaningful shipments from the existing fab by fiscal 2028, with a second cleanroom to follow. In the US, its first Idaho fab is targeted for initial wafer output in mid-2027, with ground preparation already under way for a second. A new fab in New York has broken ground, ahead of plan, and Hiroshima in Japan is being prepped for cleanroom expansion.

In NAND, Micron has committed to a new fab in Singapore, underpinned by a stronger demand outlook and the decision to colocate R&D with manufacturing – a move the company believes will accelerate time to market. Initial wafer output there is pencilled in for 2028. Assembly and test are also scaling: a vast new facility in India has begun commercial shipments, while Singapore will host advanced HBM packaging capacity that should contribute meaningfully from 2027.

Investors, meanwhile, are being offered a modest but symbolically important uptick in direct returns. The quarterly dividend is being raised 30% to $0.15 a share, acknowledging what management calls “the sustained strength” of Micron’s technology leadership and cash generation. Yet reinvestment remains the clear priority: the company expects to increase operating expenses as it ramps R&D in fiscal 2027 to pursue what Mehrotra described as an “unprecedented set of opportunities” in memory and storage.

Strategic contracts and a new AI-era business model

Perhaps the most telling change, though, is in how Micron is trying to insulate itself from the brutal down-cycles that have historically plagued memory. The company has been evolving its customer contracts from traditional one-year long-term agreements to what it now labels “strategic customer agreements,” or SCAs. It has just signed its first five-year SCA and is in talks with multiple customers across markets for similar deals.

Unlike past LTAs, these SCAs include specific multiyear commitments designed to give Micron better visibility into demand and stabilise its business model. For customers, they offer supply assurance in a persistently tight market and closer R&D collaboration on product roadmaps – particularly important for customised high-bandwidth memory stacks coupled tightly to next-generation AI accelerators.

Management was careful not to disclose economic details, citing confidentiality. But the thrust is clear: in a world where some key customers are receiving only half to two-thirds of the memory they request, the bargaining power of a leading-edge supplier has risen. Mehrotra characterised memory as “one of the biggest beneficiaries and enablers of AI,” and the SCAs as a contractual acknowledgement of that status.

Sector implications: beyond the old memory playbook

For investors used to trading the memory cycle, Micron’s Q2 results pose an uncomfortable question: how much of this is still cyclical, and how much is secular?

On one side sits a familiar pattern. Prices have risen sharply as supply tightened, driving exceptional margins and cash flow. PCs and smartphones, more price-sensitive segments, are expected to see unit declines in the low double-digits in 2026, a sign that high memory prices are starting to ration demand at the edge.

On the other side are AI-driven dynamics that look stubbornly durable. Memory is being pulled closer to the core of system architecture, with LPDRAM and high-performance SSDs sitting alongside HBM as critical levers for token cost, latency and energy per inference. New categories – personal AI workstations, agentic AI PCs, AI-embedded vehicles, and even humanoid robots – are emerging with memory footprints that dwarf legacy devices.

Micron’s guidance suggests it expects to ride this wave much further. Third-quarter revenue is projected at $33.5bn, plus or minus $750m, with record EPS of around $19.15. Gross margin is guided up another six percentage points to 81%. Management’s repeated assertion that supply-demand tightness will extend “beyond 2026” is not simply a comment on the next quarter: it is a statement about the physical and financial cadence of building the AI infrastructure layer.

For now, the company has the cash, the technology roadmap, and – increasingly – the contractual tools to try to turn what was once a boom-bust commodity business into something more structurally advantaged. Whether AI ultimately smooths the memory cycle or simply elevates it to a higher, more volatile plateau will be a question for future quarters. But on the numbers just reported, Micron has already moved the goalposts for what investors expect from a memory manufacturer.