Oracle’s AI land‑grab hits the cash flow
Highlights
- Total revenue: $16.1B (+13% YoY, vs. +9% a year ago)
- Total cloud revenue: $8.0B (+33% YoY), now ~50% of total revenue
- Cloud infrastructure (OCI) revenue: $4.1B (+66% YoY); GPU-related revenue +177% YoY
- Cloud applications revenue: $3.9B (+11% YoY); strategic back-office apps: $2.4B (+16% YoY)
- Remaining Performance Obligations (RPO): $523.3B (+433% YoY; +$68B QoQ)
- RPO to be recognized in next 12 months: +40% YoY (vs. +25% last quarter)
- Non-GAAP operating income: $6.7B (+8% YoY)
- Non-GAAP EPS: $2.26 (+51% YoY); GAAP EPS: $2.10 (+86% YoY)
- Multi‑cloud database consumption: +817% YoY; dedicated region & Alloy consumption: +69% YoY
- Marketplace consumption: +89% YoY
- Cloud apps deferred revenue: +14% YoY (outpacing +11% cloud apps revenue growth)
- Free cash flow: -$10B in Q2 (CapEx $12B, largely for AI/data center build‑out)
- FY26 CapEx now expected to be ~$15B higher than post‑Q1 forecast
Hyper‑growth in AI infrastructure, with an even bigger backlog
Oracle’s Q2 FY26 numbers confirm that the company has shifted decisively into AI‑infrastructure hyper‑growth.
Total revenue rose 13% year on year to $16.1 billion, the third consecutive quarter of double‑digit top‑line growth and an acceleration from 9% a year ago. Cloud revenue reached $8 billion, up 33% and now accounting for roughly half of group revenue—evidence that Oracle’s long‑promised pivot to cloud is no longer aspirational but central to the model.
The standout remains Oracle Cloud Infrastructure (OCI). Infrastructure revenue grew 66% year on year to $4.1 billion, fuelled by 177% growth in GPU‑related revenue. Oracle is positioning itself as the capacity provider of choice to large AI labs and high‑end enterprise AI users, with contracts from Meta, NVIDIA and others feeding a backlog that is now starting to look less like a pipeline and more like a balance‑sheet item in its own right.
Remaining Performance Obligations (RPO) swelled to $523.3 billion—up 433% year on year and $68 billion higher than just three months ago. The near‑term quality of that backlog is also improving: RPO due to be recognized in the next 12 months grew 40%, versus 25% in the prior quarter and 21% a year ago. Management emphasizes that most of the new RPO is tied to capacity Oracle can deliver in the near term, which should translate into revenue faster than traditional long‑dated cloud contracts.
The company now expects the incremental RPO signed in Q2 alone to underpin an extra $4 billion of revenue in FY27. Management has left its FY26 revenue outlook at $67 billion, signalling that the RPO surge is more about deepening the outer years than rescuing the current year.
Cash flow sacrificed at the altar of GPUs
The other side of this AI land‑grab is on the cash ledger. Q2 operating cash flow was $2.1 billion, but free cash flow was a negative $10 billion, after $12 billion of capital expenditure.
That CapEx is overwhelmingly destined for revenue‑generating kit—GPUs, servers, networking—and not for land or buildings, which Oracle typically accesses via leases that begin only when data centers are delivered ready‑to‑use. In practice, Oracle is buying hardware towards the end of the construction cycle and moving it into service quickly, hoping to compress the lag between cash outlay and revenue recognition to “on the order of a couple of months” for each site.
Even so, the build‑out is escalating. On the back of the Q2 bookings surge, Oracle now expects FY26 CapEx to be about $15 billion higher than it guided after Q1. Management is explicit that this is a response to demand they believe can be monetised quickly from FY27 onward.
Funding this wave is becoming a central investor question. Oracle argues it has more levers than a straightforward bond issuance:
- Customers can “bring their own chips,” reducing Oracle’s upfront GPU spend.
- Some suppliers are willing to lease chips rather than sell them, turning capex into more flexible operating commitments.
- The company can also lean on its existing debt structure across public bonds, bank lines, and private markets.
The company insists it is committed to maintaining an investment‑grade rating and says published estimates suggesting it must raise “upwards of $100 billion” to complete its AI build‑out materially overstate its borrowing needs. Even if that proves true, the negative free cash flow profile will likely persist as long as Oracle is adding capacity faster than it is filling it.
Margin architecture: build fast, fill faster
On profitability, Oracle generated $6.7 billion of non‑GAAP operating income in Q2, up 8% year on year. Non‑GAAP EPS jumped 51% to $2.26, and GAAP EPS rose 86% to $2.10, buoyed by a $2.7 billion pretax gain from the sale of its interest in chip start‑up Ampere.
The core margin story, however, is OCI. At its recent analyst day, Oracle floated a 30–40% gross margin target over the life of AI customer contracts. The question is how fast the aggregate OCI margin can converge on that range while Oracle is in heavy build‑out mode.
Oracle’s answer is that timing is more about utilization and mix than about long construction lags. Because Oracle doesn’t carry data center construction costs until facilities are delivered, the main exposure is the equipment and ramp‑up period once those GPUs are installed. Management says the window between spending on equipment and realizing the “steady‑state” margin profile is a matter of months per site, not years.
The bigger swing factor is the mix of capacity that is online and billable versus new capacity still being lit up. As more of the global footprint flips from construction to production, the average OCI margin should move towards that 30–40% range, assuming Oracle can keep filling racks roughly as fast as it is adding them.
The fungibility of this capacity is a key risk mitigant. Clay McGork stressed that AI customers use the same underlying OCI cloud stack as any other user. Bare‑metal instances can be securely wiped and handed to another customer in under an hour; large AI customers typically bring new capacity into productive use over two to three days. That operational flexibility is designed to reduce the tail risk of stranded capacity if a marquee AI customer pulls back.
Multi‑cloud and the database as AI’s spine
If OCI is the growth engine, the database is the spine of Oracle’s AI narrative.
The company is pushing hard on “multi‑cloud” as an architectural principle rather than a marketing phrase. Oracle now operates 147 live OCI regions, with 64 more planned, and has embedded OCI capacity inside other major clouds. In Q2 alone it launched 11 new multi‑cloud regions, bringing the total to 45 across AWS, Azure and Google Cloud, with 27 more expected shortly.
Multi‑cloud database consumption grew 817% year on year, a metric that underscores the degree to which Oracle sees itself as the neutral data layer for enterprises straddling several clouds. Dedicated Region and Alloy—programmes that essentially allow customers and partners to run a full OCI stack in their own facilities or as white‑label providers—saw consumption grow 69% year on year. Oracle now has 39 dedicated regions live, with 25 more planned.
Larry Ellison used the call to sketch a tighter integration between Oracle’s legacy strengths and the AI wave. Three planks stand out:
- AI‑ready database: Oracle has added vectorization to its flagship database, effectively turning it into an “AI database” designed to store embeddings and serve as a high‑value retrieval layer for large language models.
- AI data platform / lakehouse: Beyond Oracle’s own database and applications, the new AI data platform is intended to catalogue and vectorize data across object stores, third‑party databases and bespoke applications across any cloud.
- Model neutrality: Oracle is hosting leading frontier models from OpenAI, Google, xAI and Meta in its cloud and wants those models to perform “multi‑step reasoning” across private enterprise data while preserving security and isolation.
The strategic bet is that enterprises care less about who owns the GPU and more about who can reliably connect their fragmented data estates to the latest models, without sacrificing governance. Oracle argues that its applications and databases already hold “most of the world’s high‑value private data,” and that wiring those assets into AI workflows will unlock a second, potentially larger AI market than today’s public‑data model training.
Applications: suites, not point products
While OCI captured most of the growth headlines, Oracle’s applications business quietly posted solid numbers and, crucially, improving forward indicators.
Cloud applications revenue rose 11% year on year to $3.9 billion, implying an annualized run rate of around $6 billion. Within that:
- Fusion ERP grew 17%
- Fusion SCM 18%
- Fusion HCM 14%
- Fusion CX 12%
- NetSuite 13%
- Industry cloud applications—including hospitality, construction, retail, banking, restaurants, local government and communications—rose 21%
Cloud apps deferred revenue grew 14%, outpacing current‑period cloud apps revenue growth, suggesting a strengthening bookings trajectory.
Oracle’s pitch here is deliberately counter‑cyclical to the broader SaaS narrative. Where many pure‑play SaaS peers are grappling with decelerating growth and best‑of‑breed fatigue among customers, Oracle is doubling down on full suites. It argues that only it can offer complete back‑office, front‑office and deep industry vertical stacks, with AI “baked in” rather than bolted on.
Management says there are now more than 400 AI features live in Fusion, and 274 healthcare customers are in production on Oracle’s clinical AI agent. Crucially, these AI agents are positioned as quick‑time‑to‑value products: go‑lives measured in weeks, often self‑implemented by customers.
The sales model has been retooled to match. Oracle has now unified its Fusion and industry‑cloud salesforces into a single “One Oracle” organization to promote larger, multi‑pillar deals and to pull through cloud adoption from its on‑premise base—where simply shifting to cloud support can generate three to five times the former annual revenue per customer.
Large deals cited on the call included TIM Brasil (extending a five‑year AI‑centric partnership on OCI), Motorola Solutions, Uber (now over 3 million OCI cores), and TMobile‑scale retail and consumer workloads through the holiday season. On the applications side, wins ranged from communications (Digital Bridge, Etisalat by e&), to financial services (Jewelers Mutual, PrimeLife), to public sector (US Space Force, cities of Costa Mesa and Santa Ana) and high‑tech (Zscaler, Dropbox, SolarEdge).
With 330 cloud‑apps go‑lives in the quarter, Oracle is building an installed base that can, in theory, absorb its AI data platform and lakehouse offerings as these mature.
Strategic picture for investors
For investors, Oracle is increasingly a tale of two curves. On one side is a rapidly expanding cloud and AI infrastructure franchise, supported by a massive RPO build‑up, a multi‑cloud distribution strategy and early evidence of pricing power and margin potential in AI workloads. On the other is a cash flow statement currently bearing the weight of an unprecedented capital programme, one that Oracle insists is more flexible and customer‑funded than headline CapEx numbers imply.
How quickly OCI margins scale towards the 30–40% target, and how effectively Oracle converts its AI positioning into durable application and database pull‑through, will determine whether today’s negative free cash flow looks like a temporary trough on a steeper earnings trajectory—or the new normal in a capital‑hungry AI era.