But scale alone does not finance infrastructure. And when measured against the capital now being discussed by OpenAI’s leadership, the gap between usage and monetised demand remains striking.
Industry estimates suggest that roughly 30 to 35 million users pay for ChatGPT through individual, professional or enterprise subscriptions. That equates to a conversion rate of around five percent. Even under generous assumptions, this supports a business generating several billion dollars in annual revenue. It does not, on its own, support the kind of investment figures now circulating.
Sam Altman has been linked to proposals that would require capital on a scale rarely seen in private technology markets. Reported ambitions range from several hundred billion dollars to figures as high as $7 trillion, aimed at expanding semiconductor manufacturing, energy supply and AI-specific infrastructure worldwide.
Even the lower end of that range would represent a historic commitment. A single advanced semiconductor fabrication plant can cost $20 billion or more. Building meaningful additional capacity would require dozens of such facilities, alongside power generation, cooling, networking and a skilled workforce. The investment horizon stretches across decades.
A simple stress test highlights the imbalance. Assume 35 million paying users generate an average of $25 per month, including enterprise usage. That implies roughly $10.5 billion in annual revenue. Doubling or even tripling that figure still leaves a wide gap relative to a $200 billion infrastructure programme, let alone anything larger. A trillion-dollar investment would represent many years of revenue before accounting for operating costs, depreciation and reinvestment.
This mismatch has prompted skepticism about whether the infrastructure race is running ahead of demand. AI does not benefit from the near-zero marginal costs that defined earlier software platforms. Inference remains expensive. Training costs rise with each new model generation. More users mean higher costs, not just higher margins.
Yet focusing only on current subscription economics risks missing the broader argument.
The case for building ahead of demand
The counter-argument is that ChatGPT subscriptions are not the point.From this perspective, consumer monetisation is a leading indicator, not the revenue engine that justifies infrastructure investment. The real bet is that AI will become a foundational input across the economy, closer to electricity or cloud computing than to a standalone app. If that happens, demand for compute will not scale linearly with today’s subscription base. It will compound across industries.
History offers precedents. Cloud providers invested heavily in data centers years before enterprise demand fully materialised. Those investments looked excessive at the time, but proved decisive once software, data and workflows migrated en masse. Semiconductor capacity, too, has often been built in anticipation of future cycles rather than current utilisation.
Under this view, waiting for monetisation to catch up before investing would be strategically dangerous. Compute shortages would constrain model development, limit deployment and hand leverage to competitors or governments willing to invest earlier. Infrastructure, once built, becomes a long-lived strategic asset rather than a short-term cost.
There is also a geopolitical dimension. Advanced AI systems increasingly intersect with national security, industrial policy and energy planning. Large-scale infrastructure investments may be justified less by near-term returns than by long-term control over critical capabilities.
If AI becomes embedded in healthcare diagnostics, logistics optimisation, scientific research and national infrastructure, the willingness to pay may come not from individual users, but from institutions with far larger budgets and fewer short-term constraints.
From that angle, the infrastructure push is not premature; it is defensive.
Why most users still don’t pay
Still, the current conversion rate deserves explanation. It is tempting to frame it as a product or pricing problem, but the reality is more structural.Consumers today are already saturated with subscriptions. Rent, mobile plans, insurance, utilities and transport are recurring costs that cannot be avoided. On top of that sit streaming services, cloud storage, productivity tools and news subscriptions. Budgets are finite, and every additional subscription competes with essentials.
In that environment, even useful tools struggle to convert casual users into paying ones. ChatGPT is no exception.
The problem is compounded by competition. High-quality alternatives are readily available. Google’s Gemini and Microsoft’s Copilot are bundled into ecosystems many users already pay for, directly or indirectly. Open-source models and free tiers continue to improve. Even ChatGPT’s own free version is good enough for most everyday tasks.
For many users, the incremental benefit of paying is real, but not decisive. They like the product, but they do not rely on it enough to justify another monthly charge. That does not signal weak demand; it reflects prioritisation.
This dynamic reinforces why consumer subscriptions are unlikely to fund AI infrastructure at scale. The ceiling is not technological. It is economic and psychological.
A bet that still carries risk
The infrastructure race may ultimately be justified. But it assumes a future in which AI usage shifts decisively from curiosity and assistance to dependency and integration, with pricing power migrating from consumers to institutions.That transition is plausible, but not guaranteed. Building infrastructure ahead of that shift locks in costs long before revenues are certain. If adoption stalls, pricing compresses, or regulation intervenes, excess capacity could become a burden rather than a moat.
ChatGPT has proven that global demand for AI exists. What it has not yet proven is that this demand can be monetised at a level that supports trillion-dollar capital commitments.
The bet being placed is not on today’s users. It is on what AI becomes to the economy as a whole. Whether that bet pays off will determine whether this moment is remembered as foresight — or as the point where enthusiasm ran ahead of fundamentals.
SaaStr says AI replaced sales. The real reason is buried in the numbers, the business model, and a growth story that isn’t what it used to be.
SpaceX is quietly pushing 4,400 Starlink satellites closer to Earth. It sounds safer — but going lower means faster failures, billion-dollar replacement costs, and a hard limit set by physics. How far can they really go before orbit fights back?
A $2.9 billion Tesla battery deal shrank to just $7,000 in days. Here’s how the 4680 hype, Cybertruck reality, and market mood swings wiped out 99% of its value.
Explore how BYD is challenging Tesla in the global EV market with record sales, stronger margins, and rapid expansion. A data-driven comparison of the world’s top electric vehicle companies in 2025.
Hermès just overtook LVMH as the world’s most valuable luxury brand—by doing less.
The economics behind it will change how you think about brand strategy.
They didn’t storm the gates — they bought them.
How Goldman Sachs and J.P. Morgan helped China quietly acquire billion-dollar Western companies — and why Trump’s new tariffs could shut the door for good.