If you’ve ever typed “please” or “thank you” into ChatGPT, you’ve probably open artificial intelligenceelectricity bill. Earlier this year, one social media user asked how much money the company had “lost on its electricity bill” due to users being polite. CEO of OpenAI Sam Altman Replied it amounted to “tens of millions of dollars well spent,” adding: “You never know.” This part is a joke. But it also points to a serious question: How much does it actually cost to run one of the world’s most widely used artificial intelligence systems, and how does OpenAI plan to pay for it?As of early 2026, ChatGPT serves 800 million to 900 million weekly active users worldwide, of which approximately 35 million are paid subscribers, with the vast majority having free access to the platform. This scale is the result of one of the fastest adoption curves in consumer technology history: ChatGPT surpassed 100 million users just two months after launch, cementing its place among the fastest-scaling digital products ever.But such rapid growth comes at a huge cost. Supporting hundreds of millions of weekly users requires massive computing infrastructure, driving operating expenses to levels that profoundly reshape OpenAI’s structure, capital strategy, and long-term roadmap.
The daily bills are dizzying
Running a large language model is not the same as hosting a website. Each tip triggers new calculations on thousands of high-performance chips. 2023, Technology Research Corporation semi-analytical estimate ChatGPT’s operating costs are approximately $700,000 per day, of which approximately $694,444 is attributed to hardware and inference costs. At the time, computing was primarily based on GPT-3 infrastructure, with an estimated more than 3,600 servers powering the system. Dylan Patel, principal analyst at SemiAnalysis, said the cost of GPT-4 will be much higher. These figures are now widely considered conservative. Since 2023, OpenAI has launched more powerful models (including the GPT-5. Inference – the cost of generating each response, increases dramatically at this scale. Previously, the Washington Post calculated Generating one 100-word AI email per week for a year could consume 7.5 kilowatt-hours of electricity, roughly equivalent to the energy usage of nine homes in Washington, D.C., for one hour Multiply that by hundreds of millions of users and continued enterprise use, and the energy footprint grows quickly. According to reports, OpenAI’s annual cash burn rate has now reached approximately $17 billion, which is mainly driven by computing infrastructure. The company doesn’t expect to become profitable until around 2030.
From nonprofit idealism to profit cap reality
Founded in 2015, OpenAI is a non-profit organization whose mission is to develop artificial intelligence “in the way most likely to benefit humanity.” By 2019, leadership concluded that donations alone could not fund the scale of computing required to pursue advanced AI systems, specifically artificial general intelligence (AGI). The company shifted to a “profit cap” structure that allowed outside investment but limited returns. Microsoft invested billions of dollars. The same goes for companies like SoftBank and Nvidia. By the end of 2025, OpenAI’s valuation had climbed to about $500 billion after selling $6.6 billion in stock. Report Now indicating it is preparing for a potential IPO in late 2026 or 2027, some estimate the valuation could be as high as $1 trillion, although those figures remain speculative. After California and Delaware regulators approve the reorganization in October 2025, ownership will be divided roughly as follows:
- 26% held by non-profit organizations
Open Artificial Intelligence Foundation - 27% from Microsoft
- 47% comes from employees and other investors
The pressure to demonstrate a reliable path to profitability is intensifying.
Who pays for ChatGPT?
OpenAI’s revenue model is multi-tiered. ChatGPT provides:
- free tier (basic access)
- add $20 per month
- team $25-30 per user per month
- enterprise (customized pricing)
- Professional versionpriced at $200 per month (or higher for an enterprise-level annual plan), offers significantly expanded usage limits
According to statistics, as of mid-2025 source:
- ChatGPT Plus has approximately 10 million users
- OpenAI has 3 million paying commercial users across enterprise, teams and education
- The total number of paying subscribers is estimated to be approximately 35 million
- The conversion rate from free to paid is about 5-6%
The company reports that annual revenue will exceed $2 billion in 2023. Since then, growth has accelerated dramatically. OpenAI said its annualized revenue run rate will exceed $20 billion in 2025, a 233% increase from 2024, when revenue increased from $2 billion in 2023 to $6 billion in 2024.Despite its historic growth, the company is reportedly burning more than $17 billion annually, and revenue from subscriptions alone may still not be enough to cover the massive computing and infrastructure costs required to maintain its AI operations.
API access
Developers are paid by token. For premium models, pricing can reach:
- $1.25 per million tokens (input)
- $10 per million tokens (output)
At an enterprise scale, these costs add up quickly for both customers and OpenAI infrastructure.
GPT store and custom models
Within two months of launching Custom GPT, users created more than 3 million variations. Enterprise integration has accelerated, with 61% of marketers in one survey saying their company offers ChatGPT Team or Enterprise licenses.
Advertising, IPO talk and sustainability issues
Sam Altman has publicly expressed his dissatisfaction with advertising over the years. He has said he “hates” advertising, calling it a “last resort” and describing its combination with artificial intelligence as “uniquely disturbing.” In 2025, he softened that stance, saying he wasn’t “completely opposed” to advertising, but “you need to be very careful to get it right.” As of February 2026, OpenAI is testing ads in ChatGPT for free and $8 per month “Go” tier users in the United States. The company says ads are contextual, clearly labeled, separate from chat responses, and protect user privacy. With 800-900 million weekly active users, most of whom are unpaid, and infrastructure costs running into billions of dollars per year, subsidizing free usage indefinitely without additional revenue streams is financially unsustainable. IPO speculation is tied to the same reality. OpenAI reportedly hopes to make its public debut as early as late 2026, in part to tap capital markets to fund expanding computing needs and compete with rivals like Anthropic.


