![]() |
| The holiday weekend, as well as the festive season, is expected to drive even higher traffic. An electronic device on fire pictured. |
In a move that has sent ripples through the AI community, OpenAI has significantly tightened the daily usage limits for its groundbreaking video generation tool, Sora. Free users now find themselves restricted to generating just six videos per day, a stark reduction that underscores the immense computational pressure the platform is under.
The announcement came directly from Bill Peebles, the leader of the Sora team at OpenAI, who took to the social media platform X to address user concerns. Peebles candidly cited "overwhelming demand" as the core reason for the new restrictions, using a now-viral phrase to describe the strain on their infrastructure: "our GPUs are melting."
A Permanent Shift, Not a Temporary Fix?
While OpenAI has a history of implementing temporary caps during periods of peak traffic, Peebles' statement lacked any indication that this latest restriction is short-term. Unlike previous pauses that were framed as capacity-building measures, this change appears more structural.
Instead of promising a return to higher limits, Peebles pointed users toward a new option: the ability to purchase additional video generations. This pivot highlights OpenAI's increasing focus on scalable monetization for its most advanced models, moving beyond just subscription tiers for its flagship ChatGPT.
For now, paying customers seem to be insulated from the cuts. Limits for ChatGPT Plus and ChatGPT Pro subscribers accessing Sora remain unchanged, although OpenAI has been characteristically quiet about publicly specifying what those exact caps are.
Google Follows Suit with Gemini and Nano Banana Pro Cuts
OpenAI is not alone in this belt-tightening. In a parallel development, Google has also been quietly reducing free access to its own suite of AI tools. As first reported by 9to5Google, the company's new Nano Banana Pro image generator has seen its daily free limit reduced from three images down to just two.
The change was later confirmed within the tool's own interface, which now carries a warning to users that usage caps "may change frequently and without notice." This suggests a new era of fluid and often restrictive access for non-paying users.
Furthermore, Google appears to be extending these restrictions to its powerful Gemini 3 Pro model, continuing a clear industry trend of reeling in generous free access following high-demand model rollouts. You can read the full breakdown of Google's changes in the original report from 9to5Google.
The Unprecedented Demand Crunch: Why Now?
The simultaneous clampdown by the two AI giants points to a simple, shared reality: user demand for state-of-the-art AI generation is outstripping even the most optimistic supply projections. Since releasing their latest models, both companies have experienced server-crushing usage patterns that make resource management a top priority.
The timing of these announcements is also critical. We are entering a holiday weekend and the broader festive season, a period historically known for driving massive online traffic. As more casual users log on during their time off to experiment with these viral AI tools, the strain on infrastructure will only intensify, making these generation limits more visible—and more frustrating—to the average person.
This creates a clear two-tiered system: a limited, sometimes unreliable experience for free users, and a premium, uninterrupted service for those willing to pay. For a deeper analysis of the business strategy behind these moves, Forbes explores the implications of Peebles' "melting GPUs" statement.
What This Means for the Future of AI Access
The message from both OpenAI and Google is unequivocal. The early days of seemingly boundless, free AI access are drawing to a close. The computational cost of running these advanced models is simply too high to offer them without significant restrictions.
While paid users across both platforms remain unaffected for now, the companies are clearly signaling that the landscape is evolving. As demand continues to grow and models become even more complex, today's policies for subscribers could easily become tomorrow's memories. For creators and businesses relying on these tools, the era of budgeting for AI compute has well and truly begun.
