Friday, February 28th 2025
OpenAI Has "Run Out of GPUs" - Sam Altman Mentions Incoming Delivery of "Tens of Thousands"
Yesterday, OpenAI introduced its "strongest" GPT-4.5 model. A research preview build is only available to paying customers—Pro-tier subscribers fork out $200 a month for early access privileges. The non-profit organization's CEO shared an update via social media post; complete with a "necessary" hyping up of version 4.5: "it is the first model that feels like talking to a thoughtful person to me. I have had several moments where I've sat back in my chair and been astonished at getting actual good advice from an AI." There are apparent performance caveats—Sam Altman proceeded to add a short addendum: "this isn't a reasoning model and won't crush benchmarks. It's a different kind of intelligence, and there's a magic to it (that) I haven't felt before. Really excited for people to try it!" OpenAI had plans to make GPT-4.5 available to its audience of "Plus" subscribers, but major hardware shortages have delayed a roll-out to the $20 per month tier.
Altman disclosed his personal disappointment: "bad news: it is a giant, expensive model. We really wanted to launch it to Plus and Pro (customers) at the same time, but we've been growing a lot and are out of GPUs. We will add tens of thousands of GPUs next week, and roll it out to the plus tier then...Hundreds of thousands coming soon, and I'm pretty sure y'all will use every one we can rack up." Insiders believe that OpenAI is finalizing a proprietary AI-crunching solution, but a rumored mass production phase is not expected to kick-off until 2026. In the meantime, Altman & Co. are still reliant on NVIDIA for new shipments of AI GPUs. Despite being a very important customer, OpenAI is reportedly not satisfied about the "slow" flow of Team Green's latest DGX B200 and DGX H200 platforms into server facilities. Several big players are developing in-house designs, in an attempt to ween themselves off prevalent NVIDIA technologies.
Sources:
Sam Altman Tweet, Tom's Hardware, PC Gamer
Altman disclosed his personal disappointment: "bad news: it is a giant, expensive model. We really wanted to launch it to Plus and Pro (customers) at the same time, but we've been growing a lot and are out of GPUs. We will add tens of thousands of GPUs next week, and roll it out to the plus tier then...Hundreds of thousands coming soon, and I'm pretty sure y'all will use every one we can rack up." Insiders believe that OpenAI is finalizing a proprietary AI-crunching solution, but a rumored mass production phase is not expected to kick-off until 2026. In the meantime, Altman & Co. are still reliant on NVIDIA for new shipments of AI GPUs. Despite being a very important customer, OpenAI is reportedly not satisfied about the "slow" flow of Team Green's latest DGX B200 and DGX H200 platforms into server facilities. Several big players are developing in-house designs, in an attempt to ween themselves off prevalent NVIDIA technologies.
48 Comments on OpenAI Has "Run Out of GPUs" - Sam Altman Mentions Incoming Delivery of "Tens of Thousands"
garymarcus.substack.com/p/hot-take-gpt-45-is-a-nothing-burger
Maybe AMD doesn't have the skills to incorporate the hardware needed.. just throwing that out there.
Also OpenAI thinking of building it's own chips and Altman being in bad relations with Musk, who let me repeat is in the government, well OpenAI probably gets the last place in the priority list of Nvidia's top customers. They either differenciate themselves, or keep begging.
Around every two dozen cards missing the 8 ROPs or whatever is a whole ass other flagship card missing.
That's actually expensive.
The main issue with the current AI stuff is the utility of it just isn't there for anything other than psychotic chatbots, pictures of big-titted elf girls, and source code analysis.
No idea how OpenAI expect to compete, their costs are 100x too high and they'll be undercut to bankruptcy most probably.
Just a matter of time before open source competitors show up trained on Western data, instead of the Chinese stuff.