Monday, March 25th 2024

Tiny Corp. Prepping Separate AMD & NVIDIA GPU-based AI Compute Systems

George Hotz and his startup operation (Tiny Corporation) appeared ready to completely abandon AMD Radeon GPUs last week, after experiencing a period of firmware-related headaches. The original plan involved the development of a pre-orderable $15,000 TinyBox AI compute cluster that housed six XFX Speedster MERC310 RX 7900 XTX graphics cards, but software/driver issues prompted experimentation via alternative hardware routes. A lot of media coverage has focused on the unusual adoption of consumer-grade GPUs—Tiny Corp.'s struggles with RDNA 3 (rather than CDNA 3) were maneuvered further into public view, after top AMD brass pitched in.

The startup's social media feed is very transparent about showcasing everyday tasks, problem-solving and important decision-making. Several Acer Predator BiFrost Arc A770 OC cards were purchased and promptly integrated into a colorfully-lit TinyBox prototype, but Hotz & Co. swiftly moved onto Team Green pastures. Tiny Corp. has begrudgingly adopted NVIDIA GeForce RTX 4090 GPUs. Earlier today, it was announced that work on the AMD-based system has resumed—although customers were forewarned about anticipated teething problems. The surprising message arrived in the early hours: "a hard to find 'umr' repo has turned around the feasibility of the AMD TinyBox. It will be a journey, but it gives us an ability to debug. We're going to sell both, red for $15,000 and green for $25,000. When you realize your pre-order you'll choose your color. Website has been updated. If you like to tinker and feel pain, buy red. The driver still crashes the GPU and hangs sometimes, but we can work together to improve it."
Tiny Corp. recommends that potential clients spend a little extra on a stable NVIDIA Ada Lovelace-based system, but work will continue on ironing out nitty-gritty details with AMD engineers—today's tweet outlined a short roadmap: "going to start documenting the Radeon RX 7900 XTX GPU, and we're going right to the kernel in tinygrad with a KFD backend. Also, expect an announcement from AMD, it's not everything we asked for, but it's a start. If you want 'it just works' buy green. You pay the tax, but it's rock solid. Not too much more to say about it. Compare to more expensive 6x GeForce RTX 4090 boxes elsewhere. Hopefully we get both colors of TinyBox on MLPerf in June." It is not clear, at the time of writing, whether the Intel-based TinyBox will be prepped for pre-orders.
Sources: tinygrad Tweet #1, Tom's Hardware, Wccftech, tinygrad Tweet #2
Add your own comment

17 Comments on Tiny Corp. Prepping Separate AMD & NVIDIA GPU-based AI Compute Systems

#1
TechLurker
I wonder how long until NVIDIA forces an update that disables the critical parts to this AI box? It's clearly going beyond personal and prosumer use, and entering the enterprise sector where NVIDIA's dedicated AI accelerators are competing in. Or will NVIDIA force them to cough up a fee to stop them from just stopping their efforts?
Posted on Reply
#2
cvaldes
This is just posturing. They're setting up an Nvidia compute box to see if AMD will cave in and help them figure out firmware issues to stay competitive.

I'm sure Jensen looked over at his senior team, chuckled and said "I told you!"

:):p:D
Posted on Reply
#3
john_
TechLurkerI wonder how long until NVIDIA forces an update that disables the critical parts to this AI box? It's clearly going beyond personal and prosumer use, and entering the enterprise sector where NVIDIA's dedicated AI accelerators are competing in. Or will NVIDIA force them to cough up a fee to stop them from just stopping their efforts?
Right now Nvidia is selling anything it produces. Small companies going for this kind of solutions could be a win-win scenario for Nvidia, especially when this kind of solutions use the RTX 4090, the most expensive model and not for example the much cheaper RTX 4080 Super. If Nvidia stops this tries, then those small companies will either have to try and find an Nvidia AI accelerator, that could mean a long waiting time that could send them to AMD, or go AMD directly. I mean an MI300X, not 6 gaming cards. With RTX 5000 series also coming closer, I think Nvidia can keep this road open and if needed lock it with the next series of gaming GPUs.
Posted on Reply
#4
ncrs
TechLurkerI wonder how long until NVIDIA forces an update that disables the critical parts to this AI box? It's clearly going beyond personal and prosumer use, and entering the enterprise sector where NVIDIA's dedicated AI accelerators are competing in. Or will NVIDIA force them to cough up a fee to stop them from just stopping their efforts?
The Consumer GeForce driver EULA took care of that around 2017:
[...]
No Datacenter Deployment. The SOFTWARE is not licensed for datacenter deployment, except that blockchain processing in a datacenter is permitted.
[...]
Not to mention that consumer cards don't have as much VRAM, which is crucial for AI, as professional versions either way - RTX 6000 Ada has 48GB VRAM and is using the same AD102 chip as RTX 4090 that has 24GB. The "real" AI GPUs like A100 and H100 have 80GB of HBM or more.
Systems like the TinyBox aren't really anything special hardware-wise, since most likely all their parts can be bought online. The interesting part is selling it as a working appliance, and providing support for the clients.
Posted on Reply
#5
LabRat 891
IIRC, from the last story on this

Issues were mostly coming up beyond 4 cards. Why not sell a dual-system chassis with 2x4 cards?
Posted on Reply
#6
cvaldes
LabRat 891IIRC, from the last story on this

Issues were mostly coming up beyond 4 cards. Why not sell a dual-system chassis with 2x4 cards?
My guess is that potential customers have usage cases that require more transistors addressable as a single system.

From what I can tell, bigger is better for many of these nascent AI models. It's not just the number of AI processing cores, it also extends to VRAM, memory bandwidth, etc.

A distributed workload amongst several systems is likely less efficient than one larger system in many cases. There are probably some significant overhead demands from distributing AI computing. I thought I heard Nvidia (probably Jensen) say that in many AI systems, the AI cores only spend 60% of their time doing compute, the rest of it is talking to other cores. They are trying to develop new interconnects and infrastructural improvements to reduce that so AI cores do more what they're supposed to be doing (compute). These types of challenges aren't captured in synthetic benchmarks.

From a vendor standpoint, Tiny might be thinking that supporting ten customers with half-sized systems will cost them more than supporting five customers with full-sized systems. Customer support costs do not scale linearly.
Posted on Reply
#7
mechtech
$25k

planning to buy a new vehicle instead
Posted on Reply
#8
LabRat 891
mechtech$25k

planning to buy a new vehicle instead
Depending on One's profession, either could be 'money makers'.

Also, $25k is still probably considerably more affordable than the 'big boy' solutions on the market.
Example: When I was researching *where* my MI25s came from, the first servers w/ 4+ of them were $50K+.
(Kinda made me grin ear-to-ear knowing I paid ~$50 ea for the cards, that came out of 1000x more expensive systems :laugh:)
Posted on Reply
#9
dragontamer5788
LabRat 891Depending on One's profession, either could be 'money makers'.
As far as I can tell, the only company making money on AI right now is NVidia.

All the other companies are raising money from investors. But that's not "making money", that's just them calling investors and investors throwing money at them without understanding what the money is even doing.
Posted on Reply
#10
LabRat 891
dragontamer5788As far as I can tell, the only company making money on AI right now is NVidia.

All the other companies are raising money from investors. But that's not "making money", that's just them calling investors and investors throwing money at them without understanding what the money is even doing.
Fundamentally, you're correct. However, I immediately think of:

Effectively scamming investors has been a time-proven money maker, sadly...
Posted on Reply
#11
cvaldes
Fundamentally incorrect.

Nvidia is making money from AI. So are AMD and Intel. It's just that Nvidia is making far more money than the other two at this point in time, taking a larger percentage of the pie. But Nvidia is far from the only one making money from people jumping onto the AI bandwagon. There are other AI players. For sure, AMD is accepting purchase orders from customers who don't want to get in the back of Nvidia's line and wait 12-18 months for Team Green's latest and greatest AI accelerator.

There's also a synergistic effect the semiconductor boom from the AI boom. Semiconductor manufacturing companies like Applied Materials, ASML, Lam Research, KLA, etc. are all benefiting indirectly. Same with memory manufacturers like Micron, SK Hynix, etc. If you crack open any graphics card or AI accelerator, there are hundreds of ICs but only one or two GPUs. All those other components are being made by someone. All those companies benefit whether it's Texas Instruments or some no-name company making filters, capacitors, whatever.

For sure TSMC isn't a charity just pumping out GPUs for giggles. And an AI accelerator card is just a fancy paperweight unless you stick it in a rack, in a server room, with miles of cabling, multiple HVAC units on the roof, et cetera ad nauseam. Even the power company benefits from the time you press the power button.

As a GPU designer, Nvidia already has an established customer base of AI customers, some of whom are probably thinking about their third and fourth generation purchases. And what is very frequently ignored here by many TPU commenters is Nvidia's formidable developer ecosystem, something that is not measured in a synthetic AI benchmark run.

As for this Tiny company, it's too early to tell whether it's just smoke and mirrors or something substantial.

I know a lot of people in these discussion groups never acknowledge history. Nvidia was once a struggling startup. Same as AMD. Same as Intel. Same as Fairchild Semiconductor. Same as their customers: Amazon, Alphabet/Google, Meta/Facebook, Tesla, FedEx, Walmart, Ford Motors, etc.

Sure there are some people who just want to bilk their investors. They typically don't stay in business very long. They also end up not having many friends and don't get invited for repeat performances. There are certainly people who make poor wagers and end up creating things that aren't interesting to the marketplace. Ask Microsoft about their mobile phone lineup. Or Alphabet about their Google Stadia cloud gaming service.

Whether you do or do not have a market capitalization of a trillion dollars, you can still make good and bad decisions. It's up to Tiny to increase shareholder value even if when they still a private company.

And it's not just AI. If you buy a Ford Mustang with leather seats, it's not just Ford Motor that benefits. Some cattle rancher who raises steer for hide got paid too.
Posted on Reply
#12
Vya Domus
TechLurkerI wonder how long until NVIDIA forces an update that disables the critical parts to this AI box?
You can put this stuff together yourself, you don't have to buy this.
TechLurkerIt's clearly going beyond personal and prosumer use, and entering the enterprise sector where NVIDIA's dedicated AI accelerators are competing in.
They don't compete with that sector at all, these GPUs are still limited by their VRAM and interconnect, both AMD and Nvidia have accelerators with VRAM capacities in the hundreds of GBs and much faster interconnects and that makes them capable of things which consumer GPUs can never do. So who exactly is this for then you ask ? I don't know honestly, there is nothing you can't do with a cloud instance, there are very few scenarios where you'd wanna use something like this for ML.
cvaldesI'm sure Jensen looked over at his senior team, chuckled and said "I told you!"
This guy is not even on the map, I doubt he even knows about his startup.
Posted on Reply
#13
_Flare
The driver still crashes the GPU and hangs sometimes, ...
That seems to be intended for team red consumer cards anyway.
Posted on Reply
#14
cvaldes
Vya DomusThis guy is not even on the map, I doubt he even knows about his startup.
Dr. Lisa Su responded to the Tiny people. I'm sure someone brought it to Jensen's attention especially now that there have been multiple moments where it has been covered by mainstream tech media.

Remember that Nvidia was once a startup. In fact, this company's business plan is right in Nvidia's domain, selling AI solutions to corporations. I would be very, very surprised if Tiny Corporation didn't have former Nvidia engineers on its payroll maybe even ones that Jensen knows by name.

More than that CEO George Hotz previously ran an AI-powered autonomous driving car company which likely would have used Nvidia technology. On top of that this guy has a pretty bombastic personality. He's not very subtle about anything he does, not in the past, not now, and probably not in the future. This guy likes to be the controversial headline.

And besides, Silicon Valley isn't that big, people generally know about stuff, even if it's not the specifics. This is hard to understand if you don't live near Silicon Valley. It's simply not that big of a place. And people hop around all the time. Word travels fast.
Posted on Reply
#15
Vya Domus
cvaldesI'm sure someone brought it to Jensen's attention
His startup is basically about making a bunch of prebuilds not multi million dollar clusters, it's safe to say it definitely didn't get his attention.
cvaldesI would be very, very surprised if Tiny Corporation didn't have former Nvidia engineers on its payroll maybe even ones that Jensen knows by name.
I would be very baffled if they did, why the hell would former Nvidia employees work for a random startup making probably a fraction of the money programming AMD GPUs no less. Dude, come on.
cvaldesMore than that CEO George Hotz previously ran an AI-powered autonomous driving car company which likely would have used Nvidia technology.
They didn't, they used Qualcomm SoCs, ain't nobody putting anything with Nvidia written on it in something that low cost.
Posted on Reply
#16
cvaldes
Vya DomusHis startup is basically about making a bunch of prebuilds not multi million dollar clusters, it's safe to say it definitely didn't get his attention.


I would be very baffled if they did, why the hell would former Nvidia employees work for a random startup making probably a fraction of the money programming AMD GPUs no less.
Stock options. You do not understand some of the fundamental characteristics of Silicon Valley: the startup culture and the promise of instant riches through M&A or an IPO.

And there are plenty of people here in Silicon Valley like the energy of a startup. You cleverly ignored the point that Nvidia itself was a startup. The same sort of people that started up Nvidia would leave a big Fortune 100 firm to go work for some scrappy little startup like Tiny Corporation.

Jensen likely knows about this guy. Like I said, Silicon Valley is SMALL. There's a good chance Jensen will not say anything even if he is aware of this guy. It's not like Nvidia needs any more purchase orders for 4090s. They would rather use the wafers to make AI accelerator chips that they can charge 5x to 10x more for.

But I strongly disagree with the assumption that Jensen doesn't know of this guy. Holz's loud posturing has been covered multiple times by multiple media organizations that specialize in tech news. It's not like Holz is complaining to People magazine or TMZ.

And at some point (which has probably already happened), someone with REAL money (like a venture capitalist) is going to bring it up.

VC: "Hey Jensen, there's this dude George Holz who thinks he can get rich quick stuffing gaming cards into an AI dev box and selling them for $25K. Should I give him any money?"
Jensen: "That guy?" (laughs) "You're buying the next round."
Posted on Reply
#17
Vya Domus
cvaldesStock options.
Yeah dude totally, I am sure former Nvidia employees are dying to get stock options from a startup company making prebuilds, it all makes sense now.
Posted on Reply
Add your own comment
May 3rd, 2024 04:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts