Saturday, January 28th 2023

NVIDIA RTX 4090 Ti / RTX TITAN (Ada) Pictured, Behold the 4-slot Cinder Block

Here's the very first picture of an alleged upcoming NVIDIA flagship/halo product to be positioned above the GeForce RTX 4090. There are two distinct brand names being rumored for this product—the GeForce RTX 4090 Ti, and the NVIDIA RTX TITAN (Ada). The RTX 4090 only uses 128 out of 144 (88 percent) of the streaming multiprocessors (SM) on the 4 nm "AD102" silicon, leaving NVIDIA with plenty of room to design a halo product that maxes it out. Besides maxing out the silicon, NVIDIA has the opportunity to increase the typical graphics power closer to the 600 W continuous power-delivery limit of the 16-pin ATX 12VHPWR connector; and use faster 24 Gbps-rated GDDR6X memory chips (the RTX 4090 uses 21 Gbps memory).

The card is 4 slots thick, with the rear I/O bracket covering all 4 slots. The card's display outputs are arranged along the thickness of the card, rather than along the base. The cooler is a monstrous scale-up of the Dual-Axial Flow Through cooler of the RTX 4090 Founders Edition. The card is designed such that the PCB doesn't come up perpendicular to the plane of the motherboard like any other add-on card, but rather, the PCB is parallel to the plane of the motherboard. The PCB is arranged along the thickness of the card. This has probably been done to maximize the spatial volume occupied by the cooling solution, and probably even make room for a third fan. We also predict that the PCB is split in such a way that a smaller PCB has the display I/O, and yet another PCB handles the PCI-Express slot interface. Sufficed to say, the RTX 4090 Ti / RTX TITAN will be an engineering masterpiece by NVIDIA.
Sources: MEGAsizeGPU (Twitter), VideoCardz
Add your own comment

193 Comments on NVIDIA RTX 4090 Ti / RTX TITAN (Ada) Pictured, Behold the 4-slot Cinder Block

#176
Yraggul666
Icy1007These pictures are of an old 4090 prototype. They’ve been on the internet since before the 40-series was even announced. They are not new pics of a 4090 Ti/Titan Ada.
FML i hope you're right......
Posted on Reply
#177
Tartaros
I was thinking about this and at this point what is making motherboard manufacturers against baking a complete high end gpu into a motherboard? Most people don't use pcie slots for anything, soundcards are good enough and those who want something better usually get an external one, we have baked in storage expansion for days... How about a big ass eatx mobo with a 4090 in? Probably would be more efficient in energy and heat than the actual monster things we have to put in our pcs.
Posted on Reply
#178
Icy1007
TartarosI was thinking about this and at this point what is making motherboard manufacturers against baking a complete high end gpu into a motherboard? Most people don't use pcie slots for anything, soundcards are good enough and those who want something better usually get an external one, we have baked in storage expansion for days... How about a big ass eatx mobo with a 4090 in? Probably would be more efficient in energy and heat than the actual monster things we have to put in our pcs.
A long time ago, motherboards did come with integrated graphics and some had pretty decent GPUs for the time. Especially when Nvidia still made their nForce chipsets.
Posted on Reply
#179
stimpy88
Icy1007A long time ago, motherboards did come with integrated graphics and some had pretty decent GPUs for the time. Especially when Nvidia still made their nForce chipsets.
I loved the nVidia Chipset. I'd experienced only VIA up until that point! I alway wondered why nVidia got out of the chipset business. I guess it didn't scratch the greed itch enough for them.
Posted on Reply
#180
agent_x007
Nope. They were forced out by licencing agreements from Intel/AMD.
They weren't simply available since LGA11xx(1366)/AM4 platforms.
Posted on Reply
#181
Avro Arrow
TartarosI was thinking about this and at this point what is making motherboard manufacturers against baking a complete high end gpu into a motherboard? Most people don't use pcie slots for anything, soundcards are good enough and those who want something better usually get an external one, we have baked in storage expansion for days... How about a big ass eatx mobo with a 4090 in? Probably would be more efficient in energy and heat than the actual monster things we have to put in our pcs.
It's an interesting idea but there a couple of problems. One problem is that people all want different video cards and they're just eliminating potential customers by doing that. The other problem is that they would use exactly the same amount of power but they wouldn't be able to cool themselves as well as a separate card and the circuits in the motherboard that carried the power to the card would have to be extremely beefed-up. Having that level of power flowing through a motherboard would increase its fail-rate dramatically. That's why the only IGPs that we've ever see built into a motherboard were low-end like the GeForce 8120 or Radeon HD 3300.
Posted on Reply
#182
Tartaros
Avro ArrowIt's an interesting idea but there a couple of problems. One problem is that people all want different video cards and they're just eliminating potential customers by doing that. The other problem is that they would use exactly the same amount of power but they wouldn't be able to cool themselves as well as a separate card and the circuits in the motherboard that carried the power to the card would have to be extremely beefed-up. Having that level of power flowing through a motherboard would increase its fail-rate dramatically. That's why the only IGPs that we've ever see built into a motherboard were low-end like the GeForce 8120 or Radeon HD 3300.
I don't mean as one thing or another, just a line of mobos with integrated desktop class gpus for those who just upgrade the entire system every 5-10 years, maybe something like the cooler in the article would make sense. I don't think there would be that different in the fail rate if things are done right, pretty much like now you get what you pay and laptops are an example of that. The only disadvantage I see is in case of failure of a component you would have to replace the entire system.
Posted on Reply
#183
Avro Arrow
TartarosI don't mean as one thing or another, just a line of mobos with integrated desktop class gpus for those who just upgrade the entire system every 5-10 years, maybe something like the cooler in the article would make sense.
It's possible. Maybe OEMs would be interested in this.
TartarosI don't think there would be that different in the fail rate if things are done right,
Agreed, but that's also the problem. When are things ever done right? Things done right aren't profitable enough for corporations these days.
TartarosThe only disadvantage I see is in case of failure of a component you would have to replace the entire system.
I agree with you but I think that OEMs like Dell and HP wouldn't consider that to be a disadvantage. :laugh:
Posted on Reply
#184
Count von Schwalbe
IMHO the best place for a GPU of this type is on its own board, in the same plane as the motherboard. Nice and easy for a tower-style cooler without covering up increasingly valuable motherboard real estate. Lower powered systems could be made thinner if the cooler doesn't need to be as tall - nice for SFF.
Posted on Reply
#185
SOAREVERSOR
agent_x007Nope. They were forced out by licencing agreements from Intel/AMD.
They weren't simply available since LGA11xx(1366)/AM4 platforms.
This is true, but leaves out a whole ton of details. First that nvidia locked SLI to their chipsets, which weren't actually that good. Sure on AMD where they competed with VIA and SIS they looked good, but they were a garbage fire compared to intels offerings. Then there was the utter debacle of their 775 socket chipsets, 680i, 780i, 790i with blown boards and CPUs left right and center and furious customers. If you wanted top CPU performance you needed a core2 from intel, if you wanted top GPU performance you needed nvidia. If you wanted dual GPU, well you were screwed. You could get an nvidia chipset that ran stupidly hot and failed left right and center or you had to get AMD and nvidia or intel and ATi/AMD.

Had they opened up the SLI license, or had they not released dumpster fire after dumpster fire they'd have survived and still be here. Let's also not forget that it was SLI that ushered in stupidly expensive "gaming" boards with LEDs all over the fucking thing, heatpipes, "gamer", ROG, and more. Starting with the ASUS 939 delux which caused sticker shock at the time. But hey, use with ASUS 6800 ultras, all black PCBs, LEDs and off you go! Then people fell over with sticker shock at the first crosshair ROG, first rog board. Which, while worse than it's competition at lower price, well it had a logo, it was for "the gamers" you got door badges and a leather key ring thing, it has LEDs for status on things (that nobody used) and it took off. Topped off with the Striker Extreme for 680i which took boards to the 400 buck price range and was worse than everything else. But ASUS drove out Abit and DFI simply by making shit more blinging, throwing LEDs at it, and selling a lifestyle ROG brand. And gamers ate it up.

All our tech problems now as gamers, we gamers created. Not the companies. Not the investors. We ate all this shit up and now people bitch, whine, complain, that we got exactly what we ordered, repeatedly, for decades, and now that we are here we don't like it. The only way out is to make PC gaming suck again. Give us worse graphics and worse performance that consoles, do that for decades. Then things will be sane again. If you don't want to do that, welcome to the cloud you just got served up the final plate of your course. You own it. Not nvidia, not intel, not anybody, but you the gamer.
Posted on Reply
#186
Vayra86
SOAREVERSORproblems now as gamers, we gamers created
This is the human condition. It doesn't make sense and we're all susceptible to it.

We do it on all marketplaces, don't we... Social media is a great example. We're actively digging our own holes and create our own conflicts out of thin air. We're driving ad revenue producing more ads we hate. And then we complain about how society is all about what's hot on social media and not about things that really matter. It even got worse: there's a large group now that thinks the things on social media really matter.

Similarly, for gaming, the focus shifted heavily not towards better content but 'more games' (Steam Sales/backlog growth etc.) and 'more hardware'. Even if the advantages are slim at best, we're ready to pay through the nose for those improvements. The low hanging gaming fruit is long gone... There's also the social media aspect again: showing off your hardware is more interesting for algorithms apparently than showing off game X or Y anyone can start playing. Expressing your identity through your PC.... I think its a mental condition, but then again, rampant buying anything new because its new might fall under the same category to begin with. It doesn't make sense, you're just stroking your dopamine reservoir.
Posted on Reply
#188
Radxge
ReallyBigMistakeIt looks like the future of GPUs is water cooling only
Perfect! Water cooling will be very cheap once the sea level will have risen by a few meters.
Posted on Reply
#189
chaoshusky
Now, i strongly dislike...okay, hate Apple. And I'm quite the nVidia fan too..but i can't help think its a big "f you" to Apple on an engineering standpoint alone lol holy fffffffffffffff
Posted on Reply
Add your own comment
May 17th, 2024 17:59 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts