Saturday, January 28th 2023

NVIDIA RTX 4090 Ti / RTX TITAN (Ada) Pictured, Behold the 4-slot Cinder Block

Here's the very first picture of an alleged upcoming NVIDIA flagship/halo product to be positioned above the GeForce RTX 4090. There are two distinct brand names being rumored for this product—the GeForce RTX 4090 Ti, and the NVIDIA RTX TITAN (Ada). The RTX 4090 only uses 128 out of 144 (88 percent) of the streaming multiprocessors (SM) on the 4 nm "AD102" silicon, leaving NVIDIA with plenty of room to design a halo product that maxes it out. Besides maxing out the silicon, NVIDIA has the opportunity to increase the typical graphics power closer to the 600 W continuous power-delivery limit of the 16-pin ATX 12VHPWR connector; and use faster 24 Gbps-rated GDDR6X memory chips (the RTX 4090 uses 21 Gbps memory).

The card is 4 slots thick, with the rear I/O bracket covering all 4 slots. The card's display outputs are arranged along the thickness of the card, rather than along the base. The cooler is a monstrous scale-up of the Dual-Axial Flow Through cooler of the RTX 4090 Founders Edition. The card is designed such that the PCB doesn't come up perpendicular to the plane of the motherboard like any other add-on card, but rather, the PCB is parallel to the plane of the motherboard. The PCB is arranged along the thickness of the card. This has probably been done to maximize the spatial volume occupied by the cooling solution, and probably even make room for a third fan. We also predict that the PCB is split in such a way that a smaller PCB has the display I/O, and yet another PCB handles the PCI-Express slot interface. Sufficed to say, the RTX 4090 Ti / RTX TITAN will be an engineering masterpiece by NVIDIA.
Sources: MEGAsizeGPU (Twitter), VideoCardz
Add your own comment

193 Comments on NVIDIA RTX 4090 Ti / RTX TITAN (Ada) Pictured, Behold the 4-slot Cinder Block

#101
TheDeeGee
natr0n2nd pic is confusing it that 2 parts or what?
Looks like the PCB is on the side of the cooler instead.

It's basicly mounting the same as a CPU tower cooler that way.
DarthgreyHmm, seems like videocard PCB will be long and narrow, it will be like a very big M2 SSD on a motherboard with giant tower type cooling system. But at least it will not be bended because of heavy cooler.

I'm very interested in the how the PCI-E slot looks, that must be it's own PCB as well then.

I also wonder where the power connector's are located with the PCB being against the motherboard.
Posted on Reply
#102
Bigshrimp
Might as well get a 1600 watt power supply to go with that thick ass video card. Kind of getting a tad obnoxious with the size of these gigantic coolers they are using to cool the 800 watt beast. Plus that is a lot of power that will ultimately increase your power bill much higher.
Posted on Reply
#103
wheresmycar
i don't know about you guys but i've always liked chunky graphics cards. The only difference i prefer the heatsink being sexificially visible from the front (eg. AIB EVGA FTW 3-slot cards) but not sure about these chunky side enclosed units... takes away some of that industrial appeal hence i'm more of a AIB devotee.
Posted on Reply
#104
TheinsanegamerN
BigshrimpMight as well get a 1600 watt power supply to go with that thick ass video card. Kind of getting a tad obnoxious with the size of these gigantic coolers they are using to cool the 800 watt beast. Plus that is a lot of power that will ultimately increase your power bill much higher.
If you're paying over $2k for a video card an extra $10 a year in electricity isnt going to worry you. If it does, you are not the market for $2k graphic cards in the first place. If the power draw is enough to noticeably increase your power bill, you are likely gaming almost to the level of a full time job, and then the performance of the card will likely be worth the price of entry, or you're a streamer and a single day will bring in enough income to cover a years of electricity.

If the size of the cooler is an issue, there are waterblock options. And again, at this price, spending another $200 on a water cooling solution isnt a major issue.

Ferrari owners dont sweat the cost of gas, yadda yadda ece. The same arguments were presented 15 years ago with triple SLI setups, the first 2kW power supplies, and major water cooling setups, and the answer is the same now, its a niche market and that niche wants absolute performance, cost be damned.
Posted on Reply
#105
Totally
TheDeeGeeI also wonder where the power connector's are located with the PCB being against the motherboard.
Pcb isn't against the motherboard, the daughterboard is mounted on the side facing away from the connector.
Posted on Reply
#106
AsRock
TPU addict
Dam want one that fits a RX6950, don't judge my CPU has a massive cooler on it there fore my GPU should have a massive one too.
Posted on Reply
#107
bogami
I have nothing else to mention except that it is a total f. up setup to connect to any GPU so far. The first 2x 4K monitors are coming and I hope they will be able to do the job with the existing outputs. It looks like they didn't think how everything would look with the GPU cooling block. Maybe they used AI in the design, because it doesn't make sense to me that it could be anyone so short-sighted.:kookoo::wtf:
Posted on Reply
#108
TheDeeGee
TotallyPcb isn't against the motherboard, the daughterboard is mounted on the side facing away from the connector.
It can't be just a daughterboard, i highlighted all the contact areas. The entire PCB is on the side of the card against the motherboard when mounted in the case.

Posted on Reply
#109
N/A
TotallyLooks like a the vrm/power is just being moved to a separate daughterboard. They've done this before why is this a big deal not sandwiched this time?
There is no daugther board, it's the only board. All the memory chips the GPU and the VRMs are there. oops.
Posted on Reply
#110
SOAREVERSOR
TheinsanegamerNIf they did that, they'd either need two SKUs, one for air and one for water, or they'd need to abandon the air market. It's easier for them to leave that to AIBs to offer a model with a waterblock like gigabyte typically does.

A model with no heatsink would be nice for watercooling bros, but it'd be hard to verify warranty needs.


Bruh DVI has been a dead standard for over a decade now, and VGA? really? What high end monitor today is using VGA? If it's not a 4k panel, or even 1440p, why on earth are you buying a 4090ti? (and honestly just use a displayport -> VGA adapter if you really need it.).
DVI morphed into HDMI and DP, many high end projectors still use VGA as it's better over longer runs and doesn't require repeaters. It's still very common.
Posted on Reply
#111
dorsetknob
"YOUR RMA REQUEST IS CON-REFUSED"
GREAT a Graphics Card that makes a EATX case feel like a SFF case (big FAIL)
Posted on Reply
#112
ShiBDiB
I'm calling bullshit.. between the guys random feet in the pic and the fact that the parts look poorly made with weird angles I think this is a fake
Posted on Reply
#113
Gmr_Chick
I saw the pics and The Commodores "Brick House" started playing in my head :pimp: :roll:
ShiBDiBI'm calling bullshit.. between the guys random feet in the pic and the fact that the parts look poorly made with weird angles I think this is a fake
Homie be wearin' socks with flip flops :fear:
Posted on Reply
#114
QUANTUMPHYSICS
Anything larger than 2 slots at this point iss pure arrogance.

4 slots is beyond arrogance.

These cards are too bulky and run hot. They should simply be 2 slots wit a liquid cooler AIO.
Posted on Reply
#115
sepheronx
eidairaman1So does it do laundry, cook and wash dishes, clean the house?
You can use the Nvidia proprietary technology of Cuda cores for AI development so you can power your wife with the nvidia algorithm to be able to wash dishes, your clothing and not complain of a headache.

I look forward to such tech as it will act as a great replacement to the yesteryear technology of woman known as "biology"
Posted on Reply
#116
wolf
Performance Enthusiast
TheinsanegamerNits a niche market and that niche wants absolute performance, cost be damned
It amazes me the amount of people that either don't understand this, or know it and chime in just to spew hate anyway. Almost by definition this product isn't for the people whining about size/cost/power requirements. The Ferrari analogy works a treat. I suppose it's easier to go hurr durr nvidia bad power connector melt har har.
Posted on Reply
#117
the54thvoid
Intoxicated Moderator
wolfIt amazes me the amount of people that either don't understand this, or know it and chime in just to spew hate anyway. Almost by definition this product isn't for the people whining about size/cost/power requirements. The Ferrari analogy works a treat. I suppose it's easier to go hurr durr nvidia bad power connector melt har har.
There's nothing wrong with criticising the size and cost of a graphics card. If we look at Nvidia's professional line, they're all still dual slot (yes, I know - noise).

The Ferrari analogy is not good at all- it is a luxury performance sports car with performance that does not decrease over time - a car that did 200mph 20 years ago, is still capable of that. A £2000+ graphics card today, will perform worse than a mid range card in two generations. It's also hard to sell a £2000 gfx card at £2000 when the mid range card costs less than half that. Ferrari performance and value does not crash as that does. Bad anaology. A luxury product's value does not noticeably decrease with time. Tech does, that's why there are only a small % of enthusiasts willing to pay for it. Of note, it's very poor taste to criticise a person for buying what they can afford. But poking fun at tech that many people see as ridiculous, or unnecessary, or whatever, is fair game in context.

It is not hatred, as you say it is.
Posted on Reply
#118
wolf
Performance Enthusiast
the54thvoidThe Ferrari analogy is not good at all- it is a luxury performance sports car with performance that does not decrease over time - a car that did 200mph 20 years ago, is still capable of that. A £2000+ graphics card today, will perform worse than a mid range card in two generations.
I disagree, the same card used for the exact same workload 20 years later will retain the same performance, does it fit perfectly? of course not, one's a car and ones a GFX card, but in a way it does apply in the sense of insanely high end luxury products. In the same vein that most people don't go out of their way to criticise a Ferrari, or a supercomputer for their cost, energy requirements etc, some products were absolutely never going to be something they'd have ever entertained the possibility of buying. I see a lot of what I'd call fair game as you say, because it's ridiculous and unnecessary, there a way to do that and I can happily laugh along/shrug it off etc, but I also see hatred, not all of the time, but I can personally guarantee to you that for some people, it's hatred. We don't have to agree on the analogy, the criticism 'fuel' etc, I'm happy to agree to disagree.
Posted on Reply
#119
medi01
So, sell 4090 to people who just want to buy the fastest GPU no matter what.

Do "SM" bump from 128 to 144 (that's merely +12.5%), up max TDP and let those people buy yet another brick.

Sounds quite profitable.


I recall calculating, that 2080Ti even though it was single digit market share GPU, was about 10% of NV's reveue.
Posted on Reply
#120
Fluffmeister
Nvidia have been holding back their fastest silicon for years, it's like some of you have just rolled out of a cave.
Posted on Reply
#121
medi01
I don't recall it rolling out so fast, unless forced by the competitor.
Posted on Reply
#122
the54thvoid
Intoxicated Moderator
FluffmeisterNvidia have been holding back their fastest silicon for years, it's like some of you have just rolled out of a cave.
You're just trolling now. Everyone knows Nvidia never maxes out the initital high end silicon. The thing is here is that it doesn't need to release it because the 4090 is stupid fast. It actually doesn't make sense to do so. Except of course to one-up the 4090 and make that a 2nd tier product.
Posted on Reply
#123
Fluffmeister
the54thvoidYou're just trolling now. Everyone knows Nvidia never maxes out the initital high end silicon. The thing is here is that it doesn't need to release it because the 4090 is stupid fast. It actually doesn't make sense to do so. Except of course to one-up the 4090 and make that a 2nd tier product.
Oh please, the 4090 Ti was always going to be released at some point, acting shocked or heartbroken for RTX 4090 owners is the very definition of trolling.


We aren't talking about luxury Ferrari's after all.
Posted on Reply
#124
medi01
Fluffmeisteracting shocked or heartbroken for RTX 4090 owners
That's just your perception, not reality.
Posted on Reply
#125
Fluffmeister
medi01That's just your perception, not reality.
Guys I heard a rumour... If you're patient and happy to wait, well wait for it... Apparently there will be a card coming called the RTX 5080! And rumours suggest it may well end up being cheaper and faster than the unreleased 4090 Ti, certainly better value for money... Crazy. This is just between you and me, don't tell anyone else.
Posted on Reply
Add your own comment
May 17th, 2024 17:37 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts