Monday, June 26th 2023

More Pictures of NVIDIA's Cinder Block-sized RTX 4090 Ti Cooler Surface

Back in January, we got our first look at the cinder block-like 4-slot cooling solution of NVIDIA's upcoming flagship graphics card (called either the RTX 4090 Ti, or the TITAN (Ada). "ExperteVallah" on Twitter scored additional pictures of the cooler. Its design sees the heat dissipation surface pushed to the entire thickness of the cooler, and ventilated the entire length.

The card's PCB isn't conventional—not perpendicular to the plane of the motherboard like any other add-in card—but is rather along the plane of the motherboard, with additional breakaway daughter cards interfacing with the sole 12VHPWR power connector, and the PCIe slot. This slender, ruler-shaped PCB spans the entire length of the card, without coming in the way of its heat dissipation surfaces. The length is used for the large AD102 ASIC that's probably maxed out (with all its 144 SM enabled), twelve GDDR6X (possibly faster 23 Gbps), and a mammoth VRM that nearly maxes out the 600 W continuous power delivery design limit of the 12VHPWR.
Sources: ExperteVallah (Twitter), Hassan Mujtaba (Twitter), VideoCardz
Add your own comment

145 Comments on More Pictures of NVIDIA's Cinder Block-sized RTX 4090 Ti Cooler Surface

#101
N/A
JohHGrotesque, really.
And still only one 16 pin. When can we expect at least two for safety.
One slot is 2cm, 4 slots just wasted.
Posted on Reply
#102
evernessince
I can't imagine they will only go with one 12VHPWR connector. Arris from Hardware Busters has already demonstrated that his connector near instantly starts burning at 650w and the 4090 already goes up to 500 - 520w.
Posted on Reply
#103
LabRat 891
evernessinceI can't imagine they will only go with one 12VHPWR connector. Arris from Hardware Busters has already demonstrated that his connector near instantly starts burning at 650w and the 4090 already goes up to 500 - 520w.
A 600W-rated plug that starts to fail at 650W?!
That seems like an awfully small margin for error/excursions.

As I recall, both previous PCIe power plugs routinely went well above their spec'd rating.

<10% overhead is absolutely unacceptable in any AC Mains receptacle/plug, why is it okay here?
Posted on Reply
#104
ir_cow
Count von SchwalbeFrom the pictures, the rest of the sentence was redundant.
I'm okay with body shots.
Posted on Reply
#105
eidairaman1
The Exiled Airman
Well we know that throwing it is as effective as a traditional brick.
Posted on Reply
#106
sLowEnd
LabRat 891Less reliable/durable and easier for an End-User to break. Also, a completely on-card solution would be just as bulky and heavy.

I don't like the Apple-like layout and intricacies of the cooler, but I am 'on-board' for smaller PCBs and big coolers (that take the strain, not the PCB.)

In the future, I hope to see 'passive phase-change' coolers. No pump, no tubes; just a fully self-integrated and hermetically sealed, passive heatpump.
There's been quite a bit of work on this concept, but the tech is largely 'stuck' in development. (At least, on the consumer/prosumer side.)
An AIO would allow you to reposition that bulk and use your expansion slots
Posted on Reply
#107
Vayra86
HazizaDamn, it will be a beast for sure, but my 4090 will do until the RTX 5000 series next year. They should have released it at launch. It feels pointless to buy something so expensive and only get a year of the top-of-the-line card experience for the price.
Nvidia is making sure neither you as a 4090 buyer or the 4090ti buyer gets that experience.
Posted on Reply
#108
AusWolf
HazizaDamn, it will be a beast for sure, but my 4090 will do until the RTX 5000 series next year. They should have released it at launch. It feels pointless to buy something so expensive and only get a year of the top-of-the-line card experience for the price.
You could have expected it by looking at the specs of the 4090 and seeing that it's based on a partially disabled GPU die.
Posted on Reply
#109
Wirko
N/AAnd still only one 16 pin. When can we expect at least two for safety.
One slot is 2cm, 4 slots just wasted.
Has Nvidia or Intel or anyone else ever confirmed that more than one 16-pin connectors can be used in parallel?

Sure, the power pins aren't an issue but the signaling pins might be, and may require a non-standard solution, especially if the creators of the standard were shortsighted enough and did't think that any GPU's consumption, ever, could exceed 0.6 kW.
Posted on Reply
#110
TheinsanegamerN
Beginner Micro DeviceThose who are okay are. There would be no 999 Watt monsters whatsoever if no one bought them.
:laugh: :roll: :laugh: :roll: :laugh:
The PC community has literally always been chasing the performance dragon. Remember clock doubling? The AMD 5x86? Bus OCing?
Beginner Micro DeviceYet being outpushed. You can manage to push extra 3 percent speed outta them, yet their appetites are record breaking at these settings.

I understand them, they want their GPUs to be competitive but x2 wattage a decade is a complete clown fiesta. Especially considering how much these cards consume in low loads like media playback hitting dozens watts.

Time will pass, and video cards of <200 W TDP will cease to exist. This is what "oh come on, it's fine" attitude is doing with the industry.
It sounds like you are fundamentally unhappy. :mad: you still have 200w GPUs, in the form of cards like the 3060ti. You have the 3050. You can easily buy one of these cards, undervolt it by 5%, and have the uber efficient GPU of your dreams for not that much money.

If they took a 3060, and made it the 3090ti, would that make you happy? Or would you immediately be complaining that nvidia was sandbagging performance for the next generation? Something tells me you would be fundamentally unhappy with nvidia no matter what they did, unless they gave you 3090ti with a 3050 power envelope for $100, and even then, you'd be unhappy that it wasnt a 4090.
Razrback16Ya this is ridiculous at a comical level at this point.
Funny, people said the same thing with the 8800 ultra. Almost like innovation is scary.
Razrback16Honestly with these cards (even the previous gen 3090) requiring the power they do, and putting out the heat we are seeing, I would love to see companies try to market dedicated liquid cooling a bit more. I'd never even consider buying one of these with an air cooler on it.
Why? The air cooler is relatively quiet, keeps temps in check, and is much easier to ship and install then an AIO based card. Most people dont want to deal with water cooling, so if you can manage with an air cooler, why even bother?
Posted on Reply
#111
AusWolf
TheinsanegamerN:laugh: :roll: :laugh: :roll: :laugh:
The PC community has literally always been chasing the performance dragon. Remember clock doubling? The AMD 5x86? Bus OCing?

It sounds like you are fundamentally unhappy. :mad: you still have 200w GPUs, in the form of cards like the 3060ti. You have the 3050. You can easily buy one of these cards, undervolt it by 5%, and have the uber efficient GPU of your dreams for not that much money.

If they took a 3060, and made it the 3090ti, would that make you happy? Or would you immediately be complaining that nvidia was sandbagging performance for the next generation? Something tells me you would be fundamentally unhappy with nvidia no matter what they did, unless they gave you 3090ti with a 3050 power envelope for $100, and even then, you'd be unhappy that it wasnt a 4090.

Funny, people said the same thing with the 8800 ultra. Almost like innovation is scary.
Making something that eats 2x more power but is 4x faster is different from making something that's 2x more power hungry but is only 10% faster. The modern PC world is slowly but surely moving into the latter, while gamers shrug saying "eh, my super-duper-ultra-uber 39-slot cooler and 2 kW PSU will handle it", not even thinking about the fact that giving up that barely noticeable 10% performance difference would let us have normal-sized components that fit in any case and don't trip all the fuses in your house. Undervolting is not an argument, either, as long as GPU sizes are what they are, and you need that 100 kW PSU to get into Windows and install software to undervolt to begin with.
Posted on Reply
#112
TheinsanegamerN
AusWolfMaking something that eats 2x more power but is 4x faster is different from making something that's 2x more power hungry but is only 10% faster. The modern PC world is slowly but surely moving into the latter, while gamers shrug saying "eh, my super-duper-ultra-uber 39-slot cooler and 2 kW PSU will handle it", not even thinking about the fact that giving up that barely noticeable 10% performance would let us have normal-sized components that fit in any case and don't trip all the fuses in your house. Undervolting is not an argument, either, as long as GPU sizes are what they are, and you need that 100 kW PSU to get into Windows and install software to undervolt to begin with.
How on earth would you have survived in the before years, when PCs idled at 100w+? I'm talking the 2000s, before power states were widespread. The 2kw PSU meme came from the late 2000s when tri and quad SLI were a thing, thats not new.

Also, this argument is totally stilted anyway. People always whine and cry about the power use, like you did with "Making something that eats 2x more power but is 4x faster is different from making something that's 2x more power hungry but is only 10% faster", but the GPUs we are talking about are the likes of the 4090/ti, you know, the GPU that is not only significantly faster then a 3090ti but also uses less power? Makes the argument fall on its face.

If we're being serious here, the entire 4000 series is much more efficient then the 3000 series. Your complaint holds no water. The 4060ti pulls less power then the 3060, or the 2060, while being as fast as a 3060ti or 3070. Same for the rest of the gen. As defenders of the 7600 love to point out, it draws 90w compared to 135w for the 6650xt.

It reinforces my argument that the people whining about power use are just that, whiners. They will never be happy, they have been given a generation that is significantly more efficient and all they can do is whine about how it still uses too much power for them, why isnt the 4090 a 150w GPU, blah blah blah, while simultaniously ALSO complaining that the 4000s dont provide enough generational uplift. Like WTF do you people want? The 4000s dont need undervolted like 3000s do, and yet, all we hear about is the 4000s pull too much power.

It's all so tiresome.
Posted on Reply
#113
AusWolf
TheinsanegamerNHow on earth would you have survived in the before years, when PCs idled at 100w+? I'm talking the 2000s, before power states were widespread. The 2kw PSU meme came from the late 2000s when tri and quad SLI were a thing, thats not new.
How would I have? Well, I did actually survive it, and loved it, thanks very much. ;)

And the answer is simple: the delta between idle and load power consumption wasn't anywhere near it is today. Don't get me wrong, being efficient at idle is a good thing. All I'm saying is, we (at least in my area) were rocking noname 350-400 W office PSUs back then, while a 600 W quality unit is the minimum when you think about building an even slightly gaming-capable PC today.
TheinsanegamerNAlso, this argument is totally stilted anyway. People always whine and cry about the power use, like you did with "Making something that eats 2x more power but is 4x faster is different from making something that's 2x more power hungry but is only 10% faster", but the GPUs we are talking about are the likes of the 4090/ti, you know, the GPU that is not only significantly faster then a 3090ti but also uses less power? Makes the argument fall on its face.

If we're being serious here, the entire 4000 series is much more efficient then the 3000 series. Your complaint holds no water. The 4060ti pulls less power then the 3060, or the 2060, while being as fast as a 3060ti or 3070. Same for the rest of the gen. As defenders of the 7600 love to point out, it draws 90w compared to 135w for the 6650xt.

It reinforces my argument that the people whining about power use are just that, whiners. They will never be happy, they have been given a generation that is significantly more efficient and all they can do is whine about how it still uses too much power for them, why isnt the 4090 a 150w GPU, blah blah blah, while simultaniously ALSO complaining that the 4000s dont provide enough generational uplift. Like WTF do you people want? The 4000s dont need undervolted like 3000s do, and yet, all we hear about is the 4000s pull too much power.

It's all so tiresome.
I don't know what people want. What I want is midrange cards to stay where they are/were in power consumption. I remember the 1060 being extremely efficient with its 120 W TDP, then the 2060 increased on it by a lot, and then the 3060 by even more. Of course the 40-series is efficient because it has the performance to compensate for power needs in the high end, and it seems to be improving on power consumption while not giving a lot more in performance in the lower segments, just like the 7600 doesn't. I definitely appreciate the move towards better power vs better performance, but I'm not sure it convinces everyone, and that's probably what the crying is about. The lower end doesn't improve on last gen in performance, while the higher end consumes enormous amounts of power.
Posted on Reply
#114
TheinsanegamerN
AusWolfHow would I have? Well, I did actually survive it, and loved it, thanks very much. ;)

And the answer is simple: the delta between idle and load power consumption wasn't anywhere near it is today. Don't get me wrong, being efficient at idle is a good thing. All I'm saying is, we (at least in my area) were rocking noname 350-400 W office PSUs back then, while a 600 W quality unit is the minimum when you think about building an even slightly gaming-capable PC today.
People way overestimate the PSU you need today. A 600w unit would easily handle a 4070 and an i5 with headroom to spare.
AusWolfI don't know what people want. What I want is midrange cards to stay where they are/were in power consumption. I remember the 1060 being extremely efficient with its 120 W TDP, then the 2060 increased on it by a lot, and then the 3060 by even more.
The 1060 pulled 120w because, aside from pascal being an efficient arch, it was limited by its node. They couldnt get pascal to regularly clock to 2.4 GHz like they originally wanted. If they could, guarantee they would, and the 1060 would not have been the efficiency beast it was.

The 2060 introduced RT cores and clocked noticeably higher. The 3060 offered a huge relative performance boost over the 2060, along with better RT, and was stuck on samsung's inferior node. The 4060ti offers nearly double the performance of a 2060 while pulling less power then a 2060.
AusWolfOf course the 40-series is efficient because it has the performance to compensate for power needs in the high end, and it seems to be improving on power consumption while not giving a lot more in performance in the lower segments, just like the 7600 doesn't. I definitely appreciate the move towards better power vs better performance, but I'm not sure it convinces everyone, and that's probably what the crying is about. The lower end doesn't improve on last gen in performance, while the higher end consumes enormous amounts of power.
The high end has ALWAYS consumed enormous amounts of power, the limit was always the node, not the GPU. For some reason, people want those high end GPUs to be 150 w power sippers, against any and all reason.
Posted on Reply
#115
SCP-001
Gmr_ChickY'all are going through it right now in Texas, with that dangerous heatwave that's still (!) going strong. I remember when I lived in CA's central valley, the unbearable heat during the summer. Luckily we had AC, but my bedroom was right over the garage (we lived in a two story) and sometimes it would get so goddamn hot in my room at night while I was gaming - and this was with a 1660 Super, mind you - I'd have to give in and turn the AC on for a little while (we usually kept it off at night, don't ask me why lol).

Where I am now, the AC pretty much has to stay on 24/7 during the late spring through summer months due to the miserable humidity. Really hoping the 6800XT I just bought doesn't run crazy hot....
It reaches 105 constantly here near Dallas now. I live in a 2-story apartment so my room alone gets inanely warm (even with the AC running) even without my PC being on. I've just learned to accept it at this point. Though, I have been playing more on my steamdeck lately. I'm just hoping that next gen bring the board power down to something reasonable for the same performance, but that probably won't be the case.
Posted on Reply
#116
shoskunk
For a company valued at a "gobs of cash" capable, this cooling solution needs to be retired. Painted, thick fins equals more surface area.

It's lost that "apple" appeal.. ;)
Posted on Reply
#117
Bomby569
TheinsanegamerNThe 2060 introduced RT cores and clocked noticeably higher. The 3060 offered a huge relative performance boost over the 2060, along with better RT, and was stuck on samsung's inferior node. The 4060ti offers nearly double the performance of a 2060 while pulling less power then a 2060.
you need to pick a lane and stick with it, the difference between the 3060 and 2060 is smaller then the one from the 2060 to the 1060.
They picked Samsung to save money, it was a choice.
Then you abandon the pure 60 class comparison to go for the 4060ti, a class higher and obviously also priced higher and using to compare power draw when we know the 4060 ti is a joke that had no evolution and barely beats the 3060ti and sometimes even loses to it. Using it as argument to discuss power draw evolution is absurd, it's a card that brings zero evolution

Posted on Reply
#119
TheinsanegamerN
Bomby569you need to pick a lane and stick with it,
You need to go touch grass.
Bomby569the difference between the 3060 and 2060 is smaller then the one from the 2060 to the 1060.
Bomby569They picked Samsung to save money, it was a choice.
Bomby569Then you abandon the pure 60 class comparison to go for the 4060ti, a class higher
Bomby569and obviously also priced higher and using to compare power draw when we know the 4060 ti is a joke that had no evolution and barely beats the 3060ti and sometimes even loses to it. Using it as argument to discuss power draw evolution is absurd, it's a card that brings zero evolution

Imagine taking arguments out of context because you are THIS MAD over nvidia not giving you a 4090ti made out of god's ballsack.
Posted on Reply
#120
firejohn
There is a faster CPU needed as the current ones, also already for the 4090. Wish Intel uses TSMCs 4 or 3 nm technology.
Posted on Reply
#121
Unregistered
TheinsanegamerNFunny, people said the same thing with the 8800 ultra. Almost like innovation is scary.
Eh, not sure how innovative it is if the card is covering up half of the PCIe slots on the board.
Oh and just wanted to add on here - it's funny you mentioned the 8800 Ultra...it was after that card (I had the 8800 GTX with a heavily modified cooler that did 8800 Ultra clocks but took up like 3 slots) that I started moving to dedicated water for both the CPU & GPU. I was tired of the insanity with the air cooler size, hah.
TheinsanegamerNWhy? The air cooler is relatively quiet, keeps temps in check, and is much easier to ship and install then an AIO based card. Most people dont want to deal with water cooling, so if you can manage with an air cooler, why even bother?
See above. I understand your point and I'm not saying the cooler is poorly designed for what it does in terms of handling the heat + noise, only that when you have the cooler taking up 3-4 slots preventing usage of the southern hemisphere (lol) of your mainboard I'd say it wouldn't necessarily be the worst thing to look at other solutions because things are getting a bit out of hand.

I believe the 3090 FE cooler was 3-slots wide and as you noted, it's not a poorly designed device considering the job it does, but then I look at my own 3090 which is an ROG Strix with the front and back EK waterblock, and it doesn't even take up the full 2nd slot that is reserved by the block's mounting bracket. So yes, while some folks don't want to deal with watercooling, it could offer some nice advantages in that area that may offer some incentives - better cooling, even less noise, and will allow cards to not consume as much space.
Posted on Edit | Reply
#122
AusWolf
TheinsanegamerNPeople way overestimate the PSU you need today. A 600w unit would easily handle a 4070 and an i5 with headroom to spare.

The 1060 pulled 120w because, aside from pascal being an efficient arch, it was limited by its node. They couldnt get pascal to regularly clock to 2.4 GHz like they originally wanted. If they could, guarantee they would, and the 1060 would not have been the efficiency beast it was.

The 2060 introduced RT cores and clocked noticeably higher. The 3060 offered a huge relative performance boost over the 2060, along with better RT, and was stuck on samsung's inferior node. The 4060ti offers nearly double the performance of a 2060 while pulling less power then a 2060.

The high end has ALWAYS consumed enormous amounts of power, the limit was always the node, not the GPU. For some reason, people want those high end GPUs to be 150 w power sippers, against any and all reason.
It's only that high-end used to mean 200+ W, then 400+ W, and now, it's approaching 600+ W. Though let's not forget that you had SLi and CrossFire back in the days, now you don't. Owning a 3090 or 4090 is the same as owning two 7900 GX2's (or whatever that dual PCB monster was called), or a quad-SLi setup back then. As a mainstream gamer, it was totally unnecessary for me, just like it's unnecessary to have a 3090 now.
Posted on Reply
#123
LabRat 891
AusWolfIt's only that high-end used to mean 200+ W, then 400+ W, and now, it's approaching 600+ W. Though let's not forget that you had SLi and CrossFire back in the days, now you don't. Owning a 3090 or 4090 is the same as owning two 7900 GX2's (or whatever that dual PCB monster was called), or a quad-SLi setup back then. As a mainstream gamer, it was totally unnecessary for me, just like it's unnecessary to have a 3090 now.
Yup, 7900 GX2, 9800 GX2, GTX 295, GTX 590, GTX 690, etc.

IIRC, stuff like this is/was called 'Halo Tier' (as in: 'above the head of its class')
Posted on Reply
#124
Cyberaegis
eggymelonthis design is kinda cool, flow-through heatsink for the entire length of the card, could rotate CPU air cooler 90 degrees and achieve a nice flowing bottom-to-top wind tunnel. RAM would be facing the right direction for this too!
Yes, this time CPU and RAM will be fully cooked.
Posted on Reply
#125
Ayhamb99
I said it before and I'll say it again

Why don't they just make an AIO Water Cooler for the 4090 Ti FE? Like with how wide that cooler is, i would actually be concerned about the GPU being heavy enough for the PCIE Slot on the motherboard to get torn off and fall
Posted on Reply
Add your own comment
May 21st, 2024 15:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts