Tuesday, June 9th 2020

NVIDIA's Next-Gen Reference Cooler Costs $150 By Itself, to Feature in Three SKUs

Pictures of alleged next-generation GeForce "Ampere" graphics cards emerged over the weekend, which many of our readers found hard to believe. It's features a dual-fan cooling solution, in which one of the two fans is on the reverse side of the card, blowing air outward from the cooling solution, while the PCB extends two-thirds the length of the card. Since then, there have been several fan-made 3D renders of the card. NVIDIA is not happy with the leak, and started an investigation into two of its contractors responsible for manufacturing Founders Edition (reference design) GeForce graphics cards, Foxconn and BYD (Build Your Dreams), according to a report by Igor's Lab.

According to the report, the cooling solution, which looks a lot more overengineered than the company's RTX 20-series Founders Edition cooler, costs a hefty USD $150, or roughly the price of a 280 mm AIO CLC. It wouldn't surprise us if Asetek's RadCard costs less. The cooler consists of several interconnected heatsink elements with the PCB in the middle. Igor's Lab reports that the card is estimated to be 21.9 cm in length. Given its cost, NVIDIA is reserving this cooler for only the top three SKUs in the lineup, the TITAN RTX successor, the RTX 2080 Ti successor, and the RTX 2080/SUPER successor.
All three will use the same cooling solution, and a common PCB design codenamed PG132. Further, all three cards will be based on a common ASIC, codenamed "GA102," with varying hardware specs. The "SKU10" (TITAN RTX successor) could ditch the TITAN brand to carry the model name "GeForce RTX 3090," max out the 384-bit wide memory bus of the GA102 ASIC, and feature a whopping 24 GB of GDDR6X memory, with 350 W typical board power.

The next SKU, the SK20, which is the RTX 2080 Ti successor, will be cut down from SKU10. It will feature 11 GB of GDDR6X memory across a 352-bit wide memory interface, and have a 320 W typical board power rating. This board will likely feature the RTX 3080 Ti branding. Lastly, there's the SKU30, which is further cut-down, features 10 GB of GDDR6X memory across a 320-bit wide memory interface, and it bears the RTX 3080 model number, succeeding the RTX 2080 / RTX 2080 Super.

When launched, "Ampere" could be the first implementation of the new GDDR6X memory standard, which could come with data-rates above even the 16 Gbps of today's GDDR6, likely in the 18-20 Gbps range, if not more. Lesser SKUs could use current-gen GDDR6 memory at data-rates of up to 16 Gbps.
Sources: Igor's Lab, tor6770 (Reddit), VideoCardz, ChipHell Forums
Add your own comment

92 Comments on NVIDIA's Next-Gen Reference Cooler Costs $150 By Itself, to Feature in Three SKUs

#26
Ruru
S.T.A.R.S.
lexluthermiesterPretty much this. I'll take an EVGA card Thank You very much!
Too bad Sapphire doesn't make Nvidia cards. They've been great for years :)
Chrispy_This still isn't an official render but this version raises two more drawbacks.
  1. It's very clear that the front fan needs to be a radial fan. An ordinary axial fan without the outer ring would be a better choice than the one Nvidia have picked that effectively prevents the blade tips from acting as a psuedo-radial blower.
  2. With the PCIe plugs on the end of the card connected via that daughterboard, half of the effective cooling from the rear fan is blocked.
I mean, it didn't look like the best design to start off with but if this rendition is accurate then it's even worse than I thought.
PCIe plugs could be wired to the board like in GTX 1060 and RTX 2060..?
Posted on Reply
#27
TheoneandonlyMrK
Vayra86Yeah 320W does scream Fermi and Vega to me too. Curious about the idea behind that, Nvidia's 250W TDP for top end was almost becoming a fixed thing, and now they shrink and still need to push this up radically?

Either the perf jump is massive, or Nvidia is running out of ideas. Surely they won't do an RTG now....?
Seems to me like someone did some back of the cigarette packet maths on their competition and came up short via Xbox series X specs.
Now there dragging out the bigger chips , going with as much cooling and power as possible and shooting for the max clock's they can get.
Someone's worried.
Posted on Reply
#28
cucker tarlson
Expensive overengineered boards and cooling for the affluent.
Better stick to aibs
Posted on Reply
#29
Vayra86
theoneandonlymrkSeems to me like someone did some back of the cigarette packet maths on their competition and came up short via Xbox series X specs.
Now there dragging out the bigger chips , going with as much cooling and power as possible and shooting for the max clock's they can get.
Someone's worried.
Imagine if the GPU table finally turns in AMD's favor, now of all times.

Turing might have been the writing on the wall.... won't say told you so... but.... :P
Posted on Reply
#30
lexluthermiester
Chloe PriceToo bad Sapphire doesn't make Nvidia cards. They've been great for years :)
Agreed! Sapphire has worked hard to reach the status they have earned.
Posted on Reply
#31
Flanker
Wouldn't be surprised with Foxconn lol. These leaks can do some work on their share prices.
Posted on Reply
#32
holyprof
I was dreaming of upgrading my EVGA 1080 hybrid to one of those new 3080s ...
If the cooler itself is $150, then my 1080 with AIO, bought in the middle of the crypto craze for $600, looks like a bargain.
If AMD comes up with something good to compete with Nvidia (RDNA2 card that ties with the 3080), they will price it accordingly so having upper tier GPU is no longer for me. Back to XX60 class I guess, probably a 4060 in ... 2023?
Posted on Reply
#33
TheoneandonlyMrK
Vayra86Imagine if the GPU table finally turns in AMD's favor, now of all times.

Turing might have been the writing on the wall.... won't say told you so... but.... :P
I am not sure it will, but I am at least becoming more sure that the next year will be interesting from a GPU Pov , it's all rumours though.
But it is looking like some will have a god reason to get the Philips screwy out.
Posted on Reply
#34
Assimilator
Vayra86Yeah 320W does scream Fermi and Vega to me too. Curious about the idea behind that, Nvidia's 250W TDP for top end was almost becoming a fixed thing, and now they shrink and still need to push this up radically?

Either the perf jump is massive, or Nvidia is running out of ideas. Surely they won't do an RTG now....?
These power draws are a rumour amongst a sea of rumours that isn't corroborated by anything else. In particular the rumoured configurations of the CUDA cores has GA102 at only 15% more cores than TU102. Coupled with the 7nm shrink, that means Ampere should draw far less power than Turing.

Further, NVIDIA has historically been extremely hesitant to go above 300W TBP because that's the maximum that can be supplied by an 8+6 pin connector combination. The only reason they'd want to blow that budget is for performance reasons, and I don't see any reason that they need to worry about that.
theoneandonlymrkSeems to me like someone did some back of the cigarette packet maths on their competition and came up short via Xbox series X specs.
LOL, no. Console paper specs don't mean s**t. Especially considering RDNA2 is the first time AMD is doing HW accelerated ray-tracing.
Posted on Reply
#35
Vayra86
AssimilatorThese power draws are a rumour amongst a sea of rumours that isn't corroborated by anything else. In particular the rumoured configurations of the CUDA cores has GA102 at only 15% more cores than TU102. Coupled with the 7nm shrink, that means Ampere should draw far less power than Turing.

Further, NVIDIA has historically been extremely hesitant to go above 300W TBP because that's the maximum that can be supplied by an 8+6 pin connector combination. The only reason they'd want to blow that budget is for performance reasons, and I don't see any reason that they need to worry about that.



LOL, no. Console paper specs don't mean s**t. Especially considering RDNA2 is the first time AMD is doing HW accelerated ray-tracing.
You're right. I got carried away, but what if
Posted on Reply
#36
Ruru
S.T.A.R.S.
holyprofI was dreaming of upgrading my EVGA 1080 hybrid to one of those new 3080s ...
If the cooler itself is $150, then my 1080 with AIO, bought in the middle of the crypto craze for $600, looks like a bargain.
If AMD comes up with something good to compete with Nvidia (RDNA2 card that ties with the 3080), they will price it accordingly so having upper tier GPU is no longer for me. Back to XX60 class I guess, probably a 4060 in ... 2023?
I'd wait for AIB models with custom cooler if I were you and an upgrade is in sight.
Posted on Reply
#37
john_
$150 for the cooler, plus the cost of the pcb, plus the extra chips, plus the memory, plus what the manufacturer of the card will make, plus what the retailer will make, plus what Nvidia charges for the GPU and in whatever it charges there is a 60% or more profit margin.

I guess it will cost $499 :laugh:
Posted on Reply
#38
GreiverBlade
laszloseems the card price will be 800 $ at least
ahhhhhh and to say all 1080Ti and 2080/2080Ti were above 1000 for me ... oh woes ...

alright, benchmark will not matter ... i will go RDNA2 once i want to get rid of my 1070 (which is bound to happen before RDNA2 cards availability .... oh well maybe a second hand RX5700XT)


150$ for that cooler? heck even aftermarket cooler are cheaper than that (and probably would work just as fine ...) Jen Hsu really need to stop comming up with BS to grab extra money just to sponsor his next leather jacket ...
Posted on Reply
#39
sutyi
Is this supposed to be some pre-emptive marketing BS to justify another price hike?

150US for 2x 90-100mm fans, a vapor chamber (?) over the GPU, a separate heatsink for the VRAM I presume and another for the VRM? My ass...
Posted on Reply
#40
TheoneandonlyMrK
AssimilatorThese power draws are a rumour amongst a sea of rumours that isn't corroborated by anything else. In particular the rumoured configurations of the CUDA cores has GA102 at only 15% more cores than TU102. Coupled with the 7nm shrink, that means Ampere should draw far less power than Turing.

Further, NVIDIA has historically been extremely hesitant to go above 300W TBP because that's the maximum that can be supplied by an 8+6 pin connector combination. The only reason they'd want to blow that budget is for performance reasons, and I don't see any reason that they need to worry about that.



LOL, no. Console paper specs don't mean s**t. Especially considering RDNA2 is the first time AMD is doing HW accelerated ray-tracing.
Paper specs, what are you smoking , the Xbox series x is in hands, it defo isn't a unicorn.

We will see eh.
Posted on Reply
#41
midnightoil
That's a pretty simple, low cost design. No way in hell it exceeds $30-35 cost.
Posted on Reply
#42
Assimilator
theoneandonlymrkPaper specs, what are you smoking , the Xbox series x is in hands, it defo isn't a unicorn.
Yes... non-final hardware in the hands of "influencers" (aka Xbox fanboys/girls) who have no idea what performance numbers mean comparatively, but are happy to regurgitate the ones Microsoft supplies them with ad infinitum.

Console hardware has never been better than high-end PC hardware and it never will be, because it's always a generation (at best) behind the latest and greatest PC hardware.
Posted on Reply
#43
Ruru
S.T.A.R.S.
AssimilatorYes... non-final hardware in the hands of "influencers" (aka Xbox fanboys/girls) who have no idea what performance numbers mean comparatively, but are happy to regurgitate the ones Microsoft supplies them with ad infinitum.

Console hardware has never been better than high-end PC hardware and it never will be, because it's always a generation (at best) behind the latest and greatest PC hardware.
Xbox 360 GPU was pretty powerful in 2005 when the console was launched..?
Posted on Reply
#44
Decryptor009
Probably charge 2K in cost just because it has a fan on the back.
AssimilatorYes... non-final hardware in the hands of "influencers" (aka Xbox fanboys/girls) who have no idea what performance numbers mean comparatively, but are happy to regurgitate the ones Microsoft supplies them with ad infinitum.

Console hardware has never been better than high-end PC hardware
and it never will be, because it's always a generation (at best) behind the latest and greatest PC hardware.
Very wrong indeed.
Chloe PriceXbox 360 GPU was pretty powerful in 2005 when the console was launched..?
Both the 360 and original XBOX had GPU's more capable upon launch than anything on the desktop market, but it was merely months before the PC had something better, so not truly a win, but they were more powerful for a monent in time.
Posted on Reply
#45
Ruru
S.T.A.R.S.
Decryptor009Both the 360 and original XBOX had GPU's more capable upon launch than anything on the desktop market, but it was merely months before the PC had something better, so not truly a win, but they were more powerful for a monent in time.
I totally forgot the OG Xbox, yeah, it had a hybrid between GeForce 3 and 4.
Posted on Reply
#46
Chrispy_
Chloe PriceToo bad Sapphire doesn't make Nvidia cards. They've been great for years :)


PCIe plugs could be wired to the board like in GTX 1060 and RTX 2060..?
In the other thread I assumed the PCIe plugs would logically go where the NVLink fingers are, thus avoiding silly wires like we had in the 2060 and this insane daughter board shown in the render.

Wires would obviously be better for airflow on the rear fan but I was just going off the exploded diagram render that this thread is discussing.
Posted on Reply
#47
Ruru
S.T.A.R.S.
Chrispy_In the other thread I assumed the PCIe plugs would logically go where the NVLink fingers are, thus avoiding silly wires like we had in the 2060 and this insane daughter board shown in the render.

Wires would obviously be better for airflow on the rear fan but I was just going off the exploded diagram render that this thread is discussing.
Totally missed the plugs on that daughter board, my bad.
Posted on Reply
#48
TheoneandonlyMrK
AssimilatorYes... non-final hardware in the hands of "influencers" (aka Xbox fanboys/girls) who have no idea what performance numbers mean comparatively, but are happy to regurgitate the ones Microsoft supplies them with ad infinitum.

Console hardware has never been better than high-end PC hardware and it never will be, because it's always a generation (at best) behind the latest and greatest PC hardware.
Your getting a bit Or and wrong, I'm talking about Devs not influencer's and you said paper specs, they're set in stone now that's not paper and non final?, they're going to respin it's chip before releasing, nah.
It doesn't matter if it's better or worse , it's competing with what comes out for some people's money, and the main point being it does well.
So competition is turning up this year.

How competitive they all are is yet to be decided but I sure as shit am not paying 150£ for no air cooler, I'll say that much on topic.
Posted on Reply
#49
cucker tarlson
Vayra86You're right. I got carried away, but what if
Theyre tbp not tdp
Iirc my 2070s has a 285w board
Posted on Reply
#50
Vayra86
cucker tarlsonTheyre tbp not tdp
Iirc my 2070s has a 285w board
Correct but the trend is still up, and the fact that Ampere pushes that up further even on a smaller node surprises me. IF it does and this isn't all BS.
Posted on Reply
Add your own comment
Dec 17th, 2024 14:01 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts