Tuesday, September 22nd 2020
The Reason Why NVIDIA's GeForce RTX 3080 GPU Uses 19 Gbps GDDR6X Memory and not Faster Variants
When NVIDIA announced its next-generation GeForce RTX 3080 and 3090 Ampere GPUs, it specified that the memory found in the new GPUs will be Micron's GDDR6X variant with 19 Gbps speed. However, being that there are faster GDDR6X modules already available in a 21 Gbps variant, everyone was left wondering why NVIDIA didn't just use the faster memory from Micron. That is exactly what Igor's Lab, a technology website, has been wondering as well. They have decided to conduct testing with an infrared camera that measures the heat produced. To check out the full testing setup and how they tested everything, you can go here and read it, including watching the video embedded.
Micron chips like GDDR5, GDDR5X, and GDDR6 are rated for the maximum junction temperature (TJ Max) of 100 degrees Celsius. It is recommended that these chips should run anywhere from 0C to 95C for the best results. However, when it comes to the new GDDR6X modules found in the new graphics cards, they are not yet any official specifications available to the public. Igor's Lab estimates that they can reach 120C before they become damaged, meaning that TJ Max should be 110C or 105C. When measuring the temperature of GDDR6X modules, Igor found out that the hottest chip ran at 104C, meaning that the chips are running pretty close to the TJ Max they are (supposedly) specified. It is NVIDIA's PCB design decisions that are leading up to this, as the hottest chips are running next to voltage regulators, which can get pretty hot on their own.The takeaway here is that the heat produced from the card is quite huge with a TGP of 320 W, meaning that the cooling is quite a problem which NVIDIA managed to keep under control, however, the design decisions resulted in some possible performance loss. Instead of NVIDIA using Micron's faster 21 Gbps chips, they are forced to use the 19 Gbps variants. It is not like there would be supply issues as NVIDIA is Micron's only GDDR6X customer, however, the real reason is heat management. The speedup from a slower 19 Gbps chip to faster 21 Gbps chips is plus 10%, meaning that the heat output would follow the percentage, resulting in even greater TGP of the card. The conclusion drawn here is that the current chips are limited by temperature only, as the card is pretty hot, and that is why NVIDIA doesn't use the faster GDDR6X variant.
Sources:
Igor's Lab, via Tom's Hardware
Micron chips like GDDR5, GDDR5X, and GDDR6 are rated for the maximum junction temperature (TJ Max) of 100 degrees Celsius. It is recommended that these chips should run anywhere from 0C to 95C for the best results. However, when it comes to the new GDDR6X modules found in the new graphics cards, they are not yet any official specifications available to the public. Igor's Lab estimates that they can reach 120C before they become damaged, meaning that TJ Max should be 110C or 105C. When measuring the temperature of GDDR6X modules, Igor found out that the hottest chip ran at 104C, meaning that the chips are running pretty close to the TJ Max they are (supposedly) specified. It is NVIDIA's PCB design decisions that are leading up to this, as the hottest chips are running next to voltage regulators, which can get pretty hot on their own.The takeaway here is that the heat produced from the card is quite huge with a TGP of 320 W, meaning that the cooling is quite a problem which NVIDIA managed to keep under control, however, the design decisions resulted in some possible performance loss. Instead of NVIDIA using Micron's faster 21 Gbps chips, they are forced to use the 19 Gbps variants. It is not like there would be supply issues as NVIDIA is Micron's only GDDR6X customer, however, the real reason is heat management. The speedup from a slower 19 Gbps chip to faster 21 Gbps chips is plus 10%, meaning that the heat output would follow the percentage, resulting in even greater TGP of the card. The conclusion drawn here is that the current chips are limited by temperature only, as the card is pretty hot, and that is why NVIDIA doesn't use the faster GDDR6X variant.
55 Comments on The Reason Why NVIDIA's GeForce RTX 3080 GPU Uses 19 Gbps GDDR6X Memory and not Faster Variants
www.igorslab.de/en/simple-pad-mod-for-the-force-rtx-3080-founders-edition-slowers-the-gddr6x-temperature-by-a-whopping-8-degrees/
Firstly, nobody knows what safe temperatures are for GDDR6X, since that information isn't publicly available. 110 °C is the maximum temp for GDDR6 non-X, for all we know G6X could be rated to 125 °C.
Secondly, even if G6X is only rated to 110 °C, the modules have thermal throttling built in, so they shouldn't be damaged.
Thirdly, Igor himself states: Finally, if you really have a problem with this, do what everyone sane does: buy an AIB version with a proper cooler.
On topic/
When cards get designed right side up heat will actually travel away from the chips naturally.
Should evidence emerge showing that these temps are a problem, I will join in rightly criticising NVIDIA for putting form over function. But not before. There's far too much fanboyism and idiot brigades on these forums, I reject such nonsense wholeheartedly.
Why do you think these specs aren't public? Coincidence? Materials don't magically suddenly take more heat. They're just stretching up the limits of what's safe and what's not. As long as it makes the warranty period, right?
Time to put two and two together.
Absolute Maximum ratings, storage temperature: -55C Min. +125C Max.
can be found under data sheet in this link.
www.micron.com/products/ultra-bandwidth-solutions/gddr6x/part-catalog/mt61k256m32je-19
Nvidias cards generally aged just fine.
The hot ones however really didnt. Also on the AMD side. I dont see why this would be an exception to that rule. But you are welcome to provide examples of VRAM running close to 100C doing just fine after 4-5 years. I do have some hands full of examples showing the opposite.
And ehh tarnish reputation? The card made it past warranty right?
I'm definitely disappointed that Nvidia has designed such a good looking card that cools the GPU itself just fine but somehow fails to keep the memory chips cool enough, I'd avoid the FE and look somewhere else.
And we both know expecting 5-6 years of life out of a GPU is not a strange idea at all. Obviously it won't run everything beautifully, but it certainly should not be defective before then. Broken or crappy fan over time? Sure. Chip and memory issues? Bad design.
Now, when it comes to those AIB cards... the limitations of the FE do translate to those as well, since they're also 19 Gbps cards because 'the FE has it'.
I expected so much more from Nvidia at this time and age...