• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The Reason Why NVIDIA's GeForce RTX 3080 GPU Uses 19 Gbps GDDR6X Memory and not Faster Variants

M2B

Joined
Jun 2, 2017
Messages
284 (0.11/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
Obviously but that is not what this topic is about, is it... Nobody ever said 'buy an FE'. The article here is specifically talking about temps on the FE.

And we both know expecting 5-6 years of life out of a GPU is not a strange idea at all. Obviously it won't run everything beautifully, but it certainly should not be defective before then. Broken or crappy fan over time? Sure. Chip and memory issues? Bad design.

Now, when it comes to those AIB cards... the limitations of the FE do translate to those as well, since they're also 19 Gbps cards because 'the FE has it'.

Yeah; I just read the original article and it seems like the FE card suffers from stability issues under certain intensive workloads.
I expected so much more from Nvidia at this time and age...
 
Joined
Sep 17, 2014
Messages
21,284 (5.99/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Yeah; I just read the original article and it seems like the FE card suffers from stability issues under certain intensive workloads.
I expected so much more from Nvidia at this time and age...

Like I said, I can smell Intel CPU nonsense here. Nice burst, shit sustained full perf unless you place a monstrous cooler on it.

This new cutting edge we're getting stinks a little bit, if you ask me.
 
Joined
Feb 1, 2019
Messages
2,757 (1.41/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
These cards are not memory bandwidth starved anyway, preliminary memory overclock tests show very little performance gains. The extra performance is not a real problem, however, existing temps already seem to be a problem.

I remember some 5700XT having to sell for very low prices because they ran their memory too hot (Asus tuf and MSI evoke gen 1), and the temps were slightly under 100°C.

The memory bandwidth is the bottleneck if we take gamers nexus results as accurate.

He did manual overclocks, and found the gpu clock been increase gave basically no performance.
He then boosted memory clock speed and got a measurable increase, it was sub 5% but was there.

Even my 1080ti gets bigger gains from memory clocks vs gpu core clocks, these rumours that memory clock speeds are pointless seem wrong.

The most likely reason the fastest gddr6x chips are not been used is they will instead be used on a future 3000 series product, nvidia dont show their best hand on the early products of a generation. They probably cost more for starters.
 
Joined
Dec 24, 2008
Messages
2,062 (0.37/day)
Location
Volos, Greece
System Name ATLAS
Processor Intel Core i7-4770 (4C/8T) Haswell
Motherboard GA-Z87X-UD5H , Dual Intel LAN, 10x SATA, 16x Power phace.
Cooling ProlimaTech Armageddon - Dual GELID 140 Silent PWM
Memory Mushkin Blackline DDR3 2400 997123F 16GB
Video Card(s) MSI GTX1060 OC 6GB (single fan) Micron
Storage WD Raptors 73Gb - Raid1 10.000rpm
Display(s) DELL U2311H
Case HEC Compucase CI-6919 Full tower (2003) moded .. hec-group.com.tw
Audio Device(s) Creative X-Fi Music + mods, Audigy front Panel - YAMAHA quad speakers with Sub.
Power Supply HPU-4M780-PE refurbished 23-3-2022
Mouse MS Pro IntelliMouse 16.000 Dpi Pixart Paw 3389
Keyboard Microsoft Wired 600
Software Win 7 Pro x64 ( Retail Box ) for EU
It is unfortunate that NVIDIA engineers they do not keep track of TPU forum boards.
I wrote all ready too many hints that they need to follow so the hot pan GPU technology model this to change.

In the product design of RTX 3080, all headroom of support technologies this came at the max.
Power usage this maxed at the point that the use of power limiter to be enforced.
Air cooling system this were developed at max of possible obtained performance due air.
Memory modules which are thermally cooler and use lesser energy were selected.
GPU max frequency which does not cause destructive power usage this is enforced at highest performance variants.

In summary, dear NVIDIA you did succeed to bring your self at 2020 in a DEAD END for any further GPU development.
I bet 1000 Euro that your R&D team today this is pulling their hairs from desperation of what to use so to develop the next BIG thing.
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.38/day)
On 3080 the memory plane draws 70 watt, 3090 probably up to 170w, so what is the issue. Well the same way there were 6,7,8,9Gbps Gddr5, now there are 19,20,21,22,23 gddr6 and it comes down to quality and price. With time clocks will improve. Perhaps the error correction can cause the performance to drop at those temperatures and higher than 19 clock speeds.
 
Last edited:
Joined
Dec 31, 2009
Messages
19,366 (3.67/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
On 3080 the memory plane draws 70 watt, 3090 probably up to 170w,
So... if that is remotely true... how is the 3090 spec'd at 30W higher? The significantly increased SP count and slightly lower clocks don't account for 50W of power savings.
 
Top