• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4090 Founders Edition

Do you understand the concept of "adding bandwidth potential doesn't matter when the existing bandwidth was nowhere near saturated"?
Doesn't seem like it, no.
I guess that what the market really needed right now was an additional price hike in current gen motherboards due to OEMs upselling a "feature" that benefits zero people (PCIe Gen 5 SSDs aren't even out yet, and it's doubtful that even next generation cards will saturate the PCIe Gen 4 bus).
They 99.9% sure won't, given that current ones aren't really even saturating PCIe 3.0 consistently. And even PCIe 5.0 SSDs are meaningless outside of large sequential file transfers, which, well, most consumers don't spend much of their time doing. The NAND is still the same, so random and mixed performance will be the same - and outside of the effects of slightly faster controller cores, PCIe 3.0 and 4.0 drives perform pretty much the same in those metrics. The only reason PCIe 4.0 drives are faster today is that nobody is producing PCIe 3.0 drives with premium NAND and controllers - if you had a 3.0 controller supporting the fastest NAND today, that drive would most likely match 4.0 drives in any non-sequential task. In real world applications, even the fastes 4.0 drives don't come close to saturating PCIe 3.0 bandwidth.

PCIe 5.0 for consumers is a clear-cut case of Intel and AMD being caught in a destructive game of one-upmanship, where they "have no choice" but to include whatever harebrained feature they can because they exist, because if they don't they'll get all kinds of shit from customers complaining that their competitor has [feature X that nobody can make use of].
 
@W1zzard Table on first page has GA102 at 28 billion transistors while [the DB](https://www.techpowerup.com/gpu-specs/nvidia-ga102.g930) has it at 28.3 bil. Others appear to be correct (or at least, correspond to what the DB says).
 
I watched Terminator 3 today and this scene made me chuckle. :)

We finally have a GPU more powerful than Skynet!

Skynet teraflops.jpg
 
I watched Terminator 3 today and this scene made me chuckle. :)

We finally have a GPU more powerful than Skynet!
Not quite. Supercomputer TFLOPS are almost always about the double precision TFLOPS peak.
 
Not quite. Supercomputer TFLOPS are almost always about the double precision TFLOPS peak.

Dayum. According to the TPU database, AD102 doesn't support full-speed FP64 even in professional cards?

That was reserved for Hopper, which has 30 teraflops for double precision?
 
Question; why does the backside of the AD102 on the founders edition cards looks different than the backside of this chip from all board partners?
It looks like all board partners use 4 big capitors and 2x 10 very small capitors but Nvidia uses 6x 10 very small capitors on the backside of the chip.
 

Attachments

  • Screenshot_2024-02-22-06-50-17-391_com.android.chrome.jpg
    Screenshot_2024-02-22-06-50-17-391_com.android.chrome.jpg
    1.3 MB · Views: 73
  • Screenshot_2024-02-22-06-51-33-299_com.android.chrome.jpg
    Screenshot_2024-02-22-06-51-33-299_com.android.chrome.jpg
    1.3 MB · Views: 67
Question; why does the backside of the AD102 on the founders edition cards looks different than the backside of this chip from all board partners?
It looks like all board partners use 4 big capitors and 2x 10 very small capitors but Nvidia uses 6x 10 very small capitors on the backside of the chip.
We never really had a good explanation for this last gen when several RTX 3080 and 3080Ti cards fried themselves running New World at thousands of FPS on the title screen. A few vendors, most notably EVGA were using the 'fewer, large caps' and the Founders boards had the 'more, smaller caps'

I can't remember, but I think a few other GPU brands suffered with this too, possibly Galax and MSI - though it was EVGA that had the most failures. IIRC Der8auer and GamersNexus did articles on it that were news coverage with some speculation, but nobody really got to the bottom of why this change caused cards to die, or why other cards with the fewer, larger caps weren't also failing.
 
We never really had a good explanation for this last gen when several RTX 3080 and 3080Ti cards fried themselves running New World at thousands of FPS on the title screen. A few vendors, most notably EVGA were using the 'fewer, large caps' and the Founders boards had the 'more, smaller caps'

I can't remember, but I think a few other GPU brands suffered with this too, possibly Galax and MSI - though it was EVGA that had the most failures. IIRC Der8auer and GamersNexus did articles on it that were news coverage with some speculation, but nobody really got to the bottom of why this change caused cards to die, or why other cards with the fewer, larger caps weren't also failing.
I had an EVGA 3080Ti just shit the bed after about 3 months of use. I wasn't playing anything that exciting, I was actually in the middle of a mission for Sniper Elite 5 and it died. The replacement has been going strong for about 13-14 months now. Maybe it wasn't just one game that was killing them, but just poor quality control?
 
I had an EVGA 3080Ti just shit the bed after about 3 months of use. I wasn't playing anything that exciting, I was actually in the middle of a mission for Sniper Elite 5 and it died. The replacement has been going strong for about 13-14 months now. Maybe it wasn't just one game that was killing them, but just poor quality control?
Again, my memory isn't perfect on this but I am fairly sure the issue affected EVGA more than other brands with the same capacitor layout and that was attributed to EVGAs PCB layout and quality control to some extent.

It definitely wasn't just New World that caused problems - it was the first of a few games that exposed design flaws in those cards and the common denominator was uncapped framerates at exceptionally low loads (so 15000 fps in a menu or loading screen, for example).
 
Again, my memory isn't perfect on this but I am fairly sure the issue affected EVGA more than other brands with the same capacitor layout and that was attributed to EVGAs PCB layout and quality control to some extent.

It definitely wasn't just New World that caused problems - it was the first of a few games that exposed design flaws in those cards and the common denominator was uncapped framerates at exceptionally low loads (so 15000 fps in a menu or loading screen, for example).
Maybe cost cutting, then? A single large capacitor must be easier to manufacture and solder onto the PCB. Just a guess.
 
Back
Top