• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

Joined
Dec 12, 2016
Messages
1,834 (0.63/day)
I really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
AMD inability to compete is because no one will buy their chips even though they are very competitive against Nvidia's offerings. Luckily, you Assimilator has just volunteered to buy AMD as your next graphics card to help drive down Nvidia prices. I will join you and together we will show everyone that the only way to bring about a competitive market is for everyone to stop buying brand and gimmicks and start buying great performance per dollar tech regardless of what name is on the box.

5080 Ti/5090 here I come.

3080 Ti has been great, but it's time for an upgrade.

Hoping for around ~40% better performance than Ada, more is great of course.

3080 Ti to 5090/5080 Ti ideally around 2x faster.

Since I framecap to 237 FPS, faster/more efficient cards also means a lower total wattage which is always nice, unless new games/software push the GPU that much harder, which I doubt. Ada was a significant efficiency leap and very tempting, but I don't upgrade every gen of GPU.
I'll either be buying a 9950X3D and a Radeon 8900XTX for my next build or skip a generate and get Zen 6 and RDNA5. Since AMD is best for gaming in my opinion and will continue to focus equally between gaming and AI, my dollars will continue to go to them until Nvidia stops wasting resources on RT and AI.
 
Joined
Jun 11, 2019
Messages
611 (0.31/day)
Location
Moscow, Russia
Processor Intel 12600K
Motherboard Gigabyte Z690 Gaming X
Cooling CPU: Noctua NH-D15S; Case: 2xNoctua NF-A14, 1xNF-S12A.
Memory Ballistix Sport LT DDR4 @3600CL16 2*16GB
Video Card(s) Palit RTX 4080
Storage Samsung 970 Pro 512GB + Crucial MX500 500gb + WD Red 6TB
Display(s) Dell S2721qs
Case Phanteks P300A Mesh
Audio Device(s) Behringer UMC204HD
Power Supply Fractal Design Ion+ 560W
Mouse Glorious Model D-
Nvidia stops wasting resources on RT and AI.
AMD's already all-in on AI just like everyone else in the market, lol, and we're just one gen away from seeing if they're going to finally improve their RT. What are you going to do if they follow nvidia?
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Yeah! More stagnation! Let's vote for stagnation!

I don't know what you're looking at, but I'm seeing a slight uptick per tier, with all things increased except the bus width. GDDR7 makes up for part of the deficit though so bandwidth won't be worse than Ada relatively, that's good. But capacity, still 12GB in the midrange and 8GB bottom end? You're saying this is a good thing, now?
No, I'm saying it's good enough without sufficient competition.

Ada is already bandwidth constrained at the lower tiers. Nvidia is trying real hard to keep those tiers to what, 1080p gaming?
Every company wants to segment their products. When there's no competition that becomes a lot easier.

To each their own, but I think in 2025 people would like to move on from 1080p.
I've only played in 1440p since 2019. The 4060 Ti I switched to earlier this year has not given me any problems in this regard despite "only" 8GB VRAM and "only" a 128-bit bus. The only people you regularly see bemoaning NVIDIA GPUs are not the people who own one.

As for AMD's inability to compete... RT fools & money were parted here. AMD keeps pace just fine in actual gaming and raster perf and is/has been on many occasions cheaper. They compete better than they have done in the past. Customers just buy Nvidia, and if that makes them feel 'screwed over'... yeah... a token of the snowflake generation, that also doesn't vote and then wonders why the world's going to shit.
It's nothing to do with RT and everything to do with marketing and advertising. When people read the news they see "NVIDIA" due to the AI hype, and that has had a significant and quantifiable impression on ordinary consumers' minds. AMD has completely failed to understand this basic concept, they seem to be operating on the assumption that having a slightly worse product at a slightly lower price point is good enough, and the market has very obviously shown that it absolutely is not. AMD has options to fight back against the mindshare that NVIDIA has with things like price cuts, but again because AMD doesn't understand they need to do this, they aren't.

Let's make it clear here, AMD is staring down the barrel regarding GPUs. The last 7 quarters are the worst for them since Jon Peddie Research started tracking this metric a decade ago, they had never dropped under 18% until Q3 2022, and with the upcoming Blackwell launch and nothing new from AMD we can expect NVIDIA to breach 90% of the desktop GPU market. That is annihilation territory for AMD GPUs, that is territory where they consider exiting the desktop consumer market and concentrate on consoles only. That is territory where your company should start pulling out all the stops to recover, yet what is AMD doing in response? Literally nothing.

And it all compounds. If NVIDIA believes they're going to outsell AMD by 9:1, NVIDIA is going to book out 9x as much capacity at TSMC, which gives them a much larger volume discount than AMD will get, which means AMDs GPUs cost more; AIBs will have the same issue with all the other components they use like memory chips, PCBs, ... Once you start losing economies of scale and the associated discounts you get into even worse of a position regarding being able to manipulate your prices to compete.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Let's make it clear here, AMD is staring down the barrel regarding GPUs. The last 7 quarters are the worst for them since Jon Peddie Research started tracking this metric a decade ago, they had never dropped under 18% until Q3 2022, and with the upcoming Blackwell launch and nothing new from AMD we can expect NVIDIA to breach 90% of the desktop GPU market. That is annihilation territory for AMD GPUs, that is territory where they consider exiting the desktop consumer market and concentrate on consoles only. And what is AMD doing in response? Literally nothing.
See this is conjecture. Who said this? AMD isn't saying this, they're simply continuing development and they're not trying to keep pace with Nvidia because they know they can't.

Is AMD staring down the barrel? Is this really worse here than the years they were getting by on very low cashflow/margin products, pre-Ryzen? Are we really thinking they will destroy the one division that makes them a unique, synergistic player in the market?

There are a few indicators of markets moving.
- APUs are getting strong enough to run games proper, as gaming requirements are actually plateau-ing, you said it yourself, that 4060ti can even run 1440p. Does the PC market truly need discrete for a large segment of its gaming soon? Part of this key driver is also the PC handheld market, which AMD has captured admirably and IS devoting resources into.
- Their custom chip business line floats entirely on the presence and continued development of RDNA
- Their console business floats on continued development of RDNA - notably, sub high end, as those are the chips consoles want
- The endgame in PC gaming still floats on console ports before PC-first games at this point and with more cloudbased/unification between platforms, that won't get less, it will get more pronounced.
- AI will always move fastest on GPUs, another huge driver to keep RDNA.

Where is heavy RT in this outlook I wonder. I'm not seeing it. So Nvidia will command its little mountain of 'RT aficionado's on the PC', a dwindling discrete PC gaming market with a high cost of entry, and I think AMD will be fine selling vastly reduced numbers of GPU in that discrete PC segment because its just easy money alongside their other strategic business lines.

This whole thing wasn't new or hasn't changed since what, the first PS4.

AMD is fine, and I can totally see why they aren't moving. It would only introduce more risk for questionable gains, they can't just conjure up the technology to 'beat Nvidia' can they? Nvidia beats them at better integration of soft- and hardware.

Still, I see your other points about them and I understand why people are worried. But this isn't new to AMD. Its story of their life, and they're still here and their share gained 400% over the last five years.
 
Joined
Feb 20, 2019
Messages
8,277 (3.94/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Curious to see how the 5060/Ti maybe a 5070 will end up.
I have no upgrade plans left for this year but sometime next year I wouldn't mind to upgrade my GPU and thats the highest I'm willing to go/what my budget allows. 'those will be plenty expensive enough where I live even second hand..:shadedshu:'
4060Ti 16GB is a 1080p card in 2023. I bought one (needed the VRAM buffer for work) and dumped it into the second PC in the living room with a 4K TV. It can barely handle 1440p without performance nosediving because there's simply not enough bandwidth.

If they're going to keep it on a 128-bit bus, GDDR7 is maybe going to turn it into a 1440p card. At 448GB/s it's still 12% less bandwidth on paper than a vanilla 4070 which is okay at 1440p, but that's with lower-latency GDDR6. I'm not 100% sure you can just compare bandwidth between GDDR6 and GDDR7 because latency will have doubled, clock for clock - which means (only a guess here) that the 5060Ti will have 88% the bandwidth of a 4070 but ~50% higher latency. That's going to make it considerably handicapped compared to a 4070 overall, so I guess the rest of it is down to how well they've mitigated that shortcoming with better cache, more cache, and hopefully some lessons learned from the pointlessness of the 4060Ti.
 
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
4060Ti 16GB is a 1080p card in 2023. I bought one (needed the VRAM buffer for work) and dumped it into the second PC in the living room with a 4K TV. It can barely handle 1440p without performance nosediving because there's simply not enough bandwidth.

If they're going to keep it on a 128-bit bus, GDDR7 is maybe going to turn it into a 1440p card. At 448GB/s it's still more than 12% less bandwidth than a vanilla 4070 which is a decent 1440p offering, but that's with lower-latency GDDR6, I'm not 100% sure bandwidth comparisons between GDDR6 and GDDR7 are possible because latency will have doubled, clock for clock - which means (only a guess) that the 5060Ti will have 88% the bandwith of a 4070 but ~50% higher latency.
They could fix the latency with cache
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,029 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
Two generation gap? For me the 2080ti to 4070ti was a 50% jump.

Settle for nothing less! :cool:
50% jump is great considering you went down in the stack by ~2 tiers and are only using ~30 W more power compared to FE 2080 Ti (4080 Ti doesn't exist, 4070 Ti S 4080 lite arguably a different tier than 4070 Ti).

I'm hoping two generations plus same tier or 1-2 tier up (5090/5090 Ti?) is enough to double performance.

Fingers crossed lol. If I do go 5090/Ti I'll likely keep it three generations to recoup the extra cost.

They could fix the latency with cache
Maybe, still I think xx60 class cards will be native 1080/DLSS 1440 for at least this next gen.

Important to bear in mind 1080p on PC or 1440p DLSS arguably looks better than "native" 4K on console, which is realistically the competition at the entry level.

Native in quotes because consoles typically vary resolution and make heavy use of mediocre upscaling when playing at 4K, that or have a 30 FPS frame target which is pathetic.
 

Raysterize

New Member
Joined
Jun 11, 2024
Messages
7 (0.04/day)
Here we go again...

Should be:

5090 - 512-bit 32GB <-- Needed for 4K Max settings in all games with 64GB being overkill.
5080 - 384-bit 24GB <-- 16GB is too little for something that will be around the power of a 4090.
5070 - 256-bit 16GB <-- Sweet spot for mid range.
5060 Ti - 192-bit 12GB <-- Would sell really well.
5060 - 128-bit 8GB <-- 8GB is fine if priced right...

And for the people slating AMD I had the ASUS 7900XTX TUF Gaming OC and it was incredible! Sure the street lights would flicker when I was 4K gaming but hey ho...
 
Joined
Jun 27, 2019
Messages
2,109 (1.07/day)
Location
Hungary
System Name I don't name my systems.
Processor i5-12600KF 'stock power limits/-115mV undervolt+contact frame'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/@950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
4060Ti 16GB is a 1080p card in 2023. I bought one (needed the VRAM buffer for work) and dumped it into the second PC in the living room with a 4K TV. It can barely handle 1440p without performance nosediving because there's simply not enough bandwidth.

If they're going to keep it on a 128-bit bus, GDDR7 is maybe going to turn it into a 1440p card. At 448GB/s it's still 12% less bandwidth on paper than a vanilla 4070 which is okay at 1440p, but that's with lower-latency GDDR6. I'm not 100% sure you can just compare bandwidth between GDDR6 and GDDR7 because latency will have doubled, clock for clock - which means (only a guess here) that the 5060Ti will have 88% the bandwidth of a 4070 but ~50% higher latency. That's going to make it considerably handicapped compared to a 4070 overall, so I guess the rest of it is down to how well they've mitigated that shortcoming with better cache, more cache, and hopefully some lessons learned from the pointlessness of the 4060Ti.
I'm not planning to upgrade my resolution/monitor so I'm fine in that regard. :)
2560x1080 21:9 is somewhere between 1080p and 1440p based on my own testing over the years and most of the time I'm running out of raw GPU raster performance first when I crank up the settings at this resolution so I wouldn't exactly mind 12 GB Vram either but 16 is welcome if its not too overpriced. 'I'm also a constant user of DLSS whenever its a available in a game so that helps'
Tbh if the ~mid range 5000 serie fails to deliver in my budget range then I will just pick up a second hand 4070 Super and call it a day. 'plenty enough for my needs'
 
Joined
Feb 20, 2019
Messages
8,277 (3.94/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
They could fix the latency with cache
Yeah, that's what they said about Ada, and that didn't work - so I'll believe it when I see performance scaling without a huge nosedive!

Maybe a combination of refinements to the cache that they got wrong with Ada and the switch to GDDR7 will be enough. As always, it'll really just come down to what they're charging for it - the 4060Ti 16G would have be a fantastic $349 GPU but that's not what we got...

Tbh if the ~mid range 5000 serie fails to deliver in my budget range then I will just pick up a second hand 4070 Super and call it a day. 'plenty enough for my needs'
If the major benefits to the 50-series are for AI, the 40-series will remain perfectly good for this generation of games.
 

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,788 (1.50/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) ASUS DirectCU II R9 290 4GB
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
I really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
Your post definitely smells of fanboying :wtf:

Which is so laughable considering AMD has no problem competing with Nvidias offerings outside of the RTX 4090

The RX 7900XTX Trades blows with the RTX 4080 Super mostly edging it out
The RX 7900XT beats the RTX 4070Ti Super
The RX 7900GRE Beats the RTX 4070 Super
The RX 7800XT Beats the RTX 4070
etc....

All while offering much better prices


relative-performance-2560-1440.png


relative-performance-3840-2160.png
 
Joined
Dec 12, 2016
Messages
1,834 (0.63/day)
Your post definitely smells of fanboying :wtf:

Which is so laughable considering AMD has no problem competing with Nvidias offerings outside of the RTX 4090
Nvidia brand loyalists are fixated on three things:
  • RT
  • DLSS
  • The internet myth that AMD has fundamental driver problems and Nvidia doesn't
Outside of those three things, the GPU market looks very even and competitive with AMD doing slightly better in performance and price as you pointed out. But even if all three of my points above didn't exist, these loyalists would still buy Nvidia. But I appreciate you and everyone else doing what they can to prevent the blind fealty to one company that threatens to ruin our DIY PC building market that we love so much.
 
Joined
Oct 6, 2021
Messages
1,605 (1.40/day)
What a monstrous difference from the largest chip to the level below. More than 2x bigger. :')
 
Joined
Feb 23, 2019
Messages
6,062 (2.89/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3800 CL16
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
  • The internet myth that AMD has fundamental driver problems and Nvidia doesn't
You'd be surprised how often have I heard "Aaaaand AMD display driver just crashed" from my buddy rocking a 6600 XT on a new AM5 system while playing the same game online.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,029 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
What a monstrous difference from the largest chip to the level below. More than 2x bigger. :')
4090 wasn't fully enabled, not even close.
5090 probably won't be either.

These 100% enabled die numbers aren't representative of consumer cards, but Quadro ones.
 
Joined
Aug 2, 2012
Messages
1,986 (0.44/day)
Location
Netherlands
System Name TheDeeGee's PC
Processor Intel Core i7-11700
Motherboard ASRock Z590 Steel Legend
Cooling Noctua NH-D15S
Memory Crucial Ballistix 3200/C16 32GB
Video Card(s) Nvidia RTX 4070 Ti 12GB
Storage Crucial P5 Plus 2TB / Crucial P3 Plus 2TB / Crucial P3 Plus 4TB
Display(s) EIZO CX240
Case Lian-Li O11 Dynamic Evo XL / Noctua NF-A12x25 fans
Audio Device(s) Creative Sound Blaster ZXR / AKG K601 Headphones
Power Supply Seasonic PRIME Fanless TX-700
Mouse Logitech G500S
Keyboard Keychron Q6
Software Windows 10 Pro 64-Bit
Benchmark Scores None, as long as my games runs smooth.
AMD inability to compete is because no one will buy their chips even though they are very competitive against Nvidia's offerings. Luckily, you Assimilator has just volunteered to buy AMD as your next graphics card to help drive down Nvidia prices. I will join you and together we will show everyone that the only way to bring about a competitive market is for everyone to stop buying brand and gimmicks and start buying great performance per dollar tech regardless of what name is on the box.


I'll either be buying a 9950X3D and a Radeon 8900XTX for my next build or skip a generate and get Zen 6 and RDNA5. Since AMD is best for gaming in my opinion and will continue to focus equally between gaming and AI, my dollars will continue to go to them until Nvidia stops wasting resources on RT and AI.
Sucks to be you, but Path Tracing is the future of videogame lighting, even AMD will have to optimize for it.
 

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,788 (1.50/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) ASUS DirectCU II R9 290 4GB
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
Nvidia brand loyalists are fixated on three things:
  • RT
  • DLSS
  • The internet myth that AMD has fundamental driver problems and Nvidia doesn't
Outside of those three things, the GPU market looks very even and competitive with AMD doing slightly better in performance and price as you pointed out. But even if all three of my points above didn't exist, these loyalists would still buy Nvidia. But I appreciate you and everyone else doing what they can to prevent the blind fealty to one company that threatens to ruin our DIY PC building market that we love so much.
RT still isn’t viable as the performance hit it still to big without DLSS

DLSS is ok but so is FSR

And yea I hear that a lot. Which is funny because I’ve used AMD since the HD 4000 days and haven’t had driver issues since Hawaii. Which quite some time ago.
 
Joined
Oct 22, 2014
Messages
14,086 (3.82/day)
Location
Sunshine Coast
System Name H7 Flow 2024
Processor AMD 5800X3D
Motherboard Asus X570 Tough Gaming
Cooling Custom liquid
Memory 32 GB DDR4
Video Card(s) Intel ARC A750
Storage Crucial P5 Plus 2TB.
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Eweadn Mechanical
Software W11 Pro 64 bit
Joined
Nov 27, 2023
Messages
2,321 (6.41/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (23H2)
4090 wasn't fully enabled, not even close.
5090 probably won't be either.

These 100% enabled die numbers aren't representative of consumer cards, but Quadro ones.
That’s actually an important point that people seem to miss. If the chart turns out correct (and that’s a big IF), then I would wager that a full GB202 with 64 gigs will be the most expensive pro-card config. Said 64 gigs might not even be GDDR7, perhaps, we had the precedent with RTX6000 Ada using regular GDDR6 instead of 6X. Would be interesting to see if this go around the yields will actually be enough to create a fully enabled card. With AD102, there never WAS a full-chip card. And the 4090 was obvious dregs sold for a ton to consumers.
 
Joined
May 19, 2011
Messages
106 (0.02/day)
Yeah! More stagnation! Let's vote for stagnation!

I don't know what you're looking at, but I'm seeing a slight uptick per tier, with all things increased except the bus width. GDDR7 makes up for part of the deficit though so bandwidth won't be worse than Ada relatively, that's good. But capacity, still 12GB in the midrange and 8GB bottom end? You're saying this is a good thing, now? Ada is already bandwidth constrained at the lower tiers. Nvidia is trying real hard to keep those tiers to what, 1080p gaming?

To each their own, but I think in 2025 people would like to move on from 1080p. The 8GB tier is by then bottomline useless and relies mostly on cache; the 12GB tier can't ever become a real performance tier midrange for long, its worse than the position Ada's 12GBs are in today in terms of longevity. Sure, they'll be fine today and on release. But they're useless by or around 2026, much like the current crop of Ada 12GBs.

As for AMD's inability to compete... RT fools & money were parted here. AMD keeps pace just fine in actual gaming and raster perf and is/has been on many occasions cheaper. They compete better than they have done in the past. Customers just buy Nvidia, and if that makes them feel 'screwed over'... yeah... a token of the snowflake generation, that also doesn't vote and then wonders why the world's going to shit.

You can't fix stupidity. Apparently people love to watch in apathy as things escalate into dystopia, spending money as they go and selling off their autonomy one purchase and subscription at a time.

Personally I blame AMD for not being able to compete for so long in terms of perf/watt, software (read: following in nVidia’s footsteps), drivers, *compatibility with emerging technologies such as RT and AI especially* (call/cope it how some may) etc… Their recent move of leaving the high end to Nvidia was basically them admitting defeat and now prices are sky high. The fact of the matter is integrated graphics makes a dGPU a nonessential part of a system, and by that I mean since you technically aren’t forced to buy one in the same vein that you’re forced to buy DRAM (especially given that dGPUs are interchangeable, not being locked to a certain vendor like you would be with a CPU socket for example), you really have to sell a product based on merit more than anything.

And if anyone thinks AMD are innocent in all this, don’t forget, they launched their 7900XTX at $1,000. So they aren’t gonna save you either.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,029 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
That’s actually an important point that people seem to miss. If the chart turns out correct (and that’s a big IF), then I would wager that a full GB202 with 64 gigs will be the most expensive pro-card config. Said 64 gigs might not even be GDDR7, perhaps, we had the precedent with RTX6000 Ada using regular GDDR6 instead of 6X. Would be interesting to see if this go around the yields will actually be enough to create a fully enabled card. With AD102, there never WAS a full-chip card. And the 4090 was obvious dregs sold for a ton to consumers.
Yeah and 4090 Ti was cancelled likely because no competition for 4090. With RDNA4 supposedly being 7900XTX at 7800XT prices I doubt the full die 5090/Ti is needed either.

Why sell 90-100% enabled dies to consumers when you can sell them for 2-3x the price as Quadro cards anyway?
 
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Personally I blame AMD for not being able to compete for so long in terms of perf/watt, software (read: following in nVidia’s footsteps), drivers, *compatibility with emerging technologies such as RT and AI especially* (call/cope it how some may) etc… Their recent move of leaving the high end to Nvidia was basically them admitting defeat and now prices are sky high. The fact of the matter is integrated graphics makes a dGPU a nonessential part of a system, and by that I mean since you technically aren’t forced to buy one in the same vein that you’re forced to buy DRAM (especially given that dGPUs are interchangeable, not being locked to a certain vendor like you would be with a CPU socket for example), you really have to sell a product based on merit more than anything.

And if anyone thinks AMD are innocent in all this, don’t forget, they launched their 7900XTX at $1,000. So they aren’t gonna save you either.
The prices were sky high before 'AMD admitted defeat'. It has had zero impact - Nvidia released SUPER cards with a better perf/$ at somewhere around the same time period. Let's also not forget that AMD's RDNA3 price points were too high to begin with, so even their market presence hasn't had any impact on pricing. They happily priced up alongside Nvidia. It wasn't until the 7900GRE and 7800XT that things got somewhat sensible, and competitive versus the EOL RDNA2 offerings, which were also priced high in tandem with Ampere and lowered very late in the cycle.

The real facts are that no matter what AMD has done in the past, their PC discrete share is dropping. They're just not consistent enough and this echoes in consumer sentiment. Its also clear they've adopted a different strategy and are betting on different horses for quite a while now.

There is nothing new here with RDNA3 or RDNA4 in terms of market movement. Granted - RDNA3 didn't turn out as expected, but what if it did score higher on raster? Would that change the world?
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Is AMD staring down the barrel? Is this really worse here than the years they were getting by on very low cashflow/margin products, pre-Ryzen? Are we really thinking they will destroy the one division that makes them a unique, synergistic player in the market?
Yes, it has literally never been worse for their GPU division than today. Until Q3 2022 AMD had rarely dropped below 20% marketshare and when they did they pulled back above that level within maximum 2 quarters... since then they have had 7 consecutive quarters below that threshold. That's nearly 2 years of failure to not just gain, but hold marketshare. That's staring down the barrel.

1718114023435.png



Your post definitely smells of fanboying :wtf:

Which is so laughable considering AMD has no problem competing with Nvidias offerings outside of the RTX 4090

The RX 7900XTX Trades blows with the RTX 4080 Super mostly edging it out
The RX 7900XT beats the RTX 4070Ti Super
The RX 7900GRE Beats the RTX 4070 Super
The RX 7800XT Beats the RTX 4070
etc....

All while offering much better prices


View attachment 350831

View attachment 350832
Thanks for demonstrating exactly the same failure of understanding that I documented for AMD's marketing department in my post.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Yes, it has literally never been worse for their GPU division than today. Until Q3 2022 AMD had rarely dropped below 20% marketshare and when they did they pulled back above that level within maximum 2 quarters... since then they have had 7 consecutive quarters below that threshold. That's nearly 2 years of failure to not just gain, but hold marketshare. That's staring down the barrel.

View attachment 350840



Thanks for demonstrating exactly the same failure of understanding that I documented for AMD's marketing department in my post.

If we're looking at trends (I don't deny their share is lowest of all time, mind)...

2015: 18%
2019: 18.8%
2020H2: 18%
2022: 10%
2023Q4: 19%

They've been 'rock bottom' many times before. And if you draw a line over this graph, isn't this just the continuation of the trend of the last decade?

1718114416857.png


Sucks to be you, but Path Tracing is the future of videogame lighting, even AMD will have to optimize for it.
Oh? I must have missed that statement after Cyberpunk ran at sub 30 FPS on a 4090.

I think it mostly sucks for people who expect Path Tracing to be the norm. They're gonna be waiting and getting disappointed for a loooong time. Game graphics haven't stopped moving forward despite Path Tracing. Gonna be fun :)
 
Last edited:
Joined
Sep 15, 2011
Messages
6,721 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Where's the 384-bit model with 24GB GDDR7 though? Seems like a big gap between the top model and the next one down
That's going to be next year's Super Titanium Ultra Max Plus Extreme GPU releases.
Please stand by.
 
Top