• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next Gen GPU's will be even more expensive

Joined
Apr 30, 2020
Messages
999 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
It doesn't help that a lot of the AAA releases of 2023 and 2024 ran like total ass at launch, no matter your system. Overclocked 13900KS and 4090 SLI? It didn't matter - Fallen Order or The Last of Us Part I were still a stuttery, crashy mess with serious quality and performance issues, to name just two examples.

The fact that stuttering and other problems still happening on the Top high-end single card setups basically proves that dual gpus were never the problem. Its jank game engines that are under developed.

S.L.I/mGPU would/could help if developers actually bother to use it, but instead they take easy way out. Have some a.i do the hard part then add just patch on to the game that adfs d.l.s.s/f.s.r/x.e.s.s. done, let the driver team do the dirty work for d.l.s.s. bwcausw you always need a driver upadte for d.l.s.s for it to work pr look good of not flicker.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
The fact that stuttering and other problems still happening on the Top high-end single card setups basically proves that dual gpus were never the problem. Its jank game engines that are under developed.

S.L.I/mGPU would/could help if developers actually bother to use it, but instead they take easy way out. Have some a.i do the hard part then add just patch on to the game that adfs d.l.s.s/f.s.r/x.e.s.s. done, let the driver team do the dirty work for d.l.s.s. bwcausw you always need a driver upadte for d.l.s.s for it to work pr look good of not flicker.
It's easier to develop for a single high-end GPU with a lot of VRAM than for two or three mediocre ones with less VRAM.
 
Joined
May 10, 2023
Messages
353 (0.60/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
S.L.I/mGPU would/could help if developers actually bother to use it
mGPU development for stuff that requires extremely high levels of synchronization is really hard, and totally not worth the effort given how expensive it is nowadays, and that it's even easy to have a platform that allows to do so to begin with.

Given how gamedev nowadays is under extreme crunch with tight deadlines, which lead to awful quality, I really don't see studios caring about mGPU stuff at all.
 
Joined
Sep 10, 2018
Messages
6,966 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
It's easier to develop for a single high-end GPU with a lot of VRAM than for two or three mediocre ones with less VRAM.

I still miss sli.... crossfire not so much although I only crossfired 7970s and a couple 290x maybe the HD 6970 and before that was better didn't have the $$$ to do it prior..... loved 680 sli/970 sli/1080 sli but as soon as dx12 took over not only did we get more games that stutter but mgpu died... I'd much rather try to figure out frametimes vs what we get now with shader compilation stutter and traversal stutter.
 
Joined
Apr 30, 2020
Messages
999 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
I still miss sli.... crossfire not so much although I only crossfired 7970s and a couple 290x maybe the HD 6970 and before that was better didn't have the $$$ to do it prior..... loved 680 sli/970 sli/1080 sli but as soon as dx12 took over not only did we get more games that stutter but mgpu died... I'd much rather try to figure out frametimes vs what we get now with shader compilation stutter and traversal stutter.

Thank for addressing the problem, i was pointing out.

They can't fix the shader compilation stutter & traveral stutter. Not with the new render type & TAA. Its, already been like 8 to 9 years since dx12 came out. It was like back in the end of like 2015 to begining of 2016 when dx12 came out. Things like this stutter probelm would've gotten fixed already if it was possible to fix.

Its funny that developers were the ones who wanted a more "low level a.p.i." like Dx12. Yet they can't even use it, because it requires too much knowleadge of coding, recoding, redesigning & recomplining. Also too much time to do properly. That developers didn't need when using older DX11. DX11 had a lot of work easily done by compliers for developers. A lot of those compliers don't work on dx12.

I don't believe its harder to code for multiple gpu either now newer a.p.i they were designed around the idea. Espically since vulkan uses a similar method as dx12 for mutli-gpu. Its pretty simple to add support for it on newer vulkan builds. The trouble is when game developers purposly remove it on their game engines in Vulkan or don't bother to update their Vuñkan base code for it. Vulkan so far is better, but its barely used by any developers. As it the tools to do a lot of the work.

I mean were almost 10 years into dx12 and we've got less than around 500 dx12 games. On dx11 in two years we haf 700 games.

The rate difference is that dx11 was putting out 349 games a year to dx12's current 49 ganes a year thats around 8 times a many fof dx11.

"Crunch time" for developers is controlled by release dates that publisher put forth. A lot of them have unrealist deadlines. Then theres the adveriser pushing hype for games that gamers get too eger for. They complain about ir taking too long game gets a release date then maybe a bug found get pushed bacl gamer get mad. Pubisher feels heat by community complaints about game. Game gets slightly pushed hsrd to release.

In the end Dx12x is a big stinking pile of M$ broken promisies.

DXR does not help it either, since most general consumers assume raytracing = RTX. They have no clue thats a nvidia branding, nore any idea what DXR means.

Being stuck with single card as the only choice. Means they can charge what ever they want for the top tier card. You have no choice but to biy ot if you the maxium. ( I remeber someone on xtreme systems forum 10 to 15 years ago saying that it got to single card gpus only nvidia would charge $1,200 for the gpu. Here we are today talking about a gpu [RTX 5090] for $2,000 to $2,500)

Losing choices is never a good thing.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I still miss sli.... crossfire not so much although I only crossfired 7970s and a couple 290x maybe the HD 6970 and before that was better didn't have the $$$ to do it prior..... loved 680 sli/970 sli/1080 sli but as soon as dx12 took over not only did we get more games that stutter but mgpu died... I'd much rather try to figure out frametimes vs what we get now with shader compilation stutter and traversal stutter.
I wanted to crossfire a HD 7770, but I got a 7870 cheaper. Then I wanted to crossfire that, but I got a 7970 cheaper. SLi/CF was never worth the effort. Performance scaling was never even near 100%, and you were stuck with low VRAM for double money.
 
Joined
Feb 1, 2019
Messages
3,667 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Holding onto my 3080, speculating 5000 series will be even sillier money then raising the value of the used market, I remember getting a nice amount for my 1080ti during the price gouging season.
 
Joined
Sep 10, 2018
Messages
6,966 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I wanted to crossfire a HD 7770, but I got a 7870 cheaper. Then I wanted to crossfire that, but I got a 7970 cheaper. SLi/CF was never worth the effort. Performance scaling was never even near 100%, and you were stuck with low VRAM for double money.

My issue with it is sli was better and easier to try different sli profiles with.... Kinda reminds me of DLSS honestly with swapping in different DLL version till you find one that's the best for a specific game.

In the games I played back then ofc. It was really good in witcher 2/3.... I think people forget above 1080p was hard AF back in 2012. Was able to play with Uber sampling in witcher 2 with sli 680s looked great.

Obviously terrible value lol but still fun to mess around with.

I will say this though even when I sli 680s and 1080s a whole pc build would still run under $2000 during those generations now 1 gpu can cost almost that lol. Probably 2200 during the gtx 1080 generation but still....
 
Joined
Feb 1, 2019
Messages
3,667 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
You never know what amd might pull outa their sleeve, don't lose hope yet.

What? Misdirection is a thing you know....
"You thought our 5090 would be $3000!!!!!! wrong!!! buy now at the great value of $2500."
 
Joined
Sep 10, 2018
Messages
6,966 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
"You thought our 5090 would be $3000!!!!!! wrong!!! buy now at the great value of $2500."

While a 2500 msrp wouldn't surprise me I still think 2k is the limit unless it just destroys at AI workloads but if that was the case they'd just charge 3k for it.....

I'm honestly super curious where it all falls and if it is super expensive will it matter with Nvidia earning so much outside of gaming.

Titan rtx was 2500, Titan Volta was 3k, and Titan Z was 3k so none of these prices are unprecedented....
 
Joined
Feb 1, 2019
Messages
3,667 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
While a 2500 msrp wouldn't surprise me I still think 2k is the limit unless it just destroys at AI workloads but if that was the case they'd just charge 3k for it.....

I'm honestly super curious where it all falls and if it is super expensive will it matter with Nvidia earning so much outside of gaming.

Titan rtx was 2500, Titan Volta was 3k, and Titan Z was 3k so none of these prices are unprecedented....
There was a HUB video one of those they answer questions from viewers, the monitor guy commented he thought prices were too high, but the main guy defended Nvidia on the basis they could easily sell the same chip for much more in their other markets, kind of indicating we should be grateful for what we get.

I can see the 5090 being more expensive assuming its the flagship at launch, the other cards it feels more of a flip the coin, harder to predict.
 
Joined
Sep 10, 2018
Messages
6,966 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
There was a HUB video one of those they answer questions from viewers, the monitor guy commented he thought prices were too high, but the main guy defended Nvidia on the basis they could easily sell the same chip for much more in their other markets, kind of indicating we should be grateful for what we get.

I can see the 5090 being more expensive assuming its the flagship at launch, the other cards it feels more of a flip the coin, harder to predict.

Obviously a bucket full of salt needs to be taken but it sounds like everything below the 5090 will be insanely cut down with nothing significantly better than their predecessors....

I think what he means is the actual silicon is worth significantly more outside of the gaming market it's probably why amd dropped out of the high end gpu race according to rumors, why waste silicon on a big die when they can sell chiplets or MI gpus for substantially more....

This is just the reality regardless of how much it sucks for us gamers.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
My issue with it is sli was better and easier to try different sli profiles with.... Kinda reminds me of DLSS honestly with swapping in different DLL version till you find one that's the best for a specific game.
It reminds me of DLSS, too, and RT as well in fact... in the way that "Nvidia can do it better than AMD, but it's still kind of crap".

In the games I played back then ofc. It was really good in witcher 2/3.... I think people forget above 1080p was hard AF back in 2012. Was able to play with Uber sampling in witcher 2 with sli 680s looked great.
I don't know a single person who played above 1080p back then. Luxury resolutions require luxury GPUs. It was SLi/CF back then, and it's the x80-90 series now.

Obviously terrible value lol but still fun to mess around with.
That I can't deny. :)
 
Joined
Sep 10, 2018
Messages
6,966 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
It reminds me of DLSS, too, and RT as well in fact... in the way that "Nvidia can do it better than AMD, but it's still kind of crap".

I will say this though it's come an insanely long way since Battlefield 5 which was the first shipping game with DLSS and RT if you compare that to somthing like Alan Wake 2 it's an insanely large gap the problem is 5-6 years later only 1 gpu can really do it and just barely. The 4080/4080 super can do it at 1080p so I guess that counts lol so let's say 2.....

Still even though more and more games are using RT well by the time developers have mastered it we will then switch to Pathtracing and be back to square 1 perfomance wise. At least until Nvidia figures out how to fake it even better lol.

Regardless I hope all these technologies get better and better and AMD figures this shit out being able to afford it I guess only time will tell.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,982 (2.96/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / media-PC
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-K
Cooling Alphacool Eisbaer 360 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) Asus RTX 3080 TUF OC / Powercolor RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Sony WH-CN720N
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
There's nothing new here, practically the 10 series were the last reasonably priced series. Well, ok, if you managed to get a 3080 at MSRP, that had awesome bang for the buck too.


5090 won't be less than 2k. My opinion. I actually think they will price it between 2k and 2.5k. Ain't no way in hell they are not going to milk that. It's not going to come out until Feb or March of 25 methinks.
I'm honestly surprised why they haven't released a 4090 Ti (or even a 4080 Ti) to milk even more from the current gen.

I don't know a single person who played above 1080p back then. Luxury resolutions require luxury GPUs. It was SLi/CF back then, and it's the x80-90 series now.
I didn't know personally anyone too, but 2560x1600 monitors had been a thing from the late 2000s for the enthusiasts.
 
Joined
Sep 10, 2018
Messages
6,966 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I'm honestly surprised why they haven't released a 4090 Ti (or even a 4080 Ti) to milk even more from the current gen.


This is the closest die configuration to a non existent 4090ti but why sell it to gamers when you can sell it for 2x-3x or even more to data centers.

They've actually never even made a card that uses the full die afaik even for enterprise etc.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,982 (2.96/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / media-PC
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-K
Cooling Alphacool Eisbaer 360 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) Asus RTX 3080 TUF OC / Powercolor RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Sony WH-CN720N
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis

This is the closest die configuration to a non existent 4090ti but why sell it to gamers when you can sell it for 2x-3x or even more to data centers.

They've actually never even made a card that uses the full die afaik even for enterprise etc.
Interesting, is AD102 so complex chip that the yields just don't allow a full chip to be manufactured without disabling units from it?
 
Joined
Sep 10, 2018
Messages
6,966 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Interesting, is AD102 so complex chip that the yields just don't allow a full chip to be manufactured without disabling units from it?

It's definitely a big complex die but no idea.
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Thank for addressing the problem, i was pointing out.

They can't fix the shader compilation stutter & traveral stutter. Not with the new render type & TAA. Its, already been like 8 to 9 years since dx12 came out. It was like back in the end of like 2015 to begining of 2016 when dx12 came out. Things like this stutter probelm would've gotten fixed already if it was possible to fix.

Its funny that developers were the ones who wanted a more "low level a.p.i." like Dx12. Yet they can't even use it, because it requires too much knowleadge of coding, recoding, redesigning & recomplining. Also too much time to do properly. That developers didn't need when using older DX11. DX11 had a lot of work easily done by compliers for developers. A lot of those compliers don't work on dx12.

I don't believe its harder to code for multiple gpu either now newer a.p.i they were designed around the idea. Espically since vulkan uses a similar method as dx12 for mutli-gpu. Its pretty simple to add support for it on newer vulkan builds. The trouble is when game developers purposly remove it on their game engines in Vulkan or don't bother to update their Vuñkan base code for it. Vulkan so far is better, but its barely used by any developers. As it the tools to do a lot of the work.

I mean were almost 10 years into dx12 and we've got less than around 500 dx12 games. On dx11 in two years we haf 700 games.

The rate difference is that dx11 was putting out 349 games a year to dx12's current 49 ganes a year thats around 8 times a many fof dx11.

"Crunch time" for developers is controlled by release dates that publisher put forth. A lot of them have unrealist deadlines. Then theres the adveriser pushing hype for games that gamers get too eger for. They complain about ir taking too long game gets a release date then maybe a bug found get pushed bacl gamer get mad. Pubisher feels heat by community complaints about game. Game gets slightly pushed hsrd to release.

In the end Dx12x is a big stinking pile of M$ broken promisies.

DXR does not help it either, since most general consumers assume raytracing = RTX. They have no clue thats a nvidia branding, nore any idea what DXR means.

Being stuck with single card as the only choice. Means they can charge what ever they want for the top tier card. You have no choice but to biy ot if you the maxium. ( I remeber someone on xtreme systems forum 10 to 15 years ago saying that it got to single card gpus only nvidia would charge $1,200 for the gpu. Here we are today talking about a gpu [RTX 5090] for $2,000 to $2,500)

Losing choices is never a good thing.
Sure they can, Black Myth Wukong loses it entirely after the first couple of hours played. Its smooth, like, buttery, even at 50 FPS. Similarly, Darktide (UE4) has it mostly fixed at this point - mostly because its not really possible to determine where the very rare occasional stutter that is left, actually comes from but it doesn't feel like a traversal stutter and the very rare stutter you still encounter actually never detracts from gameplay anymore; many games have similar stuff going on.

On the second point, I think we're actually seeing that since DX12, CPU bottlenecking has been virtually alleviated, games uses more threads. Also, its important to keep in mind we're still living in a DX11 world in terms of GPU market share. Games are still being developed with DX11 modes, so its not surprising the DX12 api isn't always fully explored. The fact this happens is not because devs don't want to move away from DX11 - its simply because studios want to sell games, and excluding everyone without a DX12 capable GPU isn't exactly helping your sales.

Third point... we have over a decade of living proof that its definitely harder to code well for mGPU / Crossfire / SLI because of the simple fact you are adding complexity to the pipeline. Frames must be sent to one or the other GPU, or both, so there is overhead, too, that you do not have with a single GPU; plus, all of this extra overhead leans on VRAM data transport, which is a primary focus to actually make GPUs faster by using it better, bandwidth on VRAM is expensive. These are simple facts, and these facts add milliseconds of latency to anything you do. This is why all SLI setups could spit out fantastic FPS, but their frametimes were always (virtually always) all over the place: you literally feel the impact of that extra overhead on every frame. This problem was never fixed, and if it was, it always came at an immense scaling performance cost, effectively killing the advantage of mGPU on its own.

It's definitely a big complex die but no idea.
They can make it, no doubt in my mind, because they make larger GPUs too. The only/primary reason Nvidia positions its cards like it does, is because it wants to maximize profits. A random combination of yield, margin, 'market demand', and their overall strategy determines what we get. Everything is just a little knob they can turn to tweak their offering better AND prepare customers for the next round of GPUs. Nvidia has learned, and has the position, to look beyond the current gen, for quite a while now. Something they sucked at during the Kepler / refresh stage up until Maxwell, where they were just focused on developing faster chips than AMD and it was a close call every time. I think that Pascal was the first gen where Nvidia was confident they had market dominance, and acted upon it: they reined in direct sales with Founders' editions, and started elevating price points per tier since. The cadence between generations got slowed down considerably, too, and they started pushing RTX. All of those moves signal of an Nvidia that has gained lots of wiggle room to plan ahead and 'be safe' doing so.

Its also part of the reason why AMD's GPU gen cadence is still messy: they're reactive, not pro active.
 
Last edited:

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,982 (2.96/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / media-PC
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-K
Cooling Alphacool Eisbaer 360 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) Asus RTX 3080 TUF OC / Powercolor RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Sony WH-CN720N
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
This is why all SLI setups could spit out fantastic FPS, but their frametimes were always (virtually always) all over the place: you literally feel the impact of that extra overhead on every frame. This problem was never fixed, and if it was, it always came at an immense scaling performance cost, effectively killing the advantage of mGPU on its own.
I remember when I switched from R9 290 Crossfire to 980 Ti, the pure FPS was lower but games felt way smoother because there was no mGPU drawbacks anymore.
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I will say this though it's come an insanely long way since Battlefield 5 which was the first shipping game with DLSS and RT if you compare that to somthing like Alan Wake 2 it's an insanely large gap the problem is 5-6 years later only 1 gpu can really do it and just barely. The 4080/4080 super can do it at 1080p so I guess that counts lol so let's say 2.....

Still even though more and more games are using RT well by the time developers have mastered it we will then switch to Pathtracing and be back to square 1 perfomance wise. At least until Nvidia figures out how to fake it even better lol.

Regardless I hope all these technologies get better and better and AMD figures this shit out being able to afford it I guess only time will tell.
I think the real progress in terms of RT capabilities is what we've seen recently with shader based / in-engine RT. The form of RT that consists of an Nvidia-only toggle to get an extra layer of sauce is inevitably going to die, even just for the simple fact that consoles can't do that and never will. Even if Nvidia owns the ENTIRE PC GPU market, it ain't gonna happen. Devs design games console-first these days, and the PC-first games (strategy, mouse-friendly games, isometric/higher complexity, etc.) generally don't benefit from RT at all. Indie devs ain't got time for this, and I think there's no real demand from that corner either.

Black Myth Wukong is a good example. There is absolutely no need to have an Nvidia card for that game to have it look and convey the way it wants to look. Additionally, the gap between PT and not having it, is simply too small for its performance hit. It ain't gonna fly, its a temporary thing.

I remember when I switched from R9 290 Crossfire to 980 Ti, the pure FPS was lower but games felt way smoother because there was no mGPU drawbacks anymore.
Same, I switched from 2x GTX 660 to a single 770, about 65-75% of the net performance but everything played miles better, it felt like a 2x perf upgrade, ironically.
 
Joined
Aug 12, 2019
Messages
2,248 (1.15/day)
Location
LV-426
System Name Custom
Processor i9 9900k
Motherboard Gigabyte Z390 arous master
Cooling corsair h150i
Memory 4x8 3200mhz corsair
Video Card(s) Galax RTX 3090 EX Gamer White OC
Storage 500gb Samsung 970 Evo PLus
Display(s) MSi MAG341CQ
Case Lian Li Pc-011 Dynamic
Audio Device(s) Arctis Pro Wireless
Power Supply 850w Seasonic Focus Platinum
Mouse Logitech G403
Keyboard Logitech G110
Joined
Aug 20, 2007
Messages
21,542 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
mGPU development for stuff that requires extremely high levels of synchronization is really hard, and totally not worth the effort given how expensive it is nowadays, and that it's even easy to have a platform that allows to do so to begin with.

Given how gamedev nowadays is under extreme crunch with tight deadlines, which lead to awful quality, I really don't see studios caring about mGPU stuff at all.
No you don't understand, devs are just lazy! /s

Its funny that developers were the ones who wanted a more "low level a.p.i." like Dx12.
lolwut. Hell no. That was never really asked for but mantle kind of thrust its benefits into the limelight. Its extra work that most devs don't want nor can benefit from and I told all of you this when it came out. A few select,triple AAA budget type titles mainly have benefitted.

You clearly have never tried to make a game in your life so why not hold the judgments a bit?
 
Last edited:
Joined
Sep 10, 2018
Messages
6,966 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I think the real progress in terms of RT capabilities is what we've seen recently with shader based / in-engine RT. The form of RT that consists of an Nvidia-only toggle to get an extra layer of sauce is inevitably going to die, even just for the simple fact that consoles can't do that and never will. Even if Nvidia owns the ENTIRE PC GPU market, it ain't gonna happen. Devs design games console-first these days, and the PC-first games (strategy, mouse-friendly games, isometric/higher complexity, etc.) generally don't benefit from RT at all. Indie devs ain't got time for this, and I think there's no real demand from that corner either.

Black Myth Wukong is a good example. There is absolutely no need to have an Nvidia card for that game to have it look and convey the way it wants to look. Additionally, the gap between PT and not having it, is simply too small for its performance hit. It ain't gonna fly, its a temporary thing.

While I like the Shader based solutions in UE5/snowdrop it's still not very good at a lot of things reflections are still pretty terrible and the lighting still easily breaks. Hardware Lumen is significantly better but still significantly behind path tracing.

If they can substantially improve it sure but it likely only exist for weak console hardware to begin with and AMD current generation gpus that just don't handle full RT whatsoever.

Even BMW with it's only ok RT implementation looks substantially better with hardware RT especially water and shadows it just breaks way less.

Even Hellblade 2 which I still think looks better than BMW would greatly benefit from at least hardware RT reflections which are pretty terrible with the Lumen pass.

Beyond that hardware is all moving towards handling real time RT better. That seems like the primary focus with The PS5 pro/RDNA 4 and looking at leaked slides next generation consoles. The shader solution is just a half step bandaid till AMD gets it shit together imho.
 
Joined
Mar 7, 2023
Messages
919 (1.40/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
"You thought our 5090 would be $3000!!!!!! wrong!!! buy now at the great value of $2500."
AMD I said lol. I don't think anybody doubts the 5090 will be expensive. I think there's hope for the lower tier cards we'll just have to wait and see. I wont be buying either way. I'm using this card till it dies.
 
Top