• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Reviewers Guide for the Ryzen 9 7950X3D Leaks

Joined
Jun 14, 2020
Messages
3,460 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
4K is 4x the pixels of 1080p, so one generation uplift is much less than comparing 4K FPS to 1080p. An RTX 3090 gives you averagely 88 FPS in TechPowerUP suit of games in 4K, in 1080p it has 181 FPS - 105% more. And 2080 Ti goes from 59 FPS in 4K to 130 GPS in 1080p - 120% more.

With RTX 4090 you get 144 FPS in 4K, but only 74% more in 1080p - but absurdly high 251 FPS is here limited by the latency of system, not performance of CPU - we can be sure the actual performance is more than 100% more in 1080p than in 4K.

A generational uplift from RTX 3090 to 4090 only gives you 63% in 4K, 40% in 1080p, and from 2080Ti yo 3090 only gave you 49% uplift in 4K, 39% uplift in 1080p.

So you really have to skip a generation of GPU and use the same CPU after 4 years, and it's still not the same as comparing 4K numbers to 1080p.

720p is of course even more absurd, that's like planning on using the same CPU after more than 8 years.

This "let's get GPU bottleneck out if equation" numbers are very good for selling CPUs, but the theoretical increases at low resolution have very little effect on what you'll get from that product.

But it's good for sales, good for creating FOMO in buyers.

Because showing actual FPS they'll get from a cheap $250 CPU and very expensive $700 one in 4K, and perhaps with a more down to earth GPU that majority of gamers use shows a very, very different picture.
With all of that said, my 12900k with max tuned ddr5 at 7200c30 is bottlenecking my 4090 in settings I actually play games with. That is, 4k DLSS Q + RT. It majorly bottlenecks in hogwarts for example, and it gets really close on some areas in cyberpunk.
 
Joined
Apr 21, 2005
Messages
185 (0.03/day)
Unfortunately all CPU reviews for gaming are flawed because a massive sector of games that has 100's of thousands of concurrent players gets totally ignored. Until Civ 6 turn time gets tested, Stellaris/HOI4 etc Grand Strategy simulation rate gets tested, Cities Skylines simulation rate gets tested etc there is a huge hole of data.

What is the point in testing older AAA games that get at most 20-30% of the concurrent players as Civ 6 for example? What is the point in testing Civ 6 FPS? The damn game is basically a complicated board game on the PC. 60FPS is 100% absolutely fine and the turn time is what makes a bigger difference to playability. Where is Stellaris or HOI4 data? All Paradox grand strategy games run on the same engine so testing 1 would indicate performance for all of them and there are over 100K concurrent players for those titles on steam alone. Sure Hogwarts might be at 300K concurrent players at the moment but give it 1 month and those Paradox games will still be at 100K or so and Hogwarts will have dropped massively.

Unless a game is there because it has a specific game engine you want tested or because it uses and API you want to test surely the games tested should cover as broad an array of game types as possible to be considered complete.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,214 (4.66/day)
Location
Kepler-186f
Until Civ 6 turn time gets tested,

can't remember who does this, but its either pcgamer, guru3d, or gamersnexus. I know one of those tests civ 6 turn time. i remember coming across it before on cpu reviews.
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
With all of that said, my 12900k with max tuned ddr5 at 7200c30 is bottlenecking my 4090 in settings I actually play games with. That is, 4k DLSS Q + RT. It majorly bottlenecks in hogwarts for example, and it gets really close on some areas in cyberpunk.
You cannot use Hogwarts (and CP2077) as a benchmark for bottleneck: they are just poorly optimized games.
 
Joined
Aug 29, 2005
Messages
7,260 (1.03/day)
Location
Stuck somewhere in the 80's Jpop era....
System Name Lynni PS \ Lenowo TwinkPad L14 G2
Processor AMD Ryzen 7 7700 Raphael (Waiting on 9800X3D) \ i5-1135G7 Tiger Lake-U
Motherboard ASRock B650M PG Riptide Bios v. 3.10 AMD AGESA 1.2.0.2a \ Lenowo BDPLANAR Bios 1.68
Cooling Noctua NH-D15 Chromax.Black (Only middle fan) \ Lenowo C-267C-2
Memory G.Skill Flare X5 2x16GB DDR5 6000MHZ CL36-36-36-96 AMD EXPO \ Willk Elektronik 2x16GB 2666MHZ CL17
Video Card(s) Asus GeForce RTX™ 4070 Dual OC (Waiting on RX 8800 XT) | Intel® Iris® Xe Graphics
Storage Gigabyte M30 1TB|Sabrent Rocket 2TB| HDD: 10TB|1TB \ WD RED SN700 1TB
Display(s) KTC M27T20S 1440p@165Hz | LG 48CX OLED 4K HDR | Innolux 14" 1080p
Case Asus Prime AP201 White Mesh | Lenowo L14 G2 chassis
Audio Device(s) Steelseries Arctis Pro Wireless
Power Supply Be Quiet! Pure Power 12 M 750W Goldie | 65W
Mouse Logitech G305 Lightspeedy Wireless | Lenowo TouchPad & Logitech G305
Keyboard Ducky One 3 Daybreak Fullsize | L14 G2 UK Lumi
Software Win11 IoT Enterprise 24H2 UK | Win11 IoT Enterprise LTSC 24H2 UK / Arch (Fan)
Benchmark Scores 3DMARK: https://www.3dmark.com/3dm/89434432? GPU-Z: https://www.techpowerup.com/gpuz/details/v3zbr
Joined
Apr 21, 2005
Messages
185 (0.03/day)
can't remember who does this, but its either pcgamer, guru3d, or gamersnexus. I know one of those tests civ 6 turn time. i remember coming across it before on cpu reviews.

GN used to but they stopped in recent reviews.

HUB do Factorio but not sure if they include it in the final average performance comparison.
 
Joined
Jun 14, 2020
Messages
3,460 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
You cannot use Hogwarts (and CP2077) as a benchmark for bottleneck: they are just poorly optimized games.
CP is anything but a poorly optimized game. It's so far from the truth whoever says something like that has no idea what he is talking about.
 
Joined
Dec 12, 2012
Messages
773 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
CP is anything but a poorly optimized game. It's so far from the truth whoever says something like that has no idea what he is talking about.

Indeed. It scales really well across different hardware, and it has great multi-threading support. You can get over 200 FPS in 1080p on a 4090, and DDR5 has a significant impact as well. The game is very demanding, but it looks amazing.

Good optimization does not mean low system requirements. It means being able to utilize the hardware properly and scale accordingly.

Hogwarts Legacy definitely has optimization problems, especially on the CPU side (it doesn't even fully utilize 4 cores). VRAM usage is also too high. The graphics don't justify the requirements.
Forspoken is even worse. That's a completely messed up game, on both platforms.
 
Joined
Feb 22, 2017
Messages
26 (0.01/day)
A generational uplift from RTX 3090 to 4090 only gives you 63% in 4K, 40% in 1080p, and from 2080Ti yo 3090 only gave you 49% uplift in 4K, 39% uplift in 1080p.

So by that math, one generational uplift is roughly from 1080p to 1440p.

Many like myself run a 3440x1440 ultrawide. From https://www.gpucheck.com/gpu/nvidia-geforce-rtx-4090/, we know ultrawide framerate is closer to 1440p rather than 4k. Some others I know just pair a high-end card with 1440p, due to crappy Windows DPI scaling, etc. For all those cases, the 1080p numbers here would be a good indicator when the GPU is upgraded to RTX 5090, which makes perfect sense: GPU every 2 years, CPU every 4 years.

So you really have to skip a generation of GPU and use the same CPU after 4 years

And what's wrong with that? GPU every 4 years, CPU every 8 years. Also works for many. So 1080p could indicate 4k performance on RTX 6090.

See? Just because you don't need the numbers, doesn't mean others don't. Many really don't upgrade CPUs that often.
 
Top