• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Reviewers Guide for the Ryzen 9 7950X3D Leaks

Joined
Sep 4, 2022
Messages
308 (0.38/day)
why only 1080p?
For marketing purposes.
Also if anyone is wondering why CPUs are tested at 1080p and not in lower resolutions, this because 1080 is the middle ground for the minimum resolution currently being played of majority of gamers and the performance can be extrapolated for future gpu upgrade paths especially higher resolutions. If the cpu A can hit 240 fps at 1080p with GPU B in title C,then that CPU should be able to hit 240 hz at a higher resolution with successors to gpu B in the same title C. imo. It's not practical to test in lower resolutions although I've seen otherwise. I bet a majority of readers here play at 1440p and 4k that are interested in the zen4 3d cpus. Niche group for a niche product. The one's playing at 1440p or 4k could care less of the delta gain at lower resolutions than 1080p. Many just want to know what the 0.1% lows and frame variance graph looks like.
Lastly if you invest in am5 and have cl30 ddr5 kits at 6ghz you don't have to upgrade the memory for possibly even with zen6 3d upgrade path. By extrapolating 5800X3D with cl14 with ddr4 at 3600 mhz compared to the performance of current cpus with ddr5 memory it comes very competitive. I believe we will see the same competition without constantly upgrading ram with future am5 cpus
 
Joined
Apr 29, 2018
Messages
129 (0.05/day)
Any sense of rationality is completely gone from your comment, just look at the previous one;


This is not only nonsensical, it's actually blatantly false. It absolutely gives you a lot of information to benchmark real workloads at realistic settings, as this tells the reader how the contending products will perform in real life.


Your example is nonsense, as you wouldn't see that kind of performance discrepancy unless you are comparing a low-end CPU to a high-end CPU when benchmarking a large sample of games. And trying to use a cherry-picked game or two to showcase a "bottleneck" is ridiculous, as there is no reason to assume your selection has the characteristics of future games coming down the line.

The best approach is a large selection of games at realistic settings, and look at the overall trend eliminating the outliers, that will give you a better prediction of what is a better long-term investment.
You clearly lack any comprehension of what the hell a cpu review is all about.
 
Joined
Feb 1, 2019
Messages
3,590 (1.69/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Too much manipulation there, no wonder most of the reviewers all bench the same few games.

Also why did they disable VBS?
 
Joined
Apr 19, 2018
Messages
1,227 (0.51/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
AMD, Zen 5 better be a huge jump, or you're done playing in the big boy league.

Also why did they disable VBS?
Because it slows the system down 5-10%
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
All you did is confirm you do not understand the process.

Example 1:

1080p - you get 60 FPS
1440p - you get 60 FPS
4K - you get 40 FPS

In this situation, upgrading your GPU would provide NO performance increase in 1440p. ZERO.
In 4K, you would only gain a maximum of 50% extra performance, even if the new GPU was twice as fast.
How would you know this without the 1080p test?

Example 2:
1080p - you get 100 FPS
1440p - you get 60 FPS
4K - you get 40 FPS

In this situation, the CPU bottleneck happens at 100 FPS. Which means you can get 67% more performance in 1440p after upgrading the GPU, and you can get 150% more performance in 4K.
You know the maximum framerate the CPU can achieve without a GPU bottleneck, which means you know what to expect when you upgrade your GPU.

What's important is to have this data for as many games as possible. Some games don't need a lot of CPU power, some are badly threaded, and some will utilize all 8 cores fully.

If you test in 4K, you're not testing the maximum potential of the CPU. You want to know this if you're planning on keeping your system for more than two years. Most people WILL upgrade their GPU before their CPU.
Seriously, please just go watch the Hardware Unboxed video.
I’m quoting you but my point is valid for everyone. A lot of arguing about testing resolution and you are … both right.
Testing at 1080P still is relevant to keep the GPU (mostly) out of the equation, BUT testing at higher resolutions is important to give customers an idea of what to expect when they buy the CPU. To show only 1080P results is wrong (or is a marketing move to highlight something you are advertising).
 
Joined
Aug 29, 2005
Messages
7,260 (1.03/day)
Location
Stuck somewhere in the 80's Jpop era....
System Name Lynni PS \ Lenowo TwinkPad L14 G2
Processor AMD Ryzen 7 7700 Raphael (Waiting on 9800X3D) \ i5-1135G7 Tiger Lake-U
Motherboard ASRock B650M PG Riptide Bios v. 3.10 AMD AGESA 1.2.0.2a \ Lenowo BDPLANAR Bios 1.68
Cooling Noctua NH-D15 Chromax.Black (Only middle fan) \ Lenowo C-267C-2
Memory G.Skill Flare X5 2x16GB DDR5 6000MHZ CL36-36-36-96 AMD EXPO \ Willk Elektronik 2x16GB 2666MHZ CL17
Video Card(s) Asus GeForce RTX™ 4070 Dual OC (Waiting on RX 8800 XT) | Intel® Iris® Xe Graphics
Storage Gigabyte M30 1TB|Sabrent Rocket 2TB| HDD: 10TB|1TB \ WD RED SN700 1TB
Display(s) KTC M27T20S 1440p@165Hz | LG 48CX OLED 4K HDR | Innolux 14" 1080p
Case Asus Prime AP201 White Mesh | Lenowo L14 G2 chassis
Audio Device(s) Steelseries Arctis Pro Wireless
Power Supply Be Quiet! Pure Power 12 M 750W Goldie | 65W
Mouse Logitech G305 Lightspeedy Wireless | Lenowo TouchPad & Logitech G305
Keyboard Ducky One 3 Daybreak Fullsize | L14 G2 UK Lumi
Software Win11 IoT Enterprise 24H2 UK | Win11 IoT Enterprise LTSC 24H2 UK / Arch (Fan)
Benchmark Scores 3DMARK: https://www.3dmark.com/3dm/89434432? GPU-Z: https://www.techpowerup.com/gpuz/details/v3zbr
I would love to see a non X 3D cpu to see how the performance and the market would cope with that if a cpu like AMD Ryzen 7 7800 non X 3D would be a strong cpu controlled lower TDP than the non we might have a solid max 100watt cpu that beats hard.
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
I would love to see a non X 3D cpu to see how the performance and the market would cope with that if a cpu like AMD Ryzen 7 7800 non X 3D would be a strong cpu controlled lower TDP than the non we might have a solid max 100watt cpu that beats hard.
Which kind of application do you mean? Because if we are speaking about gaming , basically every cpu stays below 100W
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
8,540 (3.78/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
Maybe it wont be so bad. I was feeling kinda blah yesterday. I watched a video this morning that was fairly enlightening.
 
Joined
Aug 29, 2005
Messages
7,260 (1.03/day)
Location
Stuck somewhere in the 80's Jpop era....
System Name Lynni PS \ Lenowo TwinkPad L14 G2
Processor AMD Ryzen 7 7700 Raphael (Waiting on 9800X3D) \ i5-1135G7 Tiger Lake-U
Motherboard ASRock B650M PG Riptide Bios v. 3.10 AMD AGESA 1.2.0.2a \ Lenowo BDPLANAR Bios 1.68
Cooling Noctua NH-D15 Chromax.Black (Only middle fan) \ Lenowo C-267C-2
Memory G.Skill Flare X5 2x16GB DDR5 6000MHZ CL36-36-36-96 AMD EXPO \ Willk Elektronik 2x16GB 2666MHZ CL17
Video Card(s) Asus GeForce RTX™ 4070 Dual OC (Waiting on RX 8800 XT) | Intel® Iris® Xe Graphics
Storage Gigabyte M30 1TB|Sabrent Rocket 2TB| HDD: 10TB|1TB \ WD RED SN700 1TB
Display(s) KTC M27T20S 1440p@165Hz | LG 48CX OLED 4K HDR | Innolux 14" 1080p
Case Asus Prime AP201 White Mesh | Lenowo L14 G2 chassis
Audio Device(s) Steelseries Arctis Pro Wireless
Power Supply Be Quiet! Pure Power 12 M 750W Goldie | 65W
Mouse Logitech G305 Lightspeedy Wireless | Lenowo TouchPad & Logitech G305
Keyboard Ducky One 3 Daybreak Fullsize | L14 G2 UK Lumi
Software Win11 IoT Enterprise 24H2 UK | Win11 IoT Enterprise LTSC 24H2 UK / Arch (Fan)
Benchmark Scores 3DMARK: https://www.3dmark.com/3dm/89434432? GPU-Z: https://www.techpowerup.com/gpuz/details/v3zbr
Which kind of application do you mean? Because if we are speaking about gaming , basically every cpu stays below 100W

You share? In @W1zzard's review of the AMD Ryzen9 7950X it have a average of 125W and the Ryzen 7 7700X of average of 80W.

So think about it, the Ryzen 7700 non-X averages 57.4W in games so if a well AMD tuned Ryzen 7 non X 3D would do about 65W on average it would be a killer gaming CPU because we saw with the AMD Ryzen 7 5800X3D how much fps gain there was with the extra cache in gaming while not using an insane amount of watt.

This would make a lot of customers happy in countries where power still costs money you have to think outside the united states of america because they do complain about thinks but they get them in bigger sizes than other countries so they are still cheaper than the rest of the world.
 
Joined
Dec 12, 2012
Messages
773 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
I don't understand what you're trying to say. You want a 7800 non-X non-3D? Why? There's already the 7700, which is exactly the same as the 7700X, just with slightly lower boost clocks and voltages.

There is a point where every additional 100 MHz raises the voltage and power consumption significantly. That's why the X models are so inefficient at stock settings. There's no room for a 7800 non-X, because it would have to be faster than a 7700X while being more efficient. That's not possible, because they would have to use top quality chiplets, which are reserved for server CPUs.

The 7700 is already redundant for JUST gaming workloads. You either choose the 7600 for maximum value, or the 7800X3D for maximum performance.
 
Joined
Aug 29, 2005
Messages
7,260 (1.03/day)
Location
Stuck somewhere in the 80's Jpop era....
System Name Lynni PS \ Lenowo TwinkPad L14 G2
Processor AMD Ryzen 7 7700 Raphael (Waiting on 9800X3D) \ i5-1135G7 Tiger Lake-U
Motherboard ASRock B650M PG Riptide Bios v. 3.10 AMD AGESA 1.2.0.2a \ Lenowo BDPLANAR Bios 1.68
Cooling Noctua NH-D15 Chromax.Black (Only middle fan) \ Lenowo C-267C-2
Memory G.Skill Flare X5 2x16GB DDR5 6000MHZ CL36-36-36-96 AMD EXPO \ Willk Elektronik 2x16GB 2666MHZ CL17
Video Card(s) Asus GeForce RTX™ 4070 Dual OC (Waiting on RX 8800 XT) | Intel® Iris® Xe Graphics
Storage Gigabyte M30 1TB|Sabrent Rocket 2TB| HDD: 10TB|1TB \ WD RED SN700 1TB
Display(s) KTC M27T20S 1440p@165Hz | LG 48CX OLED 4K HDR | Innolux 14" 1080p
Case Asus Prime AP201 White Mesh | Lenowo L14 G2 chassis
Audio Device(s) Steelseries Arctis Pro Wireless
Power Supply Be Quiet! Pure Power 12 M 750W Goldie | 65W
Mouse Logitech G305 Lightspeedy Wireless | Lenowo TouchPad & Logitech G305
Keyboard Ducky One 3 Daybreak Fullsize | L14 G2 UK Lumi
Software Win11 IoT Enterprise 24H2 UK | Win11 IoT Enterprise LTSC 24H2 UK / Arch (Fan)
Benchmark Scores 3DMARK: https://www.3dmark.com/3dm/89434432? GPU-Z: https://www.techpowerup.com/gpuz/details/v3zbr
I don't understand what you're trying to say. You want a 7800 non-X non-3D? Why? There's already the 7700, which is exactly the same as the 7700X, just with slightly lower boost clocks and voltages.

There is a point where every additional 100 MHz raises the voltage and power consumption significantly. That's why the X models are so inefficient at stock settings. There's no room for a 7800 non-X, because it would have to be faster than a 7700X while being more efficient. That's not possible, because they would have to use top quality chiplets, which are reserved for server CPUs.

The 7700 is already redundant for JUST gaming workloads. You either choose the 7600 for maximum value, or the 7800X3D for maximum performance.

I was speaking of a Ryzen 7 7800 non X 3D it would make sense if the performance uplift is there gaming wise.

Between the 7700 vs 7700X there is about 5% difference in gaming not much but also for ITX systems it could benefit from the lower power usage and heat but I guess the market might be small and people want a 100c CPU that needs a AIO to keep cool.
 
Joined
Sep 2, 2021
Messages
195 (0.17/day)
Location
Colorado
Processor AMD Ryzen 9 7950X
Motherboard Asus ROG Crosshair X670E Gene
Cooling Full Custom Water
Memory 48GB DDR5
Video Card(s) Nvidia RTX 3090 FE
Storage Crucial T700 2TB Gen5 SSD
Display(s) Asus PG32UQX
Case Primochill Praxis WetBench
Audio Device(s) SteelSeries Arctis Pro
Power Supply SeaSonic Prime TX-1600
Mouse G502 Lightspeed
Keyboard Keychron Q1 HE, Drop Custom Keycaps
Software Windows 11 Pro 23H2
Benchmark Scores http://www.3dmark.com/pcm10b/1944567
The platform BIOS, AGESA version, Chipset Driver, and the Windows version will have an impact on the benchmarks.

I have been testing beta BIOS 0921, AGESA 1005c, Chipset V5, Windows 11 Pro 22H2,.. with 7950X chip.

AGESA 1005c.jpg

My system was fully stable, I had no issues with EXPO 6000, or manual 6400 memory profiles with AGESA 1005c. The all-core average active clock under load was 55.0x with AGESA 1005c & Chipset V5 @ default settings. I tested several benchmarks, games,.. got the same results.

Default settings, EXPO under load.
Load 55.0x.jpg

Default settings, EXPO Idle.
Idle 57.5x.jpg

I understand Asus has been working on official BIOS & Firmware for the new X3D CPUs,.. and have released new BIOS 0922 on support site, I haven't tested the 0922 BIOS. I rolled back to 0805, AGESA 1003, and Chipset V4,.. which seems to work best for my non-X3D chip.

Is the new 7950X3D base clock lowered 300MHz to 4.2GHz and boost clocks @ 5.1GHz all-core with 5,7GHz single core. :confused:

The review will be interesting, looking forward to results. :)
 
Last edited:
Joined
Nov 21, 2010
Messages
2,351 (0.46/day)
Location
Right where I want to be
System Name Miami
Processor Ryzen 3800X
Motherboard Asus Crosshair VII Formula
Cooling Ek Velocity/ 2x 280mm Radiators/ Alphacool fullcover
Memory F4-3600C16Q-32GTZNC
Video Card(s) XFX 6900 XT Speedster 0
Storage 1TB WD M.2 SSD/ 2TB WD SN750/ 4TB WD Black HDD
Display(s) DELL AW3420DW / HP ZR24w
Case Lian Li O11 Dynamic XL
Audio Device(s) EVGA Nu Audio
Power Supply Seasonic Prime Gold 1000W+750W
Mouse Corsair Scimitar/Glorious Model O-
Keyboard Corsair K95 Platinum
Software Windows 10 Pro
Looking at the list of games and have to say they suck. The ones that are interest/good have little to no performance gains. Why would some rush out to huy a gaming cpu that offers no benefit.
 
Joined
Jun 14, 2020
Messages
3,460 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
You share? In @W1zzard's review of the AMD Ryzen9 7950X it have a average of 125W and the Ryzen 7 7700X of average of 80W.

So think about it, the Ryzen 7700 non-X averages 57.4W in games so if a well AMD tuned Ryzen 7 non X 3D would do about 65W on average it would be a killer gaming CPU because we saw with the AMD Ryzen 7 5800X3D how much fps gain there was with the extra cache in gaming while not using an insane amount of watt.

This would make a lot of customers happy in countries where power still costs money you have to think outside the united states of america because they do complain about thinks but they get them in bigger sizes than other countries so they are still cheaper than the rest of the world.
12900k averages less than 57w in games.
 
Joined
Jul 30, 2019
Messages
3,276 (1.68/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
Testing at 1080P still is relevant to keep the GPU (mostly) out of the equation,
Yes, so you are comparing the CPU's. Another way to look at the situation is, in a game, the CPU is not nearly as limited by the operation of the GPU as much as the GPU is limited by the operation of the CPU. So by testing at 1080p your going to be able to better compare the differences between the CPU's relative performance way more than the GPU's.
BUT testing at higher resolutions is important to give customers an idea of what to expect when they buy the CPU.
Isn't that is what another sets of CPU synthetic tests are for?
 
Joined
Feb 22, 2017
Messages
26 (0.01/day)
Going by that logic, why not test these at 720 or 480p? What he said makes sense, these resolutions are obsolete same as 1080p.
I doubt anyone spending that amount of money, 7950x3d with 4090, will be using 1080p.
These tests are downright useless and have 0 REAL value. But then again, if you were shown real case tests, the 99% of people tests and not the 0.0001% weirdo that will run this setup, you wouldn't even care to upgrade because in reality the difference is minimal, that's also true for new generation CPUs vs previous ones.
Testing at 1080p could be indicative of what we will get at 4k/ultrawide when paired with RTX 5090/6090 later.

For the best gaming performance per buck you might want to upgrade GPU more often than CPU, which is especially true for Intel where MB needs to be upgraded. So this info is very useful.
 
Joined
Oct 12, 2005
Messages
707 (0.10/day)
Yes, so you are comparing the CPU's. Another way to look at the situation is, in a game, the CPU is not nearly as limited by the operation of the GPU as much as the GPU is limited by the operation of the CPU. So by testing at 1080p your going to be able to better compare the differences between the CPU's relative performance way more than the GPU's.

Isn't that is what another sets of CPU synthetic tests are for?
Synthetic are good to compare CPU for that specific Synthetic workload. It's not really useful for gaming. By example Zen1 was very good at many synthetic workload but was behind in gaming.

There are few good reason to test CPU in CPU limited scenario:

1) See how the CPU would do on CPU bound games that are harder to test like MMO
2) See how well the CPU could do in future
3) See what is the best of the best (because for many that matter)


Depending on what you do, it may matter or not. If you play mostly to competitive shooter by example, just get whatever mid range CPU of the brand you prefer and stop worrying about this futile debate for you. If you play MMO, heavy CPU intensive games like factorio, valheim (when focusing on base construction), etc. well get whatever best CPU your budget can offer.

This debate is way overstated. We always had and will always need to test CPU in non limited scenario. But at the end, it's not as important as people think it is. Unless we have a huge disparity between option (like in Bulldozer era.) For most people, any modern CPU that isn't low end would be more than enough for quite some time. That doesn't mean it's worthless to see what is the best and all that. This is an enthusiast site after all.
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,622 (2.41/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
You share? In @W1zzard's review of the AMD Ryzen9 7950X it have a average of 125W and the Ryzen 7 7700X of average of 80W.
I dont know what are you referring to.
In the 7950X review the average of gaming is this

1677443838585.png


87W.

12900k averages less than 57w in games.
it depends on the game, but my 13900K is around 80W.
 

JrRacinFan

Served 5k and counting ...
Joined
Mar 17, 2007
Messages
20,119 (3.11/day)
Location
Youngstown, OH
System Name Snow White
Processor Ryzen 7900x3d
Motherboard AsRock B650E Steel Legend
Cooling Custom Water 1x420
Memory 32GB T-Force Deltas
Video Card(s) PowerColor 7900 XTX Liquid Devil
Storage 20+ TB
Display(s) Sammy 49" 5k Ultrawide
Case Tt CTE 600 Snow Edition
Audio Device(s) Onboard
Power Supply EVGA 1200W P2
Mouse Corsair M65 RGB Elite White
Keyboard Corsair K65 Mini
Software Windows 10
Benchmark Scores Avermedia Live HD2
which is exactly the same as the 7700X, just with slightly lower boost clocks and voltages.
few bios tweaks and slight oc it is the exact same sku. with the same max fclk
 
Joined
Sep 15, 2011
Messages
6,722 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
The pi$$ing contest still going strong. 200fps vs 205fps in 1080p.
Time to change the CPU.... /s
 

tanaka_007

New Member
Joined
Mar 29, 2022
Messages
12 (0.01/day)
A 2DIMM motherboard is required if you want to increase memory clock in 1:2 mode.
It would be interesting to look for DDR5 modules with Hynix chips in shops and on eBay etc...
For example, a Hynix DDR5-4800 8GBx2 kit (Green PCB) works with DDR5-6000
Add a heatsink if you want to OC DDR5-7500 or higher with a cheap DDR5-4800 Hynix chip.
If you're looking for a cheaper option, you may want to wait for information on Micron and Samsung's 2nd generation (DDR5-5600 chip).

Example
7950X + 4 DIMM M/B + DDR5-6000 CL30 (2x16GB) 1:1 mode *4DIMM M/B is Limit DDR5-7000 (due to signal reflection)
7950X + 2 DIMM M/B + DDR5-8600 CL42 (2x16GB) 1:2 mode

Hynix M-die (DDR5-4800 / DDR5-7000+ Manual OC) *Release 2020/10
Hynix A-die (DDR5-5600 / DDR5-8000+ Manual OC) *Release 2022/11
Micron (DDR5-4800 / DDR5-5200+ Manual OC) *Release 2021/11
Micron (DDR5-5600 / unknown) *There is no information yet as it has just been released. *Release 2023/02
Samsung M-die (DDR5-4800 / DDR5-6000+ Manual OC) *Release 2021/12
Samsung D-die (DDR5-5600 / unknown) *There is no information yet as it has just been released. *Release 2023/02
*Release times may vary by country.
*Very high clocks may not run due to IMC and motherboard variations.
*OEM memory may have PMIC voltage locked.


Hynix chip.jpg
 
Last edited:
Joined
Aug 29, 2005
Messages
7,260 (1.03/day)
Location
Stuck somewhere in the 80's Jpop era....
System Name Lynni PS \ Lenowo TwinkPad L14 G2
Processor AMD Ryzen 7 7700 Raphael (Waiting on 9800X3D) \ i5-1135G7 Tiger Lake-U
Motherboard ASRock B650M PG Riptide Bios v. 3.10 AMD AGESA 1.2.0.2a \ Lenowo BDPLANAR Bios 1.68
Cooling Noctua NH-D15 Chromax.Black (Only middle fan) \ Lenowo C-267C-2
Memory G.Skill Flare X5 2x16GB DDR5 6000MHZ CL36-36-36-96 AMD EXPO \ Willk Elektronik 2x16GB 2666MHZ CL17
Video Card(s) Asus GeForce RTX™ 4070 Dual OC (Waiting on RX 8800 XT) | Intel® Iris® Xe Graphics
Storage Gigabyte M30 1TB|Sabrent Rocket 2TB| HDD: 10TB|1TB \ WD RED SN700 1TB
Display(s) KTC M27T20S 1440p@165Hz | LG 48CX OLED 4K HDR | Innolux 14" 1080p
Case Asus Prime AP201 White Mesh | Lenowo L14 G2 chassis
Audio Device(s) Steelseries Arctis Pro Wireless
Power Supply Be Quiet! Pure Power 12 M 750W Goldie | 65W
Mouse Logitech G305 Lightspeedy Wireless | Lenowo TouchPad & Logitech G305
Keyboard Ducky One 3 Daybreak Fullsize | L14 G2 UK Lumi
Software Win11 IoT Enterprise 24H2 UK | Win11 IoT Enterprise LTSC 24H2 UK / Arch (Fan)
Benchmark Scores 3DMARK: https://www.3dmark.com/3dm/89434432? GPU-Z: https://www.techpowerup.com/gpuz/details/v3zbr
I dont know what are you referring to.
In the 7950X review the average of gaming is this

View attachment 285609

87W.
I saw the Application power consumption.

But I am still on to see if the 3D V-cache is only gaming performance uplift because if so non X variants would still make sense for normal people and oem if they can come in at a good price from the X variant.

People are going to spend 2k minimum and then use 1080p?

Let's see the 4k results with RT or higher.

Yeah that haven't really moved for some years now with 1080p gaming, I tried 4K takes too much money and from day to day 1080p is too small so I went back to 1440p still much easier to drive than 4K and looks better in my eyes than 1080p.
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,622 (2.41/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Joined
May 11, 2018
Messages
1,254 (0.52/day)
Testing at 1080p could be indicative of what we will get at 4k/ultrawide when paired with RTX 5090/6090 later.

For the best gaming performance per buck you might want to upgrade GPU more often than CPU, which is especially true for Intel where MB needs to be upgraded. So this info is very useful.

4K is 4x the pixels of 1080p, so one generation uplift is much less than comparing 4K FPS to 1080p. An RTX 3090 gives you averagely 88 FPS in TechPowerUP suit of games in 4K, in 1080p it has 181 FPS - 105% more. And 2080 Ti goes from 59 FPS in 4K to 130 GPS in 1080p - 120% more.

With RTX 4090 you get 144 FPS in 4K, but only 74% more in 1080p - but absurdly high 251 FPS is here limited by the latency of system, not performance of CPU - we can be sure the actual performance is more than 100% more in 1080p than in 4K.

A generational uplift from RTX 3090 to 4090 only gives you 63% in 4K, 40% in 1080p, and from 2080Ti yo 3090 only gave you 49% uplift in 4K, 39% uplift in 1080p.

So you really have to skip a generation of GPU and use the same CPU after 4 years, and it's still not the same as comparing 4K numbers to 1080p.

720p is of course even more absurd, that's like planning on using the same CPU after more than 8 years.

This "let's get GPU bottleneck out if equation" numbers are very good for selling CPUs, but the theoretical increases at low resolution have very little effect on what you'll get from that product.

But it's good for sales, good for creating FOMO in buyers.

Because showing actual FPS they'll get from a cheap $250 CPU and very expensive $700 one in 4K, and perhaps with a more down to earth GPU that majority of gamers use shows a very, very different picture.
 
Top