- Joined
- May 31, 2016
- Messages
- 4,437 (1.43/day)
- Location
- Currently Norway
System Name | Bro2 |
---|---|
Processor | Ryzen 5800X |
Motherboard | Gigabyte X570 Aorus Elite |
Cooling | Corsair h115i pro rgb |
Memory | 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16 |
Video Card(s) | Powercolor 6900 XT Red Devil 1.1v@2400Mhz |
Storage | M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB |
Display(s) | LG 27UD69 UHD / LG 27GN950 |
Case | Fractal Design G |
Audio Device(s) | Realtec 5.1 |
Power Supply | Seasonic 750W GOLD |
Mouse | Logitech G402 |
Keyboard | Logitech slim |
Software | Windows 10 64 bit |
Here is a good example of those claims that 12900K uses 50watts while gaming."Core i9 actually outscores the Ryzen 9 by 40 percent when the integrated graphics (IGP) are enabled and by 6 percent when it’s off. For this test, we show power consumption when the IGP is off."
Yeah... because it's using hardware encoding/decoding of the IGP? That's a pretty good example of misinterpreting the results.
That 6% win, is the CPU performance difference.
As to what gamers do with their PC's?
They game. On anything but a 5950x.
Where you'll find that everyone talking about AMD being more efficient at gaming still holds up
View attachment 258243View attachment 258248
Doesn't matter if its power consumption over time measured in never-ending tasks
Or tasks that benefit from finishing the job faster
View attachment 258244
It's strange the ways people reach to describe this stuff - no gamer needs or wants a 5950x or a 12900K/F.
The gaming performance difference between them is tiny.
When you're looking at options that perform within 6% (Per your claim)
or 1.5-10% (TPU's claims)
View attachment 258245View attachment 258246
At low resolutions when not GPU limited, intel have a performance advantage.
That's when their power consumption goes up.
Anything that prevents the CPU from reaching those high turbo states and the high power consumption, also prevents that performance advantage. They happen together, or not at all.
The argument that "My 12th gen intel isn't bad with power consumption because it's GPU limited" is just so... strange.
Because you can get that exact same performance from a CPU that wont suddenly triple in power usage any time the CPU actually has work to do.
Anything beyond a 12600K, is simply not power efficient by any metric - single threaded, multi threaded, or total consumption for a task like rendering.
You can get a 5600x or a 12600K and unless you're running at 1080p 360Hz with a 3090Ti, the higher end CPU's from AMD and Intel literally just throw away power for no gains.
Running GPU limited or with a frame cap reduces that, but if you *rely* on that you might as well underclock the CPU because you're relying on low enough load do automatically do it for you.
It's buying a 3090Ti, gaming at 720p 30Hz with Vsync on and claiming it's the most power efficient GPU of all time.
If you don't care about cinebench, rendering or multithreaded workloads why would you buy anything greater than a 6 core CPU?
From the reads here in few CPU demanding games the power usage jumps a to 140w in some cases for the 12900K.
Not to mention, the CPU is being utilized in 60% or something around. (Death Stranding for instance or Cyberpunk)