Processor | Ryzen 5 5700x |
---|---|
Motherboard | B550 Elite |
Cooling | Thermalright Perless Assassin 120 SE |
Memory | 32GB Fury Beast DDR4 3200Mhz |
Video Card(s) | Gigabyte 3060 ti gaming oc pro |
Storage | Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs |
Display(s) | LG 27gp850 1440p 165Hz 27'' |
Case | Lian Li Lancool II performance |
Power Supply | MSI 750w |
Mouse | G502 |
If only the government could help me get rid of the heat output.
My undervolted 3080 turns my room into a sauna during the summer, together with the rest of my equipment (TV, receiver). I am never buying a 300+ W card again.
Power consumption is the main thing I will be looking at from now on, for both CPUs and GPUs.
System Name | Firelance. |
---|---|
Processor | Threadripper 3960X |
Motherboard | ROG Strix TRX40-E Gaming |
Cooling | IceGem 360 + 6x Arctic Cooling P12 |
Memory | 8x 16GB Patriot Viper DDR4-3200 CL16 |
Video Card(s) | MSI GeForce RTX 4060 Ti Ventus 2X OC |
Storage | 2TB WD SN850X (boot), 4TB Crucial P3 (data) |
Display(s) | Dell S3221QS(A) (32" 38x21 60Hz) + 2x AOC Q32E2N (32" 25x14 75Hz) |
Case | Enthoo Pro II Server Edition (Closed Panel) + 6 fans |
Power Supply | Fractal Design Ion+ 2 Platinum 760W |
Mouse | Logitech G604 |
Keyboard | Razer Pro Type Ultra |
Software | Windows 10 Professional x64 |
System Name | The Phantom in the Black Tower |
---|---|
Processor | AMD Ryzen 7 5800X3D |
Motherboard | ASRock X570 Pro4 AM4 |
Cooling | AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm |
Memory | 64GB Team Vulcan DDR4-3600 CL18 (4×16GB) |
Video Card(s) | ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB |
Storage | WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space |
Display(s) | Haier 55E5500U 55" 2160p60Hz |
Case | Ultra U12-40670 Super Tower |
Audio Device(s) | Logitech Z200 |
Power Supply | EVGA 1000 G2 Supernova 1kW 80+Gold-Certified |
Mouse | Logitech MK320 |
Keyboard | Logitech MK320 |
VR HMD | None |
Software | Windows 10 Professional |
Benchmark Scores | Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439 |
I couldn't agree more because I don't think that DDR5 is worth it. I think that AMD made a mistake by offering the AM5 platform with only DDR5 compatibility. At the same time though, DDR4 and DDR3 were also premature. When DDR3 came out, DDR2 was still perfectly viable and the same could be said about DDR4 vis a vis DDR3. Hell, just think of the people out there still using DDR3 with their FX processors. For example, Steve Burke, the Tech Jesus himself, still uses an FX-8350 in his home PC.Well, as a DDR5 platform, the real cost comparison needs to be an R7 7600 on B650 vs i5-13400(F) on B660 DDR5.
I agree that the price comparison between those two now looks evenly matched, but we need reviews to answer the question of whether DDR5 is even worth bothering with at this price point for now.
You know what's funny? Ten years ago, people were quoting the high power usage of AMD CPUs and Radeon GPUs as reasons they didn't buy them. Now those people have been proven to be full of donkey doo-doo because suddenly such things don't seem to matter to them anymore.At the very least, these non-X CPUs offer the best AM5 value yet, and while they're not as fast as Intel, they are vastly more power-efficient, which is an increasingly-important metric to judge a CPU by.
How much did the Intel DDR5 platform cost again?460 usd is a reasonable cost for a low tier CPU and a mobo?!
What do you drive a Lambo?
It's a problem with AMD video cards, because they report much less consumption in reality. That's how they are designed. You can find the explanation in each video card review. For processors, consumption is accurately reported, via VRM sensors.Please stop posting 3DMark graphs of power consumption as if they prove something. The only accurate measure of power consumption is physically at the connectors and the wall.
They consumed too much and offered little, that was the problem with FX. Now it's funny that many AMD fans invoke the consumption of Intel processors in programs that they don't use without realizing that the same processor consumes in many others less than an AMD consumes in idle.You know what's funny? Ten years ago, people were quoting the high power usage of AMD CPUs and Radeon GPUs as reasons they didn't buy them. Now those people have been proven to be full of donkey doo-doo because suddenly such things don't seem to matter to them anymore.![]()
System Name | Firelance. |
---|---|
Processor | Threadripper 3960X |
Motherboard | ROG Strix TRX40-E Gaming |
Cooling | IceGem 360 + 6x Arctic Cooling P12 |
Memory | 8x 16GB Patriot Viper DDR4-3200 CL16 |
Video Card(s) | MSI GeForce RTX 4060 Ti Ventus 2X OC |
Storage | 2TB WD SN850X (boot), 4TB Crucial P3 (data) |
Display(s) | Dell S3221QS(A) (32" 38x21 60Hz) + 2x AOC Q32E2N (32" 25x14 75Hz) |
Case | Enthoo Pro II Server Edition (Closed Panel) + 6 fans |
Power Supply | Fractal Design Ion+ 2 Platinum 760W |
Mouse | Logitech G604 |
Keyboard | Razer Pro Type Ultra |
Software | Windows 10 Professional x64 |
If it was accurately reported, we wouldn't have TPU and GN showing us that it's not.For processors, consumption is accurately reported, via VRM sensors.
System Name | Bragging Rights |
---|---|
Processor | Atom Z3735F 1.33GHz |
Motherboard | It has no markings but it's green |
Cooling | No, it's a 2.2W processor |
Memory | 2GB DDR3L-1333 |
Video Card(s) | Gen7 Intel HD (4EU @ 311MHz) |
Storage | 32GB eMMC and 128GB Sandisk Extreme U3 |
Display(s) | 10" IPS 1280x800 60Hz |
Case | Veddha T2 |
Audio Device(s) | Apparently, yes |
Power Supply | Samsung 18W 5V fast-charger |
Mouse | MX Anywhere 2 |
Keyboard | Logitech MX Keys (not Cherry MX at all) |
VR HMD | Samsung Oddyssey, not that I'd plug it into this though.... |
Software | W10 21H1, barely |
Benchmark Scores | I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000. |
It's only a problem if you don't read the labels of the values you're given.It's a problem with AMD video cards, because they report much less consumption in reality. That's how they are designed. You can find the explanation in each video card review. For processors, consumption is accurately reported, via VRM sensors.
Processor | Ryzen 5 5700x |
---|---|
Motherboard | B550 Elite |
Cooling | Thermalright Perless Assassin 120 SE |
Memory | 32GB Fury Beast DDR4 3200Mhz |
Video Card(s) | Gigabyte 3060 ti gaming oc pro |
Storage | Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs |
Display(s) | LG 27gp850 1440p 165Hz 27'' |
Case | Lian Li Lancool II performance |
Power Supply | MSI 750w |
Mouse | G502 |
How much did the Intel DDR5 platform cost again?
What do you game on, Sandy Bridge?
Ok, it's wrong with me. The 25W reported by the wattmeter for the entire system at idle is the consumption of the processor only. It is only correct for AMD, with 20W, only processor, in idle and 50 degrees, which means that the AMD processor generates current, not consumes it.If it was accurately reported, we wouldn't have TPU and GN showing us that it's not.
System Name | Skunkworks 3.0 |
---|---|
Processor | 5800x3d |
Motherboard | x570 unify |
Cooling | Noctua NH-U12A |
Memory | 32GB 3600 mhz |
Video Card(s) | asrock 6800xt challenger D |
Storage | Sabarent rocket 4.0 2TB, MX 500 2TB |
Display(s) | Asus 1440p144 27" |
Case | Old arse cooler master 932 |
Power Supply | Corsair 1200w platinum |
Mouse | *squeak* |
Keyboard | Some old office thing |
Software | Manjaro |
It's AMD's old cycle. They were doing alright before bulldozer with CPUs, in GPUS they were great, got to 49% market share with evergreen.....then just rebranded everything of the 6000s, shoved it at us, and got caught with their pants down when fermi 2.0 came out.This pricing makes ZERO sense... lowest rung non-X CPU (7600) is a whopping $70 cheaper than its X counterpart, yet the next one up (7700) is a mere $15 cheaper? Surely it should be the other way around?
It doesn't matter though, people are still not going to pay a premium for a 6-core 7600 + motherboard + DDR5 when you can get an 8-core 5800X3D + motherboard + DDR4 for less and have better gaming performance. I don't understand which genius at AMD thought that people would be happy to pay more for CPU performance, after years of the same company charging less. A smart company would be selling its newer CPUs for less than its old ones to encourage consumers to upgrade... maybe AMD is just hoping that AM4 stock will run out and consumers will be forced to buy AM5... or, maybe those consumers will just buy Intel...
Everything about AM5 has felt like the Bulldozer-era AMD TBH - clueless about the actual value proposition of their product and expecting people to buy it solely due to brand loyalty.
That's A) on a ryzen 5 3600, not a 7600, and B) the new wraith stealth cooler (which is what comes with the 7600) is hot garbage.Wraith prism is the best stock cooler currently available and superior to the Intel cooler. It performs slightly worse than CM 212. It does a descent job up to 100W load.
View attachment 278394
Processor | 7800X3D 2x16GB CO |
---|---|
Motherboard | Asrock B650m HDV |
Cooling | Peerless Assassin SE |
Memory | 2x16GB DR A-die@6000c30 tuned |
Video Card(s) | Asus 4070 dual OC 2610@915mv |
Storage | WD blue 1TB nvme |
Display(s) | Lenovo G24-10 144Hz |
Case | Corsair D4000 Airflow |
Power Supply | EVGA GQ 650W |
Software | Windows 10 home 64 |
Benchmark Scores | Superposition 8k 5267 Aida64 58.5ns |
As I replied later the prism works fine on 7700 and I thought the 7600 also had prism. Stealth is crap.It's AMD's old cycle. They were doing alright before bulldozer with CPUs, in GPUS they were great, got to 49% market share with evergreen.....then just rebranded everything of the 6000s, shoved it at us, and got caught with their pants down when fermi 2.0 came out.
Or before that, when athlon 64 was kicking intels arse and AMD went and bought ATi while pushing back their new arch and charging $1000 for the FX-62 only for conroe to exist and obliterate them from orbit.
That's A) on a ryzen 5 3600, not a 7600, and B) the new wraith stealth cooler (which is what comes with the 7600) is hot garbage.
System Name | The Phantom in the Black Tower |
---|---|
Processor | AMD Ryzen 7 5800X3D |
Motherboard | ASRock X570 Pro4 AM4 |
Cooling | AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm |
Memory | 64GB Team Vulcan DDR4-3600 CL18 (4×16GB) |
Video Card(s) | ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB |
Storage | WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space |
Display(s) | Haier 55E5500U 55" 2160p60Hz |
Case | Ultra U12-40670 Super Tower |
Audio Device(s) | Logitech Z200 |
Power Supply | EVGA 1000 G2 Supernova 1kW 80+Gold-Certified |
Mouse | Logitech MK320 |
Keyboard | Logitech MK320 |
VR HMD | None |
Software | Windows 10 Professional |
Benchmark Scores | Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439 |
The AM5 platform including the CPU and motherboard are about the same price now as the Intel LGA 1700 with DDR5. If both platforms are about the same price, NEITHER of them can really be called negative and it has nothing to do with Intel or AMD. I didn't think it was that complicated of a concept, but I guess I was mistaken because you clearly proved me wrong in that regard.so it's fine because Intel. Got it.
Processor | Ryzen 7 5800X3D |
---|---|
Motherboard | MSI B550 Tomahawk |
Cooling | Noctua U12S |
Memory | 32GB @ 3600 CL18 |
Video Card(s) | AMD 6800XT |
Storage | WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB) |
Display(s) | Samsung Odyssey G9 |
Case | Be Quiet! Silent Base 802 |
Power Supply | Seasonic PRIME-GX-1000 |
Are you just kind of rambling to yourself on the thread? Oh well, I'll engage this time.Off and On at the same time (the impact of the processor in gaming).
I replaced 11600KF with 10500. The reason: two processors, one motherboard and 10500 practically cannot be sold. I got rid of 11600KF immediately, at a reasonable price.
I did some tests beforehand, so that I don't regret it later. To my surprise, the 10500 behaves very well next to the 3070Ti. I even have a 3% blck reserve (120MHz more, all cores) and the Gaming profile in the BIOS (another 100 MHz more on all cores), but I'll try it when I have time, I have no serious reasons now. The eye does not distinguish the differences.
In synthetic tests, the differences are huge. They are also in gaming (see force), but the eye does not see them. With the downgrade to pcie 3.0 on the video card and nvme, you'd say it's a catastrophe, but the reality is completely different.
So, if we invoke gaming, what is the impact of the processor really in gaming? The reviews use powerful, top-of-the-line video cards. Most users, hmmm... I think entry-mid video cards definitely dominate.
So what are we talking about? How to spend the money and encourage them to sell us at a high price?
Processor | Ryzen 7 5800X3D |
---|---|
Motherboard | MSI B550 Tomahawk |
Cooling | Noctua U12S |
Memory | 32GB @ 3600 CL18 |
Video Card(s) | AMD 6800XT |
Storage | WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB) |
Display(s) | Samsung Odyssey G9 |
Case | Be Quiet! Silent Base 802 |
Power Supply | Seasonic PRIME-GX-1000 |
You're all over the place. Your last post was about games, not productivity. In games, the 11600k & 10600k were separated by a mere 0.8% at 720p, and it actually was the 10600k that was that little bit faster than the 11600k. I'm just sticking with your own talking points.You are seriously mistaken. Series 10 was the last of the skylake series, series 11 being a breakdown solution by redesigning a 10nm processor for 14nm lithography. The differences between them are very big, both single and multiple. According to the benchmark, 11600K exceeds 10500 by more than 30%. In reviews. In real life, you don't die with a 10500 in the system.
Ontopic:
Yes, I know, it's wrong. It's not just my wattmeter that generates errors, but everything that indicates a consumption-friendly Intel. Like in this video.
System Name | Main |
---|---|
Processor | 5900X |
Motherboard | Asrock 570X Taichi |
Memory | 32GB |
Video Card(s) | 6800XT |
Display(s) | Odyssey C49G95T - 5120 x 1440 |
What do you determine as "consumption in real life"?At least one hour of WoT with the igp. If you ask the wattmeter, ~60W average in this game, towards 70W peak. The whole system!
You can also capture sessions of 3-4 hours with ryzen, let's see the consumption in real life.
The 7600 is also 18W in single thread test case, which last time I checked is under 20+WAll power measurements on this page are based on a physical measurement of the voltage, current and power flowing through the 12-pin CPU power connector(s), which makes them "CPU only," not "full system." We're not using the software sensors inside the processor, as these can be quite inaccurate and will vary between manufacturers.
System Name | THU |
---|---|
Processor | Intel Core i5-13600KF |
Motherboard | ASUS PRIME Z790-P D4 |
Cooling | SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2 |
Memory | Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank) |
Video Card(s) | MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V) |
Storage | Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB |
Display(s) | LG OLED C8 55" + ASUS VP229Q |
Case | Fractal Design Define R6 |
Audio Device(s) | Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506 |
Power Supply | Corsair RM650 |
Mouse | Logitech M705 Marathon |
Keyboard | Corsair K55 RGB PRO |
Software | Windows 10 Home |
Benchmark Scores | Benchmarks in 2024? |
What do you determine as "consumption in real life"?
W1zzard already have 3 different power consumption test cases, that is actually testing the CPU power, and not relying on the software monitor (a combination of those is somewhat close to my real usage)
The 7600 is also 18W in single thread test case, which last time I checked is under 20+W
Strangely enough(?) the lowest consumption single-thread CPU is 5700G...