• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 9 7950X3D Runs First Benchmarks

Joined
Nov 15, 2020
Messages
930 (0.62/day)
System Name 1. Glasshouse 2. Odin OneEye
Processor 1. Ryzen 9 5900X (manual PBO) 2. Ryzen 9 7900X
Motherboard 1. MSI x570 Tomahawk wifi 2. Gigabyte Aorus Extreme 670E
Cooling 1. Noctua NH D15 Chromax Black 2. Custom Loop 3x360mm (60mm) rads & T30 fans/Aquacomputer NEXT w/b
Memory 1. G Skill Neo 16GBx4 (3600MHz 16/16/16/36) 2. Kingston Fury 16GBx2 DDR5 CL36
Video Card(s) 1. Asus Strix Vega 64 2. Powercolor Liquid Devil 7900XTX
Storage 1. Corsair Force MP600 (1TB) & Sabrent Rocket 4 (2TB) 2. Kingston 3000 (1TB) and Hynix p41 (2TB)
Display(s) 1. Samsung U28E590 10bit 4K@60Hz 2. LG C2 42 inch 10bit 4K@120Hz
Case 1. Corsair Crystal 570X White 2. Cooler Master HAF 700 EVO
Audio Device(s) 1. Creative Speakers 2. Built in LG monitor speakers
Power Supply 1. Corsair RM850x 2. Superflower Titanium 1600W
Mouse 1. Microsoft IntelliMouse Pro (grey) 2. Microsoft IntelliMouse Pro (black)
Keyboard Leopold High End Mechanical
Software Windows 11
Geekbench is pointless for testing what these chips are made to do.

I currently have 3 choices:

1. Asus X670E Strix : If I want separation with my marriage
2. Asus B650E Strix : If I want my family to continue
3. MSI X670E Ace Max: If I want a full blown divorce.

Hopefully the costs will keep coming down though as the cheapest of those boards is $500 Canadian and the Ace Max is a cool $999 Canadian.
I like my Gigabyte Extreme board. :)
 
Joined
Mar 17, 2011
Messages
159 (0.03/day)
Location
Christchurch, New Zealand
I once had a i7 980x at 4.3 ghz all core oc that lasted 10 years and was able to play metro exodus at extreme settings at 3440 by 1440p no rt with 1080ti ftw3 at 60 fps. The best way to make a cpu last almost a decade like in my case is upgrade the resolution. Unfortunately 8k gaming is no where near from grasp due to stagnation of display technology and slow adaptation of dp 2.X standards. Even without upgrading your monitor resolution there is always downscaling from a higher resolution to a lower resolution to further improve image quality and keep yourself gpu bound longer like in dsr, dldsr, nvidia's dlaa ai anti aliasing and AMD's virtual super resolution. As GPUs become more powerful I believe these downscaling techniques will become more popular especially if there is performance on the table.

Many people talk down buying high-end PC gear, but I want good performance and a long time between PC upgrades. The bedroom PC which I built in Jan 2009 runs Battlefield 4 very smoothly. A couple of years after BF5 arrived I swapped it for a (6-core) Xeon X5690 to get BF5 to also run very smoothly. If it gets old and crusty, replace it with a Xeon. However, your i7-980X is probably in the ballpark of a X5690 already, which just goes to show that paying for the good stuff from the get-go means all one has to care about for many years to come is buying a new graphics card.

I might try downscaling to see if it improves BF2042 performance on the 2009 PC. Runs like a slideshow atm.
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
8,861 (3.87/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
However, your i7-980X is probably in the ballpark of a X5690 already,
It’s the same CPU.

The 980X is actually better, because it has memory dividers that can be useful if you have good memory.
 

Sara bnt yazn

New Member
Joined
Jan 13, 2021
Messages
4 (0.00/day)
There is a different processor than its 5800x3d processor, its speed was much lower than the 5800x, but it was higher performance than it in games by a large difference. Now the 7900 and 7950 processors will come at the same speeds as the original processor in version x, and in the future the differences in processors in games will be similar to the differences in graphics cards, and God knows best. people's inventions.
 
Joined
Jun 14, 2020
Messages
3,536 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
7800X3D will probably beat 13900KS in most CPU bound games, while costing half and using much less watts
Αnd then you compared it to a 13600k, you realize it only beats it by 10% while costing way more and being way slower in everything but games. Yikes

We are in an era where Intel have taken over the market by selling higher number of cores, with most of them usually being E cores. For many 24 cores are 24 cores. They close their eyes on the fact that 16 of those are E cores. And never, I mean NEVER ask themselves "What if we had 24 P cores? What difference in performance we would have witnessed?".
Now AMD comes with X3D chips that will probably win many gaming benchmarks and because we have the non X3D chips in the market and we know what those chips can do with unlocked TDP, people try to find flaws to paint a negative image to those X3D chips. Intel using E cores might translate to like, 50+% degradation in multi core performance, but no, we should start a revolution because X3D chips will be 10% slower in the game of... Cinebench.

These chips are what people where asking from AMD. To integrate X3D cache to hi end models, and not just the 8 core model. AMD did it and now the only thing we have to do is wait and see if that extra cache and benchmark results in games can justify those prices.
The difference is ecores makes the CPU cheaper (less die space) while providing MORE mt performance. The 3d cache makes the CPU more expensive while hinderingg performance in everything but games. So not really a good analogy
 

armit

New Member
Joined
Feb 21, 2023
Messages
5 (0.01/day)
AMD processors are very unstable. Even if you record phenomena such as freezing interruption as a video and send it to the A/S center, they judge the defective product as a normal product because in case of continuous defect, AMD CPU must be refunded in accordance with the Consumer Protection Act. In the case of Intel, it recalled in the same situation.

In the end, the Ryzen and Asus b450 boards were discarded and replaced with Intel CPUs and boards, so there was no problem except for heat.
 
Last edited:
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
The Ryzen CPUs with stacked cache will have lower clock frequencies than their regular counterparts until the stacked cache die is moved below the main die. It is possible, because AMD's MI300 does that.
 
Joined
Apr 30, 2011
Messages
2,716 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Something else that most people tend to ignore/forget: The die without the 3Dcache will not be hampered in clocks, so in all apps apart from the heavy-MT ones, the CPU will match 7950X and in cache sensitive ones will easily win it. Personally, I cannot find any con in 7900X3D and 7950X3D cpus especially with their power limits being absolutely sensible.
 

imeem1

New Member
Joined
Jun 15, 2022
Messages
17 (0.02/day)
so around 5% slower outside of gaming vs. the 7950x. Would it in theory be 5% faster in gaming?
 
Joined
Sep 4, 2022
Messages
348 (0.41/day)
so around 5% slower outside of gaming vs. the 7950x. Would it in theory be 5% faster in gaming?
It might have better 0.1% lows and less frame variance even if the average is 5% delta gain but the 0.1% lows are much better even at higher practical resolutions like 4k it will be a win imo.
 
Joined
Dec 10, 2022
Messages
486 (0.65/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
I really think that AMD made a huge mistake by creating the R9-7900X3D and R9-7950X3D. We've already seen from tests with the R7-5800X3D that the increased cache does little to nothing to improve productivity performance while the reduced (and restricted) clock speeds do a lot to hinder it. I think that people who buy these APUs are going to be unhappy with them because, generally, gamers (almost) never buy 12 or 16-core chips, we (almost) always buy 6 or 8 cores. I fear that the R9 X3D APUs will end up being a bit of a debacle for AMD because I don't see these selling well at all. Prosumers are, on average, more tech-savvy than gamers. I mean, sure, there are gamers who are every bit as tech-savvy as any prosumer but there are also the kids who game on Alienwares and don't have a clue about PC tech so, yeah, I think that, on average, Prosumers are more tech-savvy than gamers. Besides the lack of sales that I foresee, those that do take the chance and buy one won't be satisfied with it, something that is especially bad considering their price.

There's also the huge opportunity that AMD missed by not having an R5-7600X3D. We've seen that X3D can have huge performance advantages in gaming and, while I'm sure that the R7-7800X3D will sell well, AMD could've been the undisputed king of the gaming CPU market with an R5-7600X3D. Any competition from Intel in that space would've been a joke at best. AMD might change its mind and end up releasing an R5-7600X3D but by then it might be too little, too late because if gamers are forced to buy something else, like the R5-7600(X) or i5-13500K, they're not going to throw money down on a new (and more expensive) APU to get that extra performance after they've already shelled out once.

I believe that AMD's product choices with regard to the Ryzen 7000-series X3D parts have been based on greed and that greed is going to cost them in the long-term. When a decision like this is made for short-term benefit, sooner or later the future comes along and plants a big, juicy bite right on the buttocks of the company that made said decision. It happened to Intel because of their refusal to offer more than four cores in a CPU for too long and I foresee it happening to AMD. They just never seem to learn, eh?
 
Last edited:
Joined
Jul 30, 2019
Messages
3,338 (1.69/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
There's also the huge opportunity that AMD missed by not having an R5-7600X3D. We've seen that X3D can have huge performance advantages in gaming and, while I'm sure that the R7-7800X3D will sell well, AMD could've been the undisputed king of the gaming CPU market with an R5-7600X3D. Any competition from Intel in that space would've been a joke at best. AMD might change its mind and end up releasing an R5-7600X3D but by then it might be too little, too late because if gamers are forced to buy something else, like the R5-7600(X) or i5-13500K, they're not going to throw money down on a new (and more expensive) APU to get that extra performance after they've already shelled out once.
I suspect a 7600X3D would have cut too close into 5800X3D sales. If you were already on AM4 a 5800X3D would be the final stop for many looking for an upgrade that would be worth it and who don't need or necessarily want more cores. Maybe it's greed or perhaps they are trying to hit some target that makes something look good on paper because they know they are going to take some losses with current trends going into 2023.
 
Joined
Dec 10, 2022
Messages
486 (0.65/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
I suspect a 7600X3D would have cut too close into 5800X3D sales. If you were already on AM4 a 5800X3D would be the final stop for many looking for an upgrade that would be worth it and who don't need or necessarily want more cores.
I honestly don't know how an R5-7600X3D would affect R7-5800X3D sales because they're on completely different platforms. The only people who would be interested in the 5800X3D would be people who already own AM4. There's no way to make the performance of the R5-7600X3D over the R7-5800X3D worth the cost of an expensive B650 motherboard and new DDR5 RAM. I don't think that it would have any effect on people who already own AM4 because the 5800X3D is already amazing and no new motherboard or RAM is required. OTOH, people who don't already own AM4 aren't interested in the 5800X3D because AM4 is a dead platform. If they're going to invest new money, they're going to go for the AM5 APU over the AM4 CPU. The R7-5800X3D is only worth it if you already have an AM4 motherboard while an R5-7600X3D would only be worth it if you didn't already have an AM4 motherboard. There's no way that could possibly change because the adoption cost of AM5 isn't cheap and people will either want to put off paying for AM5 as long as possible or would be averse to spending money on a dead platform instead of investing in the new one. I really think that this is a non-issue.

Besides, even if you were right, all that would mean to AMD is more motherboard sales and I'm sure that wouldn't be considered a bad thing. I don't see that being their reasoning.
Maybe it's greed or perhaps they are trying to hit some target that makes something look good on paper because they know they are going to take some losses with current trends going into 2023.
That sounds a lot more plausible but if it's true, AMD's greed is causing them to be galactically stupid here.

I say this because I believe the R9-7900X3D and R9-7950X3D to be complete wastes of time and resources on AMD's part. As a rule, people who buy 12 and 16-core R9s don't buy them for primarily for gaming. Sure, they might game a bit on them but their primary use will always be productivity. The only people who buy those specifically for gaming are people with more money than brains and while these people surely do exist, there aren't enough of them to make a product successful.

We've all seen from the tests on the R7-5800X3D that the cheaper R7-5800X beats it in productivity because of its faster stock clocks and as its ability to be overclocked. The 3D cache has a net-negative impact on productivity because while very few (if any) productivity suites benefit from the extra cache, they all suffer from the clock speed restrictions. The people who are looking for a gaming APU will choose the R7-7800X3D while the people looking for a productivity APU will choose the R9-7900X or R9-7950X. After all, why would anyone pay more for a product that they know is going to be inferior for their purposes than an APU that costs a good deal less?

If your answer is "nobody" then you win the prize. In this case, the prize is AMD losing a crap-tonne of money when they could've made an absolute killing with an R5-7600X3D. The very expensive (not just to buy but to produce) R9-7900X3D and R9-7950X3D will gather dust on the shelves and AMD will be forced to take a huge loss on them. Even worse, AMD's brand-image will be damaged because knowingly bringing a useless product to market which will inevitably result in returns from unsatisfied customers is easily the best way to drag your own name through the mud. I honestly can't believe that Lisa Su signed off on it because she's a lot smarter than this.
 
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
The very expensive (not just to buy but to produce) R9-7900X3D and R9-7950X3D will gather dust on the shelves and AMD will be forced to take a huge loss on them. Even worse, AMD's brand-image will be damaged because knowingly bringing a useless product to market which will inevitably result in returns from unsatisfied customers is easily the best way to drag your own name through the mud.
The difference between the 7950X3D and the 7950X is rather low in productivity. The higher speed CCD helps there. The difference is almost entirely due to the higher power limit for the 7950X which makes the 7950X3D a much saner product. Moreover, if you're buying a CPU for some work related multithreaded application, then you should look at just the benchmarks for that particular application. For some scientific workloads, the 7950 X3D is significantly faster than the 7950X. For other workloads, the 13900k is faster. As always, for specialized workloads, don't look at the general rating, but instead, look at workloads similar to yours.

1677524901584.png
 
Joined
Jul 30, 2019
Messages
3,338 (1.69/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
I honestly don't know how an R5-7600X3D would affect R7-5800X3D sales because they're on completely different platforms. The only people who would be interested in the 5800X3D would be people who already own AM4. There's no way to make the performance of the R5-7600X3D over the R7-5800X3D worth the cost of an expensive B650 motherboard and new DDR5 RAM. I don't think that it would have any effect on people who already own AM4 because the 5800X3D is already amazing and no new motherboard or RAM is required. OTOH, people who don't already own AM4 aren't interested in the 5800X3D because AM4 is a dead platform. If they're going to invest new money, they're going to go for the AM5 APU over the AM4 CPU. The R7-5800X3D is only worth it if you already have an AM4 motherboard while an R5-7600X3D would only be worth it if you didn't already have an AM4 motherboard. There's no way that could possibly change because the adoption cost of AM5 isn't cheap and people will either want to put off paying for AM5 as long as possible or would be averse to spending money on a dead platform instead of investing in the new one. I really think that this is a non-issue.

Besides, even if you were right, all that would mean to AMD is more motherboard sales and I'm sure that wouldn't be considered a bad thing. I don't see that being their reasoning.

Just spit balling some numbers $350 for 5800x3D (newegg sale right now $310 but ignore that for a moment)

About $570 for a AM5 with DDR5 and 7600x3D (estimated).

$570 - $350 = $220 now when the cheaper AM5 boards come out and price reductions in DDR5 knock off at least $100 now your close to $120.

If you are on older AM4 x300 or x400 with Zen, Zen+, or Zen2 it might be more appealing to jump to a growing AM5 platform with 7600x3D for the minor difference in price and a significant bump in performance.

But since it's not available AMD can milk the 5800x3D for a bit longer. Motherboard sales in a bit of slump right now so AMD is in a unique position to move CPU inventory on AM4 with CPU upgrades.
 
Joined
Feb 20, 2020
Messages
9,340 (5.28/day)
Location
Louisiana
System Name Ghetto Rigs z490|x99|Acer 17 Nitro 7840hs/ 5600c40-2x16/ 4060/ 1tb acer stock m.2/ 4tb sn850x
Processor 10900k w/Optimus Foundation | 5930k w/Black Noctua D15
Motherboard z490 Maximus XII Apex | x99 Sabertooth
Cooling oCool D5 res-combo/280 GTX/ Optimus Foundation/ gpu water block | Blk D15
Memory Trident-Z Royal 4000c16 2x16gb | Trident-Z 3200c14 4x8gb
Video Card(s) Titan Xp-water | evga 980ti gaming-w/ air
Storage 970evo+500gb & sn850x 4tb | 860 pro 256gb | Acer m.2 1tb/ sn850x 4tb| Many2.5" sata's ssd 3.5hdd's
Display(s) 1-AOC G2460PG 24"G-Sync 144Hz/ 2nd 1-ASUS VG248QE 24"/ 3rd LG 43" series
Case D450 | Cherry Entertainment center on Test bench
Audio Device(s) Built in Realtek x2 with 2-Insignia 2.0 sound bars & 1-LG sound bar
Power Supply EVGA 1000P2 with APC AX1500 | 850P2 with CyberPower-GX1325U
Mouse Redragon 901 Perdition x3
Keyboard G710+x3
Software Win-7 pro x3 and win-10 & 11pro x3
Benchmark Scores Are in the benchmark section
We are in an era where Intel have taken over the market by selling higher number of cores, with most of them usually being E cores. For many 24 cores are 24 cores. They close their eyes on the fact that 16 of those are E cores. And never, I mean NEVER ask themselves "What if we had 24 P cores? What difference in performance we would have witnessed?".
Now AMD comes with X3D chips that will probably win many gaming benchmarks and because we have the non X3D chips in the market and we know what those chips can do with unlocked TDP, people try to find flaws to paint a negative image to those X3D chips. Intel using E cores might translate to like, 50+% degradation in multi core performance, but no, we should start a revolution because X3D chips will be 10% slower in the game of... Cinebench.

These chips are what people where asking from AMD. To integrate X3D cache to hi end models, and not just the 8 core model. AMD did it and now the only thing we have to do is wait and see if that extra cache and benchmark results in games can justify those prices.
Hi,
Yeah those e threads are worse than skylake-x 79 series thermals but at least it was easy to delid :laugh:
 
Joined
Dec 10, 2022
Messages
486 (0.65/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
The difference between the 7950X3D and the 7950X is rather low in productivity. The higher speed CCD helps there. The difference is almost entirely due to the higher power limit for the 7950X which makes the 7950X3D a much saner product. Moreover, if you're buying a CPU for some work related multithreaded application, then you should look at just the benchmarks for that particular application. For some scientific workloads, the 7950 X3D is significantly faster than the 7950X. For other workloads, the 13900k is faster. As always, for specialized workloads, don't look at the general rating, but instead, look at workloads similar to yours.

View attachment 285719
The problem is the fact that it doesn't change the price and the lack of value. All that anyone has to do if they want sane power draw and temps from an R9-7950X is enable eco-mode which, as you can see, completely nullfies the advantage to which you refer. If I'm a prosumer who wants a processor for productivity, I'm going to choose the R9-7950X because it's a lot less expensive and performs better in productivity workloads, even if the difference isn't huge. I can decide then if I want eco-mode or full-out performance and/or overclocking while paying less for it. If I also want to game with it, that's no problem either because the R9-7950X is still a top-tier gaming APU, matching the i9-12900K for much less money than the R9-7950X3D:

If the picture isn't visible, click here to see it.
Even at 1080p (a resolution that nobody will game at anyway), the gaming performance difference between the R9-7950X and R9-7950X3D would be imperceptible while the productivity performance difference would be quite obvious. Therefore, it's a bad product for prosumers.

For gamers, it's an even worse value proposition because the R7-5800X3D is coming out. The reason we gamers only choose 6 or 8 cores is because all you get beyond that is a bunch of expensive cores sitting idle and eating expensive power for no reason.

The title of Steve Walton's review says it all and confirms everything that I've said about AMD's moronic product choices:

AMD Ryzen 9 7950X3D Review: Gamers, Don't Buy This One!

Check Out Our 7800X3D Simulated Results Instead

If you're wondering what "Simulated 7800X3D Results" are, Steve did something pretty ingenious by disabling the R9-7950X3D's conventional CCX so that the APU was only using the CCX with the 3D cache on it to simulate the performance of the R7-5800X3D. I've seen this done before for similar purposes but I didn't see any other reviewer try this with the R9-7950X3D. As long as Steve did though, we can see just how terrible a product the R9-7950X3D is:

If the picture isn't visible, click here to see it.

Now, simulations like this are never 100% accurate because the standalone chip usually performs better than the one that was cut in half. This only makes things worse for the R9-7950X3D. I actually expected this because the 3D cache makes CPU and RAM speeds essentially irrelevant (within the same generation) so even having the R7-5800X3D running at lower clock speeds than the R9-7900/50X3D won't matter. The R9-7900X3D will be a real dumpster fire because it's going to have only six cores in its 3D-imbued CCX and will therefore perform in games like an R5-7600X3D would.

I said before that AMD made a colossal mistake by creating R9 APUs instead of R5 APUs with 3D cache. I caught A LOT of flak from fools who can't see the bigger picture but I didn't care because you can't fix stupid. I said that I hoped I was wrong but I knew that I wasn't because AMD isn't magic and you can't make a CPU that's a "best choice" for both gaming and productivity at the same time. The vindication is bittersweet though because it doesn't change the fact that every member of AMD's executive leadership belongs in Arkham Asylum for this.

AMD made the most galactically-stupid decision that I've ever seen them make by producing two APUs that couldn't succeed (R9-7900X3D and R9-7950X3D) instead of one that couldn't fail (R5-7600X3D). This is not a new concept and it really didn't take a genius to foresee this and I don't know why so few people did.
 
Last edited:
Joined
Jul 30, 2019
Messages
3,338 (1.69/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
Even at 1080p (a resolution that nobody will game at anyway),
I think this is incorrect. Plenty of people will game at 1080p for reasons I won't bother to enumerate. 1080p gaming is still a reality probably most people still have because of their hardware, the game, and/or the desired experience.
For gamers, it's an even worse value proposition because the R7-5800X3D is coming out. The reason we gamers only choose 6 or 8 cores is because all you get beyond that is a bunch of expensive cores sitting idle and eating expensive power for no reason.
I think 7950X3D will make a lot of sense for some people not just gaming but gaming and streaming or doing other activities. You can still pin your games to 3D cached cores while streaming and other activities on the remaining cores. From that vantage point it's pretty useful especially in a multi-monitor setup.
Just yesterday I was playing TF2 on my 5950x with Youtube music playing and browsing the web between respawns. The cores don't eat expensive power as you describe when they are not being used to my understanding.
Now, simulations like this are never 100% accurate because the standalone chip usually performs better than the one that was cut in half.
I'm not so sure about that and I think the margins of difference would be really small if any. I have a 3800x and 3950x and cutting the 3950x in half from what I recall basically equaled the 3800x in performance benchmarks I did on my machines at the time. That's not to say the same will happen to 7800x3D/7950x3D but I think it's a reasonable expectation. Testing will bear out whether or not that expectation was realistic or not but I wouldn't expect there to be a huge difference.
 
Last edited:
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
If you're wondering what "Simulated 7800X3D Results" are, Steve did something pretty ingenious by disabling the R9-7950X3D's conventional CCX so that the APU was only using the CCX with the 3D cache on it to simulate the performance of the R7-5800X3D. I've seen this done before for similar purposes but I didn't see any other reviewer try this with the R9-7950X3D. As long as Steve did though, we can see just how terrible a product the R9-7950X3D is:
@W1zzard also simulated the 7800X3D by disabling a CCD. As for the rest of your points, I agree that it doesn't make sense for a gaming only CPU. However, I disagree on the pointlessness of the 7900 X3D and the desirability of the 7600 X3D. If the former is pointless, than a 7600 X3D would be pointless too. As far as productivity is concerned, for most users, the 7950X is the better option. However, productivity isn't a one size fits all case, and there are non gaming workloads where the 7950 X3D dominates the 7950X. For productivity, you should always research what suits your application and buy that.

The 7950 X3D isn't stupid; it's appealing to the braggarts and the spendthrifts among us. As Nvidia has proven, that is a winning strategy. It's also suitable for those who have workloads that need the multi threaded performance, and want to both game and work on the same system.
 
Joined
Dec 10, 2022
Messages
486 (0.65/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
I think this is incorrect. Plenty of people will game at 1080p for reasons I won't bother to enumerate. 1080p gaming is still a reality probably most people still have because of their hardware, the game, and/or the desired experience.
You didn't correctly read what I wrote. I said EVEN at 1080p..... which INCLUDES 1080p.
I think 7950X3D will make a lot of sense for some people not just gaming but gaming and streaming or doing other activities.
No, it really won't.
You can still pin your games to 3D cached cores while streaming and other activities on the remaining cores. From that vantage point it's pretty useful especially in a multi-monitor setup.
You can already do that with an R9-7950X for $100 less. It has the same gaming performance as the i9-12900K which makes it an incredible gaming APU in its own right and definitely NOT in need of an increase in gaming performance. The R9-7950X would actually be MORE suitable for that because it's already so fast in gaming that adding the 3D cache won't make any appreciable difference no matter what resolution that you use:


You didn't think that just because AMD released this stupid abomination R9 X3D APU that suddenly the R9-7950X started to suck at gaming, did you? :laugh:
Just yesterday I was playing TF2 on my 5950x with Youtube music playing and browsing the web between respawns. The cores don't eat expensive power as you describe when they are not being used to my understanding.
I'm afraid that you're wrong. As long as they are active, they do eat power, just not as much as when they're being used. It's called "Idle power draw" and it has always been a thing. The only way that they don't use power is if they're disabled in the BIOS.
I'm not so sure about that and I think the margins of difference would be really small if any.
That's exactly the point. The margins will be similar and they could easily price an R5-7600X3D at the same or higher price than the R7-7700X which would make it even MORE profitable because I can guarantee you that a little extra cache silicon doesn't cost AMD very much.
I have a 3800x and 3950x and cutting the 3950x in half from what I recall basically equaled the 3800x in performance benchmarks I did on my machines at the time. That's not to say the same will happen to 7800x3D/7950x3D but I think it's a reasonable expectation. Testing will bear out whether or not that expectation was realistic or not but I wouldn't expect there to be a huge difference.
I would actually expect that the R7-3800X actually performed slightly better than the R9-3950X with one CCX disabled.
 
Joined
Jul 30, 2019
Messages
3,338 (1.69/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
You didn't correctly read what I wrote. I said EVEN at 1080p..... which INCLUDES 1080p.
You also said "(a resolution that nobody will game at anyway)" hence my reply to that.

You didn't think that just because AMD released this stupid abomination R9 X3D APU that suddenly the R9-7950X started to suck at gaming, did you? :laugh:
Of course not. I'm not sure what you are driving at here.
 
Joined
Dec 10, 2022
Messages
486 (0.65/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
@W1zzard also simulated the 7800X3D by disabling a CCD. As for the rest of your points, I agree that it doesn't make sense for a gaming only CPU. However, I disagree on the pointlessness of the 7900 X3D and the desirability of the 7600 X3D. If the former is pointless, than a 7600 X3D would be pointless too.
You're 100% wrong. If what you say here is true, then there would be no R7-5800X3D and no R7-7800X3D. The 3D cache is beneficial in GAMING and ONLY GAMING. Therefore, an CPU like the R7-5800X3D or an APU like the R7-7800X3D or R5-7600X3D would be extremely attractive to gamers (as the R7-5800X3D has been) because gamers tend to buy 6 or 8-core processors.
As far as productivity is concerned, for most users, the 7950X is the better option. However, productivity isn't a one size fits all case, and there are non gaming workloads where the 7950 X3D dominates the 7950X.
You think that the odd win of no more than 10% is "domination"? I have seen nothing in any of these tests that say "This APU is worth $100 more than the R9-7950X" and neither have you, you just don't realise it.
The 7950 X3D isn't stupid; it's appealing to the braggarts and the spendthrifts among us. As Nvidia has proven, that is a winning strategy.
No, nVidia hasn't proven that at all. What nVidia has proven is that if your competition has historically had unstable drivers, as long as you make sure that yours are stable, people will flock to you because they don't know better. THEN you can charge whatever you want.

If AMD has made a very limited number of these, then sure, it's not that bad. The fact that they didn't make an R5-7600X3D, an APU that would have guaranteed their dominance over Intel in the gaming space, IS that bad. I'm not even saying this for me, because I HAVE an R7-5800X3D which means that I have no interest in buying anything from the first generation of AM5. What AMD did is push god-only-knows how many consumers into Intel's arms instead of releasing an R5-7600X3D and getting thousands more people onto the AM5 platform with it, guaranteeing future APU sales because of their platform philosophy that started with AM4.

You're either relatively new at PC tech or you just don't pick-up on historical patterns. That doesn't mean that they're not there.
It's also suitable for those who have workloads that need the multi threaded performance, and want to both game and work on the same system.
Again, you're only proving how little you understand PC tech because the R9-7950X can already do that. It has the gaming performance of the i9-12900K and any attempt to increase its gaming performance will go unnoticed because of how good that it already is. As I said to another person, do you think that just because this X3D abomination came out that the R9-7950X suddenly started to suck at gaming? I honestly wonder if you're both young because nobody with any real level of expertise would use such broken arguments as that. As if you need to spend an extra $100USD just to be able to do something that can already be done with the extant R9-7950X. Only a newbie would think that and I don't say that to be insulting, I say it because it's true. Anyone with any degree of experience and expertise immediately sees the R9-7950X3D for what it is, a shameless and useless cash-grab. That you would defend it with a bunch of broken arguments says a lot more about you than it does about AMD.

That you would say that the utility of 3D cache in the 6-core R5-7600X is no different than the utility in the 16-core R9-7950X means that you really don't have much understanding of PC tech, how it's used and what is good for what. I honestly can't take you seriously after these words.
 
Joined
Jul 30, 2019
Messages
3,338 (1.69/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
I'm afraid that you're wrong. As long as they are active, they do eat power, just not as much as when they're being used. It's called "Idle power draw" and it has always been a thing. The only way that they don't use power is if they're disabled in the BIOS.
My understanding was with Zen series cores (default configuration) when you weren't using a core it was parked and hence not active thus not using any meaningful power. I could be wrong.

I would actually expect that the R7-3800X actually performed slightly better than the R9-3950X with one CCX disabled.
Well to my recollection the CCD1 was on par with my 3800x and CCD2 was a tad slower.
 
Joined
Dec 10, 2022
Messages
486 (0.65/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
My understanding was with Zen series cores (default configuration) when you weren't using a core it was parked and hence not active thus not using any meaningful power. I could be wrong.


Well to my recollection the CCD1 was on par with my 3800x and CCD2 was a tad slower.
Well, you're wrong. Even parked cores use some power, just very little. Being parked is like sleep, not shutdown.
 
Top