# AMD Ryzen 7 5800X3D



## W1zzard (Apr 12, 2022)

The AMD Ryzen 7 5800X3D is the company's new flagship gaming processor. It introduces 3D V-Cache, a dedicated piece of silicon with additional L3 capacity. In our review, we're testing how much the larger cache can help intensive gaming workloads and applications and compare it to the Intel Core i9-12900KS, too, of course.

*Show full review*


----------



## Wavetrex (Apr 12, 2022)

Factorio:


			FactorioBox Results
		


Beats Alder lake and is also 48%!! faster than its non V-Cache brother.


----------



## dgianstefani (Apr 12, 2022)

Temptttting


----------



## zlobby (Apr 12, 2022)

A new gaming king is here!


----------



## han32 (Apr 12, 2022)

that power consumption really good


----------



## Bloax (Apr 12, 2022)

"1.35v is the highest the vcache supports, therefore the maximum possible frequency is 4.5 GHz"

yeah that's cute, tell me more

Hopefully AM5 won't have these.. Funny features, let's call them that, yes.


----------



## GoldenX (Apr 12, 2022)

W1zzard, use Curve Optimizer on this one, should help keep clock speeds higher on high thread use.
Does it support the +200MHz of AutoOC?


----------



## billeman (Apr 12, 2022)

Nice processor if you want to game at 720p/1080p


----------



## _Flare (Apr 12, 2022)

cite: [...]
Quick Word on voltages – AMD Ryzen 7 5800X3D​As you saw, I rushed a little more than I do than regular reviews so data is being added through the day. Someone did ask me about OC and while we found out that a lot of _AMD OC menu is locked up (no Curve Optimizer)_ it made me wonder about voltage behavior in single thread and multi thread situation.

Hence run a *Cinebench R23* run (in both scenarios) and the CPU behaves way different than a Zen 3 non *3D V-CACHE CPU*.

According to *GIGABYTE ITE* sensor and *SVI2* *TFN,* VCore in CBR23 single core test, it is running at 1.20-1.218~ VCore during the test. That is way lower than most Zen 3 chips, which favor high voltage to keep high boost while there’s single core loads.

Meanwhile, at *CBR23 multi-core test*, VCore is actually higher, with *1.248ish volts*. Zen 3 with 3D V-CACHE, in this retail sample, behaves differently from regular Zen 3 CPUs.
[...]
source: https://xanxogaming.com/reviews/amd-ryzen-7-5800x3d-review-the-last-gaming-gift-for-am4/


----------



## btk2k2 (Apr 12, 2022)

In the perf/$ charts do you include the cost of the ram and mobo or is it just CPU prices compared?


----------



## W1zzard (Apr 12, 2022)

GoldenX said:


> W1zzard, use Curve Optimizer on this one, should help keep clock speeds higher on high thread use.
> Does it support the +200MHz of AutoOC?
> No OC, no PBO, no multiplier changes, no BCLK changes





btk2k2 said:


> In the perf/$ charts do you include the cost of the ram and mobo or is it just CPU prices compared?


Just the CPU price, a motherboard for this CPU costs like 40 € only, A320 chipset


----------



## Xuper (Apr 12, 2022)

What a nice jump just by adding 64mb cache !


----------



## Shatun_Bear (Apr 12, 2022)

Very impressive. Not sure why they are stealth launching this, it matches or beats the limited ediition behemoth power guzzling 12900KS for a much, much lower price.

Low stock maybe?


----------



## hoosierboy8807 (Apr 12, 2022)

cant wait til am5


----------



## ARF (Apr 12, 2022)

hoosierboy8807 said:


> cant wait til am5



It has entered mass production which means in the next weeks we will see the official launch.

Nice uplift in Farcry 5, up to 35% at 720p but besides that, it's a meh.
Totally irrelevant at 2160p today.


----------



## billeman (Apr 12, 2022)

ARF said:


> Nice uplift in Farcry 5, up to 35% at 720p but besides that, it's a meh.
> Totally irrelevant at 2160p today.



Exactly. Have been doing some tests @4K with my new (finally!) 3080 and tbh if I limit my 5950x to single CCD and 6 cores/ccd (6 cores in total like a 3600x) it doesn't make a difference.


----------



## mb194dc (Apr 12, 2022)

Interesting if you're gaming at 1080p and want to keep the power bills down.


----------



## Shatun_Bear (Apr 12, 2022)

mb194dc said:


> Interesting if you're gaming at 1080p and want to keep the power bills down.



It's just as fast at 1440p and 4K i.e matching the fastest gaming processor on the market.


----------



## Al Chafai (Apr 12, 2022)

billeman said:


> Exactly. Have been doing some tests @4K with my new (finally!) 3080 and tbh if I limit my 5950x to single CCD and 6 cores/ccd (6 cores in total like a 3600x) it doesn't make a difference.


everyone knows that at higher res any modern hyperthreaded 6 cores CPU can almost match any high end CPU in most gaming scenarios.


----------



## Airisom (Apr 12, 2022)

Pretty good overall I guess. Someone needs to slap this on a b550 unify x to see if bclk oc is possible. X570 is good for 101-102mhz while b550 can ramp up to 107mhz.

Perhaps agesa updates will help in the games that show little change.


----------



## btk2k2 (Apr 12, 2022)

W1zzard said:


> Just the CPU price, a motherboard for this CPU costs like 40 € only, A320 chipset



So you don't factor in the much more expensive ram the 12900K needs to get the scores shown?


----------



## jt94 (Apr 12, 2022)

btk2k2 said:


> So you don't factor in the much more expensive ram the 12900K needs to get the scores shown?


It's a CPU performance analysis, not a platform review.


----------



## Meanhx (Apr 12, 2022)

ARF said:


> It has entered mass production which means in the next weeks we will see the official launch.
> 
> Nice uplift in Farcry 5, up to 35% at 720p but besides that, it's a meh.
> Totally irrelevant at 2160p today.


Since upscaling with DLSS now, and in the future FSR2.0 and XeSS will become more and more common I would argue that maybe not 720p, but at least the 1080p results are more relevant in CPU benchmarks than it has ever been for 4k gaming.
Personally I find the 5800X3D interesting and I might just get it as an easy replacement for the 3600 in my secondary PC. Paired with a 6800xt for high refresh 1440p gaming the 5800X3D should be a decent upgrade.


----------



## btk2k2 (Apr 12, 2022)

jt94 said:


> It's a CPU performance analysis, not a platform review.



Yet the CPU does not work without the platform. They are inextricably linked.


----------



## GreiverBlade (Apr 12, 2022)

ohhhh interesting .... it turns out that non OC "fiasco" CPU, as some thought, is not much of a fiasco ... in short just the 3DV addition and Zen3 compete with ADL, without hybrid BS ... (yeah sorry ... not a fan of Intel hybrid implementation, i mean big.LITTLE and hybrid are good in certain scenario but not all )

well ... i am impressed, although a 5700X will be good enough for me later  to replace my 3600 (i mean ... 8c/16t upgrade for lower pricing than the 5600X is all good!)
glad AMD still deliver.


----------



## yeeeeman (Apr 12, 2022)

People are looking at specific games, but you should look at the overall performance. At 720p, it is 10% better than 5800X and it slots between 12700K and 12900K.  That is a good feat, but it is a waste of silicon, given how much space that 64MB takes, almost a full chiplet.


----------



## demu (Apr 12, 2022)

Not the best possible memory for Ryzen (latencies other than CL quite poor), and no information about either single or dual rank (which may have a big impact).
Alder Lake has about the best DDR5 memory money can buy.
Ryzen:
2x 16 GB DDR4-3600
16-20-20-34 1T
Infinity Fabric @ 1800 MHz 1:1

Alder Lake:
2x 16 GB G.SKILL Trident Z5 RGB DDR5-6000
36-36-36-76 2T / Gear 2


----------



## thegnome (Apr 12, 2022)

Hmm, would putting something like 3800cl16 increase the performance..? Would be fair given that ddr5 kit is pretty top-notch while the ddr4 kit is pretty slow for todays standards.


----------



## LupintheIII (Apr 12, 2022)

Congrats for the early review and thanks for your hard work as always.
Honestly I'm extremely curious of what that 96MB of L3$ can do in combination with 128MB of L3$ and SAM with a 6900XT or 6800XT (to keep condition similar with RTX 3080 tested here), so curious I'm considering buying one myself to test it!
Do you think such a test could be possible??
I know it's incredibly time consuming, but I belive it can be a great piece of content ;-)


----------



## W1zzard (Apr 12, 2022)

thegnome said:


> Hmm, would putting something like 3800cl16 increase the performance..? Would be fair given that ddr5 kit is pretty top-notch while the ddr4 kit is pretty slow for todays standards.


Yes, but the CPU can't run DDR4-3800 at 1:1, max FCLK is 1866 MHz


----------



## grammar_phreak (Apr 12, 2022)

Could you add Farcry 6?
It seems Alder lake does really well with Farcry 6 and I wanted to see how the 5800x3d compares.


----------



## Selaya (Apr 12, 2022)

is like, everything about it locked? cant change any voltages at all? like, the IOD ones related to memory/fclk, they shouldn't hurt the 3dcache no


----------



## LupintheIII (Apr 12, 2022)

billeman said:


> Exactly. Have been doing some tests @4K with my new (finally!) 3080 and tbh if I limit my 5950x to single CCD and 6 cores/ccd (6 cores in total like a 3600x) it doesn't make a difference.


Well, you didn't need 5800X3D review to be out to know that


----------



## darksf (Apr 12, 2022)

Well you all look Avg FPS , look at the framepacing this CPU is delivering the smoothest frametimes and has 99.9%th higher than the average framerates of most of the other CPUs. Which means it is delivering the smoothest gameplay with less hickups no matter the resolution or the GPU.


----------



## demu (Apr 12, 2022)

W1zzard said:


> Yes, but the CPU can't run DDR4-3800 at 1:1, max FCLK is 1866 MHz


Would you please show where that is stated?
Almost all Ryzen 5000-series can run FCLK 1900 or 2000. Even many Ryzen 300o-series work at FCLK 1900 (like my TR 3960X).
A fast (dual rank) b-die kit @3600 CL16-16-16-36 or even 3600 CL14-14-14-34 would have given quite a bit better results.


----------



## damric (Apr 12, 2022)

Very interesting, but I guess I won't be getting this since the improvement at 4K just doesn't matter over my existing 5800x.


----------



## W1zzard (Apr 12, 2022)

demu said:


> Would you please show where that is stated?
> Almost all Ryzen 5000-series can run FCLK 1900 or 2000. Even many Ryzen 300o-series work at FCLK 1900 (like my TR 3960X).
> A fast (dual rank) b-die kit @3600 CL16-16-16-36 or even 3600 CL14-14-14-34 would have given quite a bit better results.


It's not stated anywhere.. I tested it, 1866 POST, 1900 no POST, which probably means WHEA errors at 1866


----------



## Xebec (Apr 12, 2022)

Well written summary and appreciate the detailed benchmarks.  I was expecting more than WinRAR to really appreciate the cache but this is still quite interesting.  

I'm really curious if Microsoft Flight Sim will benefit from the cache..


----------



## nicamarvin (Apr 12, 2022)

W1zzard said:


> It's not stated anywhere.. I tested it, 1866 POST, 1900 no POST, which probably means WHEA errors at 1866


The Bios you are using lacks the 3D V-Cache Optimizer Driver and you are using an unoptimized AGESA 1.2.0.6c. release. These benchmarks need to be redone when AMD releases the 3D V-Cache Optimizer Driver abroad(currently only available for Gigabyte MB) and 1usmus is stating the performance is enhanced by AGESA 1207


----------



## Totally (Apr 12, 2022)

Dang, should have bit the bullet and got a 5900x months ago.


----------



## Zareek (Apr 12, 2022)

I'm underwhelmed for the price. With the exception of a few odd ducks, it's less than 10% faster than 5800x for 20% more cost.


----------



## W1zzard (Apr 12, 2022)

nicamarvin said:


> The Bios he is using is without the 3D V-Cache Optimizer Driver and unoptimized AGESA 1.2.0.6c. These benchmarks need to be redone when AMD releases the 3D V-Cache Optimizer Driver abroad(currently only available for Gigabyte MB) and 1usmus is stating the performance is enhanced by AGESA 1207


Not true, let's talk again in a couple of days when other sites posted their reviews using the same setup


----------



## Iain Saturn (Apr 12, 2022)

*Very impressive* *cpu*.

Easy recommendation for those 2 gens behind if their motherboard supports it.

killer far cry (lol) and borderland performance - wow!

“can it play farcry “

yes,yes indeed.

Bravo AMD


----------



## FlanK3r (Apr 12, 2022)

I think, OC is possible on board with external clock generator like Crosshair VI or VII Hero. But not much.


----------



## nicamarvin (Apr 12, 2022)

W1zzard said:


> Not true, let's talk again in a couple of days when other sites posted their reviews using the same setup


It's a fact that you are using an unoptimized Driver. AMD has not released the 3D V-Cache Optimizer Driver abroad(currently only available for Gigabyte MB).


----------



## W1zzard (Apr 12, 2022)

nicamarvin said:


> It's a fact that you are using an unoptimized Driver. AMD has not released the 3D V-Cache Optimizer Driver abroad(currently only available for Gigabyte MB).


Nope, I used the driver that AMD recommend for the reviews


----------



## btk2k2 (Apr 12, 2022)

W1zzard said:


> It's not stated anywhere.. I tested it, 1866 POST, 1900 no POST, which probably means WHEA errors at 1866



I asked over on reddit but might as well ask here. Any idea if that is just a dud mc or is it another X3D limitation?


----------



## Xuper (Apr 12, 2022)

W1zzard said:


> It's not stated anywhere.. I tested it, 1866 POST, 1900 no POST, which probably means WHEA errors at 1866



for Alder lake with DDR4 , What's maximum clock without getting error in prime95 ( or general )?


----------



## Vecix6 (Apr 12, 2022)

nicamarvin said:


> It's a fact that you are using an unoptimized Driver. AMD has not released the 3D V-Cache Optimizer Driver abroad(currently only available for Gigabyte MB).


I'm curious about those new drivers but maybe they are more usefull on non game scenarios. Here we can see 10% increase from 5800X even with lower clocks, so 3DV-Caché is doing his job excellently


----------



## W1zzard (Apr 12, 2022)

btk2k2 said:


> I asked over on reddit but might as well ask here. Any idea if that is just a dud mc or is it another X3D limitation?


No way to know at this time, I'm sure other reviewers will test it in a couple of days. CPU is retail, not an ES, in case you're wondering


----------



## nicamarvin (Apr 12, 2022)

W1zzard said:


> Nope, as I said, let's talk again in a few days


I just downloaded the Driver and there were no reference for a 5800X3D Optimizer Driver.


----------



## puma99dk| (Apr 12, 2022)

@W1zzard I heard @buildzoid point out a lot that X570 won't boot if base clock is over 100Mhz many times and that's why he recommends B550 instead of this maybe it's time to test with B550 instead than?  

Great review as always it makes me want to go back to AM4 so badly


----------



## W1zzard (Apr 12, 2022)

nicamarvin said:


> and there were no reference for a 5800X3D
> 
> View attachment 243415


you are joking, right? did you read the changelog?


----------



## nicamarvin (Apr 12, 2022)

W1zzard said:


> you are joking, right? did you read the changelog?


No, I was hoping you post that here?


----------



## Izzy1985 (Apr 12, 2022)

nicamarvin said:


> No, I was hoping you post that here?


Gigabyte has officially released a new AMD Chipset driver (4.03.03.624) to some of its motherboards, adding support for a new sub-driver designed specifically for the Ryzen 7 5800X3D. Gigabyte lists this driver as the "AMD 3D V-Cache Performance Optimizer Driver" in the release notes, and it is only intended for Windows 10 users.

It's important to note that these drivers come directly from Gigabyte, as AMD has not officially released chipset driver 4.03.03.624 to its support page just yet. 

The in-depth details about AMD's new "V-Cache optimizer" remain a complete mystery, with Gigabyte's new driver being the only relevant source to its existence. 

Source: https://www.tomshardware.com/news/v-cache-optimizer-driver-for-5800x-3d


----------



## InVasMani (Apr 12, 2022)

Wonder what happens with the 1% low's for just the RTRT titles when enabled/disabled and what implications that might have in regard to stronger RTRT hardware on newer generations of GPU's. It appears from a glance that RTRT titles have a less immediate upside to the 3D stacked cache or at least currently, but suspect that's mostly due to current GPU hardware's RTRT fixed function hardware being the bigger limitation factor perhaps.


----------



## Izzy1985 (Apr 12, 2022)

Izzy1985 said:


> Gigabyte has officially released a new AMD Chipset driver (4.03.03.624) to some of its motherboards, adding support for a new sub-driver designed specifically for the Ryzen 7 5800X3D. Gigabyte lists this driver as the "AMD 3D V-Cache Performance Optimizer Driver" in the release notes, and it is only intended for Windows 10 users.
> 
> It's important to note that these drivers come directly from Gigabyte, as AMD has not officially released chipset driver 4.03.03.624 to its support page just yet.
> 
> ...


However, AMD has stated that the 3D V-Cache is transparent to the operating system and programs; it simply appears as one large L3 cache. As such, it will only require a BIOS update for existing motherboards. That makes the existence of this new Gigabyte driver a head-scratcher.

It's even stranger that this driver is exclusive to Windows 10, which either confirms that AMD has partnered with Microsoft to bring this driver natively to Windows 11, or this driver doesn't control the L3 cache at all.


----------



## sparkyyy (Apr 12, 2022)

CSGO should never be tested in terms of averages, every PC can get a high framerate with that. I wanted to see the 1% lows :////


----------



## nicamarvin (Apr 12, 2022)

Izzy1985 said:


> Gigabyte has officially released a new AMD Chipset driver (4.03.03.624) to some of its motherboards, adding support for a new sub-driver designed specifically for the Ryzen 7 5800X3D. Gigabyte lists this driver as the "AMD 3D V-Cache Performance Optimizer Driver" in the release notes, and it is only intended for Windows 10 users.


It's good for Windows 11 too. I have the screenshots of that. let me find it.


----------



## Shatun_Bear (Apr 12, 2022)

demu said:


> Not the best possible memory for Ryzen (latencies other than CL quite poor), and no information about either single or dual rank (which may have a big impact).
> Alder Lake has about the best DDR5 memory money can buy.
> Ryzen:
> 2x 16 GB DDR4-3600
> ...



Yeah I just noticed that. How much does that Alder Lake kit cost? That's some of the best memory money can buy.


----------



## phanbuey (Apr 12, 2022)

Super promising for 3D Cache zen 4

the alder lake kit isn't that expensive compared to other DDR 5 prices - it's a $400 kit of 32GB


----------



## nicamarvin (Apr 12, 2022)

This is What Gigabyte are listing as 3D V Cache Optimizer Driver. It's good for Windows 10 and Windows 11.


----------



## tussinman (Apr 12, 2022)

Shatun_Bear said:


> Very impressive. Not sure why they are stealth launching this, it matches or beats the limited ediition behemoth power guzzling 12900KS for a much, much lower price.
> 
> Low stock maybe?


Low stock plus reality is if you can spend $450 on a CPU like it's nothing then you most likely already own a CPU within 4-10% of it anyways ($450 would be better invested in Zen 4)


----------



## Izzy1985 (Apr 12, 2022)

AMD should have locked only the voltage to 1.35v and let owners test their silicone to see what kind of all core OCs we could achieve. At 1.35v and all core of 4.6 or 4.7 should be easily attainable.


----------



## buildzoid (Apr 12, 2022)

you need to use B550 or A520 for BCLK overclocking. The X570 chipset pretty much stops working with the BCLK above 101MHz.


----------



## Izzy1985 (Apr 12, 2022)

nicamarvin said:


> This is What Gigabyte are listing as 3D V Cache Optimizer Driver. It's good for Windows 10 and Windows 11.
> 
> View attachment 243417


Hmmm..... screenshot from TPU article 6 days ago.



buildzoid said:


> you need to use B550 or A520 for BCLK overclocking. The X570 chipset pretty much stops working with the BCLK above 101MHz.


Have you confirmed this isn't locked out for the 5800x3d?


----------



## Vecix6 (Apr 12, 2022)

Non-game benchmarks shows not good results so, in which scenarios would be good those new Milan-X CPUs?


----------



## W1zzard (Apr 12, 2022)

Vecix6 said:


> in which scenarios would be good those new Milan-X CPUs?


Milan-X will be awesome with your own specific workloads, for which you own the source code. You can reconfigure the cache in several ways, to hand-tune it to your application, and hand-tune your application to it.


----------



## nicamarvin (Apr 12, 2022)

Vecix6 said:


> Non-game benchmarks shows not good results so, in which scenarios would be good those new Milan-X CPUs?


Fluid Dynamics Simulation has shown up to 80% Performance Boost.


----------



## msimax (Apr 12, 2022)

Nice review  but on the memory side everyone knows you atleast want 16-16-16 with AMD hell gskill has 14-14-14 kits that's always available, This will be AMD top gaming performance chip currently and to handcuff it with that ram spec, Alderlake already has a bandwidth advantage running ddr5 6000 cas 36. 

Also if it doesn't oc in bios maybe software apps can be tested to bclk it up in steps to check for instability


----------



## nicamarvin (Apr 12, 2022)

Izzy1985 said:


> Hmmm..... screenshot from TPU article 6 days ago.


This one? 


 

I downloaded the driver and open the .txt readme. which I had issues with the MSI MB that TPU review was done...


----------



## David Fallaha (Apr 12, 2022)

_Flare said:


> cite: [...]
> Quick Word on voltages – AMD Ryzen 7 5800X3D​As you saw, I rushed a little more than I do than regular reviews so data is being added through the day. Someone did ask me about OC and while we found out that a lot of _AMD OC menu is locked up (no Curve Optimizer)_ it made me wonder about voltage behavior in single thread and multi thread situation.
> 
> Hence run a *Cinebench R23* run (in both scenarios) and the CPU behaves way different than a Zen 3 non *3D V-CACHE CPU*.
> ...


thanks, appreciate the schedule, look forward to perhaps some more data with proper RAM...feels a little unfair to give Intel one of the fastest RAM kits on the market but limit Ryzen to 3600C16, no? Or given the ludicrous power numbers perhaps some FPS/watt?



msimax said:


> Nice review  but on the memory side everyone knows you atleast want 16-16-16 with AMD hell gskill has 14-14-14 kits that's always available, This will be AMD top gaming performance chip currently and to handcuff it with that ram spec, Alderlake already has a bandwidth advantage running ddr5 6000 cas 36.
> 
> Also if it doesn't oc in bios maybe software apps can be tested to bclk it up in steps to check for instability


Completely agree, this is a pretty major drawback of the review but can hopefully be fixed soon



phanbuey said:


> Super promising for 3D Cache zen 4
> 
> the alder lake kit isn't that expensive compared to other DDR 5 prices - it's a $400 kit of 32GB


hmm DDR5 kit ~$600 in UK vs ~$250 for DDR43600C16 -in fact you can get 32GB 4000Mhz+ DDR4 for ~$300

Patriot Viper Steel 16GB (2x8GB) DDR4 PC4-35200C19 4400MHz Dual Channel Kit (PVS416G440C9K) | OcUK (overclockers.co.uk)


----------



## Denver (Apr 12, 2022)

damric said:


> Very interesting, but I guess I won't be getting this since the improvement at 4K just doesn't matter over my existing 5800x.


Maybe it makes a difference with GPUs 2x faster than current ones?


----------



## David Fallaha (Apr 12, 2022)

demu said:


> Not the best possible memory for Ryzen (latencies other than CL quite poor), and no information about either single or dual rank (which may have a big impact).
> Alder Lake has about the best DDR5 memory money can buy.
> Ryzen:
> 2x 16 GB DDR4-3600
> ...


Yep can get 4000Mhz+ low latency DDR4 for half the price of the Intel kit here...
Patriot Viper Steel 16GB (2x8GB) DDR4 PC4-35200C19 4400MHz Dual Channel Kit (PVS416G440C9K) | OcUK (overclockers.co.uk)



Zareek said:


> I'm underwhelmed for the price. With the exception of a few odd ducks, it's less than 10% faster than 5800x for 20% more cost.


Er no, once it's given proper fast RAM and the BIOS/driver update it will be the fastest gaming chip on the planet at *half* the power of the toaster 12900K for 2/3 the price...


----------



## Airisom (Apr 12, 2022)

Realistically speaking, 16gbx2 3600 cl14 is the best kit for Ryzen unless you have an imc that can handle more. A little cheaper as well versus the 6000 cl36 kits out there.


----------



## AnotherReader (Apr 12, 2022)

This lives up to AMD's performance claims. I'm not surprised, but a few other of our fellow forum members might be pleasantly surprised.


----------



## fevgatos (Apr 12, 2022)

So basically its about 3-4% faster in games than a 12700f, but gets absolutely creamed in everything else (single thread, multithread, upgradability) while also costing 50% more? Woah, thats just a bad product. Needs a big pricecut to 300-350. At 450 its a joke


----------



## tussinman (Apr 12, 2022)

David Fallaha said:


> Er no, once it's given proper fast RAM and the BIOS/driver update it will be the fastest gaming chip on the planet at *half* the power of the toaster 12900K for 2/3 the price...


I think he's more referring to price against realisitic alternates, obviously that doesn't include the 12900k.

12700k for example is over 10% faster in CPU test, only 2-3% slower in gaming + has 13th gen support and it's $100-125 cheaper. 5700/5800X is only like 6-9% slower in gaming and can be had for 25-30% cheaper


----------



## Sound_Card (Apr 12, 2022)

Imagine having a new platform with a new PCIe bus, ram speed, new IPC, etc. AMD just sticks some more cache on top of the CPU while consuming considerably less power and tying your performance in games (if not better with the 1% lows). Just sold my Intel stock.


----------



## ArcanisGK507 (Apr 12, 2022)

‎i'm going to hold on until the‎
AMD Ryzen 7 7800X3D


----------



## Hugh Jass (Apr 12, 2022)

Bloax said:


> "1.35v is the highest the vcache supports, therefore the maximum possible frequency is 4.5 GHz"
> 
> yeah that's cute, tell me more
> 
> Hopefully AM5 won't have these.. Funny features, let's call them that, yes.


This happens across the Zen 2 range (personally tested on a 5700G, 5800X and 2 x 5900X). It's called clock stretching, and only really useful if all you care about is seeing a bigger number in monitoring software.

If you run Cinebench R23/CPU-Z then you'll find you're nowhere near the 16k multi/1650 single or 7000 multi/650 single you should be at for those clocks.


----------



## nicamarvin (Apr 12, 2022)

I am still waiting for the MSI Changelog where it says that it has The *AMD 3D V-Cache Performance Optimize*r listed on their Beta Driver. Otherwise guys. Let's just wait for Benchmarks with Actual Released Bios.


----------



## phanbuey (Apr 12, 2022)

David Fallaha said:


> thanks, appreciate the schedule, look forward to perhaps some more data with proper RAM...feels a little unfair to give Intel one of the fastest RAM kits on the market but limit Ryzen to 3600C16, no? Or given the ludicrous power numbers perhaps some FPS/watt?
> 
> 
> Completely agree, this is a pretty major drawback of the review but can hopefully be fixed soon
> ...


IDK about UK but you can get a 6000 CL 36 kit here for $360 -- in fact that's what I'm running now, and it's faster than my old DDR4 32gb 4133 4x single rank b dies.





Prices aren't that different here anymore and haven't been for a while.


----------



## Aquinus (Apr 12, 2022)

The gaming numbers look really good, but I'm underwhelmed by the synthetics and number crunching and server benchmarks. The performance of Milan-X over the non-cache variant is simply staggering, at least in Linux. Maybe it's because that's 768MB of cache as opposed to 96MB, but I was expecting it to at the very least, be on par with the non 3D variant with slightly higher clocks. It will be interesting to see if Phoronix does a review of this chip on Linux and to see if those results match the same kind of trends we're seeing here.


----------



## Makaveli (Apr 12, 2022)

nicamarvin said:


> I just downloaded the Driver and there were no reference for a 5800X3D Optimizer Driver.
> 
> View attachment 243416
> 
> View attachment 243415


You downloaded a bios not a driver two different things.

@W1zzard Any chance you can run the matrix UE5 PC demo on it would love to see if the cache helps with that.


----------



## ARF (Apr 12, 2022)

fevgatos said:


> So basically its about 3-4% faster in games than a 12700f, but gets absolutely creamed in everything else (single thread, multithread, upgradability) while also costing 50% more? Woah, thats just a bad product. Needs a big pricecut to 300-350. At 450 its a joke



AMD has been very aggressive with the whole 5000 series pricing. It never cared to make the processors more affordable in order to gain some more significant market share.
While, the bitcoin mining bubble is no longer, so the share price is now well below $99. It was up to $160 some months ago...


----------



## nicamarvin (Apr 12, 2022)

Makaveli said:


> You downloaded a bios not a driver two different things.



But the Performance Optimizer for 3D V-Cache is listed on the .txt on the Gigabyte BIOS. And it's also mentioned before you download the driver on the Asus TUF B550 MB. I can't seem to find that on the MSI BIOS(Beta) on this Review


----------



## Kovoet (Apr 12, 2022)

Right I'm replacing my CPU I guess. Keep the 5900 spare


----------



## phanbuey (Apr 12, 2022)

Kovoet said:


> Right I'm replacing my CPU I guess. Keep the 5900 spare



I don't think you will see a difference.


----------



## efikkan (Apr 12, 2022)

As expected, the gains from extra L3 cache is hit and miss, since it has to sacrifice some clock speed.
I would be interested to see a more deep-dive into frame time consistency with this chip to see if there are significant gains there, because if so, those will be noticeable to the end user.

But for those of you who are disappointed, as I've been saying; adding L3 cache is not going to help performance across the board, it mostly affects instruction cache misses, which programmers know are mostly associated with poorly written software. So most heavy applications wouldn't see significant performance gains here (as they are better written), and you wouldn't see a multithreading boost as some of you expected either.


----------



## phanbuey (Apr 12, 2022)

efikkan said:


> As expected, the gains from extra L3 cache is hit and miss, since it has to sacrifice some clock speed.
> I would be interested to see a more deep-dive into frame time consistency with this chip to see if there are significant gains there, because if so, those will be noticeable to the end user.
> 
> But for those of you who are disappointed, as I've been saying; adding L3 cache is not going to help performance across the board, it mostly affects instruction cache misses, which programmers know are mostly associated with poorly written software. So most heavy applications wouldn't see significant performance gains here (as they are better written), and you wouldn't see a multithreading boost as some of you expected either.


But Borderlands 3 and Far Cry 5 are SO WELL code... ah yeah... ok.


----------



## TheDeeGee (Apr 12, 2022)

Even warmer than the 5800X and that's bitch to keep cool already.


----------



## thunderingroar (Apr 12, 2022)

would love to see how this one performs in some of the heavier emulators like RPCS3 and compared to ADL

hopefully zen4 also gets this 3D vcache


----------



## ARF (Apr 12, 2022)

phanbuey said:


> But Borderlands 3 and Far Cry 5 are SO WELL code... ah yeah... ok.



Yeah, high gains means poor code


----------



## efikkan (Apr 12, 2022)

phanbuey said:


> But Borderlands 3 and Far Cry 5 are SO WELL code... ah yeah... ok.


In case I wasn't clear enough;
Instruction cache misses are mostly associated with (poor) software design, so in simplified terms; the worse the software, the more gain is to be expected from extra L3. Get it?


----------



## phanbuey (Apr 12, 2022)

efikkan said:


> In case I wasn't clear enough;
> Instruction cache misses are mostly associated with (poor) software design, so in simplified terms; the worse the software, the more gain is to be expected from extra L3. Get it?



I was actually agreeing with you. But yes... I do get it.


----------



## Crackong (Apr 12, 2022)

Impressive!

It is just running at 4.5GHz and beats competitor's 5.2 / 5.5


----------



## roberto888 (Apr 12, 2022)

Makaveli said:


> You downloaded a bios not a driver two different things.
> 
> @W1zzard Any chance you can run the matrix UE5 PC demo on it would love to see if the cache helps with that.


There is indeed a driver for optimizing 3D V-Cache in that driver package.


----------



## Makaveli (Apr 12, 2022)

nicamarvin said:


> But the Performance Optimizer for 3D V-Cache is listed on the .txt on the Gigabyte BIOS. And it's also mentioned before you download the driver on the Asus TUF B550 MB. I can't seem to find that on the MSI BIOS(Beta) on this Review
> View attachment 243427



The performance Optimizer that everyone is talking about is this. What you have listed above is a bios that supports 5800X3D and has some performance improvements which are two separate things. The bios that W1zzard used is V2 1.2.0.6 c so it should be the same as what is listed there.


----------



## zlobby (Apr 12, 2022)

Selaya said:


> is like, everything about it locked? cant change any voltages at all? like, the IOD ones related to memory/fclk, they shouldn't hurt the 3dcache no


Very sad, indeed. /s



btk2k2 said:


> I asked over on reddit but might as well ask here. Any idea if that is just a dud mc or is it another X3D limitation?


I think it's an early AGESA or a X3D limit.


----------



## Crackong (Apr 13, 2022)

I think a typical 5800x could do 4.7GHz at 1.35v ?
Further optimization Please ?


----------



## chrcoluk (Apr 13, 2022)

Somewhat underwhelming on non gaming workloads, but will praise AMD for not going black edition route.


----------



## Xebec (Apr 13, 2022)

Crackong said:


> I think a typical 5800x could do 4.7GHz at 1.35v ?
> Further optimization Please ?


5800X will hit 1.40-1.45 volt typically when boosting to 4.7 GHz for light workloads


----------



## SeventhReign (Apr 13, 2022)

How in Gods name can you possibly recommend this??????????
Its literally SLOWER than a standard 5800x in EVERYTHING except gaming, and its BARELY any faster in games!!!!  What are you people smoking??


----------



## Deleted member 24505 (Apr 13, 2022)

2.5% faster than my 12700k, balls i should have waited and got one of these /s


----------



## Totally (Apr 13, 2022)

SeventhReign said:


> How in Gods name can you possibly recommend this??????????
> Its literally SLOWER than a standard 5800x in EVERYTHING except gaming, and its BARELY any faster in games!!!!  What are you people smoking??



Same thing I'm seeing. Shame, was hoping that this chip was going to surprise me.

5900X, is the lowkey star, now at $400, is $50 cheaper to boot. 

Going that route comes in 12% cheaper, 50% more cores and higher clocks which translates to 20% more perf in productivity and 7% less in gaming....and you can still OC if you are into that, 2c.


----------



## Minus Infinity (Apr 13, 2022)

I'm still amazed at the efficiency of the 5800X for the performance. And the the 3700X literally sips power. I have no regrets getting the 5800X but I can't wait for Zen 4 vs Raptor Lake; It will be a lot closer than may think. Even AMD says  Zen4 is the biggest change to the Zen architecture since release and we will see bigger performance jumps from Zen 3 to Zen 4 than we had with Zen2 to Zen 3. Add in the possibility of v-cache for multiple SKU's it's looking good for AMD. I have a n old Zen 1700X looking for an upgrade next year so will be interesting to see whether it'll be Zen 4 or Raptor Lake.


----------



## simlife (Apr 13, 2022)

a bit odd as a negative for a cpu that cost as much as entire next gen consoles(so a entire pc) for not having a weak interaged gpu... if you can afford this cpu  you dont need   a igpu even just for testing you can buy or have a old one... a + was its on 7nm? wtf you know 7 nm has been out for many many years its not really a pro  because cards in 2019 had that...


----------



## Aretak (Apr 13, 2022)

SeventhReign said:


> How in Gods name can you possibly recommend this??????????
> Its literally SLOWER than a standard 5800x in EVERYTHING except gaming, and its BARELY any faster in games!!!!  What are you people smoking??


Yeah, it's absolutely baffling that people are interested in the gaming performance of a chip that's entirely focused and marketed around gaming.

It's a lot faster than the 5800X in some games, and anyone who cares about productivity would (or should) be buying a 5900X or a 5950X anyway. If people just want a gaming CPU, this is the best one around. If that's not a use case which interests you, why are you here wasting your time reading a review of a CPU marketed towards gamers, friendo? Can products not exist which are for other people, or are you so arrogant that you think anything you're not personally interested in must be garbage?


----------



## Crackong (Apr 13, 2022)

Xebec said:


> 5800X will hit 1.40-1.45 volt typically when boosting to 4.7 GHz for light workloads



I mean some fine tuning.
5800X boosting to 4.7 with 1.45v is just a result of AMD doing a 'safe setting' for all 5800X chips with varies silicon quality.
In PBO a good 5800X could hit 5.0 with the same voltage

And the 5800X3D are supposed to use the best binned silicon to cooped with the 3D cache.
So I won't be surprised the 5800x part of it can hit 4.7-4.8 with 1.35v


----------



## Akkedie (Apr 13, 2022)

Please, can you test games with RT at 720p? It's maddening how ignored this test scenario is but in fact it's the most relevant one given how much of a beating RT puts on CPUs, and it's in fact settings you'd use. It's for me the only reason I need more CPU power.


----------



## Totally (Apr 13, 2022)

Aretak said:


> Yeah, it's absolutely baffling that people are interested in the gaming performance of a chip that's entirely focused and marketed around gaming.
> 
> It's a lot faster than the 5800X in some games, and anyone who cares about productivity would (or should) be buying a 5900X or a 5950X anyway. If people just want a gaming CPU, this is the best one around. If that's not a use case which interests you, why are you here wasting your time reading a review of a CPU marketed towards gamers, friendo? Can products not exist which are for other people, or are you so arrogant that you think anything you're not personally interested in must be garbage?



It's not baffling, the 5600X exists. So only considering gaming,who in their right mind would pay more than double for a 10% increase in gaming performance? So it's not baffling for people to expect it to be more well-rounded.


----------



## Pastuch (Apr 13, 2022)

I can't wait to see Warzone benchmarks! I seriously think this will sell like hot cakes if the memory performance scales in Warzone like it does in Far Cry and Shadow of the Tombraider. Warzone may be the most memory latency and performance demanding game on the market because everyone that plays it is CPU limited, even with a 3080 at 1080p. I have a 280hz monitor, if replacing my 5600x with a 5800x3d gives me 35% more frames I would go from 190 fps average to around 250 fps average. I have the new Alienware 34 inch QD-OLED coming in June and based on performance testing I've done in Warzone, even at 3440x1440p I am CPU limited with a 5600x.

My 5600x is running insanely tight Samsung Bdie, AIDA of 53.4ns (4x8), dual rank is better for Warzone. Highs are around 240 fps, average around 195, lows around 170.


----------



## igniz (Apr 13, 2022)

This site is total crap, another userbenchmark 2.0 insight, even thought they tried hard to gimped 5800X3D performance by pairing 3600 ram with very bad timings, they don't even mention that in order to match 5800X3D performance you will need to spend $800 for the KS, $400 for new DDR5 6000, $300 for a decent Z690 and another $180 for a 360 decent cooler. do your math, can you imagine?


----------



## mechtech (Apr 13, 2022)

As always a nicely done and thorough review

Well interesting attempt for more performance by increasing the cache size.
-good for 1080p
-diminishing returns 1440p
-basically no difference at 2160p
-minimal changes on non-gaming apps

Feedback for @W1zzard
-top right corner of charts - perhaps bold or darken "higher or lower is better" so it stands out more?
-top of game charts - perhaps add the game engine & version the game is using beside the game title? 

Borderlands 3 seems to really love that cache  - I wonder if games built on the same game engine would also benefit from the extra cache - and if other game engines could be coded to take advantage of a larger cache it in the future??

Also one more thing, I see a lot of games like satifactory, factorio, sub-nautica, planet builder, etc. becoming popluar with most having overwhelmingly positive reviews on steam.  Any plans in the future to include one of these games in the reviews?

I am looking forward to the 5600 and 5500 to see how they compare!!


----------



## evernessince (Apr 13, 2022)

I feel like it should be pointed out that the Intel platform includes a $700 motherboard where as the AMD platform uses a $140 one.  I believe you can squeeze a bit more performance out of a higher end motherboard on the AMD side.  Can't say I'd ever spend $700 on a motherboard.



fevgatos said:


> So basically its about 3-4% faster in games than a 12700f, but gets absolutely creamed in everything else (single thread, multithread, upgradability) while also costing 50% more? Woah, thats just a bad product. Needs a big pricecut to 300-350. At 450 its a joke



Uh, half of 450 is 225 and not 375.  How did you come up with the idea the 12700F is 50% cheaper?  It's not even close.  Including platform costs a 5800X3D and 12700F will end up costing similar amounts.  In essence it's top of the line gaming performance.  For people who want that, it's a heck of a lot cheaper option than a 12900K.

Can't say I'll buy it as I value an all rounder (including power efficiency) but I can definitely see it's market.


----------



## RedBear (Apr 13, 2022)

Congratulations and thanks for the early review. For what it's worth I'm mostly curious about the long term availability of this part, considering it's coming late in the life of the AM4 platform and it's relatively expensive, but at this point I guess we'll have to wait and see. I'm very much looking forward to the reviews of the 5600 and 5700X.


----------



## Kissamies (Apr 13, 2022)

Trades blows with with 12900KS "Emergency Edition" from game to game so good that they truly delivered what they promised. The pricing is kinda meh, but as it's first of its kind, I guess it's reasonable.

On the other hand,12900KS has the price premium over the regular SKU and that's marketed as a "gaming CPU" as well.


----------



## Why_Me (Apr 13, 2022)

Unless I already owned an AMD board, I'd opt for the less expensive 12700 / 12700F ($310) or the 12700K / 12700KF.


----------



## mahoney (Apr 13, 2022)

If only this thing cost 300€. Wizz how much did you pay for it?


----------



## Kissamies (Apr 13, 2022)

Why_Me said:


> Unless I already owned an AMD board, I'd opt for the less expensive 12700 / 12700F ($310) or the 12700K / 12700KF.


Yeah, I see no sense for getting a new board just to get this. But I'd see this more as an upgrade for existing AM4 users.


----------



## Why_Me (Apr 13, 2022)

billeman said:


> Nice processor if you want to game at 720p/1080p


This ^^



thegnome said:


> Hmm, would putting something like 3800cl16 increase the performance..? Would be fair given that ddr5 kit is pretty top-notch while the ddr4 kit is pretty slow for todays standards.


DDR5 has shown to be slower than DDR4 in regards to gaming.


----------



## Kissamies (Apr 13, 2022)

Why_Me said:


> DDR5 has shown to be slower than DDR4 in regards to gaming.


Isn't this always the thing when a new RAM generation is released, after few years with faster modules, the newer generation starts to show off its potential.


----------



## Mussels (Apr 13, 2022)

Part of me says "get this. now."

And then i realise... i'm never CPU limited by games, as it is.



W1zzard said:


> Yes, but the CPU can't run DDR4-3800 at 1:1, max FCLK is 1866 MHz


Could you try 3200 CL14, to see if lower latency helps the Vcache?



darksf said:


> Well you all look Avg FPS , look at the framepacing this CPU is delivering the smoothest frametimes and has 99.9%th higher than the average framerates of most of the other CPUs. Which means it is delivering the smoothest gameplay with less hickups no matter the resolution or the GPU.


This. Those graphs take time to interpret, but it looks like its smashing head to head with the 12900K, with lower power usage, temps, and price... so that's quite a winner



fevgatos said:


> So basically its about 3-4% faster in games than a 12700f, but gets absolutely creamed in everything else (single thread, multithread, upgradability) while also costing 50% more? Woah, thats just a bad product. Needs a big pricecut to 300-350. At 450 its a joke


It's a gaming chip. Its made for gaming.

Oh lordy no the 8 core CPU is getting creamed in multi threaded tests...by a 16 core chip.
Well... duh?


----------



## SirMaster (Apr 13, 2022)

Anyone know how this CPU performs for emulation like dolphin, cemu, yuzu, ryujinx?

Wonder how it would end up with something like 3800 14-14-14-28 memory in games as well.


----------



## Garrus (Apr 13, 2022)

billeman said:


> Nice processor if you want to game at 720p/1080p


Dude, read the review next time. The chart where it came on top was 1440p. +0.6 percent versus the 12900k at 1440p. Read the review next time.



ARF said:


> It has entered mass production which means in the next weeks we will see the official launch.
> 
> Nice uplift in Farcry 5, up to 35% at 720p but besides that, it's a meh.
> Totally irrelevant at 2160p today.


Read the review. It was clustered at the top of the charts for 4k also. All you're really saying is "NO CPU MATTERS that much when playing at 4k" which is true. But the entire point of buying the 12900ks or 5800x3d is to play at high refresh rates. 4k at minimum settings. 1440p 240hz. 1080p 360hz etc. Don't you think it is silly to say "no cpu matters at 4k" as a way to attack the best cpu? LOL


----------



## han32 (Apr 13, 2022)

SirMaster said:


> Anyone know how this CPU performs for emulation like dolphin, cemu, yuzu, ryujinx?
> 
> Wonder how it would end up with something like 3800 14-14-14-28 memory in games as well.


yup exactly..look at alder lake ddr5...review using low cl 6000 ddr6 from gskill...not fair 
and if AMD using 4000mhz cl14...gonna good duel


----------



## Garrus (Apr 13, 2022)

SeventhReign said:


> How in Gods name can you possibly recommend this??????????
> Its literally SLOWER than a standard 5800x in EVERYTHING except gaming, and its BARELY any faster in games!!!!  What are you people smoking??


90 percent of buyers that buy i7 or i9 are buying for the gaming, your argument is the same there, the only CPU that makes sense then is the 12600k

the entire reason intel limits the cache for the i5 and i7 is to make a reason to buy the i9

if you are not interested in high refresh gaming, you don't want the best fps, that's fine

AMD was so far ahead versus Intel 10th and 11th gen precisely because the 5600x has the same cache as the 5800x and 5900x, so gaming never needed anything better than the 5600x (imagine a 12600k with the same amount of cache as the i9, NOW THAT WOULD BE THE BEST CPU EVER at that price), unlike with Intel, there's no reason to upgrade for cache, the 5600x was tops in game fps for so long because of that (and why it commanded higher prices suddenly)

(Did I make sense? Intel cuts the cache i9 to i7 to i5 to make gaming worse on purpose, even though the i5 has more than enough cores)



han32 said:


> yup exactly..look at alder lake ddr5...review using low cl 6000 ddr6 from gskill...not fair
> and if AMD using 4000mhz cl14...gonna good duel


I'd love a follow up piece with the newegg $130 DDR4 3600 C14 stuff


----------



## Mussels (Apr 13, 2022)

The only reason to buy anything above the 6 cores from either camp, is as a high refresh gamer
For a lot of games that means low res, or lower settings (or future, unreleased GPU's)

If you do nothing but game, get a 12400f or 5600x and stop wasting money.

If you intend to game at 144Hz+, this review shows you what CPU's can do that, if your GPU can keep up


----------



## Why_Me (Apr 13, 2022)

han32 said:


> yup exactly..look at alder lake ddr5...review using low cl 6000 ddr6 from gskill...not fair
> and if AMD using 4000mhz cl14...gonna good duel


DDR4 3800 vs DDR5 5200.


----------



## birdie (Apr 13, 2022)

I don't understand why everyone is so excited.

AMD is _not_ the first company to have a massive L3 cache: Intel did that _seven_ years ago with Broadwell which featured a massive 128MB L4 cache and showed crazy improvements in certain applications and games as well.
5800X3D is a single experimental CPU which rocks in some games but loses in others and also loses in general applications which don't require enormous L3 cache due to decreased frequencies.
Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.
RPL according to the leaked information will feature an increased IPC for both its P and E cores as well as a significantly increased caches which means Intel will swiftly catch up and overtake this CPU and maybe even Zen 4.
It's too effing expensive for what it offers, no, it's _not $100_ more expensive than 5800X, 5800X has been recently sold for as low as $350 which makes it a huge $200 difference. I got it wrong, sorry. Still we're only talking about rare applications and gaming at quirky resolutions.
This is the last hooray of AM4, there's no future upgrade path.
Kudos to AMD for this experiment. Rare AMD fans who game at lower resolutions and those who like to boast about gaming benchmarks must be happy.


----------



## Why_Me (Apr 13, 2022)

birdie said:


> I don't understand why everyone is so excited.
> 
> AMD is _not_ the first company to have a massive L3 cache: Intel did that _seven_ years ago with Broadwell which featured a massive 128MB L4 cache and showed crazy improvements in certain applications and games as well.
> 5800X3D is a single experimental CPU which rocks in some games but loses in others and also loses in general applications which don't require enormous L3 cache due to decreased frequencies.
> ...


*$449* for a cpu that doesn't overclock in order to game at 1080P not to mention content creators won't give this cpu a second thought after looking at the rendering benches.  This should be interesting to say the least when the 12700F is going for *$310* and the 12700K/KF is going for *$370* atm.


----------



## Bwaze (Apr 13, 2022)

I think at that price point there should not have been a downgrade in productivity applications. It's not pre-Alder Lake era, where Intel was trailing badly in that area. 

What we have is more of a proof of concept product, and I imagine lessons learned here will be valuable in next Ryzen chips that are designed from the start with massive cache - and the ability to power and cool it properly.


----------



## watzupken (Apr 13, 2022)

birdie said:


> I don't understand why everyone is so excited.
> 
> AMD is _not_ the first company to have a massive L3 cache: Intel did that _seven_ years ago with Broadwell which featured a massive 128MB L4 cache and showed crazy improvements in certain applications and games as well.
> 5800X3D is a single experimental CPU which rocks in some games but loses in others and also loses in general applications which don't require enormous L3 cache due to decreased frequencies.
> ...


To me, this is like a warning shot by AMD, and also for them to try and claw back some of the ”thunder” they lost to Alder Lake. The 5800X3D to me is too little and too late. Too little because you are not going to get a consistent performance improvement since it is just the cache that got increased. So if I was looking for an overhaul of my system, Alder Lake is clearly the better choice because the performance uplift is significant. I know that people are going to say that it uses a lot more power, but the truth is, unless you consistently load the CPU to its max, you won’t see 200+ W of power consumption. In games at 1440p, I rarely see HWInfo reporting power draw of more than 70 to 80W. In fact if one really wants to get a Ryzen 5000 series at this point in time, I feel the 5900X is actually better in value if you have the use case to utilise the cores. 
This product is too late as well because it is going to end up like Rocket Lake where it is going to get replaced by more capable chip in the next 6 months or so.


----------



## btarunr (Apr 13, 2022)

birdie said:


> I don't understand why everyone is so excited.
> ...
> Kudos to AMD for this experiment. Rare AMD fans who game at lower resolutions and those who like to boast about gaming benchmarks must be happy.



This processor exists to tell investors that Zen 4 will beat Intel at gaming, and 16-core Zen 4 will beat Intel's 8+8 core processors. It exists to prove Gelsinger's claim wrong about AMD's heydays being over.


----------



## Why_Me (Apr 13, 2022)

btarunr said:


> This processor exists to tell investors that Zen 4 will beat Intel at gaming, and 16-core Zen 4 will beat Intel's 8+8 core processors. It exists to prove Gelsinger's claim wrong about AMD's heydays being over.


Lest we forget Raptor Lake is right around the corner and those cpu's will work with LGA-1700 600 series boards.


----------



## btarunr (Apr 13, 2022)

Why_Me said:


> Lest we forget Raptor Lake is right around the corner and those cpu's will work with LGA1700 600 series boards.


And Raptor Lake is built on existing Intel 7 node (10 nm), whereas Zen 4 is N5 (5 nm). If 5800X3D is a ~10% gaming perf uplift over Zen 3, the expectation from Zen4 will be set at 25% over Zen 3, which is about 10-15% above Alder Lake. Good luck to Intel trying to get there on existing node.


----------



## birdie (Apr 13, 2022)

btarunr said:


> This processor exists to tell investors that Zen 4 will beat Intel at gaming, and 16-core Zen 4 will beat Intel's 8+8 core processors. It exists to prove Gelsinger's claim wrong about AMD's heydays being over.



I don't know where you pulled this from. This CPU has absolutely _nothing_ to say about the performance of Zen 4 except that the latter will be faster. No one has even confirmed that desktop/mobile Zen 4 SKUs will have 3D V-Cache.



btarunr said:


> And Raptor Lake is built on existing Intel 7 node (10 nm), whereas Zen 4 is N5 (5 nm). If 5800X3D is a ~10% gaming perf uplift over Zen 3, the expectation from Zen4 will be set at 25% over Zen 3, which is about 10-15% above Alder Lake. Good luck to Intel trying to get there on existing node.



The node advantage will allow to pack more transistors at the same power package - a smaller node doesn't allow the CPU to magically clock a lot higher. With a very high confidence I can claim that 5.5GHz of ADL will be unattainable for Zen 4 CPUs.


----------



## btarunr (Apr 13, 2022)

birdie said:


> The node advantage will allow to pack more transistors at the same power package - a smaller node doesn't allow the CPU to magically clock a lot higher. With a very high confidence I can claim that 5.5GHz of ADL will be unattainable for Zen 4 CPUs.



Please read our 12900KS review. Raptor Lake is built on that same node, where Intel plans to add even bigger P-cores, and two more E-core clusters (8P+16E). 12900KS is already burning the block, and with thermal limits overridden, it crosses 100C.

Also, the 5.5 GHz ADL clock speed advantage is kinda pointless in the face of 5800X3D gaming perf?



birdie said:


> I don't know where you pulled this from. This CPU has absolutely _nothing_ to say about the performance of Zen 4 except that the latter will be faster. No one has even confirmed that desktop/mobile Zen 4 SKUs will have 3D V-Cache.


From this review, and the gaming performance uplift.


----------



## TheLostSwede (Apr 13, 2022)

I really don't get all the haters. Are you being forced to buy this CPU?
Yes, it's very much a last hooray for the AM4 socket, but why does this irk you?
No-one has to buy it and if it doesn't suit your needs, there are plenty other options.
I'm glad I got a 5800X for the same price as the 5700X, but I'm sure some people will go for the 5800X3D as it suits their needs.

Is it the most amazing processor ever? No. 
I would actually call this a retail tech demonstration by AMD. 
It shows what the company is  capable of, but it comes at a cost that are going to make most people look elsewhere and that's fine.


----------



## demu (Apr 13, 2022)

W1zzard said:


> It's not stated anywhere.. I tested it, 1866 POST, 1900 no POST, which probably means WHEA errors at 1866


No wonder if you tested with those semi-crap memory sticks. Try decent b-die sticks.
By the way, did you use dual rank memory sticks?


----------



## birdie (Apr 13, 2022)

btarunr said:


> Please read our 12900KS review. Raptor Lake is built on that same node, where Intel plans to add even bigger P-cores, and two more E-core clusters (8P+16E). 12900KS is already burning the block, and with thermal limits overridden, it crosses 100C.
> 
> Also, the 5.5 GHz ADL clock speed advantage is kinda pointless in the face of 5800X3D gaming perf?
> 
> ...


God, I'm tired of this crap. 5800X3D is faster in _few_ selected games and it loses in all others. It's indistinguishable for 1440K and 4K.

We have discussed ADL power consumption ad nauseam already - I'm not going down that route ever again. Hint: it's _not_ 300W, it's _not_ 100C. In actual games ADL has been shown to be as power effective as Zen 3 CPUs.

Is this a discussion of the review of 5800X3D or what?

Lastly, let me tell you how much I hate yellow headlines: "AMD Ryzen 7 5800X3D CPU *Crushes* Intel's Fastest Gaming Chip, The Core i9-12900K, In Gaming Benchmarks". There's so much pain and fallacy in it it's just cringe-worthy. Out of two dozen tested titles 5800X3D runs faster in what 3? 4? At resolutions most people couldn't care less about.

OH, GOD, THIS IS A REVOLUTION AS IF BROADWELL NEVER EXISTED. I'm done and out of this completely pointless discussion. AMD fans have collectively ejaculated - great! Your idol has reclaimed the performance crown under rare quite uncommon conditions maybe 5000 people in the world care about.


----------



## Legacy-ZA (Apr 13, 2022)

Shatun_Bear said:


> Very impressive. Not sure why they are stealth launching this, it matches or beats the limited ediition behemoth power guzzling 12900KS for a much, much lower price.
> 
> Low stock maybe?



Or to keep scalpers from catching wind of the launch.


----------



## fevgatos (Apr 13, 2022)

evernessince said:


> Uh, half of 450 is 225 and not 375.  How did you come up with the idea the 12700F is 50% cheaper?  It's not even close.  Including platform costs a 5800X3D and 12700F will end up costing similar amounts.  In essence it's top of the line gaming performance.  For people who want that, it's a heck of a lot cheaper option than a 12900K.


I didn't say 50% cheaper. I said the 5800x is 50% more expensive


----------



## Why_Me (Apr 13, 2022)

Legacy-ZA said:


> Or to keep scalpers from catching wind of the launch.


Could scalpers fetch more than the $449 suggested retail for this cpu?


----------



## Taraquin (Apr 13, 2022)

Impressive in gaming, but I wish it atleast supported negative curve optimizer, that could have yielded up to 6% better perf multicore and would lower temps single core.


----------



## Mussels (Apr 13, 2022)

birdie said:


> I don't understand why everyone is so excited.
> 
> AMD is _not_ the first company to have a massive L3 cache: Intel did that _seven_ years ago with Broadwell which featured a massive 128MB L4 cache and showed crazy improvements in certain applications and games as well.
> 5800X3D is a single experimental CPU which rocks in some games but loses in others and also loses in general applications which don't require enormous L3 cache due to decreased frequencies.
> ...


Did you read the review?
*It's winning at 1440p.*


----------



## birdie (Apr 13, 2022)

Mussels said:


> Did you read the review?
> *It's winning at 1440p.*



*0.6%? OMG. A FAT CONCLUSIVE WIN.*


----------



## DemonicRyzen666 (Apr 13, 2022)

Taraquin said:


> Impressive in gaming, but I wish it atleast supported negative curve optimizer, that could have yielded up to 6% better perf multicore and would lower temps single core.


Curve optimize is a glorified Load Line Calibration for per-core.


----------



## Mussels (Apr 13, 2022)

birdie said:


> 0.6%? OMG.


And?
It still goes against everything you just said


Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.


----------



## birdie (Apr 13, 2022)

Mussels said:


> And?
> It still goes against everything you just said
> 
> 
> Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.


Against *everything *I said? You, sir, are lying or blatantly exaggerating. Didn't expect it from a moderator but whatever. Underdog's mentality just refuses to die.

@W1zzard nowhere in your review you mention Intel Broadwell which is quite sad actually. I expected you to be a little bit more well versed in the history of computing.


----------



## AusWolf (Apr 13, 2022)

I was itching to comment a big "BOOO" until I got to the game tests. Then I went "hmm". Not bad. Not bad at all!


----------



## Taraquin (Apr 13, 2022)

DemonicRyzen666 said:


> Curve optimize is a glorified Load Line Calibration for per-core.


That's one way of putting it, but actual results can be quite impressive. Going from stock to -30 allcore on my 5600X yielded 6% better multicore CB23 at same 76W PPT. Temp in single core scenarios dropped by 5C due to voltage at load being 90mv lower vs stock.



Mussels said:


> And?
> It still goes against everything you just said
> 
> 
> Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.


Actually many game at 1080p or 720p without knowing (DLSS/FSR ).


----------



## Mussels (Apr 13, 2022)

birdie said:


> Against *everything *I said? You, sir, are lying or blatantly exaggerating. Didn't expect it from a moderator but whatever. Underdog's mentality just refuses to die.
> 
> @W1zzard nowhere in your review you mention Intel Broadwell which is quite sad actually. I expected you to be a little bit more well versed in the history of computing.


I give up - you're just here to troll and waste time.


----------



## Bwaze (Apr 13, 2022)

So, is it "20% uplift" in gaming that AMD teased? I know it was worded "up to", and at equivalent frequency to 5800x (which it doesn't achieve), and we all knew this would only be the case in non-GPU limited scenarios (so ultra low resolution). 

So yeah, it does what AMD was teasing. In some games at ultra low resolution it has incredible performance uplift - mostly at games that don't need it, they were at very high FPS even on 5800x. 

But I still think steep price increase would indicate all around faster processor, and here we have a processor that has mostly the same performance or even lower than 5800x in productivity, and even if we focus purely on gaming, there are lots of games that don't benefit much. And focusing more on games that really tax the GPU would show a bit diffetent results, even at lower resolutions.


----------



## btarunr (Apr 13, 2022)

Bwaze said:


> So, is it "20% uplift" in gaming that AMD teased? I know it was worded "up to", and at equivalent frequency to 5800x (which it doesn't achieve), and we all knew this would only be the case in non-GPU limited scenarios (so ultra low resolution).
> 
> So yeah, it does what AMD was teasing. In some games at ultra low resolution it has incredible performance uplift - mostly at games that don't need it, they were at very high FPS even on 5800x.
> 
> But I still think steep price increase would indicate all around faster processor, and here we have a processor that has mostly the same performance or even lower than 5800x in productivity, and even if we focus purely on gaming, there are lots of games that don't benefit much. And focusing more on games that really tax the GPU would show a bit diffetent results, even at lower resolutions.


How's this for an "up to"?





That's a 43% gain over the 5800X, and at a resolution that can be GPU-limited for every graphics card priced under $1000.


----------



## Bwaze (Apr 13, 2022)

Yes, certain games show great uplifts even at higher resolutions. Borderlands 3, Far Cry 5. But some games show almost zero uplift, and they aren't GPU bound - RDR2 for instance.


----------



## Taraquin (Apr 13, 2022)

Bwaze said:


> Yes, certain games show great uplifts even at higher resolutions. Borderlands 3, Far Cry 5. But some games show almost zero uplift, and they aren't GPU bound - RDR2 for instance.


It depends a lot of what games prefer, many games love cache (BL3 and FC), some love latency (FC), some love bandwith (Cyberpunk, Total war), some love cores (RSS) some like everything (SOTTR).


----------



## Bwaze (Apr 13, 2022)

I wonder if it would do anything for Microsoft Flight Simulator and DCS - two games that are often CPU bound, but I don't think they are the type that would "fit in cache".


----------



## GoldenX (Apr 13, 2022)

A friend is very interested in it, he would go from a 1800X to this. No need to update the X370 board, thanks to the new BIOSes, so it's a solid 5 years upgrade.


----------



## Chomiq (Apr 13, 2022)

Ah, just what I expected - Intel cherrypickers vs AMD apologists. I need a fresh batch of popcorn.


----------



## btk2k2 (Apr 13, 2022)

Taraquin said:


> It depends a lot of what games prefer, many games love cache (BL3 and FC), some love latency (FC), some love bandwith (Cyberpunk, Total war), some love cores (RSS) some like everything (SOTTR).



It also depends on the scene. Games like CP:2077 and Tomb Raider are pretty large so very easy for different reviewers to test different scenes and see different results depending on how CPU or GPU intensive that scene is.

EDIT: I also find it a shame that nowhere really tests non FPS metrics. Some places do Factorio UPS, a few do CIv 6 turn time and Anandtech used to do something with Dwarf Fortress world building but where are Tic Rate tests for the Paradox grand strategy games, where are the simulation rate tests for Cities Skylines or the AI turn time tests for turn based games? I don't get the lack of testing for this. A lot of these games run fine at 4K on a pretty low end GPU but by end game the CPU is the part that is crying out for help but nobody seems to want to come up with a test for them. Kind of irritating really because I have to guess based on the FPS of games with high unit counts and a lot of background calculation going on at the same time like RTS games.


----------



## Crackong (Apr 13, 2022)

It is so funny that 


Bwaze said:


> So, is it "20% uplift" in gaming that AMD teased? I know it was worded "up to", and at equivalent frequency to 5800x (which it doesn't achieve), and we all knew this would only be the case in non-GPU limited scenarios (so ultra low resolution).
> 
> So yeah, it does what AMD was teasing. In some games at ultra low resolution it has incredible performance uplift - mostly at games that don't need it, they were at very high FPS even on 5800x.
> 
> But I still think steep price increase would indicate all around faster processor, and here we have a processor that has mostly the same performance or even lower than 5800x in productivity, and even if we focus purely on gaming, there are lots of games that don't benefit much. And focusing more on games that really tax the GPU would show a bit diffetent results, even at lower resolutions.



By "ultra low resolution" you mean 1440p ?
What "Normal resolution" you gaming on?  16k ?


----------



## Legacy-ZA (Apr 13, 2022)

I am really looking forward to the next-gen CPU's AMD has done something fantastic this time around, it can only spell good things going forward, I mean, look at the power draw compared against Intels best CPU and look at the FPS gains, it's simply amazing.


----------



## Bwaze (Apr 13, 2022)

Crackong said:


> It is so funny that
> 
> 
> By "ultra low resolution" you mean 1440p ?
> What "Normal resolution" you gaming on?  16k ?




Of course I was refering at huge uplift at lower resolutions. Some games show large gains even at higher resolutions, but by no means all of them - 5800x is just 6.7% slower at that resolution on average in this set of games.


----------



## dont whant to set it"' (Apr 13, 2022)

The drop in replacement option is magnificent for just about anyone rocking an AM4 platform , provided apropriate cpu support by motherboard manufacturers.

Neysayers be booing all the want because , most likely this cpu is not for them:
Where did it score in gaming? 
Right in the top echelon.


----------



## Shatun_Bear (Apr 13, 2022)

At this point 7nm is an old node, certainly less dense than Intel's brand new Intel 7 (10nm++), yet at lowish clocks on this old node its matching a 5.5Ghz behemoth 12900KS at roughly half the power draw in gaming. It seems to me Intel pushing CPUs to quite frankly absurd power draw levels to keep up is foolish when their rival has the tech that means they only have to stack some cache on top of the die to match them.

At this point Ryzen 6000 with 3D-cache (this will likely be the refresh from what's coming later this year) is going to wipe the floor with Intel even if they push clocks to 5.7Ghz at the top end and draw 350W!


----------



## btk2k2 (Apr 13, 2022)

@W1zzard Sorry to be a pain but is there a reason why the bar chart scores for the 5800X in Metro Exodus are different from the frame time analysis scores?






 VS






Is this a GPU limit restricting maximum FPS with the difference down to the scene in the bar chart being less CPU intensive than the one in the frame time analysis? If so wouldn't it make more sense to use CPU heavy scenes for a CPU test?


----------



## ARF (Apr 13, 2022)

Garrus said:


> Read the review. It was clustered at the top of the charts for 4k also. All you're really saying is "NO CPU MATTERS that much when playing at 4k" which is true. But the entire point of buying the 12900ks or 5800x3d is to play at high refresh rates. 4k at minimum settings. 1440p 240hz. 1080p 360hz etc. Don't you think it is silly to say "no cpu matters at 4k" as a way to attack the best cpu? LOL



I don't think anyone attacks it. Here we are discussing the pros and cons and my ultimate goal is to find a reason for not buying it, which is pretty good because it saves some decent amount of cash. Don't you want to save some nice cash now in this worldwide economic crisis?


----------



## Maranak (Apr 13, 2022)

birdie said:


> AMD is _not_ the first company to have a massive L3 cache: Intel did that _seven_ years ago with Broadwell which featured a massive 128MB L4 cache and showed crazy improvements in certain applications and games as well.


L4, not L3. ofc massive Cache is not a new thing for CPUs, but now we are at a point where especially the latency is extremely good with AMDs L3.


birdie said:


> 800X3D is a single experimental CPU which rocks in some games but loses in others and also loses in general applications which don't require enormous L3 cache due to decreased frequencies.


It's not an experimental CPU, just look at what the massive L3 is capable of at Milan-X. That's the future because you get "free" performance boosts without launching a whole new architecture.


birdie said:


> Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.


Doesn't matter how many ppl are gaming at 720p. That's not the point of CPU benchmarks. You want to know how much fps a CPU can deliver and for that you need lower res to eliminate the GPU as a limiting factor. Allthough I don't think that's the case in the benchmarks here since for a lot of the games the CPUs are all too close to each other. Smells like GPU bound in 720p which is... not a good way to test a CPU. 

And for benchmarks you should never use the integrated ones since they suck (but what I think happened here a few times) and if you're ingame the FPS are always worse since the games themselves are way more demanding.

Thing with lower res is: If your 3090 is capable of delivering 80fps in 2160p and the bar shows 80fps, you don't know how good the CPUs are. If the bar shows 115fps and you know the 3090 can only deliver 80fps, you know how good the CPUs are and were the GPU is the bottleneck. What is especially important for future GPUs and if the rumors are true we are getting like >2x performance with the next GPU gens coming this year already.

Besides that I don't like testing games with 3xxfps or even more in CPU benchmarks. Who cares about 340fps or 370fps? There are heavy CPU bound games out there like Anno, Total War, Cities, hell even in Elden Ring you're looking at something way below under 100fps for the CPU. Why not test those games where you actually need a lot more fps?


birdie said:


> RPL according to the leaked information will feature an increased IPC for both its P and E cores as well as a significantly increased caches which means Intel will swiftly catch up and overtake this CPU and maybe even Zen 4.


According to the leaked information the IPC increase won't be large and the increased cache is "only" L2. Will help a bit here and there but nothing special.



birdie said:


> This is the last hooray of AM4, there's no future upgrade path.


And yet for ppl with old AM4 boards and Zen2 or earlier they get a big last upgrade if they're mainly into gaming. Looks exciting enough to me comparing the original Zen (1800X etc.) with the 5800X3D and looking at the huge performance difference. Requiring no new socket or mainboard.


----------



## Felix123BU (Apr 13, 2022)

This 5800X3D would be super interesting with a 1900mhz+ capable IMC and very fast ram with optimized timings.
Also, not sure, but does it support negative voltage adjustment?


----------



## Luminescent (Apr 13, 2022)

This is the stupidest contest between Intel and Amd to have, "best gaming cpu" at 1280x720.
It's like windows users are in another reality with 1000W power supply's and extreme water cooling while Apple users have those stylish tiny computers now with ~400W power supply's and whisper quiet operation.
Intel,Amd and Nvidia should be thankful Apple is not interested in gaming and crypto mining scheming, they would be screwed if they ever wanted a piece of this nasty market.


----------



## swirl09 (Apr 13, 2022)

billeman said:


> Nice processor if you want to game at 720p/1080p


Or 1440p. And even 4K tbh, the difference is so small its not worth mentioning - Unless you specifically and exclusively game at that (which I do). Intels small edge is only realised if your target is 4K and you can get a good DDR5 kit (or I guess good DDR4, since you are likely to get one faster that'll play nice with Intel over AMD).

So AMD delivered what it said it would, and its a no brainer at this point in time to get a 5800X3D in most cases.

I still wish TPU would add a gaming power chart, the 12900 looks insane on those charts when it in no way reflects a typical gaming scenario.



Zareek said:


> I'm underwhelmed for the price. With the exception of a few odd ducks, it's less than 10% faster than 5800x for 20% more cost.


Thats one way to look at it. I think its great to see how much extra performance they got out of a year and half old CPU, while matching the launch price of the older chip rather than price creeping.


----------



## birdie (Apr 13, 2022)

Mussels said:


> I give up - you're just here to troll and waste time.



This is called ad hominem. Attacking the person vs talking about what he says. And thank you for *not* admitting you greatly exaggerated or lied about my statement.

What a nice discussion we have here. Congratulations on the high standards of TPU forums. *Nowhere in my posts on TPU I've ever trolled or wasted anyone's time*. I'm not known for trolling but I'm well known for crushing red-tinted glasses people love to put on when they talk about particular companies.



Maranak said:


> L4, not L3. ofc massive Cache is not a new thing for CPUs, but now we are at a point where especially the latency is extremely good with AMDs L3.
> 
> It's not an experimental CPU, just look at what the massive L3 is capable of at Milan-X. That's the future because you get "free" performance boosts without launching a whole new architecture.
> 
> ...



What AMD fans absolutely love to do is to talk about the future. Almost the entirety of your post talks about the future.

Speaking of the future: RPL will have massively increased L2/L3 caches. OK? And considering Intel has ample time to redesign them they can go ahead and add even more L3 cache because I'm sure as hell Intel will do anything to retain their performance crown.

Speaking of this CPU being experimental: the hell you are talking about Milan-X? Is this a desktop CPU? Is TPU about servers or gaming PCs? Where are other Zen 3 desktop SKUs with 3D V-cache? Why is 5800X3D priced so high?

People with old AMD4 boards and zen2 or earlier will be better served by normal 5800X/5900X CPUs which show much better *overall* performance. People who have been waiting to upgrade normally don't rock RTX 3090 and game at 720p or 1080.

TLDR: not a single argument for 5800X3D outside of some very specific games at very unsual resolutions.


----------



## W1zzard (Apr 13, 2022)

btk2k2 said:


> EDIT: I also find it a shame that nowhere really tests non FPS metrics. Some places do Factorio UPS, a few do CIv 6 turn time and Anandtech used to do something with Dwarf Fortress world building but where are Tic Rate tests for the Paradox grand strategy games, where are the simulation rate tests for Cities Skylines or the AI turn time tests for turn based games? I don't get the lack of testing for this. A lot of these games run fine at 4K on a pretty low end GPU but by end game the CPU is the part that is crying out for help but nobody seems to want to come up with a test for them. Kind of irritating really because I have to guess based on the FPS of games with high unit counts and a lot of background calculation going on at the same time like RTS games.


Can you start a new thread with ideas? I definitely want to add something like this in the next rebench



swirl09 said:


> I still wish TPU would add a gaming power chart, the 12900 looks insane on those charts when it in no way reflects a typical gaming scenario.


Will definitely be included in next rebench, just too complicated to add this now for 30 CPUs. 

I've been recording gaming power already for a few months to get a feel for it, don't take this as gospel and rather consider it experimental and preliminary:






birdie said:


> @W1zzard nowhere in your review you mention Intel Broadwell which is quite sad actually. I expected you to be a little bit more well versed in the history of computing.





birdie said:


> trolled or wasted anyone's time


That's exactly what I thought when I read your first statement, could be language differences


----------



## NicklasAPJ (Apr 13, 2022)

In Denmark this is a hard CPU to sell.

You can get 12900KF for 479 Euro, and 5800X for 363 Euro, that means the 5800X3D will cost the same with fewer cores...


----------



## xenocide (Apr 13, 2022)

birdie said:


> TLDR: not a single argument for 5800X3D outside of some very specific games at very unsual resolutions.


This is like, the coldest of takes. This isn't a CPU intended to be an upgrade from someone using a 5800X or equivalent CPU. It's a final upgrade for people using a 2xxx or even 1xxx series CPU so they can go a few years longer without having to rebuild their entire PC. The performance is there, and in that regard it achieves its goal.


----------



## The King (Apr 13, 2022)

__ https://twitter.com/i/web/status/1514063342440116229


----------



## Antonis_35 (Apr 13, 2022)

Thank you for the very thorough and informative review. Personally I will wait a few weeks before I decide whether I will upgrade to this CPU, to avoid the perils of the early adopter.


----------



## chrcoluk (Apr 13, 2022)

Got a friend getting this who will test on Lightning Returns.  For those who don't know that game is like the ultimate test for a cpu, as its single threaded and has big optimisation issues, constantly flushing and reloading textures with massive stutters in certain areas.  After he tests will report back on if it helps a lot.


----------



## thunderingroar (Apr 13, 2022)

Why_Me said:


> DDR5 has shown to be slower than DDR4 in regards to gaming.


Thats not true for high end DDR5 like the 6000cl36 one used in this test


----------



## Denver (Apr 13, 2022)

W1zzard said:


> Can you start a new thread with ideas? I definitely want to add something like this in the next rebench
> 
> 
> Will definitely be included in next rebench, just too complicated to add this now for 30 CPUs.
> ...


Yes, Add more games please, and it should also remove games like CS GO as the framerate is so out of line with everything else.


----------



## 529th (Apr 13, 2022)

Is it possible to add in the 1% lows and those types  of numbers?


----------



## Aquinus (Apr 13, 2022)

birdie said:


> *Nowhere in my posts on TPU I've ever trolled or wasted anyone's time*.


I beg to differ. You waste people's time both here and over on the Phoronix forums too. You've created quite a reputation for yourself. Either way, why would we test this against Broadwell when absolutely nothing in this review is that old? That's one example of how you are trolling and wasting our time because nobody is going to buy a broadwell chip now. The comparison would be strictly for you.


----------



## efikkan (Apr 13, 2022)

btarunr said:


> And Raptor Lake is built on existing Intel 7 node (10 nm), whereas Zen 4 is N5 (5 nm). If 5800X3D is a ~10% gaming perf uplift over Zen 3, the expectation from Zen4 will be set at 25% over Zen 3, which is about 10-15% above Alder Lake. Good luck to Intel trying to get there on existing node.


Are you talking about 25% performance uplift in general or 25% performance uplift in gaming (average)?
The latter will be virtually impossible as most games aren't significantly CPU bottlenecked unless you run them at unrealistically low resolutions.



Taraquin said:


> It depends a lot of what games prefer, many games love cache (BL3 and FC), some love latency (FC), some love bandwith (Cyberpunk, Total war), some love cores (RSS) some like everything (SOTTR).


I have never seen a developer intending to write code to _love_ cache or _love_ frequency. 
If you wanted to write code to _love_ cache, you would have to write bloated code or de-optimize existing code. Cache sensitivity, especially L3, is often regarded as a symptom of lack of optimization and bloated code.
Any coder should want to make their code scalable with CPU performance, so if the code scales with e.g. frequency it's an indicator of better code (or in your terms "loves" frequency  ).



Antonis_35 said:


> Personally I will wait a few weeks before I decide whether I will upgrade to this CPU, to avoid the perils of the early adopter.


It's not a new platform or a new basic die, so the only thing to watch out for is whether the assembly of the extra cache is unproblematic.


----------



## fevgatos (Apr 13, 2022)

W1zzard said:


> Can you start a new thread with ideas? I definitely want to add something like this in the next rebench
> 
> 
> Will definitely be included in next rebench, just too complicated to add this now for 30 CPUs.
> ...


System consumption during gaming is a little bit useless. If x cpu drives a gpu harder then it will shows as higher system consumption when it's just the gpu being pushed harder / getting more fps. Cpu power consumption would be more informative imo


----------



## Garrus (Apr 13, 2022)

birdie said:


> *0.6%? OMG. A FAT CONCLUSIVE WIN.*


Birdie, you are intentionally not paying attention. It's a win against the most expensive CPU you can buy. $450 versus $800 or whatever.

It's an extra $100 over the 5800X, but massively improves the gaming performance, the main reason people upgrade from the 12600k to the 12900ks, do you not understand why people spend an extra $550 doing that? $100 versus $550 for the same gaming improvements.

So now you'll save more than $700 USD getting DDR4 3600 C14 (please Wizzard, those sticks cost $130 USD on Newegg, buy a pair and compare) with a 5800X3D instead of buying the 12900KS with $400 DDR5, and get the same gaming FPS on average. Then you save $100+ on the motherboard too. It's fantastic especially for people like my excited brother that has a 5 year old X370 sitting on the shelf. Can you find an almost free or $50 motherboard for your 12900ks that can handle 300W? Because of the ~150W used by the 5800X3D, cheap normal tower coolers works with it, you've saved another $50-$150 dollars and you don't have to worry about VRMs overheating on your motherboard.

I could go on. Do you not understand, or do you not want to understand? It might just be 0.6%, but it is ~$800 all in cheaper for the same gaming performance, with none of the pain of cooling 300W.

If you don't get it, let others buy it. More for me.

For me, the very first time I got irritated by a CPU bottleneck was playing Borderlands 3. It was fixed for Tiny Tina, but Borderlands 3 being fixed by the Ryzen 5800X3D is perfect for me. If that is a game you like, you'll love this CPU. I don't care about counter strike yikes.


----------



## Denver (Apr 13, 2022)

efikkan said:


> Are you talking about 25% performance uplift in general or 25% performance uplift in gaming (average)?
> The latter will be virtually impossible as most games aren't significantly CPU bottlenecked unless you run them at unrealistically low resolutions.
> 
> 
> ...


25-30% in overall IPC.

I believe CPUs will be important even at 4K with new GPUs being 2x faster boosting framerates to the moon.


----------



## Pastuch (Apr 13, 2022)

Bwaze said:


> Yes, certain games show great uplifts even at higher resolutions. Borderlands 3, Far Cry 5. But some games show almost zero uplift, and they aren't GPU bound - RDR2 for instance.


I feel like everyone, including Wizard, is missing the point… Based on the numbers this could be the best Battle Royale cpu ever made. There are millions of pc gamers that exclusively play BR games and spend a fortune on the hardware to do it. Yet, no one tested Apex, Warzone, or Halo (br was just announced).


----------



## Dr. Dro (Apr 13, 2022)

Given the sheer amount of hype surrounding this processor I frankly expected more. It basically closes the gap with the 5950X in games, outperforms it in a few by... up to 7.5%? And it gets spanked by the regular 5800X everywhere else... 

It's not a bad chip but, it's not a $450 processor in this day and age. For $350, AMD would have a winner here.



Pastuch said:


> I feel like everyone, including Wizard, is missing the point… Based on the numbers this could be the best Battle Royale cpu ever made. There are millions of pc gamers that exclusively play BR games and spend a fortune on the hardware to do it. Yet, no one tested Apex, Warzone, or Halo (br was just announced).



I don't think this CPU will be significantly better than the others at Apex. I'm clearly GPU bottlenecked with 5950X + 3090.


----------



## ARF (Apr 13, 2022)

Dr. Dro said:


> Given the sheer amount of hype surrounding this processor I frankly expected more. It basically closes the gap with the 5950X in games, outperforms it in a few by... up to 7.5%? And it gets spanked by the regular 5800X everywhere else...
> 
> It's not a bad chip but, it's not a $450 processor in this day and age. For $350, AMD would have a winner here.



Even 350 is too much. This is 2020 performance level and Ryzen Zen 4 new generation is coming soon.
You can just hold the purchase to see the new generation.


----------



## InVasMani (Apr 13, 2022)

btarunr said:


> How's this for an "up to"?
> View attachment 243457
> 
> That's a 43% gain over the 5800X, and at a resolution that can be GPU-limited for every graphics card priced under $1000.


The things is a lot of the other games tested had RTRT and I get the feeling that the GPU is the biggest limitation on those. If disable the RTRT settings in those other titles I bet things start looking more favorable for those titles as well for the cache. If that is the case it simply means newer GPU architectures will better leverage the cache down the road.


----------



## Dr. Dro (Apr 13, 2022)

ARF said:


> Even 350 is too much. This is 2020 performance level and Ryzen Zen 4 new generation is coming soon.
> You can just hold the purchase to see the new generation.



I mean, it is fairly competitive with the i7-12700K, though I would not really call it a winner over that processor (it's not a decisive win wherever it does win, and it's a decisive loss elsewhere). The i9-12900K is ahead, as are the Ryzen 9 chips. It's... just that, a performance segment processor that is built around some genuinely interesting, new technology, at the same time it's got the drawbacks of the existing technology that serves as its foundation. $350 would be fine, $450 is just really steep, but I do not expect many of these will be made anyway. It's a last salvo for AM4, and a showcase of their future tech.

The way I see it, it's a beta product, kind of like the R9 Fury X was a beta for HBM GPUs. There is a future here, but there is also a long road to walk through yet.


----------



## ARF (Apr 13, 2022)

Dr. Dro said:


> I mean, it is fairly competitive with the i7-12700K, though I would not really call it a winner over that processor (it's not a decisive win wherever it does win, and it's a decisive loss elsewhere). The i9-12900K is ahead, as are the Ryzen 9 chips. It's... just that, a performance segment processor that is built around some genuinely interesting, new technology, at the same time it's got the drawbacks of the existing technology that serves as its foundation. $350 would be fine, $450 is just really steep, but I do not expect many of these will be made anyway. It's a last salvo for AM4, and a showcase of their future tech.
> 
> The way I see it, it's a beta product, kind of like the R9 Fury X was a beta for HBM GPUs. There is a future here, but there is also a long road to walk through yet.



Yes, this is what I am saying too.
Regarding HBM, I think AMD gave up on this for the gaming products. It uses now third level cache called Infinity cache in order to offset the lower memory throughputs.


----------



## laszlo (Apr 13, 2022)

even if AM4 will reach eol soon i think AMD make it on purpose just to show Intel what to expect in near future... 

considering the current electricity prices, which won't go down only higher... it needs only ~50% of what  the 12900ks is consuming....this a remarkable achievement looking the fps output!

great job AMD !!


----------



## Dr. Dro (Apr 13, 2022)

ARF said:


> Yes, this is what I am saying too.
> Regarding HBM, I think AMD gave up on this for the gaming products. It uses now third level cache called Infinity cache in order to offset the lower memory throughputs.



They did, yes. But they also made three generations of HBM hardware since the Fury, it just became unsustainable for the consumer market due to high costs and not because the tech sucks


----------



## btk2k2 (Apr 13, 2022)

laszlo said:


> even if AM4 will reach eol soon i think AMD make it on purpose just to show Intel what to expect in near future...
> 
> considering the current electricity prices, which won't go down only higher... it needs only ~50% of what  the 12900ks is consuming....this a remarkable achievement looking the fps output!
> 
> great job AMD !!



I think AM4 will last for a good while yet. I expect AM5 + Zen 4 + DDR5 to be an expensive platform so AMD can use AM4 to offer more budget options.


----------



## Taraquin (Apr 13, 2022)

efikkan said:


> Are you talking about 25% performance uplift in general or 25% performance uplift in gaming (average)?
> The latter will be virtually impossible as most games aren't significantly CPU bottlenecked unless you run them at unrealistically low resolutions.
> 
> 
> ...


Of course they don't write code for cache etc, but different game engines scale differently. Geometry, shaders etc matters for what a game utilizes. Engine used for Total war and Cyberpunk loves BW and performs excellent with DDR5 due to this, engine used in FC5 and 6 uses mostly 1 tread and performs very good on Alder lake due to high IPC etc.


----------



## Pastuch (Apr 13, 2022)

W1zzard said:


> Can you start a new thread with ideas? I definitely want to add something like this in the next rebench



Want to generate insane page clicks and draw a massive different audience to TPU? Call the next piece you write on the 5800x3d something along the lines of "Best Warzone and Apex Legends CPU" then compare it to the 12900ks. Do the testing with dual rank Bdie at 3600c14 on both platforms at 1080p and 1440p (None of us use 4k).  People like me don't play anything that isn't a BR, we wake up EVERY SINGLE DAY and play Kovaaks aim trainer for at least 30 minutes, then lock in to no-life Battle Royale games. If I'm not sweating, I'm not having fun. Warzone is begging for the 5800x3d tests, you wouldn't believe how many youtube content creators have built careers off of sweat lords trying to squeeze 5 more frames out of their CPU for Warzone. I've been overclocking CPUs for 20 years but Warzone forced me spend 50 hours working on mastering Bdie subtimings to get my AIDA memory latency on Ryzen down to 53.4ns with 4 dimms. 

I play with an army of guys that have spent over $3000 each on just Warzone hardware since the game came out. Maintaining 200+ FPS in Warzone is a right of passage in the overclocking community. At present, no Ryzen chips compare to a 12900k in Warzone, no matter how low you get your memory latency and no matter how high you push your boost clocks maintaining a minimum of 200 FPS in that game at 1080P is almost impossible on Ryzen. Test Warzone in the Port area of Caldera, it eats CPUs for breakfast. 

If the 5800x3d can do 250fps with 220fps lows in Warzone I would happily pay $600 USD for it. There are hundreds of millions of "BR ONLY" gamers, I genuinely suspect the 5800x3d was made for us.


----------



## FarukPehlione (Apr 13, 2022)

Gaming performance of this processor looks really good, no doubt about it. I am currently using the Ryzen 5 1600 AF model. I have a question in mind. I want to play games and broadcast live. In this case I want to use the processor to broadcast live. As I am using the processor, there will be an FPS drop. Which processor would you buy now? Ryzen 7 5800X3D? Or is it the 12 Core Ryzen 9 5900X? I'm really undecided, I want you to give me advice.


----------



## Dr. Dro (Apr 13, 2022)

FarukPehlione said:


> Gaming performance of this processor looks really good, no doubt about it. I am currently using the Ryzen 5 1600 AF model. I have a question in mind. I want to play games and broadcast live. In this case I want to use the processor to broadcast live. As I am using the processor, there will be an FPS drop. Which processor would you buy now? Ryzen 7 5800X3D? Or is it the 12 Core Ryzen 9 5900X? I'm really undecided, I want you to give me advice.



5900X hands down if you plan on doing any sort of serious video encoding and/or streaming



Pastuch said:


> Want to generate insane page clicks and draw a massive different audience to TPU? Call the next piece you write on the 5800x3d something along the lines of "Best Warzone and Apex Legends CPU" then compare it to the 12900ks. Do the testing with dual rank Bdie at 3600c14 on both platforms at 1080p and 1440p (None of us use 4k).  People like me don't play anything that isn't a BR, we wake up EVERY SINGLE DAY and play Kovaaks aim trainer for at least 30 minutes, then lock in to no-life Battle Royale games. If I'm not sweating, I'm not having fun. Warzone is begging for the 5800x3d tests, you wouldn't believe how many youtube content creators have built careers off of sweat lords trying to squeeze 5 more frames out of their CPU for Warzone. I've been overclocking CPUs for 20 years but Warzone forced me spend 50 hours working on mastering Bdie subtimings to get my AIDA memory latency on Ryzen down to 53.4ns with 4 dimms.
> 
> I play with an army of guys that have spent over $3000 each on just Warzone hardware since the game came out. Maintaining 200+ FPS in Warzone is a right of passage in the overclocking community. At present, no Ryzen chips compare to a 12900k in Warzone, no matter how low you get your memory latency and no matter how high you push your boost clocks maintaining a minimum of 200 FPS in that game at 1080P is almost impossible on Ryzen. Test Warzone in the Port area of Caldera, it eats CPUs for breakfast.
> 
> If the 5800x3d can do 250fps with 220fps lows in Warzone I would happily pay $600 USD for it. There are hundreds of millions of "BR ONLY" gamers, I genuinely suspect the 5800x3d was made for us.



Okay, but you still reek of Gold IV, what a scrub doesn't even own Bloodhound's prestige skin smh 

Nah for real, this chip really, really won't be such a big difference on Apex. Can't speak for Warzone as I don't play it, but Apex is fairly easy to run CPU-side. You want more GPU horsepower for it.


----------



## fevgatos (Apr 13, 2022)

Garrus said:


> Birdie, you are intentionally not paying attention. It's a win against the most expensive CPU you can buy. $450 versus $800 or whatever.
> 
> It's an extra $100 over the 5800X, but massively improves the gaming performance, the main reason people upgrade from the 12600k to the 12900ks, do you not understand why people spend an extra $550 doing that? $100 versus $550 for the same gaming improvements.
> 
> ...


Or your brother buys a 12700f with a new mobo with new IO / better features and longevity? I mean the cpu + mobo is going to cost as much as the 3d on its own, Lol. Of course the 3d will lead by 3-4% in 720 p gaming but it gets destroyed in everything else. Amd fleecing customers again


----------



## blu3dragon (Apr 13, 2022)

Did anyone figure out if PBO works with it or not?
Might be fun to play with.


----------



## fevgatos (Apr 13, 2022)

laszlo said:


> even if AM4 will reach eol soon i think AMD make it on purpose just to show Intel what to expect in near future...
> 
> considering the current electricity prices, which won't go down only higher... it needs only ~50% of what  the 12900ks is consuming....this a remarkable achievement looking the fps output!
> 
> great job AMD !!


???? Jesus with these stupid posts. Where did you see that it needs half the consumption for the same fps? The only time the 12900 consumes double is in cinebench, and it actually gets double the score as well, so efficiency is the same.


----------



## lexluthermiester (Apr 13, 2022)

Just read the review and I'm not impressed. The drop in clocks hurt performance.

@ AMD, 
You folks really needed to have the 5900X3D, with the clocks running at or close to the 5900X. Just saying..


----------



## Pastuch (Apr 13, 2022)

Dr. Dro said:


> Given the sheer amount of hype surrounding this processor I frankly expected more. It basically closes the gap with the 5950X in games, outperforms it in a few by... up to 7.5%? And it gets spanked by the regular 5800X everywhere else...
> 
> It's not a bad chip but, it's not a $450 processor in this day and age. For $350, AMD would have a winner here.
> 
> ...



I'm no Apex expert but doesn't it only use a maximum of 8 cores and 16 threads? In that case your 5950x would always show a lot of idle CPU time because it can only use half your CPU.


----------



## Dr. Dro (Apr 13, 2022)

Pastuch said:


> I'm no Apex expert but doesn't it only use a maximum of 8 cores and 16 threads? In that case your 5950x would always show a lot of idle CPU time because it can only use half your CPU.



Apex runs on 3 threads only. Amount used can be specified with +threads=X command argument at launch but it does not use more than three. Even a quad-core processor will have full appreciation of the hardware, so a i5-12600K or something with really fast Golden Cove cores would be the winner for it, hands down. But even large maps like Storm Point run perfectly fine 

The additional CPU load on other threads is due to D3D11 scheduling, especially on NVIDIA.


----------



## InVasMani (Apr 13, 2022)

lexluthermiester said:


> Just read the review and I'm not impressed. The drop in clocks hurt performance.
> 
> AMD, you folks really needed to have the 5900X3D. Just saying..


5600X3D would've been pretty interesting too. Where this chip would have killed it is on a APU.


----------



## Pastuch (Apr 13, 2022)

Dr. Dro said:


> Apex runs on 3 threads only. Amount used can be specified with +threads=X command argument at launch but it does not use more than three. Even a quad-core processor will have full appreciation of the hardware, so a i5-12600K or something with really fast Golden Cove cores would be the winner for it, hands down. But even large maps like Storm Point run perfectly fine


I watch a bunch of tech toobers and they all show 300+ fps is relatively easy in Apex. Warzone on the other hand is a different beast, NOTHING can maintain 280+ FPS right now in Warzone and we all have 240hz+ monitors that we aren't using to their full potential. 

The new Halo BR is going to be huge too and then we get Warzone 2 early 2023.


----------



## W1zzard (Apr 13, 2022)

blu3dragon said:


> Did anyone figure out if PBO works with it or not?
> Might be fun to play with.


PBO does not work, as mentioned several times, neither does multiplier increase or decrease. BCLK works to a tiny degree, if you have a special motherboard. B550 might work better than X570, but we're talking 2-7% OC here, not worth it. Didn't test it but voltage control should work, because on AM4 it's a function of the motherboard (no FIVR in the CPU)


----------



## Chomiq (Apr 13, 2022)

W1zzard said:


> PBO does not work, as mentioned several times, neither does multiplier increase or decrease. BCLK works to a tiny degree, if you have a special motherboard. B550 might work better than X570, but we're talking 2-7% OC here, not worth it. Didn't test it but voltage control should work, because on AM4 it's a function of the motherboard (no FIVR in the CPU)


Can't wait for "Heavily tuned 5800X @ 1900 IF vs 5800X3D" comparisons.


----------



## phanbuey (Apr 13, 2022)

I think people are forgetting this is just a precursor to Zen 4 - and that it's proof that 3d cache does make a difference in memory bound applications.  IMO the trend is great.

First AMD gives us more cores in desktop, then they will force Intel to give us all more cache.

This is a great showing.


----------



## Pastuch (Apr 13, 2022)

Chomiq said:


> Can't wait for "Heavily tuned 5800X @ 1900 IF vs 5800X3D" comparisons.


The biggest performance improvements on Ryzen is always in the memory and Infinity fabric which are possible on the 5800x3d. I hope some of them can do more than 1866 IF. I wonder what the lowest memory latency you can get is with a 5800x3d? Maybe @W1zzard can add his results to the Aida latency thread? 
Post in thread 'Share your AIDA 64 cache and memory benchmark here'
https://www.techpowerup.com/forums/...and-memory-benchmark-here.186338/post-4687879


----------



## InVasMani (Apr 13, 2022)

If you can drop the memory divider a setting or two can you and eek out any additional BCLK headroom if you want to swap some peak bandwidth for CPU clock speed!!? It would be interesting to see if that works similarly for AMD like it does for Intel with Skylake. I can push BCLK to like 287MHz for Skylake with that trick. The multiplier drops when you adjust the memory divider lower as well is what I had noticed as well.


----------



## Pastuch (Apr 13, 2022)

InVasMani said:


> If you can drop the memory divider a setting or two can you and eek out any additional BCLK headroom if you want to swap some peak bandwidth for CPU clock speed!!? It would be interesting to see if that works similarly for AMD like it does for Intel with Skylake. I can push BCLK to like 287MHz for Skylake with that trick. The multiplier drops when you adjust the memory divider lower as well is what I had noticed as well.


It’s never worth using a divider on Ryzen, the performance loss of not running 1:1 IF:memory is massive


----------



## nicamarvin (Apr 13, 2022)

Pastuch said:


> It’s never worth using a divider on Ryzen, the performance loss of not running 1:1 IF:memory is massive


Except that we are in new territory here with 3D V-Cache so who knows? I would say try everything even if it does not make sense.


----------



## HD64G (Apr 13, 2022)

@W1zzard : Since for CPU comparisons in gaming, 1080P and 1440P are the best resolutions to go for, I would like to see the same CPU paired with a 6900XT since it will remove for sure most of the GPU bottlenecks we see in this review. And the only other CPUs needed to be tested to be compared with are 5800X and 12900KS. In order to have the non-3Dcached with the same arch and core counts and the other competitor for the gaming CPU crown.


----------



## nicamarvin (Apr 13, 2022)

HD64G said:


> Since for CPU comparisons in gaming, 1080P and 1440P are the best resolutions to go for, I would like to see the same CPU tested with a 6900XT since it will remove for sure most of the GPU bottlenecks we see in the review.


Really? It's the 3080 Bottle Necking the 5800X3D at 1080P?


----------



## defaultluser (Apr 13, 2022)

billeman said:


> Nice processor if you want to game at 720p/1080p




The whole point of doing that: WHEN you buy a new processor,* you know how much gas it has LEFT in  the tank!*

I hope you don't buy a new GPU every time you add a demanding new game to your library?  *Most of us out there in the PC hardware world don't consider buying a 20% hardware bump, then reselling their old  hardware post-haste to be part of the game!

If you're buying a new processor that will last you years, then comparisons at 720p tell you comparatively :*

1. how much faster it is than your old CPU (for those buying an upgrade on AM4)
2. how fast is in, in comparison to the rest of thew market (for those buying new.)


----------



## HD64G (Apr 13, 2022)

nicamarvin said:


> Really? It's the 3080 Bottle Necking the 5800X3D?


Check the graphs on the left side below and compare the FPS of the 6900XT (all GPUs are tested with the 5800X) vs the graphs on the right side with RTX3080. It is profound that 3080 isn't so strong to push the best CPUs to their limits in many games when in lower than 4K resolutions.


----------



## defaultluser (Apr 13, 2022)

InVasMani said:


> 5600X3D would've been pretty interesting too. Where this chip would have killed it is on a APU.



That's kinda redundant when you already have zen 4 in the pipeline.


----------



## darksf (Apr 13, 2022)

birdie said:


> This is called ad hominem. Attacking the person vs talking about what he says. And thank you for *not* admitting you greatly exaggerated or lied about my statement.
> 
> What a nice discussion we have here. Congratulations on the high standards of TPU forums. *Nowhere in my posts on TPU I've ever trolled or wasted anyone's time*. I'm not known for trolling but I'm well known for crushing red-tinted glasses people love to put on when they talk about particular companies.
> 
> ...


You started with "AMD Fans" and then you are wondering why everyone is meeting you with "joy".

"It's too effing expensive for what it offers" - Is it ? I could say the same for every CPU on the list , it is too freaking expensive for what they offer, they should be for free cause of the amount of money I'm gonna spend for electricity just to listen MP3s and argue on the TPU forums.My trusty IBM T60 with C2D is for free and can do it.

There is a specific argument when talking about gaming and especially competitive gaming it is called "Smooth Gameplay". There is entire section of the review called "Frametime Analysis" dedicated to this which you obviously choose to ignore. As a CS player (Source and GO) , DOTA 2 , WoT I can tell you I'm way more interested in stable framerate than higher average framerate.
Aaand turned out the i7 5775C is doing this perfectly (with IGP disabled and L4 clocked at 2000MHz) and it is not a fringe case it is a constant feature of big and fast low latency memory proven over and over again since the dawn of the first dynamic memory address. 

Yes 5800X3D in general in everyday tasks is nothing special and is expensive but yet the everyday tasks are nothing special on their own and I don't need to read a CPU review to pick a CPU for office work , I'm checking prices of components. 

But for my personal home machine (where I'm persistently using windows 7 so I can play NFS Porsche and some other legacy titles and use my trusty Creative X-Fi Titanium without doing tons of hacks each time Windows 10 updates) I want monolithic high cache CPU so I can enjoy the multiplayer games I play the way I want and i7 5775C is doing this perfectly and 5800X3D will do it even better , and the best of all with just a single time hacks and tricks I will be able to setup Windows 7 on it. There is no price on this  

At the end 5800X3D is what it was expected to be really good if not the best GAMING cpu because of its stable frame rates and I suppose it will receive the same spitting the i7 5775C received back when it was launched (3.7GHz turbo vs 4790K 4.4GHz --- 4.2GHz overclock vs 4790Ks 4.7/4.8 even some at 5.0GHz overclock).But in the end it turned out invaluable in what it can do best stable FPS.


----------



## JayEe (Apr 13, 2022)

fevgatos said:


> So basically its about 3-4% faster in games than a 12700f, but gets absolutely creamed in everything else (single thread, multithread, upgradability) while also costing 50% more? Woah, thats just a bad product. Needs a big pricecut to 300-350. At 450 its a joke


Fastest gaming cpu under 350€/$?? Keep on dreaming


----------



## 529th (Apr 13, 2022)

Good lord I forgot there was a frame-time analysis section.  Thanks for including this!

@darksf thanks for pointing that out.  It really is more important than Avg fps winners

sheesh


----------



## progste (Apr 13, 2022)

Very interesting results, it would be fun to dig into why and how some games benefit more from it than others, but if your main interest is gaming performance I would say it's a good CPU.
It's a bit of a shame it doesn't give an advantage in non-gaming workloads, but I guess that wasn't the objective.


----------



## fevgatos (Apr 13, 2022)

JayEe said:


> Fastest gaming cpu under 350€/$?? Keep on dreaming


Even if it was (it isnt), intel frequently had fastest gaming cpu at that price. 7700 / 8700. But i guess, this is amd fleecing customers as per the usual


----------



## Gooigi's Ex (Apr 13, 2022)

fevgatos said:


> Or your brother buys a 12700f with a new mobo with new IO / better features and longevity? I mean the cpu + mobo is going to cost as much as the 3d on its own, Lol. Of course the 3d will lead by 3-4% in 720 p gaming but it gets destroyed in everything else. Amd fleecing customers again


Longevity? Two generations are not longevity. He can just wait for AM5 and get the X600 processor



fevgatos said:


> ???? Jesus with these stupid posts. Where did you see that it needs half the consumption for the same fps? The only time the 12900 consumes double is in cinebench, and it actually gets double the score as well, so efficiency is the same. Why can't amd fanboys stop spreading nonsense is beyond me.



Did you forget to look at this? 








fevgatos said:


> Even if it was (it isnt), intel frequently had fastest gaming cpu at that price. 7700 / 8700. But i guess, this is amd fleecing customers as per the usual


7700/8700??? This is 2022, not 2017. You need to let that go. Intel is not that guy anymore


----------



## fevgatos (Apr 13, 2022)

Gooigi's Ex said:


> Longevity? Two generations are not longevity. He can just wait for AM5 and get the X600 processor
> 
> 
> 
> ...


Two generarions are more than what am4 offers right now. 

Is 566 double of 510? Oh okay then


----------



## lexluthermiester (Apr 13, 2022)

InVasMani said:


> 5600X3D would've been pretty interesting too.


Very likely, as long as the clocks were not dropped.


InVasMani said:


> Where this chip would have killed it is on a APU.


Again, very likely. I can see this 3DCache making a big improvement for APUs.

This is AMDs first go with this technology. While the overall performance was not so impressive, this is solid first go. As they refine, it'll become something special.


----------



## Chomiq (Apr 13, 2022)

lexluthermiester said:


> Very likely, as long as the clocks were not dropped.
> 
> Again, very likely. I can see this 3DCache making a big improvement for APUs.
> 
> *This is AMDs first go with this technology*. While the overall performance was not so impressive, this is solid first go. As they refine, it'll become something special.


Not if you count Milan-X.


----------



## nicamarvin (Apr 13, 2022)

Chomiq said:


> Not if you count Milan-X.


He was talking about 3D V-Cache Technology in general, which is exactly the same as found on the 5800X3D. AMD just repurposed those as "Gaming" CPUs which they do quite well. But Milan-X is where they shine and put a beat down on the EPYC and Intel 10nm Xeons


----------



## lexluthermiester (Apr 13, 2022)

Chomiq said:


> Not if you count Milan-X.


That's a fair point, but those are EPYC CPUs on a whole different platform and manufacturing system, which are not available to the general consumer. In the consumer/prosumer space, this is AMD's first go and while they have the experience with EPYC to go on, it's still a very different beast.


----------



## Aquinus (Apr 13, 2022)

lexluthermiester said:


> Just read the review and I'm not impressed. The drop in clocks hurt performance.
> 
> @ AMD,
> You folks really needed to have the 5900X3D, with the clocks running at or close to the 5900X. Just saying..





lexluthermiester said:


> That's a fair point, but those are EPYC CPUs on a whole different platform and manufacturing system, which are not available to the general consumer. In the consumer/prosumer space, this is AMD's first go and while they have the experience with EPYC to go on, it's still a very different beast.


What's weird is that this wasn't the behavior of Milan-X in the benchmarks that were done over at Phoronix, so I'm looking at this with a bit of skepticism. I am wondering if the OS plays a role here when it comes to scheduling because that can impact how data is evicted from cache, which is why I'd like to see some benchmarks with this chip in Linux to see if the trend is consistent with what we're seeing here, because it is not at all what I expected given Milan-X's performance uplift _*in Linux*_.


----------



## lexluthermiester (Apr 13, 2022)

Aquinus said:


> What's weird is that this wasn't the behavior of Milan-X in the benchmarks that were done over at Phoronix, so I'm looking at this with a bit of skepticism. I am wondering if the OS plays a role here when it comes to scheduling because that can impact how data is evicted from cache, which is why I'd like to see some benchmarks with this chip in Linux to see if the trend is consistent with what we're seeing here, because it is not at all what I expected given Milan-X's performance uplift.


Good point. Those results would be interesting to see.


----------



## Aquinus (Apr 13, 2022)

lexluthermiester said:


> Good point. Those results would be interesting to see.


It could also be that 96MB simply isn't enough. Milan-X has a whopping 768MB, so I could easily see cores switching context with data from a previous task still being resident in L3. It's really hard to say without more data.


----------



## nicamarvin (Apr 13, 2022)

Aquinus said:


> What's weird is that this wasn't the behavior of Milan-X in the benchmarks that were done over at Phoronix, so I'm looking at this with a bit of skepticism


Are you really comparing real-world OS like Linux/Unix with Windows? Real Apps with Gaming/Benching apps?



Aquinus said:


> It could also be that 96MB simply isn't enough. Milan-X has a whopping 768MB, so I could easily see cores switching context with data from a previous task still being resident in L3. It's really hard to say without more data.


The 5900X3D Prototype with 192 MiB had the same performance uplift(15% average) on the same games as the 5800X3D. You just need to realize that Gaming is really a Niche segment of computing. Unless Game developers start coding for 3D V-Cache it will be like this.


----------



## Aquinus (Apr 13, 2022)

nicamarvin said:


> Are you really comparing real-world OS like Linux/Unix with Windows? Real Apps with Gaming/Benching apps?


Well, the funny thing is, is that the improvement we saw with gaming is what I would have expected across the board because that's what Milan-X demonstrated the vast majority of the time. So yes, I am looking squarely at Windows until I get more data.


----------



## lexluthermiester (Apr 13, 2022)

Aquinus said:


> It could also be that 96MB simply isn't enough. Milan-X has a whopping 768MB, so I could easily see cores switching context with data from a previous task still being resident in L3. It's really hard to say without more data.


Agreed, more analysis needs to be done. 

I'd like to be clear, I'm not criticizing W1zzard's methodology. His testing methods are sound. However, he is limited by the tools available. As testing tools, programs and games become available that are aware of the additional cache, and are optimized for it, the benefit and scope of the 3DCache effect will become more clear.



nicamarvin said:


> Are you really comparing real-world OS like Linux/Unix with Windows?


Yes. Such is a valid comparison. Not everyone uses Windows.



Aquinus said:


> Well, the funny thing is, is that the improvement we saw with gaming is what I would have expected across the board because that's what Milan-X demonstrated the vast majority of the time. So yes, I am looking squarely at Windows until I get more data.


This.


----------



## nicamarvin (Apr 13, 2022)

Aquinus said:


> Well, the funny thing is, is that the improvement we saw with gaming is what I would have expected across the board because that's what Milan-X demonstrated the vast majority of the time. So yes, I am looking squarely at Windows until I get more data.


Really? Windows Users use Benchmark apps like Geekbench, and Cinebench, for bragging rights. Scientists/ Engineers use Linux and Ansy/OpenFoam to make money and save lives.







The truth is that 3D V-Cache is for HPC Dynamic Fluid Computing Simulations. The fact that it did so well in gaming is just a Casualty of AMD Trying to push for ever more performance at the HPC level. Game and Desktop Apps just don't see that type of performance improvement.


----------



## Aquinus (Apr 13, 2022)

nicamarvin said:


> Really? Windows Users use Benchmark apps like Geekbench, and Cinebench, for bragging rights. Scientists/ Engineers use Linux and Ansy/OpenFoam to make money and save lives.
> 
> View attachment 243511


That's not the 5800X3D, but I think that kind of proves what I'm trying to say. I'm expecting more of an uplift than what I'm seeing in this review given Milan-X's performance uplift. This is why I'm skeptical that it's the chip and that it might be the OS. Once again, we need some Linux numbers for this chip to confirm that suspicion, because right now it's just a theory with the information we have.


----------



## David Fallaha (Apr 13, 2022)

phanbuey said:


> IDK about UK but you can get a 6000 CL 36 kit here for $360 -- in fact that's what I'm running now, and it's faster than my old DDR4 32gb 4133 4x single rank b dies.
> 
> View attachment 243426
> 
> Prices aren't that different here anymore and haven't been for a while.


Fair enough. That said I think it would be fascinating to know if the V-cache eliminates the need for 3800CL14 tuned RAM? I’ve also just seen some rumours about successful 5800X3D BCLK overclocking



tussinman said:


> I think he's more referring to price against realisitic alternates, obviously that doesn't include the 12900k.
> 
> 12700k for example is over 10% faster in CPU test, only 2-3% slower in gaming + has 13th gen support and it's $100-125 cheaper. 5700/5800X is only like 6-9% slower in gaming and can be had for 25-30% cheaper


Sure but don’t forget we’re talking about an AMD chip that is a literal drop in upgrade vs an entire new build…


----------



## efikkan (Apr 13, 2022)

Aquinus said:


> What's weird is that this wasn't the behavior of Milan-X in the benchmarks that were done over at Phoronix, so I'm looking at this with a bit of skepticism. I am wondering if the OS plays a role here when it comes to scheduling because that can impact how data is evicted from cache, which is why I'd like to see some benchmarks with this chip in Linux to see if the trend is consistent with what we're seeing here, because it is not at all what I expected given Milan-X's performance uplift _*in Linux*_.


Only to the extent that more having more running in the background or using more cores will "pollute" the L3 more.
Even with aggressive scheduling, it's usually way too slow to be affected by the rate of data flowing in the caches. Considering the L3 is a LRU spillover cache from anything evicted from L2 (in all the cores) it will be overwritten with an incredible rate. And effectively, the more data an algorithm is churning through, the less effective L3 will be.


----------



## phanbuey (Apr 13, 2022)

David Fallaha said:


> Fair enough. That said I think it would be fascinating to know if the V-cache eliminates the need for 3800CL14 tuned RAM? I’ve also just seen some rumours about successful 5800X3D BCLK overclocking
> 
> 
> Sure but don’t forget we’re talking about an AMD chip that is a literal drop in upgrade vs an entire new build…



Yeah so I would guess that it does to a great extent -- having more cache compensates for latency quite effectively in many latency-sensitive workflows and games (i.e. a 5800x with tuned 3800cl14 performs roughly similar to a 5800x3d with 3200 cl14 in those workflows), so gobs of 3d cache + DDR5 (which as we know has a latency penalty in favor of bandwidth) really seems like a smart mix for zen 4 with early gen ddr5.


----------



## nicamarvin (Apr 13, 2022)

I would like to point out that AMD intent with 3D V-Cache is to procure and gain market share in the most profitable segment in computing. To provide generational improvements exceeding 30%(up to 80% on many cases) in HPC. That 3D stacked L3 do so well in Gaming is just a Causality of that.


Also The 5900X3D Prototype with 192 MiB had the same performance uplift(15% average) on the same games as the 5800X3D. You just need to realize that Gaming is really a Niche segment of computing. Unless Game developers start coding for 3D V-Cache it will be like this. The 5900X3D or 5950X3D would have been great at HPC tasks. But provide no advantage to the 5800X3D in games


----------



## Pastuch (Apr 13, 2022)

fevgatos said:


> Two generarions are more than what am4 offers right now.
> 
> Is 566 double of 510? Oh okay then


I can’t help but defend AM4 and X570 in particular, I started with a 3600x, then a 5600x, next is a 5800x3d all on the same board. It’s been the most reliable motherboard I’ve ever had and the BiOS updates just keep coming. I’ll only buy the 5800x3d once I see Warzone benches, I want 280fps+ SO bad.


----------



## Xuper (Apr 13, 2022)

This CPU only good if you own AM4 MB otherwise for a new pc You need ADL or AM5


----------



## lexluthermiester (Apr 13, 2022)

nicamarvin said:


> That 3D stacked L3 do so well in Gaming is just a Causality of that.


And? What point are you trying to make?



nicamarvin said:


> You just need to realize that Gaming is really a Niche segment of computing.


That is one of the most silly statements I've read this year. So out of touch with reality.


----------



## Vunnie (Apr 13, 2022)

Dont know why everyone is so positive, 450 dollars for a cpu slower than the 5800x??


----------



## logicisntforyou (Apr 13, 2022)

Why_Me said:


> *$449* for a cpu that doesn't overclock in order to game at 1080P not to mention content creators won't give this cpu a second thought after looking at the rendering benches.  This should be interesting to say the least when the 12700F is going for *$310* and the 12700K/KF is going for *$370* atm.


Why? Most content creators have a 2-3 PC setup for streaming and recording content and that's besides the fact most of them don't actually edit their own videos and have someone do it for them. If they are gaming content creators all you care about are the frames if you're playing any competitive game.


----------



## Jism (Apr 13, 2022)

lexluthermiester said:


> That's a fair point, but those are EPYC CPUs on a whole different platform and manufacturing system, which are not available to the general consumer. In the consumer/prosumer space, this is AMD's first go and while they have the experience with EPYC to go on, it's still a very different beast.



The 5800X is just a single CCD; the main benefit from this alone would be no latency at all if it would had to switch or read/write data from other cores.

The 96MB of additional cache added on top would work wonders on any CPU really, but this cache experiment has bin tested with EPYC before with good results.

Basicly you got a CPU now slower, half the power with 3DVCache able to compete with intel highest offerings that needs to run on 5 to 5.5Ghz and consume a truckload of power along with it.

If AMD manages to build a seperate voltage rail onto it's next 3D Cache CPU we would be able to overclock the 5800x. But this CPU is extremely limited in regards of tweaking or overclocking. Still a nice gimmick.


----------



## tussinman (Apr 13, 2022)

David Fallaha said:


> Sure but don’t forget we’re talking about an AMD chip that is a literal drop in upgrade vs an entire new build…


How did I forget when the quote you quoted literally says "5700x/5800x for current am4 users will offer within 6-9% of the performance for alot less money".

5800x3d for example is 1.5x the price as the 5700x in my country

I didn't just mention new build, I mentioned the other 2 amd processors are way more realistic values.

It's only being compared to the overpriced 12900 but that completely ignores the obvious alternatives


----------



## wheresmycar (Apr 13, 2022)

Please fill me in....

So AMD only increased the cache size with the X3D model and dropped 200Mhz on the boost clock to get the current results? I think it's amazing the AM4 socket lives on with yet another jump in performance although $450 for this newcomer is a bit of a bummer. 

Also why not a 5600X3D at a more affordable rate? Imagine that... 5600X3D for around $300 for existing AM4 board owners achieving 12600K/better performance... that would have been a treat and something worth shouting about for the wider audience


----------



## W1zzard (Apr 13, 2022)

lexluthermiester said:


> I can see this 3DCache making a big improvement for APUs.


Too expensive I think



nicamarvin said:


> Ansy/OpenFoam


Is it worth looking into OpenFoam for future CPU benchmarks, considering I already have Comsol? Are you working in that field? Anything else that a lot of people use that demands significant compute power?


----------



## Aquinus (Apr 13, 2022)

W1zzard said:


> Anything else that a lot of people use that demands significant compute power?


pgbench using a modern version of PostgreSQL would be nice to augment your existing MySQL benchmark, if you're looking for suggestions.


----------



## InVasMani (Apr 13, 2022)

Aquinus said:


> It could also be that 96MB simply isn't enough. Milan-X has a whopping 768MB, so I could easily see cores switching context with data from a previous task still being resident in L3. It's really hard to say without more data.


It's a whole CD worth of data basically in the L3 cache. Obviously that amount of data be it one file or thousands of smaller files not having to go to slower system memory adds up to a big latency uplift.


----------



## MarsM4N (Apr 13, 2022)

Sound_Card said:


> Imagine having a new platform with a new PCIe bus, ram speed, new IPC, etc. AMD just sticks some more cache on top of the CPU while consuming considerably less power and tying your performance in games (if not better with the 1% lows). Just sold my Intel stock.



^^ This.  The 5800x3d is like a body builder in high heels, hold back by the (outdated) platform.

It's just to give the audience a sneak peak of what's on the horizon with Zen4. Bet the Intel boys are sweating beans & readjusting their Q4/22+ earnings expectations.


----------



## Pastuch (Apr 13, 2022)

Vunnie said:


> Dont know why everything is so positive, 450 dollars for a cpu slower than the 5800x??


Bro, gaming is all that matters. It’s like some of you work for a living. If you’re not sweating you’re not gaming.


----------



## Why_Me (Apr 13, 2022)

logicisntforyou said:


> Why? Most content creators have a 2-3 PC setup for streaming and recording content and that's besides the fact most of them don't actually edit their own videos and have someone do it for them. If they are gaming content creators all you care about are the frames if you're playing any competitive game.


And then there's the content creators who don't have sponsors and didn't win the lottery.


----------



## logicisntforyou (Apr 13, 2022)

Why_Me said:


> And then there's the content creators who don't have sponsors and didn't win the lottery.


I didn't do either of those things and even I have a 2 PC setup for content creation.


----------



## Pastuch (Apr 13, 2022)

Why_Me said:


> And then there's the content creators who don't have sponsors and didn't win the lottery.


Get a real job and then you can use an ultrawide monitor.... I know, I know, streaming doesn't work well with ultrawides.


----------



## harm9963 (Apr 13, 2022)

Testing with a  ASUS DARK HERO with DOCS ,that would be interesting .


----------



## Gooigi's Ex (Apr 13, 2022)

fevgatos said:


> Two generarions are more than what am4 offers right now.
> 
> Is 566 double of 510? Oh okay then


RIght compared to 5 generations that AM4 had. Again it's cute what intel did, but after Rocket Lake, Intel is gonna be on a new platform, while AMD is still on AM5(given their history) so yeah...

Oh, you got me there har har(even tho I never claimed that) and no they are not at the same efficiency. The fact that it consumes energy roughly around a 12th gen i3 while matching or besting the HIGHEST END(NOT 12900K) while using 127w less is progess


----------



## waltran (Apr 14, 2022)

Honestly this puts me in a very hard place. I bought a 5800x, 7 months ago to replace my 10 year old 2500k (At that point I really waited for 12th gen so it would be 10gens of leap on Intel, but 11th gen was so bad i lost faith in Alder lake and just gone with it)

While I was buying the 5800x I was planning to change it to a 5900x or 5950x after Alder lake gets released and ddr5 becomes a thing so those cpus/platform would get cheaper, but now 5800x3d also joined the game and changed all the balance as I use my computer for both gaming and work... 5800x3d might be a limited production and never get cheaper too :/
Do you think Amd would make a 5900/50x3d ? 96mb L3 for each ccd on a possible 5900/50x3d ? total 192 mb of l3


----------



## cyberloner (Apr 14, 2022)

waltran said:


> Honestly this puts me in a very hard place. I bought a 5800x, 7 months ago to replace my 10 year old 2500k (At that point I really waited for 12th gen so it would be 10gens of leap on Intel, but 11th gen was so bad i lost faith in Alder lake and just gone with it)
> 
> While I was buying the 5800x I was planning to change it to a 5900x or 5950x after Alder lake gets released and ddr5 becomes a thing so those cpus/platform would get cheaper, but now 5800x3d also joined the game and changed all the balance as I use my computer for both gaming and work... 5800x3d might be a limited production and never get cheaper too :/
> Do you think Amd would make a 5900/50x3d ? 96mb L3 for each ccd on a possible 5900/50x3d ? total 192 mb of l3


5800x only one ccd so it have some space for extra cache... 5900 5950 dual ccd so the cpu is full.... i also have 5800x it is not a bad cpu after all


----------



## Mussels (Apr 14, 2022)

GoldenX said:


> A friend is very interested in it, he would go from a 1800X to this. No need to update the X370 board, thanks to the new BIOSes, so it's a solid 5 years upgrade.


That would be an insane upgrade. The jump from 3000 to 5000 was already large, from original zen to X3D? It's like going from a core 2 quad to an i9


----------



## wheresmycar (Apr 14, 2022)

Mussels said:


> That would be an insane upgrade. The jump from 3000 to 5000 was already large, from original zen to X3D? It's like going from a core 2 quad to an i9



I landed 2 spare AM4 used systems a couple of years back built around Ryzen 5 1000/2000 series.. one of thems a B350 (still waiting for that BIOS update to patch in 5000 series chips). A little gutted the X3D is limited to 5800X only as I don't need the extra compute power and definitely not fond of the asking price. But it has got me wandering whether its worth swapping out my intel 9700K platform for a 58003XD


----------



## theglaze (Apr 14, 2022)

wheresmycar said:


> But it has got me wandering whether its worth swapping out my intel 9700K platform for a 58003XD


I'm in the same boat... Alder Lake runs too hot and inefficient for my taste. While Zen 4 is probably worth the wait, it will likely be scalped for months after launch, a risk to teething issues with new chipsets and first-gen mobos, and DDR5 will still be expensive.


----------



## Bwaze (Apr 14, 2022)

I hope there won't be as much scalping, main fuel for it seemed to be willingness to pay extremely high prices for GPUs - for mining purposes. Combined with covid bonuses and lots of spare time, and theory that large amounts of money could be made out of thin air, it spilled into anything remotely associated, it really was a perfect storm. Companies are already lamenting that they won't be able to repeat these results unless a new crypto boom saves them (and dooms us, non believers)...


----------



## Chomiq (Apr 14, 2022)

Bwaze said:


> I hope there won't be as much scalping, main fuel for it seemed to be willingness to pay extremely high prices for GPUs - for mining purposes. Combined with covid bonuses and lots of spare time, and theory that large amounts of money could be made out of thin air, it spilled into anything remotely associated, it really was a perfect storm. *Companies are already lamenting that they won't be able to repeat these results unless a new crypto boom saves them *(and dooms us, non believers)...


Where there's a will, there's a way.

New Ryzen's aren't really flying off the shelves locally. They aren't even advertised in stores, you have to look to find them when browsing online stores. 5800X3D isn't even listed for preorder or anything.


----------



## puma99dk| (Apr 14, 2022)

Chomiq said:


> Where there's a will, there's a way.
> 
> New Ryzen's aren't really flying off the shelves locally. They aren't even advertised in stores, you have to look to find them when browsing online stores. 5800X3D isn't even listed for preorder or anything.


Same here I cannot even find them or at Caseking in Germany I am wondering what AMD's plan is because it's not even avaliable on AMD's own store yet.


----------



## Chomiq (Apr 14, 2022)

puma99dk| said:


> Same here I cannot even find them or at Caseking in Germany I am wondering what AMD's plan is because it's not even avaliable on AMD's own store yet.


5700x is listed and available, but it's only €20 away from 5800x (which had a €50 price drop). If 5800X3D goes for €450 (has AMD even announced EU pricing for this?) then it will match the new RRP for 5900X.


----------



## puma99dk| (Apr 14, 2022)

Chomiq said:


> 5700x is listed and available, but it's only €20 away from 5800x (which had a €50 price drop). If 5800X3D goes for €450 (has AMD even announced EU pricing for this?) then it will match the new RRP for 5900X.


I wouldn't launched for the new price of the 5900X because then it won't sell not even to me it would be a wasted.

I can see even Caseking has a sale on the 5800X it's -27% on it so it goes down from €489 to €359,17
Link: https://www.caseking.de/AMD-Ryzen-7...-ohne-Kuehler-HPAM-203.html?flxSmartSuggest=1

But it's really weird Caseking got the 5900X for €429 not on sale which means it's cheaper than the 5800X when it's not on sale what's going on?   
Link: https://www.caseking.de/AMD-Ryzen-9...-ohne-Kuehler-HPAM-204.html?flxSmartSuggest=1


----------



## thepath (Apr 14, 2022)

12700K has better performance per dollar in productivity, gaming (both low and high resolution) 

Also, keep in mind that turning E cores off is going to improve gaming performance. ALder Lake was not tested on its full potential for gaming
5800X3D is more expensive than 12700K. SO turning E cores will not make put 12700K at any disadvantage anyway


----------



## Vunnie (Apr 14, 2022)

Pastuch said:


> Bro, gaming is all that matters. It’s like some of you work for a living. If you’re not sweating you’re not gaming.


So you're paying 450 dollar for a 5800x3d when you can get a 12600kf for 280 with 4.4% slower gaming performance at 1080p?


----------



## Denver (Apr 14, 2022)

Vunnie said:


> So you're paying 450 dollar for a 5800x3d when you can get a 12600kf for 280 with 4.4% slower gaming performance at 1080p?


This will depend on what you play, this difference can be insignificant (5%) but in some titles it exceeds 20%.


----------



## Mussels (Apr 14, 2022)

thepath said:


> 12700K has better performance per dollar in productivity, gaming (both low and high resolution)
> 
> Also, keep in mind that turning E cores off is going to improve gaming performance. ALder Lake was not tested on its full potential for gaming
> 5800X3D is more expensive than 12700K. SO turning E cores will not make put 12700K at any disadvantage anyway





Vunnie said:


> So you're paying 450 dollar for a 5800x3d when you can get a 12600kf for 280 with 4.4% slower gaming performance at 1080p?




To both of you: pricing varies per country, let alone per state in the USA.
And as always... stop looking at just the CPU prices.

You can mix this with an A320 board and 16GB of DDR4 3200 and a 120mm air cooler, vs needing a top end DDR5 board and a 360mm AIO


----------



## fevgatos (Apr 14, 2022)

Mussels said:


> To both of you: pricing varies per country, let alone per state in the USA.
> And as always... stop looking at just the CPU prices.
> 
> You can mix this with an A320 board and 16GB of DDR4 3200 and a 120mm air cooler, vs needing a top end DDR5 board and a 360mm AIO


Really? You need ddr5 and a 360 aio for gaming on a 12600 / 12700? REALLY? Come on... 

Should i repost my results on a 12900k on a small single tower cooler in cbr23? 

Fact is, the 3d is almost 50% more expensive than a 12700f, gets crucified in every workload and only wins in 240p gaming by single digits percentage. Assuming you only care about gaming, you can even disable the ecores to close some of that single digit difference. 

And no, you dont need ddr5. Tpup also tested the difference between ddr5 and ddr4, they actually tested the exact same kit they used on the 5800x 3d against the one used on alderlake. The difference was 2%. So the total gaming difference between the 12700 and the 5800x3d, both with ddr4 is less than 5%.


----------



## darksf (Apr 14, 2022)

fevgatos said:


> Really? You need ddr5 and a 360 aio for gaming on a 12600 / 12700? REALLY? Come on...
> 
> Should i repost my results on a 12900k on a small single tower cooler in cbr23?
> 
> ...


Like someone will get 12700f or 5800x3d for "workload" you will either get the big boys 5950x , Threadripper or the cheapest Celeron there is no "workload" in between.


----------



## fevgatos (Apr 14, 2022)

darksf said:


> Like someone will get 12700f or 5800x3d for "workload" you will either get the big boys 5950x , Threadripper or the cheapest Celeron there is no "workload" in between.


Like someone would get the 5950x for "workload", you either get the 64core threadripper or a pentium 3. There is no workload in between


----------



## darksf (Apr 14, 2022)

fevgatos said:


> Like someone would get the 5950x for "workload", you either get the 64core threadripper or a pentium 3. There is no workload in between


You will get 5950x when you need both physics simulation and then rendering. Threadripper is slow for most physics tools cause they care more about core clock not core count. The best case would be a AL machine for the physics and a Threadripper for the rendering but then the budget skyrockets.


----------



## gffermari (Apr 14, 2022)

AMD's Ryzen 7 5800X3D Overclocked to 4.8GHz Using BLCK | Hardware Times
					

AMD’s Ryzen 7 5800X3D is slated to launch next week with an MSRP of $449. Armed with 64MB of 3D stacked V-Cache (L3), it is expected to perform leadership gaming performance on a budget. As per early benchmarks, it is roughly 10% faster than the Ryzen 7 5800X and on par with the Core i9-12900K …




					www.hardwaretimes.com
				









A bench at the above clocks would be interesting...


----------



## nicamarvin (Apr 14, 2022)

New Benchmarks..









						AMD Ryzen 7 5800X3D Review: Gaming-First CPU
					

Making CPU cores faster rather than adding more cores is the best way to boost PC gaming performance. That's why AMD has supercharged their 8-core, 16-thread CPU...




					www.techspot.com


----------



## Sound_Card (Apr 14, 2022)

Vunnie said:


> So you're paying 450 dollar for a 5800x3d when you can get a 12600kf for 280 with 4.4% slower gaming performance at 1080p?



The price of Intel CPU's are built into the chipset itself. If you look at motherboard prices, they are pretty insane. A clever trick of marketing. Also, a good amount of games have a pretty large disparity. It makes really wonder about a Zen 4 3D stacker in the works, because I would be all over that. Imagine a 20% IPC gain, and additional 15% game performance gain with a cache edition CPU, marketed as Gamer Edition (kinda like Black Edition), where people buying those CPU's don't care about blender. I was really hoping someone would test this on starcraft 2 since it's single thread limited, was curious if cache would help boost the performance.


----------



## Denver (Apr 14, 2022)

nicamarvin said:


> New Benchmarks..
> 
> 
> 
> ...


The difference is surprising in some titles, no one can deny it


----------



## efikkan (Apr 14, 2022)

Jism said:


> The 5800X is just a single CCD; the main benefit from this alone would be no latency at all if it would had to switch or read/write data from other cores.
> 
> The 96MB of additional cache added on top would work wonders on any CPU really, but this cache experiment has bin tested with EPYC before with good results.


No latency at all? There is quite significant latency.
If this design worked they way you expected, then we should see 5800X3D pull ahead of 5800X in most heavy multithreaded workloads. Instead we see the opposite; the extra L3 proves useful mainly in gaming and a few select workloads. And the reason for this will be obvious for those who know how CPUs and caches work; the caches are overwritten every few microseconds (if not faster), so if a core were to benefit from another core recently using a cache line, the window there is extremely small and the chance of a cache hit in L3 from another core is very small, at least for data cache lines. With instruction cache lines the chances are a bit larger, but still it's more likely that the same core just evicted that cache line from L2 than another core executing the same code just moments apart. 

This is why we only see only see select workloads benefit from the extra L3. And as anyone who understands how caches work, just throwing more cache at it will not suddenly make the gains significant to "every" workload, in fact, adding more cache will have diminishing returns.

And don't get me wrong, 5800X3D looks like a good gaming CPU.


----------



## fevgatos (Apr 14, 2022)

Denver said:


> The difference is surprising in some titles, no one can deny it


Yeap, surprising both ways. It's so far behind in cyberpunk and riftbreakers


----------



## Sound_Card (Apr 14, 2022)

fevgatos said:


> Yeap, surprising both ways. It's so far behind in cyberpunk and riftbreakers



It significantly improved over the 5950x in those titles however, majorly closing the cap with the Intel's. And in other titles, it's just beating Intel period. Then you have cost and power to factor in. I suspect in a review that covers 30 games or more, will be more telling.


----------



## btk2k2 (Apr 14, 2022)

The factorio result is good. 26% faster than the 12900KS.


----------



## Dr. Dro (Apr 14, 2022)

nicamarvin said:


> New Benchmarks..



Bearing in mind that Far Cry and Borderlands are the games that most aggressively benefit from the 3D cache, leading this to be an insanely cherry picked result, as is the Factorio one.

Decisive losses throughout the entire productivity suite and general experiences aren't something to be set aside too lightly, IMHO, but the X3D has achieved the goal of bringing it to parity with Alder Lake in gaming - it doesn't necessarily beat it, however, nor is the gap significant against it or the standard Zen 3 chips. I like this CPU, it will do great for gamers (though anyone who bought a Ryzen 9 or Alder Lake absolutely do not need to lose sleep over it), but I really question its $450 asking price.

I'm seeing the TPU testing methodology being brought to question and how many seem unwilling to take the site seriously, but I fail to see where TPU's findings have been conflicting with other media. The use of DDR4-3200 memory isn't a big deal, reflected in Hardware Unboxed's testing and imho helps to see how the processor would behave on a medium-budget gaming system, and I've also seen the use of an RTX 3080 10GB being questioned - sure it's no RTX 3090 Ti or 6900 XT, but at best this would make the 720p scores a little less representative of the processor's true capabilities (since GA102 scales poorly to low resolutions) - maybe do that one pass using the RX 6900 XT as an addendum? People will never be happy anyway.


----------



## fevgatos (Apr 14, 2022)

Sound_Card said:


> Then you have cost and power to factor in. I suspect in a review that covers 30 games or more, will be more telling.


Yeah, cost is the major issue. The 12700f is practically as fast at a fraction of the cost, while absolutely crucifying the 3d in everything non gaming. Who would pay 450 for the 3d? 350 is already pushing it


----------



## Xuper (Apr 14, 2022)

well , This CPU is a last gift to AM4. It was a showcase to celebrate AM4 platform.

Farewell AM4 !


----------



## iO (Apr 14, 2022)

Power consumtion numbers are impressive!
12900K needs 80% more power, 12900KS even 260% more...


----------



## HD64G (Apr 14, 2022)

Most people already have or able to get an inexpensive AM4 board and RAM and with that CPU will have the best gaimg experience to date even when the next gen GPUs arrive. So, what's not to like of this CPU?


----------



## gffermari (Apr 14, 2022)

fevgatos said:


> Yeah, cost is the major issue. The 12700f is practically as fast at a fraction of the cost, while absolutely crucifying the 3d in everything non gaming. Who would pay 450 for the 3d? 350 is already pushing it



Don't mix the productivity performance with the gaming one.
Obviously, someone who builds a pc from scratch now, it's better buying the Intel platform, although it costs ridiculously more.
(or wait for AM5 - the normal person will invest on DDR5/PCIe5 and even get a 12400 in order to have an upgrade path)

But for the majority who already own a X370/X470/X570/B350/B450/B550, and most of active gamers have AMD now, this is the absolute cpu. No one will replace the whole pc to get the value of a 12700K. It's not meant to lure this group of people.

The cpu is for those who seek absolute gaming performance without having to replace the whole platform.
And yes, it's meaningless to talk about value when we talk about absolute performance, even in a specific target group.
No reasonable person will choose the 3D over a 5900X/5950X. Even me, who waited for this cpu (I'm on 3700X now) for so long, I may end up with a 5900X/5950X.

*It's embarrasing for Intel, having released the KS that needs a nuclear power plant to work, to brag about the crown of the absolute gaming performance. But KS is the king.
Also, when we talk about the absolute performance, there is no place for value or power consumption in the discussion. The 3D makes the KS look like a dinosaur (and it is) but the latter has got the Crown, no matter if it costs twice the price.


----------



## Яid!culousOwO (Apr 14, 2022)

I've got a question... Since they had to shrink the thickness of the original Zen 3 silicon to make room for 3D V-Cache silicon, why didn't they flip it upside down, which is to put V-Cache at the bottom, close to the substrate, and Zen 3 chip on the top, close to the IHS? I think it should help greatly with thermal performance... Well, maybe that's too difficult for cores to connect with the pins underneath... They know it way better than me, of course.
And at the same time I wish that they would apply the same method on new Zen 4 to cut it thin, just like Intel did on the 10th gen Core. Looking forward to Zen 4.
By the way, I long have said that E-cores can't avoid scheduling problems, which is why I never like the idea of hybird architecture. It's hilarious when they call it E-cores yet still 250 W or so.


----------



## Dr. Dro (Apr 14, 2022)

iO said:


> Power consumtion numbers are impressive!
> 12900K needs 80% more power, 12900KS even 260% more...



The same power consumption numbers are attainable by using a regular Ryzen with the same 1.35 volt ceiling. XFR (and to that extent, PBO) throw efficiency out of the window in bursts in order to achieve the performance target, the biggest achievement here is how AMD managed to make this without incurring a significant power consumption penalty. Also consider that Alder Lake has twice the cores, and the efficiency is not all that different when spread across all available execution units.



RidiculousOwO said:


> I've got a question... Since they have had to shrink the thickness of the original Zen 3 silicon to make room for 3D V-Cache silicon, why didn't they flip it upside down, which is to put V-Cache at the bottom, close to the substrate, and Zen 3 chip on the top, close to the IHS? I think it should help greatly with thermal performance... Well, maybe that's too difficult for cores to connect with the pins underneath... They know it way better than me, of course.
> And at the same time I wish that they would apply the same method on new Zen 4 to cut it thin, just like Intel did on the 10th gen Core. Looking forward to Zen 4.
> By the way, I long have said that E-cores can't avoid scheduling problems, which is why I never like the idea of hybird architecture. It's hilarious when they call it E-cores yet still 250 W or so.



Alder Lake gets around your claimed scheduling problems using a hardware thread scheduler called Thread Director. The processor knows exactly where to place data before it even reaches the cores themselves, the downside to this is that it requires Windows 11 or a modern Linux kernel that can understand how the ITD works.

As for the positioning, you'll find that the through-silicon vias for the 3D cache were already present on the original Zen 3 design. I would guess AMD just did not have the packaging technology ready at the time, at least not with an acceptable cost anyhow.


----------



## Sound_Card (Apr 14, 2022)

fevgatos said:


> Yeah, cost is the major issue. The 12700f is practically as fast at a fraction of the cost, while absolutely crucifying the 3d in everything non gaming. Who would pay 450 for the 3d? 350 is already pushing it


Like I said in another Post, the cost of the Intel CPU is partitioned off into the chipset itself, never minding the cost of DDR 5. Using DDR 4 with ADL absolutely brings down its performance.


----------



## ArcanisGK507 (Apr 14, 2022)

It seems stupid to me that they have not yet realized how dependent games are on cache memory... every time there is a jump in cache memory in terms of quantity and speed, this has more repercussions on the performance of a processor than anything improvement in IPC... for End users Currently multitasking and gaming is more important than compressing a 100GB file...

Honestly, synthetic tests do not measure reality.


----------



## InVasMani (Apr 14, 2022)

fevgatos said:


> Yeap, surprising both ways. It's so far behind in cyberpunk and riftbreakers


It's a meager 3.2 frame behind in Cyberpunk at 720p and less at higher resolutions and over x3 the wattage average/max. It also costs significantly significantly less.


----------



## fevgatos (Apr 14, 2022)

Sound_Card said:


> Like I said in another Post, the cost of the Intel CPU is partitioned off into the chipset itself, never minding the cost of DDR 5. Using DDR 4 with ADL absolutely brings down its performance.


That is factually wrong. The chipset price (the money intel is charging) got up by a whooping....1€ the last couple of years. LOL


----------



## Яid!culousOwO (Apr 14, 2022)

Dr. Dro said:


> Alder Lake gets around your claimed scheduling problems using a hardware thread scheduler called Thread Director. The processor knows exactly where to place data before it even reaches the cores themselves, the downside to this is that it requires Windows 11 or a modern Linux kernel that can understand how the ITD works.


Yeah I know there's a hardware level scheduler. But just as the review has shown, there's always gonna be some software to be mis-scheduled, waiting to be corrected by human. That's the problem and that's my point. With all cores being both powerful and efficient like Zen 3, there won't be this trouble...


----------



## Cutechri (Apr 14, 2022)

I am impressed. More CPUs like this AMD, and less CPUs like the R5 4500.


----------



## tussinman (Apr 14, 2022)

Sound_Card said:


> Like I said in another Post, the cost of the Intel CPU is partitioned off into the chipset itself, never minding the cost of DDR 5. Using DDR 4 with ADL absolutely brings down its performance.


The benchmarks I've seen have majority shown that low latency ddr4 is equal if not faster in gaming ? (That's the controversy with ddr5, right now it's not really worth the cost in most consumer applications)


----------



## Pastuch (Apr 14, 2022)

If anyone finds a benchmark of the 5800x3d in Warzone please share. I'm scouring the internet and the reviewers keep using the same games.


----------



## z1n0x (Apr 14, 2022)

iO said:


> Power consumtion numbers are impressive!
> 12900K needs 80% more power, 12900KS even 260% more...
> 
> View attachment 243594


Pretty illustrative example on the effect of data locality to performance and power consumption. Trips to memory are expensive.


----------



## Shatun_Bear (Apr 14, 2022)

nicamarvin said:


> New Benchmarks..
> 
> 
> 
> ...



New gaming king is here. Intel weren't at the top for very long, even with their 280W overclocked to the absolute limit special edition behemoth


----------



## Deleted member 24505 (Apr 14, 2022)

Guessing there will be a fair few AMD CPU's in the second hand market very soon. Good luck selling them though. Make sure you have secured your X3D first though


----------



## fevgatos (Apr 14, 2022)

Shatun_Bear said:


> New gaming king is here. Intel weren't at the top for very long, even with their 280W overclocked to the absolute limit special edition behemoth


What? The techspot review shows the alderlakes on top, lol


----------



## nicamarvin (Apr 14, 2022)

fevgatos said:


> What? The techspot review shows the alderlakes on top, lol



2% at 1080P with $500 DDR RAM...Yeah good luck getting DDR5-6400


----------



## fevgatos (Apr 14, 2022)

nicamarvin said:


> 2% at 1080P with $500 DDR RAM...Yeah good luck getting DDR5-6400
> 
> View attachment 243609


And that's the difference that a 12700f has compared to the 5800x 3d. Only it absolutely crucifies it in everything else. And it costs 320€, lol

PS1. You don't need to get 6400c32. Any hynix kit clocks to 6800-7000c30. Just sayin..


----------



## Deleted member 24505 (Apr 14, 2022)

fevgatos said:


> And that's the difference that a 12700f has compared to the 5800x 3d. Only it absolutely crucifies it in everything else. And it costs 320€, lol
> 
> PS1. You don't need to get 6400c32. Any hynix kit clocks to 6800-7000c30. Just sayin..



This forum is blind to anything Intel now. It's like talking to a brick wall.


----------



## Shatun_Bear (Apr 14, 2022)

fevgatos said:


> And that's the difference that a 12700f has compared to the 5800x 3d. Only it absolutely crucifies it in everything else. And it costs 320€, lol
> 
> PS1. You don't need to get 6400c32. Any hynix kit clocks to 6800-7000c30. Just sayin..



The only thing the 12700K crucifies is power consumption, it draws significantly more power than the 5800X3D multithreaded:







And its still slower in gaming. This is not a good place for Intel to be; slower in gaming while drawing significantly more power!


----------



## Deleted member 24505 (Apr 14, 2022)

Shatun_Bear said:


> The only thing the 12700K crucifies is power consumption, it draws significantly more power than the 5800X3D multithreaded:
> 
> 
> 
> ...



But better for everything else. ST/MT keep your X3D i don't spend most of my time gaming anyway, If you do then you are onto a winner with the X3D.


----------



## nicamarvin (Apr 14, 2022)

fevgatos said:


> And that's the difference that a 12700f has compared to the 5800x 3d. Only it absolutely crucifies it in everything else. And it costs 320€, lol
> 
> PS1. You don't need to get 6400c32. Any hynix kit clocks to 6800-7000c30. Just sayin..


I mean any recent CPU within the last two years that is above $200 paired with a 3090 Ti will come within single-digit performance(look at the Top 5 CPUs, Choose the lowest price ) so your point is meaningless.



Tigger said:


> This forum is blind to anything Intel now. It's like talking to a brick wall.


We are in the presence of an engineering marvel of a CPU and many of you find a way to criticize it?


----------



## Deleted member 202104 (Apr 14, 2022)

Shatun_Bear said:


> The only thing the 12700K crucifies is power consumption, it draws significantly more power than the 5800X3D multithreaded:
> 
> 
> 
> ...



And for those extra 55 watts, it scores 7544 points higher.

I know thinking hurts, but no pain, no gain.


----------



## Deleted member 24505 (Apr 14, 2022)

nicamarvin said:


> I mean any recent CPU within the last two years that is above $200 paired with a 3090 Ti will come within single-digit performance(look at the Top 5 CPUs, Choose the lowest price ) so your point is meaningless.
> 
> 
> We are in the presence of an engineering marvel of a CPU and many of you find a way to criticize it?



You mean the same way every thread that mentions it criticizes ADL? Get back to your worship


----------



## neatfeatguy (Apr 14, 2022)

If you're strictly looking for a PC for gaming, the 5800X3D might be the way to go. If you have other uses for your computer and want something better and still solid with gaming almost any other current CPU will handling practical applications just as good as 5800X3D or better and still provide solid gaming performance.

In my situation I have days where I'm putting a lot of stuff on my Plex server and the speed and effectiveness of the 5900x is awesome for my needs. Otherwise I use it for gaming. I'd rather have a 5900x that excels at encoding work by upwards of 40% over the 5800X/X3D (I'd even be happy with a 12700K that's right there with the 5900X at encoding) and is a solid CPU for gaming over just having the top CPU for gaming - be it the 5800X3D or 12900KS.

I find it hilarious that so many people are crying Intel best! AMD best! and how they argue back and forth about a miniscule thing such as "gaming king".

I don't care what side holds the "gaming king" crown. What I can admit to is that I'm impressed with the 5800X3D as it comes in lower clocked than the 5800X and in gaming it surpasses what the 5800X can do and comes within spitting distance (sometimes surpasses) what the 12900KS does. It's a cool feat to see from AMD. Anyone unwilling to see this and admit it, they are a fool or a moron.

I for one like to take the reviews done by W1zzard and other sites such as Techspot. Use that information to base my opinion on what would be best for my needs. Everyone else should do that, too, and stop trying to convince the other side of the spectrum that your way is right and they're wrong. It's hard to talk someone down when they don't want to be.

In the end:
Strictly gaming and don't do much else? Go for that 5800X3D build if you want to save some money over a 12900K build.
Lots of media encoding? Go for the 12700k or 12900k from Intel or 5900x or 5950x from AMD.
Maybe you want that bragging rights for overall fastest and/or most expensive? The 12900KS is what you're looking for.
You don't plan on gaming? Then who cares what CPU you pick, find one that meets your needs.

Build your computer for your needs, enjoy what it does and enjoy the new hardware that keeps coming out in the reviews.


----------



## nicamarvin (Apr 14, 2022)

Tigger said:


> You mean the same way every thread that mentions it criticizes ADL? Get back to your worship


But it has earned its criticism(at least the 12900K/KS) It has been Pushed Beyond its efficiency curb to match/beat Zen3 at MT and Gaming.


----------



## Deleted member 24505 (Apr 14, 2022)

If you already have a pretty good PC, spending $400+ for the 20% better the X3D is for gaming is dumb. If you are building new or have a potato CPU then fine, otherwise why bother. So you can preen yourself on a forum, yeah that's it.


----------



## z1n0x (Apr 14, 2022)

Shatun_Bear said:


> The only thing the 12700K crucifies is power consumption, it draws significantly more power than the 5800X3D multithreaded
> And its still slower in gaming. This is not a good place for Intel to be; slower in gaming while drawing significantly more power!


Haven't you learned by now? Efficiency only matters when AMD is behind Intel/Nvidia, otherwise it does not.


----------



## Deleted member 24505 (Apr 14, 2022)

nicamarvin said:


> But it has earned its criticism(at least the 12900K/KS) It has been Pushed Beyond its efficiency curb to match/beat Zen3 at MT and Gaming.



It's only in efficient when running R23 maxed out, or haven't you been reading this forum for the past months............................oh 20 posts yeah no you haven't. move along


----------



## nicamarvin (Apr 14, 2022)

Tigger said:


> If you already have a pretty good PC, spending $400+ for the 20% better the X3D is for gaming is dumb. If you are building new or have a potato CPU then fine, otherwise why bother. So you can preen yourself on a forum, yeah that's it.


Actually I am upgrading from a stock 1700(dud at OC) to the 5800X x370 MB and a 6800 GPU. I think I should see a substantial boost in ST/MT and in Gaming


----------



## fevgatos (Apr 14, 2022)

Shatun_Bear said:


> The only thing the 12700K crucifies is power consumption, it draws significantly more power than the 5800X3D multithreaded:
> 
> 
> 
> ...


You are putting a graph of power consumption in cinebench and then talk about gaming. Are you daft dude? You realize according to the graph you just posted the 12700 is more efficient than the 5800x 3d,right? Lol, way to shoot your self on the foot buddy


----------



## Deleted member 24505 (Apr 14, 2022)

nicamarvin said:


> Actually I am upgrading from a stock 1700(dud at OC) to the 5800X x370 MB and a 6800 GPU. I think I should see a substantial boost in ST/MT and in Gaming



Indeed you will and it exactly the reason to buy a 5800x/3D


----------



## Shatun_Bear (Apr 14, 2022)

Tigger said:


> But better for everything else. ST/MT keep your X3D i don't spend most of my time gaming anyway, If you do then you are onto a winner with the X3D.



Not exactly. 12700K:

Worse power draw.
Worse heat output.
Worse total platform cost
Worst gaming performance.
Better productivity performance
Better platform 



weekendgeek said:


> And for those extra 55 watts, it scores 7544 points higher.
> 
> I know thinking hurts, but no pain, no gain.



Ok sure if you play CPU benchmarks you're on to a winner with the Intel pardon me.


----------



## fevgatos (Apr 14, 2022)

z1n0x said:


> Haven't you learned by now? Efficiency only matters when AMD is behind Intel/Nvidia, otherwise it does not.


But the 3d IS behind in the graph he posted. It scores 50% on cinebench for 30% more watts. Hello, do you know what efficiency is?



Shatun_Bear said:


> Not exactly. 12700K:
> 
> Worse power draw.
> Worse heat output.
> ...


What worse power draw dude, your own freaking graph shows that the 12700 is more efficient! 
The 12700f costs 320 euros, crucifies the 3d in everything, on a better / newer platform with more features and more upgradability while it only loses to 240p gaming by 5%. And for that 5% you have to pay 450 euros. Thats how much the 12700f paired with a mobo costs. LOL?


----------



## Deleted member 24505 (Apr 14, 2022)

Shatun_Bear said:


> Worst gaming performance.



But better than most other CPU's out at everything else. And the X3D is a one off AM4 last ditch just for them to get the gaming crown back.


----------



## Deleted member 202104 (Apr 14, 2022)

Shatun_Bear said:


> Not exactly. 12700K:
> 
> Worse power draw.
> Worse heat output.
> ...


You showed the power consumption from CB23.  You're playing benchmarks, not me.


----------



## fevgatos (Apr 14, 2022)

weekendgeek said:


> You showed the power consumption from CB23.  You're playing benchmarks, not me.


Thats not the funny part. The funny part is he showed a graph where the 5800x3d is highly inefficient and then self proclaims it the efficiency king. Like wtf is going on with posters iq nowadays?


----------



## z1n0x (Apr 14, 2022)

fevgatos said:


> But the 3d IS behind in the graph he posted. It scores 50% on cinebench for 30% more watts. Hello, do you know what efficiency is?


It seems you missed post #293. No further discussion is needed.


----------



## gffermari (Apr 14, 2022)

To sum up:
For absolute gaming performance 5800X3D or 12900KS. The budget is the limit - both cpus are 100% the best in gaming but it's meaningless to buy either of them for anything else. The 5800X3D is mediocre at productivity while the 12900KS is ridiculously more expensive against the 12900K.
For compromised, even slightly, gaming and productivity....all the other cpus.

The 12700K and 5900X are the best all around high end cpus.


----------



## fevgatos (Apr 14, 2022)

z1n0x said:


> It seems you missed post #293. No further discussion is needed.


I dont care about some other post. Im talking about that one he showed. We agree that in that one the 3d is highly inefficient compared to the 12700 and therefore the poster was absolutely wrong? Answer that and then ill check post 293,deal?



gffermari said:


> To sum up:
> For absolute gaming performance 5800X3D or 12900KS. The budget is the limit - both cpus are 100% the best in gaming but it's meaningless to buy each of them for anything else. The 5800X3D is mediocre at productivity while the 12900KS is ridiculously more expensive against the 12900K.
> For compromised, even slightly, gaming and productivity....all the other cpus.
> 
> The 12700K and 5900X are the best all around high end cpus.


From my point of view, you either go for the crown, which means 12900k / ks with a hynix ddr5 kit, or you go for the value which is a 12700f with a b660. The combo costs as much as the 5800X3D alone and loses 5% fps in 240p gaming with a 3090ti, while it absolutely slaughters the 3d in everything else. 

The 3d needs a huge pricedrop, with the current pricing only swore amd fanboys will buy it


----------



## z1n0x (Apr 14, 2022)

fevgatos said:


> I dont care about some other post. Im talking about that one he showed. We agree that in that one the 3d is highly inefficient compared to the 12700 and therefore the poster was absolutely wrong? Answer that and then ill check post 293,deal?


Here is the "deal". Source: Computerbase


----------



## Deleted member 24505 (Apr 14, 2022)

fevgatos said:


> only swore amd fanboys will buy it



They will in droves


----------



## fevgatos (Apr 14, 2022)

z1n0x said:


> Here is the "deal". Source: Computerbase
> View attachment 243616


If you are not going to answer my question im not addressing yours. That was the deal. I know it pains you to admit the 3d was inefficient as fuck in that graph but hey, whatnyou gonna do


----------



## gffermari (Apr 14, 2022)

AMD Ryzen 7 5800X3D CPU Reaches Almost 4.9 GHz Overclock on MSI's X570 GODLIKE Motherboard
					

AMD's Ryzen 7 5800X3D 3D V-Cache CPU has been overclocked to almost 4.9 GHz on MSI's flagship MEG X570 GODLIKE motherboard.




					wccftech.com


----------



## mb194dc (Apr 14, 2022)

Your use case matters. No need for fanboy wars. Depends if you're already on am4, mainly what you use it for!

In my case, only got 60hz screen at 1080p, so probably irrelevant what CPU I have... 

Do have a b350 still on Zen 1 and now ASUS released bios for Zen 3, will bother to upgrade, eventually, when a bit cheaper eventually. 

Probably not to 5800x3d. Maybe 5900 if cheap enough.


----------



## Deleted member 202104 (Apr 14, 2022)

Why wouldn't somebody who has an older CPU on the AM4 platform just grab a 5700x for $299?  PBO/CO that MoFo and call it a day.

You could even take the $150 saved over the X3D and grab a good B550 board and get PCIE4.  I mean if you need the fastest gaming CPU, what the F are you doing on a older board?

Did the 5800x suck that bad as a gaming CPU that we need this? (I know the answer is that it didn't)


----------



## nicamarvin (Apr 14, 2022)

gffermari said:


> AMD Ryzen 7 5800X3D CPU Reaches Almost 4.9 GHz Overclock on MSI's X570 GODLIKE Motherboard
> 
> 
> AMD's Ryzen 7 5800X3D 3D V-Cache CPU has been overclocked to almost 4.9 GHz on MSI's flagship MEG X570 GODLIKE motherboard.
> ...



I pulled the best screenshots from the Video. Impressive Performance(Mostly from Productivity)

Stock Performance and Prime Testing with AVX enabled.









OC Results Also with Prime AVX


----------



## grammar_phreak (Apr 14, 2022)

Doesn't the CryptoNight algorithm mine well with CPUs that have a ton of cache?


----------



## QuietBob (Apr 14, 2022)

This chip has excellent perf/watt in games. It seems to be more efficient than anything AMD or Intel currently have on offer:





Source


----------



## fevgatos (Apr 14, 2022)

QuietBob said:


> This chip has excellent perf/watt in games. It seems to be more efficient than anything AMD or Intel currently have on offer:
> 
> View attachment 243624
> 
> Source


It is impressive indeed, but keep in mind they used 4400 ddr5 on intel. That's literally the worst ddr5 kit you can buy. Imagine the pitchforks if they used 2133c19 for the 5800x3d


----------



## Mats (Apr 14, 2022)

fevgatos said:


> It is impressive indeed, but keep in mind they used 4400 ddr5 on intel. That's literally the worst ddr5 kit you can buy. Imagine the pitchforks if they used 2133c19 for the 5800x3d


On the other hand, that pic shows power consumption, not efficiency, so better RAM wouldn't have helped there.


----------



## nicamarvin (Apr 14, 2022)

fevgatos said:


> It is impressive indeed, but keep in mind they used 4400 ddr5 on intel. That's literally the worst ddr5 kit you can buy. Imagine the pitchforks if they used 2133c19 for the 5800x3d


On 12900K/KS DDR5 has worst power efficiency than DD4


----------



## Mats (Apr 14, 2022)

There will be pitchforks either way. Use same DDR4 for both and they won't get the best out of Alder. Use fast DDR5 and it will cost much more.

But going for slow DDR5 is the worst alternative lol.


----------



## fevgatos (Apr 14, 2022)

Mats said:


> On the other hand, that pic shows power consumption, not efficiency, so better RAM wouldn't have helped there.


Well power draw is irrelevant, efficiency is what matters, so in that sense faster ram would help in the efficiency department


----------



## Mats (Apr 14, 2022)

That pic hurts my eyes.. starting at 90 %, way to go.


----------



## fevgatos (Apr 14, 2022)

Mats said:


> On the other hand, that pic shows power consumption, not efficiency, so better RAM wouldn't have helped there.


Well power draw is irrelevant, efficiency is what matters, so in that sense faster ram would help in the efficiency departmrnt


Mats said:


> There will be pitchforks either way. Use same DDR4 for both and they won't get the best out of Alder. Use fast DDR5 and it will cost much more.
> 
> But going for slow DDR5 is the worst alternative lol.


Yeah, they basically used both the slowest snd the most expensive ram, lol


----------



## Mats (Apr 14, 2022)

fevgatos said:


> Well power draw is irrelevant, efficiency is what matters, so in that sense faster ram would help in the efficiency department


I wasn't the one who posted the pic, just sayin'.


----------



## Icon Charlie (Apr 14, 2022)

weekendgeek said:


> Why wouldn't somebody who has an older CPU on the AM4 platform just not grab a 5700x for $299?  PBO/CO that MoFo and call it a day.
> 
> You could even take the $150 saved over the X3D and grab a good B550 board and get PCIE4.  I mean if you need the fastest gaming CPU, what the F are you doing on a older board?
> 
> Did the 5800x suck that bad as a gaming CPU that we need this? (I know the answer is that it didn't)


Yea I have to agree with this.   This was suspect to me when the hype train came hard about 6 months ago. The performance over the 5800 was overall not earth shattering in gaming while it got gimped doing production.    The 5800X got nailed by the 5600  during launch as it was far cheaper to go the 5600 in gaming than buying a 5800X  The 5600 sold extremely well so I can see AMD making this a gimmick CPU. 

More and more I believe that this was done to remove flawed product that did not pass the grade as well as overstock they have on hand.

For myself it is not worth the price that they are asking for. Save your money and look at other options.


----------



## Deleted member 24505 (Apr 15, 2022)

Icon Charlie said:


> Yea I have to agree with this.   This was suspect to me when the hype train came hard about 6 months ago. The performance over the 5800 was overall not earth shattering in gaming while it got gimped doing production.    The 5800X got nailed by the 5600  during launch as it was far cheaper to go the 5600 in gaming than buying a 5800X  The 5600 sold extremely well so I can see AMD making this a gimmick CPU.
> 
> More and more I believe that this was done to remove flawed product that did not pass the grade as well as overstock they have on hand.
> 
> For myself it is not worth the price that they are asking for. Save your money and look at other options.



It was good enough to get all the AMD fans slavering, and jabbering again at how crappy ADL is.


----------



## Mats (Apr 15, 2022)

I can't wait for Raphael to show up, and we hopefully can leave the apples-to-oranges choice of RAM behind (for now).


----------



## Dr. Dro (Apr 15, 2022)

Shatun_Bear said:


> The only thing the 12700K crucifies is power consumption, it draws significantly more power than the 5800X3D multithreaded:
> 
> 
> 
> ...



I mean, my 5950X also doesn't draw more than 170W or so either. My motherboard won't let it. I'm not entirely sure how this validates your argument - 5800X3D at 166W 8/16 is actually barely half of the 5950Xs efficiency at 179W, if you want to nitpick that way. Which means that the 12900KS remains more efficient if you do the whole wildly inaccurate math thing by simply doing cores*2 math (remember that ADL has both high performance and efficient cores, which have differing levels of power consumption). Apples to oranges, imho.



QuietBob said:


> This chip has excellent perf/watt in games. It seems to be more efficient than anything AMD or Intel currently have on offer:
> 
> Source



It's running at the lowest voltage and clock speeds out of the bunch, so it's no wonder, really. I'm not entirely sure how this is being spun as an achievement, given all power-efficient Ryzen SKUs have been doing exactly that thus far - it's less about it being insanely efficient, and more about its default settings not bursting to insanely high clocks/frequencies like the regular SKUs do. Think of high-performance mobile parts (H series) or the desktop GE series, they both run a more conservative v/f curve and have a strict PPT setting to lower consumption. Since this processor is locked, WYSIWYG.

It's a great chip, but I fear it's being praised for something it hasn't earned. It's just how I feel about it, though, I will say that it is an excellent processor - I just think it costs too much for what it offers.


----------



## Deleted member 24505 (Apr 15, 2022)

Oh and we will lock it so you can't OC it and break it as it is so efficient. We made it and don't know how to fully manage it so we had to knobble it so stick it in and run it as it is, which considering it is a enthusiast CPU you will really appreciate.


----------



## Lifeless222 (Apr 15, 2022)

Still wait game version of AL - Raptor Lake with 3D Game Cache.


----------



## Mussels (Apr 15, 2022)

Sound_Card said:


> The price of Intel CPU's are built into the chipset itself. If you look at motherboard prices, they are pretty insane. A clever trick of marketing. Also, a good amount of games have a pretty large disparity. It makes really wonder about a Zen 4 3D stacker in the works, because I would be all over that. Imagine a 20% IPC gain, and additional 15% game performance gain with a cache edition CPU, marketed as Gamer Edition (kinda like Black Edition), where people buying those CPU's don't care about blender. I was really hoping someone would test this on starcraft 2 since it's single thread limited, was curious if cache would help boost the performance.


As much as i'd love SC2 benchmarks, it's pretty much impossible to create a reproducible test, and in MP the other players CPU's slowing down forces yours to slown down too 



weekendgeek said:


> And for those extra 55 watts, it scores 7544 points higher.
> 
> I know thinking hurts, but no pain, no gain.


because it has more cores. See what those 55 watts does for the higher core count AMD chips.
This argument is just moving the goalposts



fevgatos said:


> You are putting a graph of power consumption in cinebench and then talk about gaming. Are you daft dude? You realize according to the graph you just posted the 12700 is more efficient than the 5800x 3d,right? Lol, way to shoot your self on the foot buddy


w1zzard posted his own graph:




and another source quoted above for emphasis:






The higher core count intels are faster in multi threaded apps. 
So are the higher core count ryzens. 
Duh.


The intel's are higher power consumption in *everything*
That ups the cost of the entire PC, and it's relevant as heck. Intel selling the CPU's cheap is a marketing tactic, when the boards are so expensive.
(Dont forget all the scandals where the cheaper motherboards couldnt handle the CPU's at stock and performance tanked)

The moment you limit the intels wattages as some people keep saying, you lose that performance.
Remember that many sources for the intel systems flip between three states: Intels limits, motherboard limits, and static overclocks. 
*You cant cherry pick the performance of the CPU with unlimited power, while quoting the wattages and heat of the power limited states.

For that max performance, you need a TDP unlimited CPU, high end cooling, high end motherboard, high end RAM. Skip just one, and you get worse performance than the Ryzens.*

The actual cost of a PC changes drastically with all that in mind, even if the motherboards and CPU's were equally priced - because the intel systems need a larger PSU, larger cooling, more case fans, and often a larger case to fit all that in.



The only valid use case for the 12900K/S is for someone who needs top tier gaming performance (165Hz and above) *and* workstation needs at the same time.
Sub 120Hz Gamers can get a 5600x or 12400F and be set for another 5 years.



Why do i care about these distinctions so much?
Because as an end user buying these products from these companies for the last 30 years, i've been burned too many times by hidden catches and gotchas.
It's like trusting a marketing slide
100% faster than last years product!***
*** (If you use DDR4 2667 for last years product but DDR5 6400 in this years, with a 1KW PSU and 360mm AIO with a $500 motherboard with 24phase watercooled VRM's)


----------



## Deleted member 202104 (Apr 15, 2022)

Mussels said:


> because it has more cores. See what those 55 watts does for the higher core count AMD chips.
> This argument is just moving the goalposts



Of course it has more cores.  This isn't anything against the x3d so don't get all defensive.  The post I was replying to showed a chart with power consumption and the moron was claiming that the 12700k had worse performance while using more power.  In that graph, the 12700k did use more power, but it was *50%* faster.


----------



## Mussels (Apr 15, 2022)

I edited my post above and turned it into quite the essay
The 12700k uses more power for more MT performance, absolutely. 
It's also not the power whore the 12900K/S is, IMO it's the better chip.

It's just about comparing like for like, because in this case the 5800x3d is the gaming performance of a 12900KS, but it is NOT competing or intended to compete on the multi threaded count.

Imagine a 5600x3D and how people would whinge it's no good, because of a low cinebench score??


----------



## fevgatos (Apr 15, 2022)

nicamarvin said:


> On 12900K/KS DDR5 has worst power efficiency than DD4
> 
> View attachment 243627





Mussels said:


> As much as i'd love SC2 benchmarks, it's pretty much impossible to create a reproducible test, and in MP the other players CPU's slowing down forces yours to slown down too
> 
> 
> because it has more cores. See what those 55 watts does for the higher core count AMD chips.
> ...


Can we stop with that high end bullshit argument? It gets really boring. Techspot tested a 12700f on an el cheapo b660 with the stock cooler and no power limits. Guess what, it performs identical in games to the 12700k on a 360 aio.

12700f with a 150 euro b660 is cheaper than the 3d and crucifies it in every other workload other than 240p gaming. Period. The rest is just amd fanboying defense.


----------



## Mussels (Apr 15, 2022)

fevgatos said:


> Can we stop with that high end bullshit argument? It gets really boring. Techspot tested a 12700f on an el cheapo b660 with the stock cooler and no power limits. Guess what, it performs identical in games to the 12700k on a 360 aio.
> 
> 12700f with a 150 euro b660 is cheaper than the 3d and crucifies it in every other workload other than 240p gaming. Period. The rest is just amd fanboying defense.


Except it's not bullshit. It's rare that a cheaper board has no performance penalty.
A 12700F is not a 12900KS. 
Did they test that budget board with budget ram, vs a top tier board with top tier ram, and a high end GPU the same way they do their normal reviews? Did they test it in a normal case, or an open air test bench? Did they still use a bigass expensive cooler on it? cheap JEDEC RAM?

It's very easy to skew results and say "oh only 2% different, they're the same!" except 2% is the entire leaderboard in 4K testing.


----------



## gffermari (Apr 15, 2022)

The man who buys a 5800X3D does not care if the 12700K is faster in MT tasks.
If he does care, then he shouldn't buy it.
If also he doesn't mind the small difference in games, he could buy any other 5000 cpu or cheap 12000 cpu.
It's just not for them.

It may not make much sense to buy the 3D for general usage, as it doen't make sense to buy the KS.


----------



## usul1978 (Apr 15, 2022)

I was willing to spend the stupid money to get that chip on my gaming AM4 plateform to replace my trusty 5600x just for (pseudo) future proofing, get the bragging rights and because my birthday is 5 days after launch day. 

I only game, in 1440p with a 3080 and I aim ingame at 110 fps (if I get more FPS, I up the rez and use downscaling, like in Warzone for instance where I play at 130%) so I'm basically always gpu bottlenecked. I have not much to gain in real life if I get a 5800x3d.

But what bother me, is the things I'll probably lose with the 5800x3D : I'll probably lose my sweet RAM settings as apparently it's not sure the 5800x3d will take 1900 mhz FCLK and makes my DR 3800 cl14 DDR4 fly. I'll lose my low temps and low consumption. I'll lose my (slight) OC on the 5600x...And maybe I'll lose some stability as the 5800x3D looks like something pushed to its limits.

So, after weeks of envy, I'll pass on the 5800x3d, and it makes me sad.


----------



## fevgatos (Apr 15, 2022)

Mussels said:


> Except it's not bullshit. It's rare that a cheaper board has no performance penalty.
> A 12700F is not a 12900KS.
> Did they test that budget board with budget ram, vs a top tier board with top tier ram, and a high end GPU the same way they do their normal reviews? Did they test it in a normal case, or an open air test bench? Did they still use a bigass expensive cooler on it? cheap JEDEC RAM?
> 
> It's very easy to skew results and say "oh only 2% different, they're the same!" except 2% is the entire leaderboard in 4K testing.


Well let's agree on something. You care about the fastest, you get a 12900k / ks on a 2 dim mobo some hynix 7000c30 kits. That will cost you 1500,maybe more, but it kisses eveything goodbye... 

Now regarding the 12700f, even the cheapest of cheap b660 will handle it just fine for gaming. Of course im not suggesting to buy the cheapest, cause the computer is not a games console, im just stating that it is possible. A 150 euro b660 bazooka maxes it out easily in cbr23 with no power limits, coupled with the 12700f thats 470 euros. And comes with a cooler, which again for gaming is fine (tested by techspot). Of course id suggest to change the cooler as well but still the fact is that for almost the price of the 3d alone you get a cpu a mobo and a cooler that absolutely nails the 3d in everything but 240p games, and even there the difference is what, 5%?



gffermari said:


> The man who buys a 5800X3D does not care if the 12700K is faster in MT tasks.
> If he does care, then he shouldn't buy it.
> If also he doesn't mind the small difference in games, he could buy any other 5000 cpu or cheap 12000 cpu.
> It's just not for them.
> ...


Then that man would be buying a 12900 with high end ddr5, no?


----------



## gffermari (Apr 15, 2022)

@usul1978 

yes, but that's by far the best and, most importantly, the last gaming cpu on AM4.

The only reason I'm thinking of pulling the trigger on 5800X3D against the 5900X/5950X (which makes much more sense buying in general) is that when I replace my 2080Ti with a 4080/4080Ti, the difference in games will be even bigger than it is now among the 5000 cpus.



fevgatos said:


> Then that man would be buying a 12900 with high end ddr5, no?


No if he wants to spend half the price of a cpu or maybe less than half of a system.
(although we agree that it's pointless to invest on AM4 now but in theory a ''mad'' man who wants the absolute best and cheapest possible, then the 3D is for him)


----------



## fevgatos (Apr 15, 2022)

gffermari said:


> he wants to spend half the price of a cpu or maybe less than half of a system.
> (although we agree that it's pointless to invest on AM4 now but in theory a ''mad'' man who wants the absolute best and cheapest possible, then the 3D is for him)


I beg to differ. The best and cheapest is the 12700f. Everything above that is within 5% pretty much. The only reason I got the 12900k over the 12700 is because i wanted better binned chip for ocing


----------



## W1zzard (Apr 15, 2022)

nicamarvin said:


> https://www.techpowerup.com/forums/attachments/1649970428652-png.243621/


Impressive results because graphs dont start at zero. 
(Not trying to shit on the OC achievement, Pieter is a master at overclocking and has done amazing work)


----------



## Deleted member 24505 (Apr 15, 2022)

Why would people who spend $1k+ on just their GPU quibble on the price of any part for their PC, either you can afford it or not. I understand the power thing kind of, but i don't give a crap what power any part uses as long as i can cool it. If you have/want a high end rig, it should be high end. You don't buy a Ferrari and switch the engine for a 1000cc 4cyl 60mpg.


----------



## InVasMani (Apr 15, 2022)

Are we going to see this same end of life platform support banter the other way with Alder Lake's successor from Intel!!? Everyone considering buying AM4 chip now already knows this and anyone upgrading already on AM4 doesn't care at the same time. We don't even know what to expect with Alder Lake's successor because it's f*cking Intel we're talking about here remember Kaby Lake that was literally just Skylake with another 100MHz thrown on top and the IMC slightly ironed out with a tiny pittance of additional L3 cache. Is that what we should expect!? 

Who knows because Intel the same company that often locks down overclocking of their chips and sure AMD does sometimes as well, but usually only for good reason not because hey I want to sell it as a unlocked chip for more money while I lock down everything below it to be less appealing. Hell Skylake has less L3 cache than LGA 775 C2Q like honestly years of node advancements and here you go this is your scrap pieces of L3 cache enjoy it. They also tried to sell you on the chips "security" features initially that then turned out to be one of the most vulnerable chips in recent history. Intel pulled the rug out on everyone with their fantasy chip "security" superiority over the competition. Look at all these unsecure IPC gains of 1-2% year over year and +0 core count. Sorry, but Intel didn't give a f*ck for a decade I have a decade of not giving a f*ck in turn to give back to them.


----------



## BHS1975 (Apr 15, 2022)

usul1978 said:


> I was willing to spend the stupid money to get that chip on my gaming AM4 plateform to replace my trusty 5600x just for (pseudo) future proofing, get the bragging rights and because my birthday is 5 days after launch day.
> 
> I only game, in 1440p with a 3080 and I aim ingame at 110 fps (if I get more FPS, I up the rez and use downscaling, like in Warzone for instance where I play at 130%) so I'm basically always gpu bottlenecked. I have not much to gain in real life if I get a 5800x3d.
> 
> ...


I have a 5600x too slightly OC to 4.7 and the 5800x3d would be more efficient as mine uses about 70w gaming while the 5800x3d uses 60w so it should run about the same temps. And what ram sticks do you have? I have the trident z 32gb 3600 cl16 running at 3800 cl15 flat 2T at 1.47v as they won't run at cl14 unless I use a much higher voltage.


----------



## usul1978 (Apr 15, 2022)

BHS1975 said:


> I have a 5600x too slightly OC to 4.7 and the 5800x3d would be more efficient as mine uses about 70w gaming while the 5800x3d uses 60w so it should run about the same temps. And what ram sticks do you have? I have the trident z 32gb 3600 cl16 running at 3800 cl15 flat 2T at 1.47v as they won't run at cl14 unless I use a much higher voltage.



I have Trident Neo 3800mhz cl14 2x16 gb kit (F4-3800C14D-32GTZN), @3800, CL14-16-16, 1T. That was the only B Die in dual rank I found. They suck 1.5V! 

Actually my 5600x OC is really small and I guess that a slightly lower FCLK wouldn't be a real problem...so who knows, I may get one anyway  : )

I would like to try a 5800x3D, and return it if it doesn't take 1900 FCLK, but that seems a bit complicated.


----------



## Mats (Apr 15, 2022)

I'm actually surprised that AMD is as popular as it is right now, with Ryzen being top 5 on Amazon best sellers, and 8 of them being in top 10 as well.
I would have expected more like a 50-50 situation. Oh and only one model is from 2022, so not much of an impact from those models yet, although they're still pretty fresh.
I guess having fewer models helps AMD here.

Only 4 Alder in top 20, I thought Alder was so good that it would do better than that. Then there are other stores, of course..


----------



## LupintheIII (Apr 15, 2022)

Xebec said:


> Well written summary and appreciate the detailed benchmarks.  I was expecting more than WinRAR to really appreciate the cache but this is still quite interesting.
> 
> I'm really curious if Microsoft Flight Sim will benefit from the cache..


It does


----------



## BHS1975 (Apr 15, 2022)

usul1978 said:


> I have Trident Neo 3800mhz cl14 2x16 gb kit (F4-3800C14D-32GTZN), @3800, CL14-16-16, 1T. That was the only B Die in dual rank I found. They suck 1.5V!
> 
> Actually my 5600x OC is really small and I guess that a slightly lower FCLK wouldn't be a real problem...so who knows, I may get one anyway  : )
> 
> I would like to try a 5800x3D, and return it if it doesn't take 1900 FCLK, but that seems a bit complicated.


Get it through Amazon prime do you can return easily.


----------



## usul1978 (Apr 15, 2022)

BHS1975 said:


> Get it through Amazon prime do you can return easily.



Even if the box is opened?


----------



## lexluthermiester (Apr 15, 2022)

W1zzard said:


> Impressive results because graphs dont start at zero.
> (Not trying to shit on the OC achievement, Pieter is a master at overclocking and has done amazing work)


To me, graphs like that boarder on the dishonest as they don't show the actual differences, even if they show the correct numbers. The fact that you folks don't do that is one of the many reasons TPU is so respected and people are willing to put a great deal of stock in the benchmarking results and reviews you folks offer.


----------



## BHS1975 (Apr 15, 2022)

usul1978 said:


> Even if the box is opened?


Yeah they will credit your account.


----------



## Aquinus (Apr 15, 2022)

I'd love to see how a 12c/24t version with 192MB of LLC would turn out. Fewer cores per CCD should help with cache hits if you're only sharing that 96MB per CCD with 6 cores instead of 8. Either way, I think AMD deserves some credit for trying something a little different than with the rest of their lineup. It's that extra level of spiciness that we should be craving from tech companies, particularly if they can manage to build this SRAM on nodes small enough where power consumption is minimal given the gain.


----------



## QuietBob (Apr 15, 2022)

fevgatos said:


> It is impressive indeed, but keep in mind they used 4400 ddr5 on intel.


They actually used DDR5-4800 CL38. Yes, I agree that using faster RAM would allow the 12900K to score higher on the perf/watt metric. But similarly, the 5800X3D would do better with DDR4-3800 CL14.



Dr. Dro said:


> It's a great chip, but I fear it's being praised for something it hasn't earned.


I meant efficiency _in games_ specifically. Limiting the voltage and clocks evidently brought the power consumption down, but it's the addition of the 3D V-cache that brought the efficiency _in games _to an entirely new level for Zen 3:






Source Source

Is the new 5800X3D the most efficient CPU in terms of price/performance in all scenarios? No. 
Is it the most efficient CPU in terms of performance/watt in games? If the above ComputerBase numbers are anything to go by, I'd say yes.


----------



## Deleted member 24505 (Apr 15, 2022)

For a new build maybe, not worth buying if you already have a good CPU though.


----------



## Mussels (Apr 16, 2022)

Tigger said:


> For a new build maybe, not worth buying if you already have a good CPU though.


That's always true, new chips are only relevant to those who want a new build, or got lucky with an upgrade path (the people with X370/B450 that can run the 5800X3D struck gold)


----------



## Dr. Dro (Apr 16, 2022)

QuietBob said:


> I meant efficiency _in games_ specifically. Limiting the voltage and clocks evidently brought the power consumption down, but it's the addition of the 3D V-cache that brought the efficiency _in games _to an entirely new level for Zen 3:
> 
> Is the new 5800X3D the most efficient CPU in terms of price/performance in all scenarios? No.
> Is it the most efficient CPU in terms of performance/watt in games? If the above ComputerBase numbers are anything to go by, I'd say yes.



Yeah, it's a great processor if you strictly focus on vidya, I just disagree with the pricing. It might make sense to a lot of people, given the provided upgrade paths. I've just been puzzled here, the setbacks otherwise seem a bit more than i'm willing to deal with for $450; but i'm also spoiled by my 5950X's phenomenal performance all-around. I've owned this processor for a year now and it still feels limitlessly powerful. I also fully acknowledge that this CPU just isn't intended for me, but I wouldn't mind having a sidekick rig with one.


----------



## wheresmycar (Apr 16, 2022)

theglaze said:


> I'm in the same boat... Alder Lake runs too hot and inefficient for my taste. While Zen 4 is probably worth the wait, it will likely be scalped for months after launch, a risk to teething issues with new chipsets and first-gen mobos, and DDR5 will still be expensive.



My fear with Zen 4 is AMDs value-king status further tarnished although we cant hold it against them. Judging by Zen 3's initial prices, i suspect Zen 4's gonna be extortionate from every angle (1st Gen processors, mobos and DDR5). The more I look at it, the 5600/5600X and if the price drops the 5800X3D might be the way forward for me. I dunno, i have a weird self gratifying buzz about bringing back one of our older AM4 boards and breathing life into it.

What are your thoughts on 1440p gaming... will Zen 4(ddr5) see more significant performance returns or will we still be largely GPU bound at this resolution? I'm still considering sticking with the 9700K long enough to see how much of a Zen 4 performance uplift we get at 1440p before considering any/all options. If i'm overthinking you have full legal authority to slap me across the head with a wet fish and knock some sense into me. 

Just info-hungry-curious... any chance Zen-4 will be DDR4 backwards compatible or are we 100% moving along with DDR5?


----------



## harm9963 (Apr 16, 2022)

Dr. Dro said:


> Yeah, it's a great processor if you strictly focus on vidya, I just disagree with the pricing. It might make sense to a lot of people, given the provided upgrade paths. I've just been puzzled here, the setbacks otherwise seem a bit more than i'm willing to deal with for $450; but i'm also spoiled by my 5950X's phenomenal performance all-around. I've owned this processor for a year now and it still feels limitlessly powerful. I also fully acknowledge that this CPU just isn't intended for me, but I wouldn't mind having a sidekick rig with one.


That's what I plan to do, my second system is a x470 with a 2700x , the 5950x will replace it, the 5800X3D will go in the ASUS Dark Hero .


----------



## Mussels (Apr 16, 2022)

Just checked my motherboards for 5800x3d compatibility
**all of them**

AX370 Gaming 5? Bios T51D adds all Zen3 (incl 5800x)
Strix B450-I? Yup.
The otherwise garbage B450M PRO-M2 MAX? Yup
And no ones shocked, X570-F also has it.

It seems that a few companies were just waiting on the 5800X3D to launch, so they could do all of zen 3 in one go including all the new chips.


----------



## mb194dc (Apr 16, 2022)

Even B350 getting all Zen 3 by the look of it. Guess AMD changed their mind...

Better keep people on their platform, if you're forcing people to change board a lot would go for AL. 

Changing chip easy, for those still on Zen 1 like me. Doubt will go for 3D version... Will see if prices come down further.


----------



## nicamarvin (Apr 16, 2022)

LupintheIII said:


> It does





Xebec said:


> I'm really curious if Microsoft Flight Sim will benefit from the cache..



More tests at 1440P and 4K


----------



## Pastuch (Apr 16, 2022)

nicamarvin said:


> More tests at 1440P and 4K
> 
> View attachment 243787


Wow, if you're a flight sim guy this is pretty amazing.


----------



## The King (Apr 17, 2022)

I have regularly seen the 5600X go for $200 on Amazon.com with the 5600 going for $199. Complete BS if you ask me since the 5600 seems to be limited release in many parts of
the world and was not even released here in India while all other SKU are for sale. @W1zzard @btarunr care to share any opinions on why AMD is doing this? What is the need for a $199 5600 when the 5600X is 1USD more in the US market place?


----------



## puma99dk| (Apr 17, 2022)

I am really hooked on seeing the price of the AMD Ryzen 7 5800X3D in my country.

I am sad to see the lack of prices and feature for motherboards ain't not really good talking about compared to board's for Intel 12gen with MSI's Pro Z690-A.

I mainly been looking at X570 because most B550 boards only gives me 2xM.2. and takes away 2xSATA ports using the second M.2. but also a interal type-c for my Meshify 2 case because I use a delock usb-c to usb-a adapter always because transfer speeds are better average than normal usb 3.0 to all my usb 3.0 thumb drives.

Proshop is having a easter sale on some boards I been looking at my all almost got Realtek lan  

£167 gets me a MSI MAG X570S Torpedo but Realtek lan not a fan of the lan.
£200 gets me a MSI MPG Edge Max Wifi same features as the Torpedo adds a third M.2. and WiFi 6E.
£223 gets me my dreams scheme board which is the Gigabyte X570S Aero G it got everything but my wallet doesn't agree with me on this price and I do not want to fight my mixed DDR4 battle like my Z590 Vision G. 

Since I had a MSI B450 and Asus B550 in the past with no issue to my mixed DDR4 ram I might jump onto the Asus ROG Strix X570-F for £184 on easter sale because I got Intel I211-AT (Gigabit),  2xM.2 and 8xSATA ports


----------



## HD64G (Apr 17, 2022)

AMD Ryzen 7 5800X3D @ 5141.78 MHz - CPU-Z VALIDATOR
					

[m1eu6l] Validated Dump by TSAIK (2022-04-16 04:07:30) - MB: MSI MEG X570 GODLIKE (MS-7C34) - RAM: 8192 MB




					valid.x86.fr


----------



## The King (Apr 17, 2022)

HD64G said:


> AMD Ryzen 7 5800X3D @ 5141.78 MHz - CPU-Z VALIDATOR
> 
> 
> [m1eu6l] Validated Dump by TSAIK (2022-04-16 04:07:30) - MB: MSI MEG X570 GODLIKE (MS-7C34) - RAM: 8192 MB
> ...


Sadly he never run the built in Benchmark so not sure if that OC will be of any practical use or just pure CPU frequency validation.


----------



## Deleted member 24505 (Apr 17, 2022)

5.1ghz at 1.2v

https://hothardware.com/news/ryzen-7-5800x3d-overclocked-to-51ghz-by-msi

To achieve that overclock, TSAIK had to raise the reference clock to 113 MHz. We don't know what compromises that required on the MSI MEG X570 GODLIKE board he was using, but we do know that it required him to use only a single 8GB memory module running at just 1,205 MHz (2410 MT/s). Essentially, it's a single stick running at JEDEC "safe" timings, and the only reason it's 1,205 MHz instead of 1,066 Mhz is because of the increased reference clock.


----------



## The King (Apr 17, 2022)

Tigger said:


> 5.1ghz at 1.2v
> 
> https://hothardware.com/news/ryzen-7-5800x3d-overclocked-to-51ghz-by-msi
> 
> To achieve that overclock, TSAIK had to raise the reference clock to 113 MHz. We don't know what compromises that required on the MSI MEG X570 GODLIKE board he was using, but we do know that it required him to use only a single 8GB memory module running at just 1,205 MHz (2410 MT/s). Essentially, it's a single stick running at JEDEC "safe" timings, and the only reason it's 1,205 MHz instead of 1,066 Mhz is because of the increased reference clock.


Was just going to post that his validation was done in single channel


----------



## Deleted member 24505 (Apr 17, 2022)

The King said:


> Was just going to post that his validation was done in single channel



It's not really a practical OC i guess


----------



## blu3dragon (Apr 17, 2022)

So according to this thread, PBO works, meaning you can tune power limits and core offsets.









						5800X3D Owners
					

Scroll Down For Gaming Results If Not Interested In Details  TLDR: 5800X3D is a monster at gaming, is easily OCd, and runs hot and beats even a tuned 5950x in most games; for general computing the 5800x, and more importantly now, the 5700x trounce it when OCd as well but cannot match it in...




					www.overclock.net
				




Makes sense and is good news, although the benefits of doing it on this chip seem to be limited.


----------



## Totally (Apr 17, 2022)

xenocide said:


> This is like, the coldest of takes. This isn't a CPU intended to be an upgrade from someone using a 5800X or equivalent CPU. It's a final upgrade for people using a 2xxx or even 1xxx series CPU so they can go a few years longer without having to rebuild their entire PC. The performance is there, and in that regard it achieves its goal.



Your arguement makes no sense, it's as if people upgrading from 2xxx or even 1xxx aren't the least bit interested with being wise with their money and will pretend chips that bracket the 5800X3D, the 5600X, 5700X/5800X, and 5900X don't exist. 5800X3D isn't a bad chip but pricing puts in a very awkward spot too expensive for what it offers to make a hard case against 5600X, and expensive enough for the 5900X to be the more logical choice.


----------



## Mussels (Apr 18, 2022)

blu3dragon said:


> So according to this thread, PBO works, meaning you can tune power limits and core offsets.
> 
> 
> 
> ...


I would assume BIOS updates would be needed to enable it, as its not working on many review boards yet



Totally said:


> Your arguement makes no sense, it's as if people upgrading from 2xxx or even 1xxx aren't the least bit interested with being wise with their money and will pretend chips that bracket the 5800X3D, the 5600X, 5700X/5800X, and 5900X don't exist. 5800X3D isn't a bad chip but pricing puts in a very awkward spot too expensive for what it offers to make a hard case against 5600X, and expensive enough for the 5900X to be the more logical choice.


Not just for you, but to everyone else: 

Launch prices are always weird. The new products are in demand and priced higher than the older discounted products - give it 6 months and they'll make more sense.
Early adopter tax.


----------



## blu3dragon (Apr 18, 2022)

Mussels said:


> I would assume BIOS updates would be needed to enable it, as its not working on many review boards yet


I haven't seen any review talk about this :-/

Everyone does state that overclocking is disabled (aka manual multiplier adjustment is locked)

Might have to order one just to find out :-D


----------



## mama (Apr 18, 2022)

Seriously good for gaming.  At least as good as the 12900K in games, even with strong DDR5 memory.  Not much good for anything else, however.


----------



## Deleted member 202104 (Apr 18, 2022)

mama said:


> Seriously good for gaming.  At least as good as the 12900K in games, even with strong DDR5 memory.  *Not much good for anything else, however.*



I probably wouldn't go that far, but I'm guessing a 5700x will have about the same performance as the x3D in everything but gaming.  That makes it a $150 premium to have the 'fastest' gaming CPU.

Considering the price of everything else with PC gaming lately, $150 isn't the end of the world.

As much as I didn't care for my 5800x, I'd consider picking this up for a gaming focused build.


----------



## Why_Me (Apr 18, 2022)

gffermari said:


> To sum up:
> For absolute gaming performance 5800X3D or 12900KS. The budget is the limit - both cpus are 100% the best in gaming but it's meaningless to buy either of them for anything else. The 5800X3D is mediocre at productivity while the 12900KS is ridiculously more expensive against the 12900K.
> For compromised, even slightly, gaming and productivity....all the other cpus.
> 
> The 12700K and 5900X are the best all around high end cpus.


The 12700/12700F for the price imo.


----------



## sukathukassa (Apr 18, 2022)

Does the eco-mode work on it?


----------



## Melvis (Apr 18, 2022)

Very well done AMD, this is most likely what I will upgrade my 2700X to, since I do mostly gaming on this PC and stream from time to time and to just have a nice drop in upgrade regardless of the price is way cheaper then build/buying a new CPU/RAM/Mobo.

What people seem to forget is these CPU's and this X3D one are there for us already on the AM4 platform and for an easy drop in upgrade not aimed for brand new system builds. So if your going to compare it to some AL CPU you need to compare it to not just the CPU but DDR5 RAM (most likely) CPU, Motherboard and cooling cost compared to just the cost of a 5800X3D and when you then compare the two the price to performance is massively in AMD's favour.


----------



## fevgatos (Apr 18, 2022)

Melvis said:


> Very well done AMD, this is most likely what I will upgrade my 2700X to, since I do mostly gaming on this PC and stream from time to time and to just have a nice drop in upgrade regardless of the price is way cheaper then build/buying a new CPU/RAM/Mobo.
> 
> What people seem to forget is these CPU's and this X3D one are there for us already on the AM4 platform and for an easy drop in upgrade not aimed for brand new system builds. So if your going to compare it to some AL CPU you need to compare it to not just the CPU but DDR5 RAM (most likely) CPU, Motherboard and cooling cost compared to just the cost of a 5800X3D and when you then compare the two the price to performance is massively in AMD's favour.


I wish that was true. Its not. 12700f+b660 bazooka+ stock cooler cost the same as the 3d alone. Of course the 3d still loses massively in st and mt performance


----------



## Melvis (Apr 18, 2022)

fevgatos said:


> I wish that was true. Its not. 12700f+b660 bazooka+ stock cooler cost the same as the 3d alone. Of course the 3d still loses massively in st and mt performance



Ummm really? it is true for me and most likely most of the planet, we dont count USA with there dirt cheap prices compared to everyone else, and 12700F and B660 Bazooka is $730, about $200 more then what the 5800X3D is going to be so sadly that just isn't the best option for anyone on a AM4 system already and again people keep forgetting, its a gaming CPU.......gaming! hence the Gaming performance and name  

Also I forgot to add in the cost of Crappy Windows 11 ontop of all this just to have the AL CPU work correctly.....yeah no thank you!


----------



## ThrashZone (Apr 18, 2022)

Hi,
I'd be very tempted on a laptop sporting a 5800x3d


----------



## Dr. Dro (Apr 18, 2022)

sukathukassa said:


> Does the eco-mode work on it?



It should (as it's just limiting the CPU to 65W spec/84W PPT), but there might not be much of a point. The power consumption is significantly lower than the other Ryzen models because it doesn't peak v/f as high.

Ryzen Master was recently updated to support Curve Optimizer adjustment from Windows and add support for this processor, but core frequency control is disabled for the X3D.


----------



## Mussels (Apr 18, 2022)

ThrashZone said:


> Hi,
> I'd be very tempted on a laptop sporting a 5800x3d


Someone mentioned an APU with 3D cache before, and the reply was that it was too niche - boosting the APU performance simply making the price high enough to get a dGPU instead

That said...  game consoles?

Imagine the PS5/Xbox Series X APU with its dedicated GDDR6 but with the 3D cache on the CPU. That's the kinda chip that can be tweaked to an inch of its life, and then have the games optimised to benefit from the cache, too.


----------



## fevgatos (Apr 18, 2022)

Melvis said:


> Ummm really? it is true for me and most likely most of the planet, we dont count USA with there dirt cheap prices compared to everyone else, and 12700F and B660 Bazooka is $730, about $200 more then what the 5800X3D is going to be so sadly that just isn't the best option for anyone on a AM4 system already and again people keep forgetting, its a gaming CPU.......gaming! hence the Gaming performance and name
> 
> Also I forgot to add in the cost of Crappy Windows 11 ontop of all this just to have the AL CPU work correctly.....yeah no thank you!


What are you talking about? the 12700f is 320 to 330 euros and the b660 is 150. So yeah, the 3d is stupidly overpriced even for current am4 owners. 450 gets you a fresh new mobo with modern features and upgradability plus a cpu that obliterates the 3d in both single and multithreaded performance


----------



## Mussels (Apr 18, 2022)

fevgatos said:


> What are you talking about? the 12700f is 320 to 330 euros and the b660 is 150. So yeah, the 3d is stupidly overpriced even for current am4 owners. 450 gets you a fresh new mobo with modern features and upgradability plus a cpu that obliterates the 3d in both single and multithreaded performance


What modern features is X570 missing?
DDR5?

You can get literally everything else, on x570 (and B550)
And then you mention that the intel chip "obliterates" the 5800x3d in 3D performance... which it does not do. 


You are aware that new X570 boards come out all the time, and we aren't stuck on the launch designs from a few years ago, yeah?
X570S, the silent launch with V2 boards with updated extras?


----------



## Melvis (Apr 18, 2022)

fevgatos said:


> What are you talking about? the 12700f is 320 to 330 euros and the b660 is 150. So yeah, the 3d is stupidly overpriced even for current am4 owners. 450 gets you a fresh new mobo with modern features and upgradability plus a cpu that obliterates the 3d in both single and multithreaded performance



What are you talking about? and your fancy 12700F gets destroyed in gaming, like it was designed to do  Oh and $159 for Windows 11


----------



## Tomorrow (Apr 18, 2022)

fevgatos said:


> What are you talking about? the 12700f is 320 to 330 euros and the b660 is 150. So yeah, the 3d is stupidly overpriced even for current am4 owners. 450 gets you a fresh new mobo with modern features and upgradability plus a cpu that obliterates the 3d in both single and multithreaded performance


Like he said. Regional pricing varies. In my country 12700KF (K is even more expensive at 450€) is 415€ minimum and B660 is 155€ minimum. Total 570€. That's 120€ more expensive than 5800X3D.

Im not even sure where you got the 320-330€ pricing as the cheapest i can see in the rest of EU is 380€ with the motherboard being the same 155€. That's still 535€ and 85€ more expensive.

What modern featured do you mean? Expensive DDR5 that your selected motherboard does not even support? Because that would make 12700KF even more expensive with both motherboard and DDR5 cost to match low latency DDR4.

PCIe 5.0?
There are no 5.0 SSD's yet and wont be until the end of this year. Also no GPU's and even if there were they would not benefit from 5.0


----------



## Deleted member 24505 (Apr 18, 2022)

The 5800X3D is great for gaming, nothing else. If you spend all your time gaming great. If not get a 12700k or 5900X


----------



## Tomorrow (Apr 18, 2022)

Tigger said:


> The 5800X3D is great for gaming, nothing else. If you spend all your time gaming great. If not get a 12700k or 5900X


What is 5900X great for other than productivity applications?
Most people would not benefit from 12c/24t and would be better served by 5600/5800 models or 12400/12700.

And what do you mean 5800X3D is great for gaming and nothing else? It's functionally still 5800X - a modern 8c/16t CPU capable of handling most use cases. Sure people who want GREAT gaming performance should go for 5800X3D on 12900K. Those after GREAT productivity (especially if efficiency is important) should go for 5950X instead etc.

Some people acting like 5800X3D is a one trick pony with terrible results in everything but gaming. That is factually wrong.


----------



## lexluthermiester (Apr 18, 2022)

Melvis said:


> Oh and $159 for Windows 11


For a retail copy of Pro?


----------



## TheoneandonlyMrK (Apr 18, 2022)

birdie said:


> I don't understand why everyone is so excited.
> 
> AMD is _not_ the first company to have a massive L3 cache: Intel did that _seven_ years ago with Broadwell which featured a massive 128MB L4 cache and showed crazy improvements in certain applications and games as well.


Yes they are the first to offer a commercial CPU with such a big L3 cache as you also say, ,, L4 off die isn't L3 on top of die




birdie said:


> 5800X3D is a single experimental CPU which rocks in some games but loses in others and also loses in general applications which don't require enormous L3 cache due to decreased frequencies.


So far it is.


birdie said:


> Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.


So you suggest and own an I3, f#£@ no you don't.


birdie said:


> RPL according to the leaked information will feature an increased IPC for both its P and E cores as well as a significantly increased caches which means Intel will swiftly catch up and overtake this CPU and maybe even Zen 4.


Irrelevant, not out pointless comment


birdie said:


> It's too effing expensive for what it offers, no, it's _not $100_ more expensive than 5800X, 5800X has been recently sold for as low as $350 which makes it a huge $200 difference. I got it wrong, sorry. Still we're only talking about rare applications and gaming at quirky resolutions.


And a 12900K or KS is cheap, to you perhaps.


birdie said:


> This is the last hooray of AM4, there's no future upgrade path.


Jellosy bites eh.


birdie said:


> Kudos to AMD for this experiment. Rare AMD fans who game at lower resolutions and those who like to boast about gaming benchmarks must be happy.


Rare Intel fans show just as much bias and boast just as much.


----------



## fevgatos (Apr 18, 2022)

Melvis said:


> What are you talking about? and your fancy 12700F gets destroyed in gaming, like it was designed to do  Oh and $159 for Windows 11View attachment 244000


Buy from somewhere else then? I mean you realize that this shop has the 5600x for 340, which means the 5800x 3d will be 700+, so the argument still stands, right? RIGHT?



Tomorrow said:


> Like he said. Regional pricing varies. In my country 12700KF (K is even more expensive at 450€) is 415€ minimum and B660 is 155€ minimum. Total 570€. That's 120€ more expensive than 5800X3D.
> 
> Im not even sure where you got the 320-330€ pricing as the cheapest i can see in the rest of EU is 380€ with the motherboard being the same 155€. That's still 535€ and 85€ more expensive.
> 
> ...


Was talking about the 12700f, not the KF.



Tomorrow said:


> Some people acting like 5800X3D is a one trick pony with terrible results in everything but gaming. That is factually wrong.


No, that's factually correct. For the price, the results are terrible in everything but gaming. It basically loses to CPUs that cost a fraction, so yeah



Mussels said:


> What modern features is X570 missing?
> DDR5?
> 
> You can get literally everything else, on x570 (and B550)
> ...


I didn't mention the x570. Plus i don't think anyone with  a CPU worth upgrading to 3d rocks an x570 or an x570S.


----------



## Tomorrow (Apr 18, 2022)

fevgatos said:


> Was talking about the 12700f, not the KF.
> 
> No, that's factually correct. For the price, the results are terrible in everything but gaming. It basically loses to CPUs that cost a fraction, so yeah
> 
> I didn't mention the x570. Plus i don't think anyone with  a CPU worth upgrading to 3d rocks an x570 or an x570S.


Choosing 12700F is even worse as you're choosing a not overclockable (locked) CPU. Atleast in 5800X3D's case there's an argument that there is no choice to choose another V-Cache model that allows overclocking.

I did not see people complain about 5800X that launched at 450 too as being too expensive for the performance it offered. But now an improved version at the same price is suddenly terrible? Please. 5800X3D performance is still good outside gaming. Youre talking like it's unusuable outside gaming. I doubt most people would even notice the few percentage points of performance loss due to 200Mhz lower boost clock.

Or would notice the difference in a blind test vs 12700K outside games.
There are plenty of X570 owners like myself using 3000 series. In my case 3800X. 5800X3D will be good upgrade for me even outside gaming and cheaper than building 12700 based system on an inferior B660 motherboard.


----------



## Deleted member 24505 (Apr 18, 2022)

The 5800X3D is a good CPU, but not the king of CPU's like some people are acting as if. It is nearly the last gasp of AM4 and unless you have a shitty CPU now imo don't bother and wait for AM5 as you would be getting something so much better. If you have $450 to piss up the wall fine buy it anyway. It's not worth ditching a perfectly fine 5800/900/950X for though just for x% more FPS. Chances are you will be switching to AM5, then what with the X3D, second rig, sell it on, not worth the hassle imo. But what do I know, i'm just a dirty Intel using peasant.


----------



## fevgatos (Apr 18, 2022)

Tomorrow said:


> Or would notice the difference in a blind test vs 12700K outside games.


The 12700k is 50% faster in multithreaded tasks. So you are saying they wouldn't notice a 50% difference, but they would notice a 5% difference in games? I see, you are not at all biased 



Tomorrow said:


> I did not see people complain about 5800X that launched at 450 too as being too expensive for the performance it offered. But now an improved version at the same price is suddenly terrible? Please. 5800X3D performance is still good outside gaming.


Thank you, that is a nice comparison. The 5800x cost the same, but it was actually way faster than the i9 of the time in single threaded performance (the 5800x 3d on the other hand is way slower than the current i9), it was equal in multithreaded performance (the 5800x 3d is getting obliterated). So yeah, the 5800x was okay priced in comparison to the i9 10900k, the 5800x 3d is a joke compared to the current i9.


----------



## Why_Me (Apr 18, 2022)

Tomorrow said:


> What is 5900X great for other than productivity applications?
> Most people would not benefit from 12c/24t and would be better served by 5600/5800 models or 12400/12700.
> 
> And what do you mean 5800X3D is great for gaming and nothing else? It's functionally still 5800X - a modern 8c/16t CPU capable of handling most use cases. Sure people who want GREAT gaming performance should go for 5800X3D on 12900K. Those after GREAT productivity (especially if efficiency is important) should go for 5950X instead etc.
> ...


The 12900K  is a hose job when you have the 12700K/KF and 12700/F nipping at its heels for far less.


----------



## BHS1975 (Apr 18, 2022)

I have a 5600x with 3800 cl15 at 2T ram and a 6900xt and play bf2042 mostly with my gpu usage dipping to 70% at times. Would it be worth it to get the 5800X3D?


----------



## Tomorrow (Apr 18, 2022)

BHS1975 said:


> I have a 5600x with 3800 cl15 at 2T ram and a 6900xt and play bf2042 mostly with my gpu usage dipping to 70% at times. Would it be worth it to get the 5800X3D?


At what resolution?


----------



## TheoneandonlyMrK (Apr 18, 2022)

Tigger said:


> The 5800X3D is a good CPU, but not the king of CPU's like some people are acting as if. It is nearly the last gasp of AM4 and unless you have a shitty CPU now imo don't bother and wait for AM5 as you would be getting something so much better. If you have $450 to piss up the wall fine buy it anyway. It's not worth ditching a perfectly fine 5800/900/950X for though just for x% more FPS. Chances are you will be switching to AM5, then what with the X3D, second rig, sell it on, not worth the hassle imo. But what do I know, i'm just a dirty Intel using peasant.


For me you could hold off the last bit of your statement.

It's Just the king of gaming, no more, for productivity a 5950x is the best, and while I agree it is not worth anyone using a two year old CPU or better to buy necessarily, it's ironic coming from someone already on the latest AL to be saying.


----------



## Deleted member 24505 (Apr 18, 2022)

TheoneandonlyMrK said:


> For me you could hold off the last bit of your statement.
> 
> It's Just the king of gaming, no more, for productivity a 5950x is the best, and while I agree it is not worth anyone using a two year old CPU or better to buy necessarily, it's ironic coming from someone already on the latest AL to be saying.



My previous was imo shitty. I only have the AL as it was gifted, if not i would probably be slotting one of these in the 450 board i had with the 2600x.


----------



## tussinman (Apr 18, 2022)

Melvis said:


> What people seem to forget is these CPU's and this X3D one are there for us already on the AM4 platform and for an easy drop in upgrade not aimed for brand new system builds.


The problem is the 5700/5800 are alot cheaper and perform within 10% of it. In the US for example the 5800x 3d is exactly 1.5x the price of the 5700 (and that's assuming the 5800x3d doesn't get price gouged by scalpers or retailers) but the 5700 is within 5-8% of it in performance in most real world gaming scenarios.

Even the demographic it's targeting are more likely to get the better priced alternatives instead (people that have been sitting on 2018/2019 era Ryzens are way more likely to save the extra money) 



Melvis said:


> So if your going to compare it to some AL CPU you need to compare it to not just the CPU but DDR5 RAM (most likely) CPU, Motherboard and cooling cost compared to just the cost of a 5800X3D and when you then compare the two the price to performance is massively in AMD's favour.


Actually not true. Most benchmarks show DDR4 3600 either equal, within a few percent, or actually faster than DDR5 (that's the whole controversy with DDR5, not worth the cost). If your doing a complete new build then 12700 w/B660 and DDR4 is the best value right now. 12400/12600 isn't a bad temp plan if you have desire to swap out a 13th gen CPU later


----------



## gffermari (Apr 18, 2022)

The 5800X3D is the only cpu in AM4 platform with 12900KS performance in gaming. It’s miles ahead from any 5000 cpu.
How much did you expect it to cost?

For am4 users, it’s this or sell everything and buy intel.

The price is ok, not good but not bad either.


----------



## BHS1975 (Apr 18, 2022)

Tomorrow said:


> At what resolution?


1440p 144hz


----------



## Melvis (Apr 19, 2022)

lexluthermiester said:


> For a retail copy of Pro?



Retail copy of Home, Pro is $269


----------



## Mussels (Apr 19, 2022)

fevgatos said:


> Buy from somewhere else then? I mean you realize that this shop has the 5600x for 340, which means the 5800x 3d will be 700+, so the argument still stands, right? RIGHT?
> 
> 
> Was talking about the 12700f, not the KF.
> ...





fevgatos said:


> What are you talking about? the 12700f is 320 to 330 euros and the b660 is 150. So yeah, the 3d is stupidly overpriced even for current am4 owners. 450 gets you a *fresh new mobo with modern features and upgradability* plus a cpu that obliterates the 3d in both single and multithreaded performance



You didnt mention any chipsets, i'm happy to point out that you're moving the goalposts and changing topics instead of actually backing up your comments.
"Intel has better mobos with more features!.... but i didnt name any specific motherboards or features so i dont have to provide any facts to back this up at all!"


You just like intel. You want intel. That's fine.
Shitposting and trolling is not. If this is in fact, you just not understanding why you dislike the AM4 platform? That's also fine... if you back up your claims with hard info. If you can't then you're just lying.



"Go intel because the platform has more upgrades!"

You what? Any intel CPU released is gunna need a new motherboard anyway
It's not like they have a track record of motherboards not working with supposedly compatible CPU's, or just abandoning certain chipsets for funsies or changing the power requirements how how using a 10th gen CPU in an 11th gen board leaves your first M.2 slot unusuable (as well as the usual using NVME disabling SATA ports, PCI-E slots, etc making the boards have far less features than they seem)
(That's four seperate links for three seperate issues, mind you)
What intel let you do is re-use your CPU in a newer motherboard, they rarely let you use a newer CPU in your older motherboard.

*Dont make claims and then try and pretend you never made them, because you got caught out making utter bullshit up.*


----------



## fevgatos (Apr 19, 2022)

Mussels said:


> You didnt mention any chipsets, i'm happy to point out that you're moving the goalposts and changing topics instead of actually backing up your comments.
> "Intel has better mobos with more features!.... but i didnt name any specific motherboards or features so i dont have to provide any facts to back this up at all!"
> 
> 
> ...


Do i need to repeat myself? The user I was replying to has a 2700x. You think he is rocking an x570 or an x570s?

I dont know what you are talking about, z170 support kaby, z370 supported coffeelake refresh, z490 supported rocketlake. What track record are you talking about

Ps1. I have a b550 aorus master and a 3700x


----------



## Taraquin (Apr 19, 2022)

tussinman said:


> The problem is the 5700/5800 are alot cheaper and perform within 10% of it. In the US for example the 5800x 3d is exactly 1.5x the price of the 5700 (and that's assuming the 5800x3d doesn't get price gouged by scalpers or retailers) but the 5700 is within 5-8% of it in performance in most real world gaming scenarios.
> 
> Even the demographic it's targeting are more likely to get the better priced alternatives instead (people that have been sitting on 2018/2019 era Ryzens are way more likely to save the extra money)
> 
> ...


No, coming from a 12400F+B660 owner, don`t buy locked alder lakes and expect ram to run at 3600. SA-voltage is locked below 1.0v meaning you may not even be able to run 3200 xmp gear 1 (I guy I talked to didn`t get over 2900xmp due to 0.895v SA lock). I was very lucky and can do 3600 gear 1, but many are not so fortunate. 12600KF is a much better deal since you can run ram at 4000-4300 gear 1 since SA-volt is unlocked.



fevgatos said:


> I wish that was true. Its not. 12700f+b660 bazooka+ stock cooler cost the same as the 3d alone. Of course the 3d still loses massively in st and mt performance


But remember that you may not even get your ram to run at 3200xmp in gear 1 since SA-voltage is locked. It depends on luck. 5800x3D would have no issue running 3800 in most cases while you luck decides where you end up with locked alder lake. An unlucky guys I talked to could max run ram at 2900xmp since his MSI B660+12400F ran SA-volt at 0.895v. I was lucky and can do 3600 gear 1 with SA of 0.95V on my 12400F+B660, but most can`t.

I would rather buy a 12600KF+B660 as you can run ram at 4000-4300 gear 1. In games I`m pretty sure that combo will outperform the 12700f easily due to ram speed, in some apps too.


----------



## usul1978 (Apr 19, 2022)

Taraquin said:


> 5800x3D would have no issue running 3800 in most cases


Fingers crossed! I have some aggressive RAM settings on my 5600x and fear I won't be able to keep them if I get a 5800x3D.


----------



## Melvis (Apr 19, 2022)

fevgatos said:


> Buy from somewhere else then? I mean you realize that this shop has the 5600x for 340, which means the 5800x 3d will be 700+, so the argument still stands, right? RIGHT?


From somewhere else? there basically the best and cheapest seller in Aus......and no the 5800X is under $500 and the 5900X is under $600 so it will be priced between those two so it ends up still to be cheaper! right right?


tussinman said:


> The problem is the 5700/5800 are alot cheaper and perform within 10% of it. In the US for example the 5800x 3d is exactly 1.5x the price of the 5700 (and that's assuming the 5800x3d doesn't get price gouged by scalpers or retailers) but the 5700 is within 5-8% of it in performance in most real world gaming scenarios.
> 
> Even the demographic it's targeting are more likely to get the better priced alternatives instead (people that have been sitting on 2018/2019 era Ryzens are way more likely to save the extra money)


It doesnt matter what the price to performance is compared to the 5700/5800 the point is its there as a drop in upgrade for anyone on the AM4 platform which is alot! in the past 4-5yrs. Yes the others might be cheaper bla bla bla but if you want a very faster gaming CPU without upgrading the in tire platform then there it is!


tussinman said:


> Actually not true. Most benchmarks show DDR4 3600 either equal, within a few percent, or actually faster than DDR5 (that's the whole controversy with DDR5, not worth the cost). If your doing a complete new build then 12700 w/B660 and DDR4 is the best value right now. 12400/12600 isn't a bad temp plan if you have desire to swap out a 13th gen CPU later


Again your missing the point this isnt for new builds, never was, this is for people already rocking a AM4 platform all the way back to the B350/X370 Mobos and up so if anyone wants to compare the two you MUST compare the two with the Intel having a motherboard, possible DDR5 and a cooler if it doesnt come with one and W11 , and then and only then you can compare the two and the 5800X3D wins as pointed out many times before for pure gaming.

And the 12900K is faster then the 12700 right?


----------



## btk2k2 (Apr 19, 2022)

fevgatos said:


> I wish that was true. Its not. 12700f+b660 bazooka+ stock cooler cost the same as the 3d alone. Of course the 3d still loses massively in st and mt performance



Prove it.

In the UK it does not. In the US it does not. Does it at MindFactory or other EU retailers?

As someone else mentioned the 12700F has e-cores so you do really need Win 11 to ensure it does not screw up in the odd game here and there.

Even still 5800X3D + B550 Bazooka is in the region of £530. 12700F + B660 Bazooka is in the region of £460. So for both parts (assuming the user already has DDR4, PSU etc etc) that is a 15% difference in price for 10-15% more performance.

For someone doing a full system build the relative cost is closer to 5% more for the 5800X3D build making it better value than the 12700F build if the user just wants to game / youtube / argue on forums. For productivity the 5900X is cheaper than the 5800X3D and faster than the 12700F but the 12700F does give you a better balance of gaming and productivity performance so would be a good choice too depending on a users individual weighting for their use cases.



tussinman said:


> The problem is the 5700/5800 are alot cheaper and perform within 10% of it. In the US for example the 5800x 3d is exactly 1.5x the price of the 5700 (and that's assuming the 5800x3d doesn't get price gouged by scalpers or retailers) but the 5700 is within 5-8% of it in performance in most real world gaming scenarios.
> 
> Even the demographic it's targeting are more likely to get the better priced alternatives instead (people that have been sitting on 2018/2019 era Ryzens are way more likely to save the extra money)
> 
> ...



The 5800X3D is 20% ahead of the 5800X on average and in some games it is over 30% faster with a few going into the 40%+ range. If you play CS:GO then maybe don't bother but if you play Anno or Flight Sim or Assetto Corsa or F1 or BF:V or Factorio or Kingdom Come or RE:Village etc then you are seeing some really impressive gains over the 5800X.


----------



## The King (Apr 19, 2022)

I don't agree with this story about old value build for the 5800X3D.

Here in India where prices are generally high. You can buy a MSi Pro Z690 mobo and a 12600K for less than what the 5800X3D goes for.
I'm sure in the US and UK you could get better deals.

This 5800X3D is for old build does not fly with me. Its over priced and i Fully expect mid range Zen4 CPUs to outperform this CPU when they are released.

So its a fail in my books from AMD even though it does beat the 12900K in gaming and cost less than that CPU.

It would be cheaper for me to buy the Msi Z690 Pro DDR4 mobo and 12600K than buy the 5800X3D for my existing B450 setup.

If i sell my B450 setup I could upgrade to a 12700K or even 12900K. So no value for old AM4 users as far as I am concerned.


----------



## Taraquin (Apr 19, 2022)

Where I live 5800X3D+B550 Bazooka costs 700usd. 12700f+B660 Bazooka costs 600usd. 5800X3D will be superior in gaming stock and even more due to far better ram overclocking if you fo that. 12700f will be far better for most productivity.



The King said:


> I don't agree with this story about old value build for the 5800X3D.
> 
> Here in India where prices are generally high. You can buy a MSi Pro Z690 mobo and a 12600K for less than what the 5800X3D goes for.
> I'm sure in the US and UK you could get better deals.
> ...


It depends where you live. 5800X3D will be faster than 12600K in most games, but if you have a good B450 that is a viable option since you most likely need new cooler for Z690 (some can ship/sell brackets though). Z690+12600K+50usd+cooler makes 5800X3D a bit more interesting, what you can sell you current mobo/cpu for also matters. 

Personally I would buy Z690+12600KF for new setup, but 5800X3D can be interesting for upgrade.


----------



## BHS1975 (Apr 19, 2022)

Anyone know what time the 3D releases?


----------



## gffermari (Apr 19, 2022)

Tomorrow is the release date.


----------



## The King (Apr 19, 2022)

Taraquin said:


> Where I live 5800X3D+B550 Bazooka costs 700usd. 12700f+B660 Bazooka costs 600usd. 5800X3D will be superior in gaming stock and even more due to far better ram overclocking if you fo that. 12700f will be far better for most productivity.
> 
> 
> It depends where you live. 5800X3D will be faster than 12600K in most games, but if you have a good B450 that is a viable option since you most likely need new cooler for Z690 (some can ship/sell brackets though). Z690+12600K+50usd+cooler makes 5800X3D a bit more interesting, what you can sell you current mobo/cpu for also matters.
> ...


While its true the 5800X3D will be faster than the 12600K the 12600K still has 93.6% the gaming performance at 720p and 95.6% at 1080p. Most people will not notice this difference in games.
People running older setups hardly pushing 1440p 144Hz but at the level the 12600K is still 96.4% gaming performance when compared the 5800X3D.

12600K              Rs-24780 (330USD)
MSi Z690 Pro    Rs-17795 (237USD)
Total                  Rs-42575 (567USD)

5800X3D - Rs-44250 (590USD)

Also the 5900X is cheaper and has 20% better CPU raw performance and not that far off the 12600K in gaming performance.  

Here the price of the 12600KF Rs-23999 (320USD) 

Im actually learning towards no upgrade even to ADL and rather go for Zen4. I think this is the best option for me.


----------



## Chomiq (Apr 19, 2022)

BHS1975 said:


> Anyone know what time the 3D releases?


4:20, 3PM GMT+1.


----------



## Deleted member 24505 (Apr 19, 2022)

The King said:


> While its true the 5800X3D will be faster than the 12600K the 12600K still has 93.6% the gaming performance at 720p and 95.6% at 1080p. Most people will not notice this difference in games.
> People running older setups hardly pushing 1440p 144Hz but at the level the 12600K is still 96.4% gaming performance when compared the 5800X3D.
> 
> 12600K              Rs-24780 (330USD)
> ...



Since i already have a fine 12700k.Z690 X3D is kind of moot for me. Though I am interested to see what AM4 turns out like. If it is a whoppa I might be persuaded to jump back to AMD.


----------



## Totally (Apr 19, 2022)

Mussels said:


> Not just for you, but to everyone else:
> 
> Launch prices are always weird. The new products are in demand and priced higher than the older discounted products - give it 6 months and they'll make more sense.
> Early adopter tax.



That's under the assumption that AMD plans to produce the X3D in quantities that meet demand. We know for the this merely a test run for them and preview for them before they apply it to zen 4 which should be out 6 months from now.


----------



## Count von Schwalbe (Apr 19, 2022)

Totally said:


> That's under the assumption that AMD plans to produce the X3D in quantities that meet demand. We know for the this merely a test run for them and preview for them before they apply it to zen 4 which should be out 6 months from now.


It looks like this was positioned as a niche product - only worth it if you have a decent compatible motherboard and an older/low-end CPU. Not worth it for new builds. If that is how AMD was thinking when setting production and pricing, which seems reasonably likely, then I would expect production volume to be comparatively low. 

While we are here, I built a system using a B550 and a Ryzen 5 3600. 5800X3D vs 12600K + B660 and near parity on features is equivalent in price. I would go the upgrade route myself.


----------



## Totally (Apr 19, 2022)

Count von Schwalbe said:


> It looks like this was positioned as a niche product - only worth it if you have a decent compatible motherboard and an older/low-end CPU. Not worth it for new builds. If that is how AMD was thinking when setting production and pricing, which seems reasonably likely, then I would expect production volume to be comparatively low.
> 
> While we are here, I built a system using a B550 and a Ryzen 5 3600. 5800X3D vs 12600K + B660 and near parity on features is equivalent in price. I would go the upgrade route myself.



What would be the reason you'd pass over the 5600X‽ 90% the performance, 50% cost vs x3d in this scenario?


----------



## tussinman (Apr 20, 2022)

Melvis said:


> *It doesnt matter what the price to performance is compared to the 5700/5800 *the point is its there as a drop in upgrade for anyone on the AM4 platform which is alot! in the past 4-5yrs. Yes the others might be cheaper bla bla bla but if you want a very faster gaming CPU without upgrading the in tire platform then there it is!


No what I said was valid, especially since where talking about a more casual buying group 

My point was i'm not buying this whole "oh this product is for 2018/2019 era ryzen owners who are money conscious, they'll drop $450+ like it's nothing" narrative that's getting thrown around.

It's way more logical that they either get most of the performance for alot less money (5700X) or if they haven't upgraded to Ryzen 5000 series even though it's 1.5 years old at this point then there likely to just wait for Zen 4


Melvis said:


> Again your missing the point this isnt for new builds, never was, this is for people already rocking a AM4 platform all the way back to the B350/X370 Mobos


This whole thread started with people mocking the 12900 and how the 58003d is better for a new build. Then the moment the 12700k got thrown into the mix (almost half the price as the 12900 and $100 cheaper than the 5800X3d but is 97% as good as both) then the narrative magically became "oh no this chip is for those long term ryzen owners that haven't upgraded in years".

This is an enthuasist product. I'm not buying this baloney of people trying to play both sides (basically stating this chip is enthuasist price/performance but is being target to a more casual/mainstream buyer)


----------



## FreedomEclipse (Apr 20, 2022)

Oc'd to 5.1ghz on an X570 board


----------



## fevgatos (Apr 20, 2022)

tussinman said:


> No what I said was valid, especially since where talking about a more casual buying group
> 
> My point was i'm not buying this whole "oh this product is for 2018/2019 era ryzen owners who are money conscious, they'll drop $450+ like it's nothing" narrative that's getting thrown around.
> 
> ...


Of course its nonsense. No one stuck with a 5 year old cpu and mobo would be interested in a 450 euro cpu. Which, btw, in order to even make a difference from a 200 euro cpu needs a 1.5k graphics card... Its an enthusiast product an a ridiculous price, at 250 to 280 euros it would be pretty good

Just for clarification, if Intel released a product with exaxtly similar performance with the 3d,it would cost below 300. Its basically a locked with no igpu 12700, without ecores (so low multithread performance), low single turbo (so low single thread performance) but because the ecores are off the cache boosts to 4.6 and it manages 5% better at gaming. That cpu would cost less than a 12700f, which goes around for 320 euros. So yeah..



btk2k2 said:


> Prove it.
> 
> In the UK it does not. In the US it does not. Does it at MindFactory or other EU retailers?
> 
> ...


Instead of asking me to prove it, you could check yourself? Yes around eu (checked lots of places) it goes between 316 and 340 euros. Funnily enough after the 3d benchmark leaks it started rising in price, LOL. People realised how much better value the 12700f is and bought it like crazy.


----------



## Deleted member 24505 (Apr 20, 2022)

FreedomEclipse said:


> Oc'd to 5.1ghz on an X570 board


 It's worth noting that this was achieved with a single 8GB stick of RAM running at 1205MHz, proving that this isn't an overclock you'd want to use on a daily basis.


----------



## Count von Schwalbe (Apr 20, 2022)

Totally said:


> What would be the reason you'd pass over the 5600X‽ 90% the performance, 50% cost vs x3d in this scenario?


Already have one in another system, which would be used for non-gaming stuff. I wasn't saying that I was planning on it, just pointing out that if I have a board and wanted ultimate gaming performance only it made more sense to go with the X3D than an Intel product. 

Many people were hit by a shortage of Ryzen CPU's when Intel had nothing to compare (the whole reason I bought a 3600). I can see lots of people upgrading from whatever they could get their hands on to one of these if they are strictly gamers.


----------



## Shatun_Bear (Apr 20, 2022)

fevgatos said:


> Do i need to repeat myself? The user I was replying to has a 2700x. You think he is rocking an x570 or an x570s?
> 
> I dont know what you are talking about, z170 support kaby, z370 supported coffeelake refresh, z490 supported rocketlake. What track record are you talking about
> 
> Ps1. I have a b550 aorus master and a 3700x



Jeez, I left this review forum for several days and I return to still see you on your Intel defence crusade


----------



## dont whant to set it"' (Apr 20, 2022)

Nabbed one. Expected delivery tomorrow.


----------



## Chomiq (Apr 20, 2022)

dont whant to set it' said:


> Nabbed one. Expected delivery tomorrow.


Same price in Poland after conversion. Granted it's sold by x-kom which likes to hike the prices.


----------



## QuietBob (Apr 20, 2022)

dont whant to set it' said:


> Nabbed one. Expected delivery tomorrow.


Also got one


----------



## 529th (Apr 20, 2022)

Leaving to pick mine up now


----------



## xorbe (Apr 20, 2022)

There seems to be plenty online in the USA with just NE and AMD so far (presumably soon B&H and Amazon and BB too).  RIP to the scalpers on eBay asking $650-850.


----------



## BHS1975 (Apr 20, 2022)

xorbe said:


> There seems to be plenty online in the USA with just NE and AMD so far (presumably soon B&H and Amazon and BB too).  RIP to the scalpers on eBay asking $650-850.


When will they drop on Amazon?


----------



## gffermari (Apr 20, 2022)

410£ the 5800X3D
495£ the 5950X
360£ the 5900X

it’s still too difficult to decide…


----------



## Deleted member 24505 (Apr 20, 2022)

gffermari said:


> 410£ the 5800X3D
> 495£ the 5950X
> 360£ the 5900X
> 
> it’s still too difficult to decide…



if you have money to burn 5800X3D if not sit on your current and wait till next gen.


----------



## xorbe (Apr 20, 2022)

BHS1975 said:


> When will they drop on Amazon?



Beats me, but I would check at 8AM and 9AM Pacific, etc.



gffermari said:


> 410£ the 5800X3D
> 495£ the 5950X
> 360£ the 5900X
> 
> it’s still too difficult to decide…



5950X, 5700X -> the sane choices imho, either all the threads/perf, or 65w is my reasoning
5900X -> if the 5950X costs too much but you really do need more than 16 threads
5800X3D -> if it really hits your specific use case, or you're a cpu hobbyist person


----------



## Chomiq (Apr 20, 2022)

€519 in mindfactory




€586 at amazon.de, nuts!


----------



## xorbe (Apr 20, 2022)

Weird, it's after 9AM Pacific and never showed at Amazon / B&H / BB.  And out of stock at NE and AMD now.  One of the eBay listings sold ...



BHS1975 said:


> When will they drop on Amazon?



Edit: Your time has come, it's on Amazon.  (B&H is closed until Sunday for Passover.)


----------



## Chomiq (Apr 21, 2022)

Price bumped up by €20 within a day at x-kom in Poland:


----------



## Dr. Dro (Apr 21, 2022)

TheoneandonlyMrK said:


> Yes they are the first to offer a commercial CPU with such a big L3 cache as you also say, ,, L4 off die isn't L3 on top of die



I agree, but honestly, this processor perfects what the Core i7-5775C attempted to do seven years ago. If you look at it, the reasoning is sound. I owned one for a while and it was a pretty great CPU, shame my Z97 board died and they cost more than that platform is worth it, so I flipped the chip. If you look at what it is, a low-power, low-TDP quad-core with a small L3, doing what it does... funnily enough Ian made a test for Anandtech back in 2020 which covered many games still in the bench suite today, and these are the same games that show exceptional performance on the 5800X3D today. Was it worth it back then? I would argue no... is it worth it today? I think it still isn't, for the price being asked. But it is the way forward, and once the packaging technology advances enough we should see this deployed throughout a full stack, on both companies' offerings.





and Borderlands 3:






It is really worth reading, i'll leave the link here:









						A Broadwell Retrospective Review in 2020: Is eDRAM Still Worth It?
					






					www.anandtech.com


----------



## TheoneandonlyMrK (Apr 21, 2022)

Dr. Dro said:


> I agree, but honestly, this processor perfects what the Core i7-5775C attempted to do seven years ago. If you look at it, the reasoning is sound. I owned one for a while and it was a pretty great CPU, shame my Z97 board died and they cost more than that platform is worth it, so I flipped the chip. If you look at what it is, a low-power, low-TDP quad-core with a small L3, doing what it does... funnily enough Ian made a test for Anandtech back in 2020 which covered many games still in the bench suite today, and these are the same games that show exceptional performance on the 5800X3D today. Was it worth it back then? I would argue no... is it worth it today? I think it still isn't, for the price being asked. But it is the way forward, and once the packaging technology advances enough we should see this deployed throughout a full stack, on both companies' offerings.
> 
> 
> 
> ...


I do appreciate your points on the similarity of it's L4 cache with this L3 vertical extension.
And agree with most.
I just think calling this, vertically stacked L3 a re hash of a 2.5D equivalent is beyond ridiculous.
It's a first for consumers, though I certainly agree a wider range (and Oc or at least PBO) would have been better than one SKU, for consumers.

Interesting chip to mention though while arguing against people upgrading to this, since identically Intel didn't spread their cache love far either, perhaps an expensive endeavour.


----------



## Richards (Apr 22, 2022)

Dr. Dro said:


> I agree, but honestly, this processor perfects what the Core i7-5775C attempted to do seven years ago. If you look at it, the reasoning is sound. I owned one for a while and it was a pretty great CPU, shame my Z97 board died and they cost more than that platform is worth it, so I flipped the chip. If you look at what it is, a low-power, low-TDP quad-core with a small L3, doing what it does... funnily enough Ian made a test for Anandtech back in 2020 which covered many games still in the bench suite today, and these are the same games that show exceptional performance on the 5800X3D today. Was it worth it back then? I would argue no... is it worth it today? I think it still isn't, for the price being asked. But it is the way forward, and once the packaging technology advances enough we should see this deployed throughout a full stack, on both companies' offerings.
> 
> 
> 
> ...


Ps5 and xbox need this 3d v-cache.. amd needs this on laptops and igpus for bandwidth


----------



## z1n0x (Apr 22, 2022)

AMD Ryzen 7 5800X3D - the only (and last) fighter of its kind as a perfect and very efficient upgrade | igor'sLAB
					

To say it in advance: The Ryzen 7 5800X3D is AMD's almost condescending gesture of nonchalance to make use of an Epyc chiplet ("Milan") with 3D-V cache as a "secondary purpose", pack it into a normal…




					www.igorslab.de


----------



## QuietBob (Apr 22, 2022)

z1n0x said:


> AMD Ryzen 7 5800X3D - the only (and last) fighter of its kind as a perfect and very efficient upgrade | igor'sLAB
> 
> 
> To say it in advance: The Ryzen 7 5800X3D is AMD's almost condescending gesture of nonchalance to make use of an Epyc chiplet ("Milan") with 3D-V cache as a "secondary purpose", pack it into a normal…
> ...


Thanks for sharing! The two most relevant graphs for each game IMO are *Frame Time Variance* and *Average CPU Watts by FPS* @ 1080p. In both cases lower values are better. The lower the variance, the smoother the overall gameplay. As for the other metric - lower power consumption for the same fps means better efficiency.


----------



## The King (Apr 22, 2022)

Those not running a higher end GPU like the  RTX 3080 in the review can expect much lower gaming improvements from running the 5800X3D @1080p









Maybe @W1zzard  should also test with multiple GPUs or do a GPU scaling review with the 5800X3D


----------



## Dr. Dro (Apr 22, 2022)

Richards said:


> Ps5 and xbox need this 3d v-cache.. amd needs this on laptops and igpus for bandwidth



The Xbox One had an eSRAM cache sized around 32 MB. It is still present on the Xbox One S, was removed on the Xbox One X, and the PlayStation 4 never had it. Neither of the current generation consoles (XSS, XSX and PS5) have a high-speed buffer cache and this is unlikely to return to consoles any time soon. Regarding APUs, I believe it was considered and then disregarded because the cost of the processor would be fairly high for the market segment that these chips are intended to service, creating a niche that would not be worth it for AMD.



TheoneandonlyMrK said:


> I do appreciate your points on the similarity of it's L4 cache with this L3 vertical extension.
> And agree with most.
> I just think calling this, vertically stacked L3 a re hash of a 2.5D equivalent is beyond ridiculous.
> It's a first for consumers, though I certainly agree a wider range (and Oc or at least PBO) would have been better than one SKU, for consumers.
> ...



No doubt about it, the packaging technology is far more advanced and also this is more efficient because it is a level 3 cache and completely transparent not only to the OS, but to the processor internally, so anything that could even remotely benefit from data being closer to the processor cores in any manner will see a benefit with the X3D. It's a shame AMD decided not to refresh the Ryzen 9 lineup, I would sell my 5950X and buy a 5950X3D without thinking twice, but the 5800X3D is just not worth it for me.


----------



## Count von Schwalbe (Apr 22, 2022)

Dr. Dro said:


> No doubt about it, the packaging technology is far more advanced and also this is more efficient because it is a level 3 cache and completely transparent not only to the OS, but to the processor internally, so anything that could even remotely benefit from data being closer to the processor cores in any manner will see a benefit with the X3D. It's a shame AMD decided not to refresh the Ryzen 9 lineup, I would sell my 5950X and buy a 5950X3D without thinking twice, but the 5800X3D is just not worth it for me.


EPYC 7373-X


----------



## Dr. Dro (Apr 22, 2022)

Count von Schwalbe said:


> EPYC 7373-X



Yes... kind of like a 7373X but for AM4. And for about 4 thousand dollars less, given that chip sells for $4600.


----------



## Count von Schwalbe (Apr 22, 2022)

Dr. Dro said:


> Yes... kind of like a 7373X but for AM4. And for about 4 thousand dollars less, given that chip sells for $4600.


And with 2x CCD instead of 8? 105W instead of 240W? Seriously though, it would "only" have the same 96MB of L3 cache as it has to be duplicated across chiplets. Or, using the trick they used for the 7373X, you could take 4x CCD and disable half of the cores each - which I doubt they could do within the constraints of AM4. I really couldn't see them selling a 96MB version for less than $1k and a 192MB version for less than $1300. Would kind of take the shine off of them, value-wise.


----------



## Dr. Dro (Apr 22, 2022)

Count von Schwalbe said:


> And with 2x CCD instead of 8? 105W instead of 240W? Seriously though, it would "only" have the same 96MB of L3 cache as it has to be duplicated across chiplets. Or, using the trick they used for the 7373X, you could take 4x CCD and disable half of the cores each - which I doubt they could do within the constraints of AM4. I really couldn't see them selling a 96MB version for less than $1k and a 192MB version for less than $1300. Would kind of take the shine off of them, value-wise.



I mean, the 5950X has two fully enabled "5800X dies" (but with low leakage), the 5950X3D simply needs to have two "5800X3D dies" in it (again, lower leakage versions). For its intended usage, that is really all it needs to do to achieve the same goal, eh? It's a desktop processor. The 3D cache layer is a 64 MB addition, the 5800X3D has 96 MB (32 + 64 3D slice), so a 5950X3D would have 192 MB (32 + 64 3D slice * 2), with 96MB L3/CCX/D.


----------



## Count von Schwalbe (Apr 23, 2022)

Dr. Dro said:


> I mean, the 5950X has two fully enabled "5800X dies" (but with low leakage), the 5950X3D simply needs to have two "5800X3D dies" in it (again, lower leakage versions). For its intended usage, that is really all it needs to do to achieve the same goal, eh? It's a desktop processor. The 3D cache layer is a 64 MB addition, the 5800X3D has 96 MB (32 + 64 3D slice), so a 5950X3D would have 192 MB (32 + 64 3D slice * 2), with 96MB L3/CCX/D.


Correct me if I am wrong, but I thought L3 across 2 chiplets had to be duplicated for core cohesion, meaning you cannot simply add up L3?


----------



## Dr. Dro (Apr 23, 2022)

Count von Schwalbe said:


> Correct me if I am wrong, but I thought L3 across 2 chiplets had to be duplicated for core cohesion, meaning you cannot simply add up L3?



No, they are independent and fully usable, though this is not without certain drawbacks. In Zen 2 and Zen 3, L3 cache slices are tied up to a core complex (CCX), and while data can be accessed between CCXs, doing so incurs an access latency penalty.

Zen 2 had two CCXs per CCD (die), and Zen 3 streamlined this to have one CCX per CCD, as it doubled the amount of cores and associated L3 per CCX. The magic of the 5800X3D is that it is a single CCD design, so it turns out to be a very straightforward setup that won't incur the inter-CCD and inter-CCX penalties because it only has one of each.

R9 3950X: 4 cores + 16 MB L3 * 2 * 2 (4C+16M/4C+16M + 4C/16M+4C/16M), you can see here this was not the most efficient topology, i.e. imagine data on CCX4/CCD2 trying to access something on CCX1/CCD1
R9 5950X: 8 cores + 32 MB L3 * 2 (8C+32M + 8C+32M), far more efficient as few tasks ever need more than 8 cores or 32 MB of cache, so it usually manages pretty well with much fewer issues
R7 5800X3D: 8 cores + 96 MB L3, should be self explanatory, the processor can fully utilize its resources with maximum efficiency

An eventual 5950X3D would behave very much like the 5950X, except that it each CCD/CCX would have the full benefits of the 96 MB L3 (just like the 5800X3D), enabling very large data sets.


----------



## InVasMani (Apr 23, 2022)

AMD could cut the L3 cache in half and then 3D stack it and use that extra L3 die space at the same time. They could even do that in tandem with a 6nm die shrink. If they wanted they could probably have a 10c to 12c single CCD chip in doing so. Use cut the L3 cache in half and 3D stack it to compensate then repurpose that die area space. I figure they could potentially have a 20c 5975X3D chip that's got the same L3 size as the 5950X, but 4 more cores that adds L1/L2 cache that are of more importance and usefulness at the same time along with more cores. This 3D stacked cache was good if for no other reason, but to prototype and explore it's effectiveness.


----------



## Count von Schwalbe (Apr 23, 2022)

Dr. Dro said:


> No, they are independent and fully usable, though this is not without certain drawbacks. In Zen 2 and Zen 3, L3 cache slices are tied up to a core complex (CCX), and while data can be accessed between CCXs, doing so incurs an access latency penalty.


Oh, I see. I thought I had read the opposite somewhere.

Edit: wouldn't the higher L3 latency and the reduced clocks make it a bit of a step backwards, for applications that use more than the 8 cores of the 5800X3D?


----------



## gffermari (Apr 23, 2022)

It’s not very likely for AMD to release the 5900X3D (192MB cache), that would be kind of vandalism on their own next platform, but it would be great to close the AM4 era with something like that.


----------



## Dr. Dro (Apr 23, 2022)

Count von Schwalbe said:


> Oh, I see. I thought I had read the opposite somewhere.
> 
> Edit: wouldn't the higher L3 latency and the reduced clocks make it a bit of a step backwards, for applications that use more than the 8 cores of the 5800X3D?



Well, as the 5800X3D has shown, any losses from access latency are easily offset and overcome by the benefits of the larger cache, so I don't think so. The same "limitations" of the 5950X's design would certainly apply, though it would be more of an "maximum efficiency" issue, eg. it could be even faster somehow I guess? It would be a royal processor, mate...



gffermari said:


> View attachment 244625
> 
> It’s not very likely for AMD to release the 5900X3D (192MB cache), that would be kind of vandalism on their own next platform, but it would be great to close the AM4 era with something like that.



Yeah I believe that ES was a one-off and they decided not to release a 12- or 16- core SKU, unfortunately.


----------



## InVasMani (Apr 23, 2022)

Shame they didn't cut the L3 cache in half then 3D stack it. If you look at the die area shot seems you could fit another 4 cores per CCD for the potential of single CCD 12c part or 24C part to replace the 5950X.


----------



## ratirt (Apr 23, 2022)

Oh boy. I read the post but I could not read them all. The drama and butt pain of some people here, it is like soap operas with their arguments and problems seeing the gaming results of the 5800X3D.

Nice halo product, a cherry on top. Awesome performance considering bigger cache only and even a bit lower clock, I actually didn't expect that much and I thought AMD has stretched the truth a bit but it would seem they didn't. The CPU performs pretty good. It would seem the clocks are not that important, good to have but the cache does the trick. It is a nice showcase, how much cache capacity matter. It would be hard to imagine, how much more clock frequency up you would need to achieve this.


----------



## Mussels (Apr 23, 2022)

Well, hellooo

Wccftech Reader Tunes His AMD Ryzen 7 5800X3D Into a Efficiency Monster With Undervolting: Same Performance at 1V, 57W Peak Power at Sub-1V


"At 1V, Shaun states that he started seeing performance regression but one interesting aspect was that the performance itself didn't take a huge hit. The power and temperatures saw a huge fall. It was stated that at 1V (4.4 GHz all-core), the CPU peaked at 43C in Cinebench whereas it peaked at 80C in the same benchmark when running at stock The power consumption was rated at 73W."


----------



## gffermari (Apr 23, 2022)

Very interesting for sff builds. In general though, it’s an enthusiast class gaming cpu and the power consumption is not much of a value.
It matters only when it’s ridiculous like the KS but the high end products are not meant to be efficient but powerful.


----------



## Deleted member 24505 (Apr 23, 2022)

"Shaun is running a custom-loop cooling kit with a 420mm radiator and triple 140mm fans" i'm sure the temp for this even at 1v would not be the same as this with an air cooler. His cooling could be classed as high custom.

I'd like to see the temps with the same settings and a air cooler.

Nice CPU though, and if i did not have the ADL i would probably go for one. I do not regret the 12700k though as it is still a very good CPU, even though ADL has got loads of derision from AMD fans.


----------



## zx128k (Apr 23, 2022)

Basically anything that scales with cores and frequency would be worse with a 5800X3D compaired to the 5800X.  People will be posting blk overclocking and undervolting.  This will then be the new better performance.  Ignore it, its single samples and you cant make any judgements.  

The 12900k/ks overclocked will be faster, better RAM, more cores/higher clocks and higher DDR5 ram frequency with tightened timings. Example 5800x3d will max out likely at 1900 IF at most, so this limits RAM frequency and performance. Example Highest posible on 5800x.

Next will be the power is lower on the 5800x3d, both gamers and overclockers dont care.  An overclocked 5800x 5GHz will do 100 watts in gaming and Intel 12900k will do a little more with an overclock.  Proof here of 5800x 100Watts 98 watts peak. Power draw stock AMD vs Intel Example of a 12900k using lower power in games note that most of the time its below 100 watts and below the 5950x.  The two are close at times.

What the 5800x3d is great for is people that dont overclock and want to upgrade from say a 1000 or 3000 series cpu.  They already have DDR4 3600 RAM and this gives them a path to great gaming performance on an older motherboard that cant take the power draw of say a 5950x.  I have an old 3800x with DDR4 3600 Ram on the x570 motherboard.  I could put the 5800x3d in that system and change nothing else.  This system will out perform my 10900k system at 1080p but I am still gpu limited at 4k 3Dmark Time Spy 10900k and Time spy 3800x.


----------



## QuietBob (Apr 23, 2022)

The King said:


> Those not running a higher end GPU like the RTX 3080 in the review can expect much lower gaming improvements from running the 5800X3D @1080p


That's something I'm planning to test myself with a 6600XT. However, I don't really care for maximum fps or 200+ averages in games. What I'm interested in is frame time consistency, 1% and 0.1% lows and overall energy efficiency.



Dr. Dro said:


> It's a shame AMD decided not to refresh the Ryzen 9 lineup, I would sell my 5950X and buy a 5950X3D without thinking twice


I'm almost sure we won't get another Ryzen3D on AM4. One reason is obviously the imminent release of Zen 4. And the gains we've seen in games owing to V-cache do not really translate to massively parallel workloads, such as rendering or encoding - which are primary use scenarios for Ryzen 9.


----------



## nicamarvin (Apr 23, 2022)

zx128k said:


> The 12900k/ks overclocked will be faster, better RAM, more cores/higher clocks and higher DDR5 ram frequency with tightened timings. Example 5800x3d will max out likely at 1900 IF at most, so this limits RAM frequency and performance. Example Highest posible on 5800x.


What are you On? the 12900KS is an OC version of the 12900K and the 5800X3D it's beating both of them even with 6400Mhz RAM...


----------



## zx128k (Apr 23, 2022)

nicamarvin said:


> What are you On? the 12900KS is an OC version of the 12900K and the 5800X3D it's beating both of them even with 6400Mhz RAM...


Its a better binned 12900k, that is correct but the 12900ks is its own product with higher clocks.  Better binned cpu's will normally have better clocks at lower vcore.  There is no way a 5800x3d can't beat an overclocked 12900ks with overclocked and tuned DDR5 memory.  Look again at the aida64 score on the DDR5 RAM I posted.  If you know how to tune RAM you can go high with a 12900k system.  Most of the performance is RAM side.  THe 5800x3d will cap out with its maximum IF frequency.  The better you tune the RAM, then the more copy and less latency you can get.  This is what the extra cache of the 5800x3d provides over a normal 5800x.

Take my 10900k system, look at the aida64 link and see what my RAM gets. Game Shadow of the tomb raider, 1080p highest TAA.  My 10900k is 225fps with a 3080 ti (380 watts power limit).  Here the same settings, a 3090 ti and 5800x3d/DDR4-3800.  191fps scroll down to see bar chart.  12900ks/DDR5-6400 190 fps. 

The higher the RAM frequency and better the RAM tuning; the massive effect on performance in games.

Note: Fixed Tomb raider benchmark as PC was in power saving mode.


----------



## ThrashZone (Apr 23, 2022)

Hi,
12900ks was a 800.us suckers release dropping early just before 5800x3d dropped so it's a typical intel trolling release.


----------



## zx128k (Apr 23, 2022)

Some 12900ks cpu's can reach 5.7GHz on the performance cores for two threads and 5.2GHz for all P-cores.  You can set E-cores to +3.  Settings as per this video. So yeah Intel have the fastest overclocked CPU but they make you pay for it. Call it trolling if you like but some people will buy one or even buy and bin 1000's. Then get a 3090 ti. Sell the rejected cpu's on ebay.

Also my shadow of the tomb raider result in in power saving mode.


----------



## kapone32 (Apr 24, 2022)

Why do I fear this CPU will go the way of the 3300X?


----------



## Chomiq (Apr 24, 2022)

kapone32 said:


> Why do I fear this CPU will go the way of the 3300X?


Yeah I'm starting to regret not grabbing it on launch. It's sold out everywhere now and it looks like there won't be any restock until 5/12, at least in EU (31st of May in UK).


----------



## Deleted member 24505 (Apr 24, 2022)

Get one off ebay for twice the price 



Chomiq said:


> Yeah I'm starting to regret not grabbing it on launch. It's sold out everywhere now and it looks like there won't be any restock until 5/12, at least in EU (31st of May in UK).



This is not a bad price really

https://www.ebay.co.uk/itm/125276622939?epid=9041991188&hash=item1d2b11905b:g:TdIAAOSwcJRhgQpn


----------



## QuietBob (Apr 24, 2022)

kapone32 said:


> Why do I fear this CPU will go the way of the 3300X?


It most likely will. It's a unicorn chip, just as the 3300X was. And just like the 3300X today, I believe it's going to stay relevant in games for a long time. The next shipment may be the last chance to grab it first hand, though probably at an inflated price.

Luckily, there are other good options, such as the 5700X/5800X/5900X or the 12700 from Intel. None of them is better suited to gaming than the 5800X3D, but they're priced lower, and so offer better value overall.

And of course we'll have Zen 4 and Raptor Lake in a few months.


----------



## Deleted member 24505 (Apr 24, 2022)

QuietBob said:


> It most likely will. It's a unicorn chip, just as the 3300X was. And just like the 3300X today, I believe it's going to stay relevant in games for a long time. The next shipment may be the last chance to grab it first hand, though probably at an inflated price.
> 
> Luckily, there are other good options, such as the 5700X/5800X/5900X or the 12700 from Intel. None of them are better suited to gaming than the 5800X3D, but they're priced lower, and so offer better value overall.
> 
> And of course we'll have Zen 4 and Raptor Lake in a few months.



All of them are pretty good for gaming, and better for everything else.


----------



## kapone32 (Apr 24, 2022)

QuietBob said:


> It most likely will. It's a unicorn chip, just as the 3300X was. And just like the 3300X today, I believe it's going to stay relevant in games for a long time. The next shipment may be the last chance to grab it first hand, though probably at an inflated price.
> 
> Luckily, there are other good options, such as the 5700X/5800X/5900X or the 12700 from Intel. None of them is better suited to gaming than the 5800X3D, but they're priced lower, and so offer better value overall.
> 
> And of course we'll have Zen 4 and Raptor Lake in a few months.


The thought process that I have for this chip is Gaming. The PS5 and Xbox1 run on the same AM4 chips. As most Games will be produced for one or the other and as Developers extract more and more performance (console) as the platform ages this chip could indeed remain the best Gaming CPU around. I was gobsmacked that even at $569.99 Canadian it sold out in one day.



Tigger said:


> All of them are pretty good for gaming, and better for everything else.


The 5000 series chips are all sweet. The 5900X is a beast of a CPU and stable as granite. It is actually cheaper than the 5800X3D but that will not matter. I love my 5950X because there is nothing I can do to make the CPU feel sluggish. The best thing about AMD though is the utter flexibility that all these chips have now that there is official support for X370/B350.


----------



## Dr. Dro (Apr 24, 2022)

ThrashZone said:


> Hi,
> 12900ks was a 800.us suckers release dropping early just before 5800x3d dropped so it's a typical intel trolling release.



I don't think Intel did this because they knew the 5800X3D would come, but simply due to a demand for a halo product in that market segment. The KS is a nicely pre-binned CPU, and just as I bought a 5950X, I would buy a 12900KS if I were building today.



QuietBob said:


> I'm almost sure we won't get another Ryzen3D on AM4. One reason is obviously the imminent release of Zen 4. And the gains we've seen in games owing to V-cache do not really translate to massively parallel workloads, such as rendering or encoding - which are primary use scenarios for Ryzen 9.



Gaming-wise you're probably right, but otherwise, I don't think so, it's not that results don't translate, it's that the resulting chip would be like the 1080 Ti - a bit too good of a product that would cannibalize AMD's own sales of higher end products in the future. Aforementioned EPYC 7373X for example


----------



## Count von Schwalbe (Apr 25, 2022)

Dr. Dro said:


> I don't think Intel did this because they knew the 5800X3D would come, but simply due to a demand for a halo product in that market segment. The KS is a nicely pre-binned CPU, and just as I bought a 5950X, I would buy a 12900KS if I were building today.
> 
> 
> 
> Gaming-wise you're probably right, but otherwise, I don't think so, it's not that results don't translate, it's that the resulting chip would be like the 1080 Ti - a bit too good of a product that would cannibalize AMD's own sales of higher end products in the future. Aforementioned EPYC 7373X for example


Not sure if the 192mb would complete with the 768mb and 8 memory channels of the EPYC.


----------



## zx128k (Apr 25, 2022)

Count von Schwalbe said:


> Not sure if the 192mb would complete with the 768mb and 8 memory channels of the EPYC.


The 5800x3d works because its one chiplet.  The cache helps reduce latency and improves game performance as a result.  The down side is reduced clock speeds and worse temperatures.  For a desktop cpu, more chiplets would just be the same chiplet as the 5800x3d.  More heat and increased latency, because data has to be accessed between different chiplets.  You are looking at reduced multi-core performance in a cpu that has a primary purpose to provide more mutli-core performance.

Only the 5800X3D makes sense, one chiplet means lower latency because you don't have to talk to other.  Before AMD would have 2 4 core CCX's per die(CCD).  With the Zen 3 based Ryzen 5000 and Milan processors, AMD aims to discard the concept of two CCXs in a CCD. Instead an 8-core CCD (or CCX) with access to the entire 32MB of cache on the die.  The example vcache is added on top.  This means nothing has to go over the IF to access other ccd's or ccx's.

This all help latencies, the fact that all cores, cache are on one ccd.  As the vcache is on top it acts like a blanket blocking heat from escaping from the cores.  Also the extra cache limits vcore to 1.35volts which further limits boost frequencies and overclocking.

The up side is better gaming performance in some (not all games) but the down side is reduced overall cpu performance when compared to the 5800x.  For Space Engineers a 5800x is a better CPU as an example.  There are many games like this were pushing high fps is not the issue but the cpu gets hammered simulating the game world.

The same happens to the server chips like EPYC, if you don't need the large cache then performance is reduced.



> It comes packed with 256 MB of standard L3 cache and an additional 512 MB of 3D V-Cache, giving up to 768 MB of L3 cache and 804 MB of total cache per chip. Since two of these chips are featured on the 2P SP3 platform, you get 128 cores, 256 threads, and 1608 MB of cache which is truly insane. Each chip also comes with 280W of TDP though the ES chips may operate at a different TDP owing to their lower clocks.





> In all of the benchmarks used in the test suite, the AMD EPYC 7773X Milan-X Dual-CPU config was lost to the older EPYC 7T83 Milan CPU and also the Core i9-12900K despite having a massive cache & core advantage. The reason is simply the fact that this CPU isn't designed for the applications the content creator used in his test suite. The Milan-X lineup is designed specifically for workloads that are cache-dependent & software suites such as the ones used for benchmarking aren't optimized for the high core and cache count that this chip has to offer. The second reason is that this is an ES CPU so clocks may not be boosting as intended, hence the variable in performance versus the old part. Source





> Here are some examples from AMD on how their new processors will improve specific time-to-results workloads:
> 
> EDA – The 16-core, AMD EPYC 7373X CPU can deliver up to 66 percent faster simulations on Synopsys VCS, when compared to the EPYC 73F3 CPU.
> FEA – The 64-core, AMD EPYC 7773X processor can deliver, on average, 44 percent more performance on Altair Radioss simulation applications compared to the competition’s top of stack processor.
> CFD – The 32-core AMD EPYC 7573X processor can solve an average of 88 percent more CFD problems per day than a comparable competitive 32-core count processor while running Ansys CFX.





> AMD indicates the following workloads that might be a good fit for Milan-X:
> 
> Workloads that are sensitive to L3 cache size
> Workloads that have high L3 cache capacity misses (for example, the data set is often too large for L3 cache)
> Workloads that have high L3 cache conflict misses (for example, the data pulled into cache has low associativity





> Some areas that might have these kinds of workloads include fluid dynamics (CFD), finite element analysis (FEA), electronic design automation (EDA) and structural analysis. Source


----------



## nicamarvin (Apr 25, 2022)

Linux Benchmarks at Phoronix is Up..

As expected the HPC, 3D Fluid Dynamics and Deep Learning Performance is Off the Charts


----------



## GURU7OF9 (Apr 25, 2022)

In a review of the new "GAMING focussed AMD RYZEN R7 5800X3D " why would you choose to test 38 applications and only 10 games? 
This defies common sense and logic ! 
It should have been 38 games tested and only 10 applications ! 
Maybe I missed something ? Confused ?


----------



## zx128k (Apr 25, 2022)

GURU7OF9 said:


> In a review of the new "GAMING focussed AMD RYZEN R7 5800X3D " why would you choose to test 38 applications and only 10 games?
> This defies common sense and logic !
> It should have been 38 games tested and only 10 applications !
> Maybe I missed something ? Confused ?


Linux is likely the OS.


----------



## GURU7OF9 (Apr 25, 2022)

Nope,  it says in test setup used win11.
Maybe somebody just wants to see what apps will benefit from 3D V cache ? Or highlight the fact that 12900k will crush it at apps !
I still wonder what the logic is behind it?
It just doesn't make any sense to me other than,  think coincidence - think again !


----------



## zx128k (Apr 25, 2022)

GURU7OF9 said:


> Nope,  it says in test setup used win11.
> Maybe somebody just wants to see what apps will benefit from 3D V cache ?
> I still wonder what the logic is behind it?
> It just doesn't make any sense to me other than,  think coincidence - think again !


Must be interested in productivity applications and some gaming.


----------



## nicamarvin (Apr 25, 2022)

zx128k said:


> Linux is likely the OS.


He is Clearly not talking about Phoronix....


----------



## R0H1T (Apr 25, 2022)

GURU7OF9 said:


> Or highlight the fact that 12900k will crush it at apps !


What apps?


----------



## zx128k (Apr 25, 2022)

nicamarvin said:


> He is Clearly not talking about Phoronix....


PPL work and play some games on Linux.  Or dual boot.  Many poeple are not solely focused on gaming.  Some people want to know more about how it affects productivity programs.  Each review caters to its demographic.   If purely interested in gaming review there is lots of sources to find more information.


----------



## Aquinus (Apr 25, 2022)

nicamarvin said:


> Linux Benchmarks at Phoronix is Up..
> 
> As expected the HPC, 3D Fluid Dynamics and Deep Learning Performance is Off the Charts
> 
> ...



I'm actually pretty surprised at how well the improvement for zstd was given how lackluster the WinRAR and 7z performance was on the TPU review. This could be further evidence that the CPU scheduler in the OS can directly impact hit ratios and evictions from cache due to how tasks are scheduled.


----------



## nicamarvin (Apr 25, 2022)

Aquinus said:


> I'm actually pretty surprised at how well the improvement for zstd was given how lackluster the WinRAR and 7z performance was on the TPU review. This could be further evidence that the CPU scheduler in the OS can directly impact hit ratios and evictions from cache due to how tasks are scheduled.


Would be nice if there is any Performance boost on Windows Games running on Linux + Proton/Wine(SteamOS) vs Windows 10/11 with 3D V-Cache.

What 5950X3D Might have been


















						Intel Core i9-12900K vs. AMD EPYC 7373X vs. AMD Ryzen 9 5950X Benchmarks - OpenBenchmarking.org
					






					openbenchmarking.org


----------



## Mussels (Apr 26, 2022)

GURU7OF9 said:


> In a review of the new "GAMING focussed AMD RYZEN R7 5800X3D " why would you choose to test 38 applications and only 10 games?
> This defies common sense and logic !
> It should have been 38 games tested and only 10 applications !
> Maybe I missed something ? Confused ?


because that's what all previous CPU's were tested with, if they're not all tested the same way then you're not a reviewer - you're a shill


All parts must be tested equally, and then your conclusion is drawn off the results
(In this case, the 5800x3D has weak multi threaded performance compared to other ryzen and intel chips - but that's not a negative, for a gaming chip)


----------



## Deleted member 24505 (Apr 26, 2022)

Most people buying the X3D are not going to be bothered about app performance,  if they were they would buy a different chip


----------



## lexluthermiester (Apr 26, 2022)

Tigger said:


> Most people buying the X3D are not going to be bothered about app performance,  if they were they would buy a different chip


Not true. Some will buy it for gaming as there is clearly some benefit. Others will be buying it for certain types of compute as there is a very clear advantage over CPU's with less cache.


----------



## GURU7OF9 (Apr 26, 2022)

Aquinus said:


> I'm actually pretty surprised at how well the improvement for zstd was given how lackluster the WinRAR and 7z performance was on the TPU review. This could be further evidence that the CPU scheduler in the OS can directly impact hit ratios and evictions from cache due to how tasks are scheduled.


Imagine how good they will be on  AMD  EPYC cpus with the 3D V cache AND 64 cores !  TOTAL BEASTS  !



Tigger said:


> Most people buying the X3D are not going to be bothered about app performance,  if they were they would buy a different chip





lexluthermiester said:


> Not true. Some will buy it for gaming as there is clearly some benefit. Others will be buying it for certain types of compute as there is a very clear advantage over CPU's with less cache.


TIGGER IS RIGHT!  if they have done any sort of research for mainly app usage then they wouldnt buy the 5800X3D. If they did then they are just stupid !



Mussels said:


> because that's what all previous CPU's were tested with, if they're not all tested the same way then you're not a reviewer - you're a shill
> 
> 
> All parts must be tested equally, and then your conclusion is drawn off the results
> (In this case, the 5800x3D has weak multi threaded performance compared to other ryzen and intel chips - but that's not a negative, for a gaming chip)


I understand what you are saying but this is a GAMING orientated cpu . I would have thought that to keep it the same but still focus on the game performance, testing extra games would have been logical .
The bottom line is it has this nice shiny new 3D Vertical cache and all everyone wants to know is will it be usefull for more than only a few games!
Everyone already knows the performance of it in apps is going to be pretty much the same as the std 5800X +/- 5%, which is minimal difference !
So 38 apps have been basically retested for trivial results! I suppose just confirming it,  but seems like a waste to me !


----------



## Dr. Dro (Apr 26, 2022)

Truth is, "gaming" processors don't exist. The 5800X3D is marketed towards desktop workloads (which includes gaming) because of its relatively low core count (really, it is just one CCX/CCD) and AM4's relatively pedestrian I/O. Even with a single compute die, this processor, for the price it is sold will have a lot of buyers outside of the gaming spectrum, that's for sure.


----------



## lexluthermiester (Apr 27, 2022)

GURU7OF9 said:


> TIGGER IS RIGHT! if they have done any sort of research for mainly app usage then they wouldnt buy the 5800X3D.


Tigger was making a specific point, which you failed to understand the context of. I was making an addendum to his point. You failed to understand that too. Context is not one of your strong suits it would seem.


GURU7OF9 said:


> If they did then they are just stupid !


Look in a mirror and say that again. Anyone who READS the benchmarks and reviews will know that the X3D has an advantage in certain area's over non-X3D parts. IF those areas of advantage are important to a user than the X3D is the CPU to buy.


GURU7OF9 said:


> I suppose just confirming it, but seems like a waste to me !


Then don't buy it.


----------



## Dr. Dro (Apr 27, 2022)

Thought I'd share, Phoronix has tested the 5800X3D under Linux, where it somewhat surprisingly "sucks" at games, but excels at pretty much a wide range of other... business-ey things 






						AMD Ryzen 7 5800X3D On Linux: Not For Gaming, But Very Exciting For Other Workloads - Phoronix
					






					www.phoronix.com


----------



## btk2k2 (Apr 28, 2022)

Some interesting results in the below writeup. Memory overclocking is far less important on Zen 3D than regular Zen. Also some fantastic improvements in Stellaris. I like how for the 1st metric he notes that as each day goes by the simulation gets more complex so comparing days in a fixed time does somewhat penalise faster parts. He then does time to day X comparison as well which I have attached below but it is well worth actually giving it a read.

Writeup here.


----------



## GURU7OF9 (Apr 28, 2022)

lexluthermiester said:


> Tigger was making a specific point, which you failed to understand the context of. I was making an addendum to his point. You failed to understand that too. Context is not one of your strong suits it would seem.
> 
> Look in a mirror and say that again. Anyone who READS the benchmarks and reviews will know that the X3D has an advantage in certain area's over non-X3D parts. IF those areas of advantage are important to a user than the X3D is the CPU to buy.
> 
> Then don't buy it.


Well yes you are correct for very specific apps it may well be very good but it would be a very small amount ! But if it is what you need, then go for it! 
But you have taken my comments out of text  and you should take a good look in the mirror yourself! 

Firstly my point agreeing with Tigger is a generalisation for 95 % of apps there are far better cpus (Intel and AMD) to use than the 5800X3D, except for some very small specific use cases.
Secondly they did the whole "testing procedure on a gaming orientated cpu, heavily focussing on application performance", with roughly only 25% of that amount on gaming,  which is the cpus main focus! 
(38 apps and only 10games)
Well that kind of speaks for itself. 

Mussels went onto inform me it is because they are keeping it the same for review consistancy which is fair enough, but as i replied to him i would have thought they could have put in more/extra  gaming benchmarks in the review to satisfy the masses, myself included, wanting to know how good the extra 3d cache helps for gaming! 

You seem to think i have a problem with the 5800X3D, but i dont, its a neat bit of kit that i may well buy myself in the future!
 Certainly thinking about it, but the high price is a bit rich at the minute! Hopefully it will drop before they run out of them!


----------



## Deleted member 24505 (Apr 28, 2022)

For a general use PC, there are better, for a mainly gaming rig, there is this was kind of my point.


----------



## lexluthermiester (Apr 28, 2022)

GURU7OF9 said:


> Well yes you are correct for *a wide range of* specific apps


Fixed that for you..


----------



## InVasMani (Apr 29, 2022)

btk2k2 said:


> Some interesting results in the below writeup. Memory overclocking is far less important on Zen 3D than regular Zen. Also some fantastic improvements in Stellaris. I like how for the 1st metric he notes that as each day goes by the simulation gets more complex so comparing days in a fixed time does somewhat penalise faster parts. He then does time to day X comparison as well which I have attached below but it is well worth actually giving it a read.
> 
> Writeup here.



I found this chart rather interesting overall. What would be interesting is seeing the 5800X3D tested at 3200MT/s and 3733MT/s and comparing temperatures in fact throwing in 2133MT/s would be nice as well and adding that to the performance comparisons. If you reduce the memory frequency MT/s it makes it easier to push BLCK higher. I don't know if pushing BLCK helps make pushing infinity fabric or not on the other hand. I do like the implications of this though on the performance impact side 3200MT/s isn't too far off from 3733MT/s memory so that seems to imply the CPU would handle 2x32GB DIMM's with slower latency timings and 4 DIMM's more reasonably by extension since reducing the CL to compensate for stability won't readily impact performance too aggressively.


----------



## Mats (May 1, 2022)

nicamarvin said:


> What 5950X3D Might have been


Just no.

You'd better check the specs of the 7373X before posting anything like this. It takes 8 chiplets to get that amount of L3 cache, while AM4 only has room for 2 chiplets.


----------



## lexluthermiester (May 1, 2022)

Mats said:


> Just no.


Why not?


----------



## Mats (May 1, 2022)

lexluthermiester said:


> Why not?


See second line.


----------



## lexluthermiester (May 1, 2022)

Mats said:


> See second line.


You might have missed the point..


----------



## Mats (May 1, 2022)

lexluthermiester said:


> You might have missed the point..


Care to explain? It's not really that easy to see your point right now.


----------



## 5 o'clock Charlie (May 2, 2022)

Thank you @W1zzard for this in-depth review before the release date. Your information was able to help me decide on my CPU upgrade.


----------



## Agent_D (May 3, 2022)

Was able to run a -50 to the curve with PBO2 tuner, which is stable for 12 hours in Core Cycler. Cinebench scores 15050-15200 and knocks 8-12c off temps. Since we don't currently have any access to Curve optimizer for 5800x3d in bios, the PBO2 tuner software works nicely, though -50 is the max it will allow. With -50 on all cores, voltage is between 1.145 and 1.165 in the multi-core cinebench run; single core voltage sits between 1.11-1.113v and all cores reach 4550.1MHz.


----------



## gffermari (May 4, 2022)




----------



## 529th (May 5, 2022)

This is how benchmark graphs should be.  Based on minimum fps, not averages.  der8auer also does a fps per watt


----------



## Agent_D (May 6, 2022)

Playing around with PBO2 Tuner some more; with the -50 all core curve and using the same settings I use with a 5800x in another system for the power values: 105w PPT, 80A TDC, 110A EDC. Cinebench values drop ever so slightly to the 14900 mark, but also drop another 4-6c in temp. Not going to mess with the max boost setting.


----------



## Agent_D (May 8, 2022)

Updated bios to 4201 beta on my Dark Hero x570. They significantly dropped the stock all-core clock speeds in, what I think, is an effort to reduce temps, and likely stop throttling for users with lower end cooling solutions. All stock settings Cinebench score is 13800-13900 on the 4201 bios, but jumps back up to the 14900-15000 mark with the -50 all-core curve optimizer setting in PBO2 Tuner.


----------



## xorbe (May 10, 2022)

Note to self, don't update my Dark Hero bios any further, lol.


----------



## zx128k (May 10, 2022)

Agent_D said:


> Updated bios to 4201 beta on my Dark Hero x570. They significantly dropped the stock all-core clock speeds in, what I think, is an effort to reduce temps, and likely stop throttling for users with lower end cooling solutions. All stock settings Cinebench score is 13800-13900 on the 4201 bios, but jumps back up to the 14900-15000 mark with the -50 all-core curve optimizer setting in PBO2 Tuner.


Thats a big boost for doing next to nothing.


----------



## InVasMani (May 10, 2022)

Indeed undervolting/underclocking can see huge efficiency gains and temperature reductions. That's not anything new to anyone in fact  overclocking can see performance degradation when pushed too far.


----------



## gffermari (Jun 10, 2022)

gffermari said:


> 410£ the 5800X3D
> 495£ the 5950X
> 360£ the 5900X
> 
> it’s still too difficult to decide…



I purchased the 5950X and a few minutes later I canceled the order.

I made the mistake. Here it is...


----------



## Mussels (Jun 11, 2022)

Niiiice.

I was considering upgrading my VR system, the 5800x3d costs more than a 5900x still (and the 5900x is actually in stock)


----------



## Makaveli (Jun 11, 2022)

Mussels said:


> Niiiice.
> 
> I was considering upgrading my VR system, the 5800x3d costs more than a 5900x still (and the 5900x is actually in stock)


I've looked at both myself.

however the gains I will see at 3440x1440 won't be that great and I don't really need the extra 4 cores the 5900X will provide so while very tempting I think I will save the cash to upgrade my 6800XT to a 7800XT


----------



## kapone32 (Jun 11, 2022)

Mussels said:


> Niiiice.
> 
> I was considering upgrading my VR system, the 5800x3d costs more than a 5900x still (and the 5900x is actually in stock)


Though the 5800X3D is good for Gaming. Using it side by side vs the 5950X is underwhelming. The issue I see is on my 165Hz panel 200 vs 195 FPS means absolutely nothing in Gaming. Having said that it blows away any of the budget CPUs but is too expensive (I know that they sell as many as they make). Like Makaveli my next plan will be to buy a 7800XT (from AMD's website to get the best launch price possible).


----------



## gffermari (Jun 11, 2022)

I have a problem with the 3D's performance.

In cpuz I got ~610/6100 score while the single core clock is at 4450 and the all core at 4250.
About the same in Cinebench, having 1000 points less than in reviews.

I don't have any temperature issues although the 3D is extremely hot, taking into account that I have custom water cooling.
I've updated the BIOS 2 days ago.

Is it a bad part? RMA?


----------



## zlobby (Jun 12, 2022)

gffermari said:


> I have a problem with the 3D's performance.
> 
> In cpuz I got ~610/6100 score while the single core clock is at 4450 and the all core at 4250.
> About the same in Cinebench, having 1000 points less than in reviews.
> ...


More out of curiosity than anything else - what are your temps under full load?


----------



## Mussels (Jun 12, 2022)

gffermari said:


> I have a problem with the 3D's performance.
> 
> In cpuz I got ~610/6100 score while the single core clock is at 4450 and the all core at 4250.
> About the same in Cinebench, having 1000 points less than in reviews.
> ...


Your RAM has slow timings and you didnt go above 90W on the CPU, either you're on an old BIOS or you've got custom settings limiting it

70C is not extremely hot. It's barely hot at all.


----------



## dont whant to set it"' (Jun 12, 2022)

@gffermari its not a bad cpu by your posted score, could be some setting or software running a inbetween load in the backgroung.
A 5800X scores ~10% more for about 10% more clockspeed( single threadedly).

LE: As I currently have it set up, I've changed the cpu fan to another and its running the same fan profile to now @~250rpm from a 2000rpm one with allcores on. CPU cooler is a 120mm fan compatible air tower cooler.
Attaced screenshot.


----------



## Tomorrow (Jun 12, 2022)

Indeed. My 5800X3D goes to 90c with a 420mm AIO. Now that's what i call hot. Tho this obviously only happens in allcore workloads but those are rare for me (if they were not i would have gone with 5950X instead). In games i rarely see anything above 70c.

I'm guessing some motherboard setting is limiting performance. Possibly TDC, PPT, EDC?
It cant be too old of an BIOS - there this CPU would run as base clock (3.4Ghz) and not boost properly at all like what happened to me.

Initially when i got my 5800X3D i flashed the newest BIOS for my X570 Master but i made the mistake of restoring an old BIOS profile. Because i did not want to manually re-enter all timings etc. As it turns out this profile had a custom vcore setting. But since this value is hidden with 5800X3D it screwed up the boosting algorithm and it was stuck a 3.4Ghz.

Only after clear CMOS and painstakingly re-entering my custom settings was i able to get everything set up. Since 5800X3D does not show vcore value i was unable to reset it in an old profile. For that i would have had to unmount 5800X3D and put back my old 3800X. Then edit this profile and save it and take out 3800X again. I figured it was faster to re enter all values manually and create a new profile instead.


----------



## gffermari (Jun 12, 2022)

@zlobby 
70 degrees under full load on cpuz, 73-74 on Cinebench R23 on loop mode.
(I have 3x240 rads although 1 of them is slim rad with slim fans under the gpu)

@Mussels 
Yes my memory is Corsair Vengeance 3600. One of the worst out there. I haven't tried to tweek the timings. I just pick the DOCP.
I had a new bios (supporting the 3D) before the cpu installation but checking it 2 days ago, there was a newer version.
And I put it on. I'm on the latest now.





@dont whant to set it"' 
I'll try closing everything. Yes I have hwinfo, razer, logitech, AI suite, dropbox etc. on the tray.

@Tomorrow 
I have a X570 Impact and after installing the 3D, most of the PBO and other settings, dissappeared.
I don't know if there's anything in the BIOS that I can change.
Pic of my rig right before installing the 3D. A huge mess....


Spoiler


----------



## Tomorrow (Jun 12, 2022)

gffermari said:


> I have a X570 Impact and after installing the 3D, most of the PBO and other settings, disappeared.
> I don't know if there's anything in the BIOS that I can change.


You should be able to change PBO settings. Since im not ASUS user i cant comment on where exactly these are stored. If i had to guess they should be under AMD CBS section somewhere. You may have to switch from PBO Auto to manual to see them.

The only settings that disappeared on mine were related to vcore.


----------



## The King (Jun 12, 2022)

@gffermari 
Some P states and boost clocks may not work correctly unless you update your motherboard chipset driver. Sometimes this may have been overlooked when the BIOS is updated.
Also make sure you are running the latest updates with Windows on 10 and 11.


----------



## zlobby (Jun 12, 2022)

@gffermari I am not familiar with this particular mobo, but it should be decent enough to let you use full potential of your CPU.

I'd generally recommend putting the latest UEFI, resetting all settings, save and exit. Then, enable DOCP profile for the mem, and cherry-pick (enable) only those parameters that allow maximum auto OC e.g., thermal and power limits, PBO, etc.

Then plop Windows from scratch but only install drivers and OS updates. Who knows what AI suite and the rest are doing to your system?

Then we should have a good baseline to work with.

I know it's inconvenient, especially if it's your main rig but...

Edit: I recommend Win 11 (and CSM disabled of course).


----------



## Mussels (Jun 12, 2022)

As came up in another thread i posted, some simple mobo software can tank your CPU performance 10% or more.


Make sure nothings running in the background.


----------



## gffermari (Jun 12, 2022)

-The background apps affect the results. I closed most of them and had an increase in cpuz and R23.
Also in the AI Suite I managed to increase the cpu power. From 90 to 124.5 watts during multicore benchmarking in Cinebench.
Note: In R23 now I got 86 degrees after 4-5 loops and I'm on three 240 radiators....
But the cpu doesn't throttle while the water temp increased by just 1.5 degrees.

The thing is that I'm still behind in numbers.

Regarding the Windows thing. I have to admit that I'm in the worst possible position.
I have Windows 11 for almost a year now, it was upgraded from Windows 10 which was installed about 3-4 years ago...

*CPUZ: 617/6180*


Guru3D6396599Toms Hardware6276070

Most reviewers do not use cpuz. Googling you can see screenshots of 5800x3d cpuz numbers which are significantly higher than mine.
example:











*Cinebench R23: 14175 (I haven't tried the single core benchmark)*


TechpowerUp149614688Techspot139814221techradar148315070PCWorld149015014kitguru149515171hothardware148614612overclock3d145214461computerbase148814907147915031148414739


----------



## 529th (Jun 12, 2022)

@gffermari​CPUz ST bench mark is on par at 617 if you have Global C-States disabled .. possibly some other bios tweaks in there causing it also.  What's it set at?

Those other results are from bus overclocking


----------



## R0H1T (Jun 12, 2022)

Should set CPU Power & VDDSOC phase controls to optimized & turn off spread spectrum under CPU VRM Switching frequency. This wouldn't make a big difference but perhaps enough in your case.


----------



## Mussels (Jun 13, 2022)

CPU-Z's stock results are always higher than mine

There was also bugs with windows 11 where if you changed processor you had to reinstall windows clean to fix performance issues - i never heard back from people i suggested try an 'upgrade' install if it helped


----------



## Count von Schwalbe (Jun 13, 2022)

Mussels said:


> There was also bugs with windows 11 where if you changed processor you had to reinstall windows clean to fix performance issues - i never heard back from people i suggested try an 'upgrade' install if it helped


Did the Ryzen L3 bug on W11 get a final fix or just a workaround? As the L3 is the reason for the existence that could be an issue...

@gffermari have you tried clean installs of W11 and W10? Just a thought


----------



## gffermari (Jun 13, 2022)

Thank you all. I tried every possible setting combination in the bios but in most cases I get a lot worse numbers. (Meaning CPUz: 590/5800 range)
I will install windows in a second ssd and check.


----------



## Taraquin (Jun 13, 2022)

I hope AMD makes curve optimizer available in future agesas, the 5800X3D sure needs it. Until the the pbo2 tuner tool can be quite helpful and lower temps a lot and even boost performance is cooling is not vety good.


----------



## Mussels (Jun 13, 2022)

Count von Schwalbe said:


> Did the Ryzen L3 bug on W11 get a final fix or just a workaround? As the L3 is the reason for the existence that could be an issue...
> 
> @gffermari have you tried clean installs of W11 and W10? Just a thought


the L3 bug got its fixes, this CPU upgrade one seems to be seperate - people kinda stopped talking about it


----------



## Amaze (Jun 14, 2022)

New bios update on the way:








						AMD AGESA V2 1.2.0.7 Microcode To Fix fTPM Stutters
					

Motherboard manufacturers are addressing AMD's stuttering issue in Windows 10 and 11 with a new BIOS update. The (intermittent) performance losses and stuttering are caused mostly by the TPM 2.0 func...




					www.guru3d.com


----------



## Chomiq (Jun 14, 2022)

Amaze said:


> New bios update on the way:
> 
> 
> 
> ...


There's already news about this on TPU and 1.2.0.7 bioses were out a month ago.


----------



## chrcoluk (Jun 14, 2022)

How many microcodes do they need to fix fTPM stuttering? I count 3 so far.  Will 1.2.0.8 fix it as well?


----------



## dont whant to set it"' (Jun 14, 2022)

@chrcoluk  are you up to something?
I've been on the "1.2.0.6c beta" with my 5800x3d.


----------



## Amaze (Jun 14, 2022)

Chomiq said:


> There's already news about this on TPU and 1.2.0.7 bioses were out a month ago.


It says V2.

Edit:
It is indeed old. My board already has this as well. Who knows why this is news then.


----------



## Deleted member 24505 (Jun 14, 2022)

chrcoluk said:


> How many microcodes do they need to fix fTPM stuttering? I count 3 so far.  Will 1.2.0.8 fix it as well?



They're getting as bad as Intel now............oh wait Intel just works.


----------



## Count von Schwalbe (Jun 14, 2022)

Tigger said:


> They're getting as bad as Intel now............oh wait Intel just works.


Not illogical as Intel has dominated the market for years, well-nigh forcing MS to develop around their hardware. However, this is strongly off topic.


----------



## TheoneandonlyMrK (Jun 14, 2022)

Tigger said:


> They're getting as bad as Intel now............oh wait Intel just works.


It's like spectre didn't exist in your head, wait long enough and spectre MK2 will release, passing you an Intel microcode update that fixes stuff by tanking performance.



Now on topic I have not had these stutters with my RYZEN system despite being on windows 11 and using the allegedly enhanced security fTPM enabled etc offers, odd.


----------



## Deleted member 24505 (Jun 14, 2022)

Funny how it's different when it's a Intel thread though with off topic anti Intel posts, usual TPU double standards now though.


----------



## TheoneandonlyMrK (Jun 14, 2022)

Personally I'm fine with your comments, they need countering(in any thread where I disagree with you I'm there) though but troll on in this or any Intel thread it's a public forum after all, but man do you moan.


----------



## Tomorrow (Jun 14, 2022)

Tigger said:


> Funny how it's different when it's a Intel thread though with off topic anti Intel posts, usual TPU double standards now though.


^ Says the resident Intel fanboy. If you dont have anything meaningful to add to this topic then dont post here.


----------



## Mussels (Jun 15, 2022)

chrcoluk said:


> How many microcodes do they need to fix fTPM stuttering? I count 3 so far.  Will 1.2.0.8 fix it as well?


1.2.0.7 was the only one to ever fix the stuttering, since it was caused by 1.2.0.6 (which added the windows 11 support)


It's not like we could just disable fTPM and have it all work fine, the entire time...


----------



## Deleted member 24505 (Jun 15, 2022)

Tomorrow said:


> If you dont have anything meaningful to add to this topic then dont post here.



The same should apply to Intel threads then.


----------



## Tomorrow (Jun 15, 2022)

Tigger said:


> The same should apply to Intel threads then.


Yes it should. But someone else doing it in Intel threads does not excuse you doing it in AMD threads.


----------



## chrcoluk (Jun 15, 2022)

dont whant to set it' said:


> @chrcoluk  are you up to something?
> I've been on the "1.2.0.6c beta" with my 5800x3d.


I am on 1.2.0.6 as well.  Working great for me.


----------



## gffermari (Jun 18, 2022)

Fresh Windows install today.

Results:
CPUZ ST = 610-615 the mediocre cores, 621 the best core.
CPUZ MT = ~6225. I get random results every time and it doesn't follow a pattern. No matter what settings I set, the score is from 5800 up to 6225(best) while in every test it begins at 6300 and gradually decreased to a random number....5900, 6150, 6025, whatever....

R23 MT = ~4250. No matter what settings I put on BIOS, I can't get over 4200 points.

OCCT ST (SSE, AVX) = I get the lowest score for a 5800X3D - slightly behind or in front of the lowest 5800X3D score in the benchmark.
OCCT MT (SSE, AVX) = I get slightly better score than the average score of a 5800X3D.

I don't know if my RAM causes all these issues but just today I noticed that I get some weird results in OCCT.




----------------------------------------------------------------------
edit....





....at last, i managed to unleash some of its power!!!


----------



## Amaze (Jun 19, 2022)

gffermari said:


> Fresh Windows install today.
> 
> Results:
> 
> R23 MT = ~4250. No matter what settings I put on BIOS, I can't get over 4200 points.


That is extremely low. The best I could do so far with the PBO curve tool was 15013.








						How-to-undervolt-AMD-RYZEN-5800X3D-Guide-with-PBO2-Tuner/README.md at main · PrimeO7/How-to-undervolt-AMD-RYZEN-5800X3D-Guide-with-PBO2-Tuner
					

Get the Most out of your 5800X3D using PBO Curve Optimizer! - How-to-undervolt-AMD-RYZEN-5800X3D-Guide-with-PBO2-Tuner/README.md at main · PrimeO7/How-to-undervolt-AMD-RYZEN-5800X3D-Guide-with-PBO2...




					github.com


----------



## R0H1T (Jun 19, 2022)

You could probably ask if ASUS can lend you one of their *Beta/Alpha BIOS *to test the chip out further ~








						MSI X570 Beta BIOS Enables Ryzen 7 5800X3D Tweaking
					

A user (via Komachi_Ensaka(opens in new tab)) on the MSI forums(opens in new tab) has shared a beta firmware for MSI's MEG X570 Unify motherboard that unlocks several previously unavailable options for the Ryzen 7 5800X3D.  Before the Ryzen 7 5800X3D's launch, AMD had confirmed that overclocking...




					www.techpowerup.com
				



Pretty sure they're also working on easing some of the restrictions on this, not unlike MSI.

Also the mandatory "Try try this at your own risk" warning


----------



## gffermari (Jun 19, 2022)

Amaze said:


> That is extremely low. The best I could do so far with the PBO curve tool was 15013.
> 
> 
> 
> ...



Yes it was.

Using PBO2 tuner, I get 4350 all core in Cinebench and about 14700 points.
Still I haven't managed to get this level of performance tweaking only the BIOS settings.
Enabling the PBO in BIOS and using the PBO2 tuner, I managed to get a reasonable level of performance out of it.


----------



## Amaze (Jun 19, 2022)

gffermari said:


> Yes it was.
> 
> Using PBO2 tuner, I get 4350 all core in Cinebench and about 14700 points.
> Still I haven't managed to get this level of performance tweaking only settings the BIOS settings.
> Enabling the PBO in BIOS and using the PBO2 tuner, I managed to get a reasonable level of performance out of it.


That's more like it


----------



## HenrySomeone (Jun 20, 2022)

Shatun_Bear said:


> Very impressive. Not sure why they are stealth launching this, it matches or beats the limited ediition behemoth power guzzling 12900KS for a much, much lower price.
> 
> Low stock maybe?


It does NOT match or beat the KS when the latter is in fact 2% faster on average while certainly not using much more power while gaming (this myth needs to stop). Oh and it absolutely demolishes it in everything else at a comparable energy efficiency:


----------



## gffermari (Jun 20, 2022)

loool mate.
We're talking about a 2 years old tech 8 core cpu with some added cache at 4.5Ghz at best and you for a 800e 5.5Ghz monstrosity with pcie5 and ddr5.

The 3D is not worth it because it's expensive for not being an all around cpu. A theoretical 5900X3D would be that.
The KS is just not worth it.


----------



## Amaze (Jun 20, 2022)

A whopping 2% eh. 
Either way, those power consumption numbers are demolished by the optimized curve that users have discovered.


----------



## HenrySomeone (Jun 20, 2022)

gffermari said:


> loool mate.
> We're talking about a 2 years old tech 8 core cpu with some added cache at 4.5Ghz at best and you for a 800e 5.5Ghz monstrosity with pcie5 and ddr5.
> 
> The 3D is not worth it because it's expensive for not being an all around cpu. A theoretical 5900X3D would be that.
> The KS is just not worth it.


Of course the KS isn't worth it for most people, even if they are assembling a high end gaming system, BUT it is the best in pretty much everything down the line, while the X3D is only the best gaming Ryzen, while getting smoked even by the lowly (in comparison, especially price wise) 12600k in everything else. It should be a $300 chip and I suspect the price will eventually get there...or they'll just stop making them after all the hardcore AMD enthusiasts will already have them and no one else will take a second glance at it anymore. Either way, we'll see, but honestly even at 300, it would be a somewhat difficult sell to someone buying an all new platform...


----------



## Dr. Dro (Jun 20, 2022)

HenrySomeone said:


> Of course the KS isn't worth it for most people, even if they are assembling a high end gaming system, BUT it is the best in pretty much everything down the line, while the X3D is only the best gaming Ryzen, while getting smoked even by the lowly (in comparison, especially price wise) 12600k in everything else. It should be a $300 chip and I suspect the price will eventually get there...or they'll just stop making them after all the hardcore AMD enthusiasts will already have them and no one else will take a second glance at it anymore. Either way, we'll see, but honestly even at 300, it would be a somewhat difficult sell to someone buying an all new platform...



We 5950X owners say hi and send our heartfelt regards.



Amaze said:


> A whopping 2% eh.
> Either way, those power consumption numbers are demolished by the optimized curve that users have discovered.



I still find it amazing that as power efficient as Zen 3 is, there's still room for improvement in them. It's a shame that the 5800X3D doesn't seem to have gone through the standard binning process, they just test if the cores work and ship the processor, which is why you're seeing so many of them doing -25 to -30 in the curve optimizer. Those values would indicate that the processors are quite horribly binned, but given the fixed v/f curve that makes sense. If eventually unlocked (even if extraofficially), it'd be interesting to see how different X3Ds behave, as the other Ryzen SKUs seem to live and die by their binning quality (e.g. 5950X's and very narrow CO range). 

My personal 5950X only does -2 all-core, even if setting -2 and then -3 to the cores identified as the best ones will still result in a crash. The PBO scalar setting introduces crashes above 1x and it also hates to have the XFR frequency range increased, even if by just 50 MHz. It's an excellent bin CPU, even without PBO it tends to clock itself very, very generously, which is why I kind of get cold feet about upgrading it. Might just wait for the chance to buy something like a C8DH or C8E cheaper once they're no longer the flagships


----------



## HenrySomeone (Jun 20, 2022)

Amaze said:


> A whopping 2% eh.
> Either way, those power consumption numbers are demolished by the optimized curve that users have discovered.


Definitely not whopping, no, but it's not the other way around like the guy states and just like I said above - for purely gaming neither of these two is a smart choice. Also, not sure what kind of optimized curve you're talking about.


----------



## Dr. Dro (Jun 20, 2022)

HenrySomeone said:


> Definitely not whopping, no, but it's not the other way around like the guy states and just like I said above - for purely gaming neither of these two is a smart choice. Also, not sure what kind of optimized curve you're talking about.











						Maximizing Ryzen 5000 Performance With AMD Curve Optimizer
					

AMD's new AGESA update adds simple controls for complex overclocking and under-volting that can bring nice performance gains.




					hothardware.com
				




This. On Intel processors you still do traditional multiplier-based overclocks, but on AMD, you can simply tweak and reprogram the processor's reliability algorithm using the provided curve optimizer function to extract every last bit of performance your processor has, with 25 MHz granularity, since the multipliers on Ryzen are adjustable in 0.25x steps. The best part is that it does that by itself, all you need to be concerned with is keeping it cold. Due to being locked and programmed with a fixed curve, the X3D seems to have very, very loose binning. PBO2 Tuner works but it's not even *_supposed*_ to, Ryzen Master and BIOSes won't allow CO adjustment even to negative voltage points.


----------



## Tomorrow (Jun 20, 2022)

AMD should really have kept negativce CO for X3D. It does not harm the cache and actually improves (atleast multithreaded) performance.
I mean they had a whole year to work on it and test it with various settings.


----------



## HenrySomeone (Jun 20, 2022)

Yup, as soon as Intel's 10nm chips finally came out, AMD has already started to drift towards their once common practices, namely taking too long to get something new out and even then, it's not properly polished. It probably won't be long before they'll be OCing their chips to high heavens (in the manner of their black edition skus) in hopes of catching up or rather not falling behind even more. My bet would be after Meteor Lake lands next year, but could be even sooner as 13900k with its 24 cores is bound to be a multi thread monster!


----------



## Deleted member 24505 (Jun 21, 2022)

HenrySomeone said:


> Yup, as soon as Intel's 10nm chips finally came out, AMD has already started to drift towards their once common practices, namely taking too long to get something new out and even then, it's not properly polished. It probably won't be long before they'll be OCing their chips to high heavens (in the manner of their black edition skus) in hopes of catching up or rather not falling behind even more. My bet would be after Meteor Lake lands next year, but could be even sooner as 13900k with its 24 cores is bound to be a multi thread monster!



Hey you can't say that. you will hurt about 60% of TPU members feelings /s


----------



## Mussels (Jun 21, 2022)

well you're flatlining 80C, you're thermal throttling


for <100Hz gaming, get an i5 or R5 and go play games.
You only need the faster chips (especially the x3D) for high refresh rate gaming, where the 99% lows are smoother.


----------



## Tomorrow (Jun 21, 2022)

Good news for 5800X3D owners on MSI X570 Unify:








						MSI's BETA X570 BIOS For AMD Ryzen 7 5800X3D CPU Allows Users To Tweak VCore, Precision Boost & Curve Optimizer
					

MSI has a new BETA BIOS out for its X570 motherboards that enable users to further tweak their AMD Ryzen 7 5800X3D CPU.




					wccftech.com


----------



## Amaze (Jun 21, 2022)

AMD Ryzen 7 5800X3D im Test: Mit 3D-V-Cache und hoher Effizienz gegen Intel
					

Auf dieser Seite beschäftigen wir uns mit der Leistungsaufnahme des Ryzen 7 5800X3D. Wir prüfen zudem, wie effizient die CPU in Spielen ist und bilden dafür die Fps pro Watt ab.




					www-pcgameshardware-de.translate.goog


----------



## fevgatos (Jun 21, 2022)

Amaze said:


> AMD Ryzen 7 5800X3D im Test: Mit 3D-V-Cache und hoher Effizienz gegen Intel
> 
> 
> Auf dieser Seite beschäftigen wir uns mit der Leistungsaufnahme des Ryzen 7 5800X3D. Wir prüfen zudem, wie effizient die CPU in Spielen ist und bilden dafür die Fps pro Watt ab.
> ...


I admire the reviewers wrok, testing 50 cpus or whatever, but 4400ddr5? I mean...come on.


----------



## Amaze (Jun 21, 2022)

fevgatos said:


> I admire the reviewers wrok, testing 50 cpus or whatever, but 4400ddr5? I mean...come on.


 Yeah not the best but according to this Toms article, going from 4800 to 6200 didn't have a meaningful impact on game performance.








						Intel Alder Lake RAM Guide: Picking Between DDR4 and DDR5
					

Choose wisely and save money.




					www.tomshardware.com


----------



## fevgatos (Jun 21, 2022)

Amaze said:


> Yeah not the best but according to this Toms article, going from 4800 to 6200 didn't have a meaningful impact on game performance.
> 
> 
> 
> ...


Thats bs, there is actually a huge difference if you are cpu benching,meaning high end gpu on low resolution


----------



## Amaze (Jun 21, 2022)

fevgatos said:


> Thats bs, there is actually a huge difference if you are cpu benching,meaning high end gpu on low resolution


Can you show me some examples?


----------



## Mussels (Jun 21, 2022)

TPU's reviews say latency helps more than MHz
DDR4 vs. DDR5 on Intel Core i9-12900K Alder Lake Review - Application Performance | TechPowerUp




low res gaming is similar, with Gear 2 at "low" RAM speeds being the worst way to run it


----------



## fevgatos (Jun 21, 2022)

Amaze said:


> Can you show me some examples?


Ive tested on my pc, 6000c30 1t that im running right now is up to 40% faster than 4800c40 in cyberpunk and sotr. Haven't tested more games, but its kinda obvious there is a big difference


----------



## TheoneandonlyMrK (Jun 21, 2022)

fevgatos said:


> Ive tested on my pc, 6000c30 1t that im running right now is up to 40% faster than 4800c40 in cyberpunk and sotr. Haven't tested more games, but its kinda obvious there is a big difference


W1zzard posts actual proof,. That's kinda obvious, what your saying is 99.9% your opinion until you back it up with 100% proof.
Until then it's lips flapping.


----------



## fevgatos (Jun 21, 2022)

TheoneandonlyMrK said:


> W1zzard posts actual proof,. That's kinda obvious, what your saying is 99.9% your opinion until you back it up with 100% proof.
> Until then it's lips flapping.


Actual proof... As in, pretty bars made on paint. Dont have any problem with his reviews, but don't tell me they contain actual proof, cause they just dont. Bars made on paint is not proof.


----------



## TheoneandonlyMrK (Jun 21, 2022)

fevgatos said:


> Actual proof... As in, pretty bars made on paint. Dont have any problem with his reviews, but don't tell me they contain actual proof, cause they just dont. Bars made on paint is not proof.


Right your going on ignore.

You need to realise, your proof is actually nothing, word's.

If I and many others didn't think the likes of w1zzard uses actual benchmarks, collates results , then presents those results in an informative way to PROVE they're opinion and any statements, this site wouldn't still be here for you to flap your mouth in while showing zero proofs.

I know which opinion I'm backing, and you going on ignore should tell you it's not yours.


----------



## fevgatos (Jun 21, 2022)

TheoneandonlyMrK said:


> Right your going on ignore.
> 
> You need to realise, your proof is actually nothing, word's.
> 
> ...


So you trust the reviewer. Great, me too. That doesn't mean the review has proof, cause it just doesn't. The fact that instead of addressing my point you decided to announce that you'll ignore me makes me right by definition. So thanks i guess


----------



## dont whant to set it"' (Jun 21, 2022)

Girls and boys, while popcorn would be fun on the side , I kind of sense an admin may be lurking.

One can only compare Zen3 to Adler(Alder)Lake so and so because these are way different architectures.

It would of been really nice to get a better IoD with Zen 3 on AM4 that can reliably push the IF clocks to 2200MHz , possibly above. But, just because that not happend , does not take away from Aledre Lake's DDR5 integrated memory controller performance  scalability in certain this or that.

I for one have found on my AM4 platform that going from 4 8GB dimms ( dual rank dimms) @ 3200MT C16 flat , to 4 8GB dimms( single rank dimms) @3600MT C14 flat did not score better (within margin of error) , so yes architectural differences. Le: in cinebench r23.


----------



## QuietBob (Jun 21, 2022)

Average CPU power consumption in 14 games from the already quoted PCGH article. They tested with an oc'd 6900XT at the lowest possible resolution, but in maximum detail with RT on. The wattage was measured at the motherboard, so accounting for VRM losses:






In my own testing of 18 games the average CPU package power for the 5800X3D was 55w at 1080p with maximum quality and RT off, paired with a stock 6600XT.


----------



## TheoneandonlyMrK (Jun 21, 2022)

fevgatos said:


> So you trust the reviewer. Great, me too. That doesn't mean the review has proof, cause it just doesn't. The fact that instead of addressing my point you decided to announce that you'll ignore me makes me right by definition. So thanks i guess


Does it f£#@.    . ..


----------



## fevgatos (Jun 21, 2022)

TheoneandonlyMrK said:


> Does it f£#@.    . ..


Thought you ignored me


----------



## gffermari (Jun 25, 2022)

Is there any Asus X570 boards that has the curve optimizer available with a 5800X3D?

My X570 Impact's BIOS does not have it. Are there any beta BIOSes for other Asus boards?


----------



## Tomorrow (Jun 25, 2022)

gffermari said:


> Is there any Asus X570 boards that has the curve optimizer available with a 5800X3D?
> 
> My X570 Impact's BIOS does not have it. Are there any beta BIOSes for other Asus boards?


MSI X570 Unify supposedly has a Beta bios. Tho im not sure what version of the Unify. The old or newer Unify-X Max. I posted this news somewhere in this thread.


----------



## VulkanBros (Jun 30, 2022)

https://www.msi.com/Motherboard/MEG-X570-UNIFY/support 

"Improved CPU performance of RyZen 7 5800X3D


----------



## Tomorrow (Jun 30, 2022)

MSI Intros 'Kombo Strike' Feature on AM4 500-Series Motherboards For AMD Ryzen 7 5800X3D CPU, Offers Increased Performance & BIOS Undervolting Support








						MSI Intros 'Kombo Strike' Feature on AM4 500-Series Motherboards For AMD Ryzen 7 5800X3D CPU, Offers Increased Performance & BIOS Undervolting Support
					

MSI has added Kombo Strike feature for AMD Ryzen 7 5800X3D in AM4 500-Series motherboard BIOS offering improved performance & undervolting.




					wccftech.com
				




The following motherboards have already received the new BIOS which is available from their respective product page:

MOTHERBOARDBIOS VERSION*MEG X570 GODLIKEE7C34AMS.1I1MEG X570 ACEE7C35AMS.1J1MEG X570 UnifyE7C35AMS.AE1MEG X570S ACE MAXA7D50AMS.151MEG X570S Unify-X MAXA7D51AMS.141MEG B550 UnifyE7D13AMS.171MEG B550 Unify-XE7D13AMS.A71MEG B550 Mortar WiFiE7C94AMS.1D1MEG B550M MortarE7C94AMS.1D1


----------



## Chomiq (Jun 30, 2022)

Kombo Strike, lmao.


----------



## VulkanBros (Jun 30, 2022)

MSI Kombo Strike For AM4 500-Series Motherboards Released: Up To 5% Performance Uplift & Undervolting Support For AMD Ryzen 7 5800X3D CPU Through BIOS​Kombo Strike is a brand new feature that's being introduced by MSI for its AM4 500-series boards.  The feature comes packaged in the latest BIOS for MSI's 500-series motherboards which not only add the Kombo Strike feature for the Ryzen 7 5800X3D CPU but also unlock undervolting support for the chip.


----------



## gffermari (Jul 1, 2022)

Asus, I'm still waiting.....


----------



## Amaze (Jul 4, 2022)

VulkanBros said:


> MSI Kombo Strike For AM4 500-Series Motherboards Released: Up To 5% Performance Uplift & Undervolting Support For AMD Ryzen 7 5800X3D CPU Through BIOS​Kombo Strike is a brand new feature that's being introduced by MSI for its AM4 500-series boards.  The feature comes packaged in the latest BIOS for MSI's 500-series motherboards which not only add the Kombo Strike feature for the Ryzen 7 5800X3D CPU but also unlock undervolting support for the chip.


According to user on Overclock.net:
“Kombo Strike’ is curve optimizer. Setting it to 1 is equivalent to -10, 2 is -20, and 3 is -30.


----------



## RyzenBurner (Jul 4, 2022)

So i bought this 5800X3D last week....so far according to my testing, I think its underperforming slightly in that its missing the advertised boost clocks by about 50-150mhz? Most reviews seem to put it at around 4.35ghz all core boost and 4.5 to 4.55ghz single core boost, but as you can see from the images im getting more like 4.2 all core and 4.45 single. Cinebench r23 multicore came back with around 13000 - 13100 too, which seems low.

all core





single core




This is before the cpu is hitting thermal or electrical limits also. My board is asus x470-f and bios 6024/AGESA 1206. I have tried resetting the bios to optimal defaults and running a clean benchmark that way, and its the same.

Interestingly, in my testing bios 6042/AGESA 1207 actually dropped boost clocks another 200mhz, all core was down to 4.0ghz which is way off, so i went back to 6024. No idea what that was about.

Im not an expert, but everything i know right now says this cpu should be doing a little bit better than this. So did i get a dud chip, and should i RMA it? Im at a loss why im not getting quite the clocks that many others and many reviews are saying theyre getting.


----------



## Tomorrow (Jul 4, 2022)

Same here. For some reason im maxing out at 4450Mhz. I've never seen 4500 or even 4550. Allcore is indeed 4350 max. I have a different board: Gigabyte X570 Aorus Master (v1.0).
I too have tried optimized defaults and im running the latest 1.2.0.7 based BIOS. Thankfully it's not 4Ghz low.

Others have suggested clean OS install. That's about the only advice i can offer. Either that or some X3D chips are severly underperforming. 
It also seems like im missing some voltage. I've never seen it go above 1.30v tho it sould be 1.35v max.


----------



## RyzenBurner (Jul 4, 2022)

Tomorrow said:


> It also seems like im missing some voltage. I've never seen it go above 1.30v tho it sould be 1.35v max.



Yeah now that you mention it, ive never seen my vcore go over 1.2 actually. Is it really supposed to be 1.35?

With regards to a clean install, i actually dual booted a fresh ubuntu install and i verified that i get the same low frequencies in that, so i believe ive ruled out a windows problem here.


----------



## Tomorrow (Jul 4, 2022)

RyzenBurner said:


> Yeah now that you mention it, ive never seen my vcore go over 1.2 actually. Is it really supposed to be 1.35?
> 
> With regards to a clean install, i actually dual booted a fresh ubuntu install and i verified that i get the same low frequencies in that, so i believe ive ruled out a windows problem here.


Im not sure how official it is but before the launch AMD's rep told media that it's limited to 1.35v.

Unfortunately AMD does not specify voltage and boost is marked as "up to 4.5Ghz": https://www.amd.com/en/product/11576

So it could be manufacturing variability why some are getting 4450. Some 4500 and some lucky ones 4550. 
Also mine is below average FCLK overclocker as it's limited to 1866Mhz. Same as 3800X i upgraded from. I was hoping for 1900 at least considering it's B2 stepping of Zen 3


----------



## gffermari (Jul 4, 2022)

@RyzenBurner 

Use the PBO2 Tuner. Set -20 or -30 to all cores and check again.
AMD CBS > NBIO > SMU > CPPC Enabled
AMD CBS > NBIO > SMU > CPPC Preferred Cores Disabled
AMD CBS > CPU > Global C-State Control Enabled

This behavior is common. I had the same issues.

Now I'm 4300-4375 all core on R23, 4450 on every game tested while every core hits 4550 on light tasks.


----------



## Tomorrow (Jul 4, 2022)

gffermari said:


> @RyzenBurner
> 
> Use the PBO2 Tuner. Set -20 or -30 to all cores and check again.
> AMD CBS > NBIO > SMU > CPPC Enabled
> ...


Why would you disable preferred cores?

My settings for those are Enabled, Enabled and Disabled but i guess i can try to see if it makes a difference.


----------



## gffermari (Jul 4, 2022)

It doesn't make any difference to me but that's how I saw it recommended.


----------



## Tomorrow (Jul 4, 2022)

Ok did did some testing and it's weird. Alaso played with PBO2 Tuner and tested -20, -30 and -40.
I enabled C-States and disabled preferred cores. Rebooted and ran both CB23 ST and MT tests. 

What i noticed was that CB23 used the lowest performing core but that core was at 4450 during the ST test. Other cores were idling.
I then rebooted and went to BIOS again. Enabled preferred cores but left C-States enabled.

And then when booting back to Windows for the first time i saw a few cores hitting 4500 and one core (perf #1/1) hitting 4550.
Im not sure if it was because i changed those BIOS settings or because i played with PBO2 Tuner (or both) but something did change.

However now im not seeing anything past 4450 again after playing with PBO2 Tuner and running CB23 (no reboot). I guess i will have to go back to BIOS and disable preferred cores again.
The behaviour is inconsistent. I did gain about 1000 points in CB23 MT test compared to before because some cores ran at 4400 during the test. ST score increased slightly and is at 1495. MT at 14708.

Max core voltage is still at 1,29 something and temps did come down a little with -40 offset.


----------



## Mussels (Jul 5, 2022)

RyzenBurner said:


> So i bought this 5800X3D last week....so far according to my testing, I think its underperforming slightly in that its missing the advertised boost clocks by about 50-150mhz? Most reviews seem to put it at around 4.35ghz all core boost and 4.5 to 4.55ghz single core boost, but as you can see from the images im getting more like 4.2 all core and 4.45 single. Cinebench r23 multicore came back with around 13000 - 13100 too, which seems low.
> 
> all core
> 
> ...


You're at 89C. You're throttling.

The reason you see speed ups that go away is you're saturating your cooling.


----------



## Tomorrow (Jul 5, 2022)

Mussels said:


> You're at 89C. You're throttling.
> 
> The reason you see speed ups that go away is you're saturating your cooling.


No he's not. Look at the second attachment. He's 75c. Well below 90c TjMax and still limited to 4450Mhz. I've run it at 92c myself and it did not downlock.
It's clearly not due to throttling. Based on what others have said it's something to do with C-States and CPPC function in BIOS. I was briefly able to get 4500 and 4550 when playing with those but it went away as quickly as it appeared.


----------



## Mussels (Jul 5, 2022)

Tomorrow said:


> No he's not. Look at the second attachment. He's 75c. Well below 90c TjMax and still limited to 4450Mhz. I've run it at 92c myself and it did not downlock.
> It's clearly not due to throttling. Based on what others have said it's something to do with C-States and CPPC function in BIOS. I was briefly able to get 4500 and 4550 when playing with those but it went away as quickly as it appeared.






You dont have to hit 90C to reduce the maximum boost clocks, he's running hot


----------



## Tomorrow (Jul 5, 2022)

Mussels said:


> View attachment 253674
> 
> You dont have to hit 90C to reduce the maximum boost clocks, he's running hot


Maximum boost clock on 5800X3D can't reach 4500 or 4550 even when idling around 40c with a 420mm AIO. I know because im cooling it with that. This is clearly issue with BIOS settings, even default ones - not boosting to a maximum in lightly threaded workloads.


----------



## RyzenBurner (Jul 5, 2022)

Just to settle the argument......my frequency limits are no more than 4200 from the moment i start a benchmark IE before the chip has even heated up



gffermari said:


> @RyzenBurner
> 
> Use the PBO2 Tuner. Set -20 or -30 to all cores and check again.
> AMD CBS > NBIO > SMU > CPPC Enabled
> ...



Ive tried this now and can report that it made no difference, my frequency limits are still ~4200 all core 

Ive also now played with PBO2 Tuner with all cores down to -40, and interestingly that got my all core boost up to ~4375 to 4400, which is more like it. But single core frequency cap was still 4450 at most, so no effect there.

I know this is a new cpu and theres not a lot of community testing on it yet but i find it strange that different samples are having such rather large differences in stock perf?


----------



## dont whant to set it"' (Jul 6, 2022)

I've used to way to much grease between cpu and cpu-cooler this time, thus multi-core scores can be improved.
High score settings: 104.8MHz bclk x44, vcore 1.206V , llc level 1 , cpu vrm sf 400KHz;
Low score settings: 104.8MHz bclk x43.5, vcore 1.175V , llc level 2 , cpu vrm sf 400KHz;
Scythe air tower cooler , 120mm fan @ 1250rpm


----------



## QuietBob (Jul 6, 2022)

Tomorrow said:


> Maximum boost clock on 5800X3D can't reach 4500 or 4550 even when idling around 40c with a 420mm AIO. I know because im cooling it with that. This is clearly issue with BIOS settings, even default ones - not boosting to a maximum in lightly threaded workloads.


The current AGESA version is most likely the culprit. Many users reported lower boost clocks with the 1.2.0.7.



RyzenBurner said:


> Ive also now played with PBO2 Tuner with all cores down to -40, and interestingly that got my all core boost up to ~4375 to 4400


This proves that you are limited by your cooling solution. The way the boost works you will see higher clocks with lower temperatures. What do you use for cooling?

Also guys, you might get more responses if you move this conversation to the Ryzen owners thread.


----------



## fusseli (Aug 20, 2022)

I like how 10700k is still as fast as this new release for 4k gaming. No need to upgrade cpus too often.


----------



## neatfeatguy (Aug 21, 2022)

fusseli said:


> I like how 10700k is still as fast as this new release for 4k gaming. No need to upgrade cpus too often.



Most CPUs have the same or near identical performance for 4K gaming........so, what does that tell you?


----------



## GreiverBlade (Aug 21, 2022)

neatfeatguy said:


> Most CPUs have the same or near identical performance for 4K gaming........so, what does that tell you?


yep, that's like me if i was saying "glad my R5 3600 can beat a 10700K in SotTR 4K, heck it's even faster than the 12700K and 10900K." (by 0.1/0.2 fps since the 3600X at stock is nearly identical but the GPU is the deciding factor... shocker? right? )

ah, drat, i play in 3k


----------



## AusWolf (Aug 22, 2022)

neatfeatguy said:


> Most CPUs have the same or near identical performance for 4K gaming........so, what does that tell you?


Even at 1080p, it doesn't make any difference if you have 200 FPS with a cheaper CPU, or 250 with a more expensive one. Higher end parts above a certain level solely for gaming are nothing more than wasted money.


----------



## Mussels (Aug 22, 2022)

fusseli said:


> I like how 10700k is still as fast as this new release for 4k gaming. No need to upgrade cpus too often.


Well duh, might as well stay with a 6700K






For real this is something that's always misunderstood - your GPU and CPU both need to be capable to reach higher FPS goals.
You can always turn down GPU related settings, but you can rarely reduce CPU requirements for games.

If your GPU cant keep up the high framerates at you resolution for any reason (high res, RTX, blah blah) - then the required CPU power is really, really low.

On the flip side if you do low res high refresh rate gaming for Esports or something... oh wait, most of them are fine too.







A modern CPU does help maintain that FPS consistently, but unless you're running a 240Hz display you wont see much difference between any modern CPUs

(There are titles that show greater differences, 3700x to 5800x moved me from ~130FPS to ~160FPS in a lot of DX12 titles on average - with a 165Hz display that was worth it, but with a 120hz? not at all)


----------



## ratirt (Aug 22, 2022)

Mussels said:


> Well duh, might as well stay with a 6700K
> 
> View attachment 258998
> 
> ...


True to that. I had a similar case. 144hz 4k display with a 6900xt with my 2700K. Moving to a 5800x was a huge difference in certain games I played even at 4k. (Euro truck sim for instance)

Sometimes people are so fixated at one thing CPU or GPU and forget that these are working together and you need to make sure it does make sense for you to upgrade either of them considering the other one will be able to keep up. It is best to take as much of your set up as possible. I had to change my CPU to achieve that.


----------



## GreiverBlade (Aug 22, 2022)

ratirt said:


> True to that. I had a similar case. 144hz 4k display with a 6900xt with my 2700K. Moving to a 5800x was a huge difference in certain games I played even at 4k. (Euro truck sim for instance)
> 
> Sometimes people are so fixated at one thing CPU or GPU and forget that these are working together and you need to make sure it does make sense for you to upgrade either of them considering the other one will be able to keep up. It is best to take as much of your set up as possible. I had to change my CPU to achieve that.


usually RTS are CPU heavy, thus CPU upgrade will make a difference
hence why my R5 3600, or my RX 6700 XT were huge upgrades @1620p60 versus a i5 6600K and a 1070, and are still hugely relevant even versus newer gen, CPU wise mostly, although i still plan to get either a 5700X or 5800X3D later (i could even go for a 5900X/5950X as they are often seen under 500chf atm for the 5900X and the 5950X is priced like the 5800X3D for me) the 5700X is closer to 250chf now ... if i wait 7XX0 launch dunno, if they are still stocking them i might even see lower price and still get a good upgrade


----------



## ratirt (Aug 22, 2022)

GreiverBlade said:


> usually RTS are CPU heavy, thus CPU upgrade will make a difference
> hence why my R5 3600, or my RX 6700 XT were huge upgrades @1620p60 versus a i5 6600K and a 1070, and are still hugely relevant even versus newer gen, CPU wise mostly, although i still plan to get either a 5700X or 5800X3D later (i could even go for a 5900X/5950X as they are often seen under 500chf atm for the 5900X and the 5950X is priced like the 5800X3D for me) the 5700X is closer to 250chf now ... if i wait 7XX0 launch dunno, if they are still stocking them i might even see lower price and still get a good upgrade


I'd wait for the 7000 series. If these hit the market, you can either wait and see what the new CPUs offer and buy if you are satisfied or by 5000 series the 3D chip for instance. I'm sure the price for 5000 series will drop even more if the 7000 series are out.


----------



## GreiverBlade (Aug 22, 2022)

ratirt said:


> I'd wait for the 7000 series. If these hit the market, you can either wait and see what the new CPUs offer and buy if you are satisfied or by 5000 series the 3D chip for instance. I'm sure the price for 5000 series will drop even more if the 7000 series are out.


Well, I got the 3600 for free and found a good B550 for 99chf thus a 5XXX will be good enough, even with 7XXX an next Intel gen 

As I said the 5900X and 5950X did drop quite a bit, and given next gen probable pricing I would be crazy to pass on sub 500chf top dogs  the 5700X for 250~ is the cheaper option ATM (and it dropped to that from 319 at launch)

(I spent at least 6yrs with the 6600K and the GTX 1070 )


----------



## Mussels (Aug 29, 2022)

GreiverBlade said:


> usually RTS are CPU heavy, thus CPU upgrade will make a difference
> hence why my R5 3600, or my RX 6700 XT were huge upgrades @1620p60 versus a i5 6600K and a 1070, and are still hugely relevant even versus newer gen, CPU wise mostly, although i still plan to get either a 5700X or 5800X3D later (i could even go for a 5900X/5950X as they are often seen under 500chf atm for the 5900X and the 5950X is priced like the 5800X3D for me) the 5700X is closer to 250chf now ... if i wait 7XX0 launch dunno, if they are still stocking them i might even see lower price and still get a good upgrade


RTS are usually single threaded heavy
Gah, starcraft II still sucks to this very day because of its single threaded nature, and even RTS games with heavy multi threading (Supcom: FA) have issues with AI pathing (needs mods to avoid it)

I do love that we don't need the top tier CPUs or even the latest gen CPUs at the moment, with the exception of the very best GPUs.


----------



## GreiverBlade (Aug 29, 2022)

Mussels said:


> RTS are usually single threaded heavy
> Gah, starcraft II still sucks to this very day because of its single threaded nature, and even RTS games with heavy multi threading (Supcom: FA) have issues with AI pathing (needs mods to avoid it)
> 
> I do love that we don't need the top tier CPUs or even the latest gen CPUs at the moment, with the exception of the very best GPUs.


i said CPU heavy (which is the opposed of "GPU heavy").... not multithread heavy, thus a even with one core used a CPU gen upgrade will make a difference.

yeah SCII is ... hum ... special (not that i have any issues playing it, but the CPU single threaded usage is hilarious)


----------



## Mussels (Aug 29, 2022)

GreiverBlade said:


> i said CPU heavy (which is the opposed of "GPU heavy").... not multithread heavy, thus a even with one core used a CPU gen upgrade will make a difference.
> 
> yeah SCII is ... hum ... special (not that i have any issues playing it, but the CPU single threaded usage is hilarious)


Yeah but ST heavy is like... 6.25% on task manager with 16 threads. That doesnt feel CPU heavy 

(Merely being wording pedantic)


----------



## GreiverBlade (Aug 29, 2022)

ah, so to make it real CPU heavy i would have to deactivate 5 core out of 6 (and SMT) on my R5 3600 then ... oh drat ... it's going to be annoying to play SCII now  

AH! whatever! CPU vs GPU not ST vs MT! (joke)

well i am also glad not needing top tier CPU because my 3600 is already quite awesome and enough for all i do for now


----------



## AusWolf (Aug 29, 2022)

GreiverBlade said:


> i said CPU heavy (which is the opposed of "GPU heavy").... not multithread heavy, thus a even with one core used a CPU gen upgrade will make a difference.


Well, that depends whether you call having 150 fps instead of 120 a difference. Personally, I don't. 



GreiverBlade said:


> well i am also glad not needing top tier CPU because my 3600 is already quite awesome and enough for all i do for now


Same here with my 11700. When I read news about Zen 4, I'm tempted like any of us, but I know that I don't need it. I've already had a 5950X once that I downgraded because it was a waste of money.


----------



## InVasMani (Aug 29, 2022)

In the case of ST assign it to the last core with *imagecfg* the other remaining cores should soak up more of the background scheduling CPU utilization. You might want to assign to the last core of a CCD as well or chip die in scenario's where the latency can come into play. Works well to permenantly assign thread focus and priority to a core or number of cores though.


----------



## GreiverBlade (Aug 29, 2022)

AusWolf said:


> Well, that depends whether you call having 150 fps instead of 120 a difference. Personally, I don't.


neither do i  i still remember when Intel got back the gaming crown by a few fps (and not 30), since then i have abs of steel due to laughing.

it sparked some funny moment like when a friend showed me his i5-11600K + RTX 3070 a combo paid 300 at launch for CPU (275$ 11600k? AH! welcome to Switzerland) and 750 GPU 1440p in various games differences were at max 23.3fps with my current configurations, and he kept hammering me that my rig was vastly inferior due to being "all red crap"

i laughed internally all day long, i did not tell him that a R5 3600 was 2yrs older than a i5-11600k (or that it did cost 121chf less and the GPU 300 less ), i did not want to hurt him  but i gave him the inferior calling right on RTRT since my RX 6700 XT has higher impact loss when it is on ... (although neither he or i do really use RT in games   )

after the "showdown" bought him a beer (as a peace offering  ) and proceeded to carry on with my "red crap"


----------



## Mussels (Oct 12, 2022)

Just got mine in.
-30 on the optimiser in the BIOS, unsure if it's actually applying at this stage
PBO set to motherboard, again - if it does anything

2.7% faster than w1zz result 
4.45GHz all core AVX, 117W, 77C max






Like others, i can undervolt but it harms performance eventually

-50mv gives me stock performance 20W and 7.3C less




-100 dropped temps massively, but performance dropped to about 14,000

For a gaming system that doesn't matter if you lose 1% MT for the better temps and sustained ST? Absolutely do it


----------



## puma99dk| (Oct 12, 2022)

Mussels said:


> Just got mine in.
> -30 on the optimiser in the BIOS, unsure if it's actually applying at this stage
> PBO set to motherboard, again - if it does anything
> 
> ...


Looks like I got some work to do when the electricity doesn't cost me a £0,60-0,95 pr. kWh 

Also waiting on 1usmus to release his Hydra software.


----------



## VulkanBros (Oct 12, 2022)

So - performance wise (and, as I live in the same country as puma99dk, power consumption wise), it would not make sense, to replace the 5800X with the X3D version - correct?


----------



## Chomiq (Oct 12, 2022)

VulkanBros said:


> So - performance wise (and, as I live in the same country as puma99dk, power consumption wise), it would not make sense, to replace the 5800X with the X3D version - correct?


With your 2070? No.


----------



## Mussels (Oct 12, 2022)

Oooh my IMC looks nice too





VulkanBros said:


> So - performance wise (and, as I live in the same country as puma99dk, power consumption wise), it would not make sense, to replace the 5800X with the X3D version - correct?


It's not something you'll see gains in, if you were GPU limited. If you run DLSS or games that have CPU limits? yes.

As an example, your frametimes might be a lot smoother and have less microstutter... not that i ever had any on my 5800x outside of game bugs.

Oh and it seems with zero effort, the ram that only ran at 3800 on my 5800x will now do 4000 1:1 on my x3D
(No, the timings and latencies are not great here - they're at loose values because duh, 4000MT/s)


I am pleased.






VulkanBros said:


> So - performance wise (and, as I live in the same country as puma99dk, power consumption wise), it would not make sense, to replace the 5800X with the X3D version - correct?


the 5800x3d uses *less* power in gaming than the 5800x - 10W, 17% lower if my math was correct

TPU's 7700x review has some new testing methods and fancy graphs:
AMD Ryzen 7 7700X Review - The Best Zen 4 for Gaming - Power Consumption & Efficiency | TechPowerUp


----------



## puma99dk| (Oct 12, 2022)

Mussels said:


> Oooh my IMC looks nice too
> View attachment 265113
> 
> It's not something you'll see gains in, if you were GPU limited. If you run DLSS or games that have CPU limits? yes.
> ...


The price for the Ryzen 7 7700X which I really wanted was too much adding the outrageous motherboard cost and DDR5 Expo memory that's why I got drawn to the Ryzen 7 5800X3D and I got a good Asus ROG Crosshair VIII Dark Hero with one year warranty left with half the price of a new board and reusing my DDR4 ram so it was a win for me also power wise.


----------



## Taraquin (Oct 12, 2022)

Mussels said:


> Oooh my IMC looks nice too
> View attachment 265113
> 
> It's not something you'll see gains in, if you were GPU limited. If you run DLSS or games that have CPU limits? yes.
> ...


No WHEA19? Unless you must run ridiculus voltages on SOC/IOD or raise VDD18 2000fclk is a good deal


----------



## Tomorrow (Oct 12, 2022)

Mussels said:


> View attachment 265114


You should get some better RAM. Running 1:1:1 is useless if your latency is 69ns (no, not nice) and CL18.


----------



## dgianstefani (Oct 12, 2022)

Mussels said:


> Oooh my IMC looks nice too
> View attachment 265113
> 
> It's not something you'll see gains in, if you were GPU limited. If you run DLSS or games that have CPU limits? yes.
> ...


Bruh get some dual rank B die. 

You can cut your latency by 15ns.

RAM still matters with 5800X3D, its the new cause of stutter. Before it was the overall latency so always low fps etc, now ram is a bottleneck due to massive difference between cache and memory access if games ever spill out.

E.g. 5800x averages are 100 with lows of 50, better ram can increase both of those. 
5800x3d averages are 140 with lows of 90, better ram can increase those lows to around 100/110.


----------



## puma99dk| (Oct 12, 2022)

I am wondering if running my 5800X3D@4.5GHz all-core with -0.100 mv which HWiNFO64 shows CPU package power at max 82.702W is bad when it's originally uses 103.090W on default.

Running Cinebench R23 I score originally 13883 pts and I am down to 11962 pts which if I can count is 16.1% performance loss but about 24,7%% power savings not sure if I am totally off I am not good at maths  






So what do people say good or bad?


----------



## gffermari (Oct 12, 2022)

-100 is too much and you lose performance.
-25 to -50 is ok but it depends on the CPU silicon. Then you should score 14-15K.

The 5800X3D is already extremely efficient at gaming. There's no point of limiting the cpu power/voltage as long as the temps are reasonable.


----------



## Mussels (Oct 13, 2022)

puma99dk| said:


> The price for the Ryzen 7 7700X which I really wanted was too much adding the outrageous motherboard cost and DDR5 Expo memory that's why I got drawn to the Ryzen 7 5800X3D and I got a good Asus ROG Crosshair VIII Dark Hero with one year warranty left with half the price of a new board and reusing my DDR4 ram so it was a win for me also power wise.


I just got my x3D in and i'm fine tuning the PBO and voltage settings

Lower wattage and higher performance is what an upgrade should be, and i can consider this a finalised DDR4 system



puma99dk| said:


> I am wondering if running my 5800X3D@4.5GHz all-core with -0.100 mv which HWiNFO64 shows CPU package power at max 82.702W is bad when it's originally uses 103.090W on default.
> 
> Running Cinebench R23 I score originally 13883 pts and I am down to 11962 pts which if I can count is 16.1% performance loss but about 24,7%% power savings not sure if I am totally off I am not good at maths
> 
> ...


I get very similar, i'm testing -75 right now

It can scale from 80W to 120W depending on the chosen settings - you can also set a PPT limit to cap the wattage to say, 105W, leaving most of the performance on the table - I've found 3 different locations i can enter it into my BIOS, even boards with the removed settings tend to have it hidden away in the AMD generic bios settings

Cross posting with the zen garden thread:

Gaming results:
I reset HWinfo after I was already in game and took the screenshot before quitting so the average stats here are true gameplay averages, no idle time.

80 minutes of DRG, 4K ultra 140FPS (DX12, Unreal engine 3)
Peak system wattage of 380W, 32" monitor included.
CPU peaked at 63C, 69W
CPU averaged 48C 56W
3090 was in the 200-250W range (seriously, the gains from it going to the 375W limit are totally not worth it)

It's the little things like average core count being 3 threads average, 5 at most that show that 6 core CPU's are definitely viable for gaming.


----------



## puma99dk| (Oct 13, 2022)

Mussels said:


> I just got my x3D in and i'm fine tuning the PBO and voltage settings
> 
> Lower wattage and higher performance is what an upgrade should be, and i can consider this a finalised DDR4 system
> 
> ...



Well my memory is only 3000MHz it's a mixed kit with 3000MHz and 4400MHz Samsung B-Die so and the 3000MHz kit doesn't work great with 3600MHz.


----------



## Mussels (Oct 15, 2022)

puma99dk| said:


> Well my memory is only 3000MHz it's a mixed kit with 3000MHz and 4400MHz Samsung B-Die so and the 3000MHz kit doesn't work great with 3600MHz.


mixed memory makes it more fun, but every platform on DDR4 i've used has had issues with more memory ranks.

IMO it's the reason why intel locked the early DDR4 platforms so low to 2133, to let them cut costs on the boards instead of what AMD did hoping that end users would be understanding of problems if they tried to run it too fast...


----------



## puma99dk| (Oct 15, 2022)

Mussels said:


> mixed memory makes it more fun, but every platform on DDR4 i've used has had issues with more memory ranks.
> 
> IMO it's the reason why intel locked the early DDR4 platforms so low to 2133, to let them cut costs on the boards instead of what AMD did hoping that end users would be understanding of problems if they tried to run it too fast...



I noticed that Asus actually acomplish this task with mixed memory far better than MSI.
Even MSI also works where Gigabyte is the worst and says that it's the memory vendor if they say some ram works on their board and the customer has issues it's like Gigabyte want's the memory vendor to make the bios for the Gigabyte motherboard.

From my testing with my current DDR4 ranking of motherboard vendors goes like this:
1. Asus (B550/X570)
2. AsRock (Z370/Z390)
3. MSI (B450) it could share place with AsRock

and at the lowest is Gigabyte because even I tested my previous Gigabyte Z590 Vision G with compatible ram I borrowed wouldn't run XMP and valueram from Kingston at ran best at 2133MHz so maybe I was just unlucky not sure but I would go with another brand over visuals in the future because as nice features as Gigabyte can offer also in the visual department I do not want to face their support again with anything.


----------



## Taraquin (Oct 17, 2022)

puma99dk| said:


> Well my memory is only 3000MHz it's a mixed kit with 3000MHz and 4400MHz Samsung B-Die so and the 3000MHz kit doesn't work great with 3600MHz.


I would unmix and go B-die only, can get you up to 20% on some games if you tuned vs 3000 xmp.


----------



## Mussels (Oct 18, 2022)

Taraquin said:


> I would unmix and go B-die only, can get you up to 20% on some games if you tuned vs 3000 xmp.


You get free performance just having four ranks, it can end up pretty even


----------



## Taraquin (Oct 19, 2022)

Mussels said:


> You get free performance just having four ranks, it can end up pretty even


Maybe, depends on what die you get, if 3000 xmp is Hynix C/D or Micron E/B then it will be close, if they are Hynix AFR, Samsung C, E etc then no way  Once tuned that is!


----------



## puma99dk| (Oct 19, 2022)

Taraquin said:


> Maybe, depends on what die you get, if 3000 xmp is Hynix C/D or Micron E/B then it will be close, if they are Hynix AFR, Samsung C, E etc then no way  Once tuned that is!



Most memcontrollers in Ryzen 5000 only supports 3600MHz which is 1:1 with the infinity fab.

The Hynix dies on my 3000MT/s kit is a pain to setup for anything else than XMP and yes I tried but lately I don't have the time plus electricity is expensive so I prefer to use my time else where.

Plus I need more than 16GB of memory.


----------



## Mussels (Oct 19, 2022)

Taraquin said:


> Maybe, depends on what die you get, if 3000 xmp is Hynix C/D or Micron E/B then it will be close, if they are Hynix AFR, Samsung C, E etc then no way  Once tuned that is!


the 1usmus TPU article kinda says otherwise, and it was done all the way back in 2019

You can pick just about any benchmark on here, some things like latency are obviously impacted by just clock speeds, but as far as gaming is concerned you get quite the boost from adding more ranks


Even if you stay with just samsung, 3200CL14 dual rank outperforms everythning above it
I'm not 100% sure what "multi rank" is vs dual rank, i think it may mean mixing a single and dual rank stick. For this discussion, focus on SR and DR results.






Cropping the relevant ones for side by side from that:
That's CL12 vs CL14 - and the CL14 wins from those ranks.








Yes, tuned in RAM is faster - it's fantastic.

But the advantage of dual ranks definitely makes up for a lot of that ground
3600C14 SR 204FPS











There are oddities and outliers since these werent tested as extensively as a normal TPU review, i'm just sharing the knowledge that dual rank memory or 4 sticks while it can hurt max clock speeds, genuinely *is* a big performance boost on AM4
These tests are on ryzen 2000, the gains got bigger on Zen3 according to a lot of other reviews out there


GN covered it as well, with 10% gains on a 5600x









They ran their review setup 4x8 3200C14) as well as a bunch of other setups, i'll paste the relevant ones near each other since the visuals a mess without them

Their review setup 4x8 vs 2x8
15FPS drop from removing two sticks, no other changes.








3800 C18 (yay it's me)
11FPS








Clearly, the tuned great timings ram is faster - but average speed with four ranks, is going to beat tuned two ranks


They show this in their more confusing all together graph, where looking at all the bottom results - they're all x2 sticks/ranks on the memory

The worst performing setup (sigh, closest to mine) beat the best performing, as long as they had the extra memory ranks


----------



## Taraquin (Oct 19, 2022)

Mussels said:


> the 1usmus TPU article kinda says otherwise, and it was done all the way back in 2019
> 
> You can pick just about any benchmark on here, some things like latency are obviously impacted by just clock speeds, but as far as gaming is concerned you get quite the boost from adding more ranks
> 
> ...


I don`t disagree with you, especially if running xmp DR will be significantly faster is most cases, but if you mix kits and one kit is rather poor, a tuned set of B-die will beat it easily even if it`s single rank vs poor kit of DR. Things like running RFC at 240 vs 600, RC at 40 vs 65, FAW at 16 vs 28 etc will have very high impact on performance. The RFC potion alone can account for 5% performance, and that is usually what DR vs SR achieves.


----------



## Mussels (Oct 19, 2022)

Mixing kits is bad, agreed
I just mean that when buying, 4x8 3600 C18 cheapo ram is going to work better than expensive 2x8 3200C14

You're better off filling those ranks to get 32GB, and tuning that


I got my 64GB 3600 C18 cheaper than i could have got 16GB 3200C14


----------



## gffermari (Oct 21, 2022)

Can someone explain this?


----------



## Mussels (Oct 21, 2022)

gffermari said:


> View attachment 266461
> Can someone explain this?


Explain what? the 7000 series are faster CPUs


----------



## gffermari (Oct 21, 2022)

They are. No one disagrees with that.

I meant why the 4090 performs the same to 3090Ti when using the 3D.


----------



## Tomorrow (Oct 21, 2022)

gffermari said:


> They are. No one disagrees with that.
> 
> I meant why the 4090 performs the same to 3090Ti when using the 3D.


CPU bottleneck. 4090 requires a fast CPU at 1080p (looking at these numbers i assume this is 1080p).
It's sometimes CPU bottlenecked even at 1440p.


----------



## gffermari (Oct 21, 2022)

Yes, the magic cache doesn’t help in cs go.
The 3D performs similarly to the normal 5000 CPUs.


----------



## xorbe (Oct 21, 2022)

_Less than 800 fps CS:GO, literally unplayable!_


----------



## marios15 (Oct 21, 2022)

Once you get into hundreds of frames per second, what matters most is instruction and cache latency in nanoseconds and VRAM latency/bandwidth, cache size will not help beyond a certain point.
You probably have 50-100-200 instructions repeating at that point, if you increase instruction throughput of one of these, then another becomes a bottleneck. 

If you increase clockspeed, they still need the same cycles to complete but more cycles are completed per second = higher FPS


----------



## Badelhas (Nov 6, 2022)

Hi Guys, I own a ryzen 3600, a MSI b450 tomahawk max and a Nvidia 3060ti. I play games at 1440p. 
Will I see significant performance gains if I upgrade to the 5800x3d or just isn't worth it? 
Anyone here made the same upgrade? 
Cheers


----------



## puma99dk| (Nov 6, 2022)

Badelhas said:


> Hi Guys, I own a ryzen 3600, a MSI b450 tomahawk max and a Nvidia 3060ti. I play games at 1440p.
> Will I see significant performance gains if I upgrade to the 5800x3d or just isn't worth it?
> Anyone here made the same upgrade?
> Cheers



Depends, what games are you gaming? Because there should be a difference on average of about 11% in fps going from a 3600X to a normal 5800X.

What is the rest of your system?

You will still miss out on PCI-E 4.0 with the B450 chipset even it's a fine board.


----------



## Badelhas (Nov 6, 2022)

puma99dk| said:


> Depends, what games are you gaming? Because there should be a difference on average of about 11% in fps going from a 3600X to a normal 5800X.
> 
> What is the rest of your system?
> 
> You will still miss out on PCI-E 4.0 with the B450 chipset even it's a fine board.


Thanks for your answers.
I've been playing Metro Exodus, Death Stranding, Cyberpunk 2077, Doom Eternal, Half Life: Alyx.
Regarding PCI-E 4.0 there's no problem because I own PCI-E 3.0 SSDs only and don't see any need to upgrade for my kind of use.


----------



## puma99dk| (Nov 6, 2022)

Badelhas said:


> Thanks for your answers.
> I've been playing Metro Exodus, Death Stranding, Cyberpunk 2077, Doom Eternal, Half Life: Alyx.
> Regarding PCI-E 4.0 there's no problem because I own PCI-E 3.0 SSDs only and don't see any need to upgrade for my kind of use.



Not thinking much about SSD's here and i know that graphics cards loose about 5% at max but also gpu and so on because your system specs are out dated I guess.

I am running a Gigabyte M30 1TB NVME SSD that's only PCI-E 3.0 x4 as a boot drive and I use my Sabrent Rocket 4.0 2TB as a game library.

VR can be cpu heavy and Cyberpunk too but have you tried the MSI overlay via Rivatuner to see which maxes out at 99-100% doing games because it doesn't have to be CPU that holding you back but I guess for VR you would run a better then GTX 1070 as you state.


----------



## gffermari (Nov 6, 2022)

Badelhas said:


> Thanks for your answers.
> I've been playing Metro Exodus, Death Stranding, Cyberpunk 2077, Doom Eternal, Half Life: Alyx.
> Regarding PCI-E 4.0 there's no problem because I own PCI-E 3.0 SSDs only and don't see any need to upgrade for my kind of use.











						From Ryzen 5 3600 to 5800X3D: The Big Upgrade
					

Going from user feedback, it sounds like many are planning a final AM4 upgrade and you may be most interested in going all out on the 5800X3D,...




					www.techspot.com
				




Yes there will be a difference. But it won't be noticable everywhere since you play at 1440p with a 3060Ti.
But the 3D will boost the fps your gpu can push.


----------



## Badelhas (Nov 6, 2022)

puma99dk| said:


> Not thinking much about SSD's here and i know that graphics cards loose about 5% at max but also gpu and so on because your system specs are out dated I guess.
> 
> I am running a Gigabyte M30 1TB NVME SSD that's only PCI-E 3.0 x4 as a boot drive and I use my Sabrent Rocket 4.0 2TB as a game library.
> 
> VR can be cpu heavy and Cyberpunk too but have you tried the MSI overlay via Rivatuner to see which maxes out at 99-100% doing games because it doesn't have to be CPU that holding you back but I guess for VR you would run a better then GTX 1070 as you state.


Sorry, my specs were outdated on the tech power up forum preferences. 
My setup is a MSI B450 Tomahawk Max, Ryzen 5 3600, 16gb DDR4 3600 CL16, Samsung 970 Evo nvme 1tb (OS and part of the steam library) and Samsung 850 Evo (the rest of the steam library), Qnix 27 inch IPS 1440p @90Hz and 23 inch 1080p @144Hz LCD, Asus Xonar audio card, etc. 
I'm pretty happy with the performance but I keep reading several articles stating that the best value upgrade I could do at the moment is buying the 5800x3d, instead of having to buy a new motherboard, ddr5 and new cpu. Especially if I can find a 5800x3d for 350€. But the articles state that it's better to who plays at 1080p resolution. 
What's your thoughts about this? 
Thanks for your input


----------



## gffermari (Nov 6, 2022)

In 1080p or in cpu bound games, you will see the biggest improvement.
But there is improvement in all resolutions, even in 4K but it depends on the gpu.

Your gpu is quite good, so I would personally make the move to 3D.

((Basically I already did it, moving from 3700X to 5800X3D, having a similar gpu))


----------



## Mussels (Nov 7, 2022)

Badelhas said:


> Hi Guys, I own a ryzen 3600, a MSI b450 tomahawk max and a Nvidia 3060ti. I play games at 1440p.
> Will I see significant performance gains if I upgrade to the 5800x3d or just isn't worth it?
> Anyone here made the same upgrade?
> Cheers


Look at the latest TPU article where a 5800x3D kept up with a 13900k, except at low res with a 4090

RTX 4090 & 53 Games: Ryzen 7 5800X vs Ryzen 7 5800X3D Review | TechPowerUp


Is it more powerful than your GPU can provide? Yes.
(Lower settings and DLSS change this, a 3060Ti with DLSS on is going to be faster than a 3070 without it, and show benefits from the faster CPU)
But that just means you can reuse it for many years to come, since its low wattage, guaranteed to have compatible boards and RAM for sale for years, and will work with any future GPUs


----------



## Badelhas (Nov 8, 2022)

Mussels said:


> Look at the latest TPU article where a 5800x3D kept up with a 13900k, except at low res with a 4090
> 
> RTX 4090 & 53 Games: Ryzen 7 5800X vs Ryzen 7 5800X3D Review | TechPowerUp
> 
> ...


Thanks for the tips. I'll keep an eye on prices and when I see the 5800X3D at less than 350 euros I'll buy it and sell my 3600. I feel like I don't have to rush it, prices will tend to decline, right. 
Cheers


----------



## Mussels (Nov 8, 2022)

Badelhas said:


> Thanks for the tips. I'll keep an eye on prices and when I see the 5800X3D at less than 350 euros I'll buy it and sell my 3600. I feel like I don't have to rush it, prices will tend to decline, right.
> Cheers


Prices will go down up and and repeat over time as the process matures - it's a popular CPU, but they dont want to flood the market with it either

Remember that even a plain old 5600 (no X or G) is going to be faster for gaming than your 3600, by quite a margin


----------

