# RTX 4090 & 53 Games: Ryzen 7 5800X vs Ryzen 7 5800X3D



## W1zzard (Oct 31, 2022)

How much performance can you gain with Ryzen 7 5800X3D over the non-3DV Cache Ryzen 7 5800X when using the mighty GeForce RTX 4090? Our review has the answer, we test 53 games at three resolutions.

*Show full review*


----------



## Daven (Oct 31, 2022)

This means the small selection of games in the CPU review suite is not enough or the right type to show the performance differences. I believe the 5800X3D regressed with the choice of the 12 games in the latest reviews.


----------



## seventy (Oct 31, 2022)

Dual rank sams b-die on 5800x3d vs "questionable" ram on 5800x.
Would really like to see the opposite, ie 5800x with good b-die vs 5800x3d with trash ram.

@W1zzard if possible, can you also add 0.1% lows to the graph?


----------



## clopezi (Oct 31, 2022)

Daven said:


> This means the small selection of games in the CPU review suite is not enough or the right type to show the performance differences. I believe the 5800X3D regressed with the choice of the 12 games in the latest reviews.


I think it's more than enough for 99% of the people. With those games, you can do a nice picture of the CPU performance.

If you need 100 games with pixel perfect versus to test if one or another CPU it's better for gaming, we can get two conclusions. First, both CPU's are great. Two, you are demanding a very specific scenario and it's impossible to media outlets to test 100 games in every single review...


----------



## W1zzard (Oct 31, 2022)

clopezi said:


> and it's impossible to media outlets to test 100 games in every single review


I've been having wet dreams about a 100 game benchmark, but kinda difficult to find that many games (that are relevant and not impossible to test due to always-online, etc)



seventy said:


> if possible, can you also add 0.1% lows to the graph?


I've been thinking about ways how to add some kind of min fps indication, but it's not easy. Definitely not gonna happen for this review or the upcoming 13900K


----------



## Toss (Oct 31, 2022)

That's why I underclocked/volted my CPU to 4 GHZ while gaming at 4k60. It just can't be bottlenecked with modern GPU's so why bother?


----------



## Daven (Oct 31, 2022)

clopezi said:


> I think it's more than enough for 99% of the people. With those games, you can do a nice picture of the CPU performance.
> 
> If you need 100 games with pixel perfect versus to test if one or another CPU it's better for gaming, we can get two conclusions. First, both CPU's are great. Two, you are demanding a very specific scenario and it's impossible to media outlets to test 100 games in every single review...


Its a pretty big difference. In the 53 games tested 12900k review, intel is 15.7% faster than the 5800x at 1080p. In this review, the 5800x3d is 18.5% faster. But in the latest 13700k review, the 12900k summary shows a 8.3% lead at 1080p versus the 5800x3d. That’s quite a lot when it should be a few percent deficit.

But you are right, all CPUs do gaming well from the last few years and it is hard to pick one over the other as tech reviewers cannot bench all scenarios. Either way it looks like the 13900k will have the same 1080p gaming performance as the 5800x3d. I’m guessing that’s the last 53 game test before TPU makes a decision on a new test bench.


----------



## seventy (Oct 31, 2022)

W1zzard said:


> I've been thinking about ways how to add some kind of min fps indication, but it's not easy.


Can I ask why? You should have the data already, but now you just throwing it away.
Even identical graph but with 0.1% fps lows instead of average fps would be better than nothing.
People who are interested in that kind of stuff can easily figure it out, just give a way to access it please.


----------



## Mats (Oct 31, 2022)

seventy said:


> @W1zzard if possible, can you also add 0.1% lows to the graph?


Given how many games there are, it really needs a separate graph.


----------



## Xuper (Oct 31, 2022)

So far only this games are lower than 5% at 1080p :

AC Valhalla
Control
Devil May cry 5
GTA V
Rage 2
Strange Brigade
Witcher 3

about Witcher3 and GTA V, I'm quite surprised !

I checked this :








						RTX 4090 & 53 Games: Ryzen 7 5800X vs Core i9-12900K Review
					

We test the NVIDIA GeForce RTX 4090 with 53 games at three resolutions, comparing the AMD Ryzen 7 5800X against the Intel Core i9-12900K. The idea here is to get a feel for how much graphics performance is lost by a weaker processor.




					www.techpowerup.com
				




About Devil May Cry5  , What the hell ? 5800X is 59.7% Faster than 12900K?


----------



## GreiverBlade (Oct 31, 2022)

mmmh, now that i can find a 5800X for 299.90chf (about same price of a 5700X ) and a  5800X3D is 409.90chf (hey! i said if it dropped sub 450 when it was 449 i would place an order asap )
given that i play in 1440p/1620p the 5800X3D is highly tempting and would still cost me way less than a full new Intel or AMD build  
but dang ... 300chf for a 5800X, i could settle for that one and add 50chf for 16gb RAM more (same stick as i have ofc ) and go 32gb


----------



## rv8000 (Oct 31, 2022)

seventy said:


> Dual rank sams b-die on 5800x3d vs "questionable" ram on 5800x.
> Would really like to see the opposite, ie 5800x with good b-die vs 5800x3d with trash ram.
> 
> @W1zzard if possible, can you also add 0.1% lows to the graph?



This is sort of a good point, but neither should have a different kit in a good testing environment.

_(Seems like an oversight in the review)_ *Nevermind, looks like a poor decision as it’s pointed out in the test rig descriptions*; from personal testing higher DDR frequencies and IF clocks have dramatic impacts on minimum frame rates (and thus average). Both CPUs should have been running at the lower maximum achievable IF clock between the two CPUs.


----------



## LifeOnMars (Oct 31, 2022)

Toss said:


> That's why I underclocked/volted my CPU to 4 GHZ while gaming at 4k60. It just can't be bottlenecked with modern GPU's so why bother?


I run my 5700x on eco mode @4k with 32Gb Slow DDR4 and my 3080 12Gb  No issues and a nice cool and quiet gaming experience.


----------



## cowcatwithahat (Oct 31, 2022)

It is scary how fast the technology moves now vs the time we were stuck with the 4 cores for so damn long.
As a 5800X user, it is hard to justify spending money again on a processor when 90% of it`s use is gaming & youtube;
I get it, it`s fast! But in most cases, at least for me, more then a steady 120 FPS at any resolution it is placebo.
I am amazed of the content you guys post. Thank you!


----------



## Selaya (Oct 31, 2022)

this is kinda cool (story bro) ig but ... we all _know_ 5800x3d is better at gaming. (ig it's telling us which games have cache scaling?)
5800x3d vs 13900k/13700k would've been far more interesting, tbh

EDIT: it's already on the menu i see (read first spam later ig)
however this still feels kinda ... superfluous? could be good for perspective later, ig


----------



## rusTORK (Oct 31, 2022)

Didn't read it yet, but... FINALLY! W1zzard is true magician!

P.S. I got R7 5800X and thought about switch to R7 5800X3D.
P.P.S. Oh... and Dark Hero too... %)


----------



## dirtyferret (Oct 31, 2022)

W1zzard said:


> I've been having wet dreams about a 100 game benchmark, but kinda difficult to find that many games (that are relevant and not impossible to test due to always-online, etc)


how about a 101 game benchmark on a tuseday under a waning crescent moon while eating a ben & jerry half baked pint of ice cream?  Otherwise, what's the point none of the data would be valid other any other scenario.


----------



## BlueKnight (Oct 31, 2022)

10 days ago I sell my 5900X at 330 Euro to buy a 5800X3D for 405 Euro. I'm very satisfied for my choice!


----------



## ARF (Oct 31, 2022)

cowcatwithahat said:


> It is scary how fast the technology moves now vs the time we were stuck with the 4 cores for so damn long.
> As a 5800X user, it is hard to justify spending money again on a processor when 90% of it`s use is gaming & youtube;
> I get it, it`s fast! But in most cases, at least for me, more then a steady 120 FPS at any resolution it is placebo.
> I am amazed of the content you guys post. Thank you!



This is thanks to AMD achieving some type of a miracle with the design of a so potent Zen architecture. It's really fast and competitive, not like the previous AM3 and AM2 designs.

But speaking of stagnation - you can't imagine how bad it is to enter one of those big techno stores and look at large screens with 1080p - so ugly - bad colours, huge pixels and the screen-door effect everywhere.
Will we be stuck with 1080p forever?


----------



## Gucky (Oct 31, 2022)

I bought a 5800X3D last week for 349€...my current CPU is slower as the 5800X in the review, but goes for 450€ on Ebay. 
I already have an AM4 Board with 3200CL14 RAM here.


----------



## Shazamy (Oct 31, 2022)

Thanks so much for another great comparison W1zzard! Always appreciated


----------



## QuietBob (Oct 31, 2022)

Big thanks to @W1zzard for another comprehensive article! I can only imagine the amount of time that went into testing such a huge number of games 

Since the previous article puts the same 5800X against the 12900K, we can draw some more comparisons by looking at the graph data for both:







In terms of average frame rate at 1080p, the 12900K is meaningfully faster than the 5800X3D (>10%) in only 2 out of the 53 games tested. In Civilization VI it posts 12% higher fps, and in Cyberpunk 2077 - 19% higher. Both titles are known to scale beyond 16 threads, so Intel's advantage here is most likely due to having 50% more threads than AMD.

On the other hand, the 5800X3D is outstripping the 12900K by a similar margin in 5 titles. Borderlands 3, Darksiders 3 and Dota 2 show 15%, 14%, and 16% higher scores, respectively. Prey is 20% faster, and Devil May Cry 5 -- while being a clear outlier -- 61% faster in terms of average fps.

It's worth noting that in 4K the tables are turned and the 5800X3D leads the 12900K by 11% in Civilization VI and equalizes in Cyberpunk 2077. AMD is still faster in 4K in Borderlands 3 by 14% and in Dota 2 by 15%. Intel catches up in Darksiders 3 and Devil May Cry 5, and ekes out a 5% advantage in Prey.


----------



## zlobby (Oct 31, 2022)

Mmm, pants-wetting! Imagine the latest X3D Zen!


----------



## ffolekram (Oct 31, 2022)

Due to the fact that 5800x3d is the only AM4 cpu with 3D V-cache, almost all AM4 mobos (B350, X370, B450, X470, B550, X570) can run 5800x3d with no problem, and very small performance difference because of the 3D V-cache between 3200cl16 and 3800cl14,
this would make the 5800x3D a very popular option for any existing AM4 users to consider upgrade to, when it's the time.
I currently have a 5600x on a x570 taichi with a 3080 12g at 1440p, personally I would definitely consider 5800x3D instead of AM5 cpus in the next few years when comes to upgrade (since I can reuse almost all of the existing parts in my system).


----------



## blazeddd (Oct 31, 2022)

Any chance we can get FPS numbers instead of percentages?


----------



## regs (Oct 31, 2022)

Once again - save $200 and get a better GPU for them.


----------



## Cryio (Oct 31, 2022)

Days Gone2021​DX12UE4

Number #47514 when I ask you to stop including Days Gone under DX12. This meta joke has to STAHP.


----------



## jallenlabs (Oct 31, 2022)

Thanks for this article.  I look forward to seeing how the 13900k fares...


----------



## Count Shagula (Oct 31, 2022)

Def glad i upgraded my 5600x to the 3D. The performance uplift was like getting a new GPU as im using a 360hz screen


----------



## LifeOnMars (Oct 31, 2022)

Count Shagula said:


> Def glad i upgraded my 5600x to the 3D. The performance uplift was like getting a new GPU as im using a 360hz screen


Yep, a worthy upgrade in that scenario, especially if you are pushing a 3090.


----------



## Count Shagula (Oct 31, 2022)

LifeOnMars said:


> Yep, a worthy upgrade in that scenario, especially if you are pushing a 3090.


Soon to be a 4090 once they stop melting or a 7900. Waiting till the reviews drop for the 7900 to decide on which one


----------



## wheresmycar (Oct 31, 2022)

thanks w1zzard

its amazing how many of these titles are cache starved with a few hitting the 30-50% range. I'm just wandering why this increase in cache wasn't implemented earlier ..... i mean way earlierrrr (AMD/INTEL?). Was this something overlooked or was it down to architectural/component make-up limitations or were games not as cache demanding previously.... or a mix bag of intolerables?

lol doesn't help me though... the 5800X3D was a strong contender for a potential late-2022 upgrade but seeing how effective X3D is with Zen 3, now i'm even more curious how Zen 4 X3D plays out.


----------



## zlobby (Oct 31, 2022)

regs said:


> Once again - save $200 and get a better GPU for them.


Riiiiight.


----------



## Orangeatom (Oct 31, 2022)

What about a comparison between 5800x3d vs 5900x , is the difference in cost worth?


----------



## P4-630 (Oct 31, 2022)

BlueKnight said:


> 10 days ago I sell my 5900X at 330 Euro to buy a 5800X3D for 405 Euro. I'm very satisfied for my choice!



"2x1GB Samsung TCCC DDR400 - 217MHz 2.5-3-3-6"

You might want some faster memory for that 5800x3d.....


----------



## Minus Infinity (Nov 1, 2022)

The good thing for Zen 4 v-cache models will be the clocks will be basically the same as the regular models this time round. Much better silicon for the cache, AMD said 5800X3D has beta silicon, Zen 4 will be 2nd gen silicon. No issues with clocks and voltages this time, so we should see bigger gains than 5800X3D had overall and no penalty when the v-cache isn't helping. If 7900X3D keep price at $549 max and also boosted many productivity apps it might keep me from switching to 13700K for my upgrade.


----------



## Mussels (Nov 1, 2022)

seventy said:


> Dual rank sams b-die on 5800x3d vs "questionable" ram on 5800x.
> Would really like to see the opposite, ie 5800x with good b-die vs 5800x3d with trash ram.
> 
> @W1zzard if possible, can you also add 0.1% lows to the graph?


All zen2 and Zen3 CPUs need four memory ranks for full performance

0.1% lows are something i badly want as well, but i'm not sure the current test setup records them
High peak FPS values are meaningless if they have worse lows to go with them



ffolekram said:


> Due to the fact that 5800x3d is the only AM4 cpu with 3D V-cache, almost all AM4 mobos (B350, X370, B450, X470, B550, X570) can run 5800x3d with no problem, and very small performance difference because of the 3D V-cache between 3200cl16 and 3800cl14,
> this would make the 5800x3D a very popular option for any existing AM4 users to consider upgrade to, when it's the time.
> I currently have a 5600x on a x570 taichi with a 3080 12g at 1440p, personally I would definitely consider 5800x3D instead of AM5 cpus in the next few years when comes to upgrade (since I can reuse almost all of the existing parts in my system).


It is, it's been the top selling AMD CPU for some time now in a lot of countries, just not the USA (where you guys get intel stuff cheaper than everyone else)


----------



## BoredErica (Nov 1, 2022)

Is ther an article that explains how each game was tested? Many open world games are more CPU or GPU limited depending on what part of the game is tested. Thanks


----------



## kapone32 (Nov 1, 2022)

ffolekram said:


> Due to the fact that 5800x3d is the only AM4 cpu with 3D V-cache, almost all AM4 mobos (B350, X370, B450, X470, B550, X570) can run 5800x3d with no problem, and very small performance difference because of the 3D V-cache between 3200cl16 and 3800cl14,
> this would make the 5800x3D a very popular option for any existing AM4 users to consider upgrade to, when it's the time.
> I currently have a 5600x on a x570 taichi with a 3080 12g at 1440p, personally I would definitely consider 5800x3D instead of AM5 cpus in the next few years when comes to upgrade (since I can reuse almost all of the existing parts in my system).


It also sips power compared to chips like the 5900X and 5950X. You will not have the performance you will see in Desktop but I swear I have not felt anything as smooth in Gaming as the 5800X3D. I even put that with a 6500XT and at 210W total system power draw that was cool but high end GPUs are where it shines.



Orangeatom said:


> What about a comparison between 5800x3d vs 5900x , is the difference in cost worth?


If you know that 90% of what you plan to do with your PC is Gaming yes. If not then no. The 5900X feels bulletproof and has serious horsepower but the cache does make a difference in traditional CPU bound Games.


----------



## Makaveli (Nov 1, 2022)

Great review.

This below should removed as its doesn't need to be in there twice.

"Also, while it's not reported in this review, the minimum FPS *on 5800X3D* are considerably better on Ryzen 5800X3D than on Ryzen 5800X."


----------



## VeqIR (Nov 1, 2022)

Thank you so much for this detailed article!  Looking forward to the next comparison!


----------



## Footman (Nov 1, 2022)

Interesting stuff! I really love this type of review. 
Of course I have the upgrade bug this time of year! Can't really justify any upgrade as my system is purring along just fine with its 5600X and 6700XT for 2K gaming with a 144hz monitor. Now I might consider an upgrade for my son, who is still using an older 2600X with his 5700XT on a 165hz 1080P monitor. But really the 5800X3D would be total overkill for him. a 5700X would be a nice upgrade for him. Or I could give him my 5600X and nab a cheap 5800X for me  

Nice review.


----------



## wheresmycar (Nov 1, 2022)

Footman said:


> Or I could give him my 5600X and nab a cheap 5800X for me



my thoughts too!!

Slip him the 5600X and for $80 more, if possible, grab a 5800X3D. Every dad deserves one


----------



## Footman (Nov 1, 2022)

wheresmycar said:


> my thoughts too!!
> 
> Slip him the 5600X and for $80 more, if possible, grab a 5800X3D. Every dad deserves one


That's too funny..... But the thought did cross my mind!!! Perhaps I need to see what Black Friday brings


----------



## uftfa (Nov 1, 2022)

I don't have anything meaningful to add to the conversation, just want to drop in and show my appreciation for doing this. Thanks, W1zzard (or whoever else ran the tests).


----------



## Space Lynx (Nov 1, 2022)

@W1zzard the use of multiple colors in these charts, is the the best work you have ever done. seriously, it was so pleasant on the eyes to read it, and not confusing at all. the use of multiple colors worked great here. absolutely fantastic, I really quite enjoyed that.


----------



## wolf (Nov 1, 2022)

Can't wait to pop in the  5800X3D tonight and offload the 5900X, some serious gains all all your results.


----------



## JAB Creations (Nov 1, 2022)

Daven said:


> Its a pretty big difference. In the 53 games tested 12900k review, intel is 15.7% faster than the 5800x at 1080p. In this review, the 5800x3d is 18.5% faster. But in the latest 13700k review, the 12900k summary shows a 8.3% lead at 1080p versus the 5800x3d. That’s quite a lot when it should be a few percent deficit.
> 
> But you are right, all CPUs do gaming well from the last few years and it is hard to pick one over the other as tech reviewers cannot bench all scenarios. Either way it looks like the 13900k will have the same 1080p gaming performance as the 5800x3d. I’m guessing that’s the last 53 game test before TPU makes a decision on a new test bench.


Using a Nascar race car to win a running marathon doesn't count. Let's talk about thermals and competing on even footing. My 10x10 foot room doesn't have the capacity to absorb hundreds of watts of exhaust.


----------



## Mussels (Nov 1, 2022)

JAB Creations said:


> Using a Nascar race car to win a running marathon doesn't count. Let's talk about thermals and competing on even footing. My 10x10 foot room doesn't have the capacity to absorb hundreds of watts of exhaust.


This. Even my 3090 at full wattage requires some extreme cooling here - it's enough to heat the entire house since the house is insulated to keep heat OUT in summer, it also traps heat in - and a 500W PC (or higher) is like running a hairdryer and expecting that heat to vanish

I can only imagine people are living in poorly insulated massive houses in cold climates or something, because the sheer heat output makes a lot of these modern products just seem untenable for use half of the year


----------



## Scrizz (Nov 1, 2022)




----------



## zlobby (Nov 1, 2022)

Mussels said:


> I can only imagine people are living in poorly insulated massive houses in cold climates or something


People living in the conditions you are describing can't usually afford a 3090.


----------



## Mussels (Nov 1, 2022)

zlobby said:


> People living in the conditions you are describing can't usually afford a 3090.


I'm a disabled single dad and i could afford one.


----------



## Gica (Nov 1, 2022)

With a much more expensive X3D, it should be taken into account that it is weaker than the 1800X in applications that do not react intensively to the cache memory (3.8/4.7 GHz versus 3.4/4.5 GHz).


----------



## oxrufiioxo (Nov 1, 2022)

This is pretty much why none of the new gen cpus are all that appealing for gaming.... Yeah they are probably slightly ahead of this when tweaked to the max but this chip will perform well with crap ddr4 and a sub 150 usd board possibly even a sub 100 usd board while being easy to cool.


----------



## The King (Nov 1, 2022)

seventy said:


> Dual rank sams b-die on 5800x3d vs "questionable" ram on 5800x.
> Would really like to see the opposite, ie 5800x with good b-die vs 5800x3d with trash ram.
> 
> @W1zzard if possible, can you also add 0.1% lows to the graph?


I completely missed that!

If those are 8GBX2=16Gb then the 5800X is at a big disadvantage vs the 5800X3D 16GBx2 dual rank setup.

Another possible issue is that many on ZEN 3 see some performance loss running 2000 FCLK and many people don't even run that.

I would have also preferred to see the same 2X16GB 3600 CL14-14-14 1T on the 5800X vs that config that was run in this test.

Don't expect the outcome to change but the gap may not be has big then. Also 4000 CL20 RCD 23


----------



## Space Lynx (Nov 1, 2022)

I'm less interested in the ram testing stuff and more interested in weird stuff like how much more wattage does a 13600k draw vs a 13600kf, even when integrated graphics are not being used? Is it the same, of 74 watts when gaming, or is it truly fully dormant, or does the igpu sip a few watts simply because of its proximity on a monolith design?

I understand it doesn't really matter though, so I wouldn't ask anyone to test this for me lmao


----------



## zlobby (Nov 1, 2022)

Mussels said:


> I'm a disabled single dad and i could afford one.


Hence the 'usually' part in my previous post.


----------



## gffermari (Nov 1, 2022)

**HU used a 3090Ti, not a 4090.**
Excellent job!
I wanted to see a 1% lows chart using the 4090 but ok.
We can imagine that the difference would be massive.


----------



## Richards (Nov 1, 2022)

@W1zzard  we need a 13900k vs 7950x oc'd on a 4090 and 7900 xt when you get it


----------



## Mussels (Nov 1, 2022)

Richards said:


> @W1zzard  we need a 13900k vs 7950x oc'd on a 4090 and 7900 xt when you get it


I get the feeling he's going to do a few of these comparisons, and when he has enough done he'll be able to use the results in the new reviews from then on


When you first start with a new review setup, you can't compare it to the old results


----------



## Nopa (Nov 1, 2022)

gffermari said:


> View attachment 268093
> View attachment 268094
> 
> **HU used a 3090Ti, not a 4090.**
> ...


Really intriguing to see how many % the next X3D's gonna demolish 13600K & 13700K. I wager anywhere from 15-30% faster.


----------



## W1zzard (Nov 1, 2022)

Cryio said:


> Days Gone2021​DX12UE4
> 
> Number #47514 when I ask you to stop including Days Gone under DX12. This meta joke has to STAHP.


Lol i cant believe i failed this again… fixed now and hopefully for future articles, too


----------



## Vayra86 (Nov 1, 2022)

Impressive.

That X3D is a monster CPU. I need one. I don't need one. But I do. Shit


----------



## ratirt (Nov 1, 2022)

Gica said:


> With a much more expensive X3D, it should be taken into account that it is weaker than the 1800X in applications that do not react intensively to the cache memory (3.8/4.7 GHz versus 3.4/4.5 GHz).


You have no idea what you are talking about. 1800x is much slower than any 5000 series processor. Here you are comparing 8 cores to 8 cores. 5800x3d even with slightly slower clocks than a 5800x, eats the 1800x whole. The 3000 series CPUs where miles ahead of the 1st Ryzen.


----------



## kapone32 (Nov 1, 2022)

Gica said:


> With a much more expensive X3D, it should be taken into account that it is weaker than the 1800X in applications that do not react intensively to the cache memory (3.8/4.7 GHz versus 3.4/4.5 GHz).


What are you talking about? The IPC improvement alone between 1st Gen and 4th Gen belies what you are saying. Then let us realistically look at clock speeds. If you can get a 1800X to 4.7 GHZ (On air) you have the ultimate unicorn. The 5800X is faster in most applications than the 5800X3D but if you are coming from anything before that outside of the 3900/3950X the 5800X3D is a common sense upgrade for anyone with a previous Gen AM4 chip that wants to Game. Put another way the 5800X3D is like the next compelling CPU that the 3300X was. I can also confirm once again that the 5800X3D uses less power than the 5800/5900/5950X and blows them away in 1% lows and FPS in some Games.


----------



## HTC (Nov 1, 2022)

kapone32 said:


> If you can get a 1800X to 4.7 GHZ



I'm pretty sure dude meant a 5800X but wrote 1800X instead: his statement makes absolutely NO SENSE otherwise.


----------



## HD64G (Nov 1, 2022)

@W1zzard many kudos for your great job! This 3D-cache will be standard on many future CPU gens from AMD since not also make fast RAM less important but also it stop the CPUs being the bottleneck of many unoptimised game engines out there. Adding the success 5800X3D has in the market atm while both Intel and AMD released the latest CPU series, it will become the normality for the desktop (mostly) sooner that most expected.


----------



## Xex360 (Nov 1, 2022)

> All games are tested in custom bench scenes as the integrated benchmarks often paint a completely inaccurate picture compared to actual gameplay. Also, the GPU vendors actively optimize their drivers to achieve good results in integrated benchmarks


Apparently even replays are not reliable according to Ian (former anandtech) part of the game engine doesn't work like physics.


----------



## Nopa (Nov 1, 2022)

HD64G said:


> @W1zzard many kudos for your great job! This 3D-cache will be standard on many future CPU gens from AMD since not also make fast RAM less important but also it stop the CPUs being the bottleneck of many unoptimised game engines out there. Adding the success 5800X3D has in the market atm while both Intel and AMD released the latest CPU series, it will become the normality for the desktop (mostly) sooner that most expected.


5800X3D's probably the best AMD CPU since Athlon 64 Era tbh. I imagine the profits from it must be so good to the point we'll see X3D being featured in every generations from now on.


----------



## gffermari (Nov 1, 2022)

HD64G said:


> @W1zzard many kudos for your great job! This 3D-cache will be standard on many future CPU gens from AMD since not also make fast RAM less important but also it stop the CPUs being the bottleneck of many unoptimised game engines out there. Adding the success 5800X3D has in the market atm while both Intel and AMD released the latest CPU series, it will become the normality for the desktop (mostly) sooner that most expected.



And that's the problem with the 3D versions. They make the normal SKUs.....somewhat not worthy.
Which shouldn't be the case.


----------



## ODOGG26 (Nov 1, 2022)

At this point going forward, AMD should release X3D of all 4 chips first then drop non X3D versions later. Make X3D their main sku's then drop X versions later with less wattage and price and if they want they can go a step further and drop non X versions with even less wattage and price. Maybe those can go in productivity laptops.


----------



## theglaze (Nov 1, 2022)

Thanks for the benchmark analysis!

It would be very interesting to learn more about _WHY_ the 3D V-Cache provides a massive gain in some titles (even at 4K, there are 10 titles that still show a +15% FPS improvement). 

From all the tech talk about 3D V-Cache, I have decoded in layman's terms that the 5800X3D accelerates when:

Workloads cannot fit in a 'normal' sized cache
Workloads are extremely CPU limited
Workloads are memory sensitive
Workloads reuse data and calculations
It would take some investigative journalism to reach out to the game developers, but I'm curious to learn how those 10 titles (and others) align into different categories, such as those 4 above.

AMD has clear intentions to move forward with 3D V-Cache on AM5, and the package will (hopefully) be improved upon. But it's still an assumption to think the same game titles will benefit (in the same way) from future X3D CPUs.


----------



## Xuper (Nov 1, 2022)

can we have RUST in bench?


----------



## Vayra86 (Nov 1, 2022)

theglaze said:


> Thanks for the benchmark analysis!
> 
> It would be very interesting to learn more about _WHY_ the 3D V-Cache provides a massive gain in some titles (even at 4K, there are 10 titles that still show a +15% FPS improvement).
> 
> ...


Its not a baseless assumption.

Bigger cache means you can keep more information on hand to be (re-) used faster. Larger cache simply works like lubricant in a realtime application; the rest of the gears in the processing pipeline run more smoothly, or 'barely have to wait' to process the next bit of data required to produce the next frame. Put differently, because gaming is real time, what you want is the fastest possible throughput of code, you're not really working on a massive problem, but thousands of tiny ones that each provide input for your next frame in the game. Cache helps that throughput. This explains 1,2, 3 and 4, pretty much.

CPU limited is a broad term - you can be short on threads in applications that require parallelism. If an application requires (or uses) peak overall 5800X3D performance, extra cache won't help it much; if it requires 16 physical cores for optimal performance, cache won't get it closer to that either. With gaming, the most influential though is latency to provide the next bit of information - cache reduces latency. And simply put, the slowest link in a pipeline can single handedly limit your FPS even if everything else is super fast. The less you need to move data around, or the shorter the distance (actual distance, yes), the more efficient all parts in the pipeline get to work, as they keep getting 'fed'. Cache is ON the CPU - and has several Levels even to show how close (L1, L2, L3). RAM is on the board. The GPU is a whole lot further down the board, furthest away from the CPU. If data takes longer to get transported, or if there is congestion somewhere, you lose frames. Cache makes the flow of traffic less congested.


----------



## Mats (Nov 1, 2022)

HTC said:


> I'm pretty sure dude meant a 5800X but wrote 1800X instead: his statement makes absolutely NO SENSE otherwise.


Yeah everyone got so emotional about the 1800X statement that they missed the mismatching clock speeds in the same post.. 

Anyway, Gica is right, but a 4 % clock speed difference isn't much, and the price difference is smaller these days.


----------



## below ambient (Nov 1, 2022)

this is an all time low for you guys... first of all 1 out of 10 5800x will run 2000 fclk, so whea errors, did you check for that????... secondly you ran b die at a different speed and timings...WHY??!!?? please fix this... im losing faith in you guys


----------



## Gica (Nov 2, 2022)

5800X, not 1800X. I mistyped. I think the 1800X is no longer available for sale.
Due to higher clock speeds, the 5800X outperforms the 5800X3D in everything that does not react intensively to the L3 cache. And it is about 30% cheaper. In my opinion, if you don't have a top video card, X3D is a wrong choice (because it is much more expensive, you can buy an SDD with the difference). If tested with a middle card, the differences between the two processors are marginal in gaming, practically non-existent, but the 5800X maintains its advantage in other applications.

However, those who scolded me could be guided by the frequencies typed, impossible to reach by an 1800X


----------



## kapone32 (Nov 2, 2022)

Gica said:


> 5800X, not 1800X. I mistyped. I think the 1800X is no longer available for sale.
> Due to higher clock speeds, the 5800X outperforms the 5800X3D in everything that does not react intensively to the L3 cache. And it is about 30% cheaper. In my opinion, if you don't have a top video card, X3D is a wrong choice (because it is much more expensive, you can buy an SDD with the difference). If tested with a middle card, the differences between the two processors are marginal in gaming, practically non-existent, but the 5800X maintains its advantage in other applications.
> 
> However, those who scolded me could be guided by the frequencies typed, impossible to reach by an 1800X


What is a middle card? I put my x3d with a 6500XT and it gave me 30% more FPS in some Games than a 5900X connected to that but also a smoother experience. I am not just talking out of my butt, I own all of those. No one is disputing that the 5800X is a better overall processor.  The fact remains that the x3D chip is a revelation in Gaming. I am confident that when the X3D chips launch that you will see more mass adoption of AM5. The chip was too expensive when it launched but as soon as it went below $500 CAD it started moving. It will sell quite well for the rest of the year as it's price keeps falling too. You should be glad because it will put price pressure on the rest of the lineup. If you are starting out and are budget conscious about the cost of an SSD you should get a 5600. Then you could buy a PSU too and maybe a case.


----------



## Gica (Nov 2, 2022)

How would it be better for gaming in the same budget?
5800X3D + 6500XT or 5800X +3060
The difference from the processors is redirected to the video card in this case, not to the SSD.


----------



## kapone32 (Nov 2, 2022)

Gica said:


> How would it be better for gaming in the same budget?
> 5800X3D + 6500XT or 5800X +3060
> The difference from the processors is redirected to the video card in this case, not to the SSD.


You said it would not make a difference using a mid grade card. I tell you that a 6500XT will show the difference and that is not a mid range card and you show me the difference in price between 2 orders. it doesn't matter I am not using Youtube as my source the X3D chip is better than any 5000 chip in Gaming. The only rival is the 5950X but that draws more power to do that. I almost bought this yesterday but held off but the price is sweet.









						ASRock Radeon RX 6700 XT Challenger D Gaming Graphic Card, 12GB GDDR6 VRAM, AMD RDNA2 (RX6700XT CLD 12G) - Newegg.com
					

Buy ASRock Radeon RX 6700 XT Challenger D Gaming Graphic Card, 12GB GDDR6 VRAM, AMD RDNA2 (RX6700XT CLD 12G) with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.ca


----------



## tekjunkie28 (Nov 2, 2022)

Anyone test Farming Simulator 22 with a 5800X3D?  Thats the only game that I have real issues with. Its also not threaded very well.


----------



## Mats (Nov 2, 2022)

Gica said:


> How would it be better for gaming in the same budget?
> 5800X3D + 6500XT or 5800X +3060
> The difference from the processors is redirected to the video card in this case, not to the SSD.


You might as well look into combinations with the 5700X and the 5600, together with even more powerful graphics cards. This would give you better gaming performance for the same price.
In Germany the price difference between the 3060 and the 6700 XT is about €70. The 6700 XT gets 37 % higher FPS at 1080.

The 5800X isn't worth the extra cost over the 5700X IMO.


----------



## Gica (Nov 2, 2022)

kapone32 said:


> You said it would not make a difference using a mid grade card. I tell you that a 6500XT will show the difference and that is not a mid range card and you show me the difference in price between 2 orders. it doesn't matter I am not using Youtube as my source the X3D chip is better than any 5000 chip in Gaming. The only rival is the 5950X but that draws more power to do that. I almost bought this yesterday but held off but the price is sweet.
> 
> 
> 
> ...


I say that it is not worth the price difference for a low-midrange video card. In your case, you had more performance in gaming if you directed this difference to the video card (I gave the example of the RTX 3060, a video card in a completely different galaxy than the RX 6500XT).
They have a lot of salt in the statement with a 30% performance boost.


----------



## kapone32 (Nov 2, 2022)

Gica said:


> I say that it is not worth the price difference for a low-midrange video card. In your case, you had more performance in gaming if you directed this difference to the video card (I gave the example of the RTX 3060, a video card in a completely different galaxy than the RX 6500XT).
> They have a lot of salt in the statement with a 30% performance boost.


This is the lowest price 3060 on Newegg in Canada 









						MSI Ventus GeForce RTX 3060 Video Card RTX 3060 Ventus 3X 12G OC - Newegg.com
					

Buy MSI Ventus GeForce RTX 3060 12GB GDDR6 PCI Express 4.0 Video Card RTX 3060 Ventus 3X 12G OC with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.ca
				




This is the lowest price 6500XT









						ASUS Dual Radeon RX 6500 XT Video Card DUAL-RX6500XT-O4G - Newegg.com
					

Buy ASUS Dual AMD Radeon RX 6500 XT OC Edition 4GB GDDR6 Gaming Graphics Card (AMD RDNA 2, PCIe 4.0, 4GB GDDR6 Memory, HDMI 2.1, DisplayPort 1.4a, Axial-tech Fan Design, 0dB Technology) DUAL-RX6500XT-O4G with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.ca
				




There is no comparison between those 2 cards in performance or price I could buy 2 6500Xt for 1 3060. I only mentioned it because I have one and you said mid range and lower didn't matter. The 6700XT is less than the 3060 or at least comparable. If I am using an AM4 processor for me the potential of development on the Consoles gives me confidence to pair it with a Radeon GPU. That is much more attractive and tangible to me than hardware based ray tracing and DLSS, which is in less Games than Games that currently support Multi GPU. I am not bagging on the 3060 it is a good card but it is overpriced and the VRAM buffer is an illusion as that boost clock does not help as the 6700XT is 700mhz faster and it shows. It doesn't matter because the best budget GPU right now is hands down the 6600M. It comes back to the root of our debate though. If you are into Gaming the 5800X3D is worth the premium over the 5800X regardless of the GPU and you can feel it. That is the key.


----------



## Mussels (Nov 3, 2022)

Gica said:


> How would it be better for gaming in the same budget?
> 5800X3D + 6500XT or 5800X +3060
> The difference from the processors is redirected to the video card in this case, not to the SSD.


my GTX 1080 got a 30FPS boost going from a 3700x to a 5800x


you get gains on older GPUs with more modern CPUs


----------



## Badelhas (Nov 3, 2022)

kapone32 said:


> What is a middle card? I put my x3d with a 6500XT and it gave me 30% more FPS in some Games than a 5900X connected to that but also a smoother experience. I am not just talking out of my butt, I own all of those. No one is disputing that the 5800X is a better overall processor.  The fact remains that the x3D chip is a revelation in Gaming. I am confident that when the X3D chips launch that you will see more mass adoption of AM5. The chip was too expensive when it launched but as soon as it went below $500 CAD it started moving. It will sell quite well for the rest of the year as it's price keeps falling too. You should be glad because it will put price pressure on the rest of the lineup. If you are starting out and are budget conscious about the cost of an SSD you should get a 5600. Then you could buy a PSU too and maybe a case.


Hi. 
I own a ryzen 3600 and the nvidia 3060 ti. I play games at 1400p. In your opinion, will I see massive gains in performance if I sell my 3600 and buy the 5800x3d? 
Cheers


----------



## Mussels (Nov 3, 2022)

Badelhas said:


> Hi.
> I own a ryzen 3600 and the nvidia 3060 ti. I play games at 1400p. In your opinion, will I see massive gains in performance if I sell my 3600 and buy the 5800x3d?
> Cheers


You definitely would - just not in situations when your GPU is the limit
In most titles that changes second by second, plenty of games use the CPU for lighting or effects so the heaviest, worst performing scenes are when a CPU upgrade shows through - the minimums, not the maximums

CP2077 on psycho settings? No.

Games you used to get 100FPS on, you'll get 120-140FPS (or more, if they were heavily CPU/cache limited games like borderlands 3) and situations that used to dip down, simply may not dip at all


----------



## Nopa (Nov 3, 2022)

Don't pull the trigger just yet, we'll have a new gaming king this Jan.


----------



## Gica (Nov 3, 2022)

kapone32 said:


> This is the lowest price 3060 on Newegg in Canada
> 
> 
> 
> ...


I gave you an example: *two systems at the same price*. *Question: with which do you think you will have a better gaming experience:* *the system with 5800X3D + 6500XT or the one with 5800X + RTX 3060*? If we replaced 3060 with 6600, the price of this system would have been much lower (~$200) and it would still have been better.

The 6500XT is a fail. Too expensive for what it offers and cannot force the processor in such a way that you can see big differences. It can render at 60fps in 1080p, but at low details in AAA games if you want to avoid the "pleasure" of gaming at 15-20FPS.

Below, 5800X versus 5800X3D with RTX 3090, the graphics card that demands enormously from the processor. There are differences but not that big and I don't see how X3D can help a 6500XT with a 30% boost.
How can you prove it?











Mussels said:


> my GTX 1080 got a 30FPS boost going from a 3700x to a 5800x
> 
> 
> you get gains on older GPUs with more modern CPUs


I can choose games where the 3070Ti offers the same performance as the 11600KF 6c/12t versus the 11600KF 2c/4t, but they are not CPU intensive or AAA. I don't think that the migration from 3700X to 5800X (much better performing processor, higher class) will bring you a 30fps boost in Cyberpunk, Forza 5 or Control.


----------



## Nopa (Nov 3, 2022)

5800X3D's still a mad dog to mess with.


----------



## kapone32 (Nov 3, 2022)

Gica said:


> I gave you an example: *two systems at the same price*. *Question: with which do you think you will have a better gaming experience:* *the system with 5800X3D + 6500XT or the one with 5800X + RTX 3060*? If we replaced 3060 with 6600, the price of this system would have been much lower (~$200) and it would still have been better.
> 
> The 6500XT is a fail. Too expensive for what it offers and cannot force the processor in such a way that you can see big differences. It can render at 60fps in 1080p, but at low details in AAA games if you want to avoid the "pleasure" of gaming at 15-20FPS.
> 
> ...


There is no one that would believe that a 6500XT would match a 3060 in performance. Those cards are not in the same class but I have said that before. Even the 6600 is miles ahead of the 6500XT. That was a card for Gamers at the height of the Crypto boom. When 3060s were $700 and 6700XT were $900. The 6500XT has been about the same price since launch.

You know what you have said in this post is repeating what the official narrative is about the 6500XT. Forget that it can boost to 2900+ MHZ, forget that it supports HDMI 2.1, forget that it competes with the 1660 in price or that it does not consume more than 90 Watts in Gaming. Now as for proof. I could spend time making a video showing how the 6500Xt does in Gaming but all you have to do is look at the 1080P numbers on this very site for the 6500XT and you will see it is proficient at Gaming. If that is not enough go on Newegg and wonder to yourself why every 6500XT has 4+ egg reviews average. I guess the people that actually bought the card don't count though. As much as you may want to focus on AAA Games that is such a nuanced topic as what defines a AAA Game, sales, production cost, users. Is TWWH a AAA Game, Is Batman Arkham Knight a AAA Game, Stardew Valley or is it just Nvidia sponsored titles like CP2077 or Control. The most astounding thing about the 6500XT though is the size of the die. It is literally about the size of one of the HBM2 chips on Vega. That blew me away when I took mine apart. It was the budget choice until the 6600M became available.

Is the 3090 hands down the fastest of the last generation. Is it faster (ON AMD) than using a 6950XT in all Games? What GPU drivers did they use? When did they record the information? As much as I enjoy Youtube as much as the next person it is not the bible of technology.

Let's get back to the topic at hand. The 5800X3D is a faster feeling CPU in Gaming than any other AM4 chip. Ask any user on this very thread that has one. You will see that the theme is the same, everyone for the most part understands why they bought it. You can wax on about the 5800X and yes overall it is a better CPU but it doesn't not have V cache. As I have said before (as long as the economy does not collapse) AM5 mass adoption will occur when 7000X3D chips launch in 2023.


----------



## nexxusty (Nov 3, 2022)

GreiverBlade said:


> mmmh, now that i can find a 5800X for 299.90chf (about same price of a 5700X ) and a  5800X3D is 409.90chf (hey! i said if it dropped sub 450 when it was 449 i would place an order asap )
> given that i play in 1440p/1620p the 5800X3D is highly tempting and would still cost me way less than a full new Intel or AMD build
> but dang ... 300chf for a 5800X, i could settle for that one and add 50chf for 16gb RAM more (same stick as i have ofc ) and go 32gb


You'd be an idiot not to get the 5800x3D.

Period.

A 5800x and 16GB more RAM will not be even close to as good for gaming as a 5800x3D will be.


----------



## GreiverBlade (Nov 3, 2022)

nexxusty said:


> You'd be an idiot not to get the 5800x3D.
> 
> Period.
> 
> A 5800x and 16GB more RAM will not be even close to as good for gaming as a 5800x3D will be.


Thanks for the kind words  

Well, budget is budget and while I can stretch to 350chf
I can't do 409 yep, even 59 is too much right now  

Although if the 5800X3D drop a bit more I will get it

I can wait a bit, my R5 3600 does a banging job nonetheless


----------



## nexxusty (Nov 3, 2022)

GreiverBlade said:


> Thanks for the kind words
> 
> Well, budget is budget and while I can stretch to 350chf
> I can't do 409 yep, even 59 is too much right now
> ...


LOL, not you specifically brother. Anyone deciding between the two CPU's.

Especially for gaming.

I fully understand financial constraints being an issue. I would honestly wait just a tiny bit more and save the extra for the 5800x3D.

It will be worth it. I promise you that.


----------



## Mussels (Nov 4, 2022)

Nopa said:


> 5800X3D's still a mad dog to mess with.


the 5800x3D can be cooled by a wraith stealth while the intel needs something a lot bigger and more expensive


It's really misleading to compare performance of the CPU's vs their price when the CPU cannot be used without additional purchases - and the intels performance tanks if you dont use it with fast enough RAM, a good enough board (MANY budget intel boards have throttling issues) and i'd say a minimum 240mm AIO


As people keep saying, the x3D negates the needs for low latency ram - 4x8GB 3200 feeds it as well as high speed CL14 ram does on the regular zen3 chips, and you can run it on an A320 through an x570, whatever you can get cheap (or already have)



Intels best marketing was to sell the CPU's cheap, and then change sockets constantly so you need a second purchase from them anyway


----------



## Gica (Nov 4, 2022)

kapone32 said:


> There is no one that would believe that a 6500XT would match a 3060 in performance. Those cards are not in the same class but I have said that before. Even the 6600 is miles ahead of the 6500XT. That was a card for Gamers at the height of the Crypto boom. When 3060s were $700 and 6700XT were $900. The 6500XT has been about the same price since launch.
> 
> You know what you have said in this post is repeating what the official narrative is about the 6500XT. Forget that it can boost to 2900+ MHZ, forget that it supports HDMI 2.1, forget that it competes with the 1660 in price or that it does not consume more than 90 Watts in Gaming. Now as for proof. I could spend time making a video showing how the 6500Xt does in Gaming but all you have to do is look at the 1080P numbers on this very site for the 6500XT and you will see it is proficient at Gaming. If that is not enough go on Newegg and wonder to yourself why every 6500XT has 4+ egg reviews average. I guess the people that actually bought the card don't count though. As much as you may want to focus on AAA Games that is such a nuanced topic as what defines a AAA Game, sales, production cost, users. Is TWWH a AAA Game, Is Batman Arkham Knight a AAA Game, Stardew Valley or is it just Nvidia sponsored titles like CP2077 or Control. The most astounding thing about the 6500XT though is the size of the die. It is literally about the size of one of the HBM2 chips on Vega. That blew me away when I took mine apart. It was the budget choice until the 6600M became available.
> 
> ...


X3D costs over $500 until recently. We are talking about the present, when you can buy, for the same amount, 5800X + RTX 3060 (RX 6600XT) or 5800X3D + 6500XT. The question was simple: which configuration is better performing in gaming? I see you are avoiding the answer.
Naturally, everyone praises what they buy. I don't give a damn about praise in the comments, I prefer product criticism and only from those who have bought.

6500XT
Performance: poor (comparable to the ancient GTX 1060, destroyed by the non-Super 1660, launched in 2019 at the same power consumption)
RT: Epic fail
Productivity and streaming: zero !!! (no encoder in 2021, what was in their mind???)
It is definitely not a video card that will satisfy the demands of the year of the second decade. It was launched together with the RTX 3050, a bit more expensive, but which still offers something in gaming and with full access to dec/enc)


----------



## nexxusty (Nov 4, 2022)

Gica said:


> X3D costs over $500 until recently. We are talking about the present, when you can buy, for the same amount, 5800X + RTX 3060 (RX 6600XT) or 5800X3D + 6500XT. The question was simple: which configuration is better performing in gaming? I see you are avoiding the answer.
> Naturally, everyone praises what they buy. I don't give a damn about praise in the comments, I prefer product criticism and only from those who have bought.
> 
> 6500XT
> ...


The 6500XT is a laptop GPU. That's why it doesn't have hardware encoding. Among other things.

Like performance. Lol.



Mussels said:


> the 5800x3D can be cooled by a wraith stealth while the intel needs something a lot bigger and more expensive
> 
> 
> It's really misleading to compare performance of the CPU's vs their price when the CPU cannot be used without additional purchases - and the intels performance tanks if you dont use it with fast enough RAM, a good enough board (MANY budget intel boards have throttling issues) and i'd say a minimum 240mm AIO
> ...


Well said brother.

Couldn't have painted a better picture myself.


----------



## kapone32 (Nov 4, 2022)

Gica said:


> X3D costs over $500 until recently. We are talking about the present, when you can buy, for the same amount, 5800X + RTX 3060 (RX 6600XT) or 5800X3D + 6500XT. The question was simple: which configuration is better performing in gaming? I see you are avoiding the answer.
> Naturally, everyone praises what they buy. I don't give a damn about praise in the comments, I prefer product criticism and only from those who have bought.
> 
> 6500XT
> ...


Why are you trying so hard? I did not even write one post about the 5800X3D until I got mine. Yes the price was too high but now is very attractive and that is why I say it is the best. 

Do you have a 6500XT? If not you are just repeating what the narrative is. As I stated before I have never stated that the 6500XT is in the league of the 3060 but I guess that does not matter as you think that a 1660 destroys a 6500XT. Why are you focusing so intently on the 6500XT anyway? I know that I am quite happy with mine as I paid $215 CAD (vs 6600 for $599) for mine and regardless of your opinion that an encoder makes it useless is if you use it by itself without a CPU that actually has those instructions. Streaming? and no productivity (Do you mean making videos) because I would not have bought it for those purposes. When my main PC is in use I have no issue using my budget card in my budget PC. Please don't use the 3050 as that card is easily double the price of the 6500XT in Canada not a bit more expensive. If you actually owned one you would know what the actual biggest caveat it is not streaming.


----------



## Footman (Nov 4, 2022)

These days for me it is generally my monitor that dictates how much money I budget for PC upgrades. 

I know this sounds strange, but currently my 2K IPS monitor does 144Hz, anything above this (gaming) is really a waste. I tend to upgrade only when a new game comes out that prevents me from running close to my 144hz (max details). So although I am really tempted to buy the 5800X3D, I can max out my FPS with the 5700X or 5800X, or just keep the 5600X I already have and just upgrade my video card.

5800X3D is now available on AMD's website for $329 and the 5800X is $249, cheaper than my 5600X (which I paid $299 when it first released). They even have the 5900X for $349!!!! Black Friday comes early...


----------



## nexxusty (Nov 4, 2022)

Footman said:


> These days for me it is generally my monitor that dictates how much money I budget for PC upgrades.
> 
> I know this sounds strange, but currently my 2K IPS monitor does 144Hz, anything above this (gaming) is really a waste. I tend to upgrade only when a new game comes out that prevents me from running close to my 144hz (max details). So although I am really tempted to buy the 5800X3D, I can max out my FPS with the 5700X or 5800X, or just keep the 5600X I already have and just upgrade my video card.
> 
> 5800X3D is now available on AMD's website for $329 and the 5800X is $249, cheaper than my 5600X (which I paid $299 when it first released). They even have the 5900X for $349!!!! Black Friday comes early...


5800x3D isn't best at the higher FPS range. It's FPS lows that it excels at.

Does it as well as basically any CPU out right now as well.

100% the ONLY CPU I'd ever even call "Future Proof". It's like a 5775c on steroids.


----------



## Footman (Nov 4, 2022)

nexxusty said:


> 5800x3D isn't best at the higher FPS range. It's FPS lows that it excels at.
> 
> Does it as well as basically any CPU out right now as well.
> 
> 100% the ONLY CPU I'd ever even call "Future Proof". It's like a 5775c on steroids.


You make an interesting point about the low FPS. 

Techspot looked at the low 1% and found that compared to the 5800X:

Battlefield V ran significantly faster with the 5800X3D, boosting 1% lows by 33% at 1080p and the average frame rate by a whopping 41%. Even at 1440p, we're looking at a 30% improvement in 1% lows and a 21% increase for the average frame rate. Then at 4K, we're still looking at up to a 14% boost for the 3D V-Cache, an impressive set of results.

Difference between 133 and 173fps. As this is one of my most played games it is relevant.... However they were testing with a 3090Ti and I run my rig with 6700XT, so my differences will be a lot less.


----------



## kapone32 (Nov 4, 2022)

Footman said:


> You make an interesting point about the low FPS.
> 
> Techspot looked at the low 1% and found that compared to the 5800X:
> 
> ...


Not necessarily. The 5800X3D loves SAM on the 6000 series GPUs.


----------



## Gica (Nov 4, 2022)

kapone32 said:


> Why are you trying so hard? I did not even write one post about the 5800X3D until I got mine. Yes the price was too high but now is very attractive and that is why I say it is the best.
> 
> Do you have a 6500XT? If not you are just repeating what the narrative is. As I stated before I have never stated that the 6500XT is in the league of the 3060 but I guess that does not matter as you think that a 1660 destroys a 6500XT. Why are you focusing so intently on the 6500XT anyway? I know that I am quite happy with mine as I paid $215 CAD (vs 6600 for $599) for mine and regardless of your opinion that an encoder makes it useless is if you use it by itself without a CPU that actually has those instructions. Streaming? and no productivity (Do you mean making videos) because I would not have bought it for those purposes. When my main PC is in use I have no issue using my budget card in my budget PC. Please don't use the 3050 as that card is easily double the price of the 6500XT in Canada not a bit more expensive. If you actually owned one you would know what the actual biggest caveat it is not streaming.


I don't have the 6500XT, but I think you missed the general impression from the launch of this model. Impossible to find material to praise, only criticism. And you say that X3D brings a 30% average increase in the performance of this video card. Even an i3 10100 sleeps in games with 6500XT.
Without concrete proof, I don't think so. Can an RTX 3070Ti be turned into a 3080Ti if I replace the 11600KF with the 5800X3D?


----------



## kapone32 (Nov 4, 2022)

Gica said:


> I don't have the 6500XT, but I think you missed the general imp2077ression from the launch of this model. Impossible to find material to praise, only criticism. And you say that X3D brings a 30% average increase in the performance of this video card. Even an i3 10100 sleeps in games with 6500XT.
> Without concrete proof, I don't think so. Can an RTX 3070Ti be turned into a 3080Ti if I replace the 11600KF with the 5800X3D?


This is the issue I have with launch data. Many people will tout that and extrapolate what they will. The reviews are generally day one and rarely ever revisited months later. In the review of 6500XT on Techpowerup they described the issue with the encoder would be solved by the CPU but no one references that, only the negative. Now here is the caveat since you mentioned the 10100. In order to stretch the legs of the 6500XT you should use a Pcie 4 CPU. Lets keep in mind that the card is wired at x4. So when the 10100 is used you are restricting the card to 4 lanes of 3.0. That is why I would not use the 6500XT with a 5600/5700G CPU as those are 3.0 as well(I tried). Here is where you will see the difference. When you use a 4.0 CPU it runs at it's native 4.0 and even though it only has 4 lanes because it is 4.0 it actually gives you the bandwidth of 3.0 X 8 basically doubling the bandwidth. That is where you will see the 30% improvement as a card that is wired @ 64 bit needs as much bandwidth as it can get and would be affected by this. That is where the clock speed comes in.  The 5600 is a perfect CPU to pair with it but the concrete proof I am trying to show you (User reviews) is plenty enough as only 2 types of people generally write reviews, those who love it and those who abhor it. So again as much as the narrative online may seem the the 6500XT is absolute Garbage the user reviews do not agree with that. Now I also have to remind you that online reviewers are not generally paying for the cards they review and when you spend your own money it more indicative of the actual way a person will feel about their purchase and not opinion. If you have a 4K TV in your living room that has HDMI 2.1 with VRR (Freesync support) a 6500XT will be butter smooth as the refresh range is 44-144 but that is the panel so when you set the resolution to 1080P it is butter smooth. That alone makes it better than the 570/1660. Now Games like CP 2077 that use tremendous amounts of VRAM will be slower so let's say 50 FPS with settings adjusted, will be so smooth that you will be in disbelief. The problem with modern benchmarks is they don't take into account VRR as it really does moot the FPS argument. Now if I am playing anything above 1080P I would not use the 6500Xt but my 6800XT handles that just fine. Everyone is not me though so I do agree that if you can afford it you should get something better but like I have said before the 6600M is probably the best price/performance GPU right now as it is cheaper than anything we have mentioned and is fully supported on desktop with drivers. 

The 5800X3D should feel faster to you than the 11600K in some Games. Indeed with the improvement that V cache delivers you my feel that your 3070TI is a 3080 in some benchmarks but you are looking at cards that are 4000 cores apart from each other with a 50% difference in VRAM.


----------



## Mussels (Nov 5, 2022)

nexxusty said:


> 5800x3D isn't best at the higher FPS range. It's FPS lows that it excels at.
> 
> Does it as well as basically any CPU out right now as well.
> 
> 100% the ONLY CPU I'd ever even call "Future Proof". It's like a 5775c on steroids.


Good way to explain it. For some time it was great at being the top of FPS charts as well, but the new gen have outdone it there - at much higher wattages, but not always keeping those lows up

I'm expecting mine to age like the old i7 920 did, or the golden era K chips (2500k through 4790K) where owners of those are still able to game on them today


----------



## Gica (Nov 5, 2022)

kapone32 said:


> This is the issue I have with launch data. Many people will tout that and extrapolate what they will. The reviews are generally day one and rarely ever revisited months later. In the review of 6500XT on Techpowerup they described the issue with the encoder would be solved by the CPU but no one references that, only the negative. Now here is the caveat since you mentioned the 10100. In order to stretch the legs of the 6500XT you should use a Pcie 4 CPU. Lets keep in mind that the card is wired at x4. So when the 10100 is used you are restricting the card to 4 lanes of 3.0. That is why I would not use the 6500XT with a 5600/5700G CPU as those are 3.0 as well(I tried). Here is where you will see the difference. When you use a 4.0 CPU it runs at it's native 4.0 and even though it only has 4 lanes because it is 4.0 it actually gives you the bandwidth of 3.0 X 8 basically doubling the bandwidth. That is where you will see the 30% improvement as a card that is wired @ 64 bit needs as much bandwidth as it can get and would be affected by this. That is where the clock speed comes in.  The 5600 is a perfect CPU to pair with it but the concrete proof I am trying to show you (User reviews) is plenty enough as only 2 types of people generally write reviews, those who love it and those who abhor it. So again as much as the narrative online may seem the the 6500XT is absolute Garbage the user reviews do not agree with that. Now I also have to remind you that online reviewers are not generally paying for the cards they review and when you spend your own money it more indicative of the actual way a person will feel about their purchase and not opinion. If you have a 4K TV in your living room that has HDMI 2.1 with VRR (Freesync support) a 6500XT will be butter smooth as the refresh range is 44-144 but that is the panel so when you set the resolution to 1080P it is butter smooth. That alone makes it better than the 570/1660. Now Games like CP 2077 that use tremendous amounts of VRAM will be slower so let's say 50 FPS with settings adjusted, will be so smooth that you will be in disbelief. The problem with modern benchmarks is they don't take into account VRR as it really does moot the FPS argument. Now if I am playing anything above 1080P I would not use the 6500Xt but my 6800XT handles that just fine. Everyone is not me though so I do agree that if you can afford it you should get something better but like I have said before the 6600M is probably the best price/performance GPU right now as it is cheaper than anything we have mentioned and is fully supported on desktop with drivers.
> 
> The 5800X3D should feel faster to you than the 11600K in some Games. Indeed with the improvement that V cache delivers you my feel that your 3070TI is a 3080 in some benchmarks but you are looking at cards that are 4000 cores apart from each other with a 50% difference in VRAM.


I used 10100 as a metaphor, in the idea that even a quad core is more than enough for the 6500XT. Even after the latest reviews, the 5800X3D is a processor that sits well next to top video cards, not next to low video cards. It can't help much because *many operations when rendering a 3D scene are exclusively performed by the GPU*. 
The gaming reference tests for the processor are done at low resolutions because that way the video card renders more frames and the processor is used more intensively and you have a better idea of its performance in gaming. As the resolution increases, the video card renders fewer frames and the differences between the processors decrease, even to zero in 4K. Do you have any other explanation besides that the powerful processor waits for the video card and allows the weaker processor to almost match it? The processor just must not be weak and cause stuttering and real performance increases can only be brought by game developers and video card drivers, not by replacing a high-performance processor with another slightly higher-performance one.
You can take the igp performance (link) of the 7000 series as a benchmark (at Intel they are different because UHD 770 > UHD 630). So, zero differences with AMD from the same series (6, 8, 12, 16 cores) and with the same integrated graphics.

The only logical explanation that comes to mind is that, somehow, you had PCIe 3.0 set to the old processor. I don't see how a processor can bring 30% boost to this video card.


----------



## kapone32 (Nov 5, 2022)

Gica said:


> I used 10100 as a metaphor, in the idea that even a quad core is more than enough for the 6500XT. Even after the latest reviews, the 5800X3D is a processor that sits well next to top video cards, not next to low video cards. It can't help much because *many operations when rendering a 3D scene are exclusively performed by the GPU*.
> The gaming reference tests for the processor are done at low resolutions because that way the video card renders more frames and the processor is used more intensively and you have a better idea of its performance in gaming. As the resolution increases, the video card renders fewer frames and the differences between the processors decrease, even to zero in 4K. Do you have any other explanation besides that the powerful processor waits for the video card and allows the weaker processor to almost match it? The processor just must not be weak and cause stuttering and real performance increases can only be brought by game developers and video card drivers, not by replacing a high-performance processor with another slightly higher-performance one.
> You can take the igp performance (link) of the 7000 series as a benchmark (at Intel they are different because UHD 770 > UHD 630). So, zero differences with AMD from the same series (6, 8, 12, 16 cores) and with the same integrated graphics.
> 
> The only logical explanation that comes to mind is that, somehow, you had PCIe 3.0 set to the old processor. I don't see how a processor can bring 30% boost to this video card.


That is because you don't have a V cache enabled CPU. What you don't understand is that I used a 5600G and 5900X to get my information. Single CCDs are better at Gaming than dual but because the PCie generations are different it did indeed provide that performance so I thought ok let's get a 5600 (as they are inexpensive) as the 5900X is a beast and sure enough the same thing vs the 5600G. I know that cards like the 6800XT or 3080 would not really show the same but you have to keep in mind that the 6500XT is starved vs those cards both in wiring and bandwidth. It cannot play 1440P well and should never be considered for that but 1080P all day is fine. Don't forget that there are also software innovations like FSR (Which is now on Version 3) anti lag and Enhanced Sync to help mitigate the gap as well. So when I got my X3D chip, I put it in my HTPC before my main rig and yes the 5800X3D made me smile in how a CPU upgrade could have an effect on Gaming at 1080P like a GPU upgrade. AMD will sell every 5800X3D they make. Do you know why the 7000 are not selling. Part of it is cost, part of it is power but a huge part of it is Intel deceptively showed that the 13th Gen is at best in Gaming as fast as the X3D chip and combined with the price drop it will have plenty singing about using the X3D chip with a 7000 series GPU around Xmas. Just look at how many supporters are singing the praises of X3D on this forum but I can understand your sentiment as you don't have one.


----------



## Gica (Nov 6, 2022)

kapone32 said:


> That is because you don't have a V cache enabled CPU. What you don't understand is that I used a 5600G and 5900X to get my information. Single CCDs are better at Gaming than dual but because the PCie generations are different it did indeed provide that performance so I thought ok let's get a 5600 (as they are inexpensive) as the 5900X is a beast and sure enough the same thing vs the 5600G. I know that cards like the 6800XT or 3080 would not really show the same but you have to keep in mind that the 6500XT is starved vs those cards both in wiring and bandwidth. It cannot play 1440P well and should never be considered for that but 1080P all day is fine. Don't forget that there are also software innovations like FSR (Which is now on Version 3) anti lag and Enhanced Sync to help mitigate the gap as well. So when I got my X3D chip, I put it in my HTPC before my main rig and yes the 5800X3D made me smile in how a CPU upgrade could have an effect on Gaming at 1080P like a GPU upgrade. AMD will sell every 5800X3D they make. Do you know why the 7000 are not selling. Part of it is cost, part of it is power but a huge part of it is Intel deceptively showed that the 13th Gen is at best in Gaming as fast as the X3D chip and combined with the price drop it will have plenty singing about using the X3D chip with a 7000 series GPU around Xmas. Just look at how many supporters are singing the praises of X3D on this forum but I can understand your sentiment as you don't have one.


Yes, I saw that with you: 5800X3D + 6500XT is 30% more performant in gaming than 5900X + 6500XT. And I don't believe it! You are that fan who praises a video card that has serious problems in 1080p at details worthy of 2022. Try to demonstrate through reviews in which the most powerful video cards were used and that anyway, even with them, a jump was not possible so big. According to this review, between the 5900X and 5800X3D there is only a 7% difference with the RTX 3080, a video card that renders hundreds of frames in 1080p and demands a lot from the processor.
You can't turn RX6500XT into RX6600 with just the processor, believe me. The drivers bring extra performance, but I don't think that AMD is now consuming its resources with that bad joke. They got what they deserved at its launch.


----------



## ratirt (Nov 6, 2022)

HWUB recently released this video comparing gaming performance with an 4090 for 4 CPUs. It does look interesting though especially in price to performance etc.


----------



## kapone32 (Nov 6, 2022)

Gica said:


> Yes, I saw that with you: 5800X3D + 6500XT is 30% more performant in gaming than 5900X + 6500XT. And I don't believe it! You are that fan who praises a video card that has serious problems in 1080p at details worthy of 2022. Try to demonstrate through reviews in which the most powerful video cards were used and that anyway, even with them, a jump was not possible so big. According to this review, between the 5900X and 5800X3D there is only a 7% difference with the RTX 3080, a video card that renders hundreds of frames in 1080p and demands a lot from the processor.
> You can't turn RX6500XT into RX6600 with just the processor, believe me. The drivers bring extra performance, but I don't think that AMD is now consuming its resources with that bad joke. They got what they deserved at its launch.


That chart you attached does not have the 5600G. Let's also drill down and when you see Borderlands 3 at 1080P what is the performance gap? Then using that same data think about what the difference would be between the 5600G and 5800X3D and you answer is there but I guess it doesn't qualify because it doesn't do Ray Tracing and DLSS. There is FSR though and Unreal Engine 5 looks like it will be a nice rival to hardware based Ray Tracing. So let me ask you this, what would need the CPU more a card like the 3080 or the 6500XT? TPU and others have done enough to prove that the 5800X3D is the fastest Gaming chip AMD make and whatever GPU you use will feel faster vs any other AM4 outside of the 5950X (if the Game uses 1/2 threads). By the way my 30% difference was actually PCIe 3 vs PCIe 4 but yes the 5800X3D did feel faster than the 5900X. I can see that you have a desultory opinion of the card and ask you to just go on Newegg and read user reviews. Consuming resources? Have you seen the size of the GPU on the 6500XT? They could probably make 4-6 6500XT for every 6700XT.


----------



## Gica (Nov 6, 2022)

kapone32 said:


> *That chart you attached does not have the 5600G*. Let's also drill down and when you see Borderlands 3 at 1080P what is the performance gap? Then using that same data think about what the difference would be between the 5600G and 5800X3D and you answer is there but I guess it doesn't qualify because it doesn't do Ray Tracing and DLSS. There is FSR though and Unreal Engine 5 looks like it will be a nice rival to hardware based Ray Tracing. So let me ask you this, what would need the CPU more a card like the 3080 or the 6500XT? TPU and others have done enough to prove that the 5800X3D is the fastest Gaming chip AMD make and whatever GPU you use will feel faster vs any other AM4 outside of the 5950X (if the Game uses 1/2 threads). By the way my 30% difference was actually PCIe 3 vs PCIe 4 but yes the 5800X3D did feel faster than the 5900X. I can see that you have a desultory opinion of the card and ask you to just go on Newegg and read user reviews. Consuming resources? Have you seen the size of the GPU on the 6500XT? They could probably make 4-6 6500XT for every 6700XT.


You see that you have confused the lies. I wasted too much time with an AMD Taliban.


kapone32 said:


> What is a middle card?* I put my x3d with a 6500XT and it gave me 30% more FPS in some Games than a 5900X connected to that but also a smoother experience.* I am not just talking out of my butt, I own all of those. No one is disputing that the 5800X is a better overall processor.  The fact remains that the x3D chip is a revelation in Gaming. I am confident that when the X3D chips launch that you will see more mass adoption of AM5. The chip was too expensive when it launched but as soon as it went below $500 CAD it started moving. It will sell quite well for the rest of the year as it's price keeps falling too. You should be glad because it will put price pressure on the rest of the lineup. If you are starting out and are budget conscious about the cost of an SSD you should get a 5600. Then you could buy a PSU too and maybe a case.


----------



## Nopa (Nov 6, 2022)

Gica said:


> an AMD Taliban.


First time seeing this.. You're creative asf man!


----------



## kapone32 (Nov 6, 2022)

Gica said:


> You see that you have confused the lies. I wasted too much time with an AMD Taliban.


What was the difference between the 5900X and 5800X3D in Borderlands 3 at 1080P? 
Did you look? 
Did you read any user reviews of the 6500XT on Newegg? 
Did you watch the video posted from Hardware Unboxed? 

All of your points are based on opinion from what you have read and not real world use. Then to call me a terrorist because I like my 5800X3D and 6500XT is pretty juvenile in my opinion but it is just that as you have already stated that you own none of these but seem so knowledgeable in the same. 

If you notice I clearly stated some Games. I assure though that the 5800X3D is faster than the 5600G than 30% using a 6500XT, but you don't have to believe me. This thread is not about the 5600G nor 6500XT but is there anyone who thinks that you will not notice a difference using a 5600G vs a 5800X3D? Would it be foolish to even ask that question?


----------



## The King (Nov 18, 2022)

@W1zzard here are the results from performance difference between 5800X3D and 5800X using same DR RAM kit and GPU.















						AMD Ryzen 7 5800X3D Review - The Magic of 3D V-Cache
					

The AMD Ryzen 7 5800X3D is the company's new flagship gaming processor. It introduces 3D V-Cache, a dedicated piece of silicon with additional L3 capacity. In our review, we're testing how much the larger cache can help intensive gaming workloads and applications and compare it to the Intel Core...




					www.techpowerup.com
				




I'm calling BS on this comparison, you making the 5800X look far worse than it actually is and the difference in not "HUGE" has you put it.


----------



## W1zzard (Nov 18, 2022)

The King said:


> @W1zzard here are the results from performance difference between 5800X3D and 5800X using same DR RAM kit and GPU.
> View attachment 270470
> 
> View attachment 270469
> ...


RTX 3080


----------



## The King (Nov 18, 2022)

W1zzard said:


> RTX 3080


I still feel you not running the same RAM config is playing a huge role in the performance difference between the two ZEN 3 cpus.
If the 5800X performs 8% better when running DR then that will definitely make a difference?

I would be very surprised if you get the same FPS in all the games tested if you change the RAM config to the same one that was used on the 5800X3D. (2X16GB 3600 14-14-14)
It would mean DR setups on ZEN 3 just don't work at all and does not increase FPS performance in games.


----------



## FreezingPC (Nov 18, 2022)

Again;



> Test System​*The goal of this review is NOT to test "5800X vs 5800X3D at similar config," but "The current GPU Test System that I have right now, a decent but slightly aged config, vs 5800X3D the way you would build it today" to find out how much of a difference an upgrade can bring.*



And even if you put the same ram:

With a 3090 TI, a 5800X3D already slapped the 5800x, with a 15% performances difference...








						Ryzen 7 5800X3D vs. Ryzen 7 5800X: Zen 3 Gaming Shootout
					

On the menu today is another 40 game benchmark -- actually 41. This time it's the 5800X3D against its spiritual predecessor, the 5800X, to see where that...




					www.techspot.com
				





On a more recent review, with a 4090, we get a 22% performances gap:








						Intel Core i5-13600K vs. AMD Ryzen 5 7600X
					

The Core i5-13600K is the most affordable model in Intel's new Raptor Lake range. Its predecessor, the 12600K was a personal favorite of ours, so we are...




					www.techspot.com
				





What sort of *miracle* are people expecting from RAM, when the gap is this large ???


----------



## The King (Nov 19, 2022)

FreezingPC said:


> Again;
> 
> 
> 
> ...



Is this is not a tech forum! It was well established that ZEN 3 CPUS can benefit up to 8% when running dual rank ram config VS single rank. This is a fact not an opinion.
It was confirmed by several hardware reviewers shortly after ZEN 3 launched.

The 5800X was going to lose that won't change.

The 5800X3D would still beat the 5800X even if the RAM config was switched around, heck the 5800X3D may even beat the 5800X with a single stick of
RAM for how fast it is. That is not the point at all.

The point is the 5800X setup has many things stacked against it with this RAM setup. Very Questionable DDR4 4000 CL20 RCD 23 RAM against Dual RANK setup of 3600 CL 14-14-14.

Whatever the goal of the test is it paints the 5800X far worse against the 5800X3D in gaming that it actually is compared to when they share the same RAM config. That is my point.

Many people wont even look at the test config or read the whole article and only look at the charts. I personally am running dual rank setups with all 3 of ZEN 3 CPUs.

I would expect any one else running ZEN 3 that wants the best FPS when gaming would upgrade their RAM before anything else like a CPU or GPU has that would cost much more!

Let me add some commentary to this.

Hey guys, So today we going to see how much faster the 5800X3D is compared to a 5800X in games. But wait we will give the 5800X3D a dual rank setup so that it can achieve its is maximum performance levels but leave the 5800X at a 8% AVG fps disadvantage by running it in single rank??? You see the problem here?

I don't care that the 5800X loses. The issue is by how much and this test setup is simply very BIAS towards the 5800X3D.




> *The goal of this review is NOT to test "5800X vs 5800X3D at similar config," but "The current GPU Test System that I have right now, a decent but slightly aged config, vs 5800X3D the way you would build it today" to find out how much of a difference an upgrade can bring.*


This shows what upgrading both the CPU and RAM will do not just upgrading the CPU alone so that is an important factor to take into the gaming performance comparison between these two setups.
The 5800X3D also benefits from running Dual rank just like all other ZEN 3 CPUs do.

Has for this* note in bold*. I certainly never seen it before and betting the majority of people on this site missed that as well. At the end of the day the gaming performance data is bias in the 5800X3D favor.


----------



## FreezingPC (Nov 19, 2022)

The King said:


> At the end of the day the gaming performance data is bias in the 5800X3D favor.


Yes, now people just need to get over it and stop expecting ram to do miracles when you have this wide of a gap...
Easy...


----------



## The King (Nov 19, 2022)

FreezingPC said:


> Yes, now people just need to get over it and stop expecting ram to do miracles when you have this wide of a gap...
> Easy...


So 8-17% fps increasing in gaming fps is just not significant i guess. 
And yet again not sure if you have a comprehension problem. This does affect the gap!


----------



## FreezingPC (Nov 19, 2022)

The King said:


> So 8-17% fps increasing in gaming fps is just not significant i guess.
> And yet again not sure if you have a comprehension problem. This does affect the gap!



The 3200MHz cl14 dual ranked being the reference, on average, the 3200MHz cl14 single is 3%~ slower and the 4000MHz CL16 tuned is 3%~ faster...

GN review settings configuration is 5%~ slower (vs stock) on average...


Unless i start cherry picking the results (taking the 8% or more only or taking even slower ram result for some reason), the 2 videos gave an averg. that is on the margin of error...
Its pretty underwhelming and i really don't see the purpose of continuing the RAM discussion...

*" *The goal of this review is NOT to test "5800X vs 5800X3D at similar config,"*  " - *The end and GL for others to get over this.


----------



## The King (Nov 19, 2022)

FreezingPC said:


> The 3200MHz cl14 dual ranked being the reference, on average, the 3200MHz cl14 single is 3%~ slower and the 4000MHz CL16 tuned is 3%~ faster...
> 
> GN review settings configuration is 5%~ slower (vs stock) on average...
> 
> ...


You clearly know nothing about RAM if you want to compare 4000 CL16 which is a good cache latency to 4000 CL20 which is not even close?
Rather end this topic here.
I made all the points that I needed to with my above posts.  



> *" *The goal of this review is NOT to test "5800X vs 5800X3D at similar config,"*  " - *The end and GL for others to get over this.


What is the title of this thread \ article?
RTX 4090 & 53 Games: *Ryzen 7 5800X vs Ryzen 7 5800X3D   *


----------



## FreezingPC (Nov 20, 2022)

The King said:


> What is the title of this thread \ article?


If your problem is the title, then i advice you to talk with the person who made this thread to begin with, since a random Techpowerup forum user can't change this type of stuff...


----------



## Cjsev (Nov 20, 2022)

So, if you own a 4080 running on a 1440 @ 240z and have a 5800x cpu, is it worth upgrading to the 5800x 3d?


----------



## The King (Nov 21, 2022)

Cjsev said:


> So, if you own a 4080 running on a 1440 @ 240z and have a 5800x cpu, is it worth upgrading to the 5800x 3d?


A 9% AVG increase in performance @ 1440p using a 3090 Ti and *same RAM config*. The gap could be more using the RTX 4080 has it is around 12% faster than the 3090 Ti @ 1440p.

Since this TPU test does not use the same RAM config it is hard to use it has a fair comparison. Yes I went there!  


In some games you won't even get any FPS improvement so your need to see what game you would be playing
and only you can answer the question is it worth it or not. if you plan on selling your 5800X to cover the cost of the upgrade then it should be worth it.









						Ryzen 7 5800X3D vs. Ryzen 7 5800X: Zen 3 Gaming Shootout
					

On the menu today is another 40 game benchmark -- actually 41. This time it's the 5800X3D against its spiritual predecessor, the 5800X, to see where that...




					www.techspot.com


----------



## 529th (Nov 26, 2022)

Mussels said:


> Good way to explain it. For some time it was great at being the top of FPS charts as well, but the new gen have outdone it there - at much higher wattages, but not always keeping those lows up
> 
> I'm expecting mine to age like the old i7 920 did, or the golden era K chips (2500k through 4790K) where owners of those are still able to game on them today



I immediately seen this correlation with my old X58 board and the 6 core 32nm chips I was using it them.  I'll be gaming on this 3D for a few years or more with only doing GPU upgrades.


----------



## cvearl (Nov 28, 2022)

Thanks for the AM4 Parting Gift AMD! I sold my 5800X for $250 and bought the 5800X3D for $450 and got Uncharted for free LOL. So drop in CPU upgrade for $150 with this kind up increase in performance? Wow. AM4 is truely a legend. Now if only AMD could get the price of B650 boards in line with B550 boards. With the new 7000 series pricing we would be back on the path of value once again.


----------



## Hotobu (Jan 1, 2023)

Are the game settings listed anywhere?


----------



## Mussels (Jan 2, 2023)

Hotobu said:


> Are the game settings listed anywhere?


Traditionally they're ran at the highest in-game preset, but it may not be specified in this article

@W1zzard  ?


----------



## W1zzard (Jan 2, 2023)

Mussels said:


> Traditionally they're ran at the highest in-game preset, but it may not be specified in this article
> 
> @W1zzard  ?


Yeah it’s max settings pretty much, some games run slightly higher than the ultra profile equivalent when that leaves some Settings not maxed out. Some games i turn off things that are pretty much vendor specific like hairworks on Witcher


----------

