# Intel 10th gen beats Ryzen in gaming by 40-60 fps when using high speed ram...



## Space Lynx (Jul 14, 2020)

Why is no one talking about this? It's absolutely insane the gains...

@R-T-B

Am I missing something? Why are the gains so insane???? Ryzen is history... I mean yeah the ram costs $200 versus $100 for the same capacity, but small price to pay for 40-60 fps gains across the board...


timestamped:

gamersnexus review of high speed ram with intel i5-10600k beats ryzen 3700x by 60-70 fps in many games... and I can't believe no one is talking about that. its absolutely insane imo.
timestamped:


----------



## EzioAs (Jul 14, 2020)

Maybe you linked the wrong source because I don't see how you reach the "high speed ram with intel i5-10600k beats ryzen 3700x by 60-70 fps in many games" conclusion. Emphasis on the 60-70 fps in many games.

The biggest gap I can see is with F1 2019 (1080p w/high settings) and even then, it's 51 fps at most. Enlighten me if I'm wrong.


----------



## Space Lynx (Jul 14, 2020)

EzioAs said:


> Maybe you linked the wrong source because I don't see how you reach the "high speed ram with intel i5-10600k beats ryzen 3700x by 60-70 fps in many games" conclusion. Emphasis on the 60-70 fps in many games.
> 
> The biggest gap I can see is with F1 2019 (1080p w/high settings) and even then, it's 51 fps at most. Enlighten me if I'm wrong.



i meant ryzen 3600 - for example shadow of tomb raider i5-10600k and high ram = 178 fps, on ryzen 3600 ddr4  3200 cas 14 its 40 fps difference.  i guess i read it wrong, but still 40-50 fps doesn't change my point... insane gains...

i mean $100 more for 4000 cas 15 ram over cas 14 3200 ram, and $75 more for a 10600k vs a 3600, mobo costs are about same, MSI Z490 A Pro is $160 and has great vrm cooling....  we pay what $200-400 premium's on graphics cards for 15-30 fps gains...


----------



## phanbuey (Jul 14, 2020)

People do know this, and have for a while - this has been true since the 8700K / 8600K launch.  10th Gen is basically exactly the same crap.  High end/low latency ram on coffee lake would stomp ryzen up and down in academic gaming numbers.  The main issue is ppl dont care as much when CPU bound vs GPU bound...

Take your shadow of the tomb raider example:
You would only notice this IF you had a 1080P 165hz+ screen and you were running a high end card, and you cared enough  to notice those FPS.

If any of that scenario changes: i.e. If you're on a 1440P screen, or you're running a budget build with a lesser vcard or slower ram even on a 120hz 1080p monitor you don't care or feel that 40 FPS on the CPU at all.  Those extra FPS never really amount to that much irl.

When it comes to GPU it's a different story, there you're paying for 15% FPS more in say something like 4K or more eyecandy, or whatever, so it's more noticeable than 40 fps when you're already pushing 140 because it allows you to experience more.


----------



## EzioAs (Jul 14, 2020)

Should also mentioned that the i5-10600K is OC'ed to 5.1GHz. I mean, yeah it's nice to look at but it doesn't really change what a lot of us know about the current state of CPUs from both sides. If you focus primarily on gaming, the i5-10600K is the best gaming CPU period. That was the conclusion reached when it originally launched and tested by many reviewers. Overclock it and put higher speed memory (which I'm not sure at what speed are the 10th gen CPUs more comfortable at for daily use), it becomes even faster, no surprises there. However, like many reviewers have noted, if you do more than just gaming and want to focus more on productivity workloads, Ryzen 3000 CPUs are usually the better performer/value (outside of the XT variants of course). 

I guess why most people here don't bother adding more to what's shown in the video above is because these aren't really that surprising, although it might be to some people. In the end, people buy what fits their need and budget, and if they're happy with what they choose, not much more to add is there?


----------



## Kanan (Jul 14, 2020)

Intel has academically relevant FPS advantage in games, as in, barely relevant, while having a inferior architecture with security problems and no PCI-E 4.0 support which may very well be relevant soon, coming gen graphics cards. Plus, Intels CPUs are awfully inefficient, especially when overclocked. And 170 bucks extra, may not be much money for you, but for the most people it is rather significant.


----------



## Space Lynx (Jul 14, 2020)

phanbuey said:


> People do know this, and have for a while - this has been true since the 8700K / 8600K launch.  10th Gen is basically exactly the same crap.  High end/low latency ram on coffee lake would stomp ryzen up and down in academic gaming numbers.  The main issue is ppl dont care as much when CPU bound vs GPU bound...
> 
> Take your shadow of the tomb raider example:
> You would only notice this IF you had a 1080P 165hz+ screen and you were running a high end card, and you cared enough  to notice those FPS.
> ...




this makes sense, plus @Mussels pointed out to me the test was done with 2080 ti which skews everything a bit more since its an unrealistic gpu for most people.  and that is true, I would have like to see the tests done with a 2070 super stock or something to see if the gains are as wide, and more 1440p tests, cause if you have that rig and you are gaming at 1080p you are doing it wrong, lol

still I find it impressive that ram can make so much in gains... its really impressive how far ram has come, i remember in ddr3 days when I built my first PC from ground up, my buddy just told me ram is ram try not to worry about it, and I think that was true back then, not so much anymore.


----------



## Kanan (Jul 14, 2020)

lynx29 said:


> i remember in ddr3 days when I built my first PC from ground up, my buddy just told me ram is ram try not to worry about it, and I think that was true back then, not so much anymore.


The first days I remember Ram being more significant, was with DDR (1), where I actually bought my first high quality kit for my back then Athlon 64 3800+. It had to be high quality, because via FSB overclocking (multi was locked on the regular CPUs), the Ram as well was overclocked by 20% to 240 MHz. The CPU then ran on 2400 instead of 2000 MHz. The Ram even did as much as 280 MHz without problems. Another good kit I bought was then later for the Phenom II 940, a DDR2 kit with 800 MHz and CL9 timings.

This in contrast to the first days I had a PC, where I would find spare EDO Ram pieces laying around in my fathers office and simply take and insert it in the empty slots of my, back then, Asus Socket 7 board, which had only 2 of 6 slots populated. Was a nice increase from 32 MB to 196 MB back then.


----------



## Space Lynx (Jul 14, 2020)

Kanan said:


> The first days I remember Ram being more significant, was with DDR (1), where I actually bought my first high quality kit for my back then Athlon 64 3800+. It had to be high quality, because via FSB overclocking (multi was locked on the regular CPUs), the Ram as well was overclocked by 20% to 240 MHz. The CPU then ran on 2400 instead of 2000 MHz. The Ram even did as much as 280 MHz without problems. Another good kit I bought was then later for the Phenom II 940, a DDR2 kit with 800 MHz and CL9 timings.
> 
> This in contrast to the first days I had a PC, where I would find spare EDO Ram pieces lying around in my fathers office and simply take and insert it in the empty slots of my, back then, Asus Socket 7 board, which had only 2 of 6 slots populated. Was a nice increase from 32 MB to 196 MB back then.




Did that actually increase FPS in games by large margins though? That is my point here.


----------



## Kanan (Jul 14, 2020)

lynx29 said:


> Did that actually increase FPS in games by large margins though? That is my point here.


Back then every bit helped. Since 2008 or so, CPU performance is in excess because of the advent of Quad Cores and games actually using more than just one core. 2006 or earlier every bit helped, but reviews with actual data about that, or people generally caring about CAS timings were rather rare. You only wanted higher quality kits for the better OCs, which helped of course. For the Phenom II 940, which was the AM2+ version of a CPU more widely used with DDR3 on AM3 boards, it helped a lot as well, cause you had to counteract the loss of bandwidth with sharp timings, which was the only upside of DDR2 compared to DDR3, aside from the actually payable prices back then.


----------



## Space Lynx (Jul 14, 2020)

does anyone know how much FPS I would gain at 1440p with everything the same on a more realistic system, say a 2070 super and ryzen 3700x at 1440p high refresh on a realistic setup vs the 2080 ti at 1080p gamersnexus benches show?

and the only variable change would be ram - say realistic ram - 3200 cas 16-18-18 vs 3600 cas 16-16-16


----------



## Chomiq (Jul 14, 2020)

lynx29 said:


> Why is no one talking about this? It's absolutely insane the gains...
> 
> @R-T-B
> 
> ...


You're missing the bit about cpus not running at identical clock speeds. Increased clock speed, higher ram speed, reduced latency and there you have it. Clear advantage in cpu performance.


----------



## Assimilator (Jul 14, 2020)

I wish you would stop posting these BS clickbait "Intel is bettar than Ryzen OMG WTF LOL" threads. Or at least, read further than the end of your own nose before people point out to you that you should have read further than the end of your nose, before posting said threads.

I would've thought you would've learned from the previous one, but as they say, the definition of insanity...


----------



## Space Lynx (Jul 14, 2020)

Assimilator said:


> I wish you would stop posting these BS clickbait "Intel is bettar than Ryzen OMG WTF LOL" threads. Or at least, read further than the end of your own nose before people point out to you that you should have read further than the end of your nose, before posting said threads.
> 
> I would've thought you would've learned from the previous one, but as they say, the definition of insanity...



huh, did you not read any of my posts? I didn't realize the 2080 ti was the skewing factor and I am now trying to find more realistic results... maybe you should not post on my threads unless you have something valuable to add?  welcome to my ignore list. you are of no help.


----------



## Kanan (Jul 14, 2020)

lynx29 said:


> huh, did you not read any of my posts? I didn't realize the 2080 ti was the skewing factor and I am now trying to find more realistic results... maybe you should not post on my threads unless you have something valuable to add?  welcome to my ignore list. you are of no help.


What do all these Youtubers have in common? They all recommend Ryzen, why do you think that is? Simply watch videos to the end, they always add a disclaimer and tell which CPUs are the best, regarding value and performance.

That said, a lot of GamersNexus videos are highly made for entertainment. They do a lot of freakish benchmark runs, that are barely realistic.


----------



## Space Lynx (Jul 14, 2020)

Kanan said:


> What do all these Youtubers have in common? They all recommend Ryzen, why do you think that is? Simply watch videos to the end, they always add a disclaimer and tell which CPUs are the best, regarding value and performance.
> 
> That said, a lot of GamersNexus videos are highly made for entertainment. They do a lot of freakish benchmark runs, that are barely realistic.



gamersnexus is an an enthusiast (some people enjoy watching that) and trustworthy imo.  he has a wide variety of videos catering to many budgets.


----------



## Kanan (Jul 14, 2020)

lynx29 said:


> gamersnexus is an an enthusiast (some people enjoy watching that) and trustworthy imo.  he has a wide variety of videos catering to many budgets.


Didn't say anything to the contrary, I'm just saying, not all his videos are to be taken for realistic world usage, unless he adds his disclaimer at the end, as per usual, that these combinations are highly unlikely to happen. Super expensive Ram and a 1200$ card with a midrange CPU - I don't think so.


----------



## Space Lynx (Jul 14, 2020)

Kanan said:


> Didn't say anything to the contrary, I'm just saying, not all his videos are to be taken for realistic world usage, unless he adds his disclaimer at the end, as per usual, that these combinations are highly unlikely to happen. Super expensive Ram and a 1200$ card with a midrange CPU - I don't think so.



I agree, but I didn't realize this until Mussels pointed it out to me, which is why I mentioned it an earlier post (making your comment redundant)

I'm now curious what realistic builds on mid-range budgets look like if you just take say a 3600 stock and then some high end ram paired with mid-range gpu, as mentioned in previous post, I can not find any benches for this and I think it would be interesting to see the results. Could be some interesting gains for only $100 more, or maybe it all levels off because it really was the 2080 ti doing the skewing too much? we won't know without a thorough and controlled study that only people like w1zz and gamersnexus are capable of.  also i would love to see some older games tested again for this stuff like witcher 3, why everyone is using F1 racing a game 3% of games play is beyond me.


----------



## kayjay010101 (Jul 14, 2020)

Kanan said:


> Didn't say anything to the contrary, I'm just saying, not all his videos are to be taken for realistic world usage, unless he adds his disclaimer at the end, as per usual, that these combinations are highly unlikely to happen. Super expensive Ram and a 1200$ card with a midrange CPU - I don't think so.


You really don't understand how benchmarking works, do you? You need to eliminate potential bottlenecks when testing a specific part so that you're actually testing the part. If they built realistic builds that were balanced, they'd 9/10 times just be testing the GPUs when they're intending to test CPUs, since most games are GPU limited. By going with a standardized build that just swaps CPUs you eliminate those variables. Their videos aren't to be taken as purchasing advice for builds, they never advertise themselves as that. Their videos are benchmarks and reviews of single products, They compare single products to other singular products. They aren't there to make build guides for you.

Also, Steve is not the only one at GamersNexus. They have a whole team, Steve is just the face of the channel and the main writer.


----------



## Space Lynx (Jul 14, 2020)

kayjay010101 said:


> You really don't understand how benchmarking works, do you? You need to eliminate potential bottlenecks when testing a specific part so that you're actually testing the part. If they built realistic builds that were balanced, they'd 9/10 times just be testing the GPUs when they're intending to test CPUs, since most games are GPU limited. By going with a standardized build that just swaps CPUs you eliminate those variables.
> 
> Also, Steve is not the only one at GamersNexus. They have a whole team, Steve is just the face of the channel and the main writer.



ram is a different ball game these days though, are you not curious at all what 3800 cas 15/16 ram can do with a ryzen 3600 and rtx 2070 super at 1440p?  i mean this ram is very affordable now, $95 for 3200 cas 14 and then oc it with ryzen calculator.  i am really interested to see how much it gains over someone who just went full budget with $60 ram at 3200 cas 16


----------



## Kanan (Jul 14, 2020)

kayjay010101 said:


> You really don't understand how benchmarking works, do you? You need to eliminate potential bottlenecks when testing a specific part so that you're actually testing the part. If they built realistic builds that were balanced, they'd 9/10 times just be testing the GPUs when they're intending to test CPUs, since most games are GPU limited. By going with a standardized build that just swaps CPUs you eliminate those variables.
> 
> Also, Steve is not the only one at GamersNexus. They have a whole team, Steve is just the face of the channel and the main writer.


I do understand it, don't get angry. My point just is, that nobody will sport this combination (in this topics video), unless he does it for fun. A lot of his videos are interesting *and* realistic. This one isn't. Also where exactly did I say, he is one guy? I follow this channel regularly, I know that he isn't alone. Nor can he be, with so much work at hand.


----------



## kayjay010101 (Jul 14, 2020)

lynx29 said:


> ram is a different ball game these days though, are you not curious at all what 3800 cas 15/16 ram can do with a ryzen 3600 and rtx 2070 super at 1440p?  i mean this ram is very affordable now, $95 for 3200 cas 14 and then oc it with ryzen calculator.  i am really interested to see how much it gains over someone who just went full budget with $60 ram at 3200 cas 16


I have a 3700X with 3600CL14-14-14-28 timings. Before that I had 3200 CL16 with the same setup. This was at 1440p. At most I've seen a 3-4 FPS difference between the two, the real difference was in the AIDA64 memory benchmark.


----------



## Space Lynx (Jul 14, 2020)

kayjay010101 said:


> I have a 3700X with 3600CL14-14-14-28 timings. Before that I had 3200 CL16 with the same setup. This was at 1440p. At most I've seen a 3-4 FPS difference between the two, the real difference was in the AIDA64 memory benchmark.



see I find those results very interesting, it is only one person though.  i would expect 10 fps increase, especially seeing as how it scales to 40-50 fps when you bring in the 2080 ti at 1080p, or maybe 1440p is what is stopping the scaling of the ram performance gains? this is why i wish benchmarkers did budget build tests. something doesn't make sense here it should be at least 10 fps, i know it doesn't make sense to match the 40-50 fps, but i would just think it would be more than that scaled down even


----------



## kayjay010101 (Jul 14, 2020)

Kanan said:


> I do understand it, don't get angry. My point just is, that nobody will sport this combination (in this topics video), unless he does it for fun.


My point is your point is irrelevant. GN isn't making a suggestion, they're just doing benchmarks to test.



Kanan said:


> A lot of his videos are interesting *and* realistic. This one isn't.



I would argue most of their videos aren't realistic. They're supposed to eliminate variables that in the real world would be present.



Kanan said:


> Also where exactly did I say, he is one guy? I follow this channel regularly, I know that he isn't alone. Nor can he be, with so much work at hand.


You keep referring to *their *channel as *his*.



lynx29 said:


> see I find those results very interesting, it is only one person though.  i would expect 10 fps increase, especially seeing as how it scales to 40-50 fps when you bring in the 2080 ti at 1080p, or maybe 1440p is what is stopping the scaling of the ram performance gains? this is why i wish benchmarkers did budget build tests. something doesn't make sense here it should be at least 10 fps, i know it doesn't make sense to match the 40-50 fps, but i would just think it would be more than that scaled down even


I don't know if those numbers are right btw, I haven't really done any actual testing. It's just my observation that I haven't noticed any higher FPS numbers after upgrading the RAM, but without doing actual benchmarks my words are naught.


----------



## Kanan (Jul 14, 2020)

kayjay010101 said:


> Mt point is your point is irrelevant. GN isn't making a suggestion, they're just doing benchmarks to test.
> 
> 
> 
> ...


Calm down. First of all, GN is testing and recommending a lot. And they also do a lot of enthusiast videos for entertainment. Both. My point isn't irrelevant, because the OP thought Ryzen is "so much" inferior, while it really isn't. Real world usage, with normal resolutions, RAM sticks and everyman's GPU's, the difference is abysmal between the CPUs. And then, you can switch from Zen 2 to Zen 3, while with Intel you have a dead end ahead.


----------



## Space Lynx (Jul 14, 2020)

Kanan said:


> Calm down. First of all, GN is testing and recommending a lot. And they also do a lot of enthusiast videos for entertainment. Both. My point isn't irrelevant, because the OP thought Ryzen is "so much" inferior, while it really isn't. Real world usage, with normal resolutions, RAM sticks and everyman's GPU's, the difference is abysmal between the CPUs. And then, you can switch from Zen 2 to Zen 3, while with Intel you have a dead end ahead.



well I currently own a ryzen and msi x570 tomahawk. this has nothing to do with fanboy stuff, I simply am interested in the numbers from GN benchmarks and the implement ion of faster ram, now that it is affordable for everyone to own B-Die ram at $95 for 16 gigs. there does seem to be some gains with intel/amd, and Intel seems to benefit the most from the ram, but again we are not seeing real world tests as you said and which I have mentioned multiple times now is why i am interested at all in this thread still...because I still feel like spending $30 extra on ram can have enormous benefits for even real world builds, but I am struggling to find benches for that.

at this point though I can see not caring about context is going to just keep drawing hate to this thread... TPU has become toxic imo, so i guess nm and close thread mods. i thought we were enthusiasts here who enjoyed overclocking and pushing tech to the limits, but I guess not.


----------



## kayjay010101 (Jul 14, 2020)

Kanan said:


> Calm down.


Why do you keep telling me to calm down? I'm not mad



Kanan said:


> First of all, GN is testing and recommending a lot. And they also do a lot of enthusiast videos for entertainment. Both.


Yes, seperately. Their "entertaining" videos aren't meant to be taken as recommendations, and their serious testing isn't supposed to be ultra mega entertaining (although that doesn't mean they're a snorefest). They know to keep the two separate. 

Either way, my refutal was towards your comment that their testing videos are to be taken as realistic build scenarios, which they're not. Although reading a bit more into your comment it seems I skipped a few words which indicated you were separating the two, which is my bad. It seems I jumped the gun a bit here.



Kanan said:


> My point isn't irrelevant, because the OP thought Ryzen is "so much" inferior, while it really isn't. Real world usage, with normal resolutions, RAM sticks and everyman's GPU's, the difference is abysmal between the CPUs. And then, you can switch from Zen 2 to Zen 3, while with Intel you have a dead end ahead.


Agreed, I wasn't refuting this point though.


----------



## Kanan (Jul 14, 2020)

lynx29 said:


> well I currently own a ryzen and msi x570 tomahawk. this has nothing to do with fanboy stuff, I simply am interested in the numbers from GN benchmarks and the implement ion of faster ram, now that it is affordable for everyone to own B-Die ram at $95 for 16 gigs. there does seem to be some gains with intel/amd, and Intel seems to benefit the most from the ram, but again we are not seeing real world tests as you said and which I have mentioned multiple times now is why i am interested at all in this thread still...because I still feel like spending $30 extra on ram can have enormous benefits for even real world builds, but I am struggling to find benches for that.
> 
> at this point though I can see not caring about context is going to just keep drawing hate to this thread... TPU has become toxic imo, so i guess nm and close thread mods. i thought we were enthusiasts here who enjoyed overclocking and pushing tech to the limits, but I guess not.


Sorry if I came across as toxic, not my intention - I tried to be fair here. Hype is bad for hardware communities, and this is the issue here. Good hype, bad hype, both are annoying and misleading.

When it comes to RAM, I recommend, don't go over 200 bucks (32 GB kit), certainly don't pay more for RAM than for the CPU.


lynx29 said:


> at this point though I can see not caring about context is going to just keep drawing hate to this thread... TPU has become toxic imo, so i guess nm and close thread mods. i thought we were enthusiasts here who enjoyed overclocking and pushing tech to the limits, but I guess not.


True. For enthusiasts look in the club forum thread Zen Garden, though, or over at Overclocking. Everything has its place here.


kayjay010101 said:


> Although reading a bit more into your comment it seems I skipped a few words which indicated you were separating the two, which is my bad. It seems I jumped the gun a bit here.


No problem.


----------



## Space Lynx (Jul 14, 2020)

Kanan said:


> Sorry if I came across as toxic, not my intention - I tried to be fair here. Hype is bad for hardware communities, and this is the issue here. Good hype, bad hype, both are annoying and misleading.
> 
> When it comes to RAM, I recommend, don't go over 200 bucks (32 GB kit), certainly don't pay more for RAM than for the CPU.
> 
> ...



Wasn't referring to just you. also I just asked this same question on overclock.net and got some healthy answers. so I guess from now on if I get excited about something tech/enthusiast related I will just ask there. cheers, and thread can be closed now i got my answers at OCN

for some reason TPU has become everything must be fanboy related if its a new topic... i own a gtx 1070 laptop and a ryzen msi x570 tomahawk desktop... its nothing to do with fanboy for me, im just genuinely interested in the latest tech and what is going on. if I read the numbers wrong that is on me, and I admitted that, and thanks again to Mussels for pointing that out to me. doesn't mean I shouldn't get excited for new possibilities in builds. this is one of my main hobbies


----------



## Melvis (Jul 14, 2020)

Nope










and its not even 20FPS difference comparing the both tuned OC scores with the link you gave us. not this BS 60-70FPS crap your saying


----------



## phill (Jul 14, 2020)

I might have to do some testing of my own when I get the hardware through....  I believe I have enough systems to compare to.....

Thing is we all know Ryzen won't do as well in games but how many people buy a 2080 Ti, runs a 3600 or such Intel equivalent, and then games at 1080??  I can't see it.  I know I run triple screens at 1080 but that's because I don't see a monitor out currently that I'd like to upgrade to, when I wish to buy something decent, I want to make sure its worth the cash.  I'd love some higher res monitors, so I am looking, my 1080 Ti's need a stretch....
Once you go past 1080, to 1440/1600 and then to 4k, there's really not the difference for myself personally to worry about the few fps difference.  Right now to me, AMD makes the sensible option..


----------



## R-T-B (Jul 15, 2020)

If I didn't know better, I think you might've hit a nerve, lynx.

People need to chill.  That's all I got.


----------



## Papahyooie (Jul 15, 2020)

For nearly twice the price, It'd better stomp lol.


----------



## R0H1T (Jul 15, 2020)

lynx29 said:


> now that it is affordable for everyone to own B-Die ram at $95 for 16 gigs.


Well you're wrong there, it isn't *cheap *for the vast majority of the world outside the US. Pretty sure B die RAM start at least 20% higher over here, similarly in Oz, Europe, much of Asia (except perhaps China) et al.


----------



## Nordic (Jul 15, 2020)

phill said:


> How many people buy a 2080 Ti, runs a 3600 or such Intel equivalent, and then games at 1080??


Competive gamers, competive wannabes, and people who like smooth fps. That is who runs hardware like that.


----------



## R-T-B (Jul 15, 2020)

Nordic said:


> Competive gamers, competive wannabes, and people who like smooth fps. That is who runs hardware like that.



And people like me, who run Kerbal Space Program, the most singlethreaded physics simulator known to man (and loads of fun too!)


----------



## cucker tarlson (Jul 15, 2020)

there is nothing wrong with testing cpu in gaming and not having situations where you're gpu bound.unless you are an idiot you understand that is a part of a test.in every part of a cpu review the focus is on cpu,not only the gaming part of it.do you enable cuda acceleration for video tests and nvenc for streaming tests ?

there is nothing wrong with presenting unrealstic setups like someone will buy a 3600xt  and pair it with 3800 c13 ram. it's what tech channels are supposed to do for us.

there is,however,a problem with people not being smart enough to interpret those reults and putting themselves on the opposites sides of the barricade.what is the difference ? it's usually neither 40 fps nor 4 fps. the difference is certainly there and it's visible.if it's *that* big in cpu testing and you gotta use 1440p ultra on a 2070/2080 level card to bring it down to a couple of percent that means the difference will show now and even more in the future.

how hilarious it is those who think 2080Ti testing is unrealistic will instantly believe "news" from clickbait video channels about navi 20 being 50% faster than 2080Ti and fail to see any dissonance.


----------



## Nordic (Jul 15, 2020)

R-T-B said:


> And people like me, who run Kerbal Space Program, the most singlethreaded physics simulator known to man (and loads of fun too!)


Do you need a 2080ti for that?


----------



## R-T-B (Jul 15, 2020)

Nordic said:


> Do you need a 2080ti for that?



You might with the right absurd visual mods, but it's more cpubound than anything.  It likely won't care if it's 2080ti or 1080.


----------



## phill (Jul 15, 2020)

Nordic said:


> Competive gamers, competive wannabes, and people who like smooth fps. That is who runs hardware like that.


I'd have thought they'd run something faster to maximise the FPS??  Like they always say with a car, there's no replacement for displacement....  I think it's pretty similar in PC land 

But then reviews are meant to show the biggest difference between the CPUs so it's more of a wave than oh well at 4k there's like 5fps difference, meh...  That's not going to get the reads/likes and such like  
We are all tweakers underneath but sometimes the amount of tweaking just doesn't help so much with the performance you get out of it..


----------



## John Naylor (Jul 15, 2020)

Ypou almost lost me starting with Gamer's Nexus as the only site I distrust more is jayztwocents.

The biggest problem with RAM testing has been:

1.  It seems that reviewers go in with a pre-conceived notion and then select games to test which prove their hypothesis

2.  Its also true that more effort is needed to include hardware that is not already bottlenecking the system.  Back in the day you'd see tests where fps didn't change much because the system was GPU bottlenecked.  But, run that test again with twin card in SLI / CF and bang, big difference.

3.  Another testing inadequacy was that most folks tested average fps... and not min fps.    In many games, avg fps changed little but the test with faster RAM got rid of the stutters because faster RAM brought up minimum fps dramatically.

4.  Most of the build requests we get are from gaming enthusiasts and with the ability to do a decent job in video editing.   In any price point we've looked at, its hard for me to make a case for AMD.  And of course, the faster RAM also helps in the Video editing arena.

5.  Yes Intel's overclock ability is something AMD can't do but in TPUs tests, the 10600k hit 80C and that was at 4.9 OC not 5.1 ... granted that gaming will present a far lower load than Blender and most Gaming Boxes will have something a bit better than a Noctua NH-U12 (Sythe FUMA is 5C cooler / best AI is 10C cooler but that's a cost that has to be considered.   Then again... the 3900X is only 1C cooler than the  heavily OC'd 10600k




Kanan said:


> Intel has academically relevant FPS advantage in games, as in, barely relevant, while having a inferior architecture with security problems and no PCI-E 4.0 support which may very well be relevant soon, coming gen graphics cards. Plus, Intels CPUs are awfully inefficient, especially when overclocked. And 170 bucks extra, may not be much money for you, but for the most people it is rather significant.



"inferior architecture" is the non-relevant  rebuttal to "your CPU is slower".  CPUs are tools, their elegance in design is not relevant to getting the job done.  You can make a hammer with carbon fiber handle and space age metals but if it doesn't bang nails in any faster, the fancy-smancy design has no advantage.  I have several hammer  of various designs ... the newest one is 25 years old.  I have not used, seen or read about  another hammer that is going to change how fast I can replace the shingles on my garage.

And exactly what effect has the security problems resulted in for either CPU ?  Who has been impacted ?   Not only have I not seen articles explaining that x % or users have been affected and this is how it has impacted them, I have yet to see a single post whereby a single user has a sob story to tell.

And if all we are talking about is "enthusiast gaming," what does AMD have that can compete with the $160 10400F



			https://tpucdn.com/review/intel-core-i5-10400f/images/relative-performance-games-1920-1080.png
		


$160 - 10400F = 100%
$390 - 3900X = 97.3%
$273 - 3700X = 97.0%
$218 - 3600x = 94.7%
$172 - 3600 = 93.8%
$256 - 3300x = 92.8%
$209 - 2700x = 89.7%
$290 - 10600k = 102.5%

Selecting a CPU is a exercise in picking the best tool for the job ... 

... it's not better because it has a smaller die size
... it's not better because it has more cores
... it's not better because it has less vulnerabilities to threats that don't actually exist in the real world
... it's not better because the design is more elegant
... it's only better when it performs better

Finally let's look at the pricing argument .... specious as it is.

Compared to the 3600, the 10600k is only 9.3 %  (102.5 / 93.8%)  faster and it costs 68% (290 / 172) more, therefore the 3600 is the better buy ?  OK so I have bought the two CPus, have them sitting on my desk ... neither one is faster because they don't do anything without the rest of the componentry.    So we need to make this real .... Let's say we are building a $1200 system with a 3600.  is it worth it to switch to a 10600k just for Gaming ?

3600 System = $1250
10600k system + $1250 + ($290 - $172) = $1368 or 9.4% more cost.

I'd call that a "mathematical" wash.   And that's from a strictly fiscal perspective.... what does the user gain in the way of the user experience for being just under 10% faster ?

Now of course, given the audeince here, the user typically will be taking the  additional step of OC'ing the system to the max, they will be grabbing a better cooler than most, andthey can pick up about 2% with the OC on the 10600k, 1% with the 3600 in gaming.  If they doing video editing of the gaming experiences, then the gains are about 4% for the 10600k and 1% for the 3600 . 

If the OPs referenced testing delivers even half of what the article claims with faster RAM ... it is certainly

In summary,  if you are building a box to play games or to play games and edit video  ... you should only be looking at CPU performance in those areas;  Intel wins for this job.   If you are going to be doing Rendering or Software / Gaming Development, then you should choose AMD cause it is the best tool for that job.  That's basically 'it"

Secondary considerations are:

Power consumption - I don't see any major differences here but it favors Intel:


			https://tpucdn.com/review/intel-core-i5-10400f/images/power-gaming.png
		


Temperature -  I don't see any major differences here but it favors Intel:


			https://tpucdn.com/review/intel-core-i5-10400f/images/cpu-temperature.png
		


If you are a price conscious gamer, then you won't be looking at the 10600k, you'll be grabbing the 10400F which is 6% faster than the 3600.  With a  decent air cooler In addition your GFX card budget is likely to be the system bottleneck anyway.

There is no best tool ... which tool you should grab for your toolbox kinda depends upon whether you want to take aout a bolt, screw in a screw or bang in a nail.   It really doesn't matter what color it is, how elegant it looks or how it reacts to things it will never be exosed to.  Same for the CPU which is also a tool.  There  is no best CPU, just the best CPU for a particular job.  In an engineering or architectural office wqith 10 CAD operators,a typical workload might be 9 stations using AutoCAD to prepare constrution drawings and 1 station doing rendering. for "presentation purposes".  I'd build them 9 boxes with Intel CPUs and nvidia RTX cards  because AutoCAD is promarily single threaded and the applications runs faster on this hardware.   The rending box would be AMD based with Quadro card because these components do rendering better.  it's not a populatity or beauty contest.  It's about getting the job done.     And while AMD has closed the gap, it's still a choice thats hard to justify with Intel's 10xxx lineup


----------



## Nordic (Jul 15, 2020)

phill said:


> I'd have thought they'd run something faster to maximise the FPS??  Like they always say with a car, there's no replacement for displacement....  I think it's pretty similar in PC land
> 
> But then reviews are meant to show the biggest difference between the CPUs so it's more of a wave than oh well at 4k there's like 5fps difference, meh...  That's not going to get the reads/likes and such like
> We are all tweakers underneath but sometimes the amount of tweaking just doesn't help so much with the performance you get out of it..


The competive gamers I know want/need super high fps and single threaded performance to consistently reach an exceed their refresh rate. They will run the best hardware they can afford and still do 720p lowest graphics settings.


----------



## xman2007 (Jul 15, 2020)

Nordic said:


> The competive gamers I know want/need super high fps and single threaded performance to consistently reach an exceed their refresh rate. They will run the best hardware they can afford and still do 720p lowest graphics settings.


Which accounts for how much % of the general PC gaming population? 1% maybe lower?   well when the gen pop game at 720p with all low settings to acheive 250fps+ and poor little Ryzen can only acheive 200fps then I think you have made your point and Intel is the dominant master PC race, meanwhile, I'm gaming happily on my Ryzen 1600af which performs as good if not better when overclocked than a 2600 (4.2Ghz daily use) and it cost me a whopping £80, my RX 580 cost me similar and my B-Die Team group RAM cost me  about £70 for 16GB and chugs along happily at 3400mhz CL 14 (it does do 3600 but I think my IMC is the limiting factor in stability at that speed but W/E) I'm gutted I didnt spend an extra £120 on CPU, £100+ on GPU and another £50+ on RAM for the same performance or 5% give or take

Obviously the used market is a different beast (my RX 580 was used at £80, though my 1600af and RAM were both new) but in my case there is no way no how I could acheive the same gaming performance that I do now having gone Intel and Nvidia, used or not, so I guess it's horses for courses?


----------



## TheoneandonlyMrK (Jul 15, 2020)

R0H1T said:


> Well you're wrong there, it isn't *cheap *for the vast majority of the world outside the US. Pretty sure B die RAM start at least 20% higher over here, similarly in Oz, Europe, much of Asia (except perhaps China) et al.


And is not in production anymore, it's getting rarer.


----------



## RealNeil (Jul 16, 2020)

lynx29 said:


> are you not curious at all what 3800 cas 15/16 ram can do with a ryzen 3600


Getting ready to build a Ryzen 3600 with 32GB GEIL 4132MHz. RAM. (after a few days I'll drop in the 32GB GSKill 3600MHz. and leave it)


----------



## phill (Jul 16, 2020)

Nordic said:


> The competive gamers I know want/need super high fps and single threaded performance to consistently reach an exceed their refresh rate. They will run the best hardware they can afford and still do 720p lowest graphics settings.


Daft question from myself as I'm interested at this point...  If they care about just FPS and so on.  What sort of setups do they have?  Highly overclocked?  High end kit?  I'm sure as eggs it's not going to be something like a 3600/10400 or whatever is around at the price point with a 2080 Ti and C12 40000MHz RAM..... (The 40000 was intentional!! lol)


----------



## thesmokingman (Jul 16, 2020)

Nordic said:


> The competive gamers I know want/need super high fps and single threaded performance to consistently reach an exceed their refresh rate. They will run the best hardware they can afford and still do 720p lowest graphics settings.



Yea, they're kinda stupid in that way. smh...


This whole farce reminds of the old ricer craze from a few years back which was a result of idiots wanting to be like race cars. And nowadays you have idiots running real or fake anit-lag because some idiots saw it on the WRC.


----------



## Nordic (Jul 16, 2020)

xman2007 said:


> Which accounts for how much % of the general PC gaming population? 1% maybe lower?


That is irrelevant. I brought that group up when phill asked who used hardware like that. I run ryzen myself.



phill said:


> Daft question from myself as I'm interested at this point...  If they care about just FPS and so on.  What sort of setups do they have?  Highly overclocked?  High end kit?  I'm sure as eggs it's not going to be something like a 3600/10400 or whatever is around at the price point with a 2080 Ti and C12 40000MHz RAM..... (The 40000 was intentional!! lol)


It depends on how much budget they have. They usually run the fastest single threaded cpu they can afford with the fastest ram and gpu they can afford. That 1% performance matters to them. They care very much about their minimum frame rate.



thesmokingman said:


> Yea, they're kinda stupid in that way. smh...
> 
> 
> This whole farce reminds of the old ricer craze from a few years back which was a result of idiots wanting to be like race cars. And nowadays you have idiots running real or fake anit-lag because some idiots saw it on the WRC.


The real competive people don't care about whatever anti lag software. They have a real use case for maximum single threaded performance.


----------



## Palladium (Jul 16, 2020)

Well I don't know the rest of you, I got better things to do than seeing which Youtuber is waving a bigger meat rod at their ADHD audience.


----------



## phill (Jul 16, 2020)

Nordic said:


> That is irrelevant. I brought that group up when phill asked who used hardware like that. I run ryzen myself.
> 
> 
> It depends on how much budget they have. They usually run the fastest single threaded cpu they can afford with the fastest ram and gpu they can afford. That 1% performance matters to them. They care very much about their minimum frame rate.
> ...


And rather than de-rail the thread, I'll ask over in a PM if that's ok @Nordic ?


----------



## efikkan (Jul 16, 2020)

lynx29 said:


> Why is no one talking about this? It's absolutely insane the gains...
> Am I missing something? Why are the gains so insane???? Ryzen is history... I mean yeah the ram costs $200 versus $100 for the same capacity, but small price to pay for 40-60 fps gains across the board...
> gamersnexus review of high speed ram with intel i5-10600k beats ryzen 3700x by 60-70 fps in many games... and I can't believe no one is talking about that. its absolutely insane imo.


This is not really news; in some games at 1080p at medium settings with a high-end GPU, some Intel CPUs will pull ahead significantly, especially with tighter memory timings. But I would call this close to an artificial scenario, you will not game with a RTX 2080 Ti at these settings, so this is more interesting for an academic discussion than anything.

But I think most of you missed the most important detail in the benchmark results;
R5 3600 XT and memory needs to be overclocked to match what the i5-10600K (and i7-8700K) does at stock.

By the time you run this at a realistic configuration, like 1440p at high/ultra, there will no longer be a 60-70 FPS difference any more, and then the $100 extra in RAM is wasted, $100 you could have spent on a better GPU instead. I generally would advice against doing memory overclocking, especially if you think it will give you extra "value". Overclocking should be reserved for those doing it for fun, everyone else should run RAM at the fastest JEDEC speed supported by their CPU. It annoys me that so many these days seem to recommend overclocking by default. The countless crashes and stability issues from overclocked memory (especially as the memory controller degrades) is not worth it, and you have to pay a lot extra for the memory, and probably get <5% gains in most realistic gaming workloads. Despite how old Skylake may be, i5-10600K is an excellent gaming CPU capable of running games at stock memory speeds.


----------



## Mats (Jul 16, 2020)

lynx29 said:


> timestamped:
> 
> gamersnexus review of high speed ram with intel i5-10600k beats ryzen 3700x by 60-70 fps in many games... and I can't believe no one is talking about that. its absolutely insane imo.
> timestamped:


gobsmacked:

*x FPS higher* says nothing here, because it's not in relation to anything. Percentage is the way to go. Otherwise it just looks like clickbait.

gobsmacked:


----------



## Kanan (Jul 16, 2020)

efikkan said:


> This is not really news; in some games at 1080p at medium settings with a high-end GPU, some Intel CPUs will pull ahead significantly, especially with tighter memory timings. But I would call this close to an artificial scenario, you will not game with a RTX 2080 Ti at these settings, so this is more interesting for an academic discussion than anything.
> 
> But I think most of you missed the most important detail in the benchmark results;
> R5 3600 XT and memory needs to be overclocked to match what the i5-10600K (and i7-8700K) does at stock.
> ...


Ryzen is fine with up to 3600 and sometimes 3800 Ram speed without problems, but yes, tight 3200 speed is more than enough if it has nice timings like CL14. In general I agree with you.


----------



## Mats (Jul 16, 2020)

Papahyooie said:


> For nearly twice the price, It'd better stomp lol.


They're comparing it to the 3600 XT, so no, the price difference isn't very big.


----------



## Kanan (Jul 16, 2020)

Mats said:


> They're comparing it to the 3600 XT, so no, the price difference isn't very big.
> View attachment 162352


But do not forget, that the Ryzen can be paired with a cheap mainboard and the Intel needs to be paired with a expensive Z mainboard for overclocking. And without overclocking I don't see a reason to buy Intel, honestly.


----------



## Mats (Jul 16, 2020)

Kanan said:


> But do not forget, that the Ryzen can be paired with a cheap mainboard and the Intel needs to be paired with a expensive Z mainboard for overclocking. And without overclocking I don't see a reason to buy Intel, honestly.


In this scenario it doesn't matter. To achieve these gains you need a 2080 TI (AND run it at 1080p), that's *€1000* just for the GPU. This is not for the budget conscious buyer.

Besides, I'm not sure that such budget AM4 boards can overclock to 4.6 GHz.


----------



## Kanan (Jul 16, 2020)

Mats said:


> In this scenario it doesn't matter. To achieve these gains you need a 2080 TI (AND run it at 1080p), that's *€1000* just for the GPU. This is not for the budget conscious buyer.
> 
> Besides, I'm not sure that such budget AM4 boards can overclock to 4.6 GHz.


True. No, it depends, not for long time usage at least. If they have weak power stages it will abuse these mainboards too hard. With the 4/6 core it could work for a time though.


----------



## Papahyooie (Jul 16, 2020)

Mats said:


> They're comparing it to the 3600 XT, so no, the price difference isn't very big.
> View attachment 162352


I guess I missed the fact it was XT, thought it was referring to 3600x. Good call.


----------



## efikkan (Jul 16, 2020)

Kanan said:


> Ryzen is fine with up to 3600 and sometimes 3800 Ram speed without problems, but yes, tight 3200 speed is more than enough if it has nice timings like CL14. In general I agree with you.


It depends on the user's definition of "fine", I'm pretty sure that most of you who have overclocked memory don't have 100% stability, even if it passes Prime95 and AIDA64. It might be _good enough_ for you, but it's probably not as stable as stock over time.

If your purpose of the machine is to have fun with overclocking, then go ahead and have fun, then it's a conscious decision.

If not, and especially if it's used for any kind of productive workload, then stay away from any overclocking and stick to JEDEC profiles for memory.
It's kind of funny that AMD's graphics department warns about using overclocked memory, yet their CPU department almost seem to encourage it. 
It annoys me when videos from Linus tech tips and the like encourages people building their first computer to overclock it for more value. That's only going to give people some pain.


----------



## Mats (Jul 16, 2020)

Kanan said:


> With the 4/6 core it could work for a time though.


Maybe, although I wonder if 4.6 is possible at all.


----------



## Kanan (Jul 16, 2020)

efikkan said:


> It depends on the user's definition of "fine", I'm pretty sure that most of you who have overclocked memory don't have 100% stability, even if it passes Prime95 and AIDA64. It might be _good enough_ for you, but it's probably not as stable as stock over time.
> 
> If your purpose of the machine is to have fun with overclocking, then go ahead and have fun, then it's a conscious decision.
> 
> ...


I agree, absolutely, for work or similar usage, I would not overclock. I have bought 3600 Ram and the Ram itself is not overclocked. The thing with AMD is, yes, 3200 is official, but AMD said, the controller will do up to 3600 or 3800 nicely at 1:1 mode. After that you most likely need to decouple it, which decreases performance too much. But 3600 is no problem with a 3700X. I even tried 3800 and it ran the memory stresstest without problems.



Mats said:


> Maybe, although I wonder if 4.6 is possible at all.


I don't know, I considered it because I thought you saw it in a review. Most likely it is not, in my experience. 4.4 maybe, everything after that, I would consider lucky or a wonder. For the newest CPUs, for the older ones? Even 4.4 is a wonder then.


----------



## Mats (Jul 16, 2020)

Kanan said:


> I don't know, I considered it because I thought you saw it in a review. Most likely it is not, in my experience. 4.4 maybe, everything after that, I would consider lucky or a wonder. For the newest CPUs, for the older ones? Even 4.4 is a wonder then.


Yeah, I picked the clock speed from the OP's link. 
I'm not talking about what the CPU can do, just the board. If you have a 3600 XT that does 4.6 GHZ in an X570 board, I wouldn't expect it to automatically do the same in a B450 board, even though it's not impossible.


----------



## Kanan (Jul 16, 2020)

Mats said:


> Yeah, I picked the clock speed from the OP's link.
> I'm not talking about what the CPU can do, just the board. If you have a 3600 XT that does 4.6 GHZ in an X570 board, I wouldn't expect it to automatically do the same in a B450 board, even though it's not impossible.


50 € B450? Yes for a few days maybe - oh it also depends a lot on the usage type. If it's just the usual games, and not 64P Battlefield 5, it would probably work out for a good time. Definitely not for 24/7 or coding.


----------



## Mats (Jul 16, 2020)

Kanan said:


> 50 € B450? Yes for a few days maybe


How do you know that it will work at all? Those boards were made when 4.6 GHz was a pipe dream, more or less.


----------



## Kanan (Jul 16, 2020)

Mats said:


> How do you know that it will work at all? Those boards were made when 4.6 GHz was a pipe dream, more or less.


Power stages, if they are weaker, it just means they have lower limits and less durability. It doesn't mean the overclocks are lower. Just the stability is low or lower.

I have learned much of this from @buildzoid in his channel AHOC. He knows a lot.


----------



## lexluthermiester (Jul 16, 2020)

lynx29 said:


> Ryzen is history...


ROFLMFBO!! Funniest damned thing I've read all year! Full stop!


----------



## Mats (Jul 16, 2020)

Kanan said:


> Power stages, if they are weaker, it just means they have lower limits and less durability. It doesn't mean the overclocks are lower. Just the stability is low or lower.


Yeah, I was going for stable clocks, of course.


----------



## Kanan (Jul 16, 2020)

Mats said:


> Yeah, I was going for stable clocks, of course.


If you're a overclocker it's better to overspend than to underspend.


----------



## dirtyferret (Jul 16, 2020)

lynx29 said:


> Why is no one talking about this? It's absolutely insane the gains...
> 
> @R-T-B
> 
> ...


This review changes EVERYTHING!


----------



## Mats (Jul 16, 2020)

This just in: Faster stuff runs faster!!!


----------



## Nordic (Jul 16, 2020)

phill said:


> And rather than de-rail the thread, I'll ask over in a PM if that's ok @Nordic ?


You can.


----------



## Mats (Jul 16, 2020)

Kanan said:


> If you're a overclocker it's better to overspend than to underspend.


Yup, but also:


Kanan said:


> the Ryzen can be paired with a cheap mainboard


 
I never suggested going for a budget board in this situation..


----------



## Kanan (Jul 16, 2020)

Mats said:


> Yup, but also:
> 
> 
> I never suggested going for a budget board in this situation..


You compared prices. And intel K cpus are not paired with cheap boards, therefore your conclusion was flawed IMO.


----------



## Mats (Jul 16, 2020)

Kanan said:


> You compared prices.


I replied to someone who compared prices, and then you compared prices of the boards. What's your point.


Kanan said:


> And intel K cpus are not paired with cheap boards, therefore your conclusion was flawed.


What conclusion?


----------



## Kanan (Jul 16, 2020)

Mats said:


> I replied to someone who compared prices, and then you compared prices of the boards. What's your point.
> 
> What conclusion?


Now you suddenly forgot what everything is about and I have to do your job, despite you trying to get on MY nerve? No thanks, I'll pass. Fuck this.


----------



## Mats (Jul 16, 2020)

Honestly, that wasn't my intention. For me it started with post #54, someone else compared prices.

This doesn't mean trying to get on someones nerve ----> 
Maybe I should have used a regular smiley.


Have a good day.


----------



## Mats (Jul 16, 2020)

Anyway, the OP's interpretation of the GN's review is a bit hard to understand. If there's a difference in FPS when comparing stock vs stock, it isn't far fetched to expect something similar when OC'd..
I'm not surprised that the 10600K scales better than the 3600XT.


----------



## Kursah (Jul 16, 2020)

Let's keep it on topic folks, no more bickering unless you want to earn points. We have forum guidelines for a reason, please review them (link in sig if you need it) and then let's get back to this topic constructively.


----------

