# 3700X vs 9900K, that is the question...



## trparky (Jul 23, 2019)

So I'm sitting here thinking about building a new system, I already have an nVidia GTX 1060 so that saves me some cash in building the system.

I have two systems with parts chosen...

*Config #1:* Intel 9900K, Gigabyte Z390 Aorus Ultra, and a Corsair Hydro H115i Platinum
*Config #2:* AMD 3700X and a Gigabyte X570 Gaming X (this config would use the AMD Prism cooler)

Both systems would have 16 GBs of RAM and a Samsung 970 EVO 500 GB SSD.

Of course, if I go by price, I could save myself almost $400 by going with the AMD config but as with everything in life, it's really not that easy.

Here's some background... I play Diablo 3 and Starcraft 2 casually, I game casually and I really don't care about extremely high frames per second. I may get back into World of Warcraft at some point but that's a maybe. Hell, I still have two 1080p-class monitors locked at 60Hz. The most I would probably get is a 75Hz Freesync monitor that's again a 1080p-class monitor. Again I don't care about extremely high refresh rates for I'm not into the whole competitive gaming scene. I also like to spin up Hyper-V virtual machines for testing stuff in... just because. Right now I have a Hyper-V VM that I test Windows 10 Fast Builds in just to see what Microsoft is up to.

Yes, I have posted in other threads about how Intel is best for gaming *IF* you are into highly competitive gaming and you absolutely *must* have the highest FPS. But again, I'm not the kind of person.

So with all of that said, the question of course is... Do I need an Intel build for my needs or should I save some cash (nearly $400) and go with the AMD build?


----------



## cucker tarlson (Jul 23, 2019)

why change the 8700K ?

oh a brand new one.
then 3700+570 or 3600 on b450 really.


----------



## trparky (Jul 23, 2019)

cucker tarlson said:


> why change the 8700K ?


Giving it to my father.


----------



## cucker tarlson (Jul 23, 2019)

got it.updated.

see this thread








						Ryzen 2600X build, Need case/psu suggestions
					

So i finally have the means to do a full system upgrade (minus rx 480) and here is what i have come up with:  Ryzen 2600X https://www.canadacomputers.com/product_info.php?cPath=4_1210_64&item_id=120459 DDR4 16GB Team T-Force Dark...




					www.techpowerup.com


----------



## HD64G (Jul 23, 2019)

If I was in your place I wouldn't think at all and would buy a Ryzen 3000 CPU at once. You will have the ability to upgrade to both the upcoming 16C/32T in a few months and both to the next series in 2020 that will be even better while using the same board and memory. And with the 3900X you will already have a powerhouse in your PC that consumes very little. Just take care of its cooling to allow it to boost as high as possible.


----------



## cucker tarlson (Jul 23, 2019)

HD64G said:


> If I was in your place I wouldn't think at all and would buy a Ryzen 3000 CPU at once. You will have the ability to upgrade to both the upcoming 16C/32T in a few months and both to the next series in 2020 that will be even better while using the same board and memory. And with the 3900X you will already have a powerhouse in your PC that consumes very little. Just take care of its cooling to allow it to boost as high as possible.


dafuq?
did you read what he said ?

he's asking about 3700x,not 3900x.why would he get a 3900x for 60 fps gaming ? and then upgrade to 16 cores ?
people and their drugs.all he needs is a 3600 and 3700x if he really wants to pay premium.


----------



## Vario (Jul 23, 2019)

It seems to me you artificially increased the price between the two setups to make that $400 gap.
You could buy a much cheaper Z390 board that is $100 less and have the same experience.  Similarly you picked an expensive CPU heatsink, $140.  You will likely be dissapointed with the lackluster Prism cooler, so you may as well apply that overpriced corsair heatsink to both of them, then the Intel setup is actually ~$160 greater.
You could also pick a 9700 instead of a 9900K and then the cost is about the same.


----------



## cucker tarlson (Jul 23, 2019)

Vario said:


> It seems to me you artificially increased the price between the two setups to make that $400 gap.
> You could buy a much cheaper Z390 board that is $100 less and have the same experience.  Similarly you picked an expensive CPU heatsink.  You would be dissapointed with the lackluster Prism cooler, so you may as well apply the expensive heatsink to both of them.
> You could also pick a 9700 instead of a 9900K.


he's comparing wraith prism to 115i plat 
wrairth prism maybe matches $30 aftermarket coolers,and that's hopeful.meanwhile,amd has no igpu and once you're between gpus or one dies on your pc is unusable.but that's not added value for the red team apparently.


----------



## Vario (Jul 23, 2019)

cucker tarlson said:


> he's comparing wraith prism to 115i plat


Yeah, somehow thanks to marketing, people believe the Prism cooler is somehow sufficient when the AMD setup runs hot as hell with it and the real world AMD performance depends so much on thermals.  I am also not sure why the top of the line Intel with a top of the line motherboard and a top of the line AIO heatsink is being compared to a midrange AMD with a midrange motherboard and a low end heatsink.  No wonder somehow the price differential needlessly becomes $400.


----------



## trparky (Jul 23, 2019)

Alright, no arguments here. The plain question is... Can I get away with the 3700X based upon my needs?

Well for one, the 9900K is a power-hungry beast of a chip (thus a motherboard with a very good VRM array) and it needs serious cooling to keep it cool (thus a 280mm radiator cooler).


----------



## cucker tarlson (Jul 23, 2019)

trparky said:


> Alright, no arguments here. The plain question is... Can I get away with the 3700X based upon my needs?
> 
> Well for one, the 9900K is a power-hungry beast of a chip (thus a motherboard with a very good VRM array) and it needs serious cooling to keep it cool (thus a 280mm radiator cooler).


absolutely.


----------



## Vario (Jul 23, 2019)

trparky said:


> Alright, no arguments here. The plain question is... Can I get away with the 3700X based upon my needs?


If you game and all you do is game why do you need more than a 8700K?  Are you gaming while also running the VM at the same time?

Does your father truly need an 8700K?


----------



## trparky (Jul 23, 2019)

Vario said:


> If you game and all you do is game why do you need more than a 8700K?  Are you gaming while also running the VM at the same time?


Again, I'm giving the 8700K to my father.



Vario said:


> Does your father truly need an 8700K?


He's got an old 3570K right now and it's showing its age.


----------



## cucker tarlson (Jul 23, 2019)

Vario said:


> Yeah, somehow thanks to marketing, people believe the Prism cooler is somehow sufficient when the AMD setup runs hot as hell with it and the real world AMD performance depends so much on thermals.  I am also not sure why the top of the line Intel with a top of the line motherboard and a top of the line AIO heatsink is being compared to a midrange AMD with a midrange motherboard and a low end heatsink.  No wonder somehow the price differential needlessly becomes $400.


that wraith cooler is pretty shit actually 









						Matisse (Ryzen 3000) overclocking/undervolting
					

(Please excuse me if this was posted already somewhere else and I missed it, If not, here we go:)  Findings on how to make Ryzen 3000 CPUs go faster, or cooler, or more power efficient. Sure, the internet is full of reviews, but why not have our little corner.  --- My findings today on a R7...




					www.techpowerup.com


----------



## Vario (Jul 23, 2019)

cucker tarlson said:


> that wraith cooler is pretty shit actually
> 
> 
> 
> ...


So he may as well apply an extra $140 to the cost of the AMD so the price differential is now $260.  Knock the Z390 board to a midrange and now its $160.


----------



## HD64G (Jul 23, 2019)

The 3700X will fully cover your PC needs for years until a more powerful CPU is needed. With a 65 TDP and the box cooler being sufficient except for the case you live in a very hot region and even then, a $25 cooler will be more than enough.


----------



## kapone32 (Jul 23, 2019)

Vario said:


> So he may as well apply an extra $140 to the cost of the AMD so the price differential is now $260.  Knock the Z390 board to a midrange and now its $160.



Why a $30 dollar after market cooler will keep the 3700x good.


----------



## trparky (Jul 23, 2019)

Vario said:


> It seems to me you artificially increased the price between the two setups to make that $400 gap.


Hell, the motherboard is $120.


Vario said:


> Similarly you picked an expensive CPU heatsink, $140.


You practically need a 280mm radiator to keep the 9900K from cooking itself to death.


----------



## cucker tarlson (Jul 23, 2019)

trparky said:


> Hell, the motherboard is $120.
> 
> You practically need a 280mm radiator to keep the 9900K from cooking itself to death.


but 80 degrees in gaming on wraith is fine


----------



## Vario (Jul 23, 2019)

trparky said:


> Hell, the motherboard is $120.
> 
> You practically need a 280mm radiator to keep the 9900K from cooking itself to death.


The funny thing is the 8700K will perform the same as the 9900K for your uses so if it were me I'd buy my dad a 3770K, sell his 3570K, and get a new videocard for my 8700K build.  Both of the builds you have in the OP are sidegrades from your current setup.


----------



## Papahyooie (Jul 23, 2019)

Get the 3700x and a Hyper 212 evo cooler. Call it a day. Either of those CPUs will slay 60 fps gaming without even a yawn. No need to spend extra money for that. And the 3700x would be great for your VM use case. Hell, save some more money by going with a decent x470 or even B350 motherboard, unless you need PCI-e 4 for some reason. Benchmarks show negligible difference in performance by going with the X570 boards. So there's no need, as long as you get an older motherboard with decent VRM.

(That being said, as above, it's exactly a side-grade.)


----------



## xtreemchaos (Jul 23, 2019)

go for the Zen, the 9900k is a nice chip but runs hot at 5ghz and the savings from the ryzen is a happy place.


----------



## cucker tarlson (Jul 23, 2019)

xtreemchaos said:


> go for the Zen, the 9900k is a nice chip but runs hot at 5ghz and the savings from the ryzen is a happy place.


sums it up pretty much.


----------



## Papahyooie (Jul 23, 2019)

To be clear, it's a side-grade from your 8700k for gaming. The extra 2c/4t on the 3700x can help out in a multiple VM scenario, for sure. And a win on power consumption, if you care about that sort of thing.


----------



## trparky (Jul 23, 2019)

Yes, I am fully aware that it's a side-grade here but I wouldn't want to give my father a piece of junk either. I mean it's my father for God's sake, it's called being a good son.

As for staying with a third-generation chip, because there are no firmware upgrades for Spectre and Meltdown the chip is vulnerable.


----------



## Lionheart (Jul 23, 2019)

I would suggest getting a decent B450 motherboard with Q-flash, & a Ryzen 5 3600 instead of the 3700X but hey, if you want the 8 core, grab it but IMHO the 3600 is plentiful especially paired with a GTX 1060. Not much point going all out on the best gaming CPU (9900K) unless you're going to get a high end GPU as well & it seems to me you're quite content with your 1060 for now plus you don't really care for high refresh rate as well. As for CPU cooling just stick with the AMD box cooler & look for a decent aftermarket one for around $30 max from coolermaster, should be fine.


----------



## Papahyooie (Jul 23, 2019)

trparky said:


> Yes, I am fully aware that it's a side-grade here but I wouldn't want to give my father a piece of junk either. I mean it's my father for God's sake, it's called being a good son.
> 
> As for staying with a third-generation chip, because there are no firmware upgrades for Spectre and Meltdown the chip is vulnerable.


Fair enough. 

When you say third generation chip, are you talking about the x570 motherboard?


----------



## trparky (Jul 23, 2019)

Papahyooie said:


> When you say third generation chip, are you talking about the x570 motherboard?


No, Intel 3rd generation. Someone mentioned I get a 3770K for him.


----------



## Papahyooie (Jul 23, 2019)

trparky said:


> No, Intel 3rd generation. Someone mentioned I get a 3770K for him.


Ah ok. Gotcha.


----------



## trparky (Jul 23, 2019)

OK, so at least I know that a 3700X will be good enough for my needs; that's good. I haven't yet decided on the motherboard yet so there's some more decisions there. Yes, I could get the 3600X instead of the 3700X but I do like the idea of the two extra cores.

I am limited by the choices of motherboards at Microcenter.


----------



## oxrufiioxo (Jul 23, 2019)

You're better off upgrading your dad to a 3600 and keeping your 8700k.

The 8700k is better than any ryzen chip at gaming.


----------



## dirtyferret (Jul 23, 2019)

trparky said:


> Giving it to my father.



trparky







Give me the 8700k


----------



## Vario (Jul 23, 2019)

trparky said:


> OK, so at least I know that a 3700X will be good enough for my needs; that's good. I haven't yet decided on the motherboard yet so there's some more decisions there. Yes, I could get the 3600X instead of the 3700X but I do like the idea of the two extra cores.
> 
> I am limited by the choices of motherboards at Microcenter.


Microcenter has some bad board options a lot of the time, you might want to forgo the $30 combo savings and get a board elsewhere sometimes the rebates or coupon codes make up for it.  If you do buy from Microcenter theres a $5 coupon out there you print out and bring to store.


----------



## trparky (Jul 23, 2019)

You see the thing is, I live like half an hour away (in good traffic) from a Microcenter.


----------



## John Naylor (Jul 23, 2019)

I can't see a reason to make a change ... especially with a 1060.   The change in gaming experience will not be in any way noticeable.

9900k is 1.9 % faster ... big whoop ... https://tpucdn.com/review/intel-core-i9-9900k/images/relative-performance-games-1920-1080.png
3700X is 4% slower .... going backwards ... https://tpucdn.com/review/amd-ryzen-7-3700x/images/relative-performance-games-1920-1080.png

Never quite understood the urge to upgrade a system whereby the end resulst would be to go slower or get an increase that could not be perceived.

If you want to upgrade your Gaming Experience ... I would do the following:

1.  Increase your FPS by a factor 3 with an AIB 2080 Super or AIB 2080 with same monitor.  That will costs ya $640 - $740

2.  Get the best available gaming experience available at present w/ a 165 HZ IPS 10 bit AuOptronics panel which are now available for $550 ... A decent AIB 1060 delivers 60 fps in Witcher 3 .  One of these will deliver 120 fps at 1440p in sharp 10 bit color and no ghosting.






						Acer XB271HU bmiprz 27.0" 2560x1440 165 Hz Monitor
					






					pcpartpicker.com
				




Both should cost, the same or less than a new system which with brings nothing to the table or loses ground.  For less than a new system that does nothing.

Take the 1060 and put it in Dad's system with one of your old monitors... if it's a hot CPU, a $46  Scyth Fuma will outperform  most 240 / 280mm AIOs

I'd hold of building anything fresh outta the gate till 1st stepping products have cleared the shelves.... no sense dealing with unstable BIOSs, broken features and lower clocks ... all this will improve in later steppings.


----------



## trparky (Jul 23, 2019)

But what do I do about my father's current system? It has a 3570K that's quite old that was once overclocked to 4.4 GHz but recently became unstable so I had to revert it to stock clocks. Intel doesn't support it anymore so no new firmware or microcode updates and that means that it's vulnerable to Spectre and Meltdown and any of the fixes for them results in less performance. And not only that but much of what he does on it is choking the 3570K. He's got Excel spreadsheets that take forever to load on the 3570K yet they load in no time at all on my 8700K.

So again... what do I do about my father's current system? I want to do an upgrade for him.


----------



## cucker tarlson (Jul 23, 2019)

trparky said:


> But what do I do about my father's current system? It has a 3570K that's quite old that was once overclocked to 4.4 GHz but recently became unstable so I had to revert it to stock clocks. Intel doesn't support it anymore so no new firmware or microcode updates and that means that it's vulnerable to Spectre and Meltdown and any of the fixes for them results in less performance. And not only that but much of what he does on it is choking the 3570K. He's got Excel spreadsheets that take forever to load on the 3570K yet they load in no time at all on my 8700K.
> 
> So again... what do I do about my father's current system? I want to do an upgrade for him.


3600


----------



## Joss (Jul 23, 2019)

oxrufiioxo said:


> You're better off upgrading your dad to a 3600 and keeping your 8700k.


This.


----------



## HD64G (Jul 23, 2019)

Get a Ryzen with a good B450 and call it a day. Don't think much. It is the best option both for the present and the future. Less power consumption, better security from vulnerabilities, upgradeability to 16C/32T later on.


----------



## Papahyooie (Jul 23, 2019)

You guys saying keep the 8700k aren't reading what he's saying though... his use case is 60 hz gaming and VMs. The 3700x is going to be objectively better than the 8700k running multiple VMs. It would make no sense to keep the 8700k and get his Dad a 3600 (unless the goal is simply to save even more money, but that won't get you any increase in VM)

If the goal is to get your Dad upgraded, get the cheapest 1600 and motherboard, and be done with it. But he's already expressed that he wants his Dad to have a nice machine. The 3700x will absolutely benefit him in VMs, which sounds like his only really strenuous activity (because 60hz gaming will not benefit from either option, or ANY of the options that have been thrown out here... both these processors can do 60hz gaming at basically idle clocks lol) If he wanted to do high hz gaming, sure, keep the 8700k. But that's not what he said. His heavy usage is VMs, and the 3700x will blow the 8700k away with 2c/4t more.


----------



## trparky (Jul 23, 2019)

Alright, maybe you guys are right. Keep my 8700K and just do a cheap upgrade with a Ryzen 5 2600 along with a cheap B450 board.

I’m looking at the Gigabyte B450M DS3H. I need compatible memory, preferably DDR4-3000 CAS16, and I’ll do a cheap upgrade for him.


----------



## Vario (Jul 23, 2019)

Papahyooie said:


> You guys saying keep the 8700k aren't reading what he's saying though... his use case is 60 hz gaming and VMs. The 3700x is going to be objectively better than the 8700k running multiple VMs. It would make no sense to keep the 8700k and get his Dad a 3600 (unless the goal is simply to save even more money, but that won't get you any increase in VM)
> 
> If the goal is to get your Dad upgraded, get the cheapest 1600 and motherboard, and be done with it. But he's already expressed that he wants his Dad to have a nice machine. The 3700x will absolutely benefit him in VMs, which sounds like his only really strenuous activity (because 60hz gaming will not benefit from either option, or ANY of the options that have been thrown out here... both these processors can do 60hz gaming at basically idle clocks lol) If he wanted to do high hz gaming, sure, keep the 8700k. But that's not what he said. His heavy usage is VMs, and the 3700x will blow the 8700k away with 2c/4t more.


In my opinion the occasional once a month VM usage just to test a Win 10 install isn't worth spending a ton of cash. It doesn't sound like a "heavy usage" 





> I also like to spin up Hyper-V virtual machines for testing stuff in... just because. Right now I have a Hyper-V VM that I test Windows 10 Fast Builds in just to see what Microsoft is up to.


----------



## trparky (Jul 23, 2019)

Unless of course you guys can recommend a better B450 board.


----------



## Komshija (Jul 23, 2019)

Well, my dad has an i5 3470 and he is sattisfied with it. After all, he's not gaming or doing video editing so that he needs something faster. 
For the internet, internet banking/shopping and general MS office tasks, an i5 3570K will be more than enough. Ahh... is your father gaming? 

If you absolutely must replace i7 8700K (which is excellent for gaming and will be so for at least next 3-4 years), I would recommend Ryzen-based system.


----------



## trparky (Jul 23, 2019)

He plays with Excel spreadsheets that seem to bring his 3570K to its knees. Not only that but because it was once overclocked and now it isn’t due to BSODs, I question the long term stability of the system.


----------



## oxrufiioxo (Jul 23, 2019)

trparky said:


> Unless of course you guys can recommend a better B450 board.











						ASRock B450M Pro4 AMD AM4 microATX Motherboard - Micro Center
					

Get it now! A powerful tool that combines the speed of your SSD with the capacity of your hard disk into a single, fast, easy-to-manage drive. This motherboard has a pair of onboard Type-A and Type-C USB 3.1 Gen2 ports built on the rear i/o for supporting next generation USB 3.




					www.microcenter.com


----------



## Vario (Jul 23, 2019)

Why not do a 3600 for him instead of the 2600, its only $60 more.


----------



## trparky (Jul 23, 2019)

Vario said:


> Why not do a 3600 for him instead of the 2600, its only $60 more.


Because I can get a B450 motherboard *and a* Ryzen 5 2600 for $160 plus tax. That’s a steal if I ever saw one!

Now all need is compatible memory and I can start planning his system upgrade.

*Edit:* Damn autocorrect.


----------



## eidairaman1 (Jul 23, 2019)

Does your dad play games?


----------



## trparky (Jul 24, 2019)

No, it would be a productivity machine only.


----------



## eidairaman1 (Jul 24, 2019)

trparky said:


> No, it would be a productivity machine only.



Research APU, might be able to skip the gpu since it is productivity only


----------



## trparky (Jul 24, 2019)

eidairaman1 said:


> Research APU, might be able to skip the gpu since it is productivity only


I already have a graphics card that I can reuse.


----------



## eidairaman1 (Jul 24, 2019)

trparky said:


> I already have a graphics card that I can reuse.



I guess so since the 1060 aint worth anything now...


----------



## trparky (Jul 24, 2019)

Oh, he has an R9 380 in his system; I’ll reuse that.


----------



## oxrufiioxo (Jul 24, 2019)

If you can squeeze this in your budget I would.









						G.SKILL Flare X Series 16GB (2 x 8GB) 288-Pin DDR4 SDRAM DDR4 3200 (PC4 25600) AMD X370 / B350 Memory (Desktop Memory) Model F4-3200C14D-16GFX - Newegg.com
					

Buy G.SKILL Flare X Series 16GB (2 x 8GB) 288-Pin DDR4 SDRAM DDR4 3200 (PC4 25600) AMD X370 / B350 Memory (Desktop Memory) Model F4-3200C14D-16GFX with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.com
				





If not any 3200 CL16 kit will be fine just stay away from corsair LPX I've had nothing but issues with it on multiple ryzen system.


----------



## swirl09 (Jul 24, 2019)

Is there a reason you are not considering the 9700k? Its within 1% of the 9900k, and considering there is a small clock difference which you can make disappear without effort...

You also get to skim over the monthly articles about HT exploits. Sold me anyway lol


----------



## cucker tarlson (Jul 24, 2019)

by the way,are you sure it's the cpu that's slowing down excel?


----------



## ratirt (Jul 24, 2019)

Greetings.
read all the posts and I understand our troubled friend's position of switching to a new PC. Normally I'd say lovely days (when you get to buy new stuff) but for some reason I see a troubled man.
With the games you play, you are good with any CPU. I've moved to a 2700x end of the last year and you'd be surprised what it has to offer. I play games of course but casually like you and with the card you've got 1060, 1080p is your goal considering your monitor capabilities.
Even though, I bought 2700x, I would not recommend buying it. Not that it's a bad chip, contrary but new stuff is out. Not sure what your budget is but I assume you have few hundreds for this.
Seen people saying 9700K, 9900K, etc.
I wouldn't go with one of these because you've already experienced 8700K plus the price for these (gaming chips) knocks my socks off.
3700x is a nice choice for you but what about 3700 without X? You save few bucks and you can go for 470x board. From my stand point Ryzens' are better choice for you. Cause despite gaming (and claims that Intel is better, and it is, but is it worth it considering what you play? And how much better it is? ) Casual games and some work VM's somebody mentioned? 1060 and 60Hz monitor playing 1080p? How can you lose with purchasing Ryzen? I simply can't see it.
My recommendation is Ryzen 3700 or 3700x with 470x board. Besides, it would be a good experience for you switching to something different and see how it goes.  You will not be disappointed I can vouch for that. More cores will be useful within a year time or even less. I can see I5 necessity to be pushed to 5Ghz to keep up with FPS minimums. Don't go that way bro.
Also you better hurry with the purchase since RAM is quite cheap now but what I've read this may change this year due to upcoming shortages.


----------



## cucker tarlson (Jul 24, 2019)

there is no 3700 non-x


----------



## ratirt (Jul 24, 2019)

cucker tarlson said:


> there is no 3700 non-x


You are right. Been thinking about the 3600 somebody recommended.


----------



## crazyeyesreaper (Jul 24, 2019)

3700X grab a B450 board save a ton of money grab a decent mid range cooler and be done with it.

If he doesnt need latest greatest then as others have said 2600 on a B450 will suffice.

Considering AMD's chips with XFR / PBO pretty much push the chips as far as they can go anyway. You dont need an extreme VRM setup with either option. If you slot in a down draft style cooler something like the Be quiet Dark Rock TF it will cool the VRM as well while providing adequate CPU cooling. But again depends on the usage scenario. Going with a 2600+B450 you can easily use a generic cooler and be fine.


----------



## biffzinker (Jul 24, 2019)

Core-i5-9400F with a B360 motherboard  for you father, and keep the 8700K?


----------



## cucker tarlson (Jul 24, 2019)

biffzinker said:


> Core-i5-9400F with a B360 motherboard  for you father, and keep the 8700K?


yup,most office type work will prioritize ipc.9400f single core boost is 4.1ghz so it matches ryzen 3000 at 4.1ghz
I'd still get the 3600 if money's no object,though this 9400f looks like a great bang for the buck option for home office work.


----------



## ratirt (Jul 24, 2019)

cucker tarlson said:


> yup,most office type work will prioritize ipc.9400f single core boost is 4.1ghz so it matches ryzen 3000 at 4.1ghz
> I'd still get the 3600 if money's no object,though this 9400f looks like a great bang for the buck option for home office work.


As a matter of fact Ryzens IPC is higher. That of course depends on the software you are using. I honestly wouldn't buy I5. I'd go for more cores and threads since more and more are going to be utilized for sure.


----------



## cucker tarlson (Jul 24, 2019)

ratirt said:


> As a matter of fact Ryzens IPC is higher. That of course depends on the software you are using. I honestly wouldn't buy I5. I'd go for more cores and threads since more and more are going to be utilized for sure.


hard to measure,but you yourself said it depends on the software.
gaming,office suite,photoshop,premere - look at all those parts of the review.9400f definitely has more pefromance for clock than ryzen 3600 there.
of course ryzen wins in other stuff but that's mostly due to SMT and the fact we've got rendering and deep learning performance thrown into the average score of a low-end cpu that 9400F is.


----------



## Melvis (Jul 24, 2019)

trparky said:


> So I'm sitting here thinking about building a new system, I already have an nVidia GTX 1060 so that saves me some cash in building the system.
> 
> I have two systems with parts chosen...
> 
> ...



I actually think you answered your own question in this, save the money go the AMD.



trparky said:


> Giving it to my father.



trparky  I AM your father!!


----------



## ratirt (Jul 24, 2019)

cucker tarlson said:


> hard to measure,but you yourself said it depends on the software.
> gaming,office suite,photoshop,premere - look at all those parts of the review.9400f definitely has more pefromance for clock than ryzen 3600 there.
> of course ryzen wins in other stuff but that's mostly due to SMT and the fact we've got rendering and deep learning performance thrown into the average score of a low-end cpu that 9400F is.


Well I've seen some measurements with different software cinebench 15 and 20, different sites, they show different stuff. CPUz bench also shows single core perf differently. One bench draw my attention when clocks were locked at 3.5 for both platforms and Ryzen was ahead by quite a margin from Intel. 
I keep saying this. Intel's been a kind for so long. Software has been developed under Intel's banner. It means it will work better for Intel and that is how I see it. One soft shows totally something else than the other? That is strange when they are measuring exactly the same thing. Anyway I think the situation is changing and AMD is catching up even if the company doesn't have the soft advantage support Intel does. Some of you may disagree but that's how I see it. Otherwise there is no way to explain different behavior of the benchmarks which are supposedly measure the same thing. 
The other aspect is: More cores are going to be needed. You can see it now and this proves my statement that intel's been in the lead and the soft companies were clinging to the master's will and use resources they been given. Now it is different. Intel can't push clocks any higher. (shrinks and such won't help moreover will make the clocks lower and it has been clearly stated by the node production companies etc.) All companies know this and they will look for other ways to improve. This means core utilization if they want to stay competitive. New rules I guess caused not by Intel or AMD but simple physics and limitations we encounter on our ways to improve.


----------



## biffzinker (Jul 24, 2019)

ratirt said:


> One bench draw my attention when clocks were locked at 3.5 for both platforms and Ryzen was ahead by quite a margin from Intel.


This one?












						AMD Ryzen 5 3600 review
					

We again review a six-core model Ryzen 5 3600 processor, you know, the one without an X. It's 200 Mhz slower on the base and turbo clocks, but a whopping 50 bucks cheaper. Worth it?... Performance - CineBench 15 (and IPC)




					www.guru3d.com


----------



## ratirt (Jul 24, 2019)

biffzinker said:


> This one?
> View attachment 127641
> 
> 
> ...


I think that's the one.

and you have this one



I assume, in the above one 9900k is at stock clock?

Someone suggested 9400F cause of IPC and gaming? Well sure, why not if it is enough but one game shows different things than the other. So what would be the reason for this? is it really IPC or something else?
I'd bet, if you get a game that Intel is normally ahead of AMD and if you have locked the clocks for both, bench the 2 platforms Intel would have won despite the locked clocks. graphs above show it shouldn't and yet it still does. I haven't seen this actually. I wonder if this is, how it would have been.


----------



## EarthDog (Jul 24, 2019)

trparky said:


> So I'm sitting here thinking about building a new system, I already have an nVidia GTX 1060 so that saves me some cash in building the system.
> 
> I have two systems with parts chosen...
> 
> ...


Didnt read one post in the thread....

Wondering how this is our choice??? Its your money and you know how they perform... what is this 'brain trust' going to do but get in a pissing contest and muddy the waters???


----------



## cucker tarlson (Jul 24, 2019)

ratirt said:


> Well I've seen some measurements with different software cinebench 15 and 20, different sites, they show different stuff. CPUz bench also shows single core perf differently. One bench draw my attention when clocks were locked at 3.5 for both platforms and Ryzen was ahead by quite a margin from Intel.
> I keep saying this. Intel's been a kind for so long. Software has been developed under Intel's banner. It means it will work better for Intel and that is how I see it. One soft shows totally something else than the other? That is strange when they are measuring exactly the same thing. Anyway I think the situation is changing and AMD is catching up even if the company doesn't have the soft advantage support Intel does. Some of you may disagree but that's how I see it. Otherwise there is no way to explain different behavior of the benchmarks which are supposedly measure the same thing.
> The other aspect is: More cores are going to be needed. You can see it now and this proves my statement that intel's been in the lead and the soft companies were clinging to the master's will and use resources they been given. Now it is different. Intel can't push clocks any higher. (shrinks and such won't help moreover will make the clocks lower and it has been clearly stated by the node production companies etc.) All companies know this and they will look for other ways to improve. This means core utilization if they want to stay competitive. New rules I guess caused not by Intel or AMD but simple physics and limitations we encounter on our ways to improve.


Well that's one rendering benchmark,pretty one dimensional comparing to a whole suite od games,office programs,Photoshop and premere


----------



## ratirt (Jul 24, 2019)

cucker tarlson said:


> Well that's one rendering benchmark,pretty one dimensional comparing to a whole suite od games,office programs,Photoshop and premere


I was referring to an IPC not a suite of benchmarks. IPC should be a constant not changing depending on an app you are using.


----------



## cucker tarlson (Jul 24, 2019)

ratirt said:


> I was referring to an IPC not a suite of benchmarks. IPC should be a constant not changing depending on an app you are using.


Looks like in most cases except for gaming it's splitting hairs


----------



## ratirt (Jul 24, 2019)

cucker tarlson said:


> Looks like in most cases except for gaming it's splitting hairs


I been trying to draw a picture which you refuse to look at. IPC is a constant, it doesn't change. 
Anyway. 
I'd still go with Ryzen 3700x knowing what the OP will be using it for and it's cheaper than mentioned 9900k.


----------



## cucker tarlson (Jul 24, 2019)

ratirt said:


> I been trying to draw a picture which you refuse to look at. IPC is a constant, it doesn't change.
> Anyway.
> I'd still go with Ryzen 3700x knowing what the OP will be using it for and it's cheaper than mentioned 9900k.


Yeah but single threaded performance is what you should look at,what is an ipc benchmark


----------



## ratirt (Jul 24, 2019)

cucker tarlson said:


> Yeah but single threaded performance is what you should look at,what is an ipc benchmark


Well I thought we have covered that and Ryzens are better with the IPC at the moment. IPC measurement is not certain games benchmark or even suite of games which can be coded and performing better for a top dog being Intel for over a decade. Wouldn't you say?


----------



## cucker tarlson (Jul 24, 2019)

ratirt said:


> Well I thought we have covered that and Ryzens are better with the IPC at the moment. IPC measurement is not certain games benchmark or even suite of games which can be coded and performing better for a top dog being Intel for over a decade. Wouldn't you say?


is that what happens ? games and software are coded for intel specifically ?


----------



## Slizzo (Jul 24, 2019)

Most are compiled exclusively with an Intel based compiler. So, yes, sort of. But many developers are compiling with both in mind now.


----------



## EarthDog (Jul 24, 2019)

cucker tarlson said:


> is that what happens ? games and software are coded for intel specifically ?


Id say there are too many other variables to test and figure that out in the first place.

Clock speed is a key component.


----------



## trparky (Jul 24, 2019)

Slizzo said:


> Most are compiled exclusively with an Intel based compiler. So, yes, sort of. But many developers are compiling with both in mind now.


I thought that most Windows applications are compiled by Microsoft's own C++ compiler included with Microsoft Visual Studio.


----------



## Viruzz (Jul 24, 2019)

trparky said:


> So I'm sitting here thinking about building a new system, I already have an nVidia GTX 1060 so that saves me some cash in building the system.
> 
> I have two systems with parts chosen...
> 
> ...



If you play video games, go look at benchmarks and see how Ryzen is humiliated by 9900K in every resolution. If you need to encode Video go see how Intel with QuickSync is MUCH faster then 3900x cranking all 12 cores and 24 threads compressing video.
Then go read how the whole platform is broken, the 3800x and 3900x wont boost to their advertised speeds, PCIe Gen 4 SSDs perform MUCH better on Intel PCIe Gen 3 platform compared to AMD Gen 4 [which is huge LOL to AMD], people have tons of issues like CPUs stuck at base clocks or window keep the voltage at max when some programs run at background.


----------



## cucker tarlson (Jul 24, 2019)

Viruzz said:


> If you play video games, go look at benchmarks and see how Ryzen is humiliated by 9900K in every resolution. If you need to encode Video go see how Intel with QuickSync is MUCH faster then 3900x cranking all 12 cores and 24 threads compressing video.
> Then go read how the whole platform is broken, the 3800x and 3900x wont boost to their advertised speeds, PCIe Gen 4 SSDs perform MUCH better on Intel PCIe Gen 3 platform compared to AMD Gen 4 [which is huge LOL to AMD], people have tons of issues like CPUs stuck at base clocks or window keep the voltage at max when some programs run at background.


dunno about voltage and boost,that stuff is gonna get ironed out soon,but that SSD speed is a huge blow.those high end drives cost as much as the cpu sometimes.
quicksync is just one of the options,from what I measured nvidia's nvenc is faster than quicksync and will work with every format anyway.


----------



## Viruzz (Jul 24, 2019)

cucker tarlson said:


> dunno about voltage and boost,that stuff is gonna get ironed out soon,but that SSD speed is a huge blow.those drives cost as much as the cpu sometimes.



They cost twice more and perform worse, except in synthetic Read benchmark.
I hope AMD fixes this ASAP, i plan on getting 16 Core Ryzen myself, but mostly becuase i want 16 core CPU and i game at 4K so im gpu limited even with my 2080Ti, i just want something new with many cores, not because its faster more like novelty.
Back when AMD released their first 6 core CPUs I think it was 1090T or something like that I got one instead of Intel quad core becuase it was the first 6 core mainstream CPU [it was my last AMD CPU]


----------



## cucker tarlson (Jul 24, 2019)

Viruzz said:


> They cost twice more and perform worse, except in synthetic Read benchmark.
> I hope AMD fixes this ASAP, i plan on getting 16 Core Ryzen myself, but mostly becuase i want 16 core CPU and i game at 4K so im gpu limited even with my 2080Ti, i just want something new with many cores, not because its faster more like novelty.
> Back when AMD released their first 6 core CPUs I think it was 1090T or soemthing like that I got one instead of intel [it was my last AMD CPU]


the thing is,ryzen has had lower nvme performance than intel since ryzen 1000.dunno if this is ever gonna change.look at anvil suite score,20% difference





__





						AMD Ryzen SSD Storage Performance Preview
					

We provide an early look at AMD Ryzen SSD storage performance versus Intel in a range of different early tests.




					www.tweaktown.com
				




even sata ssd's perform way down





__





						AMD Ryzen SSD Storage Performance Preview
					

We provide an early look at AMD Ryzen SSD storage performance versus Intel in a range of different early tests.




					www.tweaktown.com


----------



## HenrySomeone (Jul 24, 2019)

Viruzz said:


> If you play video games, go look at benchmarks and see how Ryzen is humiliated by 9900K in every resolution. If you need to encode Video go see how Intel with QuickSync is MUCH faster then 3900x cranking all 12 cores and 24 threads compressing video.
> Then go read how the whole platform is broken, the 3800x and 3900x wont boost to their advertised speeds, PCIe Gen 4 SSDs perform MUCH better on Intel PCIe Gen 3 platform compared to AMD Gen 4 [which is huge LOL to AMD], people have tons of issues like CPUs stuck at base clocks or window keep the voltage at max when some programs run at background.


Yeah, even if we forget the gaming deficiencies (say for someone who games at 4k or doesn't plan to upgrade past a 60Hz monitor for some reason), their new platform is so full of bugs, it's like a colony   And I also feel that some of them aren't going to be fixed at all or they will take so long to do it, that Ice Lake will be out before, which will once again spell big trouble for them


----------



## Viruzz (Jul 24, 2019)

cucker tarlson said:


> the thing is,ryzen has had lower nvme performance than intel since ryzen 1000.dunno if this is ever gonna change.look at anvil suite score,20% difference
> 
> 
> 
> ...



This is bad news, look at the results you posted its especially bad in 4K benchmarks the most important ones.
I plan on doing RAID0 MP510, I hope AMD fixes this


----------



## cucker tarlson (Jul 24, 2019)

Viruzz said:


> This is bad news, look at the results you posted its especially bad in 4K benchmarks the most important ones.
> I plan on doing RAID0 MP510, I hope AMD fixes this


yup,random low size gets hit the worst,and that just sucks.


----------



## Viruzz (Jul 24, 2019)

HenrySomeone said:


> Yeah, even if we forget the gaming deficiencies (say for someone who games at 4k or doesn't plan to upgrade past a 60Hz monitor for some reason), their new platform is so full of bugs, it's like a colony   And I also feel that some of them aren't going to be fixed at all or they will take so long to do it, that Ice Lake will be out before, which will once again spell big trouble for them



If its not going to be fixed is due to Corporate Shills and Fantards, I posted the Boost bag issues with videos from Derbauer [not unknown person] on  AMD reddit and tons of people started to defend AMD, the post only has 60% thumbs up.
Some people are enemies from within, instead of pressuring AMD to fix they defend them.


----------



## Vario (Jul 24, 2019)

AMD Reddit , good luck.


----------



## Papahyooie (Jul 24, 2019)

Viruzz said:


> If you play video games, go look at benchmarks and see how Ryzen is humiliated by 9900K in every resolution. If you need to encode Video go see how Intel with QuickSync is MUCH faster then 3900x cranking all 12 cores and 24 threads compressing video.
> Then go read how the whole platform is broken, the 3800x and 3900x wont boost to their advertised speeds, PCIe Gen 4 SSDs perform MUCH better on Intel PCIe Gen 3 platform compared to AMD Gen 4 [which is huge LOL to AMD], people have tons of issues like CPUs stuck at base clocks or window keep the voltage at max when some programs run at background.



I'm not a fanboy at all, I own both Intel and AMD, and I sure as heck want AMD to fix their issues but.... lolwut? 









						AMD Ryzen 9 3900X, SMT on vs SMT off, vs Intel 9900K
					

By community request, we present our findings on how the AMD Ryzen 9 3900X performs with SMT disabled. This approach has potential, especially for gaming, because it ensures more physical hardware units are available for each thread, and could also benefit the processor's power management.




					www.techpowerup.com
				




A 3900x performance difference is negligible to a 9900k for a similar price, and a 3600 is 90% the performance of a 9900k at less than half the price.

I fail to see that as humiliation lol.


----------



## cucker tarlson (Jul 24, 2019)

Viruzz said:


> If its not going to be fixed is due to Corporate Shills and Fantards, I posted the Boost bag issues with videos from Derbauer [not unknown person] on  AMD reddit and tons of people started to defend AMD, the post only has 60% thumbs up.
> Some people are enemies from within, instead of pressuring AMD to fix they defend them.


that's the biggest ??? moment for me.
don't you wanna see this fixed?


----------



## HenrySomeone (Jul 24, 2019)

I feel the new 3000 series and X570 chipset are a lot like those candies or cookies from Harry Potter (it's been ages since I saw the movie, so I can't remember the name, but I mean the ones that came in all flavors, and I mean all of them) and for every toffee or mousse (good IPC improvement over Zen+, large cache) we get a couple of vomit and ear-wax tasted ones 



cucker tarlson said:


> that's the biggest ??? moment for me.
> don't you wanna see this fixed?


AMD reddit is basically a cult, they don't take facts, arguments or reason


----------



## cucker tarlson (Jul 24, 2019)

Papahyooie said:


> I'm not a fanboy at all, I own both Intel and AMD, and I sure as heck want AMD to fix their issues but.... lolwut?
> 
> 
> 
> ...


tpu gaming tests aren't the best frankly.










this thing can get crushed by 9600k on occasions,though smt off can help a lot.

yeah,with smt off it looks like humiliate is too big a word.still behind the 9900k and 8700k/9600k even.


----------



## vega22 (Jul 24, 2019)

Lock any open source test to 1 thread at the same speeds and you get a good idea of IPC.

Games are not a good indication as FPS is often linked to latency (cache and ram) as much as as anything when CPU bound. Not to mention GPU drivers....


----------



## Space Lynx (Jul 24, 2019)

as long as you don't play destiny 2 i would i go with AMD.


----------



## EarthDog (Jul 24, 2019)

cucker tarlson said:


> tpu gaming tests aren't the best frankly.


Oh? Do tell..........


----------



## cucker tarlson (Jul 24, 2019)

EarthDog said:


> Oh? Do tell..........


no mention for testing locations,plain and simple.


----------



## EarthDog (Jul 24, 2019)

cucker tarlson said:


> no mention for testing locations,plain and simple.


Really?

Many have integrated benchmarks and is used. That said, I would like a bit more detail, but, that notwithstanding (because it's only relevant for those who want to repeat the tests, the data is more than solid) is one of the best out there.


----------



## cucker tarlson (Jul 24, 2019)

EarthDog said:


> Really?
> 
> Many have integrated benchmarks and is used. That said, I would like a bit more detail, but, that notwithstanding (because it's only relevant for those who want to repeat the tests, the data is more than solid) is one of the best out there.



look at witcher 3 result for that matter.pretty flat,even at 720p,not much difference,just 9% fastest to slowest and fps at well over 200.

now at 1080p but cpu heavy loaction,quite another story,right ?









						Test procesora AMD Ryzen 7 3700X - Premiera architektury Zen 2 | PurePC.pl
					

Test procesora AMD Ryzen 7 3700X - Premiera architektury Zen 2 (strona 52) Premierowy test nowego procesora AMD Ryzen 7 3700X na architekturze Zen 2, który staje do walki z Intel Core i7-8700K i Core i7-9700K. Czy AMD uda się zwyciężyć?




					www.purepc.pl
				




or







look at the performance uplift going to 3733 from 3200 on ryzen


----------



## EarthDog (Jul 24, 2019)

It's helpful to know, sure...but doesn't Witcher 3 have an integrated benchmark? I don't run it...
EDIT: It does not...

I also thought he has it listed somewhere... just not in the review? I agree a section in each GPU review listing the testing settings would be helpful.

It also, to me, mind numblingly stupid to benchmark any game at 720p in the first place for the reason of the results you just posted... because it doesn't scale the same and exaggerates a difference not found at higher resolutions.

Regardless of what area... nobody is going to be happy... you test in the worst area, people bitch. You test in a faster area, people bitch. Reviewers just can't win in the eyes of the couch potato forum users.


----------



## cucker tarlson (Jul 24, 2019)

EarthDog said:


> It's helpful to know, sure...but doesn't Witcher 3 have an integrated benchmark? I don't run it...
> 
> I also thought he has it listed somewhere... just not in the review? I agree a section in each GPU review listing the testing settings would be helpful.
> 
> It also, to me, mind numblingly stupid to benchmark any game at 720p in the first place for the reason of the results you just posted... because it doesn't scale the same and exaggerates a difference not found at higher resolutions.


using regular resolution like 1080p and finding a cpu heavy place is ideal.
witcher 3 doesn't have a benchmark,but big cities (novigrad and beuclair) are extremely heavy on the cpu.tested myself









						What I found about 5775c EDRAM's impact on gaming performance.
					

I found the option in bios to OC my EDRAM as well as disable it. I chose WatchDogs2 as my benchmark because of how CPU dependent that games is. Resolution is 2560x1440 but graphics settings are low to rule out GPU bottleneck. The results are staggering.  EDRAM @2GHz - 118 FPS   EDRAM disabled -...




					www.techpowerup.com
				




digitalfoundry's recent vid,novigrad abolutely destroys cpus


----------



## mbeeston (Jul 24, 2019)

trparky said:


> So I'm sitting here thinking about building a new system, I already have an nVidia GTX 1060 so that saves me some cash in building the system.
> 
> I have two systems with parts chosen...
> 
> ...


i'd get the 3700x and put the extra money towards a new gpu, with a 1060 you'll get the same performance out of both anyway.


----------



## EarthDog (Jul 24, 2019)

cucker tarlson said:


> using regular resolution like 1080p and finding a cpu heavy place is ideal.


Is it? See last sentence (I edited).

Anyway, a bit OT, this silly thread in the first place... I digress.


----------



## cucker tarlson (Jul 24, 2019)

EarthDog said:


> Is it? See last sentence (I edited).
> 
> Anyway, a bit OT, this silly thread in the first place... I digress.


it's not about ppl moaning.
cpu test in gaming should reflect what you will experience at normal resolution but in places the cpu is the limiting factor,that's just common sense.ppl can say whatever they want frankly,opinions can't change the fact.

OT,but still impressively civil for an intel-amd thread.


----------



## EarthDog (Jul 24, 2019)

cucker tarlson said:


> but in places the cpu is the limiting factor,


We'll have to agree to disagree... because cherry picking a specific area like that is but a snapshot of the whole game. A worst case scenario. People bitch if it is that, and people bitch if it is another way. Reviewers cannot please everyone.


----------



## cucker tarlson (Jul 24, 2019)

EarthDog said:


> We'll have to agree to disagree... because cherry picking a specific area like that is but a snapshot of the whole game. A worst case scenario. People bitch if it is that, and people bitch if it is another way. Reviewers cannot please everyone.


well they can bitch if they want to,but this is what cpu testing is.what your cpu will perform like when it's the limiting factor.worst scenario-yes,but how else are you going to learn how game behaves in cpu heavy locations.cpu reviews are for ppl who can interpret them too.


----------



## Vayra86 (Jul 24, 2019)

trparky said:


> Alright, maybe you guys are right. Keep my 8700K and just do a cheap upgrade with a Ryzen 5 2600 along with a cheap B450 board.
> 
> I’m looking at the Gigabyte B450M DS3H. I need compatible memory, preferably DDR4-3000 CAS16, and I’ll do a cheap upgrade for him.



That was my thought on page 1. Sensible choice, well played. Keep at it 



cucker tarlson said:


> well they can bitch if they want to,but this is what cpu testing is.what your cpu will perform like when it's the limiting factor.worst else scenario-yes,but how else are you going to learn how game behaves in cpu heavy locations.cpu reviews are for ppl who can interpret them too.



That again huh. @EarthDog and I don't agree on that either even though my practical experience when the CPU gets pushed in games, proves him wrong every time. Its hard to convince him, trust me 

But yeah, _we _can at least enjoy W1zzards' 720p benches


----------



## cucker tarlson (Jul 24, 2019)

Vayra86 said:


> That was my thought on page 1. Sensible choice, well played. Keep at it
> 
> 
> 
> ...


I kinda feel like W1zzard could do so much more by so much less.He's testing 10 games at 3 resolutions.Look at what gamersnexus or purepc do,they test 5-7 games at one resolution only,but do a lot more research as far as cpu limited areas.They come up with results that better reflect reality and they do it much quicker.


----------



## Vayra86 (Jul 24, 2019)

cucker tarlson said:


> using regular resolution like 1080p and finding a cpu heavy place is ideal.
> witcher 3 doesn't have a benchmark,but big cities (novigrad and beuclair) are extremely heavy on the cpu.tested myself
> 
> 
> ...



This video man... I watched it. That is 90% total BS though. He is actually advocating to play on different resolutions to gain consistency?! What is he smoking. The overall conclusion is no CPU guarantees flawless gaming and shit code will be shit code regardless. This really goes nowhere. The cherry picked frame drops are also remarkably pro-Intel across the whole thing, if you ask me. That Crysis one... whut? All CPUs drop there and its common when the viewport shifts to a large open area. Not just in Crysis... And it doesn't even hamper gameplay, its a checkpoint load/save point as well (which induces that stutter alongside the other load!)...

Goes to show you need to actually play games to know what examples to pick, instead of stare at charts


----------



## EarthDog (Jul 24, 2019)

Vayra86 said:


> That again huh. @EarthDog and I don't agree on that either even though my practical experience when the CPU gets pushed in games, proves him wrong every time. Its hard to convince him, trust me


It doesn't show ANYTHING except at that abhorrently low resolution. That is the ONLY thing it is good for. The proportions DO NOT SCALE and the proportions are different as it goes up. So those who play at 1080p+, that type of exaggerated testing is largely not relevant as the result simply does not apply anywhere close to the same manner.

What can you take away from a result that says... zOMG holy shyte this CPU holds back FPS by 20% WTFBBQ???? When at the resolution you game at it, the difference is nill? If people used 720p results as The Gospel, we'd be in a whole world of hurt.

You're practical experience, respectfully, is nothing compared to actual empirical testing which you can easily reference and see my point. But hey, its hard to convince some people of the facts, trust me. I can be convinced, it just takes facts and empirical testing to do so, not unsupported butt dyno experience. 

EDIT... weird... people are disappearing off my ignore list. I don't think I've seen a vayra86 post (or was it Vaya dumas?) in MONTHS! But I surely see why!!! LOL!


----------



## cucker tarlson (Jul 24, 2019)

Vayra86 said:


> This video man... I watched it. That is 90% total BS though. He is actually advocating to play on different resolutions to gain consistency?! What is he smoking. The overall conclusion is no CPU guarantees flawless gaming and shit code will be shit code regardless.


no,he's saying that 4K gives better consistency in frametimes and he's right.Don't know why you misinterpreted it.Look at the frametime graph when it's running 1080p - even 9700k can stutter here and there.4K - no stutter at all.

the problem EarthDawg has is thinking that testing e.g. witcher 3 in novigrad specifically is cherry picking while it's absolutely not.



EarthDog said:


> It doesn't show ANYTHING except at that abhorrently low resolution. That is the ONLY thing it is good for. The proportions DO NOT SCALE and the proportions are different. So those who play at 1080p+, that type of exaggerated testing is largely not relevant.
> 
> What can you take away from a result that says... zOMG holy shyte this CPU holds back FPS by 20% WTFBBQ???? When at the resolution you game at it, the difference is nill?
> 
> You're practical experience, respectfully, is nothing compared to actual empirical testing which you can easily reference and see my point. But hey, its hard to convince some people of the facts, trust me. I can be convinced, it just takes facts and empirical testing to do so, not your butt dyno experience.



yeah man but that's what I said earlier,cpu testing in games actually requires the user to know their stuff or else they come up with conclusions like the "omg omg" sentence you wrote.


----------



## Vayra86 (Jul 24, 2019)

cucker tarlson said:


> no,he's saying that 4K gives better consistency in frametimes and he's right.Don't know why you misinterpreted it.Look at the frametime graph when it's running 1080p - even 9700k can stutter here and there.4K - no stutter at all.
> 
> the problem EarthDawg has is thinking that testing e.g. witcher 3 in novigrad specifically is cherry picking while it's absolutely not.



Right but how does that affect either the CPU choice, or the monitor you are stuck with anyway? Sure it gives better consistency, but what value does that statement have? Its been obvious since the dawn of gaming. And meanwhile, we constantly need to hear the 9700K leads a bit. Every time.


----------



## cucker tarlson (Jul 24, 2019)

Vayra86 said:


> Right but how does that affect either the CPU choice, or the monitor you are stuck with anyway? Sure it gives better consistency, but what value does that statement have? Its been obvious since the dawn of gaming. And meanwhile, we constantly need to hear the 9700K leads a bit. Every time.


he's just speaking from the perspective of reviewing it at different resolutions.
again,for the one who watches it to interpret.


----------



## EarthDog (Jul 24, 2019)

cucker tarlson said:


> yeah man but that's what I said earlier,cpu testing in games actually requires the user to know their stuff or else they come up with conclusions like the "omg omg" sentence you wrote.


ANd if you don't test that way....... the unadorned don't have to walk away misinformed.


----------



## Vayra86 (Jul 24, 2019)

EarthDog said:


> It doesn't show ANYTHING except at that abhorrently low resolution. That is the ONLY thing it is good for. The proportions DO NOT SCALE and the proportions are different as it goes up. So those who play at 1080p+, that type of exaggerated testing is largely not relevant as the result simply does not apply anywhere close to the same manner.
> 
> What can you take away from a result that says... zOMG holy shyte this CPU holds back FPS by 20% WTFBBQ???? When at the resolution you game at it, the difference is nill? If people used 720p results as The Gospel, we'd be in a whole world of hurt.
> 
> ...



It does not scale and yet, the performance gaps return in the worst case scenarios.  In other words, it can be used as a stand-in for a worst case in-game scenario wrt CPU load. You're at liberty to believe or disbelieve that, I really don't care. Go go ignore button quick!

Anyway, topic hijacked... Sorry bout that. Dropping it.


----------



## cucker tarlson (Jul 24, 2019)

EarthDog said:


> ANd if you don't test that way....... the unadorned don't have to walk away misinformed.


I never said it's not needed,I said it's not representative of real world cpu bottleneck.




EarthDog said:


> EDIT... weird... people are disappearing off my ignore list. I don't think I've seen a vayra86 post (or was it *Vaya dumas*?) in MONTHS! But I surely see why!!! LOL!


----------



## dirtyferret (Jul 24, 2019)

cucker tarlson said:


> I kinda feel like W1zzard could do so much more by so much less.He's testing 10 games at 3 resolutions.Look at what gamersnexus or purepc do,they test 5-7 games at one resolution only,but do a lot more research as far as cpu limited areas.They come up with results that better reflect reality and they do it much quicker.


GN tests at 1080p and 1440p, one less resolution then TPU.


----------



## cucker tarlson (Jul 24, 2019)

dirtyferret said:


> GN tests at 1080p and 1440p, one less resolution then TPU.


ah yes,they do.still,less work and more informative end result.


----------



## dirtyferret (Jul 24, 2019)

I'm going to put my two cents in and try to avoid the flying bullets.  

I like that TPU tests at 720p, it gives me a basic understanding of what CPU headroom may be in certain games.

I like that GN tests at .1 even though it's hardly the end all be all of micro stuttering

I like that pcgamer shows a composite of 97th percentile in their gaming suite

I like that anadtech shows 95th percentile

I like that techspot/hardware unboxed does 36 game reviews 

I understand fan boys will cherry pick these results to justify why "their" CPU is the best of the best regardless how it may fly in the big picture of things.

I think it would be rather boring if every site ran the exact same tests with the exact same hardware and games.  After all, who here plays the exact same games on the exact same hardware?  Now if you excuse me, I going to hide in my bunker.


----------



## Papahyooie (Jul 24, 2019)

You guys really don't understand benchmarking do you lol? Funnily enough, it literally says why TPU does 720p tests IN the review, so you're just too lazy to find out.

They test games at 720p resolution because that effectively removes the GPU from the equation. A high end GPU will barely be working at 720p, so it allows the differences in the CPUs to come out. At 4k, the CPU doesn't matter as much because the GPU is loaded to the point that it's not waiting on the CPU. At 720p, the GPU is almost always waiting on the CPU, so you get a true representation of how a CPU performs. It also gives a good indicator of how the CPU will perform with a large range of GPUs, because that's an indicator of how many frames you will get MAXIMUM with a particular CPU, regardless of what GPU you use. If a CPU cannot hit 144 fps in a particular game at 720p, it will never EVER hit 144 fps in that game at realistic resolutions, even if you upgrade to a faster GPU. As such, it is a true representation of the CPU's performance in that game, uncolored by GPU choice.

But you all will probably still argue with me lol...


----------



## trparky (Jul 24, 2019)

I've changed my plans for my upgrade. The plan has been completely scrapped, the change from an 8700K to a 9900K (like many of you in this thread have said) would be so negligible an upgrade it wouldn't be even noticeable.

I am however planning on upgrading my father's system soon. I plan on getting a cheap AMD B450 motherboard, a Ryzen 2600 (last generation), and some RAM for less than $250. I already have a case, SATA SSD (which is more than enough for him), power supply, and a video card we can reuse. So that leaves only the processor, motherboard, and memory to replace which should make for a quick and cheap upgrade that will last him for years to come.

Hopefully in a year or two, when Ryzen gets even better, I'll do the same. Keep my SSD and power supply to do a cheap upgrade instead of having to do a full system rebuild.


----------



## cucker tarlson (Jul 24, 2019)

Papahyooie said:


> You guys really don't understand benchmarking do you lol? Funnily enough, it literally says why TPU does it IN the review, so you're just too lazy to find out.
> 
> They test games at 720p resolution because that effectively removes the GPU from the equation. A high end GPU will barely be working at 720p, so it allows the differences in the CPUs to come out. At 4k, the CPU doesn't matter as much because the GPU is loaded to the point that it's not waiting on the CPU. At 720p, the GPU is almost always waiting on the CPU, so you get a true representation of how a CPU performs. It also gives a good indicator of how the CPU will perform with a large range of GPUs, because that's an indicator of how many frames you will get MAXIMUM with a particular CPU, regardless of what GPU you use. If a CPU cannot hit 144 fps in a particular game at 720p, it will never EVER hit 144 fps in that game at realistic resolutions, even if you upgrade to a faster GPU. As such, it is a true representation of the CPU's performance in that game, uncolored by GPU choice.
> 
> But you all will probably still argue with me lol...


and yet you see completely different results at tpu testing 720p and other sites testing 1080p but in cpu limited locations,the latter actually being worse for the cpu despite 1080p/Ultra settings.
choosing a testing location is the alpha and omega of cpu testing in games.It's all you need to do.


----------



## Papahyooie (Jul 24, 2019)

cucker tarlson said:


> and yet you see completely different results at tpu testing 720p and other sites testing 1080p but in cpu limited locations.
> choosing a testing location is the alpha and omega of cpu testing in games.It's all you need to do.


You're suggesting TPU doesn't test their games in specific locations, consistent over their tests? 
You're suggesting that the random fluctuations and circumstances that a PC happens to be in at any given moment, including changes in game state, memory loaded, human error by looking at a slightly different spot, etc are more reliable than blanket limiting the GPU out of the equation by lowering the resolution?

Both nonsense.


----------



## cucker tarlson (Jul 24, 2019)

Papahyooie said:


> You're suggesting TPU doesn't test their games in specific locations, consistent over their tests?
> You're suggesting that the random fluctuations and circumstances that a PC happens to be in at any given moment, including changes in game state, memory loaded, human error by looking at a slightly different spot, etc are more reliable than blanket limiting the GPU out of the equation by lowering the resolution?
> 
> Both nonsense.


1. no,but they don't mention locations and the results actually look like they're not.
2.you're blowing the in-game testing margin of error out of propotion.way,way,waay out of proportion.in games that include a quicksave option and for a reviewer that knows how to do it it's not that difficult to do a hundred identical runs.








look at ac odyssey results.
if they tested in-game in Phokis those results would be 50% lower,1080p,1440p,doesn't matter.My 5775c gets so hammered in this location it can drop below 50,and it's a place you go to very,very often.

matter of fact,wait a second......


----------



## Papahyooie (Jul 24, 2019)

cucker tarlson said:


> 1. no,but they don't mention locations and the results actually look like they're not.
> 2.you're blowing the in-game testing margin of error out of propotion.way,way,waay out of proportion.in games that include a quicksave option and for a reviewer that knows how to do it it's not that difficult to do a hundred identical runs.
> 
> 
> ...



What in the world does that have to do with anything? It doesn't matter if your 5775c gets hammered at a specific location. If the runs are exactly the same, then any CPU will have to do the exact same calculations that your 5775c did, and therefore would be "hammered" the exact same amount. So in Phokis, the results would be 50% lower... so what?? They'd be 50% lower across ALL CPUs, so the ratios would be maintained. The delta between the processors listed would be identical. Benchmarks are never about what the raw fps number is. It's about COMPARISON of the processors tested. You're completely misunderstanding the point of benchmarks here. It's not to give you a reliable indication of a specific performance target. That's not how that works, because your setup will ALWAYS be slightly different, and you'll always have slightly different numbers. The fact that the framerates would be lower in a specific location across the board is entirely irrelevant. The point of doing 720p tests is to provide an accurate comparison between the chips, while taking the GPU effectively out of the comparison.


----------



## cucker tarlson (Jul 24, 2019)

Papahyooie said:


> What in the world does that have to do with anything? It doesn't matter if your 5775c gets hammered at a specific location. If the runs are exactly the same, then any CPU will have to do the exact same calculations that your 5775c did, and therefore would be "hammered" the exact same amount. So in Phokis, the results would be 50% lower... so what?? They'd be 50% lower across ALL CPUs, so the ratios would be maintained. The delta between the processors listed would be identical. Benchmarks are never about what the raw fps number is. It's about COMPARISON of the processors tested. You're completely misunderstanding the point of benchmarks here. It's not to give you a reliable indication of a specific performance target. That's not how that works, because your setup will ALWAYS be slightly different, and you'll always have slightly different numbers. The fact that the framerates would be lower in a specific location across the board is entirely irrelevant. The point of doing 720p tests is to provide an accurate comparison between the chips, while taking the GPU effectively out of the comparison.


you don't understand still.
testing at 720p does not always equate to cpu testing.
see this post and look at what I said about the difference between the slowest (ryzen 1st gen) and fastest (9900k).even though they're testing 720p,it clearly not that cpu bound as another location is at higher resolution








						3700X vs 9900K, that is the question...
					

Yeah but single threaded performance is what you should look at,what is an ipc benchmark  Well I thought we have covered that and Ryzens are better with the IPC at the moment. IPC measurement is not certain games benchmark or even suite of games which can be coded and performing better for a top...




					www.techpowerup.com
				




in that,a game is not really limited by a resolution but by a scenario (location)


----------



## biffzinker (Jul 24, 2019)

trparky said:


> I plan on getting a cheap AMD B450 motherboard, a Ryzen 2600 (last generation), and some RAM for less than $250.


Interested in buying a used 2600X? There's a couple of circular marks up on top of the heatspreader from the AMD OEM cooler. Otherwise the chip hasn't been abused due to excessive voltage, this mobo didn't allow anything past 1.4v or overclocking besides PBO.


----------



## Papahyooie (Jul 24, 2019)

cucker tarlson said:


> you don't understand still.
> testing at 720p does not always equate to cpu testing.
> see this post and look at what I said about the difference between the slowest (ryzen 1st gen) and fastest (9900k).even though they're testing 720p,it clearly not that cpu bound as another location is at higher resolution
> 
> ...


When comparing the same two processors at the same clockspeeds (i.e, 9900k vs R7 2700 because those are the top and bottom on the TPU chart), between your chart and mine, the % difference is negligible. They basically say the same thing. You can't just take the top and bottom from each chart and compare those, because the top and bottom chips aren't the same chip on each chart lol.


----------



## cucker tarlson (Jul 24, 2019)

Papahyooie said:


> When comparing the same two processors at the same clockspeeds (i.e, 9900k vs R7 2700 because those are the top and bottom on the TPU chart), between your chart and mine, the % difference is negligible. They basically say the same thing. You can't just take the top and bottom from each chart and compare those, because the top and bottom chips aren't the same chip on each chart lol.


no,but take 1800x and 990k.9% on 720p and 43% in the other test.


----------



## Papahyooie (Jul 24, 2019)

That just says to me that your data isn't consistent lol. Why would two processors' delta be almost identical in both tests, but two other processors have a huge difference? 

Who's more likely to blame? An unscientific test that uses unknown and unreplicatable data? Or one that uses known and defined degrees of freedom?


----------



## cucker tarlson (Jul 24, 2019)

Papahyooie said:


> That just says to me that your data isn't consistent lol. Why would two processors' delta be almost identical in both tests, but two other processors have a huge difference?
> 
> Who's more likely to blame? An unscientific test that uses unknown and unreplicatable data? Or one that uses known and defined degrees of freedom?


why do you insist it's unreplicatable ?


----------



## Papahyooie (Jul 24, 2019)

cucker tarlson said:


> why do you insist it's unreplicatable ?


Because you're relying on data that cannot be defined. Prove to me that looking at the same city each time provides a consistent load on the processor. You can't, because you're not defining the contents of memory reliably. You're making an assumption. Using 720p at least provides a stable and known factor, whereas "game location" cannot be said to equate to any defineable factor.


----------



## EarthDog (Jul 24, 2019)

cucker tarlson said:


> why do you insist it's unreplicatable ?


this is the problem with manual run throughs of games.... it's never the same and increases the variability between runs. Regardless of the resolution, a canned benchmark is exactly repeatable scene and minimizes run variance, whereas a manual run through from a given location adds variance.


----------



## Papahyooie (Jul 24, 2019)

To be clear, the TPU runs are subject to the same possible problems, because each run may be slightly different. But the difference is, TPU is depending on that CPU load to gather data, but it is NOT relying on that consistent CPU load to be the primary indicator for comparison, but rather relying on the repeatable known factor of a low resolution. If such an anomalous result were found in the TPU results, the benchmarker should retry the run multiple times to try to shake out the anomoly, or smooth it over with an average. If such anomaly continues to be repeatable, it would behoove the tester to investigate why.


----------



## cucker tarlson (Jul 24, 2019)

so you're saying if those tests were carrier out in a more consistent way,the 40% difference during the in-game run would be 9% again ? I don't think you believe that.
and isn't a very specific choice of location a repeatable factor as you called it too?


----------



## Papahyooie (Jul 24, 2019)

cucker tarlson said:


> so you're saying if those tests were carrier out in a more consistent way,the 40% difference during the in-game run would be 9% again ? I don't think you believe that.


You're the one claiming that game location causes such a huge difference in the test data. So you tell me... Could a change in the test that resulted in more or less CPU load on a specific run cause that much of a difference?

If so, then your data is not reliable.

If not, then your claim that in game location is a better indicator of performance is false, and therefore your entire premise is flawed.

You tell me.


----------



## cucker tarlson (Jul 24, 2019)

Papahyooie said:


> You're the one claiming that game location causes such a huge difference in the test data. So you tell me... Could a change in the test that resulted in more or less CPU load on a specific run cause that much of a difference?
> 
> If so, then your data is not reliable.
> 
> ...


you're confusing the very thing you're arguing about.
your point was about replicablility factor,the consistency in carrying out the tests.not the location vs resolution difference that I'm trying to get across.just cause your data with 720p could be more consistent does not mean it's any more relevant than location testing.I mean it could be for a reviewer maybe,but not for the end user of a cpu.

Please don't think I don't get your point.I do,though I think you exagerrate things.I just think that more consistent methodology (720p benchmark) does not matter for me as much as testing a cpu heavy location at more standard resolution would.You end up with a consitent bag of nothing.Which result will the end user ever be closer to seeing with their own eyes while playing?


----------



## trparky (Jul 24, 2019)

What’s this about SSD performance?


----------



## cucker tarlson (Jul 24, 2019)

trparky said:


> What’s this about SSD performance?


nothing you need to worry about really if that ryzen system is gonna end up in your dad's office.just nerds comparing score points.


----------



## Papahyooie (Jul 24, 2019)

cucker tarlson said:


> you're confusing the very thing you're arguing about.
> your point was about replicablility factor,the consistency in carrying out the tests.not the location vs resolution difference that I'm trying to get across.just cause your data with 720p could be more consistent does not mean it's any more relevant than location testing.I mean it could be for a reviewer maybe,but not for the end user of a cpu.
> 
> Please don't think I don't get your point.I do,though I think you exagerrate things.I just think that more consistent methodology (720p benchmark) does not matter for me as much as testing a cpu heavy location at more standard resolution would.You end up with a consitent bag of nothing.Which result will the end user ever be closer to seeing with their own eyes while playing?



That's fine if you want an indicator of what FPS you will get from a particular chip in a particular situation, but that's not what these benchmarks are for! They are for comparing processors against each other. And your data shows an anomaly, ostensibly because the "particular situation" that was tested, was changed somehow across runs. This invalidates the result. Two processors show identical performance deltas over the two testing methodologies, and two others show hugely different deltas. So there is something wrong with SOMEONE'S tests. I contend that the 720p tests are more accurate and reliable. If I am correct, then your fps numbers are objectively wrong, and therefore not even a good indicator for what you want. Something has skewed your numbers, so for whatever purpose you may want, your numbers are wrong! Even if you deem such a test more "useful" to you, it doesn't matter because the results are flawed, so your "more useful" results are inherently useless.


----------



## cucker tarlson (Jul 24, 2019)

Papahyooie said:


> That's fine if you want an indicator of what FPS you will get from a particular chip in a particular situation, but that's not what these benchmarks are for! They are for comparing processors against each other. And your data shows an anomaly. Two processors show identical performance deltas over the two testing methodologies, and two others show hugely different deltas. So there is something wrong with SOMEONE'S tests. I contend that the 720p tests are more accurate and reliable. If I am correct, then your fps numbers are objectively wrong, and therefore not even a good indicator for what you want. Something has skewed your numbers, so for whatever purpose you may want, your numbers are wrong! Even if you deem such a test more "useful" to you, it doesn't matter because the results are flawed, so your "more useful" results are inherently useless.


please,just cause 720p benchmark may be more consistent does not imply all the other things you claim.wrong results? yeah,sure.
two different deltas does not imply either number is wrong,I don't know why you'd think that.wouldn't having identical deltas imply that?


----------



## Papahyooie (Jul 24, 2019)

I'll copy/paste something I edited into the reply above that you probably didn't see:

"your data shows an anomaly, ostensibly because the "particular situation" that was tested, was changed somehow across runs. This invalidates the result."

Yes. Incorrect results. If you change the test across runs, which obviously did happen somehow, you invalidate that result. This is basic testing method.


----------



## cucker tarlson (Jul 24, 2019)

Papahyooie said:


> I'll copy/paste something I edited into the reply above that you probably didn't see:
> 
> "your data shows an anomaly, ostensibly because the "particular situation" that was tested, was changed somehow across runs. This invalidates the result."
> 
> Yes. Incorrect results. If you change the test across runs, which obviously did happen somehow, you invalidate that result. This is basic testingmethod.


but the fact is you don't change it.

you can't really think that a game that doesn't have an in-built benchmark is actually impossible to test,do you?


----------



## Papahyooie (Jul 24, 2019)

cucker tarlson said:


> but the fact is you don't change it.


Something obviously was changed, or you wouldn't have such disparity in the deltas between the two separate testing methodologies. You'd have different numbers, sure. But you SHOULD have consistent deltas, at least in percentage. So something changed, there IS anomaly in SOMEONE's testing methodology. You can argue all you want that TPU's testing methodology is the less consistent, but that's laughable considering the fact that TPU's methodology has repeatable factors, whereas yours doesn't. So I maintain that it's the one you shared that has anomaly. I'm not going to spend the time to prove it, because it's common sense, and any reasonable person would agree.


----------



## cucker tarlson (Jul 24, 2019)

Papahyooie said:


> Something obviously was changed, or you wouldn't have such disparity in the deltas between the two separate testing methodologies. You'd have different numbers, sure. But you SHOULD have consistent deltas, at least in percentage. So something changed, there IS anomaly in SOMEONE's testing methodology. You can argue all you want that TPU's testing methodology is the less consistent, but that's laughable considering the fact that TPU's methodology has repeatable factors, whereas yours doesn't. So I maintain that it's the one you shared that has anomaly. I'm not going to spend the time to prove it, because it's common sense, and any reasonable person would agree.


is it even possible to have a minor change skew results by such margins? not really.you're looking for answers in picking on details while it lies elsewhere,in plain sight.


----------



## Papahyooie (Jul 24, 2019)

cucker tarlson said:


> is it even possible to have a minor change skew results by such margins? not really.you're looking for answers in picking on details while it lies elsewhere,in plain sight.


We've already been through this lol. 



Papahyooie said:


> You're the one claiming that game location causes such a huge difference in the test data. So you tell me... Could a change in the test that resulted in more or less CPU load on a specific run cause that much of a difference?
> 
> If so, then your data is not reliable.
> 
> ...


----------



## cucker tarlson (Jul 24, 2019)

Papahyooie said:


> We've already been through this lol.


I tell you that reviewer must have been hit in the head with the hammer in the middle of each run to obtain such skewed results.

it's a change between two tests,not runs of each tests for Christ's sake 

we are so dead when the moderator sees it.


----------



## Papahyooie (Jul 24, 2019)

cucker tarlson said:


> I tell you that reviewer must have been hit in the head with the hammer in the middle of each run to obtain such skewed results.


That's the problem with your claims... You can't say why. The anomaly is obviously there, when compared to other tests. And yet, you can't explain why. Which is why I contend that your methodology is unreliable.


----------



## cucker tarlson (Jul 24, 2019)

Papahyooie said:


> That's the problem with your claims... You can't say why. The anomaly is obviously there, when compared to other tests. And yet, you can't explain why. Which is why I contend that your methodology is unreliable.


it's not an anomaly though.you call it that.

hey,look,testing a cpu heavy location in game shows other results than in-game benchmark at other resolution.must be an anomaly.


----------



## Papahyooie (Jul 24, 2019)

cucker tarlson said:


> it's not an anomaly though.you call it that.



It is. Other tests show consistent deltas between two sets of two processors. Yours show consistent deltas between two processors, but wildly different deltas for two other processors, of the same set of four. This is an anomaly.


----------



## cucker tarlson (Jul 24, 2019)

Papahyooie said:


> It is. Other tests show consistent deltas between two sets of two processors. Yours show consistent deltas between two processors, but wildly different deltas for two other processors, of the same set of four. This is an anomaly.


thinking that just because a 720p test shows XX% difference between two,three or five cpus means any other run that shows a wider/lesser gap is incorrect is just wrong.


----------



## Papahyooie (Jul 24, 2019)

cucker tarlson said:


> thinking that just because a 720p test shows XX% difference between two,three or five cpus means any other run that shows a wider/lesser gap is incorrect is just wrong.



That's not how deltas work lol. When the gap is consistent across two testing methodologies, except for a specific result, that is anomaly.


----------



## biffzinker (Jul 25, 2019)

trparky said:


> What’s this about SSD performance?


Seems to be fine for me.


----------



## btarunr (Jul 25, 2019)

If you don't overclock, have you considered a combination of i9-9900 (non-K) with a cheap B365 motherboard?

https://www.newegg.com/intel-core-i9-9th-gen-core-i9-9900/p/N82E16819118023  ($440 here)

https://www.newegg.com/p/N82E168131...B365&cm_re=ASRock_B365-_-13-157-863-_-Product ($90 here).


----------



## dgianstefani (Jul 25, 2019)

What a silly suggestion. The point of Intel is to overclock. If you're not going to overclock buy a Ryzen, they're about equivalent to Intel at stock, but the Intels gain 10/20% by overclocking whereas the Ryzens are already running maxed out.


----------



## Vayra86 (Jul 25, 2019)

cucker tarlson said:


> you're confusing the very thing you're arguing about.
> your point was about replicablility factor,the consistency in carrying out the tests.not the location vs resolution difference that I'm trying to get across.just cause your data with 720p could be more consistent does not mean it's any more relevant than location testing.I mean it could be for a reviewer maybe,but not for the end user of a cpu.
> 
> Please don't think I don't get your point.I do,though I think you exagerrate things.I just think that more consistent methodology (720p benchmark) does not matter for me as much as testing a cpu heavy location at more standard resolution would.You end up with a consitent bag of nothing.Which result will the end user ever be closer to seeing with their own eyes while playing?



You're both right here, to be honest... the idea behind the test TPU does and the one you see elsewhere focusing on specific game events/locations is different.

- TPU wants to show the maximum potential FPS in game X on CPU Y, and how each processor ranks in that maxed out scenario - uncolored, 'scientific' value. A window into the maxed potential with regards to high FPS gaming. Keep in mind those CS GO players used to run 4:3 at low res and super low settings... They like this test and it serves a purpose in their buying decision - and rightly so, I might add. It is also here you see Ryzen's improvement over last gen in all of its glory.

- Specific game location tests serve a night and day different purpose; they show us the _worst case scenario_ for a CPU. Does it run maxed out? Yes, but it also runs into a badly optimized, or heavy in asset/code part of the game. This gives us a window into the worst frame drops per game per CPU. It also tells you mostly just thát: how does it work out in that specific game. This is interesting for people who play mostly that specific type of games. TW3's Novigrad or AC: O's towns, same type of game & same behavior on CPUs.

Different purpose, and I think both approaches are valuable to readers. I also have to say the 720p testing fits well in the rest of the review for TPU because there is zero focus on minimum FPS. Its all about averages, ballpark idea of relative performance. Not a deep dive.


----------



## cucker tarlson (Jul 25, 2019)

dgianstefani said:


> What a silly suggestion. The point of Intel is to overclock. If you're not going to overclock buy a Ryzen, they're about equivalent to Intel at stock, but the Intels gain 10/20% by overclocking whereas the Ryzens are already running maxed out.


stock vs. stock the K skus look meh compared to ryzen on price terms,non-k will be cheaper for the cpu and cheaper for the mobo while the performance hit is not that big.
it's just another option,and tbh it's not silly at all.


Vayra86 said:


> You're both right here, to be honest... the idea behind the test TPU does and the one you see elsewhere focusing on specific game events/locations is different.
> 
> - TPU wants to show the maximum potential FPS in game X on CPU Y, and how each processor ranks in that maxed out scenario - uncolored, 'scientific' value. A window into the maxed potential with regards to high FPS gaming. Keep in mind those CS GO players used to run 4:3 at low res and super low settings... They like this test and it serves a purpose in their buying decision - and rightly so, I might add. It is also here you see Ryzen's improvement over last gen in all of its glory.
> 
> ...


well in game tests are certainly not wrong or flawed
like I said,720p built-in testing is very consistent but max cpu performance is not the limiting factor,the min is.
like that i5 result,sb sees that and thinks his i5 is ready to handle odyssey at 90 fps,buys a 120hz panel and then sees the cpu limiting performance to 60 fps in every city.


----------



## Vayra86 (Jul 25, 2019)

cucker tarlson said:


> stock vs. stock the K skus look meh compared to ryzen on price terms,non-k will be cheaper for the cpu and cheaper for the mobo while the performance hit is not that big.
> it's just another option,and tbh it's not silly at all.
> 
> well in game tests are certainly not wrong or flawed
> ...



Of course, but the blame should be placed not on the CPU but the game. All CPUs drop heavily in these scenarios, and the gist of that is just that you want the fastest CPU available, there are barely any exceptions to that. And that conclusion also echoes from the 720p tests.


----------



## cucker tarlson (Jul 25, 2019)

Vayra86 said:


> Of course, but the blame should be placed not on the CPU but the game. All CPUs drop heavily in these scenarios, and the gist of that is just that you want the fastest CPU available, there are barely any exceptions to that. And that conclusion also echoes from the 720p tests.


well there are big differences in how much those cpus drop,the whole point of testing cpus for gaming.balanced performance is not high avg. fps,it's when the basement is as close to the ceiling as possible.


----------



## Vayra86 (Jul 25, 2019)

cucker tarlson said:


> well there are big differences in how much those cpus drop,the whole point of testing cpus for gaming.



....drop in that _specific scenario. _A different game may put other CPUs on top in its own heaviest scenarios. Those tests have little value beyond a specific type of game, type of engine and/or even only just that specific game. They say little about the 'overall' relative CPU performance. 720p does give you that.

Go figure, the two examples you keep giving are two very similar scenarios. Different engines, but 3rd person open world and both in a population hub. That is precisely the scope of the results you are looking at.


----------



## cucker tarlson (Jul 25, 2019)

Vayra86 said:


> ....drop in that _specific scenario. _A different game may put other CPUs on top in its own heaviest scenarios. Those tests have little value beyond a specific type of game, type of engine and/or even only just that specific game. They say little about the 'overall' relative CPU performance. 720p does give you that.


that's why you test across games,engines and apis.
look at gamersnexus,they do just a bunch of games,but each is different engine,type,api.
if a cpu does well in 5 wildly different scenarios,chances are it's gonna do well in every other one.
that's why I always say that the best cpu is not the one with highest avg. fps,but the one that always makes it to the top 3 of any chart.if you get an 8700k you can be sure you're never gonna see your performance fall off a cliff in any game.


----------



## Vayra86 (Jul 25, 2019)

cucker tarlson said:


> that's why you test across games,engines and apis.
> look at gamersnexus,they do just a bunch of games,but each is different engine,type,api.
> if a cpu does well in 5 wildly different scenarios,chances are it's gonna do well in every other one.
> that's why I always say that the best cpu is not the one with highest avg. fps,but the one that always makes it to the top 3 of any chart.



I think the relative performance charts on TPU still cover that quite well. Or can you put two reviews side by side and point out a different top 3 based on that between GN and TPU?

A large test selection kinda mitigates the outliers and arrives at similar results as to the 'overall best'.


----------



## cucker tarlson (Jul 25, 2019)

Vayra86 said:


> I think the relative performance charts on TPU still cover that quite well.


in terms of cpu rank list - yes
in terms of measuring actual cpu bottleneck performance deltas - not quite

fact is I don't care much for 720p testing,but that's cause I have more personal experience with several dozens of games ran on a high refresh rate panels,that's why I look for cpu heavy location testing.
both are indicative of something that someone might be looking for.


----------



## Vayra86 (Jul 25, 2019)

cucker tarlson said:


> in terms of measuring actual cpu bottleneck performance deltas - not quite



Alright I get ya, but that really is equivalent to a deep dive, one you would expect to be looking for AFTER buying your CPU. A lot of these game specific quirks can be fixed either through software tweaks, or hardware OC. And in other examples, a simple game patch makes all problems go away... so again... I highly question the value of going that deep into it, when it comes to a purchase decision. For that, overall relative performance is where its at, unless you are _very keen _on serving just a single type of workload.


----------



## cucker tarlson (Jul 25, 2019)

Vayra86 said:


> Alright I get ya, but that really is equivalent to a deep dive, one you would expect to be looking for AFTER buying your CPU. A lot of these game specific quirks can be fixed either through software tweaks, or *hardware OC*.


ryzen 3000 would not like what you just said


----------



## Vayra86 (Jul 25, 2019)

cucker tarlson said:


> ryzen 3000 would not like what you just said



Amen to that, its the reason Intel had dominance for so long, right? The CPUs were _versatile. _They excelled both at high FPS and at high load scenarios. Now, the tables are turning. Good thing Ryzen 3000 kills stuff right out of the box.


----------



## cucker tarlson (Jul 25, 2019)

Vayra86 said:


> Amen to that, its the reason Intel had dominance for so long, right? The CPUs were _versatile. _They excelled both at high FPS and at high load scenarios. Now, the tables are turning. Good thing Ryzen 3000 kills stuff right out of the box.


well,another thing is min fps is just part of the story.frametime consistency is just as important.I'd like to see how that humongous l3 cache on ryzen 3000 handles that.


----------



## ratirt (Jul 26, 2019)

Vayra86 said:


> Amen to that, its the reason Intel had dominance for so long, right? The CPUs were _versatile. _They excelled both at high FPS and at high load scenarios. Now, the tables are turning. Good thing Ryzen 3000 kills stuff right out of the box.


Intel has slight advantage now because its long dominance which shrinks every time AMD releases a new CPU.



cucker tarlson said:


> well,another thing is min fps is just part of the story.frametime consistency is just as important.I'd like to see how that humongous l3 cache on ryzen 3000 handles that.


I agree. Min FPS is the key and benching 720p is ridiculous from one stand point. Who cares about 120 max when lows go below 40 for example. The narrow the gap between min and max then better. I see reviews when benching at 720p the FPS isn't much different from 1080p or even 2k and reviewers tend to say "CPU bottleneck". It is kinda horse crap due to the fact the CPU is being utilized with 5-10%. How can CPU be a bottleneck when it still has such a headroom. The game is the bottleneck, poor scheduler (maybe), execution of code and process cause it can't utilize the resources. The game software gives/feeds commands to CPU and tells him how to make it happen not all the way around. For me a CPU bottleneck is when CPU is at 100% struggles to keep up and the GPU has to wait for it to finish whatever is there to finish hanging on at 60% for instance.
That's a definite CPU bottleneck. I know this will change and it has already started but it is happening slower than CPU arch and core number for some reason.


----------

