# AMD Ryzen 7 2700X 3.7 GHz



## W1zzard (Apr 19, 2018)

AMD's new second-generation Ryzen processors are here. We run the Ryzen 7 2700X flagship through our completely revamped test suite, which features the latest BIOS, OS, and software updates, as well as new tests and games. Results are very impressive and considerably reduce the gap to Intel's offerings.

*Show full review*


----------



## jabbadap (Apr 19, 2018)

So no point of overclocking it, or? Can't you tune single core clock manually any higher? Very puzzling results I might say.


----------



## v12dock (Apr 19, 2018)

Way more than I expected, I think we can all expect a knee jerk reaction from Intel.


----------



## Vario (Apr 19, 2018)

Thanks for the review.  About time things got competitive for CPUs.  I am excited to see what Intel counters with.


----------



## Fleurious (Apr 19, 2018)

Looks like a great CPU.  For me it’s a toss up with the i5 8400 and will be decided by the chipset/platform features.


----------



## yeeeeman (Apr 19, 2018)

Nice, 7% difference in gaming at 1080P is nothing compared to 8700K. You get more cores, cheaper platform, best buy in my opinion.


----------



## phill (Apr 19, 2018)

Bottom line - Overclocking or not, I'd like one, maybe two....  

Hopefully as time goes on with the refresh, the performance will become even better and more so with overclocking which I believe would make a bigger difference if they can push 4.4Ghz or more..



jabbadap said:


> So no point of overclocking it, or? Can't you tune single core clock manually any higher? Very puzzling results I might say.



As they said in the review, it's possibly down to how well AMD's auto boosting is working and the fact that they couldn't go above the base boost level of 4.3Ghz..  Here's hoping for 4.5Ghz or more at some point...


----------



## Mussels (Apr 19, 2018)

At 4K gaming overclocking it made it run slower? what?


----------



## jabbadap (Apr 19, 2018)

yeeeeman said:


> Nice, 7% difference in gaming at 1080P is nothing compared to 8700K. You get more cores, cheaper platform, best buy in my opinion.



Until you overclock intel to ~5 GHz...


----------



## W1zzard (Apr 19, 2018)

Mussels said:


> At 4K gaming overclocking it made it run slower? what?


As explained in the overclocking section and conclusion, the CPU will boost higher than our manual OC when few cores are loaded.


----------



## Vario (Apr 19, 2018)

Would it overclock higher with a better cooler?


----------



## W1zzard (Apr 19, 2018)

jabbadap said:


> Can't you tune single core clock manually any higher?


Unlike Intel, it seems there is no way to adjust boost levels for specific core-count-workloads


----------



## jabbadap (Apr 19, 2018)

W1zzard said:


> As explained in the overclocking section and conclusion, the CPU will boost higher than our manual OC when few cores are loaded.



Did you use Bios or Wattman for overclocking?


----------



## W1zzard (Apr 19, 2018)

Vario said:


> Would it overclock higher with a better cooler?


Sure, get out the LN2  I did some quick testing with a 240 mm watercooler and saw negligible improvements, maybe 50 MHz.



jabbadap said:


> Did you use Bios or Wattman for overclocking?


Tried both, makes no difference. For me Ryzen Master is nice to quickly find the rough maximum OC (not so many reboots), and then fine-tune using BIOS because you'll have to reboot anyway due to system crash.


----------



## Mussels (Apr 19, 2018)

W1zzard said:


> As explained in the overclocking section and conclusion, the CPU will boost higher than our manual OC when few cores are loaded.



Its still mind boggling how good XFR2 is


----------



## jabbadap (Apr 19, 2018)

What are the all core clocks by the way, can't seem to find that information anywhere.


----------



## phill (Apr 19, 2018)

Thank you W1zzard, brilliant review   I wonder how it works for Cancer Crunching....


----------



## W1zzard (Apr 19, 2018)

jabbadap said:


> What are the all core clocks by the way, can't seem to find that information anywhere.


Great idea. This is something I'll include in future CPU reviews.


----------



## Mussels (Apr 19, 2018)

Oh @W1zzard  - just to be a pita, can you see if the clocks/performance (with XFR/high manual OC/voltages) change with active cooling on the VRM's? My ryzen systems had lower performance over time until i did that, although my MSI boards dont exactly have high end VRM cooling, it still made a massive difference when using the ryzen 1700 chip (invisible thermal throttling, basically)


----------



## cucker tarlson (Apr 19, 2018)

Stock 7600K-like performance in games from an overclocked 8c/16t is disappointing to say the least . 2700X has serious workstation performance tho. I just can't explain people putting ryzen in rigs that are mostly for gaming.


----------



## Mussels (Apr 19, 2018)

cucker tarlson said:


> Stock 7600K-like performance in games from an overclocked 8c/16t is disappointing to say the least . 2700X has serious workstation performance tho. I just can't explain people putting ryzen in rigs that are mostly for gaming.



Ryzens cheaper, especially outside the USA. Simple as that.


----------



## cucker tarlson (Apr 19, 2018)

Mussels said:


> Ryzens cheaper, especially outside the USA. Simple as that.


Than 7600K ? What ?
Here Ryzen 2700X is 1350, 8700K is 1400, and 8700K is just better overall.


----------



## intrepid3d (Apr 19, 2018)

Why is your Ryzen 2700X at 4.2Ghz gaming performance pretty much identical to the 1800X at 3.6Ghz? despite a 600Mhz clock speed difference the performance is exactly the same?
Did you make a mistake or has Ryzen 2 lost a huge lump of IPC vs Ryzen 1?

Your review is very strange i don't understand it.


----------



## W1zzard (Apr 19, 2018)

intrepid3d said:


> Why is your Ryzen 2700X at 4.2Ghz gaming performance pretty much identical to the 1800X at 3.6Ghz? despite a 600Mhz clock speed difference the performance is exactly the same?
> Did you make a mistake or has Ryzen 2 lost a huge lump of IPC vs Ryzen 1?
> 
> Your review is very strange i don't understand it.


What exactly are you looking at? GPU limited?


----------



## kruk (Apr 19, 2018)

cucker tarlson said:


> Stock 7600K-like performance in games from an overclocked 8c/16t is disappointing to say the least .



No, its actually not, some games just don't scale. Also, this is a benchmark, where interference from other apps is kept to a minimum. In real world use cases (where you have background apps running), you might notice that (in heavily threaded scenarios) frame drops can occur on the 4T CPU but the 8C/16T CPU will have no problems.


----------



## intrepid3d (Apr 19, 2018)

W1zzard said:


> What exactly are you looking at? GPU limited?



No i'm looking at your review...

https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/20.html


----------



## dyonoctis (Apr 19, 2018)

If you are gaming and doing workstation work the 2700x is a steal, it's only 10% slower than the 7820x in productivity, but 17% faster in games. (HFR)
Overclocking doesn't seem really interesting, since you would have to turn it on and off according to what you are doing.


----------



## intrepid3d (Apr 19, 2018)

Guys, there are a lot more reviews on the net and *all of them* look a lot better for Ryzen 2 than this one, especially with its strange claims of 0 performance difference with +600Mhz.

This makes it look like Bulldozer Mk2, Ryzen is not Bulldozer, all other reviews have it close to Intel.


----------



## W1zzard (Apr 19, 2018)

intrepid3d said:


> No i'm looking at your review...



I guess you are looking at 4K resolution only? At that resolution the games are GPU limited - the GPU can not produce frames any faster, which is why the CPU sits idle, waiting for the next frame to be ready from the GPU. So a faster CPU will sit idle, "but faster", ie, not provide any performance gain. Look at lower resolutions, which have higher FPS rates, which means the CPU needs to do more work and become a limiting factor. CPU time for a given frame is roughly constant.


----------



## cucker tarlson (Apr 19, 2018)

kruk said:


> No, its actually not, some games just don't scale. Also, this is a benchmark, where interference from other apps is kept to a minimum. In real world use cases (where you have background apps running), you might notice that (in heavily threaded scenarios) frame drops can occur on the 4T CPU but the 8C/16T CPU will have no problems.


I know it's better than 7600K for streaming, but 8700K is better than ryzen too and costs the same. Let's end this discussion here, we won't draw any conclusions that haven't already been drawn with Ryzen 1xxx series. By the way, there's so much more to talk about in this CPU than its rather mediocre gaming price/perf. I think it's splendid. When they release its non-x version at lower price, people who consider 8700k for a workstation cpu might instead give 2700 and faster ddr4 a try.


----------



## kruk (Apr 19, 2018)

intrepid3d said:


> Guys, there are a lot more reviews on the net and *all of them* look a lot better for Ryzen 2 than this one, especially with its strange claims of 0 performance difference with +600Mhz.



You should reread the conclusion:



> Yes, overclocking is possible and our sample reached 4.2 GHz stable on all cores, but that won't always give you higher performance as our benchmarks show. The underlying reason is that Ryzen has very clever Boost algorithms that automagically increase clock frequencies beyond rated stock frequencies. The 2700X can go up to 4.3 GHz with just a single core is active, which is higher than what we managed with manual overclocking. As result, single-threaded applications will run faster than without overclocking.



You should really look at the green bars if you want to compare 1800x to 2700x ...


----------



## W1zzard (Apr 19, 2018)

kruk said:


> You should reread the conclusion:


I think @intrepid3d is looking at the gaming performance summary charts for 4K and is wondering why all CPUs are identical


----------



## intrepid3d (Apr 19, 2018)

W1zzard said:


> I guess you are looking at 4K resolution only? At that resolution the games are GPU limited - the GPU can not produce frames any faster, which is why the CPU sits idle, waiting for the next frame to be ready from the GPU. So a faster CPU will sit idle, "but faster", ie, not provide any performance gain. Look at lower resolutions, which have higher FPS rates, which means the CPU needs to do more work and become a limiting factor. CPU time for a given frame is roughly constant.



You ndon't even know your own slides? also why the sub i3 performance? really? did you turn all the graphics settings off to make all your game results single threaded?

I would have thought by now people are clued up enough to know that by turning graphics setting down you're reducing the load on CPU's, especially MT CPU's.


----------



## W1zzard (Apr 19, 2018)

intrepid3d said:


> did you turn all the graphics settings off


All games are at highest settings.

Aaaaah, now I know what you mean. When overclocked, the 2700X is running 4.2 GHz all cores. When at stock (what is listed as 2700X 3.7 GHz in the charts), the CPU will boost, up to 4.3 GHz, which is 100 MHz higher than the manual OC. The 1800X will boost up to 4.1 GHz if I remember correctly, which is close enough to the 2700X's 4.2 GHz, to explain the small difference.

Does that help?


----------



## bug (Apr 19, 2018)

Just curious, but if "Limited overclocking potential" is a con, why is "Soldered IHS" a pro?


----------



## DeathtoGnomes (Apr 19, 2018)

First off, thanks for the review @W1zzard. 

You can expect the price-performance to change the first time this chip goes on sale. Which, I expect we will be up to $100 on special sales from amazon, and maybe a few other etailers. From what I can gather the 2600 performance is right up there with the 2700, which might be a better price-perfomance buy.



bug said:


> Just curious, but if "Limited overclocking potential" is a con, *why is "Soldered IHS" a pro*?



because its not TIM!


----------



## jabbadap (Apr 19, 2018)

cucker tarlson said:


> Than 7600K ? What ?
> Here Ryzen 2700X is 1350, 8700K is 1400, and 8700K is just better overall.



you mean old kaby lake 7600k? It's actually quite literally same processor as i3 8350k.


----------



## Xaled (Apr 19, 2018)

intrepid3d said:


> Guys, there are a lot more reviews on the net and *all of them* look a lot better for Ryzen 2 than this one, especially with its strange claims of 0 performance difference with +600Mhz.
> 
> This makes it look like Bulldozer Mk2, Ryzen is not Bulldozer, all other reviews have it close to Intel.



0 performance difference with +600? did u smoke something before reading the article? or did u even read it? .. u actually dont need to read it to know that 1800x and all other cpus has clock boost. unless the reviews you found on the net fixed the clock speed of 1800x at 3.6.

Ryzen 2 does look good in this review. not close to bulldozer at all. it is shame even to mention their names together..


----------



## intrepid3d (Apr 19, 2018)

W1zzard said:


> All games are at highest settings.
> 
> Aaaaah, now I know what you mean. When overclocked, the 2700X is running 4.2 GHz all cores. When at stock (what is listed as 2700X 3.7 GHz in the charts), the CPU will boost, up to 4.3 GHz, which is 100 MHz higher than the manual OC. The 1800X will boost up to 4.1 GHz if I remember correctly, which is close enough to the 2700X's 4.2 GHz, to explain the small difference.
> 
> Does that help?



Which proves your benchmarks are all limited to single threaded. In a world of Multithreded CPU's what is the point in that?

I'm probably wasting my time here because no reviewer like to be told something is not right with what they are doing, But in the hope that you might listen seriously please look again about how you do your benchmarks, because you are telling your viewers an i3 is a better gaming CPU than any Ryzen CPU, Do you use an i3 for high end GPU gaming? its a dreadful combination, the i3 is a really bad gaming CPU because what your slides don't show is the bad variation in performance you get between complex and empty scenes, you must have run you benchmarks in empty scenes because had you use complex scenes the i3 would have been right at the bottom.

Watch this Ryzen 1600 vs 7600K review, this guy knows what he's talking about and sure enough in actual fact, in complex scenes the 1600 is almost twice, yes twice as fast and the frame rate changes with the 7600K are so bad there is massive stuttering, none of this shows on any of your slides, in fact these days the way you do reviews is so bare-bones its outright not what anyone actually using these products would experience, its misleading, as misleading as you labeling the 8700K as running at 3.7Ghz,  its running at at least 4.3Ghz if not 4.7Ghz, why is you tell pepole the CPU is running at 3.7Ghz a problem? because it makes people think they can get another 30% performance out of it by overclocking it, in realty its 10% or 2% if its actually running at 4.7Ghz and we don't know which it is, its certainly not 3.7Ghz.



http://imgur.com/jdinejn


----------



## cucker tarlson (Apr 19, 2018)

intrepid3d said:


> I would have thought by now people are clued up enough to know that by turning graphics setting down you're reducing the load on CPU's, especially MT CPU's.


Uuumm...no. matter of fact, exactly opposite is true. Decrease the visual fidelity and the cpu has more work to do, increase the settings and resolution and the cpu has less work.


----------



## btarunr (Apr 19, 2018)




----------



## fullinfusion (Apr 19, 2018)

Great review @W1zzard thank you!


----------



## cucker tarlson (Apr 19, 2018)

btarunr said:


> View attachment 99974


AMD - we make your intel CPU a better buy for the money !


----------



## W1zzard (Apr 19, 2018)

intrepid3d said:


> as you labeling the 8700K as running at 3.7Ghz, its running at at least 4.3Ghz if not 4.7Ghz, why is you tell pepole the CPU is running at 3.7Ghz a problem? because it makes people think they can get another 30% performance out of it by overclocking it, in realty its 10% or 2% if its actually running at 4.7Ghz and we don't know which it is, its certainly not 3.7Ghz.


Hmm .. maybe we could list them as Base/Boost. ie "Core i7-8700K 3.7 / 4.7 GHz". But wouldn't that add a ton of noise to the charts?
What do other readers think?


----------



## cucker tarlson (Apr 19, 2018)

W1zzard said:


> Hmm .. maybe we could list them as Base/Boost. ie "Core i7-8700K 3.7 / 4.7 GHz". But wouldn't that add a ton of noise to the charts?
> What do other readers think?


Should be 4.3 for the 8700K. That's what it runs stock in games. Not 3.7, not 4.7. It runs 4.3GHz.


----------



## W1zzard (Apr 19, 2018)

intrepid3d said:


> i3 is a really bad gaming CPU








I cherry picked a result just like you did.


----------



## intrepid3d (Apr 19, 2018)

W1zzard said:


> Hmm .. maybe we could list them as Base/Boost. ie "Core i7-8700K 3.7 / 4.7 GHz". But wouldn't that add a ton of noise to the charts?
> What do other readers think?



It should be what its actually running at.


----------



## W1zzard (Apr 19, 2018)

intrepid3d said:


> It should be what its actually running at.


During each benchmark? Average? Which cores? Average the clocks? Or the highest? Or the lowest? (serious question)

I'm thinking about this right now for the request further up on reporting clocks at various thread-counts


----------



## cucker tarlson (Apr 19, 2018)

Interpid3d doesn't get one thing.Faster single core performance is not equivalent to less cores, and vice versa. You're gonna find examples of ryzen beating i3 and i3 beating ryzen. That's why I think neither ryzen nor i3 are good for gaming. 6c/6t i5s and 4c/8t i7s have the best balance, fast single core and enough threads not to choke the cpu.


----------



## intrepid3d (Apr 19, 2018)

W1zzard said:


> I cherry picked a result just like you did.



Bad cherry pick, only a 10% difference, the one i picked the 1600 was 100% faster.

And you have clearly missed the whole point, something as a reviewer worth his salt you should know, you measure a CPU's performance at its weakest, not as you do at its best in empty scenes.

At thier weakest the Ryzen 1600 is 100% faster than the 7600K, that is the true measure of the performance differences in games, how they perform in those parts of the games that actually stress the CPU.


----------



## jabbadap (Apr 19, 2018)

cucker tarlson said:


> Uuumm...no. matter of fact, exactly opposite is true. Decrease the visual fidelity and the cpu has more work to do, insresease the settings and resolution and the cpu has less work.



Unless it is some physics running on CPU. If I remember correctly some graphics options have cpu bottleneck too(was it geometry, can't remember).


----------



## cucker tarlson (Apr 19, 2018)

jabbadap said:


> Unless it is some physics running on CPU. If I remember correctly some graphics options have cpu bottleneck too(was it geometry, can't remember).


Some postfx effects and geometry too. They like fast single core. This (I think) is actually the reason for a very curious thing -in games like watch dogs 2 (a lot of gemoetry in the city), Ryzen 7 can scale on all 16 threads equally, but it still loses to an i5 which hits 80-90% utilization on 6 threads.
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,42
 Like I said, more cores is not equivallent to faster core and vice versa, some games prefer one to the other.
Physics is done on the gpu these days, at least most of the times.


----------



## the54thvoid (Apr 19, 2018)

cucker tarlson said:


> Stock 7600K-like performance in games from an overclocked 8c/16t is disappointing to say the least . 2700X has serious workstation performance tho. I just can't explain people putting ryzen in rigs that are mostly for gaming.



Things in bold for speed.

1 - *I upgraded before the 8700k was released,* at the time, a nice 8 core was a 'psychological' boost over my old 6 core.
2 - The 8700k, which wasn't out, was pushed out at a lower Intel price point (had I been able to see into the future and see Intel's post Ryzen pricing restructure, I may not have gone with Ryzen).
3 - The Intel tendency to ditch motherboard for new chips was a factor to me.  *I can upgrade to the next Ryzen mArch change and keep the motherboard.*
4 - My Ryzen was a sideways upgrade, *willing to experiment with a new tech arrival, you know, like tech enthusiasts like to do. *
5 - I own a 1080ti.  Going by your logic of not being able to explain putting a Ryzen in a gaming rig, I can't explain why anybody would buy a Vega 56 or 64.
6 -* I gamed at 1440p till last week when I then went 4k*.  Intel has nothing exceptional to offer me over Ryzen.  

Point 4 is the most delicate point.  Buying the fastest thing doesn't make it interesting.  I enjoyed my Ryzen build.  And again, when I bought it, the 8700K wasn't in existence.


----------



## bug (Apr 19, 2018)

cucker tarlson said:


> AMD - we make your intel CPU a better buy for the money !


Well, yeah. It's why, pre Ryzen launch I always said that even if you buy Intel exclusively, it's still wise to root for AMD.


----------



## Xaled (Apr 19, 2018)

W1zzard said:


> Hmm .. maybe we could list them as Base/Boost. ie "Core i7-8700K 3.7 / 4.7 GHz". But wouldn't that add a ton of noise to the charts?
> What do other readers think?


yeah it may look so but that would be more accurate and less misleading. especially for new comers. also adding CPU Name  - Defualt Clock/Boost Clock label top of the chart may make it perfect


----------



## cucker tarlson (Apr 19, 2018)

the54thvoid said:


> Things in bold for speed.
> 
> 1 - *I upgraded before the 8700k was released,* at the time, a nice 8 core was a 'psychological' boost over my old 6 core.
> 2 - The 8700k, which wasn't out, was pushed out at a lower Intel price point (had I been able to see into the future and see Intel's post Ryzen pricing restructure, I may not have gone with Ryzen).
> ...


Lol I'm not judging you. You don't have to explain dick to me  You made your decision, and even though I'm more inclined towards intel cpus in gaming rigs, I can't argue with any of the points you just made, they make sense.


----------



## xkm1948 (Apr 19, 2018)

Great review. Thank you @W1zzard 

With this kind of performance I'd rather pick a 8700K TBH. Just overall a stronger CPU in gaming, which is what the price segment is for.


----------



## dj-electric (Apr 19, 2018)

xkm1948 said:


> Great review. Thank you @W1zzard
> 
> With this kind of performance I'd rather pick a 8700K TBH. Just overall a stronger CPU in gaming, which is what the price segment is for.



definitely is. Gaming performance, especially for me as a 144Hz user is what prevents me most from going AMD.
Still, these are OK. Can be recommended to most people.


----------



## kastriot (Apr 19, 2018)

Well so that's it, now we wait for lesser prices and for next gen cpu-s.


----------



## W1zzard (Apr 19, 2018)

intrepid3d said:


> At thier weakest the Ryzen 1600 is 100% faster than the 7600K


How does 7600K do at Ryzen 1600's weakest?



mcraygsx said:


> Is HS soldered like first Gen Ryzen chips or they using TIM between the die this time around.


It is soldered, mentioned a couple of times in the review


----------



## jabbadap (Apr 19, 2018)

cucker tarlson said:


> Some postfx effects and geometry too. They like fast single core. Physics is done on the gpu theres days, at least most of the times.



Two of the most common physics middlewares PhysX and Havok are both running on cpu most of the time, but granted one can't even necessary disable those physics. Well maybe some games have cloth and debris options but yeah it varies.


----------



## Mighty-Lu-Bu (Apr 19, 2018)

Wow, at that price point it almost makes sense to upgrade my 1700X.


----------



## efikkan (Apr 19, 2018)

DDR4-2933? Why? Isn't DDR4-3000 more available? (Cascade Lake-X is rumored to feature DDR4-2933 as well…)

I wish they sold versions without these boxed fans. This CPU is hot and needs a proper cooler. Stock fans are fine on low-end products, but at least >$300 products should be offered without them.

-----

It's sad to see that AMD have taken such extreme measures to close the gap with Intel. I'm afraid products like this which boosts outs of thermal specifications will be the new norm, and unfortunately, Intel will probably respond by doing the same thing. We have already seen boosting taken too far on Vega and Pascal, where real world builds easily loose 5-10% when putting the card in a case with normal airflow. But this is now taken one step further by boosting beyond thermal specifications, rendering all open case or open bench tests useless since the results are no longer reliable. This is now effectively an overclocked product, and should be regarded as such.



> These chips hit Intel's 7th generation Core "Kaby Lake" series so hard, that the company cut its yearly generational product cycle by half and rushed in the 8th generation Core "Coffee Lake" series, with 50-100% core-count increases across the board, to restore competitiveness.


The planned bump in core count has been known to the public since around the launch of Skylake and is not a result of Ryzen in any way. Even the greatly delayed Ice Lake was done in tapeout early last summer. These are designs made before Intel knew what Zen would provide. Please refrain from such attempts to misrepresent the history when you know better.



btarunr said:


> View attachment 99974


Have AMD forgotten their own product from 2011? Or are they no longer considering their FX line as "8-core"?


----------



## Xaled (Apr 19, 2018)

bug said:


> Well, yeah. It's why, pre Ryzen launch I always said that even if you buy Intel exclusively, it's still wise to root for AMD.


if only rooting alone would help AMD make better cpus...
 Actually the only way to not let Intel and nVidia inrease prices and to gèt their future products for less money is to buy AMD products now so they can spend more on devoloping their products. 
it really works like that. i am not an amd fan nor an intel hater. if AMD was in  Intels place i would have said the opposite


----------



## GhostRyder (Apr 19, 2018)

Excellent review, I have been very curious about this chip for awhile.  I am guessing its temp limit is about the same as before.  Did you manage to ever make it throttle?


----------



## intrepid3d (Apr 19, 2018)

W1zzard said:


> How does 7600K do at Ryzen 1600's weakest?



I don't quite understand the question? when your gaming performance depends on the CPU, not the GPU, the Ryzen 1600 is roughly around 100% faster than the 7600K.

Your reviews don't reflect that, they appear to be best case for low core count CPU's, in other words your CPU reviews are not based on CPU performance, they mislead people into thinking 4 cores are better than 6 or 8 when in fact in real life the higher core count CPU's are vastly better gaming CPU's.

Where are your testing methodologies? i want to replicate them so i can help you see exactly where you are going wrong.


----------



## John Naylor (Apr 19, 2018)

Commendable effort. by AMD and comes pretty close to many of the pre-launch hoopla.  Obvious choice for the game who's using workstation apps, at least a third of the time.  And for the gamer, while I understand the cost argument logically, I don't think $70 is going to impact gamer choices in this $300+ CPU price niche.

On thing I'd like to see is the effect of overclocking in a couple of gaming titles.   For example, when comparing AMD and nVidia GPUs, TPU reviews let us see the impact of overclocking.  I think the best example of that was the 480 review which showed:

The MSI 480 overclocks 8.6% and the MSI 1060 overclocks 15.1%.. So when the 1060's  (10% performance advantage @ stock) is overclocked, the relative difference would be:

https://www.techpowerup.com/reviews/MSI/RX_480_Gaming_X/26.html
https://www.techpowerup.com/reviews/MSI/GTX_1060_Gaming_X/27.html

110% x (115.1 / 108.6) = 116.6% of the 480s speed or 16.6 % faster

In some cases, one card @ stock is faster but when both are OC'd the situation reverses.  For those that OC, stock performance is something they will never see.  It would be an overwhelming task to do 18 games but a chart with 2 - 4 with just the new CPU and the most apt price niche competitor would be very welcome.

The solder thing however is clearly serves as no more than a marketing ploy .... why do folks delid ?  so they can avoid the temp wall and get higher OCs ...they aren't getting any OC.

Can ya turn off the RGB Stuff ?


----------



## bug (Apr 19, 2018)

Xaled said:


> if only rooting alone would help AMD make better cpus...
> Actually the only way to not let Intel and nVidia inrease prices and to gèt their future products for less money is to buy AMD products now so they can spend more on devoloping their products.
> it really works like that. i am not an amd fan nor an intel hater. if AMD was in  Intels place i would have said the opposite


I get that you think you know what people should buy, but that's people's choice, really.

What you're suggesting doesn't work. It requires everybody turning their purchases into a political statement. It also requires rewarding the player that falls behind, which is not always desirable.
They are businesses, it's their job to compete with each other. Buyers are seldom concerned with the politics behind that.

I would buy from the underdog if a comparable product is $10-20 more expensive or at the same price I can get 5-10% less performance (_all_ other things being equal). But I never picked Bulldozer over Core. Nor do I expect people that do not follow CPU development as religiously as I do to have the knowledge to do the same.


----------



## springs113 (Apr 19, 2018)

W1zzard said:


> During each benchmark? Average? Which cores? Average the clocks? Or the highest? Or the lowest? (serious question)
> 
> I'm thinking about this right now for the request further up on reporting clocks at various thread-counts


Hate to sound greedy, but how about all the above.

Edit: side note i guess...if the results are this good...Only AMD can ruin Zen2(which I doubt) will happen.


----------



## dicktracy (Apr 19, 2018)

cucker tarlson said:


> Stock 7600K-like performance in games from an overclocked 8c/16t is disappointing to say the least . 2700X has serious workstation performance tho. I just can't explain people putting ryzen in rigs that are mostly for gaming.


I wonder that myself as well. It’s like a gamer choosing Vega over a 1080ti for its compute power that he will never utilize and in the end, he’s stuck with the option that produces less FPS.


----------



## intrepid3d (Apr 19, 2018)

Interesting results from Anandtech... Ryzen 2700X walks all over the 8700K https://www.anandtech.com/show/12625/amd-second-generation-ryzen-7-2700x-2700-ryzen-5-2600x-2600/17

And this may be why....

"We ran our tests on a fresh version of RS3 + April Security Updates + Meltdown/Spectre patches using our standard testing implementation. "

Did TPU do that or was it a clean system without the performance hampering Intel fixes?


----------



## jabbadap (Apr 19, 2018)

efikkan said:


> DDR4-2933? Why? Isn't DDR4-3000 more available? (Cascade Lake-X is rumored to feature DDR4-2933 as well…)
> 
> I wish they sold versions without these boxed fans. This CPU is hot and needs a proper cooler. Stock fans are fine on low-end products, but at least >$300 products should be offered without them.
> 
> ...



The amd official system memory for Ryzen 7 2700x is 2933MHz, over than that is considered overclocking(xmp profiles). Last gen. Ryzen 7s it was 2666MHz, thus review were with 2667MHz memory.


----------



## dicktracy (Apr 19, 2018)

intrepid3d said:


> Interesting results from Anandtech... Ryzen 2700X walks all over the 8700K https://www.anandtech.com/show/12625/amd-second-generation-ryzen-7-2700x-2700-ryzen-5-2600x-2600/17
> 
> And this may be why....
> 
> ...


I wouldn’t put much weigh in Anandtech. Meltdown doesn’t/barely hurts gaming performance. Their review is just weird compared to TPU and everywhere else (even biased ones like Hardware Unboxed had 2700x trailing by some).


----------



## MxPhenom 216 (Apr 19, 2018)

intrepid3d said:


> I don't quite understand the question? when your gaming performance depends on the CPU, not the GPU, the Ryzen 1600 is roughly around 100% faster than the 7600K.
> 
> Your reviews don't reflect that, they appear to be best case for low core count CPU's, in other words your CPU reviews are not based on CPU performance, they mislead people into thinking 4 cores are better than 6 or 8 when in fact in real life the higher core count CPU's are vastly better gaming CPU's.
> 
> Where are your testing methodologies? i want to replicate them so i can help you see exactly where you are going wrong.



LOL you act like this is Wizzards first go at a review.


----------



## intrepid3d (Apr 19, 2018)

dicktracy said:


> I wouldn’t put much weigh in Anandtech. Meltdown doesn’t/barely hurts gaming performance. Their review is just weird compared to TPU and everywhere else (even biased ones like Hardware Unboxed had 2700x trailing by some).



Anand are one of the oldest, most knowledgeable and professional reviewers there are, they have been around doing it better than most for decades.

I think they are on to something, they usually are.


----------



## BasedOndread (Apr 19, 2018)

cucker tarlson said:


> Than 7600K ? What ?
> Here Ryzen 2700X is 1350, 8700K is 1400, and 8700K is just better overall.


did you forget 8700k dont come with cooler you must and must buy water cooling for 8700k to max out performance and what margin ? 10 FPS ? adding freaking cost to gain 10 to 20 FPS ? AIO Cooler 109$ + Z board


----------



## cucker tarlson (Apr 19, 2018)

BasedOndread said:


> did you forget 8700k dont come with cooler you must and must buy water cooling for 8700k to max out performance and what margin ? 10 FPS ? adding freaking cost to gain 10 to 20 FPS ? AIO Cooler 109$ + Z board


Yes. Are you kidding ? 20 fps ? Even 10 fps ? It's a huge difference. That's 1080Ti vs 1080 difference.


----------



## dicktracy (Apr 19, 2018)

intrepid3d said:


> Anand are one of the oldest, most knowledgeable and professional reviewers there are, they have been around doing it better than most for decades.
> 
> I think they are on to something, they usually are.


That was valid when Anand was still heading it. It’s not the same site it once was.


----------



## BasedOndread (Apr 19, 2018)

cucker tarlson said:


> Yes. Are you kidding ? 20 fps ? Even 10 fps ? It's a huge difference. That's 1080Ti vs 1080 difference.


no no no no

we are talking price on initial moderator reply,, he talking about price,, if you are rich kid and want more 10 fps with 200$ more money than 10 less fps ryzen, fine,, because you said 8700k is same price and its better but forget about Overall Build cost

but ryzen has great bargain, the gaming is good and dominate multithreaded workload,, im always buy 4 core since phenom II x4 era and current i5 2500 im using, for this reason more core giceme lot headroom


----------



## cucker tarlson (Apr 19, 2018)

BasedOndread said:


> no no no no
> 
> we are talking price on initial moderator reply,, he talking about price,, if you are rich kid and want more 10 fps with 200$ more money than 10 less fps ryzen, fine,, because you said 8700k is same price and its better but forget about Overall Build cost
> 
> but ryzen has great bargain, the gaming is good and dominate multithreaded workload,, im always buy 4 core since phenom II x4 era and current i5 2500 im using, for this reason more core giceme lot headroom


I hear your point, I think you're right up to a point, but you're exaggerating as well. You don't have to spend $200 more on z370 and cooling. Matter of fact, for 1700X/2700X you WILL need a good x370/x470 board too if you wanna overclock. Sure, mulitplier is unlocked on b350, but vrm quality just won't handle a 8 core chip at 1.35-1.45v. A good cooler is a quality feature on its own. Sure, you may stick with amd's cooler but you'll never get the noise/performance ratio of a good $50 air cooler like TR Macho, which is usually best price/quality.


----------



## B-Real (Apr 19, 2018)

So 7% difference in FHD gaming, 3% in 1440P and 1% in 4K with a 1080 Ti, same overall CPU performance for 40$ less? And I still didn't mention the 2700 or the 2600. Great work again, AMD.



cucker tarlson said:


> Stock 7600K-like performance in games from an overclocked 8c/16t is disappointing to say the least . 2700X has serious workstation performance tho. I just can't explain people putting ryzen in rigs that are mostly for gaming.



Maybe gamers don't buy Ryzen 2700X but 2600-2600X for gaming and a better than 8600K CPU performance? 2600X and 8600K difference is 7% only with a 1080Ti, and you get a better overall CPU performance with the 2600X for 12% less money. BTW, gamers don't buy 8700K either, but a 8600K.


----------



## unclesharkey (Apr 19, 2018)

intrepid3d said:


> Interesting results from Anandtech... Ryzen 2700X walks all over the 8700K https://www.anandtech.com/show/12625/amd-second-generation-ryzen-7-2700x-2700-ryzen-5-2600x-2600/17
> 
> And this may be why....
> 
> ...


Did you just come here to troll? Just read the review, and the many other reviews online, and on YouTube, and come to your own conclusions. If you don't like the reviews being done then do your own. But don't be a troll.


----------



## VSG (Apr 19, 2018)

unclesharkey said:


> Did you just come here to troll? Just read the review, and the many other reviews online, and on YouTube, and come to your own conclusions. If you don't like the reviews being done then do your own. But don't be a troll.



I don't see him trolling at all, he's being courteous and trying to understand discrepancies in expectations and numbers from one outlet to another. Whether the reasoning is valid is a different matter, but I have zero problems with him.


----------



## S@LEM! (Apr 19, 2018)

aside from getting closer ( 3% in 1440P ) which most will play at, these benchmark does not represent the real-world scenario where you have 10 Apps running background with windows updates all over the place, unremoved registers and bloated files. things will get interesting.

good review!
the point where you have to mention the boosted Hz while gaming is a great value to be added and frankly, even me missed it out


----------



## intrepid3d (Apr 19, 2018)

unclesharkey said:


> Did you just come here to troll? Just read the review, and the many other reviews online, and on YouTube, and come to your own conclusions. If you don't like the reviews being done then do your own. But don't be a troll.



Because i disagree with some of it? did you actually study anything i said or do you just want to live in an echo-chamber?


----------



## CheapMeat (Apr 19, 2018)

Man, I love it all. It's amazing how well the XFR and similar tech in it does. I can't imagine how much better Zen 2 will be (I decided to wait).


----------



## cadaveca (Apr 19, 2018)

intrepid3d said:


> or do you just want to live in an echo-chamber?



Interestingly enough, asking all sites to do the exact same review is kind of asking for an echo-chamber.


----------



## Vario (Apr 19, 2018)

BasedOndread said:


> did you forget 8700k dont come with cooler you must and must buy water cooling for 8700k to max out performance and what margin ? 10 FPS ? adding freaking cost to gain 10 to 20 FPS ? AIO Cooler 109$ + Z board



No you don't.  It runs just fine on air.


----------



## Space Lynx (Apr 19, 2018)

v12dock said:


> Way more than I expected, I think we can all expect a knee jerk reaction from Intel.



at 1440p (which is what I game at) stock clocks 8600k beats it by 10 fps in Far Cry 5 and 9 fps in BF1... and since my 8600k runs at 5.1ghz 24.7 on air never breaking 60 celsius... probably can add another 5 fps on top of that if not more. If I needed that many threads 2700x would be a good deal though, but since all I do is game, I will stick with my 8600k at 5,1

@W1zzard why no min fps tests this time around? its an important bench for smooth gameplay... would love to see how much of a difference there is, was 10 fps across the board last gen / time... big difference for smooth gameplay imo.

@Vario is correct. i never break 60 celsius on air at 5.1ghz no downclocking ever.  

edit: i found a review with min fps... https://www.techspot.com/review/1613-amd-ryzen-2700x-2600x/page3.html  decimation - tho it woulda been nice to see 1440p numbers

though it would have been nice to see it at 1440p... at 1080p my 5.1 8600k is utterly decimating OC'd ryzen 2700x... 20-30 fps across the board... yikes, not to mention $100 cheaper and I re-used my old Noctua cooler from 5 years ago...


----------



## R0H1T (Apr 19, 2018)

cadaveca said:


> Interestingly enough, asking all sites to do the exact same review is kind of asking for an echo-chamber.


Not really, as has been said & discussed over a million times on this forum & many others ~ ideally these tests should include spectre & meltdown fixes, all of them. Do you think people should hold off updating their OS, to fall (Creator's) update or whatever comes next, because they'll have these fixes baked in? You aren't showing the full picture unless you're running the tests with the latest patches, that impact performance & most importantly *security*. I can't believe how tech (review) sites don't highlight this aspect of hardware more rigorously! If the situation is more fluid, wrt performance impact of the patches, then users will get a better idea as to what they can expect from their current hardware or even future purchases.


----------



## efikkan (Apr 19, 2018)

jabbadap said:


> The amd official system memory for Ryzen 7 2700x is 2933MHz, over than that is considered overclocking(xmp profiles). Last gen. Ryzen 7s it was 2666MHz, thus review were with 2667MHz memory.


Sorry, you missed my point. I *know* DDR4-2933 is the spec, I was asking if anyone knew why AMD (and Intel) chose this.
Actually, this review says "DDR4-3200 14-14-14-34".


----------



## trparky (Apr 19, 2018)

lynx29 said:


> at 1440p (which is what I game at) stock clocks 8600k beats it by 10 fps in Far Cry 5


No offense but can you really see tell the difference? Ten frames per second is really no big deal. Now if we were talking about a 25 to 30 frames per second difference then we would definitely have something to talk about. But ten? Really? Other than having the bragging rights there's really no difference.


----------



## Space Lynx (Apr 19, 2018)

trparky said:


> No offense but can you really see tell the difference? Ten frames per second is really no big deal. Now if we were talking about a 25 to 30 frames per second difference then we would definitely have something to talk about. But ten? Really? Other than having the bragging rights there's really no difference.



When you add in my 5.1ghz OC its prob more like 15. and yes I can tell the difference between 144hz and 165hz, I switch back and forth in game all the time to test it, I have also seen a 240hz monitor, and while its huge diminishing returns after 165 fps, its still slightly better, but 165 is indeed my sweet spot.


----------



## trparky (Apr 19, 2018)

lynx29 said:


> When you add in my 5.1ghz OC its prob more like 15. and yes I can tell the difference between 144hz and 165hz, I switch back and forth in game all the time to test it, I have also seen a 240hz monitor, and while its huge diminishing returns after 165 fps, its still slightly better, but 165 is indeed my sweet spot.


Here I am at 1080p@60Hz. I feel like such a pleb with that comment of yours.


----------



## Space Lynx (Apr 19, 2018)

trparky said:


> Here I am at 1080p@60Hz. I feel like such a pleb with that comment of yours.



In all honesty, 100hz is the real sweet spot, like say you are slaying Orcs in lord of the rings game, there is no blur at all, you can see the sword move in full, its glorious and very immersive. anything above 100hz is diminishing returns. but 60hz is too blurry for me, sword in mid swing is blurry, ruins the immersion. just imo.


----------



## trparky (Apr 19, 2018)

Oh, I would definitely love to go to 1440p@100Hz but damn if I can afford it. Not only are the monitor prices high but GPU prices are still stupidly above MSRP. If it wasn't for price I would go for that it in a heart beat but seeing as I'm trying to stay below $1450 USD with this build of mine I have to cut corners where I can.


----------



## the54thvoid (Apr 19, 2018)

trparky said:


> Here I am at 1080p@60Hz. I feel like such a pleb with that comment of yours.



I wouldn't worry, all major media is still at, what, 24fps, it's a bit of a phallic contest with gamers right now with ultra high fps with _marginal_ gains.  I'd rather have 4k, given I'm not a pro gamer and would rather have more pixel real estate.


----------



## Space Lynx (Apr 19, 2018)

the54thvoid said:


> I wouldn't worry, all major media is still at, what, 24fps, it's a bit of a phallic contest with gamers right now with ultra high fps with _marginal_ gains.  I'd rather have 4k, given I'm not a pro gamer and would rather have more pixel real estate.



Enjoy your blurry sword swings.  Also, head shots are easier to get in FPS games at 100hz, its just more immersive, but we all have different tastes, no reason to call those of us that don't enjoy blurry messes "phallic".


----------



## Shatun_Bear (Apr 19, 2018)

lynx29 said:


> at 1440p (which is what I game at) stock clocks 8600k beats it by 10 fps in Far Cry 5 and 9 fps in BF1... and since my 8600k runs at 5.1ghz 24.7 on air never breaking 60 celsius... probably can add another 5 fps on top of that if not more. If I needed that many threads 2700x would be a good deal though, but since all I do is game, I will stick with my 8600k at 5,1
> 
> though it would have been nice to see it at 1440p... at 1080p my 5.1 8600k is utterly decimating OC'd ryzen 2700x... 20-30 fps across the board... yikes, not to mention $100 cheaper and I re-used my old Noctua cooler from 5 years ago...



What on earth are you talking about?

1. 8600K virtually neck-and-neck with the 2700X @ 1440p resolution on average. Throwing in outliers is not going to change that fact. Look at the graphs.
2. 5.1Ghz OC is not going to add much as in this very review, TPU found that 5Ghz overclock on a 8700K nets only....1-3% gaming performance gains over stock.
3. Please tell me you game with a *1080 Ti* or this is all just face-palm, as you do realise that for the Intel's to gain this small amount of performance advantage (was it 7%) in gaming, you've got to be using a 1080 Ti whilst gaming at 1080p (which, bizarrely, you're talking about 1440p to add to the silliness)?

Please explain the critical thinking inside your head by all means...


----------



## Space Lynx (Apr 19, 2018)

Shatun_Bear said:


> What on earth are you talking about?
> 
> 1. 8600K virtually neck-and-neck with the 2700X @ 1440p resolution on average. Throwing in outliers is not going to change that fact. Look at the graphs.
> 2. 5.1Ghz OC is not going to add much as in this very review, TPU found that 5Ghz overclock on a 8700K nets only....1-3% gaming performance gains over stock.
> ...



I found 5 games that have 10+ fps increase in same system over 2700x OC'd and 8600k at stock...
Learn to use System Specs under peoples names, yes I use a 1080 Ti...
Also, min fps is increased across the board with Intel, as shown in the techspot review I linked a few posts up, (min fps creates a smoother gaming experience).


----------



## bug (Apr 19, 2018)

intrepid3d said:


> Because i disagree with some of it? did you actually study anything i said or do you just want to live in an echo-chamber?


Friendly advice: tone it down while all your activity here is basically one thread spamming  of "your review is wrong because it doesn't show what I want".
Grow a reputation before you can expect people will listen to you.


----------



## W1zzard (Apr 19, 2018)

lynx29 said:


> why no min fps tests this time around?


it seemed nobody cared, so i just dropped them. i couldn't find much to read out of them anyway tbh


----------



## Shatun_Bear (Apr 19, 2018)

lynx29 said:


> I found 5 games that have 10+ fps increase in same system over 2700x OC'd and 8600k at stock...
> Learn to use System Specs under peoples names, yes I use a 1080 Ti...
> Also, min fps is increased across the board with Intel, as shown in the techspot review I linked a few posts up, (min fps creates a smoother gaming experience).



I don't care how many cherry-picked games you found, the two processors are virtually neck-and-neck at 1440p on average, across 32 games. I will say again, look at the graphs.

You sound like someone insecure about his CPU choice who is trying to justify it against something that is far more powerful and ought to be much more expensive. And if you bring your 8600K into the price equation, *don't try and pretend the 2600 or 2600X doesn't exist* . The 2600X is just 1.9% slower than the 2700X (which has parity with your 8600K) at the res you game at, while only costing $230! That blows the 8600K out of the water in its price-to-performance ratio. Or are you just choosing to ignore that inconvenient fact?

Also, why would you pair a 1080 Ti with a bargain-price 6-thread CPU?


----------



## unclesharkey (Apr 19, 2018)

VSG said:


> I don't see him trolling at all, he's being courteous and trying to understand discrepancies in expectations and numbers from one outlet to another. Whether the reasoning is valid is a different matter, but I have zero problems with him.





intrepid3d said:


> Because i disagree with some of it? did you actually study anything i said or do you just want to live in an echo-chamber?



I under standing what you are saying but you seem to be rather harsh regarding this specific review. A lot of the information presented in these types of reviews can be rather cryptic. I also think some of these graphs can be misleading because one low score can skew the entire set of numbers in a particular set. I mean if I was doing research I don't know if I could trust half of these results. That is why I prefer some of the sites on YouTube where you can see them actually playing the game for 15-30 minutes at a time. You see when the fps go up and down etc. I also think people make to much out of these reviews and benchmarks. Benchmarks aside most things in life are subjective, and I believe in the good enough philosophy. Hell I run a FX8320 stock with a 1060 6gb and I have no problems playing  all the games I like. But if you look at the benchmarks my CPU is crap, garbage, portable heater. That may be true but it is good enough for me and cost wise I have saved a lot of money. 

My opinion is that Ryzen is a great CPU when compared to Intel. In a lot of cases superior. Faster than intel in most threaded applications and very little difference in games. But everyone likes to point out the game benchmarks and that Intel is faster..... well in some specific situations. The truth is unless you are gaming with a 1080 there is very little difference between the Ryzen and Intel chips.
Ryzen is for the most part faster where it counts and is more cost friendly.


----------



## bug (Apr 19, 2018)

unclesharkey said:


> I under standing what you are saying but you seem to be rather harsh regarding this specific review. A lot of the information presented in these types of reviews can be rather cryptic. I also think some of these graphs can be misleading because one low score can skew the entire set of numbers in a particular set. I mean if I was doing research I don't know if I could trust half of these results. That is why I prefer some of the sites on YouTube where you can see them actually playing the game for 15-30 minutes at a time. You see when the fps go up and down etc. I also think people make to much out of these reviews and benchmarks. Benchmarks aside most things in life are subjective, and I believe in the good enough philosophy. Hell I run a FX8320 stock with a 1060 6gb and I have no problems playing  all the games I like. But if you look at the benchmarks my CPU is crap, garbage, portable heater. That may be true but it is good enough for me and cost wise I have saved a lot of money.
> 
> My opinion is that Ryzen is a great CPU when compared to Intel. In a lot of cases superior. Faster than intel in most threaded applications and very little difference in games. But everyone likes to point out the game benchmarks and that Intel is faster..... well in some specific situations. The truth is unless you are gaming with a 1080 there is very little difference between the Ryzen and Intel chips.
> Ryzen is for the most part faster where it counts and is more cost friendly.


I'm totally with you.
I mean, people forget all too easily that many times the difference between the fastest and the slowest CPU in a benchmark is really the difference between an ok CPU and a more than ok CPU. They'll all get the job done after all. Video card, motherboard, SSD reviews, they all tell similar stories.
That said, games are not the only weakness for Ryzen. Ryzen also loses in a number of productivity benchmarks. Basically if what you do doesn't saturate _a lot_ of cores most of the time, then you're probably better of with Intel. But that's for everyone to judge, there can (and usually are) more factors involved.


----------



## W1zzard (Apr 19, 2018)

Charts have been remade and now include both base clock and max turbo


----------



## Space Lynx (Apr 19, 2018)

bug said:


> I'm totally with you.
> I mean, people forget all too easily that many times the difference between the fastest and the slowest CPU in a benchmark is really the difference between an ok CPU and a more than ok CPU. They'll all get the job done after all. Video card, motherboard, SSD reviews, they all tell similar stories.
> That said, games are not the only weakness for Ryzen. Ryzen also loses in a number of productivity benchmarks. Basically if what you do doesn't saturate _a lot_ of cores most of the time, then you're probably better of with Intel. But that's for everyone to judge, there can (and usually are) more factors involved.



yep. that is what I originally said. and since all I do is game, it makes sense for me to do Intel. now if I gamed and streamed... well I might do 2700x. but I don't, I am a private person and would never stream, so Intel it is.

also 8600k at 5ghz beats 8700k at 5ghz in Assassins Creed Origins at 1440- according to GamersNexus review  min fps is again 12 fps better than ryzen OC'd at 4.3... so yeah, Intel for pure gaming experience.



W1zzard said:


> Charts have been remade and now include both base clock and max turbo



you should have added in a 5.1 ghz 8600k to fight the oc'd 2700x


----------



## HTC (Apr 19, 2018)

lynx29 said:


> you should have added in a 5.1 ghz 8600k to fight the oc'd 2700x



1st: just because some manage to get the 8600k to 5.1 GHz doesn't mean everyone does, so a more "conservative" number would be required: 4.9? I'm sure most would reach that number.

That said, it the object of this review were an 8600k, sure: i'd totally agree. Since it's not, don't see the need, really.


----------



## Space Lynx (Apr 19, 2018)

HTC said:


> 1st: just because some manage to get the 8600k to 5.1 GHz doesn't mean everyone does, so a more "conservative" number would be required: 4.9? I'm sure most would reach that number.
> 
> That said, it the object of this review were an 8600k, sure: i'd totally agree. Since it's not, don't see the need, really.



https://www.tweaktown.com/reviews/8602/amd-ryzen-7-2700x-5-2600x-review/index9.html

another review showing 8700k stock beating ryzen 2700x stock at 1440p by 20 fps in multiple games.

also GamersNexus shows multiple OC's in his review. its fine TPU doesn't as I have other sources that do, so meh.  also like 90% of the chips hit 5ghz.

You wanted more games being tested? Sure thing, here you go, 8700k is 100-42% faster in many games, the cherry picked latest games its only 10 fps or 10% faster, sure...



























I have 20 more games as well... you get the picture by now I think...


----------



## Vya Domus (Apr 19, 2018)

I am seeing some pretty considerable discrepancies between benchmarks of the same games on several other sites. Don't know what to make of any of them.


----------



## Space Lynx (Apr 19, 2018)

Vya Domus said:


> I am seeing some pretty considerable discrepancies between benchmarks of the same games on several other sites. Don't know what to make of any of them.



Same. Intel leads in all of them for pure gaming min, max, and avg FPS though. So if you are a 'I only game on PC' gamer, there you go, buy Intel 8th gen. If you are a gamer who streams, 2700x is the winner. Simple as that.


----------



## Shatun_Bear (Apr 20, 2018)

lynx29 said:


> Same. Intel leads in all of them for pure gaming min, max, and avg FPS though. So if you are a 'I only game on PC' gamer, there you go, buy Intel 8th gen. If you are a gamer who streams, 2700x is the winner. Simple as that.



Here you go again.

It's not as simple as that. If you only game on PC, it only makes sense to buy an 8700K if:

1. You game at 1080p using a 1080 Ti

Seeing as the gaming performance differences are small even in this extremely narrow scenario, imo it makes sense in nearly all other circumstances to choose a 2700X. Much more multicore performance, comes with a cooler, cheaper motherboards, and AM4 will last you another gen on top of this one. No brainer.


----------



## FYFI13 (Apr 20, 2018)

intrepid3d said:


> when your gaming performance depends on the CPU, not the GPU, the Ryzen 1600 is roughly around 100% faster than the 7600K.


 This is what happens to kids that eat Tide pods.


----------



## evernessince (Apr 20, 2018)

Was the spectre/meltdown patch applied to the Intel system in this review?


----------



## intrepid3d (Apr 20, 2018)

I asked that, didn't get an answer ^^^
----------------------------

Meh... to all these random slides. Anyone can make a slide and write anything they like on it, its why slide reviews are a dying breed, some of them are so extreme they are an obvious fake.

In videos you can actually see whats going on, and they don't lie.

Other than 10 FPS in Arma III there is nothing between the 8700K and the Ryzen 2700X.


----------



## xorbe (Apr 20, 2018)

lynx29 said:


>



What's going on here?  8700k minimum is 101, and the AMD average is 123, so Intel's min is not greater than AMD's average. And all the  charts say 1700X?


----------



## intrepid3d (Apr 20, 2018)

xorbe said:


> What's going on here?  8700k minimum is 101, and the AMD average is 123, so Intel's min is not greater than AMD's average. And all the  charts say 1700X?



Its Tweaktown, anything more need to be said? A known Intel shill site.


----------



## magickai (Apr 20, 2018)

intrepid3d said:


> I asked that, didn't get an answer ^^^
> ----------------------------
> 
> Meh... to all these random slides. Anyone can make a slide and write anything they like on it, its why slide reviews are a dying breed, some of them are so extreme they are an obvious fake.
> ...


you talk about lies, but thats what this video about, OC 2700x vs stock 8700K


----------



## Melvis (Apr 20, 2018)

jabbadap said:


> What are the all core clocks by the way, can't seem to find that information anywhere.



Just over 4GHz which is lovely!


----------



## Nima (Apr 20, 2018)

intrepid3d said:


> Its Tweaktown, anything more need to be said? A known Intel shill site.


OMG, you are so obsessed with AMD that you are loosing your mind. listen kid these companies don't care about you. they are like monsters, they eat your flesh and spit your bones if they could. don't waste your mind and soul for them.


----------



## W1zzard (Apr 20, 2018)

evernessince said:


> Was the spectre/meltdown patch applied to the Intel system in this review?


yup


----------



## sergionography (Apr 20, 2018)

Thanks for the excellent review. 
One point i wasn't sure about and couldn't find mention of in the review was whether the intel systems were tested with the latest spectre/meltdown variant 2 patches. Because to my knowledge all the x470 motherboards are patched on release.


----------



## geon2k2 (Apr 20, 2018)

Hmm, there is something wrong with the graphs, somehow you just like to show Ryzen at the bottom, when it is clear winner and should be at the top.

If lower is better why won't you sort the entries in the opposite order so that the lowest (better) would be at the top?
See the wprime graph for example.


----------



## W1zzard (Apr 20, 2018)

sergionography said:


> whether the intel systems were tested with the latest spectre/meltdown variant 2 patches.


see two posts further up, or the end of the intro



geon2k2 said:


> at the bottom


we have always been sorting "best" to bottom. starting at the top from the worst scores



geon2k2 said:


> opposite order


charts that have higher is better are sorted "opposite", ie, still: best result at bottom


----------



## [XC] Oj101 (Apr 20, 2018)

W1zzard said:


> As explained in the overclocking section and conclusion, the CPU will boost higher than our manual OC when few cores are loaded.



Try using the BCLK to overclock instead of the multiplier


----------



## Totally (Apr 20, 2018)

intrepid3d said:


> I asked that, didn't get an answer ^^^
> ----------------------------
> 
> Meh... to all these random slides. Anyone can make a slide and write anything they like on it, its why slide reviews are a dying breed, some of them are so extreme they are an obvious fake.
> ...



Fyi, They Ryzen gameplay was a lot smoother than the Intel where the frame rate kept dropping or was jumping between two extremes. I'll take smoothness over that kind of distraction.


----------



## Space Lynx (Apr 20, 2018)

Totally said:


> Fyi, They Ryzen gameplay was a lot smoother than the Intel where the frame rate kept dropping or was jumping between two extremes. I'll take smoothness over that kind of distraction.



I hope you don't play older games, Ryzen can't even play some of them.


----------



## W1zzard (Apr 20, 2018)

[XC] Oj101 said:


> Try using the BCLK to overclock instead of the multiplier


Interesting idea, but won't work to get you any significant increases unless you have a special motherboard that fixes SATA and PCIe clocks to 100


----------



## medi01 (Apr 20, 2018)

Sorry, I just quickly skimmed through the article, had latest *Intel's security patches against Meltdown/Spectre* been used in this benchmark?


----------



## Space Lynx (Apr 20, 2018)

medi01 said:


> Sorry, I just quickly skimmed through the article, had latest *Intel's security patches against Meltdown/Spectre* been used in this benchmark?



Yes, says it on first page...

lol W1zzard is going to quit posting reviews after this launch, his poor soul


----------



## Readlight (Apr 20, 2018)

Iznt that StoreMI something like RAPID mode. It steals RAM and won't do anything.


----------



## intrepid3d (Apr 20, 2018)

magickai said:


> you talk about lies, but thats what this video about, OC 2700x vs stock 8700K



Actually both are overclocked. 4.4Ghz vs 4.2Ghz.

Most reviewers like to use custom loops or at least high end 240mm AIO's to cool overclocked 8700K's, they don't overclock at all without those, not everyone can afford those high end coolers, maybe the reviewer in this case couldn't.

And if so i don't see anything wrong with the review, it doesn't always have to be shown in its best light needing those very expensive cooling solutions, Ryzen doesn't need them because its HS is soldered and the CPU is power efficient, Intel don't always need to be given a free ride, Make your CPU's better, Intel.


----------



## medi01 (Apr 20, 2018)

W1zzard said:


> yup


Any noticeable impact in games?




lynx29 said:


> Ryzen can't even play some of them.


Are you paid for spreading FUD, or is it sorta hobby?


----------



## W1zzard (Apr 20, 2018)

lynx29 said:


> Yes, says it on first page...


Added the same text to the conclusion page, in bold.



medi01 said:


> Any noticeable impact in games?


No idea. I didn't test before and after with just the patches. A lot has changed between now and earlier CPU tests, so you can't just compare scores.


----------



## Melvis (Apr 20, 2018)

lynx29 said:


> I hope you don't play older games, Ryzen can't even play some of them.



Like which ones?


----------



## ikeke (Apr 20, 2018)

I quite like mine. Upgraded from launch day R7 1700 which did 3.8Ghz all core max.

http://valid.x86.fr/za2lwf


----------



## jabbadap (Apr 20, 2018)

Melvis said:


> Just over 4GHz which is lovely!



Well yeah, 4.05GHz all core says he on occt and blender. Amd did gave the table of threads vs clock.


Spoiler: R7 2700x AVG clock speed per threads










There's striking similarities to that canardpcs printed review on those curvatures(canardpc used smt off). Differencies on core clocks can be explained by used motherboard(Canardpc used a320, which don't support enhanced xfr 2 nor precision boost 2).


----------



## medi01 (Apr 20, 2018)

computerbase.de is also done with review, they focus on memory latencies (which seem to have major impact on gaming) and get to 2700x beating 1800 by about 12% (2-3 times more than techpowerup).
https://www.computerbase.de/2018-04/amd-ryzen-2000-test/8/


----------



## laszlo (Apr 20, 2018)

must say good job AMD!

is a win-win for all of us no matter preferences so let the price war begin!


----------



## I No (Apr 20, 2018)

Good for AMD. I'm really impressed more with the pressure it puts on Intel to get their heads out of the gutter. Shame the same thing can't be said in the GPU side of things...


----------



## intrepid3d (Apr 20, 2018)

I No said:


> Good for AMD. I'm really impressed more with the pressure it puts on Intel to get their heads out of the gutter. Shame the same thing can't be said in the GPU side of things...



According to this review, the 2700X is slower than an i3, which on the face of this review makes them over priced and slow. it should be priced at below the i3.

Other reviews show a vastly better picture than this one but the question is who do you believe? if you believe this one the 2700X is useless for gaming.

As always with AMD how good or bad it is depends on which review site you go to, with that in mind if you use you PC for games why would you buy AMD at all?


----------



## I No (Apr 20, 2018)

intrepid3d said:


> According to this review, the 2700X is slower than an i3, which on the face of this review makes them over priced and slow.
> 
> Other reviews show a vastly better picture than this one but the question is who do you believe? if you believe this one the 2700X is useless for gaming.



Doesn't really matter in the long run. It gives people options and this is the big picture. It might take them 2 additional cores to keep up with Intel's offerings but they do it a lower price point. The market will never be 50-50 nor close to that for that matter but it might be a good smack for the competition which would drive innovation and for me that thing matters more than a few %'s in fps in a game. Be that as it may I don't see many current Ryzen owners or anything form Skylake and up upgrading to this platform but it does give people running older gen rigs options as opposed to a forced upgrade pattern based on one manufacturer.


----------



## bug (Apr 20, 2018)

intrepid3d said:


> According to this review, the 2700X is slower than an i3, which on the face of this review makes them over priced and slow. it should be priced at below the i3.
> 
> Other reviews show a vastly better picture than this one but the question is who do you believe? if you believe this one the 2700X is useless for gaming.
> 
> As always with AMD how good or bad it is depends on which review site you go to, with that in mind if you use you PC for games why would you buy AMD at all?


At this point I can't tell whether you're just being stubborn or you just can't read.
How on Earth does this review paint the 2700X as "useless for gaming", when it shows that even in the most CPU limited scenario (gaming @720p), the 8700k is just 16% faster? There's a big graph showing just this in the review.


----------



## intrepid3d (Apr 20, 2018)

Well i'm struggling to understand what is and isn't real.

Of course it matters, forget about the 8700K, if the 2700X can't even compete with the i3 its junk for gaming, think about it, what place does it have if it cannot keep up with an i3?

Can you answer that?


----------



## I No (Apr 20, 2018)

intrepid3d said:


> Well i'm struggling to understand what is and isn't real.
> 
> Of course it matters, forget about the 8700K, if the 2700X can't even compete with the i3 its junk for gaming, think about it, what place does it have if it cannot keep up with an i3?
> 
> Can you answer that?



It actually depends. For example if you only base it on benchmarks alone it would paint a picture far from what you might end up using it for. Let's say you're getting a new rig and your aim is gaming. Further down the road you might want to stream or do a little bit of video encoding and whatnot. Would you take 4 cores for the FPS or you'll go for the extra cores and threads that would help out in something other than gaming? And since we're on that note no one uses their PCs for only gaming.
Truth be told the reason why I went to a 8700k instead of a Ryzen was the platform instability that was a deal breaker for me at that time. One thing sticks to AMD and always has been they are trying to be the jack of all trades the only difference is this time they put a dent in market. Ryzen maybe many things but Bulldozer it is not. I wouldn't have any issues with recommending AMD vs Intel in this day and time.


----------



## intrepid3d (Apr 20, 2018)

I No said:


> It actually depends. For example if you only base it on benchmarks alone it would paint a picture far from what you might end up using it for. Let's say you're getting a new rig and your aim is gaming. Further down the road you might want to stream or do a little bit of video encoding and whatnot. Would you take 4 cores for the FPS or you'll go for the extra cores and threads that would help out in something other than gaming? And since we're on that note no one uses their PCs for only gaming.
> Truth be told the reason why I went to a 8700k instead of a Ryzen was the platform instability that was a deal breaker for me at that time. One thing sticks to AMD and always has been they are trying to be the jack of all trades the only difference is this time they put a dent in market. Ryzen maybe many things but Bulldozer it is not. I wouldn't have any issues with recommending AMD vs Intel in this day and time.



Its not actually Bulldozer, but i'm sorry this looks to me like Bulldozer of 2018. Bulldozer was also good for gaming and streaming at the same time, it was junk just for gaming so people had to find merrits to apply to it, "its good for streaming your games" yes but a low end Intel CPU is better for gaming full stop.

Look at this slide the gaming performance is abysmal, again why would any one spend more than $100 on anything this bad?


----------



## bug (Apr 20, 2018)

intrepid3d said:


> Well i'm struggling to understand what is and isn't real.
> 
> Of course it matters, forget about the 8700K, if the 2700X can't even compete with the i3 its junk for gaming, think about it, what place does it have if it cannot keep up with an i3?
> 
> Can you answer that?


The same graph shows the i3 giving 97.1% of the performance of a 2700X in gaming @720, so you may want to explain what do you mean by "the 2700X can't even compete with the i3".


----------



## I No (Apr 20, 2018)

intrepid3d said:


> Its not actually Bulldozer, but i'm sorry this looks to me like Bulldozer of 2018. Bulldozer was also good for gaming and streaming at the same time, it was junk just for gaming so people had to find merrits to apply to it, "its good for streaming your games" yes but a low end Intel CPU is better for gaming full stop.
> 
> Look at this slide the gaming performance is abysmal, again why would any one spend more than $100 on anything this bad?
> 
> View attachment 100018


It's 16% out of the box at a lower price-point for gaming and it's better for productivity. I don't see anything wrong with that. Furthermore no one will keep a CPU until it becomes the bottleneck. *The GPU will bottleneck the system far faster than a CPU will*. If you put the whole context into day-to-day use you won't notice a big difference between the platforms no one games at 720p with a shiny new processor let's be honest. The only thing that 720p test shows is how do CPU's stack up against each other in a CPU limited scenario and I for one never felt that scenario actually happening and no gamer would. If you up the resolution the GPU becomes the elephant in the room so yeah.... 
And oh the only thing Bulldozer was good for was a coaster and heating up your room.


----------



## intrepid3d (Apr 20, 2018)

The Ryzen CPU is already a massive bottleneck, look at the slide. ^^^^^



bug said:


> The same graph shows the i3 giving 97.1% of the performance of a 2700X in gaming @720, so you may want to explain what do you mean by "the 2700X can't even compete with the i3".



Ironically the 2700X is slower when overclocked, thats why, the i3 overclocks too, but it actually gets faster when you overclock it.

Look, its an i3, a cheap and cheerful CPU, better than AMD's finest, clearly its Bulldozer 2018.


----------



## I No (Apr 20, 2018)

intrepid3d said:


> The Ryzen CPU is already a massive bottleneck, look at the slide. ^^^^^
> 
> 
> 
> ...




Dude you're looking at the picture from GAMING at 720p point of view. Widen the frame, people use these things for a lot more things that getting high FPS in CS:GO. Put yourself in a content creator's shoes and look at this from his POV. No one games on 720p in 2018 with a brand new CPU check the standard resolutions and draw your conclusions from there.


----------



## bug (Apr 20, 2018)

@W1zzard Consider this my reporting on @intrepid3d for spamming/rambling.


----------



## intrepid3d (Apr 20, 2018)

I No said:


> Dude you're looking at the picture from GAMING at 720p point of view. Widen the frame, people use these things for a lot more things that getting high FPS in CS:GO. Put yourself in a content creator's shoes and look at this from his POV. No one games on 720p in 2018 with a brand new CPU check the standard resolutions and draw your conclusions from there.



Why bother testing for it?

He also tested 1080P and it has the same outcome and with what was it? about 20 games? most of them even at 1080P the performance is below the i3.

https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/13.html

Again why would anyone buy something like that to play games on?


----------



## I No (Apr 20, 2018)

intrepid3d said:


> Why bother testing for it?
> 
> He also tested 1080P and it has the same outcome and with what was it? about 20 games? most of them even at 1080P the performance is below the i3.
> 
> ...




https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/14.html
https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/13.html

Are these results make the games tested unplayable? we're talking about less that 2% difference in 1080p gaming.....  and a whole 7% less than a 8700k for a much cheaper price point. The only way this would mean something is that if the 2% would make the difference between playable and unplayable which clearly isn't the case........ Seriously this discussion is pointless if you think something that's priced lower and offers performance within single digits of the competition in games is wrong .... I rest my case


----------



## cucker tarlson (Apr 20, 2018)

intrepid3d said:


> Why bother testing for it?
> 
> He also tested 1080P and it has the same outcome and with what was it? about 20 games? most of them even at 1080P the performance is below the i3.
> 
> ...


I don't know whether you like or dislike that 2700x cpu, can you give us like a two sentence summary ?



I No said:


> https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/14.html
> https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/13.html
> 
> Are these results make the games tested unplayable? we're talking about less that 2% difference in 1080p gaming.....  and a whole 7% less than a 8700k for a much cheaper price point. The only way this would mean something is that if the 2% would make the difference between playable and unplayable which clearly isn't the case........ Seriously this discussion is pointless if you think something that's priced lower and offers performance within single digits of the competition in games is wrong .... I rest my case



It's 7% in this review. If you find more cpu intensive scenarious, the gap grows substantially.

https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,34
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,35
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,36
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,37
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,38
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,39
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,40
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,41
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,42
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,43

This here can be interpteted as worst possible scenario. It depends on a particular game whether it is indicative of how a game will run. I played most of them, and I can tell you this: in some of them you'll see this difference quite often, in some it will definitely not be indicative of how a game runs overall. If you're a performance enthusiast, you'll likely be disappointed by how ryzen handles some cpu intensive scenarios.

also, occt power draw

https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,44

their 1800x system needed 280W for 4.1GHz OC, so that's a lot more power to achieve that 150MHz OC, mostly due to voltage, it needs crazy 1.475v to stay stable at 4250MHz on all cores.


----------



## intrepid3d (Apr 20, 2018)

I don't know, when mulling over a gaming CPU all i can see from these slides is the i3 at half the price is better. ^^^^



I No said:


> https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/14.html
> https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/13.html
> 
> Are these results make the games tested unplayable? we're talking about less that 2% difference in 1080p gaming.....  and a whole 7% less than a 8700k for a much cheaper price point. The only way this would mean something is that if the 2% would make the difference between playable and unplayable which clearly isn't the case........ Seriously this discussion is pointless if you think something that's priced lower and offers performance within single digits of the competition in games is wrong .... I rest my case



According to these slides all the Intel CPU's have margin of error the same performance, what that tells me is they can't all be the same so they must be bottlenecked by the GPU, given that only the Ryzen CPU's are actually slower it can only be that they are the bottleneck.


----------



## mroofie (Apr 20, 2018)

cucker tarlson said:


> AMD - we make your intel CPU a better buy for the money !



Savage


----------



## FeelinFroggy (Apr 20, 2018)

intrepid3d said:


> I don't know, when mulling over a gaming CPU all i can see from these slides is the i3 at half the price is better. ^^^^



I have read all your comments on this thread and it reminds me of the scene in the movie Swingers.  Where the guy leaves about 20 messages on the girls phone.  Its still painful to watch.

A lot of games, scale very well with fast single threaded CPU performance.  This is something the i3 8350k and other Intel chips are very good at.  They have a lead over AMD (a small lead) in IPC for single threads and they have higher clock speeds.  Thus, they will perform better in applications (games) that like single threaded performance.   In these situations, the i3 will perform better than the 2700x, 2600x, 1950x, and so on.  This is why the i3 (particularly the 8100) is a fantastic budget CPU.  

Not all games scale to faster single cores. BF1, like you showed on an earlier is a very CPU heavy title.  One of the hardest games on a CPU.  The 1600 is going to do a lot better a lot better than a 4 thread CPU on that game as 6600k is the minimum CPU requirement for that game.  There is not one CPU that is great at everything.  But the 2700x is good at everything.  

You are also comparing the 2700x OC to the stock 2700x.  When you overclock a Ryzen CPU and an Intel CPU for that matter, you disable turbo.  The way these turbos work is they scale to cores.  All core turbo on a 8700k is 4.3, but running at stock clocks the 8700k will run single threaded processes at 4.7.  AMD has a similar feature with their single core boost (I think it is 4.35 but it has only been out for a day so someone will correct me if I am wrong).  So when you overclock the 2700x to 4.2, the turbo and XFR is disabled and all cores are running at 4.2.  When an application (game) that scales well to single threaded CPU performance (see paragraph above) is played on an overclocked 2700x the single core speed is only 4.2ghz, instead of 4.35ghz of a stock 2700x.  So when you overclock the 2700x, you hurt it's single core performance, which in turn will hurt performance in a lot of games.

The best part about Ryzen+ is also the worst part.  AMD in a low of ways has eliminated the need for overclocking Ryzen CPUs in gaming with their turbo boost technology.  You dont need an expensive "x" motherboard or cooler.  You just need a motherboard and cooler that will handle the turbo and XFR.  The worst part is it does not look like Ryzen+ can overclock past it's single core turbo.  They have squeezed all of the juice out of the orange.  This was an issue with the first gen because it looks like Ryzen hits a hard wall at 4.2-4.3.  Maybe Zen 2 will fix these issues as I am sure that AMD is trying.

I am impressed with what AMD has done as they really have closed the gap in gaming and this review shows that.  It also shows, that in today's gaming that single threaded CPU performance is still very important and AMD has closed the gap considerably from where they were 2 years ago.


----------



## intrepid3d (Apr 20, 2018)

FeelinFroggy said:


> I have read all your comments on this thread and it reminds me of the scene in the movie Swingers.  Where the guy leaves about 20 messages on the girls phone.  Its still painful to watch.
> 
> A lot of games, scale very well with fast single threaded CPU performance.  This is something the i3 8350k and other Intel chips are very good at.  They have a lead over AMD (a small lead) in IPC for single threads and they have higher clock speeds.  Thus, they will perform better in applications (games) that like single threaded performance.   In these situations, the i3 will perform better than the 2700x, 2600x, 1950x, and so on.  This is why the i3 (particularly the 8100) is a fantastic budget CPU.
> 
> ...



So why is it not a single one of the reviews show this? using your example, BF1, the i3 is 15% out infront of the 2700X. if that's a case where the 2700x is supposed to come good in CPU heavy games its still really bad, far worse than the i3.

https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/12.html

lol, Same guy keeps down voting me, don't lurk laszlo say whats on your mind.


----------



## trparky (Apr 20, 2018)

I No said:


> Seriously this discussion is pointless if you think something that's priced lower and offers performance within single digits of the competition in games is wrong .... I rest my case


Not in my area, I'm seeing prices for both AMD Ryzen and the Intel 8600K to be nearly neck-and-neck. Then again. Microcenter is known to run some really good deals on motherboard and processor bundles where they take a loss on the processor because they're betting on you wanting to buy other things with it (case, PSU, RAM, etc.).

Microcenter AMD Ryzen 5 2600X and ASUS ROG Strix X470-F Processor and Motherboard Bundle -- $394.98
Microcenter Intel Core i5-8600K and ASUS ROG Strix Z370-E Processor and Motherboard Bundle -- $399.98

They're practically trading blows here.


----------



## Space Lynx (Apr 20, 2018)

The thing that no one seems to be looking at here, is that min frame rates are still 10-15% better across the board with intel, which means more smooth gameplay when more enemies get on screen, etc. This is according to GamersNexus youtube review.



trparky said:


> Not in my area, I'm seeing prices for both AMD Ryzen and the Intel 8600K to be nearly neck-and-neck. Then again. Microcenter is known to run some really good deals on motherboard and processor bundles where they take a loss on the processor because they're betting on you wanting to buy other things with it (case, PSU, RAM, etc.).
> 
> Microcenter AMD Ryzen 5 2600X and ASUS ROG Strix X470-F Processor and Motherboard Bundle -- $394.98
> Microcenter Intel Core i5-8600K and ASUS ROG Strix Z370-E Processor and Motherboard Bundle -- $399.98
> ...




that is a bad price example... I got my Z370 tomahawk for $99, and 8600k for $221 just two weeks ago on a sale.


----------



## trparky (Apr 20, 2018)

lynx29 said:


> that is a bad price example... I got my Z370 tomahawk for $99, and 8600k for $221 just two weeks ago on a sale.


I'm just saying that the idea that AMD is somehow drastically cheaper than Intel is patently false, the prices I quoted (even though they may not be as good as the prices you got) prove that.


----------



## bug (Apr 20, 2018)

trparky said:


> Not in my area, I'm seeing prices for both AMD Ryzen and the Intel 8600K to be nearly neck-and-neck. Then again. Microcenter is known to run some really good deals on motherboard and processor bundles where they take a loss on the processor because they're betting on you wanting to buy other things with it (case, PSU, RAM, etc.).
> 
> Microcenter AMD Ryzen 5 2600X and ASUS ROG Strix X470-F Processor and Motherboard Bundle -- $394.98
> Microcenter Intel Core i5-8600K and ASUS ROG Strix Z370-E Processor and Motherboard Bundle -- $399.98
> ...


Intel doesn't come with a haetsink, so that's another $20-30 in favour of AMD


----------



## cucker tarlson (Apr 20, 2018)

trparky said:


> I'm just saying that the idea that AMD is somehow drastically cheaper than Intel is patently false, the prices I quoted (even though they may not be as good as the prices you got) prove that.


People fall for the illusion of amd being cheap, that is what their marketing team is trying to achieve and is doing it successfully. Of course amd has an in-box cooler and can be oc'd on b350 boards. But what oc can you achieve with stock cooler and b350 ? You need as good cooling and as good a mobo for 2600X as you need for 8600K if you wanna push them.


----------



## Space Lynx (Apr 20, 2018)

trparky said:


> I'm just saying that the idea that AMD is somehow drastically cheaper than Intel is patently false, the prices I quoted (even though they may not be as good as the prices you got) prove that.



holy crap, I just realized I got my mobo and 8600k cheaper than a single 2700x chip, lol i didn't realize that until now for some reason, I re-used my 5-6 year old NH-d14 as well, meh its all good.  competition is good for all of us we can all agree there.  im very happy with the deals i got though


----------



## FeelinFroggy (Apr 20, 2018)

intrepid3d said:


> So why is it not a single one of the reviews show this? using your example, BF1, the i3 is 15% out infront of the 2700X. if that's a case where the 2700x is supposed to come good in CPU heavy games its still really bad, far worse than the i3.
> 
> https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/12.html
> 
> lol, Same guy keeps down voting me, don't lurk laszlo say whats on your mind.



Honestly, I don't know what to make of the 720p resolution stats.  In fact, I don't really care.  Some people think running at 720p puts all of the bottleneck on the CPU, and in some ways it definitely does.  But it is not a real world indicator and there are more factors going into playing a game than the CPU.  No one is going to run a system with a 1080ti at 720p.  The 1080ti was not designed to run at such low resolution so there are going to be anomalies.  When you look at the 1080p results for BF1 the results are very different as the is only a few fps difference between all of the top Intel CPUs.  Maybe it was the sequence that was benched in BF1.  The game does not have a built in benchmark which makes things very difficult.  In fact many of the games don't have built in benchmarks.  This is why you should be look at cumulative averages and not one off performances because there are just too many variables from hardware configuration to settings to what sequence used in the games.

Oh, and here is a nickel's worth of free advice.  Expect to be down voted if you come to a site criticizing the reviewer and saying that they do not know how to do a review.  You have basically accused the reviewer of being paid by Intel to make an AMD product look bad.  You accused him of running Intel CPUs pre specter and meltdown patches and accused his graphs as being misleading.  When the reviewer addressed the meltdown and specter patches and went back and updated every graph to include the base and boost clocks so your feelings would not get hurt.  That is 14 games with 22 different CPUs at 4 different resolutions.  That comes out to 1,232 individual results in just gaming benchmarks.  Do you have any idea how long it takes to bench 1,232 individual gaming results on three different platforms?  And you are complaining about 1 of 1,232.  Show some respect and you will get respect.


----------



## trparky (Apr 20, 2018)

OK then...

ASUS TUF Z370-PLUS GAMING LGA 1151 ATX Intel Motherboard -- $119.99
Intel Core i5-8600K Coffee Lake 3.6 GHz LGA 1151 Boxed Processor -- $219.99 Retail Price ($189.99 if bought with a compatible motherboard)
Corsair Certified Hydro Series H100i v2 -- $69.99

Total price... *$379.97*

Compare that to the Microcenter AMD Ryzen 5 2600X and ASUS ROG Strix X470-F Processor and Motherboard Bundle which is $394.98.

Again, all I'm saying is that the idea that AMD is somehow drastically cheaper than Intel is patently false. I'm not trying to be a fanboy here, all I'm doing is doing the math and the math doesn't lie.


----------



## Space Lynx (Apr 20, 2018)

FeelinFroggy said:


> Honestly, I don't know what to make of the 720p resolution stats.  In fact, I don't really care.  Some people think running at 720p puts all of the bottleneck on the CPU, and in some ways it definitely does.  But it is not a real world indicator and there are more factors going into playing a game than the CPU.  No one is going to run a system with a 1080ti at 720p.  The 1080ti was not designed to run at such low resolution so there are going to be anomalies.  When you look at the 1080p results for BF1 the results are very different as the is only a few fps difference between all of the top Intel CPUs.  Maybe it was the sequence that was benched in BF1.  The game does not have a built in benchmark which makes things very difficult.  In fact many of the games don't have built in benchmarks.  This is why you should be look at cumulative averages and not one off performances because there are just too many variables from hardware configuration to settings to what sequence used in the games.
> 
> Oh, and here is a nickel's worth of free advice.  Expect to be down voted if you come to a site criticizing the reviewer and saying that they do not know how to do a review.  You have basically accused the reviewer of being paid by Intel to make an AMD product look bad.  You accused him of running Intel CPUs pre specter and meltdown patches and accused his graphs as being misleading.  When the reviewer addressed the meltdown and specter patches and went back and updated every graph to include the base and boost clocks so your feelings would not get hurt.  That is 14 games with 22 different CPUs at 4 different resolutions.  That comes out to 1,232 individual results in just gaming benchmarks.  Do you have any idea how long it takes to bench 1,232 individual gaming results on three different platforms?  And you are complaining about 1 of 1,232.  Show some respect and you will get respect.




ignore all 720p benches. go look at TPU's own 1440p benches. my 8600k at stock beats 2700x in every game at 1440p. sometimes by as much as 10 fps. not to mention all min frames rates across all resolution are still better on intel (check gamers nexus review and tweaktown)

that being said. AMD is def the winner if you stream on twitch, etc. intel just cant keep when it comes ot high end streaming.  those extra cores really benefit twitch streamers.


----------



## SIGSEGV (Apr 20, 2018)

Great reviews. Thanks wizz
AMD is able to catch up Intel right now. yes. bring the competition. 

OOT : I am so sick but it's also really fun and makes me lol reading the comment section especially with so many intel fanboys butthurt with this AMD Ryzen 2 release. am i psycho ? lol


----------



## intrepid3d (Apr 20, 2018)

FeelinFroggy said:


> Honestly, I don't know what to make of the 720p resolution stats.  In fact, I don't really care.  Some people think running at 720p puts all of the bottleneck on the CPU, and in some ways it definitely does.  But it is not a real world indicator and there are more factors going into playing a game than the CPU.  No one is going to run a system with a 1080ti at 720p.  The 1080ti was not designed to run at such low resolution so there are going to be anomalies.  When you look at the 1080p results for BF1 the results are very different as the is only a few fps difference between all of the top Intel CPUs.  Maybe it was the sequence that was benched in BF1.  The game does not have a built in benchmark which makes things very difficult.  In fact many of the games don't have built in benchmarks.  This is why you should be look at cumulative averages and not one off performances because there are just too many variables from hardware configuration to settings to what sequence used in the games.
> 
> Oh, and here is a nickel's worth of free advice.  Expect to be down voted if you come to a site criticizing the reviewer and saying that they do not know how to do a review.  You have basically accused the reviewer of being paid by Intel to make an AMD product look bad.  You accused him of running Intel CPUs pre specter and meltdown patches and accused his graphs as being misleading.  When the reviewer addressed the meltdown and specter patches and went back and updated every graph to include the base and boost clocks so your feelings would not get hurt.  That is 14 games with 22 different CPUs at 4 different resolutions.  That comes out to 1,232 individual results in just gaming benchmarks.  Do you have any idea how long it takes to bench 1,232 individual gaming results on three different platforms?  And you are complaining about 1 of 1,232.  Show some respect and you will get respect.



Not really advice given you based it entirely on your assumptions.

I do think his reviews could be better given how oddly clumped together Intel and Ryzen are, the only variation there is lays between Ryzen and Intel, between the low end Ryzens there is little variation from the bottom up, its even worse on the Intel side with low end older gen Intel's pretty much the same performance as the latest and greatest, i think any reasonable observer would look at that and wonder "this isn't a good review if he couldn't separate these clearly different levels of CPU's into different levels of actual performance" many other reviewers do manage that seemingly quite easily.

Quite aside from all of that if the top Ryzen is notably slower than the near bottom i3 then Ryzen is very clearly utter garbage.

The same is still true for 1080P https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/13.html


----------



## cucker tarlson (Apr 20, 2018)

SIGSEGV said:


> OOT : I am so sick but it's also really fun and makes me lol reading the comment section especially with so many intel fanboys butthurt with this AMD Ryzen 2 release. am i psycho ? lol


No, you're not psycho, just not a great reader.


----------



## Space Lynx (Apr 20, 2018)

SIGSEGV said:


> Great reviews. Thanks wizz
> AMD is able to catch up Intel right now. yes. bring the competition.
> 
> OOT : I am so sick but it's also really fun and makes me lol reading the comment section especially with so many intel fanboys butthurt with this AMD Ryzen 2 release. am i psycho ? lol





cucker tarlson said:


> No, you're not psycho, just not a great reader.



not butthurt at all... I just enjoy looking at the numbers. if i didnt get my z370 mobo on sale for $99 i prob woulda waited for the 2700x, just because I enjoy looking at the numbers and reporting them doesn't mean I am butthurt...

does 10 fps matter? nope, not at all. and any cpu you buy is going to be great. i didn't name call anyone, but you guys feel the need to for some reason. well take it easy i guess, i have stuff to get to now


----------



## dyonoctis (Apr 20, 2018)

Some comments here is making pc gaming look like cancer, like just because ryzen is slower than an i3 in gaming they are overpriced ? really ? following that twisted logic the 7820x is also overpriced since it's slower than the 2700x in gaming :





geez, again the conclusion is simple, while not the fastest in gaming you can game just fine on ryzen, it's really a good multipurpose cpu, decent in gaming and decent in multitasking, the 8600k/8700 is a better buy if you value gaming above everyhting else. Using gaming as the sole mesuring point is really too narrow.


----------



## Xaled (Apr 20, 2018)

trparky said:


> I'm just saying that the idea that AMD is somehow drastically cheaper than Intel is patently false, the prices I quoted (even though they may not be as good as the prices you got) prove that.



No it is true. at least for launch period, before Intel drop prices or give better product for same price so they compete with AMDs products.

Also dont forget that AMD doesnt change socket as much as Intel does. all am4 owners can now upgrade to Zen+ (correct me please if i am wrong).


----------



## Readlight (Apr 20, 2018)

Everything works why new, its all just how good programs, games are made. Memory latency is same. And if i want i can go to someday to i5-2400.  i haw no money for it now.
There cooler is good, but they put less aluminum then old ones.


----------



## Xaled (Apr 20, 2018)

dyonoctis said:


>



nice slide there. where did you get that from? can you share it with us cause i really want to see how my 2500k performs against other new cpus. this exactly what i was searching for..


----------



## W1zzard (Apr 20, 2018)

FeelinFroggy said:


> 720p


720p is immensely useful to estimate what FPS you can get on that specific CPU with the fastest GPU money can buy and what to expect from <next-gen NVIDIA monster GPU>.

But yes, I would consider it a synthetic test and not a real-life benchmark. That's why I benchmark 1080/1440/4K, too.


----------



## Nordic (Apr 20, 2018)

Ryzen wasn't slower than an i3 at 720p. It was about 3% faster. The review even stated that they got worse performance from over clocking manually, while amd's xfr did a better job overclocking. The overclocking performance numbers are there for consistency with past reviews, and should not be taken as the performance.


----------



## dyonoctis (Apr 20, 2018)

Xaled said:


> nice slide there. where did you get that from? can you share it with us cause i really want to see how my 2500k performs against other new cpus. this exactly what i was searching for..


It's from hardware.fr
https://translate.googleusercontent...700201&usg=ALkJrhhPsyhZDosoKGPeZ2P7dw1zqqp_sw


----------



## FeelinFroggy (Apr 20, 2018)

intrepid3d said:


> Not really advice given you based it entirely on your assumptions.
> 
> I do think his reviews could be better given how oddly clumped together Intel and Ryzen are, the only variation there is lays between Ryzen and Intel, between the low end Ryzens there is little variation from the bottom up, its even worse on the Intel side with low end older gen Intel's pretty much the same performance as the latest and greatest, i think any reasonable observer would look at that and wonder "this isn't a good review if he couldn't separate these clearly different levels of CPU's into different levels of actual performance" many other reviewers do manage that seemingly quite easily.
> 
> ...




The 1080p results I see for the cumulative gaming performance shows a stock 2700x beating the i3.  The 2700x is about 5% fewer fps than the i3 if BF1.  I would not qualify that as notably slower, infact, with a game with no built in benchmark, it is closer to the margin of error.  Remember on our lesson on single core performance and overclocking Ryzen.  Overclocking the 2700x hurts gaming performance so you should not look at the OC 2700x data as it is not the optimal way to game on the 2700x.  The top Intel CPUs are within 1% of each other in BF1.  The 8400 cost half as much as the 8700k and has one tenth of a point higher fps.  So does that mean the 8700k is "Is very clearly utter garbage" as you have repeated several times?  Of course not, it is an outlier on one game with multiple different systems in one sequence of a game that does not have a built in benchmark. 

Assumptions?  Did you not call W1zzard out twice about specter and meltdown patches?  I broke it down to 1,232 individual benchmarks.  I am not a math genius, but simple multiplication is not an assumption, its reality.  You are the one making assumptions.  Since you are such an expert at CPU reviews, kindly leave some links to your previous reviews that you have done to show these ametures how it is done. 

You are probably too young to watch Swingers, but look it up on youtube and watch the scene where he calls the girl 20 times and leaves a message.  This is what you are doing now and all you have accomplished is embarrassing yourself.


----------



## phill (Apr 20, 2018)

@W1zzard - I'd just like to say thank you for the review   It was decent and showed me enough to want to buy a AMD Ryzen 2 even if for a trial!  I find the price point for the CPU very compelling and to be honest, 8 cores is just the sweet spot for everything I think these days.  Games and workloads, I don't believe in my opinion why anyone would have anything else really.  Yes if your completely wanting the utmost speed then maybe Intel is were you need to go but I hope that in time the CPUs get better and with the new Ryzen 2 next year at some point, it'll be even better...  

Exciting times ahead for everyone I believe..  I for one, really welcome the competition from AMD and what they are managing to do..  Very pleased 
Now if they could only do the same in their GPU section....


----------



## intrepid3d (Apr 20, 2018)

FeelinFroggy said:


> The 1080p results I see for the cumulative gaming performance shows a stock 2700x beating the i3.  The 2700x is about 5% fewer fps than the i3 if BF1.  I would not qualify that as notably slower, infact, with a game with no built in benchmark, it is closer to the margin of error.  Remember on our lesson on single core performance and overclocking Ryzen.  Overclocking the 2700x hurts gaming performance so you should not look at the OC 2700x data as it is not the optimal way to game on the 2700x.  The top Intel CPUs are within 1% of each other in BF1.  The 8400 cost half as much as the 8700k and has one tenth of a point higher fps.  So does that mean the 8700k is "Is very clearly utter garbage" as you have repeated several times?  Of course not, it is an outlier on one game with multiple different systems in one sequence of a game that does not have a built in benchmark.
> 
> Assumptions?  Did you not call W1zzard out twice about specter and meltdown patches?  I broke it down to 1,232 individual benchmarks.  I am not a math genius, but simple multiplication is not an assumption, its reality.  You are the one making assumptions.  Since you are such an expert at CPU reviews, kindly leave some links to your previous reviews that you have done to show these ametures how it is done.
> 
> You are probably too young to watch Swingers, but look it up on youtube and watch the scene where he calls the girl 20 times and leaves a message.  This is what you are doing now and all you have accomplished is embarrassing yourself.



By 2.6% with both at stock, as W1zzard proved you overclock the 2700X you actually end up with much lower performance, the i3 overclocks quite a lot.

https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/20.html


----------



## dyonoctis (Apr 20, 2018)

intrepid3d said:


> The problem is its not really a compromise is it?
> 
> if you get the cheaper 4 core Ryzens then you only pay i3 money but end up with really bad gaming performance and only i3 in anything else, so the i3 is much better.
> 
> ...


It all depend of how much thread your applications are using. For 3d rendering for example the 2700x is faster than a 8700k, and only 10% slower than a 7820x while being 38% cheaper, and faster in gaming.


----------



## intrepid3d (Apr 20, 2018)

dyonoctis said:


> It all depend of how much thread your applications are using. For 3d rendering for example the 2700x is faster than a 8700k, and only 10% slower than a 7820x while being 38% cheaper, and faster in gaming.



Yes its a good low cost productivity CPU, but thats all, its still a sub i3 gaming chip so unless all you do is render in blender its pretty crap.

Edit: faster than what in gaming? a Pentium? congratulations, it beats a $60 CPU


----------



## FeelinFroggy (Apr 20, 2018)

intrepid3d said:


> By 2.6% with both at stock, as W1zzard proved you overclock the 2700X you actually end up with much lower performance, the i3 overclocks quite a lot.
> 
> https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/20.html



Yeah dude, whatever you say.  The i3 is a better overclocker than the 2700x so that clearly makes it by far a superior CPU.  

I feel like I am playing cards with my brother's kids or something...


----------



## dyonoctis (Apr 20, 2018)

intrepid3d said:


> Yes its a good low cost productivity CPU, but thats all, its still a sub i3 gaming chip so unless all you do is render in blender its pretty crap.


It's crap for you because all you care about is gaming on apparently +144 hz screen, Just like a 7920x would also be crap in you eyes, but awesome for everyone else using multithreaded application.
And crap is a pretty strong word, I though that crap was used when something is deficient, ryzen fps don't look deficient to me. Slower yes, but crap ?

to your edit : it's faster than a 7820x. Skylake x doesn't do that well in gaming compared to cofee lake and even ryzen +.


----------



## TheLaughingMan (Apr 20, 2018)

The real fun is what GN did. They got the chip to run at 4.0 to 4.1 GHz on 1.125 V which gave performance on par with stock most of the time and drastically lowered power draw and heat output. If I upgraded, that is what I would do since I did the same with my 1800X. I will most likely just want for Zen 2.


----------



## W1zzard (Apr 20, 2018)

TheLaughingMan said:


> They got the chip to run at 4.0 to 4.1 GHz on 1.125 V which gave performance on par with stock most of the time and drastically lowered power draw and heat output


So same thing you can do with Intel? but on Intel you can probably  go even lower (assuming that OC headroom ==~ under voltage headroom)


----------



## DeathtoGnomes (Apr 20, 2018)

i vote this thread be closed. its the same arguing back and forth.


----------



## HTC (Apr 20, 2018)

DeathtoGnomes said:


> i vote this thread be closed. its the same arguing back and forth.



Or move that "arguing back and forth" to another 2700X topic: one that's not a TPU review.


----------



## Nordic (Apr 20, 2018)

To try and move the subject onwards, what improvements will AMD include on zen? Zen+ was a nice improvement on its own and AMD didn't do much.


----------



## Xaled (Apr 20, 2018)

Does anyone know if there would be such refreshment for Threadripper too or a price cut at least?.. as 1950x now has no point of being sold at 950$-1000. At launch it has the 2x price of of Ryzen 1800x but now that doesnt make sense..

By the way while i searching for that i found this interesting article that claims that 1800x now is 10% faster than 1800x at launch
https://www.extremetech.com/computing/267915-psa-the-amd-ryzen-7-1800x-is-faster-than-it-used-to-be


----------



## HTC (Apr 20, 2018)

Xaled said:


> Does anyone know if there would be such refreshment for Threadripper too or a price cut at least?.. as 1950x now has no point of being sold at 950$-1000. At launch it has the 2x price of of Ryzen 1800x but now that doesnt make sense..
> 
> *By the way while i searching for that i found this interesting article that claims that 1800x now is 10% faster than 1800x at launch*
> https://www.extremetech.com/computing/267915-psa-the-amd-ryzen-7-1800x-is-faster-than-it-used-to-be



Interesting. Any other sources to back this up?


----------



## xorbe (Apr 20, 2018)

Imho, on that one chart a couple pages back (purposefully using 1280x720 to exagerate results ...), everything 1500X and up is fine for gaming, the 100% +/- 15% range.  It's like 2/3 of the chart.  We're splitting performance hairs here.

Anyway, I'm planning to build and 2700X system, just not sure when, or what video card I'll put in it.


----------



## HTC (Apr 20, 2018)

Spotted this over @ anandtech forums, translated from French:

https://translate.google.com/transl...470-r5-2600x-r7-2700x.html?start=0&edit-text=

*They compare a lot of older generation processors*, from both camps to 2600X and 2700X and also include Intel's current offerings.


----------



## intrepid3d (Apr 20, 2018)

Interesting, TechSpot made a clock for clock comparison, it seems Ryzen 2 has about a 7% gain in IPC in games like BF1, about 3% in cinebench.

That's as much as the whole gain from the 1800X to the 2700X in the same game in this review.


----------



## FeelinFroggy (Apr 20, 2018)

That is interesting considering AMD stated that the IPC increase was 3%.  I would have thought that AMD knows their CPUs better than a TechSpot reviewer.  Boy was I wrong.

Both reviews below state that AMD informed them on the 3% IPC increase from the 1800x to the 2700x.

https://www.anandtech.com/show/12625/amd-second-generation-ryzen-7-2700x-2700-ryzen-5-2600x-2600/4

https://www.tomshardware.com/reviews/amd-ryzen-7-2700x-review,5571.html


----------



## TheLaughingMan (Apr 20, 2018)

W1zzard said:


> So same thing you can do with Intel? but on Intel you can probably  go even lower (assuming that OC headroom ==~ under voltage headroom)


True. Its just good to see that is an option for AMD again. Everything from AMD since the original Black Edition line have been power hungry as hell. I don't recall any desktop AMD processor running on less than 1.25 V in years except the E lines, let alone one with 8 cores @ 4.0 GHz+.


----------



## Joss (Apr 20, 2018)

HTC said:


> Spotted this over @ anandtech forums, translated from French:
> https://translate.google.com/translate?sl=fr&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=http://www.comptoir-hardware.com/articles/cpu-mobo-ram/36169-test-amd-x470-r5-2600x-r7-2700x.html?start=0&edit-text=
> *They compare a lot of older generation processors*, from both camps to 2600X and 2700X and also include Intel's current offerings.


What a wonderful way to show a chart! When you hover a certain CPU score it shows 100% with the others relative to it 
That site is on my bookmarks now


----------



## evernessince (Apr 21, 2018)

W1zzard said:


> yup



Thank you!


----------



## W1zzard (Apr 21, 2018)

FeelinFroggy said:


> AMD stated that the IPC increase was 3%


https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/16.html

3% does sound about right, we got 1.91%. It will depend on what you test, because those measurements also include other gains






Note the word "single threaded improvement", so this will change with multiple threads, too



intrepid3d said:


> in games like BF1


Didn't you mention a few pages back that people should test at highest settings?


----------



## HTC (Apr 21, 2018)

Joss said:


> What a wonderful way to show a chart! *When you hover a certain CPU score it shows 100% with the others relative to it*
> That site is on my bookmarks now



As far as i know, Computer Base has been doing that in their CPU / GPU reviews for quite some time: dunno of any others that do the same.


----------



## mroofie (Apr 21, 2018)

Slightly off topic

Ryzen second generation Pro coming 2H

Is that Q4???


----------



## Arrakis9 (Apr 21, 2018)

intrepid3d said:


> Interesting, TechSpot made a clock for clock comparison, it seems Ryzen 2 has about a 7% gain in IPC in games like BF1, about 3% in cinebench.
> 
> That's as much as the whole gain from the 1800X to the 2700X in the same game in this review.
> 
> ...



You really should stop Intel shilling, every review site has similar but ultimately different hardware for testing (Silicon lottery?, different Motherboards?, different memory/timings/frequencies) , if your expecting every review site to line up 100% with their results your truly lost and at this point just being a troll. If you don't like the reviews here go some where else that has more 'respectable' numbers and views, it's not like your constant arguing is going to change the review or the results. Give it a rest already.

Thanks @Wizzard for the time and effort you put into the review.


----------



## Vayra86 (Apr 21, 2018)

This looks really good. Still some considerable gaps in gaming scenarios but regardless, for 8c/16t this is another great step forward. Intel still wins on clocks though, so for gaming Ryzen still isn't the _optimal_ choice.


----------



## Tsukiyomi91 (Apr 21, 2018)

Now this is interesting. Gaming performance is to be expected... especially on 1080p & 1440p, so no surprises there. Price wise it's very competitive, should be able to shake up Intel a little bit.


----------



## Joss (Apr 21, 2018)

I wonder what's the percentage of people playing high refresh rates (144hz and above) because I suspect it's diminutive.
And even those who buy 144hz monitors, do they all really demand 144fps in every possible scenario? because I suspect many are happy to play above 100hz.
I find the 2700X to be an excellent solution for the majority of demanding scenarios, gaming only included.


----------



## cucker tarlson (Apr 21, 2018)

What this thread leads me to conclude is not so much about the performance of ryzen 2,that was pretty much to be expected, as it is about the fact that people need  to learn how to interpret the numbers.
I said early in the thread that "7600k-like performance is mediocre". It is. 2700X should be at least level with 8600K when it comes to pefromance numbers. But saying that i3 is as good of a gaming processor as Ryzen is simply not true, Ryzen wins is a landslide overall. Take 10 games and let's say that in the end 7600K and 2700X are withing 1-2% of each other. But anyone who has experienced a cpu bottleneck with a fast GPU and high refresh monitor will tell you that 55 fps on a CPU that's using 50-60% across the cores is gonna feel much smoother than 55 fps on a 4c/4t CPU that's getting +90% on every core, or even 70 fps on a 4c/4t CPU that's getting +90% on every core. It just is, I experienced that myself and tested it many times with 4790K HT on vs HT off. The only exception is the broadwell-c family with l4 cache which runs smooth, without stutter or hiccups, even when the cpu is being hammered. The only games that 8350K will feel smoother are those which use 1-2 threads heavily but don't put much usage on other 2 cores. However, they're like 2 out of 10 these days. Far Cry 5 would be an example, they're still using Dunia engine. I think Kingdom Come: Deliverance is another one. The other ones are either decent or very good at multithreading.I bet you anything that if you could play them side by side, that 160 fps in BF1 would feel smoother than 170 fps in same game with 8350K,which would get occasional stutter and hiccups when the cores are heavily loaded. Why is 8350K getting a higher number then - cause faster single core is not equivalent to more threads, and vice versa. What is unquestionably true is that in games that run multithreading well, a cpu that is 50% loaded will not get the fratetime spikes and stutter that a +90% loaded cpu will get, even if the latter produces twice as many fps.
So does ryzen 2700x have mediocre gaming performance ? Yes, it does, cause 8600K is faster and cheaper. Is 8350K as good of a gaming CPU- hell no.


----------



## intrepid3d (Apr 21, 2018)

W1zzard said:


> https://www.techpowerup.com/reviews/AMD/Ryzen_7_2700X/16.html
> 
> 
> Didn't you mention a few pages back that people should test at highest settings?



Yes, and i stand by that.

I understand and agree with 720P testing but by reducing image quality settings one is reducing the workload on the CPU. i don't think i need to explain it beyond that, if you think about it for a minute it should make sense to you.


----------



## cucker tarlson (Apr 21, 2018)

intrepid3d said:


> by reducing image quality settings one is reducing the workload on the CPU


No. You literally learned nothing here.

I am gonna have to -1 you for again spreading misinformation about something we already explained to you.

Open any game, look at you cpu usage. Then drop the resolution down to 720p. Do you see the cpu usage go up or down ?

Report back.


----------



## intrepid3d (Apr 21, 2018)

cucker tarlson said:


> No. You literally learned nothing here.
> 
> I am gonna have to -1 you for again spreading misinformation about something we already explained to you.



Where did you explain it? i must have missed that and i'm genuinely interested.

PS: down vote all you like the rep system is utterly meaningless to me. If anything i think its slightly amusing as those things on any forum, especially reddit get used by immature people hammering their mouse button on people who say things they simply don't like or just disagree with them.

In fact what makes you think anyone should worry about the like or dislike buttons? They just make people say whats popular in order not to lose those shiny like points, go head, knock yourself out with it and down vote all you like.


----------



## cucker tarlson (Apr 21, 2018)

intrepid3d said:


> Where did you explain it? i must have missed that and i'm genuinely interested.


https://www.techpowerup.com/forums/threads/amd-ryzen-7-2700x-3-7-ghz.243209/page-2#post-3830814

You should be pretty embarassed by now. No wonder you missed it, you've been posting non-stop since the review came out. Take a shower 



1440p low quality, 56% usage on CPU, 86% usage on GPU







1440p Ultra, 38% on the CPU but 97% on the GPU


----------



## W1zzard (Apr 21, 2018)

intrepid3d said:


> Yes, and i stand by that.
> 
> I understand and agree with 720P testing but by reducing image quality settings one is reducing the workload on the CPU. i don't think i need to explain it beyond that, if you think about it for a minute it should make sense to you.


Your techspot link uses medium quality


----------



## intrepid3d (Apr 21, 2018)

cucker tarlson said:


> Open any game, look at you cpu usage. Then drop the resolution down to 720p. Do you see the cpu usage go up or down ?
> 
> Report back.



That's pseudo logic, by turning the resolution down the GPU is rendering faster which makes the CPU work harder to keep up.
However, games are far more complex than that, Graphics quality settings are there to reduce the workload on the whole system, including the CPU, in games the CPU runs Streamed shading and lighting calculations, post processing filters, soft and ridged physics math.... a whole bunch of things and by reducing graphics settings you are reducing or in some cases turning off those things and in that way the CPU is doing less work.
Usually by reducing graphics quality settings you are reducing the need for higher core count CPU's given that a lot of these things like Physics are very parallelized workloads so you can make a 2 or 4 core CPU look just as fast as a 6 or 8 core where in fact when such things are turned on they overwhelm small CPU's with less cores.
Observe this video, why do you think when looking at the sky the 7600K is faster than the Ryzen 1600, but when looking at the complex scene the Ryzen 1600 is twice as fast, because in the complex scene there is vastly more work for the CPU to do and the 4 thread Intel is only half as fast as the 12 thread Ryzen, even with much lower clock speeds.
You turn the graphics settings down and the scene will be far less complex allowing the 7600K to catch up again, its why i'm always suspicious of reviews that have 4 thread CPU's and 12 thread CPU's from the same vendor clumped together..

Edit, this forum will not allow time stamp youtube   go to 5m10s


----------



## cucker tarlson (Apr 21, 2018)

You'd have to have a cpu and gpu load figures in that video to prove that. They're not there, so either produce your own resluts with usage included like I did or just give us a rest. You're basically saying that a higher cpu usage numer means it has less work to do. If what you're saying was true, then a faster CPU would make a difference at 4K. It never does.


----------



## intrepid3d (Apr 21, 2018)

cucker tarlson said:


> You'd have to have a cpu and gpu load figures in that video to prove that. They're not there, so either produce your own resluts with usage included like I did or just give us a rest. You're basically saying that a higher cpu usage numer means it has less work to do.



Wow... you didn't even whatch it, if you had you would realize this ^^^^ is a nonsense. did you go to time stamp 5m10s  and watch it from there?


----------



## cucker tarlson (Apr 21, 2018)

intrepid3d said:


> Wow... you didn't even whatch it, if you had you would realize this ^^^^ is a nonsense. did you go to time stamp 5m10s  and watch it from there?


Of course I didn't watch it. You just said "when looking at the sky.." and then posted a 17 minute video with no reference to the scene you were talking about.

I've wasted enought time talking to a troll.


----------



## intrepid3d (Apr 21, 2018)

cucker tarlson said:


> Of course I didn't watch it. You just said "when looking at the sky.." and then posted a 17 minute video with no reference to the scene you were talking about.



This forum removes Youtube time stamps, so you have to go there yourself... 5m10s watch 2 minutes or more if you like from that time. again 5m10s


----------



## cucker tarlson (Apr 21, 2018)

All this proves is 4c/4t is too few cores to run crysis since this game is very heavily multithreaded. It has nothing to do with decreasing resolution and how it impacts cpu usage. You're not comparing identical scenes like you do with a 1080p vs 720p tests here, you're comparing sky vs terrain. 7600k renders sky flawlessly, but it can't cope with terrain. If you dropped the resolution in that scene from 1080p to 720p, that  7600K would be even more loaded, that is if it hadn't already hit 100% usage before, but we don't know that since there is no cpu or gpu usage.

I think you're just drawing the wrong conclusion. You think that if 7600k can handle rendering the sky but can't cope with terrain, then increasing image quality will always hammer the cpu. It is not true in 99.9% of the cases. It means some scenes in the game are just easier to render than others,whether it's gpu or cpu that has it easier depends on the scene. This is the conclusion you should have drawn.


----------



## intrepid3d (Apr 21, 2018)

cucker tarlson said:


> All this proves is 4c/4t is too few cores to run crysis since this game is very heavily multithreaded. It has nothing to do with decreasing resolution and how it impacts cpu usage. You're not comparing identical scenes like you do with a 1080p vs 720p tests here, you're comparing sky vs terrain. Think for a second dude !



You're the one fixated on resolution, i have said it many times already to try and be clear but i'll say it again, reducing the resolution is perfectly valid, i have no problem with it, in fact i agree testing at low resolution should be included in reviews, did you get that into your head now?

Crysis 3 is a 6 year old game, these days it is far from unique, almost every modern title has areas that put a lot of emphasis on more rather than less threads.

They are not difficult to find if you know what to look for, if you understand how game engines work and how they use your CPU, as Digital Foundry do because in the same video they also identified this in Ryse of the Tomb Raider and Assassins Creed, games that are usually seen as very heavily Intel bias, _only when you benchmark simple scenes_.

Games the world over now have a mixture of simple and complex scenes that in one instance can make a high clocked i3 look better and in another instance can make a low clocked Ryzen 1600 look better.


----------



## cucker tarlson (Apr 21, 2018)

intrepid3d said:


> in the same video they also identfied this in Ryse of the Tomb Raider and Assassins Creed, games that are usually see as very heavily Intel bias, only when you benchmark simple scenes.


Again, this is cause faster IPC and increased number of threads are not equivalent, and when the intel cpu is not getting 90-100% loaded on all threads but has some breathing space it will pull away from amd.

Yes, I got that into my head now, although I actually never undermined 720p testing nor did I say it's invalid. I just caught you pants down saying stuff that's just plain wrong, that's the origin of the discussion. I bid you farewell.


----------



## intrepid3d (Apr 21, 2018)

cucker tarlson said:


> Again, this is cause faster IPC and increased number of threads are not equivalent, and when the intel cpu is not getting 90-100% loaded on all threads but has some breathing space it will pull away from amd.



So what? how is it the fault of the AMD CPU with its higher thread count that the Intel CPU cannot keep up with it in complex scenes?

Where or how is the justification that Intel should be given the benefit of not being tested in a way that puts full load on their CPU's simply because in such scenario the AMD CPU is clearly faster, how do you reason that?


----------



## cucker tarlson (Apr 21, 2018)

Ugh, takes a lot of explaining, but here's the short version



> Where or how is the justification that Intel should be given the benefit of not being tested in a way that puts full load on their CPU's, simply because in such scenario the AMD CPU is clearly faster, how do you reason that?



1. except all that you wrote is actually true. yes, they do test them under full load scenario, and yes intel's 4 core / 4 thread cpus tend to get hammered hard when the game requires 6 cores or more.

https://www.purepc.pl/procesory/tes...e_i3_8350k_prawie_jak_core_i5_7600k?page=0,38
https://www.purepc.pl/karty_graficz..._s_creed_origins_problemy_w_egipcie?page=0,12
https://www.purepc.pl/procesory/tes...e_i3_8350k_prawie_jak_core_i5_7600k?page=0,37
https://www.purepc.pl/procesory/tes...e_i3_8350k_prawie_jak_core_i5_7600k?page=0,31

2. and this is like the main theme here, you are drawing waaaay too much conclusions just from this one digitalfoundry test, and from one oddball scene in particular.I'll tell you I've ran many cpu tests myself, I've seen huge differences between i5 and i7, but never been able to reproduce such a huge margin, under no circumstances. 7600k is faster than 1600 when the game is not using all threads, and it gets hit hard when the game suddenly requires 12 threads to run smoothly. But that's what I've been trying to get across to you, this is the conclusion you should have arrived at: 4c/4t intel is faster than amd when the particular scene in game doesn't require all of its resources. It gets a heavy performance drop when those resources are running at their limit in other scenes while amd ryzen still has plenty of cpu resources so it doesn't get such a drastical performance drop. Instead this is the conclusion you arrived at: "I would have thought by now people are clued up enough to know that by turning graphics setting down you're reducing the load on CPU's ". Swing and a miss.

I literally ran out of time I should spend talking to you like 20 minutes ago. Take care.


----------



## intrepid3d (Apr 21, 2018)

cucker tarlson said:


> Ugh, takes a lot of explaining, but here's the short version
> 
> 
> 
> ...



Ok i'll put it a different way, i take it you watched that video now? and you saw the massive frame rate dips on the 7600K, the stuttering that was happening on it?

None of that shows on slides, in fact slides can be very forgiving because taken over a longer period they dilute the performance differences given it is taken from averages over a period of low and high frame rates.

Whats more they don't show you that stuttering, its why i argue slide only reviews these days can be misleading,. because they hide the true gaming experience one can expect from the CPU's raw numbers in text only.

This is Insurgency, an old source engine game, its me, this is me playing this game, this very old game, the GPU is a GTX 1070, the CPU a 4.5Ghz 4690K.
Watch closely what happens to the 4 cores in this , they jump around bouncing off 100% on all 4.... i can tell you it feels horrible, its laggy and stuttery, in some places you can even see that stutter.

Now here's the thing, if i didn't already know about this from owning an Intel 4 core, and as a noob i'm looking to buy a CPU, if i look at w1zzards slides i would be lead to conclude the 4 core Intel i3 is much better than the 1600, the 2600, even the 2700X, when in fact the experience i would actually get, is as it is right now for me, pretty dreadful, i would have been much better off even with the Ryzen 1600.










Its not to have a go at W1zzard, i just hope that as a good reviewer he takes this feedback for what it is.

BTW a lot of the games i play are as bad or worse than that, on this Intel 4 core.


----------



## cucker tarlson (Apr 21, 2018)

Why do you lecture me on things I literally wrote a few posts earlier ?

https://www.techpowerup.com/forums/threads/amd-ryzen-7-2700x-3-7-ghz.243209/page-9#post-3831825

Either poor reader or deliberately trolling, don't wanna know which one, all same to me.


----------



## intrepid3d (Apr 21, 2018)

cucker tarlson said:


> Why do you lecture me on things I literally wrote a few posts earlier ?
> 
> https://www.techpowerup.com/forums/threads/amd-ryzen-7-2700x-3-7-ghz.243209/page-9#post-3831825
> 
> Either poor reader or deliberately trolling, don't wanna know which one, all same to me.



Then we simply agree, and i'm glad of that. i'm not alone in realizing how flawed a lot of mainstream reviews are becoming.


----------



## Joss (Apr 21, 2018)

@*cucker tarlson*
@*intrepid3d*

get a room


----------



## cucker tarlson (Apr 21, 2018)

Joss said:


> @*cucker tarlson*
> @*intrepid3d*
> 
> get a room


I adamantly and vehemently reject and deny any and all implication and allegation that I have ever engaged in any  relations with this person. 

Seriously,he is something else ....








intrepid3d said:


> Then we simply agree, and i'm glad of that. i'm not alone in realizing how flawed a lot of mainstream reviews are becoming.


If so then just say that in one sentence or one post, explain that clearly, don't antagonize or pick on people. You're PMSing all over his thread since it was created.


----------



## intrepid3d (Apr 21, 2018)

cucker tarlson said:


> I adamantly and vehemently reject and deny any and all implication and allegation that I have ever engaged in any  relations with this person.
> 
> Seriously,he is something else ....
> 
> ...



Point taken


----------



## Totally (Apr 21, 2018)

intrepid3d said:


> The Ryzen CPU is already a massive bottleneck, look at the slide. ^^^^^
> 
> 
> 
> ...



Since when is 3% massive? I could understand it if we were talking about computational work where that 3% might get translated to several hours or a couple days but this is gaming performance where talking about could you please get real for a moment.


----------



## heflys20 (Apr 21, 2018)

intrepid3d said:


> Look, its an i3, a cheap and cheerful CPU, better than AMD's finest, clearly its Bulldozer 2018.



Good lord. Are you being serious?


----------



## Swift_owl (Apr 22, 2018)

Long time (7 years) reading W1zzard reviews and going through the discussions. Had to register finally, just to put in my two cent on the discussion. Another great review W1zzard, you my friend are one of the best in the business. 

This discussion (by 1 person), turned into first shitting all over W1zzard's review, looking for something to attack, then just blatantly attacking the Ryzen cpu itself. You cant keep injecting your opinion, over the facts of what is laid out right in front of you. You have all W1zzard test results in plan view. Couldn't wait for this review to go live, so I can see if I want to buy the new 2700x. W1zzard did another great job, laying down facts, so I can use my own opinion on which cpu, *I want to buy*. 

Thanks W1zzard, keep up the good work, you are a vital part of the tech hardware community.


----------



## Melvis (Apr 22, 2018)

Find me one person on the planet that uses a current 7/8th Gen or even 6th Gen i7 CPU or a Current Ryzen 7 CPU with a GTX 1080 Ti that games at 720p, just one! then report back.  That will tell you how irrelevant 720p gaming benchmarks are.....


----------



## HTC (Apr 22, 2018)

W1zzard said:


> *720p is immensely useful to estimate what FPS you can get on that specific CPU with the fastest GPU money can buy and what to expect from <next-gen NVIDIA monster GPU>.*
> 
> But yes, I would consider it a synthetic test and not a real-life benchmark. That's why I benchmark 1080/1440/4K, too.



The problem with this is that there are cases such as RotTR, where the 8700K goes from being 25.9 FPS ahead @ 720p to 13 FPS behind @ 1080p: "stock" figures.

I figure AMD (or the motherboard makers) pulled an "Intel's MCE" on us, which is why stock figures tend to be actually higher than OCed ones (mostly games, but also some applications).

Also (if it was already asked / replied to, i missed it): what was the cooler used for each CPU? I see no mention of this in the test setup section.


----------



## Super XP (Apr 22, 2018)

Do people not realize this quote: "AMD stated that the IPC increase was 3%" doesn't mean a 3% IPC on every single thing that's tested. Its On Average, you will achieve less or more than 3%. It varies, just like how PC Gaming FPS varies. Components also have a say in how your overall setup performs. 
Anyhow, this Ryzen refresh is just that, a refresh from the original ZEN release. It's meant for those that want an upgrade not for those that already own the original Ryzen CPUs.

And for those that are claiming Ryzen is Bulldozer 2018, need to see Dr. YouKnowJack as soon as possible.


----------



## Vayra86 (Apr 22, 2018)

@Melvis
@intrepid3d

Just get this line into your head when it comes to 720p game CPU tests:

'720p is a resolution that removes any kind of GPU bottleneck and serves as a _synthetic test _to gauge _relative CPU performance'_

Key words: *synthetic*, and *relative* performance.

If you want to know how a game runs with any given CPU in a real world scenario, find a performance test of that specific game. There really isn't much more to say about it


----------



## Super XP (Apr 22, 2018)

I game at 2K 144Hz. Personally I think all gaming reviews should be done at 1080p, 2K & 4K. But I do understand the reasoning behind this sites reviews.
W1zzard offers a different perspective of course. Hence why I've had TechPowerUp bookmarked for years now. Reviews are rock solid.


----------



## W1zzard (Apr 22, 2018)

Super XP said:


> I think all gaming reviews should be done at 1080p, 2K & 4K


Yup. That's why i go the extra mile to get you 720p on top of 1080p, 1440p and 4K. 4K is pretty boring but I think it's important for people to realize that CPU matters less the lower your FPS


----------



## intrepid3d (Apr 22, 2018)

I have and would argue again W1zzard's like all 'slide only' reviews are misleading.

His review tell the observer the i3 is as or near as good as the i5's, even i7's, and better than the 2700X in gaming, because they constraint on a very narrow and unrealistic set of perimeters.
Completely disembodied frame rate numbers from an unknown testing methodology, for all we know W1zzard could have spend his time benching looking at walls in game, certainly it is obvious from the similarity in performance results between the i3 and i7 that he didn't do anything that would actually tress the CPU at all, which defeats the object of CPU performance testing.

The fact is W1zzard is telling his reader the i3 is near as good for gaming as the i7 and much better than any Ryzen.

The truth is the i3 is  a dreadful CPU to pair with a high end GPU, an observer choosing the i3 as its so cheap would actually end up with much lower performance that the lowly Ryzen 1600, yes much lower performance in complex game areas that actually matter to the CPU, whats more the likely hood is they would also experience stutter in games, as i do with my i5 and GTX 1070 combo because it just doesn't have enough threads to keep up with my GTX 1070 never mind a GTX 1080 or 1080TI.

Hardwarte reviews have moved on and for good reason, a slide tells you absolutely nothing about the hardware other than what the author of them might want you to think, a side by side run through of any given game tells you everything, its why reviews like that are taking over.


----------



## GoldenX (Apr 22, 2018)

Solid option for a new AM4 build. I like how overclocking is worse than the auto clocking with turbo and XFR for games, you get better performance at the best power consumption, and save time testing stability.

Thanks to these Ryzen 2, the Ryzen APUs are finally lowering their prices here.


----------



## Shatun_Bear (Apr 23, 2018)

What I would like someone to explain with how CPUs are reviewed/tested:

*CPU tests* - most of these tests are produced under conditions that may be synthetic but are meant to represent real-world

But

*Gaming tests* - produced under conditions that are not realistic or unlikely to be encountered by the user i.e benches at 720p resolution, or even 1080p tests using a 1080 Ti.

It seems there's jumping through hoops with gaming tests to find any difference between processors (in this case the 2700X and 8700K) that essentially offer identical (difference not noticeable) gaming performance. Of course, most of the CPU tests show performance gaps that are not noticeable. But with gaming, the perception seems to be the engineered performance gap is a deal-breaker, prompting enthusiasts the internet over to proclaim they're not buying the Ryzen because of less gaming performance. This is my issue.

I wish people would actually read the reviews and look at the settings used, and then comprehend how, actually, they're not running any kind of set-up where the gaming performance is going to be appreciably less or more with the processors being reviewed.


----------



## Vayra86 (Apr 23, 2018)

intrepid3d said:


> I have and would argue again W1zzard's like all 'slide only' reviews are misleading.
> 
> His review tell the observer the i3 is as or near as good as the i5's, even i7's, and better than the 2700X in gaming, because they constraint on a very narrow and unrealistic set of perimeters.
> Completely disembodied frame rate numbers from an unknown testing methodology, for all we know W1zzard could have spend his time benching looking at walls in game, certainly it is obvious from the similarity in performance results between the i3 and i7 that he didn't do anything that would actually tress the CPU at all, which defeats the object of CPU performance testing.
> ...



No, you just fail at reading comprehension, that is all.

Nobody is saying people should buy a high end GPU. More specifically, for anything 60hz most CPUs do the job fine so @W1zzard is quite right in that sense. The numbers are not misleading, they tell the simple truth, and it is up to each and every reader himself to decide whether or not the extra threads are needed. Thus far, for the gaming scenario on its own, 4 threads 'will cut it'. Is it optimal? Of course not. Similarly, Ryzen's extra cores do benefit certain situations and at the same time, higher clocks benefit in others. What is lacking though is any kind of linear scaling for core counts in gaming. Even today the best scaling occurs across 4 cores. Not 6, not 8. Four. What is NOT lacking, is a clear advantage especially at 60-120 FPS/hz due to high CPU clocks, which is where neither Ryzen or an i3 will be sufficient.

So, really, the conclusion is spot on.


----------



## trparky (Apr 23, 2018)

Shatun_Bear said:


> *Gaming tests* - produced under conditions that are not realistic or unlikely to be encountered by the user i.e benches at 720p resolution, or even 1080p tests using a 1080 Ti.


Yeah especially since you damn near need to win the lottery to buy a 1080ti card. Most gamers aren't going to be playing with one unless they're loaded. At most they might be playing with a GTX1060 or GTX1070 at most.


----------



## intrepid3d (Apr 23, 2018)

Vayra86 said:


> No, you just fail at reading comprehension, that is all.
> 
> Nobody is saying people should buy a high end GPU. More specifically, for anything 60hz most CPUs do the job fine so @W1zzard is quite right in that sense. The numbers are not misleading, they tell the simple truth, and it is up to each and every reader himself to decide whether or not the extra threads are needed. Thus far, for the gaming scenario on its own, 4 threads 'will cut it'. Is it optimal? Of course not. Similarly, Ryzen's extra cores do benefit certain situations and at the same time, higher clocks benefit in others. What is lacking though is any kind of linear scaling for core counts in gaming. Even today the best scaling occurs across 4 cores. Not 6, not 8. Four. What is NOT lacking, is a clear advantage especially at 60-120 FPS/hz due to high CPU clocks, which is where neither Ryzen or an i3 will be sufficient.
> 
> So, really, the conclusion is spot on.



You're both wrong, 4 cores as i have demonstrated many times in this thread is not "sufficient" its sufficient for a 1050TI perhaps but it is not sufficient for anything much above that.

look at the 7600K and the horrendous performance here, not just the fact that its half the frame rates compared with the Ryzen 1600 but look at the blue line, that's some pretty bad stutter.

Yet according to this review none of that exists on 4 core CPU's, according to this review they are at least as good as 8 core CPU's, i can pull up a bunch of reviews where you can clearly see stutter in games running 4 cores Intel CPU's, including my own videos where i experience it.

W1zzard's review doesn't demonstrate this, i don't know what he is doing to get these results but whatever it is it isn't finding the reality that is low performance and stuttering with the 4 core CPU's on high end GPU's, its not good enough.



http://imgur.com/jdinejn


----------



## Melvis (Apr 23, 2018)

@Vayra86 

Just get this into your head when it comes to 720p gaming benchmarks.

It is still GAMING BENCHMARKS! No matter how you want to spin it, in the end its still GAMING BENCHMARKS!

Again show me one person that uses this res with those specs, so far no one can, funny that....


----------



## cucker tarlson (Apr 23, 2018)

Melvis said:


> @Vayra86
> 
> Just get this into your head when it comes to 720p gaming benchmarks.
> 
> ...


You want 1080p or higher ? You can't handle the truth. 

My 2 cents, based on my thousands of hours spent gaming on 3570K + 980Ti and 4790K + 1080. Even at 1440p you'll find plenty of places across various games that are CPU heavy. Plenty. It is impossible for the reviewer to recreate those, taking into account they bench 10+ games, they'd spend hundreds of hours just looking for good testing places. I know one guy from a Polish website who is responsible for gpu and cpu tests exclusively, and he always goes through the routine of finding them - look how different his results are from W1zzard's or the majority of other reviews for that matter.

https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,34
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,35
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,36
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,37
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,38
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,39
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,40
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,41
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,42
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,43
https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,44

Forget about 3-10% percent that most sites usually show, this tells a completely different story. Now you may rightfully wonder how those results can be reproduced without all this time spent pciking the right locations. 720p testing. It won't show you what fps you'll be running at, let's say 1440p, when you find a cpu heavy scene, but they will show how CPUs would cope in such scenario.



intrepid3d said:


> You're both wrong, 4 cores as i have demonstrated many times in this thread is not "sufficient" its sufficient for a 1050TI perhaps but it is not sufficient for anything much above that.
> 
> look at the 7600K and the horrendous performance here, not just the fact that its half the frame rates compared with the Ryzen 1600 but look at the blue line, that's some pretty bad stutter.
> 
> ...



I hear you, but you need to stop this. 
You've been playing that same Crysis DF card over and over.
It does reflect what happens to 4c/4t sometimes, but the margins that this benchmark shows are way higher than anything you'll ever see. Look at two different games that use multithreading very heavily - Crysis 3 and WatchDogs2.

This is what usually happens, 15-20% deifference

https://www.purepc.pl/procesory/tes...e_i3_8350k_prawie_jak_core_i5_7600k?page=0,38
https://www.purepc.pl/karty_graficz..._s_creed_origins_problemy_w_egipcie?page=0,12

This is what happens in Crysis 3, 50% difference

https://www.purepc.pl/procesory/tes...e_i3_8350k_prawie_jak_core_i5_7600k?page=0,31

Crysis 3 is an oddball example.


----------



## Vayra86 (Apr 23, 2018)

intrepid3d said:


> You're both wrong, 4 cores as i have demonstrated many times in this thread is not "sufficient" its sufficient for a 1050TI perhaps but it is not sufficient for anything much above that.
> 
> look at the 7600K and the horrendous performance here, not just the fact that its half the frame rates compared with the Ryzen 1600 but look at the blue line, that's some pretty bad stutter.
> 
> ...



Again you fail at reading comprehension...
That pic right there shows the 7600K pushing 62 FPS. You're linking sources that prove you wrong, but you still don't want to see it... Very odd.

I did say:



Vayra86 said:


> More specifically, for anything 60hz most CPUs do the job fine



So here we have a worst-case scenario that is not only that but also an outlier in terms of what you usually see in games, and it still does not dip below 60. Or it may even dip below 60 momentarily, it still doesn't make a world of difference for any gamer with a 60hz panel.

The odd truth is that a Ryzen at relatively low clocks is more readily capable of dipping below 60 even with 6 or 8 cores, because most games still rely on a heavy single thread that is forced to run on a single core. In most games, thát is the limiting factor, way before the number of cores becomes relevant.



Melvis said:


> @Vayra86
> 
> Just get this into your head when it comes to 720p gaming benchmarks.
> 
> ...



Going all caps doesn't make gaming benchmarks about the game itself in a CPU review, and it still doesn't change a thing about the relative performance gaps you see there. Its fine if you don't (want to) get it, so skip the 720p page and move right on to your preferred resolution to see what's the CPU for your use case. No one is stopping you...


----------



## Shatun_Bear (Apr 23, 2018)

Vayra86 said:


> No, you just fail at reading comprehension, that is all.
> 
> Nobody is saying people should buy a high end GPU. More specifically, for anything 60hz most CPUs do the job fine so @W1zzard is quite right in that sense. The numbers are not misleading, they tell the simple truth, and it is up to each and every reader himself to decide whether or not the extra threads are needed. Thus far, for the gaming scenario on its own, 4 threads 'will cut it'. Is it optimal? Of course not. Similarly, Ryzen's extra cores do benefit certain situations and at the same time, higher clocks benefit in others. What is lacking though is any kind of linear scaling for core counts in gaming. Even today the best scaling occurs across 4 cores. Not 6, not 8. Four. What is NOT lacking, is a clear advantage especially at 60-120 FPS/hz due to high CPU clocks, which is where neither Ryzen or an i3 will be sufficient.
> 
> So, really, the conclusion is spot on.



Actually, I would tend to agree with him.

My issue is the final score of these reviews and the perception formed from them is overwhelmingly based (it seems) on the 'gaming performance' despite the fact the performance during _real-world_ use would be nigh on identical with the lauded 8700k, yet the difference in CPU grunt would be noticeable for certain real-world tasks (rendering) because the performance gap isn't artificially magnified by 100% (benching at 720p, or 1080p with a 1080 Ti for gaming).

It's like reviewing a new racing car that has more horsepower than a rival. But when racing at low speeds in the rain (not real world racing conditions), it's slower by 10%, despite having faster speed on the road, and the reviewer forming a conclusion largely based on that. This then leads everyone to a false conclusion on the car.

Reviews (not just TPU) should include the caveat: 'differences in gaming performance are only visible under artificial conditions. Real-world scenarios and use the performance gap would be non-existent.


----------



## Vayra86 (Apr 23, 2018)

Shatun_Bear said:


> Actually, I would tend to agree with him.
> 
> My issue is the final score of these reviews and the perception formed from them is overwhelmingly based (it seems) on the 'gaming performance' despite the fact the performance during _real-world_ use would be nigh on identical with the lauded 8700k, yet the difference in CPU grunt would be noticeable for certain real-world tasks (rendering) because the performance gap isn't artificially magnified by 100% (benching at 720p, or 1080p with a 1080 Ti for gaming).
> 
> ...



So, once again, reading comprehension, I'll quote it for you, straight out of the review

*Game Tests: 720p*
On popular demand from comments over the past several CPU reviews, we are including game tests at 720p (1280x720 pixels) resolution. All games from our CPU test suite are put through 720p using a GTX 1080 graphics card and Ultra settings. _This low resolution serves to highlight theoretical CPU performance because games are extremely CPU-limited at this resolution. Of course, nobody buys a PC with a GTX 1080 to game at 720p,_ but the results _are of academic value because a CPU that can't do 144 frames per second at 720p will never reach that mark at higher resolutions either_. So these numbers could interest high refresh-rate gaming PC builders with fast 120 Hz and 144 Hz monitors. Our 720p tests hence serve as synthetic tests in that they are not real-world (_720p isn't a real-world PC-gaming resolution anymore_) even though the game tests themselves are not synthetic (they're real games, not 3D benchmarks).


Its not anyone else's fault but your own that you skip right on to the relative performance summaries, skim over the review conclusion and fail to read the rest and interpret the data properly. But please, don't bother us with your lack of attention and understanding. You have a Youtube full of crappy reviewers that can cater to your shallow demands.

Another option is that you might want to open your eyes a bit and learn something along the way.

Or translate all of what you're saying to 'real world use cases'. How many hardcore gamers do actually use that same CPU for heavy multithreaded workloads? I'd reckon about as many as there are gaming on 720p... For them the relative performance summaries are just about as relevant as for anyone building a workstation. The only way out of this is creating a bench suite that covers the CPUs' limitations in _every possible way_, so that each reader can extract the data set that is most relevant to his/her personal use case.

Oh, and about gaming on that 8700K at 'normal settings', seen this?

https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,19

Tell me again you don't want the fattest CPU money can buy when all you care about is FPS. Tell me again cores matter more than clocks in any gaming scenario... We have a 7700K and 8600K literally spitting out the same FPS as 8c16t Ryzens at the *same clock* and the ONLY CPU that pushes over 40 (!) minimum FPS is the one that can push 4.7 Ghz on a single core.


----------



## Readlight (Apr 23, 2018)

Can you make review for i5 8500 190€ and Pentium Gold G5500 91€ vs Ryzen 2600


----------



## intrepid3d (Apr 23, 2018)

@Vayra86 you're the one failing to grasp what we are talking about, you're talking about something entirely different to what the rest of us are, you're talking about resolution, no one is disputing using low resolution is a good idea because it helps stop the GPU from becoming the bottleneck, its actually really obvious what we are talking about higher stress complex scenes in games that cause low performance and sometimes stutter on the 4 core Intel CPU's.

I don't know how many times i have to explain this to you over and over and over again.... if you benchmark your games by using scenes that have little to no work for the CPU to do you can make a 4 core look as fast or faster than an 8 core, if you are looking at a wall there is nothing for the CPU to do and with that a low threaded high clocked CPU's will do better than a low clock speed high core count CPU.
take the same CPU and look at an explosion, or even just a scene with long draw distances, a lot of shading and lighting, physics..... the 4 core i3 will be running at 100% causing stutter and low performance, the 6 or 8 core lower clocked CPU's will have much higher performance and still be smooth.

W1zzard utterly failed to demonstrate that fact, his slides proport that the i3 is not only close to as good as the Intel i7 but also much better than the 2700X, the real truth is even the Ryzen 1600 is a far better gaming CPU, including at very low resolution..


----------



## Charcharo (Apr 23, 2018)

lynx29 said:


> https://www.tweaktown.com/reviews/8602/amd-ryzen-7-2700x-5-2600x-review/index9.html
> 
> another review showing 8700k stock beating ryzen 2700x stock at 1440p by 20 fps in multiple games.
> 
> ...




Include the source.



http://imgur.com/a/oTNrz


ANyways, the reason for some of these discrepencies is not Ryzen, but the Nvidia driver. How do I know this? I tested the exact same scenes that Jester/Boredgunner tested (we are friends) on a Ryzen 5 1500X and a Fury. It got higher fps than his 1700X OCed with a 1080 Ti. Let that sink it. 

I honestly do not how this will be fixed or when, but Maxwell and Pascal + Ryzen is sometimes (not always!) a bad combination.

As for the person who used purepc.pl as an argument... those guys are definitely not testing games to find the most stresful parts. I do not know how or what they are doing, but I suspect it is not super competent. 

As for the discussion about settings: Ultra settings increase CPU load but by how much depends on the game, scene, and engine. IMHO testing low resolution, Ultra settings is a valid CPU benchmark. Testing low resolution and low settings is valid only for E sports titles, and even then a difference between 400 and 500 fps is not going to be noticed by 99.99% of the people who read benchmarks. They are not good enough for it to matter to them.


----------



## intrepid3d (Apr 23, 2018)

Something else about that slide ^^^^^ it is massively magnified, its a 6% difference made to look like its a massive more than 2x performance win for the 8700K.

I can't take people who magnify slides like that seriously.


----------



## Charcharo (Apr 23, 2018)

intrepid3d said:


> Something else about that slide ^^^^^ it is massively magnified, its a 6% difference made to look like its a massive more than 2x performance win for the 8700K.
> 
> I can't take people who magnify slides like that seriously.



All the tests are serious. His slide skills are irrelevant . BTW, there are more slides than just one, just FYI.


----------



## Vayra86 (Apr 23, 2018)

intrepid3d said:


> @Vayra86 you're the one failing to grasp what we are talking about, you're talking about something entirely different to what the rest of us are, you're talking about resolution, no one is disputing using low resolution is a good idea because it helps stop the GPU from becoming the bottleneck, its actually really obvious what we are talking about higher stress complex scenes in games that cause low performance and sometimes stutter on the 4 core Intel CPU's.



ERRRN wrong. The games that stutter are limited mostly by a single core performance bottleneck, not a lack of cores. A single core can carry multiple game threads but you cannot split a single game thread across multiple cores ad infinitum.

So adding cores does only one thing, and that is providing more headroom for the heaviest game thread that is forced to that one core. And if a game hammers four cores with significant loads, then yes, anything you run in the background will influence ingame performance. But if all you run is an OS and some lightweight stuff alongside it, four cores are sufficient for the vast majority of games. Exceptions exist, but don't make the rule.

I will agree with you straight away that a hexacore is a sensible purchase for a gaming machine, but it is simply not true that a quad core by definition means you get stutter. The people experiencing stutter on quad cores today are mostly using DDR3-based rigs of Haswell or older - this included myself not too long ago. Most CPU-related stutter is actually attributable to RAM in equal or greater measure than it is to core count. Core counts are rather irrelevant, as long as you have performance left on those cores its just fine.

Another important bit of info you keep omitting is the fact that all of these tests are about > 60 FPS, not sub 60 FPS, ie the realm of Vsync, adaptive sync, Fast Sync or frame capping. Tons of ways to avoid stutter at this point without noticeably degrading the experience. When you speak of sub 60 FPS performance, the balance shifts rapidly towards clocks, and *sufficient* core counts, and show no scaling whatsoever beyond a 6 core CPU and only marginal scaling beyond 4 core CPUs.


----------



## Charcharo (Apr 23, 2018)

Vayra86 said:


> ERRRN wrong. The games that stutter are limited mostly by a single core performance bottleneck, not a lack of cores. A single core can carry multiple game threads but you cannot split a single game thread across multiple cores ad infinitum.
> 
> So adding cores does only one thing, and that is providing more headroom for the heaviest game thread that is forced to that one core. And if a game hammers four cores with significant loads, then yes, anything you run in the background will influence ingame performance. But if all you run is an OS and some lightweight stuff alongside it, four cores are sufficient for the vast majority of games. Exceptions exist, but don't make the rule.
> 
> ...



Funny thing is, that some engines like the 4A Engine or id Tech 6 (and likely 7) can saturate 12+ threads under the right condition. That is how I tested my old i5 4570 vs my Ryzen 5 1500X. In some places, 1500X was almost 50% faster than the 4 core 4 thread i5.

If only we had PC Gamers demanding more simulation aspects from their titles . Imagine A-Life 2.0 on a multi-core engine with improved ballistic calculations and armor simulation... that is what lets me sleep at night ...


----------



## intrepid3d (Apr 23, 2018)

Vayra86 said:


> ERRRN wrong. The games that stutter are limited mostly by a single core performance bottleneck, not a lack of cores. A single core can carry multiple game threads but you cannot split a single game thread across multiple cores ad infinitum.
> 
> So adding cores does only one thing, and that is providing more headroom for the heaviest game thread that is forced to that one core. And if a game hammers four cores with significant loads, then yes, anything you run in the background will influence ingame performance. But if all you run is an OS and some lightweight stuff alongside it, four cores are sufficient for the vast majority of games. Exceptions exist, but don't make the rule.
> 
> ...



You say this right after ignoring 4 core Skylake DDR4 stutter examples, it contradicts your argument, ignore it, pretend it doesn't exist and make the argument anyway.

Your V-Sync arguments are also completely bizarre, again your making blanket statements that fly in the face of the facts and basic common sense, these tests are not about 60 FPS, good grief don't you think that if these tests would have been about V-Sync they would have actually used V-Sync? with that what is the point in CPU performance testing with V-Sync on? that's complete and utter madness.

A 4 core CPU is good as long as you cap your Frame Rates to 60 FPS? i have a better one for you, if you're going to cap your Frame Rates to 60 don't bother with a GTX 1080/TI get a GTX 1050 instead you will have exactly the same performance, if that is your answer then why are you so defensive of this review, V-Sync was off in all of them, as it should be.


----------



## trparky (Apr 23, 2018)

Um... @intrepid3d sounds like your typical Intel fanboy.


----------



## intrepid3d (Apr 23, 2018)

A couple more examples of bad Intel 4 core performance that you would never know about just looking at slides.


----------



## trparky (Apr 23, 2018)

Oy... this guy is going on my ignore list because he's spamming my alerts list.


----------



## cucker tarlson (Apr 23, 2018)

intrepid3d said:


> A couple more examples of bad Intel 4 core performance that you would never know about just looking at slides.
> 
> View attachment 100146
> View attachment 100147
> ...


Are you kidding me ? Those are momentary screen shots. Show us full video - if there's stutter we should be able to noticeat least some of it.
You're proving a point which is valid, but I gotta say the case you're making for it is pretty bad, you're not even basing it on whole benchmark runs, you're basing it on print screens. No wonder people ignore you. You don't read and make your case on passive agressive tone more than anything.

I admit my perspective is not very different from yours, but I ran 3570K with 980Ti OC and 4790K with 1080 OC, spent hundreds if not thousands of hours playing, I speak from my expereince. I don't even know whether watching a video showing a CPU bottleneck is able to really capture what you experience live and how the controls feel in real time, probably not, what I can tell you is that I sure as shit can't see stutter in a print screen image.


----------



## Vayra86 (Apr 23, 2018)

trparky said:


> Oy... this guy is going on my ignore list because he's spamming my alerts list.



That's not my style, but I get ya. The plank of wood is too thick to get through here and I'm done. Here we have a guy ignoring half my posts, misinterpreting the rest and then convincing himself he's right. Meanwhile, all of the screenshots above show > 60 FPS game performance which supports my statements fully, running uncapped FPS while no one in any sort of scenario would ever use an i3 OR a Ryzen CPU because both fall off in high refresh scenario's where it counts, which is what the argument started with. It would be hilarious if it wasn't such a sad display of lacking intelligence.

All has been said on the topic, I'm done. When in doubt, re-read and think hard 

One last hint: look at the last screenshot, where a 12 core CPU 'gets 12 thread action' but only manages to push out ~40 more FPS with 84% utilization, while the quad core CPU is 100% loaded. Both CPUs would offer the exact same gaming performance (remarkably, even on the fastest monitors money could buy right now, everything is over 240FPS), with the 12 thread CPU being grossly inefficient doing so.

Or take the screenshot of Ryzen 1700 vs 7700K with IDENTICAL FPS regardless of the additional core count.

Circling back to what I started with: reading comprehension.



intrepid3d said:


> You say this right after ignoring 4 core Skylake DDR4 stutter examples, it contradicts your argument, ignore it, pretend it doesn't exist and make the argument anyway.
> 
> Your V-Sync arguments are also completely bizarre, again your making blanket statements that fly in the face of the facts and basic common sense, these tests are not about 60 FPS, good grief don't you think that if these tests would have been about V-Sync they would have actually used V-Sync? with that what is the point in CPU performance testing with V-Sync on? that's complete and utter madness.
> 
> A 4 core CPU is good as long as you cap your Frame Rates to 60 FPS? i have a better one for you, if you're going to cap your Frame Rates to 60 don't bother with a GTX 1080/TI get a GTX 1050 instead you will have exactly the same performance, if that is your answer then why are you so defensive of this review, V-Sync was off in all of them, as it should be.



Stutters can occur due to high frame rate variance, so in real world scenario's, a frame cap serves as an effective way to remove stutter caused by huge variations in frame time. Other options are available, and almost every gamer would use one of them to improve the ingame experience. For high refresh rate gaming, such a method is Fast Sync + uncapped FPS, which is also extremely effective at removing stutter while maintaining optimal button-to-pixel latency. The other cause of stutter in games is when RAM or VRAM get saturated either in capacity or bandwidth, which is a type of stutter that manifests mostly _below 60 FPS and what Haswell and older quad cores fall prey to._

About the GPU choice: you *nearly got the point* there, but then you went off in the wrong direction once more. Nobody is saying 'pair an i3 with a 1080ti'. You're right if 60 fps is your goal, which it is for a good 75-85% of the gaming crowd because they have a 60hz monitor, all of these CPUs perform 100% the same, and ONLY the CPUs with the highest clock speeds are a guarantee that you will never, or almost never (outliers still exist, even with a 5Ghz 8700K, speaking from experience not a Youtube video) dive below 60 FPS.

So, there is a good CPU for each user and use case. For the vast majority gaming at 60 FPS/hz, Ryzen or an i3 will net similar results, with the Ryzen 6c/12t offering more headroom than the i3 at the *same clock* but the i3-K CPU will provide more headroom once you OC it, which is where Ryzen can still fall short in more games than the i3 would due to the lack of cores. And for the high-end, Ryzen is again a fine choice, but now we're talking about 'what is the _optimal choice_ and that is where Intel wins the day due to clocks on their 6-core offerings.

So - that REALLY was my last attempt.


----------



## cucker tarlson (Apr 23, 2018)

Vayra86 said:


> That's not my style, but I get ya. The plank of wood is too thick to get through here and I'm done. Here we have a guy ignoring half my posts, misinterpreting the rest and then convincing himself he's right. Meanwhile, all of the screenshots above show > 60 FPS game performance which supports my statements fully, running uncapped FPS while no one in any sort of scenario would ever use an i3 OR a Ryzen CPU because both fall off in high refresh scenario's where it counts, which is what the argument started with. It would be hilarious if it wasn't such a sad display of lacking intelligence.


I feel sad for him. He actually has a hunch about cpu bottleneck, but he's struggling to prove it sooooo badly. I've been trying to help him but I doubt more than 10% of what I write gets through.



Charcharo said:


> As for the person who used purepc.pl as an argument... those guys are definitely not testing games to find the most stresful parts. I do not know how or what they are doing, but I suspect it is not super competent.


Care to explain how you came to that conclusion ? You don't know but suspect what they're doing is not actually what they say they're doing but in fact you think they rather don't fully know what they're doing. Doesn't seem like a solid case of disproving anything to me. You're not coming across as super competent either with remarks like this.


----------



## Charcharo (Apr 23, 2018)

Percent utilization is not a perfect representation of CPU resources guys. You are forgetting that. All of you  !


----------



## intrepid3d (Apr 23, 2018)

cucker tarlson said:


> Are you kidding me ? Those are momentary screen shots. Show us full video - if there's stutter we should be able to noticeat least some of it.
> You're proving a point which is valid, but I gotta say the case you're making for it is pretty bad, you're not even basing it on whole benchmark runs, you're basing it on print screens. No wonder people ignore you. You don't read and make your case on passive agressive tone more than anything.
> 
> I admit my perspective is not very different from yours, but I ran 3570K with 980Ti OC and 4790K with 1080 OC, spent hundreds if not thousands of hours playing, I speak from my expereince. I don't even know whether watching a video showing a CPU bottleneck is able to really capture what you experience live and how the controls feel in real time, probably not, what I can tell you is that I sure as shit can't see stutter in a print screen image.



Its not so much about stutter, the 4 core CPU is very heavily loaded and as a result the performance is simply better on the Ryzen 1600X. can you not see that in the screenshots? Metro LL for example the performance is around 20 to 40% higher on the 1600X


----------



## cucker tarlson (Apr 23, 2018)

Sorry, but this is joker. The guy who claimed +20% performance on Vega with the fall win 10 update. And he MEASURED it. I mean how can a youtube video lie about numbers, says right in the title he's unbiased


----------



## intrepid3d (Apr 23, 2018)

cucker tarlson said:


> Sorry, but this is joker. The guy who claimed +20% performance on Vega with the fall win 10 update. And he MEASURED it. I mean how can a youtube video lie about numbers, says right in the title he's unbiased



No he didn't, You've just made that up.

You've just had a go at me for not providing the proof, now its your turn.


----------



## cucker tarlson (Apr 23, 2018)

Lol don't make me pull out a thread I made based on his video and make me look exactly like the sort of a fool that you're making of yourself now.

Okay, challeange accepted.


https://www.techpowerup.com/forums/...-performance-improvement-in-game-mode.238018/

He pulled the video after being called on his crap. But he's a sensationalist more than anything. He likes clicks more than he likes work ethics, that's not hard to tell.


----------



## trparky (Apr 23, 2018)

cucker tarlson said:


> Sorry, but this is joker.


He's acting like your typical hard-headed Intel fanboy. Nothing you say will get through his concrete skull.


----------



## Vayra86 (Apr 23, 2018)

intrepid3d said:


> No he didn't, You've just made that up.
> 
> You've just had a go at me for not providing the proof, now its your turn.



You need to realize this is not a discussion, its us trying to teach you something about gaming performance, reviews, and reading the numbers right. You need to change your stance and as long as you don't you will never gain the understanding that's required here.

Also this is not something that works in the opposite direction as well, you don't need to show or tell me anything. I've seen it all. Not from a Youtube idiot, but from first hand experience with the actual hardware.



Charcharo said:


> Percent utilization is not a perfect representation of CPU resources guys. You are forgetting that. All of you  !



BINGO. +10 pts


----------



## cucker tarlson (Apr 23, 2018)

trparky said:


> He's acting like your typical hard-headed Intel fanboy.


I don't know what he is at this point but he certainly stole the whole show from any CPU in the discussion, be it intel or amd.


----------



## intrepid3d (Apr 23, 2018)

cucker tarlson said:


> Lol don't make me pull out a thread I made based on his video and make me look exactly like the sort of a fool that you're making yourself.
> 
> Okay, challeange accepted.
> 
> ...



All i see in that link is you making a claim about 20% higher performance in games with nVidia and AMD GPU's, with an unknown removed video link..

There is no proof there.


----------



## trparky (Apr 23, 2018)

Vayra86 said:


> You need to realize this is not a discussion


Who? Me?


----------



## cucker tarlson (Apr 23, 2018)

intrepid3d said:


> All i see in that link is you making a claim about 20% higher performance in games, with an unknown removed video link..
> 
> There is no proof there.


Read my whole post God dammit ! And read that thread.


----------



## Vayra86 (Apr 23, 2018)

trparky said:


> Who? Me?



Nope, the user you ignored and what is now causing you to be confused  This is why ignoring is quite the nuisance


----------



## intrepid3d (Apr 23, 2018)

cucker tarlson said:


> Read my whole post God dammit ! And read that thread.



I read your whole post, there are about 3 lines in it and none of it says anything at all about JP.

So now you want me to find the proof in a 4 page thread myself?

You made it up, the burden of proof is not with me, its with you, you proof it to me..


----------



## cucker tarlson (Apr 23, 2018)

Charcharo said:


> Percent utilization is not a perfect representation of CPU resources guys. You are forgetting that. All of you  !


And you're forgetting that print screens of a video are as much of a proof of stuttering as this is a print screen proof of telekinesis.

https://pbs.twimg.com/media/DbewNhjW0AEVkVi?format=jpg



intrepid3d said:


> I read your whole post, there are about 3 lines in it and none of it says anything at all about JP.
> 
> So now you want me to find the proof in a 4 page thread myself?
> 
> You made it up, the burden of proof is not with me, its with you, you proof it to me..



I can't prove anything to a person that either can't or is too lazy to read.
Good thing I don't really have to.


----------



## Shatun_Bear (Apr 23, 2018)

Vayra86 said:


> So, once again, reading comprehension, I'll quote it for you, straight out of the review
> 
> *Game Tests: 720p*
> On popular demand from comments over the past several CPU reviews, we are including game tests at 720p (1280x720 pixels) resolution. All games from our CPU test suite are put through 720p using a GTX 1080 graphics card and Ultra settings. _This low resolution serves to highlight theoretical CPU performance because games are extremely CPU-limited at this resolution. Of course, nobody buys a PC with a GTX 1080 to game at 720p,_ but the results _are of academic value because a CPU that can't do 144 frames per second at 720p will never reach that mark at higher resolutions either_. So these numbers could interest high refresh-rate gaming PC builders with fast 120 Hz and 144 Hz monitors. Our 720p tests hence serve as synthetic tests in that they are not real-world (_720p isn't a real-world PC-gaming resolution anymore_) even though the game tests themselves are not synthetic (they're real games, not 3D benchmarks).
> ...



Do us all a favour and calm down. You're embarrassing yourself. You've now made it clear to everyone you're upset that this CPU is closer to CL than you had hoped. Have you just bought yourself an 8700K and are indignant the gaming gap is down to only 10% at worse case scenario for the 2700X?

Also, reading comprehension. You're so fervent over this you forgot to use your own eyes to read the conclusion, which is what we're meant to be talking about here:

_"Nevertheless, if you are a 1080p gamer looking to drive your monitor at 144 Hz, Intel is the way to go since the Ryzen 7 2700X won't be able to provide framerates nearly as high."_

Lastly, you trawled the internet to find some cherry-picked benches where the gaming performance gap is bigger. Well done that's not going to convince anyone of anything and is rather juvenile. Why would you not use TPU's very own performance summary at 1440p? Because you're in the middle of a tantrum? This is more 'real-world' and it's the most comprehensive written bench suite out there. It shows gaming at that res with a 1080 has a gap of 3.4%!! So sit and stew on that reality if you must:


----------



## cucker tarlson (Apr 23, 2018)

Lol cause there are other reviews than TPU, they exist, it seems you're much more desperate than him to tell us not to use them. Two different results from two different tests don't even have to exclude one another, are you kidding me ?

Vayra is spot on with saying people need to learn to interpret benches and numbers, and you're proof.


----------



## Vayra86 (Apr 23, 2018)

Shatun_Bear said:


> Do us all a favour and calm down. You're embarrassing yourself. You've now made it clear to everyone you're upset that this CPU is closer to CL than you had hoped. Have you just bought yourself an 8700K and are indignant the gaming gap is down to only 10% at worse case scenario for the 2700X?
> 
> Also, reading comprehension. You're so fervent over this you forgot to use your own eyes to read the conclusion, which is what we're meant to be talking about here:
> 
> ...



With the minor detail that I am not the one disagreeing with this review at all nor its conclusion - I'm in defense of it.

The detail I am not agreeing on, is that difference between an OK CPU for gaming and the optimal one, and the statement that an i3 won't do the job just fine as well, even being a quad core. I will say it again because you obviously missed it: Ryzen is fine for the vast majority of scenario's. And currently the 8600K and 8700K are the optimal choice when it comes to a CPU that has the highest performance in *any* gaming scenario. That's remarkably similar to that italic line you put in there. We came upon this topic because of high refresh rate gaming - and this is where high clocks matter and also that one scenario that Ryzen still has trouble covering well - I did say something as well about relying purely on a relative performance summary earlier, and how it can blind you from these details. You just confirmed it right there... And you even managed to link the wrong summary, because its not 1080p or 720p which you emphasized on, but 1440p thus a GPU limited scenario.

Also, tantrum? Nah mate. I'm trying to show people the things they missed.


----------



## intrepid3d (Apr 23, 2018)

cucker tarlson said:


> And you're forgetting that print screens of a video are as much of a proof of stuttering as this is a print screen proof of telekinesis.
> 
> https://pbs.twimg.com/media/DbewNhjW0AEVkVi?format=jpg
> 
> ...



You don't know how to read frame timing lines do you?

When someone sends you a screen-shot with a frame times line in it and it looks like a 4 year old doodling that's a graphical or visual representation of stutter, its very common to use such graphs to illustrate stutter in games.


----------



## cucker tarlson (Apr 23, 2018)

intrepid3d said:


> You don't know how to read frame timing lines do you?
> 
> When someone sends you a screen-shot with a frame times line in it and it looks like a 4 year old doodling that's a graphical or visual representation of stutter, its very common to use such graphs to illustrate stutter in games.


The frametimes are the numers measured in "ms" kiddo.


----------



## intrepid3d (Apr 23, 2018)

cucker tarlson said:


> The frametimes are the numers measured in "ms" kiddo.



Oh wow you have no idea... yes its a measure of frames rendered at a given time interval, in this case MS, if the line is straight and smooth the frame times are consistent.
If the graph line is up and down like a bad heart monitor readout its because the frame times are not consistent, when the frame times are not consistent it manifests as stutter.


----------



## cucker tarlson (Apr 23, 2018)

intrepid3d said:


> Oh wow you have no idea... yes its a measure of frames rendered at a given time interval, in this case MS, if the line is straight and smooth the frame times are consistent.
> If the graph line is up and down like a bad heart monitor readout its because the frame times are not consistent, when the frame time are not consistent it manifests as stutter.


There's no graph, you posted static screen shots with no frametime indication other than once again a static number, which is fine by the way, actually even better on intel than amd.

Lol you even highlighted that static frametime value in red !


----------



## Vayra86 (Apr 23, 2018)

cucker tarlson said:


> There's no graph, you posted static screen shots with no frametime indication other than once again a static number, which is fine by the way, actually even better on intel than amd.



Thick plank is really quite thick.


----------



## cucker tarlson (Apr 23, 2018)

Vayra86 said:


> Thick plank is really quite thick.


Lol he highlighted that static frametime value in red and now calls me on pointing it out that it is a static shot not a graph.


----------



## Charcharo (Apr 23, 2018)

cucker tarlson said:


> And you're forgetting that print screens of a video are as much of a proof of stuttering as this is a print screen proof of telekinesis.
> 
> https://pbs.twimg.com/media/DbewNhjW0AEVkVi?format=jpg
> 
> ...



Why tell me that though? I know how it works  I was on a 4c/4t i5 till early this year. 
Stutter did happen at times, but I do think 4c/4t still has use in low end machines. For now. There is nuance to hardware, guys.


----------



## intrepid3d (Apr 23, 2018)

cucker tarlson said:


> There's no graph, you posted static screen shots with no frametime indication other than once again a static number, which is fine by the way, actually even better on intel than amd.
> 
> Lol you even highlighted that static frametime value in red !



I have highlighted the frame times for you in this, in red now too, the blue line is Intel, that is not just some stutter, that is horrendous.



http://imgur.com/CHmOGdu


----------



## cucker tarlson (Apr 23, 2018)

Charcharo said:


> Why tell me that though? I know how it works  I was on a 4c/4t i5 till early this year.
> Stutter did happen at times, but I do think 4c/4t still has use in low end machines. For now. There is nuance to hardware, guys.


That was just banter.
But now I don't know what was the point of calling crap on purepc ? You think GTX 1080 at 1080p is a low-end machine ?



intrepid3d said:


> I have highlighted the frame times for you in this, in red now too, the blue line is Intel, that is not just some stutter, that is horrendous.
> 
> 
> 
> http://imgur.com/CHmOGdu


So I know what the problem is.
You are communicating on a different wavelenght than other people do, don't you ?


----------



## intrepid3d (Apr 23, 2018)

cucker tarlson said:


> That was just banter.
> But now I don't know what was the point of calling crap on purepc ? You think GTX 1080 at 1080p is a low-end machine ?
> 
> 
> ...



lol, well at least you have a sense of humor


----------



## cucker tarlson (Apr 23, 2018)

Dude I told you to lay off that Crysis 3 shot that you've been beating to death for the last couple of days. You're going nowhere with it.
It's not like you have to convince us NOW. Take some time to build your case, and get back to us when you're ready. Okay ?

Are you trying to explain something to us or do you want to ask us for something to be explained to you ? Cause I am not sure at this point. You can do the latter one if you want, you know.


----------



## Charcharo (Apr 23, 2018)

cucker tarlson said:


> That was just banter.
> But now I don't know what was the point of calling crap on purepc ? You think GTX 1080 at 1080p is a low-end machine ?
> 
> 
> ...



Egh I just ... do not think they do a good job of finding those most stressful parts of a game.  At least the ones you can use to stress a CPU/GPU.


----------



## Vayra86 (Apr 23, 2018)

Charcharo said:


> Egh I just ... do not think they do a good job of finding those most stressful parts of a game.  At least the ones you can use to stress a CPU/GPU.



As to your comment on the 4c/4t, what CPU was it exactly?

As for PurePC, even though I can't read Polish for shit, their choices in benchmarks to use is ridiculously good, arguably the best on the net right now. Their CPU reviews tell me more than any others I can find.


----------



## cucker tarlson (Apr 23, 2018)

Charcharo said:


> Egh I just ... do not think they do a good job of finding those most stressful parts of a game.  At least the ones you can use to stress a CPU/GPU.


Well, I don't know if I understand you correctly .... CPU/GPU ? The whole point of a CPU test in games is to find a place that is,and I can't emphasize it enough, NOT a GPU heavy place.
They're doing a good job on that.

On another note,but somehow related to this, I can't undestand people like Shatun Bear, who insist you only look on TPU benchmarks when you're on TPU. No one forces you to resort to picking one review only. Not even W1zzard who made it! Reviews for CPUs don't have to exclude each other even if they show different results, and that is pretty friggin obvious.


----------



## Charcharo (Apr 23, 2018)

cucker tarlson said:


> Well, I don't know if I understand you correctly .... CPU/GPU ? The whole point of a CPU test in games is to find a place that is,and I can't emphasize it enough, NOT a GPU heavy place.
> They're doing a good job on that.


Exactly what I am talking about.




Vayra86 said:


> As to your comment on the 4c/4t, what CPU was it exactly?



I had an i5 4460 and i5 4570. Basically the same thing to be frank. I know modern Skylake/Coffee/Kaby ones are faster, but not much faster. It is pretty analogous to a low end Coffee i3 or 7400.


----------



## intrepid3d (Apr 23, 2018)

Charcharo said:


> Egh I just ... do not think they do a good job of finding those most stressful parts of a game.  At least the ones you can use to stress a CPU/GPU.



Same, i certainly think like most reviewers TPU could do a lot better, if lone youtubers can do it and pickup on where 4 core CPU's are saturated.... and a whole lot more besides then i don't see how such an establishment like this can knock out such poorly defined information its to some extent even misleading.

These lone Youtubers are taking over and you can see why.

I'm off for some dinner


----------



## Vayra86 (Apr 23, 2018)

Charcharo said:


> Exactly what I am talking about.
> 
> 
> 
> ...



Yeah see. I had an i5 3570k and the experience is precisely similar to what you've said about 4c4t. Its really not so much about it being a quad, its the DDR3 and the platform. Skylake @ DDR4 showed a huge jump back in the day and this is often ignored. Even RAM OCs on Haswell and earlier are rewarding these days for gaming.


----------



## Charcharo (Apr 23, 2018)

Vayra86 said:


> Yeah see. I had an i5 3570k and the experience is precisely similar to what you've said about 4c4t. Its really not so much about it being a quad, its the DDR3 and the platform. Skylake @ DDR4 showed a huge jump back in the day and this is often ignored. Even RAM OCs on Haswell and earlier are rewarding these days for gaming.



You have to remember that whilst DDR4 would for sure help, lower end mobos and i5s can not really OC memory like K parts. But I agree it would be better for sure. Just not 1500X better (in this context).


----------



## cucker tarlson (Apr 23, 2018)

Charcharo said:


> Exactly what I am talking about.
> 
> 
> 
> ...


They show a lot of gains from faster CPU as well as fast RAM. How exactly are they managing it while picking poor CPU testing places at the same time ?



intrepid3d said:


> I'm off for some dinner



Just don't stutter while eating, you'll choke, and we don't want that.


----------



## Charcharo (Apr 23, 2018)

cucker tarlson said:


> They show a lot of gains from faster CPU as well as fast RAM. How exactly are they managing it while picking poor CPU testing places at the same time ?


To be 100% certain I would have to test and compare to their results. For example, testing CPU in Metro is best done in the AI Arena with PhysX On (if Redux! as Redux uses multithreaded PhysX). Spawn humans x4 on both sides and see the carnage watching over the two forces fight. Reset the level every time so that the destructible cover is reset too. That is one hell of a CPU workout, IMHO, as the actual arena is sparse on GPU resources. Best tested with a higher end AMD GPU of course, so PhysX is all on the CPU.


----------



## trparky (Apr 23, 2018)

Can I ask a dumb question here? What clock speed was the 8700K running at when many of these tests were being conducted?


----------



## Vayra86 (Apr 23, 2018)

trparky said:


> Can I ask a dumb question here? What clock speed was the 8700K running at when many of these tests were being conducted?



Considering all-core load, 4.3 Ghz


----------



## Charcharo (Apr 23, 2018)

trparky said:


> Can I ask a dumb question here? What clock speed was the 8700K running at when many of these tests were being conducted?



4.3 Ghz boost for all cores. In some games, likely 4.7 Ghz boosted single core.


----------



## cucker tarlson (Apr 23, 2018)

Vayra86 said:


> As to your comment on the 4c/4t, what CPU was it exactly?
> 
> As for PurePC, even though I can't read Polish for shit, their choices in benchmarks to use is ridiculously good, arguably the best on the net right now. Their CPU reviews tell me more than any others I can find.



I was running a 4790K and experiencing a CPU bottleneck in some games. They came up with an updated review for 5775c, this time they did it on more modern games on most importantly gtx 1080 (the previous ones were on gtx 980, which was the fastest on the market at the time broadwell-c made it, which may explain a large portion of why it recieved a lukewarm welcome). I was somewhat sceptical,but I got a good deal on that CPU, I installed it and I was like "hoooly crap" when I tested it. It lined up almost perfectly with what I saw in the review. Even made a little thread here on TPU,although in the thread I was rather feeling the limits of this cpu than making a "in most cases" scenario.
https://www.techpowerup.com/forums/...c-edrams-impact-on-gaming-performance.236514/
Their cpu testing methodology is top notch.


----------



## Charcharo (Apr 23, 2018)

cucker tarlson said:


> I was running a 4790K and experiencing a CPU bottleneck in some games. They came up with an updated review for 5775c, this time they did it on more modern games on most importantly gtx 1080 (the previous ones were on gtx 980, which was the fastest on the market at the time broadwell-c made it, which may explain a large portion of why it recieved a lukewarm welcome). I was somewhat sceptical,but I got a good deal on that CPU, I installed it and I was like "hoooly crap" when I tested it. It lined up almost perfectly with what I saw in the review. Even made a little thread here on TPU.
> https://www.techpowerup.com/forums/...c-edrams-impact-on-gaming-performance.236514/
> Their cpu testing methodology is top notch.



Well, seems like something I need to check on a game by game basis then. 
Of course I can always go full PCMR and mod my areas to show what CPU benchmarking really means for all the filthy casuals.


----------



## Vayra86 (Apr 23, 2018)

Charcharo said:


> Well, seems like something I need to check on a game by game basis then.
> Of course I can always go full PCMR and mod my areas to show what CPU benchmarking really means for all the filthy casuals.



Look at Total War: Warhammer for a good example of CPU destruction. Its a modern day Crysis in that sense - and it even uses threads quite well...



cucker tarlson said:


> I'm actually curious what it was running in FC5. Dunia engine is heavily single threaded.



FC5 will load multiple cores and will probably not keep the CPU at 4.7, but rather 4.3

I cba to remove my OC to check it for ya... Board makers apply multi core enhancements from the factory for good reason, because CFL will lose the 4.7 Ghz in most gaming scenarios.


----------



## cucker tarlson (Apr 23, 2018)

Charcharo said:


> 4.3 Ghz boost for all cores. In some games, likely 4.7 Ghz boosted single core.


I'm actually curious what it was running in FC5. Dunia engine is heavily single threaded.



Vayra86 said:


> Look at Total War: Warhammer for a good example of CPU destruction. Its a modern day Crysis in that sense - and it even uses threads quite well...


I found that not all games that thread well actually give ryzen all that big a boost. I guess more threads is not always equivalent of faster IPC, even on games that put the CPU load evenly across the cores.
I think you should prioritize IPC and ram speed still, even in 2018, and then find a cpu that offers that with enough cores/threads that won't choke the performance. Actually video tests are good for that, you can see the CPU load across the cores change. If it stays under 75% it's absolutely fine. +80% is higher than my liking. At +90% you may experience fps dips and stutter, how bad it's gonna be depends on the game.


----------



## Vayra86 (Apr 23, 2018)

cucker tarlson said:


> I'm actually curious what it was running in FC5. Dunia engine is heavily single threaded.
> 
> 
> I found that not all games that thread well actually give ryzen all that big a boost. I guess more threads is not always equivalent of faster IPC, even on games that put the CPU load evenly across the cores.



Correct, my experience as well. The constant swapping of threads across cores doesn't help with monitoring either. But there is for sure a tremendous amount of overhead as core counts go up and the usage you see on HT/SMT threads is almost exclusively overhead. Even in DX12. Another thing that happens with virtually every game is that FPS is always limited by the biggest thread; so essentially no matter how well it uses threads, there is still a single threaded limitation.


----------



## Charcharo (Apr 23, 2018)

Vayra86 said:


> Look at Total War: Warhammer for a good example of CPU destruction. Its a modern day Crysis in that sense - and it even uses threads quite well...
> 
> 
> 
> ...



I do not have it. From the very CPU heavy and multi-threaded games I have Metro, Witcher 3, BF1, Wolfenstein 2, DOOM 2016, Crysis 3.

I told you my Metro bench. For Wolfenstein I use either the Panzerhund sequence in New Orleance or one of the late game arenas and wait a minute for most of the AI to spawn before going all in on them. That is what I used to test my CPUs, and STALKER Call of Chernobyl. Which uses only 2 threads and is CPU heavy to a wacky degree. Is superior to Far Cry 5 as a game though, by a gigantic degree.


----------



## cucker tarlson (Apr 23, 2018)

Vayra86 said:


> . Another thing that happens with virtually every game is that FPS is always limited by the biggest thread; so essentially no matter how well it uses threads, there is essentially still a single threaded limitation.


Good observation, I was thinking the same thing but wasn't sure.


----------



## Vayra86 (Apr 23, 2018)

Charcharo said:


> I do not have it. From the very CPU heavy and multi-threaded games I have Metro, Witcher 3, BF1, Wolfenstein 2, DOOM 2016, Crysis 3.
> 
> I told you my Metro bench. For Wolfenstein I use either the Panzerhund sequence in New Orleance or one of the late game arenas and wait a minute for most of the AI to spawn before going all in on them. That is what I used to test my CPUs, and STALKER Call of Chernobyl. Which uses only 2 threads and is CPU heavy to a wacky degree. Is superior to Far Cry 5 as a game though, by a gigantic degree.



You just play the wrong type of games to see it 

What you need is an isometric ARPG, an RTS, stuff like that. Starcraft 2 - it hinges almost entirely on single thread performance and was the bane of the FX processors back in the day. Games like Torchlight, Grim Dawn, and almost every single indie game out there all rely heavily on single thread performance. These are games that cannot push lots of work towards the GPU because the entire battlefield needs to be responsive at all times, there is no option to 'stream' game content based on the player's position in the world, you can't really use LoD and dynamic ways to spawn assets, etc..

There are also games that eat every thread you throw at it and also want the highest clock, and that is the MMO. Guild Wars 2, WoW are good examples (though WoW is so well optimized you won't notice in a lot of cases and it just runs into network limitations before it hits a CPU wall - I had 50-60 FPS in player hubs even on my 3570k which is a tremendous feat of coding).


----------



## GoldenX (Apr 23, 2018)

You know, we didn't have such discusions with FX.
Next thing is Threadripper, right?


----------



## Charcharo (Apr 23, 2018)

Vayra86 said:


> You just play the wrong type of games to see it
> 
> What you need is an isometric ARPG, an RTS, stuff like that. Starcraft 2 - it hinges almost entirely on single thread performance and was the bane of the FX processors back in the day. Games like Torchlight, Grim Dawn, and almost every single indie game out there all rely heavily on single thread performance. These are games that cannot push lots of work towards the GPU because the entire battlefield needs to be responsive at all times, there is no option to 'stream' game content based on the player's position in the world, you can't really use LoD and dynamic ways to spawn assets, etc..
> 
> There are also games that eat every thread you throw at it and also want the highest clock, and that is the MMO. Guild Wars 2, WoW are good examples (though WoW is so well optimized you won't notice in a lot of cases and it just runs into network limitations before it hits a CPU wall - I had 50-60 FPS in player hubs even on my 3570k which is a tremendous feat of coding).



Oh... trust me, when STALKER's modified A-Life for 1000 NPCs is turned on  (simulates all of them real time) and the bullets with my addons with fixed ballistics start flying, the CPU screams. Since grenade shrapnel is also a physical objective... you get the idea. Still one of the most WTF tier CPU benches.  I play Starcraft 2 and ... even it can not compete. But good choice still. 

IMHO every game should be as multi-threaded as humanly possible since otherwise it is leaving performance on the table. Lets be real, both Intel and AMD are climbing a mountain now with IPC and clock speeds. It will still get higher, but not much higher less we switch to a new material for our CPUs. So until/when that happens, game developers have no real choice but to git gut.


----------



## Vayra86 (Apr 23, 2018)

GoldenX said:


> You know, we didn't have such discusions with FX.
> Next thing is Threadripper, right?



You must have missed them, then 

If anything this shows how incredible the jump was from FX > Ryzen


----------



## GoldenX (Apr 23, 2018)

Man, the Phenom II has aged.


----------



## trparky (Apr 23, 2018)

Vayra86 said:


> Starcraft 2 - it hinges almost entirely on single thread performance and was the bane of the FX processors back in the day.


That's one of the games that really has me questioning what processor to buy. I currently have a 3570K@4.4GHz and I keep coming back to the question of whether Ryzen is good enough to be able to run that game decently when compared to my current processor that struggles late into the game.


----------



## Charcharo (Apr 23, 2018)

GoldenX said:


> Man, the Phenom II has aged.


We all age. The abyss awaits. But muh ALife and ballistics are eternal.




trparky said:


> That's one of the games that really has me questioning what processor to buy. I currently have a 3570K@4.4GHz and I keep coming back to the question of whether Ryzen is good enough to be able to run that game decently when compared to my current processor that struggles late into the game.



Depends on whether you have an AMD or Nvidia GPU with Starcraft 2.


----------



## Vayra86 (Apr 23, 2018)

trparky said:


> That's one of the games that really has me questioning what processor to buy. I currently have a 3570K@4.4GHz and I keep coming back to the question of whether Ryzen is good enough to be able to run that game decently when compared to my current processor that struggles late into the game.



Decide for yourself - but Ryzen is in a very good place all around, its not remotely comparable to FX. Note that this is without an OC - so this is a pretty good scenario to gauge relative Ryzen / Intel worst case gaming performance.
https://www.techporn.ph/amd-ryzen-7-1800x-am4-cpu-review/


----------



## trparky (Apr 23, 2018)

Well alrighty then... since I lock my FPS to 60Hz (because my monitor is 60Hz) to reduce screen tearing it looks to me like it doesn't even matter which CPU I get.

*Edit*
Yeah I know... I play games like a pleb here at 1080p@60Hz. With GPU prices being in the stratosphere and with good 1440p monitors still being quite expensive when compared to 1080p monitors that can be had for under $100 USD I just can't justify the added cost of going 1440p.

*Edit #2*
I can't be the only gamer who thinks this way.


----------



## Charcharo (Apr 23, 2018)

Pretty much.

Honestly memes aside, I am pretty happy with how Zen has turned out. It has some funky weird faults in some titles, and does not play well with Nvidia at times (or maybe it is the other way round? ) but it is fast in everything. Not always the fastest, but generally pretty good engineering. Good at everything, real good at some things, not quite the best, small CPU cores, upgrade path, efficient.

I do hope for some real upgrades with Zen 2 and Ryzen 3000, but CPUs are exciting again. Things are actually interesting and people can pick any CPU (bar the pentium and arguably lowest end R3 or i3) and expect great performance in 99% of titles, and passable in the remaining 1% at the least. Is this not what we all want in the end  ?


----------



## trparky (Apr 23, 2018)

Charcharo said:


> I do hope for some real upgrades with Zen 2 and Ryzen 3000


I have a feeling that Zen 2 is going to be where Ryzen will take off like a rocket. Not only is Zen 2 (Ryzen 3000 Series) going to be based upon GloFlo 7nm process node but AMD has also partnered up with IBM and they have some seriously smart people at IBM. If there's anybody in the business that will know how to take Zen to the next level, it will be the people at IBM.

What's really funny about Intel is what I was reading over at AnandTech's forums regarding Intel's struggles to get to 10nm. It seems that Intel is completely drunk on their own Kool-Aid and refuses to reach out to outside experts despite the fact that the likes of TSMC and Samsung have both successfully made the leap to 10nm (and beyond) while Intel is still struggling at 14nm.

David Schor: Intel 10nm in big problems

This is why I have a very good feeling that AMD is going to thoroughly kick Intel's ass with Zen 2.


----------



## Charcharo (Apr 23, 2018)

trparky said:


> I have a feeling that Zen 2 is going to be where Ryzen will take off like a rocket. Not only is Zen 2 (Ryzen 3000 Series) going to be based upon GloFlo 7nm process node but AMD has also partnered up with IBM and they have some seriously smart people at IBM. If there's anybody in the business that will know how to take Zen to the next level, it will be the people at IBM.
> 
> What's really funny about Intel is what I was reading over at AnandTech's forums regarding Intel's struggles to get to 10nm. It seems that Intel is completely drunk on their own Kool-Aid and refuses to reach out to outside experts despite the fact that the likes of TSMC and Samsung have both successfully made the leap to 10nm (and beyond) while Intel is still struggling at 14nm.
> 
> ...



To be fair, the nm does not refer to the node's size nowadays. It is just a marketing term. Zen's 14nm is actually more like 20nm with Finfets. Intel's 14nm is more like 16nm. TSMC's 16nm is like 20nm with Finfets...


----------



## kanecvr (Apr 23, 2018)

cucker tarlson said:


> Than 7600K ? What ?
> Here Ryzen 2700X is 1350, 8700K is 1400, and 8700K is just better overall.



8700k = 1633 lei // 2700x = 1364 lei. Z370 board (obligatory for overclocking) ~ 550 lei while a decent x450 board is about 480 lei (not to mention one could go for a B450 witch is cheaper and still allows for overclocking, or the older X370/B350 chipsets). All in all, the price difference is about the value of a reasonably fast 8GB kit of dual channel DDR4. That coupled with it's workstation level performance makes it a best buy for me, since I do both games and work on the same computer.

I don't understand why regular consumers would buy the intel chip. As an enthusiast, it makes sense for me to go for the 8700k as most samples will reach 5GHz, and at that speed the performance gap in games wides, while the x2700's lead in productivity apps shrinks - and here comes another BUT - the 8700k gets STUPID HOT even at 4.7GHz... at 5GHz you need to de-lid it, replace the TIM with liquid metal, spend 100-200$ on a decent water cooling solution, and use a mother of a PSU and an expensive motherboard... so... not really my can of worms... 

All and all, I'll stick with my 3930k for a while longer. At least until DDR4 prices settle.


----------



## trparky (Apr 23, 2018)

kanecvr said:


> I don't understand why regular consumers would buy the intel chip. As an enthusiast, it makes sense for me to go for the 8700k as most samples will reach 5GHz, and at that speed the performance gap in games wides, while the x2700's lead in productivity apps shrinks - and here comes another BUT - the 8700k gets STUPID HOT even at 4.7GHz... at 5GHz you need to de-lid it, replace the TIM with liquid metal, spend 100-200$ on a decent water cooling solution, and use a mother of a PSU and an expensive motherboard... so... not really my can of worms...


There's no denying that Intel overclocks like nobody's business but at what cost? This is the part that I've never understood about some people in the Intel camp. At some point the overclocking is going to have diminishing returns.


----------



## Melvis (Apr 24, 2018)

cucker tarlson said:


> You want 1080p or higher ? You can't handle the truth.



Well clearly I do as I live in 2018 where 1080p is the most used and most common res for gaming, this isnt 2003 anymore....



cucker tarlson said:


> My 2 cents, based on my thousands of hours spent gaming on 3570K + 980Ti and 4790K + 1080. Even at 1440p you'll find plenty of places across various games that are CPU heavy. Plenty. It is impossible for the reviewer to recreate those, taking into account they bench 10+ games, they'd spend hundreds of hours just looking for good testing places. I know one guy from a Polish website who is responsible for gpu and cpu tests exclusively, and he always goes through the routine of finding them - look how different his results are from W1zzard's or the majority of other reviews for that matter.
> 
> https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,34
> https://www.purepc.pl/procesory/test_procesorow_amd_ryzen_7_2700x_vs_intel_core_i7_8700k?page=0,35
> ...



So far you have showed me that you actually agree with my statement? with all the links that show exactly what I am talking about, that yes even at 1080 the res most used shows that there are actually differences in performance because of the stress on the CPU, its good to see that you got what I meant and agree 




Vayra86 said:


> Going all caps doesn't make gaming benchmarks about the game itself in a CPU review, and it still doesn't change a thing about the relative performance gaps you see there. Its fine if you don't (want to) get it, so skip the 720p page and move right on to your preferred resolution to see what's the CPU for your use case. No one is stopping you...



Seems like I have to to get the point across for those who clearly dont get it and still live back in 2003. You just said it yourself, its a CPU review....but we are going to show you gaming benchmarks on a res from 2003? cant you see how irrelevant that is? it still is gaming benchmarks........at a res no one uses! with said hardware, so please stop spreading fud and get back to the yr 2018 for god sake.

Again im still waiting for that person that games at 720p with a i7/Ryzen 7 and a GTX 1080 Ti.


----------



## cucker tarlson (Apr 24, 2018)

Melvis said:


> Well clearly I do as I live in 2018 where 1080p is the most used and most common res for gaming, this isnt 2003 anymore....
> 
> 
> 
> ...


I don't think 720p represents what you will see at 1080p, except in CPU limited areas of games, and that is the whole point. It shows which CPU is best so I don't understand people whining about it, it helps you get the most bang for your buck. You know you don't have to pay AMD for their product if you don't like it. I mean "you know" more figuratively, not specifically about you, cause you seem to totally discredit that 720p results with an argument that you repeat like a broken record. We know no one plays at 720p anymore. We figured it out.  But no one looks at 720p tests cause they're running a 720 rig, and that is what you don't get. We look at it and see CPU limited scenario.

BTW how stupid are AMD for not letting me use an aftermarket cooler on my CPU and voiding my warranty ? I mean I know they can't tell but what they're claming is beyond retarded.

https://support.amd.com/en-us/search/faq/147


----------



## Charcharo (Apr 24, 2018)

I personally think 720P benches are useless unless they are used for E-Sport titles. Then they make sense.


----------



## cucker tarlson (Apr 24, 2018)

Charcharo said:


> I personally think 720P benches are useless unless they are used for E-Sport titles. Then they make sense.


Wrong. And personal beliefs should not be a part of this discussion.



Melvis said:


> Well clearly I do as I live in 2018
> 
> Seems like I have to to get the point across for those who clearly dont get it and still live back in 2003.
> 
> stop spreading fud and get back to the yr 2018 for god sake.


"It's 2018, get real" says a guy who runs a FX cpu in 2018. With a friggin sli setup. 
You obviously don't know much about any sort of cpu limitations, neither do you know how to test them.


----------



## Charcharo (Apr 24, 2018)

cucker tarlson said:


> Wrong. And personal beliefs should not be a part of this discussion.



OK. My informed real world opinion as someone who plays old games and the most CPU demanding titles in existence, who until lat least year was using a 1440x900 screen to play on is that anything under 1080p/900p (at the least) is de facto irrelevant for high end modern parts.

If you disagree you must give me an objective reason that is not about "future performance" as that simply should not depend on single-threaded scenarios in this year. It is an issue of the developers at that point as it means their game will never (likely) run good, no matter what Intel or AMD do. If it is about 240hz gaming, I accept E-Sport titles. If it is about the insanely small, tiny minority of people who have ultra high end gear and play at 720p because an ugly blob bothers them less than not getting 200+ Fps... egh ok that is valid but damn is it niche.


----------



## trparky (Apr 24, 2018)

Benchmarking at 720p puts the game performance squarely on the processor's shoulders as versus the GPU. It damn near removes the GPU from the equation completely. It's about the closest you can get to a pure CPU benchmark.


----------



## Charcharo (Apr 24, 2018)

trparky said:


> Benchmarking at 720p puts the game performance squarely on the processor's shoulders as versus the GPU. It damn near removes the GPU from the equation completely. It's about the closest you can get to a pure CPU benchmark.



I obviously know that. But it is academic, the difference between science and engineering . It tells you almost nothing real and nothing about the future.

Also I am unsure if it perfectly benches the CPUs as different operations in a scene take different time/resources to be calculated. CPUs are lightly impacted by resolution, but the lightly part still exists, and how they communicate with a GPU /GPU driver matters too.


----------



## Vayra86 (Apr 24, 2018)

Melvis said:


> Seems like I have to to get the point across for those who clearly dont get it and still live back in 2003. You just said it yourself, its a CPU review....but we are going to show you gaming benchmarks on a res from 2003? cant you see how irrelevant that is? it still is gaming benchmarks........at a res no one uses! with said hardware, so please stop spreading fud and get back to the yr 2018 for god sake.
> 
> Again im still waiting for that person that games at 720p with a i7/Ryzen 7 and a GTX 1080 Ti.



Another thick plank here then... Are you going to repeat the same broken record every few pages?

Spreading 'FUD'? If this is FUD to you, then you suffer from a serious lack of insight and experience. The line about 2003 and actually using said hardware on 'this res' shows you really don't get it. I've already supplied multiple examples of recent or still heavily played games that still rely on the same things as they did back in 2003 - and that should not be a surprise to anyone if you know how genuinely old most game engines really are - even today. Many of them are just new iterations of the same old stuff. Re read my earlier response on this. You don't have to agree, and you don't have to see it or believe it, that's fine, skip the 720p tests in that case and move on. I have better things to do than to keep repeating myself. Its your delusion not mine. The reality is, you bought an FX at some point, which is evidently one of the worst gaming CPUs for its time and it was a well known fact even then - and people who bought them used the same arguments as you're doing now. 'Everything's gonna be multi threaded anyway'. Its quite hilarious and even in 2018 its still completely wrong.

https://www.techpowerup.com/forums/threads/amd-ryzen-7-2700x-3-7-ghz.243209/page-10#post-3832275



Charcharo said:


> I obviously know that. But it is academic, the difference between science and engineering . It tells you almost nothing real and nothing about the future.
> 
> Also I am unsure if it perfectly benches the CPUs as different operations in a scene take different time/resources to be calculated. CPUs are lightly impacted by resolution, but the lightly part still exists, and how they communicate with a GPU /GPU driver matters too.



Its not purely academic, low res benchmarks are perfect to see the *relative performance *between CPUs. Not more than one page ago we looked at Starcraft 2 and worst-case scenario benchmarks and we also concluded that even in modern games, the min. FPS hinges on that one big game thread no matter how many cores the game can use. The vast majority of games and engines still work this way, and there are a few exceptions to this rule, DOOM being a good example of an engine that really extracts high FPS from increased core counts as well as clocks, but the reality is that this game also barely uses any CPU whatsoever.

The abundance of shooters and first/third person games in many reviews are showing a twisted reality where most CPUs perform largely the same. But gaming is a lot more than shooters and I so happen to play very few shooters and much more of other game types, and many of them are capable of using every Ghz I throw at it but hardly benefit from core count. Stellaris being (another) good example of something really quite recent that still hinges on the same performance bottleneck as games did ten years back.


----------



## Melvis (Apr 24, 2018)

Konceptz said:


> If we can get 4.0-4.2ghz stable or higher while waiting for Ryzen 2 i'll be happy. I still on my FX-8350 waiting for Ryzen +





cucker tarlson said:


> Wrong. And personal beliefs should not be a part of this discussion.
> 
> 
> "It's 2018, get real" says a guy who runs a FX cpu in 2018. With a friggin sli setup.
> You obviously don't know much about any sort of cpu limitations, neither do you know how to test them.



You find it funny that in 2018 im running a FX CPU with a SLi setup thats still able to play games at 1440P? I think the jokes on you kid
You obviously have no clue at all how a CPU works with modern software in 2018. Might need to go do some research before commenting next time?

Here I shall give you a few links so you can go looking for yourself and find out how even a older CPU is still able to play modern games even in 2018.

https://www.google.com.au/
https://www.youtube.com/



Vayra86 said:


> Another thick plank here then... Are you going to repeat the same broken record every few pages?
> 
> Spreading 'FUD'? If this is FUD to you, then you suffer from a serious lack of insight and experience. The line about 2003 and actually using said hardware on 'this res' shows you really don't get it. I've already supplied multiple examples of recent or still heavily played games that still rely on the same things as they did back in 2003 - and that should not be a surprise to anyone if you know how genuinely old most game engines really are - even today. Many of them are just new iterations of the same old stuff. Re read my earlier response on this. You don't have to agree, and you don't have to see it or believe it, that's fine, skip the 720p tests in that case and move on. I have better things to do than to keep repeating myself. Its your delusion not mine. The reality is, you bought an FX at some point, which is evidently one of the worst gaming CPUs for its time and it was a well known fact even then - and people who bought them used the same arguments as you're doing now. 'Everything's gonna be multi threaded anyway'. Its quite hilarious and even in 2018 its still completely wrong.
> 
> https://www.techpowerup.com/forums/threads/amd-ryzen-7-2700x-3-7-ghz.243209/page-10#post-3832275



Just another kid know it all we have here, and yes I will over and over again till you finally catch up with reality and get out of the dark ages. 

Look we know your new to the computer world its ok we are here to educate you on these things so you can be more respected on forums when replying. We know that if you dont understand that 720p is a dead res from 2003 and that in the real world no one uses it with said hardware, its just a fact otherwise you or anyone else would be linking me to someone that actually games at 720p with said hardware, but alas im still waiting. You dont have to agree or like the facts thats totally up to you and you can just skip this and stop responding, its probably in your best interest honestly. So your going to skip the facts and comment on my FX system? which has no baring on this subject at all, now I know your reaching. Maybe before attacking my computer you should go back to when they came out and see the threads (yes ill be there) and you might learn something, but again we can tell your new to this since you only been on here since 2014 so if you cant find your answers I will be more then happy to explain it to you, so you can catch up with the rest of us oldies.  FYI Multi-threaded programs/games is alot more now then when it was back in 2012.


----------



## W1zzard (Apr 24, 2018)

Just had a thought about those YouTube videos. Doesn't sensor recording, overlay, video recording add additional load on the system which introduces some bias toward systems with more cores?


----------



## Charcharo (Apr 24, 2018)

Vayra86 said:


> Another thick plank here then... Are you going to repeat the same broken record every few pages?
> 
> Spreading 'FUD'? If this is FUD to you, then you suffer from a serious lack of insight and experience. The line about 2003 and actually using said hardware on 'this res' shows you really don't get it. I've already supplied multiple examples of recent or still heavily played games that still rely on the same things as they did back in 2003 - and that should not be a surprise to anyone if you know how genuinely old most game engines really are - even today. Many of them are just new iterations of the same old stuff. Re read my earlier response on this. You don't have to agree, and you don't have to see it or believe it, that's fine, skip the 720p tests in that case and move on. I have better things to do than to keep repeating myself. Its your delusion not mine. The reality is, you bought an FX at some point, which is evidently one of the worst gaming CPUs for its time and it was a well known fact even then - and people who bought them used the same arguments as you're doing now. 'Everything's gonna be multi threaded anyway'. Its quite hilarious and even in 2018 its still completely wrong.
> 
> ...



Oh mate I think we did not understand each other. I agree and even stated that games like Starcraft 2 (which I play, pretty well might I add as long as I am drunk!) do matter. I play such titles, I even mentioned Call of Chernobyl which when configured for max simulation simply slaughters CPUs. Starcraft 2 looks like an idle test in comparison for single-threaded benches in comparison, it is that bad.

My point is that the performance profile at 720p and at 1080p or above is not QUITE the same. People have this idea that CPU resources are unaffected by resolution (incorrect) and that all things a game needs require similar percentile resources or time to complete. That is not true, unfortunately. Now you will say "OK Alex, fair enough you are right. I will add that nuance. Still at 720P or low resolutions in general we see more of the CPU at work even with that nuance added in." and I would agree with you. I just do not think it matters for most titles bar E-Sports ones. That was my point. It is not me not knowing or disagreeing with you in general.

As for Starcraft and Stellaris (which I admit I do not play) ... egh I am divided. You see old games we can not expect to run perfectly well on new CPUs. I know that and that is the reason why I personally, whilst happy with my 1500X (at least it is never slower than my old Haswell i5!) will wait for Ryzen 3000/Zen 2 before I consider an upgrade. I hope for higher clocks and IPC especially in niche scenarios, from AMD. I play such titles a lot and want good performance in them. What I mean though is that the big onus is on those developers to work on their engines and games. In the case of Starcraft, it still gets dev time and is a popular title even to this day. Stellaris I have not played so I dont know. But if Ashes of the Singularity and Total War can do multi-threading well whilst using MUCH more CPU resources on every single front, and a low budget game like Men of War too can at least use 8 threads well (whilst simulating ballistics, destructible terrain, AI for every soldier, Inventory, armor penetration, destructible buildings and more) ... well there is no excuse. Starcraft 2 should receive an update in general and that would improve performance for all players. We should demand more as gamers too because IPC and clocks are at a wall. They can not improve by much, no matter what Intel or AMD do. At least not with silicon. That was my point.




W1zzard said:


> Just had a thought about those YouTube videos. Doesn't sensor recording, overlay, video recording add additional load on the system which introduces some bias toward systems with more cores?



Should have a small impact but not for Digital Foundry. There it should be zero. 
IMHO it would be cool if you can do some CPU tests under ... more unclean conditions. I should likely not mention how many things I have open most of the time else people will mock me  !


----------



## cucker tarlson (Apr 24, 2018)

Melvis said:


> You find it funny that in 2018 im running a FX CPU with a SLi setup thats still able to play games at 1440P? I think the jokes on you kid
> You obviously have no clue at all how a CPU works with modern software in 2018. Might need to go do some research before commenting next time?
> 
> Here I shall give you a few links so you can go looking for yourself and find out how even a older CPU is still able to play modern games even in 2018.
> ...


Referring me to Youtube.com really says it about how comprehensive your knowledge is. No wonder people build rigs like yours, with all sorts of bottlenecks, if they have Youtube as reference. It's just a shame that they later go on quality forums like tpu and think they can lecture the reviewer


----------



## intrepid3d (Apr 24, 2018)

W1zzard said:


> Just had a thought about those YouTube videos. Doesn't sensor recording, overlay, video recording add additional load on the system which introduces some bias toward systems with more cores?



MSI After Burner: no
Shadow Rlay recording: very little, about 2% if anything


----------



## Melvis (Apr 24, 2018)

cucker tarlson said:


> Referring me to Youtube.com really says it about how comprehensive your knowledge is. No wonder people build rigs like yours, with all sorts of bottlenecks, if they have Youtube as reference. It's just a shame that they later go on quality forums like tpu and think they can lecture the reviewer



Oh dear you have got alot to learn and shows us how much ignorance and little knowledge you do have in the tech industry thats now a given fact. You tube videos are more credible then reviews done on tech sites, you get to see the tests been done and carried out infront of your eyes with live testing and benchmarking etc, alot harder to fake or give false info then just taking a screen shot and putting it on a website.  
IF this is all you have as a comeback and all you can do is complain about youtube then you have already lost


----------



## cucker tarlson (Apr 24, 2018)

Melvis said:


> Oh dear you have got alot to learn and shows us how much ignorance and little knowledge you do have in the tech industry thats now a given fact. You tube videos are more credible then reviews done on tech sites, you get to see the tests been done and carried out infront of your eyes with live testing and benchmarking etc, alot harder to fake or give false info then just taking a screen shot and putting it on a website.
> IF this is all you have as a comeback and all you can do is complain about youtube then you have already lost


Your purchase is not my problem.
So you're saying no bottlenecks ?
Not with fx cpu ? Not with 2.0 lanes ?


----------



## W1zzard (Apr 24, 2018)

intrepid3d said:


> MSI After Burner: no


Did some quick testing










Seems to be around 1-2% for the OSD (values vary a bit, but there's a clear difference)

More at higher FPS:


----------



## intrepid3d (Apr 24, 2018)

W1zzard said:


> Did some quick testing
> 
> 
> 
> ...



1 - 2% is Margin of error, that's why values vary, no one run is identical to another.

Oh and you're using FRAPS with MSI After Burner, using two sensory monitoring applications is not a good idea as they can conflict.


----------



## Charcharo (Apr 24, 2018)

Guys, if the man with the FX CPU is happy... then maybe he is correct for himself? Maybe it is OK for him? I mean it is ugly from the outside to see that even if you are technically correct. 

Same way I tell people with Pentiums or i3s or FXes that if they are happy... they do not need to change their set up yet. Every single person is unique in what they do and how they do it with gaming.


----------



## intrepid3d (Apr 24, 2018)

Charcharo said:


> Guys, if the man with the FX CPU is happy... then maybe he is correct for himself? Maybe it is OK for him? I mean it is ugly from the outside to see that even if you are technically correct.
> 
> Same way I tell people with Pentiums or i3s or FXes that if they are happy... they do not need to change their set up yet. Every single person is unique in what they do and how they do it with gaming.



This, i'm mean for Christ Sake i have never understood pepole who attack others for thier hardware choices, its hardware snobbery and out right church mentality of the highest order.


----------



## W1zzard (Apr 24, 2018)

intrepid3d said:


> 1 - 2% is Margin of error, that's why values vary, no one run is identical to another.
> 
> Oh and you're using FRAPS with MSI After Burner, using two sensory monitoring applications is not a good idea as they can conflict.


values don't vary that much, test yourself and you'll see. i did not cherry pick results

fraps was the easiest for me to do a/b testing. a game with integrated fps counter would work too.


----------



## intrepid3d (Apr 24, 2018)

W1zzard said:


> values don't vary that much, test yourself and you'll see. i did not cherry pick results
> 
> fraps was the easiest for me to do a/b testing. a game with integrated fps counter would work too.



Wait, are you saying you don't get a 1 to 2% variation when runnig multiple game benchmarks, or any benchmark, because that is the understood norm, most reviewers run multiple runs and take avrages from the results. even cinebench which is recognized as the most consistent varies 1 or 2% between runs.

No. i don't trust your runs are 100% consistent, no ones are.


----------



## W1zzard (Apr 24, 2018)

intrepid3d said:


> Wait, are you saying you don't get a 1 to 2% variation when runnig multiple game benchmarks, or any benchmark, because that is the understood norm, most reviewers run multiple runs and take avrages from the results. even cinebench which is recognized as the most consistent varies 1 or 2% between runs.
> 
> No. i don't trust your runs are 100% consistent, no ones are.


I'm not saying that. What I'm saying is that monitoring, OSD and capture add some CPU load (not a random variation).

Of course trying to measure the impact has some variation (a small percentage of the impact itself)


----------



## cucker tarlson (Apr 24, 2018)

W1zzard said:


> I'm not saying that. What I'm saying is that monitoring, OSD and capture add some CPU load (not a random variation).
> 
> Of course trying to measure the impact has some variation (a small percentage of the impact itself)


Recording and OSD - negligible
Streaming - yuge


intrepid3d said:


> This, i'm mean for Christ Sake i have never understood pepole who attack others for thier hardware choices, its hardware snobbery and out right church mentality of the highest order.


Lol I got nothing against the man's choice, I could not care less. Apologies if I came across too strong, but we gotta draw a line when he tries to discredit the review results just cause he's happy living in ignorance. I was pointing out what he was saying about us living in 2003 and him being so damn forward thinking was ridiculous and ironic.

Oh yeah, I missed it was you again. Well that explains A LOT cause I was just thinking why would anyone accuse me of hardware snobbery.


----------



## Charcharo (Apr 24, 2018)

W1zzard said:


> I'm not saying that. What I'm saying is that monitoring, OSD and capture add some CPU load (not a random variation).
> 
> Of course trying to measure the impact has some variation (a small percentage of the impact itself)



Egh if it is the same load for all CPUs, IMHO it don't matter.

I dislike clean install CPU benches too, but only because that is not how I work/play on my PC. I completely understand why they are done and do not want it to be changed if it is too much workload for both.


----------



## intrepid3d (Apr 24, 2018)

W1zzard said:


> I'm not saying that. What I'm saying is that monitoring, OSD and capture add some CPU load (not a random variation).
> 
> Of course trying to measure the impact has some variation (a small percentage of the impact itself)



Ah ok, its very low load, a couple of % at most and its the same for all CPU's.


----------



## W1zzard (Apr 24, 2018)

Charcharo said:


> Egh if it is the same load for all CPUs, IMHO it don't matter.


The load should be proportional to FPS, so not exactly constant, but probably fine, because it's just percentages on top of percentages.

My main thought is this though: This load can probably migrate to an unloaded core in a multi-core system. On a system with few cores, that are fully loaded already by the game, this will eat into the CPU time available to the game.


----------



## Charcharo (Apr 24, 2018)

W1zzard said:


> The load should be proportional to FPS, so not exactly constant, but probably fine, because it's just percentages on top of percentages.
> 
> My main thought is this though: This load can probably migrate to an unloaded core in a multi-core system. On a system with few cores, that are fully loaded already by the game, this will eat into the CPU time available to the game.



Seems fair enough to me. Wont affect benches noticeably IMHO. And is actually closer to real world results (even if not for the correct reasons). 
Of course the DF way seems the best but it is harder.


----------



## ORLY (Apr 24, 2018)

Minimum FPS?
0.1% and 1% lows? It would be really cool to know what CPUs give better gaming experience, average framerate doesn't tell you that.


----------



## Vayra86 (Apr 24, 2018)

Charcharo said:


> Oh mate I think we did not understand each other. I agree and even stated that games like Starcraft 2 (which I play, pretty well might I add as long as I am drunk!) do matter. I play such titles, I even mentioned Call of Chernobyl which when configured for max simulation simply slaughters CPUs. Starcraft 2 looks like an idle test in comparison for single-threaded benches in comparison, it is that bad.
> 
> My point is that the performance profile at 720p and at 1080p or above is not QUITE the same. People have this idea that CPU resources are unaffected by resolution (incorrect) and that all things a game needs require similar percentile resources or time to complete. That is not true, unfortunately. Now you will say "OK Alex, fair enough you are right. I will add that nuance. Still at 720P or low resolutions in general we see more of the CPU at work even with that nuance added in." and I would agree with you. I just do not think it matters for most titles bar E-Sports ones. That was my point. It is not me not knowing or disagreeing with you in general.
> 
> ...



You're absolutely right I think about the onus being on the devs, but there are limitations to that as well. In budget, the engine in use and the limitations of that engine, and coding talent as well. And wherever the devs fall short, we use hardware to compensate for it. Its always been that way and always will be that way. Software is almost never 'perfect'.

The bottom line for a vast majority of games in my view, doesn't really change: single thread is and will for the foreseeable future be the limiting factor when it comes to CPU performance. You're right about Ashes. And there are other examples of new games as well that use cores extremely well, I see them too. And at the same time, even in those cases the performance does not scale linearly with core counts. There are many more things in the entire pipeline that can limit FPS some of which even faster hardware cannot overcome. But it does still help in lots of scenarios.

Physics simulations are easy to divide across cores and threads, and any non-sequential workload in games as well. New APIs provide easier access to actually implementing code in that way, and this will in the longer term (we're not *really* seeing it yet, today, let's be honest, almost every bench linked in this topic is proof of that, almost none of the results actually show a tangible FPS increase from core counts/thread counts) provide the headroom to make more complex games.



Melvis said:


> You find it funny that in 2018 im running a FX CPU with a SLi setup thats still able to play games at 1440P? I think the jokes on you kid
> You obviously have no clue at all how a CPU works with modern software in 2018. Might need to go do some research before commenting next time?
> 
> Here I shall give you a few links so you can go looking for yourself and find out how even a older CPU is still able to play modern games even in 2018.
> ...



You keep harping on about a dead res and if you are not running into a performance bottleneck on your rig, more power to you. I'm not 'skipping' facts by the way, quite the opposite, I am supplying evidence that supports them and I will always keep asking questions when it comes to performance - all you've done is repeat yourself a couple times and link Google.com. And yes, there are a lot more multi-threaded programs but if you had taken the time to actually get into the supported evidence of single thread limited gaming scenario's, you would see most cores are loaded with nothing substantial, at least nothing that improves FPS. We're at 15 pages now... do scroll through them and take a look. The comment about FX and 2012... you do realize I just upgraded from Ivy Bridge, yes? I was around back then, just not here... I suppose I shouldn't tell you how old I am but you're quite far off the mark with 'kid' 

Beyond that, lets agree to disagree, because this is going nowhere substantial, except for getting personal.


----------



## DeathtoGnomes (Apr 25, 2018)

ORLY said:


> Minimum FPS?
> 0.1% and 1% lows? It would be really cool to know what CPUs give better gaming experience, average framerate doesn't tell you that.


it depends on the game, you cant just lump all games into this statement.


----------



## cucker tarlson (Apr 25, 2018)

DeathtoGnomes said:


> it depends on the game, you cant just lump all games into this statement.


Agreed, but they still need to be tested for the majority.


----------



## GoldenX (Apr 25, 2018)

Man this is good, pass the popcorn!


----------



## cucker tarlson (Apr 25, 2018)

trparky said:


> I have a feeling that Zen 2 is going to be where Ryzen will take off like a rocket. Not only is Zen 2 (Ryzen 3000 Series) going to be based upon GloFlo 7nm process node but AMD has also partnered up with IBM and they have some seriously smart people at IBM. If there's anybody in the business that will know how to take Zen to the next level, it will be the people at IBM.
> 
> What's really funny about Intel is what I was reading over at AnandTech's forums regarding Intel's struggles to get to 10nm. It seems that Intel is completely drunk on their own Kool-Aid and refuses to reach out to outside experts despite the fact that the likes of TSMC and Samsung have both successfully made the leap to 10nm (and beyond) while Intel is still struggling at 14nm.
> 
> ...


Intel has had 8 cores hitting 4.5ghz since haswell-e 22nm while ryzen is struggling to hit 4300 on 12nm. I don't know about skylakex but I think it can do 4700 on 12 or even 14  cores if You replace the pigeon poop they put under IHS. Coffeelake is an improvement over skylake already. I don't think they'll have any problems with releasing an 8 core 9700k at 4.5ghz stock capable of +5ghz. I don't think amd can leap from 4.2 to 5ghz that easily, You can see how hard it was to jump from 4100 on ryzen to 4250 on ryzen 2.


----------



## GoldenX (Apr 25, 2018)

Pentium 4 reached 4GHz while a 2,2GHz Athlon destroyed it.
Max frequency is not an argument, price/perfoance is. Or do you preffer a hedt 8 core? Right now there is no 8 core i7 on the mainstream socket.
If it wasn't for AMD, we would still have $200 dual core i3s.


----------



## cucker tarlson (Apr 25, 2018)

GoldenX said:


> Pentium 4 reached 4GHz while a 2,2GHz Athlon destroyed it.
> Max frequency is not an argument, price/perfoance is. Or do you preffer a hedt 8 core? Right now there is no 8 core i7 on the mainstream socket.
> If it wasn't for AMD, we would still have $200 dual core i3s.


Well it clearly wasn't for pentium vs athlon then. Check with @Melvis  what year it is.   the whole point of this ryzen 2 is frequency increase so You pretty much touche'd yourself sir lol.


Yeah 2K 2 core i3s, are there unicorns in that story too ?


----------



## Melvis (Apr 25, 2018)

cucker tarlson said:


> Your purchase is not my problem.
> So you're saying no bottlenecks ?
> Not with fx cpu ? Not with 2.0 lanes ?



Well apparently it is as you brought it up?
No I didnt say that, Im saying it can still play the games I play at 1440P.
2.0 lanes dont effect it at all.


----------



## phill (Apr 25, 2018)

Jezz...  I'm glad I only posted once or twice..  Who's had the hand bags out??  I think the keyboard warriors are out in force again!!


----------



## Shatun_Bear (Apr 25, 2018)

AMD Pinnacle Ridge: Ryzen 2000 also benefits greatly from memory tuning

The findings on memory speed/timings with respect to FPS and frametimes for 2700X is pretty awesome. Or too good to be true. Intel doesn't benefit nearly as much and in some cases the 2700X is faster than the 8700K:


----------



## cucker tarlson (Apr 25, 2018)

In case of 8700K intel gains about the same as AMD when it comes to the leap from stock memory to 3466, but Intel gains additional 1% from timings while AMD gains like another 8%. They test 1080p Ultra with a 1080Ti, so it seems 8700K with 3466 pretty much maxes out the GPU as the gain from improved timings is marginal. Ryzen needs those timing tweaks on top of speed to max out the GPU utilization.
And come on dude, 1% is not faster, 1% is margin of error when both can drive the GPU at max utilization. I know you're rooting for AMD but don't do that, don't use double standards, it will come back to bite you. You're the one who said 10% is small when Ryzen is behind 8700K.

This is a good find, though it actually means you can go 3400 CL16 with Intel and max out your 8700K performance, while you gotta go for 3400 CL14 to max out ryzen, though People will still claim Ryzen is cheaper cause it includes a stock cooler .....


----------



## GoldenX (Apr 25, 2018)

cucker tarlson said:


> Well it clearly wasn't for pentium vs athlon then. Check with @Melvis  what year it is.   the whole point of this ryzen 2 is frequency increase so You pretty much touche'd yourself sir lol.
> 
> 
> Yeah 2K 2 core i3s, are there unicorns in that story too ?



200, please, if you are going to answer, read carefully.
Check the i3 7350K, and the almost released i3 7360X.
Ryzen 2 is refining the existing arch, so you get lower power, higher clocks, improved specs (xfr2) and fixed problems (Linux segfaults fixed from the get go).
Only you see it as only a frequency increase. Only Intel does that and calls it an improvement.
Ryzen is cheaper because you don't need to expend on a cooler and an expensive chipset. And you get more cores for the same price.
Intel's lineup has been the same joke for years until coffee lake, that it' just kaby lake with two extra cores.


----------



## cucker tarlson (Apr 25, 2018)

Almost released. That wins it for today.



GoldenX said:


> Ryzen 2 is refining the existing arch, so you get lower power
> Only you see it as only a frequency increase. Only Intel does that and calls it an improvement.



looooool
an improvement of -0.1 kJ worse efficiency than 1800X


----------



## GoldenX (Apr 25, 2018)

Same eficiency with higher clocks equals better performance without drawbacks. Not like the i9 line being worse than previous hedt processors.
No discussion about price, multi thread performance, delid necesity or stoping a virtual monopoly?
Of course the i3 7360X is a joke, do you want to pay $200 for a dual core, on a platform designed for over 10 cores? Best thing is, Intel even released review samples.


----------



## W1zzard (Apr 25, 2018)

GoldenX said:


> Same eficiency with higher clocks equals better performance without drawbacks


Our new Energy Usage test takes performance into account


----------



## GoldenX (Apr 25, 2018)

W1zzard said:


> Our new Energy Usage test takes performance in
> to account


Point taken, thank you. 
Will you review the non X versions too?


----------



## Shatun_Bear (Apr 25, 2018)

cucker tarlson said:


> In case of 8700K intel gains about the same as AMD when it comes to the leap from stock memory to 3466, but Intel gains additional 1% from timings while AMD gains like another 8%. They test 1080p Ultra with a 1080Ti, so it seems 8700K with 3466 pretty much maxes out the GPU as the gain from improved timings is marginal. Ryzen needs those timing tweaks on top of speed to max out the GPU utilization.
> And come on dude, 1% is not faster, 1% is margin of error when both can drive the GPU at max utilization. I know you're rooting for AMD but don't do that, don't use double standards, it will come back to bite you. You're the one who said 10% is small when Ryzen is behind 8700K.
> 
> This is a good find, though it actually means you can go 3400 CL16 with Intel and max out your 8700K performance, while you gotta go for 3400 CL14 to max out ryzen, though People will still claim Ryzen is cheaper cause it includes a stock cooler .....



No ok I should clarify I mean Ryzen rising 16% to overtake the 8700K is very impressive not that it's 1% faster.


----------



## W1zzard (Apr 25, 2018)

GoldenX said:


> Will you review the non X versions too?


AMD plans to sample then in early May, so yes. I have half a dozen Intel CPUs here that will keep me busy until then


----------



## GoldenX (Apr 25, 2018)

I would love to see some Pentiums, the G4560 still tops the price/performance chart.


----------



## cucker tarlson (Apr 25, 2018)

GoldenX said:


> Same eficiency with higher clocks equals better performance without drawbacks. Not like the i9 line being worse than previous hedt processors.
> No discussion about price, multi thread performance, delid necesity or stoping a virtual monopoly?
> Of course the i3 7360X is a joke, do you want to pay $200 for a dual core, on a platform designed for over 10 cores? Best thing is, Intel even released review samples.


I agree. Broadwell-E pretty much strolls over Skylake-X when it comes to efficiency. Look at 7800x vs 6800K when both at 4500MHz

power draw
https://www.purepc.pl/procesory/tes...e_i77800x_szesciordzeniowy_skylakex?page=0,40

performance

https://www.purepc.pl/procesory/tes...e_i77800x_szesciordzeniowy_skylakex?page=0,30
https://www.purepc.pl/procesory/tes...e_i77800x_szesciordzeniowy_skylakex?page=0,31
https://www.purepc.pl/procesory/tes...e_i77800x_szesciordzeniowy_skylakex?page=0,32
https://www.purepc.pl/procesory/tes...e_i77800x_szesciordzeniowy_skylakex?page=0,33
https://www.purepc.pl/procesory/tes...e_i77800x_szesciordzeniowy_skylakex?page=0,34
https://www.purepc.pl/procesory/tes...e_i77800x_szesciordzeniowy_skylakex?page=0,35
https://www.purepc.pl/procesory/tes...e_i77800x_szesciordzeniowy_skylakex?page=0,36
https://www.purepc.pl/procesory/tes...e_i77800x_szesciordzeniowy_skylakex?page=0,37
https://www.purepc.pl/procesory/tes...e_i77800x_szesciordzeniowy_skylakex?page=0,38
https://www.purepc.pl/procesory/tes...e_i77800x_szesciordzeniowy_skylakex?page=0,39


----------



## GoldenX (Apr 25, 2018)

If they some day release a Coffee Lake-E without that "cache rework", Threadripper will have to lower it's prices. I hope this time they don't fragment the HEDT socket with desktop parts.


----------



## cucker tarlson (Apr 25, 2018)

I think mesh is here to stay, they'll wanna improve it tho. It works well in workstation stuff, and that's the priority.  It sucks what they did with skylake-x tho. Haswell-E/Broadwell-E had some great gaming SKUs like 6850K, 6 cores with solder and 40 pci-e lanes for multi gpu. It was something that an enthusiast gamer would buy without having to rob a bank. Now you can only find the increased number of lanes on 10 cores.


----------



## W1zzard (Apr 25, 2018)

GoldenX said:


> I would love to see some Pentiums


I even have Celerons


----------



## GoldenX (Apr 25, 2018)

Now you have me really interested. The low end never gets love, and it's what sells the most.
AMD has those horrible 9000 APUs as the cheap options, the Celeron should destroy them.


----------



## Shatun_Bear (Apr 25, 2018)

GoldenX said:


> Now you have me really interested. The low end never gets love, and it's what sells the most.
> AMD has those horrible 9000 APUs as the cheap options, the Celeron should destroy them.



They also have the impending Ryzen 2400GE and 2200GE, low-power APUs. Or you probably mean even cheaper than these? Let's not forget I can buy a 2200G for £75 here in the UK. 2200GE should be cheaper.


----------



## GoldenX (Apr 25, 2018)

No, the low power options are always more expensive, they are extra binned.
AMD can release a dual core + SMT Ryzen APU, that would be a lot better than the FX era 9500-9800 APUs.


----------



## HTC (Apr 25, 2018)

@W1zzard : have you seen this?

Anandtech discovered why their results were so different then other reviews and the implications of this can affect the review's results.


----------



## W1zzard (Apr 25, 2018)

HTC said:


> @W1zzard : have you seen this?
> 
> Anandtech discovered why their results were so different then other reviews and the implications of this can affect the review's results.


Not sure why anyone would ever want to force HPET in Windows. Apparently no other reviewer did that, so just an isolated failure.


----------



## HTC (Apr 25, 2018)

W1zzard said:


> *Not sure why anyone would ever want to force HPET in Windows.* Apparently no other reviewer did that, so just an isolated failure.



You can "thank" Microsoft for that. If they kept the "up to Windows 7" way of measuring time since then, this would not be needed.

As you can see from Anandtech's benches, the difference can be significant which *IMO* warrants a check to see if your results were in anyway affected. Just pick one of the "worst offenders" from their review and do what they did with the timer (assuming you can): if it shows no difference, it's safe to say your review is "timer error" free. If it does show a big difference, then ...


----------



## DeathtoGnomes (Apr 25, 2018)

W1zzard said:


> Not sure why anyone would ever want to force HPET in Windows. Apparently no other reviewer did that, so just an isolated failure.


it has to be enabled in the BIOS ( assuming there is an option in your BIOS) as well as Windows.


----------



## W1zzard (Apr 25, 2018)

DeathtoGnomes said:


> it has to be enabled in the BIOS ( assuming there is an option in your BIOS) as well as Windows.


You're not supposed to "enable" it in Windows. Doing so (bcdedit useplatformclock true) forces Windows to use HPET as ONLY time source on the system which causes the issues Anandtech experienced. Windows will automatically do the right thing if things are left at default.


----------



## HTC (Apr 25, 2018)

W1zzard said:


> You're not supposed to "enable" it in Windows. Doing so (bcdedit useplatformclock true) forces Windows to use HPET as ONLY time source on the system *which causes the issues Anandtech experienced*. Windows will automatically do the right thing if things are left at default.



It's the other way around, dude: they found out their issues were because the timer was having inconsistencies, which is why they forced the HPET timer to be the ONLY timer, since it's supposedly the most accurate one.

Once they enabled HPET only timer, their results were very similar in some cases but very different in others, and this is for both Intel and AMD's systems, though in different ways.

EDIT (damn auto merge)



HTC said:


> *It's the other way around, dude: they found out their issues were because the timer was having inconsistencies, which is why they forced the HPET timer to be the ONLY timer, since it's supposedly the most accurate one*.
> 
> Once they enabled HPET only timer, their results were very similar in some cases but very different in others, and this is for both Intel and AMD's systems, though in different ways.



*And i had a brain fart* ... after reading the new Anandtech review i understood it backwards ... it's once they disabled HPET that their results became more inline with other reviews.

Still, this opens up another can of worms: whose to say that the HPET only timer results are "the wrong results"? I mean, if benches rely on timer to accurately give results and the timer is incorrect, the results should also be incorrect, no?


----------



## JRMBelgium (Apr 26, 2018)

The performance per dollar, does that take in to account that you need to buy a cooler for the 8700K? Because I don't think it's fair to make a "performance per dollar" chart if you're not comparing appels to appels.


----------



## oli_ramsay (Apr 28, 2018)

I still think my 3770K is OK by today's standards.  I don't see the point in upgrading until next year when Ryzen 2 is out.  I mainly game anyway.

Still good to see competition though!
Now if only the GPU side of things were the same!  I need something better than a 480 for my freesync monitor and I'#m not paying >£500 for a vega 56 lol


----------



## Zatanus (May 7, 2018)

Can anyone please tell how Virtualization with Vmware has been tested?


----------



## W1zzard (May 7, 2018)

Zatanus said:


> Can anyone please tell how Virtualization with Vmware has been tested?


1.. 10 VMware instances are started in parallel and the time measured up to a certain point


----------



## Zatanus (May 7, 2018)

W1zzard said:


> 1.. 10 VMware instances are started in parallel and the time measured up to a certain point



Thanx for the reply.

Can I replicate that or is a TechPowerUp custom process?


----------



## W1zzard (May 7, 2018)

Zatanus said:


> Thanx for the reply.
> 
> Can I replicate that or is a TechPowerUp custom process?


it's a custom vm with custom test driver, nothing comes  from anything that's available commercially or in public. so no


----------



## Charcharo (May 9, 2018)

Vayra86 said:


> You're absolutely right I think about the onus being on the devs, but there are limitations to that as well. In budget, the engine in use and the limitations of that engine, and coding talent as well. And wherever the devs fall short, we use hardware to compensate for it. Its always been that way and always will be that way. Software is almost never 'perfect'.
> 
> The bottom line for a vast majority of games in my view, doesn't really change: single thread is and will for the foreseeable future be the limiting factor when it comes to CPU performance. You're right about Ashes. And there are other examples of new games as well that use cores extremely well, I see them too. And at the same time, even in those cases the performance does not scale linearly with core counts. There are many more things in the entire pipeline that can limit FPS some of which even faster hardware cannot overcome. But it does still help in lots of scenarios.
> 
> ...



Problem is, it is not about average IPC. Ryzen 2000 is close to Intel there, around 5% difference so one would expect that it would be the same in games under equal clocks, yet Ryzen 2700X (which is clocked higher) sometimes can not win even vs the i5 8400. I honestly think it will take time till the architecture in general gets used better by the, let's be honest, mostly bad programmers that work on video games. And the Nvidia driver issues with Ryzen do not help in some older game benches.

Ashes has split AI across its threads too. Not just physics. Men of War has done neither, hell it is 32 bit still. Another low budget game beating Starcraft 2 on technological fronts though, makes one wish Blizzard would actually get good at their job.

*I say this as a person that plays Starcraft 2 and likes it a lot, but seriously its getting silly.


ANyways, game logic at 720p and at 1080p may have different bottlenecks. CPUs and games are not so simple, that is my problem with the tests. Still, for me personally, as a high refresh rate gamer, going from Haswell to Ryzen 5 1500X was a gigantic jump in all scenarios (though I am using a Fury on top so no Nvidia/Ryzen driver issues in older games). Now if only I had more cash I would plan an upgrade for 2600X.


----------



## Shatun_Bear (Oct 10, 2018)

Seeing as the 9900K releases for £600 (  ), in hindsight buying this 8-core Ryzen was one of the best purchases you could of made earlier this year for £280.


----------



## phill (Oct 10, 2018)

Shatun_Bear said:


> Seeing as the 9900K releases for £600 (  ), in hindsight buying this 8-core Ryzen was one of the best purchases you could of made earlier this year for £280.



I especially like a 1700X for £150...  Bargain deal if ever there was one


----------



## JRMBelgium (Oct 10, 2018)

phill said:


> I especially like a 1700X for £150...  Bargain deal if ever there was one



The current Price/Performance ratio in Belgium/The Netherlands:





So you are correct. Also in my country the 1700x is a better deal then the 2700x when it comes to performance per euro.


----------

