# AMD Ryzen 7 1800X 3.6 GHz



## W1zzard (Mar 17, 2017)

Arguably the most important product launch for AMD as a processor company, Ryzen 7 1800X is an attempt to get back into the big league with the Intel Core i7 series in what is a textbook David vs. Goliath battle. Besides CPU tests, we also included 14 games tested at 1080p and 1440p.

*Show full review*


----------



## The Quim Reaper (Mar 17, 2017)

Superb chips for productivity tasks, good enough for gaming.

The only thing putting me off right now is the buggy ecosystem that is supporting them and the fact they're such poor overclockers.

I was really hoping they would at least get to 4.3Ghz but alas that looks like we'll have to wait for the first refresh of Ryzen in 2018 before overclocking headroom improves.


----------



## thebluebumblebee (Mar 17, 2017)

Was worth the wait for this review.


----------



## IceScreamer (Mar 17, 2017)

Man, gaming performance aside this is a great CPU. I mean I'm blown away by the power consumption.

When they iron out the annoyances this will be a great platform.


----------



## Joss (Mar 17, 2017)

This chip is more than "good enough" for gaming, as many put it, it's very good.
With the upcoming mobo tweaks and RAM issues solved I can imagine a 1600x rig by my desk side for years to come.


----------



## the54thvoid (Mar 17, 2017)

Well I've bought into the Ryzen ecosystem for one reason.

If I buy a KL chip now, it'll be superseded soon enough by another Intel chip so in 3 years time there will have been like 5 Intel high end chips out.  Anyway, at 1440p with max settings on everything and new 1080ti, the 1700X I've gone for should be better then my Sandy-E.  And if Bethesda go full AMD optimised, maybe it's not so much a gamble for me.


----------



## Jhelms (Mar 17, 2017)

Appreciate the review - a TON of work went into this clearly and it well defines the current state of Ryzen. I like how you more focus on the raw data and present it as such. I feel the 1800x is simply not a bargain. The 1700 non X is quite a beast for the cash and AMD will have my money here soon  Very impressed with what they have done, in the time they have done it with the budget and manpower the company has.


----------



## ssdpro (Mar 17, 2017)

Good review consistent with other reviews.  Probably a bit better since not rushed and things have settled some.  Even setting aside whatever happened with the gaming performance, it is a great value.  I like to think of it as more of a balance.  The problem remains if you are a gamer, and already have a 6700/7700k, you won't enhance anything with these.  You might push specific niche tasks better and maintain reasonable gaming performance.  The problem is these niche areas are just not something mainstream or even most enthusiasts take part in.  Time will tell if this does any actual disruption.  The market says it isn't and the stock continues lower since launch.


----------



## geon2k2 (Mar 17, 2017)

Can you please disable 4 cores/1 CCX, just to see what would the gaming performance look like?

I'm particularly interested for titles in which the i7 crashed the r7, and are reasonably popular such as Fallout 4, Hitman or Total War Warhammer. 
In Batman Zen is also behind, but still manages 140 fps ... so that's not much of an issue.

Thank you in advance.


----------



## mcraygsx (Mar 17, 2017)

Ryzen 1800X is not in same league as mainstream 7700K. Why not compare this product to Broadwell-E series instead. Fantastic review as always W1zzzard. Neat.


----------



## Sempron Guy (Mar 17, 2017)

hitman, total warhammer and rise of the tomb raider tested under dx11 or 12? same question with Doom, if it's opengl or vulkan?


----------



## DeathtoGnomes (Mar 17, 2017)

Great review @W1zzard ! 

But I wonder, with power consumption in gaming performance lower, if AMD deliberately used a hard limit there. If they did, that would explain lower gaming performance across the board.


----------



## btarunr (Mar 17, 2017)

Sempron Guy said:


> hitman, total warhammer and rise of the tomb raider tested under dx11 or 12? same question with Doom, if it's opengl or vulkan?



Hitman, TW, and RoTR are DX12 and Doom Vulkan, yes.


----------



## Evo85 (Mar 17, 2017)

Not sure why lack of integrated graphics is a negative here. If you want that, you buy an APU chip.


----------



## dat_boi (Mar 17, 2017)

What a shame, maybe another time AMD.


----------



## Ubersonic (Mar 17, 2017)

Within a few % of the 7700K in games, half the price of an Intel octocore, me likey.


----------



## P4-630 (Mar 17, 2017)

Not all that impressive for gaming, you're still better off with a i7 7700K for gaming.
It's power usage is impressive for a 16 thread AMD chip compared to their older chips and Intel.


----------



## Enterprise24 (Mar 17, 2017)

Impressive review. I hope that their 4 cores variant could achieve higher clock like 4.5Ghz+ for better gaming performance.


----------



## suraswami (Mar 17, 2017)

As usual Great review W1z!!

For some of us here who still uses the 8350 (or similar), would have been bit more helpful to see what kind of performance jump the Ryzen gives.


----------



## ZoneDymo (Mar 17, 2017)

I want one 
I want to put it to work compressing the crap out of my livestream


----------



## Drac (Mar 17, 2017)

Im wondering what would be the numbers with memory at 3000 mhz CL15 instead of 2666 mhz at CL16


----------



## Shatun_Bear (Mar 17, 2017)

thebluebumblebee said:


> Was worth the wait for this review.



Hardly.

The low score was predictable after AMD stopped sending TPU first-round review samples.

It seems anything Nvida = 9.7-9.9 score, anything AMD as low as 8.7. Really transparent bias.


----------



## W1zzard (Mar 17, 2017)

Sempron Guy said:


> hitman, total warhammer and rise of the tomb raider tested under dx11 or 12? same question with Doom, if it's opengl or vulkan?


DX12 / Vulkan



Shatun_Bear said:


> The low score was predictable after AMD stopped sending TPU first-round review samples.


It's a first-round sample. I just didn't have time.



Evo85 said:


> Not sure why lack of integrated graphics is a negative here. If you want that, you buy an APU chip.


I wanted to inform you personally that Ryzen isn't for you, due to lack of IGP. No man, you can't plug anything into the monitor outputs on the Ryzen motherboards! Ryzen APUs are coming out in H2. For the majority of people this is a non-issue, that's why it wasn't mentioned in the conclusion text.


----------



## dat_boi (Mar 17, 2017)

It's funny how many times hardware reviewers get accused of being anti-AMD, while they are doing their damnedest to paint its products favorably as if their livelihood depended on it, because it does.


----------



## newtekie1 (Mar 17, 2017)

> AM4 still has a rectangular cooler mount-hole layout (as opposed to square ones on Intel LGA platforms). AMD should have switched to a square layout, to make it easier to orient tower-type coolers to blow hot air out of the rear of the case. Current AM4-ready tower-type coolers have elaborate retention module kits that let you do that. Most popular cooler vendors are either selling or giving away AM4 retention modules for free. You often also have to remove the plastic retention module motherboards ship with, to install certain kinds of coolers.



This is what annoys the crap out of me(Intel is guilty too).  Why change the mounting hole layout so we have to buy(or the heatsink manufacturers have to give away at a loss) new retention brackets?  Did Intel really need to make the holes on the 115X platform ever so slightly larger than the 775?  Did AMD really need to do the same between AM3+/FM2+ and AM4?  And it makes even less sense that AMD wouldn't take the opportunity to make their mounting holes square, and they should have actually just matched the already in use Intel spacing.  Make it easier for all of us!



dat_boi said:


> It's funny how many times hardware reviewers get accused of being anti-AMD, while they are doing their best to paint its products favorably as if their livelihood depended on it, because it does.



It is the classic fanboy problem.  If you speak ill of a product by the fanboy's chosen brand, you are instantly bias.  The reality is the reviewer just feels the product isn't as good as the competition that scores higher.  Right now AMD is still behind both Intel and nVidia in terms of overall product quality, so their scores in reviews reflect that.  They are a lot closer than they were last year, that is for sure, but not quite to the point of being equal or better.  Of course, I'm sure I'll get "well Ryzen is better than Intel in XYZ area, or Ryzen is cheaper than one single Intel processor so Ryzen should have better scores" or "AMD GPUs are better than nVidia if you look at a few specific games".  Well, yes, if you look at a few specific areas, AMD is better, but if you look overall they are not.

People saying W1zzard is Bias need to review their history.  This site started largely as an ATI resource.  W1zzard has gone so far as to hide an easter egg in an nVidia review that said "epic fail" because the card was so bad.  And I believe an nVidia card still holds the crown for the lowest score ever received in a review here.


----------



## G33k2Fr34k (Mar 17, 2017)

It would've been nice to see a 6900K along side the 1800X just for reference. The comparison against the 7700K is certainly good to see but the 7700K is not 1800X's competitor. The power consumption figures are amazing!

I'm waiting for the 1400X 3.9GHz quad core CPU. I'd love to see a review of it compared against the 7700K when it comes out.


----------



## Lightofhonor (Mar 17, 2017)

Went with a 1700X since, when I preordered at least, it wasn't completely known if the 1700 would be able to achieve similar overclocks.

Overall great review! I agree the MB issues are a pain, but even with those I am still way outperforming my old i5 with 1333mhz memory  And it will just get faster.

I do wish we could keep XFR/Boost while overclocking. Enjoying my 3.8ghz, but knowing I am actually slower in single core makes me want to OC more...


----------



## mcraygsx (Mar 17, 2017)

newtekie1 said:


> This is what annoys the crap out of me(Intel is guilty too).  Why change the mounting hole layout so we have to buy(or the heatsink manufacturers have to give away at a loss) new retention brackets?  Did Intel really need to make the holes on the 115X platform ever so slightly larger than the 775?  Did AMD really need to do the same between AM3+/FM2+ and AM4?  And it makes even less sense that AMD wouldn't take the opportunity to make their mounting holes square, and they should have actually just matched the already in use Intel spacing.  Make it easier for all of us!



Asus Crosshair VI support both AM3/AM4 since it has 8 holes total for cooler. You can upgrade to new AM4 cooler or extend the life of your existing AM3 Cooler/Kit. Motherboard manufacturers can also make this transition easier should they choose to do so.

Therefore I wouldn't put much blame on AMD. Since Intel does it each season.


----------



## Ubersonic (Mar 17, 2017)

P4-630 said:


> Not all that impressive for gaming, you're still better off with a i7 7700K for gaming.



To be brutally honest there is zero reason for anyone with a brain to buy a 7700K now.  If a Ryzen chip isn't better than a 7700K for what you're doing, then what you're doing doesn't warrant buying the 7700K over the 7600K either.


----------



## newtekie1 (Mar 17, 2017)

mcraygsx said:


> Asus Crosshair VI support both AM3/AM4 since it has 8 holes total for heatsink.



Yeah, there were some boards that did that with the 775/1156 transition too.  That just shows even more proof that there was no reason to change the hole layout.  It is nice that some manufacturers recognize this, but it is something that AMD and Intel should recognize.



Ubersonic said:


> To be brutally honest there is zero reason for anyone with a brain to buy a 7700K now.  If a Ryzen chip isn't better than a 7700K for what you're doing, then what you're doing doesn't warrant buying the 7700K over the 7600K either.



Agreed.  If the extra threads of the 7700K are going to help you, then the extra cores of the 1700 will help you even more.  And the gaming performance of the 1700(with an overclock) will be good enough.


----------



## XiGMAKiD (Mar 17, 2017)

Excellent review, the power consumption is impressive. Now if only AMD can use that power consumption advantage to make it perform and overclock better

Just curious, with Ryzen 5 already announced and launching worldwide on April 11, do TPU already got one?


----------



## Shatun_Bear (Mar 17, 2017)

dat_boi said:


> It's funny how many times hardware reviewers get accused of being anti-AMD, while they are doing their damnedest to paint its products favorably as if their livelihood depended on it, because it does.



So then why is this being compared against a 7700K and nothing else? Last I checked TPU wasn't a 'gaming' website but a 'hardware' website. AMD didn't compare it to a 7700K for a reason. It is a HEDT competitor - so at least put the 6900K, 6950K or a 6800K in there.


W1zzard said:


> DX12 / Vulkan
> 
> 
> It's a first-round sample. I just didn't have time.



But you said this:



> Arguably *the most important product launch for AMD as a processor company*



So with a review over 2 weeks after launch, I take it AMD is pretty low down in your estimations?

Yet the Nvidia 1080 Ti gets reviewed a day _before_ it launches.


----------



## Vario (Mar 17, 2017)

Shatun_Bear said:


> Hardly.
> 
> The low score was predictable after AMD stopped sending TPU first-round review samples.
> 
> It seems anything Nvida = 9.7-9.9 score, anything AMD as low as 8.7. Really transparent bias.


As far as gaming goes, put simply Ryzen is more money and performs worse and with more hassle. Its scored to account for that.


----------



## dozenfury (Mar 17, 2017)

On the good side AMD is closer than they've been in many years.  But on the downside unless you do not game and primarily do heavy productivity/encoding (and are counting every second at that), the 7700K will be the better option.  It's much cheaper than the 1800X and essentially the same price as the 1700X.  Even then the 7700K can still do the productivity apps with fine performance, and is far more overclockable which would increase the gap even further than the results in this review.  I was hopeful for Ryzen, and I don't have a bias either way since I've had plenty of Intel and AMD systems.  But it's tough to argue for a 1700/X or 1800X as-is.  This is a bit harsh, but realistically with it's performance, lack of oc headroom, and various minor outstanding issues, the 1800X would need to be in the mid $200's to be a buy for me over a $320 7700K.  

Not to mention if I'm encoding something, I'll start it and go do something else for awhile while it runs or just let it run in the background.  So the amount of time that takes for passive activity like encoding is really far less important than gaming or general use response time, where you are actively interacting and are more directly affected by the performance difference.  FPS and response time is important.  Faster encoding time on the other hand is a bit like saying my oven can cook a pizza in 18 minutes instead of 20.  Nice, but something that's tough to get super excited about.


----------



## the54thvoid (Mar 17, 2017)

Shatun_Bear said:


> Hardly.
> 
> The low score was predictable after AMD stopped sending TPU first-round review samples.
> 
> It seems anything Nvida = 9.7-9.9 score, anything AMD as low as 8.7. Really transparent bias.




As someone who has just purchased an entire new Ryzen system, I feel qualified to reply to this.

I've gambled.  I've bought 3200 memory that is on a QVL list.  It should just work.  It better.  I've just bought a £370 CPU that will game worse than a 7700k.  But I'm hopeful some optimising will be done to address the CCX latency (thus the 3200 RAM speed).  I know it wont clock much over 4Ghz, (if i'm lucky).  I actually have no idea why i bought a Ryzen CPU.  All I do is game.  I just wanted a change and you know what.  If it was a mistake and I have issues, I'll come back and ram it down your throat that the Ryzen path is a perilous one.

You know why Nvidia cards often get 9 point whatever?  Because they are very good.  You did notice AMD used a Titan Xp for it's open Ryzen demos?  And last outing for Fury X type PC system that never really happened they used an Intel chip.  Join the dots and don't be so AMD banner waving blind.

In fact - I think I bought an AMD chip so I can get all pious about it and tell the fanboys to go jump off a bridge.


----------



## newtekie1 (Mar 17, 2017)

Shatun_Bear said:


> So with a review over 2 weeks after launch, I take it AMD is pretty low down in your estimations?
> 
> Yet the Nvidia 1080 Ti gets reviewed a day _before_ it launches.



I'm sure the 1080 Ti was a lot quicker to review than Ryzen was.  The test bed was already set up for the 1080 Ti review.  It was just a matter of popping the card in, taking a few pictures, copy and pasting some information into the review template and done.(Sorry, W1zzard.  I know this is a simplification of all the work you put into the reviews.  I'm just trying to make the point that you have the process down to a science for GPU reviews. So they are a lot quicker to do and get published.)

Testing a brand new CPU, when you aren't using to testing CPUs, is a little different.

Plus, W1zzard isn't exactly a CPU reviewer.  Most of the CPU reviews have been done by others.  So I can see it taking him a little longer to get the review done and get the process down.  Plus, he is extremely thorough, which is one of the reasons I really like his reviews.  And he is dealing with a brand new platform that several reviewers, some way more expereience with CPU reviews, even had problems dealing with.  Hell, just dialing in a stable overclock can take a week of fiddling and stress testing...


----------



## Big_Vulture (Mar 17, 2017)

It took quite really long to TPU...


----------



## efikkan (Mar 17, 2017)

For anyone planning to game on a GTX 1070 or better, this CPU is simply not good enough. An i7-6800K would be a better buy then, even with "only" 6 cores.


----------



## ZeroFM (Mar 17, 2017)

Only problem in test is : all games is AAA , and it's best case scenario for amd . In arma 3 1800x vs 7700K is 30%+ FPS different , same should be in ARK , Rust , The forest


----------



## xkm1948 (Mar 17, 2017)

Cons: Lacks integrated graphics?

Is W1zzard smoking something? Who the f*uck in the right mind would want an iGPU on an 8 core HEDT processor? It is like anyone actually uses Intel's shit iGPU when they buy 6700K/7700K.


----------



## W1zzard (Mar 17, 2017)

newtekie1 said:


> I'm sure the 1080 Ti was a lot quicker to review than Ryzen was.  The test bed was already set up for the 1080 Ti review.  It was just a matter of popping the card in, taking a few pictures, copy and pasting some information into the review template and done.(Sorry, W1zzard.  I know this is a simplification of all the work you put into the reviews.  I'm just trying to make the point that you have the process down to a science for GPU reviews. So they are a lot quicker to do and get published.)
> 
> Testing a brand new CPU, when you aren't using to testing CPUs, is a little different.
> 
> Plus, W1zzard isn't exactly a CPU reviewer.  Most of the CPU reviews have been done by others.  So I can see it taking him a little longer to get the review done and get the process down.  Plus, he is extremely thorough, which is one of the reasons I really like his reviews.  And he is dealing with a brand new platform that several reviewers, some way more expereience with CPU reviews, even had problems dealing with.  Hell, just dialing in a stable overclock can take a week of fiddling and stress testing...


Right on the money, this is my first CPU review. Which means selecting and figuring out benchmarks, then building test systems with the hardware that's available, then bench (not exactly few results), then think, fix bench suite, rebench everything (two times for this review), then come up with structure, layout, texts, conclusion.

There will be more CPU reviews from me though  Just bought i5 7400, i3 7100, Pentium G4560.


----------



## the54thvoid (Mar 17, 2017)

I'm going to crank AA to 64x and put tesselation to warp factor.  That'll reduce my CPU bottleneck.


----------



## xkm1948 (Mar 17, 2017)

If it was me testing, I would probably push it through a whole suite of Linux based super computing performance test as well as Bioinformatics test. At least you need to use some BWE/HWE for comparison!


----------



## W1zzard (Mar 17, 2017)

xkm1948 said:


> as well as Bioinformatics test.


Do you have a workload you could share? Which is not running much faster on GPU


----------



## xkm1948 (Mar 17, 2017)

W1zzard said:


> Do you have a workload you could share? Which is not running much faster on GPU



A lot of genomic stuff I do cannot be done on GPU due to lack of VRAM. Protein 3D folding on the other hand relies heavily on GPU(Standford Fold@Home)

http://www.genome.umd.edu/masurca.html
This is what I use for genome assembly. CPU performance matters a lot in these computation intensive linux based testings. MaSuRCA is totally free. I think most RAW human genome reads are also free for grab on NCBI as sequencing reading archives. I used to use 30X human genome reads as performance test for PCs.  Assembly of a 30X coverage human genome takes about 166hrs on an 8 core 16 thread IvyBridgeE Xeon. The same amount of work it took 5~6hrs on my overclocked 5820K. On a HWEP based 20 core 40 thread super computing node it takes ~3hrs.

And this is de novo(without reference) assembly I am talking about.
So yes for us Biofimaticians RAW CPU performance is very important. 

Example of one human genome sequence reading archive(SRA)
https://www.ncbi.nlm.nih.gov/sra/ERX1943173[accn]


----------



## Air (Mar 17, 2017)

I think ryzen will shake up things in gaming notebooks, where power consumption is crucial and framerates higher than 60 fps are rarely the target.


----------



## londiste (Mar 17, 2017)

in anandtech's ryzen strictly technical thread (https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/) stilt wrote:


> Note: Current versions of Prime95 (28.10) do not stress Ryzen CPUs properly. The resulting power consumption is abnormally low, and both Firestarter and Linpack result in significantly higher power consumption.


is there any truth to this?


----------



## 1c3d0g (Mar 17, 2017)

@W1zzard: no distributed computing benchmarks? I'd love to see if AMD's 2x cores beats Intel's higher clocks with SETI@Home (or some other BOINC project) work units.


----------



## the54thvoid (Mar 17, 2017)

To be fair, on a gaming front, having looked at the 1440p results, it's good news for me.


----------



## CJCerny (Mar 17, 2017)

If most of your computing time is spent gaming, sure seems like the 7700K is still the way to go, especially when you consider that it is at least $100 cheaper than this Ryzen CPU at your local Microcenter. If you spend most of your time using multi-core liking apps, then this is a product to consider. I'm not seeing any bias in this review--buy what suits your workload best. You don't need a $500 CPU to play games well and surf the web.


----------



## birdie (Mar 17, 2017)

An excellent review and a great fanboyism free conclusion. Thank you, @W1zzard

Several missing pieces though:

You haven't mentioned whether Ryzen is worth the upgrade money (and it's a lot since you have to buy a new motherboard and RAM) for the owners of Intel Core it 2500/3770 CPUs who game at 1080p using average GPUs (GTX 970/1060/RX 470/480)

Do you think memory support will improve or not in this generation?
You've complimented AMD for creating this beast of a CPU architecture however from what I see IPC (even though it's 52% higher than Bulldozer's) is not really there - do you think AMD can actually overturn Intel with IPC performance in Zen 2.0 which was already announced?
A lingering issue is a relatively slow communication between CCX complexes which kills the performance of the applications whose threads need to talk to one another - this issue doesn't affect rendering/encoding but it affects certain games and workflows. So, at least for me Zen 1.0 is a nice try and I truly appreciate what AMD has managed to achieve, but I'll be waiting for Coffee Lake (the first hexacore consumer CPU from Intel)/Zen 2.0.


----------



## TheGuruStud (Mar 17, 2017)

Wowzers, only a couple of those tests actually use the cores.


----------



## idx (Mar 17, 2017)

Great review W1zz! 

What is really interesting about this cpu is that if you disable one CCX and do a lil OC, it will perform better in most of these buggy games (everyone is considering it as a legit performance measure ... sadly).
In plus, it will still beat the i7 7700k in some of the other tests.


----------



## ERazer (Mar 17, 2017)

Would Zen be a good upgrade path from 2600k, i dont see a whole much comparison.


----------



## simlariver (Mar 17, 2017)

I managed to find the review interesting even after reading tons of Ryzen reviews before.
I hope you get to also review R5 when it comes out.


----------



## PerfectWave (Mar 17, 2017)

why not compare 1800x and at least 6850k?


----------



## etayorius (Mar 17, 2017)

newtekie1 said:


> This is what annoys the crap out of me(Intel is guilty too).  Why change the mounting hole layout so we have to buy(or the heatsink manufacturers have to give away at a loss) new retention brackets?  Did Intel really need to make the holes on the 115X platform ever so slightly larger than the 775?  Did AMD really need to do the same between AM3+/FM2+ and AM4?  And it makes even less sense that AMD wouldn't take the opportunity to make their mounting holes square, and they should have actually just matched the already in use Intel spacing.  Make it easier for all of us!
> 
> 
> 
> ...




Just going to name a few stuff i seen from several years ago:


* Thumbs Down to AMD GPU's for not supporting PhysX back in the day.          nVidia GPU's never got a Thumbs Down for not supporting MANTLE or FreeSync

* Thumbs Down to AMD GPU's for Supporting DX11 claiming it will not be relevant for a while, yet when nVidia released Fermi they get a Thumbs up for "Substantial performance improvements in DirectX11". (Fermi at least got lower scores than HD5000 series in the end)

* Not benchmarking MANTLE which did brought final Performance in several games, claiming nVidia did not supported the API and that only a few games used it. But heck, if you're a review site you should review every single feature of a card, if not adding to the final score, at least leave it there for reference.

* Very very late Benchmarks for DX12/Vulkan claiming DX12 was unstable and not reliable. This was back when AMD was stomping nVidia to the floor with early DX12 games.

* When finally started benchmarking DX12/Vulkan the RX480 got left out of the list, putting the RX470 against both GTX1060 (3GB and 6GB).

* Wizz making fun of Ryzen at release in Facebook, going back and forth changing the title.


Those are just a few of the things i noticed in the past. I only owned 1 Radeon GPU which was the HD5770 and sold it like a year later, not because it was bad and actually worked fine with no issues at all. Sold it because i was offered a GTX470 which was almost twice as powerful. Then bought a GTX780TI. Prior to my HD5770 i owned a 8800GT, 7600GT and 5200FX which was my very first one. Hardware side i preffer nVidia, but i preffer AMD ethics since nVidia sometimes really act like a D*ck.

Lately i noticed that TPU has not been reviewing some AMD hardware, as they claim they did not provide or got their review hardware late. But if i were AMD i would totally not send them any more hardware Lol.

I really like AMD and i want them to succeed to keep both Intel and nVidia prices in check, lately it has not been the case but i don't want to see them go. I plan to support them with VEGA IF it offers GTX1080 performance for less cash.

Anyway TPU it is what is and i check every review they do from any brand, but when it's AMD i always expect a 8 - 8.5 from TPU.

I hope i don't get banned for this.


----------



## thebluebumblebee (Mar 17, 2017)

W1zzard said:


> There will be more CPU reviews from me though  Just bought i5 7400, i3 7100, Pentium G4560.


Would you please add the i3-7350K to that list?.


----------



## xkm1948 (Mar 17, 2017)

etayorius said:


> Just going to name a few stuff i seen from several years ago:
> 
> 
> * Thumbs Down to AMD GPU's for not supporting PhysX back in the day.          nVidia GPU's never got a Thumbs Down for not supporting MANTLE or FreeSync
> ...




Pretty sharp observation. Just went back to read some of the old reviews done by W1zzard, what you said are all true.

Welp as critical thinkers it should not be that hard for people to connect dots and figuring out trends now.


----------



## Aenra (Mar 17, 2017)

I pressed 'thanks' before reading it.

When you get an 8c/16t for a review, you should compare it to another 8c/16t if they are available; and they have been for some time now. And you must do so by i) over/underclocking until both have same freqs, run the exact same RAM, at the exact same freqs and timings, ii) pinpointing the performance difference if all was left at a default stock. Then and only then do you bother with 'unfair' comparisons between different lines [8c vs 4c and so on]. And should you go down that route, one would expect the wording to specify how problematic (hint!!!) such comparisons are.
While i have no doubt Wizzard's technical knowledge far surpasses even my wildest dreams, his way of thinking and mentality is highly problematic; for a reviewer mind.

Man, was that not a let down. Page after page about where it's all heading, what with an entire damn market comparing apples to oranges and calling the outcome of such comparisons an 'educated' opinion !!! And yet, here we are, lol

Removed my thanks, expressed my disappointment and politely moving on.
What-Ever. Like seriously.


----------



## newtekie1 (Mar 17, 2017)

etayorius said:


> Thumbs Down to AMD GPU's for not supporting PhysX back in the day. nVidia GPU's never got a Thumbs Down for not supporting MANTLE or FreeSync



PhysX added elements to the game that AMD had no alternative to.  Mantle was just an alternate rendering path with the only real benefit being to improve performance on AMD GPUs, while nVidia GPUs were already outperforming AMD so mantle wasn't necessary for them.

And since nVidia offers a competing solution to FreeSync, again it isn't a con.  You just have to pick which of the two technologies you want.  If it had no alternative to FreeSync, then it would be a con.  I don't believe "No Gsync support" was ever a con(except for maybe for a brief time after nVidia came out with Gsync before AMD scrambled to rebrand someone else's technology as their own so they would have an alternative).



etayorius said:


> Thumbs Down to AMD GPU's for Supporting DX11 claiming it will not be relevant for a while, yet when nVidia released Fermi they get a Thumbs up for "Substantial performance improvements in DirectX11". (Fermi at least got lower scores than HD5000 series in the end)



You know, in the GTX480 review, "DirectX 11 won't be relevant for quite a while" is listed as a con, right?  Same thing in the GTX470 and the GTX460  So it seems kind of idiotic to say they are biased because they put that in AMD reviews when it is in nVidia reviews too.  But yes, it is a pro that the nVidia GPUs were significantly more capable at DX11 too.  That isn't bias, that is just the facts.



etayorius said:


> Not benchmarking MANTLE which did brought final Performance in several games, claiming nVidia did not supported the API and that only a few games used it. But heck, if you're a review site you should review every single feature of a card, if not adding to the final score, at least leave it there for reference.



Yeah, and how relevant did Mantle end up becoming? The total number of games that used it ended up being like 12...  Seems the decision to not waste time with it was a good one.  And really, it shouldn't have been a concern for people buying the card either, because it ended up to not matter.  And if someone did buy the card becuase they hoped Mantle would be the next big thing, they ended up pretty disappointed.



etayorius said:


> Very very late Benchmarks for DX12/Vulkan claiming DX12 was unstable and not reliable. This was back when AMD was stomping nVidia to the floor with early DX12 games.



By "games" you mean single game right.  Because Ashes of Singularity was about the only game using DX12 at that time...



etayorius said:


> When finally started benchmarking DX12/Vulkan the RX480 got left out of the list, putting the RX470 against both GTX1060 (3GB and 6GB).



And which review are you talking about.  Because I just went through every single GTX1060 review, and every single one had the RX 480 in it.



etayorius said:


> Wizz making fun of Ryzen at release in Facebook, going back and forth changing the title.



He's made fun of  nVidia plenty in the past.  I even mentioned it already.


----------



## Joss (Mar 17, 2017)

efikkan said:


> For anyone planning to game on a GTX 1070 or better, this CPU is simply not good enough



What ?





this is basically the same performance of the future 1600x (same chip, same clocks, only two cores disabled) for $249, _not good enough_


----------



## 0x4452 (Mar 17, 2017)

etayorius said:


> * Thumbs Down to AMD GPU's for not supporting PhysX back in the day. nVidia GPU's never got a Thumbs Down for not supporting MANTLE or FreeSync



* Mantle is explicit for AMD GPUs where the software API matches the HW interface exposed by the AMD GPU. Wanting NVIDIA to support that is non-sense. Similar to Glide back in the day from 3DFX.
* PhysX was not started by NVIDIA and it also had a CPU fall-back code-path.
* FreeSync in my book is a rip off of G-Sync. NVIDIA invested in the RnD, proved the concept, obviously they want the feature to be a premium. And it is.

In general, AMD fanboys keep on bashing NVIDIA for not seeing gains from DX12. As if that matters - DX12 doesn't make your games prettier, it's a power move by M$/AMD. NVIDIA invests in game works, physx, gsync to make gaming and graphics better, gets accused for being evil. WTF?


----------



## dwade (Mar 17, 2017)

More power efficient than Intel. Did AMD sell their soul to the devil!?


----------



## etayorius (Mar 17, 2017)

newtekie1 said:


> PhysX added elements to the game that AMD had no alternative to.  Mantle was just an alternate rendering path with the only real benefit being to improve performance on AMD GPUs, while nVidia GPUs were already outperforming AMD so mantle wasn't necessary for them.
> 
> And since nVidia offers a competing solution to FreeSync, again it isn't a con.  You just have to pick which of the two technologies you want.  If it had no alternative to FreeSync, then it would be a con.  I don't believe "No Gsync support" was ever a con(except for maybe for a brief time after nVidia came out with Gsync before AMD scrambled to rebrand someone else's technology as their own so they would have an alternative).
> 
> ...



No, just no.

The HD5870 was cooler with less TDP than the GTX480. The only reason why the 480 won was because it's frequency speed was pushed way too high to beat the 5870 which had been out for more than 6 moths. It is idiotic to say that the 480 had superior DX11 performance. You also missed the point of the MANTLE/PhysX. Yes one of them is an API and the other is a middleware, but you can't really blame nVidia for not supporting MANTLE, but it seems it is ok to blame AMD for not supporting PhysX. What sort of mentality is this?


----------



## Shatun_Bear (Mar 17, 2017)

Aenra said:


> I pressed 'thanks' before reading it.
> 
> When you get an 8c/16t for a review, you should compare it to another 8c/16t if they are available; and they have been for some time now. And you must do so by i) over/underclocking until both have same freqs, run the exact same RAM, at the exact same freqs and timings, ii) pinpointing the performance difference if all was left at a default stock. Then and only then do you bother with 'unfair' comparisons between different lines [8c vs 4c and so on]. And should you go down that route, one would expect the wording to specify how problematic (hint!!!) such comparisons are.
> While i have no doubt Wizzard's technical knowledge far surpasses even my wildest dreams, his way of thinking and mentality is highly problematic; for a reviewer mind.
> ...



This is what I said, but no justification from W1zzard on why he only compared to a 7700K. That's completely not the CPU this should be compared with - it's on a hiding to nothing as well with all these tests in this review that prefer frequency over cores (which is strange again).

Sure, throw a 7700K in there, but at least alongside a 6900K as well for comparison.



etayorius said:


> Just going to name a few stuff i seen from several years ago:
> 
> 
> * Thumbs Down to AMD GPU's for not supporting PhysX back in the day.          nVidia GPU's never got a Thumbs Down for not supporting MANTLE or FreeSync
> ...



The most recent one was perhaps the most telling -

At the 480/1060 launches, the cards were benched with a really, really old suite of games featuring stuff like BF3 AND BF4 and no DX12/Vulkan titles. Then the conclusion was formed from the performance summary that the 1060 was quite significantly faster than the 480 because it was quite a lot faster in these old DX11 titles W1zzard benched with. Bizarrely, just _after_ both cards had been reviewed (and opinions formed from the reviews by TPU readers) he updated his bench suite with lots of newer games and DOOM Vulkan/DX12 games. (and lo and behold, the 480 is neck-and-neck with the 1060 for the most part). But it was too late then. Surely with a huge twin videocard release, you would have updated your test suite before release, not a couple weeks afterwards.


----------



## etayorius (Mar 17, 2017)

0x4452 said:


> * Mantle is explicit for AMD GPUs where the software API matches the HW interface exposed by the AMD GPU. Wanting NVIDIA to support that is non-sense. Similar to Glide back in the day from 3DFX.
> * PhysX was not started by NVIDIA and it also had a CPU fall-back code-path.
> * FreeSync in my book is a rip off of G-Sync. NVIDIA invested in the RnD, proved the concept, obviously they want the feature to be a premium. And it is.
> 
> In general, AMD fanboys keep on bashing NVIDIA for not seeing gains from DX12. As if that matters - DX12 doesn't make your games prettier, it's a power move by M$/AMD. NVIDIA invests in game works, physx, gsync to make gaming and graphics better, gets accused for being evil. WTF?



* AMD offered MANTLE to nVidia, they said no.

* nVidia never offered PhysX to AMD

* Who cares.

I'm neither Red or Green Team. I use whatever fits my needs at the time, which had happen more often nVidia for me.



Shatun_Bear said:


> This is what I said, but no justification from W1zzard on why he only compared to a 7700K. That's completely not the CPU this should be compared with - it's on a hiding to nothing as well with all these tests that prefer frequency.
> 
> Sure, throw a 7700K in there, but at least alongside a 6900K as well for comparison.
> 
> ...




Tell this to the fanboys above. How dare AMD not support PhysX Software, but why should nVidia support AMD API Software crap.


----------



## xkm1948 (Mar 17, 2017)

etayorius said:


> No, just no.
> 
> The HD5870 was cooler with less TDP than the GTX480. The only reason why the 480 won was because it's frequency speed was pushed way too high to beat the 5870 which had been out for more than 6 moths. It is idiotic to say that the 480 had superior DX11 performance. You also missed the point of the MANTLE/PhysX. Yes one of them is an API and the other is a middleware, but you can't really blame nVidia for not supporting MANTLE, but it seems it is ok to blame AMD for not supporting PhysX. *What sort of mentality is this?*



Bias, ain't it clear? I thought you have already named it!


----------



## etayorius (Mar 17, 2017)

xkm1948 said:


> Bias, ain't it clear? I thought you have already named it!



I'm just here for the Lulz. I already know what to expect from TPU. Intel, nVidia, AMD... whatever, I love them all equally. But sometimes nVidia d*ckness is what gets on my nerves. Intel has behave quite good for the last decade though.


----------



## kruk (Mar 17, 2017)

W1zzard said:


> Right on the money, this is my first CPU review. Which means selecting and figuring out benchmarks, then building test systems with the hardware that's available, then bench (not exactly few results), then think, fix bench suite, rebench everything (two times for this review), then come up with structure, layout, texts, conclusion.
> 
> There will be more CPU reviews from me though  Just bought i5 7400, i3 7100, Pentium G4560.



Could you please also test stock CPU cooler performance (temp) and noise? (if provided)
Also, would it be possible to measure power consumption of the CPU like you do with the GPUs?


----------



## newtekie1 (Mar 17, 2017)

etayorius said:


> The HD5870 was cooler with less TDP than the GTX480. The only reason why the 480 won was because it's frequency speed was pushed way too high to beat the 5870 which had been out for more than 6 moths. It is idiotic to say that the 480 had superior DX11 performance.



The GTX480 was on a different performance level to the 5870 in DX11.  In DX11 the GTX480 had a good 30% lead over the 5870.  So, no, it is not idiotic to say the GTX480 had superior DX11 performance.  It is the fact to say it had superior DX11 performance.  It had massively superior DX11 performance.  In the first GTX480 review, in Metro 2033, one of the few DX11 games tested, the HD5870 scored 0.6FPS in the highest resolution test while the GTX480 scored 18.1FPS!  In the next lowest resolution the 5870 scored 18.9FPS while the GTX480 was getting a solid 31.3FPS.

You listed a bunch of other reasons that the GTX480 ended up getting a bad score, but none of them are reasons why the better DX11 performance shouldn't be a positive bullet point for the GTX480.  Remember, the HD5870 scored a 9.5, the GTX480 only got 8.2.



etayorius said:


> You also missed the point of the MANTLE/PhysX. Yes one of them is an API and the other is a middleware, but you can't really blame nVidia for not supporting MANTLE, but it seems it is ok to blame AMD for not supporting PhysX. What sort of mentality is this?



Actually, no I didn't, I addressed it.  AMD has no alternative to PhysX.  However, nVidia had an alternative to Mantle, its called DX11 and DX10, and DX9, and their cards performed just fine using those alternative APIs.



etayorius said:


> AMD offered MANTLE to nVidia, they said no.



NVidia didn't need it.  Their cards were already beating AMD cards with the APIs already on the market. Mantle was a performance improving API that really only improved performance for AMD cards. Supporting Mantle would have only made it go main stream, and that would have only resulted in AMD closing the performance gap.  Why would nVidia bother doing this?



etayorius said:


> nVidia never offered PhysX to AMD



https://www.extremetech.com/computing/82264-why-wont-ati-support-cuda-and-physx

The real fact is that nVidia was happy to let AMD/ATI use and support PhysX and CUDA. AMD/ATI just didn't want to write their drivers to do it.


----------



## 0x4452 (Mar 17, 2017)

etayorius said:


> * nVidia never offered PhysX to AMD



It is the last time I respond to you, but... get your facts straight.

PhysX is on github:

https://developer.nvidia.com/physx-source-github

AMD can go and optimize it for their GPUs (or CPUs) if they wanted...


----------



## Kyuuba (Mar 17, 2017)

Shatun_Bear said:


> So then why is this being compared against a 7700K and nothing else? Last I checked TPU wasn't a 'gaming' website but a 'hardware' website. AMD didn't compare it to a 7700K for a reason. It is a HEDT competitor - so at least put the 6900K, 6950K or a 6800K in there.
> 
> 
> But you said this:
> ...


One of the reasons was the troubled platform in order to make it work, memory issues etc, Wiz said that, reviewers have had a very hard time reviewing this CPU, the issues with it makes some people leave the review unfinished.


----------



## kruk (Mar 17, 2017)

0x4452 said:


> It is the last time I respond to you, but... get your facts straight.
> 
> PhysX is on github:
> 
> ...



Or maybe they can't, because they won't be accepted in to the developer program or because of the EULA (did you read it?) ... now what?


----------



## Melmac (Mar 17, 2017)

0x4452 said:


> * FreeSync in my book is a rip off of G-Sync. NVIDIA invested in the RnD, proved the concept, obviously they want the feature to be a premium. And it is.
> 
> In general, AMD fanboys keep on bashing NVIDIA for not seeing gains from DX12. As if that matters - DX12 doesn't make your games prettier, it's a power move by M$/AMD. NVIDIA invests in game works, physx, gsync to make gaming and graphics better, gets accused for being evil. WTF?



sorry, long time lurker, but had to sign up to reply to this. Nvidia were first to bring sync to the market, because they wanted their own solution that tied people into their cards. But, AMD were first to come up with the idea. They went the open source route and had to apply to VESA to get the specs of Display port changed to allow adaptive sync to work.

And what's this BS about Dx12 been a power move by M$ and AMD? Do you not remember that presentation by Nvidia and Microsoft and they said that they had been working for 4 years together on Dx12. Where they claimed that they had much better support for Dx12 than AMD.

I am not sure about Nvidia been evil but they are just out for themselves. They invested in Gsync knowing that AMD were pushing for adaptive sync which is an open source solution that anybody can use. 3D, PhysX, Gsync have all been things that have tied people to their brand.


----------



## etayorius (Mar 17, 2017)

newtekie1 said:


> The GTX480 was on a different performance level to the 5870 in DX11.  In DX11 the GTX480 had a good 30% lead over the 5870.  So, no, it is not idiotic to say the GTX480 had superior DX11 performance.  It is the fact to say it had superior DX11 performance.  It had massively superior DX11 performance.  In the first GTX480 review, in Metro 2033, one of the few DX11 games tested, the HD5870 scored 0.6FPS in the highest resolution test while the GTX480 scored 18.1FPS!  In the next lowest resolution the 5870 scored 18.9FPS while the GTX480 was getting a solid 31.3FPS.
> 
> You listed a bunch of other reasons that the GTX480 ended up getting a bad score, but none of them are reasons why the better DX11 performance shouldn't be a positive bullet point for the GTX480.  Remember, the HD5870 scored a 9.5, the GTX480 only got 8.2.
> 
> ...




I want some of that stuff you're smoking! because even on TPU GTX480 review it is only a whooping 11% faster overall than HD5870 comparing all resolutions.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/32.html


----------



## VSG (Mar 17, 2017)

I'll mention a few things here since w1z probably won't. I have had the benefit of seeing this review from scratch to the publication stage, and this took him weeks to go through. He received the press kit late since he was not present at AMD press day (AMD could not sponsor travel for many press based out of USA), and it arrived moments before he had to leave to cover the Nvidia press event on the GTX 1080 Ti (Nvidia did sponsor travel for everyone) where he got the GPU for review. So of course he was able to cover the GPU first, and the CPU later which he took a lot of time to understand the platform as best as possible before setting forth to figure out the bugs and how to deal with them.

The AMD press kit had a mix of motherboards, and most received their kits before the final microcode even went out to the motherboard companies. In this instance, the GA-AX370-Gaming 5 arrived with the BIOS F3 which was released Feb 13, and was extremely unstable. Gigabyte then released newer BIOS over 2 weeks later, albeit beta BIOS did get out which may have helped but can't be used for reviews in my opinion anyway.

As far as why no Intel HEDT CPUs? I asked him the same thing, and it was pretty simple- he had none to compare with. He bought some mainstream enthusiast level CPUs himself for comparison since he never was a CPU reviewer and had no samples lying around. I thought it was fair myself.

Granted I do not agree with a few things myself such as the title and language on some news/social media posts, but I continue to post reviews and more here because I still think TPU is among the very best when it comes to detailed reviews and this is no exception. If not, I would have just posted everything on my own website.


----------



## thebluebumblebee (Mar 17, 2017)

etayorius said:


> Thumbs Down to AMD GPU's for not supporting PhysX back in the day


Okay, so how do you communicate this fact to the uninformed user?  Seriously, how do you let people know that this vendor's card may not support a specific aspect of a game you want to play?


----------



## cadaveca (Mar 17, 2017)

kruk said:


> Could you please also test stock CPU cooler performance (temp) and noise? (if provided)
> Also, would it be possible to measure power consumption of the CPU like you do with the GPUs?



You didn't look closely at the review, since those numbers ARE there already, so you're asking for something that was already done.

But since you missed it, here is a handy link for you :

https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/15.html


----------



## kruk (Mar 17, 2017)

cadaveca said:


> You didn't look closely at the review, since those numbers ARE there already, so you're asking for something that was already done.
> 
> But since you missed it, here is a handy link for you :
> 
> https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/15.html



Am I missing something?

Quote from the article:


> We also measured *wall-socket power draw of the whole system*, which is displayed in the bar graphs below the curve chart



I would like to see CPU only power draw.


----------



## cadaveca (Mar 17, 2017)

kruk said:


> Am I missing something?
> 
> Quote from the article:
> 
> ...


Why? It'll change with each CPU... and by that I mean buy two 1800X, the power draw will be different.

My board reviews will show those figures for a 1700X, because it's not only CPU power consumption, but also board VRM efficiency.


----------



## newtekie1 (Mar 17, 2017)

etayorius said:


> I want some of that stuff you're smoking! because even on TPU GTX480 review it is only a whooping 11% faster overall than HD5870 comparing all resolutions.
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/32.html



See, this is how you spot the person losing the argument.  The original point was about listing "Significantly better *DX11* performance" and the claim that wasn't correct.  So I responded with "The GTX480 was on a different performance level to the 5870 in *DX11*."  I based this statement on the two *DX11* benchmarks included in the very review you linked to seen here and here.  Which on average show, ignoring the bogus Metro 2560 results, give an average of 55% and 50% better GTX480 DX11 performance respectively.   You then decided to ignore the fact that we were talking about DX11 performance and tried to argue about overall performance instead.

This is my sign it is time to exit.  Have a nice day.


----------



## Grings (Mar 17, 2017)

Good review, glad you hung on until the smt bug was known about

hope you take another look when new revisions come out and *when/if *3466+ ram is more commonly supported


----------



## Shatun_Bear (Mar 17, 2017)

VSG said:


> *Granted I do not agree with a few things myself such as the title and language on some news/social media posts*, but I continue to post reviews and more here because I still think TPU is among the very best when it comes to detailed reviews and this is no exception. If not, I would have just posted everything on my own website.



I keep seeing this. What language/title are people talking about?


----------



## ssdpro (Mar 17, 2017)

Shatun_Bear said:


> I keep seeing this. What language/title are people talking about?


There was an early title posted on launch day that referenced the reviews that came in a little negative.  It was changed quickly. 

I still see no reason for excessive drama.  The product is a good value and a nice balance.  The product came in a little under the best hopes.  The R7 isn't disrupting but the R5 might.  Setting aside opinions, at launch AMD stock was at 14.90.  Analysts were bullish that good reviews would pump that to 16-17.5.  Reviews got hung up on below expected gaming performance and the stock dropped to 13 (or 15%).  When something was at 15 hoping for 17, drops to 13 on reality, that is punishing reality.  The stock still hasn't recovered to pre-launch levels which means sales suck.


----------



## xkm1948 (Mar 17, 2017)

Shatun_Bear said:


> I keep seeing this. What language/title are people talking about?



TPU posted something only an amature level fanboy would post during RyZen launch on facebook, they got burned, hard, on facebook. It was so humiliating for TPU that they went back to change the post title and content of the facebook post.


----------



## 3lfk1ng (Mar 17, 2017)

Hello *W1zzard**,*

Knowing that AMD's infinity fabric runs off the speed of the memory controller, I would love to see TPU perform a small followup to this review using faster memory.

At present, the use of the memory that was used in this review, severely cripples the performance that one could expect from Ryzen CPU.
According to your test setup:
@ 2133 MHz 15-16-16-35 = would have a fabric link of just 1066MHz.
@ 2666 MHz 16-16-16-36 = would have a fabric link of just 1333MHz.

However, if you had tested it using the same ram that was used on the competing Intel rigs (7700K/6700k @ 3000MHz) or even AMD's recommended 3200MHz, you would see 1500MHz and 1600MHz CCX speeds effectively.
This may help to present a more accurate review of the processor because at present it looks like the the Ryzen was intentionally kneecapped.

Also, AMD's 1800X was designed to compete with Intel's 8-core offering, the _$1000_ i7-6900K.
To omit it from this review for a fair 1:1 comparison doesn't do any of the processors used in this review any justice. 
As most of us are aware, AMD's R7 processors (1700/1700x/1800x) aren't designed for gaming and to that end, Intel's 7700k/6700k aren't designed for heavily threaded applications. 
On average, this makes the 1800x look slightly inferior in games and makes the 7700k/6700k look slightly inferior in applications.
It would be very welcoming to see 8-core vs 8-core to really drive home the priceerformance ratio that AMD's $499 processors offers.

Last but not least, the Gigabyte Aorus GA-AX370-Gaming 5 that was used in the review has received a few BIOS updates since launch that improved memory performance. 
Which BIOS version was used in this review?


----------



## Steevo (Mar 17, 2017)

"Hep" thing perhaps should be "hip"?

Now to read the review.


----------



## HammerON (Mar 17, 2017)

Nice review W1zzard  Well worth the wait in my opinion.
As far as this thread...  can we please stay on topic.  The topic is not that TPU is biased in one way or the other.  This gets old really quick.
Take it or leave it.


----------



## GoldenX (Mar 17, 2017)

Man I remember the original "EPIC FAIL" in the cons of the kamikaze GTX595, that was funny.
Nvidia originally didn't have any problem with ATI cards running CUDA, some dude even made them work (on an HD4850 I think) and AMD made that driver disappear with the promise of a hardware accelerated Havoks driver, never happened. Now they (Nvidia) don't let you use an AMD card for rendering and an Nvidia for Physx, so yeah.
AMD promotes the open standard OpenCL, why should they implement a heavy-optimized for another architecture alternative? Same thing with Free-Sync.
We also have DirectCompute and "OpenGL GLSL Compute Language" and nobody uses them, Vulkan also has a Compute Shader.

On topic, this really missed the 6900K on the list, a 16 threads without IGP CPU should be compared to another 16 threads CPU.
What a terrible start for a new platform, it's a mess, no stable BIOS, memory compatibility problems, bad Windows support, only expensive motherboard offerings... and despite all that, I love the performance and efficiency of Ryzen, socket 2011 performance at a third of the price, and with lower power consumption, oh, and overclocking for everybody (let's see how the quads clock).

Is any word on the X300 ITX motherboards? Can't wait for the APUs.


----------



## fusionblu (Mar 17, 2017)

Nice to see a review for these at last and one with a Gigabyte motherboard instead (ones I have seen so far used Asus ones and did not think their reviews were thorough enough).
Motherboard availability is the obvious issue so far.

Now I would like to see a review to determine the differences between the Gigabyte AORUS AX370-GAMING K7 and Gigabyte AORUS AX370-GAMING 5 (used for this review) besides the cosmetic and price differences (also seems the same).


----------



## HD64G (Mar 17, 2017)

I am glad that @W1zzard waited a bit for new BIOS to allow easier OC and higher RAM speeds. Great review as usual but I think 1800X could get 9 points in score.


----------



## nem.. (Mar 17, 2017)

where are mins? 

from de linus 1080ti review


----------



## 0x4452 (Mar 17, 2017)

nem.. said:


> where are mins?



99 percentile is a better metric IMO. There can be one irrelevant frame on say at the very start that will impact the min score.

W1z, any hopes of getting a FCAT setup?


----------



## sweet (Mar 17, 2017)

Thanks for the review. However at this late I expected a much better one.

1. Too many game benchmarks for a HEDT CPU review. People who just play games normally don't buy these CPUs. Need more production tasks.

2. At this moment many suggested tweaks are available to improve gaming performance. These tweak aren't needed for "workstation guys" as wrongly insisted in this review.

3. Unable to describe the pros and cons of CCX and Fabric design.

4. Fail to point out that Ryzen scale better with RAM speed. Also with the newest BIOS the Aorus board should hit 3000 Mhz RAM easily. Doubt that it was updated in this review.

5. Lack of comparison with Intel 8 core 6900k and the same priced HEDT 6 core 6800k/6850k.

6. No IPC and SMT analysis. FYI the IPC is lower than Broadwell, but SMT is better than HT and as a result 1800x can match the 6900k in multithread.

To sum up, this review is late and straight bad.

Off topic but to all the people here who said PhysX is open, Hell No. NVidia even prevents their cards from doing PhysX if AMD is the main VGA for example. They are just that d1ck of a company.


----------



## nem.. (Mar 17, 2017)

Ryzen 7 1700 4.0GHz 3 AAA games on ULTRA at the same time!


----------



## notb (Mar 17, 2017)

Evo85 said:


> Not sure why lack of integrated graphics is a negative here. If you want that, you buy an APU chip.



Because APU will be slower? 
Because AMD is likely to use a relatively powerful and mostly pointles IGP, that'll raise the price and the power consumption? 
Honestly, most people want IGP to see their Excel, movie or webpages. They don't need a powerful IGP - they need anything. And the minority looking for performance will buy a dGPU anyway...

I've already mentioned this issue in one of the Ryzen threads, but it seems worth repeating.
Many buyers will choose Ryzen over Intel for a rendering/simulation/productivity rig and there's nothing wrong with that - it's a very competent CPU.
But if this is meant as a standalone PC (also used for day-to-day tasks), you'll have to buy a dGPU just to see the results.
Sadly, because we don't have cheap options like we used to (Intel killed them with their IGP), you'll have to go for proper cards: GTX1050 or RX460 at least.
The simple fact is: both these GPUs will offer similar or better performance in most scenarios (of course if GPGPU is available, but it got fairly common lately).

No place for a simple IGP? Why is this almost a SoC? Why put USB controller in a CPU?
I understand there most likely was some know-how trade with Samsung on the way, but this has clearly gone too far.


----------



## notb (Mar 17, 2017)

Ubersonic said:


> To be brutally honest there is zero reason for anyone with a brain to buy a 7700K now.  If a Ryzen chip isn't better than a 7700K for what you're doing, then what you're doing doesn't warrant buying the 7700K over the 7600K either.



Unless you don't want to spend a lot, you want good productivity performance (8 threads help) and you prefer (or even you are limited to) Intel for whatever reason - like most people on the planet. Then 7700K is still the best choice our there.


----------



## W1zzard (Mar 17, 2017)

birdie said:


> do you think AMD can actually overturn Intel with IPC performance in Zen 2.0 which was already announced?


AMD doesn't need to overturn Intel, if the can just take 5-10% market share they'll make billions



etayorius said:


> I hope i don't get banned for this.


You almost did, lucky I had a great dinner and re-read your post



thebluebumblebee said:


> Would you please add the i3-7350K to that list?.


Not for this review, but for future ones


----------



## yogurt_21 (Mar 17, 2017)

etayorius said:


> I want some of that stuff you're smoking! because even on TPU GTX480 review it is only a whooping 11% faster overall than HD5870 comparing all resolutions.
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/32.html


 my goodness it was like a different world back then... W1z ran 5 different resolutions? that must have sucked... 1024x768 was still relevant enough to be included in a flagship card review? how bad was the average joes monitor 7 years ago?

3 different 3dmarks, heaven, AND a proprietary nvidia physx bench? how much time did he use to have? 

there were only 4 call of duty's? 

It seems like a damn time portal or something. Like can that really be only 7 years ago? why does it feel like 20?


----------



## L|NK|N (Mar 17, 2017)

This is the best and most informative Ryzen review on the web. Thank you for the intense amount of thought and effort that you put into this article W1zzard.


----------



## Dethroy (Mar 17, 2017)

The Ryzen 7 1800X is by far the least interesting product of its whole family.
The R7 1700 is the most interesting 8/16 SKU imho, the R5 1600X will probably be the SKU of choice for most gamers that value perf/$, whereas the 1500X will probably deliver impressive dual core perf/$ for gamers on a tight budget that don't want to gamble on game devs stepping up their game any time soon to utilize more than 2-4 cores.

Here is a spreadsheet I made that compares most of the Ryzen lineup (1800X all the way down to the 1500X) on a perf/$ basis.






Note: my math assumes 1:1 scaling for both frequency and cores and is therefor not necessarily indicative of real world performance but should still give you a good guesstimate


----------



## Fluffmeister (Mar 17, 2017)

The 1700 non-x does indeed appear to be the best of bunch (at least when it comes to the 8c16t variety). The upcoming 1600(X) chips might turn into the even sweeter sweet spot.

Shame about the flaky platform, but no doubt the kinks will be worked out eventually, I thank all you beta testers out there in the meantime.


----------



## toxzl2 (Mar 17, 2017)

Wizard, why did you use 2666mhz memory at 1080p

Look at this 

https://community.amd.com/community...4/tips-for-building-a-better-amd-ryzen-system


----------



## happy medium (Mar 17, 2017)

gtx 1080, i7 7700  @ 4.8 = 123.5fps.




gtx1080 with a i7 7700 @ 4.2 = 123.5 fps.





That's a gpu bottleneck.
This review has flaws and Ryzen damage control written all over it.
If it were a Titan P or 1080ti it would have made Ryzen look pathetic.

I see you used ddr4 2600 (the fastest memory for the chip) with the Ryzen.
Why didn't you use ddr4 3400 memory with the 7700?

I see you overclocked the Ryzen chip.
Mabe I missed it, did you overclock the 7700?
Well I guess it would not have made a difference, some of the games were tested with gpu bottlenecks even at 1080p.
Just look at the gtx1080ti scores way higher.
Look at the gtx1080 scores with the 7700k @ 4.2 and 4.8. they are the same!

1440p is worse! Should have used crossfire/sli and watched the Ryzen cpu get crushed with its low memory bandwidth.
Why do these reviewers go out of there way to skew the results?

Hey look my i7 920 @ 4.2 games the same as the 7700k @ 4,2 with a gtx1060 3gb @ 4k.
Really! come on!

Grab 2 gtx1080ti's, a few games that sli well, use medium details, and a 1440p monitor.
Overclock a i7 7700k with ddr4 3000+ to 4.8, and a Ryzen cpu @ 4.0
Watch the Ryzen cpu get clobbered.

You need to eliminate the bottleneck!



I can't believe you guys don't see through this BS.


----------



## King Banakon (Mar 17, 2017)

nice unbiased review W1zzard .. keep up


----------



## birdie (Mar 17, 2017)

W1zzard said:


> AMD doesn't need to overturn Intel, if the can just take 5-10% market share they'll make billions



I'm sure they'll make billions even with Zen 1.0, it's just Zen 1.0 is not that attractive for existing Intel CPUs owners (aside from those who use their PCs for very specific tasks like rendering/compiling/encoding using x264).


----------



## GreiverBlade (Mar 17, 2017)

good show jolly good show AMD....

now gaming test prove that it's a worthy "next" in my buy list but in the form of the R7 1700 instead of trying to stay "blue" and go 7700K as i will upgrade to 1440p before i check on CPU mobo upgrade (sidegrade for my 6600K tho but i don't care ... i like to jump ship ) or also waiting on R5 1600 review ... a 6C/12T for the price of a 6600/7600K ? ... i can't say no ... 

as i not only do gaming but sometime encoding and other task where Ryzen can shine ... there is no better option for the price asked.


----------



## Divide Overflow (Mar 18, 2017)

I hope it doesn't take long for the platform to mature and support much higher speed DDR4 memory.  Ryzen performance appears to scale extremely well with faster RAM.


----------



## OneMoar (Mar 18, 2017)

well that went pretty much as expected
so we need to buy a 120 dollar kit of ram to compete with a cpu thats 160 dollars cheaper
deal of the century

once again amd hedged there bets on having more cores and not a single developer cares


----------



## Iceni (Mar 18, 2017)

Really interesting review. 

I'm dumbfounded by the poor overclock performance. And the fact the chip actually looses performance in some tests. I'm pretty sure the platform needs more development time because of that. By the time we see R5 hopefully they have the bios and drivers functioning a little better.

Also the results against the 2500K. Thank you for those. Performance wise once R5 hits I'm pretty sure it will make for a very hard decision for the older Intel users as to what upgrade path to take. Price is going to be a major factor, but the single thread performance isn't as bad as I was lead to believe on other reviews. 

The gaming results as well are not as bad as some would  have you think. Intel are still fastest, and have 1080p covered, but for anyone that has more resolution the decision making just got a whole lot tougher. The higher res results just show that AMD have made a great chip, and on their platform you will game and have the horse power to have a productive machine for other tasks.

I'm seriously looking forwards to that Ryzen R5 1600X hitting the reviewers!


----------



## mastrdrver (Mar 18, 2017)

@Wizzard

Thanks for the review.

Do you think that the bad launch may have been because of the downsizing that has happened over the years? That AMD just didn't have enough people to launch this right? Just a thought I've been having.

Also do you think there will soon (less than a year) board revisions to help solve some problems inherent in the motherboard designs?

Thanks


----------



## MrMilli (Mar 18, 2017)

You forgot to mention if you're testing these games in DX11 or DX12 (at least the ones that support it).
It's specially the DX12 mode of games that are not running too well on Ryzen. Being a low level API, these games will need optimization to run well on Ryzen.
I already posed about this on TR forum. ComputerBase tested all games in DX11 and DX12. The difference is noticeable.

https://techreport.com/forums/viewtopic.php?f=2&t=119280

Ryzen easily looses 20-30% of performance in DX12 in many games.

7700K vs 1800X:


Battlefield 1 DX11: 116 vs 122
Battlefield 1 DX12: 127 vs 90
Deus Ex: MD DX11: 87 vs 80
Deus Ex: MD DX12: 83 vs 63
RotTR DX11: 152 vs 135
RotTR DX12: 168 vs 117
Total War: W DX11: 43 vs 40
Total War: W DX12: 42 vs 30


----------



## efikkan (Mar 18, 2017)

idx said:


> What is really interesting about this cpu is that if you disable one CCX and do a lil OC, it will perform better in most of these buggy games (everyone is considering it as a legit performance measure ... sadly).
> In plus, it will still beat the i7 7700k in some of the other tests.


Once again, AMD performance sucks and of course it's the software's fault. Unlike with GPUs, there is actually very little to optimize for specific CPUs in software, especially since Ryzen is more superscalar.

The issue with gaming is that the CPU workload is not optimized, it's actually riddled with cache misses and branch predictions, which makes the CPU rely mainly on the prefetcher. Even though Ryzen has more theoretical throughput, it has a vastly inferior prefetcher, and is unable to compete even with Sandy Bridge in this manner. Overclocking wouldn't solve issues caused by cache misses, since the penalty is a constant in _ns_, not in clock cycles. The only resolution is to try to reduce the number of them.

If you're a gamer or a power user which games, Ryzen is simply not good enough. Even though there are a few programs where Ryzen really shines, that doesn't make up for all the cases where it's not. When a product is not universally performing great, the users needs to carefully watch if the product's benefits even apply to their workload. If there are three benchmarks which has nothing to do with your usage where the product really shines, then that's irrelevant.
Let's say a hypothetical user does some gaming, some Photoshop, and of course web browsing, then a quad core from Intel will actually outperform an octa core of Ryzen. As always real performance matters, not theoretical figures. Very few Ryzen buyers ever do enough Blender to ever experience the advantage at all. CPUs such as i7-6800K will also give much more value for a number of use cases.



Joss said:


> What ?
> this is basically the same performance of the future 1600x (same chip, same clocks, only two cores disabled) for $249, not good enough


Look at all the graphs; for a number of them it's 10% loss, some even 15-20%. If you're buing a GTX 1070 or better you'll be then wasting money.
If they had compared the selection of Intel CPUs, ranging from i5-6600K, to i7-6800K, i7-6900K, etc. You'll see that there would be no significant gaming difference between them, yet Ryzen is far behind. This is not good enough.



0x4452 said:


> It is the last time I respond to you, but... get your facts straight.
> 
> PhysX is on github:
> 
> ...


You know, facts don't matter to fanboys, only the narrative to make Nvidia look evil…


----------



## Ravenas (Mar 18, 2017)

Thanks for the review Wizzard.

I don't agree with you regarding integrated graphics being a con. A con because Intel offers it for nothing more than marketing? Anyone buying a 7700k to use it with the integrated graphics is completely throwing their money in the wrong direction.

Board revisions... Has anyone followed the state of MOBO manufactures regarding AMD? Pre Ryzen, has anyone noted a revision for a MOBO for AM3+ 990 chipset in the last year and a half? How much market share does AMD have to claim before MOBO manufactures pay attention.

I'm just stating I don't agree with the above cons, that is my opinion.

I would have rather of seen the 1800x compared to the 6900k. AMD touted the 1800x as a processor that directly competes with the 6900k, and yet this review is strictly aligned against the 7700k. I'm not really sure how this helps me as a reader?

Not what I expect from AMD regarding overclocking performance. Yes I realize they are aiming for power efficiency and this is not the 1900x, but I honestly am in a small group of consumers who don't care about 65W vs 125W... To say the least, I am let down by the overclocking... I will attribute some of this to your test setup, both the motherboard selection and cooling.

Overall great processor and competition for a company whose market value hovered 5 billion pre launch. (intel 165 billion)


----------



## Nkd (Mar 18, 2017)

happy medium said:


> gtx 1080, i7 7700  @ 4.8 = 123.5fps.
> 
> 
> 
> ...



WTF lol. If anything I would say why didn't wizard test those games with SMT off that seem to benefit at 4.0ghz instead of 3.6, and the integrated graphics CON was not really fair. No one buys this system for intergrated graphics that is coming with zen APU.  This review was mostly balanced. 

You sound like you are trolling and I shouldn't even respond to you. Who gets a frickin titan x to game at 1080p? Shit gtx 1080 is damn good enough for anything 1440p.


----------



## buggalugs (Mar 18, 2017)

I think AMD did an incredible job all things considered. If I wanted to build a workstation I would go AMD, and even for mid-range gaming, which is most of the market.

 When they sort out the motherboard/bios issues, and memory compatibility, performance will only get better


----------



## sweet (Mar 18, 2017)

buggalugs said:


> I think AMD did an incredible job all things considered. If I wanted to build a workstation I would go AMD, and even for mid-range gaming, which is most of the market.
> 
> When they sort out the motherboard/bios issues, and memory compatibility, performance will only get better



I think the problem is we are not pushing Ryzen in the right way at this moment.

Just have a quick look at ~35% gains after tweaking, very impressive indeed.


----------



## Lionheart (Mar 18, 2017)

8.6?  At least a 9 & I don't understand how the motherboards are horrible?


----------



## Basard (Mar 18, 2017)

Great review, with and equally great variety of different benches....  I see in the conclusion that the new platform was a pleasure to work with...  Sad that everybody dropped the ball with that crap, a lot of people need to get fired, that's BS.  I mean, call security and have them kick them all in the asshole on the way out.  So many things could have been done better.  And there's going to be a lot of overtime worked to fix this shit storm.


----------



## MrGenius (Mar 18, 2017)

So nobody saw the 7770K typo yet...besides me?





EDIT: And I see it a bunch more times(21 more by my count) throughout the review. Now that I'm looking for it.

Googling i7-7770K to make sure it's not a thing. Nope. Didn't think so...

I mean it's not right? 

It's not listed as one of the test setups either.
https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/4.html


----------



## FordGT90Concept (Mar 18, 2017)

I tried searching but couldn't find an answer: does this benchmark reflect High Performance profile enabled?  Could explain why Ryzen's power consumption is so low during gaming (or could be the fact it's only using 25-50% of the CPU so the rest just shuts down).


----------



## GoldenX (Mar 18, 2017)

efikkan said:


> ...CPUs such as i7-6800K will also give much more value for a number of use cases.



How can a more expensive processor + motherboard offer more value?

About the git of physx, read the license, you are not allowed to use it, only compile it. That github is only for the SDK, it's not for implementing the API in any way.


----------



## jigar2speed (Mar 18, 2017)

dat_boi said:


> What a shame, maybe another time AMD.


You were reading AMD FX review and commented now ?


----------



## Naito (Mar 18, 2017)

Awesome review as always. It seems it may be best to wait for some of the platform bugs to be ironed out, before I look into it. AMD have definitely sparked my interest this round, though. Haven't touched an AMD CPU since the PII days.


----------



## qu4k3r (Mar 18, 2017)

Good job!

I'd like to see 1700 review paired with B350 motherboard.


----------



## saikamaldoss (Mar 18, 2017)

Review would have been more helpful if it was 1440p and 4k instead of 1080p..
 any who pays $499 for a processor will have a decent graphics card that can run 4k or 1440..

 even if they don't have 4k monitor. They can still run dynamic resolution.

I run games at 1800p with 280x tweaked and now waiting for mitx x300 or x370...

 the wait is killing me


----------



## G33k2Fr34k (Mar 18, 2017)

Looks like the 1800x is throttling in games:


----------



## Kanan (Mar 18, 2017)

G33k2Fr34k said:


> Looks like the 1800x is throttling in games:


No it's simply not using all 8 cores in gaming.

Review is generally okay, but has some mistakes here and there that add up to give me a mixed feeling about the review (not the CPU, I've read reviews about it before anyway). They should've simply shipped him a Asus board and he'd probably given a higher score, because those boards simply work better and also support the higher ram frequencies he wanted. Simply put, he basically got the worst "highend" board there is atm with Asrock and especially Asus both better.

Why I have mixed feelings about the review:

On one hand you applaud a total energy efficiency win for AMD, and then you talk about it in the conclusion like it was a "draw" or "near-draw" vs Intel. 

On one hand you say the Ram isn't overclockable to 2933 MHz, and then you say you bought a new Ram and it worked, just to say again that it didn't make it to 2933. Strange. 

That games only support up to 4 cores is only partially true atm (many support more cores) and will change in the near future, but you denied that by stating it will not. That's just a opinion on your side and I'm pretty sure a wrong one, but you're stating it like it's a fact, but it's not. 

CCX problem's, latency stuff etc. not part of this review. 

etc. as I said the review is "okay", but not a inch more than that.


----------



## G33k2Fr34k (Mar 18, 2017)

Kanan said:


> No it's simply not using all 8 cores in gaming.
> 
> Review is generally okay, but has some mistakes here and there that add up to give me a mixed feeling about the review (not the CPU, I've read reviews about it before anyway). They should've simply shipped him a Asus board and he'd probably given a higher score, because those boards simply work better and also support the higher ram frequencies he wanted. Simply put, he basically got the worst "highend" board there is atm with Asrock and especially Asus both better.



I see. I guess you're right. It looks like most games don't utilize 8 cores properly.


----------



## Rahmat Sofyan (Mar 18, 2017)

Hmmm... 8.6 overall


----------



## Patriot (Mar 18, 2017)

G33k2Fr34k said:


> I see. I guess you're right. It looks like most games don't utilize 8 cores properly.


best part of the 1700... get an asus board, push a button in software and 3.8ghz stock volts and cooler for $330... oh and... mine likes 3200 ram just fine.... the difference between 2133 and 2666 is the same as 2666 to 3200. another 5fps give or take.


----------



## FordGT90Concept (Mar 18, 2017)

G33k2Fr34k said:


> I see. I guess you're right. It looks like most games don't utilize 8 cores properly.


Most games target quad cores without SMT.  Ryzen has 8 cores with SMT so as that video shows, the 1700 is chillaxing while the 7600K is straining.  Sure 1700 has a little lower framerate but you're talking 87 FPS instead of 90.  Who cares?  Start something like Handbrake that can use 16 threads and suddenly the 7600K looks pathetic.

I think the score of 8.6 is fair.  Ryzen has compatibility problems right now which deserves a point docked.  The remaining 0.4 can easily be attributed to the asterisks describing how some games take a significant performance hit on it and the relatively small L3 cache.  Overall though, it's the best bang for the buck on the market once you get it running.


----------



## ISI300 (Mar 18, 2017)

Platform issues aside, seems like an amazing CPU and a great architecture to use as baseline for Zen+. 
1700X is the one I'd go with if I eventually decide to ditch my 4670K.
Good job AMD, not very often we get to say that.


----------



## notb (Mar 18, 2017)

FordGT90Concept said:


> Sure 1700 has a little lower framerate but you're talking 87 FPS instead of 90.  Who cares?  Start something like Handbrake that can use 16 threads and suddenly the 7600K looks pathetic.



This whole Ryzen situation - especially the comments under reviews/leaks - really made me wonder how people are using their PCs. 
Seriously, how often do you encode movies?
Looking at the discussions, one could easily get an impression this is a rendering/encoding/physical simulation forum. Suddenly everyone praises the performance in tasks are - in general - performed by maybe 1% of PC users.

Also think about the *qualitative *nature of the benchmarks.
Obviously more cores give an advantage in multi-threaded tasks, but that's just in improved time. You can render/encode/simulate on pretty much any machine. Encoding a movie on Ryzen 1700 will take maybe 30% less time than on a similarly priced i7. Is this a big deal for a home user, who does that occasionally?
Gaming is so much different, as it always has a defined "usability" threshold for hardware. Difference of few fps in the >80 range is totally unimportant, but at ~20 it implies whether you can play the game or not.

For example: I'm not really into gaming lately - the list of titles I'd like to play is building up. For that reason I'm still using a very old desktop - with an almost ancient E5400. However, I'm doing a lot of simulations and of course I get the same results as on any other CPU - it just takes a few more hours.

It's very different with productivity tasks that you do "live" - like working with large Excel files. Here a poor CPU can be really frustrating. And yet, I find most reviewers have totally ignored this kind of benchmarks. For example the TPU review shows a result of an unknown Excel test that takes under a second. Maybe opening a file? This is not how professionals use Excel...


----------



## RejZoR (Mar 18, 2017)

I'm gonna ask the obvious question, why is 6900K, a direct competitor to R7 1800X not listed in the charts? Comparing enthusiast level cruncher like 1800X to several mainstream Core i7's is a bit silly. Almost as if reviewer assumed only gamers would buy Ryzen... What about people who require crunching horsepower, but are not willing to pay 1100€ for 6900k ?


----------



## notb (Mar 18, 2017)

W1zzard said:


> I wanted to inform you personally that Ryzen isn't for you, due to lack of IGP. No man, you can't plug anything into the monitor outputs on the Ryzen motherboards! Ryzen APUs are coming out in H2. For the majority of people this is a non-issue, that's why it wasn't mentioned in the conclusion text.



I think it might not be an issue for typical TPU forum user, but as for majority of people it's actually pretty huge.
The reviews on TPU aren't read just by enthusiasts - the site is hugely popular and well positioned in search engines. Many people will just google for "Ryzen review" and find your text in top 3 results. They might not know much about CPUs and - after years of using Intel - be unaware that not every working computer offers a video signal.
As such, I would put the lack of IGP as the first con... probably in red bold font. 

Also, could you share the specification for productivity tests? I'm mostly interested in the MS Office stuff, because the times shown suggest some very basic tasks. Weren't you tempted to use something more serious (some sites use a Monte Carlo benchmark for Excel).


----------



## W1zzard (Mar 18, 2017)

Iceni said:


> I'm dumbfounded by the poor overclock performance. And the fact the chip actually looses performance in some tests.


I mentioned it somewhere. Due to XFR boosting single-thread workloads to 4.1, which is higher than 4.0 manual OC which is all cores though. So single thread apps will be worse with 4.0 manual OC, while multi-threaded ones will be better. The big difference is that XFR works out of the box, without manual tuning, at stock voltage.



Ravenas said:


> I don't agree with you regarding integrated graphics being a con. A con because Intel offers it for nothing more than marketing? Anyone buying a 7700k to use it with the integrated graphics is completely throwing their money in the wrong direction.


It is a small con but worth mentioning in my opinion, if only to educate less experienced users, who might look at motherboards, their outputs and think "oh good it has integrated". For many office, workstation, render farm, servers, scientific, work tasks igp is good enough, and it comes free and integrated on non-HEDT Intel CPUs. Which also helps with formfactor. I know most of you will use a dedicated graphics card. Wouldn't be the first time a graphics card fails and IGP can be used while waiting for the RMA to complete, to at least be able to browse the web, do email and watch YouTube.



MrGenius said:


> So nobody saw the 7770K typo yet...besides me?


Fixed. These charts are generated from a single entry in the database where I had a typo. Amazing indeed that nobody else noticed it (including our staff and proofreader)



FordGT90Concept said:


> does this benchmark reflect High Performance profile enabled?


Yes, enabled for all Ryzen testing



notb said:


> Also, could you share the specification for productivity tests? I'm mostly interested in the MS Office stuff, because the times shown suggest some very basic tasks. Weren't you tempted to use something more serious (some sites use a Monte Carlo benchmark for Excel).


I don't think anyone serious in the real world does Monte Carlo in Excel. You sound like you work in a workstation/science environment, tell me about your tasks (pm is fine ofc). I'd be happy to design and add more benchmarks


----------



## Vlada011 (Mar 18, 2017)

AMD Turbo work same as Intel Turbo?
I mean all processors are 100% stable on 4.0GHz clock but they could be overclock only 200-300MHz more, correct?


----------



## FordGT90Concept (Mar 18, 2017)

notb said:


> Seriously, how often do you encode movies?


My server? Whenever necessary.



notb said:


> Encoding a movie on Ryzen 1700 will take maybe 30% less time than on a similarly priced i7.


If that 30% is the difference between realtime and buffering, it is worth it.



notb said:


> Difference of few fps in the >80 range is totally unimportant, but at ~20 it implies whether you can play the game or not.


If you're only getting 20 FPS and your computer isn't a dinosaur, consider upgrading the graphics card over the, well, everything.  Witcher 3 was only using ~33% of the 1700 which translates to 5-6 of the 1700's logical cores.  The bottleneck wasn't the CPU, it was the GPU or the applications inability to use more cores.  We already have a two pronged solution to that problem: 1) Ryzen itself which makes 8-core processors affordable to consumers and 2) next gen APIs like D3D12 and Vulkan which gain frames largely because they divert as much work away from the CPU as possible in addition to multithreading it.

TL;DR: no gamer should not consider a Ryzen 8-core simply because Intel's quad-core is a little bit faster.  Developers have been coding for quad-cores for half a decade now.  They're just as ready as I am to move to reasonable six and eight core machines.



notb said:


> It's very different with productivity tasks that you do "live" - like working with large Excel files. Here a poor CPU can be really frustrating. And yet, I find most reviewers have totally ignored this kind of benchmarks. For example the TPU review shows a result of an unknown Excel test that takes under a second. Maybe opening a file? This is not how professionals use Excel...


If you're seeing poor performance using Excel, the problem is that you're using Excel.  Spreadsheets < databases.


----------



## Tsukiyomi91 (Mar 18, 2017)

Finally a good review. AMD did a good job at the power consumption side, which is good. Kinda sad to see it can't push beyond 4.3GHz...


----------



## W1zzard (Mar 18, 2017)

notb said:


> It's very different with productivity tasks that you do "live" - like working with large Excel files. Here a poor CPU can be really frustrating. And yet, I find most reviewers have totally ignored this kind of benchmarks. For example the TPU review shows a result of an unknown Excel test that takes under a second. Maybe opening a file? This is not how professionals use Excel...


The value is the average time for various tasks. In a workbook with 240k cells, also copying to another one with 10k cells. What do you do in Excel? Send me an example, w1zzard@techpowerup.com


----------



## geon2k2 (Mar 18, 2017)

W1zzard said:


> The value is the average time for various tasks. In a workbook with 240k cells, also copying to another one with 10k cells. What do you do in Excel? Send me an example, w1zzard@techpowerup.com



Office stuff hasn't been an issue for 10 years by now, for productivity the best breakthrough was the SSD, even very old CPU's can handle excel just fine for most of the people. If you speak about databases and access ... then that is a different story, but even then I'd look at a faster SSD first 

What I really want to see is 1CCX, 4 core, 8 threads gaming benchmark pretty please 
I'm sure gaming performance will be better if all the cores used will be within the same CCX.


----------



## efikkan (Mar 18, 2017)

GoldenX said:


> How can a more expensive processor + motherboard offer more value?


I think you should check closer, you might be mixing it up:
Ryzen 1800X: $499
i7-6800K: $434



G33k2Fr34k said:


> Looks like the 1800x is throttling in games:


It's called stalling, not trottling. Trottling would be the clock dropping.


----------



## GreiverBlade (Mar 18, 2017)

OneMoar said:


> well that went pretty much as expected
> so we need to buy a 120 dollar kit of ram to compete with a cpu thats 160 dollars cheaper
> deal of the century
> 
> once again amd hedged there bets on having more cores and not a single developer cares


just in case ... you know the counterpart of the 7700K is the R7 1700, which cost the same price and not 1800X ... and all gaming video review, at 1080p, put the 3.7-3.8ghz 1700 on par, +/-10-15fps to a 5.0ghz 7700K? Yes i know ... it's still 8C/16T versus 4C/8T ... but hey ... it's Intel's fault if devs don't give a damn about "above 4 core"  (worth nothing but frequencies wise ... 3.7ghz vs 5.0ghz make it 1.3ghz slower but still keep up  and even in "4thread only" games )

in my case if i sell my 6600K my Z170X Gaming 7 and my 4x4 2800 C14, i bet i would have enough to get a full R7 1700 system, my initial budget was for a 7700K alone to replace my 6600K


GreiverBlade said:


> as i not only do gaming but sometime encoding and other task where Ryzen can shine ... there is no better option for the price asked.


well since i already have the fund for the CPU i just need fund for RAM and Mobo ... and i have a CPU Mobo and RAM to sell to fund it, now ... since Ryzen Mobo aren't somewhat more expensive than the Intel one ... and still offer the same advantages
i could even get some money back if i wait for the R5 1600 ( awww shoot ... and getting only 6C/12T instead of 4C/4T at the same price)

also on core count ... actually now it's 8 real core + SMT ... and not 4x2 CMT, single thread performance is not as high as it could have .... but still enough.

as the review say : "AMD is competitive again" ... and damn yes it is.




FordGT90Concept said:


> TL;DR: no gamer should not consider a Ryzen 8-core simply because Intel's quad-core is a little bit faster.  Developers have been coding for quad-cores for half a decade now.  They're just as ready as I am to move to reasonable six and eight core machines.


actually ... nope ... if a 7700K is a good idea then a R7 1700 is also a good idea  (tho sometime you need to put HT off on a 7700K ... ) and yep a Intel is a "little" bit faster from what i saw ... nothing to fret over as long as playing in 1440p and up (or 1080p ... +/-10-15fps is not that high ... ok 20 ... on a select set of game that i never had/liked and would never touch  )

all in all gamer should consider Ryzen's offer around the R5 1600/R7 1700 if they have a budget for a 6600/7600/6700/7700K ... even if devs coded games for 4 core for over a decade (not all ... mmo and RTS ... would probably benefit from Ryzen) but if they move on (thanks to AMD probably) well ... they can be ready.

my end word would be: waiting on the issues to get sorted, mobo and BIOS to get better and holding until the R5 line get out and get benched is not a bad idea, but for now

edit... i noticed that the R7 1800X consume less than a 7700K only now ...  yep ... good job AMD, now i wonder about the 1700 power consumption 




MrMilli said:


> You forgot to mention if you're testing these games in DX11 or DX12 (at least the ones that support it).
> It's specially the DX12 mode of games that are not running too well on Ryzen. Being a low level API, these games will need optimization to run well on Ryzen.
> I already posed about this on TR forum. ComputerBase tested all games in DX11 and DX12. The difference is noticeable.
> 
> ...


actually DX12 games are not "ready now" they loose FPS even on Intel System ... and not many games are DX12  as for ROTTR ... 152fps in DX11 and 168fps in DX12 ... my 6600K with a 1070 does the opposite ... (aka: it loose FPS and not gain 16fps )  at almost the same ratio of the 1800X

DX12 need to be optimized before being a thing ... actually DX12 feel like a beta to me ... (Vulkan on the other hand ... )


----------



## efikkan (Mar 18, 2017)

FordGT90Concept said:


> TL;DR: no gamer should not consider a Ryzen 8-core simply because Intel's quad-core is a little bit faster.  Developers have been coding for quad-cores for half a decade now.  They're just as ready as I am to move to reasonable six and eight core machines.


You know better than this.

The problem with Ryzen is that 1800X only achieves great gains in certain more unusual workloads, while the cheaper i7-6800K would be better at gaming, Photoshop, web browsing, among other things. Buying a product that's super good at 7 applications you never/rarely use is pointless, real world performance is the only thing that matters. We'll have to see if Zen+ contains a better prefetcher to feed it's huge computational throughput.

The "lack of" multithreading in games is not due to quad cores being mainstream for so long, but rather the inability for rendering to scale over multiple cores. Rendering is done in a pipeline, and the CPU is really just building a queue for the driver to feed the GPU. Even though the GPU workload is massively parallel, the creation of the queue is not. Splitting up the queue would be limited to separate rendering stages, like physics/particle simulation, etc. So there would be limited gains here. Granted, Direct3D 12 allows you to have multiple threads working on a queue, but then you'll need to synchronize them which will create more latency than gains, so it's simply pointless. But the creation of the queue is really not the problem here, it's all the rendering thread does besides creating the queue. Typically a game engine iterates a list of objects, and then invokes a render() on each one, which in turn invokes several functions to render each object separately. In these cases each object would result in 2 or more cache misses (each resulting in ~250 clocks wasted), which will result in >95% of CPU cycles stalling while waiting for data. Intel has a better prefetcher so it's able to mitigate these problems to some extent, but the only way to make the game scalable in terms of CPU performance is to make the code cache/branch optimized, which requires a complete redesign of an engine.


----------



## medi01 (Mar 18, 2017)

newtekie1 said:


> eople saying W1zzard is Bias need to review their history.



Well, this has been called out on reddit:






note that AMD has stated that mem speed defines "infinity fabric" (CCX to CCX connection) speed.


----------



## notb (Mar 18, 2017)

FordGT90Concept said:


> My server? Whenever necessary.


This is not an answer to my question. How often, honestly? How many hours of material a month? How much does it take? Can you estimate it?



FordGT90Concept said:


> If that 30% is the difference between realtime and buffering, it is worth it.


Nope. Real life shows that, for example, if you always run some tasks during sleep or while you're at work, 4 hours is just as good as 6.



FordGT90Concept said:


> If you're only getting 20 FPS and your computer isn't a dinosaur, consider upgrading the graphics card over the, well, everything.



Hmmm... At the moment I don't have a dGPU (occasional gaming on IGP), so replacing the card would be fairly difficult. 
So as you can see for me Ryzen is really bad as a gaming CPU. I would get 0 fps in games. 

But jokes aside: the argument remains valid in my opinion: IMO people are too concerned about benchmarks covering rare tasks.
I don't think a typical user is so affected by having to spend extra 30% time encoding movies or zipping files. However, we're all very much relying on web performance and this kind of tests are, IMO, heavily underestimated. I've seen reviews without them. This will be very important for lower Ryzen chips - especially APU.
Good multi-thread optimization could make them fine for gaming (for a long time), but potentially poor performance in single-thread tasks will surface at some point in the future (and some people don't replace their CPUs for half a decade or more).



FordGT90Concept said:


> Developers have been coding for quad-cores for half a decade now.  They're just as ready as I am to move to reasonable six and eight core machines.


I'm seeing this theory all the time. Why do you assume game designers will optimize for 6-8 cores now? What new does Ryzen bring us when AMD has been selling 8 core consumer chips (actually cheaper than Ryzen 7) for years?
It's even worse today, because so many people moved to gaming laptops. Will AMD give us an 8c mobile Ryzen or just an 4c APU?
And if it an 8c Ryzen, what will be the battery life (as it will have to rely on dGPU for everything)?
Laptops with Intel CPUs can switch between IGP and dGPU, so they can be used as everyday notebooks or workstations (if you can live with the design/build quality).


If you're seeing poor performance using Excel, the problem is that you're using Excel.  Spreadsheets < databases.[/QUOTE]


----------



## rtwjunkie (Mar 18, 2017)

Z


G33k2Fr34k said:


> Looks like the 1800x is throttling in games:


It looks to me like gaming is not working it hard enough.  It needs coding for moar cores by developers.


----------



## MrMilli (Mar 18, 2017)

GreiverBlade said:


> actually DX12 games are not "ready now" they loose FPS even on Intel System ... and not many games are DX12  as for ROTTR ... 152fps in DX11 and 168fps in DX12 ... my 6600K with a 1070 does the opposite ... (aka: it loose FPS and not gain 16fps )  at almost the same ration of the 1800X
> 
> DX12 need to be optimized before being a thing ... actually DX12 feel like a beta to me ... (Vulkan on the other hand ... )



Yes, but Ryzen loses much more performance compared to Intel in DX12. The problem is that most reviewers solely use DX12 for testing, showing Ryzen in a worse light than needed.

Look at the FPS difference again: https://techreport.com/forums/viewtopic.php?f=2&t=119280


----------



## GreiverBlade (Mar 18, 2017)

MrMilli said:


> Yes, but Ryzen loses much more performance compared to Intel in DX12. The problem is that most reviewers solely use DX12 for testing, showing Ryzen in a worse light than needed.
> 
> Look at the FPS difference again: https://techreport.com/forums/viewtopic.php?f=2&t=119280


ahhh i see ... you were actually "ranting" (the "" are added for obvious reason) about "most" review being mostly in DX12 thus belittling Ryzen more than it should've deserved? tho "deserved" is not the right word ... Ryzen do mighty fine against Intel in my view (or i am completely out of caffeine and really need to replenish my reserve )

well as i said my 6600K loose also the same 35% perf ... a 1700 or 1600 doesn't seems that bad, as long as i get a little above 60fps on 1440p in the games i play the most (which iirc only 2 or 3 of them are DX12 and a little buggy if put in that mode) when i will upgrade

having the horsepower to do some heavier computational task ... would be a cherry on the cake in my case... if i was only doing gaming, then yes i would stay on my 6600K


----------



## MrMilli (Mar 18, 2017)

GreiverBlade said:


> ahhh i see ... ... if i was only doing gaming, then yes i would stay on my 6600K



Not disputing your opinion but most reviewers only use DX12 for their reviews. I think TPU also only used DX12 when possible. I think at this point, it's useful to have both DX11 & DX12 data (just like ComputerBase did). DX12 performance is all over the place while DX11 gives a more stable look into game performance.

I would also like to point out that your 6600k isn't in this review. The 7700K is and the CB number I posted are showing a clear regression of performance for Ryzen in DX12 compared to 7700K.


----------



## nemesis.ie (Mar 18, 2017)

notb said:


> I think it might not be an issue for typical TPU forum user, but as for majority of people it's actually pretty huge.
> The reviews on TPU aren't read just by enthusiasts - the site is hugely popular and well positioned in search engines. Many people will just google for "Ryzen review" and find your text in top 3 results. They might not know much about CPUs and - after years of using Intel - be unaware that not every working computer offers a video signal.
> As such, I would put the lack of IGP as the first con... probably in red bold font.
> 
> Also, could you share the specification for productivity tests? I'm mostly interested in the MS Office stuff, because the times shown suggest some very basic tasks. Weren't you tempted to use something more serious (some sites use a Monte Carlo benchmark for Excel).



Actually I think a 3rd list needs to be added in addition to pros and cons - e.g. notes, where something isn't really a pro or con but important in the buying decision, such as an IGP, the price and difference in performance of stock speed versus optimal/faster RAM and such.

Those things will allow the buyer to make an informed choice based on their needs and budget but they are not necessarily a pro or con.

Agreed on the Excel/Office, a mix of tests for these would be good to see "home user" use versus "professional (accountant, scientist etc.).


----------



## GreiverBlade (Mar 18, 2017)

MrMilli said:


> I would also like to point out that your 6600k isn't in this review. The 7700K is and the CB number I posted are showing a clear regression of performance for Ryzen in DX12 compared to 7700K.


yeah i know ... tho a 7700K show no improvement whatsoever over a 6600K since it's still a quadcore (granted with HT ) 

and also the direct contender to the 1800X is the 6900K ... price to price the one for the 7700K is the 1700 and 1700/1700X are like the 1800X but without XFR for one and lower clocked for the other ... so actually it make the 1700 a better option versus a 7700K (as in my case since i wanted to have more thread for some other task than gaming and i would be a fool to take 4C/8T over 8C/16T for the same price)




nemesis.ie said:


> Actually I think a 3rd list needs to be added in addition to pros and cons - e.g. notes, where something isn't really a pro or con but important in the buying decision, such as an IGP, the price and difference in performance of stock speed versus optimal/faster RAM and such.
> 
> Those things will allow the buyer to make an informed choice based on their needs and budget but they are not necessarily a pro or con.
> 
> Agreed on the Excel/Office, a mix of tests for these would be good to see "home user" use versus "professional (accountant, scientist etc.).



it does not have an IGP is a plus for me (which i do not use on my 6600K even for vBIOS flashing, since i have enough spare PCIeX graphic card for that) 

agreed on Pros/"neutral"/cons "neutral" would be feature that fit neither in cons or pros based on how people view it.


----------



## EarthDog (Mar 18, 2017)

Stale crackers in soup are better than no crackers at all! 

Another well done review!


----------



## nemesis.ie (Mar 18, 2017)

GreiverBlade said:


> 1700 and 1700/1700X are like the 1800X but without XFR for one and lower clocked for the other ... so actually it make the 1700 a better option versus a 7700K.



All three have XFR, the X models XFR just goes a bit higher; 50MHz XFR boost for non-X and 100MHz for X I think.


----------



## GreiverBlade (Mar 18, 2017)

nemesis.ie said:


> All three have XFR, the X models XFR just goes a bit higher; 50MHz XFR boost for non-X and 100MHz for X I think.


i see, thanks for the head up


----------



## GoldenX (Mar 18, 2017)

efikkan said:


> I think you should check closer, you might be mixing it up:
> Ryzen 1800X: $499
> i7-6800K: $434



How about 1700 vs 6800K or 6900K? Same performance plus the cheaper motherboard, and lower power consumption?
Ryzen can't beat the gaming oriented 7700K, but socket 2011 is officialy dead.


----------



## efikkan (Mar 18, 2017)

GoldenX said:


> How about 1700 vs 6800K or 6900K? Same performance plus the cheaper motherboard, and lower power consumption?
> Ryzen can't beat the gaming oriented 7700K, but socket 2011 is officialy dead.


i5-7600K, i7-7700K, i7-6800K and i7-6900K should all have been part of a such review.

Socket *2011-3* is not dead at all, i7-6800K is a fantastic allround CPU, and even a better deal for most users than Ryzen. This socket will be replaced later this year with Skylake-X.


----------



## GreiverBlade (Mar 18, 2017)

efikkan said:


> i7-6800K is a fantastic allround CPU


actually a R7 1700 would be a better option than that one ... for allround

but the one that goes against the 6800K will cost a tad less than a 6800K or 1800X/1700X/1700 too (since the 1600 will probably be priced like a i5-6/7600K)



GoldenX said:


> Ryzen can't beat the gaming oriented 7700K


it does not beat it but it does not lag way behind it either... technically it's more than sufficient (for the R7 1700 base)


actually more i read that review the more i think R7 1700X/1800X are not really worth the overprice over a 1700 in the performances department, the 1700 has been shown to be up to the task just fine
so, it mean we have a 374chf (using my country pricing) that can compete with his own price counterpart, the 7700K (and offer a little more in case of specific workload), get in the 444.55chf domain of the 6800K (while the counterpart of the 6800K is cheaper ... it's the future R5 1600/1600X ) and same on the 1108chf 6900K

from what i saw at last ... (meaning paper magazine reviews that list a little more than only 2 CPU in competition and web reviews )


----------



## horsemama1956 (Mar 18, 2017)

It's a real shame AMD could only launch with the higher end products. It looks like the lower end parts are going to be insane value for the money with solid performance. The 4\6 core + SMT CPUs will have a pretty huge multithreading advantage compared to their i3-i5 counter parts and the singlethreaded power to not gimp users in games and applications that aren't going to use more than a thread or 2.


----------



## L1amrob (Mar 18, 2017)

Still feel like the X1700 is the one to go, was thinking to hold off for the R5 CPUs, but I just gone ahead and go crazy spend a bit more and get some extra cores 
Now hope the Fatal1ty X370 doesn't have the same issues as that gigabyte one..


----------



## springs113 (Mar 18, 2017)

Big_Vulture said:


> It took quite really long to TPU...


Wizz main problems were memory issues


----------



## yeeeeman (Mar 18, 2017)

Where is 6900X? Or do you think Ryzen 1800X main competitor is 7700K?


----------



## geon2k2 (Mar 18, 2017)

yeeeeman said:


> Where is 6900X? Or do you think Ryzen 1800X main competitor is 7700K?


Well for sure its not the 6900K either. That costs twice as much 
If you meant 6950X, than that is three times as much


----------



## GreiverBlade (Mar 18, 2017)

GoldenX said:


> How about 1700 vs 6800K or 6900K? Same performance plus the cheaper motherboard, and lower power consumption?
> Ryzen can't beat the gaming oriented 7700K, but socket 2011 is officialy dead.


how about waiting the real counterpart of the 6800K aka the i5-6/7600K priced R5 1600/1600X which is a 6C/12T like the 6800K 

the 1700 has already been pitted up against the 6900K in heavy computational task (1800X all the same)  it's quite near it



geon2k2 said:


> Well for sure its not the 6900K either. That costs twice as much
> If you meant 6950X, than that is three times as much


well the counterpart of the 1800X (1700/1700X) _*IS (are) *_the 6900K series, prices has nothing to do with it  specially if you can get near the same perf for 2 or 3 time less money needed


----------



## efikkan (Mar 18, 2017)

i7-6800K *is* the main competitor, until Skylake-X replaces it.


----------



## YautjaLord (Mar 18, 2017)

The question that pops up in my head, how exactly AMDs gonna fix/iron out issues (for a lack of better word) this CPU has (SMT worth jack sh1t, quirky/dorky IMC behavior, need to choose High Perf in Windows as power option, etc.....)

Linus did a review of 1080 Ti on both 7700K & 1800X & aside from one game (forgot which) it was a tie between two CPUs. Not much difference in fps between 2 platforms, mostly 1080 Ti was faster or on par with 1200$ behemoth on both rigs. Tested all games suite in DX11 & 12 & Vulkan (DOOM), in 4k only. I'll see if AMDs & mobo vendors fix this sh1t with CPU revision & fresh BIOSes respectively by June/July (coming back from Download Open Air 2017 by that time. ) Great review Wiz regardless, thx.


----------



## Ravenas (Mar 18, 2017)

W1zzard said:


> Wouldn't be the first time a graphics card fails and IGP can be used while waiting for the RMA to complete, to at least be able to browse the web, do email and watch YouTube.



I 100% agree with this comment. Failure of GPU... integrated graphics to the rescue. I am a firm believer in things of this nature. Personally, I have extra graphics cards lying around for just that. I'm just not sure someone with a 7700k wouldn't have this as well... but we both have a point here. I think the whole reason Intel has included integrated is not because they wanted to compete with AMDs APU focus, but because Apple among others has demanded such.

I like integrated graphics on the mainboard like the good old days.


----------



## 3lfk1ng (Mar 18, 2017)

medi01 said:


> Well, this has been called out on reddit:
> 
> 
> 
> ...



As they should be!

I get that the early, immaturely released BIOS wasn't capable of 3000MHz speeds but the new(er) BIOS revisions are capable of at least 3000MHz on nearly all the motherboards. This specific Gigabyte board supports 3200MHz and the recent BIOS update (2017/03/14) further improves DDR compatibility.

There should definitely be a followup to this review to showcase the performance gains one can expect from Ryzen on increased ram speeds since the CPU is so dependent on ram speed (users reporting 30% gains in framerate). To elaborate, CCX is always half of the memory controller speed so 3000MHz would have a Infinity fabric speed of 1500 over the 1066/1333 that was used in the review. 

Also, as I mentioned before. Comparing the 1800x to an 7700k isn't ideal for a proper review either. We need to get a i7-6900k on the charts as that's the processor that the 1800x was designed to compete with.

When the R5's drop, then TPU can compare those to the i7-7700k.

Just my two coppers.


----------



## nemesis.ie (Mar 18, 2017)

L1amrob said:


> Now hope the Fatal1ty X370 doesn't have the same issues as that gigabyte one..



I have one and an X1800 sitting on the desk, but I have to locate the power supply that got "tidied away" before the holidays and am also waiting on an AM4 mounting kit for a Kraken.

I've got some spanky 3733Mhz RAM in it too.  AsRock have been banging out new UEFI updates every few days since launch so we'll see how the RAM goes. It also has the external refclk generator.

There was also a new AMD AM4 chipset driver pack released on the 13th, @W1zzard, did you use that one? If not, a comparison with it installed would be a really nice addition (assuming any difference is observed in some initial testing so as not to waste any time).


----------



## xkm1948 (Mar 18, 2017)

Kinda fun to see W1zzard rushing over to r/AMD to defend his review of RyZen. I thought he is too good for reddit.


----------



## EarthDog (Mar 18, 2017)

Link....


----------



## W1zzard (Mar 18, 2017)

xkm1948 said:


> Kinda fun to see W1zzard rushing over to r/AMD to defend his review of RyZen. I thought he is too good for reddit.


I'm quite active on reddit, have been there for many years. Always trying to be open, not defending.



EarthDog said:


> Link....


https://www.reddit.com/user/wizzardtpu


----------



## EarthDog (Mar 18, 2017)

Oh god.. it's (r) formatting is even worse mobile... lol!

I'll check it out at home.


----------



## medi01 (Mar 18, 2017)

yeeeeman said:


> Where is 6900X? Or do you think Ryzen 1800X main competitor is 7700K?



Comparing it to ONLY 7700k frankly didn't make sense for a number of reasons, price being one of them.

I am also curious, whether "doesn't have an integrated GPU" was listed as "cons" in reviews of Intel's 8 core chips.




horsemama1956 said:


> It's a real shame AMD could only launch with the higher end products.


R5 is launching early April.
I don't see how it is a big deal to be honest.

Extremely likely to be same chips as 1800x with faulty cores.
(so 2+2 and 3+3)
So they could not have started with lower end cores, for obvious reasons.

Single CCX chips are likely ones targeted at notebooks.




efikkan said:


> Socket *2011-3*


Cool that AMD is back and we mainboard swapping with CPU upgrades would stop being a thing


----------



## Athlonite (Mar 18, 2017)

Lacks integrated graphics
how is that a thumbs down when the CPU was never advertised as an APU you want intergrated GPU wait for the APU's to come out



kruk said:


> Could you please also test stock CPU cooler performance (temp) and noise? (if provided)
> Also, would it be possible to measure power consumption of the CPU like you do with the GPUs?



Both the 1800X and 1700X do not come with stock HSF's only the 1700 comes with the new wraith cooler for the other two you'll need to supply your own HSF

Thanks for a thorough review Wiz I know these things are a lot of work to get out shame you didn't have an 6900K to go against the Ryzen but you can't always have everything.... The only thing holding me back from going out and buying a ryzen setup is the lack of mobo's from the likes of MSI and Asrock here in NZ and some BIOS issues which I expect to be ironed out over the next few months


----------



## GreiverBlade (Mar 18, 2017)

efikkan said:


> i7-6800K *is* the main competitor, until Skylake-X replaces it.


okay ... 6C/12T  vs 8C/16T totally make sense .... the 6C/12T from AMD will be the R5 1600, the R7 1700/1700X/1800X are meant to target the 6900X while being priced "mainstream" for the 1700 and half the price or the top dog HEDT for the 1800X, 1700X being a "in-between".

Skylake-X will be Skylake HEDT indeed (wait aren't we on Kabylake on "mainstream"?)


----------



## efikkan (Mar 18, 2017)

GreiverBlade said:


> okay ... 6C/12T  vs 8C/16T totally make sense .... the 6C/12T from AMD will be the R5 1600, the R7 1700/1700X/1800X are meant to target the 6900X while being priced "mainstream" for the 1700 and half the price or the top dog HEDT for the 1800X, 1700X being a "in-between".
> 
> Skylake-X will be Skylake HEDT indeed (wait aren't we on Kabylake on "mainstream"?)


What matters is real world performance, not "specs" and theoretical figures. The same argument was constantly used for Bulldozer back in the day; it had 8 cores so it couldn't be "compared" to 4-cores from Intel, even though they beat it.

With it's gaming performance, and great performance in a number of applications, surely i7-6800K is a better buy for most users than a 1800X.


----------



## dj-electric (Mar 18, 2017)

Can we talk about the huge potential Ryzen has for mobile devices with that phenomenal power consumption?
4C/8T chips running past 3Ghz would be awesome in gaming and workload intended laptops.
I also wanna see what mobile ryzen can do with 2C/4T setups.

About the desktop ryzens, if OC comes to mind, the R7 1700 is the only relevant choice i can think of. for 329$ you have the potential to get 95% of what a 499$ can provide, granted you can hit 3.9Ghz stable all-core


----------



## GreiverBlade (Mar 18, 2017)

efikkan said:


> What matters is real world performance, not "specs" and theoretical figures. The same argument was constantly used for Bulldozer back in the day; it had 8 cores so it couldn't be "compared" to 4-cores from Intel, even though they beat it.
> 
> With it's gaming performance, and great performance in a number of applications, surely i7-6800K is a better buy for most users than a 1800X.


and i was talking about real world performance .... in the same domain a 1700 to 1800X compete with the 6900K (you know ... reviews and such things ... on paper or web... there is quite a lot on it)

and for gaming technically a R5 1600, when out, will be a better option than a 6800K if not ... well you can still add 2 core, 4 threads more and get a slightly cheaper than a 6800K, but a tad more expensive than a R5 1600, R7 1700 ....

i take you didn't read any of my previous post and just took what you wanted to counter  no worries, it's fine.


----------



## happy medium (Mar 19, 2017)

Nkd said:


> You sound like you are trolling and I shouldn't even respond to you. Who gets a frickin titan x to game at 1080p? Shit gtx 1080 is damn good enough for anything 1440p.



I'll prove my theory.

take a look at these benchmarks for Fallout 4.
This is a i7 7700k @ 4.8 with a gtx1080ti vs a gtx1080.
You will see little to NO gpu bottleneck.





Even at 1440p just a slight bottleneck with the gtx1080.





Now look what happens to the Ryzen cpu in this cpu intensive test, without! the gpu bottleneck.









Still think I'm trolling?
I respect the reviewer and never had a problem with this site.
Someone needs to run 2 1080ti's in sli with games that support sli well @ 1440p and compare Ryzen @ 4.0 to the i7 7770k
@ 4.8.
I have a feeling this is NOT a 1080p thing.
Next year when Big Volta is released and its 40% faster than a 1080ti, I bet you will see the Ryzen cpu's falling behind
@ 1440p.


----------



## OneCool (Mar 19, 2017)

Good review as always. Not sure how you knew I was looking at that Mobo for a Ryzen build.

Anyway...I will wait to pull the trigger on a new build.


----------



## sweet (Mar 19, 2017)

W1zzard said:


> I'm quite active on reddit, have been there for many years. Always trying to be open, not defending.
> 
> 
> https://www.reddit.com/user/wizzardtpu


Would be more appreciated if you can do a follow-up with a proper BIOS and faster RAM. You can get some tips from this video


----------



## Totally (Mar 19, 2017)

newtekie1 said:


> This is what annoys the crap out of me(Intel is guilty too).  Why change the mounting hole layout so we have to buy(or the heatsink manufacturers have to give away at a loss) new retention brackets?  Did Intel really need to make the holes on the 115X platform ever so slightly larger than the 775?  Did AMD really need to do the same between AM3+/FM2+ and AM4?  And it makes even less sense that AMD wouldn't take the opportunity to make their mounting holes square, and they should have actually just matched the already in use Intel spacing.  Make it easier for all of us!
> 
> 
> 
> ...



Now you mention is AMD hasn't changed the cooling mounting mechanism since socket 939, yet intel had changes them at rate they can't be mentioned in the same breath,  looking squarely at squarely at sockets 2011/2011-3


----------



## OneCool (Mar 19, 2017)

Intels cash = 0.002% is gaming


I don't think some understand just how big and detached Intel is from a gamers stand point.


----------



## tungt88 (Mar 19, 2017)

Just finished the review -- and the wait was well worth it! Thanks to Wizzard and the crew. 

My plan to build a new rig this summer gets all the more interesting now, especially since I live near a Micro Center. Next up on the table -- GTX 1080 Ti, or Vega?


----------



## RejZoR (Mar 19, 2017)

Totally said:


> Now you mention is AMD hasn't changed the cooling mounting mechanism since socket 939, yet intel had changes them at rate they can't be mentioned in the same breath,  looking squarely at squarely at sockets 2011/2011-3



Well, it also depends on motherboard makers to be honest. On my Rampage II Gene, I've had option between LGA775 and LGA1366 mounting holes. So, despite all new socket, I was able to use my LGA775 cooler without ANY changes to it. And I've used a LGA1366 AiO cooler on my LGA2011 for a while. The mounting holes for the backplate were spaced the same and fit through LGA2011 holes, I just had to punch through some insulation layer that was blocking the holes. I just used springs instead of plastic spacers that originally came with it and were too wide and getting stuck to the 2011 socket frame. But it worked with ZERO cost, just tiny bit of ingenuity on my end. So, while Intel does change sockets often, it doesn't necessarily mean it affects users.

Also, can't remember out of my head for sure, but isn't Intel using same 75mm mounting hole spacing since like Sandy Bridge (2500k/2600k) for the mainstream models? Same coolers should fit latest Kaby Lake afaik... I've been on the HEDT branch (1366 and now 2011) since 775 so I'm just saying what I roughly remember from that time...


----------



## Jism (Mar 19, 2017)

As for the mounting holes, it might be due to change of layout in pins. The Ryzen has a lot more CPU pins and thus a different layout then the AM3/AM3+ cpu's in the past. I woud'nt worry about it, many cooling companies send you a replaceable bracket so you can continue to use your current cooler.

As for the review, very good! The CPU is still $499 and basicly whipes the floor with Intel that charges up to $1000. As for the motherboard, chipset and memory issues, if i'm not mistaken the chipset belongs in the CPU now, so it's not really up to motherboard vendors but AMD theirself who should be testing and verifying various memory kits.

The switch from auto to manual is something not everyone would understand, but it's the first thing i do when i enter a fresh bios. Exclude that certain values can disrupt the system's stability. Auto is never a good option, at Vishera your CPU/NB gets a huge voltage kick up to 1.4V for just CPU/NB which is terrible.

Anyway; putting my 8320 soon to rest! Ryzen will be my next chip.


----------



## refillable (Mar 19, 2017)

Not sure if it's just me, but this one is a pretty disappointing review. Let me point out why.

First, leaving out Broadwell-E CPUs is one hell of a big blunder. Why? You clearly noted that this CPUs is designed to tackle Intel's HEDT platform, and not even showing a single one of the competition makes me shake my head.

Second, CPUs are NOT GPUs. Separate, PLEASE, separate single threaded and multi threaded apps. DIFFERENTIATE them well, seriously. The way that you mix them up all over the place is not wise, at all.

Third, the logic of "1080p bottlenecks on faster cards, if you have a GTX 1070 or faster, play at 1440p or buy Intel" is err, sorry, stupid. Proposing that 1080p is the standard for GTX 1070 or faster isn't wise, and to go ahead and make an implication and then conclusion based on that faulty proposition ("or buy Intel" conclusion) is super absurd.

I'm not bashing on you as an "anti-AMD fanboy" or anything of that rhetoric. The thing is, you're shifting away the focus of an "HEDT 8 Core CPU" and have your sights (and even worse, as a reviewer, changing the sights of the readers) set on a 4 core mainstream CPU. You need to correct that. Feel free to disagree with me (anyone). So W1zzard, if you're reading this, do an overhaul of your CPU reviews, or stick to GPU reviews if you can't figure out where this CPU are marketed to (which is a really simple thing).

The wait was not worth it. Anandtech, Techspot, Guru3D did much better than you. And they did it earlier. And before invoking a straw man fallacy to me, I have no concerns on Fallout 4 (and some others) where Ryzen indeed does fall behind. My objection is this review is fundamentally flawed, completely ignoring that this CPU is an 8 Core HEDT CPU. And that's bad.


----------



## notb (Mar 19, 2017)

W1zzard said:


> I don't think anyone serious in the real world does Monte Carlo in Excel. You sound like you work in a workstation/science environment, tell me about your tasks (pm is fine ofc). I'd be happy to design and add more benchmarks



You'd be very surprised by the things people do in Excel (e.g. www.modeloff.com). 

I know TechSpot uses a MC Excel benchmark in their reviews, but I haven't seen the file (maybe you could obtain it some way):
http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page3.html

Leaving extreme examples aside (like the MC above or some extensive VBA) a typical financial/data analyst is surrounded by Excel files way above 100MB (and i'm talking about xlsx format - already zip compressed). I have an i5-6500 at work and many of files need around 30 seconds to recalculate - without any randomization or VBA (just a lot of data and formulas).

I guess I could make you something from scratch or send an existing file, but I'll have to clean them beforehand (their full of data/methods I'm not supposed to share). I'll look into it after work.


----------



## notb (Mar 19, 2017)

nemesis.ie said:


> Actually I think a 3rd list needs to be added in addition to pros and cons - e.g. notes, where something isn't really a pro or con but important in the buying decision, such as an IGP, the price and difference in performance of stock speed versus optimal/faster RAM and such.
> 
> Those things will allow the buyer to make an informed choice based on their needs and budget but they are not necessarily a pro or con.



I think you're right. From what we've seen until now: Ryzen's performance is great, but it requires a bit of tweeking, choosing the right components and so on.
This should be pointed out in reviews that getting the most from the platform (so what we see in most benchmarks) needs some preparation and is not that easy. The exact things should be listed and easy to spot. Instead we get very general remarks like "this is a new platform and it needs some time to mature".
Sure, many of the issues might be fixed in software updates, but until that happens this platform is not as easy-to-use as Intel's.
I feel like many novice PC users might be disappointed by their Ryzen system at first.



nemesis.ie said:


> Agreed on the Excel/Office, a mix of tests for these would be good to see "home user" use versus "professional (accountant, scientist etc.).



Exactly. We lack a well defined, standardized set of tests for different real-world scenarios - something a user could download and run on his machine (like with Cinebench). At this point it's all about gaming and synthetic benchmarks.
I assume people tend to think that rendering a webpage is a silly task for our powerful 4K gaming rigs - that's why it is rarely included in tests. But it's so not true and you would expect reviewers to know better.

It's much the same with Excel. I totally understand that someone mostly into gaming might not know how Excel is used by finance / data analysts - I don't think I've ever used a sheet larger than 10 MB until I got a job in finance. But that means someone should prepare such test and give it to the reviewers - just like rendering software companies do. And by "someone" I obviously mean Microsoft. Maybe if the biggest PC review sites ask MS, they would respond with a tool.


----------



## notb (Mar 19, 2017)

Dj-ElectriC said:


> Can we talk about the huge potential Ryzen has for mobile devices with that phenomenal power consumption?



Notice that TPU results are very different from what we've seen at other sites, so they should be taken with a grain of salt.
The general observation from multiple reviews is that Ryzen (especially in Boost/XFR) heavily exceeds the TDP - an obvious thing once you learn their TDP methodology.

Anyway, the 4C/8T Ryzen 5 will be rated at 65W - exactly like the 7700 (and rumored to have similar performance), so it's unlikely to make a big commotion in the notebook market.

You also have to remember that Ryzen doesn't have an IGP, so it's already not a chip for vast majority of notebooks sold. Unless AMD will sell it with their very decent Radeon Pro, but I tend to think it's reserved for Apple at this point.

So we must wait for the APU. How will it perform? No idea.
And we know that AMD tends to put fairly powerful graphic chips in APUs, which again makes them not the best choice for a mobile device. Unless their make something specifically for notebooks.



Dj-ElectriC said:


> About the desktop ryzens, if OC comes to mind, the R7 1700 is the only relevant choice i can think of. for 329$ you have the potential to get 95% of what a 499$ can provide, granted you can hit 3.9Ghz stable all-core



Well... that's what OC is about. But not everyone does it, so 1700X and 1800X are still important - especially with their ability to automatically reach their limits (so you don't have to do any overclocking yourself). We must remember that if these CPUs are meant to win some enterprise audience as well, they have to be robust and competent.
If AMD wants to sell Ryzen in business solutions, it will have to do without OC, Ryzen Master Utility and all this gaming-oriented image they've based the marketing on.


----------



## Basard (Mar 19, 2017)

For home users, Pentium III (sarcasm) will do fine in Excel. Techspot is full of ads and clutter. Guru is nice, in that they put up more pictures with taskman/cpu-z/hwinfo captures. I'd have to take a small vacation to get through Anand's review. Here I click through a buncha charts, read some smart ass (more sarcasm) remarks at the end, and I come away with a new understanding of Ryzen--all in like five mins.  Then I spend an eternity in the forums criticizing it (even more sarcasm).


----------



## W1zzard (Mar 19, 2017)

notb said:


> It's much the same with Excel. I totally understand that someone mostly into gaming might not know how Excel is used by finance / data analysts - I don't think I've ever used a sheet larger than 10 MB until I got a job in finance. But that means someone should prepare such test and give it to the reviewers - just like rendering software companies do. And by "someone" I obviously mean Microsoft. Maybe if the biggest PC review sites ask MS, they would respond with a tool.


http://exceltrader.net/excel-benchmark/
That's what they use. Nothing close to real-life imo.

PLEASE yes, get me some of our sheets. I won't share them, you should still randomize values and change formulas slightly.

I just made a separate thread for CPU review methodology ideas (non-Ryzen related posts only).


----------



## rippie (Mar 19, 2017)

Great review, the numbers from the game benchmarks show already that memory frequencies massively impact performance.
due to the infinity fabric clocking on ddr clocks.

Can you add do tests with DDR @ 4200? i bet ya it will start to fly beyond the 7700k.


----------



## efikkan (Mar 19, 2017)

GreiverBlade said:


> and i was talking about real world performance .... in the same domain a 1700 to 1800X compete with the 6900K (you know ... reviews and such things ... on paper or web... there is quite a lot on it)
> 
> and for gaming technically a R5 1600, when out, will be a better option than a 6800K if not ... well you can still add 2 core, 4 threads more and get a slightly cheaper than a 6800K, but a tad more expensive than a R5 1600, R7 1700 ....
> 
> i take you didn't read any of my previous post and just took what you wanted to counter  no worries, it's fine.


If you look at the benchmarks, the only scenarios where 1800X gives you and advantage is AES, prime numbers, H.264/H.265, 7-Zip, Blender, CineBench. And how many of these are purely theoretical, and how many correspond to actual workload for a user? And if the cheaper i7-6800K was in the benchmark, it would have beaten 1800X is half of these, there are several of these where AMD's eight cores barely beat fore cores from Intel; so much for superior multitasking… While the Intel CPU with half the cores still wins all the gaming benchmarks, Photoshop, Office(doesn't matter that much), all web browser benchmarks. So unless you spend all your day doing AES, crunching prime numbers and rendering in Blender, the i7-6800K will give better overall performance. I bet most Ryzen buyers will never live to experience the benefits over an i7-6800K.


----------



## notb (Mar 19, 2017)

efikkan said:


> If you look at the benchmarks, the only scenarios where 1800X gives you and advantage is AES, prime numbers, H.264/H.265, 7-Zip, Blender, CineBench. And how many of these are purely theoretical, and how many correspond to actual workload for a user? And if the cheaper i7-6800K was in the benchmark, it would have beaten 1800X is half of these, there are several of these where AMD's eight cores barely beat fore cores from Intel; so much for superior multitasking… While the Intel CPU with half the cores still wins all the gaming benchmarks, Photoshop, Office(doesn't matter that much), all web browser benchmarks. So unless you spend all your day doing AES, crunching prime numbers and rendering in Blender, the i7-6800K will give better overall performance. I bet most Ryzen buyers will never live to experience the benefits over an i7-6800K.



While I wouldn't be so sure about the advantage over 6800K, I am worried by the general trends in benchmarks which you've mentioned...
We're getting to this ridiculous situation, when a *typical user* is persuaded by both reviewers and community that he should get a Ryzen because of high core superiority in very specific tasks.

Aside from the science/work-related stuff (which is heavy simulations and data analysis), I find myself to be a very ordinary PC owner.
I spend most of my free computer-time on the web, watching movies or gaming (very rarely lately). And having really thought this through, I don't think getting 8 cores (be it AMD or Intel) over an i7 (or even i5) would change my life significantly.
And keep in mind the "pro-stuff" I do is actually fantastically parallelized, so it would benefit greatly from the extra threads.
It's just that I don't do it on a daily basis, so it's not like a week is too short for running 20 or 30h of computation (moving to Ryzen would save me maybe 10h a week of PC time).

As for video encoding, it's mid March and it seems I've used Handbrake exactly 5 movies this year. I only use it to transcode movies from 1080p (how I buy them) to something more smartphone-friendly (max 720p).
It's really hard to understand why it's so important to people not in the video industry (or at least streamers). I actually asked one of Ryzen worshipers here how often he does movie encoding, if he's spending so much time praising Ryzen's performance. The answer I got was: "whenever necessary".


----------



## GreiverBlade (Mar 19, 2017)

efikkan said:


> If you look at the benchmarks, the only scenarios where 1800X gives you and advantage is AES, prime numbers, H.264/H.265, 7-Zip, Blender, CineBench. And how many of these are purely theoretical, and how many correspond to actual workload for a user? And if the cheaper i7-6800K was in the benchmark, it would have beaten 1800X is half of these, there are several of these where AMD's eight cores barely beat fore cores from Intel; so much for superior multitasking… While the Intel CPU with half the cores still wins all the gaming benchmarks, Photoshop, Office(doesn't matter that much), all web browser benchmarks. So unless you spend all your day doing AES, crunching prime numbers and rendering in Blender, the i7-6800K will give better overall performance. I bet most Ryzen buyers will never live to experience the benefits over an i7-6800K.


other than that the differences are minimal ... and a 1700 would suffice ... sooo why taking a 6800K would be a better option? 

waiting on the R5 1600 benchies before anything else  if it get in the same domain ... well we will have a 6800K opponent for a i5 6/7600K price, tho even the 1700 is adequately priced, 2011-3 X99 platform has less appeal than the AM4 X370 actually ... in term of future ... and Skylake-X will be very ... "Intel'ish" ... aka: overpriced


----------



## efikkan (Mar 19, 2017)

GreiverBlade said:


> other than that the differences are minimal ... and a 1700 would suffice ... sooo why taking a 6800K would be a better option?


Because it gives better all-round performance. Minimal differences? There are a number of games losing 15-20% performance on a GTX 1080, which will be even more for a GTX 1080 Ti.
When you can choose a product which performs better in typical real world applications, and one which performs better in rarely used applications, why would you be stupid enough to buy the last option?


----------



## MrMilli (Mar 19, 2017)

Wizzard, I don't know if you've read the original post I made but it concerns this forum topic: https://techreport.com/forums/viewtopic.php?f=2&t=119280

Any chance you could test one of these games to see if Ryzen's performance really tanks so much in DX12?


----------



## GreiverBlade (Mar 19, 2017)

efikkan said:


> Because it gives better all-round performance. Minimal differences? There are a number of games losing 15-20% performance on a GTX 1080, which will be even more for a GTX 1080 Ti.
> When you can choose a product which performs better in typical real world applications, and one which performs better in rarely used applications


though... why would i pay 200$ more ... (100$ in case of a 1700) ... 15-20% ... oh well as long as it's not a "overall"  (well loosing 10-15 fps in 1080p is horrible indeed ... i almost cried when i saw the R7 1700 @3.7 being beaten by a 7700K @5.0ghz ... tho on quadcore games ... on "more than 4 core" games it was on par or slightly above sometime ... oh wait ... it wasn't beaten literally ... damn, i got sarcastic ... sorry)

yeah thanks


efikkan said:


> why would you be stupid enough to buy the last option?


probably to give the market a change ... or to promote the concurrent which has a excellent product for once (almost a Slot A Athlon feel ... nearly Athlon/Athlon XP Skt A  almost A64 ... we need it again ... )

at a 7700K price i choose R7 1700
at a 6800K price i choose R7 1700X
at a 6900X price i choose any of the R7 over that one ...

why would i pay more to have ... well ... not really more for what i need ? that would be stupid. (well quad channel RAM is a little more expensive to setup than dual ... nonetheless  )
also if i go 6800K, intel will force me to change socket mobo cpu for the next gen (well ... they need money, right? they're short on it, i've heard recently)... amd,on the other hand, is used to do differently

remember Ryzen is new and need some "patching" ... for now the best price/perf ratio is on AMD

you can keep cheering on Intel no worries (after all my sys spec are blue green actually ) but nope ... my opinion has just as much value as your has.
as i wrote: X99 platform has less and less appeal with the days passing ... and Z170/270 has a minimal edge for now

yep my vote goes to AMD ... otherwise if people keep "blindly" following Intel, even when a worthy enough alternative is here, Intel will keep doing what they did recently ... aka: +5-15% improvement and overpricing

also ... powerdraw ... gotta laugh ... if the 7700K is that high i wonder how is my 4.4 6600K (and a 6800K 6900X )


----------



## nemesis.ie (Mar 19, 2017)

Progress! I found the missing PSU and got the machine (Asrock Gaming Pro X370, R7 X1800) up and running with a small AM3 cooler.

Good news on the RAM compatibility front, using the shipped 1.2 AsRock UEFI, I selected XMP and was shown the XMP profile (only 1) for 3733MHz, I selected it and rebooted and the machine booted fine.  On inspection it's running at 3200MHz.

So a good start. Windows 10 installed in around 5 mins to the Samsung 960 in the "Ultra" M.2 slot. Sadly they only supplied 1 x M.2 screw with the board and the accessories were a little light which is the only minus so far.

Seeing 64c in UEFI with this tiny cooler, which seems about right based on the "finger on the heatsink" test - it's just starting to burn like touching a household radiator. 

I've just flashed in UEFI 1.55. On to base setup with that and the driver insall.

I hope the Kraken mount arrives soon.

Let me know if you'd like me to update this thread or if there is interest I can create a new one?


----------



## efikkan (Mar 19, 2017)

GreiverBlade said:


> though... why would i pay 200$ more ... (100$ in case of a 1700) ... 15-20% ... oh well as long as it's not a "overall"  (well loosing 10-15 fps in 1080p is horrible indeed ... i almost cried when i saw the R7 1700 @3.7 being beaten by a 7700K @5.0ghz ... tho on quadcore games ... on "more than 4 core" games it was on par or slightly above sometime ... oh wait ... it wasn't beaten literally ... damn, i got sarcastic ... sorry)


$200 more? R7 1800X is more expensive than i7-6800K.

On Intel CPUs gains in games flatten out around ~4 GHz, so an i5-7600K or any of the models above will perform roughly the same in gaming, while Ryzen is clearly behind.

If you're paying for a decent graphics card then this is a total waste of money, since losing 15-20% in a number of games defeats the purpose of a decent graphics card in the first place. If you're playing on a RX 480 though, it makes less of a difference.



GreiverBlade said:


> at a 7700K price i choose R7 1700
> at a 6800K price i choose R7 1700X
> at a 6900X price i choose any of the R7 over that one ...


Why? In most normal power-user tasks an i7-6800K outperforms an R7 1800X at a lower price. This is clearly bias.



GreiverBlade said:


> why would i pay more to have ... well ... not really more for what i need ? that would be stupid. (well quad channel RAM is a little more expensive to setup than dual ... nonetheless  )


No one is forcing you to use all of the memory channels. But more memory channels allows you to get more bandwidth at a lower price.



GreiverBlade said:


> also if i go 6800K, intel will force me to change socket mobo cpu for the next gen (well ... they need money, right? they're short on it, i've heard recently)... amd,on the other hand, is used to do differently


Now you're not even trying to stay serious. No one is buying a $400-500 CPU and then upgrading it two years later.



GreiverBlade said:


> remember Ryzen is new and need some "patching" ... for now the best price/perf ratio is on AMD


There is no need for "patching", and what is there really to "patch"? You fools said the same thing about Bulldozer back in the day; it has more cores and will perform better down the road. Of course it never happened.
Buy stuff based on real world performance rather than hypothetical dream scenarios!

Ryzen is the best performance/price if you're sitting all day rendering in Blender or crunching prime numbers. But for most productive tasks Intel offer the same or more value.



GreiverBlade said:


> you can keep cheering on Intel no worries (after all my sys spec are blue green actually ) but nope ... my opinion has just as much value as your has.
> as i wrote: X99 platform has less and less appeal with the days passing ... and Z170/270 has a minimal edge for now
> 
> yep my vote goes to AMD ... otherwise if people keep "blindly" following Intel, even when a worthy enough alternative is here, Intel will keep doing what they did recently ... aka: +5-15% improvement and overpricing


As spoken by a true fanboy…


----------



## jaggerwild (Mar 19, 2017)

FINALLY....................................Nice work Wizard!!! TY!


----------



## mrg666 (Mar 19, 2017)

I always check w1zzard's reviews for graphics cards as my first reference. The Ryzen review is very informative as well. Just a small suggestion: if the single thread and multiple thread benchmarks are marked clearly, although I am able to figure out, it would be helpful. 

My understanding is that although AMD is much more competitive with Intel in this release compared to Bulldozer disaster, AMD is still behind Intel in single-thread performance. The review clearly shows that, for the high-end entertainment/gaming desktop,  i7-7700K is the better (and cheaper) processor than 1800X. Ryzen can only be better when all threads are needed and fully employed. I was excited with the hype of Ryzen before release and I am happy that I didn't make the mistake of pre-ordering one of them. I will go with i7-7700K. Thanks, w1zzard!


----------



## GreiverBlade (Mar 19, 2017)

efikkan said:


> $200 more? R7 1800X is more expensive than i7-6800K.


ok proven you don't read my post ... no need to write more on the subject, have a nice evening/afternoon/day.


----------



## TheHunter (Mar 19, 2017)

medi01 said:


> Comparing it to ONLY 7700k frankly didn't make sense for a number of reasons, price being one of them.
> 
> I am also curious, whether "doesn't have an integrated GPU" was listed as "cons" in reviews of Intel's 8 core chips.
> 
> ...


@guru3d everyone compared it to 7700K and only by gaming.. lol, there was 5960x and 6900K too, but no one cared.
http://www.guru3d.com/articles_pages/amd_ryzen_7_1800x_processor_review,16.html

But here all want 6900K xD




Hitman, TotalWar, Fallout4 are all known to have bad cpu api code, most are probably fine tuned for intel, remember those intel logo flashes? MT engine or Codemasters EGO or older Hitman,.. Fallout3 had it also if Im not mistaken.


----------



## xkm1948 (Mar 19, 2017)

Somebody over ROG forum managed to bench at DDR4-3500. 

https://rog.asus.com/forum/showthread.php?91820


----------



## Nkd (Mar 19, 2017)

happy medium said:


> I'll prove my theory.
> 
> take a look at these benchmarks for Fallout 4.
> This is a i7 7700k @ 4.8 with a gtx1080ti vs a gtx1080.
> ...



You seemed to have cherry picked the game that clearly ryzen does worst in. I already saw that. So this is not news to me. There clearly is something with that game that needs to fixed, because it was an outlier in the review. 

As far as volta goes, we will see what happens. But picking fallout 4 that clearly does bad on ryzen is not a great example to go by.


----------



## ManofGod (Mar 19, 2017)

ssdpro said:


> Good review consistent with other reviews.  Probably a bit better since not rushed and things have settled some.  Even setting aside whatever happened with the gaming performance, it is a great value.  I like to think of it as more of a balance.  The problem remains if you are a gamer, and already have a 6700/7700k, you won't enhance anything with these.  You might push specific niche tasks better and maintain reasonable gaming performance.  The problem is these niche areas are just not something mainstream or even most enthusiasts take part in.  Time will tell if this does any actual disruption.  The market says it isn't and the stock continues lower since launch.



Not sure where you are getting you information. However, I just looked at AMD stocks and from what I can see, they are stable.



suraswami said:


> As usual Great review W1z!!
> 
> For some of us here who still uses the 8350 (or similar), would have been bit more helpful to see what kind of performance jump the Ryzen gives.



The jump is Hugh! However, I recommend that if you are going to overclock, even eventually, buy the 1700 non X since it comes with a good cooler out of the box. I have both the 1700X and 1700 non X and I love them both, I came from an FX 8320 and FX 8350.



Ubersonic said:


> To be brutally honest there is zero reason for anyone with a brain to buy a 7700K now.  If a Ryzen chip isn't better than a 7700K for what you're doing, then what you're doing doesn't warrant buying the 7700K over the 7600K either.



That may be the case. However, the cost of the previous releases are not really any cheaper than the 7700 or 7600, for the most part. Unless you can go to Microcenter and get a fantastic deal, of course. I am an AMD fan and with this release, I think they are worth it for even non AMD fans to buy. However, the R5 1600 and 1600X will probably be the better deal.

Speaking of overclocking, even Intel chips usually do not overclock very well, at least with my experience.

Edit: And no, I am not going to hunt down any thing I want to reply to and then do so later in a multi quote. I see something I want to reply to and do so right then a there. After all, I do not wait until I gather up a bunch of stuff people said to me and then reply all at once at the end of the day.


----------



## happy medium (Mar 20, 2017)

Nkd said:


> You seemed to have cherry picked the game that clearly ryzen does worst in. I already saw that. So this is not news to me. There clearly is something with that game that needs to fixed, because it was an outlier in the review.
> 
> As far as volta goes, we will see what happens. But picking fallout 4 that clearly does bad on ryzen is not a great example to go by.



Actually I picked the only game where there was the LEAST gpu bottleneck at 1080p AND 1440p.
Look for yourself.
Look at the gtx1080ti review. Just about every game increased fps when going from a gtx1080 to a gtx1080ti.
That is a gpu bottleneck my friend.

Take look at Hitman in the gx1080ti review vs the gtx1080. Only a slight gpu bottleneck with the 1080ti vs the 1080.




Once again the lower the gpu bottleneck (see above) the more Ryzen gets its ass kicked. (see below)






I have a friend on another Tech site that will run the tests for me.
He has 2 1080ti's and a Ryzen and an Intel system.
The only reason the Ryzen cpu looks as good as is does, is because every site is showing it will a slight or moderate gpu bottleneck.

I'm not saying Ryzen is trash, its actually a good advancement for AMD.  I'm saying reviewers are hiding the truth about its gaming performance or are not putting effort into finding the truth.


Edit : Look at Total War's gpu bottleneck
Look at the 7700k scores. they go way down due to the 1080 bottlenecking at 1440p
But look at the Ryzen scores at 1080p when there less gpu bottleneck.











edit 2:

Here is another game where they use a gtx1080 and with  less gpu bottlenecking.
Look at the 1080p vs 1440p scores for the 7700k. 99 @ 1080p vs 97 @ 1440p
Look at the Ryzen scores! @ 1440p ,Pathetic.
AGAIN the less gpu bound the game is the worse Ryzen scores.


----------



## Tran Hong Duc (Mar 20, 2017)

another hype train fail/blooper for AMD?


----------



## sweet (Mar 20, 2017)

Tran Hong Duc said:


> another hype train fail/blooper for AMD?


If you think a $329 1700 which can match $1000 6900K is a fail then yes...


----------



## happy medium (Mar 20, 2017)

Tran Hong Duc said:


> another hype train fail/blooper for AMD?




For gaming its not so good unless your gpu bound.
When the 6 core Ryzen comes out and there is no more excuses, bios fixes, SMT , scheduler problems ec ect ect.
and people see it has a little lower IPC  of a Haswell chip while gaming, people will wake up and smell the silicon.

For now its a good reason to upgrade for those with last gen AMD crap cpu's.
If you have an Intel chip made in the past 5 years, and everyone should, there is no real reason to upgrade .

I think most of the excitement is just because after 6 years there is finally a chip that can give you a reason to even look at something other than a Intel chip.


----------



## sweet (Mar 20, 2017)

happy medium said:


> For gaming its not so good unless your gpu bound.
> When the 6 core Ryzen comes out and there is no more excuses, bios fixes, SMT , scheduler problems ec ect ect.
> and people see it has a little lower IPC  of a Haswell chip while gaming, people will wake up and smell the silicon.
> 
> ...



Ryzen has higher IPC than Haswell, it is a bit lower than Broadwell. And SMT is much better than HT in production tasks.

FYI 1800X is holding the record Cinebench MT run for 8 cores CPU. The OC scene is very excited about it, not only because it is new, but because it is more powerful than Intel's 8 cores. People are targeting the $1500 6950X with a $500 CPU, it's really crazy.

Of coz for people who just want higher fps in *current *games, 7700k is still the best choice. For the mass who is on budget, they can wait for 4 core parts, which will be much cheaper and offer the mostly the same gaming performance of 8 core Ryzen.

Note that I emphasized *current *games.


----------



## GoldenX (Mar 20, 2017)

Broadwell-X IPC in all tasks except games, where the IPC seems to be between Sandy and Nehalem, at half or 1/3 of the price and lower power consumption, no overclocking restrictions and a cheaper platform overall (that is going to be even cheaper once the low/mid-end CPUs and motherboards appear).

And it's a bad product? Either Intel users are really hurt, or suddenly the only important task for a PC is running games, like a console.


----------



## lexluthermiester (Mar 20, 2017)

The Quim Reaper said:


> Superb chips for productivity tasks, good enough for gaming.
> 
> The only thing putting me off right now is the buggy ecosystem that is supporting them and the fact they're such poor overclockers.
> 
> I was really hoping they would at least get to 4.3Ghz but alas that looks like we'll have to wait for the first refresh of Ryzen in 2018 before overclocking headroom improves.


You could look at this another way. Get a 1700. The testing from other sites has shown that the 1700 overclocks decently enough to out-shine the 1700x and 1800x in over-clocking and at the same performance. The bios, memory and setup difficulties are being worked out and seem to be having good progress.

The Ryzen 5 series is going to be equally as interesting as this launch has been. 

As for the bias noted by other commenters, the reviewer is clearly excited by the results, but the results are inline with results posted by other reviewers on other sights. The range of tests done were interesting which kinda proves W1zzard put some thought into the process. So even with the bias, it's still a valid review that provides value to the readers.


----------



## notb (Mar 20, 2017)

sweet said:


> If you think a $329 1700 which can match $1000 6900K is a fail then yes...



Well... in some tasks in can match a $1000 6900K, in others it loses to a $300 i5.
Each of us has to decide whether this is a CPU for him.

Of course I have nothing against people buying Ryzen, when their actual workflow would benefit more from something from Intel. It's their choice.

Plus, it will help AMD financially.
Let's be honest. If Ryzen turns out to be a major flop, this would be a disaster for AMD. Maybe not the first one, but very likely the last.



Nkd said:


> You seemed to have cherry picked the game that clearly ryzen does worst in.



It's actually not true, but whatever.
Fallout 4 is a bestseller and built on one of the most popular engines. Saying that it favors Intel, so it's not the best game to use for comparing, is just silly.
It's like if we compared a hatchback and a tractor for best shopping vehicle and you'd protest, because most roads are clearly favoring cars.

Maybe AMD should start making stuff that works well in major titles? (AFAIK they've just started working with Bethesda, so it's not against their corporate strategy...)

How does this compare to people constantly moaning that reviews don't include Ashes of Singularity - a mediocre game that basically no one plays?

Number of Steam users' reviews:
Fallout: 54 402
AoS: 354


----------



## laszlo (Mar 20, 2017)

have read the review and thanks @W1zzard !

also ran over the ~ 9 pages of comments....

with some comments i agree with some no...this is how tech site works..

my personal feelings about the review are mixed...didn't saw a 8 core intel .... now the 7700 is faster even than his bigger 8 core brother in many areas....seems that 4 cores communicate better&faster than 8 depending on workload type also.. 

i think it will be fair to compare different cpu's  (2,4,6,8... cores) using 1 core to find out the base performance; more cores on die don't mean faster always due architecture of each producer.

productivity & gaming i would compare side-by-side cpu's with similar # of cores ....


----------



## medi01 (Mar 20, 2017)

TheHunter said:


> everyone compared it to 7700K





TheHunter said:


> lol, there was 5960x and 6900K too


Mind blown.



happy medium said:


> I'll prove my theory.


You need more than one game to prove your theory. Here are benches at  1080p across a bunch of games, interesting to note that going down to 720p gap is reduced, not increased:








Games were:

Assassin's Creed Unity 
Call of Duty: Advanced Warfare 
F1 2015
GTA V
The Witcher 3
Total War: Attila 

https://www.computerbase.de/thema/prozessor/rangliste/#diagramm-gesamtrating-spiele-720p


----------



## rtwjunkie (Mar 20, 2017)

notb said:


> I actually asked one of Ryzen worshipers here how often he does movie encoding, if he's spending so much time praising Ryzen's performance. The answer I got was: "whenever necessary"


You're being a little disingenuous. The member whose answer you dislike is @FordGT90Concept.  I would hardly call him a Ryzen worshipper, but whatever.

Someone who has a large movie collection will in all likelihood watch those movies pretty frequently. So the answer to how many times may not be exact, and his "whenever necessary" is no different from mine.


----------



## BiggieShady (Mar 20, 2017)

Multi threading in Fallout 4 is buggy but can be enabled: https://www.reddit.com/r/pcmasterrace/comments/3t5jn7/fallout_4_multithreading_console_commands/

The gap in Fallout 4 graph is huge because it's not very multi threaded engine so the difference in boost clock shows the most.


----------



## Capitan Harlock (Mar 20, 2017)

Great review as always but why no 4K benchmarks?


----------



## notb (Mar 20, 2017)

rtwjunkie said:


> Someone who has a large movie collection will in all likelihood watch those movies pretty frequently. So the answer to how many times may not be exact, and his "whenever necessary" is no different from mine.



I think it's no different with everyone - not just you. 

How often do you transcode a movie before watching it?
I've also seen many movies / series episodes / documentaries this year - I guess around 4/week. And yet only 5 of the files needed editing.

Should we really base our evaluation of Ryzen's usefulness on the fact that it is superior at some tasks that we do "whenever necessary"?

I understand that you mean "we do it sometimes and Ryzen will make things faster", but again: this is not a qualitative situation. Going from 4 cores to 8 does not make new things possible. It just makes them less time-consuming and - as such - we must take in consideration how often these tasks are performed.


----------



## FordGT90Concept (Mar 20, 2017)

BiggieShady said:


> Multi threading in Fallout 4 is buggy but can be enabled: https://www.reddit.com/r/pcmasterrace/comments/3t5jn7/fallout_4_multithreading_console_commands/
> 
> The gap in Fallout 4 graph is huge because it's not very multi threaded engine so the difference in boost clock shows the most.


Seriously?  I thought BethSoft fixed their shit.  Apparently not.  I hope they're working on a modern D3D12/multithreaded/64-bit engine now.  It's long overdue.



Capitan Harlock said:


> Great review as always but why no 4K benchmarks?


Strains GPU more than CPU.


----------



## Footman (Mar 20, 2017)

Great review. Hit the nail on the head. I believe that there is better value to be had in the 1700 version, which is the cpu I purchased last week as a replacement for Devils Canyon. Multi-threaded performance is through the roof, but gaming at 2560x1440 is similar. Optimizing will take time. Motherboard support is horrible. My 3200 ram wont boot at 3200, best I have got with it to date is 3000 and that has been a struggle. response from motherboard maker has been sporadic. Perhaps this cpu needs some time. Good news is overclocking the 1700 is pretty easy. I have mine at 4.0ghz and 1.3785v and as cool as 54C with OCCP (under water). Not really a gamers chip though, until games take advantage of more cores or game makers optimize current crop of games.
Pretty good value for 8 core 16 thread chip....


----------



## MrMilli (Mar 20, 2017)

happy medium said:


> Edit : Look at Total War's gpu bottleneck
> Look at the 7700k scores. they go way down due to the 1080 bottlenecking at 1440p
> But look at the Ryzen scores at 1080p when there less gpu bottleneck.



Look at these:
https://www.computerbase.de/2017-03...test/4/#diagramm-total-war-warhammer-dx11-fps
https://www.computerbase.de/2017-03...test/4/#diagramm-total-war-warhammer-dx12-fps

7700K
DX11: 43 fps
DX12: 42 fps

1800X
DX11: 40 fps
DX12: 30fps

While the 7700K only loses 2% of performance in DX12, the 1800X loses 33% of performance. The DX11 results of both chips is actually very comparable.
As I've pointed out in my previous posts (pretty much ignored on all forums), DX12 games are lacking optimization for Ryzen and should be tested in DX11 till they're updated.
The whole issue at the moment with Ryzen's game performance is that almost all reviews solely focus on DX12 performance which is a mistake in my opinion, with DX12 being a low lever API.

In that CB review, Ryzen beats/matches the 7700K in Dishonored 2, F1 2016, Shadow Warrior 2 and Watch Dogs 2. Its game performance is not as bad as some reviews make it out to be.
It seems to really lack performance in Project Cars. Some 20% behind the 7700K but around 13% faster 4770K. But in the same vein, the 7700K is 20% slower in F1 2016.


----------



## EarthDog (Mar 20, 2017)

funny.. they 'should' be tested in DX11... we should test (anything) in a more favorable light than how people would run it. Why would I run a DX12 game in DX11 unless there were some serious issues? It is what it is right now... sorry AMD couldn't seem to get samples/information out to some of the most important people to ensure better success on the chip...


----------



## BiggieShady (Mar 20, 2017)

FordGT90Concept said:


> Seriously? I thought BethSoft fixed their shit. Apparently not. I hope they're working on a modern D3D12/multithreaded/64-bit engine now. It's long overdue.


My reaction exactly .... it'll be hilarious when we see gamebryo one more time


----------



## dirtyferret (Mar 20, 2017)

a few quick points

1. a much better effort then the disappointing FX series especially in IPC 
2. a bit over priced for the casual user and gamer, there is obviously a certain market that can use that multi-core performance now but it's strange AMD did not launch something in the $200-250 price range out the gate.  
3. That said I am very interested in seeing Ryzen 5 family performance when they launch in mid-April (hopefully not a paper launch)
4. If anyone replies to this stating they are making a Ryzen purchase to "future proof" their PC, allow me to post my auto reply 

5. AMD really needs to move off the more cores for your money band wagon.  I know the mark up is much greater on CPUs beyond mid range but they really need to attain more market share and that will occur in the mid range market.  
6. interesting article over at hardocp 
_
For those of you looking to save a few bucks and build a budget system with as many cores as could previously be had for $1000, the Ryzen 1700 processor is looking to be the best value in Ryzen CPUs for the overclocker. For all intents and purposes, the Ryzen 1700 is the same CPU as the 1700X and the 1800X at quite a cost savings. The one caveat may be, and this is a guess based on very little testing so far, that the 1700 may not show the same overclocking prowess as the X models. Even then it was less than 100MHz, which is something that you would never be able to identify in everyday usage and gaming.

https://www.hardocp.com/article/2017/03/08/amd_ryzen_1700_cpu_vs_1700x_review/3_


----------



## londiste (Mar 20, 2017)

MrMilli said:


> The whole issue at the moment with Ryzen's game performance is that almost all reviews solely focus on DX12 performance which is a mistake in my opinion, with DX12 being a low lever API.


dx12 is only lower level api when it comes to gpu. basically as far as cpu is concerned, it is not that different.


----------



## GWComputers (Mar 20, 2017)

WE DON'T CARE AT ALL NEITHER FOR INTEL OR AMD BUT...
This post should have had a dislike button instead.
Are you being paid by Intel??
The R7 1800X since the beginning was being compared to i7 6900K which is an 2011v3 CPU and the ( i7 name makes no sense here) SO MY QUESTION IS, WHERE ARE THE i7 6800K, 6900K and i7 6950K in the benchmarks??
We are no fans of neither Intel or AMD but this post is biased or made by a mediocre member of your stuff.
You could make the RIGHT post and say that the softwares used today are not being optimized for multicores and that's the reason why High End CPUs like R7 1700x / 1800X and i7 6800k/ 6900k / 6950k sometimes get lower results than a "mediocre" i7 7700
This is not the first time that i see terrible reviews made by you and other sites like yours. Tests in our lab are different and can assure you we know what we are doing 
	

	
	
		
		

		
		
	


	



You continue getting biased and make these "reviews" and in return we still have this 4-cores bad overpriced CPUs around and software companies forced or paid to develop garbage softwares including games


----------



## MrMilli (Mar 20, 2017)

londiste said:


> dx12 is only lower level api when it comes to gpu. basically as far as cpu is concerned, it is not that different.



Firstly, that's not true and secondly, how do you explain the massive regres in performance for Ryzen in DX12 if it would be true?


----------



## Shatun_Bear (Mar 20, 2017)

dirtyferret said:


> a few quick points
> 
> 1. a much better effort then the disappointing FX series especially in IPC
> 2. *a bit over priced for the casual user and gamer*, there is obviously a certain market that can use that multi-core performance now but it's strange AMD did not launch something in the $200-250 price range out the gate.
> ...



This is the silliest post I have read all day. That's like saying 'the Porsche is over-priced for the working-class buyer'.

The second-bolded is equally crazy. You covered it in your own post - Ryzen 5 is coming April 11. But regardless, AMD is giving us too many cores for an over-inflated priced according to you with those pesky R7s 

I don't know whether you are being serious with the whole list or not in all honesty.


----------



## notb (Mar 21, 2017)

Shatun_Bear said:


> This is the silliest post I have read all day. That's like saying 'the Porsche is over-priced for the working-class buyer'.



Maybe he meant "too expensive"? That would be correct.

Ryzen R7 is great value, but clearly priced way above what most people desire to spend on a CPU.

R3/R5 is the mainstream CPU we're waiting for, but again - what about the price?
It seems the R5 1500 (6C) will have performance similar to Intel i7 (4C). 
So it's $230 vs $300 - clearly AMD is cheaper IF you're going to buy a dGPU in both cases - not necessarily otherwise.
The gaps are even smaller down the line and in the low-end R3 / i3 it's neck-and-neck.

The cheapest Ryzen 3 is rumored to cost $130 - again, quite a lot. What about cheaper stuff?
Will the Ryzen-based APU compete with Intel's Pentium lineup?
Until now AMD has only told us something about the high-end models (4C/8T), but they will be more expensive than non-IGP variants (makes sense, doesn't it?).
G4560 is 2C/4T + IGP for $80 and it is REALLY fast.



Shatun_Bear said:


> The second-bolded is equally crazy. You covered it in your own post - Ryzen 5 is coming April 11. But regardless, AMD is giving us too many cores for an over-inflated priced according to you with those pesky R7s



Again, what's wrong?
The last time AMD had a market share close to Intel in consumer CPU segment, they weren't trying to be different, interesting, futuristic or something like that. They were making a very similar product, yet with some interesting features and at a lower price point.

Intel is trying to be innovative in other segments - spending a lot on AI, embedded solutions, IoT and so on. But in the consumer CPU they're simply answering current needs - not trying to be clever.
Wouldn't this be also better for AMD? Their battle for high-core gaming is already going on for over 5 years (10 if you consider the R&D time for Bulldozer). It's been a huge failure since the beginning.


----------



## londiste (Mar 21, 2017)

MrMilli said:


> Firstly, that's not true and secondly, how do you explain the massive regres in performance for Ryzen in DX12 if it would be true?


would you care to elaborate on why it's not true?


----------



## medi01 (Mar 21, 2017)

FordGT90Concept said:


> Strains GPU more than CPU.


So god forbid dude with Titan would see what performance he would get with this CPU.
He should deduct it from very reasonable SLI Titan benchmarks at 720p.



EarthDog said:


> Why would I run a DX12 game in DX11


Because it is... faster that way? (for all nvidia gpus and most games)


----------



## Shatun_Bear (Mar 21, 2017)

notb said:


> Again, what's wrong?
> *The last time AMD had a market share close to Intel in consumer CPU segment, they weren't trying to be different, interesting, futuristic or something like that. They were making a very similar product, yet with some interesting features and at a lower price point.*
> 
> *Intel is trying to be innovative in other segments - spending a lot on AI, embedded solutions, IoT and so on. But in the consumer CPU they're simply answering current needs - not trying to be clever.
> Wouldn't this be also better for AMD?* Their battle for high-core gaming is already going on for over 5 years (10 if you consider the R&D time for Bulldozer). It's been a huge failure since the beginning.



You've contradicted yourself there. First you want AMD to offer something 'different', then you say they would be better 'answering current needs' like Intel. So which is it?

The problem with TechPowerUp is they review products without considering the price, especially when it comes to Intel and Nvidia products. So I don't blame you if the incredible value proposition of the 1700 is lost on you - offering an 8-core CPU for $330 that gets very close to Intel's 3X as expensive 8-core HEDT is very interesting and 'futuristic'. I don't know how anyone can believe that more cores is not the future, anyone will tell you that - it's the way the industry is going. It would be incredibly backward if AMD offered only high-frequency low-core CPUs with Ryzen, which is what you and the other guy seem to be proposing they should have done. What's silly is they _are_ offering 4 and 6-core CPUs, but the R7's have released first. Just be bloody patient.


----------



## Warrgarbl (Mar 21, 2017)

If I hadn't had a need to buy a new motherboard and CPU a year back I would totally buy a Ryzen. Performance is great across the board, and the gaming performance isn't enough to deter me. Granted, my focus shifted. Then again, I can't wait to see what software-side optimizations might bring (games, especially).


----------



## msroadkill612 (Mar 21, 2017)

Is this correct. It seems too simple?

The impression I get is the Ryzen auto overclocking works well, but only within strictly monitored cpu temperature ranges.

So, instead of the limited benefit hassle of manual overclocking, why not spend on better cooling (liquid e.g.), and let auto overclock do the hard yards?


----------



## msroadkill612 (Mar 21, 2017)

Steevo said:


> "Hep" thing perhaps should be "hip"?
> 
> Now to read the review.


FTR, "a real hep cat" was actual "beatnik" usage of yore, & meant "hip" of course.


----------



## msroadkill612 (Mar 21, 2017)

LiNKiN said:


> This is the best and most informative Ryzen review on the web. Thank you for the intense amount of thought and effort that you put into this article W1zzard.



Maybe its just my searching, but i was keen for any ryzen stuff out there, & didnt find much that wasnt dated Mar 2.

News is new by definition. but its also a crap way to learn - history for example. Imagine a view of the vietnam war formed by reading the papers at the time, and nothing since?

Lets not forget what utter BS some of the early reviewers sprouted. A lot of AMD stock holders were stung from believing them.

check this - 1 month amd stock chart, look what happens on mar 2 based on "expert" opinions in the press, then folks wised up:




So yeah, me, i prefer information, and to wait a while & see what others discovered the hard way seems intelligent husbanding of resources.

Kudos. I shall pay more attention to TPUs site in future.


----------



## dirtyferret (Mar 21, 2017)

Shatun_Bear said:


> This is the silliest post I have read all day. That's like saying 'the Porsche is over-priced for the working-class buyer'.
> 
> The second-bolded is equally crazy. You covered it in your own post - Ryzen 5 is coming April 11. But regardless, AMD is giving us too many cores for an over-inflated priced according to you with those pesky R7s
> 
> I don't know whether you are being serious with the whole list or not in all honesty.



I'm sorry my post was too complicated for you, I will attempt to educate you on the basics of CPUs if need be.  Please let me know if any term goes over your head but here is a good start for you

https://en.wikipedia.org/wiki/Central_processing_unit

Your anecdote is not apt as The Ryzen CPU is not a porshe nor does a single person in the tech world consider it one.  The Ryzen 7 is more like a Subura Baja, over priced against sedans and lacking the full utility of a true pick up.  Obviously there was a market for such a vehicle yet it only lasted for four years due to low sales.


----------



## GhostRyder (Mar 21, 2017)

the54thvoid said:


> Well I've bought into the Ryzen ecosystem for one reason.
> 
> If I buy a KL chip now, it'll be superseded soon enough by another Intel chip so in 3 years time there will have been like 5 Intel high end chips out.  Anyway, at 1440p with max settings on everything and new 1080ti, the 1700X I've gone for should be better then my Sandy-E.  And if Bethesda go full AMD optimised, maybe it's not so much a gamble for me.


I am shocked your retiring that system, such an awesome chip.

Well, the best thing about these chips is the power consumption considering the amount of cores mixed in with the performance.  Still disappointing overclocking though but thats a caveat of a low power design.  Personally I would choose better overclocking over low power but thats me.


----------



## Frick (Mar 21, 2017)

W1zzard said:


> Right on the money, this is my first CPU review. Which means selecting and figuring out benchmarks, then building test systems with the hardware that's available, then bench (not exactly few results), then think, fix bench suite, rebench everything (two times for this review), then come up with structure, layout, texts, conclusion.
> 
> There will be more CPU reviews from me though  Just bought i5 7400, i3 7100, Pentium G4560.



Didn't you review the E6600, or was it the Q6600?

Anyway, hoping someone will sell me their Haswell i3s/i5s for cheap!


----------



## notb (Mar 21, 2017)

Shatun_Bear said:


> You've contradicted yourself there. First you want AMD to offer something 'different', then you say they would be better 'answering current needs' like Intel. So which is it?


Misunderstanding.
I've never said that I want AMD to be "different". Sorry if you got it that way.
I precisely think AMD should stop building their strategy on a bet. For 10 years they've been betting that just around the corner is a highly multi-thread future.

Intel is giving us CPUs optimized for the actual tasks that are performed at the moment.
AMD gave us very expensive CPUs that are superior in fairly niche situations...



Shatun_Bear said:


> The problem with TechPowerUp is they review products without considering the price, especially when it comes to Intel and Nvidia products. So I don't blame you if the incredible value proposition of the 1700 is lost on you - offering an 8-core CPU for $330 that gets very close to Intel's 3X as expensive 8-core HEDT is very interesting and 'futuristic'.



It doesn't matter. $300 is too much for a mainstream CPU. It doesn't matter how good is the performance. People want a CPU that will let them use a browser, an Office suit, Skype, some games and so on.
An i3 is already good for that. i5 is for those gaming in high resolution. i7 is already an overkill for most.
You're praising Ryzen as if everyone on the planet was doing WCG as a hobby.

Ryzen has multi-thread potential that - outside of specific tasks like movie encoding or simulations - is very difficult to use. This won't change fast enough for AMD to get a big audience. Intel will catch up - even just on their small but regular improvements with each generation. 



Shatun_Bear said:


> I don't know how anyone can believe that more cores is not the future



It's not about if it is or not (and are you so sure?). It's about how far we are from this happening (assuming it will).


----------



## dirtyferret (Mar 21, 2017)

notb said:


> Misunderstanding.
> I've never said that I want AMD to be "different". Sorry if you got it that way.
> I precisely think AMD should stop building their strategy on a bet. For 10 years they've been betting that just around the corner is a highly multi-thread future.
> 
> ...



I don't think you could have laid it out any better.


----------



## CounterSpell (Mar 22, 2017)

buggy hardware ecosystem. Waiting for fixes... cpu not mature enough.

But a big jump for amd cpu´s


----------



## friocasa (Mar 22, 2017)

GWComputers said:


> WE DON'T CARE AT ALL NEITHER FOR INTEL OR AMD BUT...
> This post should have had a dislike button instead.
> Are you being paid by Intel??
> The R7 1800X since the beginning was being compared to i7 6900K which is an 2011v3 CPU and the ( i7 name makes no sense here) SO MY QUESTION IS, WHERE ARE THE i7 6800K, 6900K and i7 6950K in the benchmarks??
> ...


This is one of the worst review i've ever seen in this website: Bad tests, bad choice and very limited amount of CPU models, worse RAM speed and latency on one side(Ryzen, the most dependent)...

I truly expect an update of this review in the near future


----------



## medi01 (Mar 22, 2017)

LiNKiN said:


> This is the best and most informative


Oh please...
No 4k (yeah, god forbid people actually using GPUs used in the test would know how it runs on their monitors), no 8 cores from Intel, no DX11 vs 12, Ryzen using slower RAM than 7700k (after AMD said infinity fabric uses RAM clock)

"Most informative" right.


----------



## notb (Mar 22, 2017)

medi01 said:


> Oh please...
> No 4k (yeah, god forbid people actually using GPUs used in the test would know how it runs on their monitors),



You assume people actually have 4K LCDs?



medi01 said:


> no 8 cores from Intel



AMD categorized Ryzen as gaming segment. TPU is primarly a gaming-oriented site and so is the review.
In the blue camp HEDT CPUs are not in gaming segment. Intel promotes LGA1151 stuff for that.

Why would we compare Ryzen to Intel HEDT stuff? Because they have similar number of some weird elements called "cores"? Why not the physical size of a die?



medi01 said:


> , no DX11 vs 12


At this point everything should be tested on DX12 anyway. Why bother with both?



medi01 said:


> Ryzen using slower RAM than 7700k (after AMD said infinity fabric uses RAM clock)


Ryzen platform has compatibility issues with faster DDR at this point, Intel doesn't. We're comparing platforms' performance, not just raw design superiority.

Think about it. If it turned out that Intel chips work with every GPU available, but Ryzen (for whatever reason) is not compatible with anything faster than a GTX1050, would this be an important factor while evaluating their gaming potential?
Or would you also expect reviewers to use GTX1060 for both and totally omit the tiny "detail"?


----------



## Rowsol (Mar 22, 2017)

4k is a pointless test for CPU testing and mostly a waste of GPU power.


notb said:


> AMD categorized Ryzen as gaming segment


They compared it to the 1k chip and said same performance, half price.  They also tested it against a core i5 or i7 (don't remember which) while streaming, again, showing the gains of extra cores.  Anyone with a brain knows a 16t chip isn't built for gaming.


----------



## refillable (Mar 22, 2017)

notb said:


> You assume people actually have 4K LCDs?



What's the difference between that and assuming the otherwise for a (let's assume) GTX 1080 Ti and this CPU? Anecdotal evidence.



notb said:


> AMD categorized Ryzen as gaming segment. TPU is primarly a gaming-oriented site and so is the review.
> In the blue camp HEDT CPUs are not in gaming segment. Intel promotes LGA1151 stuff for that.
> 
> Why would we compare Ryzen to Intel HEDT stuff? Because they have similar number of some weird elements called "cores"? Why not the physical size of a die?



Wrong. AMD categorized Ryzen as content creator processor as well. This part of your post has nothing right in it.



notb said:


> Think about it. If it turned out that Intel chips work with every GPU available, but Ryzen (for whatever reason)* is not compatible with anything faster than a GTX1050*, would this be an important factor while evaluating their gaming potential?
> Or would you also expect reviewers to use GTX1060 for both and totally omit the tiny "detail"?



It isn't. What now, cherry pick Hitman, Total War and Fallout again perhaps?

*And I SERIOUSLY consider giving a free i3 or Celeron for an "average customer" like you.*


----------



## happy medium (Mar 22, 2017)

AMD Ryzen 7 1800X review: what's the real story with gaming?
http://www.eurogamer.net/articles/digitalfoundry-2017-amd-ryzen-7-1800x-review

This is how you do a CPU review, you take away the gpu bottleneck as much as possible.
quote:
"We believe that a CPU purchase should last for years, so rather than test processors with a particular GPU at standard gaming conditions, we opt instead to take the graphics hardware out of the test results as best we can and to attempt to concentrate more closely on a processor's gaming potential. The aim here is to ascertain relative performance between CPUs when running game engine code - this gives a better idea of how 'lastable' a potential processor may be."

Looks like a i5 7600k trades blows with a 1800x.












Guys, Pleaase read the whole review!
http://www.eurogamer.net/articles/digitalfoundry-2017-amd-ryzen-7-1800x-review

The fact is if your NOT GPU LIMITED a Ryzen cpu is about 15/20% slower than a Intel cpu when gaming.
It even stated that a gtx1080ti overclocked is sometimes limited @ 1080p
AND if you overclock the Intel cpu's will beat a Ryzen cpu even more.

The good news is, if you are encoding movies all day , Ryzen is a good buy.


----------



## refillable (Mar 22, 2017)

"We believe that a CPU purchase should last for years, so rather than test processors with a particular GPU at standard gaming conditions, we opt instead to take the graphics hardware out of the test results as best we can and to attempt to concentrate more closely on a processor's gaming potential. The aim here is to ascertain relative performance between CPUs when running game engine code - this gives a better idea of how 'lastable' a potential processor may be."

Has been refuted countless of times. Back in the days of Bulldozer, people were showing Civ 5 benches in 800x600 and put Bulldozers to be half as fast as their Sandy Bridge counterparts. And they simply inferred the same thing - Sandy Bridge is more "lastable", but it turns out that it's a flawed logic. Even until now those crappy faildozers (or piledrivers) can still keep up (in games) with having around 80-90% the performance of the 2500K (according to computerbase.de).


----------



## notb (Mar 22, 2017)

Rowsol said:


> They compared it to the 1k chip and said same performance, half price.  They also tested it against a core i5 or i7 (don't remember which) while streaming, again, showing the gains of extra cores.  Anyone with a brain knows a 16t chip isn't built for gaming.



But now suddenly we're talking about what clients should know? 

AMD is selling Ryzen as general consumer / gaming solution. *End of story.* This is all over both the platform specs and the marketing surrounding it.
Naples is the upcoming server solution. We don't know about a workstation solution at the moment.
Go to their website. Ryzen is in desktop solutions. No CPU is mentioned in the workstation part - just GPUs:
http://www.amd.com/en/products/workstations

AMD knows very well that they can make a workstation-grade platform and ask $1000 for a CPU - just like Intel does.
Have you seen the latest rumors?
http://wccftech.com/amd-working-16-core-ryzen-cpu

Don't get me wrong: Ryzen 7's multi-thread performance is up there with Intel HEDT, but the whole platform isn't - it lacks robustness and features (ECC support being the most obvious).
Comparing Ryzen and Broadwell-E simply by looking on the performance and price is naive at least (I don't want to call it "biased" or "fanboyish"). For the extra $500 Intel sells you a lot more than the 8C/16T setup. 

If you're just after performance, check the i7-6800K. It's has basically the same performance as the Ryzen 1700 and it costs "just" $400.
Sure, Ryzen 1700 is still cheaper, but that's $330 vs $400 - something we've been used to with AMD vs Intel.
And of course it's just 6C/12T going neck-and-neck with 8C/16T from AMD.


----------



## Capitan Harlock (Mar 22, 2017)

notb said:


> You assume people actually have 4K LCDs?



I have a 43" 4k Lcd IPS monitor with a LG pannel XD.


----------



## nemesis.ie (Mar 22, 2017)

Update: running tests at 3850/1.3v (Ryzen Master), CPU-Z says 1.25v. RAM @ 3200MHz, AIDA64 reporting a little over 50,000MB/sec RAM read speed.

Cracked 20,000 in CPU-Z bench.

Running with a ghetto TT copper cooler with 120mm fan on top (via a funnel) pulling air off the chip ... kind of works but the Kraken will be better whenever I get the mount. 

AIDA64 reports 55c. Ryzan Master about 75 ... which ties in to the 20c offset nonsense.

Regarding ECC, I don't have any to test but all the ECC options (and indeed server platform options like socket/die RAM interleaving) are present in the UEFI settings ...


----------



## refillable (Mar 23, 2017)

notb said:


> Don't get me wrong: Ryzen 7's multi-thread performance is up there with Intel HEDT, but the whole platform isn't - it lacks robustness and features (ECC support being the most obvious).



Rubbish, and if it really if some is true, relating those to gamers and content creators' perspective (which AMD clearly marketed Ryzen 7 to) is still dumb.


notb said:


> Comparing Ryzen and Broadwell-E simply by looking on the performance and price is naive at least (I don't want to call it "biased" or "fanboyish"). For the extra $500 Intel sells you a lot more than the 8C/16T setup.
> 
> If you're just after performance, check the i7-6800K. It's has basically the same performance as the Ryzen 1700 and it costs "just" $400.
> Sure, Ryzen 1700 is still cheaper, but that's $330 vs $400 - something we've been used to with AMD vs Intel.
> And of course it's just 6C/12T going neck-and-neck with 8C/16T from AMD.



Not at all. 6 cores are still too weak for me. The R7 still wins on almost all multithreaded benches out there.


----------



## nem.. (Mar 23, 2017)

happy medium said:


> AMD Ryzen 7 1800X review: what's the real story with gaming?
> http://www.eurogamer.net/articles/digitalfoundry-2017-amd-ryzen-7-1800x-review
> 
> This is how you do a CPU review, you take away the gpu bottleneck as much as possible.
> ...


<br class="Apple-interchange-newline"><div id="inner-editor"></div>

Man since we are credulous to believe in Digital foundry (that is known for years who they serve ), why not also to believe to TOM$HARDWARE where a 6900k with ipc Broadwell can defeat the 7700k kabylake, as has not been seen In other reviews this in gamming and in 1080p, something similar to the one of DIGITAL FOUNDRY. What fools the intel engineers that even their cpu broadwell defeat to their fastest cpu kabylake (this can not be true), i guess only in titles where are used more than four cores but not at all tittles as they show it. :/ 


btw according to this japanese review this is the ipc of this cpus.
link http://ascii.jp/elem/000/001/445/1445029/


----------



## lexluthermiester (Mar 24, 2017)

friocasa said:


> This is one of the worst review i've ever seen in this website: Bad tests, bad choice and very limited amount of CPU models, worse RAM speed and latency on one side(Ryzen, the most dependent)...
> 
> I truly expect an update of this review in the near future


Intel fanboy nonsense. I have read/watched MANY reviews and this one was well thought out. The tests chosen show a very clear methodology of careful, thoughtful planning with a focus on being fair to both platforms. The results and insights offered clearly show the reviewers excitement for AMD's new line of CPU's which are good performers by comparison to Intel's BEST line up and are even more competitive in pricing. 

Your rather sad little opinion comes off childish at best. Go boil your head.

Oh, and for the record, I own one of the best Intel's CPUs available and will not be going to AMD. Still, no one but arrogant, prideful Intel fanboy's are going to say that AMD has not done very well with these CPU offerings.

AMD's Ryzen 5 and 3 lineup are going to be just as interesting and exciting!


----------



## nem.. (Mar 26, 2017)

*RYZEN + 3600MHZ RAM - closing the 7700K gap in gaming? *


----------



## EarthDog (Mar 26, 2017)

I guess it's this video's turn to be posted 50 times at TPU, lol!


----------



## medi01 (Mar 29, 2017)

EarthDog said:


> I guess it's this video's turn to be posted 50 times at TPU, lol!


I don't find it funny.
And I think we need screenshots from it, difference is MASSIVE:


----------



## medi01 (Mar 29, 2017)

Practical examples:





http://www.neogaf.com/forum/showthread.php?t=1348347&page=44


----------



## EarthDog (Mar 29, 2017)

Its funny when you see it 10 times in 4 threads. 

I'd like to see TW3 empirically tested instead of some random dude in the forum and his results. I'd also like to see that same comparison with a modern intel cpu. If the fps rates are similar, well, you can finish the story.


----------



## happy medium (Mar 29, 2017)

medi01 said:


> I don't find it funny.
> And I think we need screenshots from it, difference is MASSIVE:
> 
> 
> ...



Dude he is using a gtx1070, gpu bottleneck, end of story.
He needs to use a gtx1080ti overclocked to relieve the gpu bottleneck.

He also should use ddr4 4000 ram with the Intel system, it also makes a difference.


----------



## medi01 (Apr 4, 2017)

happy medium said:


> gpu bottleneck, end of story.


Oh, FFS, how is it "just a GPU bottleneck" if performance varies so wildly between CPUs?

Not to mention that 1070 outsells Titan like what, 50 to 1 and is more relevant?


----------



## nemesis.ie (Apr 4, 2017)

I'd love to see this same 3600MHz test with 2x RX480s which would also be cheaper than 1 x 1080ti.


----------



## GoldenX (Apr 5, 2017)

The reactions from the fanboys are truly something. Intel boys are BUT MUH GAMES FPS, while AMD boys are angry because this review or any other doesn't show Ryzen winning on everything.
Was it like this on the Athlon 64 vs Pentium 4 days?


----------



## nemesis.ie (Apr 5, 2017)

You have to ask?


----------



## notb (Apr 5, 2017)

GoldenX said:


> Was it like this on the Athlon 64 vs Pentium 4 days?


Not really. Back then Intel and AMD made similar CPUs, with similar number of cores (1-2...). Actually back then Intel was (slightly) leading in multi-core tasks (thanks to HT).
Thing is though... the most recent CPU was the fastest. It changed all the time.

On one hand, the fanboy war was very constant, but on the other, AFAIR it was less intensive than today. 
AMD guys were living in frustration for 5 years - now they finally have a Ryzon to live!
As for Intel crowd... I guess they simply missed  arguing.


----------



## happy medium (Apr 6, 2017)

medi01 said:


> Oh, FFS, how is it "just a GPU bottleneck" if performance varies so wildly between CPUs?
> 
> Not to mention that 1070 outsells Titan like what, 50 to 1 and is more relevant?



How does it matter how many people buy the faster gpu's?
Next year when a $350 mainstream upper midrange gpu is as fast as a gtx1080ti and the Ryzen cpu's are falling even more behind, I won't say I told you so.

The way to test a cpu's performance is to have no gpu bottleneck, its always been that way.
What now we make special rules for Ryzen cpu's?


----------



## medi01 (Apr 6, 2017)

happy medium said:


> How does it matter how many people buy the faster gpu's?


Because, as seen with the 1070, you actually can NOT freaking deduct how certain CPU+ GPU combination will actually work out in games?


----------



## happy medium (Apr 6, 2017)

medi01 said:


> Because, as seen with the 1070, you actually can NOT freaking deduct how certain CPU+ GPU combination will actually work out in games?


Yet we have been doing it this way for 20 years but now we can't, ok if that's the best answer you can up with .

Here I give you a 7700k@ 4.0 vs a 1800 @ 4.0 with a gtx480 and gtx1060.
Ryzen gets its azz kicked.


----------



## happy medium (Apr 6, 2017)

If the 7700k was overclocked, it would destroy a Ryzen even worse.
Wake up guys! Stop making excuses.

I rest my case.


----------



## notb (Apr 6, 2017)

medi01 said:


> Because, as seen with the 1070, you actually can NOT freaking deduct how certain CPU+ GPU combination will actually work out in games?



Honestly, you can't anyway. It's just way to sensitive to other variables.
You can, in general, take review results as more or less optimal and repeatable. They're great for comparing cards - by no means a guarantee that you'll get close to identical results (even if you get identical gear...).


----------



## GoldenX (Apr 7, 2017)

happy medium said:


> If the 7700k was overclocked, it would destroy a Ryzen even worse.
> Wake up guys! Stop making excuses.
> 
> I rest my case.



In games, a higher clocked processor of a very mature architecture, beating a lower clocked one, of a new architecture, with bugs to solve. Shocking.

How is proper multithreaded performance, overall platform cost and power consumption of the glorious 7700K, 6800K, 6850K and 6900K compared to the poor, useless R7 1700, 1700X and 1800X going?


----------



## happy medium (Apr 7, 2017)

GoldenX said:


> In games, a higher clocked processor of a very mature architecture, beating a lower clocked one, of a new architecture, with bugs to solve. Shocking.
> 
> How is proper multithreaded performance, overall platform cost and power consumption of the glorious 7700K, 6800K, 6850K and 6900K compared to the poor, useless R7 1700, 1700X and 1800X going?



But in the benchmarks I listed they were both clocked to 4ghz and the Ryzen lost big time.
I said "if" the 7700k was overclocked, in this case only the 1800 was overclocked. and it still lost.

Power consumption for the 7700k @ 4 ghz would be lower than the Ryzen at 4ghz.


----------



## medi01 (Apr 7, 2017)

happy medium said:


> Yet we have been doing it this way for 20 years


People believed Earth was flat for thousand years, you still have time.



happy medium said:


> Here I give you a 7700k@ 4.0 vs a 1800 @ 4.0 with a gtx480 and gtx1060.
> Ryzen gets its azz kicked.



Did you actually LOOK at what you have posted.
Namely, at SHRINKING difference between 7700k and 1800k when you switch to AMD GPU?

Your slide shows exactly the opposite of what you are trying to say, basically confirming Adored TV conclusion.


----------



## mcraygsx (Apr 8, 2017)

With those benchmarks 7700K is already being exhausted with close to max usage. Ryzen shines when you are doing other tasks while gaming e.g. Multiple IE tabs, game recording/uploading. 7700K only shines in games and shutters while handling multiple tasks. 7700K is perfect for games only but Ryzen is better at everything else. Just give it time and it will catch up. Remember good old days of X99 platform, riddled with bugs.

As enthusiasts we should all admit AMD has fantastic a product on hand for us at fantastic price. As of today there is zero upgrade path for Z270 or X99 while AM4 still as a future.


----------



## notb (Apr 8, 2017)

mcraygsx said:


> With those benchmarks 7700K is already being exhausted with close to max usage. Ryzen shines when you are doing other tasks while gaming e.g. Multiple IE tabs, game recording/uploading. 7700K only shines in games and shutters while handling multiple tasks. 7700K is perfect for games only but Ryzen is better at everything else. Just give it time and it will catch up. Remember good old days of X99 platform, riddled with bugs.



I don't think people have actually complained that latest Intel i7s "shutter while handling multiple tasks" before Ryzen marketing blitzkrieg happened.
Now suddenly people decided that what they want is gaming with multiple browser tabs and movie encoding in the background. Honestly, what's the point?
There is always a limit of how many things a CPU can handle. Why not open 100 tabs? Why not run 10 Handbrake sessions? 

Also, while we should give AMD some time to polish the platform, I don't see how this could be an argument for buying current Ryzen stuff. If it's "riddled with bugs" like the early X99, how can we call it "a fantastic product"?  It could be a great architecture, but the currently available product is of huge potential but uncertain (at best).

Also, while AMD and Intel both have a history of releasing buggy architectures, they have a very different way of solving that. Intel updates often - fixing issues on the way. However, AMD is a mixed bag here. If Zen architecture turns out to be hard to live with, they might as well drop it and start another 5-year R&D program.



mcraygsx said:


> As enthusiasts we should all admit AMD has fantastic a product on hand for us at fantastic price. As of today there is zero upgrade path for Z270 or X99 while AM4 still as a future.



I don't think every "enthusiast" has to be an enthusiast of tweaking, OC, worrying about RAM compatibility and so on. What about enthusiasts of having powerful, yet stable and trouble-free machines? Is AMD going to take care of them?

I'm sure Ryzen is an excellent purchase for people who love benchmarking, uploading their results on forums and so on.
But would you also recommend a Ryzen (including the current mobo choice) to someone who only has a single PC and needs it to be operational?


----------



## RejZoR (Apr 8, 2017)

The above 720p tests are funny. RYZEN GETS ITS ASS KICKED! [Games run at 190+ fps]


----------



## notb (Apr 8, 2017)

RejZoR said:


> The above 720p tests are funny. RYZEN GETS ITS ASS KICKED! [Games run at 190+ fps]


But then we see people claiming that Ryzen would be faster if clocked identically. Or that it's NVIDIA cards' fault that Ryzen lags behind competition in games.
The uploaded test results simply verify these claims.
It's by no means a real-life gaming performance evaluation, because obviously not many people game at 720p (well... I do, actually).
As we all know 1080p is the ruling resolution today. However, 1080p reviews are still criticized by Ryzen users (I don't understand why...).

Moreover, I still find it funny how you can mock a conclusion as irrelevant ("games run at 190+ fps"), but you're praising tests that concern some hardly real-life / mainstream tasks (movie encoding, streaming, synthetic benchmarks...). 

As for the 720p graphs: the same CPUs paired with a cheaper card (e.g. GTX1050) would half the fps shown here (if not better). Increasing game requirements would result in another halving in 2 years or so. Before you know we're in a ~50fps range and at that point every fps counts.


----------



## RejZoR (Apr 8, 2017)

The fact is, Ryzern has IDENTICAL IPC to Intel's offerings. The fact that games don't work the same is solely about market share and dominance of Intel. Do you really think game studios will optimize games for AMD when they haven't released anything worthwile for 5+ years? And even though they had presence, no one bothered optimizing something that is in minority. It's how it is. Now that AMD has released something good, it'll take short time for optimizations to get back on track with AMD.

And if you stop obsesing over framerate and look at it objectively, ANY Ryzen you buy, you'll be able to fully enjoy ALL games no matter what. It just doesn't matter.


----------



## notb (Apr 8, 2017)

RejZoR said:


> The fact is, Ryzern has IDENTICAL IPC to Intel's offerings. The fact that games don't work the same is solely about market share and dominance of Intel. Do you really think game studios will optimize games for AMD when they haven't released anything worthwile for 5+ years?


Now that argument is even worse for AMD. Now you're saying that AMD can't improve Ryzen performance, because game studios are not interested in optimizing for a niche processor.
So what if AMD Ryzen doesn't change AMD market share? What if it's still a user group too small for gaming studios to care about?
It's even worse. Game studios should actually want AMD Ryzen to fail, because it's easier for them to optimize for a single architecture. So would other software makers.



RejZoR said:


> And even though they had presence, no one bothered optimizing something that is in minority. It's how it is. Now that AMD has released something good, it'll take short time for optimizations to get back on track with AMD.


I call this wishful thinking. Lets wait for some official figures on shipped CPUs. I really doubt the desktop Ryzen CPUs will change a lot.

Moreover, if AMD is about to gain a lot market share, it will have to be thanks to APU/mobile solutions. But those will only have a single CCX, max 8 threads and possibly a lot less optimization problems.
Even if game studios will try to support the Zen APU, how can you be sure they will care about the desktop niche as well? 



RejZoR said:


> And if you stop obsesing over framerate and look at it objectively, ANY Ryzen you buy, you'll be able to fully enjoy ALL games no matter what. It just doesn't matter.


This is also true for all Intel CPUs in the same price range. So if a CPU doesn't matter, why do you insist Ryzen is better? The dichotomy is killing me!


----------



## RejZoR (Apr 8, 2017)

I'm not talking about Ryzen being a minority, I'm talking about the entire timeframe from Bulldozer launch till Ryzen launch. AMD had nothing serious to offer and no one bothered with AMD.

With the amount of interest and existing sales, Ryzen is doing great despite all the nonsense panic by people running around about the "problems" with Ryzen. Things WILL change, because Ryzen is more than competitive product now (unlike Bulldozer and derivatives). If I weren't on already capable system, I'd jump on Ryzen without much thinking.

Also I never said it's BETTER. At worst, it's equally as good as any Intel CPU. One is better at one thing, other at another. But when you draw a line, differences are tiny compared to how massive they were during Bulldozer era... It is better in terms of value for money, that's for sure. 1800X beating 6900K in some scenarios for half the money, that's a deal that's hard to refuse.


----------



## notb (Apr 9, 2017)

RejZoR said:


> I'm not talking about Ryzen being a minority, I'm talking about the entire timeframe from Bulldozer launch till Ryzen launch. AMD had nothing serious to offer and no one bothered with AMD.


When Bulldozer arrived it was also a new architecture that everyone was talking about for a while, but in the end it changed nothing.
Honestly, AMD today is repeating the same process - they bet that the world has changed and it's finally in need of more cores. But is it?



RejZoR said:


> With the amount of interest and existing sales, Ryzen is doing great despite all the nonsense panic by people running around about the "problems" with Ryzen.


What's your source on the sales figures? Are you and AMD insider or what?



RejZoR said:


> If I weren't on already capable system, I'd jump on Ryzen without much thinking.


I wonder if this is actually the case: people buying Ryzen without much thinking.
AMD says that Ryzen is a great all-round / gaming platform, but the attached graphs show video encoding performance.
AMD says that Ryzen is cheaper, but most current Intel CPU users don't have a dGPU - adding that to the equation changes everything.



RejZoR said:


> Also I never said it's BETTER. At worst, it's equally as good as any Intel CPU. One is better at one thing, other at another. But when you draw a line, differences are tiny compared to how massive they were during Bulldozer era... It is better in terms of value for money, that's for sure. 1800X beating 6900K in some scenarios for half the money, that's a deal that's hard to refuse.



It's *very easy to refuse* once you actually think about the price point. $500 for a CPU? That's way too much. It really doesn't matter if a $500 CPU is faster than another niche CPU costing $1000.

As for the Bulldozer: it launched after Sandy Bridge and was humiliated by the latter's single-thread performance.
5 years later Ryzen has actually matched SB's in single-thread and - because Intel hasn't shown a new architecture yet - is also fairly competitive against Kaby Lake.
Who knows what will happen next? Today Ryzen might seem like a good value, but in a year?
We know that Intel will show something new, while AMD will have to live with Zen for next few years.

And once again: desktop gaming market is too small to have a big impact on the general share statistics. AMD needs to offer competitive APU and mobile solutions. This hasn't happened yet.


----------



## RejZoR (Apr 9, 2017)

When 500 for CPU is too much, 350 will be too much for a 7700k as well. Those kind of people buy Core i3's for 120 bucks or something...


----------



## GoldenX (Apr 9, 2017)

Intel answer to Ryzen so far are giving HT to Pentiums (I can't see how it can compete against Ryzen APUs), the (possible) i5 2011-3 and the i7 7740K for 2011-3, so, a more expensive 7700K with a 110w TDP and no IGP, without any of the benefits of the 2011-3 socket.
When Intel has something that is really new, AMD is going to have Zen+, the war (finally) continues.


----------



## refillable (Apr 10, 2017)

This thread is plagued with red herring debates. Glad I can simply ignore them.


----------



## oblivionlord (Apr 17, 2017)

Crazy question but why are the Fallout 4 1080p and 1440p performance virtually identical on both charts? Civilization VI and Hitman also appear to have virtually the same performance from 1080p and 1440p. Hitman however shows the intel chip loosing performance at 1440p. Seems odd that with near 80% added gpu demand with these 3 games that they show virtually no performance drop from 1080p to 1440p.


----------



## W1zzard (Apr 17, 2017)

oblivionlord said:


> Crazy question but why are the Fallout 4 1080p and 1440p performance virtually identical on both charts? Civilization VI and Hitman also appear to have virtually the same performance from 1080p and 1440p. Hitman however shows the intel chip loosing performance at 1440p. Seems odd that with near 80% added gpu demand with these 3 games that they show virtually no performance drop from 1080p to 1440p.


CPU limited


----------



## oblivionlord (Apr 17, 2017)

Hitman and Fallout 4 are not cpu intensive enough to top out these chips as evident on other benchmark sites that show higher framerates at highest gfx settings.


----------



## Totally (Apr 17, 2017)

notb said:


> Now that argument is even worse for AMD. Now you're saying that AMD can't improve Ryzen performance, because game studios are not interested in optimizing for a niche processor.
> So what if AMD Ryzen doesn't change AMD market share? What if it's still a user group too small for gaming studios to care about?
> It's even worse. Game studios should actually want AMD Ryzen to fail, because it's easier for them to optimize for a single architecture. So would other software makers.



That is false, a scenario where coding is only needed to be done for a single architecture sounds nice on paper but is horrible in reality for studios as it means they get less support and money from the hardware manufacture since they have much less incentive to provide such how many studios do you think could survive shouldering that burden?


----------



## medi01 (Apr 17, 2017)

Crazy stuff...


----------



## oblivionlord (Apr 17, 2017)

medi01 said:


> Crazy stuff...



I like this hahha. Further raises more suspicions hehe. Ive also noticed the nice 3000mhz and tighter timmings used on the intel system whereas the amd used looser timmings and slower mem. Quite fair


----------



## notb (Apr 17, 2017)

Totally said:


> That is false, a scenario where coding is only needed to be done for a single architecture sounds nice on paper but is horrible in reality for studios as it means they get less support and money from the hardware manufacture since they have much less incentive to provide such how many studios do you think could survive shouldering that burden?


1) Pitty on the game design studios that NEED money from hardware manufacturers. Are games too cheap or what?
You're telling me I have to pay more for my CPU, because the manufacturer has to support unprofitable game creators?!

2) Why would a single architecture manufacturer not offer support for studios? (but not with money for Zeus' sake!) I find this hard to understand. This company would still want it's CPUs to work well with software. IMO you place too much stress on economical principles (importance of competition) and too little on empirical observations.
For 5 years Intel was ruling the gaming scene and yet it cooperated with studios and was hugely active as a gaming events' sponsor.

3) I'm pretty sure all gaming studios that make money on their products would survive. Why should we care about the rest if we clearly don't care about their products?


----------



## notb (Apr 17, 2017)

oblivionlord said:


> I like this hahha. Further raises more suspicions hehe. Ive also noticed the nice 3000mhz and tighter timmings used on the intel system whereas the amd used looser timmings and slower mem. Quite fair


Most early reviewers didn't have choice. Ryzen RAM compatibility is awful. As more and more people notice lately: we're getting back to almost forgotten PC issues: who made the memory dies, how are they configured and so on. This is so sad...
And keep in mind even the Ryzen 7 testing bundles that AMD sent to reviewers had compatibility issues. 

IMO testing Ryzen with the same RAM that is run on Intel platform is quite fair, actually. Ryzen is marketed as a cheaper alternative, so lets talk about a real-world cost of moving from Intel to Ryzen assuming you already have high-end DDR4 (Intel supports it since late 2014). It's very probable that it won't work with Ryzen (or will ruin it's performance), so you'll need to replace it. 16GB of high-quality DDR4 costs almost as much as a Ryzen 5...
But if you're into some "productivity" tasks: rendering, encoding etc (which Ryzen 7 is clearly great for!), it might happen that you'll have to replace 32 or even 64 GB... And suddenly it might turn out that Ryzen is more expensive than Intel's X99 platform...


----------



## oblivionlord (Apr 17, 2017)

Yes early reviews using engineering samples really arent ideal also considering early adaptor issues especially with ryzen ram which corsair i believe had the first official compatible ram chips above 2666mhz released shortly after ryzen debut. Now the story has changed a bit and we can see a difference in performance due to maturity.

Im quite certain this review was being conducted quite some time earlier than the publishing date so it would be impractical to retest everything soon after thr initial testing. Then again to maintain a loyal fanbase, you have to provide updates and be loyal to your audience.

Still i am skeptic about the 3 games mentioned since their numbers are quite off from other sites.


----------



## Totally (Apr 18, 2017)

notb said:


> 1) Pitty on the game design studios that NEED money from hardware manufacturers. Are games too cheap or what?
> You're telling me I have to pay more for my CPU, because the manufacturer has to support unprofitable game creators?!



Studios would have to charge more because they'd have to take on more engineers to handle the stuff, that was being spoon fed to them by hardware manufacturers, or from not getting free monies or hardware.



> 2) Why would a single architecture manufacturer not offer support for studios? (but not with money for Zeus' sake!)



Ask yourself in a scenario with only one player, what would the benefits/consequences be for supporting, vs. consequences benefits for not supporting. Their hardware is the only platform why would they care if the software runs faster at the benefit consumer? When will be buying the hardware anyway?



> I find this hard to understand. This company would still want it's CPUs to work well with software. IMO you place too much stress on economical principles (importance of competition) and too little on empirical observations.
> For 5 years Intel was ruling the gaming scene and yet it cooperated with studios and was hugely active as a gaming events' sponsor.



Being the top dog on the block is completely different from being the ONLY dog on the block, having to exert some effort to maintain the lead vs. having a lead due to no contest. Intel contrary to what you are saying Intel has shown time and time again that they are not above dragging their feet when they have a comfortable lead. I'm not faulting them for this they are a business after all. If there is no competition, other than the bare minimum, helping with debugging, providing devs with support such as hardware, more efficient code to help the software faster on their hardware is all goodwill. Goodwill without reason is bad for business, since it hits against the bottom line. The example you give Intel only has a lead, it's "not the only guy in town" so there is no such empirical evidence.



> 3) I'm pretty sure all gaming studios that make money on their products would survive. Why should we care about the rest if we clearly don't care about their products?



Pretty sure everything then would just be made by EA, Activision, and/or Bethesda. I get the feeling you would already say that you're cool with that.


----------



## notb (Apr 18, 2017)

Totally said:


> Studios would have to charge more because they'd have to take on more engineers to handle the stuff, that was being spoon fed to them by hardware manufacturers, or from not getting free monies or hardware.


Possibly yes.



Totally said:


> Ask yourself in a scenario with only one player, what would the benefits/consequences be for supporting, vs. consequences benefits for not supporting. Their hardware is the only platform why would they care if the software runs faster at the benefit consumer? When will be buying the hardware anyway?


Because with more novelties and better performance, people would have a reason to update more often.
This kind of strategy works - think about Apple. They don't have any serious competition, i.e. most Apple users would not consider jumping to another company - even with better specs or lower price.
Yet, Apple manages to update it's products in such a way that even sensible people (so not those sleeping in front of stores) are usually updating e.g. their iPhones every 2 years at most.
And while the prices would most likely go up (again: Apple...), a monopoly does not imply lack of progress.

Going back to Intel/AMD - just look what happened in mobile CPUs. AMD is totally absent in this segment and yet the speed of Intel mobile CPUs have doubled since Sandy Bridge (progress in desktops was way smaller). Monopoly worked.
Someone could say that Intel invests in mobile CPUs to defend against ARM smartphones and tablets - I might even agree with that theory. But if that's correct, what stops smartphones and consoles from forcing development of desktop PCs? Again: we don't need another CPU manufacturer. 



Totally said:


> Being the top dog on the block is completely different from being the ONLY dog on the block, having to exert some effort to maintain the lead vs. having a lead due to no contest. Intel contrary to what you are saying Intel has shown time and time again that they are not above dragging their feet when they have a comfortable lead.


There is another side of this story. Intel has a huge hunger for R&D (that's what happens when you employ so many R&D people).
You're right that every time Intel feels safe in consumer PC territory, desktop CPU improvement slows down. But their not going on vacation.
Intel has been doing some really important stuff lately: AI (including autonomous cars), IoT, faster memory tech and so on.

Just compare the main websites:
I go to *www.intel.com* and is all about business, AI, drones. Not a word about gaming. Actually not much about CPUs. Here CPU is just a part of a bigger solution.
Then I open* www.amd.com* and it's *"RYZEN POWERS. YOU FIGHT."*
Of course I'm not bashing AMD for not concentrating more on practical real-world issues, but it is important to remember that Intel and AMD are two very different companies - even if their most important products do the same thing.
But to be honest: I find Intel's image very appealing, professional, mature... IMO what AMD does lately is a bit too teenager-oriented...


----------



## Rash-Un-Al (Apr 21, 2017)

What I'm surprised to see missing... in virtually all reviews across the web... is any discussion (by a publication or its readers) on the AM4 platform's longevity and upgradability (in addition to its cost, which is readily discussed).

Any Intel Platform - is almost guaranteed to not accommodate a new or significantly revised micro-architecture... beyond the mere "tick".  In order to enjoy a "tock", one MUST purchase a new motherboard (if historical precedent is maintained).

AMD AM4 Platform - is almost guaranteed to, AT LEAST, accommodate Ryzen "II" and quite possibly Ryzen "III" processors.  And, in such cases, only a new processor and BIOS update will be necessary to do so.

This is not an insignificant point of differentiation.


----------



## notb (Apr 22, 2017)

Rash-Un-Al said:


> What I'm surprised to see missing... in virtually all reviews across the web... is any discussion (by a publication or its readers) on the AM4 platform's longevity and upgradability (in addition to its cost, which is readily discussed).


Possibly because it's just an optimistic guess, with no actual guarantee from AMD?

We should not expect AM4 to last as long as AM3/AM3+ (over 6 years) if AMD is supposed to become a competitive player (like it used to be).
Before AM3, AMD was also replacing their mainstream socket every 2-3 years.

In 3 years from now AM4 will become fairly outdated anyway (DDR5 etc).



Rash-Un-Al said:


> AMD AM4 Platform - is almost guaranteed to, AT LEAST, accommodate Ryzen "II"


"Ryzen II" or "Ryzen+" will merely be a "tock" generation. What's special in supporting 2 CPU generations? You've just bashed Intel for that.


----------



## Rash-Un-Al (Apr 22, 2017)

notb said:


> In 3 years from now AM4 will become fairly outdated anyway (DDR5 etc).



This, too, may be an optimistic guess.     For example, DDR3 has been with us since 2007... and it wasn't until 2014 that the first motherboards supported DDR4... 7 years.   There's little doubt that technology and motherboard features will continue to progress.  However, one who purchases an AM4 motherboard today... is likely to have the choice to upgrade their CPU only (to the next generation)... or CPU + Motherboard (if they want the latest feature set).  



notb said:


> "Ryzen II" or "Ryzen+" will merely be a "tock" generation. What's special in supporting 2 CPU generations? You've just bashed Intel for that



Did you mean tick?   If you meant tock, you're helping to make the same point made in my prior post... as Intel requires a new motherboard with each tock.   I understand your overall point, however... and to that, more specifically, Intel technically doesn't support 2 generations per socket... as much as it supports 1 (major) micro-arch revision and it's equivalent die-shrink (and tweaks).  Although, this is likely to have changed, a bit, more recently... with the pause of Intel's tick-tock cadence (the pathway to future node miniaturization appears to be increasingly difficult).

Also, there was no bashing of Intel.  Rather, only mention of a differentiator.   It's interesting how highlighting a potential difference in one brand can be perceived as bashing another brand... certainly not the intent.   (I'm typing this on an Intel Haswell-based system, by the way).  

You're probably right... AMD may shift its strategy and not support AM4 for 6+ years, as it did with AM3/AM3+... we'll just have to wait and see.  There is, however, at least some historical AMD precedent to discuss on this front... while this has not been the case with Intel (and it would make sense, too... as, purely from a business/profitability standpoint, Intel has likely not had any incentive to do so).   AMD, on the other hand, may not have this luxury... and is probably all too happy to snip at any bit of market share that it can, in any way that it can... and one tactic (of many) might be to provide a more broad/flexible upgrade framework... as it has in the past.


----------



## notb (Apr 22, 2017)

Rash-Un-Al said:


> Did you mean tick?   If you meant tock, you're helping to make the same point made in my prior post...


Well, it's not that easy to compare. Maybe I shouldn't have used the word at all.
I basically meant that Ryzen+ will be the second generation of AM4 CPUs, which is nothing special. We get 2 (at least) gens for each Intel socket as well.
And from what we've heard Ryzen+ will not be a new arch, but merely a refresh: minor fixes, maybe an improved process (allowing higher clocks) - so possibly even a smaller change compared to what Intel offers in"ticks".

Also, AMD said that AM4 will be supported for 5 years, but AFAIK they didn't say it will be their main socket until the end.
And even if it will be, it's 5 years... Some people react like "great, I'll be able to upgrade the CPU!", but my reaction is like "this will be really old tech by then - I would not invest in it".

However, I guess I would be fine, as I usually use desktops for a long time (I still don't have USB3.0! )
But when I see a hardware geek with all the latest stuff in his System Specs using this argument, I'm like "oh man... who are you trying to fool?"


----------



## RejZoR (Apr 22, 2017)

DDR5 may arrive soon, but it'll be unobtainable and expensive. It took few years before DDR4 became accessible.


----------



## happy medium (Apr 25, 2017)




----------



## systemBuilder (Jun 20, 2017)

You have to give AMD some credit for honesty in naming their previous-generation CPUs "Bulldozer".  After all, who would name a *fast* cpu after a bulldozer?  That's right, nobody at all would pick that name if the CPU were fast ...


----------



## lexluthermiester (Jun 20, 2017)

happy medium said:


>


Do you actually think you're winning the debate with all of your whining, stat posts and the above video? Let me help you with that; *You're NOT!*
All you're doing is showing that some anon came out of nowhere with a very clear agenda to bad mouth Ryzen with weak arguments, modified stat graphs and an iffy video that only proves you're either a fanboy or a marketing chump from Intel.

And before you call ME an AMD fanboy, I run Intel CPU's in all of my PC's. However, I'm an objective person who looks at the big picture. And right now AMD is giving Intel something to be very worried about. They are providing quality CPU's that perform *VERY* well and at price points that are making the whole industry take notice. Your fanboyism is a wasted effort here and is making you look like an overgrown child and making Intel look bad by attrition..


systemBuilder said:


> You have to give AMD some credit for honesty in naming their previous-generation CPUs "Bulldozer".  After all, who would name a *fast* cpu after a bulldozer?  That's right, nobody at all would pick that name if the CPU were fast ...


You think that was clever? It wasn't.


----------



## happy medium (Jun 21, 2017)

lexluthermiester said:


> Do you actually think you're winning the debate with all of your whining, stat posts and the above video? Let me help you with that; *You're NOT!*
> All you're doing is showing that some anon came out of nowhere with a very clear agenda to bad mouth Ryzen with weak arguments, modified stat graphs and an iffy video that only proves you're either a fanboy or a marketing chump from Intel.
> 
> And before you call ME an AMD fanboy, I run Intel CPU's in all of my PC's. However, I'm an objective person who looks at the big picture. And right now AMD is giving Intel something to be very worried about. They are providing quality CPU's that perform *VERY* well and at price points that are making the whole industry take notice. Your fanboyism is a wasted effort here and is making you look like an overgrown child and making Intel look bad by attrition..
> ...



That's right , argue with personal attacks, but DON'T argue the facts.
You are the poster-boy AMD fanatic.
Yea, I have been a member here for 6 years (and many years on other sites) waiting for Ryzen's release, you got me. lol


----------



## lexluthermiester (Jun 21, 2017)

happy medium said:


> That's right , argue with personal attacks, but DON'T argue the facts.
> You are the poster-boy AMD fanatic.
> Yea, I have been a member here for 6 years (and many years on other sites) waiting for Ryzen's release, you got me. lol


Thank You for proving my point so overwhelmingly. Outstanding!


----------



## Athlonite (Jun 21, 2017)

Now now girls if you want to argue with one another then take it to pm's and stop clogging up the thread with trash talk


----------



## lexluthermiester (Jun 21, 2017)

Athlonite said:


> Now now girls if you want to argue with one another then take it to pm's and stop clogging up the thread with trash talk


...yes mommy...


----------

