# AMD Ryzen 3 1200 3.1 GHz



## W1zzard (Jul 27, 2017)

AMD's Ryzen 3 1200 is aggressively priced at $110, which makes it the most affordable Ryzen available. With clock frequencies ranging from 3.1 GHz Base to 3.4 GHz Boost, it is clocked not much different than the Ryzen 5 1400, which also has four physical cores, but includes the SMT technology on top.

*Show full review*


----------



## TheLostSwede (Jul 27, 2017)

More negative points than positive ones, yet gets an award?


----------



## xenocide (Jul 27, 2017)

TheLostSwede said:


> More negative points than positive ones, yet gets an award?



A few of those negatives are kinda fluffed up.  Easily could have fluffed up the pro's as well, but it seems unnecessary.  Surprised it doesn't have 2-3 that basically equate to; "Good with multithreaded workloads."


----------



## Thalles Adorno (Jul 27, 2017)

There was a video by actual hardware overclocking saying that on xmp a timing could go very wrong, hurting the performance significantly. I don't understand much about it and I don't remember the exact timing, but is that tRC on CPU z right ? It looks like the one that was problematic on his video


----------



## Manu_PT (Jul 27, 2017)

Very good review and I absolutely loved the 720p benchmarks part. That's what I look for on all cpu review and rarely can find it! (Hardocp does it too). I'm an high refresh gamer (240hz) and we can clearly see AMD just cant keep up. On some cases Intel i5 has almost double framerate!! That is crazy really


----------



## DeathtoGnomes (Jul 27, 2017)

thanks @W1zzard  for the great review. The 720 tests seems to affirm the limitations of these budget chips.


----------



## GoldenX (Jul 27, 2017)

Finally, after so many years, we have proper $100 quads again.
We only need the sub $100 offers, something to compete against the perfect G4560.

I still don't understand the need for 720p benchs, that would be useful for comparing APUs, IGPs, nobody is going to buy a R3 1200 for gaming and play at 1280x720. If the idea is to see the difference in CPU performance, for that we have CPU tests.


----------



## Manu_PT (Jul 27, 2017)

Because at 720p the only bottleneck is the CPU and you can clearly know what to expect in the future for high refresh gaming. People keep saying "oh the difference at 4k is only 3%". It is only 3% NOW, in 2 years when you have a beefy more than capable 4k GPU you will notice the differences more and more, and 720p shows you the exact difference in performance between the CPUs in gaming tasks.

Btw I play a lot of e-sports games at 1366x768 to get 200fps. Now I know I couldn´t do it with a Ryzen chip. So yeah they don´t interest me at all.


----------



## EarthDog (Jul 27, 2017)

Ugh, let's not drag up the 720p testing BS again.. there was a MONUNMENTAL thread on it here already.


----------



## DeathtoGnomes (Jul 27, 2017)

EarthDog said:


> Ugh, let's not drag up the 720p testing BS again.. there was a MONUNMENTAL thread on it here already.


LOL as if you didnt have enough nightmares?


----------



## Durvelle27 (Jul 27, 2017)

Good for the price


----------



## biffzinker (Jul 27, 2017)

An overclock to 3.8 GHz sure helps scale performance up assuming you can get a majority of the 1200s to overclock that high.


----------



## Manu_PT (Jul 27, 2017)

EarthDog said:


> Ugh, let's not drag up the 720p testing BS again.. there was a MONUNMENTAL thread on it here already.



Is the most imporant CPU gaming test to me. We all different and we all want different things from our hardware 

The problems start when someone can´t respect how other people use their PCs. Like, I don´t really care about eye-candy ultra doopa Witcher 3 4k settings, so I don´t jump at everyone that wants it and I don´t say that 4k tests are useless. Is all about respect


----------



## meirb111 (Jul 27, 2017)

still bugs me how a 4 core cpu (R3 1200) and a 8 core cpu (RYZEN 1700)of the same gen have 65w tdp.


----------



## Nuckles56 (Jul 27, 2017)

@W1zzard How come there were no OC'd CPU gaming results?


----------



## notb (Jul 27, 2017)

meirb111 said:


> still bugs me how a 4 core cpu (R3 1200) and a 8 core cpu (RYZEN 1700)of the same gen have 65w tdp.


Poor multi-core utilization and impact of binning.
Remember this TDP is for "typical load", not "maximum".
You run a game on a Ryzen 7 1700 and it uses 8 cores but only at 50-60%. You run it on a 4-core Ryzen 3 1200 and it's reaching 100%. So the CPU does the same work, just handles it differently (they end up with similar fps). Hence, "typical load" offers similar TDP. And it's the same with many other applications which often don't utilize 8 cores. E.g. many tasks in Photoshop can hardly go past 3 threads. Again: "typical load", low power usage.
It's all just marketing rubbish. What AMD actually says: we're giving you 8 cores, but because you'll usually utilize half of them, you'll save a lot of electricity.

You run a CPU-heavy, multithread benchmark - like wPrime - and the actual power draw becomes obvious.
In TPU's test Ryzen 3 1200 ate 71W, which is what you'd expect from a 65W TDP CPU (+few watts for mobo and RAM).
Ryzen 7 1700 needed 108W, so it's way past its TDP.


----------



## EarthDog (Jul 27, 2017)

Manu_PT said:


> Is the most imporant CPU gaming test to me. We all different and we all want different things from our hardware
> 
> The problems start when someone can´t respect how other people use their PCs. Like, I don´t really care about eye-candy ultra doopa Witcher 3 4k settings, so I don´t jump at everyone that wants it and I don´t say that 4k tests are useless. Is all about respect


It has nothing to do with respect, honestly. Its about what people are taking from the results and formulating an opinion over it is the concern.

That said, since (you JUST told us) you play games at 720p, testing there is are great for you!!! Had i known you do that, i wouldnt have posted anything (maybe fill out your system specs - you arent new here!!). They just dont mean much to anyone playing games at a higher res (where we/many disagree). 

I digress... said i wouldnt and i did.. lol! If you want to rehash this poor beaten horse, find the thread and bump it..


----------



## deu (Jul 28, 2017)

Looking at the 4K overall scores its ironic that the 7700K is only 4% faster. I mean: If you wanna game the games in 4K; well 1200 is your go to CPU... or G4560!


----------



## Manu_PT (Jul 28, 2017)

That´s the thing I´ve said countless times and I don´t want you to get this as an offense or me against AMD. Today, July 2017, 4k gaming on 7700k is only 4% faster because the GPU power isnt there yet. In 2/3 years, July 2019/2020 with who knows GTX 2080ti or some beefy VEGA, what will it be at 4k? 20% difference? Is all because right now the GPU is the obvious bottleneck.

As for 1366x768 I play at this resolution because most of the games I play are competitive ones, so I prefer high framerate/high refresh rate for better aiming and lower input latency/better motion clarity. Is just an option. I have no problems playing a Single Player game like Batman with all the eye candy maxed out and high resolutions. But if we are talking about a more than capable 4k 60fps CPU, then you can just go for a 50 bucks Pentium G4560 on a 50 bucks H110 motherboard with 50 bucks 8gb ddr4 ram. Why spend more than the double money on a quad? I know why and I agree with you -> because it assures better performance for the future. And you got my point


----------



## EarthDog (Jul 28, 2017)

EarthDog said:


> If you want to rehash this poor beaten horse, find the thread and bump it..


----------



## Melvis (Jul 28, 2017)

Thanks for the review but I find this part of your conclusion just a little odd >  You should also pay attention to the sub-60 frame rates noticed in games such as "Fallout 4" and "Watch_Dogs 2" at 720p. If it's sub-60 fps in 720p, it won't be over 60 fps in higher resolutions, no matter the graphics card you use.

I noticed that the FPS in all resolutions in Fallout 4 was exactly the same at 51FPS, this to me seems very odd and would suggest maybe a software issue more so then a lack of CPU performance (Fallout 4 is horrifically un optimised) and also considering that you said "it won't be over 60 fps in higher resolutions, no matter the graphics card you use" which isnt the case in watchdogs 2 as the FPS is 79/52/32FPS in 1080/1440/2160P resolutions. 

Can you please clarify this for me?


----------



## ps000000 (Jul 28, 2017)

I think R5-1600 is a nice point to upgrade.

I am glad that my i5-2500K @ 4.8GHz @ 1.31 VCore still rock better CB R15 single /multi score than this.


----------



## notb (Jul 28, 2017)

Melvis said:


> I noticed that the FPS in all resolutions in Fallout 4 was exactly the same at 51FPS, this to me seems very odd and would suggest maybe a software issue more so then a lack of CPU performance (Fallout 4 is horrifically un optimised) and also considering that you said "it won't be over 60 fps in higher resolutions, no matter the graphics card you use" which isnt the case in watchdogs 2 as the FPS is 79/52/32FPS in 1080/1440/2160P resolutions.
> 
> Can you please clarify this for me?


This is simply how a CPU bottleneck looks like.
What you have to understand is that CPU tasks are resolution-invariant (at least not directly). The actual frame render, so the moment where _resolution_ appears, is at the end of the pipeline - afterwards the frame is directly pushed to the video output (not coming back to the CPU).

So when the same CPU+GPU setup results in similar fps, it means that this is how quick CPU can provide its part.


----------



## Melvis (Jul 29, 2017)

notb said:


> This is simply how a CPU bottleneck looks like.
> What you have to understand is that CPU tasks are resolution-invariant (at least not directly). The actual frame render, so the moment where _resolution_ appears, is at the end of the pipeline - afterwards the frame is directly pushed to the video output (not coming back to the CPU).
> 
> So when the same CPU+GPU setup results in similar fps, it means that this is how quick CPU can provide its part.



No this is not, and if you read the post you will see that this isnt the case when you see the watch dogs 2 FPS that it clearly changes which is completely different from what he states. 

Something is ether wrong or Vsync was turned on. The difference between the two CPU's is only a few hundred MHz difference and you will still see a change in FPS.


----------



## BiggieShady (Jul 29, 2017)

EarthDog said:


> Ugh, let's not drag up the 720p testing BS again.. there was a MONUNMENTAL thread on it here already.


You just did ... dragged it up ... monumentally


----------



## notb (Jul 29, 2017)

Melvis said:


> No this is not, and if you read the post you will see that this isnt the case when you see the watch dogs 2 FPS that it clearly changes which is completely different from what he states.


I was referring to benchmark resulting in similar FPS at different resolutions (like Fallout 4).
Watch Dogs 2 is exactly the opposite situation. FPS differ between resolutions, but not with different CPUs. That's a GPU bottleneck. Only when you lower the resolution, you'll see a difference between CPUs (look at 1920x1080). This is why TPU tests 720p as well. It's all very understandable and logical if you take a moment to contemplate...


----------



## hat (Jul 29, 2017)

Looks like it doesn't do so well in CPU heavy games... but it would have been interesting to see a quick 1080p test run of one of the games it fell behind in (like CIV 6, Hitman, Fallout 4) when overclocked. 

Remember, back in the day, one major aspect of overclocking used to be buying cheaper, lower and hardware and cranking it up to run at the same level of the more expensive hardware. There is significant value in that.


----------



## Melvis (Jul 30, 2017)

notb said:


> I was referring to benchmark resulting in similar FPS at different resolutions (like Fallout 4).
> Watch Dogs 2 is exactly the opposite situation. FPS differ between resolutions, but not with different CPUs. That's a GPU bottleneck. Only when you lower the resolution, you'll see a difference between CPUs (look at 1920x1080). This is why TPU tests 720p as well. It's all very understandable and logical if you take a moment to contemplate...



Ok I think I get what your saying here, and if this is the case then the FPS on the Ryzen 1200 would there for be the same on any other CPU tested, but it isnt, its the exact same 51FPS with the Ryzen 1200. 

Also why is it then that weaker CPU's (with the same GTX 1080) get better or different FPS then this Ryzen 1200? This just doesnt add up in my eyes.


----------



## Russell Harris (Aug 4, 2017)

I think people for get that your eyes can't see past 30fps anyway and high frame rates in a game without a monitor that supports it does you no good either.  You latterly can't out do you physical limitations as a human, but hey you free to was your money.  The intergraded graphics is really a mute point for this processor we all know they are releasing a line of APU's so that will be the real test for zen and since AMD generally has better low-end CPU's with graphics I think they'll win this round especially with AMD Dual Graphics.


----------



## Durvelle27 (Aug 4, 2017)

Russell Harris said:


> I think people for get that your eyes can't see past 30fps anyway and high frame rates in a game without a monitor that supports it does you no good either.  You latterly can't out do you physical limitations as a human, but hey you free to was your money.  The intergraded graphics is really a mute point for this processor we all know they are releasing a line of APU's so that will be the real test for zen and since AMD generally has better low-end CPU's with graphics I think they'll win this round especially with AMD Dual Graphics.


The human eye can see as much as your brain can handle


----------



## notb (Aug 4, 2017)

Melvis said:


> Also why is it then that weaker CPU's (with the same GTX 1080) get better or different FPS then this Ryzen 1200? This just doesnt add up in my eyes.


"Why weaker CPUs get better FPS than Ryzen 1200?"
Do you see the issue in this question?


----------



## Melvis (Aug 5, 2017)

notb said:


> "Why weaker CPUs get better FPS than Ryzen 1200?"
> Do you see the issue in this question?



Nope!

So why is it then?


----------



## Russell Harris (Aug 5, 2017)

it's because of how the program is written, single and up to four threads intel has an advantage in gaming and some other tasks.  where Ryzen shines in moving large amounts of data quickly which is why it performs better in higher resolution applications.  what would of helped the Ryzen 3 is to have left the cores on a single CCX the cross talk slows things down  they are still great cpu's.


----------



## Russell Harris (Aug 5, 2017)

Durvelle27 said:


> The human eye can see as much as your brain can handle



Sorry your wrong there is a reason film is shot at 30fps and it's because the human eye only sees at a little over 29fps.


----------



## GoldenX (Aug 6, 2017)

Russell Harris said:


> Sorry your wrong there is a reason film is shot at 30fps and it's because the human eye only sees at a little over 29fps.



It can see only 30fps when the consecutive frames are smoothed, check frames on any movie, now check screenshots of your games. On still frames like on games, the brain can process much more information. 24 and 30 frames per second was used to save on resources, film was expensive back then. Why do you think monitors use 60Hz as standard? Try using your monitor at 30 or 40 Hz with a custom resolution, tell me what happens.


----------



## Deleted member 163934 (Aug 21, 2017)

Russell Harris said:


> I think people for get that your eyes can't see past 30fps anyway and high frame rates in a game without a monitor that supports it does you no good either.



Please point to a couple of scientific researches that clearly prove that the human eyes just can't send the informations needed for the brain to process more than 30 fps.

As far as I know human eyes + brain can handle hundreds of fps.

This 30 fps myth show up constantly but I never ever saw a scientific research to back it up.


----------



## notb (Aug 21, 2017)

thedukesd1 said:


> Please point to a couple of scientific researches that clearly prove that the human eyes just can't send the informations needed for the brain to process more than 30 fps.
> 
> As far as I know human eyes + brain can handle hundreds of fps.
> 
> This 30 fps myth show up constantly but I never ever saw a scientific research to back it up.


But can you show a paper about how human "eyes + brain can handle hundreds of fps"? Because it's not true, you know...

A typical research suggesting that people see hundreds of fps are very simple: you're being flashed by very short impulses and asked whether you've seen anything or not.
One only needs a tiny bit of physics knowledge (or general understanding of digital photography), to get why we can see flashes that take much less than 0,01s.

On the other hand, in a different experiment - observing a flickering diode with controlled frequency - most people state that they don't see the flickering above 40-50Hz. At that point the brain is already simplifying the image. And it's just a diode - not a very complicated thing to interpret.


----------



## EarthDog (Aug 21, 2017)

GoldenX said:


> It can see only 30fps when the consecutive frames are smoothed, check frames on any movie, now check screenshots of your games. On still frames like on games, the brain can process much more information. 24 and 30 frames per second was used to save on resources, film was expensive back then. Why do you think monitors use 60Hz as standard? Try using your monitor at 30 or 40 Hz with a custom resolution, tell me what happens.


Exactly.. PC gaming doesn't have a natural motion blur, like real life, hence why 24 FPS isn't choppy to humans in film.


----------



## notb (Aug 21, 2017)

EarthDog said:


> Exactly.. PC gaming doesn't have a natural motion blur, like real life, hence why 24 FPS isn't choppy to humans in film.


I think we can all agree that switch from 24-30 fps to 60fps is visible and welcome.
But 144fps or even above... I'm sure almost all people with 144Hz LCDs play with fps information visible on the screen - the only way for them to feel that spending $2000 on a gaming PC was a good idea...


----------



## GoldenX (Aug 21, 2017)

notb said:


> I think we can all agree that switch from 24-30 fps to 60fps is visible and welcome.
> But 144fps or even above... I'm sure almost all people with 144Hz LCDs play with fps information visible on the screen - the only way for them to feel that spending $2000 on a gaming PC was a good idea...



I went from 60 to 94Hz after overclocking the display and the jump is noticeable even without a fps counter.


----------

