• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 3 1200 3.1 GHz

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,651 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
AMD's Ryzen 3 1200 is aggressively priced at $110, which makes it the most affordable Ryzen available. With clock frequencies ranging from 3.1 GHz Base to 3.4 GHz Boost, it is clocked not much different than the Ryzen 5 1400, which also has four physical cores, but includes the SMT technology on top.

Show full review
 
Last edited:
More negative points than positive ones, yet gets an award?
 
More negative points than positive ones, yet gets an award?

A few of those negatives are kinda fluffed up. Easily could have fluffed up the pro's as well, but it seems unnecessary. Surprised it doesn't have 2-3 that basically equate to; "Good with multithreaded workloads."
 
There was a video by actual hardware overclocking saying that on xmp a timing could go very wrong, hurting the performance significantly. I don't understand much about it and I don't remember the exact timing, but is that tRC on CPU z right ? It looks like the one that was problematic on his video
 
Very good review and I absolutely loved the 720p benchmarks part. That's what I look for on all cpu review and rarely can find it! (Hardocp does it too). I'm an high refresh gamer (240hz) and we can clearly see AMD just cant keep up. On some cases Intel i5 has almost double framerate!! That is crazy really
 
thanks @W1zzard for the great review. The 720 tests seems to affirm the limitations of these budget chips.
 
Finally, after so many years, we have proper $100 quads again.
We only need the sub $100 offers, something to compete against the perfect G4560.

I still don't understand the need for 720p benchs, that would be useful for comparing APUs, IGPs, nobody is going to buy a R3 1200 for gaming and play at 1280x720. If the idea is to see the difference in CPU performance, for that we have CPU tests.
 
Because at 720p the only bottleneck is the CPU and you can clearly know what to expect in the future for high refresh gaming. People keep saying "oh the difference at 4k is only 3%". It is only 3% NOW, in 2 years when you have a beefy more than capable 4k GPU you will notice the differences more and more, and 720p shows you the exact difference in performance between the CPUs in gaming tasks.

Btw I play a lot of e-sports games at 1366x768 to get 200fps. Now I know I couldn´t do it with a Ryzen chip. So yeah they don´t interest me at all.
 
Ugh, let's not drag up the 720p testing BS again.. there was a MONUNMENTAL thread on it here already.
LOL as if you didnt have enough nightmares?
 
Good for the price
 
An overclock to 3.8 GHz sure helps scale performance up assuming you can get a majority of the 1200s to overclock that high.
 
Ugh, let's not drag up the 720p testing BS again.. there was a MONUNMENTAL thread on it here already.

Is the most imporant CPU gaming test to me. We all different and we all want different things from our hardware :)

The problems start when someone can´t respect how other people use their PCs. Like, I don´t really care about eye-candy ultra doopa Witcher 3 4k settings, so I don´t jump at everyone that wants it and I don´t say that 4k tests are useless. Is all about respect :)
 
still bugs me how a 4 core cpu (R3 1200) and a 8 core cpu (RYZEN 1700)of the same gen have 65w tdp.
 
@W1zzard How come there were no OC'd CPU gaming results?
 
still bugs me how a 4 core cpu (R3 1200) and a 8 core cpu (RYZEN 1700)of the same gen have 65w tdp.
Poor multi-core utilization and impact of binning.
Remember this TDP is for "typical load", not "maximum".
You run a game on a Ryzen 7 1700 and it uses 8 cores but only at 50-60%. You run it on a 4-core Ryzen 3 1200 and it's reaching 100%. So the CPU does the same work, just handles it differently (they end up with similar fps). Hence, "typical load" offers similar TDP. And it's the same with many other applications which often don't utilize 8 cores. E.g. many tasks in Photoshop can hardly go past 3 threads. Again: "typical load", low power usage.
It's all just marketing rubbish. What AMD actually says: we're giving you 8 cores, but because you'll usually utilize half of them, you'll save a lot of electricity.

You run a CPU-heavy, multithread benchmark - like wPrime - and the actual power draw becomes obvious.
In TPU's test Ryzen 3 1200 ate 71W, which is what you'd expect from a 65W TDP CPU (+few watts for mobo and RAM).
Ryzen 7 1700 needed 108W, so it's way past its TDP.
 
Is the most imporant CPU gaming test to me. We all different and we all want different things from our hardware :)

The problems start when someone can´t respect how other people use their PCs. Like, I don´t really care about eye-candy ultra doopa Witcher 3 4k settings, so I don´t jump at everyone that wants it and I don´t say that 4k tests are useless. Is all about respect :)
It has nothing to do with respect, honestly. Its about what people are taking from the results and formulating an opinion over it is the concern.

That said, since (you JUST told us) you play games at 720p, testing there is are great for you!!! Had i known you do that, i wouldnt have posted anything (maybe fill out your system specs - you arent new here!!). They just dont mean much to anyone playing games at a higher res (where we/many disagree). :)

I digress... said i wouldnt and i did.. lol! If you want to rehash this poor beaten horse, find the thread and bump it.. :)
 
Looking at the 4K overall scores its ironic that the 7700K is only 4% faster. I mean: If you wanna game the games in 4K; well 1200 is your go to CPU... or G4560! :D
 
That´s the thing I´ve said countless times and I don´t want you to get this as an offense or me against AMD. Today, July 2017, 4k gaming on 7700k is only 4% faster because the GPU power isnt there yet. In 2/3 years, July 2019/2020 with who knows GTX 2080ti or some beefy VEGA, what will it be at 4k? 20% difference? Is all because right now the GPU is the obvious bottleneck.

As for 1366x768 I play at this resolution because most of the games I play are competitive ones, so I prefer high framerate/high refresh rate for better aiming and lower input latency/better motion clarity. Is just an option. I have no problems playing a Single Player game like Batman with all the eye candy maxed out and high resolutions. But if we are talking about a more than capable 4k 60fps CPU, then you can just go for a 50 bucks Pentium G4560 on a 50 bucks H110 motherboard with 50 bucks 8gb ddr4 ram. Why spend more than the double money on a quad? I know why and I agree with you -> because it assures better performance for the future. And you got my point :D
 
Thanks for the review but I find this part of your conclusion just a little odd > You should also pay attention to the sub-60 frame rates noticed in games such as "Fallout 4" and "Watch_Dogs 2" at 720p. If it's sub-60 fps in 720p, it won't be over 60 fps in higher resolutions, no matter the graphics card you use.

I noticed that the FPS in all resolutions in Fallout 4 was exactly the same at 51FPS, this to me seems very odd and would suggest maybe a software issue more so then a lack of CPU performance (Fallout 4 is horrifically un optimised) and also considering that you said "it won't be over 60 fps in higher resolutions, no matter the graphics card you use" which isnt the case in watchdogs 2 as the FPS is 79/52/32FPS in 1080/1440/2160P resolutions.

Can you please clarify this for me?
 
I think R5-1600 is a nice point to upgrade.

I am glad that my i5-2500K @ 4.8GHz @ 1.31 VCore still rock better CB R15 single /multi score than this.
 
I noticed that the FPS in all resolutions in Fallout 4 was exactly the same at 51FPS, this to me seems very odd and would suggest maybe a software issue more so then a lack of CPU performance (Fallout 4 is horrifically un optimised) and also considering that you said "it won't be over 60 fps in higher resolutions, no matter the graphics card you use" which isnt the case in watchdogs 2 as the FPS is 79/52/32FPS in 1080/1440/2160P resolutions.

Can you please clarify this for me?
This is simply how a CPU bottleneck looks like.
What you have to understand is that CPU tasks are resolution-invariant (at least not directly). The actual frame render, so the moment where resolution appears, is at the end of the pipeline - afterwards the frame is directly pushed to the video output (not coming back to the CPU).

So when the same CPU+GPU setup results in similar fps, it means that this is how quick CPU can provide its part.
 
This is simply how a CPU bottleneck looks like.
What you have to understand is that CPU tasks are resolution-invariant (at least not directly). The actual frame render, so the moment where resolution appears, is at the end of the pipeline - afterwards the frame is directly pushed to the video output (not coming back to the CPU).

So when the same CPU+GPU setup results in similar fps, it means that this is how quick CPU can provide its part.

No this is not, and if you read the post you will see that this isnt the case when you see the watch dogs 2 FPS that it clearly changes which is completely different from what he states.

Something is ether wrong or Vsync was turned on. The difference between the two CPU's is only a few hundred MHz difference and you will still see a change in FPS.
 
Ugh, let's not drag up the 720p testing BS again.. there was a MONUNMENTAL thread on it here already.
You just did ... dragged it up ... monumentally
 
Back
Top