Wednesday, September 5th 2018

AMD Athlon Pro 200GE Detailed: An Extremely Cut-down "Raven Ridge" at $55

AMD is giving finishing touches to its Athlon Pro 200GE socket AM4 SoC, which it could position against Intel's $50-ish Celeron LGA1151 SKUs. Leaked slides by PCEva reveals that it's a heavily cut-down 14 nm "Raven Ridge" die. For starters, unlike previous-generation Athlon-branded products on platforms such as FM2, the Athlon 200GE won't lack integrated graphics. Only 3 out of 11 Vega NGCUs will be enabled, translating to 192 stream processors, which should be enough for desktop, 2D, and video acceleration, but not serious gaming, even at low resolutions.

The CPU config is 2-core/4-thread, with 512 KB L2 cache per core, and 4 MB shared L3 cache. The CPU is clocked at 3.20 GHz, with no Precision Boost features. You still get GuardMI commercial-grade hardware security features. There is a big catch with one of its uncore components. The PCIe root-complex only supports PCI-Express 3.0 x4 out of your motherboard's topmost x16 slot, not even x8. Ryzen "Raven Ridge" APUs already offer a crippled x8 connectivity through this slot. AMD claims that the Athlon 200GE will be "up to 19 percent faster" than Intel Pentium G4560 at productivity work. When it launches on 6th September with market availability from 18th September, the Athlon Pro 200GE will be priced at USD $55.
Sources: PCEva, HD-Technologica, VideoCardz
Add your own comment

56 Comments on AMD Athlon Pro 200GE Detailed: An Extremely Cut-down "Raven Ridge" at $55

#26
GoldenX
It's more than enough for office work and media playback, at only 35w, and with a good, upgradeable platform.
And I'm sure it can game better than an UHD IGP. Some light gaming in 1366x768 display seems possible.
Posted on Reply
#27
carex
it will match the g4560 single thread performance for sure as 3.2ghz is fater than 3.5ghz intel
Posted on Reply
#28
Prima.Vera
Can this play compressed 4K 10-bit HDR media? That's all I'm asking.
Posted on Reply
#29
GoldenX
Via CPU? Try it limiting your own cores.
Posted on Reply
#30
Rauelius
GoldenXSingle core with SMT, and a single Vega core.
What would the "FX" be? Single-Core no SMT single Vega Core?
Posted on Reply
#31
newtekie1
Semi-Retired Folder
Prima.VeraCan this play compressed 4K 10-bit HDR media? That's all I'm asking.
I'm pretty sure the GPU has a built in HEVC decoder, so it should be able to.
Posted on Reply
#32
Valantar
RaueliusWhat would the "FX" be? Single-Core no SMT single Vega Core?
FX would be massively overclocked Ryzen (but with half the cache and IF speed to limit IPC), no SMT, no GPU, TDPs in the 200W range, and a bundled AIO water cooler. For those wanting a space heater ;)

I want them to bring back the Duron brand. My first CPU was a Duron! It was "great" (read: I could afford it, and I was like 12 at the time!). It also overclocked by the exact amount of 0%. Not that I had even the slightest clue what I was doing, but I couldn't get it above stock clocks whatsoever. IIRC my brother's Athlon was a much better overclocker. Oh, how the world has moved on.
Posted on Reply
#33
GoldenX
newtekie1I'm pretty sure the GPU has a built in HEVC decoder, so it should be able to.
Some prefer to use the CPU for decoding, it's less bug prone and allows for better personalization.
Posted on Reply
#34
ikeke
With TDPdown (in BIOS) you can make any Ryzen very power efficient.

R7 2700x @45W TDP
Posted on Reply
#35
Valantar
GoldenXSome prefer to use the CPU for decoding, it's less bug prone and allows for better personalization.
Are there any CPUs at all that can decode 10-bit HDR HEVC @4K60 without dedicated hardware or 'hybrid' solutions?
Posted on Reply
#36
GoldenX
I don't know. Try it, if you have media like that.
Posted on Reply
#37
silentbogo
ValantarAre there any CPUs at all that can decode 10-bit HDR HEVC @4K60 without dedicated hardware or 'hybrid' solutions?
Kaby Lake [+refresh], Coffee Lake? All of those embedded Pentium Gold/Silver SoCs that came out in the past 1.5 years? Ryzen APUs? Snapdragon 820/835/845? The list goes on...
Pretty sure this new Athlon and upcoming low-power entry level mobile chips will also support 4K HDR.
Posted on Reply
#38
newtekie1
Semi-Retired Folder
silentbogoKaby Lake [+refresh], Coffee Lake? All of those embedded Pentium Gold/Silver SoCs that came out in the past 1.5 years? Ryzen APUs? Snapdragon 820/835/845? The list goes on...
Pretty sure this new Athlon and upcoming low-power entry level mobile chips will also support 4K HDR.
Most of those support it through a hardware decoder. He's asking for software only, using pure CPU power, and for that I'd be inclined to think anything below a modern powerful quad-core would struggle.
Posted on Reply
#39
silentbogo
newtekie1Most of those support it through a hardware decoder. He's asking for software only, using pure CPU power, and for that I'd be inclined to think anything below a modern powerful quad-core would struggle.
Oh... that... Well, that's stupid. I'm not sure even if the most powerful consumer CPU can do it in real-time this way. My i3-6100 can barely pull 4K@23FPS 8-bit with half the frames dropped, and my older X5650 would do around 1-2FPS in software mode. Those were just the curiosity experiments.
Posted on Reply
#40
Valantar
silentbogoOh... that... Well, that's stupid. I'm not sure even if the most powerful consumer CPU can do it in real-time this way. My i3-6100 can barely pull 4K@23FPS 8-bit with half the frames dropped, and my older X5650 would do around 1-2FPS in software mode. Those were just the curiosity experiments.
I completely agree that that is a rather silly requirement, but people here seemed to want it:
Prima.VeraCan this play compressed 4K 10-bit HDR media? That's all I'm asking.
GoldenXSome prefer to use the CPU for decoding, it's less bug prone and allows for better personalization.
Hence me asking if it was at all possible.
My initial guess aligns with yours @silentbogo , that at the very least you'd need an OC'd 8700k or similar, though probably even that couldn't do it. And by then you're likely pushing more than 100W just to run an single-threaded decoder, which is, as you say, stupid. But to each their own, I suppose.
Posted on Reply
#41
silentbogo
ValantarHence me asking if it was at all possible.
My initial guess aligns with yours @silentbogo , that at the very least you'd need an OC'd 8700k or similar, though probably even that couldn't do it. And by then you're likely pushing more than 100W just to run an single-threaded decoder, which is, as you say, stupid. But to each their own, I suppose.
What others meant was not software decoding, but the built-in decoder ASIC, like AMD UVD or Intel QuickSync. It's all dedicated hardware solely responsible for video decoding.
Just like GPUs (or graphics accelerators, at the time), were created to avoid the difficulties of software rendering, now all hardware includes some sort of specialized video encoding/decoding accelerator. It's not only faster, it's also more efficient.
Posted on Reply
#42
Valantar
silentbogoWhat others meant was not software decoding, but the built-in decoder ASIC, like AMD UVD or Intel QuickSync. It's all dedicated hardware solely responsible for video decoding.
Just like GPUs (or graphics accelerators, at the time), were created to avoid the difficulties of software rendering, now all hardware includes some sort of specialized video encoding/decoding accelerator. It's not only faster, it's also more efficient.
You're preaching to the choir here man, this is not news to me. Shouldn't be to anybody.
Posted on Reply
#43
GoldenX
Of course it works with the GPU decoder. That's what I use too. But some people like to do post processing, and doing it with GPU aceleration is buggy. Kinda weird to ask that of a 55 usd dual core Ryzen, but I've seen people do renders on Atoms.

By the way, all decoders are very good at using all threads.

We need cheaper itx boards for this baby, I want one for my file sharing Pc, the 5150 is a bit old now.
Posted on Reply
#44
Prima.Vera
ValantarI completely agree that that is a rather silly requirement, but people here seemed to want it:
Why silly requirements?? I want to build a mini multimedia box to play 4K Netflix/Amazon/offline videos on my 4K HDR TV, what's so silly about it?? I don't want to buy a dedicated crap from Amazon or Roku, since those cannot properly play 4K HDR-10 bit MKV or h265 video files. I also have an USB Blu-ray capable of playing 4K HDR Disks.
Posted on Reply
#45
Valantar
Prima.VeraWhy silly requirements?? I want to build a mini multimedia box to play 4K Netflix/Amazon/offline videos on my 4K HDR TV, what's so silly about it?? I don't want to buy a dedicated crap from Amazon or Roku, since those cannot properly play 4K HDR-10 bit MKV or h265 video files. I also have an USB Blu-ray capable of playing 4K HDR Disks.
We were talking about software/CPU decoding, i.e. not using fixed-function hardware in the chip. Please read more carefully.
Posted on Reply
#46
silentbogo
ValantarWe were talking about software/CPU decoding, i.e. not using fixed-function hardware in the chip. Please read more carefully.
I don't understand, is it you that were interested in software decoding, or not you?
Cause no one else has brought up or mentioned software decoding, yet you say:
ValantarYou're preaching to the choir here man, this is not news to me. Shouldn't be to anybody.
Kinda weird and pointless, don't you think?
Posted on Reply
#47
Valantar
silentbogoI don't understand, is it you that were interested in software decoding, or not you?
Cause no one else has brought up or mentioned software decoding, yet you say:
I brought it up? Huh? Let's see:
Prima.VeraCan this play compressed 4K 10-bit HDR media? That's all I'm asking.
GoldenXVia CPU? Try it limiting your own cores.
newtekie1I'm pretty sure the GPU has a built in HEVC decoder, so it should be able to.
GoldenXSome prefer to use the CPU for decoding, it's less bug prone and allows for better personalization.
ValantarAre there any CPUs at all that can decode 10-bit HDR HEVC @4K60 without dedicated hardware or 'hybrid' solutions?
If that's "no one but [me]" bringing it up, then ... yeah. I didn't bring it up, I asked if [thing that other people seemed to be discussing] was at all possible.
Posted on Reply
#48
silentbogo
ValantarI asked if [thing that other people seemed to be discussing] was at all possible.
All of it refers to using UVD or Quicksync, cause it's more stable. Not software decoding on CPU.
Once again, to be even more speicific: HW decoding on iGPU, not dGPU. No one mentioned software decoding.
Posted on Reply
#49
Valantar
silentbogoAll of it refers to using UVD or Quicksync, cause it's more stable. Not software decoding on CPU.
Once again, to be even more speicific: HW decoding on iGPU, not dGPU. No one mentioned software decoding.
Again: preaching to the choir. Fixed-function video decoding hardware integrated into iGPUs is well know, at least to me. Don't see how you got so stuck on that point. I just asked for clarification on something tangentially related to that. All in all, I guess this is just a circle of misunderstandings all over. No idea why you're dead set on somehow blaming this misunderstanding on me (or why someone needs to be singled out for it at all), but I'll leave that to you. Pretty clear from the posts I've quoted above that that wasn't the case. Now can we stop beating this dead horse?
Posted on Reply
#50
GoldenX
I think this mess originated from a guess of mine.
Prima.Vera asked if this CPU can decode h.265 10-bit HDR sources, and as it's obvious Vega has h.265 decoding, i guessed he was referring to CPU decoding. It kinda went from there.
So, sorry, mea culpa.

Where are the A300 and X300 chipset-less motherboards? We need them for these Athlons.
Posted on Reply
Add your own comment
Nov 22nd, 2024 18:30 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts