Monday, February 12th 2018

AMD Ryzen "Raven Ridge" Comes with a Limited PCIe Interface

AMD today launched its first desktop Ryzen "Raven Ridge" APUs that combine quad-core "Zen" CPUs with "Vega" based integrated graphics solutions. One of its key specifications that caught our eye is its PCI-Express interface. Apparently, these chips feature just 8 PCI-Express gen 3.0 lanes for discrete graphics, besides 4 lanes dedicated as the chipset-bus, and 4 other lanes driving a 32 Gbps M.2 NVMe slot. What this means for the end-users, is that any discrete graphics cards plugged into the PCI-Express 3.0 x16 slot will run at half the bandwidth - PCI-Express 3.0 x8.

Our various PCI-Express scaling articles, which we regularly redo with the latest high-end GPUs, should tell you that the performance loss between gen 3.0 x16 and gen 3.0 x8 is negligible. This, however, becomes a problem for the small minority of users who combine these processors with AMD X370 chipset motherboards. The second x16 slot (which draws its PCIe lanes by segmenting it from the first x16 slot) won't work, and without at least x8 bandwidth, there's no possibility of NVIDIA SLI functioning, even on X370 motherboards that have SLI certification. One can't even argue that some internal PCIe lane allocation to the iGPU permanently locks 8 lanes away from PEG. AMD confirms that the "Zen" CCX and the "Vega 11" iGPU talk to each other over Infinity Fabric, not PCIe.
AMD responded to our story in advance with this statement:

The target market for Raven Ridge, PC builders or DIYers who value the presence of SoC graphics, will select B350 or A320 motherboards, which do not feature the ability to split PCIe lanes. X370 buyers are typically buying the high end Ryzen, such as Ryzen 5 6-core and Ryzen 7 8-core, to go with that class and price motherboard. For the majority of the market, upgrading from the class leading processor graphics inside the Ryzen 5 2400G or Ryzen 3 2200G to a single discrete GPU will be more than enough, given then performance on offer today from discrete graphics cards such as Radeon RX VEGA64 and Radeon RX 580. And today, those buyers can select from our 1st Gen Ryzen desktop processors, including the Ryzen 5 1500X and Ryzen 3 1300X which remain in the product stack for those buyers who value the extra PCIe lanes.

Additionally, we typically expect buyers who want to run mGPU are doing so from day one. These consumers are finding the lure of the of the Threadripper platform very compelling - massive performance, class leading PCIe lanes, and massive memory bandwith are all the perfect complementary features to go with multi-GPU compute, and gaming.

Raven Ridge was created for the ultra thin and light performance notebook segment, but also to scale into the mainstream performance desktop. It does this through the scalability of "Zen" and "Vega" IP, connected by AMDs Infinity Fabric, offering outstanding graphics performance and features for the price point in the desktop market. The scalable nature of the Socket AM4 platform means that there is a path for users to begin with an entry level motherboard and mainstream processor, and now upgrade processor and platform features the same way that people upgrade discrete graphics card.
Add your own comment

29 Comments on AMD Ryzen "Raven Ridge" Comes with a Limited PCIe Interface

#1
Durvelle27
Yea honestly at the market they are targeting I don’t think it’s much of a big deal and if it was they wouldn’t be buying an APU
Posted on Reply
#2
Valantar
Hasn't this been well known since the announcement of the AM4 platform? IIRC, AMD have consistently had separate spec charts for PCIe on AM4 depending on if you're running an APU or CPU, and no indication that this would change for RR compared to Bristol Ridge. Of course it's good to have this cut and dried, but this shouldn't be surprising to anyone paying attention to AM4 specs.
Posted on Reply
#3
Imperium77
This is a very good thing. This way miners won't buy all these up to get the small boost from the on-die graphics. and It will give people a way to upgrade to ryzen with graphics already included and wait out till the prices drop to fill the one slot they can use on the mobo.
Posted on Reply
#4
dj-electric
Mining resistant. Ingenious.
That said, for many, 8X1 is enough, so...
Posted on Reply
#5
trickson
OH, I have such a headache
YEAH keep producing them sub-par CPU's Intel loves it. I will NEVER buy AMD again a complete waste of time and money! They need to FIRE there entire R&D department and get some real talent in there!
Sub-par performance at a premium price what a JOKE!
Just get a real Intel system and avoid the headache and wasted time and money! AMD is GARBAGE! I know I have them and they BOTH SUCK! My Q9650 of 9 years of age can not only keep up but is in many tasks faster! OMG I GOT SUCKERED!
Posted on Reply
#6
Valantar
dj-electricMining resistant. Ingenious.
That said, for many, 8X1 is enough, so...
The question is whether the CPU will allow splitting this off into single lanes. There's no guarantee that lane bifurcation is supported on consumer platforms, even if it's technically possible.
Posted on Reply
#7
Unregistered
People such as myself that buys this stuff aren't expecting to run SLI... The only crossfire with past models was with cards that used a 64bit memory interface...
I don't think they are even doing crossfire with any card and this APU series... Unless they have a lot end Vega I don't know about.
Posted on Edit | Reply
#8
trickson
OH, I have such a headache
ValantarThe question is whether the CPU will allow splitting this off into single lanes. There's no guarantee that lane bifurcation is supported on consumer platforms, even if it's technically possible.
Not when there is so much sharing within the core and being dependent on RAM like always AMD still misses the mark once again! If it comes out best save up for 32GB of RAM at the fastest speed possible too then you will need to push that CPU BEYOND ALL engineered limits just to reach the level of performance a stock Intel Core i5 will give!
Posted on Reply
#9
Disparia
I agree with the target market, though I'm not going to get a 300 series board this close to the 400-series release. I'll buy a 2400G with an A420 or B450 board in a couple months.
Posted on Reply
#10
Valantar
tricksonYEAH keep producing them sub-par CPU's Intel loves it. I will NEVER buy AMD again a complete waste of time and money! They need to FIRE there entire R&D department and get some real talent in there!
Sub-par performance at a premium price what a JOKE!
Just get a real Intel system and avoid the headache and wasted time and money! AMD is GARBAGE! I know I have them and they BOTH SUCK! My Q9650 of 9 years of age can not only keep up but is in many tasks faster! OMG I GOT SUCKERED!
Wow, maybe you should chill out a little bit? From the looks of your post and your sig, you went from an ultra-high-end CPU (probably OC'd too, given how long you kept it) to a low-end, budget part. While the Ryzen should still beat out the C2Q in 99% of tasks, it does sound like more of a sidegrade. I went from a Q9450 (@3.5GHz) to an R5 1600X last June, and haven't regretted it for a second. It absolutely trounces my old CPU, gave me a serious GPU performance lift, all while running far cooler and quieter. IF being dependent on RAM speed is a bit of a downer, but price differences between ~3000 and 2400MT/s RAM is pretty negligible (and not only because of the crazy prices these days - they were the same back when I bought my 16GB kit a few months before prices jumped, too).

I get that you had a bad experience, but it sounds like you made some poor calls when buying hardware and should probably have read more reviews before pulling the trigger. Ryzen has made AMD competitive with Intel (still behind, but competitive) for IPC. They're still lagging in pure clock speeds, and their platform has some idiosyncrasies that might hit you badly given your part selection - but that's the nature of not being the de facto industry standard. Ryzen still works amazingly for pretty much everyone who bought it.
tricksonNot when there is so much sharing within the core and being dependent on RAM like always AMD still misses the mark once again! If it comes out best save up for 32GB of RAM at the fastest speed possible too then you will need to push that CPU BEYOND ALL engineered limits just to reach the level of performance a stock Intel Core i5 will give!
What you're saying here has absolutely no relation to what you quoted.
JizzlerI agree with the target market, though I'm not going to get a 300 series board this close to the 400-series release. I'll buy a 2400G with an A420 or B450 board in a couple months.
Kinda-sorta agree, although I doubt there'll be any major differences between them (other than, it seems, X470 being a die shrink or similar redesign for lower power). If I had the money and RAM prices weren't stupid I'd be getting a 2400G + a fast 16GB RAM kit for a HTPC upgrade. I'll hold off for now, though, as it seems the RAM situation might improve somewhat throughout the year.
Posted on Reply
#11
Disparia
ValantarKinda-sorta agree, although I doubt there'll be any major differences between them (other than, it seems, X470 being a die shrink or similar redesign for lower power). If I had the money and RAM prices weren't stupid I'd be getting a 2400G + a fast 16GB RAM kit for a HTPC upgrade. I'll hold off for now, though, as it seems the RAM situation might improve somewhat throughout the year.
Well, I use to buy SIMMs so I'm very dull to RAM price fluctuations. Lets talk GPUs instead (or not!).

It's more-so that I've waited since Ryzens release to get at these APUs, so there's a bit of stubbornness in there as well. "Hey AMD, I didn't need a new desktop or workstation, I needed a fresh APU". So I've been biding my time since then... and I'll wait a little longer good measure. Also, I'll probably get a better out-of-box experience, which may be the 400-series best feature.
Posted on Reply
#13
IceShroom
So this mean Raven Ridge has PCIe Gen3 configuration of 8x+4x+4x, compare to 16x+4x+4x of Summit Ridge.
Posted on Reply
#14
windwhirl
It makes sense. PCIe 3.0 x8 is almost the same as 2.0 x16, and unless you were pushing it to the limit (haven't read anywhere something like that for a single discreet GPU, perhaps with Crysis 3 at max settings), you probably wouldn't feel any difference in almost all games. Also, anyone buying AMD's APU processors probably have no plans to use any kind of add-on card besides a single graphics card. And for that market, the IO provided by the motherboard and the chipset should be enough in most cases.
Posted on Reply
#15
bug
ValantarHasn't this been well known since the announcement of the AM4 platform? IIRC, AMD have consistently had separate spec charts for PCIe on AM4 depending on if you're running an APU or CPU, and no indication that this would change for RR compared to Bristol Ridge. Of course it's good to have this cut and dried, but this shouldn't be surprising to anyone paying attention to AM4 specs.
Yeah, you can always count on the odd idiot out there to buy a cheap CPU only to be surprised it's as fully fledged as its more expensive siblings. And when I said "odd", I meant enough of them to start a lawsuit at some point.
So yeah, it's ok people are highlighting this.
Posted on Reply
#16
Patriot
bugYeah, you can always count on the odd idiot out there to buy a cheap CPU only to be surprised it's as fully fledged as its more expensive siblings. And when I said "odd", I meant enough of them to start a lawsuit at some point.
So yeah, it's ok people are highlighting this.
No need to look far, just scroll up a bit.
Posted on Reply
#17
Unregistered
I really hope they come out with a 960 sp version... Really make it a real 1080p gaming platform.
Posted on Edit | Reply
#18
cadaveca
My name is Dave
IceShroomSo this mean Raven Ridge has PCIe Gen3 configuration of 8x+4x+4x, compare to 16x+4x+4x of Summit Ridge.
No. It has a x16/x4/x4 link, but 8 lanes of that 16 port are swapped into infinity fabric mode. This is also how the EPYC/ThreadRipper CPUs communicate with each other as well. So really, whenever we have these devices connected to each other, they are kind of technically using PCIe to connect, but it is a higher-speed interface mode than what PCIe 3.0 offers.
Posted on Reply
#19
R-T-B
tricksonYEAH keep producing them sub-par CPU's Intel loves it. I will NEVER buy AMD again a complete waste of time and money! They need to FIRE there entire R&D department and get some real talent in there!
Sub-par performance at a premium price what a JOKE!
Just get a real Intel system and avoid the headache and wasted time and money! AMD is GARBAGE! I know I have them and they BOTH SUCK! My Q9650 of 9 years of age can not only keep up but is in many tasks faster! OMG I GOT SUCKERED!
You are comparing a 2008 $339 processor (~$380 inflation considered) to an almost $125 Ryzen in 2017. They SHOULD perform about equal. Your comparing 2008's finest to 2017's garbage tier.

You then went backwards technologically, to Piledriver, which has known to be shit since forever (part of why Ryzen was made), and you think AMD is the problem?

Confusing.

This is the AMD CPU you should have started with at minimum, to see how technology has advanced:

www.newegg.com/Product/Product.aspx?Item=N82E16819113428&cm_re=Ryzen-_-19-113-428-_-Product
Posted on Reply
#20
Casecutter
The only one building this as an issue is Btarunr. As we see AMD had detailed out how APU's with infinity fabric would use PCIe to connect some time back. If buying an APU to down the road add a single discreet mid-range or higher GPU this APU ability was never the right technology to purchase! Would anyone think of doing these with some 1080Ti as a smart pairing? It's if you followed APU with multi-GPU never held much merit, and at least now right-up-front you now have... an-even-better reason not to do stupid.
Posted on Reply
#21
bug
CasecutterThe only one building this as an issue is Btarunr. As we see AMD had detailed out how APU's with infinity fabric would use PCIe to connect some time back. If buying an APU to down the road add a single discreet mid-range or higher GPU this APU ability was never the right technology to purchase! Would anyone think of doing these with some 1080Ti as a smart pairing? It's if you followed APU with multi-GPU never held much merit, and at least now right-up-front you now have... an-even-better reason not to do stupid.
Those that buy into the "CPU is no longer a bottleneck, especially at high resolutions" might.
Posted on Reply
#22
windwhirl
R-T-BYou are comparing a 2008 $339 processor (~$380 inflation considered) to an almost $125 Ryzen in 2017. They SHOULD perform about equal. Your comparing 2008's finest to 2017's garbage tier.

You then went backwards technologically, to Piledriver, which has known to be shit since forever (part of why Ryzen was made), and you think AMD is the problem?

Confusing.

This is the AMD CPU you should have started with at minimum, to see how technology has advanced:

www.newegg.com/Product/Product.aspx?Item=N82E16819113428&cm_re=Ryzen-_-19-113-428-_-Product
Ryzen processors are slightly slower than Intel Haswell counterparts, in terms of single-threaded performance (acording to data from Passmark benchmarks), when running at the same frequency, although your mileage may vary somewhat. Against a Q9650 or any FX series CPU, however, Ryzen pretty much stomps all over them. There has to be something else for it to perform that badly.
Posted on Reply
#23
R-T-B
windwhirlRyzen processors are slightly slower than Intel Haswell counterparts, in terms of single-threaded performance (acording to data from Passmark benchmarks), when running at the same frequency, although your mileage may vary somewhat. Against a Q9650 or any FX series CPU, however, Ryzen pretty much stomps all over them. There has to be something else for it to perform that badly.
I'd start by blaming his single channel ram and poor bench choice (CPU-Z).
Posted on Reply
#24
lexluthermiester
Durvelle27Yea honestly at the market they are targeting I don’t think it’s much of a big deal and if it was they wouldn’t be buying an APU
While I agree with you, this does limit the upgrade paths available to purchasers of this line of APU's. Upgrading to a single GPU? No problem. But SLI and Crossfire, not gonna happen. Still, the audience this product will appeal to will very likely be happy with a single GPU. And Any one who does want to go multi-GPU will likely know thay need to upgrade to a better CPU anyway. So this limitation won't have much impact in practice.
R-T-BYou're comparing 2008's finest to 2017's economy tier. You then went backwards technologically, to Piledriver, which has known to be under-performing since forever
That was begging for corrections. However, those are very good points.

@trickson AMD's Ryzen in anything but sub-par. In fact Ryzen is hitting Intel hard, which can only ever be good for consumers as it drives market competition. And for the record;
AMD Ryzen 3 1300X 3.72GHz.GA-AB350M-Gaming3 MB. 1 stick of viper DDR4 8GB RAM.3Tb WD HDD. GeForce GTX 1050 Video. (LOOKING TO GET A RYZEN DON'T BE A FOOL LIKE ME!)
THAT is why your Ryzen 3 is under-performing. Get another stick of ram for that bad-boy and watch it shine!
Posted on Reply
#25
Shirley Marquez
lexluthermiesterAnd for the record;

THAT is why your Ryzen 3 is under-performing. Get another stick of ram for that bad-boy and watch it shine!
That goes at least double if you buy an APU like the 2400G. The chip is using main memory for graphics so memory bandwidth is key. ALWAYS use dual channel and spend the little bit of extra money for DDR4-3200 RAM with an APU. (Note that this means you'll need a B350 or B450 motherboard rather than an A320 because that's technically a memory overclock.) And experiment with the amount of RAM set aside for graphics. Some benchmarks have shown it doesn't matter but I have tried some programs where it DOES matter, a lot.
Posted on Reply
Add your own comment
Nov 13th, 2024 02:47 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts