Sunday, May 24th 2020

Possible 3rd Gen AMD Ryzen "Matisse Refresh" XT SKU Clock Speeds Surface

Last week, we brought you reports of AMD inching closer to launch its 3rd generation Ryzen "Matisse Refresh" processor lineup to ward off the 10th gen Intel Core "Comet Lake" threat, by giving the "Zen 2" chips possible clock speed-bumps to shore up performance. The lineup included the Ryzen 9 3900XT, the Ryzen 7 3800XT, and the Ryzen 5 3600XT. We now have a first-look at their alleged clock speeds courtesy of an anonymous tipster on ChipHell forums, seconded by HXL @9550pro.

The XT SKUs indeed revolve around 200-300 MHz increments in base- and boost clock speeds as many of our readers predicted in the "Matisse Refresh" article's comments section. The 3900XT comes with 4.10 GHz base clock, and 4.80 GHz max boost clocks, compared to 3.80 GHz base and 4.60 GHz boost clocks of the 3900X. Likewise, the 3800XT notches up to 4.20 GHz base clock (highest in the lineup), and 4.70 GHz max boost, compared to 3.90-4.50 GHz of the 3800X. The 3600XT offers the same 4.70 GHz max boost, a step up from the 4.40 GHz of the 3600X, but has its base clock set at 4.00 GHz, compared to 3.80 GHz on the 3600X. It appears like AMD's design focus is to reduce, if not beat, Intel's gaming performance lead. The 10th generation Core "Comet Lake" tops gaming performance by a mid-high single-digit percentages over AMD's offerings, and AMD could bring them down to low single-digit percentages with the XT family.
Sources: ChipHell forums, HXL aka 9550pro (Twitter)
Add your own comment

113 Comments on Possible 3rd Gen AMD Ryzen "Matisse Refresh" XT SKU Clock Speeds Surface

#26
Bwaze
The problem I see is this - we don't really see the clock speed at which the processors actually do the work, advertised boost clock speed is mostly theoretical.

So will the increase in max turbo clock speed from 4.5 to 4.7 GHz actuallt yield 4.5% increase in single core, gaming speeds? I doubt it - in gaming and single core various Ryzen 3000 processors are much closer together than advertised boost clocks would place them.
Posted on Reply
#27
yeeeeman
EzioAsThis should give it a significant enough boost to close the gap or equalize with Intel's offering (in games) if these leaks are true. Those really looking for Zen 2 CPUs should be glad they have more choices now.
I doubt. AMD's problem in games is inter CCX latency, which 200mhz extra won't do much.
Nevertheless, these extra megahertz will sweeten the deal for some and buy AMD instead of Intel so yeah, it is a good idea.
Also, Zen 3 will not be here till October/November so they need something fresh.
Posted on Reply
#28
ARF
AMD has been binning the golden samples for EPYC. I wonder how high those can go up.

AMD has known for months that Comet Lake is coming and those 200 MHz more are too little too late.

And this "XT" naming is so inappropriate.
Posted on Reply
#29
Wilson
FouquinI ran my 1600X at 4.2GHz for two years... I got the 1800X to 4.3GHz three days before launch, and my 3600X is currently running at 4.4GHz on the stock Wraith Spire in an ITX chassis.

So I'm not sure where your info is coming from.
I'm talking absolute stability with safe voltages (1.42/1.38/1.32), not the ability to get into windows barely. Most of 1st gen does 3800-3900, 3900-4000 for 2nd gen and about 4150-4300 for 3rd.
Posted on Reply
#30
Fouquin
WilsonI'm talking absolute stability with safe voltages (1.42/1.38/1.32), not the ability to get into windows barely.
I guess you missed the "ran for two years" part. That was my 24/7 daily OC at 1.4v. I'm coming up on a full year of 100% stability on my 3600X too.

But hey thanks for blasting right out of the gate with more unsubstantiated numbers and insinuating that I'm somehow lying because "it barely gets into windows". Love that attitude.
Posted on Reply
#31
zo0lykas
not every one is lucky!

my 1800x not stable over 4.1, have it from beginning when people wait on pre-orders and paid 499 lol
btw its on water block, so temps are always low
FouquinI ran my 1600X at 4.2GHz for two years... I got the 1800X to 4.3GHz three days before launch, and my 3600X is currently running at 4.4GHz on the stock Wraith Spire in an ITX chassis.

So I'm not sure where your info is coming from.
Posted on Reply
#32
dyonoctis
AddSubThe way "boost" works on Ryzen vs Kaby/Coffee/Comet Lake is much different. (Ryzen owner here, and owner of just about every CPU platform dating back to Z80) Ryzen boosting is almost cosmetic at most times really, is how I would describe it. Thermals, chip binning, the "voodoo" of it all has to be perfect, and even then you probably won't get what is advertised.

Ryzen "boosting" lasts less (fractional ms if that), Ryzen CPUs will refuse to boost when REALLY needed (a modern Win7/8/10/Linux system running its usual arsenal of 100+ services and 100+ processes will cause your average Ryzen CPU to say "No, yeah, thanks, not gonna boost now!" Even when all the thermals are right AND the binnin' lottery was won AND all the "voodoo" is right AND all the stars are lined up, your average Ryzen will maybe get close to that advertised MHz for a fraction of a second and then... bail. Vs your average Coffee Lake out of the box where the CPU will hang for HOURS at boost speeds if the thermals are right.

Besides one major factor here to consider... legacy x87 32bit code is where lot of your gaming still occurs and this is where AMD is still stuck in mid 2000s, at about 50% to even 100% in some legacy gaming scenarios.

Yeah, this is will NOT narrow the difference a whole, much less have AMD overtake in gaming (all in all). They need a solid GHz+ or even a bit more to secure that overwhelming gaming superiority. That is not happening with AM4, probably not even AM5.

...
..
.
Your test was made with first gen ryzen, wich is far slower than zen 2 in gaming. Reviews of the 3300x, and of the mobile 4800Hs show that zen 2 potential in gaming is great once the ccx to ccx latency has been taken care of. Ryzen boost works well enough so that they are not that far behind all those intel chips at 5 ghz st/4.7 Ghz all core
Do you have a spy at AMD to know that AM5 (or even zen 3) won't bring any improvement in gaming ?
Posted on Reply
#33
k_9virus
when i first saw the "XT" i thought its a new GPU so whats next? "XTX"?
Posted on Reply
#34
las
Not sure why AMD is even going to bother with this release, unless it's because 4000 series are delayed. Q4 is close.

Always glad to see higher clockspeeds on Ryzen tho. I hope 4000 series will get IPC increase and maybe do 4.5 all-core clockspeed.
Posted on Reply
#35
MAXLD
lasNot sure why AMD is even going to bother with this release,
Because they can. Because it's very easy for them to do it at this point in time. 7nm is more mature now so their yields have been stellar in quantity and quality, so without major supply problems you can slack a bit on the chip priority selections (for example: Threadripper/3950X), so now you can more easily spare and bin some killer quality chips to push the marketing and competition on the desktop lineup to counter the new Intel 10th gen launch.

Even with these new 10th gen, Intel only really kills it at the usual limited Intel friendly single core apps and the marketing flag of Intel: 1080p gaming. So, true, AMD doesn't urgently need to put out new cpus now since Zen3 is coming soon. But since they have right now a ton more of the best chips to spare, they can easily put these binned ones out to counter the small bump Intel got with 10th gen. It's that easy for AMD, and it will keep the lineup refreshed and attractive to buyers that need the hardware right now and can't / won't wait until Q4.

Resuming, even without the "XT" parts, until Zen 3 arrives in Q4:
- AMD Zen 2 still has a better all round performance proposition than Intel 10th gen (in multi core specially)
- AMD Zen 2 still has better prices, so when considering the better all round performance, it results in unquestionable killer value
- AMD Zen 2 has plenty of 7nm yields so you won't have much problems finding and buying a Ryzen cpu out there
- Intel is expected to have some supply hiccups for these 10th gen parts

So, if on top of that, given the chance to easily launch these binned "XT" chips, there's not much against AMD doing it. Specially if it helps to close the gap a bit in 1080p gaming, where the big number difference pops up in reviews.
Posted on Reply
#36
Wilson
FouquinI guess you missed the "ran for two years" part. That was my 24/7 daily OC at 1.4v. I'm coming up on a full year of 100% stability on my 3600X too.

But hey thanks for blasting right out of the gate with more unsubstantiated numbers and insinuating that I'm somehow lying because "it barely gets into windows". Love that attitude.
Well, your 1st gen numbers lay in 1-5% top quality, you can learn it from oc leaderboards of ocn, oc.ru, luxx etc. For 3rd gen it's in top 20-30%
Posted on Reply
#37
las
MAXLDBecause they can. Because it's very easy for them to do it at this point in time. 7nm is more mature now so their yields have been stellar in quantity and quality, so without major supply problems you can slack a bit on the chip priority selections (for example: Threadripper/3950X), so now you can more easily spare and bin some killer quality chips to push the marketing and competition on the desktop lineup to counter the new Intel 10th gen launch.

Even with these new 10th gen, Intel only really kills it at the usual limited Intel friendly single core apps and the marketing flag of Intel: 1080p gaming. So, true, AMD doesn't urgently need to put out new cpus now since Zen3 is coming soon. But since they have right now a ton more of the best chips to spare, they can easily put these binned ones out to counter the small bump Intel got with 10th gen. It's that easy for AMD, and it will keep the lineup refreshed and attractive to buyers that need the hardware right now and can't / won't wait until Q4.

Resuming, even without the "XT" parts, until Zen 3 arrives in Q4:
- AMD Zen 2 still has a better all round performance proposition than Intel 10th gen (in multi core specially)
- AMD Zen 2 still has better prices, so when considering the better all round performance, it results in unquestionable killer value
- AMD Zen 2 has plenty of 7nm yields so you won't have much problems finding and buying a Ryzen cpu out there
- Intel is expected to have some supply hiccups for these 10th gen parts

So, if on top of that, given the chance to easily launch these binned "XT" chips, there's not much against AMD doing it. Specially if it helps to close the gap a bit in 1080p gaming, where the big number difference pops up in reviews.
I can tell you very firmly that its easy to be CPU limited at 1440p. High fps = CPU bound, regardless of resolution.

Nah, Zen 2 does not have better all-round performance. AMD hardware is mostly hit or miss and this continues to be the case. Go look at the performance in emulators, early launch titles or niche programs and you'll see that AMD is often left in the dust.

It's not all about Cinebench you know. Real world performance is a whole different ballpark, especially when looking at the overall picture instead of cherrypicking. You won't find a single game or program that runs bad on newer a Intel/Nvidia system. You will find plenty than run bad on a full AMD setup tho. Hell, this page has tons of proof. Go through the game performance reviews and you'll see.

Bannerlord for example, literally runs like poo on my friends system, compared to mine. 3700X at 4.3 GHz with 16 gigs of 3600/C16 memory with a 5700 XT. He has tons of fps dips and spikes. Especially during very large battles.
Posted on Reply
#38
ARF
lasI can tell you very firmly that its easy to be CPU limited at 1440p. High fps = CPU bound, regardless of resolution.

Nah, Zen 2 does not have better all-round performance. AMD hardware is mostly hit or miss and this continues to be the case. Go look at the performance in emulators, early launch titles or niche programs and you'll see that AMD is often left in the dust.

It's not all about Cinebench you know. Real world performance is a whole different ballpark, especially when looking at the overall picture instead of cherrypicking. You won't find a single game or program that runs bad on newer a Intel/Nvidia system. You will find plenty than run bad on a full AMD setup tho. Hell, this page has tons of proof. Go through the game performance reviews and you'll see.

Bannerlord for example, literally runs like poo on my friends system, compared to mine. 3700X at 4.3 GHz with 16 gigs of 3600/C16 memory with a 5700 XT. He has tons of fps dips and spikes. Especially during very large battles.
That is true and it is a problem which originates in the way Intel and Nvidia operate. It has nothing to do with AMD but in the lack of development support in quite a few software organisations.
Posted on Reply
#39
Bruno Vieira
CheeseballI was hoping for a 4.8 GHz or 4.9 GHz boost for the 3800XT to push up the minimum framerate. It would've given me a reason to relegate my 3800X to another PC.

3900XT looks really nice though.
The Ryzen 4000 6-8 core will easily put your 3800X to rest in gamming, and for now, the better upgrade is probably memory.
Posted on Reply
#40
las
ARFThat is true and it is a problem which originates in the way Intel and Nvidia operate. It has nothing to do with AMD but in the lack of development support in quite a few software organisations.
Yep, that is true and we can only hope that this will change as AMD gains more marketshare, nonetheless, this is how it is right now.
Software is always the bottleneck. Transisioning is slow.
Bruno VieiraThe Ryzen 4000 6-8 core will easily put your 3800X to rest in gamming, and for now, the better upgrade is probably memory.
I'd personally not buy a 6 core CPU for gaming these days, considering next gen consoles gets 8C/16T, compared to 8C/8T now (which is why 4C/8T still holds up in many games). 6 cores simply won't age well. Only for a temporary solution (ie. buying R5 3600 and replaces it with a 4000 series 8+ core part)

8C/16T is the perfect sweet spot for PC gamers that want their rig to atleast last a few years. With clockspeeds as high as possible. More cores are not needed, less can pose a problem, unless you mainly play older and less demanding games.

When new consoles come out, PC gaming is affected. Last time, when current gen consoles came out, 4C/4T started to have issues.
Posted on Reply
#41
Elysium
Well it's about what we expected. RIP owners of the base versions of these chips? If you knew XT bins were coming, would you have waited? I can't see these being moved about too much in the price bracket - too high and you forfeit ground to the i5, too low and you have some considerable PR blowback. But it could push the base version prices lower, which combined with the B550 launch is probably a good stock-clearing move.
Posted on Reply
#42
hurakura
Refresh only makes sense if they boost the clock to be more competitive in games. Otherwise why bother when Ryzen 4xxx realease is near. Because AMD is winning on every other front, only lacking a bit more oomph in games.
Posted on Reply
#43
Chrispy_
The i5 10600KF has just made the rest of Intel's 10th Gen pointless; If you're gaming, that's the best chip you can buy.

If you're after actual multi-threaded performance, these new Ryzen XT variants will beat the i7 and i9 into pulp even harder than the old X variants already do. More cores, more cache, more PCIe, more performance, more efficient, more cheaper, more stable BIOSes. There is simply zero reason to buy intel outside of gaming at the moment and until the 10400 and cheaper motherboards come out, Intel still can't compete with a B450 and 3600.
Posted on Reply
#44
Bruno Vieira
hurakuraRefresh only makes sense if they boost the clock to be more competitive in games. Otherwise why bother when Ryzen 4xxx realease is near. Because AMD is winning on every other front, only lacking a bit more oomph in games.
Maybe this is the thing, ryzen 4000 is not near (I mean not septempber but nov-Jan) and they need a refresh now
Posted on Reply
#45
MAXLD
lasI can tell you very firmly that its easy to be CPU limited at 1440p. High fps = CPU bound, regardless of resolution.

Nah, Zen 2 does not have better all-round performance. AMD hardware is mostly hit or miss and this continues to be the case. Go look at the performance in emulators, early launch titles or niche programs and you'll see that AMD is often left in the dust.

It's not all about Cinebench you know. Real world performance is a whole different ballpark, especially when looking at the overall picture instead of cherrypicking.
If you have already high fps (100~120+) at 1440p at a certain specific game that is not the 1400p norm, then what's the real gain of having 130 or 140? I wouldn't surely pay 40% more for that. Not unless my source income depended on it (e-sports).

When I said "all round" I meant overall, considering all aspects, as in: the overall package is better (if you account for multi core performance + gaming). In this case, you get a CPU that does really well in multi core situations and also performs more than good enough in gaming in general (more difference at 1080p, sure, but equally good at the usual 1440p GPU bound games to the point of having no difference or when there's a bigger difference it's a specific case, and already at more then enough high FPS to not really matter).

So, basically: you only get the most advantages when the Intel is used for:
- most 1080p games
- only some 1440p games than are not so GPU bound
- specific single core and/or GHz friendly and/or specific older apps
lasGo look at the performance in emulators, early launch titles or niche programs and you'll see that AMD is often left in the dust.
Well, you gotta admit, that also sounds kinda... "cherrypicking".

Again, not saying the Intel cpus are bad, but you only get the most significant performance delta at certain scenarios, 1080p being the most mainstream/noticeable. And to get that, you have to pay more (as in like 40% more)(and that's only the cpu into account).
Posted on Reply
#46
las
MAXLDIf you have already high fps (100~120+) at 1440p at a certain specific game that is not the 1400p norm, then what's the real gain of having 130 or 140? I wouldn't surely pay 40% more for that. Not unless my source income depended on it (e-sports).

When I said "all round" I meant overall, considering all aspects, as in: the overall package is better (if you account for multi core performance + gaming). In this case, you get a CPU that does really well in multi core situations and also performs more than good enough in gaming in general (more difference at 1080p, sure, but equally good at the usual 1440p GPU bound games to the point of having no difference or when there's a bigger difference it's a specific case, and already at more then enough high FPS to not really matter).

So, basically: you only get the most advantages when the Intel is used for:
- most 1080p games
- only some 1440p games than are not so GPU bound
- specific single core and/or GHz friendly and/or specific older apps



Well, you gotta admit, that also sounds kinda... "cherrypicking".

Again, not saying the Intel cpus are bad, but you only get the most significant performance delta at certain scenarios, 1080p being the most mainstream/noticeable. And to get that, you have to pay more (as in like 40% more)(and that's only the cpu into account).
I can easily see the difference between 100 and 150 fps. Alot of Ryzen CPU's have fps dips in games below 100 fps, I don't accept that on a 1440p/165 Hz monitor. I want my games to be buttery smooth, when fps drops below 100, I see it instantly, without fps counter enabled.

Also, most Emulators run noticable worse on AMD. Pretty much night and day difference. Most of these are 100% optimized for Intel + Nvidia.

Many software suites run much better on Intel too, Adobe for example. Also, Handbrake X265 / HEVC encoding. There's literally tons of examples where AMD is behind, maybe because of software, maybe not. Depends on featuresets used.

You don't really have to pay more, since lower end Intel chips beats higher end AMD chips in gaming. 3950X is way more expensive than i5-10600K, yet loses in gaming. Different usecases.

The average consumer does not need tons of cores and threads. They need less cores and threads with high clockspeed, because this is what programs and games prefer and it's not going to change anytime soon.
Posted on Reply
#47
Totally
CheeseballI was hoping for a 4.8 GHz or 4.9 GHz boost for the 3800XT to push up the minimum framerate. It would've given me a reason to relegate my 3800X to another PC.

3900XT looks really nice though.
Yeah, I was already looking replace the 3800X with a 3900X when prices go down when 4000 series comes out but if these are true the 3900XT is a direct upgrade. I'll set my sights on that instead.
Posted on Reply
#48
MAXLD
lasI can easily see the difference between 100 and 150 fps. Alot of Ryzen CPU's have fps dips in games below 100 fps, I don't accept that on a 1440p/165 Hz monitor. I want my games to be buttery smooth, when fps drops below 100, I see it instantly, without fps counter enabled.

Also, most Emulators run noticable worse on AMD. Pretty much night and day difference. Most of these are 100% optimized for Intel + Nvidia.

Many software suites run much better on Intel too, Adobe for example. Also, Handbrake X265 / HEVC encoding. There's literally tons of examples where AMD is behind, maybe because of software, maybe not. Depends on featuresets used.

You don't really have to pay more, since lower end Intel chips beats higher end AMD chips in gaming. 3950X is way more expensive than i5-10600K, yet loses in gaming. Different usecases.

The average consumer does not need tons of cores and threads. They need less cores and threads with high clockspeed, because this is what programs and games prefer and it's not going to change anytime soon.
In those 1440p specific cases where you indeed see any gap, you don't have a 50fps difference. You have 10~15. That's not noticeable if you're at already well beyond 100fps.

I would say emulators are quite inside that category of niche.

Even considering Adobe, the 10700K, for example, is around the same or worse than the 3700X. At Premiere, at Photoshop, and at After Effects. Those gaps have been closing lately.

I wouldn't be so sure. GHz are still key in some specific apps and games, but IPC is the most important factor nowadays and cores are becoming also more relevant. And, like you said, new consoles are also coming out with 8c, so we'll see how the gaming platform porting influence changes the mainstream scenario. GHz difference is still relevant, but not as much as IPC optimizations. Zen 2 improvements proved that.
Posted on Reply
#49
Totally
lynx29if that 3800xt can hit around $359.99 or under that would be one hell of a CPU at 4.7ghz single core boosts. I actually think I would just dive in with that and a MXI X570 Tomahawk instead of waiting for Zen 3. Especially if a big covid 19 wave hits in the Fall time. I have a feeling Fall time will be a paper launch for Zen 3, cause ampere, rdna 2, and zen 3 will all be fighting for TSMC time slots alongside the new iphones etc and two next gen consoles... all at the same time...
I'm predicting that it's going to fall victim to the same problems the current 3800X did and somewhat still does. The 3700X's performance matching or nearly matching it's performance, making the chip seem a bit redundant and not warranting it's asking price. I'm certain that's why there isn't a 3700XT.
Posted on Reply
#50
Turmania
If these refreshes are true, A) great much needed increase in speed. B) the launch of Zen3, is probably next year.
Posted on Reply
Add your own comment
Jul 26th, 2024 17:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts