• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 5000 "Zen 3" "Vermeer" Launch Liveblog

It can't match the 3070 as we don't even have rumours about its performance (and we don't even have non-AMD performance of the RX 6000). Only NV said "faster than 2080Ti", however, Galax pictures showed it's under it.

6000 will be 15% slower than a 3080. I've been alive for enough of these launches to tell you their goal is to out price the 3070/3060.
 

Attachments

  • fghdfhdtyhdyt-100861401-orig.jpg
    fghdfhdtyhdyt-100861401-orig.jpg
    210 KB · Views: 138
  • Screenshot_20201008-122644_Firefox.jpg
    Screenshot_20201008-122644_Firefox.jpg
    324.1 KB · Views: 140
The price of 5800X makes not sense: +$150 for 2 extra cores over 5600X, then only +$100 for 4 more cores on 5900X and the MSRP higher than the current prices of 3900X. That IPC gain looks great, but it does not seem to be all that relevant in games that aren't already running at 150+ FPS.

5800X should have been the rumored 10core CPU, with 8core 5700X in the middle.
 
6000 will be 15% slower than a 3080. I've been alive for enough of these launches to tell you their goal is to out price the 3070/3060.
Yes, and check the results of the linked Borderlands 3 benchmark: RX 5700 XT is slower than a 2070, where in reality it sits in the middle between the 2070 and 2070 Super in average.
 
On the plus side of these new chips, you have eight cores in a single CCX, which should hopefully allow for a bit better utilization of the cores, whereas in the Zen 2, the first CCX seems to be utilised more than the second CCX. Right now, 3/4 cores in my second CCX are asleep, with all the cores in the first CCX being active.
This why I am excited the 3300X is in every way faster than the 3100X because of this, You can feel it too.
 
6000 will be 15% slower than a 3080. I've been alive for enough of these launches to tell you their goal is to out price the 3070/3060.

Borderlands 3 is brought to life with Unreal Engine 4, which takes advantage of DirectX 12 and Direct X 11 with some AMD-specific features, such as FidelityFX. In our testing, we used DirectX 11 because the DirectX 12 renderer has extremely long loading times and some instability. We will switch to the new renderer once these issues are ironed out.

This from the techpowerup reviews for all of those 3080.

doesn't compare vs DX12.
 
Let me be brutally honest. AMD is no different than Intel in terms of dictating prices when they have the performance crown.

The pricing for the Ryzen 5000 series:

5600X: $300 - very close to the price of the 3700X which featured two more cores.
5800X: $450
5900X: $550
5950X: $800

All priced $50 higher than their Ryzen 3000 counterparts. What's more, there's no sign of 5700X which was a sweet spot for the previous gen Ryzen CPUs. Either you pay $50 more for the 3600X alternative or you have to pay whopping $120 more to get just two more cores.

Customers first, my ass. More like profits first now that Intel still cannot solve their 10nm node.
Expect XT refresh and 3100/3300x/3500x cutdowns eventually, hence why I'm waiting patiently for AM5...
 
Borderlands 3 is brought to life with Unreal Engine 4, which takes advantage of DirectX 12 and Direct X 11 with some AMD-specific features, such as FidelityFX. In our testing, we used DirectX 11 because the DirectX 12 renderer has extremely long loading times and some instability. We will switch to the new renderer once these issues are ironed out.

This from the techpowerup reviews for all of those 3080.

doesn't compare vs DX12.
Nice observation.
 
Firstly, Intel normally allows two generations of CPUs for the same socket/chipset, so you're basically lying.
No, you should read up before posting.
8700K = new Z370 board
9900K = new Z390 board
10900K = new Z490 board
(or budget variants)

Normally doesn't apply here, as soon as the core count started to go beyond four, a new board was needed every time. What you're referring to happened up until the 7700K in January 2017, which could be used with the older Z170 chipset.
And stop calling people liars, you can do better than that.
 
Everything you said is true. But I think another way we can look at this is that AMD's CPU division is finally in a position to where they can charge enough to not only cover their costs on this gen, but also have funding for R&D to actually keep pushing forward generation over generation. As a consumer, sure, I don't want to pay higher prices, but I also don't want the only company that can put Intel in check to be stagnant or dragging behind (i.e. Bulldozer days). I want to see a true fight between them, not a "good enough" option. To me it looks like they are doing this exactly.
Secondly, as a long time ATi customer, after the buyout I watched that GPU division prop up the CPU division during the Bulldozer days to get them to ZEN, at the cost of GPU's falling behind. AMD can now take the profits from a successful ZEN2/3 and use it to boost the GPU division and hopefully become as competitive as their CPU's today (at all performance tiers).

TLDR: I don't see the prices as a negative. Actually, I think it's long overdue for AMD to stop being Generation Entitlement's best friend at their own detriment, and start charging what they need to charge in order to thrive and outpace competition. We also have to remember that whether we talk about AMD, Intel, or nVidia--- the closer we get to physical limitations of silicon, the cost of development and engineering skyrockets, as they have already picked all low hanging fruit performance-wise long ago.

Very relaxing, thanks. This holiday season will be awesome for both console & pc gamers, so much choices
 
I guess there wouldn't be any way to tell from the descriptions in eshops, so the new revisions will simply eventually replace the old stock.
1%20Slide%2023.jpg
 
No, you should read up before posting.
8700K = new Z370 board
9900K = new Z390 board
10900K = new Z490 board
(or budget variants)

Normally doesn't apply here, as soon as the core count started to go beyond four, a new board was needed every time. What you're referring to happened up until the 7700K in January 2017, which could be used with the older Z170 chipset.
And stop calling people liars, you can do better than that.

Socket 1156: supports both Lynnfield and Clarkdale CPUs.
Socket 1155: supports both Sandy Bridge and Ivy Bridge CPUs.
Socket 1151 revision 1: supports both Sky Lake and Kaby Lake CPUs.
Socket 1200: supports Comet Lake and Rocket Lake CPUs.

Now there are outliers which you've shown but in your first blanket statement you claimed each new generation of Intel CPUs require a new socket. Sorry, you lied.
 
Looks like it's 10fps down on 3080 in two of the 3 games. I'll come down to pricing. Probably another 5700XT VS 2070Super situation (in terms of price/perf).

And this is fine, tbh.

The RX 5700 XT is a really good chip at $400. It may be within 5% to 15% slower than the RTX 2070 Super, but it is $100 cheaper.

I'm going to throw in my guess that the RX 6900 XT will be $599 just to undercut the RTX 3080.

Also they should've aimed at just adding $50 more to all the CPUs compared to the previous generation. My 3800X is doing pretty well but the single-core performance is what I'm after. 5800XT (8-core) at $450 is quite overpriced, especially since I got the 3800X at only $320 during last year's Black Friday. I was planning to go 5900XT as a reasonable upgrade, but not at $550.
 
Last edited:
sooo shall I finally drop my 2600k? think I might just do that

Also the Radeon bit is just epic, the background music makes it feel like im watching a Halo trailer, good stuff.
But yeah as we can see its not quite up there with the 3080 but beats the 2080(ti), now its just a matter of powerconsumption and price and we can have a winner.
We seem to be in the same boat apparently. I was thinking of buying a 5900x or 5950x since it will be mainly used for programming (my day job), and will finally get to replace this old trash (2600k). Since this will be the last platform to support the DDR4 and since it is pretty cheap nowadays, I was thinking of even going 128 GB, since I easily burn through the 32 GB when running 20 - 30 microservice docker instances when doing development.
 
Borderlands 3 is brought to life with Unreal Engine 4, which takes advantage of DirectX 12 and Direct X 11 with some AMD-specific features, such as FidelityFX. In our testing, we used DirectX 11 because the DirectX 12 renderer has extremely long loading times and some instability. We will switch to the new renderer once these issues are ironed out.

This from the techpowerup reviews for all of those 3080.

doesn't compare vs DX12.

This is true. The RX 6000 series should have a slight advantage at DX12 in Borderlands 3, but please take note that the DX12 renderer is still incomplete even today. I know this because I actively play using a RX 5700 XT at 3880x1440p. There is stuttering when traveling through the world and the load times are longer than using the DX11 renderer. Not sure why Gearbox isn't working with Epic to fix this yet.
 
And this is fine, tbh.

The RX 5700 XT is a really good chip at $400. It may be within 5% to 15% slower than the RTX 2070 Super, but it is $100 cheaper.

I'm going to throw in my guess that the RX 6900 XT will be $599 just to undercut the RTX 3080.

Also they should've aimed at just adding $50 more to all the CPUs compared to the previous generation. My 3800X is doing pretty well but the single-core performance is what I'm after. 5800XT (8-core) at $450 is quite overpriced, especially since I got the 3800X at only $320 during last year's Black Friday. I was planning to go 5900XT as a reasonable upgrade, but not at $550.

Black Friday isnt a normal selling situation. Seems wierd to use that for comparison on pricing.

This is true. The RX 6000 series should have a slight advantage at DX12 in Borderlands 3, but please take note that the DX12 renderer is still incomplete even today. I know this because I actively play using a RX 5700 XT at 3880x1440p. There is stuttering when traveling through the world and the load times are longer than using the DX11 renderer. Not sure why Gearbox isn't working with Epic to fix this yet.

Similar issue with Battlefield and Battlefront in DX12 and thats with Frostbite.
 
Black Friday isnt a normal selling situation. Seems wierd to use that for comparison on pricing.

Well, that's the thing. If I'm not mistaken the original launch price of the 3800X was $399, but even then this is still $50 over the part it was supposed to replace.

Then again (I almost forgot about this one) its technically replacing the refresh 3800XT which is $399, so I guess the $50 uplift is fine? I can always wait for a sale on the 5900X to go down to $499 or maybe even $450.
 
Well, that's the thing. If I'm not mistaken the original launch price of the 3800X was $399, but even then this is still $50 over the part it was supposed to replace.

Then again (I almost forgot about this one) its technically replacing the refresh 3800XT which is $399, so I guess the $50 uplift is fine? I can always wait for a sale on the 5900X to go down to $499 or maybe even $450.

I dont see an issue with pricing. If I was okay with Intel doing basically the same thing for years while they had the performance crown, im okay with AMD doing the same. And it sounds like AMD will quite literally have the performance crown for more than just non gaming workloads.
 
Unfortunately not. The core allocation works just like on any other CPU, as you can't set the fastest core to be the "default" core that kicks in.
What you want to hope for is core 1 being the fastest one, but most people aren't that lucky.
In the case of my current CPU, core 8 is the fastest one, followed by core 4 and then 2 and 7, at least according to Ryzen Master.
On the plus side of these new chips, you have eight cores in a single CCX, which should hopefully allow for a bit better utilization of the cores, whereas in the Zen 2, the first CCX seems to be utilised more than the second CCX. Right now, 3/4 cores in my second CCX are asleep, with all the cores in the first CCX being active.
ZEN2 is indeed a little mess with those high-med-low "quality" cores. The lucky users got the best ones in 1 CCX and the worst in luck got them scattered around 2 or even 4 CCXs. On top that unluck here comes the ignorant windows scheduler to load all cores almost equally to any given load, not just single/low thread work. Theoretically 1usmus's Universal power plan do some right on this, but still not much on the unlucky ones with "high quality" cores scattered on all CCXs. By giving Win scheduler the knowledge of core quality its trying to load the highest ones but also try to keep most loaded threads on the same CCX. And this applies to any given workload from 1% to 100%. But with the exact opposite benefit margin.

Red: Best cores
Yellow: CCXs
Watch closely the effective clocks and the thread loading. Actual clocking without all C-states included does not really matter and thats what eff clock is.
This is after 5+hours of light workload, eveyday simple usage of internet and videos. I can show also gaming and 100% loads.

Untitled28.png
 
Last edited:
6000 will be 15% slower than a 3080. I've been alive for enough of these launches to tell you their goal is to out price the 3070/3060.
It might be in 4K, it might not be in 1440p. 3080 gains a lot in 4K due to the high number of shaders.
 
The price of 5800X makes not sense: +$150 for 2 extra cores over 5600X, then only +$100 for 4 more cores on 5900X and the MSRP higher than the current prices of 3900X. That IPC gain looks great, but it does not seem to be all that relevant in games that aren't already running at 150+ FPS.

5800X should have been the rumored 10core CPU, with 8core 5700X in the middle.
150$? Um reports are that 5800x and up doesn't even come with a cooler so that 150$ is more closer to 250$
 
The price of 5800X makes not sense: +$150 for 2 extra cores over 5600X, then only +$100 for 4 more cores on 5900X and the MSRP higher than the current prices of 3900X. That IPC gain looks great, but it does not seem to be all that relevant in games that aren't already running at 150+ FPS.

5800X should have been the rumored 10core CPU, with 8core 5700X in the middle.
If you look at how the CCDs are shaped and CPUs are made, this makes perfect sense. The 5600 can use defective CCDs and disable 2 cores while 5800 requires flawless ones. Same with the 5900, which is 2 5600s duct-taped together.

Honestly, either a 7-core or 2-CCD 8 core 5700 would make the most sense, from an engineering PoV.
 
Expensive.

But if it beats Intel hands down in all scenarios at lower power draw then AMD are right to ask for at least what Intel have been ripping people off with for the last decade.

On top of superior performance and energy efficiency, it's also not subject to continual and repeated performance degredation through Spec-ex attacks that Intel's horribly-dated architecture is still subject to.

I'll wait for independent reviews of course, but they'll be out before any of us can actually buy these anyway....
 
No, you should read up before posting.
8700K = new Z370 board
9900K = new Z390 board
10900K = new Z490 board
(or budget variants)

Normally doesn't apply here, as soon as the core count started to go beyond four, a new board was needed every time. What you're referring to happened up until the 7700K in January 2017, which could be used with the older Z170 chipset.
And stop calling people liars, you can do better than that.

Except that you're wrong.

This all assumes the board maker provides a BIOS update, but that would be true for AMD boards as well :

6700k and 7700k will work on a Z170 or Z270
The 8700K and 9900K will work on a Z370.
8th and 9th gen work on Z3XX and so on.
10th and upcoming 11th gen Rocket Lake will work on Z4XX

Intel has a sustained record across the previous 4 processor generations of motherboard chipsets working for 2 generations.

If you really want to get technical, it's possible to make a 9900K work on a Z170 - with an overclock - that's 4 gens (there are many such guides out there):

 
150$? Um reports are that 5800x and up doesn't even come with a cooler so that 150$ is more closer to 250$
it was a $30 cooler at best that was adequate, now you want to mitigate that with a high end air cooler or 240m AIO at $100??? :laugh: :laugh: :laugh: :laugh: add $100 to all previous Intel CPU's that shipped with no cooler then talk about how more expensive it is, fml you can't please some people :kookoo:
 
Back
Top