• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 5000 "Zen 3" "Vermeer" Launch Liveblog

Joined
Sep 15, 2016
Messages
484 (0.16/day)
It can't match the 3070 as we don't even have rumours about its performance (and we don't even have non-AMD performance of the RX 6000). Only NV said "faster than 2080Ti", however, Galax pictures showed it's under it.

6000 will be 15% slower than a 3080. I've been alive for enough of these launches to tell you their goal is to out price the 3070/3060.
 

Attachments

  • fghdfhdtyhdyt-100861401-orig.jpg
    fghdfhdtyhdyt-100861401-orig.jpg
    210 KB · Views: 120
  • Screenshot_20201008-122644_Firefox.jpg
    Screenshot_20201008-122644_Firefox.jpg
    324.1 KB · Views: 122
Joined
Jun 21, 2013
Messages
606 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
The price of 5800X makes not sense: +$150 for 2 extra cores over 5600X, then only +$100 for 4 more cores on 5900X and the MSRP higher than the current prices of 3900X. That IPC gain looks great, but it does not seem to be all that relevant in games that aren't already running at 150+ FPS.

5800X should have been the rumored 10core CPU, with 8core 5700X in the middle.
 
Joined
Feb 18, 2017
Messages
688 (0.24/day)
6000 will be 15% slower than a 3080. I've been alive for enough of these launches to tell you their goal is to out price the 3070/3060.
Yes, and check the results of the linked Borderlands 3 benchmark: RX 5700 XT is slower than a 2070, where in reality it sits in the middle between the 2070 and 2070 Super in average.
 
Joined
Jun 2, 2017
Messages
9,373 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
On the plus side of these new chips, you have eight cores in a single CCX, which should hopefully allow for a bit better utilization of the cores, whereas in the Zen 2, the first CCX seems to be utilised more than the second CCX. Right now, 3/4 cores in my second CCX are asleep, with all the cores in the first CCX being active.
This why I am excited the 3300X is in every way faster than the 3100X because of this, You can feel it too.
 
Joined
Apr 30, 2020
Messages
999 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
6000 will be 15% slower than a 3080. I've been alive for enough of these launches to tell you their goal is to out price the 3070/3060.

Borderlands 3 is brought to life with Unreal Engine 4, which takes advantage of DirectX 12 and Direct X 11 with some AMD-specific features, such as FidelityFX. In our testing, we used DirectX 11 because the DirectX 12 renderer has extremely long loading times and some instability. We will switch to the new renderer once these issues are ironed out.

This from the techpowerup reviews for all of those 3080.

doesn't compare vs DX12.
 
Joined
Mar 7, 2019
Messages
144 (0.07/day)
Let me be brutally honest. AMD is no different than Intel in terms of dictating prices when they have the performance crown.

The pricing for the Ryzen 5000 series:

5600X: $300 - very close to the price of the 3700X which featured two more cores.
5800X: $450
5900X: $550
5950X: $800

All priced $50 higher than their Ryzen 3000 counterparts. What's more, there's no sign of 5700X which was a sweet spot for the previous gen Ryzen CPUs. Either you pay $50 more for the 3600X alternative or you have to pay whopping $120 more to get just two more cores.

Customers first, my ass. More like profits first now that Intel still cannot solve their 10nm node.
Expect XT refresh and 3100/3300x/3500x cutdowns eventually, hence why I'm waiting patiently for AM5...
 
Joined
Nov 3, 2013
Messages
2,141 (0.53/day)
Location
Serbia
Processor Ryzen 5600
Motherboard X570 I Aorus Pro
Cooling Deepcool AG400
Memory HyperX Fury 2 x 8GB 3200 CL16
Video Card(s) RX 6700 10GB SWFT 309
Storage SX8200 Pro 512 / NV2 512
Display(s) 24G2U
Case NR200P
Power Supply Ion SFX 650
Mouse G703 (TTC Gold 60M)
Keyboard Keychron V1 (Akko Matcha Green) / Apex m500 (Gateron milky yellow)
Software W10
Borderlands 3 is brought to life with Unreal Engine 4, which takes advantage of DirectX 12 and Direct X 11 with some AMD-specific features, such as FidelityFX. In our testing, we used DirectX 11 because the DirectX 12 renderer has extremely long loading times and some instability. We will switch to the new renderer once these issues are ironed out.

This from the techpowerup reviews for all of those 3080.

doesn't compare vs DX12.
Nice observation.
 

SL2

Joined
Jan 27, 2006
Messages
2,460 (0.36/day)
Firstly, Intel normally allows two generations of CPUs for the same socket/chipset, so you're basically lying.
No, you should read up before posting.
8700K = new Z370 board
9900K = new Z390 board
10900K = new Z490 board
(or budget variants)

Normally doesn't apply here, as soon as the core count started to go beyond four, a new board was needed every time. What you're referring to happened up until the 7700K in January 2017, which could be used with the older Z170 chipset.
And stop calling people liars, you can do better than that.
 
Joined
Aug 11, 2012
Messages
20 (0.00/day)
Location
South Borneo
Processor 9900k, 3950x
Motherboard Z370 Strix, Max Hero
Cooling NZXT X72, TT M360 Plus
Memory Trident Z RGB 32g, Royal 32gb
Video Card(s) 2080Ti FE, 3080 Strix Gundam, 3090 Strix
Display(s) Dell S2716DG, 34" Ultrawide
Case Lian-Li O11DW, TT Core P3 White
Power Supply Seasonic
Everything you said is true. But I think another way we can look at this is that AMD's CPU division is finally in a position to where they can charge enough to not only cover their costs on this gen, but also have funding for R&D to actually keep pushing forward generation over generation. As a consumer, sure, I don't want to pay higher prices, but I also don't want the only company that can put Intel in check to be stagnant or dragging behind (i.e. Bulldozer days). I want to see a true fight between them, not a "good enough" option. To me it looks like they are doing this exactly.
Secondly, as a long time ATi customer, after the buyout I watched that GPU division prop up the CPU division during the Bulldozer days to get them to ZEN, at the cost of GPU's falling behind. AMD can now take the profits from a successful ZEN2/3 and use it to boost the GPU division and hopefully become as competitive as their CPU's today (at all performance tiers).

TLDR: I don't see the prices as a negative. Actually, I think it's long overdue for AMD to stop being Generation Entitlement's best friend at their own detriment, and start charging what they need to charge in order to thrive and outpace competition. We also have to remember that whether we talk about AMD, Intel, or nVidia--- the closer we get to physical limitations of silicon, the cost of development and engineering skyrockets, as they have already picked all low hanging fruit performance-wise long ago.

Very relaxing, thanks. This holiday season will be awesome for both console & pc gamers, so much choices
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,772 (2.42/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
I guess there wouldn't be any way to tell from the descriptions in eshops, so the new revisions will simply eventually replace the old stock.
 
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
No, you should read up before posting.
8700K = new Z370 board
9900K = new Z390 board
10900K = new Z490 board
(or budget variants)

Normally doesn't apply here, as soon as the core count started to go beyond four, a new board was needed every time. What you're referring to happened up until the 7700K in January 2017, which could be used with the older Z170 chipset.
And stop calling people liars, you can do better than that.

Socket 1156: supports both Lynnfield and Clarkdale CPUs.
Socket 1155: supports both Sandy Bridge and Ivy Bridge CPUs.
Socket 1151 revision 1: supports both Sky Lake and Kaby Lake CPUs.
Socket 1200: supports Comet Lake and Rocket Lake CPUs.

Now there are outliers which you've shown but in your first blanket statement you claimed each new generation of Intel CPUs require a new socket. Sorry, you lied.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
2,039 (0.35/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA)
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case Cooler Master QUBE 500 Flatpack Macaron
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Keychron K2 HE Wireless / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Meta Quest 3 512GB
Software Windows 11 Pro 64-bit 24H2 Build 26100.2605
Looks like it's 10fps down on 3080 in two of the 3 games. I'll come down to pricing. Probably another 5700XT VS 2070Super situation (in terms of price/perf).

And this is fine, tbh.

The RX 5700 XT is a really good chip at $400. It may be within 5% to 15% slower than the RTX 2070 Super, but it is $100 cheaper.

I'm going to throw in my guess that the RX 6900 XT will be $599 just to undercut the RTX 3080.

Also they should've aimed at just adding $50 more to all the CPUs compared to the previous generation. My 3800X is doing pretty well but the single-core performance is what I'm after. 5800XT (8-core) at $450 is quite overpriced, especially since I got the 3800X at only $320 during last year's Black Friday. I was planning to go 5900XT as a reasonable upgrade, but not at $550.
 
Last edited:
Joined
Jul 5, 2019
Messages
318 (0.16/day)
Location
Berlin, Germany
System Name Workhorse
Processor 13900K 5.9 Ghz single core (2x) 5.6 Ghz Allcore @ -0.15v offset / 4.5 Ghz e-core -0.15v offset
Motherboard MSI Z690A-Pro DDR4
Cooling Arctic Liquid Cooler 360 3x Arctic 120 PWM Push + 3x Arctic 140 PWM Pull
Memory 2 x 32GB DDR4-3200-CL16 G.Skill RipJaws V @ 4133 Mhz CL 18-22-42-42-84 2T 1.45v
Video Card(s) RX 6600XT 8GB
Storage PNY CS3030 1TB nvme SSD, 2 x 3TB HDD, 1x 4TB HDD, 1 x 6TB HDD
Display(s) Samsung 34" 3440x1400 60 Hz
Case Coolermaster 690
Audio Device(s) Topping Dx3 Pro / Denon D2000 soon to mod it/Fostex T50RP MK3 custom cable and headband / Bose NC700
Power Supply Enermax Revolution D.F. 850W ATX 2.4
Mouse Logitech G5 / Speedlink Kudos gaming mouse (12 years old)
Keyboard A4Tech G800 (old) / Apple Magic keyboard
sooo shall I finally drop my 2600k? think I might just do that

Also the Radeon bit is just epic, the background music makes it feel like im watching a Halo trailer, good stuff.
But yeah as we can see its not quite up there with the 3080 but beats the 2080(ti), now its just a matter of powerconsumption and price and we can have a winner.
We seem to be in the same boat apparently. I was thinking of buying a 5900x or 5950x since it will be mainly used for programming (my day job), and will finally get to replace this old trash (2600k). Since this will be the last platform to support the DDR4 and since it is pretty cheap nowadays, I was thinking of even going 128 GB, since I easily burn through the 32 GB when running 20 - 30 microservice docker instances when doing development.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
2,039 (0.35/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA)
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case Cooler Master QUBE 500 Flatpack Macaron
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Keychron K2 HE Wireless / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Meta Quest 3 512GB
Software Windows 11 Pro 64-bit 24H2 Build 26100.2605
Borderlands 3 is brought to life with Unreal Engine 4, which takes advantage of DirectX 12 and Direct X 11 with some AMD-specific features, such as FidelityFX. In our testing, we used DirectX 11 because the DirectX 12 renderer has extremely long loading times and some instability. We will switch to the new renderer once these issues are ironed out.

This from the techpowerup reviews for all of those 3080.

doesn't compare vs DX12.

This is true. The RX 6000 series should have a slight advantage at DX12 in Borderlands 3, but please take note that the DX12 renderer is still incomplete even today. I know this because I actively play using a RX 5700 XT at 3880x1440p. There is stuttering when traveling through the world and the load times are longer than using the DX11 renderer. Not sure why Gearbox isn't working with Epic to fix this yet.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,009 (2.49/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | LG 24" IPS 1440p
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
And this is fine, tbh.

The RX 5700 XT is a really good chip at $400. It may be within 5% to 15% slower than the RTX 2070 Super, but it is $100 cheaper.

I'm going to throw in my guess that the RX 6900 XT will be $599 just to undercut the RTX 3080.

Also they should've aimed at just adding $50 more to all the CPUs compared to the previous generation. My 3800X is doing pretty well but the single-core performance is what I'm after. 5800XT (8-core) at $450 is quite overpriced, especially since I got the 3800X at only $320 during last year's Black Friday. I was planning to go 5900XT as a reasonable upgrade, but not at $550.

Black Friday isnt a normal selling situation. Seems wierd to use that for comparison on pricing.

This is true. The RX 6000 series should have a slight advantage at DX12 in Borderlands 3, but please take note that the DX12 renderer is still incomplete even today. I know this because I actively play using a RX 5700 XT at 3880x1440p. There is stuttering when traveling through the world and the load times are longer than using the DX11 renderer. Not sure why Gearbox isn't working with Epic to fix this yet.

Similar issue with Battlefield and Battlefront in DX12 and thats with Frostbite.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
2,039 (0.35/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA)
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case Cooler Master QUBE 500 Flatpack Macaron
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Keychron K2 HE Wireless / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Meta Quest 3 512GB
Software Windows 11 Pro 64-bit 24H2 Build 26100.2605
Black Friday isnt a normal selling situation. Seems wierd to use that for comparison on pricing.

Well, that's the thing. If I'm not mistaken the original launch price of the 3800X was $399, but even then this is still $50 over the part it was supposed to replace.

Then again (I almost forgot about this one) its technically replacing the refresh 3800XT which is $399, so I guess the $50 uplift is fine? I can always wait for a sale on the 5900X to go down to $499 or maybe even $450.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,009 (2.49/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | LG 24" IPS 1440p
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
Well, that's the thing. If I'm not mistaken the original launch price of the 3800X was $399, but even then this is still $50 over the part it was supposed to replace.

Then again (I almost forgot about this one) its technically replacing the refresh 3800XT which is $399, so I guess the $50 uplift is fine? I can always wait for a sale on the 5900X to go down to $499 or maybe even $450.

I dont see an issue with pricing. If I was okay with Intel doing basically the same thing for years while they had the performance crown, im okay with AMD doing the same. And it sounds like AMD will quite literally have the performance crown for more than just non gaming workloads.
 
Joined
Sep 3, 2019
Messages
3,580 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
Unfortunately not. The core allocation works just like on any other CPU, as you can't set the fastest core to be the "default" core that kicks in.
What you want to hope for is core 1 being the fastest one, but most people aren't that lucky.
In the case of my current CPU, core 8 is the fastest one, followed by core 4 and then 2 and 7, at least according to Ryzen Master.
On the plus side of these new chips, you have eight cores in a single CCX, which should hopefully allow for a bit better utilization of the cores, whereas in the Zen 2, the first CCX seems to be utilised more than the second CCX. Right now, 3/4 cores in my second CCX are asleep, with all the cores in the first CCX being active.
ZEN2 is indeed a little mess with those high-med-low "quality" cores. The lucky users got the best ones in 1 CCX and the worst in luck got them scattered around 2 or even 4 CCXs. On top that unluck here comes the ignorant windows scheduler to load all cores almost equally to any given load, not just single/low thread work. Theoretically 1usmus's Universal power plan do some right on this, but still not much on the unlucky ones with "high quality" cores scattered on all CCXs. By giving Win scheduler the knowledge of core quality its trying to load the highest ones but also try to keep most loaded threads on the same CCX. And this applies to any given workload from 1% to 100%. But with the exact opposite benefit margin.

Red: Best cores
Yellow: CCXs
Watch closely the effective clocks and the thread loading. Actual clocking without all C-states included does not really matter and thats what eff clock is.
This is after 5+hours of light workload, eveyday simple usage of internet and videos. I can show also gaming and 100% loads.

Untitled28.png
 
Last edited:
Joined
Feb 23, 2019
Messages
6,106 (2.87/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3600 CL14
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
6000 will be 15% slower than a 3080. I've been alive for enough of these launches to tell you their goal is to out price the 3070/3060.
It might be in 4K, it might not be in 1440p. 3080 gains a lot in 4K due to the high number of shaders.
 
Joined
Jun 13, 2012
Messages
1,409 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Audio Device(s) Logitech Z906 5.1
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
The price of 5800X makes not sense: +$150 for 2 extra cores over 5600X, then only +$100 for 4 more cores on 5900X and the MSRP higher than the current prices of 3900X. That IPC gain looks great, but it does not seem to be all that relevant in games that aren't already running at 150+ FPS.

5800X should have been the rumored 10core CPU, with 8core 5700X in the middle.
150$? Um reports are that 5800x and up doesn't even come with a cooler so that 150$ is more closer to 250$
 
Joined
Aug 4, 2020
Messages
1,624 (1.01/day)
Location
::1
The price of 5800X makes not sense: +$150 for 2 extra cores over 5600X, then only +$100 for 4 more cores on 5900X and the MSRP higher than the current prices of 3900X. That IPC gain looks great, but it does not seem to be all that relevant in games that aren't already running at 150+ FPS.

5800X should have been the rumored 10core CPU, with 8core 5700X in the middle.
If you look at how the CCDs are shaped and CPUs are made, this makes perfect sense. The 5600 can use defective CCDs and disable 2 cores while 5800 requires flawless ones. Same with the 5900, which is 2 5600s duct-taped together.

Honestly, either a 7-core or 2-CCD 8 core 5700 would make the most sense, from an engineering PoV.
 
Joined
Feb 20, 2019
Messages
8,339 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Expensive.

But if it beats Intel hands down in all scenarios at lower power draw then AMD are right to ask for at least what Intel have been ripping people off with for the last decade.

On top of superior performance and energy efficiency, it's also not subject to continual and repeated performance degredation through Spec-ex attacks that Intel's horribly-dated architecture is still subject to.

I'll wait for independent reviews of course, but they'll be out before any of us can actually buy these anyway....
 
Joined
Jan 27, 2015
Messages
1,746 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
No, you should read up before posting.
8700K = new Z370 board
9900K = new Z390 board
10900K = new Z490 board
(or budget variants)

Normally doesn't apply here, as soon as the core count started to go beyond four, a new board was needed every time. What you're referring to happened up until the 7700K in January 2017, which could be used with the older Z170 chipset.
And stop calling people liars, you can do better than that.

Except that you're wrong.

This all assumes the board maker provides a BIOS update, but that would be true for AMD boards as well :

6700k and 7700k will work on a Z170 or Z270
The 8700K and 9900K will work on a Z370.
8th and 9th gen work on Z3XX and so on.
10th and upcoming 11th gen Rocket Lake will work on Z4XX

Intel has a sustained record across the previous 4 processor generations of motherboard chipsets working for 2 generations.

If you really want to get technical, it's possible to make a 9900K work on a Z170 - with an overclock - that's 4 gens (there are many such guides out there):

 
Joined
Jan 25, 2006
Messages
1,470 (0.21/day)
Processor Ryzen 1600AF @4.2Ghz 1.35v
Motherboard MSI B450M PRO-A-MAX
Cooling Deepcool Gammaxx L120t
Memory 16GB Team Group Dark Pro Sammy-B-die 3400mhz 14.15.14.30-1.4v
Video Card(s) XFX RX 5600 XT THICC II PRO
Storage 240GB Brave eagle SSD/ 2TB Seagate Barracuda
Display(s) Dell SE2719HR
Case MSI Mag Vampiric 011C AMD Ryzen Edition
Power Supply EVGA 600W 80+
Software Windows 10 Pro
150$? Um reports are that 5800x and up doesn't even come with a cooler so that 150$ is more closer to 250$
it was a $30 cooler at best that was adequate, now you want to mitigate that with a high end air cooler or 240m AIO at $100??? :laugh: :laugh: :laugh: :laugh: add $100 to all previous Intel CPU's that shipped with no cooler then talk about how more expensive it is, fml you can't please some people :kookoo:
 
Top