• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 7 5800X3D

Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
And?
It still goes against everything you just said

  • Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.
Against everything I said? You, sir, are lying or blatantly exaggerating. Didn't expect it from a moderator but whatever. Underdog's mentality just refuses to die.

@W1zzard nowhere in your review you mention Intel Broadwell which is quite sad actually. I expected you to be a little bit more well versed in the history of computing.
 
Joined
Jan 14, 2019
Messages
12,349 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I was itching to comment a big "BOOO" until I got to the game tests. Then I went "hmm". Not bad. Not bad at all!
 
Joined
Aug 9, 2019
Messages
1,695 (0.87/day)
Processor 7800X3D 2x16GB CO
Motherboard Asrock B650m HDV
Cooling Peerless Assassin SE
Memory 2x16GB DR A-die@6000c30 tuned
Video Card(s) Asus 4070 dual OC 2610@915mv
Storage WD blue 1TB nvme
Display(s) Lenovo G24-10 144Hz
Case Corsair D4000 Airflow
Power Supply EVGA GQ 650W
Software Windows 10 home 64
Benchmark Scores Superposition 8k 5267 Aida64 58.5ns
Curve optimize is a glorified Load Line Calibration for per-core.
That's one way of putting it, but actual results can be quite impressive. Going from stock to -30 allcore on my 5600X yielded 6% better multicore CB23 at same 76W PPT. Temp in single core scenarios dropped by 5C due to voltage at load being 90mv lower vs stock.

And?
It still goes against everything you just said

  • Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.
Actually many game at 1080p or 720p without knowing (DLSS/FSR ;)).
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Against everything I said? You, sir, are lying or blatantly exaggerating. Didn't expect it from a moderator but whatever. Underdog's mentality just refuses to die.

@W1zzard nowhere in your review you mention Intel Broadwell which is quite sad actually. I expected you to be a little bit more well versed in the history of computing.
I give up - you're just here to troll and waste time.
 
Joined
May 11, 2018
Messages
1,258 (0.53/day)
So, is it "20% uplift" in gaming that AMD teased? I know it was worded "up to", and at equivalent frequency to 5800x (which it doesn't achieve), and we all knew this would only be the case in non-GPU limited scenarios (so ultra low resolution).

So yeah, it does what AMD was teasing. In some games at ultra low resolution it has incredible performance uplift - mostly at games that don't need it, they were at very high FPS even on 5800x.

But I still think steep price increase would indicate all around faster processor, and here we have a processor that has mostly the same performance or even lower than 5800x in productivity, and even if we focus purely on gaming, there are lots of games that don't benefit much. And focusing more on games that really tax the GPU would show a bit diffetent results, even at lower resolutions.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,244 (7.54/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
So, is it "20% uplift" in gaming that AMD teased? I know it was worded "up to", and at equivalent frequency to 5800x (which it doesn't achieve), and we all knew this would only be the case in non-GPU limited scenarios (so ultra low resolution).

So yeah, it does what AMD was teasing. In some games at ultra low resolution it has incredible performance uplift - mostly at games that don't need it, they were at very high FPS even on 5800x.

But I still think steep price increase would indicate all around faster processor, and here we have a processor that has mostly the same performance or even lower than 5800x in productivity, and even if we focus purely on gaming, there are lots of games that don't benefit much. And focusing more on games that really tax the GPU would show a bit diffetent results, even at lower resolutions.
How's this for an "up to"?
1649833902337.png


That's a 43% gain over the 5800X, and at a resolution that can be GPU-limited for every graphics card priced under $1000.
 
Joined
May 11, 2018
Messages
1,258 (0.53/day)
Yes, certain games show great uplifts even at higher resolutions. Borderlands 3, Far Cry 5. But some games show almost zero uplift, and they aren't GPU bound - RDR2 for instance.
 
Joined
Aug 9, 2019
Messages
1,695 (0.87/day)
Processor 7800X3D 2x16GB CO
Motherboard Asrock B650m HDV
Cooling Peerless Assassin SE
Memory 2x16GB DR A-die@6000c30 tuned
Video Card(s) Asus 4070 dual OC 2610@915mv
Storage WD blue 1TB nvme
Display(s) Lenovo G24-10 144Hz
Case Corsair D4000 Airflow
Power Supply EVGA GQ 650W
Software Windows 10 home 64
Benchmark Scores Superposition 8k 5267 Aida64 58.5ns
Yes, certain games show great uplifts even at higher resolutions. Borderlands 3, Far Cry 5. But some games show almost zero uplift, and they aren't GPU bound - RDR2 for instance.
It depends a lot of what games prefer, many games love cache (BL3 and FC), some love latency (FC), some love bandwith (Cyberpunk, Total war), some love cores (RSS) some like everything (SOTTR).
 
Joined
May 11, 2018
Messages
1,258 (0.53/day)
I wonder if it would do anything for Microsoft Flight Simulator and DCS - two games that are often CPU bound, but I don't think they are the type that would "fit in cache".
 
Joined
Oct 2, 2015
Messages
3,147 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
A friend is very interested in it, he would go from a 1800X to this. No need to update the X370 board, thanks to the new BIOSes, so it's a solid 5 years upgrade.
 
Joined
Feb 23, 2019
Messages
6,074 (2.88/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3800 CL16
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
Ah, just what I expected - Intel cherrypickers vs AMD apologists. I need a fresh batch of popcorn.
 
Joined
Apr 21, 2005
Messages
185 (0.03/day)
It depends a lot of what games prefer, many games love cache (BL3 and FC), some love latency (FC), some love bandwith (Cyberpunk, Total war), some love cores (RSS) some like everything (SOTTR).

It also depends on the scene. Games like CP:2077 and Tomb Raider are pretty large so very easy for different reviewers to test different scenes and see different results depending on how CPU or GPU intensive that scene is.

EDIT: I also find it a shame that nowhere really tests non FPS metrics. Some places do Factorio UPS, a few do CIv 6 turn time and Anandtech used to do something with Dwarf Fortress world building but where are Tic Rate tests for the Paradox grand strategy games, where are the simulation rate tests for Cities Skylines or the AI turn time tests for turn based games? I don't get the lack of testing for this. A lot of these games run fine at 4K on a pretty low end GPU but by end game the CPU is the part that is crying out for help but nobody seems to want to come up with a test for them. Kind of irritating really because I have to guess based on the FPS of games with high unit counts and a lot of background calculation going on at the same time like RTS games.
 
Last edited:
Joined
Feb 15, 2019
Messages
1,659 (0.78/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
It is so funny that
So, is it "20% uplift" in gaming that AMD teased? I know it was worded "up to", and at equivalent frequency to 5800x (which it doesn't achieve), and we all knew this would only be the case in non-GPU limited scenarios (so ultra low resolution).

So yeah, it does what AMD was teasing. In some games at ultra low resolution it has incredible performance uplift - mostly at games that don't need it, they were at very high FPS even on 5800x.

But I still think steep price increase would indicate all around faster processor, and here we have a processor that has mostly the same performance or even lower than 5800x in productivity, and even if we focus purely on gaming, there are lots of games that don't benefit much. And focusing more on games that really tax the GPU would show a bit diffetent results, even at lower resolutions.

By "ultra low resolution" you mean 1440p ?
What "Normal resolution" you gaming on? 16k ?
 
Joined
Dec 14, 2011
Messages
1,044 (0.22/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Corsair iCUE H115i Elite Capellix 280mm
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage Sabrent Rocket 1TB M.2
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Redragon K618 RGB PRO
Software Microsoft Windows 11 - Enterprise (64-bit)
I am really looking forward to the next-gen CPU's AMD has done something fantastic this time around, it can only spell good things going forward, I mean, look at the power draw compared against Intels best CPU and look at the FPS gains, it's simply amazing.
 
Joined
May 11, 2018
Messages
1,258 (0.53/day)
It is so funny that


By "ultra low resolution" you mean 1440p ?
What "Normal resolution" you gaming on? 16k ?


Of course I was refering at huge uplift at lower resolutions. Some games show large gains even at higher resolutions, but by no means all of them - 5800x is just 6.7% slower at that resolution on average in this set of games.
 
Joined
Oct 8, 2015
Messages
772 (0.23/day)
Location
Earth's Troposphere
System Name 3 "rigs"-gaming/spare pc/cruncher
Processor R7-5800X3D/i7-7700K/R9-7950X
Motherboard Asus ROG Crosshair VI Extreme/Asus Ranger Z170/Asus ROG Crosshair X670E-GENE
Cooling Bitspower monoblock ,custom open loop,both passive and active/air tower cooler/air tower cooler
Memory 32GB DDR4/32GB DDR4/64GB DDR5
Video Card(s) Gigabyte RX6900XT Alphacooled/AMD RX5700XT 50th Aniv./SOC(onboard)
Storage mix of sata ssds/m.2 ssds/mix of sata ssds+an m.2 ssd
Display(s) Dell UltraSharp U2410 , HP 24x
Case mb box/Silverstone Raven RV-05/CoolerMaster Q300L
Audio Device(s) onboard/onboard/onboard
Power Supply 3 Seasonics, a DeltaElectronics, a FractalDesing
Mouse various/various/various
Keyboard various wired and wireless
VR HMD -
Software W10.someting or another,all 3
The drop in replacement option is magnificent for just about anyone rocking an AM4 platform , provided apropriate cpu support by motherboard manufacturers.

Neysayers be booing all the want because , most likely this cpu is not for them:
Where did it score in gaming?
Right in the top echelon.
 
Joined
Jul 19, 2016
Messages
482 (0.16/day)
At this point 7nm is an old node, certainly less dense than Intel's brand new Intel 7 (10nm++), yet at lowish clocks on this old node its matching a 5.5Ghz behemoth 12900KS at roughly half the power draw in gaming. It seems to me Intel pushing CPUs to quite frankly absurd power draw levels to keep up is foolish when their rival has the tech that means they only have to stack some cache on top of the die to match them.

At this point Ryzen 6000 with 3D-cache (this will likely be the refresh from what's coming later this year) is going to wipe the floor with Intel even if they push clocks to 5.7Ghz at the top end and draw 350W!
 
Joined
Apr 21, 2005
Messages
185 (0.03/day)
@W1zzard Sorry to be a pain but is there a reason why the bar chart scores for the 5800X in Metro Exodus are different from the frame time analysis scores?

VS



Is this a GPU limit restricting maximum FPS with the difference down to the scene in the bar chart being less CPU intensive than the one in the frame time analysis? If so wouldn't it make more sense to use CPU heavy scenes for a CPU test?
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
Read the review. It was clustered at the top of the charts for 4k also. All you're really saying is "NO CPU MATTERS that much when playing at 4k" which is true. But the entire point of buying the 12900ks or 5800x3d is to play at high refresh rates. 4k at minimum settings. 1440p 240hz. 1080p 360hz etc. Don't you think it is silly to say "no cpu matters at 4k" as a way to attack the best cpu? LOL

I don't think anyone attacks it. Here we are discussing the pros and cons and my ultimate goal is to find a reason for not buying it, which is pretty good because it saves some decent amount of cash. Don't you want to save some nice cash now in this worldwide economic crisis?
 
Joined
Jul 21, 2016
Messages
4 (0.00/day)
AMD is not the first company to have a massive L3 cache: Intel did that seven years ago with Broadwell which featured a massive 128MB L4 cache and showed crazy improvements in certain applications and games as well.
L4, not L3. ofc massive Cache is not a new thing for CPUs, but now we are at a point where especially the latency is extremely good with AMDs L3.
800X3D is a single experimental CPU which rocks in some games but loses in others and also loses in general applications which don't require enormous L3 cache due to decreased frequencies.
It's not an experimental CPU, just look at what the massive L3 is capable of at Milan-X. That's the future because you get "free" performance boosts without launching a whole new architecture.
Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.
Doesn't matter how many ppl are gaming at 720p. That's not the point of CPU benchmarks. You want to know how much fps a CPU can deliver and for that you need lower res to eliminate the GPU as a limiting factor. Allthough I don't think that's the case in the benchmarks here since for a lot of the games the CPUs are all too close to each other. Smells like GPU bound in 720p which is... not a good way to test a CPU.

And for benchmarks you should never use the integrated ones since they suck (but what I think happened here a few times) and if you're ingame the FPS are always worse since the games themselves are way more demanding.

Thing with lower res is: If your 3090 is capable of delivering 80fps in 2160p and the bar shows 80fps, you don't know how good the CPUs are. If the bar shows 115fps and you know the 3090 can only deliver 80fps, you know how good the CPUs are and were the GPU is the bottleneck. What is especially important for future GPUs and if the rumors are true we are getting like >2x performance with the next GPU gens coming this year already.

Besides that I don't like testing games with 3xxfps or even more in CPU benchmarks. Who cares about 340fps or 370fps? There are heavy CPU bound games out there like Anno, Total War, Cities, hell even in Elden Ring you're looking at something way below under 100fps for the CPU. Why not test those games where you actually need a lot more fps?
RPL according to the leaked information will feature an increased IPC for both its P and E cores as well as a significantly increased caches which means Intel will swiftly catch up and overtake this CPU and maybe even Zen 4.
According to the leaked information the IPC increase won't be large and the increased cache is "only" L2. Will help a bit here and there but nothing special.

This is the last hooray of AM4, there's no future upgrade path.
And yet for ppl with old AM4 boards and Zen2 or earlier they get a big last upgrade if they're mainly into gaming. Looks exciting enough to me comparing the original Zen (1800X etc.) with the 5800X3D and looking at the huge performance difference. Requiring no new socket or mainboard.
 
Joined
Jan 26, 2020
Messages
416 (0.23/day)
Location
Minbar
System Name Da Bisst
Processor Ryzen 5800X
Motherboard GigabyteB550 AORUS PRO
Cooling 2x280mm + 1x120 radiators, 4xArctic P14 PWM, 2xNoctua P12, TechN CPU, Alphacool Eisblock Auror GPU
Memory Corsair Vengeance RGB 32 GB DDR4 3800 MHz C16 tuned
Video Card(s) AMD PowerColor 6800XT
Storage Samsung 970 Evo Plus 512GB
Display(s) BenQ EX3501R
Case SilentiumPC Signum SG7V EVO TG ARGB
Audio Device(s) Onboard
Power Supply ChiefTec Proton Series 1000W (BDF-1000C)
Mouse Mionix Castor
This 5800X3D would be super interesting with a 1900mhz+ capable IMC and very fast ram with optimized timings.
Also, not sure, but does it support negative voltage adjustment?
 
Joined
Sep 8, 2020
Messages
216 (0.14/day)
System Name Home
Processor 5950x
Motherboard Asrock Taichi x370
Cooling Thermalright True Spirit 140
Memory Patriot 32gb DDR4 3200mhz
Video Card(s) Sapphire Radeon RX 6700 10gb
Storage Too many to count
Display(s) U2518D+u2417h
Case Chieftec
Audio Device(s) onboard
Power Supply seasonic prime 1000W
Mouse Razer Viper
Keyboard Logitech
Software Windows 10
This is the stupidest contest between Intel and Amd to have, "best gaming cpu" at 1280x720.
It's like windows users are in another reality with 1000W power supply's and extreme water cooling while Apple users have those stylish tiny computers now with ~400W power supply's and whisper quiet operation.
Intel,Amd and Nvidia should be thankful Apple is not interested in gaming and crypto mining scheming, they would be screwed if they ever wanted a piece of this nasty market.
 
Joined
Dec 14, 2011
Messages
275 (0.06/day)
Processor 12900K @5.1all Pcore only, 1.23v
Motherboard MSI Edge
Cooling D15 Chromax Black
Memory 32GB 4000 C15
Video Card(s) 4090 Suprim X
Storage Various Samsung M.2s, 860 evo other
Display(s) Predator X27 / Deck (Nreal air) / LG C3 83
Case FD Torrent
Audio Device(s) Hifiman Ananda / AudioEngine A5+
Power Supply Seasonic Prime TX 1000W
Mouse Amazon finest (no brand)
Keyboard Amazon finest (no brand)
VR HMD Index
Benchmark Scores I got some numbers.
Nice processor if you want to game at 720p/1080p
Or 1440p. And even 4K tbh, the difference is so small its not worth mentioning - Unless you specifically and exclusively game at that (which I do). Intels small edge is only realised if your target is 4K and you can get a good DDR5 kit (or I guess good DDR4, since you are likely to get one faster that'll play nice with Intel over AMD).

So AMD delivered what it said it would, and its a no brainer at this point in time to get a 5800X3D in most cases.

I still wish TPU would add a gaming power chart, the 12900 looks insane on those charts when it in no way reflects a typical gaming scenario.

I'm underwhelmed for the price. With the exception of a few odd ducks, it's less than 10% faster than 5800x for 20% more cost.
Thats one way to look at it. I think its great to see how much extra performance they got out of a year and half old CPU, while matching the launch price of the older chip rather than price creeping.
 
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
I give up - you're just here to troll and waste time.

This is called ad hominem. Attacking the person vs talking about what he says. And thank you for not admitting you greatly exaggerated or lied about my statement.

What a nice discussion we have here. Congratulations on the high standards of TPU forums. Nowhere in my posts on TPU I've ever trolled or wasted anyone's time. I'm not known for trolling but I'm well known for crushing red-tinted glasses people love to put on when they talk about particular companies.

L4, not L3. ofc massive Cache is not a new thing for CPUs, but now we are at a point where especially the latency is extremely good with AMDs L3.

It's not an experimental CPU, just look at what the massive L3 is capable of at Milan-X. That's the future because you get "free" performance boosts without launching a whole new architecture.

Doesn't matter how many ppl are gaming at 720p. That's not the point of CPU benchmarks. You want to know how much fps a CPU can deliver and for that you need lower res to eliminate the GPU as a limiting factor. Allthough I don't think that's the case in the benchmarks here since for a lot of the games the CPUs are all too close to each other. Smells like GPU bound in 720p which is... not a good way to test a CPU.

And for benchmarks you should never use the integrated ones since they suck (but what I think happened here a few times) and if you're ingame the FPS are always worse since the games themselves are way more demanding.

Thing with lower res is: If your 3090 is capable of delivering 80fps in 2160p and the bar shows 80fps, you don't know how good the CPUs are. If the bar shows 115fps and you know the 3090 can only deliver 80fps, you know how good the CPUs are and were the GPU is the bottleneck. What is especially important for future GPUs and if the rumors are true we are getting like >2x performance with the next GPU gens coming this year already.

Besides that I don't like testing games with 3xxfps or even more in CPU benchmarks. Who cares about 340fps or 370fps? There are heavy CPU bound games out there like Anno, Total War, Cities, hell even in Elden Ring you're looking at something way below under 100fps for the CPU. Why not test those games where you actually need a lot more fps?

According to the leaked information the IPC increase won't be large and the increased cache is "only" L2. Will help a bit here and there but nothing special.


And yet for ppl with old AM4 boards and Zen2 or earlier they get a big last upgrade if they're mainly into gaming. Looks exciting enough to me comparing the original Zen (1800X etc.) with the 5800X3D and looking at the huge performance difference. Requiring no new socket or mainboard.

What AMD fans absolutely love to do is to talk about the future. Almost the entirety of your post talks about the future.

Speaking of the future: RPL will have massively increased L2/L3 caches. OK? And considering Intel has ample time to redesign them they can go ahead and add even more L3 cache because I'm sure as hell Intel will do anything to retain their performance crown.

Speaking of this CPU being experimental: the hell you are talking about Milan-X? Is this a desktop CPU? Is TPU about servers or gaming PCs? Where are other Zen 3 desktop SKUs with 3D V-cache? Why is 5800X3D priced so high?

People with old AMD4 boards and zen2 or earlier will be better served by normal 5800X/5900X CPUs which show much better overall performance. People who have been waiting to upgrade normally don't rock RTX 3090 and game at 720p or 1080.

TLDR: not a single argument for 5800X3D outside of some very specific games at very unsual resolutions.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,852 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
EDIT: I also find it a shame that nowhere really tests non FPS metrics. Some places do Factorio UPS, a few do CIv 6 turn time and Anandtech used to do something with Dwarf Fortress world building but where are Tic Rate tests for the Paradox grand strategy games, where are the simulation rate tests for Cities Skylines or the AI turn time tests for turn based games? I don't get the lack of testing for this. A lot of these games run fine at 4K on a pretty low end GPU but by end game the CPU is the part that is crying out for help but nobody seems to want to come up with a test for them. Kind of irritating really because I have to guess based on the FPS of games with high unit counts and a lot of background calculation going on at the same time like RTS games.
Can you start a new thread with ideas? I definitely want to add something like this in the next rebench

I still wish TPU would add a gaming power chart, the 12900 looks insane on those charts when it in no way reflects a typical gaming scenario.
Will definitely be included in next rebench, just too complicated to add this now for 30 CPUs.

I've been recording gaming power already for a few months to get a feel for it, don't take this as gospel and rather consider it experimental and preliminary:
power-gaming.png


@W1zzard nowhere in your review you mention Intel Broadwell which is quite sad actually. I expected you to be a little bit more well versed in the history of computing.
trolled or wasted anyone's time
That's exactly what I thought when I read your first statement, could be language differences
 
Last edited:
Top