• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

DOOM Eternal Benchmark Test & Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,707 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
What are the graphics driver settings, though? Radeon Software and Nvidia Control Panel ?
There is quite noticeable difference in the results if you tweak one or two settings.
For all my testing I use out of the box settings as that represents 99.9% or higher of what people use
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.68/day)
Location
Ex-usa | slava the trolls
For all my testing I use out of the box settings as that represents 99.9% or higher of what people use

Well, I always change the settings, for AMD historically Texture Filtering Quality set to High gives higher performance.
One needs to test the settings and see where there are gains. I always do it because every frame is precious with lower end hardware.

But for apples-to-apples comparison the default might be right. :)

1584746923161.png


1584746947638.png
 
Joined
Jul 18, 2017
Messages
575 (0.22/day)
We don't even need to read CPU/GPU game tests to know which brands will top the charts lolz
 
Joined
Mar 20, 2019
Messages
427 (0.21/day)
Location
Australia
System Name Ryzen
Processor AMD Ryzen 7 5700X
Motherboard Asus TUF Gaming B550-Plus (Wi-Fi)
Cooling Cryorig H7
Memory Kingston Fury Beast DDR4 3200MHz 2x8GB + 2x16GB
Video Card(s) Sapphire NITRO+ Radeon RX 6700 XT GAMING OC
Storage WD_Black SN850 500GB NVMe SSD + Adata XPG SX8200 Pro 512GB NVMe SSD
Display(s) Gigabyte G27QC
Case NZXT H510 Flow
Audio Device(s) SteelSeries Arctis Prime
Power Supply Corsair RM650x Gold 650W
Mouse Logitech G502 X
Keyboard HyperX Alloy FPS Cherry MX Blue
Software Windows 11 Pro
Is anyone else having trouble with DE flicking their 144Hz monitor down to 60Hz? It doesn’t seem to matter what I do, every time I launch the game in either full screen or borderless window modes it changes my refresh rate to 60Hz.

I’m using the latest Radeon drivers which came out the other day with Windows 10 all patched up.
 
Joined
Mar 23, 2005
Messages
4,082 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Can't find any CPU Ryzen-Core analysis on Doom Eternal but it can be seen that the game doesn't need more than a 4-core/8-thread which is just disappointing :eek:
Doom 2016 runs not bad on the old FX processor back in 2016.

View attachment 148680

View attachment 148681
I am sure a patch will rectify that core count. Next Gen Gaming Consoles WILL utilize more than 8 cores and up to even 16 threads if they can.
 
Joined
Jun 2, 2014
Messages
536 (0.14/day)
Location
Midwest USA
System Name Core
Processor Intel 12700k @ 5.2/4.0
Motherboard ASRock z690 Steel Legend
Cooling Artic Cooling Freezer 420 AiO
Memory GSkill 64GB 3200 cas 14 b die
Video Card(s) ASRock Intel ARC a750
Storage Optane 900p x2, SK Hynix p41 Pro, p31 Pro
Display(s) ACER 250hz 1080p 25" IPS display/AOC 22" display
Case Phanteks p500a with all Arctic/Thermaltake fans
Audio Device(s) Focusrite interface, Presonus Studio Monitors and Subwoofer
Power Supply Seasonic 850w plat with cable mod cables
Mouse Logitech G502 Hero
Keyboard Corsair mech k65
Software Win 11 Pro
Im sure this game is great and I plan to play it, but honestly, Im more interested in Doom 64 on PC!
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,590 (2.90/day)
Location
Jyväskylä, Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X @ PBO +200 -20CO
Motherboard Asus ROG Crosshair VII Hero
Cooling Arctic Freezer 50, EKWB Vector TUF
Memory 32GB Kingston HyperX Fury DDR4-3466
Video Card(s) Asus GeForce RTX 3080 TUF OC 10GB
Storage A pack of SSDs totaling 3.2TB + 3TB HDDs
Display(s) 27" 4K120 IPS + 32" 4K60 IPS + 24" 1080p60
Case Corsair 4000D Airflow White
Audio Device(s) Asus TUF H3 Wireless / Corsair HS35
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 + Asus ROG Strix Edge Nordic
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis
So it seems that 980 Ti will run more than fine with 1080p, I need to get this game soon since the last Doom was hella great!
 
Joined
Jan 31, 2010
Messages
5,535 (1.03/day)
Location
Gougeland (NZ)
System Name Cumquat 2021
Processor AMD RyZen R7 7800X3D
Motherboard Asus Strix X670E - E Gaming WIFI
Cooling Deep Cool LT720 + CM MasterGel Pro TP + Lian Li Uni Fan V2
Memory 32GB GSkill Trident Z5 Neo 6000
Video Card(s) Sapphire Nitro+ OC RX6800 16GB DDR6 2270Cclk / 2010Mclk
Storage 1x Adata SX8200PRO NVMe 1TB gen3 x4 1X Samsung 980 Pro NVMe Gen 4 x4 1TB, 12TB of HDD Storage
Display(s) AOC 24G2 IPS 144Hz FreeSync Premium 1920x1080p
Case Lian Li O11D XL ROG edition
Audio Device(s) RX6800 via HDMI + Pioneer VSX-531 amp Technics 100W 5.1 Speaker set
Power Supply EVGA 1000W G5 Gold
Mouse Logitech G502 Proteus Core Wired
Keyboard Logitech G915 Wireless
Software Windows 11 X64 PRO (build 23H2)
Benchmark Scores it sucks even more less now ;)
So it seems that 980 Ti will run more than fine with 1080p, I need to get this game soon since the last Doom was hella great!

not just the 980Ti also RX580 & RX590 aswell
 
Joined
Feb 18, 2013
Messages
2,181 (0.51/day)
Location
Deez Nutz, bozo!
System Name Rainbow Puke Machine :D
Processor Intel Core i5-11400 (MCE enabled, PL removed)
Motherboard ASUS STRIX B560-G GAMING WIFI mATX
Cooling Corsair H60i RGB PRO XT AIO + HD120 RGB (x3) + SP120 RGB PRO (x3) + Commander PRO
Memory Corsair Vengeance RGB RT 2 x 8GB 3200MHz DDR4 C16
Video Card(s) Zotac RTX2060 Twin Fan 6GB GDDR6 (Stock)
Storage Corsair MP600 PRO 1TB M.2 PCIe Gen4 x4 SSD
Display(s) LG 29WK600-W Ultrawide 1080p IPS Monitor (primary display)
Case Corsair iCUE 220T RGB Airflow (White) w/Lighting Node CORE + Lighting Node PRO RGB LED Strips (x4).
Audio Device(s) ASUS ROG Supreme FX S1220A w/ Savitech SV3H712 AMP + Sonic Studio 3 suite
Power Supply Corsair RM750x 80 Plus Gold Fully Modular
Mouse Corsair M65 RGB FPS Gaming (White)
Keyboard Corsair K60 PRO RGB Mechanical w/ Cherry VIOLA Switches
Software Windows 11 Professional x64 (Update 23H2)
good to know that a RTX2060 is plenty enough for Ultra Nightmare in both 1080p & 1440p.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,981 (0.34/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA) / NVIDIA RTX 4090 Founder's Edition
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case HYTE Hakos Baelz Y60
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Wooting 60HE+ / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.4317
Is anyone else having trouble with DE flicking their 144Hz monitor down to 60Hz? It doesn’t seem to matter what I do, every time I launch the game in either full screen or borderless window modes it changes my refresh rate to 60Hz.

I’m using the latest Radeon drivers which came out the other day with Windows 10 all patched up.

If you bought the game on Steam and if you have the built-in Steam Overlay FPS counter enabled, disable it.
 
Joined
Mar 20, 2019
Messages
427 (0.21/day)
Location
Australia
System Name Ryzen
Processor AMD Ryzen 7 5700X
Motherboard Asus TUF Gaming B550-Plus (Wi-Fi)
Cooling Cryorig H7
Memory Kingston Fury Beast DDR4 3200MHz 2x8GB + 2x16GB
Video Card(s) Sapphire NITRO+ Radeon RX 6700 XT GAMING OC
Storage WD_Black SN850 500GB NVMe SSD + Adata XPG SX8200 Pro 512GB NVMe SSD
Display(s) Gigabyte G27QC
Case NZXT H510 Flow
Audio Device(s) SteelSeries Arctis Prime
Power Supply Corsair RM650x Gold 650W
Mouse Logitech G502 X
Keyboard HyperX Alloy FPS Cherry MX Blue
Software Windows 11 Pro
If you bought the game on Steam and if you have the built-in Steam Overlay FPS counter enabled, disable it.
Thanks, yes I bought it on Steam. I always have the overlay set to off though, so no such luck for me yet.
 
Joined
Jun 27, 2019
Messages
2,105 (1.07/day)
Location
Hungary
System Name I don't name my systems.
Processor i5-12600KF 'stock power limits/-115mV undervolt+contact frame'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/@950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
good to know that a RTX2060 is plenty enough for Ultra Nightmare in both 1080p & 1440p.

There is a Vram limiter in the game tho so you might not be able to max out the Textures.
At least I do not see any way to bypass this,it won't let you apply the settings unless its within the Vram limit,even if its only a few MBs.
2020.03.21-10.56.png


This is why I asked how the game was tested on supposedly 'highest' settings on 3-4GB cards.
 
Joined
Feb 18, 2009
Messages
1,825 (0.32/day)
Location
Slovenia
System Name Multiple - Win7, Win10, Kubuntu
Processor Intel Core i7 3820 OC@ 4.0 GHz
Motherboard Asus P9X79
Cooling Noctua NH-L12
Memory Corsair Vengeance 32GB 1333MHz
Video Card(s) Sapphire ATI Radeon RX 480 8GB
Storage Samsung SSD: 970 EVO 1TB, 2x870 EVO 250GB,860 Evo 250GB,850 Evo 250GB, WD 4x1TB, 2x2TB, 4x4TB
Display(s) Asus PB328Q 32' 1440p@75hz
Case Cooler Master CM Storm Trooper
Power Supply Corsair HX750, HX550, Galaxy 520W
Mouse Multiple, Razer Mamba Elite, Logitech M500
Keyboard Multiple - Lenovo, HP, Dell, Logitech
I thought that Vulkan and DX12 being modern APIs supposably only requiring a thin driver would make the GPU Manufacturer driver-game babysitting a thing of the past!

But no it doesn't look like it that's completely the case. The down-to-the-metal optimizations should ALL be handled by the game developers just like the newer APIs said they would give more access and responsibility, but it looks like it wasn't fully it, hence it's still abstraction, just "better at it".

It's so weird when there's fixes in these GPU drivers about some edge-case in some game, "corruption is seen in XYZ game when opening a menu" ... why on earth would that ever be a driver problem, if it is then that is a wrong approach, but if it's not then it shuldn't be driver trying to fix it but rather the most sensible appropriate component where the root cause is coming from, the game is doing something wrong most likely, but with this current system it feels like they always look at it as if "the driver isn't doing enough", missing the point and forgetting to ask if the driver should be doing this at all, even if it's known to everyone that it is a game fault but we just choose it to fix it in the driver, bah, it just feels like a cheap way to get over the problem and then the devs also become less motivated because they expect the GPU manufacturer to fix it, but the system is made so that many things the only way to fix it is in the driver, just weird on so many levels, the driver shouldn't have this kind of extra wide ranging responsibilities, there is no rules in the industry what can and cannot go in a driver to keep things simple, GPU drivers are one of the biggest one, just look at how many Megabytes, that's MEGABYTES are the size of the .DLLs ... 20-40 MB, what a freakshow compared to everything else.
What kind of API calls or whatever is the game sending or doing something else somewhere that causes this bug effect. GAMES ARE THE BIG BULK, games are the CARGO of weight and complexity, it should always be the game looking for it's compatability with the OS/API/DRIVER/HW, not the other way around, is the cargo strapped in the airplane correctly, if not, fix the cargo, not add another engine or extend the wing or add counter-weight to keep balance of the imbalanced cargo, pffft, if it is compatible it JUST WORKS, the GPU freezes or crashes, guess what, it should never be the API/OS/DRIVER fault because those things should have to be DESIGNED to be as reliable as smooth as simple as possible and the only room should be in the game where the game is the place where those nitty-gritty down-to-the-metal optimizations happen and not anywhere else.

obviously you can't have 1 company serving 1000x games out there to their fullest potential, this is whole driver babysitting is a fundamentally not optimal approach for practical end-user usage.

The industry keeps chugging this terrible method of so much driver responsibility and babysitting each and every game and having to "support" a new game, give me a break, the game supports the API, the game supports the OS, the game supports the GPU, the GPU supports the API, and the driver translates the games instructions through the API into GPU instructions, it ought to JUST WORK, right?!?!? Why not? Why so much fiddling and diddling with the freaking driver, with the mystery middle-man. Why so much drama with the transporter. This from an outside practical point of view makes no sense, but sometimes it takes that kind of approach to view something from afar rather than being an expert at it's details, the various individual experts may not see/realize the and just go along with it as if that's just how it suppose to be. The transport/conversion always should be as smooth, fast, reliable, but simple.


It could also be GPU HW problem, if some game or two is coincidentially using some kind of a pattern of commands than causes the GPU to produce corrupted output ... guess what, IT'S THE GPUs FAULT, not the driver, leave the driver alone and fix your broken HW you cheap's cakes. In the practical economic world ofcourse they would want to poke the driver to fix it, the users would also not want to replace the GPU if they bought it recently, but this is just the reality.
If such things were to happen ... it would have been a failure at Quality Assurance and testing, not testing for all possible combinations of commands feeded into the GPU, with todays AI and supercomputer automation that's like a non-issue to test for, so such fixes would be very rare.

Continuing: Nothing else requires such insane amount of driver maintenance than the graphics department, this has been plaguing field and I think this is why there's so much drama around benchmarks and performance.

The only thing a GPU manufacturer thin driver would do is general things such as HW/OS/API compatability so that fullscreen modes, super resolution, scaling, support for newer API version and other insfrastructural things to work properly as OS and HW is developed, you would need to update it much less frequently versus now, and you would update it to support a new API version and that's it, not each new GAME!!! :/, and that update should be fine for all the new games using an updated API version!
If done properly and tested right there wouldn't really be so much room for bugs anyway and if there would be bugs they wouldn't affect specific games in such specific manner, these general and larger bugs would also be very noticable and affect a lot of people so they would be traced down and fixed relatively fast. The driver shouldn't ever go into nitty-gritty extremely-game-specific details which pretty much makes this world not a GPU war but a DRIVER WAR!!!



Do you have to update the mouse driver to make the mouse "support" a game that runs over 300 FPS?
Do you have to optimize the mouse driver when you choose a new mouse pointer style?
Do you have to optimize the keyboard driver when so you can press 10x more keys in a highly competitive FPS game?
Do you have to upgrade the CPU driver when you load a new program that uses modern instructions?
Do you have to update the network driver to support a new Cat.7 Ethernet cable?
Do you have to

No you don't! Everywhere else, IT JUST WORKS for what it is designed for, unless the driver is just bad made by low paid devs, usually cheap pheriperals from Asia.
 
Last edited:
Joined
Dec 30, 2010
Messages
2,194 (0.43/day)
It's so weird when there's fixes in the driver about some edge-case in some game, "corruption is seen in XYZ game when openin a menu" ... why is that a driver problem, it should be a game problem, either the game or the OS or whatever is, a thin driver shouldn't have that kind of responsibility IMO, and because GPU manufacturers take upon themselfs then everyone sits and waits for their fixes and obviously you can't have 1 company serving 1000x games out there to their fullest potential, this is whole driver babysitting is a fundamentally wrong approach.

Yes but it would give AMD or Nvidia a bad rep if a game does'nt properly function on a new released game, does'nt it? I mean you read everywhere the 5700 xt drivers blabla, no sir, it is the game that was done in a bad way on where drivers need to fix issues that are initially caused by a game.

Vulkan, Mantle, DX12, it's nothing new really. Back in the C64 days they already applied 'tactics' to get the utter best from that base tiny hardware:


With all the computational power a GPU such as the 580/590 has, you'd say you could even make it better then what eternal now does on Ultra / WQHD or so. It all depends on how far a programmer is willing to go. But they don't really, because they have to take into account so many different configs for a base PC to make it run in the first place.

Console gaming could actually look better then PC in a way; because they have a default set of hardware, and to get the best out of it you have to program it like your talking to the chip itself. This is why Vulkan is such a wonderful concept; you can extract simply more out of it since AMD chips tend to perform best.


The PS2 only had a 4MB 150Mhz GPU, but once devs put their work onto it, they really extracted whatever what was possible on such a tiny, 32MB console.


PS3 same story. A G70 based GPU, aka 7800 or so. But once devs start gettings down into it, they really pull the potential the GPU's have.

Bottom point: Game devs have schedule's, targets and timespans where there's profit to be made. So they go usually for a generic approach, leaving lots of potential behind or to be patched later. Pubg was a good example. Ran shit at the beginning, runs perfect now.
 
Joined
Dec 5, 2013
Messages
633 (0.16/day)
Location
UK
Can't find any CPU Ryzen-Core analysis on Doom Eternal but it can be seen that the game doesn't need more than a 4-core/8-thread which is just disappointing :eek:
Why is it "disappointing" for a developer to be so good at coding that they can hit 200fps in 2020 games with just a 4/8 CPU? A genuinely well optimised game is one that "does the most with the least", not one which has 16x threads filled with cr*ppy code or because the publisher wanted 10x layers of CPU-heavy virtualisation based DRM in. I have far more respect for id Software who produce amazingly well optimized 200fps Vulkan games, seem to consistently get +2.0-2.5x fps per core and end up universally GPU bottlenecked than I do certain other lazy developers like Ubisoft who can't even hit half that frame-rate given twice the horsepower even when reusing the same engine they're supposed to have a decade's worth of 'experience' with...
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.68/day)
Location
Ex-usa | slava the trolls
Why is it "disappointing" for a developer to be so good at coding that they can hit 200fps in 2020 games with just a 4/8 CPU? A genuinely well optimised game is one that "does the most with the least", not one which has 16x threads filled with cr*ppy code or because the publisher wanted 10x layers of CPU-heavy virtualisation based DRM in. I have far more respect for id Software who produce amazingly well optimized 200fps Vulkan games, seem to consistently get +2.0-2.5x fps per core and end up universally GPU bottlenecked than I do certain other lazy developers like Ubisoft who can't even hit half that frame-rate given twice the horsepower even when reusing the same engine they're supposed to have a decade's worth of 'experience' with...

Because more used cores means more realism, more AI, more physics.
Because the mainstream is at least 6-core/12-thread today, with many people already rocking 12-core/24-thread and 16-core/32-thread.
 
Joined
Dec 30, 2010
Messages
2,194 (0.43/day)
Because more used cores means more realism, more AI, more physics.
Because the mainstream is at least 6-core/12-thread today, with many people already rocking 12-core/24-thread and 16-core/32-thread.

Point is; with a game like doom, there's so much potential to be extracted from all those cores and race in who has the most cores, while once right optimized you can get away with a 4 core 8 thread and still get 200 FPS ingame.

This is why mantle was created in the first place.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.68/day)
Location
Ex-usa | slava the trolls
Point is; with a game like doom, there's so much potential to be extracted from all those cores and race in who has the most cores, while once right optimized you can get away with a 4 core 8 thread and still get 200 FPS ingame.

This is why mantle was created in the first place.

With outdated graphics. This engine is like 5-year-old technology.
 

Just4Gamerstube1991

New Member
Joined
Mar 21, 2020
Messages
3 (0.00/day)
Techpowerup remains my go to for Benchmarks, Techpowerup's benchmarks are always right on the mark in terms of FPS. Also Congratulations to ID Tech team for making an amazing gem of a series and for other game developers... You guys need to take some notes now this is how you make a game for the PC platform. 45 fps in 4K max settings on a 1660 ti is amazing.

I honestly can't believe how good both Eternal and Doom perform, they are both excellent ports. Developers really need to start taking some notes from these guys. It just goes to show you don't need expensive hardware to pull in some good numbers. If a PC port is in working condition both inside and out it should perform well on various hardware.
 
Joined
Feb 18, 2009
Messages
1,825 (0.32/day)
Location
Slovenia
System Name Multiple - Win7, Win10, Kubuntu
Processor Intel Core i7 3820 OC@ 4.0 GHz
Motherboard Asus P9X79
Cooling Noctua NH-L12
Memory Corsair Vengeance 32GB 1333MHz
Video Card(s) Sapphire ATI Radeon RX 480 8GB
Storage Samsung SSD: 970 EVO 1TB, 2x870 EVO 250GB,860 Evo 250GB,850 Evo 250GB, WD 4x1TB, 2x2TB, 4x4TB
Display(s) Asus PB328Q 32' 1440p@75hz
Case Cooler Master CM Storm Trooper
Power Supply Corsair HX750, HX550, Galaxy 520W
Mouse Multiple, Razer Mamba Elite, Logitech M500
Keyboard Multiple - Lenovo, HP, Dell, Logitech
Why don't the GPU manufacturers pay or send people over to game devs to get it right in the first place then? :p
 
Joined
Dec 31, 2009
Messages
19,371 (3.57/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Well, I always change the settings, for AMD historically Texture Filtering Quality set to High gives higher performance.
One needs to test the settings and see where there are gains. I always do it because every frame is precious with lower end hardware.

But for apples-to-apples comparison the default might be right. :)
Apples to apples. Now you are thinking! Reviews cant cover settings for every user. Most leave these things at default. Me, I actually set the texture filtering to high quality from high (nvidia) but that performance impact is negligible anyway.
 
Joined
Dec 5, 2013
Messages
633 (0.16/day)
Location
UK
Because more used cores means more realism, more AI, more physics.
It doesn't though. It has the potential to mean that but hardly any devs code for that, modern gaming is the same "Lowest Common Denominator" it's been since PC exclusives turned into "Console first cross-platforms" in the 2000's and half the time it's a case of "the more you give them, the more they waste", variable quality ports or simply conflicting priorities. Even today, ask people who've been gaming on PC since the 90's which games are memorable for great AI (or cleverly done scripts being able to spoof the feel of enemies doing clever stuff whilst playing) and you still hear "FEAR 1" or "Half Life 2" more than the latest titles. Even No One Lives Forever (2000, same LithTech Engine family as FEAR) had enemies flipping over tables & hiding behind them, reacting to lights being turned on in adjacent rooms / doors left open, tracking your footprints in the snow, etc, on one 1GHz Pentium 3 core. Thief (1997) had 11 visibility states and some of the most accurate sound propagation physics in PC gaming history. Thief (2014) in comparison was dumbed down to 3x visibility states, half sized levels despite 128x more RAM to play with and a super buggy audio engine. Fully Destructible Environment physics? Red Faction (2001) did that on P3's & 256MB RAM...

Likewise, the real bottleneck to "more realism" like having 1,000 unique NPC's each with their own personality isn't CPU, it's the development time / budget, paying for 1,000 voice actors, quadruple the writers, more mo-cap actors (to avoid having 'a crowd of clones' all moving the same way at once), etc, vs 10% of the effort that brings in +200% more profit churning out skins, lootboxes / pay2win "Booster Packs" / DLC, etc. This comment isn't aimed at you personally, but people who've just bought themselves a new 8C/16T toy to play with thinking it'll magic up some Super AI out of thin air to fill up those 50-75% idling cores are being staggeringly naive in not grasping what really drives game development. We're not short of CPU horsepower, we're short of quality non-lazy developers and all the Threadrippers in the world won't cure that...

As for Doom Eternal, if even a 4C/8T CPU hits GPU bottleneck, it may well mean that an 8C/16T could potentially get more fps but that you simply can't test for that until future 2x more powerful GPU's appear. 200fps on lower end hardware is literally the exact opposite of a "poorly optimized" game though, and a lot of people who've just bought an enthusiast CPU fall into the trap of thinking "a rising tide lifts all boats" (a game is so efficient it simply doesn't need more cores to hit 144Hz) is somehow a "bad thing" simply because it doesn't "demo" their new purchase vs older hardware that well to other enthusiasts.
 
Last edited:
Joined
Jun 1, 2011
Messages
4,559 (0.93/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
I am sure a patch will rectify that core count. Next Gen Gaming Consoles WILL utilize more than 8 cores and up to even 16 threads if they can.
How can next gen consoles utilize "more then 8 cores" if they only come with eight cores? :cool:

Because more used cores means more realism, more AI, more physics.
Because the mainstream is at least 6-core/12-thread today, with many people already rocking 12-core/24-thread and 16-core/32-thread.
I don't think mainstream means what you think it means and more cores do not equal more realism, ai, or physics.
 
Last edited:
Top