# DOOM Eternal Benchmark Test & Performance Analysis



## W1zzard (Mar 20, 2020)

DOOM Eternal is the long-awaited sequel to the epic DOOM series. There's even more carnage, and gameplay is super fast-paced. Built upon the id Tech 7 engine, visuals are excellent, and graphics performance is outstanding. We tested the game on all modern graphics cards at Full HD, 1440p and 4K Ultra HD.

*Show full review*


----------



## newtekie1 (Mar 20, 2020)

Another game that fills up 8GB of VRAM but doesn't actually use it.


----------



## IceShroom (Mar 20, 2020)

Why the 2 most pupular cards(GTX 1650/Super) are missing?


----------



## W1zzard (Mar 20, 2020)

IceShroom said:


> Why the 2 most pupular cards(GTX 1650/Super) are missing?


Just not part of my benchmarking routine, didn't think they are that popular. Let me see if I can get some runs in for those


----------



## ribizly (Mar 20, 2020)

As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.


----------



## xkm1948 (Mar 20, 2020)

ribizly said:


> As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.



People always forget there are architect improvement from Pascal to Turing


----------



## W1zzard (Mar 20, 2020)

ribizly said:


> As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.


Turing shaders can do FP + INT at the same time


----------



## ribizly (Mar 20, 2020)

xkm1948 said:


> People always forget there are architect improvement from Pascal to Turing



That is okay. But then it should have been there at the release as well. It is simply driver tweak against the "older".



W1zzard said:


> Turing shaders can do FP + INT at the same time



What are the FP + INT?


----------



## EarthDog (Mar 20, 2020)

ribizly said:


> What are the FP + INT?


Floating Point and Integer.


----------



## W1zzard (Mar 20, 2020)

ribizly said:


> What are the FP + INT?


Floating Point + Integer calculations

so you can do 1.0 + 1.0 = 2.0 at the same time as 1 + 1 = 2, effectively running two operations at the same time, for each GPU core. If game (or driver) code is properly crafted to optimize for that capability you can gain a lot of performance. That's why most recent games run much better on Turing.


----------



## hamstertje (Mar 20, 2020)

The charts show a 5600 XT with 8 GB?
Wondering how the 5500 XT with 8 GB compares to the 4 GB version and the 580 and 590 in perfomance


----------



## EarthDog (Mar 20, 2020)

@W1zzard - Is there an integrated benchmark here? If not, how  did you test? Apologies if I missed in glancing over the article.


----------



## ribizly (Mar 20, 2020)

W1zzard said:


> Floating Point + Integer calculations
> 
> so you can do 1.0 + 1.0 = 2.0 at the same time as 1 + 1 = 2, effectively running two operations at the same time, for each GPU core. If game (or driver) code is properly crafted to optimize for that capability you can gain a lot of performance. That's why most recent games run much better on Turing.



Now I see. Thanks. That explains a lot.


----------



## londiste (Mar 20, 2020)

ribizly said:


> As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.


In addition to FP+INT, there is also Variable Rate Shading that idTech definitely supports and very likely uses.

Edit:
There might be other features they are using, Rapid Packed Math (2*FP16 in place of FP32) comes to mind.


----------



## MKRonin (Mar 20, 2020)

"The good thing is that our results show no major loss of performance (due to VRAM) for GTX 1060 3 GB and RTX 570 4 GB. What's surprising though is that RX 5500 XT 4 GB is doing much worse than expected. My best guess is that AMD's VRAM management for Navi isn't as refined yet as that for Polaris. At least the game doesn't crash when VRAM is exceeded, and continues to run fine."

The RX 5500 XT 4 GB, despite supporting PCIe 4.0 (and 3.0), is only physically x8 lanes. On the test setup, it's running PCIe 3.0 x8, where as the 1060 and 570 are x16 lane cards, so they can run PCIe 3.0 x16.


----------



## mouacyk (Mar 20, 2020)

ribizly said:


> Now I see. Thanks. That explains a lot.


The technical terminology for it is concurrent execution of floating point and integer operations. It is actually only made possible by a hardware change in Turing, by moving the INT32 blocks to be separate. https://hexus.net/tech/reviews/grap...g-architecture-examined-and-explained/?page=2

As support matures for new hardware features, performance will pull away from last generation -- leaves a bit of a bitter after taste too.


----------



## W1zzard (Mar 20, 2020)

hamstertje said:


> The charts show a 5600 XT with 8 GB?


Whoops, fixed



hamstertje said:


> Wondering how the 5500 XT with 8 GB compares to the 4 GB version and the 580 and 590 in perfomance


Should be roughly between RX 580 and RX 590 I'd say



londiste said:


> Variable Rate Shading that idTech definitely supports and very likely uses.


I doubt they would secretly enable that as it would reduce image quality (if only a small bit)



EarthDog said:


> Is there an integrated benchmark here? If not, how did you test?


No integrated benchmark, just play the game, find a good scene and keep playing that.



MKRonin said:


> The RX 5500 XT 4 GB, despite supporting PCIe 4.0 (and 3.0), is only physically x8 lanes. On the test setup, it's running PCIe 3.0 x8, where as the 1060 and 570 are x16 lane cards, so they can run PCIe 3.0 x16.


Very good point, let me mention that in the review


----------



## Sithaer (Mar 20, 2020)

I have a question about the Vram limit when the game was tested on Ultra Nightmare.

Was the texture quality lowered with 3-4GB cards and everything else left on max?

I'm playing the game on a RX 570 4GB and 2560x1080 res and with that I'm unable to use High texture cause the ingame counter goes over by 11 _'yes 11..' _Mb of Vram and it tells me to lower stuff else it wont let me apply the settings.
So now I'm playing with Medium textures,I could lower Shadows to low and use high Textures but I kinda prefer a more balanced setting. _'luckily I can't really see a diff between medium and high but still'_


----------



## Makaveli (Mar 20, 2020)

MKRonin said:


> "The good thing is that our results show no major loss of performance (due to VRAM) for GTX 1060 3 GB and RTX 570 4 GB. What's surprising though is that RX 5500 XT 4 GB is doing much worse than expected. My best guess is that AMD's VRAM management for Navi isn't as refined yet as that for Polaris. At least the game doesn't crash when VRAM is exceeded, and continues to run fine."
> 
> The RX 5500 XT 4 GB, despite supporting PCIe 4.0 (and 3.0), is only physically x8 lanes. On the test setup, it's running PCIe 3.0 x8, where as the 1060 and 570 are x16 lane cards, so they can run PCIe 3.0 x16.



Would be nice to see the numbers for the 5500XT in a ryzen system as PCie 4.0 x8 = PCie 3.0 x16

But the test rig is intel so hopefully another site runs this on a PCIe 4.0 board.


----------



## Cheeseball (Mar 20, 2020)

@W1zzard Looks like your benchmarking is in-line with what I'm getting on my 2080 Super (442.74) and RX 5700 XT (Pro 20.Q1.1). 

I'm sure if I was on the latest Adrenalin it would see more FPS from the driver optimization.


----------



## Vya Domus (Mar 20, 2020)

W1zzard said:


> Floating Point + Integer calculations
> 
> so you can do 1.0 + 1.0 = 2.0 at the same time as 1 + 1 = 2, effectively running two operations at the same time, for each GPU core. If game (or driver) code is properly crafted to optimize for that capability you can gain a lot of performance. That's why most recent games run much better on Turing.



Just for clarification, it's not that for each floating point operation Turing can also do an integer operation, it's just that they can occur concurrently within the same clock cycle. Before, the scheduling logic was simpler and allowed either for floating point or integer computations within 1 clock cycle.

To be fair I reckon the real wold gain in performance from this is modest, because usually after one clock cycle of doing something floating point related you probably had to compute a set of addresses in the next clock cycle anyway which is why they never bothered with this until now.


----------



## IceShroom (Mar 20, 2020)

W1zzard said:


> Just not part of my benchmarking routine, didn't think they are that popular. Let me see if I can get some runs in for those


According to steam survey GTX 1650 alone is more pupular than RX 570, even RX 580. So it deserve it place in the benchmark chart.



ribizly said:


> As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.


Because Turing is the first Nvidia architecture to fully support low level API like D3D12/Vulkan and as a result dont have performence penalty like Maxwell/Pascal.


----------



## Flanker (Mar 20, 2020)

Seems.. optimized? For 1920x1200 60Hz, looks like my GTX1080 will be good enough for a long time yet


----------



## renz496 (Mar 20, 2020)

ribizly said:


> As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.


Turing is better at id tech engine than pascal. this is not that surprising. you can see similar performance difference with doom 2016 as well.


----------



## ARF (Mar 20, 2020)

RX 5700 XT is 10% faster than Radeon VII at 1920x1080.
While the Radeon VII is 5% faster than RX 5700 XT at 3840x2160.



1920x1080:





3840x2160:


----------



## r.h.p (Mar 20, 2020)

I always think your reviews are pretty spot on on @W1zzard , so I'm a little bit disappointed  with this result . I think ill wait for a $ 35 sale on steam


----------



## thepath (Mar 20, 2020)

The game does not really need more than 6GB video memory (even if it consume more than that when you have 2080 Ti)

There is no issue or any stutter on 6GB regardless what resolution or setting you play

On 3GB/4GB cards, it only causes issue when you play at ultra-nightmare setting and high resolution. But let us be honest who is going to play over 1080p on 3GB/4GB cards ?? Even if you play at 1440p, you can always lower some setting that does not have much impact on visuals
No big difference between GTX 1060 3GB and 6GB at 1080p (keep in mind that 6GB model has few cuda cores more), just lower few unimportant setting should make it easily on par with GTX 1060 6GB frame rate.


----------



## nguyen (Mar 20, 2020)

Now I'm torn between finishing the game or wait for the RTX patch...knowing that I will have nothing to do for the next 4-6 weeks...


----------



## ARF (Mar 20, 2020)

bigguns said:


> why not CPU performance ? i want to know how much faster is Intel over AMD junkzen in this game



Can't find any CPU Ryzen-Core analysis on Doom Eternal but it can be seen that the game doesn't need more than a 4-core/8-thread which is just disappointing 
Doom 2016 runs not bad on the old FX processor back in 2016.













						Doom Eternal PC Performance Analysis
					

Doom Eternal is powered by id Tech 7 and uses the Vulkan API, and it's time to benchmark it and see how it performs on the PC platform.




					www.dsogaming.com
				















						DOOM Benchmarked: Graphics & CPU Performance Tested
					

Benchmarks: CPU Performance. The Doom reboot is a gift to the PC Master Race. The 4K visuals are amazing, excellent high resolution textures cover every surface, the lighting and shadows...




					www.techspot.com


----------



## puma99dk| (Mar 20, 2020)

I guess these tests are with the protected exe file and not the unprotected exe file that Bethesda actually forgot to remove 

I can see that my GTX 1080 Ti should still do well in here so maybe I should try this game soon


----------



## r.h.p (Mar 20, 2020)

bigguns said:


> why not CPU performance ? i want to know how much faster is Intel over AMD junkzen in this game


lol , im no brainiak pal but what the ….


----------



## ARF (Mar 20, 2020)

r.h.p said:


> lol , im no brainiak pal but what the ….



He has a point, though. It's interesting to see if Ryzen is faster than Core in this game. It depends on the particular optimisation and IPC here.


----------



## EarthDog (Mar 20, 2020)

ARF said:


> Doom 2016 runs not bad on the old FX processor back in 2016.


I think we have different definitions of not bad, lol. the 9590 is at 4.7 Ghz and 8c/8t, while a 2c/4t CPU is 3 fps/less than 3% faster (and its minimum are higher). It will play just fine, but when there is any kind of perspective put up against them, they look pretty pathetic. A sandybridge quad core beats them.


----------



## r.h.p (Mar 20, 2020)

ARF said:


> He has a point, though. It's interesting to see if Ryzen is faster than Core in this game. It depends on the particular optimisation and IPC here.





bigguns said:


> i want to know how much faster is Intel over AMD junkzen in this game


Spam Comment in my opinion


----------



## infrared (Mar 20, 2020)

Just had an hour or so messing around in this game, it runs weird on my system. Acts like it's CPU bottlenecked pretty badly, and yet CPU usage is very low (about 25%) and spread across all threads nicely. I'm getting 100-120fps almost regardless of graphics settings, 65-70% GPU utilization.

I wonder if this is Denuvo DRM related perhaps, or a quirk with how it's coded and 1st gen ryzen not handling it well? idk.

1800X @ 4.1ghz all core, with 3466mhz 14-14-14-28 (and tight sub timings) and a 1080Ti.. All other games run beautifully on this combo, I didn't expect performance issues with Doom. 

edit - I guess it runs silky smooth at least, I'll turn off the afterburner overlay and just enjoy it I think

*Ignore above.. I think it was an overlay causing weird problems, my bro was playing on my rig without anything else running and it was running really well. 1440p ultra nightmare and it's frequently 130-165fps in gameplay now, very happy. *


----------



## EarthDog (Mar 20, 2020)

infrared said:


> Just had an hour or so messing around in this game, it runs weird on my system. Acts like it's CPU bottlenecked pretty badly, and yet CPU usage is very low (about 25%) and spread across all threads nicely. I'm getting 100-120fps almost regardless of graphics settings, 65-70% GPU utilization.
> 
> I wonder if this is Denuvo DRM related perhaps, or a quirk with how it's coded and 1st gen ryzen not handling it well? idk.
> 
> ...



I'd guess the slightly slower IPC and lower clocks is somehow managing to limit things? No idea until we see a CPU comparison..


----------



## spectatorx (Mar 20, 2020)

ARF said:


> Can't find any CPU Ryzen-Core analysis on Doom Eternal but it can be seen that the game doesn't need more than a 4-core/8-thread which is just disappointing
> Doom 2016 runs not bad on the old FX processor back in 2016.
> ...




I can confirm with my own experience running doom on fx-6300+radeon r9 380 was more than just playable experience, especially on vulkan. I enjoyed the game a lot and will wait for decent discount on eternal. Proof:


Spoiler


----------



## Makaveli (Mar 20, 2020)

ARF said:


> RX 5700 XT is 10% faster than Radeon VII at 1920x1080.
> While the Radeon VII is 5% faster than RX 5700 XT at 3840x2160.
> 
> 
> ...



not really that surprising.

The Radeon VII has more Vram and the Rx 5700 has higher clocks.

But its also Vega Vs Navi so two different arch's


----------



## sujauktas (Mar 20, 2020)

Vega 56 on par with GTX 1080 on 1440p!  Vega 56 still puffin those leaves, still not loving 1080!


----------



## W1zzard (Mar 20, 2020)

IceShroom said:


> Why the 2 most pupular cards(GTX 1650/Super) are missing?


Added GTX 1650 Super


----------



## ARF (Mar 20, 2020)

sujauktas said:


> Vega 56 on par with GTX 1080 on 1440p!  Vega 56 still puffin those leaves, still not loving 1080!



The performance is still there where it has been all the time since 2017.

Doom 2016 at RX Vega 64 release in 2017:











						AMD Radeon RX Vega 64 8 GB Review
					

Our AMD Radeon RX Vega 64 review confirms that the company achieved major performance improvements over their last-generation Polaris and Fiji cards: Vega is faster than the GTX 1080. We tested six different performance configurations of the Vega 64, with surprising results.




					www.techpowerup.com
				




Doom 2020:











						DOOM Eternal Benchmark Test & Performance Analysis - 26 Graphics Cards Compared
					

DOOM Eternal is the long-awaited sequel to the epic DOOM series. There's even more carnage, and gameplay is super fast-paced. Built upon the id Tech 7 engine, visuals are excellent, and graphics performance is outstanding. We tested the game on all modern graphics cards at Full HD, 1440p and 4K...




					www.techpowerup.com


----------



## Manoa (Mar 20, 2020)

hardware and performance are nice but the game suckx


----------



## mouacyk (Mar 20, 2020)

Manoa said:


> hardware and performance are nice but the game suckx


yeah, way too much extra for a doom game.  it's trying to be a borderlands game


----------



## ARF (Mar 20, 2020)

It would be nice if it has Quake 3 style of multi-player maps with this year graphics.


----------



## GurthBrooks (Mar 20, 2020)

ribizly said:


> As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.


Looks like a 2070 performs same as a 1080ti in Eternal!


----------



## Deleted member 190774 (Mar 20, 2020)

mouacyk said:


> As support matures for new hardware features, performance will pull away from last generation -- leaves a bit of a bitter after taste too.


Only amongst those who flocked to buy a 1080ti while it seemed to offer similar performance to the 2080 for less money in the then current raft of games. I know there was a lot of negative sentiment around the 2080.

Those that went 2080, will now feel relief that their investment is starting to show some benefits.


----------



## brutlern (Mar 20, 2020)

I can get anywhere between 100 fps to 250 fps on 1440p ultra nightmare (2080 super). That's a variation of 150 fps. The charts show 160 avg fps for a 2080 super, but unless everyone can run the same benchmark getting anything close to a meaningful result is exercise in futility. Is there a proper in game benchmark? Maybe some console commands?


----------



## EarthDog (Mar 20, 2020)

brutlern said:


> I can get anywhere between 100 fps to 250 fps on 1440p ultra nightmare (2080 super). That's a variation of 150 fps. The charts show 160 avg fps for a 2080 super, but unless everyone can run the same benchmark getting anything close to a meaningful result is exercise in futility. Is there a proper in game benchmark? Maybe some console commands?


So long as it's the same section repeated for each card, it is, the takeaway is RELATIVE performance between cards, not fps. Even an integrated benchmark doesnt do a great job of actual in-game fps either.


----------



## ARF (Mar 20, 2020)

What are the graphics driver settings, though? Radeon Software and Nvidia Control Panel ?
There is quite noticeable difference in the results if you tweak one or two settings.


----------



## Non_NPC (Mar 20, 2020)

Sorry if I missed it, but is there SLi support? Seems like 2 2080's would be real sweet at 4k.


----------



## W1zzard (Mar 20, 2020)

ARF said:


> What are the graphics driver settings, though? Radeon Software and Nvidia Control Panel ?
> There is quite noticeable difference in the results if you tweak one or two settings.


For all my testing I use out of the box settings as that represents 99.9% or higher of what people use


----------



## ARF (Mar 20, 2020)

W1zzard said:


> For all my testing I use out of the box settings as that represents 99.9% or higher of what people use



Well, I always change the settings, for AMD historically Texture Filtering Quality set to High gives higher performance.
One needs to test the settings and see where there are gains. I always do it because every frame is precious with lower end hardware.

But for apples-to-apples comparison the default might be right.


----------



## dicktracy (Mar 21, 2020)

We don't even need to read CPU/GPU game tests to know which brands will top the charts lolz


----------



## s3thra (Mar 21, 2020)

Is anyone else having trouble with DE flicking their 144Hz monitor down to 60Hz? It doesn’t seem to matter what I do, every time I launch the game in either full screen or borderless window modes it changes my refresh rate to 60Hz.

I’m using the latest Radeon drivers which came out the other day with Windows 10 all patched up.


----------



## Super XP (Mar 21, 2020)

ARF said:


> Can't find any CPU Ryzen-Core analysis on Doom Eternal but it can be seen that the game doesn't need more than a 4-core/8-thread which is just disappointing
> Doom 2016 runs not bad on the old FX processor back in 2016.
> 
> View attachment 148680
> ...


I am sure a patch will rectify that core count. Next Gen Gaming Consoles WILL utilize more than 8 cores and up to even 16 threads if they can.


----------



## jallenlabs (Mar 21, 2020)

Im sure this game is great and I plan to play it, but honestly, Im more interested in Doom 64 on PC!


----------



## Kissamies (Mar 21, 2020)

So it seems that 980 Ti will run more than fine with 1080p, I need to get this game soon since the last Doom was hella great!


----------



## Athlonite (Mar 21, 2020)

Chloe Price said:


> So it seems that 980 Ti will run more than fine with 1080p, I need to get this game soon since the last Doom was hella great!



not just the 980Ti also RX580 & RX590 aswell


----------



## Tsukiyomi91 (Mar 21, 2020)

good to know that a RTX2060 is plenty enough for Ultra Nightmare in both 1080p & 1440p.


----------



## Cheeseball (Mar 21, 2020)

s3thra said:


> Is anyone else having trouble with DE flicking their 144Hz monitor down to 60Hz? It doesn’t seem to matter what I do, every time I launch the game in either full screen or borderless window modes it changes my refresh rate to 60Hz.
> 
> I’m using the latest Radeon drivers which came out the other day with Windows 10 all patched up.



If you bought the game on Steam and if you have the built-in Steam Overlay FPS counter enabled, disable it.


----------



## s3thra (Mar 21, 2020)

Cheeseball said:


> If you bought the game on Steam and if you have the built-in Steam Overlay FPS counter enabled, disable it.


Thanks, yes I bought it on Steam. I always have the overlay set to off though, so no such luck for me yet.


----------



## Sithaer (Mar 21, 2020)

Tsukiyomi91 said:


> good to know that a RTX2060 is plenty enough for Ultra Nightmare in both 1080p & 1440p.



There is a Vram limiter in the game tho so you might not be able to max out the Textures.
At least I do not see any way to bypass this,it won't let you apply the settings unless its within the Vram limit,even if its only a few MBs.




This is why I asked how the game was tested on supposedly 'highest' settings on 3-4GB cards.


----------



## ador250 (Mar 21, 2020)

Man, pascal really fucked up. GTX 1060 now RX 570 competitor.


----------



## RoutedScripter (Mar 21, 2020)

I thought that Vulkan and DX12 being modern APIs supposably only requiring a *thin driver* would make the GPU Manufacturer driver-game babysitting a thing of the past!

But no it doesn't look like it that's completely the case. The down-to-the-metal optimizations should ALL be handled by the game developers just like the newer APIs said they would give more access and responsibility, but it looks like it wasn't fully it, hence it's still abstraction, just "better at it".

It's so weird when there's fixes in these GPU drivers about some edge-case in some game, "_corruption is seen in XYZ game when opening a menu_" ... why on earth would that ever be a driver problem, if it is then that is a wrong approach, but if it's not then it shuldn't be driver trying to fix it but rather the most sensible appropriate component where the root cause is coming from, the game is doing something wrong most likely, but with this current system it feels like they always look at it as if "*the driver isn't doing enough*", missing the point and forgetting to ask if the driver should be doing this at all, even if it's known to everyone that it is a game fault but we just choose it to fix it in the driver, bah, it just feels like a cheap way to get over the problem and then the devs also become less motivated because they expect the GPU manufacturer to fix it, but the system is made so that many things the only way to fix it is in the driver, just weird on so many levels, the driver shouldn't have this kind of extra wide ranging responsibilities, there is no rules in the industry what can and cannot go in a driver to keep things simple, GPU drivers are one of the biggest one, just look at how many Megabytes, that's MEGABYTES are the size of the .DLLs ... 20-40 MB, what a freakshow compared to everything else. 
What kind of API calls or whatever is the game sending or doing something else somewhere that causes this bug effect. GAMES ARE THE BIG BULK, games are the CARGO of weight and complexity, it should always be the game looking for it's compatability with the OS/API/DRIVER/HW, not the other way around, is the cargo strapped in the airplane correctly, if not, fix the cargo, not add another engine or extend the wing or add counter-weight to keep balance of the imbalanced cargo, pffft, if it is compatible it JUST WORKS, the GPU freezes or crashes, guess what, it should never be the API/OS/DRIVER fault because those things should have to be DESIGNED to be as reliable as smooth as simple as possible and the only room should be in the game where the game is the place where those nitty-gritty down-to-the-metal optimizations happen and not anywhere else.

 obviously you can't have 1 company serving 1000x games out there to their fullest potential, this is whole driver babysitting is a fundamentally not optimal approach for practical end-user usage.

The industry keeps chugging this terrible method of so much driver responsibility and babysitting each and every game and having to "*support*" a new game, give me a break, the game supports the API, the game supports the OS, the game supports the GPU, the GPU supports the API, and the driver translates the games instructions through the API into GPU instructions, it ought to JUST WORK, right?!?!? Why not? Why so much fiddling and diddling with the freaking driver, with the mystery middle-man. Why so much drama with the transporter. This from an outside practical point of view makes no sense, but sometimes it takes that kind of approach to view something from afar rather than being an expert at it's details, the various individual experts may not see/realize the and just go along with it as if that's just how it suppose to be. The transport/conversion always should be as smooth, fast, reliable, but simple.


It could also be GPU HW problem, if some game or two is coincidentially using some kind of a pattern of commands than causes the GPU to produce corrupted output ... guess what, IT'S THE GPUs FAULT, not the driver, leave the driver alone and fix your broken HW you cheap's cakes. In the practical economic world ofcourse they would want to poke the driver to fix it, the users would also not want to replace the GPU if they bought it recently, but this is just the reality.
If such things were to happen ... it would have been a failure at Quality Assurance and testing, not testing for all possible combinations of commands feeded into the GPU, with todays AI and supercomputer automation that's like a non-issue to test for, so such fixes would be very rare.

Continuing: Nothing else requires such insane amount of driver maintenance than the graphics department, this has been plaguing field and I think this is why there's so much drama around benchmarks and performance.

The only thing a GPU manufacturer thin driver would do is general things such as HW/OS/API compatability so that fullscreen modes, super resolution, scaling, support for newer API version and other insfrastructural things to work properly as OS and HW is developed, you would need to update it much less frequently versus now, and you would update it to support a new API version and that's it, not each new GAME!!! :/, and that update should be fine for all the new games using an updated API version!
If done properly and tested right there wouldn't really be so much room for bugs anyway and if there would be bugs they wouldn't affect specific games in such specific manner, these general and larger bugs would also be very noticable and affect a lot of people so they would be traced down and fixed relatively fast. The driver shouldn't ever go into nitty-gritty extremely-game-specific details which pretty much makes this world not a GPU war but a *DRIVER WAR!!!*



Do you have to update the mouse driver to make the mouse "support" a game that runs over 300 FPS?
Do you have to optimize the mouse driver when you choose a new mouse pointer style?
Do you have to optimize the keyboard driver when so you can press 10x more keys in a highly competitive FPS game?
Do you have to upgrade the CPU driver when you load a new program that uses modern instructions?
Do you have to update the network driver to support a new Cat.7 Ethernet cable?
Do you have to

No you don't! Everywhere else, IT JUST WORKS for what it is designed for, unless the driver is just bad made by low paid devs, usually cheap pheriperals from Asia.


----------



## Jism (Mar 21, 2020)

> It's so weird when there's fixes in the driver about some edge-case in some game, "corruption is seen in XYZ game when openin a menu" ... why is that a driver problem, it should be a game problem, either the game or the OS or whatever is, a thin driver shouldn't have that kind of responsibility IMO, and because GPU manufacturers take upon themselfs then everyone sits and waits for their fixes and obviously you can't have 1 company serving 1000x games out there to their fullest potential, this is whole driver babysitting is a fundamentally wrong approach.



Yes but it would give AMD or Nvidia a bad rep if a game does'nt properly function on a new released game, does'nt it? I mean you read everywhere the 5700 xt drivers blabla, no sir, it is the game that was done in a bad way on where drivers need to fix issues that are initially caused by a game.

Vulkan, Mantle, DX12, it's nothing new really. Back in the C64 days they already applied 'tactics' to get the utter best from that base tiny hardware:










With all the computational power a GPU such as the 580/590 has, you'd say you could even make it better then what eternal now does on Ultra / WQHD or so. It all depends on how far a programmer is willing to go. But they don't really, because they have to take into account so many different configs for a base PC to make it run in the first place.

Console gaming could actually look better then PC in a way; because they have a default set of hardware, and to get the best out of it you have to program it like your talking to the chip itself. This is why Vulkan is such a wonderful concept; you can extract simply more out of it since AMD chips tend to perform best.










The PS2 only had a 4MB 150Mhz GPU, but once devs put their work onto it, they really extracted whatever what was possible on such a tiny, 32MB console.










PS3 same story. A G70 based GPU, aka 7800 or so. But once devs start gettings down into it, they really pull the potential the GPU's have.

Bottom point: Game devs have schedule's, targets and timespans where there's profit to be made. So they go usually for a generic approach, leaving lots of potential behind or to be patched later. Pubg was a good example. Ran shit at the beginning, runs perfect now.


----------



## BSim500 (Mar 21, 2020)

ARF said:


> Can't find any CPU Ryzen-Core analysis on Doom Eternal but it can be seen that the game doesn't need more than a 4-core/8-thread which is just disappointing


Why is it "disappointing" for a developer to be so good at coding that they can hit 200fps in 2020 games with just a 4/8 CPU? A genuinely well optimised game is one that "does the most with the least", not one which has 16x threads filled with cr*ppy code or because the publisher wanted 10x layers of CPU-heavy virtualisation based DRM in. I have far more respect for id Software who produce amazingly well optimized 200fps Vulkan games, seem to consistently get +2.0-2.5x fps per core and end up universally GPU bottlenecked than I do certain other lazy developers like Ubisoft who can't even hit half that frame-rate given twice the horsepower even when reusing the same engine they're supposed to have a decade's worth of 'experience' with...


----------



## ARF (Mar 21, 2020)

BSim500 said:


> Why is it "disappointing" for a developer to be so good at coding that they can hit 200fps in 2020 games with just a 4/8 CPU? A genuinely well optimised game is one that "does the most with the least", not one which has 16x threads filled with cr*ppy code or because the publisher wanted 10x layers of CPU-heavy virtualisation based DRM in. I have far more respect for id Software who produce amazingly well optimized 200fps Vulkan games, seem to consistently get +2.0-2.5x fps per core and end up universally GPU bottlenecked than I do certain other lazy developers like Ubisoft who can't even hit half that frame-rate given twice the horsepower even when reusing the same engine they're supposed to have a decade's worth of 'experience' with...



Because more used cores means more realism, more AI, more physics.
Because the mainstream is at least 6-core/12-thread today, with many people already rocking 12-core/24-thread and 16-core/32-thread.


----------



## Jism (Mar 21, 2020)

ARF said:


> Because more used cores means more realism, more AI, more physics.
> Because the mainstream is at least 6-core/12-thread today, with many people already rocking 12-core/24-thread and 16-core/32-thread.



Point is; with a game like doom, there's so much potential to be extracted from all those cores and race in who has the most cores, while once right optimized you can get away with a 4 core 8 thread and still get 200 FPS ingame.

This is why mantle was created in the first place.


----------



## ARF (Mar 21, 2020)

Jism said:


> Point is; with a game like doom, there's so much potential to be extracted from all those cores and race in who has the most cores, while once right optimized you can get away with a 4 core 8 thread and still get 200 FPS ingame.
> 
> This is why mantle was created in the first place.



With outdated graphics. This engine is like 5-year-old technology.


----------



## Just4Gamerstube1991 (Mar 21, 2020)

Techpowerup remains my go to for Benchmarks, Techpowerup's benchmarks are always right on the mark in terms of FPS. Also Congratulations to ID Tech team for making an amazing gem of a series and for other game developers... You guys need to take some notes now this is how you make a game for the PC platform. 45 fps in 4K max settings on a 1660 ti is amazing.

I honestly can't believe how good both Eternal and Doom perform, they are  both excellent ports. Developers really need to start taking some notes from these guys.  It just goes to show you don't need expensive hardware to pull in some good numbers. If a PC port is in working condition both inside and out it should perform well on various hardware.


----------



## RoutedScripter (Mar 21, 2020)

Why don't the GPU manufacturers pay or send people over to game devs to get it right in the first place then?


----------



## EarthDog (Mar 21, 2020)

ARF said:


> Well, I always change the settings, for AMD historically Texture Filtering Quality set to High gives higher performance.
> One needs to test the settings and see where there are gains. I always do it because every frame is precious with lower end hardware.
> 
> But for apples-to-apples comparison the default might be right.


Apples to apples. Now you are thinking! Reviews cant cover settings for every user. Most leave these things at default. Me, I actually set the texture filtering to high quality from high (nvidia) but that performance impact is negligible anyway.


----------



## BSim500 (Mar 21, 2020)

ARF said:


> Because more used cores means more realism, more AI, more physics.


It doesn't though. It has the potential to mean that but hardly any devs code for that, modern gaming is the same "Lowest Common Denominator" it's been since PC exclusives turned into "Console first cross-platforms" in the 2000's and half the time it's a case of "the more you give them, the more they waste", variable quality ports or simply conflicting priorities. Even today, ask people who've been gaming on PC since the 90's which games are memorable for great AI (or cleverly done scripts being able to spoof the feel of enemies doing clever stuff whilst playing) and you still hear "FEAR 1" or "Half Life 2" more than the latest titles. Even No One Lives Forever (2000, same LithTech Engine family as FEAR) had enemies flipping over tables & hiding behind them, reacting to lights being turned on in adjacent rooms / doors left open, tracking your footprints in the snow, etc, on one 1GHz Pentium 3 core. Thief (1997) had 11 visibility states and some of the most accurate sound propagation physics in PC gaming history. Thief (2014) in comparison was dumbed down to 3x visibility states, half sized levels despite 128x more RAM to play with and a super buggy audio engine. Fully Destructible Environment physics? Red Faction (2001) did that on P3's & 256MB RAM...

Likewise, the real bottleneck to "more realism" like having 1,000 unique NPC's each with their own personality isn't CPU, it's the development time / budget, paying for 1,000 voice actors, quadruple the writers, more mo-cap actors (to avoid having 'a crowd of clones' all moving the same way at once), etc, vs 10% of the effort that brings in +200% more profit churning out skins, lootboxes / pay2win "Booster Packs" / DLC, etc. This comment isn't aimed at you personally, but people who've just bought themselves a new 8C/16T toy to play with thinking it'll magic up some Super AI out of thin air to fill up those 50-75% idling cores are being staggeringly naive in not grasping what really drives game development. We're not short of CPU horsepower, we're short of quality non-lazy developers and all the Threadrippers in the world won't cure that...

As for Doom Eternal, if even a 4C/8T CPU hits GPU bottleneck, it may well mean that an 8C/16T could potentially get more fps but that you simply can't test for that until future 2x more powerful GPU's appear. 200fps on lower end hardware is literally the exact opposite of a "poorly optimized" game though, and a lot of people who've just bought an enthusiast CPU fall into the trap of thinking "a rising tide lifts all boats" (a game is so efficient it simply doesn't need more cores to hit 144Hz) is somehow a "bad thing" simply because it doesn't "demo" their new purchase vs older hardware that well to other enthusiasts.


----------



## dirtyferret (Mar 21, 2020)

Super XP said:


> I am sure a patch will rectify that core count. Next Gen Gaming Consoles WILL utilize more than 8 cores and up to even 16 threads if they can.


How can next gen consoles utilize "more then 8 cores" if they only come with eight cores? 



ARF said:


> Because more used cores means more realism, more AI, more physics.
> Because the mainstream is at least 6-core/12-thread today, with many people already rocking 12-core/24-thread and 16-core/32-thread.


I don't think mainstream means what you think it means and more cores do not equal more realism, ai, or physics.


----------



## EarthDog (Mar 21, 2020)

dirtyferret said:


> How can next gen consoles utilize "more then 8 cores" if they only come with eight cores?


be prepared for a walk back... where he didnt post the full thought in his head to make it complete.


----------



## dirtyferret (Mar 21, 2020)

EarthDog said:


> be prepared for a walk back... where he didnt post the full thought in his head to make it complete.


Maybe he knows something we don't and you can upgrade the CPU in the new consoles?  Although that would bring up power and cooling questions.  Luckily in console land those issues don't exist.


----------



## Non_NPC (Mar 21, 2020)

Thanks for the clarification!


----------



## nickbaldwin86 (Mar 21, 2020)

game is amazing and plays so well. butter smooth on my system.

The MP game play is meh at best but I bought it as a SP game and the story line brings me back to when I was a kid playing DOOM / DOOM2 way back when


----------



## Cheeseball (Mar 21, 2020)

The Slayer challenges on Nightmare are satisfyingly difficult. If enemies get two hits on you without you doing a Glory Kill to recover health, you're gonna be in trouble. You have to strategize between burning them to build armor and giving up Glory Kills to avoid getting hit.


----------



## Makaveli (Mar 21, 2020)

ARF said:


> With outdated graphics. This engine is like 5-year-old technology.



Were you expecting them to create a brand new engine for a sequel?


----------



## nickbaldwin86 (Mar 22, 2020)

Makaveli said:


> Were you expecting them to create a brand new engine for a sequel?



Agreed and who cares it is the story that is amazing and it honestly looks great!  for a 5 year old engine it is aging really well


----------



## Badelhas (Mar 22, 2020)

"Graphical fidelity of DOOM at the highest "Ultra Nightmare" setting is "good", maybe even "very good", but I'm not seeing anything that looks "next-gen". Many textures are blurry and some models definitely lack geometry. While the structural level design is amazing, I'd definitely have wished for more love when it comes to floor geometry and textures. It also puzzles me why they gave us six graphics presets yet built them in such a way that high-end hardware will be dishing out super high FPS (which could be traded for better graphics quality)."

Exactly my thoughs. Consoles are driving game nowawday and it´s a pity. The last game which has really a breakthough in terms of graphics and destructive physics was CRYSYS, a PC only game released in 2007, 13 years ago!


----------



## Frutika007 (Mar 22, 2020)

ribizly said:


> As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.





ribizly said:


> That is okay. But then it should have been there at the release as well. It is simply driver tweak against the "older".



I swear to God,there's always someone complaining about gtx 1080/1080Ti being nerfed by nvidia driver in every single benchmark test review. You are dead wrong dude. And as the reason of it not being there at the release as well is because there wasn't any game that could take full advantage of turing. As time went by,turing got more matured via driver updates and more games started to take full advantage of turing's architecture. GTX 1080/1080Ti was already at it's peak maturity as it was 2 years older than turing. So as the time went by,the performance gap between the pascal and the turing became more and more prominent. So it was NOT DRIVER TWEAK. NVIDIA DIDN'T NERF ANYTHING.


----------



## lexluthermiester (Mar 22, 2020)

W1zzard said:


> Overall, DOOM Eternal is a solid sequel to the epic series, but I'm not convinced it will achieve as legendary a status as earlier DOOM releases.


But you have to admit, it's up there in both quality and fun. I'm not easily impressed, but this game has me impressed. The original Doom and Doom 2 are classics for sure, but they haven't aged well. Without a source port engine like GZDoom, they're not as impressive as they once were.

Personally, Doom3 is my fav, but this one is quickly rivaling it.



Frutika007 said:


> I swear to God,there's always someone complaining about gtx 1080/1080Ti being nerfed by nvidia driver in every single benchmark test review. You are dead wrong dude. And as the reason of it not being there at the release as well is because there wasn't any game that could take full advantage of turing. As time went by,turing got more matured via driver updates and more games started to take full advantage of turing's architecture. GTX 1080/1080Ti was already at it's peak maturity as it was 2 years older than turing. So as the time went by,the performance gap between the pascal and the turing became more and more prominent. So it was NOT DRIVER TWEAK. NVIDIA DIDN'T NERF ANYTHING.


Let's be cool. W1zzard already explained things to that user and everything is resolved. No need for the yelling and such.


----------



## jihadjoe (Mar 22, 2020)

Ashes of the Benchmark showed us early on that Pascal wasn't very good at ASYNC compute. Thing is no one else was using it back then so it sort of masked Pascal's weakness in this aspect. Now that we have more games built natively for DX12 and Vulkan that gap is only going to get bigger. Sucks for those who went for "bargain" 1080Tis over 2080s.


----------



## harm9963 (Mar 22, 2020)

Doom Eternal runs well on my 1080Ti.


----------



## Ikaruga (Mar 22, 2020)

newtekie1 said:


> Another game that fills up 8GB of VRAM but doesn't actually use it.


I don't understand this statement. Every game should use all the RAM and all VRAM in the system, at least for a caching streamed in stuff if it can't be used for anything else. Leaving VRAM unused is simply bad.



jihadjoe said:


> Ashes of the Benchmark showed us early on that Pascal wasn't very good at ASYNC compute. Thing is no one else was using it back then so it sort of masked Pascal's weakness in this aspect. Now that we have more games built natively for DX12 and Vulkan that gap is only going to get bigger. Sucks for those who went for "bargain" 1080Tis over 2080s.


Turing has many new features, a better Async compute implementation is just one of those.


----------



## TheoneandonlyMrK (Mar 22, 2020)

Just wondering how to get ultra nightmare mode working on an rtx2060 6Gb , I tried it wouldn't let me.


----------



## Palladium (Mar 22, 2020)

jihadjoe said:


> Ashes of the Benchmark showed us early on that Pascal wasn't very good at ASYNC compute. Thing is no one else was using it back then so it sort of masked Pascal's weakness in this aspect. Now that we have more games built natively for DX12 and Vulkan that gap is only going to get bigger. Sucks for those who went for "bargain" 1080Tis over 2080s.



I don't play RDR2, but I decided it was the time to offload my 1070 at $150 for a $350 2060S when I saw Pascal got butchered by Turing/AMD in RDR2. I predicted future games will very likely follow the performance trends of RDR2, looking at Doom Eternal now told me I made quite the right call.


----------



## harm9963 (Mar 22, 2020)

theoneandonlymrk said:


> Just wondering how to get ultra nightmare mode working on an rtx2060 6Gb , I tried it wouldn't let me.


 Not with 6GB.


----------



## Jism (Mar 22, 2020)

Cheeseball said:


> The Slayer challenges on Nightmare are satisfyingly difficult. If enemies get two hits on you without you doing a Glory Kill to recover health, you're gonna be in trouble. You have to strategize between burning them to build armor and giving up Glory Kills to avoid getting hit.



This ^ Instant went for Ultra, as i like a challenge, but damn, finally that AI is very well done.

Runs butter smooth on my 2700x / 64GB / RX580 / WDHD using approx 5.5Gb of vram and a locked sync/fps at 72. Never one stutter.

But these games demand so much free space on your harddrive, 40GB, and there's a folder with various sound packs varying in between spanish, japanese, french, polish, portugese, russian and what more taking in unneeded space. I mean if you have a installer just select the language and download only the ones requires instead of dumping everything into one installer. I mean we're talking 6GB of unused sound sources here.


----------



## Makaveli (Mar 23, 2020)

Ikaruga said:


> I don't understand this statement. Every game should use all the RAM and all VRAM in the system, at least for a caching streamed in stuff if it can't be used for anything else. Leaving VRAM unused is simply bad.
> 
> Turing has many new features, a better Async compute implementation is just one of those.



Agreed unused Ram is wasted ram.


----------



## Mister300 (Mar 23, 2020)

Hell my 5820K with a 5 yr old 390X hits 70-100 fps on my LG 2K 144 hz VA panel on ultra nightmare settings, runs like butter.  Vulkan runs great.


----------



## wolf (Mar 23, 2020)

Runs fantastic on my setup, generally 100++ FPS but the most intense scene/map/fight so far in Super Gore Nest has it drop to ,at worst, mid 70's when the sh*t is hitting the fan. And my lord the load times, what load times? Seems iterative over id tech 6 rather than transformative, but major kudos getting it to look better, pack a lot more detail and content and remain almost excessively performant. well done.

3700x, 2x8GB 3200, GTX1080 uv/oc, NVME SSD, 2560x1080p @ 200hz FOV 95

Bonus points for excellent 21:9 support, even cutscenes are flawlessly done with the right aspect and full FPS.


----------



## nickbaldwin86 (Mar 23, 2020)

Played last night: 
Anyone else having issues with it locking up and going black.

If it goes black you can esc and goes right back to normal.

If it locks up, well CRT-ALT-DEL and you get to start ALL OVER  

Currently cant progress because it keeps locking up so much.

Didn't have these issues before the patch that came out yesterday and broke the game


----------



## TheGuruStud (Mar 23, 2020)

nickbaldwin86 said:


> Played last night:
> Anyone else having issues with it locking up and going black.
> 
> If it goes black you can esc and goes right back to normal.
> ...



I just changed resolution on fresh start up and it locked up lol. 
Never change, Bethesda.


----------



## nickbaldwin86 (Mar 23, 2020)

TheGuruStud said:


> I just changed resolution on fresh start up and it locked up lol.
> Never change, Bethesda.



RIP


----------



## Badelhas (Mar 23, 2020)

nickbaldwin86 said:


> Played last night:
> Anyone else having issues with it locking up and going black.
> 
> If it goes black you can esc and goes right back to normal.
> ...


What update are you referring to?


----------



## TheGuruStud (Mar 23, 2020)

nickbaldwin86 said:


> RIP



Locked up multiple times so far. Didn't make it through the first door LOL. It's a stuttery mess when going over refresh. Vsync doensn't even limit it most of the time. Gonna update drivers and see what happens.


----------



## nickbaldwin86 (Mar 23, 2020)

Badelhas said:


> What update are you referring to?



Well here is the history... even says it is to fix the VERY issue I am having... was working great, now totally broke








TheGuruStud said:


> Locked up multiple times so far. Didn't make it through the first door LOL. It's a stuttery mess when going over refresh. Vsync doensn't even limit it most of the time. Gonna update drivers and see what happens.



On the newest NV drivers. hate to have to play with drivers and uninstall reinstall so on....

will just wait for next update to hopefully fix the game again.


----------



## TheGuruStud (Mar 23, 2020)

nickbaldwin86 said:


> Well here is the history... even says it is to fix the VERY issue I am having... was working great, now totally broke
> 
> View attachment 149026
> 
> ...


Well...I'm running on launch release....   5700XT, though.

Wow, updated drivers fixed the crashing/freezing. WTF did you do Bethesda? More proof hardware vendors have to fix dev BS.
Chill works to limit fps and stuttering stopped. Oh, and the game was literally broken. I had to start a new game. Demons didn't spawn and nothing happened lol.


----------



## Cheeseball (Mar 23, 2020)

nickbaldwin86 said:


> Well here is the history... even says it is to fix the VERY issue I am having... was working great, now totally broke
> 
> View attachment 149026
> 
> ...



Try 445.75. Disable Super Resolution, the Steam overlay FPS counter and Windows 10 HDR if you're using any of these.

FWIW, I haven't had any crashes on both the 5700 XT and the 2080 Super on the latest update. I had FPS drops only because I had Steam's overlay FPS counter enabled.


----------



## Super XP (Mar 24, 2020)

harm9963 said:


> Doom Eternal runs well on my 1080Ti.


The Graphics & Game play look Awesome.


----------



## mouacyk (Mar 24, 2020)

Too colorful and bright for Doom.  Feels and plays more like Serious Sam now, than a gloomy doomy game.


----------



## Super XP (Mar 24, 2020)

mouacyk said:


> Too colorful and bright for Doom.  Feels and plays more like Serious Sam now, than a gloomy doomy game.


You can set the color tone as you wish. But looking at it right now it's a nice clean looking  game. This DOOM version has Realistic picture quality too.


----------



## nickbaldwin86 (Mar 24, 2020)

Cheeseball said:


> Try 445.75. Disable Super Resolution, the Steam overlay FPS counter and Windows 10 HDR if you're using any of these.
> 
> FWIW, I haven't had any crashes on both the 5700 XT and the 2080 Super on the latest update. I had FPS drops only because I had Steam's overlay FPS counter enabled.



just upgraded to those drivers this morning. I don't have any overlays enabled but I will try disabling the others.

see if I can play through a level


----------



## kapone32 (Mar 24, 2020)

My Rock Candy controller gave up the ghost and I am waiting to get into this ordered a new controller on Sunday but won't get it until tomorrow. I can't wait to get into this and a few others with the time I have now.


----------



## newtekie1 (Mar 24, 2020)

Ikaruga said:


> I don't understand this statement. Every game should use all the RAM and all VRAM in the system, at least for a caching streamed in stuff if it can't be used for anything else. Leaving VRAM unused is simply bad.




It's lazy programming, especially in a program like this that is largely linear.  And it just leads to people thinking they need more powerful graphics cards than they really do.  We'll be seeing people quoting this game using 8GB+ of VRAM at 1080p as an excuse for needing 16GB of VRAM during the next generation of graphics cards.


----------



## nickbaldwin86 (Mar 24, 2020)

Cheeseball said:


> Try 445.75. Disable Super Resolution, the Steam overlay FPS counter and Windows 10 HDR if you're using any of these.
> 
> FWIW, I haven't had any crashes on both the 5700 XT and the 2080 Super on the latest update. I had FPS drops only because I had Steam's overlay FPS counter enabled.



Well didn't change anything but the NV drivers updated.  seems to be running great... nothing in the release notes from NV. game hasn't updated. Oh well I will take it either way, it is working great so.


----------



## Cheeseball (Mar 25, 2020)

nickbaldwin86 said:


> Well didn't change anything but the NV drivers updated.  seems to be running great... nothing in the release notes from NV. game hasn't updated. Oh well I will take it either way, it is working great so.



Not sure what else to suggest man. Do you have any aggressive anti-virus or background software that could be conflicting with the game/Vulkan? Are there any monitoring software that you're running at the same time? Any special configuration that you have for Windows? Is your overclock still stable in other games?

I'd think *I* would have problems considering I have both AMD and NVIDIA drivers on the same Windows install, but "It Just Works" (tm) for me.


----------



## nickbaldwin86 (Mar 25, 2020)

Cheeseball said:


> Not sure what else to suggest man. Do you have any aggressive anti-virus or background software that could be conflicting with the game/Vulkan? Are there any monitoring software that you're running at the same time? Any special configuration that you have for Windows? Is your overclock still stable in other games?
> 
> I'd think *I* would have problems considering I have both AMD and NVIDIA drivers on the same Windows install, but "It Just Works" (tm) for me.



Played through another level. it is stable. running great now. I see another game update came out. I am sure it will only get better and better, game has only been out a week.

my clocks have been stable in every game and this PC build is getting aged, I think going on 3 years :|  longest I have ever had a single system, but I can't BS a reason to "upgrade".

Thanks for all the suggestions!


----------



## EarthDog (Mar 25, 2020)

nickbaldwin86 said:


> but I can't BS a reason to "upgrade".


you're running a 2c/4t cpu...that's reason to upgrade..more fps in a lot of games.


----------



## nickbaldwin86 (Mar 25, 2020)

EarthDog said:


> you're running a 2c/4t cpu...that's reason to upgrade..more fps in a lot of games.



It is 4c/4t  
My cpu running at 5Ghz is not a bottleneck

Look at reviews. this proc does great at 5Ghz and can stand with the procs that are double the price


----------



## wolf (Mar 25, 2020)

nickbaldwin86 said:


> It is 4c/4t
> My cpu running at 5Ghz is not a bottleneck
> 
> Look at reviews.



It appears you'll be ok in DOOM, but I'd expect varying results in a lot of titles and it'll get progressively worse. I recently went from 4c/8t mid 4ghz region to 8c/16t ~4.2ghz and the uplift is major across the board, including testing DOOM 2016 at the time of upgrade before/after. I'm all for squeezing the most out of hardware before upgrading, and I'm not saying go buy today or anything but make no mistake, a fast 6+ core (intel 8th gen+ or Ryzen 3000+) will make your PC universally more performant in games, bar the odd cherry picked exception.


----------



## EarthDog (Mar 25, 2020)

nickbaldwin86 said:


> Look at reviews. this proc does great at 5Ghz and can stand with the procs that are double the price


Depends on the title. 4c/4t is long in the tooth today and puts a glass ceiling on some games... any that can use more than 4 threads which is several and growing monthly. Doesnt change the fact that 4c/4t cpu can be long in tooth today already


----------



## nickbaldwin86 (Mar 25, 2020)

thanks for the concern guys... No CPU out today is worth the upgrade just to pull a few more frames.

I plan to wait for the 10th gen CPUs. see what sub $200 i3 CPU that have to offer that I can squeeze the most out of, hoping it is 6core but the fake news says it will be 4/8 which seems like a whole new game if they are putting threads on a i3 ... who knows what Intel is up to these days.

ultimately I don't upgrade until the need occurs and I have yet to see a need. 

Doom Eternal 100% doesn't need more CPU... I watched my CPU usage after a level of playing and it hardly stresses my CPU at all. doesn't even warm it up like most other games do.


----------



## Ikaruga (Mar 28, 2020)

newtekie1 said:


> It's lazy programming, especially in a program like this that is largely linear.  And it just leads to people thinking they need more powerful graphics cards than they really do.  We'll be seeing people quoting this game using 8GB+ of VRAM at 1080p as an excuse for needing 16GB of VRAM during the next generation of graphics cards.


I would give it 2/10 if not serious, but if you were indeed serious, then I would say on the contrary!

When it comes to game engines, I consider anything to be “lazy programming” which does not use all the VRAM and prefers to use system ram and/or storage instead. It is a no-brainer really, and props to the team for not going for a lazy approach and actually filling up and using all the available VRAM.

Moreover, it is not that easy nowadays either, because the "border" between VRAM and system RAM is not that distinguishable anymore from the POV of game engines, so today it requires actual design and effort to use all the VRAM efficiently.

In addition, I disagree about what will people think, because it also works perfectly fine with older/lower spec cards offering less VRAM.

At the end (in my opinion) we can argue about if Doom Eternal is a good or a bad video game, because it is about personal preference, what one might consider exciting or boring etc... but I think it's universally safe to say that the engine itself is a very good one, one of the very best in the world.


----------



## John Naylor (Mar 29, 2020)

newtekie1 said:


> Another game that fills up 8GB of VRAM but doesn't actually use it.



Better said .. yet another game that reserves 8 GB of RAM and never gets near using it.   Blatantly obvious that as resolution increases, VRAM requirements barely budge.  Merely a case of "Oh look, have have 11 GB of RAM, let's reserve 8 GB  of it.  Alienbabeltech blew this fake RAM requirement outta the water way back in 2013.









						GTX 770 4GB vs 2GB Showdown - Page 3 of 4 - AlienBabelTech
					

Do you need 4GB of ram? We tested EVGA's GTX 770 4GB versus Nvidia's GTX 770 2GB version, at 1920x1080, 2560x1600 and 5760x1080.




					alienbabeltech.com
				




The best evidence here was this:



> _There is one last thing to note with Max Payne 3:  It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it claims to require 2750MB.  However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting.  And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s. _



It needs 2750 MB of RAM or it won't allow the install, but if you fool it, by installing witha  4 GB card, then remove the card and replace it with 2 GB one, the fps and user experience is unchanged.

This result was echoed by Puget Sound (600 series), Guru3D (900 series) and Extremetech (900 series) and TPU with the 3 GB / 6 GB 1060s.

_"Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.” _

If the game grabs 6 GB on a 8 GB card and then plays at the same resolution, same fps and same user experience when played on an identical card w/ 4 GB ... it clearly does not **need** 6 GB.


----------



## simlife (Apr 8, 2020)

i am super super struggling to see why a 24 inch monitor is used... on a tech site non the less.. plz look up pixel density....  the human eye will not discern anything unless you are about 4 inches from the screen at 4k.. even though this is about the raw numbers for the specs this bugs me to a insane lvl.. i  have a 75 inch tv and own a switch to at times im at 720p and lower.. so again 4k on a tiny monitor why....


----------



## W1zzard (Apr 8, 2020)

simlife said:


> i am super super struggling to see why a 24 inch monitor is used


because i need to save space .. have too many monitors and computers here


----------



## EarthDog (Apr 8, 2020)

simlife said:


> i am super super struggling to see why a 24 inch monitor is used... on a tech site non the less.. plz look up pixel density....  the human eye will not discern anything unless you are about 4 inches from the screen at 4k.. even though this is about the raw numbers for the specs this bugs me to a insane lvl.. i  have a 75 inch tv and own a switch to at times im at 720p and lower.. so again 4k on a tiny monitor why....


Im curious why inches matter in the first place? This is a performance test. 1080p and 2560x1440, and 4k are 1080p, 2560x1440, and 4k uhd at any size. Pixel density isnt relevant here in any way, is it? Reviewers benchmark, not look for an immersive gaming experience.


----------



## cox (Apr 12, 2020)

s3thra said:


> Is anyone else having trouble with DE flicking their 144Hz monitor down to 60Hz? It doesn’t seem to matter what I do, every time I launch the game in either full screen or borderless window modes it changes my refresh rate to 60Hz.
> 
> I’m using the latest Radeon drivers which came out the other day with Windows 10 all patched up.


Same bug here. You need to set your refreshrate to 144Hz, restart your rig and it should run at 144Hz.


----------

