• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Resident Evil 8 Village Benchmark Test & Performance

Looks like 8GB VRAM card not going to last long on mainstream 1440p.
Much faster cards will be "released" by the time that happens.
 
W1z's conclusion said it all, RT is the future, just not in this game. Total afterthought and a detriment to the games visual style and presentation.

Fuels the thoughts that it is a gimmick, which is a shame, because the hardware is so young and versatile and there is so much visually and performance wise yet to be squeezed out of the hardware that is already in gamers hands. Just look at the gains in performance and visuals between release and now with Metro Exodus.

The RT effects in RE8 are deliberately toned down to accomodate weaker Raytracing capability to the point that it's not even worth being used at all.

Perf cost of RayTracing at 4K:
RE8: 25% with the 3090 and 39% with 6900XT (RTGI + RT Reflection)
WD Legion: 25% with 3090 and 41% with 6900XT (RT Reflection)
Metro Exodus: 41% with 3090 and 54% with 6900XT (RTGI)
Control: 41% with 3090 and 60% with 6900XT (RT Reflection + RT Shadow + RT Lightning)

So RTGI + RT Reflection in RE8 cost about the same clock cycles as RT Reflection alone in WD Legion.
Surely there are plenty of room to up the quality of Raytracing in RE8 but we all know why it doesn't :D
 
Much faster cards will be "released" by the time that happens.

Not everyone can afford to waste money on every card release. I, unlike any others, only upgrade bi-yearly so "future proofing" playing a role in my selections.
 
There is no such thing as future proofing. You can get a card with lots of VRAM, but it will be too slow by the time you need that VRAM.

As since you will have to lower your settings anyway, might as well lower texture quality, which makes no difference on the upper settings.

No game really needs 8 GB even at 4K, most games just fill up as much as they can. And with DirectStorage coming, the difference will be even lower.


I have tried future proofing a few times in the past and it never worked. You are spending more for something you do not need right now. Might as well upgrade later (unless you do not sell your used components).
 
There is no such thing as future proofing. You can get a card with lots of VRAM, but it will be too slow by the time you need that VRAM.

As since you will have to lower your settings anyway, might as well lower texture quality, which makes no difference on the upper settings.

No game really needs 8 GB even at 4K, most games just fill up as much as they can. And with DirectStorage coming, the difference will be even lower.


I have tried future proofing a few times in the past and it never worked. You are spending more for something you do not need right now. Might as well upgrade later (unless you do not sell your used components).

Just look at Vega 64, for being slower than RTX 2060, and now faster than it. You could argue faster cards are always better, but amount of VRAM plays a role even if it's just small fractions.

relative-performance_2560-1440.png
performance-2560-1440.png
 
This game is optimized for AMD, which can be seen in all results. I doubt VRAM makes any impact in this case.

In case of AMD, you never really pay extra for more VRAM compared to NVIDIA equivalents. But then again, the cards could be cheaper with less VRAM, which you will never really need when the cards become too slow.
 
This game is optimized for AMD, which can be seen in all results. I doubt VRAM makes any impact in this case.

In case of AMD, you never really pay extra for more VRAM compared to NVIDIA equivalents. But then again, the cards could be cheaper with less VRAM, which you will never really need when the cards become too slow.

Can you further backed up your statement, source maybe? AFAIK, all Capcom games are based on RE Engine, Unreal's in-house fork.
 
The implementation of FidelityFX and no NVIDIA-specific features. There were many articles about Capcom working with AMD on this one.

Most big games tend to be NVIDIA-focused, but this is one of the uncommon AMD examples. To be honest, if all games ran this well, we would not need DLSS or other gimmicks, which seem to encourage developers to ignore optimization.
 
The RT effects in RE8 are deliberately toned down to accomodate weaker Raytracing capability to the point that it's not even worth being used at all.
It does appear that way, to the detriment of the tech :( I mean look how well optimized it is for AMD, then with RT on that lead evaporates fast.

4K results

6800XT 107.9 fps
6800XT+RT 67.6 fps -37.4%

3080 102.6 fps
3080+RT 76.1 fps -25.8%

Effectively meaning the 3080 can handle the RT load ~44% better, a fairly typical result for RDNA2 vs Ampere.

Having said all that, the results here and in Metro Exodus EE are certainly very promising for AMD, and I am extremely glad they added hardware RT capability. They just need to run weaker effects/sample counts vs the Ampere counterpart - and they need their DLSS competitor to arrive in a big way.
 
Last edited:
The RT effects in RE8 are deliberately toned down to accomodate weaker Raytracing capability to the point that it's not even worth being used at all.
So RTGI + RT Reflection in RE8 cost about the same clock cycles as RT Reflection alone in WD Legion.
Surely there are plenty of room to up the quality of Raytracing in RE8 but we all know why it doesn't :D
Because the most common RT capable GPUs on steam will not be able to run it at 1080p 60FPS+ otherwise? ;)
It does appear that way, to the detriment of the tech :( I mean look how well optimized it is for AMD, then with RT on that lead evaporates fast.
IMO the biggest detriment to the tech is that lack of more affordable hardware capable of running it.
Its doesn't matter how good something looks or not, if most users cannot use it, it is often perceived as enthusiasts being snobby. :(

If AMD has the influence that people think they do, they will not be in their current position in the GPU market.
Fact is the vast majority of card on Steam right now don't have hardware RT at all (Pascal),
and the top RTX GPU being 2060 followed by 2060S / 2070 Super will fit in the bottom of the 1080p chart right now. Not much above 60 FPS.
 
Last edited:
Looks like 8GB VRAM card not going to last long on mainstream 1440p.

Haha, what makes you think that? 3070 beats 6700XT in 4K and 3060 Ti is just behind 6700XT, like it's supposed to be

You can't use the Memory usage on a 3090 with 24 GB VRAM for anything. It's pointless actually. Higher amount of VRAM will always result in higher allocation and usage. Put a 3080 in, run the same game, same settings and VRAM usage will be lower, I will guarantee this. Same is true for system memory. More memory, higher allocation.

8GB is plenty for 1440p and this won't change anytime soon. 4-6GB is what most newer AAA games on high/ultra settings actually needs at this resolution.

Looking at VRAM usage is useless, especially when some game engines just allocate all VRAM. You need to look for fps dips to see if VRAM is a problem or not. Not enough VRAM; fps drop to 0-1 fps for a split second. You will know something is up. It won't be smooth.

In 4 years, 8GB might not be enough for 1440p maxed out in true next gen titles, but you know what? Current GPU's will be too slow anyway. A high amount of VRAM is not going to save a weak GPU. So you will need to LOWER SETTINGS anyway, and when you lower these, VRAM requirement drops too. So, yeah...

Even the best GPU today will be beaten by mid-end in 4 years. Upgrading more often is better than trying to futureproof.
 
Last edited:
You can't use the Memory usage on a 3090 with 24 GB VRAM for anything. It's pointless actually. Higher amount of VRAM will always result in higher allocation and usage. Put a 3080 in, run the same game, same settings and VRAM usage will be lower, I will guarantee this
Tested it just for you, RTX 3080, same everything, 4K

RT off = 6892 MB
RT on = 7675 MB

So only around 150 MB difference
 
These performance numbers AMD cards are pushing on newer titles are having me think I should get a 6900xt instead of waiting for the 3080Ti if i can get my hands on either. Be kind of cool to have an all AMD rig as I plan to get a 5800x soon too.

In the new Assassins Creed, AMD cards do extremely well in that as well.
 
This is all proprietary crap imo. If I can't iray render with it in the 3d program that I use, its useless to me. The idea that I have to sacrifice a PCIe slot strictly for gaming if I desired to put an AMD card in is just ridiculous. It would be beneficial to "all" if they would work together in some areas and adopt some kind of standard instead of all this G-sync, Free-sync, Money-sink, Kitchen-sink, RT, NasT, whatever business gimmick nonsense.

We're all being played like instruments at an opera being orchestrated by multiple conductors.
 
This is all proprietary crap imo. If I can't iray render with it in the 3d program that I use, its useless to me. The idea that I have to sacrifice a PCIe slot strictly for gaming if I desired to put an AMD card in is just ridiculous. It would be beneficial to "all" if they would work together in some areas and adopt some kind of standard instead of all this G-sync, Free-sync, Money-sink, Kitchen-sink, RT, NasT, whatever business gimmick nonsense.

We're all being played like instruments at an opera being orchestrated by multiple conductors.
There is not a single thing about this game that screams proprietary. What are you going on about?

Ray tracing for one is definitely not proprietary as its part of DX12 API which both AMD AND Nvidia make use of. Nvidias RTX was proprietary with its own ray tracing library but thats all been largely adopted by DX12. Now its just Nvidia nomenclature to disguish GPUs in their line up that have dedicated RT hardware in silicon and those that dont.

Also we are talking about a game in this thread, not a CAD program.
 
Last edited:
Ray tracing for one is definitely not proprietary as its part of DX12 API which both AMD AND Nvidia make use of. Nvidias RTX was proprietary with its own ray tracing library but thats all been largely adopted by DX12. Now its just Nvidia nomenclature to disguish GPUs in their line up that have dedicated RT hardware in silicon and those that dont.
DirectX Ray-tracing API was announced well before Turing launched, it is the Vulkan extensions that are adopted from nVidia's library.
RTX is mostly an umbrella term for nVdia's DXR implementation and DLSS. Games don't even need RT to put RTX branding on them, only DLSS.
 
Haha, what makes you think that? 3070 beats 6700XT in 4K and 3060 Ti is just behind 6700XT, like it's supposed to be

You can't use the Memory usage on a 3090 with 24 GB VRAM for anything. It's pointless actually. Higher amount of VRAM will always result in higher allocation and usage. Put a 3080 in, run the same game, same settings and VRAM usage will be lower, I will guarantee this. Same is true for system memory. More memory, higher allocation.

8GB is plenty for 1440p and this won't change anytime soon. 4-6GB is what most newer AAA games on high/ultra settings actually needs at this resolution.

Looking at VRAM usage is useless, especially when some game engines just allocate all VRAM. You need to look for fps dips to see if VRAM is a problem or not. Not enough VRAM; fps drop to 0-1 fps for a split second. You will know something is up. It won't be smooth.

In 4 years, 8GB might not be enough for 1440p maxed out in true next gen titles, but you know what? Current GPU's will be too slow anyway. A high amount of VRAM is not going to save a weak GPU. So you will need to LOWER SETTINGS anyway, and when you lower these, VRAM requirement drops too. So, yeah...

Even the best GPU today will be beaten by mid-end in 4 years. Upgrading more often is better than trying to futureproof.

Just wanted to say I'm nearing the end of the game right now. Playing on a GTX 1080 (non Ti), and 8700K. Running the game at 1440p with everything maxed out (no RT of course). No VRAM issues at all. Like you said apps like Afterburner just show what the game is REQUESTING not actually making use of. I've had no stuttering, pop in etc. Been a 99 percent locked 60 FPS experience.
 
Tested it just for you, RTX 3080, same everything, 4K

RT off = 6892 MB
RT on = 7675 MB

So only around 150 MB difference

Wut, that is 780MB difference
 
Wut, that is 780MB difference
Ias's argument was that RTX 3090 vs RTX 3080 will see vastly different memory usage, because 24 GB vs 10 GB. I tested that and only 150 MB difference (not between RT on and off, but between 3090 (in the article) and 3080 (in my post)
 
I can feel the annoyance of all 3090 buyers on this title, pay the price of a decent second hand car for a graphics card and have nothing to show for it :D
On the other hand I feel no sympathy for them, they are the reason each generation gets more expensive.

The RT in this game is meh at best, but it looks beautiful with RT or without, one can argue it looks better without RT. I like that its well enough optimized so that a lot of gamers can actually have a good experience, and not a 20FPS slideshow.
Thx W1zzard, good review!
 
These performance numbers AMD cards are pushing on newer titles are having me think I should get a 6900xt instead of waiting for the 3080Ti if i can get my hands on either. Be kind of cool to have an all AMD rig as I plan to get a 5800x soon too.

In the new Assassins Creed, AMD cards do extremely well in that as well.

Well it depends on game, tons of games still favor Nvidia titles.
This and AC Valhalla is pretty much the AMD titles - you don't see this in many other games.

Keep in mind that several of the games here, has option for DLSS, but it's not being used.


1440p.png

 
I can feel the annoyance of all 3090 buyers on this title, pay the price of a decent second hand car for a graphics card and have nothing to show for it :D
On the other hand I feel no sympathy for them, they are the reason each generation gets more expensive.
If someone can afford a 3090, I do not think they really care about getting actual value in games from it. They just want the fastest card for no reason than to simply have it.

It makes zero sense to buy that card for actual gaming benefits, because you will never see them.

The first Titan was a good card, because it offered a huge performance boost and 700 series cards came after it. Even the Maxwell and Pascal Titans offered decent value. But the 3090 is one of the worst graphics cards in history.
 
The RT effects in RE8 are deliberately toned down to accomodate weaker Raytracing capability to the point that it's not even worth being used at all.
This is a pretty miserable attitude you got there.

What's wrong with subtlety? We don't need everything to be completely over saturated with effects, like flooding (excuse the pun) the environment with puddles just so you can see a million ray-traced reflections everywhere.
 
This is a pretty miserable attitude you got there.

What's wrong with subtlety? We don't need everything to be completely over saturated with effects, like flooding (excuse the pun) the environment with puddles just so you can see a million ray-traced reflections everywhere.

if you think only puddles have reflection then you have alot to learn about RT ;)
 
Back
Top