• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Atomic Heart Benchmark Test & Performance Analysis

You can see how this is a Nvidia game. Nvidia were using this game as a showcase for RTX 3 years ago. Looks like some deliberate AMD crippling going on in this game by the devs. Performance is garbage on AMD IMO and great to see the new drivers lifted performance by a mind blowing 1-2%
 
You can see how this is a Nvidia game. Nvidia were using this game as a showcase for RTX 3 years ago. Looks like some deliberate AMD crippling going on in this game by the devs. Performance is garbage on AMD IMO and great to see the new drivers lifted performance by a mind blowing 1-2%

Weird that UE4 games are running better on Nvidia isn't it :rolleyes:.

AMD has always been garbage in UE4 games, that's just the way it is.
 
RTX 4070 Ti with 12 GB VRAM has more FPS than the RX 7900 XTX with 24 GB VRAM?

I'm wondering when will the popular argument that "it has more VRAM so it's better" finally die?

View attachment 285024
It is not a problem with the amount of VRAM. It is known that Unreal Engine 4.0 don't run very well with AMD GPUs. With Unreal Engine 5.0 looking at Fortnite performance, this appears to be a none issue. At the end of the day, regardless of Nvidia, AMD or Intel GPUs, they may not always run well with certain game engine.

I don't know why you insist on derailing the thread with this nonsense. For one thing the game is using about 8 GB of VRAM, so why would you want to prove anything with a game that is barely even close to maxing out a lot of high end cards is beyond me. Memory optimization means removing game objects, that's about it, developers simply try to tune their game to the least common denominator most of the time. Less memory to work with means more simplistic game, that's all there is to the optimization process.

Anyway, you are aware both AMD and Nvidia have drastically increased their memory capacities this generation with Nvidia doing much more so than AMD. The 4080 has 60% more memory than the 3080 and the 4070ti 50% more than the 3070ti. You think they did that for no reason or because they figured it was necessary ? Use your head.

And by the way the highest performing cards are still the ones with the most VRAM, to your dismay, not that it matters.
I feel the reason for the bump is VRAM is less about to bump performance (at least not the main reason), rather it is a decision based on the way they want to segregate the GPUs. For Ampere, the RTX 3080 for example had 10GB likely because, it is using the GA102 chip which comes with a 384bit bus, which means they can either release a 24 or 12GB version. Noting that the performance is actually not that far off, they likely chose to drop the bus to 320 bit and go with the lowest possible VRAM config to address the low GDDR6X supply, and justifying the price of their RTX 3090. The secondary reason is likely because there is really insufficient VRAM for cards targeted to run 1440p upwards. In this case, all looks good, i.e. within 8GB VRAM. But we should reassess the VRAM usage when they enable RT.

In my opinion, spamming VRAM will just result in games getting less optimized. Game consoles can last for a long time with limited amount of VRAM, and to me, the games likely is not getting enough GPU cores, rather than VRAM. But it does show that VRAM usage can be optimized likely with minimal degrade to image quality.
 
So the regular DLSS 2.x is now 3.x, even without frame generation?
Correct, the DLSS 2 version number is 3.x and the DLSS 3 technology exists separately, currently at version 1.x.

Their mistake was calling it DLSS "3", or even "DLSS", Frame Generation is not "scaling", at least not imo. It's still an awesome tech that works really well
 
Everyone is pointing how bad is DLSS in Forspoken (AMD sponsored game) but in this case Nvidia paid devs to use FSR1. :laugh: FSR source code is available to download so update it by yourself.

RTX 4070 Ti with 12 GB VRAM has more FPS than the RX 7900 XTX with 24 GB VRAM?

I'm wondering when will the popular argument that "it has more VRAM so it's better" finally die?

View attachment 285024

Because it's not AMD sponsored game. Even A Plague Tale Requiem Quixel Megascans use ethical amount of VRAM.

vram.png
 
So you need at least a 500 euros graphics card to reach barely 60fps on FHD in 2023. Well, good times...
I'm not gonna say graphics card price are correct, but all those test are in Atomic preset. Put the game in ultra or high and you will gain a lot more fps without having an ugly gaming experience. I didn't see difference while playing between Atomic and high preset, maybe some light here and there but everything was great. Maybe the Atomic preset is more noticeable in 4K but in 1440p i didn't see that much of a diffence, oh yes one, my graphics card goes to 99% to 65% percent usage.
 
I didn't see difference while playing between Atomic and high preset, maybe some light here and there but everything was great
Indeed, check the settings comparison screenshots, the differences are not big, especially not when you're actually playing the game. Only low is a lot different in terms of lighting, still very playable
 
Indeed, check the settings comparison screenshots, the differences are not big, especially not when you're actually playing the game. Only low is a lot different in terms of lighting, still very playable
The screenshot in low settings is really impressive, it gains a lot of fps and is not really ugly, for people with old components or low performance, it is really good.
 
I feel the reason for the bump is VRAM is less about to bump performance (at least not the main reason)

Increase in VRAM capacity is never about performance, it's about ensuring usability. It's binary either you have sufficient memory and there are no issues or you get massive performance drops. Memory costs money, if they can help it Nvidia and AMD would rather not add more of it, they do if they see that it's necessary in the near future. You don't add more memory when you need to because by then it's already too late.

In my opinion, spamming VRAM will just result in games getting less optimized. Game consoles can last for a long time with limited amount of VRAM, and to me, the games likely is not getting enough GPU cores, rather than VRAM. But it does show that VRAM usage can be optimized likely with minimal degrade to image quality.

As I already said the way you optimize memory usage in a game is by simply cutting out stuff and making the game world less complex. Consoles last a long time because developers don't have a choice but to always adhere to this lowest common denominator but make no mistake less VRAM = less complex rendering.
 
Correct, the DLSS 2 version number is 3.x and the DLSS 3 technology exists separately, currently at version 1.x.

Their mistake was calling it DLSS "3", or even "DLSS", Frame Generation is not "scaling", at least not imo. It's still an awesome tech that works really well
Hmm didn't realise that, guess I'll grabbing the latest DLL then!
 
NGL, I'm trying really hard to even see the difference between high/medium vs atomic...
Still, the game looks good and VRAM usage looks reasonable
 
Damn, this seems to kill my 6700 XT :eek:
 
It's the old "If it isn't true in my one example, it's must not be true at all" logic he's using.
This must be satire. When the vast majority of games run fine on 8/10 GB cards, and half the internet loses their mind over a single title or two...

Pot calling the kettle black if I've ever seen it, with so many people willing to hang their hat on obsolescence based on a tech demo and one poorly title with multiple issues, oh and lets not forget AMD sponsored titles with texture packs.

Mind you, you personally might not have jumped on that bandwagon, but as an active member you can't pretend you missed it.
 
Plus the fact it's an Nvidia title
It's an Nvidia RTX Showcase Title.
Oh wait, hang on, that's incorrect.

It's an Nvidia RTX Showcase Title.
There we go I think. Or wait, perhaps not - if AMD have beaten them to game-ready drivers...

It's an Nvidia RTX Showcase Title.
Much better, still room for improvement though.

It's an Nvidia RTX Showcase Title.
Perfection.
 
It's sad to see the arguments and hate about companies and VRAM

Look, it's very simple, having little VRAM it's not a problem if the game doesn't need more than that, it doesn't affect the performance

Now, games could use more VRAM while being optimized, but the developers choose to sacrifice visuals, so the game requires less VRAM and works in more computers. Developers will release games that use more VRAM in the future, depending on market adoption of GPUs with more VRAM

So why does anyone needs 24 GB of VRAM right now? Well, there's life outside gaming, like game development, modeling, rendering

Personally, for gaming I would go for at least 8 GB VRAM, but going beyond 12 GB doesn't make much sense, the GPU will become obsolete sooner than the quantity of VRAM would be actually used on gaming, except for maybe one weird exception here and there
 
It's sad to see the arguments and hate about companies and VRAM

Look, it's very simple, having little VRAM it's not a problem if the game doesn't need more than that, it doesn't affect the performance

Now, games could use more VRAM while being optimized, but the developers choose to sacrifice visuals, so the game requires less VRAM and works in more computers. Developers will release games that use more VRAM in the future, depending on market adoption of GPUs with more VRAM

So why does anyone needs 24 GB of VRAM right now? Well, there's life outside gaming, like game development, modeling, rendering

Personally, for gaming I would go for at least 8 GB VRAM, but going beyond 12 GB doesn't make much sense, the GPU will become obsolete sooner than the quantity of VRAM would be actually used on gaming, except for maybe one weird exception here and there
An adequate retort to your ideology that beyond 12GB doesn't matter is in the market, the 1080 Ti had it been given 6GB it wouldn't have been half the card it turned out to be longevity wise.

And it also schooled Nvidia on obsolescence, they don't over Vram cards anymore unless you pay big dime, they learned how wrong your ideology can be since some normies will only upgrade when an issue stops the latest cod working and not before, going future proof(ish) in Vram costs the brand sales in two to three years.

From some perspective it won't matter true, if you're upgrading in two years so be it but most Don't.
 
Update Feb 22 10 PM UTC: AMD has released their Radeon 23.2.2 drivers this evening, which add support for Atomic Heart. All AMD results have been retested on that driver, the performance gain is around 1-2% for RX 7900 XT/XTX and 1% for 6800/6900 at 4K.
Any difference between 1 and .1% lows? Framerate stability?
 
An adequate retort to your ideology that beyond 12GB doesn't matter is in the market, the 1080 Ti had it been given 6GB it wouldn't have been half the card it turned out to be longevity wise.

And it also schooled Nvidia on obsolescence, they don't over Vram cards anymore unless you pay big dime, they learned how wrong your ideology can be since some normies will only upgrade when an issue stops the latest cod working and not before, going future proof(ish) in Vram costs the brand sales in two to three years.

From some perspective it won't matter true, if you're upgrading in two years so be it but most Don't.

Framing a personal opinion as an ideology seems worrying, plus the longevity of the 1080 Ti was precisely due it's performance, not the VRAM
 
12 GB isn't enough for 8K in some games. But it's more than enough for 4K in every game. Even 8 GB is enough.

12 GB 2080 Ti and 8 GB 3070 scales evenly from 1080p to 4K.
 
Last edited:
It's sad to see the arguments and hate about companies and VRAM

Look, it's very simple, having little VRAM it's not a problem if the game doesn't need more than that, it doesn't affect the performance

Now, games could use more VRAM while being optimized, but the developers choose to sacrifice visuals, so the game requires less VRAM and works in more computers. Developers will release games that use more VRAM in the future, depending on market adoption of GPUs with more VRAM

So why does anyone needs 24 GB of VRAM right now? Well, there's life outside gaming, like game development, modeling, rendering

Personally, for gaming I would go for at least 8 GB VRAM, but going beyond 12 GB doesn't make much sense, the GPU will become obsolete sooner than the quantity of VRAM would be actually used on gaming, except for maybe one weird exception here and there
Having more VRAM than you need is pointless, but having insufficient VRAM is pretty bad. I ran into multiple issues at 1440p with 6GB over three years ago. 6GB is simply unacceptable in 2023 and laptops with the 3060 6GB are suffering in several popular titles now.

8GB and 10GB have been enough so far for the resolutions that those cards target, respectively, but it looks like 8GB is becoming a bigger problem every month for cards like the 3070 and 3070Ti which can actually drive 4K well enough to need more VRAM. Meanwhile the 3080 10GB has just encountered its first "insufficient VRAM" game so its days as a no-compromise card are definitely over.

The 12GB of the 4070Ti looks to be enough for the next generation of game engines, but the 8GB of the upcoming 4070 and 4060Ti are definitely not going to be enough, because it's already a problem for the current-gen cards in 2023. I was really hoping that 8GB was "entry level" this generation, and that 12GB would be the mainstream VRAM amount for the popular 1440p sweet-spot. Clearly, Jensen has other ideas...
 
Framing a personal opinion as an ideology seems worrying, plus the longevity of the 1080 Ti was precisely due it's performance, not the VRAM
No need to worry, but the Vram helps keep it viable to use, as I said if it had 6 or worse 4 GB regardless of its performance power it would have been insufficient, sooner in many pcs.

and not precisely even then because after it released, Gpu performance stagnated a bit, vram sizes froze and prices stepped up while availability vanished so it had a good run with assistance from the likes of Raja :) :D
 
Another "Crippled by Nvidia" title.
 
Having more VRAM than you need is pointless, but having insufficient VRAM is pretty bad. I ran into multiple issues at 1440p with 6GB over three years ago. 6GB is simply unacceptable in 2023 and laptops with the 3060 6GB are suffering in several popular titles now.

8GB and 10GB have been enough so far for the resolutions that those cards target, respectively, but it looks like 8GB is becoming a bigger problem every month for cards like the 3070 and 3070Ti which can actually drive 4K well enough to need more VRAM. Meanwhile the 3080 10GB has just encountered its first "insufficient VRAM" game so its days as a no-compromise card are definitely over.

The 12GB of the 4070Ti looks to be enough for the next generation of game engines, but the 8GB of the upcoming 4070 and 4060Ti are definitely not going to be enough, because it's already a problem for the current-gen cards in 2023. I was really hoping that 8GB was "entry level" this generation, and that 12GB would be the mainstream VRAM amount for the popular 1440p sweet-spot. Clearly, Jensen has other ideas...
RTX 4070 will have 12GB same like Ti version, not 8GB.
And yes, RTX 4060/4060 Ti with 8GB is a joke in 2023.
 
They are not intended for 4K, so it's enough. Not a single game requires more 8 GB in 1440p.
What ? Since when ? Alot of gsmes uses more then 8gb in 1440p
 
Back
Top