• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Atomic Heart Benchmark Test & Performance Analysis

Did you complete the campaign?
No, roughly half of it

and did you think it worth the effort and money?
Not happy with some parts that feel like they are designed to just slow you down, so you end up with more playtime .. you might disagree with me though

Really odd numbers here. My 6900xt gets 130+ fps at 1440p 240hz. Just in generel really odd AMD numbers on TPU compared to the rest of benchmarkers
Open world? or indoors?
 
This game looks great. Can't wait for a sale.
 
Who are you arguing with? The VRAM requirements for this game are low so nobody is making that claim:

vram.png

It's the old "If it isn't true in my one example, it's must not be true at all" logic he's using.

RTX 4070 Ti with 12 GB VRAM has more FPS than the RX 7900 XTX with 24 GB VRAM?

I'm wondering when will the popular argument that "it has more VRAM so it's better" finally die?

View attachment 285024

The chart above illustrates that any modern graphics card with 8GB+ is enough to run this game. It does not illustrate that the same applies to every game, as Hogwarts legacy and Portal RTX will show you.
 
It's the old "If it isn't true in my one example, it's must not be true at all" logic he's using.



The chart above illustrates that any modern graphics card with 8GB+ is enough to run this game. It does not illustrate that the same applies to every game, as Hogwarts legacy and Portal RTX will show you.
Hmmmm.......not seeing it in the hogwarts review. The 3080 is consistently kicking it with the 6800xt, right where its supposed to be,

 
No, roughly half of it


Not happy with some parts that feel like they are designed to just slow you down, so you end up with more playtime .. you might disagree with me though


Open world? or indoors?
If you get through it I wouldn't mind knowing what you think, shit I'll sign up for game pass, it's a quid.

@ThrashZone you were wrong though, less than a day :):p:D
 
Last edited:
Hmmmm.......not seeing it in the hogwarts review. The 3080 is consistently kicking it with the 6800xt, right where its supposed to be,


On that review Wizard commented:

""Allocated" is not "used". You can see cards with 8 GB dropping down in their relative positioning, that's when they are actually running out of VRAM (enough to make a difference in FPS)"

The dropoff in performance isn't dramatic for cards with 8GB or more as most frequently used data is still able to fit in VRAM but there is some performance loss as more fetches to main system memory is required.

For Portal RTX the difference is huge, the 3080 gets 1 FPS at 4K.

VRAM bottlenecks will not always be obvious like that though, not unless there is frequently used data that cannot fit into VRAM.
 
its hard to judge graphics in screenshots alone, but i sure miss some ray tracing features to make it better.
think the developers planned from rtx implementation and skimped out on baked lighting and that is why it looks so bad on most of the pictures.
but as always, what game works first month after release anyways?....
 
RTX 4070 Ti with 12 GB VRAM has more FPS than the RX 7900 XTX with 24 GB VRAM?

I'm wondering when will the popular argument that "it has more VRAM so it's better" finally die?
IMO it's thought processes like yours that perfectly demonstrate why the western world is going down the gurgler.
 
RTX 4070 Ti with 12 GB VRAM has more FPS than the RX 7900 XTX with 24 GB VRAM?

I'm wondering when will the popular argument that "it has more VRAM so it's better" finally die?

View attachment 285024
They had three years working With Nvidia.

It's kind of how Nvidia would have planned it, no.

Loving the day one no update performance of all these cards, especially Arc.
 
RTX 4070 Ti with 12 GB VRAM has more FPS than the RX 7900 XTX with 24 GB VRAM?

I'm wondering when will the popular argument that "it has more VRAM so it's better" finally die?

View attachment 285024

Sadly we live in a world of absolutes... doesn't help when a certain unboxed review site suggests a card is obsolete because of one game. Again I can take a crumb of comfort from the fact that my card isn't actually aging worse than it's 16GB competition and falling faster down the graphs than a Russian oligarch next to a window.
 
Sadly we live in a world of absolutes... doesn't help when a certain unboxed review site suggests a card is obsolete because of one game. Again I can take a crumb of comfort from the fact that my card isn't actually aging worse than it's 16GB competition and falling faster down the graphs than a Russian oligarch next to a window.
I wouldn't be so dismissive. Lower VRAM capacity has bitten both AMD and Nvidia in the past. A big part of the Fury X aging faster than the 980 Ti was its limited VRAM pool.
 
Despite the lack of driver optimisation, the performance looks pretty good as it is. The cards meant for 1440p can do 1440p at max settings and could probably do a good job at 2160p without max settings. The RX 6800 XT sits at ~48FPS and the RTX 3080 sits at ~53FPS when running 2160p max settings. It wouldn't take much to push those over 60FPS.

All in all, this game doesn't look that hard to play on modern hardware, unlike some other titles (cough! HOGWARTS! cough!). :laugh:

I wouldn't be so dismissive. Lower VRAM capacity has bitten both AMD and Nvidia in the past. A big part of the Fury X aging faster than the 980 Ti was its limited VRAM pool.
I couldn't agree more. The 4GB VRAM buffer is definitely the limiting factor on my old R9 Furies, the same 4GB of HBM that was on the Fury-X.
 
and definitely another game where the only DLSS setting that make sense is Quality
 
I wouldn't be so dismissive. Lower VRAM capacity has bitten both AMD and Nvidia in the past. A big part of the Fury X aging faster than the 980 Ti was its limited VRAM pool.

Indeed, I certainly remember the bullshit hype around Fury X and it's "future proof" HBM memory.

The fact remains most half decent cards will run out of pure GPU grunt before VRAM becomes the limiting factor.
 
The fact remains most half decent cards will run out of pure GPU grunt before VRAM becomes the limiting factor.
You're right in the general case, but as the Fury X shows, it's possible for a GPU to run out of VRAM before running out of graphics or compute throughput. In the case of the 3080, Nvidia didn't have good options to increase the VRAM as GDDR6X was only available in 8 Gb capacities at the time.
 
Update Feb 22 10 PM UTC: AMD has released their Radeon 23.2.2 drivers this evening, which add support for Atomic Heart. All AMD results have been retested on that driver, the performance gain is around 1-2% for RX 7900 XT/XTX and 1% for 6800/6900 at 4K.
 
Update Feb 22 10 PM UTC: AMD has released their Radeon 23.2.2 drivers this evening, which add support for Atomic Heart. All AMD results have been retested on that driver, the performance gain is around 1-2% for RX 7900 XT/XTX and 1% for 6800/6900 at 4K.
This has to a be a first: a game ready driver that doesn't actually improve performance.
 
Even though it is not tested by Valve to be Playable or Verified yet, it runs perfectly at 30 FPS with the Low preset on the Steam Deck. CPU is at capped at 10W, capped frame rate at 30. After playing for an hour, battery time remaining at 3 hrs 48 minutes at 72%.

EDIT: FSR disabled
 
Last edited:
Update Feb 22 10 PM UTC: AMD has released their Radeon 23.2.2 drivers this evening, which add support for Atomic Heart. All AMD results have been retested on that driver, the performance gain is around 1-2% for RX 7900 XT/XTX and 1% for 6800/6900 at 4K.
You are a legend Sir thank you.
 
I've been enjoying the game, I have been getting far higher FPS with pretty much the same settings @ 1440p on the RX6800XT. Usually sit about about 100 +
Same here, at 4k max settings(FSR Off) on a 6800/5800x rig I'm averaging around 75fps. The game runs very smooth so far. The lowest fps I've seen so far is 55. Not sure what part of the game TPU tested on, but the performance they are showing isn't anywhere close to what actual gameplay is. At least on AMD, not sure about Nvidia.
 
So the regular DLSS 2.x is now 3.x, even without frame generation?

Any changes in this version?
 
I'm starting to understand why no one's talking about the story now, kinda funny, I didn't realise,, just how on point the story was(Consider Cod tbf).
Shame really since it's normally perfectly fine to talk about it but it's a topic and a half in any forum.

Shame there's no fsr2.1 too.
 
That's relieving. I thought for a moment that 24 GB VRAM is no advantage at all in gaming. Turns out it's all the fault of these stupid Atomic Heart devs who dared to optimize VRAM usage in their game.

I don't know why you insist on derailing the thread with this nonsense. For one thing the game is using about 8 GB of VRAM, so why would you want to prove anything with a game that is barely even close to maxing out a lot of high end cards is beyond me. Memory optimization means removing game objects, that's about it, developers simply try to tune their game to the least common denominator most of the time. Less memory to work with means more simplistic game, that's all there is to the optimization process.

Anyway, you are aware both AMD and Nvidia have drastically increased their memory capacities this generation with Nvidia doing much more so than AMD. The 4080 has 60% more memory than the 3080 and the 4070ti 50% more than the 3070ti. You think they did that for no reason or because they figured it was necessary ? Use your head.

And by the way the highest performing cards are still the ones with the most VRAM, to your dismay, not that it matters.
 
Last edited:
Back
Top