• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Far Cry 6 Benchmark Test & Performance

I've always been of the opinion that Ray Tracing should have been absent in this generation of consoles. Rather,

  • reduced texture/object pop-in, greater draw distance and higher frame rate
  • leveraging SSDs to eliminate loading screens

should've been the focus. Ray Tracing simply isn't ready for mainstream use yet in my opinion.

As someone who adores Doom Eternal (which has a very optimised storage stack and is a very optimised game in general), it is very nice to have fast loading times and it is jaw dropping that such a beautiful game can run so damn fast. I never found myself wishing for Ray Tracing when playing it.

I am really looking forward to having single player games in the future where there are no loading screens when transitioning between missions/levels. It would be such a seamless and cohesive experience.
 
Last edited:
Maybe they used the integrated benchmark?

Do I see that right? RX 6700 XT is faster than RTX 3090 according to their results?

lnddrfypyk.jpg
 
Last edited:
Maybe they used the integrated benchmark?

Do I see that right? RX 6700 XT is faster than RTX 3090 according to their results?

lnddrfypyk.jpg
Yes, btw i have the same feeling about this(3060ti faster than 6900xt):

performance-1920-1080.png
 
@W1zzard
What's really interesting is VRAM usage. I measured well over 10 GB with the 4K, HD Texture pack, and ray tracing combo, which does stutter sometimes on the 10 GB GeForce RTX 3080, but runs perfectly fine with cards offering 12 GB VRAM or higher. I guess I was wrong when I said that the RTX 3080's 10 GB will suffice for the foreseeable future.
Out of curiosity, when you've said that runs perfectly fine on 12+ GB cards, it would be interesting to see what the FPS numbers actually are. Additionally, Image comparisons between HD textures on or off. I think the former was already requested but I realize you've likely got your hands full.

While I can certainly see the perspective you've laid out where you feel wrong in saying the 10GB will suffice, I don't think many, if any, will be running the game @4k + HD textures + RT, not only for the lackluster outright performance you'd get, but the virtually non-existent visual returns for enabling RT in this title.

Can I deny this scenario exists and is possible, absolutely not. Can I see many gamers actually using this exact combination of settings? Very unlikely IMO.
 
@W1zzard

Out of curiosity, when you've said that runs perfectly fine on 12+ GB cards, it would be interesting to see what the FPS numbers actually are. Additionally, Image comparisons between HD textures on or off. I think the former was already requested but I realize you've likely got your hands full.

While I can certainly see the perspective you've laid out where you feel wrong in saying the 10GB will suffice, I don't think many, if any, will be running the game @4k + HD textures + RT, not only for the lackluster outright performance you'd get, but the virtually non-existent visual returns for enabling RT in this title.

Can I deny this scenario exists and is possible, absolutely not. Can I see many gamers actually using this exact combination of settings? Very unlikely IMO.
farcry 6.png


HUB (using custom benchmark scene) is not seeing any VRAM related issue with 3080's 10GB though? probably because they use 32GB system RAM instead of 16GB?
Even 3070Ti with 8GB VRAM is closing on to 6800XT with RT Reflections + HD Texture Pack.
32GB Dual Channel Quad Ranks memory FTW :rockout:
 
HUB (using custom benchmark scene) is not seeing any VRAM related issue with 3080's 10GB though? probably because they use 32GB system RAM instead of 16GB?
Even 3070Ti with 8GB VRAM is closing on to 6800XT with RT Reflections + HD Texture Pack.
Indeed it's a very interesting one, they don't mention any stuttering either. I guess my point is more that while it's possible to create VRAM-limited situations in this game, I don't think this scenario and combo of settings is something that many people with any GFX card will go for. ie, the vast majority would either not run the bad RT at all, not run 4k native, not run the HD textures, or use FSR to bring framerates up to playable levels, any of which would alleviate the VRAM bottleneck.

Unfortunately, it's another case of tacked-on-at-the-11th-hour RT effect's 'just because' that are very much in the group of disproportionate performance impact to visual gain category. So the vast majority of people likely won't enable it anyway, no matter the res or card. It's an interesting topic overall imo, and I'm interested to be on the lookout for, and investigate situations where the 10GB is a genuine hindrance under what I'd call more typical/common use cases.
 
Indeed it's a very interesting one, they don't mention any stuttering either. I guess my point is more that while it's possible to create VRAM-limited situations in this game, I don't think this scenario and combo of settings is something that many people with any GFX card will go for. ie, the vast majority would either not run the bad RT at all, not run 4k native, not run the HD textures, or use FSR to bring framerates up to playable levels, any of which would alleviate the VRAM bottleneck.

Unfortunately, it's another case of tacked-on-at-the-11th-hour RT effect's 'just because' that are very much in the group of disproportionate performance impact to visual gain category. So the vast majority of people likely won't enable it anyway, no matter the res or card. It's an interesting topic overall imo, and I'm interested to be on the lookout for, and investigate situations where the 10GB is a genuine hindrance under what I'd call more typical/common use cases.

I don't think RT was tackled-on-at-the-11th-hour here in this game, the reduced RT Reflective surfaces was a deliberate move. This is an AMD sponsored game after all, devs have to make sure it runs sufficiently well on RX6000 with RT ON, just like RE8 Village. So yeah RT is not even worth using in this title, unless you are hitting CPU bottleneck and RT ON make no difference to FPS.

Though I'm happy that this game run well with Nvidia hardwares and the visual justify the performance figure (as opposed to AC Valhalla, funny that HUB is thinking the same thing I do). I'm pondering whether to buy this game on EGS now or wait for it to show up on Steam :rolleyes:

edit: checking out reviews and this game probably need a year to have enough contents, Steam it is then.
 
Last edited:
Nice to see FSR has a promising future. I am an advocate of DLSS and FSR, same as I was with Freesync and Gsync, its good for the immersive experience.
 
Yes, btw i have the same feeling about this(3060ti faster than 6900xt):

View attachment 219778
TPU test rig simply has a significantly lower cpu limitation. HUB has a faster processor (5950x) and dual rank memory, which together give something like 10% uplift to performance, which in turn provides better separation between the GPUs at low resolutions.
 
They said next gen consoles would have brought real multithreading in games.
Here it is:
Far-Cry-6_Geforce-RTX-3080-10GB_HD-Textures-not-loading-pcgh.jpg


View attachment 219709

Some of you could have read about stuttering and low performance in BF 2042 with a 3950x. Another single threaded game.
And they keep promoting useless (game wise) features like DirectStorage when these games are terrible.
An update on the topic:

ubisoft.png


Well, it's Ubisoft.
 
An update on the topic
Yeah the game is CPU limited at lower resolution, I mentioned this in the conclusion and on the FSR testing page
 
The game seems to like bandwidth. This could be a repeat of Halo infinite where the One X gets a higher resolution than the Series S while still managing 60 fps.
 
PS4 have too 8 Cores and games can use 7 of it after an update in the last year.

Whats the point: stupid people bought a 12, 16 Core AMD and though they are the PCMR in Games? :laugh:

Sweetspot is atm still a 6 Core with HTT or 8 Cores without HTT like 9700.



But the i7 5775C is insane, Broadwell nearly takes out a 10% higher Clocked 3600 with 50% more Threads,
Twice Oced Broadwell will crush the 3600

I hit 100% total cpu usage all the time in BF5 multiplayer with a 5.2GHz 9900K when running a mix of high and ultra settings to get 180-200fps. Even 8 cores isn't enough for everything.
 
I hit 100% total cpu usage all the time in BF5 multiplayer with a 5.2GHz 9900K when running a mix of high and ultra settings to get 180-200fps. Even 8 cores isn't enough for everything.
I think the better solution will be to improve per core performance (think Zen 3+ 8 cores with 96 MB L3 cache) than increasing core count any further (at this point in time).
 
But I just started FC5 yesterday ...
 
I don't think RT was tackled-on-at-the-11th-hour here in this game, the reduced RT Reflective surfaces was a deliberate move. This is an AMD sponsored game after all, devs have to make sure it runs sufficiently well on RX6000 with RT ON, just like RE8 Village. So yeah RT is not even worth using in this title, unless you are hitting CPU bottleneck and RT ON make no difference to FPS.
In my experience, having RT on increases the load on the CPU so it might even make the CPU bottleneck appear at a lower fps than with RT off. Have experienced this myself when I had a 3600 paired with a 3080 in games such as CP2077 and WD Legion, some areas in those games had the fps drop lower with RT on and in those instances the GPU usage was in the 65-80% range.
 
Ok after playing for a bit and trying to get used to FSR... I just can't -- it looks terrible the image is badly degraded - a step in the right direction maybe but it's kind of not worth using at the moment at 4k. I would rather take the fps hit in this title or just turn it to medium which actually looks pretty much the same to me.

It just introduces so much blur in mid distance, and with a game that's played at range it's far too distracting vs native vs something like DLSS 2 in cyberpunk. I cant wait until we stop adding blur and motion blur to everything... things should be blurry because they move fast... not because we blur them.
 
Last edited:
Ok after playing for a bit and trying to get used to FSR... I just can't -- it looks terrible the image is badly degraded - a step in the right direction maybe but it's kind of not worth using at the moment at 4k. I would rather take the fps hit in this title or just turn it to medium which actually looks pretty much the same to me.

It just introduces so much blur in mid distance, and with a game that's played at range it's far too distracting vs native vs something like DLSS 2 in cyberpunk. I cant wait until we stop adding blur and motion blur to everything... things should be blurry because they move fast... not because we blur them.

Did you spot more shimmering with FSR? I think devs use negative LOD with FSR (which is the correct thing to do with FSR/DLSS) but it will cause more shimmering, not a problem with DLSS because of the inherent SSAA though.
 
As a long term FarCry series fan. I think I'll sit this one out for the time being, wait for a few patches, and a sale.
 
Negative LOD bias doesn't make much sense with FSR since it sharpens. At the very least depending on sharpening strength it actually negates the need for negative LOD bias and it's DLSS/TAA that has a stronger need for negative LOD bias because it blurs so some of the added texture sharpening thru larger mipmaps helps to offset it.
 
Indeed it's a very interesting one, they don't mention any stuttering either. I guess my point is more that while it's possible to create VRAM-limited situations in this game, I don't think this scenario and combo of settings is something that many people with any GFX card will go for. ie, the vast majority would either not run the bad RT at all, not run 4k native, not run the HD textures, or use FSR to bring framerates up to playable levels, any of which would alleviate the VRAM bottleneck.
For larger variations, measurements such as 99th percentile will tell you something if it's causing major problems. But usually not sporadic microstutter, then you need to analyze the timing of every single frame.

What VRAM bottleneck? If a game is actually running out of VRAM, the symptoms will be severe, like either resource popping or severe stutter and sizeable drop in frame rate (which one depends on the game engine design). "VRAM usage" is a measurement of how much VRAM is allocated, not how much is actually used continuously. A lot of the allocated memory may be temporary buffers which may be used in some rendering passes but not others, which means the data can be heavily compressed when not in use. The true VRAM usage in a game varies from ms to ms. I addition, a lot of buffers are mostly emptiness and can be heavily compressed regardless. So unless you somehow have low-level hardware debugging tools for the GPU, the only way to detect a VRAM (capacity) bottleneck is by measuring the impact.

As a long term FarCry series fan. I think I'll sit this one out for the time being, wait for a few patches, and a sale.
With the quality of software these days, I'd say this should be the recommendation for everyone.
 
Negative LOD bias doesn't make much sense with FSR since it sharpens. At the very least depending on sharpening strength it actually negates the need for negative LOD bias and it's DLSS/TAA that has a stronger need for negative LOD bias because it blurs so some of the added texture sharpening thru larger mipmaps helps to offset it.

Sounds about right, I prefer the softer image of DLSS and just apply a "correct" amount of sharpening on top.
Now what can I do with FSR when it appear oversharpened (probably just not using it LOL).

Btw it does look like FC6 use negative LOD with FSR though, some texture that are not even in the Native image show up with FSR, maybe they are just noise from too much sharpening?
 
Back
Top