• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6800 XT Tested on Z490 Platform With Resizable BAR (AMD's SAM) Enabled

You have multiple graphs where the difference is a few percentages and then one which increases three times - obviously this is a large red flag. If enabling Resizable BAR triggers something in the driver which makes it work correctly and FPS to be more stable, this doesn't mean that Resizable BAR improves the minimum FPS three times. First you wait for AMD to fix the drivers so they work properly, and then test again. If AMD can't fix the drivers, then your theory is obviously correct and resizable BAR can also fix game bugs.

Also, you don't just publish min FPS results for the obvious outlier, as this will raise questions (everyone that has ever done benchmarking will be skeptical of the results ...). You publish frametime graphs, which explain in detail what changed when Resizable BAR got turned on. Isn't it obvious? It doesn't seem to be for them ...

Min FPS is the result of bottlenecking, SAM removes the bottleneck and you get more of the performance you are already receiving (the average FPS)

This isn't complicated.
 
You have multiple graphs where the difference is a few percentages and then one which increases three times - obviously this is a large red flag. If enabling Resizable BAR triggers something in the driver which makes it work correctly and FPS to be more stable, this doesn't mean that Resizable BAR improves the minimum FPS three times. First you wait for AMD to fix the drivers so they work properly, and then test again. If AMD can't fix the drivers, then your theory is obviously correct and resizable BAR can also fix game bugs.

Also, you don't just publish min FPS results for the obvious outlier, as this will raise questions (everyone that has ever done benchmarking will be skeptical of the results ...). You publish frametime graphs, which explain in detail what changed when Resizable BAR got turned on. Isn't it obvious? It doesn't seem to be for them ...
How about the obvious red flag that such minimum lows (fps) would suggest a driver bug or some issues with the game itself? Like many others have said ~ it ain't so hard :shadedshu:
 
How about the obvious red flag that such minimum lows (fps) would suggest a driver bug or some issues with the game itself? Like many others have said ~ it ain't so hard :shadedshu:


The driver doesn't control how big the addressable window to move data is, or when the game engine makes a call for those resources to be transferred.

Its not hard to understand.
 
Min FPS is the result of bottlenecking, SAM removes the bottleneck and you get more of the performance you are already receiving (the average FPS)

This isn't complicated.
How about the obvious red flag that such minimum lows (fps) would suggest a driver bug or some issues with the game itself? Like many others have said ~ it ain't so hard :shadedshu:

Ok, ok, sheesh. A attack from all sides, just because someone is skeptical. You all win. I'm out of this topic.

But before I go I would like to just say one more thing: because of graphs like this, people make false extrapolations and are then disappointed with the results when the reality is different. We have seen this countless times, and it will happen again. Just wait ...
 
So nothing new here, no magic, they are just using the Pcie bus the way it's meant to be used. Using a capability that was already there, but no one thought of actually utilizing it till now. Because no performance gain should be free, buy a better gpu?
 
Ok, ok, sheesh. A attack from all sides, just because someone is skeptical. You all win. I'm out of this topic.

But before I go I would like to just say one more thing: because of graphs like this, people make false extrapolations and are then disappointed with the results when the reality is different. We have seen this countless times, and it will happen again. Just wait ...


I want a full on review of what it does with different chips and to see what gains and losses it provides, for example in games like CIV it may negatively impact performance if all cores are already busy and one has to queue its work to run driver code and perform a transfer and decompression.

Once game engines are coded to take advantage of the function we will see how it changes the landscape.
 
If these are accurate results this is very promising! I wonder if it has anything to do with the loading of assets in wider open games that RDR2 is getting that massive boost. I can attest that no matter what I do in some games I cannot get away from those micro stutters whenever I am exploring the world and as new items load I there is a slight, but noticeable stutter. If this fixes that I will kiss some feet! This is one of the biggest things that keeps driving me crazy in games that I have not yet been Able to fix
 
This. If it's part of the PCIe spec, why is it being limited to only the most recent hardware? I'd like a boost for my 2600/5700XT
I'd also like to boost my 3900X on X370 motherboard and my future 6800XT as well, if they do not so and Intel systems have it, I'd be really disappointed
 
Yeah nice gains for Intel users, and a nice kick in the teeth for AMD users without a shiny new 5000 series CPU.

Come on AMD, play it again SAM.
 
Why didn't Nvidia or Intel figure this out earlier? I think regardless of everything else, everyone owes AMD a thank you, because I'm sure this will eventually be enabled on everything.

Seriously though, Nvidia has 10x the financial resources of AMD and way more employees... How come they didn't get to this sooner?
 
Look at that lowest FPS gain for RDR2!!! o_Oo_O
If this does end up working on Intel CPU's, you better be damn sure they support it on previous Gen Ryzen CPU's...
 
Parden my ignorance, but what is the difference between gpu avg and fps avg? Also, what is the point of testing max fps?
 
So why would that be remedied with BAR enabled? Or are you suggesting that they ran the "SAM off" test in borderless and then the "SAM on" test in fullscreen?

I don't know if it's a game issue or AMD driver issue, but I would assume they did the test in borderless as the game is very insistent in using that mode (at least it does for me).

As to why having SAM on made a difference (assuming they used borderless both times), is that there is some stalling from the driver/game that is causing the game to lag.

Could be an API issue too as I think i remember some people with nVidia cards commenting on that issue with Vulkan.
 
Boosts performance to an almost difference class of product wow!

I'm kinda sad now that SAM might not come to Ryzen 3000 and older. From a comment on videocardz:
Irata said:
The reason could come down to Intel CPU supporting PDEP and PEXT in hardware since Haswell, whereas Ryzen only supports this in hardware since Zen 3. Previous Gen Ryzen had to use it in Microcode, so latency and performance will not be good enough.
It will probably come to 400 series mainboards but again with Zen 3.
But you know, you could be happy that this feature is now available on more platforms (the latest Intel CPU on Asus 400 series boards) and appreciate the technology rather than make a stupid troll post. This looks like a free performance boost, and the more that can take advantage of that, the better.

Wikipedia source

 
Ahh yes I completely forgot that the ASUS ROG Maximus XII EXTREME is a PCIe 3.0 board my bad. Same goes for the CPU. I'm sorry I forgot to take that into consideration, therefor making the board useless for Rocket Lake as well by not being able to utilize PCIE 4.0 either. Thank you for pointing that out. Now I can go back to making food. And once again, thank you for pointing out the flaw in my logic. Now I know that I won't be buying a ASUS ROG Maximus XII EXTREME because it'll never support PCIe 4.0.

Actually, The Maximus XII Extreme is the only board in ASUS Z490 lineup that's confirmed to be gen 4 capable at the hardware level (meaning it's got all the necessary hardware built into the board itself). All their other boards seem to be lacking the necessary hardware.
 
Why didn't Nvidia or Intel figure this out earlier? I think regardless of everything else, everyone owes AMD a thank you, because I'm sure this will eventually be enabled on everything.

Seriously though, Nvidia has 10x the financial resources of AMD and way more employees... How come they didn't get to this sooner?
Because Nvidia doesn't make motherboard chipsets any more? As such they have no control over features like this.
 
Gonna be some bullshit if AMD doesn't enable this on Ryzen 3000 CPUs as well.
 
Gonna be some bullshit if AMD doesn't enable this on Ryzen 3000 CPUs as well.
Please read the thread at a very minimum before commenting.
 
Please read the thread at a very minimum before commenting.

I did?
 
I'm kinda sad now that SAM might not come to Ryzen 3000 and older. From a comment on videocardz:
Sounds all very scientific, but what is the connection between accelerated bitmasking+compression and wide addressing of the GPU memory?
 
Clearly not, as if you did, you would've know why it won't come to older AMD CPUs. AMD doesn't support the instruction needed.
Maybe read the full twitter thread linked to above if the explanation here wasn't good enough, or the Wikipedia link provided above.

From above twitter thread.
EmshLqmXcAAWPr1.png
 
Last edited:
Hella good boost on minimum framerate like said above. :eek:
 
So the SAM works with Intel CPUs. That surely is good news. It shows very good numbers scaling as well. It would have been a comedy if the SAM worked better on Intel vs AMD processors. :D
 
Back
Top