# AMD Radeon Resizable BAR / Smart Access Memory



## W1zzard (Nov 18, 2020)

AMD's Smart Access Memory feature is highly interesting. It promises a performance boost when the new Radeon RX 6800 Series cards are paired with an AMD Ryzen Zen 3 processor. We extensively test this in a whole article with 22 games at three resolutions, at up to 4K Ultra HD.

*Show full review*


----------



## Rais (Nov 18, 2020)

> If you're one of the lucky few who flew to the horn of Africa, joined a pirate gang, hijacked a Maersk superheavy, broke into the right container, and pulled out a Ryzen 5000 processor, then Smart Access Memory is a cool feature to have (no, don't do that)



Too late, i'm already on black lagoon mood.


----------



## Ravenas (Nov 18, 2020)

It should have been enabled on Ryzen 3000. I agree it should be enabled on Intel chips as well. It should be considered a selling point for the graphics card, not a selling point for platform selection.


----------



## Sandbo (Nov 18, 2020)

Tbh, that's a difference way too small to even be mentioned, or am I missing something?


----------



## ZeppMan217 (Nov 18, 2020)

Sandbo said:


> Tbh, that's a difference way to small to even be mentioned, or am I missing something?


SAM makes 6800XT marginally faster (3-4%) at sub 4K resolutions, with nothing changing at 4K. 
I'm more curious if the plain 6800 is any better with SAM enabled.


----------



## W1zzard (Nov 18, 2020)

Sandbo said:


> Tbh, that's a difference way to small to even be mentioned, or am I missing something?


It is way bigger than that, and what makes it interesting, it's free and available on all platforms. Of course in a subjective scenario without FPS counter you'd never notice the difference, but all the little gains add up eventually


----------



## birdie (Nov 18, 2020)

> NVIDIA has already announced that they will add a similar feature to their GeForce graphics cards, probably with wider platform support. I'm sure this will lead AMD to open their discovery to more chipsets and hardware combinations.



LMAO.

AMD sure loves a taste of proprietary just like NVIDIA.


----------



## bug (Nov 18, 2020)

Nvidia red and AMD green. Were you intentionally trying to make our brains hurt reading those graphs?


----------



## W1zzard (Nov 18, 2020)

bug said:


> Nvidia red and AMD green. Were you intentionally trying to make our brains hurt reading those graphs?


It's the default coloring of my charts. Let me fix this


----------



## L'Eliminateur (Nov 18, 2020)

weeeezard(ever since watching hoovie's garage bringing cars to the car wizard i can't help it), ¿can you test the effects of enabling HW scheduling on these new cards in a separate article?, it's been several months, a new windows build and several drivers(not to mention game patches) since this feature was introduced but ever since everything has been mum on that front apart from the initial benchmarks done then(on 2004 and initial drivers)


----------



## W1zzard (Nov 18, 2020)

bug said:


> Nvidia red and AMD green. Were you intentionally trying to make our brains hurt reading those graphs?


Fixed, better?


----------



## ZeppMan217 (Nov 18, 2020)

W1zzard said:


> Fixed, better?


Yas, big thank!


----------



## Space Lynx (Nov 18, 2020)

I got everything I need to enable this now. I'm excited.  I am glad AMD didn't enable it for older CPU just yet though, AMD has bad history of bugs and stability, I'd rather them focus all their energy on getting the latest and greatest to work first, then work their way down the line... mainly cause I just don't trust them to be stable otherwise. I can't wait to game!


----------



## claylomax (Nov 18, 2020)

What's with the F1 2020 and Project Cars 3 charts compared to the original 6800XT review?
That Intel test system is definitely outdated.


----------



## W1zzard (Nov 18, 2020)

claylomax said:


> What's with the F1 2020 and Project Cars 3 charts compared to the original 6800XT review?
> That Intel test system is definitely outdated.


Different settings. When those detect a change in CPU they will reset to different details levels, which I noticed only after all I got all data. The data is still valid, just not comparable to the Intel rig


----------



## bug (Nov 18, 2020)

W1zzard said:


> Fixed, better?


Not ideal, but yes, better.

Ideally, I'd have AMD stick to red, Nvidia to green and Intel to blue. And then I'd use different shades for variations of the same product.


----------



## Space Lynx (Nov 18, 2020)

I am going to leave SAM enabled 24/7 personally. the 2-3 fps hit on certain games not a big deal to me, the gains in games like gears 5 though is def worth it. I won't have to bother with it, set it and forget it, leave almost all other settings at default for drivers, and game time!


----------



## bug (Nov 18, 2020)

lynx29 said:


> I am going to leave SAM enabled 24/7 personally. the 2-3 fps hit on certain games not a big deal to me, the gains in games like gears 5 though is def worth it. I won't have to bother with it, set it and forget it, leave almost all other settings at default for drivers, and game time!


Which gain is that? 182fps not enough for you @FHD? Or do the extra 5fps @4k make a difference to you?
It's a slight improvement, but really, nothing to rave about.


----------



## claylomax (Nov 18, 2020)

W1zzard said:


> Different settings. When those detect a change in CPU they will reset to different details levels, which I noticed only after all I got all data. The data is still valid, just not comparable to the Intel rig


Ok, I see. Thanks.


----------



## Space Lynx (Nov 18, 2020)

bug said:


> Which gain is that? 182fps not enough for you @FHD? Or do the extra 5fps @4k make a difference to you?
> It's a slight improvement, but really, nothing to rave about.



not raving about it... just will be fun to enable... cause why not?

i am raving about getting my hands on a 5600x and non-xt 6800 on their respective launch days, can't believe I got so lucky. was insane trying to check out.  and I expect we won't see steady stocks until April 2021... and I am also raving that I just paid $579 for an AMD card in the year 2020 that beats a 2080 ti... are you kidding me? this is a great day for everyone, competition is back on all fronts. this is awesome for the future of our hobby.  i can't wait to play cyberpunk 2077 now, i could care less about enabling ray tracing, so nvidia has that win for sure, but meh. im still quite happy


----------



## suqd (Nov 18, 2020)

Will SAM work if I'm running the GPU in Gen4 x8 mode? I'll be using the other x8 lanes for m.2 Raid1.


----------



## W1zzard (Nov 18, 2020)

suqd said:


> Will SAM work if I'm running the GPU in Gen4 x8 mode? I'll be using the other x8 lanes for m.2 Raid1.


No reason to assume it wouldn't. It has nothing to do with PCIe lanes


----------



## chodaboy19 (Nov 18, 2020)

> If you're one of the lucky few who flew to the horn of Africa, joined a pirate gang, hijacked a Maersk superheavy, broke into the right container, and pulled out a Ryzen 5000 processor, then Smart Access Memory is a cool feature to have (no, don't do that).



I enjoyed that closing.


----------



## okbuddy (Nov 18, 2020)

why under 4k everything came loose, isn't that 128mb cache snake oil


----------



## kruk (Nov 18, 2020)

birdie said:


> LMAO.
> 
> AMD sure loves a taste of proprietary just like NVIDIA.



It isn't proprietary, stop making things up: https://www.pcgamer.com/amd-smart-access-memory-not-proprietary-promise/



> AMD says its Smart Access Memory isn't proprietary and isn't only locked into working with its own Ryzen 5000-series CPUs and Radeon RX 6000-series GPUs. It's just that it hasn't yet worked with any other hardware vendors to enable it, though it welcomes the opportunity to do so.


----------



## Julhes (Nov 18, 2020)

Thanks to my rtx 2080 ti bought for € 429 (second-hand)


----------



## bug (Nov 18, 2020)

kruk said:


> It isn't proprietary, stop making things up: https://www.pcgamer.com/amd-smart-access-memory-not-proprietary-promise/


Well, tbh, they haven't even worked with themselves to bring this to Zen2 CPUs.
But I was suspecting that's a temporary situation and they just focused on one CPU gen to get to the market faster.

Plus, now that we see it doesn't actually do much difference irl, we can really stop worrying about it. Those who can get it will need to turn on their monitoring HUD to see the few fps they enjoy, the rest can rest assured they're not missing much.

And while typing this, it just hit me not everything is just about fps, I wonder is enabling SAM affects power draw somehow. I'd be more psyched if SAM would allow you to shave off a few W in memory intensive scenarios, rather than gaining a few fps.


----------



## mechtech (Nov 18, 2020)

W1zzard said:


> It's the default coloring of my charts. Let me fix this



"Red–green color blindness affects up to 8% of males and 0.5% of females of Northern European descent. "








						Color blindness - Wikipedia
					






					en.wikipedia.org
				




May I suggest orange and blue please/thanks


----------



## defaultluser (Nov 18, 2020)

We have been using the same 256-MB PCIe memory window from the 32-bit days for the last 15 years without a problem.

Making this infinite-sized is not going to suddenly magically make the PCIe bus any faster, you just get slightly higher throughput (due to less protocol overhead).  It's like all the  hype of Ethernet Jumbo Frames, and yet it does very little (unless you are processor limited)

Once drivers and games get optimized for this, it will be a more  reliable performance improvement...but it's still going to be 5% or less.


----------



## Steevo (Nov 18, 2020)

defaultluser said:


> We have been using the same 256-MB PCIe memory window from the 32-bit days for the last 15 years without a problem.
> 
> Making this infinite-sized is not going to suddenly magically make the PCIe bus any faster, you just get slightly higher throughput (due to less protocol overhead).  It's like all the  hype of Ethernet Jumbo Frames, and yet it does very little (unless you are processor limited)
> 
> Once drivers and games get optimized for this, it will be a more  reliable performance improvement...but it's still going to be 5% or less.



Less than 5% is all the 3080 is faster than the 6800XT, and SAM is free, so......

It may improve things like F@H and unfortunately also mining efficiency, the former is welcome, the latter can light themselves on fire for all I care. But it would also be impressive to see how SQL lookup times would change if the CPU being able to use GPU memory and hardware.


----------



## defaultluser (Nov 18, 2020)

Steevo said:


> Less than 5% is all the 3080 is faster than the 6800XT, and SAM is free, so......




This is  not some proprietary solution.  I would expect both implementations of this standard feature to give you similar increases in FPS. We were just waiting for the 64-bit conversion to finish before this increase in memory-window size  made any sense.

You are not going to magically unlock more FPS from the 6800 than it has today.


----------



## Steevo (Nov 18, 2020)

defaultluser said:


> This is  not some proprietary solution.  I would expect both implementations of this standard feature to give you similar increases in FPS. We were just waiting for the 64-bit conversion to finish before this increase in memory-window size  made any sense.
> 
> You are not going to magically unlock more FPS from the 6800 than it has today.



Did you read the review? It showed "unlocking" 2% performance improvement, and with further implementation I'm sure it will gain a little more. So other than one game engine where is your data to back up that it doesn't improve performance today?

It will be interesting to see how Nvidia implements this and what performance increase they too gain, none of these companies are adding features to slow down their hardware....


----------



## zlobby (Nov 18, 2020)

Holly schnitzel! Eat $hit, nvidia! 



W1zzard said:


> It's the default coloring of my charts. Let me fix this


Where are the comparisons between frame times and input latencies?


----------



## W1zzard (Nov 18, 2020)

zlobby said:


> Where are the comparisons between frame times and input latencies?


thanks for reminding, frametimes have been added. Nothing to see there other than just lower frametime with BAR resizing


----------



## zlobby (Nov 18, 2020)

W1zzard said:


> thanks for reminding, frametimes have been added. Nothing to see there other than just lower frametime with BAR resizing


Sweet! Yer a wizard, W1zzard!



bug said:


> Not ideal, but yes, better.
> 
> Ideally, I'd have AMD stick to red, Nvidia to green and Intel to blue. And then I'd use different shades for variations of the same product.


Different shading and fill patterns can also be used.


----------



## Space Lynx (Nov 18, 2020)

@W1zzard 

just an fyi, min frame rates increase a lot with SMA enabled, timestamped linus video for you where he talks about it


----------



## HD64G (Nov 18, 2020)

So, when using the Intel setup 6800XT loses to 3080 by 4% and when using the 5900X setup it closes the gap to 1% without SAM and wins by 1% using SAM at 1440P on average. Good to know. PCIE4 could play its role there me thinks.


----------



## Space Lynx (Nov 18, 2020)

HD64G said:


> So, when using the Intel setup 6800XT loses to 3080 by 4% and when using the 5900X setup it closes the gap to 1% without SAM and wins by 1% using SAM at 1440P on average. Good to know. PCIE4 could play its role there me thinks.




hardware unboxed review of the 6800 xt is the best of all of them I think, I watched them all.  ray tracing really doesn't matter to me i just want high frames 1080p/1440p gaming. so yeah. rock on.


----------



## arbiter (Nov 18, 2020)

So AMD claims big gains like near 10% but in independent review only gives at most 2%. Yea sounds like AMD marketing is still like it used to be.


----------



## crazyeyesreaper (Nov 18, 2020)

arbiter said:


> So AMD claims big gains like near 10% but in independent review only gives at most 2%. Yea sounds like AMD marketing is still like it used to be.


Well to be honest there are games that do see such a boost.

*Detroit Become Human*
1080p 8% performance gain.
1440p 7%
4k 4%

*Gears 5*
1080p 9%
1440p 11%
4k 6%

*Hitman 2*
1080p 14%
1440p 11%
4k 7%

Granted 3 games out of many is pretty limited. However, its likely with time, drivers and game engine enhancements especially with consoles being AMD CPU / GPU we will see SAM offer a more sustained boost. For now its extremely limited but in some situations a 4-14% boost in performance is nothing to sneeze at especially in an age where GPU and CPU overclocking are becoming increasingly redundant due to power limits and other factors. I imagine with Xbox essentially just being a pre-spec PC we will see more titles use SAM and NVIDIA's implementation as well. A free performance bump is a free bump.


----------



## Space Lynx (Nov 18, 2020)

crazyeyesreaper said:


> Well to be honest there are games that do see such a boost.
> 
> *Detroit Become Human*
> 1080p 8% performance gain.
> ...




you haven't added in SAM or high tuned ram  yet either... i will be getting 20-40% gains due to my ram and SAM... so yeah...I am aware not every will be able to activate SMA, but thats not my problem.


----------



## crazyeyesreaper (Nov 18, 2020)

lynx29 said:


> you haven't added in SAM or high tuned ram  yet either... i will be getting 20-40% gains due to my ram and SAM... so yeah...I am aware not every will be able to activate SMA, but thats not my problem.


Point remains 22 games 4K res settings most of the time its 0 change but the change we do see is a good indicator of the future in those select titles. no one will turn down free performance after all. And considering the AMD cards can actually OC a bit... well i would say they are sitting in a good spot seeing an OC increase performance by 9% is a nice change of pace, add in rage mode for another 1-2% add in SAM for even if its just 2% on average thats still a nice up tick of 12% with the capacity to push up further. for instance in a title at 1440p already seeing a boost of 10-11% via SAM, add in 9% OC and rage mode and oh look you gained 20%. Thats a nice feather in AMD's cap but it needs to become a mainstream aspect not a limited to select scenarios bonus.


----------



## zlobby (Nov 19, 2020)

@W1zzard , I know you are a skilled coder and you have good knowledge of hardware and how it works on a low level, but can you please elaborate why are you a fan of CSM in 2020? I know UEFI is far from perfect, but still?



bug said:


> Well, tbh, they haven't even worked with themselves to bring this to Zen2 CPUs.
> But I was suspecting that's a temporary situation and they just focused on one CPU gen to get to the market faster.
> 
> Plus, now that we see it doesn't actually do much difference irl, we can really stop worrying about it. Those who can get it will need to turn on their monitoring HUD to see the few fps they enjoy, the rest can rest assured they're not missing much.
> ...


I'm more worried that SAM will bring tons of new vulnerabilities for the hax0rz to play with. 



lynx29 said:


> you haven't added in SAM or high tuned ram  yet either... i will be getting 20-40% gains due to my ram and SAM... so yeah...I am aware not every will be able to activate SMA, but thats not my problem.


It'd be interesting to see how SAM performs with heavily-OC'd VRAM, as well as with speedy system RAM. W1zz, get cracking!


----------



## Space Lynx (Nov 19, 2020)

zlobby said:


> It'd be interesting to see how SAM performs with heavily-OC'd VRAM, as well as with speedy system RAM. W1zz, get cracking!



@W1zzard it's none of my business, but I just don't want to see you waste your limited time, imo any ram oc'ing tests should wait for the next BIOS, AMD will be make 4000 1:1 much easier to obtain, etc.  i have my ram stable at 3600 cas 14-14-14 with my 5600x, and it is dual ranked. so i think thats as good as it gets right now until next bios. /shrug



crazyeyesreaper said:


> Point remains 22 games 4K res settings most of the time its 0 change but the change we do see is a good indicator of the future in those select titles. no one will turn down free performance after all. And considering the AMD cards can actually OC a bit... well i would say they are sitting in a good spot seeing an OC increase performance by 9% is a nice change of pace, add in rage mode for another 1-2% add in SAM for even if its just 2% on average thats still a nice up tick of 12% with the capacity to push up further. for instance in a title at 1440p already seeing a boost of 10-11% via SAM, add in 9% OC and rage mode and oh look you gained 20%. Thats a nice feather in AMD's cap but it needs to become a mainstream aspect not a limited to select scenarios bonus.



hardware unboxed and ars technica review (i think) had much different performance gains for SAM than w1zz did. lot of variables this time around, who knows what the real answer is. either way im enabling SAM when my card gets here, meh why not. also its not about fastest fps with SAM, linustechtips shows us min fps increases a lot with SAM on


----------



## crazyeyesreaper (Nov 19, 2020)

lynx29 said:


> @W1zzard it's none of my business, but I just don't want to see you waste your limited time, imo any ram oc'ing tests should wait for the next BIOS, AMD will be make 4000 1:1 much easier to obtain, etc.  i have my ram stable at 3600 cas 14-14-14 with my 5600x, and it is dual ranked. so i think thats as good as it gets right now until next bios. /shrug
> 
> 
> 
> hardware unboxed and ars technica review (i think) had much different performance gains for SAM than w1zz did. lot of variables this time around, who knows what the real answer is. either way im enabling SAM when my card gets here, meh why not. also its not about fastest fps with SAM, linustechtips shows us min fps increases a lot with SAM on


True Gamer's Nexus on the other hand found that SAM had next to no real benefit other than 1 or 2 titles. So its likely highly dependent on the title / game engine and driver optimization.


----------



## alres3 (Nov 19, 2020)

I have a 2700+ B450 and pre-ordered 3080 (maybe I'll get it in a week). To really benefit from SAM I'd need to upgrade my MB to X570 and new CPU, correct? Probably better off with my 3080 coming and gaming at 4k 60fps+


----------



## ratirt (Nov 19, 2020)

@W1zzard Is everything OK with the charts? For instance the F1 2020 shows some weird numbers.
Is this the lowest quality possible? 486.2 FPS for 3080 at 4k? In the 6800XT review it had 145.5 FPS.
*


*


----------



## W1zzard (Nov 19, 2020)

ratirt said:


> @W1zzard Is everything OK with the charts? For instance the F1 2020 shows some weird numbers.
> Is this the lowest quality possible? 486.2 FPS for 3080 at 4k? In the 6800XT review it had 145.5 FPS.


I copied over my settings from the Intel rig, but when F1 detects a change in CPU, it will reset to different settings. I noticed this only after testing.

Results within this article are perfectly comparable, just not comparable to my other reviews


----------



## ratirt (Nov 19, 2020)

W1zzard said:


> I copied over my settings from the Intel rig, but when F1 detects a change in CPU, it will reset to different settings. I noticed this only after testing.
> 
> Results within this article are perfectly comparable, just not comparable to my other reviews


over 480FPS at 4k  That is crazy


----------



## Spanners (Nov 19, 2020)

birdie said:


> LMAO.
> 
> AMD sure loves a taste of proprietary just like NVIDIA.



Thanks for the reminder I wasn't logged in.


----------



## ratirt (Nov 19, 2020)

@W1zzard why you still aren't using 4 Ram modules? It does boost the performance for both Intel and AMD.


----------



## W1zzard (Nov 19, 2020)

ratirt said:


> @W1zzard why you still aren't using 4 Ram modules? It does boost the performance for both Intel and AMD.


Why is everybody suddenly talking about this? It's been known for a long time, but just isn't practical for most?


----------



## Woomack (Nov 19, 2020)

ratirt said:


> @W1zzard why you still aren't using 4 Ram modules? It does boost the performance for both Intel and AMD.



It's not really like it boosts performance. Depends on the application and many other factors it gives 0-5% performance gain. In most cases, it's close to a 0% gain. Dual rank memory is something that was here for long years but suddenly people start to talk and make up stories (most don't test anything, just repost what they saw around the web). Later forums are flooded by "up to 15% boost".

Btw. nice article W1zzard


----------



## ratirt (Nov 19, 2020)

W1zzard said:


> Why is everybody suddenly talking about this? It's been known for a long time, but just isn't practical for most?


Why not practical? I know it's been known for some time but aren't the CPUs now struggling with performance at 1080p considering the current GPUs? Wouldn't it be better to get some extra performance out of the CPUs to evaluate and test the current graphics cards? Considering the trend of how the GPUs performance increase I think it is best to get the most of the CPUs to have a better reflection, what you can expect of the card.



Woomack said:


> It's not really like it boosts performance. Depends on the application and many other factors it gives 0-5% performance gain. In most cases, it's close to a 0% gain. Dual rank memory is something that was here for long years but suddenly people start to talk and make up stories. Later forums are flooded by "up to 15% boost".
> 
> Btw. nice article W1zzard


I'm talking about games here only and yes, it does boost performance of the CPUs in both cases and the graphics cards achieve more FPS just by adding 2 additional ram modules to have 4 instead of 2. They started talking about it for a reason and that would be CPU bottlenecks at 1080p.


----------



## W1zzard (Nov 19, 2020)

ratirt said:


> Why not practical? I know it's been known for some time but aren't the CPUs now struggling with performance at 1080p considering the current GPUs? Wouldn't it be better to get some extra performance out of the CPUs to evaluate and test the current graphics cards? Considering the trend of how the GPUs performance increase I think it is best to get the most of the CPUs to have a better reflection, what you can expect of the card.


Yeah I'll definitely look more into this when I have time, right now I have 10x SSD for last months, 2x NVIDIA NDA custom design for tomorrow, Radeon custom design soon, new NVIDIA cards next month, 6900 XT next month, 3080 Ti next month. Also reevaluation power testing methodology on GPU, noise testing on GPU, need new CPU test methodology, new GPU test benchmarks, possibly on Zen 3.


----------



## Woomack (Nov 19, 2020)

ratirt said:


> I'm talking about games here only and yes, it does boost performance of the CPUs in both cases and the graphics cards achieve more FPS just by adding 2 additional ram modules to have 4 instead of 2. They started talking about it for a reason and that would be CPU bottlenecks at 1080p.



I actually tested that recently and at 1080p there is up to ~3-5% difference when games already run at 100FPS+. At 1440p+ there is barely any gain, typically in the error margin range. I simply can't confirm what some others spread around the web.

So yes, it helps a bit but it's not really a performance boost. Results at 2x dual-rank modules and 4x single-rank are about the same. Results at 4x single-rank modules are usually a bit better than 4x dual-rank (a matter of memory timings and what motherboard sets under auto). There is also a matter of motherboard and its topology. Simply many variables so can't always say that 4 memory modules are better.


----------



## ratirt (Nov 19, 2020)

W1zzard said:


> Yeah I'll definitely look more into this when I have time, right now I have 10x SSD for last months, 2x NVIDIA NDA custom design for tomorrow, Radeon custom design soon, new NVIDIA cards next month, 6900 XT next month, 3080 Ti next month. Also reevaluation power testing methodology on GPU, noise testing on GPU, need new CPU test methodology, new GPU test benchmarks, possibly on Zen 3.


I don't wanna hustle you about this just pointing out that maybe this is a good approach for testing. I know you are busy and I really appreciate your work.
As a token for appreciation, I have a wizz pick in my signature 



Woomack said:


> I actually tested that recently and at 1080p there is up to ~3-5% difference when games already run at 100FPS+. At 1440p+ there is barely any gain, typically in the error margin range. I simply can't confirm what some others spread around the web.
> 
> So yes, it helps a bit but it's not really a performance boost. Results at 2x dual-rank modules and 4x single-rank are about the same. Results at 4x single-rank modules are usually a bit better than 4x dual-rank (a matter of memory timings and what motherboard sets under auto). There is also a matter of motherboard and its topology. Simply many variables so can't always say that 4 memory modules are better.


There's also other reviewers who have tested it. Maybe you should also look into that.


----------



## Super XP (Nov 19, 2020)

Sandbo said:


> Tbh, that's a difference way too small to even be mentioned, or am I missing something?


SAM isn't fully optimized in my opinion, I think overtime it will end up giving better results.


----------



## W1zzard (Nov 19, 2020)

Super XP said:


> SAM isn't fully optimized in my opinion, I think overtime it will end up giving better results.


What is there to optimize? It's just one chunk of memory in address space now


----------



## HTC (Nov 19, 2020)

Apparently, the biggest gains this technology brings are with the 1% lows and 0.1% lows: not so much the average FPS.


----------



## THU31 (Nov 19, 2020)

Just a gimmick for now. Maybe it will have more of an impact when devs start optimizing for it, but right now you will not notice the difference, whether positive or negative.


----------



## Sora (Nov 20, 2020)

> you'll have to reinstall Windows on a GPT partition. There's also a conversion mechanism between MBR and GPT, but I haven't tested that.



Use MBR2GPT included with windows to convert the partition table and install the UEFI boot loader, no need to reinstall - only limitation is you can't have more than 3 partitions on the disk and it can't be dynamic.


----------



## AsRock (Nov 20, 2020)

W1zzard said:


> It is way bigger than that, and what makes it interesting, it's free and available on all platforms. Of course in a subjective scenario without FPS counter you'd never notice the difference, but all the little gains add up eventually



It's free ?, don't you need to buy a AMD V card, mobo and CPU.  I hardly call it free but yes a some what interesting feature.

I think i will skip the FREE feature for now .


----------



## olstyle (Nov 20, 2020)

kruk said:


> It isn't proprietary, stop making things up: https://www.pcgamer.com/amd-smart-access-memory-not-proprietary-promise/


AMD only now said this BECAUSE people where hating on them for restricting it to a Zen3+PCIe4 platform. Please don't confuse effect and cause.
@sam/Resizable BAR: So activate if you can, don't be too disappointed if you can't. No brainer really.


----------



## W1zzard (Nov 20, 2020)

Sora said:


> Use MBR2GPT included with windows to convert the partition table and install the UEFI boot loader, no need to reinstall - only limitation is you can't have more than 3 partitions on the disk and it can't be dynamic.


This won't always work, I've tried, there's a few more gotchas, but yeah. Definitely try that first.



AsRock said:


> It's free ?, don't you need to buy a AMD V card, mobo and CPU.  I hardly call it free but yes a some what interesting feature.
> 
> I think i will skip the FREE feature for now .


No, it will (eventually) work on EVERY chipset and EVERY graphics card (that's PCI-E). This is a standard PCI feature


----------



## bug (Nov 20, 2020)

AsRock said:


> It's free ?, don't you need to buy a AMD V card, mobo and CPU.  I hardly call it free but yes a some what interesting feature.
> 
> I think i will skip the FREE feature for now .


As a rule of thumb, free hardware features require you to buy hardware. Bummer, I know.

This one is free as in it doesn't incur additional costs (e.g. adding a G-Sync module to the mix).


----------



## kruk (Nov 20, 2020)

olstyle said:


> AMD only now said this BECAUSE people where hating on them for restricting it to a Zen3+PCIe4 platform. Please don't confuse effect and cause.



Please provide proof that Smart Access Memory was meant to be proprietary. They have limited resources and not investing time and cash into supporting (and testing) all possible combinations of hardware makes sense to me.


----------



## kapone32 (Nov 20, 2020)

Just imagine for a moment that if they release the new TR4 chips that have smart access but also enough PCIe lanes to run up to 11 Pcie 4 NVMEs in RAID 0?


kruk said:


> Please provide proof that Smart Access Memory was meant to be proprietary. They have limited resources and not investing time and cash into supporting (and testing) all possible combinations of hardware makes sense to me.



Nvidia claimed that this is a API feature that they are working on. Just because AMD is the first doesn't mean that it is going to keep it that way. This feature is available on all Games no? It is like AMD's Ray tracing that will natively work with any Game that has Ray tracing enabled vs Nvidia who don't just have Physx and Hairworks that they have been very selfish with but also their own flavor of Ray Tracing. I notice that the community (online) generally praises DLSS and RTX but laments anything that AMD does like many of the Nvidia fans that are crfying foul in this forum. I wonder what anyone will have to say when the 6900XT reviews land?


----------



## W1zzard (Nov 20, 2020)

kapone32 said:


> This feature is available on all Games no?


Correct, but it's more like it's irrelevant for games. It only changes the interaction between graphics driver and hardware, it's completely transparent and invisible to the rest of the system


----------



## kapone32 (Nov 20, 2020)

W1zzard said:


> Correct, but it's more like it's irrelevant for games. It only changes the interaction between graphics driver and hardware, it's completely transparent and invisible to the rest of the system


I would love to see how this will be with CPU intensive Games like TWWH or Civ6.


----------



## AsRock (Nov 20, 2020)

W1zzard said:


> This won't always work, I've tried, there's a few more gotchas, but yeah. Definitely try that first.
> 
> 
> No, it will (eventually) work on EVERY chipset and EVERY graphics card (that's PCI-E). This is a standard PCI feature



Did AMD say they were going too ?,  i guess they will at some point but by them not saying they plan or at least not making it known they plan to has only made them look a bit of a ass by being a ass not saying this,  with no help from nVidia piping in that they are making it as wide spread as possible.

So no for now it's not a FREE feature imo.


----------



## W1zzard (Nov 20, 2020)

AsRock said:


> Did AMD say they were going too


Not aware that they made any promises. You guys just need to make enough noise, just like with Zen 3 support on older boards and they'll say "ah fuck this, there's too much drama, *point at random engineers* solve this problem"


----------



## dragontamer5788 (Nov 20, 2020)

W1zzard said:


> Why is everybody suddenly talking about this? It's been known for a long time, but just isn't practical for most?



You're not keeping up with the Youtube generation W1zzard!










That video started this whole "test 4 sticks" thingy that you're seeing a lot of now.


----------



## igralec84 (Nov 21, 2020)

Funny i got a 5600X 8 days after launch, went to a local computer parts store and it was in stock at a not too much markedup price (now it's already 35€ more but still in stock), but getting a 6800/XT based on the 2 attempts to buy directly at that sweet MSRP from AMD on wednesday and yesterday, might be a problem


----------



## jesdals (Dec 15, 2020)

I havent been able to activate it in bios withouy loosing all my drives (3x nvme)



Sora said:


> Use MBR2GPT included with windows to convert the partition table and install the UEFI boot loader, no need to reinstall - only limitation is you can't have more than 3 partitions on the disk and it can't be dynamic.



What is the UEFI boot loader?


----------



## jesdals (Dec 17, 2020)

I had to do the convertion above to make i work in bios, but how does one see in windows if its enabled?


----------



## jesdals (Dec 18, 2020)

W1zzard did you test if SAM works on other drives e.g. a secondary nvme?


----------



## W1zzard (Dec 18, 2020)

jesdals said:


> W1zzard did you test if SAM works on other drives e.g. a secondary nvme?


SAM has nothing to do with storage. Or am I misunderstanding your question?


----------



## jesdals (Dec 18, 2020)

W1zzard said:


> SAM has nothing to do with storage. Or am I misunderstanding your question?


I did just wonder since it needs the speciel boot partion to show the drives - before i did the convertion of my boot disk the nvme drives was not accessible - but it might not be an issue then.


----------



## W1zzard (Dec 18, 2020)

SAM requires the system to boot in UEFI mode, i.e. not CSM. This only works when you have a UEFI boot partition in your system.


----------



## jesdals (Dec 19, 2020)

OMG this thing is buggy on Auros Master X570. Did som memory changes in bios and suddently I was back to not beeing able to se my nvme drives ind the boot menu. I took several resets before getting to boot for minute ther I thought I would have to convert the c drive again.


----------



## HD64G (Feb 12, 2021)

After MSI published the BIOS in December for Tomahawk B450 Max enabling the SAM feature, I thought I should test it even having a 2600X with an RX5700. And today I did so finding the results below:

Rome2TW SAM on: 175FPS maximum, 96FPS average, 65FPS minimum
Rome2TW SAM off: 149FPS maximum, 94FPS average, 64FPS minimum

DirtRally SAM on: 178FPS maximum, 150FPS average, 110FPS minimum
DirtRally SAM off: 175FPS maximum, 142FPS average, 89FPS minimum

Games as Ghost Recon Wildlands and the (vulkanised through the DXVK) AC Odyssey didn't show any performance differences.

Another change from enabling this feature is that the loading screeens when getting in the gameplay take significantly less time.


----------



## Space Lynx (Feb 12, 2021)

HD64G said:


> After MSI published the BIOS in December for Tomahawk B450 Max enabling the SAM feature, I thought I should test it even having a 2600X with an RX5700. And today I did so finding the results below:
> 
> Rome2TW SAM on: 175FPS maximum, 96FPS average, 65FPS minimum
> Rome2TW SAM off: 149FPS maximum, 94FPS average, 64FPS minimum
> ...



I have had SAM on since December. No issues, but I didn't do any before or after. ^^


----------

