• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Brings Smart Access Memory (Resizable BAR) Support to Ryzen 3000 Series

hmm, I'd say around 150-165 range fluctuations is about perfect to my eyes, when I get 103-120 fluctuations in the Shadowlands cities it ruins my immersion a bit, it's not horrible don't get me wrong, but it's not as smooth looking to the eyes. I have seena 240hz monitor btw, and I actually didn't like 240hz gaming, it feels like soap opera almost. 150-180hz range I think is the ultimate sweet spot. 140-190 is probably my perfect target area. no higher no lower. I'm hoping someday to upgrade my 1080p to a 27" 1440p 190hz or so... it might be coming soon, I know Asus has a 180hz one coming, so maybe I will look into that one.
Got it I did not realize you meant 1080P. Once you go to 1440P 1080P will look like 720P does today.
 
Got it I did not realize you meant 1080P. Once you go to 1440P 1080P will look like 720P does today.

I have seen 1440p, I used to own 1440p 27" qnix monitor but sold it to someone, I regret selling it honestly. BUT this a really high quality 1080p, 12 bit IPS 0.5 ms, lots of new tech in it, and very high quality. got it on launch on best buy for $169 free ship. its the acer brand of those new IPS 23.8" panels, but it looks gorgeous, way better than most 1080p. but yes I know what you mean as far as clarity goes.
 
I have seen 1440p, I used to own 1440p 27" qnix monitor but sold it to someone, I regret selling it honestly. BUT this a really high quality 1080p, 12 bit IPS 0.5 ms, lots of new tech in it, and very high quality. got it on launch on best buy for $169 free ship. its the acer brand of those new IPS 23.8" panels, but it looks gorgeous, way better than most 1080p. but yes I know what you mean as far as clarity goes.
The QNIX 2710? That monitor was truly epic! A really good 1080P IPS does look sweet too though.
 
The QNIX 2710? That monitor was truly epic! A really good 1080P IPS does look sweet too though.

yep (it might have been x-star brand i can't remember, they were same panel though), plus keep in mind - i will be taking a fps hit at 1440p and i have a target range of fps i intend to hit. for example AC Valhalla I will play on my 1080p IPS even when I do have 1440p again. so i can turn down settings a little hit at least 130 fps range or so.

but games that can hit 130+ at 1440p which there are a lot, I will play on 1440p. plus it will be nice having two monitors, so its a win win. :)
 
yep (it might have been x-star brand i can't remember, they were same panel though), plus keep in mind - i will be taking a fps hit at 1440p and i have a target range of fps i intend to hit. for example AC Valhalla I will play on my 1080p IPS even when I do have 1440p again. so i can turn down settings a little hit at least 130 fps range or so.

but games that can hit 130+ at 1440p which there are a lot, I will play on 1440p. plus it will be nice having two monitors, so its a win win. :)

You got it. I love Gaming on my 1440P monitor but visuals are sweet on my 4K monitor so I use that to watch vidoes.
 
You got it. I love Gaming on my 1440P monitor but visuals are sweet on my 4K monitor so I use that to watch vidoes.

yep and i intend to get a nice a 4k OLED tv someday (probably a couple years from now alongside hopefully a updated PS5 model that doesn't reach 95 celsius vrm's...) and games on PC that are capped to 30 or 60 fps I will play on the 4k tv. so i am going to have the trifecta of monitors going on so to speak. for now I am holding off on PS5, until they fix the drift in controller and the 95 celsius vrms
 
no, looks like 3000 series only which is reasonable enough. Two generations of support seems fair.

I've seen results of ReBar on Zen 1, there is no missing capability on the older processors, its an artificial limitation.

Nvidia is only enabling it on a per game basis as well, so it's going to have a pretty limited effect regardless

Feel free to try it on more games.

Screenshot2021022602.png


SAM does not work across the board, some titles have negatives. It's blocked at a driver level and enabled on games once they've been confirmed to get a positive result.

"SAM" is all or nothing, and that All implies the downsides and a reboot required to turn it off when performance is negatively affected.

Nvidia isn't doing "SAM", they went one better and tied the capability into their profile system so that games not whitelisted use traditional 256MB uploads.

Another unfortunate situation where AMD is first to something, but their implemenation is half arsed and done better by the competition.
 
yep and i intend to get a nice a 4k OLED tv someday (probably a couple years from now alongside hopefully a updated PS5 model that doesn't reach 95 celsius vrm's...) and games on PC that are capped to 30 or 60 fps I will play on the 4k tv. so i am going to have the trifecta of monitors going on so to speak. for now I am holding off on PS5, until they fix the drift in controller and the 95 celsius vrms
I have the Sony XBR65X900H in my Amazon cart. It is supposed to be a House warming gift for my Sister. It literally dropped $100 overnight to $1598. I know it's not OLED but it is better than any TV I currently own. I might make it 2 but that would kill my budget.
 
I have the Sony XBR65X900H in my Amazon cart. It is supposed to be a House warming gift for my Sister. It literally dropped $100 overnight to $1598. I know it's not OLED but it is better than any TV I currently own. I might make it 2 but that would kill my budget.


Costco has the 55" LG CX OLED flagship for $1349 with extra warranty included free. i'd rather move my couch up a little bit and get the 55" over 65". OLED is just too lovely imo
 
Costco has the 55" LG CX OLED flagship for $1349 with extra warranty included free. i'd rather move my couch up a little bit and get the 55" over 65". OLED is just too lovely imo
I should quoted CAD. That Sony monitor would be $1262 US. That is still a sweet deal though.
 
Well not entirely, if the put a window somewhere and the sun is out and no lights shines through the window... I mean yeah unless its really meant to be designed that way with your main character going nuts or it being horror or something sure...but otherwise its simply just wrong.

No, that'd just be an oversight from the devlopers. Forgetting to place a light source would have the same result if they were using RT as well. Of course it's wrong, it'd be wrong no matter what lighting technology you are using. I don't get the point of your comment, it's like you are implying that a accident on the dev's end somehow makes rasterization wrong. Makes no sense.
 
No, that'd just be an oversight from the devlopers. Forgetting to place a light source would have the same result if they were using RT as well. Of course it's wrong, it'd be wrong no matter what lighting technology you are using. I don't get the point of your comment, it's like you are implying that a accident on the dev's end somehow makes rasterization wrong. Makes no sense.

Well ok lets take a step back, games in general are meant to look realistic right? even in a fantasy game like for example Zelda or whatever the sun amits light and when that light is blocked by idk a mountain, it should cast a shadow etc etc etc, we get this.
RT makes that lighting be realistic, at the cost of performance yes but its correct and realistic.
Raterization is faked lighting (so to speak) and thus can actually be not realistic, we accept how it looks in general but that does not make it correct.

My original question was: cant the devs do an RT pass before the lighting has to be done, just to see how it should look if it was realistic, then use that as a reference when doing the faked lighting in rasterization so it looks as realistic as RT would, without the performance penalty?

(now obviously I get that RT lighting is also dynamic and you lose that sure, but there is plenty of static light that is revealed to be more realistic with RT and seemingly could easily be faked, in Cyberpunk there is a bench with a light above it used in Digital Foundry's video to illustrate RT on and off that shows what im talking about)
 
Well ok lets take a step back, games in general are meant to look realistic right? even in a fantasy game like for example Zelda or whatever the sun amits light and when that light is blocked by idk a mountain, it should cast a shadow etc etc etc, we get this.
RT makes that lighting be realistic, at the cost of performance yes but its correct and realistic.
Raterization is faked lighting (so to speak) and thus can actually be not realistic, we accept how it looks in general but that does not make it correct.

My original question was: cant the devs do an RT pass before the lighting has to be done, just to see how it should look if it was realistic, then use that as a reference when doing the faked lighting in rasterization so it looks as realistic as RT would, without the performance penalty?

(now obviously I get that RT lighting is also dynamic and you lose that sure, but there is plenty of static light that is revealed to be more realistic with RT and seemingly could easily be faked, in Cyberpunk there is a bench with a light above it used in Digital Foundry's video to illustrate RT on and off that shows what im talking about)

Zelda's lighting is certainly not designed to be realistic. Basic things like the sun emitting light and objects casting shadows are not evidence to the contrary.

You are mistaken in your assumption that rasterization effects can't be accurate as well, especially given that many modern game engines are adding lighting features that exceed what is currently possible with via RT on modern video cards at a fraction of the processing budget. Go and look at the unreal engine 5 demo.

"My original question was: cant the devs do an RT pass before the lighting has to be done, just to see how it should look if it was realistic, then use that as a reference when doing the faked lighting in rasterization so it looks as realistic as RT would, without the performance penalty?"

In order to match the RT version though, your rasterized lighting would have to support indirectly lighting. At that point though, if it does there's really no point in bothering with an RT pre-pass as your rasterized lighting is already as good as your RT lighting. If you are using CryEngine, Unreal engine, or Unity you can already get this with rasterized lighting.
 
You are mistaken in your assumption that rasterization effects can't be accurate as well, especially given that many modern game engines are adding lighting features that exceed what is currently possible with via RT on modern video cards at a fraction of the processing budget. Go and look at the unreal engine 5 demo.

Rasterization cannot be as accurate without incurring a significant penalty on the CPU to simulate dynamic shadow and lighting without restraint.
 
Back
Top