• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

So what does the Ada release by Nvidia mean to you

Has the Ada release by Nvidia been good in your opinion


  • Total voters
    101
Status
Not open for further replies.

ixi

Joined
Aug 19, 2014
Messages
1,451 (0.38/day)
Yeah DLSS is so bad -> https://www.techpowerup.com/review/starfield-dlss-community-patch/

"The default anti-aliasing method in Starfield is TAA (Temporal Anti-Aliasing), which results in a very blurry image at all resolutions, including 4K. It also fails to render small object details like thin steel structures, power lines, transparent materials, tree leaves, and vegetation well. Additionally, there are noticeable shimmering issues across the entire image, even when you're not moving, especially at lower resolutions like 1080p."

"The official implementation of FidelityFX Super Resolution (FSR) in Starfield addresses most of these problems but still exhibits excessive shimmering on thin steel objects, transparent materials, tree leaves, and vegetation. Enabling DLSS resolves these shimmering problems and ensures a stable image quality, even at lower resolutions like 1080p."

"The community patch also supports DLAA (Deep Learning Anti-Aliasing), which takes image quality a step further, surpassing the quality of native in-game TAA, FSR, or DLSS. Enabling DLSS/DLAA also enhances the quality of in-game particle effects, providing a more comprehensive and stable appearance, particularly during motion, compared to FSR."

Also, AMD GPUs don't even render the sun like fevgatos stated. Even tho it is AMD sponsored and optimized for AMD CPU and GPU (which is why DLSS support is not present - AMD hates when FSR is compared with DLSS, because FSR always loose)

The game simply looks best on Nvidia, especially with DLSS/DLAA mod, easily beating native resolution visuals.

I bet you don't even use Nvidia GPU :laugh:

Thank you for long sentences. About AMD gpu's not having sun that is indeed funny. Is it possible to fly to the sun and go kaboom while you don't see it? :D

Have not played Starfield. Well, if devs created native ress with bluriness I guess that is the way they want you to see the game :D. With dlss/fsr you are removing that how devs want you to see the game, but maybe there is another reason. Devs don't know how to make optimized game... and that is why they lower image quality by adding bluriness... that is sad.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Thank you for long sentences. About AMD gpu's not having sun that is indeed funny. Is it possible to fly to the sun and go kaboom while you don't see it? :D

Have not played Starfield. Well, if devs created native ress with bluriness I guess that is the way they want you to see the game :D. With dlss/fsr you are removing that how devs want you to see the game, but maybe there is another reason. Devs don't know how to make optimized game... and that is why they lower image quality by adding bluriness... that is sad.

No its not. Starfield has FSR2 enabled by default in all presets

DLSS/FSR is used in all new games and most dev's even has it enabled by default now. Like in Starfield and Remnant 2.

TAA is crap compared to DLAA anyway. Tons of shimmering and jaggies while inducing blur most of the time. DLAA is crisp and sharp. Insanely good AA solution, best in class.

People who claim native is always better, don't have access to DLSS/DLAA for sure. It makes image looks cleaner, especially when you tweak sharpening. FSR is hit or miss, mostly miss. DLSS/DLAA easily beats FSR/TAA.
 
Joined
Aug 20, 2007
Messages
21,615 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon, Phanteks and Corsair Maglev blower fans...
Memory 64GB (2x 32GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 5800X Optane 800GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Sure, but what about scrapping the Tensor cores, and using the now freed up die real estate for more traditional shaders and/or RT cores? Then we wouldn't need DLSS in the first place, and could have decent enough performance for native resolutions.

DLSS is like buying a car with square wheels, then redesigning the world's road infrastructure to fit cars with square wheels - a solution to a problem that shouldn't even exist.
Welcome to being a second class citizen to machine learning: be glad they bothered to think of a gaming use for the tensor cores at all. They certainly did not have to.
 
Joined
Sep 17, 2014
Messages
22,859 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
And even now, AMD is doing great with GPU performance.. but it isnt all sunshine and lollipops with them for even just daily stuff.
I disagree, my PC is shitting rainbows & unicorns every day
Might be my daughter tho
 
Joined
Aug 20, 2007
Messages
21,615 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon, Phanteks and Corsair Maglev blower fans...
Memory 64GB (2x 32GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 5800X Optane 800GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Why little voicey... because they used half as many memory modules, so more of that power went to the GPU and not the VRAM.
The difference on the Ti is that each memory chip has twice the capacity, so there's only half as many required, which means no more memory chips on the back side of the card
During the eth mining boom, my 3090 would hit its 375W limit with the GPU clocked at 1.4GHz - super far down vs the 2.1GHz it can boost to when the ram is hardly in use.
Undervolting the GPU let me reach 1.6GHz at 370W, so it's easy to imagine the 3090Ti's higher capacity VRAM would have let it clock even higher with no other changes.
Mussels you are telling me your "Ti" sticker on your 3090 didn't work? Shame.
 
Joined
Sep 17, 2014
Messages
22,859 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Peeps complain about Ada pricing but things were so bad w/ampere and the cryptocurrency mining shortage that even the pricing for the 4090 looks cheap by comparison.
Sure but then you could get return on investment when crypto priced the GPU higher. You'd mine away the difference.

It gives you moar frames when or if you need them.

That's merely fanboi-talk and exactly nothing whatsoever to do with the technology. I didn't ask what you think about that horrid Green Monster's money-grabbing, I asked why DLSS is useless. Not the same thing.

Indeed it is, just as long as that particular aspect is all you care about.

Again, that has nothing whatsoever to do with the technology in itself. All extraneous factors excepted, is DLSS a good thing or a bad thing?
DLSS isn't a good thing because its proprietary, and Nvidia actively uses it as a tool to sell you a new GPU.
AMD is not pulling that trick with FSR. I ran FSR on my GTX 1080 - and it allowed me to run Cyberpunk quite well too.

These are not external factors. They're part of the package deal with these technologies. I don't like being on Nvidia's agenda with regards to 'something great' that indeed is very useful. They already pulled their first customer-squeeze with DLSS3, and you can wait for more - it has been Nvidia's MO for decades now. Most of the time though this fails and slowly dies off, or the technology isn't quite as groundbreaking as it seems, like Gsync does now versus Freesync; or like the way PhysX is barely there with GPU acceleration anymore, even on Nvidia. When it stops making them money, they stop giving it money. Common sense. Why wouldn't they? They already own the market. The reason they push DLSS3 so hard now, is because they know they have to sell Ada and without the DLSS technology constantly ramping up the offer is absolutely horrible. They know gamers like DLSS. They know they can tie gamers to their brand with DLSS. And fools & money shall be parted because of DLSS. And this is also why its useless. It relies fully on Nvidia's game plan going forward.

I think the definition of 'free frames' is not quite right :) DLSS is G-frames. FSR is free frames.
So I conclude DLSS is indeed a bad thing. Its great we have the tech, now unify it.
 
Joined
Jun 14, 2020
Messages
3,781 (2.26/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Sure but then you could get return on investment when crypto priced the GPU higher. You'd mine away the difference.


DLSS isn't a good thing because its proprietary, and Nvidia actively uses it as a tool to sell you a new GPU.
AMD is not pulling that trick with FSR. I ran FSR on my GTX 1080 - and it allowed me to run Cyberpunk quite well too.

These are not external factors. They're part of the package deal with these technologies. I don't like being on Nvidia's agenda with regards to 'something great' that indeed is very useful. They already pulled their first customer-squeeze with DLSS3, and you can wait for more - it has been Nvidia's MO for decades now. Most of the time though this fails and slowly dies off, or the technology isn't quite as groundbreaking as it seems, like Gsync does now versus Freesync; or like the way PhysX is barely there with GPU acceleration anymore, even on Nvidia. When it stops making them money, they stop giving it money. Common sense. Why wouldn't they? They already own the market. The reason they push DLSS3 so hard now, is because they know they have to sell Ada and without the DLSS technology constantly ramping up the offer is absolutely horrible. They know gamers like DLSS. They know they can tie gamers to their brand with DLSS. And fools & money shall be parted because of DLSS. And this is also why its useless. It relies fully on Nvidia's game plan going forward.

I think the definition of 'free frames' is not quite right :) DLSS is G-frames. FSR is free frames.
So I conclude DLSS is indeed a bad thing. Its great we have the tech, now unify it.
There must be a reason DLSS is better than FSR though, right? What the reason is, don't know. Might be that nvidia spent more money for R&D on it, might be because it's only working on cards with Tensor cores, might be that their software department is more competent than AMD's, whatever the reason, asking them to hand out something they spent money and are better at to their competitors doesn't sound like a great business plan.

Nvidia already spent money to unify it, it's called nvidia streamline. Begs the question why amd sponsored games don't use it...
 
Joined
Sep 17, 2014
Messages
22,859 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
There must be a reason DLSS is better than FSR though, right? What the reason is, don't know. Might be that nvidia spent more money for R&D on it, might be because it's only working on cards with Tensor cores, might be that their software department is more competent than AMD's, whatever the reason, asking them to hand out something they spent money and are better at to their competitors doesn't sound like a great business plan.
Sure, but one thing does not invalidate the other. I just think Nvidia pours a lot more money into it, like they do historically on everything gaming, simply because they can justify it due to larger market share.

Also its not all negative, some GameWorks technologies also made it into the wild for everyone. So we know they can and might eventually share a thing or two. Tensor cores only? Don't make me laugh. That's marketing with a technology barrier 'because I said so'. Remember the initial story about DLSS? 'We're training your GPU'. 'We're using AI to create DLSS'. But now a modder can effectuate it with a .dll. And today they're selling the story that DLSS3 can only happen on Ada, while AMD is allegedly incorporating similar tech everywhere, even prior to RDNA. Without AI.

I don't like being lied to continuously to further a company's bottom line ;)
Also take a look at XeSS which also seems far more versatile and usable than DLSS. Two out of three companies manage to create a virtually identical technology that isn't tied to hardware.
 
Joined
Jun 14, 2020
Messages
3,781 (2.26/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Sure, but one thing does not invalidate the other. I just think Nvidia pours a lot more money into it, like they do historically on everything gaming, simply because they can justify it due to larger market share.

Also its not all negative, some GameWorks technologies also made it into the wild for everyone. So we know they can and might eventually share a thing or two. Tensor cores only? Don't make me laugh. That's marketing with a technology barrier 'because I said so'. Remember the initial story about DLSS? 'We're training your GPU'. Now they can effectuate it with a .dll.

I don't like being lied to continuously to further a company's bottom line ;)
Well it doesn't really matter, you are complaining that FG only made it to ADA. Well you realize AMD's FG hasn't made it to ANY card? At all? How the heck is that better? On top of that, dunno if you heard, hyper RX FG (the one that works in all games through drivers) is going to be RX 7000+ exclusive. Won't work on older cards either.


Personally Im glad both FSR and DLSS exist, FSR is great on smaller screens (like laptops) because you don't really notice it's shortcomings, but I wouldn't get a gpu without DLSS on a desktop. It's a deal breaker. Don't care THAT much about FG, but no DLSS = no buy until FSR is just as good.

Btw, nvidia has streamline, an open source solution allowing developers to implement all 3 upscaling methods supposedly pretty easy. Now why do amd sponsored games don't use it...yeah, we all know the answer to that.
 
Joined
Sep 17, 2014
Messages
22,859 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Well it doesn't really matter, you are complaining that FG only made it to ADA. Well you realize AMD's FG hasn't made it to ANY card? At all? How the heck is that better? On top of that, dunno if you heard, hyper RX FG (the one that works in all games through drivers) is going to be RX 7000+ exclusive. Won't work on older cards either.


Personally Im glad both FSR and DLSS exist, FSR is great on smaller screens (like laptops) because you don't really notice it's shortcomings, but I wouldn't get a gpu without DLSS on a desktop. It's a deal breaker. Don't care THAT much about FG, but no DLSS = no buy until FSR is just as good.

Btw, nvidia has streamline, an open source solution allowing developers to implement all 3 upscaling methods supposedly pretty easy. Now why do amd sponsored games don't use it...yeah, we all know the answer to that.

We'll see it when its there, but most of what you said up here seems to be nonsense - the fact a catch-all FG solution isn't going to land everywhere isn't the same thing as 'we need hardware X/Y' to use FG. The feature is still there, but you're tied to the per game implementation with FSR3. Still a whole lot better than a per hardware gen requirement. Plus you're basically confirming what I said: DLSS forces you to Nvidia. Great? FSR3 nor FG will force you to AMD.
 
Joined
Jun 14, 2020
Messages
3,781 (2.26/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%

We'll see it when its there, but most of what you said up here seems to be nonsense.
Is it now? From the article

"On that note, it bears mentioning that Hypr-RX requires an RDNA 3 GPU, meaning it’s only available for Radeon RX 7000 video cards as well as the Ryzen Mobile 7040HS CPU family"



Nobody is talking about it cause it's amd as per the usual, hyper RX will feature antilag+, a competitor to reflex. Reflex works on Maxwell cards (2013?), antilag+ will only work on RDNA3.

FMF (fluid motion frame) which is the FG competitor but on a driver level will also be added to Hyper RX. You know, RDNA 3 only feature.
 
Joined
Sep 17, 2014
Messages
22,859 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Is it now? From the article

"On that note, it bears mentioning that Hypr-RX requires an RDNA 3 GPU, meaning it’s only available for Radeon RX 7000 video cards as well as the Ryzen Mobile 7040HS CPU family"



Nobody is talking about it cause it's amd as per the usual, hyper RX will feature antilag+, a competitor to reflex. Reflex works on Maxwell cards (2013?), antilag+ will only work on RDNA3.
Low latency frames aren't new to any card. Max pre rendered frames set at 1 or 0, yawn
Don't drown in all the buzzwords mate

HYPR RX is just the fancy catch all name that AMD uses as a vehicle to market RDNA3. But you get every individual feature on cards prior too, or at the very least, there is nothing stopping a dev from incorporating it. Its not quite relevant to the DLSS discussion. FMF isn't tied to HYPR RX.

You're trying to frame it as if AMD is at fault for trying the thing we all want: driver level FG.
Nvidia isn't even moving there.

Just stahp man, its sad.
 
Joined
Jun 14, 2020
Messages
3,781 (2.26/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Low latency frames aren't new to any card. Max pre rendered frames set at 1 or 0, yawn
Don't drown in all the buzzwords mate
Max pre rendered frames is not reflex. If it was, it wouldn't be an RDNA 3 exclusive would it? And so is FMF

FMF isn't tied to HYPR RX.
According to everything we know, yes it is.
You're trying to frame it as if AMD is at fault for trying the thing we all want: driver level FG.
No i did no such thing. I'm telling you both AMD and nvidia have features that are not supported by older cards, yet nobody complains when AMD does it, everyone complains when nvidia does. I don't complain about either, I couldn't care less. Im just pointing out the double standards.
 
Joined
Oct 24, 2020
Messages
467 (0.30/day)
Location
Belgium
System Name MSi Coffee Lake
Processor i7-8700k
Motherboard MSI Z370 GAMING PRO CARBON AC
Cooling NZXT something AIO loop
Memory 16GB Kingston HyperX 2133 C14 Fury Black
Video Card(s) TITAN Xp Jedi Order Edition
Storage Samsung 960 Evo NVMe
Display(s) Medion 23'
Case Cooler Master Stryker
Audio Device(s) onboard
Power Supply BeQuiet 600W
Mouse Logitech Trackman T-BB18
Keyboard Generic hp
Software Windows 10
Like in state machine logic , "Dont care" should have been an option here.
 
Joined
Sep 17, 2014
Messages
22,859 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Max pre rendered frames is not reflex. If it was, it wouldn't be an RDNA 3 exclusive would it? And so is FMF


According to everything we know, yes it is.

No i did no such thing. I'm telling you both AMD and nvidia have features that are not supported by older cards, yet nobody complains when AMD does it, everyone complains when nvidia does. I don't complain about either, I couldn't care less. Im just pointing out the double standards.
Maybe I can't read, please do correct me if wrong, but FMF doesn't seem tied to anything.

As for the technical underpinnings, AMD has answered a few questions relating to FSR3/Fluid Motion Frames, but the company is not doing a deep dive on the technology at this time. So there remains a litany of unanswered questions about its implementation.

With regards to compatibility, according to AMD FSR3 will work on any RDNA (1) architecture GPU or newer, or equivalent hardware. RDNA (1) is a DirectX Feature Level 12_1 architecture, which means equivalent hardware spans a pretty wide gamut of hardware, potentially going back as far as NVIDIA’s Maxwell 2 (GTX 900) architecture. That said, I suspect there’s more to compatibility than just DirectX feature levels, but AMD isn’t saying much more about system requirements. What they are saying, at least, is that while it will work on RDNA (1) architecture GPUs, they recommend RDNA 2/RDNA 3 products for the best performance.

Along with targeting a wide range of PC video cards, AMD is also explicitly noting that they’re targeting game consoles as well. So in the future, game developers will be able to integrate FSR3 and have it interpolate frames on the PlayStation 5 and Xbox Series X|S consoles, both of which are based on AMD RDNA 2 architecture GPUs.

Underpinning FSR 3’s operation, in our briefing AMD made it clear that it would require motion vectors, similar to FSR 2’s temporal upscaling, as well as rival NVIDIA’s DLSS 3 interpolation. The use of motion vectors is a big part of why FSR 3 requires per-game integration – and a big part of delivering high quality interpolated frames. Throughout our call “optical flow” did not come up, but, frankly, it’s hard to envision AMD not making use of optical flow fields as well as part of their implementation. In which case they may be relying on D3D12 motion estimation as a generic baseline implementation, since it wouldn’t require accessing each vendor’s proprietary integrated optical flow engine.

What’s not necessary, however, is AI/machine learning hardware. Since AMD is targeting the consoles, they wouldn’t be able to rely on it anyhow.

HYPR RX:
Second up, we have Hypr-RX. AMD’s smorgasbord feature combines the company’s Radeon Super Resolution (spatial upscaling), Radeon Anti-Lag+ (frame queue management), and Radeon Boost (dynamic resolution scaling). All three technologies are already available in AMD’s drivers today, however they can’t all currently be used together. Super Resolution and Boost both touch the rendering resolution of a game, and Anti-Lag steps on the toes of Boost’s dynamic resolution adjustments.
 
Joined
Jun 14, 2020
Messages
3,781 (2.26/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Maybe I can't read, please do correct me if wrong, but FMF doesn't seem tied to anything.

As for the technical underpinnings, AMD has answered a few questions relating to FSR3/Fluid Motion Frames, but the company is not doing a deep dive on the technology at this time. So there remains a litany of unanswered questions about its implementation.

With regards to compatibility, according to AMD FSR3 will work on any RDNA (1) architecture GPU or newer, or equivalent hardware. RDNA (1) is a DirectX Feature Level 12_1 architecture, which means equivalent hardware spans a pretty wide gamut of hardware, potentially going back as far as NVIDIA’s Maxwell 2 (GTX 900) architecture. That said, I suspect there’s more to compatibility than just DirectX feature levels, but AMD isn’t saying much more about system requirements. What they are saying, at least, is that while it will work on RDNA (1) architecture GPUs, they recommend RDNA 2/RDNA 3 products for the best performance.

Along with targeting a wide range of PC video cards, AMD is also explicitly noting that they’re targeting game consoles as well. So in the future, game developers will be able to integrate FSR3 and have it interpolate frames on the PlayStation 5 and Xbox Series X|S consoles, both of which are based on AMD RDNA 2 architecture GPUs.

Underpinning FSR 3’s operation, in our briefing AMD made it clear that it would require motion vectors, similar to FSR 2’s temporal upscaling, as well as rival NVIDIA’s DLSS 3 interpolation. The use of motion vectors is a big part of why FSR 3 requires per-game integration – and a big part of delivering high quality interpolated frames. Throughout our call “optical flow” did not come up, but, frankly, it’s hard to envision AMD not making use of optical flow fields as well as part of their implementation. In which case they may be relying on D3D12 motion estimation as a generic baseline implementation, since it wouldn’t require accessing each vendor’s proprietary integrated optical flow engine.


What’s not necessary, however, is AI/machine learning hardware. Since AMD is targeting the consoles, they wouldn’t be able to rely on it anyhow.

HYPR RX:
Second up, we have Hypr-RX. AMD’s smorgasbord feature combines the company’s Radeon Super Resolution (spatial upscaling), Radeon Anti-Lag+ (frame queue management), and Radeon Boost (dynamic resolution scaling). All three technologies are already available in AMD’s drivers today, however they can’t all currently be used together. Super Resolution and Boost both touch the rendering resolution of a game, and Anti-Lag steps on the toes of Boost’s dynamic resolution adjustments.
I think you are confusing FMF with FSR3. FSR 3 will indeed work with all RDNA cards. That's not the driver level FG, that needs a per game implementation. The FMF is the driver level and it's only for RDNA 3 cards, at least for now.


Scott Herkelman, you know the AMD Radeon boss said that for now it's an RDNA 3 only feature

“…if there is good reception of AFMF and gamers believe it to be worthwhile, we’ll take it to the next step and see if we can enable it on RDNA 2”

HYPR RX:
Second up, we have Hypr-RX. AMD’s smorgasbord feature combines the company’s Radeon Super Resolution (spatial upscaling), Radeon Anti-Lag+ (frame queue management), and Radeon Boost (dynamic resolution scaling). All three technologies are already available in AMD’s drivers today, however they can’t all currently be used together. Super Resolution and Boost both touch the rendering resolution of a game, and Anti-Lag steps on the toes of Boost’s dynamic resolution adjustments.
It's in the drivers, you can't use it. Antilag+ is RDNA 3 exclusive as well. From their very own website


"AMD Radeon™ Anti-Lag+ product support is limited to AMD RDNA™ 3 products, including the AMD Radeon™ RX 7000 Series or newer."
 
Joined
Jan 14, 2019
Messages
13,395 (6.11/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Inflation is a big reason for 4000 series was priced higher. AMDs 7900 series were priced high as well, stop acting like it is only Nvidia. Last gen cards also had to be sold still. Both Nvidia and AMD had huge stock because of mining demand and then mining crash. AMD especially had huge stock. Which is why they waited a loooong time with 7700 and 7800 series, still selling 6000 series deep into 2023.

People are screaming about prices because they feel the inflation as well, on all fronts. People can't afford new hardware and holding on to their dated hardware, while talking crap about new stuff, because they can't afford to upgrade and trying to justify waiting. Human nature, nothing new.


Only meh cards this generation is 4060 series and 7600 series. These were almost pointless, at least until last gen cards sell out.
4060 series will probably beat entire Radeon 7000 series on Steam HW Survey anyway in terms of marketshare. Why? Because x60 series always sell like hotcakes, because 9 out of 10 people that are not into hardware won't even consider AMD because of reputation or bad experience earlier.
Inflation doesn't explain why the 7800 XT costs roughly the same as the 4060 Ti while being 25% faster (and a way more complex design). This is not inflation. This is Nvidia's price gouging.
 
Joined
Jun 14, 2020
Messages
3,781 (2.26/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Inflation doesn't explain why the 7800 XT costs roughly the same as the 4060 Ti while being 25% faster (and a way more complex design). This is not inflation. This is Nvidia's price gouging.
I assume you are talking about the 16gb 4060ti, caue the 8gb 4060ti has the same performance per dollar with the 7800xt.

The 16gb 4060ti is not for gamers, it's very attractive for people that want cuda and vram. 3060 12gb ---> 4060ti 16gb ---> 3090 ----> 4090 are what these people are buying.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.89/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Are you going to play at 4k with a 4060 or a 4060ti? Assuming they had 512 bit bus and 48 gb of vram, they would still be kinda terrible for that resolution. I mean my 4090 struggles at newest games in 4k so..


That's pretty insane considering the 4060ti is the 2nd best card in terms of price to performance, only losing to the just released 7800xt, and by less than 9% mind you.
I play at 4K with a GTX 1070Ti on my TV, and at 4K with a 3090
You don't sit there and say "but that GPU isn't for THAT resolution, user error!"

You have a 13900K on air, who's going to do that? Let's just disregard anything to do with your PC's performance based on that assumption, as that follows your logic train.
Somehow you bring starfield, the buggy mess in and change topics entirely? The games never been tested properly on anything PC. It's a shitshow console cash grab.
 
Joined
Jun 14, 2020
Messages
3,781 (2.26/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I play at 4K with a GTX 1070Ti on my TV, and at 4K with a 3090
You don't sit there and say "but that GPU isn't for THAT resolution, user error!"

You have a 13900K on air, who's going to do that? Let's just disregard anything to do with your PC's performance based on that assumption, as that follows your logic train.
Somehow you bring starfield, the buggy mess in and change topics entirely? The games never been tested properly on anything PC. It's a shitshow console cash grab

If you can play 4k with your 1070ti then I see no problem doing so with a 4060ti either. It's much much faster than the 1070ti, you will have a way way better experience. Enjoy I guess?
 
Joined
Jan 14, 2019
Messages
13,395 (6.11/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Welcome to being a second class citizen to machine learning: be glad they bothered to think of a gaming use for the tensor cores at all. They certainly did not have to.
5 years ago at Nvidia:
"Look, boss, the new Tensor cores are awesome for machine learning."
"Oh, really!"
"And now they're part of our new architecture all across the board!"
"Cool! But how do we sell this to gamers?"
"Oops... err... lemme think... dunno... don't worry, we'll come up with something."

Thus, DLSS was born.

Is it now? From the article

"On that note, it bears mentioning that Hypr-RX requires an RDNA 3 GPU, meaning it’s only available for Radeon RX 7000 video cards as well as the Ryzen Mobile 7040HS CPU family"



Nobody is talking about it cause it's amd as per the usual, hyper RX will feature antilag+, a competitor to reflex. Reflex works on Maxwell cards (2013?), antilag+ will only work on RDNA3.
No. Anti-Lag (non-plus) is a competitor to Reflex, and it has been with us since the RX 500 times, as far as I know. Anti-Lag+ is just an upgraded version for RDNA 3 on DX 12 games.

HYPR-RX is a combination of Radeon Boost, Anti-Lag+ and RSR/FSR, features that can be enabled or disabled separately if you want to. Like others said, don't get hung up on buzzwords.

I assume you are talking about the 16gb 4060ti, caue the 8gb 4060ti has the same performance per dollar with the 7800xt.

The 16gb 4060ti is not for gamers, it's very attractive for people that want cuda and vram. 3060 12gb ---> 4060ti 16gb ---> 3090 ----> 4090 are what these people are buying.
Yes, the 16 GB card. Price-to-performance of the 8 GB model might be similar with current games, but paying nearly 400 quid for an 8 GB card anno 2023 with any kind of longevity in mind is an absolute waste.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Inflation doesn't explain why the 7800 XT costs roughly the same as the 4060 Ti while being 25% faster (and a way more complex design). This is not inflation. This is Nvidia's price gouging.
What are you talking about, here in EU 4070 and 7800XT is priced similar, 7800XT has slighly more performance (4% in 1440p) but lacks features and much lower RT perf. 4060 Ti is way cheaper.

4060 Ti had a MSRP of 390 dollars. 7800XT is 500 dollars. 4070 was 599 dolllars on launch but dropped below numerous times to 550 and below. Besides 4070 lanched half a year ago. Obviously AMD will price the card lower or it would be pointless to even launch it. AMD were late to the party, once again.

4060 series is mediocre, just like Radeon 7600 series. Who cares about them. However 4060 will probably outsell entire Radeon 7000 series lineup alone, watch in a few years on Steam HW Survey...
 
Joined
Jun 14, 2020
Messages
3,781 (2.26/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
5 years ago at Nvidia:
"Look, boss, the new Tensor cores are awesome for machine learning."
"Oh, really!"
"And now they're part of our new architecture all across the board!"
"Cool! But how do we sell this to gamers?"
"Oops... err... lemme think... dunno... don't worry, we'll come up with something."

Thus, DLSS was born.
And that's a problem because...?
No. Anti-Lag (non-plus) is a competitor to Reflex
No, it is not. Reviewers that tested them say otherwise. Let's stop with the misinformation campaign.

1694518886189.png


Yes, the 16 GB card. Price-to-performance of the 8 GB model might be similar with current games, but paying nearly 400 quid for an 8 GB card anno 2023 with any kind of longevity in mind is an absolute waste.
And paying 549€ anno 2023 for a card with the RT performance of a 3 year old 500€ 3070 is an absolute waste in my opinion, but as per the usual, YMMV. You know, not everyone has the same needs and wants
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Yes, the 16 GB card. Price-to-performance of the 8 GB model might be similar with current games, but paying nearly 400 quid for an 8 GB card anno 2023 with any kind of longevity in mind is an absolute waste.
Why? 16GB made no difference, not even in 4K gaming -> https://www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-16-gb/31.html
Shows just how irellevant VRAM is, when GPU is the limiting factor.

A slow GPU won't need more than 8GB, even in 2023. 4060 series are considered slow GPUs. Meant for 1080p gaming mostly. Can do 1440p in some games tho, on lower presets. Not meant to max out demanding games at 1440p at all. Meaning 8GB is plenty.

3070 8GB very easily beats 6700XT 12GB in 4K gaming FYI -> https://www.techpowerup.com/review/asus-radeon-rx-7700-xt-tuf/35.html
So where is the magic of more VRAM here? Slow GPU will be slower and zero of those cards will be suited for 4K on high preset anyway.

You ramble about VRAM, meanwhile most look at performance numbers. The only games that caused 8GB cards problems, were a few rushed console ports like TLOU and RE4 and they ran fine after a patch or two, even on 8GB.

PC games won't require alot more VRAM anytime soon. PS5 and XSX are out and UE5 engine will only require less VRAM over time when devs actually knows how to optimize it right.

You won't see a VRAM requirement bump before PS6 and next Xbox, or UE6 sometimes in 3-4-5 years. You can act like it all you want, but more VRAM never helped a weak GPU, because you will be forced to lower settings anyway, hence lowing VRAM usage.

The only high-end card I can think of that had issues with VRAM very fast after release was AMD Fury line, you know, the overclockers dream (stated by Lisa Su) that barely overclocked 1% while power usage skyrocketed, meanwhile 980 Ti overclocked like absolute crazy and destroyed Fury X completely, while having 2GB more VRAM. I remember my 980 Ti having like 35-40% increased performance because of 1200 -> 1580 MHz speed bump.
 
Last edited:
Joined
Jan 14, 2019
Messages
13,395 (6.11/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
What are you talking about, here in EU 4070 and 7800XT is priced similar, 7800XT has slighly more performance (4% in 1440p) but lacks features and much lower RT perf. 4060 Ti is way cheaper.

4060 Ti had a MSRP of 390 dollars. 7800XT is 500 dollars. 4070 was 599 dolllars on launch but dropped below numerous times to 550 and below. Besides 4070 lanched half a year ago. Obviously AMD will price the card lower or it would be pointless to even launch it. AMD were late to the party, once again.

4060 series is mediocre, just like Radeon 7600 series. Who cares about them. However 4060 will probably outsell entire Radeon 7000 series lineup alone, watch in a few years on Steam HW Survey...
I have no idea what "here in the EU means", but here in the UK, the cheapest 4070 is 90 quid more expensive than the cheapest 7800 XT. That's 18%! I don't care what gimmicks features Nvidia is trying to push down my throat, I'm not paying that much more for identical performance. I wouldn't even compare to the 4060 Ti, as it is an entirely different performance segment altogether, yet, the 16 GB version costs as much as the 7800 XT does.

Like I said above, the only card I'd consider buying over its direct competitor is the 4060, because that's the performance level that needs DLSS and FG the most.
 
Status
Not open for further replies.
Top