Thursday, January 11th 2024

AMD Believes NVIDIA is Behind in Driver-Based Upscaler Development

AMD is readying its Fluid Motion Frames (AFMF) technology for public release later this month (January 24 to be exact). Aaron Steinman, a Senior Radeon Manager, believes that arch rival NVIDIA will need to take some drastic steps once AFMF arrives due to its more open nature. He stated in a short interaction with PC Gamer: "I would be curious to know if NVIDIA feels now they have to match what we've done in making some of these solutions driver-based." His software engineering buddies have already released the Radeon Super Resolution (RSR) technology, which functions via in-driver operation.

Unlike Team Red's heavily marketed FidelityFX Super Resolution (FSR) system, AFMF and RSR are not reliant on official support from games developers. The driver-based solutions will be packaged within an upcoming version of AMD's HYPR-RX feature set. Steinman continued with his statement: "I think what we're gonna start seeing, DLSS is only available on certain solutions, so either NVIDIA is going to have to benefit from our solution because we did make it open-source and cross-vendor, or they're probably going to need to do something similar." The publication points out that Team Green has something in the same ballpark—NVIDIA Image Scaling—but its nowhere near as advanced as their headlining "AI-infused" DLSS tech. Steinman conceded to PC Gamer that his main opponent will inevitably pull ahead in the future: "I mean, the competition will never end, right? We'll have new technologies, they (NVIDIA) will have new technologies."
Sources: PC Gamer, VideoCardz
Add your own comment

84 Comments on AMD Believes NVIDIA is Behind in Driver-Based Upscaler Development

#51
Macro Device
CyberPomPom
15 years ago, almost to the day.
1600p60 FSAA was also a thing already on many games. And by the end of the year some cards had same performance level as the best of this graph but in 1600p.
6 outta 9 games are way slower than 60 FPS on a single top-tier GPU. SLI and CF have never been ideal and playability is questionable. 4K runs faster now than 1080p back then.
Posted on Reply
#52
Prima.Vera
mb194dcNever understood upscaliing, if I'm paying thousands for a graphics card and screen... I want the best image quality. That is native resolution. No interest in DLSS, FSR or whatever.
Bingo.
That's exactly what I was saying every time. I want the native sharp image, no fake upscaling garbage. Freme generation is OK, if it works well, and doesn't create a stutter and lag mess.
AssimilatorLMAO what utter nonsense. What's crippling hardware performance are the high resolutions people expect now... 4k is 4x 1080p, 8k is 4x that again, so you are asking the GPU to render sixteen times more pixels. Graphics cards could do that today... if they were 16x larger, consumed 16x more power, and cost 16x as much. Since that's stupid and nobody would be able to afford it, upscaling and frame generation are literally the only way that GPUs can keep up with high-resolution displays. It's almost like the people who design GPUs are a lot smarter than you, and have thought of this.


Because performance isn't a feature :rolleyes:


Since @las has posted evidence that contradicts this claim, how about you provide evidence to back it up?
Not entirely true. I game on 1440p and basically in ALL recent games, even with mediocre graphics, you HAVE TO use the garbage DLSS or if not, the even more trash piece of junk, TAA.
Basically I am forced to use either one of those, and what's funny and retarded, is that DLSS on high, looks a little better than native with TAA. And that's the reason people are saying that DLSS is good looking.
I miss my MSAA, SMAA (I don't care for shimmering tbh), or any other Anti Aliasing technologies that DO NOT involve blurring stuff.
P.S.
nGreedia's DLAA is good, but not as good as MSAA. Still blurry by a bit.
Posted on Reply
#53
ratirt
AssimilatorLMAO what utter nonsense. What's crippling hardware performance are the high resolutions people expect now... 4k is 4x 1080p, 8k is 4x that again, so you are asking the GPU to render sixteen times more pixels. Graphics cards could do that today... if they were 16x larger, consumed 16x more power, and cost 16x as much. Since that's stupid and nobody would be able to afford it, upscaling and frame generation are literally the only way that GPUs can keep up with high-resolution displays. It's almost like the people who design GPUs are a lot smarter than you, and have thought of this.
Nonsense? You are not seeing my point of view and nonsense has nothing to do with it. Just because you don't believe something can transpire from something doesn't mean it's a nonsense.
Who said crippling? You not me. Who said anything about resolution? You not me.
I'm saying that instead of hardware performance going up, features will be advertised as the product is faster than a previous one just because it has a feature that enables you to run game faster.
That is what I'm talking about not a 4k or whatever crippling gaming you are talking about.
Upscaling and FG are not the only way and you can read extensively about it.
Hardware increasing is not solely dependent on getting graphics chips bigger, more expensive and consume more power.

i'm not paying for features that is very simple. :)
Posted on Reply
#54
MicroUnC
"Driver based upscaler"?
NVIDIA doesn't need it. NV is going full AI
Posted on Reply
#55
CyberPomPom
Beginner Micro Device6 outta 9 games are way slower than 60 FPS on a single top-tier GPU. SLI and CF have never been ideal and playability is questionable. 4K runs faster now than 1080p back then.
I daily drove SLI and crossfire from mid 2008 to the end of 2015, it was a thing. I don't see why we should discard it? It had some caveats but mostly worked. Also remind yourself that pricewise the GTX 295 was equivalent (inflation included) to a RTX 4070 and for TDP to a 4070Ti. Very much accessible.
The reason for leaving behind multi GPU in 3D rendering got more to do with the evolution of rendering technologies than playability.

But I do fully agree with the rest of your argument. 4K has no reason to be the end goal of display technologies and compute power will probably lead us to 8K quite soon in the grand scheme of things.
Posted on Reply
#57
ToTTenTranz
Vya DomusWell, technically correct since Nvidia doesn't offer anything at all on this front.
That's... pretty much the guy's point.
Posted on Reply
#58
kapone32
andreiga76I was talking about PC market.
It's not about how many units you sell, or how much revenue you get, it's all about how much profit you get from each sold unit, AMD is not getting back much profit on those, in fact MS and Sony chose them because NVidia didn't want involved in that market.
You're talking about the booming handheld market, but you forgot Nintendo is a handheld device and they sell more units than PS5, this year with the new Nintendo console/handheld it will be a clear winner in units sold.
NVidia is ok in selling their GPUs to Nintendo because of their special negotiated agreement and because they can sell them at high margins, since they are much reduced versions compared to current generation, based on some info they are even based on last generation.
Do you realize that the Steam Deck has been in the top 10 Global sales since release on the entire platform? Have you seen the number of handhelds that Oyamomi (You know who i mean)) have released? Have you seen how there are 2230 and 2242 M2 drives that you can buy after years of 2280 alone. Do you think that the Switch contributed to that? The fact that the Switch is not as expensive as any handheld also speaks to your profit argument.
Posted on Reply
#59
mouacyk
Everything should be a nail when all you have is a hammer.
Posted on Reply
#60
lexluthermiester
T0@stAMD Believes NVIDIA is Behind in Driver-Based Upscaler Development
While we have to take that statement with a grain of salt, it is possible.
Posted on Reply
#61
kapone32
I really don't care if you call me a fan boy what AMD has done is so similar to how Gsync vs Freesync evolved. The narrative made upscaling seem better than raster. Nvidia makes software that only works with current cards and restricts it from others. Now AMD tells us that it will be at the Driver level. That statement is so huge for the potential as we are going to higher resolutions and are already at higher refresh rates. Just look at this year's CES with 500hz IPS 1440P and 480HZ OLED 4K monitors. Indeed this could turn into a serious mic drop.
Posted on Reply
#62
Denver
People turn everything into AMD vs Nvidia here, and start raising flags and guessing what people prefer. To begin with, we already had a poll about this, it was clear that the majority prefer to run games in native resolution without any upscaling or fake frames gold plus, green or red, whatever. The same goes for RT.

Don't underestimate people's intelligence. More raw performance in rasterization is what most people want, these extra tricks and artifact generators will be used in the future to cover up mediocre performance gains. You can write it down and wait.
Posted on Reply
#63
Chrispy_
AMD have been supporting Geforce 10 and 20-series where Nvidia won't for the last 3 years at least.
Bringing frame-gen to the 30-series is just icing on the cake.

I've also been a mixed-brand household for as long as I can remember and it's always stark how shit the Nvidia driver is compared to the AMD one. I basically have to plug in a bunch of third-party tools to bodge things on Nvidia where AMD stuff just works without additional software.

If we're talking day-zero driver issues, I suspect Nvidia are ahead of AMD there, simply because more developer workstations are Nvidia-sponsored or Nvidia-powered - though these issues are almost always resolved by AMD, Intel, and Nvidia within a few days of a new AAA title launching with any kind of driver issues. If we're talking about the quality of the driver control panel and features it offers, Nvidia have been lagging for a decade or more...
Posted on Reply
#64
Beermotor
Upscaling allows hardware vendors to sell weak, crippled parts at premium prices.
Posted on Reply
#65
Assimilator
DenverMore raw performance in rasterization is what most people want
Hey guess what, you don't always get what you want because of things like, IDK, physics.
Denverthese extra tricks and artifact generators will be used in the future to cover up mediocre performance gains
Do you know what else uses all sorts of tricks to gain better performance? Rasterisation. Tricks like occlusion culling have been used for as long as rasterisation has existed, and sometimes they give the wrong results. Do I hear you complaining about those? Nope.

Really, this anti-frame-generation nonsense is exactly that: reactive nonsense from people who don't understand the limits of silicon or the future of graphics. Thankfully, that future is going to happen regardless of your inability or willingness to comprehend it.
Posted on Reply
#66
Denver
AssimilatorHey guess what, you don't always get what you want because of things like, IDK, physics.


Do you know what else uses all sorts of tricks to gain better performance? Rasterisation. Tricks like occlusion culling have been used for as long as rasterisation has existed, and sometimes they give the wrong results. Do I hear you complaining about those? Nope.

Really, this anti-frame-generation nonsense is exactly that: reactive nonsense from people who don't understand the limits of silicon or the future of graphics. Thankfully, that future is going to happen regardless of your inability or willingness to comprehend it.
You can't maintain the coherence of your own argument. How do you claim to understand that physics makes it exponentially more difficult and expensive with each new manufacturing process, while at the same time defending Nvidia wasting transistors/area on the die with something remarkably impossible like RT?

- Oh, I know, but the $2000 4090 can run CB2077 from half a decade ago at 20-30fps with PT.
- No, It's not playable... Then, What are we going to do? Blur it with upscaling( the realistic graphic I was trying to create) and then insert fake frames, artifacts and extra latency... Wow, look, look, now it's playable. Yay!

Occlusion culling and fake frames/upscaling are not even comparable. One works on the part of the scene that is not visible; Sometimes it can cause artifacts depending on the efficiency of the implementation used, on the other hand fake frames/upscaling cause artifacts and quality degradation 100% of the time.

Yes, let's be thankful for a future of buggy games with upscaling, artifacts, expensive hardware, and minimal advancements; It's incredible.
Posted on Reply
#67
Steevo
Beginner Micro Device"640 kB is enough for everyone."

The difference in fidelity is real. Calculating power grows, upscaling technology improves. It's a matter of time. Maybe in 10 or 15 years we will have 8K monitors for sub 800 USD and GPUs capable of reasonable 8K gaming at High or at least Medium settings at sub 1000 USD mark. Doesn't sound insane to me (15 years ago, 1080p60 was not a thing, just FYI).
Alright youngsters, when I started I was excited to get a 600X800 screen and 16bit color.

I bought a 9600XT to play HL2 and bought a new CRT monitor at a whopping 1280X1024,
Posted on Reply
#68
kapone32
SteevoAlright youngsters, when I started I was excited to get a 600X800 screen and 16bit color.

I bought a 9600XT to play HL2 and bought a new CRT monitor at a whopping 1280X1024,
Remember when 720P was the rage? I loaded up Flashback on my 3DO emulation on my 4K tv and wow.
Posted on Reply
#69
Macro Device
SteevoAlright youngsters, when I started I was excited to get a 600X800 screen and 16bit color.

I bought a 9600XT to play HL2 and bought a new CRT monitor at a whopping 1280X1024,
When all that was a thing I was an elementary school nonsense. Now, being a 29 y.o. nonsense, I just enjoy the fact I don't need to lower my resolution to 320x240 for games to have a double digit framerate.
Posted on Reply
#70
kapone32
Beginner Micro DeviceWhen all that was a thing I was an elementary school nonsense. Now, being a 29 y.o. nonsense, I just enjoy the fact I don't need to lower my resolution to 320x240 for games to have a double digit framerate.
No doubt high resolution and high refresh monitors are part of the pie that makes Gaming so sweet today. If you pair the right equipment with the right hardware you are going to get the compelling smile that is the fulcrum for the matchbox vs Hot Wheels argument that persists. I used to be one of those people that believed that higher refresh rates were snake oil. I have a 4K Mini Led VA panel (FV43U) that I am in love with. It actually supplants the Korea 1440P no scalar monitor I got about 10 years ago.

I am addicted to Gaming and do not care. When you see Everspace 2 in 4K with it's arcade gameplay or Redout 2 with it's homage and evolution of F zero. Even Division 2 is ultra immersive doing Stronghold or Main Missions and how old is that Game. Indeed even Games like Torchlight 2 benefit from modern Monitors. If you want to be blown away Avatar and Spiderman are visual toure de force Games.
Posted on Reply
#71
Steevo
kapone32Remember when 720P was the rage? I loaded up Flashback on my 3DO emulation on my 4K tv and wow.
I thought it was the next step, GPUs couldn't push 1080 at decent framerates and a LOT of projectors were 720.
Posted on Reply
#72
Dirt Chip
BeermotorUpscaling allows hardware vendors to sell weak, crippled parts at premium prices.
And to game developers ship the game at beta (or alpha) state, optimisation wise.
DenverYou can't maintain the coherence of your own argument. How do you claim to understand that physics makes it exponentially more difficult and expensive with each new manufacturing process, while at the same time defending Nvidia wasting transistors/area on the die with something remarkably impossible like RT?

- Oh, I know, but the $2000 4090 can run CB2077 from half a decade ago at 20-30fps with PT.
- No, It's not playable... Then, What are we going to do? Blur it with upscaling( the realistic graphic I was trying to create) and then insert fake frames, artifacts and extra latency... Wow, look, look, now it's playable. Yay!

Occlusion culling and fake frames/upscaling are not even comparable. One works on the part of the scene that is not visible; Sometimes it can cause artifacts depending on the efficiency of the implementation used, on the other hand fake frames/upscaling cause artifacts and quality degradation 100% of the time.

Yes, let's be thankful for a future of buggy games with upscaling, artifacts, expensive hardware, and minimal advancements; It's incredible.
Anyone can choose to play at 1080@60. With that, the top tire GPU will give you what older 10 years ago couldn’t dream of, obviously. Compromise and tricks was used back then, like today. If you turn off RT, you will not have problems playing any game at max on a 1080 monitor even with mid level GPU.
You want more res and higher max fps? Then you need to move to the new age of compromise - AI. You can choose not to of course.
You’re ranting about AMD and NV (and Intel..) to choose to invest on RT and AI solving calculations and allocating precious wafer area is close to futile, as this is what all of the targeted.
Do you see any prospect to a new company or line of products that do only raster? I don’t.
For that you also need a whole line of new game developers to code only raster games.

In 5 years or less, non AI driven GPU (and just about everything else pc related) hardware will be davestatly inferior to non existing.
We’re at a transition time towards it, some polishing still to be done but will get there sooner than later. Do join along.
Posted on Reply
#73
Denver
Dirt ChipAnd to game developers ship the game at beta (or alpha) state, optimisation wise.


Anyone can choose to play at 1080@60. With that, the top tire GPU will give you what older 10 years ago couldn’t dream of, obviously. Compromise and tricks was used back then, like today. If you turn off RT, you will not have problems playing any game at max on a 1080 monitor even with mid level GPU.
You want more res and higher max fps? Then you need to move to the new age of compromise - AI. You can choose not to of course.
You’re ranting about AMD and NV (and Intel..) to choose to invest on RT and AI solving calculations and allocating precious wafer area is close to futile, as this is what all of the targeted.
Do you see any prospect to a new company or line of products that do only raster? I don’t.
For that you also need a whole line of new game developers to code only raster games.

In 5 years or less, non AI driven GPU (and just about everything else pc related) hardware will be davestatly inferior to non existing.
We’re at a transition time towards it, some polishing still to be done but will get there sooner than later. Do join along.
Yes, it is futile to think within reality and logic, the important thing is to create an irrational trend and make money. I'm entirely wrong, it makes perfect sense for companies to put a stick on supposed RT capabilities even on smartphone arm SOCs with just 5w TDP, APUs and low-end GPUs. People don't have brains, just put the "RT ready" stick and increase sales.

The solution is to simply admit that the gaming and hardware market is no longer attractive, get out and look for another hobby. There are so many options, including much healthier ones.
Posted on Reply
#74
Dirt Chip
DenverYes, it is futile to think within reality and logic, the important thing is to create an irrational trend and make money. I'm entirely wrong, it makes perfect sense for companies to put a stick on supposed RT capabilities even on smartphone arm SOCs with just 5w TDP, APUs and low-end GPUs. People don't have brains, just put the "RT ready" stick and increase sales.

The solution is to simply admit that the gaming and hardware market is no longer attractive, get out and look for another hobby. There are so many options, including much healthier ones.
I agree with your conclusion, classic gaming is dead.
Having say that, I myself keep on gaming happily without RT, at 60 fps, do compromise on every setting when needed and choose to play both free and good games where graphics isn't the key reason to play it.
Moreover, those free games (ones full price AAA title) are well optimised by the time thay get free.

I also advise to all to keep to their hardware untill it brakes or won lunch. No other choice to make a change in the industry.
Knowing it can't be done, by reflecting on human nature, I choose to Wellcome the AI progress- with all it's defects. That is, i better game my old games happy than not playing new games angry.
Posted on Reply
#75
Suspecto
Considering that modders successfully implemented DLSS in games that do not officially support it, I give Nvidia a month or two until they come up with something similar.
Posted on Reply
Add your own comment
Dec 20th, 2024 06:38 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts