Tuesday, February 28th 2023

NVIDIA RTX Video Super Resolution Tested, AI Enhanced Streaming That Barely Makes a Difference

NVIDIA has leveraged their expertise in neural networks and deep learning to release an interesting new feature with their R530 driver branch, an AI video stream upscaler designed to take advantage of RTX Tensor Cores when playing video content within Chromium based browsers. Our previous news article on RTX Video Super Resolution (VSR) covered the release of Chrome 110 stable, which included support for this technology. The latest version of Microsoft Edge, based on Chromium, also officially supports RTX VSR. Owners of NVIDIA RTX graphics cards may have been puzzled by exactly how to enable this feature however, either in Chrome 110 or in the NVIDIA Control Panel, since the relevant 'NvidiaVpSuperResolution' setting is enabled by default within Chrome, but the required accompanying driver has only just been released, three weeks later.
To use RTX VSR, you'll need a RTX 30 or 40-series graphics card, be running the latest NVIDIA GeForce Graphics Driver, and have enabled the "RTX Video Enhancement" option within the NVIDIA Control Panel, under the "Adjust video image settings" submenu. There are four quality presets, with "1" being the lowest and "4" being the highest, while also using the most GPU resources. Owners of RTX 20-series cards will have to wait for NVIDIA to enable this functionality for their GPUs, once the engineering work is completed for that architecture.
Some comparison screenshots, taken on my personal system with a 3080 Ti and a 1440p monitor, it seems the technology is most noticeable when applied to videos at 720p resolution and below.

Similar to the well received NVIDIA Shield TV, which could take 720p or 1080p content and upscale it to 4K at up to 30 frames per second using the AI hardware within the Tegra X1+, RTX VSR is a further, more advanced development. Using the more powerful hardware on modern RTX graphics cards, RTX VSR automatically upscales content played from within your browser between 360p and 1440p, to 4K, improving detail and removing the compression artifacts streamed content is known for.

NVIDIA's RTX VSR FAQ and blog post answers some common questions and provides further details on how the technology works.

You can take a look at NVIDIA's comparison video or try enabling the feature yourself to decide how well NVIDIA's efforts have paid off. As we've seen with other AI based deep learning solutions, the technology will continue to improve with time. In it's current state, RTX VSR seems particularly well suited for increasing the clarity of videos uploaded at lower resolutions or bitrates, such as older videos or live streamed content from Twitch or YouTube. Those using capped or slower network connections limiting their streaming options should also appreciate being able to efficiently consume content without sacrificing too much in image quality. I can't wait to see where the iterative path leads, as this technology could be as impactful in video media as AI based upscalers were for gaming!


Update: After further testing of a YouTube stream of in game content, set to 480p and 720p, isolated differences between RTX VSR enabled at setting '4' and disabled can be shown.
480p enabled 480p disabled
720p enabled 720 disabled
Looking closely at the barrels, trees, textures on the surfaces, text on the container, and straight lines for example the roof of the service station, we can see image quality improvements with RTX VSR enabled and set to '4' quality.

While these image quality improvements certainly exist, I have some questions as to how many owners of RTX 30 and 40-series graphics cards spend their time watching low resolution streams, since improvements are much less obvious when using higher resolution source material. This technology seems ideally suited to portable applications, where there is limited internet bandwidth available, such as smartphones on mobile networks, or laptops on the go using slow wireless connections. Unfortunately, NVIDIA requires laptops to be plugged into mains power to use RTX VSR, due to the additional power drawn by the Tensor Cores required for image processing (most laptops would use iGPU via Optimus under light graphics loads, and RTX VSR requires the discrete GPU to be active), and there are no smartphones that have RTX features. The way I see it, it's a zero effort (after initial setup, which takes a minute or so) way to get slightly better image quality, scaling less as you go up in source resolution, with a negligible draw on system resources. There is also the case where many older videos from the earlier days of the internet tend to be only available in relatively low resolution, so this technology can certainly come into play to offer a more contemporary image quality.
Add your own comment

114 Comments on NVIDIA RTX Video Super Resolution Tested, AI Enhanced Streaming That Barely Makes a Difference

#52
sLowEnd
Minus InfinityAI AI AI, the cure all for earth's woes.
All you need is love, love, love :)
Posted on Reply
#53
Verpal
I hope this is another case of DLSS 1, whereas initial application is kind of bad, and it improve in new iterations.
Posted on Reply
#54
R0H1T
sLowEndAll you need is love, love, love :)
VerpalI hope this is another case of DLSS 1, whereas initial application is kind of bad, and it improve in new iterations.
Hopefully but this would also need substantial time/investment from Google/Amazon/FB/Netflix et al.

As an aside it also probably works up to 1440p w/120fps, ran a few streams at 2x the speed in YT & the power consumption on RTX 3080 shot up touching nearly ~245W or so.
Posted on Reply
#55
nguyen
720p VSR ON vs 720p VSR OFF vs 1080p VSR OFF

720p VSR ON looks close to 1080p VSR OFF, which is kinda neat but power consumption is a little high
720p - 70W
1080p - 130W
1440p - 240W

Edit: using lower quality setting reduce power consumption significantly without any visual difference
With Quality at level 1
720p: 28W
1080p: 40W
1440p: 50W
Posted on Reply
#56
Divide Overflow
nVidia's AI isn't doing what it's supposed to. What is it really doing then? Hmmm....
Posted on Reply
#57
deathlessdd
Divide OverflownVidia's AI isn't doing what it's supposed to. What is it really doing then? Hmmm....
Clearly there’s no ai doing anything it’s trying to upscale video topaz ai does a much better job at making bad quality videos look better.

I don’t even think the tensor cores are doing anything to help this cause, doesn’t require ai to upscale video…
Posted on Reply
#58
n-ster
For that kind of power consumption, pretty poor effort considering better outcomes have been possible for years. I remember using MPC-HC and some upscaling thing, I forget if it required anything beyond ffmpeg, but the results were much more impressive
Posted on Reply
#59
Jun
Always cool to see new features getting added to older hardware, keep things exciting but it uses too much power to just keep it on all the time. Maybe a toggle or a shortcut key would be more friendly. I hope they keep improving it.
Posted on Reply
#60
mama
Oh dear. Tell me this isn't the final product.
Posted on Reply
#61
watzupken
WareIt definitely works.
The screenshots in the article are almost worthless for demonstration.
It's no miracle, but it does about what I expected.
These pics are 360 video scaled to 1440.

Some people had unrealistic expactations, and some just complain about everything.
I think unless one starts pixel peeping, otherwise the benefit is not going to be obvious. It may be good if one have limited data per month, and so need to stream at a lower resolution. Otherwise, I see limited use for it. It may auto enable, but I don't think people will notice the difference.
VerpalI hope this is another case of DLSS 1, whereas initial application is kind of bad, and it improve in new iterations.
I feel they are for different purposes. DLSS makes a lot of sense because you are trying to deliver better image quality to enable higher FPS. Here, I see little benefits. It is a net passive way to improve video quality, but one with a RTX 3000 or 4000 series GPU is more than capable of streaming high quality videos, which also means you can stream at 1440p and let it downscale to your 1080p display for sharper image. So the only benefit it likely to save data.
Posted on Reply
#62
Pumper
nvidia fanboys when AMD is using 50W while playing Youtube videos: "what a peace of shit"
nvidia fanboys when their RTX is using 200W while playing Youtube videos because they are using blur/sharpen "AI": "a game changer, RIP AMD"
Posted on Reply
#63
Steevo
Unsharp and a few other non AI image enhancements look as good, ATI/AMD had great video options for those of us that cared and my Blu-Ray software already does 60FPS frame generation with vector adapt using 10 year old hardware.

Kinda feels like being resold something that we already have or had.
Posted on Reply
#64
nguyen
So I tested again with Quality level at 1,
1080p VSR OFF vs 1080p VSR ON vs 1440p VSR OFF

VSR looks slightly better and use around 20-30W more (VSR OFF: 20W, VSR ON: 40W at 1080p and 50W at 1440p). It's useful for old Youtube video and people with slow Internet :D
Posted on Reply
#65
Jun
Pumpernvidia fanboys when AMD is using 50W while playing Youtube videos: "what a peace of shit"
nvidia fanboys when their RTX is using 200W while playing Youtube videos because they are using blur/sharpen "AI": "a game changer, RIP AMD"
I don't think anyone said that.
Posted on Reply
#66
LupintheIII
It doesn't look much better compared to AMD Sharpening filter wich is usable in every video (including local saved files) with every browser and is there since 4 years now (even tho no one talked about that).

I guess the difference is the use of the buzzword AI...
R0H1T

Hopefully but this would also need substantial time/investment from Google/Amazon/FB/Netflix et al.

As an aside it also probably works up to 1440p w/120fps, ran a few streams at 2x the speed in YT & the power consumption on RTX 3080 shot up touching nearly ~245W or so.
245W to stream a YT video??
And people complained about Intel Arc idle power consumption...
Posted on Reply
#67
Euphorical
So I did notice a difference, however I didn't like the "look" of the end result, hard to explain, but there is visual artifacts that make videos look unnatural or odd. Like adding grid lines to skin on faces ect.
Not a fan, too distracting, turned it off.

1440p/240hz 27in HP Omen X 27.
3080.
Tested 360p/720p on 5 diff random vids on YouTube.
Posted on Reply
#68
loracle706
Absolut bullshit, no difference at all no even placebo feeling, Nvidia fix important things and stop leading us to bullshit ones !!
Posted on Reply
#69
R0H1T
LupintheIII245W to stream a YT video??
Quality set to level 4 & 1080p/1440p videos ran at 2x the rate, so basically around 120 fps. Not sure which stream had the highest power consumption but it peaked just shy of 245W in some of them.
Posted on Reply
#70
Vya Domus
regs480p? It can't do magic. It's not intended for that ultra low resolution. It's more intended to watch 1080p on 4K screen.
I know but if you look at the demo video it looks night and day between 1080p and upscaled 4K. My guess is what they showcased was rendered offline, I mean you can get that kind of quality from ML upscaling just not in real time. It looks like it does more denoising rather than actual upscaling.
Posted on Reply
#71
mrnagant
On the personal pictures, 1080p and 1440p has like no difference. In the 360p/480p, you do amazingly lose a lot of detail when upscaled. 360p upscaled for example, the beard looks like it was painted on you. The 480p upscale looks like the mustache was penciled on. Reminds me of those Snapchat filters that can change how your face looks. I prefer the low-res grainy video in this use case.
Posted on Reply
#73
Vayra86
WareIt definitely works.
The screenshots in the article are almost worthless for demonstration.
It's no miracle, but it does about what I expected.
These pics are 360 video scaled to 1440.

Some people had unrealistic expactations, and some just complain about everything.
Its still a blurry mess.

This is still a convincingly low quality video, which kinda defeats the point of the technology. So they smoothed out some high contrast edges... yay? The whole boxer itself is still the same garbled mess. Its video, not a spreadsheet they're upscaling - unless you're watching the TV logo as a favorite pastime.
sLowEndAll you need is love, love, love :)
Bing won't talk to us like that anymore already I heard :D
Pumpernvidia fanboys when AMD is using 50W while playing Youtube videos: "what a peace of shit"
nvidia fanboys when their RTX is using 200W while playing Youtube videos because they are using blur/sharpen "AI": "a game changer, RIP AMD"
LOL!
Posted on Reply
#74
rolachaz
We need some CSIs tech enhanced video streaming!

Posted on Reply
#75
Steevo
rolachazWe need some CSIs tech enhanced video streaming!

YEAHHHHHHHHHHHHH!!!! EWWWWWwwwwwwww
Posted on Reply
Add your own comment
Dec 13th, 2024 22:32 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts