• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

F1 2020 Gets NVIDIA DLSS Support, 4K-60 Max Details Possible on RTX 2060 SUPER

Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Honestly this sorta tech is something I would have expected more from AMD then Nvidia to begin with but here we are.
Not sure why you bring AMD into this. DLSS is NV's and the fact NV has invented this thing is not to have better graphics (even though it is advertised as such) but to have more FPS because 4k gaming with RT is unreachable at this point. Is the DLSS 2.0 bad or not worth it? No, it is a nice trick to speed things up and that is the main reason NV came up with it even though new cards are just about to be released but still wont be able to pull native 4k 60FPS with RT on. That is why DLSS 2.0 is for. You will be able to get upscaled image to 4k combination with more FPS.
 
Joined
Aug 5, 2019
Messages
812 (0.41/day)
System Name Apex Raptor: Silverback
Processor Intel i9 13900KS Allcore @ 5.8
Motherboard z790 Apex
Cooling LT720 360mm + Phanteks T30
Memory 32GB @8000MT/s CL36
Video Card(s) RTX 4090
Storage 990 PRO 4TB
Display(s) Neo G8 / C1 65"
Case Antec Performance 1
Audio Device(s) DT 1990 Pro / Motu M2
Power Supply Prime Ultra Titanium 1000w
Mouse G502 X Plus
Keyboard K95 Platinum
Quite a statement to call digital foundry "paid proganda", do hope you have something to back that up with....
And again, HAVE you tried it with DLSS 2.0? or are you just remembering when DLSS first came out, you tried it, thought it looked blurry, turned it off and never looked again?

Ive already stated ive played every single DLSS 1.0 and 2.0 game. Just not 2.0 Control and I'm not a fan.
Not sure why all DLSS fanatics have this need convert everyone to their religion. I like image quality and despise aliasing. At 1440p no DLSS implementation exists that makes me want to use it let alone convert to the DLSS gods. Perhaps it would be tolerable at 4k, but i wont be going for a 4k monitor in the forseable future as Im more than happy with the GL850
Glad to see options, it's not good enough for me.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
I love how upscaled 4k is simply called "4k" nowadays.

Can I have 8k too?
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.10/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
Imagine being able to maintain 95% of the visual quality with a 30-40% performance uplift and getting upset about that.
Cyberpunk 2077 is on the way with DLSS support and hopefully more people will see the benefits of this incredible reconstruction technique.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
95% of the visual quality
Oh, you are so behind, my friend.
It's 105%.
By metric that I've just made up.
Come and beat this.

And if you go further, upscaling to 8k that is, it gets to 125%.
And if you wonder how, here is how: machine learning, tensor cores, karate, cognitive dissonance, bit chain.
In short: it's magic.
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
I love how upscaled 4k is simply called "4k" nowadays.

Can I have 8k too?
No to mention, showcasing the DLSS 2.0 on a PC while comparing it to PS4Pro?
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
No to mention, showcasing the DLSS 2.0 on a PC while comparing it to PS4Pro?
That 7870 in PS4 multiplied by "coded to the metal" makes it an RTX level GPU.
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.10/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
Oh, you are so behind, my friend.
It's 105%.
By metric that I've just made up.
Come and beat this.

And if you go further, upscaling to 8k that is, it gets to 125%.
And if you wonder how, here is how: machine learning, tensor cores, karate, cognitive dissonance, bit chain.
In short: it's magic.


At some point in the future there will be blind tests done by some big tech channels on youtube and you'll be shocked to see how many people wouldn't be able to tell the difference between Native 4K and DLSS 2.0
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
At some point in the future there will be blind tests done by some big tech channels on youtube and you'll be shocked to see how many people wouldn't be able to tell the difference between Native 4K and DLSS 2.0
You are joking right? They are comparing DLSS 2.0 to a PS4Pro image quality (which in comparison to PC4k native looks like a joke). Try Native vs DLSS 2.0 and understand that DLSS technique is a lower res upscale to improve FPS. Which part of that you don't understand? How is an upscaled image better than native? Nobody says the DLSS 2.0 is stupid but it does not make the image look better than native 4k because that's just nonsense.
It won't be better than native or same. The resolution is still lower for DLSS 2.0 to achieve more FPS.
 
Joined
Feb 11, 2009
Messages
5,572 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
I've probably watched them long before you did, don't worry about that. The only one willingly ignoring glaring issues of this feature is you, I mean you straight up believe DLSS produces quality higher than native. Here, your quote :

Which means you didn't even watch the videos you told me to look at a million times where you can clearly see the deficiencies of DLSS. How can I even argue with someone who believes that over sharpened images with ghosting look better than native 4K ? You're akin to those people who think over saturated pictures from cameras are of higher quality.

I am going to say it again, Nvidia is very good at manipulating consumers and selling them subpar features and products, you are prime example of that with your conviction that this is the best thing since sliced bread. Good on you, we've got nothing to discuss any further.

Sooo did you miss the part at 13:22? Maybe because you "watched them long before I did" you have forgotten about it, but yeah check it out, and dont get so upset so quickly.

Not sure why you bring AMD into this. DLSS is NV's and the fact NV has invented this thing is not to have better graphics (even though it is advertised as such) but to have more FPS because 4k gaming with RT is unreachable at this point. Is the DLSS 2.0 bad or not worth it? No, it is a nice trick to speed things up and that is the main reason NV came up with it even though new cards are just about to be released but still wont be able to pull native 4k 60FPS with RT on. That is why DLSS 2.0 is for. You will be able to get upscaled image to 4k combination with more FPS.

Well its pretty obvious I would think why I would bring up AMD, they are the one other company making graphics cards and here is a technology next to Cuda/Nvenc that I truely do believe will be hard for them to compete with if they dont get on it.

The RT reasoning aside, Nvidia cards can soon produce near as makes no difference 4k quality images while running the game at a lower res thus producing a higher FPS, that is going to be a hard thing to compete with (im repeating myself now)

Ive already stated ive played every single DLSS 1.0 and 2.0 game. Just not 2.0 Control and I'm not a fan.
Not sure why all DLSS fanatics have this need convert everyone to their religion. I like image quality and despise aliasing. At 1440p no DLSS implementation exists that makes me want to use it let alone convert to the DLSS gods. Perhaps it would be tolerable at 4k, but i wont be going for a 4k monitor in the forseable future as Im more than happy with the GL850
Glad to see options, it's not good enough for me.

I have not seen a lot of DLSS fanatics, but then im not on many tehc forums myself so ill take your word for it they exist (Big N has a lot of loyal fanboys after all).
I was merely checking your experience, if you say you tried DLSS 2.0 in the games you own like, I take it Battlefield? then so be it, I dont even have an Nvidia card, cant check it myself, I have just heard that DLSS 2.0 is a massive leap and Control and Death Stranding look excellent from what I can see with it on.

No to mention, showcasing the DLSS 2.0 on a PC while comparing it to PS4Pro?

Take it that is a response to the vid I posted? idk maybe watch it because that remark is a tad silly.

You are joking right? They are comparing DLSS 2.0 to a PS4Pro image quality (which in comparison to PC4k native looks like a joke). Try Native vs DLSS 2.0 and understand that DLSS technique is a lower res upscale to improve FPS. Which part of that you don't understand? How is an upscaled image better than native? Nobody says the DLSS 2.0 is stupid but it does not make the image look better than native 4k because that's just nonsense.
It won't be better than native or same. The resolution is still lower for DLSS 2.0 to achieve more FPS.

They are comparing that as just 2 different tech of image reconstruction with the addition (honestly just watch the damn vid) that the PC has a lot more horsepower to do this....
If you as well look at 13:22 and other times ti does compare it to native and no, it does not look like a joke, the only joke is you convincing yourself of what you want to be true rather the just objectively looking.
A lower res image 1080p or 1440p, reconstructed based on a native 16k image....that is hardly the same as just "upscaling".
 
Last edited:
  • Like
Reactions: M2B
Joined
Jan 8, 2017
Messages
9,505 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
At some point in the future there will be blind tests done by some big tech channels on youtube and you'll be shocked to see how many people wouldn't be able to tell the difference between Native 4K and DLSS 2.0

I am sure they could do the same with frame rate, most people would never be able to tell the difference between say 80-90 FPS and 110-120. But If I were to go out there and say it is irrelevant that an Nvidia GPU is 30% faster than an AMD one because you could never tell the difference at frame rates that high all hell would break loose.

You know, there is an Overton window with technology as well, Nvidia (and in some part Sony and Microsoft) managed to make lower resolutions acceptable, what will be next I wonder ?
 
Joined
Oct 15, 2019
Messages
588 (0.31/day)
At some point in the future there will be blind tests done by some big tech channels on youtube and you'll be shocked to see how many people wouldn't be able to tell the difference between Native 4K and DLSS 2.0
Yup, I can totally see this coming. It’ll be fun to see how many % of people are blind or prefer the oversharpened look. Bunch of people use motion interpolation when watching movies as well, so I would not bet that even the majority are going to choose the images that reflect the creators vision.
 
Joined
Feb 11, 2009
Messages
5,572 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
I am sure they could do the same with frame rate, most people would never be able to tell the difference between say 80-90 FPS and 110-120. But If I were to go out there and say it is irrelevant that an Nvidia GPU is 30% faster than an AMD one because you could never tell the difference at frame rates that high all hell would break loose.

You know, there is an Overton window with technology as well, Nvidia (and in some part Sony and Microsoft) managed to make lower resolutions acceptable, what will be next I wonder ?

Not a fair comparison, because where a difference between 90 and 120 might be hard to see, a difference between 50 and 60 is and that is the difference you would get at higher settings or with a more demanding future game.
Wereas the 4k vs DLSS will still be the same visual videlity.

Yup, I can totally see this coming. It’ll be fun to see how many % of people are blind or prefer the oversharpened look. Bunch of people use motion interpolation when watching movies as well, so I would not bet that even the majority are going to choose the images that reflect the creators vision.

Most dont calibrate their monitors and use cheap headphones as well so the "Creators vision" on that level of detail is long gone anyway, how much are vinyl records these days anyway? and those reference monitors?
 
Joined
Oct 15, 2019
Messages
588 (0.31/day)
Most dont calibrate their monitors and use cheap headphones as well so the "Creators vision" on that level of detail is long gone anyway.
Yup, or use TN panel monitors. Bunch of idiots, whose opinions I give a whole 0% of weight.
 
Joined
Feb 11, 2009
Messages
5,572 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Yup, or use TN panel monitors. Bunch of idiots, whose opinions I give a whole 0% of weight.

So what hardware do you use then? Monitor/screen, headphones, speakers, soundcard, all that jazz.
 
Joined
Jan 8, 2017
Messages
9,505 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Not a fair comparison

Of course it is, it's the exact same idea except that a different metric is used. The reason you don't think it's fair is because you didn't have someone like Nvidia come up with some frame interpolation thingy and give it a cool name.

Wereas the 4k vs DLSS will still be the same visual videlity.

Objectively and subjectively wrong. I can't believe you're still at it.
 
Joined
Feb 11, 2009
Messages
5,572 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Of course it is, it's the exact same idea except that a different metric is used. The reason you don't think it's fair is because you didn't have someone like Nvidia come up with some frame interpolation thingy and give it a cool name.

Objectively and subjectively wrong. I can't believe you're still at it.

Ok now you are just being silly man, i clearly explained that you are wrong, I really cannot possible make it any simpler but for you I will try:
Card A in older title/lower settings: 90 vs Card B 110fps aka small difference, barely noticable, it does not really matter if you buy A or B
Card A in newer title/higher settings: 50 vs Card B 60 fps big difference, very noticable, it matters wether you buy Card A or Card B

4k in older title has Look X, DLSS in older title has look Y
4k in newer title has Look X, DLSS in newer title has look Y

Nothing changes in the visual-videlity difference between a newer or older title using 4k vs DLSS, it stays the same.

If you still cant fathom this now then Im just out of ideas, sorry.
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Well its pretty obvious I would think why I would bring up AMD, they are the one other company making graphics cards and here is a technology next to Cuda/Nvenc that I truely do believe will be hard for them to compete with if they dont get on it.

The RT reasoning aside, Nvidia cards can soon produce near as makes no difference 4k quality images while running the game at a lower res thus producing a higher FPS, that is going to be a hard thing to compete with (im repeating myself now)
Maybe AMD wont have to do anything. Just leave native 4k if AMD's cards will have enough horse power to pull this one off.
Take it that is a response to the vid I posted? idk maybe watch it because that remark is a tad silly.
Yes it is.
They are comparing that as just 2 different tech of image reconstruction with the addition (honestly just watch the damn vid) that the PC has a lot more horsepower to do this....
If you as well look at 13:22 and other times ti does compare it to native and no, it does not look like a joke, the only joke is you convincing yourself of what you want to be true rather the just objectively looking.
A lower res image 1080p or 1440p, reconstructed based on a native 16k image....that is hardly the same as just "upscaling".
Everyone seen the video. Stop saying reconstructed as a marketing scheme.
You compare a 7-6 year old PS4pro to something that just showed up. It is an improvement on a technique that made the PS4pro run faster by reducing image quality and resolution. DLSS 2.0 does it slightly better by upscaling image without reducing detail level. Still it is a technique to get more FPS by reducing the image resolution comparing to native to make the game run faster on a graphics card. It has less impact on the detail level balancing it in a way that it doesn't look hideous.

DLSS 2.0 = balanced image resolution reduction to make games run faster but not losing all the details and visual fidelity of the game. PERIOD that what DLSS is for and it does it well enough but you can still see the difference plus glitches that occur which has been broth by people here.
 
Joined
Feb 11, 2009
Messages
5,572 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Maybe AMD wont have to do anything. Just leave native 4k if AMD's cards will have enough horse power to pull this one off.

Yes it is.

Everyone seen the video. Stop saying reconstructed as a marketing scheme.
You compare a 7-6 year old PS4pro to something that just showed up. It is an improvement on an technique that made the PS4pro run faster by reducing image quality and resolution. DLSS 2.0 does it slightly better by upscaling image without reducing detail level. Still it is a technique to get more FPS by reducing the image resolution comparing to native to make the game run faster on a graphics card. It has less impact on the detail level balancing it in a way that it doesn't look hideous.

Omg seriously man, the PS4 part is completely irrelevant to the argument here.
This is checkerboard reconstruction (which the ps4 employs) vs DLSS (which the PC employs), and just how they compare, a technology for, oh sorry here comes that "marketing scheme" image reconstruction.
But forget that comparison, I linked it with the eye on how well DLSS (seems to) work nowadays, they did a vid earlier with Control showing its quite amazing as well:

Perhaps that is a better (less confusing) vid to link.

And yes "it is a technique to get more FPS by reducing the image resolution comparing to native to make the game run faster on a graphics card. "
Exactly, so my point is, AMD needs to compete with this because before you know Nvidia can give you 4k QUALITY at 1080p PERFORMANCE, that is what worries me with the eye on competition.

Also it is image reconstruction, that is what it does and why games dont just support DLSS out of the box, there are 16k reference images that it tries to mimic as well as it can on a lower resolution, it tries to reconstruct that 16k image with far less pixels available, that is what the tech does and needs to do per game on Nvidias end hence its not some out of the box overlay/filter like AMD's CAS sharpening.
 
Joined
Oct 15, 2019
Messages
588 (0.31/day)
So what hardware do you use then? Monitor/screen, headphones, speakers, soundcard, all that jazz.
Eizo monitor, beyerdynamic studio headphones, etc. I do some music video production work on the side, so that kind of equipment is a must. For fast paced fps games where your performance is more important I have a lg 27gl850. It has really awful colours though, really hoping that someone at some point makes a fast monitor that doesn’t have sub par visuals.
 
Last edited:
Joined
Jan 8, 2017
Messages
9,505 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
i clearly explained that you are wrong, I really cannot possible make it any simpler but for you I will try:
Card A in older title/lower settings: 90 vs Card B 110fps aka small difference
Card A in newer title/higher settings: 50 vs Card B 60 fps big difference, very noticable

4k in older title has Look X, DLSS in older title has look Y
4k in newer title has Look X, DLSS in newer title has look Y

What I can't fathom is how you do not understand that you can't convince me that a frame rate difference is very noticeable but one in image quality isn't, you can't quantify that. That's entirely within the realm of subjectivity and no matter how you spin it, "4k vs DLSS will still be the same visual videlity" will never be a valid point that you could use. You simply can't prove that to be true.

The native resolution is not 4K, no amount of reconstruction/machine learning will ever bring back that lost detail, you can interpolate the missing pixels, sharpen the image, etc nothing will bring back that original 3840x2160 image. So I can't say if the difference would be noticeable or not but what I can say with absolute certainty is that the image which is reconstructed with DLSS will look worse in every measurable way.
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
And yes "it is a technique to get more FPS by reducing the image resolution comparing to native to make the game run faster on a graphics card. "
Exactly, so my point is, AMD needs to compete with this because before you know Nvidia can give you 4k QUALITY at 1080p PERFORMANCE, that is what worries me with the eye on competition.
AMD doesn't have to do anything if the FPSis 60 at 4k. No DLSS needed to speed anything up so Native 4k can be left without any reconstructions. Can AMD pull this off? Time will tell with the new graphics release.
We all know what DLSS does but you keep forgetting that this DLSS of NV is not to make the image quality better but gain FPS by sacrificing the image quality in one way or another. The reconstruction when happens is to sacrifice image quality to gain FPS and it doesn't look the same as before the reconstruction happened. That's why the reconstruction takes place, sacrafice to get FPS otherwise what would have been the point of that? What is wrong with you man?
 
Joined
Feb 11, 2009
Messages
5,572 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
What I can't fathom is how you do not understand that you can't convince me that a frame rate difference is very noticeable but one in image quality isn't, you can't quantify that. That's entirely within the realm of subjectivity and no matter how you spin it, "4k vs DLSS will still be the same visual videlity" will never be a valid point that you could use. You simply can't prove that to be true.

The native resolution is not 4K, no amount of reconstruction/machine learning will ever bring back that lost detail, you can interpolate the missing pixels, sharpen the image, etc nothing will bring back that original 3840x2160 image. So I can't say if the difference would be noticeable or not but what I can say with absolute certainty is that the image which is reconstructed with DLSS will look worse in every measurable way.

Its not about 4k vs DLSS, its about 4k vs DLSS on an older game or a newer game, THAT is of no difference, DLSS on an older game looks the same as DLSS on a newer game, its the same technique applied to both.

Lets go back a tad shall we:
M2B said that in the future he expects blind tests to be done and we will see many people who cant see a difference between Native 4k and DLSS

You then responded saying the same can be done for framerate, that the difference between 90 and 120 is hardly noticable for the vast majority, but if you were to therefor claim the cards producing those numbers are of the same value, "all hell would break loose"

implying that you cant just glance over the minor differences because they do matter.

Then I responded its not a fair comparison as that frame-rate difference wont matter indeed for that made up game, like you can buy a for example RX580 to play CSGO or an RTX2080Ti, both will easily run that game at 200 fps so it wont matter which you get, BUT If you were to play something a tad more demanding, suddenly it does matter.
If one only played CSGO nobody would mind someone claiming that the 200fps RX580 or the 300fps RTX2080ti dont matter and that they should just buy the cheaper of the two cards.
But if people do play more, suddenly that difference in performance will matter and then you cant just make the previous claim because you would give bad advice.

However when it comes to the diference between 4k and DLSS visual videlity for CSGO or a "tad more demanding" game, the difference would remain the same, the difference does not scale like fps can and does and so the argument of "all hell breaking lose" falls away.

Either way, getting massively off track here, or are we?

Its not entirely in the realm of subjectivity though is it, if a pixel is black, its black, there is nothing subjective about it.
A 4k image and a 1440 DLSS 2.0 image can be compared pixel for pixel and can be analysed in how "the same" they are and that removes the idea of subjectivity.


How much worse does B look then A? Subjective, but if some program analyses them and says they are 99.9% the same then well, becomes hard to argue subjectivity.

and on the last thing, remember that this is also about looks in motion, not just a fixed image, AND that the DLSS image comes from a 16k reference image, which is quite a bit higher res with more detail then 4k.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
blind tests done by some big tech channels
Dr you mean tech channels by cyborg reviewers who are capable of doing reliable "% of 4k" assessment?
Or tech channels that pretend to be cyborgs?
Or tech channels that sound like NV marketing team would?
Be more specific please.

we will see many people who cant see a difference between Native 4k and DLSS
I personally know people who cannot see the difference between 4k and 1080p when watching 65" screen from 4.5 meter distance.
What does that prove again? Oh, resolution doesn't matter much, mkay.
Why did we bring in DLSS to figure that again?

Subjective, but if some program analyses them and says they are 99.9%
Then I want to know what the same programs say about images upscaled using other methods. Just for reference.

And that 99.9% is BS, unless you add that wonderful "threshold" thing, which would also swallow plain old upscaling quite eagerly.

Either way, getting massively off track here, or are we?
We are. Next gen consoles are targeting 4k. You wouldn't need to dumb down and then upscale your images, with a decent PC, unless you'd want to go 8k for some reason. And there we need "blind tests" of how many people can spot 8k.
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
The more you get into the woods the darker it gets.
I'm done here.
 
Top