System Name | potato |
---|---|
Processor | Ryzen 9 5950X |
Motherboard | MSI MAG B550 Tomahawk |
Cooling | Custom WC Loop |
Memory | 2x16GB G.Skill Trident Z Neo 3600 |
Video Card(s) | RTX3090 |
Storage | 512GB, 2TB NVMe + 2TB SATA || 32TB spinning rust |
Display(s) | XIAOMI Curved 34" 144Hz UWQHD |
Case | be quiet dark base pro 900 |
Audio Device(s) | Edifier R1800T, Logitech G733 |
Power Supply | Corsair HX1000 |
Mouse | Logitech G Pro |
Keyboard | Logitech G913 |
Software | win 11 amd64 |
Well a final output imagine is nuanced, especially in motion, there is lots to unpack there. For instance I find shimmering on straight(er) edges is a nit-pick of mine where DLSS helps immensely, along with intricate details, perhaps other parts of the image matter more you you? I'd encourage you to come join the conversation here if you have extended thoughts to share or anything you want to discuss with other other people that use DLSS 2.0.
System Name | MightyX |
---|---|
Processor | Ryzen 9800X3D |
Motherboard | Gigabyte B650I AX |
Cooling | Scythe Fuma 2 |
Memory | 32GB DDR5 6000 CL30 |
Video Card(s) | Asus TUF RTX3080 Deshrouded |
Storage | WD Black SN850X 2TB |
Display(s) | LG 42C2 4K OLED |
Case | Coolermaster NR200P |
Audio Device(s) | LG SN5Y / Focal Clear |
Power Supply | Corsair SF750 Platinum |
Mouse | Corsair Dark Core RBG Pro SE |
Keyboard | Glorious GMMK Compact w/pudding |
VR HMD | Meta Quest 3 |
Software | case populated with Artic P12's |
Benchmark Scores | 4k120 OLED Gsync bliss |
I can't see this being required at all, if DLSS Doesn't need RT to be on, I would put money on AMD not requiring it for FSR.Does it really need ray tracing to be on as some old rumors said?
System Name | M3401 notebook |
---|---|
Processor | 5600H |
Motherboard | NA |
Memory | 16GB |
Video Card(s) | 3050 |
Storage | 500GB SSD |
Display(s) | 14" OLED screen of the laptop |
Software | Windows 10 |
Benchmark Scores | 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling. |
That surely helped G-Sync to get traction.The key difference is those Nvidia users get to enjoy both.
No, definitely not.Does it really need ray tracing to be on as some old rumors said?
System Name | Windforce |
---|---|
Processor | i7-4790K @4.4ghz |
Motherboard | Asus ROG Maximus Gene VII |
Cooling | Swiftech H220X |
Memory | 12GB Corsair Vengeance Pro |
Video Card(s) | Gigabyte GTX 970 G1 Gaming |
Storage | 1x Western Digital 160GB 1x OCZ ARC 240GB SSD |
Case | Corsair Carbide Air 240 |
Power Supply | Corsair TX750 |
Mouse | Roccat Kone Pure |
Keyboard | Coolermaster CM Storm Quickfire TK |
Software | Windows 10 |
System Name | Cyberline |
---|---|
Processor | Intel Core i7 2600k -> 12600k |
Motherboard | Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4 |
Cooling | Tuniq Tower 120 -> Custom Watercoolingloop |
Memory | Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz |
Video Card(s) | AMD RX480 -> RX7800XT |
Storage | Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD |
Display(s) | Philips 32inch LPF5605H (television) -> Dell S3220DGF |
Case | antec 600 -> Thermaltake Tenor HTCP case |
Audio Device(s) | Focusrite 2i4 (USB) |
Power Supply | Seasonic 620watt 80+ Platinum |
Mouse | Elecom EX-G |
Keyboard | Rapoo V700 |
Software | Windows 10 Pro 64bit |
if they bring this to the new consoles.. imagine those 720p games being upscaled
System Name | [H]arbringer |
---|---|
Processor | 4x 61XX ES @3.5Ghz (48cores) |
Motherboard | SM GL |
Cooling | 3x xspc rx360, rx240, 4x DT G34 snipers, D5 pump. |
Memory | 16x gskill DDR3 1600 cas6 2gb |
Video Card(s) | blah bigadv folder no gfx needed |
Storage | 32GB Sammy SSD |
Display(s) | headless |
Case | Xigmatek Elysium (whats left of it) |
Audio Device(s) | yawn |
Power Supply | Antec 1200w HCP |
Software | Ubuntu 10.10 |
Benchmark Scores | http://valid.canardpc.com/show_oc.php?id=1780855 http://www.hwbot.org/submission/2158678 http://ww |
I would postulate because DLSS 1.0 was like rubbing Vaseline on your screen and DLSS 2.0 is black magic.It's funny because I found some forums, and this one in particular, to be an Anti-DLSS (among other things) echo chamber, and a theme was, the vast majority, if not all of these users 'hating', don't own or use RTX cards.
So I can see where the distinct chance for the opposite exists and people just rave on it with no constructive criticisms whatsoever, but so far it actually seems more balanced than the free for all, which I find overwhelmingly negative when everyone that has read a review, or seen a compressed youtube video, or some cherry-picked side-by-sides, but comes to the party with their bias and just dumps on it.
I SO want AMD to succeed with FSR, it's only going to be good for us all if they do, but as it stands I have serious doubts.
I do have an RTX card, what makes DLSS useless is it's limited to games that supports it, like yourself I use a 4k OLED TV but DLSS isn't supported in any old games, andDLSS is far from useless, used it in many games so far, especially when I output to my 4K OLED at 120Hz/Gsync, looks and runs awesome. Death Stranding looked better than native with DLSS enabled using quality preset. Text was sharper. Textures looked better. DLSS is very good for elminating jaggies.
DLSS support will explode over time, native support in most popular game engines is going to happen. Already confirmed. Unreal Engine, Unity - also DLSS 3.0 should allow all games that support TAA to force DLSS instead, which is 100s of titles even older games
DLSS is the true magic of RTX series. Allows for a huge fps boost or RT without huge fps drop. Ray Tracing is a joke without DLSS but even without DLSS, Nvidias 3000 series beats AMDs 6000 with ease in RT scenarios. Ray Tracing can be great in some titles, single player ones. In multiplayer it's all about performance for me tho.
If DLSS is implemented well (most DLSS 2.x games) - the tech is insanely good. Free performance and pretty much identical image quality, sometimes better, there's several videos and tests with side by side comparisons. Why say no to 50-75% more fps _and_ improved visuals? Just never use motion blur with DLSS (who uses motion blur anyway... sigh, motion blur is only something you use when you try and mask a low framerate, aka consoles and low end PCs)
DLSS 1.0 sucked pretty much, blurred crap but DLSS 2.x is nothing like 1.0 some people still think DLSS means blur tho haha.
It's funny how people with AMD GPUs or GTX cards always seem to think DLSS is useless I Wonder why.
I do have an RTX card and like yourself I use an OLED 4K G-Sync, the issue with DLSS is it's only available for games that support it, which renders it useless otherwise, the same goes for RT.DLSS is far from useless, used it in many games so far, especially when I output to my 4K OLED at 120Hz/Gsync, looks and runs awesome. Death Stranding looked better than native with DLSS enabled using quality preset. Text was sharper. Textures looked better. DLSS is very good for elminating jaggies.
DLSS support will explode over time, native support in most popular game engines is going to happen. Already confirmed. Unreal Engine, Unity - also DLSS 3.0 should allow all games that support TAA to force DLSS instead, which is 100s of titles even older games
DLSS is the true magic of RTX series. Allows for a huge fps boost or RT without huge fps drop. Ray Tracing is a joke without DLSS but even without DLSS, Nvidias 3000 series beats AMDs 6000 with ease in RT scenarios. Ray Tracing can be great in some titles, single player ones. In multiplayer it's all about performance for me tho.
If DLSS is implemented well (most DLSS 2.x games) - the tech is insanely good. Free performance and pretty much identical image quality, sometimes better, there's several videos and tests with side by side comparisons. Why say no to 50-75% more fps _and_ improved visuals? Just never use motion blur with DLSS (who uses motion blur anyway... sigh, motion blur is only something you use when you try and mask a low framerate, aka consoles and low end PCs)
DLSS 1.0 sucked pretty much, blurred crap but DLSS 2.x is nothing like 1.0 some people still think DLSS means blur tho haha.
It's funny how people with AMD GPUs or GTX cards always seem to think DLSS is useless I Wonder why.
System Name | MightyX |
---|---|
Processor | Ryzen 9800X3D |
Motherboard | Gigabyte B650I AX |
Cooling | Scythe Fuma 2 |
Memory | 32GB DDR5 6000 CL30 |
Video Card(s) | Asus TUF RTX3080 Deshrouded |
Storage | WD Black SN850X 2TB |
Display(s) | LG 42C2 4K OLED |
Case | Coolermaster NR200P |
Audio Device(s) | LG SN5Y / Focal Clear |
Power Supply | Corsair SF750 Platinum |
Mouse | Corsair Dark Core RBG Pro SE |
Keyboard | Glorious GMMK Compact w/pudding |
VR HMD | Meta Quest 3 |
Software | case populated with Artic P12's |
Benchmark Scores | 4k120 OLED Gsync bliss |
Indeed 1.0 was beyond average, literally mas as well just run 70% render scale. 2.0 is legit black magic lol, sometimes I really can't believe my eyes.I would postulate because DLSS 1.0 was like rubbing Vaseline on your screen and DLSS 2.0 is black magic.
System Name | Chip |
---|---|
Processor | Amd 5600X |
Motherboard | MSI B450M Mortar Max |
Cooling | Hyper 212 |
Memory | 2x 16g ddr4 3200mz |
Video Card(s) | RX 6700 |
Storage | 5.5 tb hd 220 g ssd |
Display(s) | Normal moniter |
Case | something cheap |
VR HMD | Vive |
is is incorrectotion blur is only something you use when you try and mask a low framerate, aka consoles and low end PCs)
System Name | MightyX |
---|---|
Processor | Ryzen 9800X3D |
Motherboard | Gigabyte B650I AX |
Cooling | Scythe Fuma 2 |
Memory | 32GB DDR5 6000 CL30 |
Video Card(s) | Asus TUF RTX3080 Deshrouded |
Storage | WD Black SN850X 2TB |
Display(s) | LG 42C2 4K OLED |
Case | Coolermaster NR200P |
Audio Device(s) | LG SN5Y / Focal Clear |
Power Supply | Corsair SF750 Platinum |
Mouse | Corsair Dark Core RBG Pro SE |
Keyboard | Glorious GMMK Compact w/pudding |
VR HMD | Meta Quest 3 |
Software | case populated with Artic P12's |
Benchmark Scores | 4k120 OLED Gsync bliss |
Granted adoption is one of the bigger gripes about it, if the game you were playing supported it, would you not enable it? I'd call it limited, not useless. And I don't think we're at the end of the road for it, NVIDIA know how amazingly it works, and that there's a massive appetite for it, so it's very much in their interest to vastly increase per game adoption/look at ways to enable it universally etc. They'd be foolish not to be working on that right now.I do have an RTX card and like yourself I use an OLED 4K G-Sync, the issue with DLSS is it's only available for games that support it, which renders it useless otherwise, the same goes for RT.
The technology seems great, but as of now it is useless for someone who doesn't play compatible games, which represents the majority of gamers.Granted adoption is one of the bigger gripes about it, if the game you were playing supported it, would you not enable it? I'd call it limited, not useless. And I don't think we're at the end of the road for it, NVIDIA know how amazingly it works, and that there's a massive appetite for it, so it's very much in their interest to vastly increase per game adoption/look at ways to enable it universally etc. They'd be foolish not to be working on that right now.
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
Why do you want DLSS in old games? You get a million fps on native resolution anyway. IMO, the main function of DLSS is to counter the performance hit you suffer when you turn ray tracing on. Old games don't need it. New games wouldn't need it either, if not for the enormous performance costs that come with RT.I do have an RTX card, what makes DLSS useless is it's limited to games that supports it, like yourself I use a 4k OLED TV but DLSS isn't supported in any old games, and
I do have an RTX card and like yourself I use an OLED 4K G-Sync, the issue with DLSS is it's only available for games that support it, which renders it useless otherwise, the same goes for RT.
The only way to make it useful it to be able to apply it to all games like other AA solutions. Only then it'll make sense for me.
System Name | Chip |
---|---|
Processor | Amd 5600X |
Motherboard | MSI B450M Mortar Max |
Cooling | Hyper 212 |
Memory | 2x 16g ddr4 3200mz |
Video Card(s) | RX 6700 |
Storage | 5.5 tb hd 220 g ssd |
Display(s) | Normal moniter |
Case | something cheap |
VR HMD | Vive |
yeah lolWhy do you want DLSS in old games? You get a million fps on native resolution anyway. IMO, the main function of DLSS is to counter the performance hit you suffer when you turn ray tracing on. Old games don't need it. New games wouldn't need it either, if not for the enormous performance costs that come with RT.
System Name | M3401 notebook |
---|---|
Processor | 5600H |
Motherboard | NA |
Memory | 16GB |
Video Card(s) | 3050 |
Storage | 500GB SSD |
Display(s) | 14" OLED screen of the laptop |
Software | Windows 10 |
Benchmark Scores | 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling. |
In fact, as a user of any GPU, you can simply run game at 1080p, call it "Bazinga 4k" (print "Bazinga 4k" using comic sans font, with the face of Jensen Huang next to it, he deserves it) and have it on the wall, next to your monitor, in case someone tells you you gained perf from simply lowering your resolution (1080p is 4 times less pixels than 4k, 1440p is 2.2 times less)So for a Nvidia RTX users you can enable both DLSS + FSR in games? Woww huge performance.....
Not many games old or new support DLSS 2.0, plus 4k120 isn't easy as few have the hardware for.Why do you want DLSS in old games? You get a million fps on native resolution anyway. IMO, the main function of DLSS is to counter the performance hit you suffer when you turn ray tracing on. Old games don't need it. New games wouldn't need it either, if not for the enormous performance costs that come with RT.
As for 4K TVs, I also have one, but I watch 90% of my stuff in 1080p as the difference is barely noticeable.
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
My argument to that isNot many games old or new support DLSS 2.0, plus 4k120 isn't easy as few have the hardware for.
I agree the new TVs do a very good job upscaling, but when it comes to games there is a noticeable difference between 4k and 1440p.
I guess we have to agree to disagree.My argument to that is
1. 4K 120 fps is a luxury even with modern hardware. You need a couple more GPU generations until it really becomes available with modern games.
2. you really can't blame game developers for not including a proprietary nvidia technology in X game when that technology and the hardware for it wasn't even invented yet.
3. a "noticeable difference" doesn't mean it's crap. Finding a good compromise between performance and visual quality isn't a concept from the devil - just sayin'.
System Name | [H]arbringer |
---|---|
Processor | 4x 61XX ES @3.5Ghz (48cores) |
Motherboard | SM GL |
Cooling | 3x xspc rx360, rx240, 4x DT G34 snipers, D5 pump. |
Memory | 16x gskill DDR3 1600 cas6 2gb |
Video Card(s) | blah bigadv folder no gfx needed |
Storage | 32GB Sammy SSD |
Display(s) | headless |
Case | Xigmatek Elysium (whats left of it) |
Audio Device(s) | yawn |
Power Supply | Antec 1200w HCP |
Software | Ubuntu 10.10 |
Benchmark Scores | http://valid.canardpc.com/show_oc.php?id=1780855 http://www.hwbot.org/submission/2158678 http://ww |
I was about to disagree with 4k 120 being a luxury because my 3090 can pump it out.... but....that's kinda a luxury item.My argument to that is
1. 4K 120 fps is a luxury even with modern hardware. You need a couple more GPU generations until it really becomes available with modern games.
2. you really can't blame game developers for not including a proprietary nvidia technology in X game when that technology and the hardware for it wasn't even invented yet.
3. a "noticeable difference" doesn't mean it's crap. Finding a good compromise between performance and visual quality isn't a concept from the devil - just sayin'.
System Name | MightyX |
---|---|
Processor | Ryzen 9800X3D |
Motherboard | Gigabyte B650I AX |
Cooling | Scythe Fuma 2 |
Memory | 32GB DDR5 6000 CL30 |
Video Card(s) | Asus TUF RTX3080 Deshrouded |
Storage | WD Black SN850X 2TB |
Display(s) | LG 42C2 4K OLED |
Case | Coolermaster NR200P |
Audio Device(s) | LG SN5Y / Focal Clear |
Power Supply | Corsair SF750 Platinum |
Mouse | Corsair Dark Core RBG Pro SE |
Keyboard | Glorious GMMK Compact w/pudding |
VR HMD | Meta Quest 3 |
Software | case populated with Artic P12's |
Benchmark Scores | 4k120 OLED Gsync bliss |
Many people on this and other forums need to hear and accept that.a "noticeable difference" doesn't mean it's crap. Finding a good compromise between performance and visual quality isn't a concept from the devil - just sayin'.
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
Totally. Like I said, I used to be a native resolution above all AA kind of guy, but the way FidelityFX works in Cyberpunk 2077 really impressed me. Being able to play it on full HD all high settings on a GTX 1650 with a minor loss of sharpness is amazing. I'd get around 20-25 fps running all native, but with 75% scaling, I get between 35-40, which I'm fine with. Obviously, all low 60 fps would be possible too, I just don't want it.Many people on this and other forums need to hear and accept that.
One thing I like about this new wave of upscaling technologies and methods is the ability to disproportionately retain image quality, detail, and sharpness beyond what the base/input resolution would suggest the quality should be. Some cannot seem to wrap their head around or accept that it's even possible, let alone that people might find it desirable to use rather than native.
System Name | MightyX |
---|---|
Processor | Ryzen 9800X3D |
Motherboard | Gigabyte B650I AX |
Cooling | Scythe Fuma 2 |
Memory | 32GB DDR5 6000 CL30 |
Video Card(s) | Asus TUF RTX3080 Deshrouded |
Storage | WD Black SN850X 2TB |
Display(s) | LG 42C2 4K OLED |
Case | Coolermaster NR200P |
Audio Device(s) | LG SN5Y / Focal Clear |
Power Supply | Corsair SF750 Platinum |
Mouse | Corsair Dark Core RBG Pro SE |
Keyboard | Glorious GMMK Compact w/pudding |
VR HMD | Meta Quest 3 |
Software | case populated with Artic P12's |
Benchmark Scores | 4k120 OLED Gsync bliss |
Absolutely, FX Cas helped me tremendously in Horizon Zero Dawn, and there is very little visual sacrifice made with a 70-90% render scale and a touch of sharpening, whichever 'flavour' you have. I use a sharpen filter in the vast majority of games I play these days, no matter what res/render scale. I eagerly await when these various technologies become more dynamic, like a resolution scaler with FPS target, but some work put into a dynamic sharpening filter across the top. Or the same principle with FSR or DLSS, static output resolution/FPS target, dynamic input resolution perhaps.Totally. Like I said, I used to be a native resolution above all AA kind of guy, but the way FidelityFX works in Cyberpunk 2077 really impressed me. Being able to play it on full HD all high settings on a GTX 1650 with a minor loss of sharpness is amazing. I'd get around 20-25 fps running all native, but with 75% scaling, I get between 35-40, which I'm fine with. Obviously, all low 60 fps would be possible too, I just don't want it.
The other thing is, I don't think a PC enthusiast should be freaked out by a bit of tweaking to find the sweet spot between image quality and performance (yes, I said PC enthusiast, not high-end enthusiast, or RTX 3090 enthusiast). There's no such thing as a bad computer, only mismatched expectations.
System Name | Meh |
---|---|
Processor | 7800X3D |
Motherboard | MSI X670E Tomahawk |
Cooling | Thermalright Phantom Spirit |
Memory | 32GB G.Skill @ 6000/CL30 |
Video Card(s) | Gainward RTX 4090 Phantom / Undervolt + OC |
Storage | Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server |
Display(s) | 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR |
Case | Fractal Design North XL |
Audio Device(s) | FiiO DAC |
Power Supply | Corsair RM1000x / Native 12VHPWR |
Mouse | Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro |
Keyboard | Corsair K60 Pro / MX Low Profile Speed |
Software | Windows 10 Pro x64 |
If you are using the pc in your system info i understand whyth
is is incorrect
and wrong
I hate motionblur in most games
BUT RACING games REALLY take advantage from it
in first person it REALLY makes the car feel like it's going faster