• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD FSR 2.0 Quality & Performance

Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
Only "ray-traced" reflections... though.
Nope. All RT effects. Reflections are what devs USUALLY use, but not always.
For Metro its RTGI. For other games it is shadows or RTAO.
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.57/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
I hope both technologies are here to stay and hopefully they become industry standard where all games will come with out of the box.
 
Joined
Dec 22, 2011
Messages
286 (0.06/day)
Processor Ryzen 7 5800X3D
Motherboard Asus Prime X570 Pro
Cooling Deepcool LS-720
Memory 32 GB (4x 8GB) DDR4-3600 CL16
Video Card(s) Gigabyte Radeon RX 6800 XT Gaming OC
Storage Samsung PM9A1 (980 Pro OEM) + 960 Evo NVMe SSD + 830 SATA SSD + Toshiba & WD HDD's
Display(s) Samsung C32HG70
Case Lian Li O11D Evo
Audio Device(s) Sound Blaster Zx
Power Supply Seasonic 750W Focus+ Platinum
Mouse Logitech G703 Lightspeed
Keyboard SteelSeries Apex Pro
Software Windows 11 Pro
Only "ray-traced" reflections... though.
Huh? They support literally anything you can do with RT, just like AMD or NVIDIA (or Intel Arc) PC cards can.
The reason they might seem more "limited" is simply performance issue, consoles aren't top end PC hardware where GPU consumes more than the whole console
 
Joined
Feb 20, 2019
Messages
7,487 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Some games are starting to add a sharpening slider for DLSS and honestly this is what DLSS has sorely needed. At least in deathloop the FSA 2.0 + sharpness is the best output here by a wide margin.

I think FSR 2.0 is close to DLSS 2.3 in terms of output quality; it doesn't really matter if there are minor differences because the effectiveness of FSR and DLSS varies from game to game and from scene to scene. The basic mechanics of FSR 2.0 much more closely match DLSS now - temporal sampling of jittered camera positions with a couple of features designed to combat the two worst drawbacks of this technique (thin feature shimmer and motion trail artifacts)

Where I think FSR is vastly superior to DLSS is the adaptive resolution. Finally you can run something demanding at a target framerate and not have to pause, sacrifice some graphics options, potentially restart the game to apply them, and then wait until the next big firefight and hope it's enough. You'll (maybe) notice it getting a bit blurry in the heat of the moment, but it's only temporary and you don't have to sacrifice those higher quality settings or resolution for 99% of the gameplay just to cover those 1% peak demands.
 
Joined
Apr 1, 2017
Messages
420 (0.16/day)
System Name The Cum Blaster
Processor R9 5900x
Motherboard Gigabyte X470 Aorus Gaming 7 Wifi
Cooling Alphacool Eisbaer LT360
Memory 4x8GB Crucial Ballistix @ 3800C16
Video Card(s) 7900 XTX Nitro+
Storage Lots
Display(s) 4k60hz, 4k144hz
Case Obsidian 750D Airflow Edition
Power Supply EVGA SuperNOVA G3 750W
I really enjoyed the introduction to the technology. It was a great write up.

It is too bad that the minimum requirements are so high. I could really use this with my 1060 but it is below min spec for 1080p up scaling. Those who need it the most can't even use it.
why not try it first before getting upset
 
Joined
Feb 20, 2019
Messages
7,487 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I hope both technologies are here to stay and hopefully they become industry standard where all games will come with out of the box.
I am confident that FSR will become industry standard fast - as it's the only tech that runs on all platforms. Game developers may continue to implement both as long as Nvidia make it easy for them and incentivise them to do so, but look at it from the game developer's perspective:

Do you:
  1. Implement a single solution (FSR) which works for all three of your target markets with no restrictions, and is officially endorsed by the exclusive GPU vendor for the two console markets.

    OR

  2. Do all the work to implement FSR but also do additional work to add DLSS that is only usable by about one quarter of one of your three target markets, and adding it is redundant because it doesn't really do anything special that FSR already does.
Please, find me a good reason why a Dev would pick the second option from now on? Outside of financial incentives from Nvidia, you just wouldn't.
 
Joined
Dec 12, 2012
Messages
724 (0.17/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
Where I think FSR is vastly superior to DLSS is the adaptive resolution. Finally you can run something demanding at a target framerate and not have to pause, sacrifice some graphics options, potentially restart the game to apply them, and then wait until the next big firefight and hope it's enough. You'll (maybe) notice it getting a bit blurry in the heat of the moment, but it's only temporary and you don't have to sacrifice those higher quality settings or resolution for 99% of the gameplay just to cover those 1% peak demands.
DLSS in Deathloop does support dynamic resolution scaling. It was tested by someone on TPU months ago. It seems that is a game specific feature.
 
Joined
Feb 20, 2019
Messages
7,487 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
DLSS in Deathloop does support dynamic resolution scaling. It was tested by someone on TPU months ago. It seems that is a game specific feature.
Ah okay, I've not seen it as a feature in any games yet, though I have seen games that let you combine the in-game dynamic resolution scaling with DLAA.

If the dynamic scaling actually affects the internal render resolution of DLSS then that's good. DLAA or DLSS with the game's own dynamic resolution is just a simple per-frame upscale without any of the motion-vector or temporal buffers that make DLSS 2.3 and FSR 2.0 better.
 
Joined
Oct 12, 2005
Messages
682 (0.10/day)
I thought that it was only for DLSS 1.0 ? you can use dlss in the real time preview of unity and unreal engine, and I really doubt that Nvidia servers are computing every single project that are being made

DLSS 2.x still use a neural network but it's not trained per game. Nvidia ship the inference and it's what run on the tensor core.

One of the thing is people think you need AI for a lot of thing were a good algorithm can do the work just fine and be much easier and cheaper to run. But writing algorithm require more work than to train an AI. AI is being used right now to brute force so many things that could just have been done using good programing.

there are area where AI is really useful and cannot be beaten by clever algorithm. but these area is just a small subset of what people try to apply AI to.
 
Joined
Jan 24, 2011
Messages
272 (0.06/day)
Processor AMD Ryzen 5900X
Motherboard MSI MAG X570 Tomahawk
Cooling Dual custom loops
Memory 4x8GB G.SKILL Trident Z Neo 3200C14 B-Die
Video Card(s) AMD Radeon RX 6800XT Reference
Storage ADATA SX8200 480GB, Inland Premium 2TB, various HDDs
Display(s) MSI MAG341CQ
Case Meshify 2 XL
Audio Device(s) Schiit Fulla 3
Power Supply Super Flower Leadex Titanium SE 1000W
Mouse Glorious Model D
Keyboard Drop CTRL, lubed and filmed Halo Trues
It is too bad that the minimum requirements are so high. I could really use this with my 1060 but it is below min spec for 1080p up scaling. Those who need it the most can't even use it.

It will still work. Unlike NVIDIA, AMD doesn't stop you from trying things that aren't explicitly supported.
 
Joined
May 8, 2021
Messages
1,978 (1.76/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Instead of wastin precious developer time on mimicing lower settings, why don't you simply change the settings from ultra high to very high? The result will be the same with regards to the FPS improvement.. :D
Do you really think that everyone does that or has hardware for that. I don't even use presets, but I manage with low-high settings, with most set to medium. Technology was more interesting for low end gamers, that may be able to use weaker hardware that otherwise might not run game you want. The problem is still picture quality. The main problem with gaming, which has been a problem for at least decade is that games need faster and faster hardware to run but very often there's nearly no visual quality gain in newer games. You can run 10 year old game at very high settings and it will look better than new game at low, but old game could run well with GTX 650, meanwhile new game will be slideshow. I frankly don't want games to look awful, but I also don't care to much about visual quality. However, I hate when newer games are more demanding and look worse or run worse for no obvious reason. It's damn shame that many devs don't know how to dev properly.
 
Joined
Jun 18, 2019
Messages
124 (0.07/day)
DLSS and RT are proprietary "features" by nvidia with no value for the user who can think.
The PS5 and new XBox do not support RT, so the gamers actually do not need it.

AMD's mistake is that it follows instead of thinking proactively about new unique features with real value.
Jesus you're delusional.
RT is not proprietary, it existed since the 1980's
Ray Tracing is the future of graphics.
Xbox and PS5 DO support Ray Tracing.
 

Lateshow

New Member
Joined
Aug 26, 2020
Messages
9 (0.01/day)
Does sound like a hater, plus, everyone seems to forget that for all intents and purposes, Nvidia has limitless financial resources when compared to AMD, but expect AMD to not only compete, but to do better (while also not seeking profit in the same way as Nvidia....so many people think AMD should be a non-profit company and hold them to standards they hold nobody else to)....this is a great big step, and should only get better as long as Nvidia doesn't pull an intel and instead of innovating, just use vast amounts of money to box AMD out and get developers to be exclusive to Nvidia IP....which they will
Yeah, it's like the pot, and the kettle around here at times... His comment was typical of an NVIDIA fan boy, but you trying to say people want AMD to run as a non profit is just as fan boyish.

AMD, and their record setting quarters, clearly shows they are as focused on profits as NVIDIA, or Intel. Digging deeper, they got rid of their sub $300 CPU's last go, and sold a silly amount of slower 3k chips for those who couldn't budget $300+ for the improved 5k chips. Yesterday's GPU refresh offering around 5% more performance for 10%+ more money is as bad any of their competitors tactics. Well NVIDIAs original MSRP on their 20 series being super high to help sell excess 10 series GPU's was worse, but it's along the same lines. AMD 6500xt was a bad joke, right? There's more if you want.

Competition is the only thing keeping tech prices at these inflated levels despite the mining, and PC boom being over. I'm hoping Intel jumps in the GPU game soon, and succeeds hard. Both GPU makers took advantage of their customers, and any decent 3rd option would help out us consumers.
 
Joined
Apr 11, 2021
Messages
214 (0.19/day)
It will still work. Unlike NVIDIA, AMD doesn't stop you from trying things that aren't explicitly supported.
If it "works" but the impact is severe enough to leave you with (nearly) unplayable frame rate, then it doesn't work. I mean, those minimum recommendations weren't thrown out just for fun.

EDIT: Talking about 1080p, is there going to be a 1080p comparison?
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,110 (2.59/day)
Location
Ex-usa | slava the trolls
RT... it existed since the 1980's
Ray Tracing is the future of graphics.

It had been the "future" since 80s and never came :D
You don't have the technology to make ray-tracing working for real gaming. Unless every computer is connected via the internet with many powerful ray-tracing supercomputers in order to give you that constant framerate that you wish between 60 FPS and 144 FPS (for example).

you're delusional.

I'm delusional or you believe in unicorns? :D
 
Joined
Jun 18, 2019
Messages
124 (0.07/day)
It had been the "future" since 80s and never came :D
You don't have the technology to make ray-tracing working for real gaming. Unless every computer is connected via the internet with many powerful ray-tracing supercomputers in order to give you that constant framerate that you wish between 60 FPS and 144 FPS (for example).



I'm delusional or you believe in unicorns? :D
Dude, you didn't even understand what I've said. You said that RT is useless and proprietary when in reality it's a technology that existed since 1980's and it's the only way to get photorealistic graphics. What real time rt in games have anything do with this?
 
Joined
Feb 20, 2022
Messages
175 (0.21/day)
System Name Custom Watercooled
Processor 10900k 5.1GHz SSE 5.0GHz AVX
Motherboard Asus Maximus XIII hero z590
Cooling XSPC Raystorm Pro, XSPC D5 Vario, EK Water Blocks EK-CoolStream XE 360 (Triple Fan) Radiator
Memory Team Group 8Pack RIPPED Edition TDPPD416G3600HC14CDC01 @ DDR4-4000 CL15 Dual Rank 4x8GB (32GB)
Video Card(s) KFA2 GeForce RTX 3080 Ti SG 1-Click OC 12GB LHR GDDR6X PCI-Express Graphics Card
Storage WD Blue SN550 1TB NVME M.2 2280 PCIe Gen3 Solid State Drive (WDS100T2B0C)
Display(s) LG 3D TV 32LW450U-ZB and Samsung U28D590
Case Full Tower Case
Audio Device(s) ROG SupremeFX 7.1 Surround Sound High Definition Audio CODEC ALC4082, ESS SABRE9018Q2C DAC/AMP
Power Supply Corsair AX1000 Titanium 80 Plus Titanium Power Supply
Mouse Logitech G502SE
Keyboard Logitech Y-BP62a
Software Windows 11 Pro
Benchmark Scores https://valid.x86.fr/2rdbdl https://www.3dmark.com/spy/27927340 https://ibb.co/YjQFw5t
Jesus you're delusional.
RT is not proprietary, it existed since the 1980's
Ray Tracing is the future of graphics.
Xbox and PS5 DO support Ray Tracing.
DirectX 12 has been expanded to cover ray tracing, machine learning and faster storage. This is why these features should be normal parts of any benchmarking. Not a special sub section of tests. The reason these feature are treated differently is the perception NVidia supports these features better and delivers more performance. Thus AMD are to be protected from the negative results in benchmarks.
 
Last edited:
Joined
Sep 2, 2014
Messages
259 (0.07/day)
Location
Emperor's retreat/Naboo Moenia
System Name Order66
Processor Ryzen 7 3700X
Motherboard Asus TUF GAMING B550-PLUS
Cooling AMD Wraith Prism (BOX-cooler)
Memory 16GB DDR4 Corsair Desktop RAM Vengeance LPX 3200MHz Red
Video Card(s) GeForce RTX 3060Ti
Storage Seagate FireCuda 510 1TB SSD
Display(s) Asus VE228HR
Case Thermaltake Versa C21 RGB
Audio Device(s) onboard Realtek
Power Supply Corsair RM850x
Software Windows10 64bit
For developers that already support DLSS 2.0, adding FSR 2.0 support will be easy, AMD talks about days.

Exactly that !!
When DLSS is present AMD have stated that implementing FSR 2.0 is very easy. "Deathloop" is a game that already has DLSS implemented , so what we are seeing here is the best case scenario for FSR 2.0
Of course , the next logical question is "how FSR 2.0 performs when DLSS is not present".
AMD has stated that this will be a much lengthier procedure ( 05:17 at the following video) ,and of course we don't know yet what kind of quality will this implementation have.
 
Joined
Feb 20, 2022
Messages
175 (0.21/day)
System Name Custom Watercooled
Processor 10900k 5.1GHz SSE 5.0GHz AVX
Motherboard Asus Maximus XIII hero z590
Cooling XSPC Raystorm Pro, XSPC D5 Vario, EK Water Blocks EK-CoolStream XE 360 (Triple Fan) Radiator
Memory Team Group 8Pack RIPPED Edition TDPPD416G3600HC14CDC01 @ DDR4-4000 CL15 Dual Rank 4x8GB (32GB)
Video Card(s) KFA2 GeForce RTX 3080 Ti SG 1-Click OC 12GB LHR GDDR6X PCI-Express Graphics Card
Storage WD Blue SN550 1TB NVME M.2 2280 PCIe Gen3 Solid State Drive (WDS100T2B0C)
Display(s) LG 3D TV 32LW450U-ZB and Samsung U28D590
Case Full Tower Case
Audio Device(s) ROG SupremeFX 7.1 Surround Sound High Definition Audio CODEC ALC4082, ESS SABRE9018Q2C DAC/AMP
Power Supply Corsair AX1000 Titanium 80 Plus Titanium Power Supply
Mouse Logitech G502SE
Keyboard Logitech Y-BP62a
Software Windows 11 Pro
Benchmark Scores https://valid.x86.fr/2rdbdl https://www.3dmark.com/spy/27927340 https://ibb.co/YjQFw5t
Exactly that !!
When DLSS is present AMD have stated that implementing FSR 2.0 is very easy. "Deathloop" is a game that already has DLSS implemented , so what we are seeing here is the best case scenario for FSR 2.0
Of course , the next logical question is "how FSR 2.0 performs when DLSS is not present".
AMD has stated that this will be a much lengthier procedure ( 05:17 at the following video) ,and of course we don't know yet what kind of quality will this implementation have.
Customer wins I guess, now that FSR 2 is out, benchmarks will have to accept DLSS/FSR 2 results. There is no reason not to accept RT and upscaling now.
 
Joined
Oct 12, 2005
Messages
682 (0.10/day)
If it "works" but the impact is severe enough to leave you with (nearly) unplayable frame rate, then it doesn't work. I mean, those minimum recommendations weren't thrown out just for fun.
There are still a big performance it versus running the game at the lower resolution directly.

By example in the previous test
1440p Native with TAA: 70 FPS
4K with FSR 2.0 Quality (1440p internal resolution): 52 fps -26% loss vs native 1440p
4K with DLSS 2.0 Quality (1440p internal resolution): 54 fps -23% loss vs native 1440p

So if you can take a 25% hit on the internal resolution, you can probably run FSR 2.0. from what I see.
 
Joined
Feb 1, 2013
Messages
1,248 (0.30/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
So wtf is DL in DLSS these days? Apparently AMD is doing it without any neural network shenanigans.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,955 (2.58/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Razer Huntsman Tournament Edition
Software Windows 11 Pro 64-Bit
DLSS and RT are proprietary "features" by nvidia with no value for the user who can think.
The PS5 and new XBox do not support RT, so the gamers actually do not need it.

AMD's mistake is that it follows instead of thinking proactively about new unique features with real value.
RT is not proprietary to Nvidia. Nvidia RTX was before RT became a feature set of DX12.
 

ARF

Joined
Jan 28, 2020
Messages
4,110 (2.59/day)
Location
Ex-usa | slava the trolls
RT is not proprietary to Nvidia. Nvidia RTX was before RT became a feature set of DX12.

Okey, so nvidia rtx is proprietary, sorry for missing the "x" in the end..

AMD said that you can get your ray-tracing only in the cloud. Good luck!

1652372119321.png
 
Top