• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Patent Shines Raytraced Light on Post-Navi Plans

Joined
May 31, 2016
Messages
4,421 (1.45/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
If you ask me AMD is thinking this through and doesn't rush like NV did to just release something new to be first. AMD may be second but I think with a better result. Of course we will all see how it pans out in the later term but AMD seems to be confident with what they are presenting.
 
Joined
Jul 10, 2015
Messages
754 (0.22/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
Sutyi, I guess they will double RT cores in 7nm Ampere and 1-2 years of real world experience and developers feedback is worth a lot for nvidia. More and more games are using their aproach to RT which is much better than screen space reflections, you just need to be less biased and open your eyes.
 
Joined
Feb 3, 2017
Messages
3,667 (1.31/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Last edited:
Joined
Jul 28, 2014
Messages
190 (0.05/day)
Location
Denmark
System Name NorthBlackGoldDream
Processor Ryzen 7600X
Motherboard Gigabyte B650M-DS3H
Cooling Arctic Freezer II 240
Memory 16 GB DDR5-5200C40
Video Card(s) GTX 1080 Ti 11 GB
Storage 1 TB NVMe PCIe 3.0
Display(s) 24.5" 240 Hz TN
Case Fractal North Black Mesh
Power Supply 650W
This ray tracing is FAKE! you want to feel REAL ? here you hear :
In 2001 , Alias-Wavefront announced Maya 4.along with Maya 4 , There was add-on and It was Metal ray which later bought by NVIDIA.I was quite interested in Mental ray rendering.I did draw some geometry and rendered in Mental ray.I was like wow, my god.it was damn beautiful.after 18 years , I saw first ray tracing tech in BF/Metro Exodus , I didn't feel exactly like Ray tracing in 18 years ago.You want it ? allright Feel like this :


Biased and unbiased renders bro. Mental ray is slow AF and was a CPU-only renderer at the time. But yes, very pretty results. I do prefer GPU unbiased render lige iray or vray.
 
Joined
Apr 16, 2019
Messages
632 (0.32/day)
If I were an AMD fan I would skip the Radeon RX 5XXX generation altogether since AMD is again dedicating most of its resources to next-gen MS/Sony consoles with HW accelerated Ray Tracing while gamers will receive half-baked products which will be rendered obsolete less than a year from now.

I'm not saying NVIDIA RTX is worth buying - I'm saying if you can wait, do wait. In a year from now we'll have proper RDNA (2.0?) for PC and Turing Refresh/Ampere on 7nm.
Navi is already obsolete due to the Nvidia Super line-up and that's without even considering ray traycing, lol!
 
Joined
May 31, 2016
Messages
4,421 (1.45/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Navi is already obsolete due to the Nvidia Super line-up and that's without even considering ray traycing, lol!
I don't get people sometimes. When others' release product it's always wait for the proper reviews to see exactly. NV releases "Super" for a super price with one lousy benchmark leak using Final Fantasy which everyone knows is NV sponsored and yet AMD's cards are already obsolete. :) This is kinda funny even though AMD provided more benchmarks to have slightly more accurate and a brother spectrum of games to compare. :) anyway....
Ray tracing is useless now btw whit it's DLSS feature. You can have it and yet you can't play it properly because of the performance impact. What's the point of having it anyway.
It is good that it is there and maybe at some point you will be able to play 2k with RT on with a card for a reasonable price but not this year or next my friend.
 
Joined
Jul 10, 2015
Messages
754 (0.22/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
Its not useless for nvidia, sure its not costless but its profitable in long term.
 
Joined
Jul 9, 2015
Messages
3,413 (1.01/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
If I were an AMD fan I would skip the Radeon RX 5XXX generation altogether since AMD is again dedicating most of its resources to next-gen MS/Sony consoles with HW accelerated Ray Tracing while gamers will receive half-baked products which will be rendered obsolete less than a year from now.

I'm not saying NVIDIA RTX is worth buying - I'm saying if you can wait, do wait. In a year from now we'll have proper RDNA (2.0?) for PC and Turing Refresh/Ampere on 7nm.

You got Sony/MS involvement all wrong.
They are, in fact, FUNDING research and development at AMD.
Whatever comes out, will be used in other products.

Obviously, it's not done yet, to appear in products that will be released in 5 days.

As for waiting... Wait and see if RT is used in consoles first. But then it's quite a long wait.

As AMD's BHV trees are flexible, unlike nvidia's, one might discover that NV's implementation struggling with whatever comes later.
No point to "wait", anyhow.

Navi is already obsolete due to the Nvidia Super line-up
$380 5700 is obsolete, because nVidia will release 2060Super, that will be as fast, and cost $20 more.

An illustration to: "what's wrong with green brains".
 
Joined
Jul 10, 2015
Messages
754 (0.22/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
It will be faster, lol
 
Joined
Feb 18, 2017
Messages
688 (0.25/day)
I just have one Doubt, a pretty big question, if I'm able to do it,
Why Sony/Microsoft keep using AMD gpu's, when from my POV NVidia has the upper hand
in Power Efficiency and Performance, I mean Sony could use GP106 to carve PS4 Pro, and get
better power management, and at same time more perfomance. I've heard of PS4 PRO GPU
to be at level of RX 470, wichi has same TDP with GTX 1060, and there is a huge diference.


P.D. All I can imagine to keep on AMD is backwards compatibility.


P.D. P.D. Sorry if this is not the place to make this kind of questions.


View attachment 126022

I think performance is more like RX570 than RX470. And GTX 1060 is $199/249, and RX 470 is $169? Plus what "huge" difference are you talking about?


And yeah, AMD has a CPU to offer too which would for sure means even cheaper prices.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,667 (1.31/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
When others' release product it's always wait for the proper reviews to see exactly. NV releases "Super" for a super price with one lousy benchmark leak using Final Fantasy which everyone knows is NV sponsored and yet AMD's cards are already obsolete. :) This is kinda funny even though AMD provided more benchmarks to have slightly more accurate and a brother spectrum of games to compare. :)
You have a point but the train of thought here is not completely wrong. Final Fantasy may be a bad benchmark but even a bad benchmark will provide reasonably reliable results for GPUs of same architecture. RTX 2070 Super is 11-12% faster than RTX 2070 which is a bigger difference than the performance difference between RX 5700XT and RTX2070 on AMD's slide. Even with a slight price premium it should end up at roughly equal perf/$.
As AMD's BHV trees are flexible, unlike nvidia's, one might discover that NV's implementation struggling with whatever comes later.
What exactly do you mean by being flexible?
 
Joined
May 31, 2016
Messages
4,421 (1.45/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Its not useless for nvidia, sure its not costless but its profitable in long term.
NV is not using it you are, customers are. If you are ok with a slide show or crappy DLSS and RT then go for it. This is an individual perception. Cheaper and already obsolete without any extensive reviews that's just not right to say.

You have a point but the train of thought here is not completely wrong. Final Fantasy may be a bad benchmark but even a bad benchmark will provide reasonably reliable results for GPUs of same architecture. RTX 2070 Super is 11-12% faster than RTX 2070 which is a bigger difference than the performance difference between RX 5700XT and RTX2070 on AMD's slide. Even with a slight price premium it should end up at roughly equal perf/$.
I'm not saying is bad but drawing conclusion from one isn't right if you consider yourself a person that knows this stuff.
AMD can take strange brigade game as a benchmark and make NV lack performance. Is that the way to go? I don't think so. Larger spectrum of games is needed to more less evaluate the performance and value of given card not just one game. Not saying it shouldn't be in a mix.
 
Joined
Dec 31, 2009
Messages
19,371 (3.60/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Ray tracing is useless now btw whit it's DLSS feature.
que?

RT and DLSS work from 2 different pieces of hardware on the card. DLSS, though typically with a small negative IQ impact, helps boost FPS back up with RT adds the reflections.

What is thought of it is a different story. But with AMD coming out with what amounts to the same thing, it hardly useless.

I love how NVIDIA (read: any company) gets shit on for being an innovator.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,933 (2.36/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
This ray tracing is FAKE! you want to feel REAL ? here you hear :
In 2001 , Alias-Wavefront announced Maya 4.along with Maya 4 , There was add-on and It was Metal ray which later bought by NVIDIA.I was quite interested in Mental ray rendering.I did draw some geometry and rendered in Mental ray.I was like wow, my god.it was damn beautiful.after 18 years , I saw first ray tracing tech in BF/Metro Exodus , I didn't feel exactly like Ray tracing in 18 years ago.You want it ? allright Feel like this :

That’s great, but you are comparing your oranges (pictured) to apples. Still frame ray tracing has been around for years. The subject here is Real-Time Ray Tracing (RTRT) which is obviously going to be limited at this point in time.
 
Joined
Feb 3, 2017
Messages
3,667 (1.31/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
AMD can take strange brigade game as a benchmark and make NV lack performance. Is that the way to go? I don't think so. Larger spectrum of games is needed to more less evaluate the performance and value of given card not just one game. Not saying it shouldn't be in a mix.
You missed my point. Benchmark being heavily biased towards one vendor does not enter the equation here.

For example, lets say RX570 gets 82.5 and RX580 gets 91.1 fps at 1080p in Strange Brigade. That makes RX580 10% faster in this game at this resolution. This might be enough to suspect this might be the case in general. Based on this, knowing that RX570 and GTX1060 3GB are roughly equal we may conclude that RX580 should be that ~10% faster. In this case the eventual performance difference across many games in the same review ends up being slightly larger at 13%. You can see that GTX1060 3GB is 20% slower than RX570 in Strange Brigade but this is irrelevant in the comparison we are making.

The final truth will be there when NDAs are gone and reviews are out but until then we only have incomplete data to analyze and try to make educated guesses from :)
 
Last edited:
Joined
May 31, 2016
Messages
4,421 (1.45/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
You missed my point. Benchmark being heavily biased towards one vendor does not enter the equation here.

For example, lets say RX570 gets 82.5 and RX580 gets 91.1 fps at 1080p in Strange Brigade. That makes RX580 10% faster in this game at this resolution. This might be enough to suspect this might be the case in general. Based on this, knowing that RX570 and GTX1060 3GB are roughly equal we may conclude that RX580 should be that ~10% faster. - in this case the eventual performance difference across many games in the same review ends up being slightly larger at 13%. You can see that GTX1060 3GB is 20% slower than RX570 in Strange Brigade but this is irrelevant in the comparison we are making.
I didn't miss it. I simply disagree with you having one benchmark from a game conclude about the value and performance of a graphics card.
 
Joined
Dec 31, 2009
Messages
19,371 (3.60/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
I didn't miss it. I simply disagree with you having one benchmark from a game conclude about the value and performance of a graphics card.
I think you did miss it. I didnt take away from his posts that he was all in, but using what was available. ;)
 
Joined
May 31, 2016
Messages
4,421 (1.45/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
que?

RT and DLSS work from 2 different pieces of hardware on the card. DLSS, though typically with a small negative IQ impact, helps boost FPS back up with RT adds the reflections.

What is thought of it is a different story. But with AMD coming out with what amounts to the same thing, it hardly useless.

I love how NVIDIA (read: any company) gets shit on for being an innovator.
Well it is useless. I'm not saying it is bad that it's there, it is great but it shouldn't be used as an added value when you can't utilize this feature in a proper manner cause of lack of performance. On the other hand DLSS isn't a great feature. It reduces the image quality very much. The blurriness is just unbearable which is like moving back in time with image quality. AMD is going to have RT that's for sure but the one difference between AMD and NV at this point is, AMD didn't rush RT as much knowing of the performance impact (that's just what I think). So I disagree with some of the forum members that there's an added value to a card using RT or it should have been implemented in every card. RT wont work as it should now eating resources when they could have been used to lower the costs (I assume RT cores are expensive) and they could use the space in the die for cores that would boost performance. If this would happen then maybe you could play 4k ultra on a 2070 with 60FPS with no problem ( that's just me guessing btw)
I think you did miss it. I didnt take away from his posts that he was all in, but using what was available. ;)
Well. You think I did miss it. I'm telling you I didn't just disagree with this logic but whatever suit you :)
 
Joined
Feb 3, 2017
Messages
3,667 (1.31/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
I assume RT cores are expensive
I can't find the exact posts right now but estimation is 7-8% of die or less. Tensor cores are more, but these are apparently useful for doing the concurrent FP16 stuff.
If this would happen then maybe you could play 4k ultra on a 2070 with 60FPS with no problem ( that's just me guessing btw)
This is too optimistic. We are looking at about 40% performance deficit here. Compared to RTX2070, RTX2080 is about 20% faster and RTX2080Ti is 45% faster at 1440p. RTX2080 is not enough for 4K Ultra, RTX2080Ti usually is.
 
Last edited:
Joined
Jul 9, 2015
Messages
3,413 (1.01/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
I can't find the exact posts right now but estimation is 7-8% die or less.
Estimate that I have seen was at around 22% of the die.
 
Joined
Feb 3, 2017
Messages
3,667 (1.31/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Joined
Jul 9, 2015
Messages
3,413 (1.01/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
Jan 13, 2015
Messages
51 (0.01/day)
I really do hope that majority of posters know what ray-tracing is (since I saw a lot of educated opinions that... mean nothing).

It's MATH. Nothing more. First algorhytms appeared late 60's, improvements later.

It's nothing 'invented' by nvidia or amd - CPUs did it in old versions of 3d studio (then, now MAX) and other animation software.

So, there's like nothing holding up 'implementation' of ray-tracing, mental-ray, radiosity - name the rendering type ever for any GPU (or CPU) - question is just how successful it will be in it.

There's also no 'magical' ray-tracing hardware or software improvements.
 
Joined
Feb 3, 2017
Messages
3,667 (1.31/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Joined
May 31, 2016
Messages
4,421 (1.45/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
I can't find the exact posts right now but estimation is 7-8% of die or less. Tensor cores are more, but these are apparently useful for doing the concurrent FP16 stuff.
This is too optimistic. We are looking at about 40% performance deficit here. Compared to RTX2070, RTX2080 is about 20% faster and RTX2080Ti is 45% faster at 1440p. RTX2080 is not enough for 4K Ultra, RTX2080Ti usually is.
You are right that it might have been optimistic but anyway you get my point here. 2080 is ok for 4k with the 4k res you can easily crank the AA down or even switch it off. I don't use it when I play 4k and it is ok for most of the games and I got V64. Although 22% (if it is correct) more cores might be sufficient for 4k gaming if you add it on top what 2070 has.
 
Top