• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

EVGA GeForce RTX 2060 KO Pictured, Possibly NVIDIA's First Response to RX 5600 XT

Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
nobody is forcing
Nobody is forcing you to read everyone's comment, right?


The thread is about NVs alleged answer to AMDs product.
The main differentiating factor here is RTX.
If you think RTX on 2060 is viable, good for you.
If it isn't, heck, good for you.

Why shut people up.
 
Joined
Aug 6, 2017
Messages
7,412 (2.78/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
rtx 2060 runs RT at

57 fps in BF5


39 fps in Exodus


47.5 fps in SOTR


85 fps in CoD


all those stock clocks on highest RTX setting available,so yeah,basically the same as rx5700xt that doesn't support it at all.
 
Joined
Sep 17, 2014
Messages
22,438 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
So the RTX 2060 KO KOs their 1660 Ti? :D



How do you know it is going mainstream this year? All I see is in the nearly 1,5 year RTX is on the market, we got only 2 games (if I'm right) with initial RTX support. These are Metro: Exodus and Control. The RTX announced titles like BFV and SotTR got their RTX support in a later patch. It was also promised that FFXV will get RTX support, but it was cancelled. And you have Quake 2 with RTX... This is the RTX lineup so far. BFV had to be updated, because - as a competitive game - couldn't run with fix 60 fps on FHD with a 2080Ti with RTX set to high/ultra, so they had to lower graphics quality in a later patch to provide better performance. And that is FHD with a $1000 GPU (which is $1100 in reality). Metro and Tomb Raider both had minimums halved compared to average, meaning a 2080Ti provided 40-45 fps minimums on FHD... Same for Control.

Yeah mainstream is really the wrong term for it. 'Gets more widely used' fits better. Its completely abstract to us and even next year buying into big RT push is def going to leave a sour taste in your mouth. This is not going mainstream because a new console is announced; it goes mainstream when it can show its merit on console AND PC through multiple killer apps or just being everywhere, the latter wont happen anytime soon, any half decent dev cycle is moee than one year.
 
Joined
Jun 16, 2016
Messages
409 (0.13/day)
System Name Baxter
Processor Intel i7-5775C @ 4.2 GHz 1.35 V
Motherboard ASRock Z97-E ITX/AC
Cooling Scythe Big Shuriken 3 with Noctua NF-A12 fan
Memory 16 GB 2400 MHz CL11 HyperX Savage DDR3
Video Card(s) EVGA RTX 2070 Super Black @ 1950 MHz
Storage 1 TB Sabrent Rocket 2242 NVMe SSD (boot), 500 GB Samsung 850 EVO, and 4TB Toshiba X300 7200 RPM HDD
Display(s) Vizio P65-F1 4KTV (4k60 with HDR or 1080p120)
Case Raijintek Ophion
Audio Device(s) HDMI PCM 5.1, Vizio 5.1 surround sound
Power Supply Corsair SF600 Platinum 600 W SFX PSU
Mouse Logitech MX Master 2S
Keyboard Logitech G613 and Microsoft Media Keyboard
So the "KO" means regular 2060 and 1660Ti being "KOed" by this one selling at 1660Ti prices ?

1660 Ti was a dead card walking ever since the 1660 Super came out. This should let Nvidia have the stack "simplified" to 1650, 1650 Super, 1660, 1660 Super, 2060. That's still way too many cards but I guess Nvidia wanted to make sure that AMD didn't undercut them at any pricepoint below $350.
 
Joined
May 21, 2009
Messages
269 (0.05/day)
Processor AMD Ryzen 5 4600G @4300mhz
Motherboard MSI B550-Pro VC
Cooling Scythe Mugen 5 Black Edition
Memory 16GB DDR4 4133Mhz Dual Channel
Video Card(s) IGP AMD Vega 7 Renoir @2300mhz (8GB Shared memory)
Storage 256GB NVMe PCI-E 3.0 - 6TB HDD - 4TB HDD
Display(s) Samsung SyncMaster T22B350
Software Xubuntu 24.04 LTS x64 + Windows 10 x64
Joined
Jun 10, 2014
Messages
2,986 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I'll call rtrt mainstream when all the games set it as default and I have to lower it to get higher fps.
Or you think it'll become mainstream before even DirectX 12 becomes mainstream?
More advanced graphics will probably always come at a performance penalty. In a few years ray tracing will become "mainstream", all it takes is a few games that does it well.

I have yet to see any game that uses DirectX 12 properly, the ones I've seen so far uses an abstraction layer to emulate DirectX 11, which defies the point of a "low overhead" API to begin with, and is the reason why we see performance drops in many games.

Because even 2080Ti is pathetically underpowered at it to deliver anything beyond basic reflection/shades gimmicks.
And that is not going to change any time soon.
Clearly you know how graphics works. :rolleyes:
Once there is 1-2 "killer games" that does ray tracing well, providing a level of immersion that non-rt can ever achieve, you'll have to eat those words. In most scenes, diffuse lighting (including soft shadows) is much more important than specular lighting (sharp reflections). Even with the capabilities of RTX 2080 Ti, a well crafted game should be able to do a good hybrid approach doing diffuse lighting with ray tracing and "faking" much of the specular stuff. The problem here is that most games' sorce code are pieces of junk stitched together in a hurry, and often uses an "off-the-shelf" game engine with some light scripting. This needs to be deeply integrated in the core game engine to do it well.

It's true that there isn't very much support for RTRT from Developers right now but that will change when the next gen consoles roll out with hardware support for accelerated RTRT.

Remember that most games are still made for the console and ported (sometimes badly) to PC. Console gamers would have a shit-fit if Developers didn't make some use of RTRT in their new console games especially if these next gen consoles end up more expensive than the present generation.

With AMD on board the RTRT train there is nothing standing in the way of RTRT even though it can only be implemented in small ways right now.
It will of course help a lot to gain traction.
But I'm not convinced that we will see many well-crafted software marvels with the current state of the game industry, but hopefully a handful okay ones.

RTRT doesn't provide enough IQ improvement and brings with it too much of a performance hit (at this point in time anyway) in my opinion to warrant it being a deciding factor when buying a GPU.
Ray tracing as a technology has the potential to achieve graphics no rasterizing ever can come close to. It all comes down to the application of the technology, and so far the usage in game is trash, so don't judge the technology based on poor utilization. The public perception of ray tracing will change radically once a few good titles come out.

… but tessellation's performance hit was so big it took 7 years between ATI's first implementation and DX adding support for it. To this day, we still cringe when we hear about HairWorks or TressFX
If I may add, about tessellation and performance hit, it depends on what you compare it to. If you compare it to drawing the same model in high detail without tessellation, then there is a massive performance gain. Tessellation allows for substantial savings in memory bandwidth, as you can have a fairly low-detailed mesh and a high-detailed displacement map to render a high-detailed mesh. It also simplifies the vertex shading, since the first step is actually performed on the low detail mesh before subdivision. Hardware tessellation also allows for smooth interpolation between two levels of detail, which is practically impossible without tessellation, as mesh structures in GPU memory is complex, and changing them is a costly operation. The big problem with tessellation is that it's hard to use well, as with all advanced graphics techniques. Tessellation can only work of certain types of meshes, they have to be "subdividable". I've yet to this day not noticed any game utilize it well. But like with other good stuff, like Vulkan or DirectX 12, the stuff is good, but "nobody" is using it right.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.73/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
It will of course help a lot to gain traction.
But I'm not convinced that we will see many well-crafted software marvels with the current state of the game industry, but hopefully a handful okay ones.

Time takes time. While RTRT is moving at a snails pace right now with the next gen consoles having a hardware solution for accelerating ray tracing and AMD stepping into the ring and possibly Intel as well the RTRT advancement should pick up the pace but it's still going to take a while to get the Developers fully up to speed with the future of gaming.

As far as fully implemented RTRT in games that will no doubt be years away and at least 2 generations away after the releases this year but it will come eventually imo.
 
Joined
Jun 10, 2014
Messages
2,986 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Time takes time. While RTRT is moving at a snails pace right now with the next gen consoles having a hardware solution for accelerating ray tracing and AMD stepping into the ring and possibly Intel as well the RTRT advancement should pick up the pace but it's still going to take a while to get the Developers fully up to speed with the future of gaming.
I'm fairly sure that like with practically every CPU or GPU feature, software will continue to lag behind, while hardware is advancing rapidly. On the hardware side, I think we should expect more than just increasing the RT core count over time, with three major players in the game there will be new approaches that accelerates throughput.

As far as fully implemented RTRT in games that will no doubt be years away and at least 2 generations away after the releases this year but it will come eventually imo.
I'm not sure what you mean.
If you mean full scene ray tracing, that would probably require about 10-50x the ray tracing performance of a RTX 2080Ti in more challenging games.
If you mean games requiring ray tracing, then that may come in a couple of generations.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Clearly you know how graphics works. :rolleyes:
Yes, and quite well, for this context.

Once there is 1-2 "killer games" that does ray tracing well, providing a level of immersion that non-rt can ever achieve, you'll have to eat those words. In most scenes, diffuse lighting (including soft shadows) is much more important than specular lighting (sharp reflections). Even with the capabilities...<something very very great>...
Where would those "killer games" come from?
Which insane manager would invest money into an AAAA (yes, 4 times) title that would not run on majority of gamer PCs?

As Crytek has demonstrated, hybrid Ray Tracing, as in "certain things are much easier to do with RT approach, compared to rasterization", one doesn't need dedicated RT hardware to pull it off:


The problem here is that ... <game developers suck>...
There are no abstract developers with endless sources of money. It makes no sense whatsoever to spend too much time optimizing for PCs as there are too many combinations. For consoles, on the other hand... Compare GoW on PS4's 7870 to Witcher on 1080.

The problem here is The Leather Man. The guy who has ruined OpenGL and should not be allowed next to any industry wide specification, the guy who has managed to piss off major players to a point they act as if NV didn't exist.

RT will take off when AMD, who supplies 35% of discrete GPU market and 100% of performant console market, will go for it, and there will be little to no chance for TLM to influence it.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.73/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
I'm fairly sure that like with practically every CPU or GPU feature, software will continue to lag behind, while hardware is advancing rapidly. On the hardware side, I think we should expect more than just increasing the RT core count over time, with three major players in the game there will be new approaches that accelerates throughput.


I'm not sure what you mean.
If you mean full scene ray tracing, that would probably require about 10-50x the ray tracing performance of a RTX 2080Ti in more challenging games.
If you mean games requiring ray tracing, then that may come in a couple of generations.

Yes, I'm talking about fully implemented RTRT. I'm probably being too optimistic in saying it will be at least 2 generations after the release of the new generation this year. I have no idea what it would take to do that but if it will take 10X to 50X then it may not even come in my lifetime.

Still we don't know what Nvidia and AMD and Intel are working on for the future. Frankly, Nvidia surprised me with the Turings and their capacity for RTRT even a small as the implementation is. I saw a video a while back of a Star Wars game running fully implemented RTRT using 4 RTX Titans. When a single high end GPU can match that then I guess we will have the potential for fully implemented RTRT. Obviously the vast majority of gamers will have to turn down the ray tracing settings even then as they will be, like always, on entry level or midrange GPUs.

I also have seen articles on possible new materials to replace silicon like possibly carbon nanotubes which scientists believe could be 5 times faster than using silicon using the same wattage. But I'm going pretty far off topic so I won't post anymore about RTRT.
 
Joined
Aug 6, 2017
Messages
7,412 (2.78/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
As Crytek has demonstrated, hybrid Ray Tracing, as in "certain things are much easier to do with RT approach, compared to rasterization", one doesn't need dedicated RT hardware to pull it off:

funny how you find rtx games poor quality but a synthetic benchmark doing half decent reflections with one ray per four pixels running at 43 fps avg./26 fps min. 1440p on Radeon 7 is fine.
a 2070 does RT reflections in BF1 twice as fast with one ray per two when it's using RT hardware.
 
Last edited:
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
A nice (I wish there was sound) set of presentation slides from 2018, on where we are on RT & Games:

Note slide 60. We are at least a decade away from full RT (and I'm being optimistic)

funny how you find rtx games poor quality
I have never said that.
The "noisy" bit I'm mentioning is a reference to this:

1578751031770.png
 

bug

Joined
May 22, 2015
Messages
13,761 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
If I may add, about tessellation and performance hit, it depends on what you compare it to. If you compare it to drawing the same model in high detail without tessellation, then there is a massive performance gain. Tessellation allows for substantial savings in memory bandwidth, as you can have a fairly low-detailed mesh and a high-detailed displacement map to render a high-detailed mesh. It also simplifies the vertex shading, since the first step is actually performed on the low detail mesh before subdivision. Hardware tessellation also allows for smooth interpolation between two levels of detail, which is practically impossible without tessellation, as mesh structures in GPU memory is complex, and changing them is a costly operation. The big problem with tessellation is that it's hard to use well, as with all advanced graphics techniques. Tessellation can only work of certain types of meshes, they have to be "subdividable". I've yet to this day not noticed any game utilize it well. But like with other good stuff, like Vulkan or DirectX 12, the stuff is good, but "nobody" is using it right.
I'm not disagreeing with any of that. I'll just note that RT is mostly in the same spot: it can simplify things, if used correctly and it can do things that were problematic (I don't want to say impossible) without it. But RT also has the inevitable teething issues of a first generation implementation.
 
Joined
Sep 17, 2014
Messages
22,438 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Its really simple... as long as there is no killer app, it wont fly. You can look at VR, you can look at electric cars (never took off until Tesla which is clearly killer app by definition, ie autopilot and all the other features compared to ICE cars) and we can go on like that throughout history.

@efikkan I feel was on the right track pushing that task on developers and funding. That is exactly the core issue here and the reason why consoles are the early beginnings and not Nvidias RT tryout with Turing.

This needs a big budget game that is not only graphically interesting; it also needs to show us that RT enables new or emergent gameplay. Just shiny graphics are not enough; despite the technicalities, raster has already produced such fantastic worlds it will be nearly impossible to be blown away just by graphical prowess. We need interactivity; that is where the dynamic aspect of RT comes to light (lol), sightseeing beautiful scenery is not enough. We need to manipulate it and interface with it. Many recent techs are moving that way: AR; VR;a d RT really is at its core also exactly the same. A tool to create more dynamic scenes and increase interactivity and realism/immersion.
 
Joined
Apr 14, 2019
Messages
221 (0.11/day)
System Name Violet
Processor AMD Ryzen 5800X
Motherboard ASRock x570 Phantom Gaming X
Cooling Be quiet! Dark Rock Pro 4
Memory G.Skill Flare x 32GB 3400Mhz
Video Card(s) MSI 6900XT Gaming X Trio
Storage Western Digital WD Black SN750 1TB
Display(s) 3440x1440
Case Lian Li LANCOOL II MESH Performance
Power Supply Corsair RM850x
Mouse EVGA X15
Keyboard Corsair K95 RGB
Software Windows 10 64bit
@cucker tarlson

This is a post about the EVGA GeForce RTX 2060 KO. Then why always that AMD vs Nvidia trolling?

Nvidia does support Microsoft's Ray Tracing and AMD not (yet), what exactly is the problem here?
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
you can look at electric cars (never took off until Tesla which is clearly killer app by definition, ie autopilot and all the other features compared to ICE cars)
This is a very bad example, although, I understand where you are coming from.
Car manufacturers are under major pressure that stems from CO2 emission commitments that most of the Western countries (including US) have made.
Starting 2020-2021 it would be basically impossible for anyone selling cars in Europe to keep selling cars without paying hefty fines, unless they mix in hybrid/pure electric vehicles.
On top of it, countries like Netherlands have major emission taxes that, for instance, essentially double price of cars like Ford Mustang.

That, and not Musk joining Tesla, is why electric cars (which frankly suck as universal vehicles) are viable: they will be forced down the throat of the customers, one way or another.


As for "killer app" there is that chicken and an egg problem. Nobody is going to make an AAA+ title for a non-existing market.
 
Joined
Jun 10, 2014
Messages
2,986 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Where would those "killer games" come from?
Which insane manager would invest money into an AAAA (yes, 4 times) title that would not run on majority of gamer PCs?
A such game wouldn't have to be ray tracing only. I'm just thinking of a game that's amazes people enough that it becomes a "killer app".

As Crytek has demonstrated, hybrid Ray Tracing, as in "certain things are much easier to do with RT approach, compared to rasterization", one doesn't need dedicated RT hardware to pull it off:
I know, and it was kind of my point, you can get pretty good results in certain conditions with what we already have, and as you say, even without RT cores if used very cleverly. I've see demos where ray tracing has been used to do lighting for a low-resolution "voxel model" of the world, and then the rasterization uses this as a light map, giving sharp nice textures and "realistic" room lighting and shadows without any huge performance requirements. You can get away with probably 1 ray per 10 pixels for many scenes. It's for the specular lighting that we need the incredible performance do to ray tracing, e.g. for explosions, flames, sparks, sun gloss in water, etc.

There are no abstract developers with endless sources of money. It makes no sense whatsoever to spend too much time optimizing for PCs as there are too many combinations. For consoles, on the other hand... Compare GoW on PS4's 7870 to Witcher on 1080.
I think you missed the point.
Most game studios focus on quantity not quality, they want quick rapid development cycles, and they usually use plenty of money, possibly too much money. But the focus is on getting it done quickly, not done right. Doing software development takes time, and as any skilled programmer can tell you, if you don't do it properly and just stitch it together, it's going to be nearly impossible to "fix later". What they should do is to get the engine and core game properly working before they throw all the "content people" into the project. This requires more time, but not necessary more money in the long term. But share holders and management usually want quick turnover.

The problem here is The Leather Man. The guy who has ruined OpenGL and should not be allowed next to any industry wide specification, the guy who has managed to piss off major players to a point they act as if NV didn't exist.
You're way out of line here. That's not even remotely true.
Nvidia is pretty much the sole contributor to OpenGL since version 2.1, AMD have been limping behind and to this date not added proper OpenGL support.

RT will take off when AMD, who supplies 35% of discrete GPU market and 100% of performant console market, will go for it, and there will be little to no chance for TLM to influence it.
As you can see yourself in the Steam hardware survey, AMD's market share among PC gamers is ~14-15%, including APUs. While AMD sells about ~30% of discrete GPUs, about half of these are low-end OEM GPUs for home or office PCs, which is why they don't show up in game statistics. Their console foothold is substantial, but not 100% (don't forget that thing from Nintendo), but the PC market is getting more dominant every year.

Yes, I'm talking about fully implemented RTRT. <snip> I have no idea what it would take to do that but if it will take 10X to 50X then it may not even come in my lifetime.

Still we don't know what Nvidia and AMD and Intel are working on for the future. Frankly, Nvidia surprised me with the Turings and their capacity for RTRT even a small as the implementation is.<snip>
I absolutely think it will come in your lifetime (I seriously hope you don't die prematurely ;)).
Hardware accelerated ray tracing is still in its infancy. While I'm only speculating here, based on a deep understanding of GPU technology, I think there is a good chance of a breakthrough in a 10-15 year timeframe, and not just from node shrinks and more RT cores, but ways to process batches of rays together, similar to how tensor cores are amazingly good at one thing.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,438 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
This is a very bad example, although, I understand where you are coming from.
Car manufacturers are under major pressure that stems from CO2 emission commitments that most of the Western countries (including US) have made.
Starting 2020-2021 it would be basically impossible for anyone selling cars in Europe to keep selling cars without paying hefty fines, unless they mix in hybrid/pure electric vehicles.
On top of it, countries like Netherlands have major emission taxes that, for instance, essentially double price of cars like Ford Mustang.

That, and not Musk joining Tesla, is why electric cars (which frankly suck as universal vehicles) are viable: they will be forced down the throat of the customers, one way or another.


As for "killer app" there is that chicken and an egg problem. Nobody is going to make an AAA+ title for a non-existing market.

Car manufacturers under major pressure? Yes, we buy new cars because they are economically viable, now or made so by tax and rule changes, but just looking at the technology here; Tesla made a car that is lightyears ahead of the competition and all the rest can do is follow suit. That also applies to autopilot in a big way; Musk has been collecting/mining fleet data since day one, you can guess who's going to win the autonomous driving race already. And the customer feels that as a killer app; this car does things in ways it was not done before. It integrates features in ways we've not seen yet, it simplifies a great many things, and is a poster child for a desire and a necessity to go greener.

Keep in mind many attempts have been made to push electric vehicles in the past. The only reason we chose ICE over it is because it was economically more interesting; or put differently, we could foot the bill to the environment and it never asked us to pay that money... Today we pay more tax to keep natural disaster at bay... who's really paying this bill now? Your story isn't any different when placed in a different age and related to the birth of ICE cars. Any car manufacturer today that still tells you they will keep doing ICEs indefinitely is lying to you, and to themselves.

Many of these aspects also apply to RT and implementing it in games. It must be made economically viable. Do you know why we have barely seen perf/dollar shifts in the past few gens? To make RT more economically viable. Slow down progress so any new progress seems more valuable. As if the current GPUs could not be scaled further as they are ;)

Anyway let's not drift off topic :D
 
Last edited:
Joined
Aug 6, 2017
Messages
7,412 (2.78/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT

bug

Joined
May 22, 2015
Messages
13,761 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Its really simple... as long as there is no killer app, it wont fly. You can look at VR, you can look at electric cars (never took off until Tesla which is clearly killer app by definition, ie autopilot and all the other features compared to ICE cars) and we can go on like that throughout history.

I don't think it will be a killer app. I think it will be more like people noticing they can go into hiding and look for incoming in a glass reflection and stuff like that that will make people seeing an advantage in RTRT.
But yes, as I have said countless time before, Turing is not the future of RTRT. Turing is squarely aimed at developers (who tend not to program for a non-existing hardware install base) and enthusiasts like me who dig RTRT and are willing to pay a premium to get to kick its tires.
 
Top