Raevenlord
News Editor
- Joined
- Aug 12, 2016
- Messages
- 3,755 (1.24/day)
- Location
- Portugal
System Name | The Ryzening |
---|---|
Processor | AMD Ryzen 9 5900X |
Motherboard | MSI X570 MAG TOMAHAWK |
Cooling | Lian Li Galahad 360mm AIO |
Memory | 32 GB G.Skill Trident Z F4-3733 (4x 8 GB) |
Video Card(s) | Gigabyte RTX 3070 Ti |
Storage | Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB |
Display(s) | Acer Nitro VG270UP (1440p 144 Hz IPS) |
Case | Lian Li O11DX Dynamic White |
Audio Device(s) | iFi Audio Zen DAC |
Power Supply | Seasonic Focus+ 750 W |
Mouse | Cooler Master Masterkeys Lite L |
Keyboard | Cooler Master Masterkeys Lite L |
Software | Windows 10 x64 |
NVIDIA's Turing-based RTX 20-series graphics cards have been announced to begin shipping on the 20th of September. Their most compelling argument for users to buy them is the leap in ray-tracing performance, enabled by the integration of hardware-based acceleration via RT cores that have been added to NVIDIA's core design. NVIDIA has been pretty bullish as to how this development reinvents graphics as we know it, and are quick to point out the benefits of this approach against other, shader-based approximations of real, physics-based lighting. In a Q&A at the Citi 2018 Global Technology Conference, NVIDIA's Colette Kress expounded on their new architecture's strengths - but also touched upon a possible segmentation of graphics cards by raytracing capabilities.
During that Q&A, NVIDIA's Colette Kress put Turing's performance at a cool 2x improvement over their 10-series graphics cards, discounting any raytracing performance uplift - and when raytracing is indeed brought into consideration, she said performance has increased by up to 6x compared to NVIDIA's last generation. There's some interesting wording when it comes to NVIDIA's 20-series lineup, though; as Kress puts it, "We'll start with the ray-tracing cards. We have the 2080 Ti, the 2080 and the 2070 overall coming to market," which, in context, seems to point out towards a lack of raytracing hardware in lower-tier graphics cards (apparently, those based on the potential TU106 silicon and lower-level variants).
This is just speculation - based on Kress's comments, though - but if that translates to reality, this would be a tremendous misstep for NVIDIA and raytracing in general. The majority of the market games on sub-**70 tier graphics cards (the 20-series has even seen a price hike up to $499 for the RTX 2070...), and failing to add RT hardware to lower-tier graphics would exclude a huge portion of the playerbase from raytracing effects. This would mean that developers adding NVIDIA's RTX technologies and implementing Microsoft's DXR would be spending development resources catering to the smallest portion of gamers - the ones with high-performance discrete solutions. And we've seen in the past what developers think of devoting their precious time to such features.
Additionally, if this graphics card segregation by RTX support (or lack of it) were to happen, what would be of NVIDIA's lineup? GTX graphics cards up to the GTX 2060 (and maybe 2060 Ti), and RTX upwards? Dilluting NVIDIA's branding through GTX and RTX doesn't seem like a sensible choice, but of course, if that were to happen, it would be much better than keeping the RTX prefix across the board.
It could also be a simple case of it not being feasible to include RT hardware on smaller, lower performance GPUs. As performance leaks and previews have been showing us, even NVIDIA's top of the line RTX 2080 Ti can only deliver 40-60 FPS at 1080p in games such as the upcoming Shadow of the Tomb Raider and Battlefield V (DICE has even said they had to tone down levels of raytracing to achieve playable performance levels). Performance improvements until release could bring FPS up to a point, but all signs point towards a needed decrease in rendering resolution for NVIDIA's new 20-series to be able to cope with the added raytracing compute. And if performance looks like this on NVIDIA's biggest (revelaed) Turing die, with its full complement of RT cores, we can only extrapolate what raytracing performance would look like in cut-down dies with lower number of RT execution units. Perhaps it really wouldn't make much sense to add the increased costs and per-die-area of this dedicated hardware, if raytracing could only be supported in playable levels at 720p.
All in all, it seems to this editor that segregation of graphics cards via RTX capabilities would be a mistake, not only because of userbase fracturing, but also because the highest amount of players game at **60 and lower levels. Developers wouldn't be so inclined to add RTX to their games to such a small userbase, and NVIDIA would be looking at dilluting its gaming brand via RTX and GTX - or risk confusing customers by branding a non-RTX card with the RTX branding. If any of these scenarios come to pass, I risk saying it might have been too soon for the raytracing push - even as I applaud NVIDIA for doing it, anyway, and pushing graphics rendering further. But perhaps timing and technology could have been better? But I guess we all just better wait for actual performance reviews, right?
View at TechPowerUp Main Site
During that Q&A, NVIDIA's Colette Kress put Turing's performance at a cool 2x improvement over their 10-series graphics cards, discounting any raytracing performance uplift - and when raytracing is indeed brought into consideration, she said performance has increased by up to 6x compared to NVIDIA's last generation. There's some interesting wording when it comes to NVIDIA's 20-series lineup, though; as Kress puts it, "We'll start with the ray-tracing cards. We have the 2080 Ti, the 2080 and the 2070 overall coming to market," which, in context, seems to point out towards a lack of raytracing hardware in lower-tier graphics cards (apparently, those based on the potential TU106 silicon and lower-level variants).
This is just speculation - based on Kress's comments, though - but if that translates to reality, this would be a tremendous misstep for NVIDIA and raytracing in general. The majority of the market games on sub-**70 tier graphics cards (the 20-series has even seen a price hike up to $499 for the RTX 2070...), and failing to add RT hardware to lower-tier graphics would exclude a huge portion of the playerbase from raytracing effects. This would mean that developers adding NVIDIA's RTX technologies and implementing Microsoft's DXR would be spending development resources catering to the smallest portion of gamers - the ones with high-performance discrete solutions. And we've seen in the past what developers think of devoting their precious time to such features.
Additionally, if this graphics card segregation by RTX support (or lack of it) were to happen, what would be of NVIDIA's lineup? GTX graphics cards up to the GTX 2060 (and maybe 2060 Ti), and RTX upwards? Dilluting NVIDIA's branding through GTX and RTX doesn't seem like a sensible choice, but of course, if that were to happen, it would be much better than keeping the RTX prefix across the board.
It could also be a simple case of it not being feasible to include RT hardware on smaller, lower performance GPUs. As performance leaks and previews have been showing us, even NVIDIA's top of the line RTX 2080 Ti can only deliver 40-60 FPS at 1080p in games such as the upcoming Shadow of the Tomb Raider and Battlefield V (DICE has even said they had to tone down levels of raytracing to achieve playable performance levels). Performance improvements until release could bring FPS up to a point, but all signs point towards a needed decrease in rendering resolution for NVIDIA's new 20-series to be able to cope with the added raytracing compute. And if performance looks like this on NVIDIA's biggest (revelaed) Turing die, with its full complement of RT cores, we can only extrapolate what raytracing performance would look like in cut-down dies with lower number of RT execution units. Perhaps it really wouldn't make much sense to add the increased costs and per-die-area of this dedicated hardware, if raytracing could only be supported in playable levels at 720p.
All in all, it seems to this editor that segregation of graphics cards via RTX capabilities would be a mistake, not only because of userbase fracturing, but also because the highest amount of players game at **60 and lower levels. Developers wouldn't be so inclined to add RTX to their games to such a small userbase, and NVIDIA would be looking at dilluting its gaming brand via RTX and GTX - or risk confusing customers by branding a non-RTX card with the RTX branding. If any of these scenarios come to pass, I risk saying it might have been too soon for the raytracing push - even as I applaud NVIDIA for doing it, anyway, and pushing graphics rendering further. But perhaps timing and technology could have been better? But I guess we all just better wait for actual performance reviews, right?
View at TechPowerUp Main Site