Tuesday, January 7th 2020
EVGA GeForce RTX 2060 KO Pictured, Possibly NVIDIA's First Response to RX 5600 XT
At CES, we went hands-on with the EVGA GeForce RTX 2060 KO graphics card, and its price came as the biggest surprise: USD $299. This could very well be NVIDIA's first response to AMD's Radeon RX 5600 XT: a new line of RTX 2060 graphics cards under $300, with RTX support being the clincher. The EVGA card looks like it's severely built to a cost. A 20-ish centimeter length, a simple twin-fan cooling solution, and just three connectors, including a legacy DVI-D. It still has a full-length back-plate. The KO ticks at NVIDIA-reference clock-speeds for the RTX 2060. EVGA is planning a premium KO Ultra SKU with factory-overclocked speeds comparable to the RTX 2060 iCX, priced at a small premium. EVGA says that the RTX 2060 KO will launch next week (January 13 or later).
95 Comments on EVGA GeForce RTX 2060 KO Pictured, Possibly NVIDIA's First Response to RX 5600 XT
57 fps in BF5
www.purepc.pl/karty_graficzne/geforce_rtx_2000_test_kart_graficznych_w_ray_tracingu_i_dlss?page=0,4
39 fps in Exodus
www.purepc.pl/karty_graficzne/geforce_rtx_2000_test_kart_graficznych_w_ray_tracingu_i_dlss?page=0,7
47.5 fps in SOTR
www.purepc.pl/karty_graficzne/geforce_rtx_2000_test_kart_graficznych_w_ray_tracingu_i_dlss?page=0,10
85 fps in CoD
www.purepc.pl/karty_graficzne/call_of_duty_modern_warfare_2019_test_wydajnosci_raytracingu?page=0,4
all those stock clocks on highest RTX setting available,so yeah,basically the same as rx5700xt that doesn't support it at all.
:)
I have yet to see any game that uses DirectX 12 properly, the ones I've seen so far uses an abstraction layer to emulate DirectX 11, which defies the point of a "low overhead" API to begin with, and is the reason why we see performance drops in many games. Clearly you know how graphics works. :rolleyes:
Once there is 1-2 "killer games" that does ray tracing well, providing a level of immersion that non-rt can ever achieve, you'll have to eat those words. In most scenes, diffuse lighting (including soft shadows) is much more important than specular lighting (sharp reflections). Even with the capabilities of RTX 2080 Ti, a well crafted game should be able to do a good hybrid approach doing diffuse lighting with ray tracing and "faking" much of the specular stuff. The problem here is that most games' sorce code are pieces of junk stitched together in a hurry, and often uses an "off-the-shelf" game engine with some light scripting. This needs to be deeply integrated in the core game engine to do it well. It will of course help a lot to gain traction.
But I'm not convinced that we will see many well-crafted software marvels with the current state of the game industry, but hopefully a handful okay ones. Ray tracing as a technology has the potential to achieve graphics no rasterizing ever can come close to. It all comes down to the application of the technology, and so far the usage in game is trash, so don't judge the technology based on poor utilization. The public perception of ray tracing will change radically once a few good titles come out. If I may add, about tessellation and performance hit, it depends on what you compare it to. If you compare it to drawing the same model in high detail without tessellation, then there is a massive performance gain. Tessellation allows for substantial savings in memory bandwidth, as you can have a fairly low-detailed mesh and a high-detailed displacement map to render a high-detailed mesh. It also simplifies the vertex shading, since the first step is actually performed on the low detail mesh before subdivision. Hardware tessellation also allows for smooth interpolation between two levels of detail, which is practically impossible without tessellation, as mesh structures in GPU memory is complex, and changing them is a costly operation. The big problem with tessellation is that it's hard to use well, as with all advanced graphics techniques. Tessellation can only work of certain types of meshes, they have to be "subdividable". I've yet to this day not noticed any game utilize it well. But like with other good stuff, like Vulkan or DirectX 12, the stuff is good, but "nobody" is using it right.
As far as fully implemented RTRT in games that will no doubt be years away and at least 2 generations away after the releases this year but it will come eventually imo.
If you mean full scene ray tracing, that would probably require about 10-50x the ray tracing performance of a RTX 2080Ti in more challenging games.
If you mean games requiring ray tracing, then that may come in a couple of generations.
Which insane manager would invest money into an AAAA (yes, 4 times) title that would not run on majority of gamer PCs?
As Crytek has demonstrated, hybrid Ray Tracing, as in "certain things are much easier to do with RT approach, compared to rasterization", one doesn't need dedicated RT hardware to pull it off:
The problem here is The Leather Man. The guy who has ruined OpenGL and should not be allowed next to any industry wide specification, the guy who has managed to piss off major players to a point they act as if NV didn't exist.
RT will take off when AMD, who supplies 35% of discrete GPU market and 100% of performant console market, will go for it, and there will be little to no chance for TLM to influence it.
Still we don't know what Nvidia and AMD and Intel are working on for the future. Frankly, Nvidia surprised me with the Turings and their capacity for RTRT even a small as the implementation is. I saw a video a while back of a Star Wars game running fully implemented RTRT using 4 RTX Titans. When a single high end GPU can match that then I guess we will have the potential for fully implemented RTRT. Obviously the vast majority of gamers will have to turn down the ray tracing settings even then as they will be, like always, on entry level or midrange GPUs.
I also have seen articles on possible new materials to replace silicon like possibly carbon nanotubes which scientists believe could be 5 times faster than using silicon using the same wattage. But I'm going pretty far off topic so I won't post anymore about RTRT.
a 2070 does RT reflections in BF1 twice as fast with one ray per two when it's using RT hardware.
www.slideshare.net/DICEStudio/hpg-2018-game-ray-tracing-stateoftheart-and-open-problems
Note slide 60. We are at least a decade away from full RT (and I'm being optimistic) I have never said that.
The "noisy" bit I'm mentioning is a reference to this:
@efikkan I feel was on the right track pushing that task on developers and funding. That is exactly the core issue here and the reason why consoles are the early beginnings and not Nvidias RT tryout with Turing.
This needs a big budget game that is not only graphically interesting; it also needs to show us that RT enables new or emergent gameplay. Just shiny graphics are not enough; despite the technicalities, raster has already produced such fantastic worlds it will be nearly impossible to be blown away just by graphical prowess. We need interactivity; that is where the dynamic aspect of RT comes to light (lol), sightseeing beautiful scenery is not enough. We need to manipulate it and interface with it. Many recent techs are moving that way: AR; VR;a d RT really is at its core also exactly the same. A tool to create more dynamic scenes and increase interactivity and realism/immersion.
This is a post about the EVGA GeForce RTX 2060 KO. Then why always that AMD vs Nvidia trolling?
Nvidia does support Microsoft's Ray Tracing and AMD not (yet), what exactly is the problem here?
Car manufacturers are under major pressure that stems from CO2 emission commitments that most of the Western countries (including US) have made.
Starting 2020-2021 it would be basically impossible for anyone selling cars in Europe to keep selling cars without paying hefty fines, unless they mix in hybrid/pure electric vehicles.
On top of it, countries like Netherlands have major emission taxes that, for instance, essentially double price of cars like Ford Mustang.
That, and not Musk joining Tesla, is why electric cars (which frankly suck as universal vehicles) are viable: they will be forced down the throat of the customers, one way or another.
As for "killer app" there is that chicken and an egg problem. Nobody is going to make an AAA+ title for a non-existing market.
Most game studios focus on quantity not quality, they want quick rapid development cycles, and they usually use plenty of money, possibly too much money. But the focus is on getting it done quickly, not done right. Doing software development takes time, and as any skilled programmer can tell you, if you don't do it properly and just stitch it together, it's going to be nearly impossible to "fix later". What they should do is to get the engine and core game properly working before they throw all the "content people" into the project. This requires more time, but not necessary more money in the long term. But share holders and management usually want quick turnover. You're way out of line here. That's not even remotely true.
Nvidia is pretty much the sole contributor to OpenGL since version 2.1, AMD have been limping behind and to this date not added proper OpenGL support. As you can see yourself in the Steam hardware survey, AMD's market share among PC gamers is ~14-15%, including APUs. While AMD sells about ~30% of discrete GPUs, about half of these are low-end OEM GPUs for home or office PCs, which is why they don't show up in game statistics. Their console foothold is substantial, but not 100% (don't forget that thing from Nintendo), but the PC market is getting more dominant every year. I absolutely think it will come in your lifetime (I seriously hope you don't die prematurely ;)).
Hardware accelerated ray tracing is still in its infancy. While I'm only speculating here, based on a deep understanding of GPU technology, I think there is a good chance of a breakthrough in a 10-15 year timeframe, and not just from node shrinks and more RT cores, but ways to process batches of rays together, similar to how tensor cores are amazingly good at one thing.
Keep in mind many attempts have been made to push electric vehicles in the past. The only reason we chose ICE over it is because it was economically more interesting; or put differently, we could foot the bill to the environment and it never asked us to pay that money... Today we pay more tax to keep natural disaster at bay... who's really paying this bill now? Your story isn't any different when placed in a different age and related to the birth of ICE cars. Any car manufacturer today that still tells you they will keep doing ICEs indefinitely is lying to you, and to themselves.
Many of these aspects also apply to RT and implementing it in games. It must be made economically viable. Do you know why we have barely seen perf/dollar shifts in the past few gens? To make RT more economically viable. Slow down progress so any new progress seems more valuable. As if the current GPUs could not be scaled further as they are ;)
Anyway let's not drift off topic :D
But back in topic, KO cards are now listed on EVGA store:
www.evga.com/products/product.aspx?pn=06G-P4-2068-KR
www.evga.com/products/product.aspx?pn=06G-P4-2066-KR
Wonder what European prices will be. Probably sub 300€ in couple of months from now.
it's what 1660ti sold for not too long ago
But yes, as I have said countless time before, Turing is not the future of RTRT. Turing is squarely aimed at developers (who tend not to program for a non-existing hardware install base) and enthusiasts like me who dig RTRT and are willing to pay a premium to get to kick its tires.