Tuesday, January 7th 2020

EVGA GeForce RTX 2060 KO Pictured, Possibly NVIDIA's First Response to RX 5600 XT

At CES, we went hands-on with the EVGA GeForce RTX 2060 KO graphics card, and its price came as the biggest surprise: USD $299. This could very well be NVIDIA's first response to AMD's Radeon RX 5600 XT: a new line of RTX 2060 graphics cards under $300, with RTX support being the clincher. The EVGA card looks like it's severely built to a cost. A 20-ish centimeter length, a simple twin-fan cooling solution, and just three connectors, including a legacy DVI-D. It still has a full-length back-plate. The KO ticks at NVIDIA-reference clock-speeds for the RTX 2060. EVGA is planning a premium KO Ultra SKU with factory-overclocked speeds comparable to the RTX 2060 iCX, priced at a small premium. EVGA says that the RTX 2060 KO will launch next week (January 13 or later).
Add your own comment

95 Comments on EVGA GeForce RTX 2060 KO Pictured, Possibly NVIDIA's First Response to RX 5600 XT

#76
cucker tarlson
rtx 2060 runs RT at

57 fps in BF5

www.purepc.pl/karty_graficzne/geforce_rtx_2000_test_kart_graficznych_w_ray_tracingu_i_dlss?page=0,4

39 fps in Exodus

www.purepc.pl/karty_graficzne/geforce_rtx_2000_test_kart_graficznych_w_ray_tracingu_i_dlss?page=0,7

47.5 fps in SOTR

www.purepc.pl/karty_graficzne/geforce_rtx_2000_test_kart_graficznych_w_ray_tracingu_i_dlss?page=0,10

85 fps in CoD

www.purepc.pl/karty_graficzne/call_of_duty_modern_warfare_2019_test_wydajnosci_raytracingu?page=0,4

all those stock clocks on highest RTX setting available,so yeah,basically the same as rx5700xt that doesn't support it at all.
Posted on Reply
#77
Vayra86
B-RealSo the RTX 2060 KO KOs their 1660 Ti? :D



How do you know it is going mainstream this year? All I see is in the nearly 1,5 year RTX is on the market, we got only 2 games (if I'm right) with initial RTX support. These are Metro: Exodus and Control. The RTX announced titles like BFV and SotTR got their RTX support in a later patch. It was also promised that FFXV will get RTX support, but it was cancelled. And you have Quake 2 with RTX... This is the RTX lineup so far. BFV had to be updated, because - as a competitive game - couldn't run with fix 60 fps on FHD with a 2080Ti with RTX set to high/ultra, so they had to lower graphics quality in a later patch to provide better performance. And that is FHD with a $1000 GPU (which is $1100 in reality). Metro and Tomb Raider both had minimums halved compared to average, meaning a 2080Ti provided 40-45 fps minimums on FHD... Same for Control.
Yeah mainstream is really the wrong term for it. 'Gets more widely used' fits better. Its completely abstract to us and even next year buying into big RT push is def going to leave a sour taste in your mouth. This is not going mainstream because a new console is announced; it goes mainstream when it can show its merit on console AND PC through multiple killer apps or just being everywhere, the latter wont happen anytime soon, any half decent dev cycle is moee than one year.
Posted on Reply
#78
danbert2000
CrackongSo the "KO" means regular 2060 and 1660Ti being "KOed" by this one selling at 1660Ti prices ?
1660 Ti was a dead card walking ever since the 1660 Super came out. This should let Nvidia have the stack "simplified" to 1650, 1650 Super, 1660, 1660 Super, 2060. That's still way too many cards but I guess Nvidia wanted to make sure that AMD didn't undercut them at any pricepoint below $350.
Posted on Reply
#80
efikkan
KhonjelI'll call rtrt mainstream when all the games set it as default and I have to lower it to get higher fps.
Or you think it'll become mainstream before even DirectX 12 becomes mainstream?
More advanced graphics will probably always come at a performance penalty. In a few years ray tracing will become "mainstream", all it takes is a few games that does it well.

I have yet to see any game that uses DirectX 12 properly, the ones I've seen so far uses an abstraction layer to emulate DirectX 11, which defies the point of a "low overhead" API to begin with, and is the reason why we see performance drops in many games.
medi01Because even 2080Ti is pathetically underpowered at it to deliver anything beyond basic reflection/shades gimmicks.
And that is not going to change any time soon.
Clearly you know how graphics works. :rolleyes:
Once there is 1-2 "killer games" that does ray tracing well, providing a level of immersion that non-rt can ever achieve, you'll have to eat those words. In most scenes, diffuse lighting (including soft shadows) is much more important than specular lighting (sharp reflections). Even with the capabilities of RTX 2080 Ti, a well crafted game should be able to do a good hybrid approach doing diffuse lighting with ray tracing and "faking" much of the specular stuff. The problem here is that most games' sorce code are pieces of junk stitched together in a hurry, and often uses an "off-the-shelf" game engine with some light scripting. This needs to be deeply integrated in the core game engine to do it well.
64KIt's true that there isn't very much support for RTRT from Developers right now but that will change when the next gen consoles roll out with hardware support for accelerated RTRT.

Remember that most games are still made for the console and ported (sometimes badly) to PC. Console gamers would have a shit-fit if Developers didn't make some use of RTRT in their new console games especially if these next gen consoles end up more expensive than the present generation.

With AMD on board the RTRT train there is nothing standing in the way of RTRT even though it can only be implemented in small ways right now.
It will of course help a lot to gain traction.
But I'm not convinced that we will see many well-crafted software marvels with the current state of the game industry, but hopefully a handful okay ones.
milewski1015RTRT doesn't provide enough IQ improvement and brings with it too much of a performance hit (at this point in time anyway) in my opinion to warrant it being a deciding factor when buying a GPU.
Ray tracing as a technology has the potential to achieve graphics no rasterizing ever can come close to. It all comes down to the application of the technology, and so far the usage in game is trash, so don't judge the technology based on poor utilization. The public perception of ray tracing will change radically once a few good titles come out.
bug… but tessellation's performance hit was so big it took 7 years between ATI's first implementation and DX adding support for it. To this day, we still cringe when we hear about HairWorks or TressFX
If I may add, about tessellation and performance hit, it depends on what you compare it to. If you compare it to drawing the same model in high detail without tessellation, then there is a massive performance gain. Tessellation allows for substantial savings in memory bandwidth, as you can have a fairly low-detailed mesh and a high-detailed displacement map to render a high-detailed mesh. It also simplifies the vertex shading, since the first step is actually performed on the low detail mesh before subdivision. Hardware tessellation also allows for smooth interpolation between two levels of detail, which is practically impossible without tessellation, as mesh structures in GPU memory is complex, and changing them is a costly operation. The big problem with tessellation is that it's hard to use well, as with all advanced graphics techniques. Tessellation can only work of certain types of meshes, they have to be "subdividable". I've yet to this day not noticed any game utilize it well. But like with other good stuff, like Vulkan or DirectX 12, the stuff is good, but "nobody" is using it right.
Posted on Reply
#81
64K
efikkanIt will of course help a lot to gain traction.
But I'm not convinced that we will see many well-crafted software marvels with the current state of the game industry, but hopefully a handful okay ones.
Time takes time. While RTRT is moving at a snails pace right now with the next gen consoles having a hardware solution for accelerating ray tracing and AMD stepping into the ring and possibly Intel as well the RTRT advancement should pick up the pace but it's still going to take a while to get the Developers fully up to speed with the future of gaming.

As far as fully implemented RTRT in games that will no doubt be years away and at least 2 generations away after the releases this year but it will come eventually imo.
Posted on Reply
#82
efikkan
64KTime takes time. While RTRT is moving at a snails pace right now with the next gen consoles having a hardware solution for accelerating ray tracing and AMD stepping into the ring and possibly Intel as well the RTRT advancement should pick up the pace but it's still going to take a while to get the Developers fully up to speed with the future of gaming.
I'm fairly sure that like with practically every CPU or GPU feature, software will continue to lag behind, while hardware is advancing rapidly. On the hardware side, I think we should expect more than just increasing the RT core count over time, with three major players in the game there will be new approaches that accelerates throughput.
64KAs far as fully implemented RTRT in games that will no doubt be years away and at least 2 generations away after the releases this year but it will come eventually imo.
I'm not sure what you mean.
If you mean full scene ray tracing, that would probably require about 10-50x the ray tracing performance of a RTX 2080Ti in more challenging games.
If you mean games requiring ray tracing, then that may come in a couple of generations.
Posted on Reply
#83
medi01
efikkanClearly you know how graphics works. :rolleyes:
Yes, and quite well, for this context.
efikkanOnce there is 1-2 "killer games" that does ray tracing well, providing a level of immersion that non-rt can ever achieve, you'll have to eat those words. In most scenes, diffuse lighting (including soft shadows) is much more important than specular lighting (sharp reflections). Even with the capabilities...<something very very great>...
Where would those "killer games" come from?
Which insane manager would invest money into an AAAA (yes, 4 times) title that would not run on majority of gamer PCs?

As Crytek has demonstrated, hybrid Ray Tracing, as in "certain things are much easier to do with RT approach, compared to rasterization", one doesn't need dedicated RT hardware to pull it off:

efikkanThe problem here is that ... <game developers suck>...
There are no abstract developers with endless sources of money. It makes no sense whatsoever to spend too much time optimizing for PCs as there are too many combinations. For consoles, on the other hand... Compare GoW on PS4's 7870 to Witcher on 1080.

The problem here is The Leather Man. The guy who has ruined OpenGL and should not be allowed next to any industry wide specification, the guy who has managed to piss off major players to a point they act as if NV didn't exist.

RT will take off when AMD, who supplies 35% of discrete GPU market and 100% of performant console market, will go for it, and there will be little to no chance for TLM to influence it.
Posted on Reply
#84
64K
efikkanI'm fairly sure that like with practically every CPU or GPU feature, software will continue to lag behind, while hardware is advancing rapidly. On the hardware side, I think we should expect more than just increasing the RT core count over time, with three major players in the game there will be new approaches that accelerates throughput.


I'm not sure what you mean.
If you mean full scene ray tracing, that would probably require about 10-50x the ray tracing performance of a RTX 2080Ti in more challenging games.
If you mean games requiring ray tracing, then that may come in a couple of generations.
Yes, I'm talking about fully implemented RTRT. I'm probably being too optimistic in saying it will be at least 2 generations after the release of the new generation this year. I have no idea what it would take to do that but if it will take 10X to 50X then it may not even come in my lifetime.

Still we don't know what Nvidia and AMD and Intel are working on for the future. Frankly, Nvidia surprised me with the Turings and their capacity for RTRT even a small as the implementation is. I saw a video a while back of a Star Wars game running fully implemented RTRT using 4 RTX Titans. When a single high end GPU can match that then I guess we will have the potential for fully implemented RTRT. Obviously the vast majority of gamers will have to turn down the ray tracing settings even then as they will be, like always, on entry level or midrange GPUs.

I also have seen articles on possible new materials to replace silicon like possibly carbon nanotubes which scientists believe could be 5 times faster than using silicon using the same wattage. But I'm going pretty far off topic so I won't post anymore about RTRT.
Posted on Reply
#85
cucker tarlson
medi01As Crytek has demonstrated, hybrid Ray Tracing, as in "certain things are much easier to do with RT approach, compared to rasterization", one doesn't need dedicated RT hardware to pull it off:

funny how you find rtx games poor quality but a synthetic benchmark doing half decent reflections with one ray per four pixels running at 43 fps avg./26 fps min. 1440p on Radeon 7 is fine.
a 2070 does RT reflections in BF1 twice as fast with one ray per two when it's using RT hardware.
Posted on Reply
#87
bug
efikkanIf I may add, about tessellation and performance hit, it depends on what you compare it to. If you compare it to drawing the same model in high detail without tessellation, then there is a massive performance gain. Tessellation allows for substantial savings in memory bandwidth, as you can have a fairly low-detailed mesh and a high-detailed displacement map to render a high-detailed mesh. It also simplifies the vertex shading, since the first step is actually performed on the low detail mesh before subdivision. Hardware tessellation also allows for smooth interpolation between two levels of detail, which is practically impossible without tessellation, as mesh structures in GPU memory is complex, and changing them is a costly operation. The big problem with tessellation is that it's hard to use well, as with all advanced graphics techniques. Tessellation can only work of certain types of meshes, they have to be "subdividable". I've yet to this day not noticed any game utilize it well. But like with other good stuff, like Vulkan or DirectX 12, the stuff is good, but "nobody" is using it right.
I'm not disagreeing with any of that. I'll just note that RT is mostly in the same spot: it can simplify things, if used correctly and it can do things that were problematic (I don't want to say impossible) without it. But RT also has the inevitable teething issues of a first generation implementation.
Posted on Reply
#88
Vayra86
Its really simple... as long as there is no killer app, it wont fly. You can look at VR, you can look at electric cars (never took off until Tesla which is clearly killer app by definition, ie autopilot and all the other features compared to ICE cars) and we can go on like that throughout history.

@efikkan I feel was on the right track pushing that task on developers and funding. That is exactly the core issue here and the reason why consoles are the early beginnings and not Nvidias RT tryout with Turing.

This needs a big budget game that is not only graphically interesting; it also needs to show us that RT enables new or emergent gameplay. Just shiny graphics are not enough; despite the technicalities, raster has already produced such fantastic worlds it will be nearly impossible to be blown away just by graphical prowess. We need interactivity; that is where the dynamic aspect of RT comes to light (lol), sightseeing beautiful scenery is not enough. We need to manipulate it and interface with it. Many recent techs are moving that way: AR; VR;a d RT really is at its core also exactly the same. A tool to create more dynamic scenes and increase interactivity and realism/immersion.
Posted on Reply
#89
Lindatje
@cucker tarlson

This is a post about the EVGA GeForce RTX 2060 KO. Then why always that AMD vs Nvidia trolling?

Nvidia does support Microsoft's Ray Tracing and AMD not (yet), what exactly is the problem here?
Posted on Reply
#90
medi01
Vayra86you can look at electric cars (never took off until Tesla which is clearly killer app by definition, ie autopilot and all the other features compared to ICE cars)
This is a very bad example, although, I understand where you are coming from.
Car manufacturers are under major pressure that stems from CO2 emission commitments that most of the Western countries (including US) have made.
Starting 2020-2021 it would be basically impossible for anyone selling cars in Europe to keep selling cars without paying hefty fines, unless they mix in hybrid/pure electric vehicles.
On top of it, countries like Netherlands have major emission taxes that, for instance, essentially double price of cars like Ford Mustang.

That, and not Musk joining Tesla, is why electric cars (which frankly suck as universal vehicles) are viable: they will be forced down the throat of the customers, one way or another.


As for "killer app" there is that chicken and an egg problem. Nobody is going to make an AAA+ title for a non-existing market.
Posted on Reply
#91
efikkan
medi01Where would those "killer games" come from?
Which insane manager would invest money into an AAAA (yes, 4 times) title that would not run on majority of gamer PCs?
A such game wouldn't have to be ray tracing only. I'm just thinking of a game that's amazes people enough that it becomes a "killer app".
medi01As Crytek has demonstrated, hybrid Ray Tracing, as in "certain things are much easier to do with RT approach, compared to rasterization", one doesn't need dedicated RT hardware to pull it off:
I know, and it was kind of my point, you can get pretty good results in certain conditions with what we already have, and as you say, even without RT cores if used very cleverly. I've see demos where ray tracing has been used to do lighting for a low-resolution "voxel model" of the world, and then the rasterization uses this as a light map, giving sharp nice textures and "realistic" room lighting and shadows without any huge performance requirements. You can get away with probably 1 ray per 10 pixels for many scenes. It's for the specular lighting that we need the incredible performance do to ray tracing, e.g. for explosions, flames, sparks, sun gloss in water, etc.
medi01There are no abstract developers with endless sources of money. It makes no sense whatsoever to spend too much time optimizing for PCs as there are too many combinations. For consoles, on the other hand... Compare GoW on PS4's 7870 to Witcher on 1080.
I think you missed the point.
Most game studios focus on quantity not quality, they want quick rapid development cycles, and they usually use plenty of money, possibly too much money. But the focus is on getting it done quickly, not done right. Doing software development takes time, and as any skilled programmer can tell you, if you don't do it properly and just stitch it together, it's going to be nearly impossible to "fix later". What they should do is to get the engine and core game properly working before they throw all the "content people" into the project. This requires more time, but not necessary more money in the long term. But share holders and management usually want quick turnover.
medi01The problem here is The Leather Man. The guy who has ruined OpenGL and should not be allowed next to any industry wide specification, the guy who has managed to piss off major players to a point they act as if NV didn't exist.
You're way out of line here. That's not even remotely true.
Nvidia is pretty much the sole contributor to OpenGL since version 2.1, AMD have been limping behind and to this date not added proper OpenGL support.
medi01RT will take off when AMD, who supplies 35% of discrete GPU market and 100% of performant console market, will go for it, and there will be little to no chance for TLM to influence it.
As you can see yourself in the Steam hardware survey, AMD's market share among PC gamers is ~14-15%, including APUs. While AMD sells about ~30% of discrete GPUs, about half of these are low-end OEM GPUs for home or office PCs, which is why they don't show up in game statistics. Their console foothold is substantial, but not 100% (don't forget that thing from Nintendo), but the PC market is getting more dominant every year.
64KYes, I'm talking about fully implemented RTRT. <snip> I have no idea what it would take to do that but if it will take 10X to 50X then it may not even come in my lifetime.

Still we don't know what Nvidia and AMD and Intel are working on for the future. Frankly, Nvidia surprised me with the Turings and their capacity for RTRT even a small as the implementation is.<snip>
I absolutely think it will come in your lifetime (I seriously hope you don't die prematurely ;)).
Hardware accelerated ray tracing is still in its infancy. While I'm only speculating here, based on a deep understanding of GPU technology, I think there is a good chance of a breakthrough in a 10-15 year timeframe, and not just from node shrinks and more RT cores, but ways to process batches of rays together, similar to how tensor cores are amazingly good at one thing.
Posted on Reply
#92
Vayra86
medi01This is a very bad example, although, I understand where you are coming from.
Car manufacturers are under major pressure that stems from CO2 emission commitments that most of the Western countries (including US) have made.
Starting 2020-2021 it would be basically impossible for anyone selling cars in Europe to keep selling cars without paying hefty fines, unless they mix in hybrid/pure electric vehicles.
On top of it, countries like Netherlands have major emission taxes that, for instance, essentially double price of cars like Ford Mustang.

That, and not Musk joining Tesla, is why electric cars (which frankly suck as universal vehicles) are viable: they will be forced down the throat of the customers, one way or another.


As for "killer app" there is that chicken and an egg problem. Nobody is going to make an AAA+ title for a non-existing market.
Car manufacturers under major pressure? Yes, we buy new cars because they are economically viable, now or made so by tax and rule changes, but just looking at the technology here; Tesla made a car that is lightyears ahead of the competition and all the rest can do is follow suit. That also applies to autopilot in a big way; Musk has been collecting/mining fleet data since day one, you can guess who's going to win the autonomous driving race already. And the customer feels that as a killer app; this car does things in ways it was not done before. It integrates features in ways we've not seen yet, it simplifies a great many things, and is a poster child for a desire and a necessity to go greener.

Keep in mind many attempts have been made to push electric vehicles in the past. The only reason we chose ICE over it is because it was economically more interesting; or put differently, we could foot the bill to the environment and it never asked us to pay that money... Today we pay more tax to keep natural disaster at bay... who's really paying this bill now? Your story isn't any different when placed in a different age and related to the birth of ICE cars. Any car manufacturer today that still tells you they will keep doing ICEs indefinitely is lying to you, and to themselves.

Many of these aspects also apply to RT and implementing it in games. It must be made economically viable. Do you know why we have barely seen perf/dollar shifts in the past few gens? To make RT more economically viable. Slow down progress so any new progress seems more valuable. As if the current GPUs could not be scaled further as they are ;)

Anyway let's not drift off topic :D
Posted on Reply
#95
bug
Vayra86Its really simple... as long as there is no killer app, it wont fly. You can look at VR, you can look at electric cars (never took off until Tesla which is clearly killer app by definition, ie autopilot and all the other features compared to ICE cars) and we can go on like that throughout history.
I don't think it will be a killer app. I think it will be more like people noticing they can go into hiding and look for incoming in a glass reflection and stuff like that that will make people seeing an advantage in RTRT.
But yes, as I have said countless time before, Turing is not the future of RTRT. Turing is squarely aimed at developers (who tend not to program for a non-existing hardware install base) and enthusiasts like me who dig RTRT and are willing to pay a premium to get to kick its tires.
Posted on Reply
Add your own comment
Nov 23rd, 2024 16:14 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts