Tuesday, January 7th 2020
EVGA GeForce RTX 2060 KO Pictured, Possibly NVIDIA's First Response to RX 5600 XT
At CES, we went hands-on with the EVGA GeForce RTX 2060 KO graphics card, and its price came as the biggest surprise: USD $299. This could very well be NVIDIA's first response to AMD's Radeon RX 5600 XT: a new line of RTX 2060 graphics cards under $300, with RTX support being the clincher. The EVGA card looks like it's severely built to a cost. A 20-ish centimeter length, a simple twin-fan cooling solution, and just three connectors, including a legacy DVI-D. It still has a full-length back-plate. The KO ticks at NVIDIA-reference clock-speeds for the RTX 2060. EVGA is planning a premium KO Ultra SKU with factory-overclocked speeds comparable to the RTX 2060 iCX, priced at a small premium. EVGA says that the RTX 2060 KO will launch next week (January 13 or later).
95 Comments on EVGA GeForce RTX 2060 KO Pictured, Possibly NVIDIA's First Response to RX 5600 XT
(heads-up, not everything in there is released, but there are more titles than cards ;) )
Now watch, once next gen consoles come out with RT hardware and RDNA 2 comes out, AMD fans will be praising it as the next innovation of immersive gameplay. They only dis RT now because their hardware cant do it at all. It's the reverse of DX12, when only AMD card could do it, it was the next messia of the gaming industry, and NVidia would surely be so far behind AMD would dominate. Give it a year, and the anti raytracing arguments will mysteriously dissapear.
Funny thing, quite a lot of AMD fans here actually sports the “evil nvidia GPU”
When Nvidia priced high: evil money grabbing Nvidia!
When Nvidia priced low: evil Nvidia stealing RTG’s thunder, RTX useless etc etc
Some folks always find negativity in anything Nvidia related
You know, I do have to wonder in what sort of brain dead world you live in which you really believe this sort of thing would actually happen, that people would just radically change their opinion in a split second simply because of the color of their favorite company or that it would matter in any conceivable way. As if the success of some technology that's supposedly amazing by default would be somehow forever ruined by a bunch of fanboys.
It's actually pretty sad that you're bothered enough by what fans of some company say to tell me this. I'm sorry, really, this is strangely depressing in a way.
Remember that most games are still made for the console and ported (sometimes badly) to PC. Console gamers would have a shit-fit if Developers didn't make some use of RTRT in their new console games especially if these next gen consoles end up more expensive than the present generation.
With AMD on board the RTRT train there is nothing standing in the way of RTRT even though it can only be implemented in small ways right now.
Yes, RTX cards are too slow to run more complex light effects. But what they do now makes such a huge difference already.
The goal today is not to make games look like movies looked 10 years ago. It's not even a goal for the next decade or two.
It's just to make game look better - just like with every major change in GPU feature set.
It always means higher hardware requirements.
It's 2020.
You can buy a $200 card for a decent 1080p 60fps gaming. It's enough for most.
Or you can pay few times more and get a bonus. Before RTX that bonus was mostly about resolution.
Today you have another choice: more realistic picture.
It's a choice. It's not mandatory. Why are you so much against it?
Also, think about 4K.
We take it for granted today. It's a reference point for high-end GPUs.
But it hasn't been like that in the past. When 4K came out, many people were against it: it decimates fps, it doesn't change a lot - same arguments you use.
And to some extent they were right. In many games moving from 1080p to 4K doesn't impact gaming experience very much.
Many people still game at 1080p. We have high-end 1080p gaming monitors and so on.
And lets look at total cost of 4K vs RTX - something people here seldom think about (maybe because most of you have vevry expensive PCs anyway).
For 4K you need a more expensive graphics card, but also a more potent CPU, a 4K monitor, maybe more RAM. You end up with an expensive PC.
For RTX you only need an RTX GPU. The rest can remain pretty basic and cheap.
It needs to mature, I believe next gen consoles will be the answer because they really have no choice as both M$ and Sony have been touting about RT support.
M$'s latest info that came out is 4k/120 w/ RT enabled. And they also spoke about 8k support. 8k is useless now and for the foreseeable future.
Sony's PlayStation 5 claims 4k/60 w/ RT enabled.
I suppose we will eventually find out in 2020 as RT development is going quite well. 1440p seems to be the best resolution for PC Gaming. Because you get the huge bump up in image quality over 1080p with very little hindrance on performance.
Going from 1080p to 4k is not only too expensive, but it hinders performance a lot, despite the superior image quality gains.
1440p wins on performance, image quality and price. Nobody is against Nvidia's Ray Tracing implementation, what people are upset with is the premium price Nvidia implemented for a product that does not work as advertised nor has enough in game support.
In many scenarios enabling NV Ray Tracing didn't make the picture quality all that great. Not to mention the massive performance hit.
You want to launch a beta product? By all means go right ahead, but don't charge a premium for it lol..
Fair enough, then why are games still getting 1/2 the FPS when RT is enabled?
In other words, dedicated RT cores do nothing special but massively decrease performance very little gain in PQ.
That's why you can't dis people who will praise AMD offering RT for lower price. I guess they did that on a purpose so people who want to fully benefit from RT to buy their most expensive or next gen cards
The claim that RT cause additional cost is just misleading, yes RT cores make additional but also other old parts are just going cheaper, especially when you add RT cores on a not too much improved 3 years old gpu, maybe it is even cheaper!
Image just gets sharper. This may lead to unrealistic, very crisp image.
RTRT does something exactly opposite. It will lead to unsharp, often darker image. So it's just not for every game and not for every gamer. Yeah, people are just against Nvidia. :) And once again: RTRT makes the image more realistic. Some people will see this as decrease in picture quality. Not everything is sharp, bright and clear. That's the point.
This also means that RTRT is more worthwhile in games built around mood (Metro, SOTR) than in something bright and flashy like Final Fantasy. Most critics are not targeting RTX value, but RTRT in general.
Almost all of them on this forum are known for supporting AMD. And most will start to praise RTRT when AMD starts supporting it. That's the point.
Yes, RTX adds cost and makes cards more expensive. Yes, the result isn't for everyone. These are objective arguments.
Frankly, after 20 years of 3D gaming, I still think isometric perspective is more enjoyable than what dominates today. And imagine the enormous savings on hardware.
in a fully path traced game 2060 is 2.5x faster than 1080Ti
www.purepc.pl/karty_graficzne/test_wydajnosci_quake_ii_path_tracing_na_api_vulkan_wstrzasa?page=0,6
do you know SSR in RDR2 take a 20% performance hit on Ultra,and they're still screen space ?
Or rasterized soft shadows ? the performance hit is the same,the quality is worse.
here's rasterized soft shadows vs ultra.
notice the performance hit.36%.
AMD, likely because of pressure from Sony and Microsoft, is really focusing on technologies that can get higher image quality with fewer transistors.
On topic, KO is just EVGA branding.
Going from medium to high/ultra isn't changing much in many AAA titles, while fps can drop by 30% or even more.
Imagine the situation, when "medium" is the best setting we're used to and suddenly Nvidia adds a "magic feature" that provides higher modes (with the performance cost we observe today).
Bloodbath on forums.
Are you one of those people who think upscaling photos makes them look better? :)
And once again: you're talking about quntitative stuff (like resolution and fps), not about actual quality.
Think about orcs in Hobbit movies.
Now imagine they look like in Shadow of War (4K, highest settings!).
So: you'd rather go after the actual movie image or go 16K? :)
imo when rt gets adopted and improved over a few years we'll look at rasterized games with disgust.