• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

EVGA GeForce RTX 2060 KO Pictured, Possibly NVIDIA's First Response to RX 5600 XT

Nvidia has not released anything, the RTX 2060 is a card that has been around for a year. Just got a little forgotten, because of the 2060 Super.

You are absolutely right so I guess they have updated pricing.
 
Image sharpening improves image quality?
Are you one of those people who think upscaling photos makes them look better? :)
RIS makes rendered edges sharper which counters the graininess from upscaling. Look up reviews of people that tested the feature: it works great and will likely be standard in Scarlette/PS5.


This is off topic so I'm dropping it.
 
RIS makes rendered edges sharper which counters the graininess from upscaling. Look up reviews of people that tested the feature: it works great and will likely be standard in Scarlette/PS5.


This is off topic so I'm dropping it.
I can vouch for that,dlss with sharpening produces a much better image overall.
 
As opposed to other techniques that speed it up?
Tesselation incurs a performance hit. Shading incurs a performance hit. Lighting incurs a performance hit. If it weren't for all these pesky techniques, we'd be enjoying Wolfenstein at 1,000,000 fps by now.

Edit: More on topic, I think Nvidia has squeezed all there was from Turing by now. Going forward it's Ampere or bust (i.e. whoever didn't buy into Turing by now, most likely never will).
Tessellation was actually introduced to improve performance by reducing the number of draw calls for a given level of polygon count,
it also makes it easier to change sub-division etc in the polygon mesh in real time.
As for shading, there are many techniques that are meant to performance, such as given an illusion of surface texture or unevenness without exploding the polygon count by using real geometry.

the iq improvement is there,but not every scene will not showcase that.
imo when rt gets adopted and improved over a few years we'll look at rasterized games with disgust.

View attachment 141675
View attachment 141676
TBH right now what disgust me the most is so many "influenza" (yes they might as well be a virus) like Digital Foundry keep praising Metro as the best looking game ever just because it has RT in it.
When the texture quality in many places are just facepalm, and it is literally right in the fore ground compare to the RT global illumination and shadows further away.
RT alone does not make a good looking game, it makes more accuate lighting.
The notion has been because something is more accurate it must be better, when games themselves are artistic representations.
 
Last edited:
RT alone does not make a good looking game, it makes more accuate lighting.
The notion has been because something is more accurate it must be better, when games themselves are artistic representations.

The worst thing is that with stuff such as GI you simply have no sense of how exactly something is supposed to look right because we've had what, 30 years of progress in traditional lighting techniques that have gotten us extremely close to the real thing.

Not only that you'll have to live with the massive hit in performance but you'll also have to convince yourself you're getting something that's actually better.
 
The worst thing is that with stuff such as GI you simply have no sense of how exactly something is supposed to look right because we've had what, 30 years of progress in traditional lighting techniques that have gotten us extremely close to the real thing.

Not only that you'll have to live with the massive hit in performance but you'll also have to convince yourself you're getting something that's actually better.
Performance hit aside.
When I look at a game I notice the texture quality and details etc the developers put into the environment.
Many great lookin games were already made without RT such as the RE2 remake or MHW etc.
I can tell the developers put the effort in to craft the environment to make it look just as they want it.

What I want to say is:
How often do I care that X lighting is off by 10 degrees to the right?
Or how long do I stare at a character in the mirror that I won't see except in some cut scenes in First Person games?
I am not going to do all the math in my mind to see if the lightning is exactly X degrees and reflected Y times.
There are far more important work that are done in games to make it look amazing, RT itself is just a means to an end.
 
Last edited:
I think it's not a good idea to judge the entire value of RTRT right now. It's fairly new stuff for Developers and they need time to learn how to effectively implement it and how not to use it to the extent that it destroys performance. Probably a lot of Developers haven't any experience with it at all. There will be a time when they do and the RTRT experience will be better than today.

The other big complaint about RTX cards is the expense. Well, when has new hardware not been expensive? The early adopters of GPUs with Tensor and RT cores will pay the price for the rest of us and prices should come down over time just like they did with SSDs and 4K monitors. R&D costs have to be recuperated from some customers.

Having said that I think Nvidia has been overcharging in addition to the above statement due to lack of competition. That will change this year hopefully.
 
And most will start to praise RTRT when AMD starts supporting it. That's the point.

I couldn't care less about it, regardless of which company is supporting it.

In general: people criticize RTRT for not providing enough IQ improvement, but at the same time many assume games should only be played at highest settings.
Going from medium to high/ultra isn't changing much in many AAA titles, while fps can drop by 30% or even more.

Imagine the situation, when "medium" is the best setting we're used to and suddenly Nvidia adds a "magic feature" that provides higher modes (with the performance cost we observe today).
Bloodbath on forums.

RTRT doesn't provide enough IQ improvement and brings with it too much of a performance hit (at this point in time anyway) in my opinion to warrant it being a deciding factor when buying a GPU. Similar things could be said about other graphics settings too - I'll happily sacrifice things like uber-realistic shadows and volumetric lighting for better performance. As you mention, everybody feels like they have to play on ultra, when a majority of the time you can have a game that looks essentially the same as ultra while playing on a combo of medium/high while performing significantly better. I guess it's just dependent on each person's preferences.
 
EVGA is Nvidia most prominent AIB so they have some leeway on how to make their own product aside from Nvidia product stack, like this 2070 Super with 15.5gbps GDDR6
I guess this version of the 2060 exist within EVGA only.

There was a 2070 Super with the faster GDDR6? Where the hell was this 6 months ago when I was deciding between the 5700 XT and this Super?

And what the hell is this name: EVGA GeForce RTX 2070 SUPER FTW3 ULTRA+

Wait, nevermind. Screw this pricing of $609 (the MSRP of the 2070 Super is $499), I would rather get the 2080 Super instead.
 
Is this the weekly "RTRT is a worthless technology by evil NVIDIA to artificially increase graphics card prices, waaaah" thread?

Allow me to repeat for the millionth time, nobody is forcing you to buy NVIDIA's products so why the f**k do you crybabies care about the cost? Buy an AMD GPU and in the process you'll solve 3 problems: you won't be giving money to "NGREEDIA", you won't be paying more for features that are supposedly useless (since AMD cards don't have them), AND you'll no longer have a reason to whine endlessly on forums. Everyone wins - especially the people who are tired of literally every GPU thread getting drowned by a fecal matter torrent of AMD fanboys telling us that RTRT is useless for the thousandth time.
 
Is this the weekly "RTRT is a worthless technology by evil NVIDIA to artificially increase graphics card prices, waaaah" thread?

Allow me to repeat for the millionth time, nobody is forcing you to buy NVIDIA's products so why the f**k do you crybabies care about the cost? Buy an AMD GPU and in the process you'll solve 3 problems: you won't be giving money to "NGREEDIA", you won't be paying more for features that are supposedly useless (since AMD cards don't have them), AND you'll no longer have a reason to whine endlessly on forums. Everyone wins - especially the people who are tired of literally every GPU thread getting drowned by a fecal matter torrent of AMD fanboys telling us that RTRT is useless for the thousandth time.
so, anyone that criticizes nshitias RTRT is an amd fanboy.. but you, the wise and objective judge of character support a feature that even the top of the line 2080ti can barely use. but the critics are the "fanboys".. sad.

(on topic) the segmentation of the gpu market is becoming more ridiculous by the day. it will be very funny when we reach the point of each respective gpu offering 1fps difference from the previous or the next one in line.
 
I agree. These image enhancements do take a performance hit, just not as deep as Ray Tracing. RT ain't polished yet.
We must remember things differently then. Pixel shading didn't incur as big a performance hit, because it was adopted gradually over 10 years or so, but tessellation's performance hit was so big it took 7 years between ATI's first implementation and DX adding support for it. To this day, we still cringe when we hear about HairWorks or TressFX
It needs to mature, I believe next gen consoles will be the answer because they really have no choice as both M$ and Sony have been touting about RT support.
No arguing there, but saying "it needs to mature" about a tech at its first generation is a truism.
M$'s latest info that came out is 4k/120 w/ RT enabled. And they also spoke about 8k support. 8k is useless now and for the foreseeable future.
Sony's PlayStation 5 claims 4k/60 w/ RT enabled.
Despite what marketing would have you believe, consoles won't do anything remotely resembling 4k. They'll do what they always do: upscale.

Image Sharpening and Boost. Boost dynamically lowers render resolution get more FPS. Image can make a lower resolution render look like it is higher at little frame time cost.
I was talking about stuff that improves image quality. It would be shocking to see something that downgrades quality to incur any kind of performance hit.
 
RIS makes rendered edges sharper which counters the graininess from upscaling. Look up reviews of people that tested the feature: it works great and will likely be standard in Scarlette/PS5.
OMG. So now we'll improve quality by up-scaling and sharpening edges. That is just sad. :o
Have you ever (I mean: ever) read anything about digital photography editing? Even an article in Playboy?

You see. That's why RTRT has such a hard time to be understood. Because it's just impossible to convince some people that more pixels, more sharpness and more saturation doesn't improve image quality.
It's not like I'm that surprised since many people tend to prefer photos from smartphones over those from high-end cameras for the same reason.

So as I said: RTRT is just not for everyone. But it's also not compulsory, so no harm done, right? :)
 
There was a 2070 Super with the faster GDDR6? Where the hell was this 6 months ago when I was deciding between the 5700 XT and this Super?

And what the hell is this name: EVGA GeForce RTX 2070 SUPER FTW3 ULTRA+

Wait, nevermind. Screw this pricing of $609 (the MSRP of the 2070 Super is $499), I would rather get the 2080 Super instead.


The SC2 Ultra+ model is currently being sold for 560usd (after 20usd off), so only 40usd more expensive than normal SC2. This model could very well compete with 2080 on equal footing.
 
OMG. So now we'll improve quality by up-scaling and sharpening edges. That is just sad. :eek:
Have you ever (I mean: ever) read anything about digital photography editing? Even an article in Playboy?

You see. That's why RTRT has such a hard time to be understood. Because it's just impossible to convince some people that more pixels, more sharpness and more saturation doesn't improve image quality.
It's not like I'm that surprised since many people tend to prefer photos from smartphones over those from high-end cameras for the same reason.

So as I said: RTRT is just not for everyone. But it's also not compulsory, so no harm done, right? :)
There is a bit of misconception on RIS, it doesn't actually reduce the native resolution unlike DLSS.
Some people use RIS to off-set the blurness of rendering game at a lower resolution is another story. RIS does not change the native resolution on its own.
Right now there is not way to stop DLSS from rendering the game at lower resolution and then upscale it.
 
wtf. KO?
Ti, Super, KO, 11, 10, 20, 21.... :confused::confused::confused:
I expect 2960 WO edition, then I am gonna buy it. :laugh:
 

The SC2 Ultra+ model is currently being sold for 560usd (after 20usd off), so only 40usd more expensive than normal SC2. This model could very well compete with 2080 on equal footing.

I'm avoiding buying stuff from NewEgg due to their warranty and returns policies.

EDIT: I found an Amazon listing. God damn it @nguyen this is tempting.
 
We must remember things differently then. Pixel shading didn't incur as big a performance hit, because it was adopted gradually over 10 years or so, but tessellation's performance hit was so big it took 7 years between ATI's first implementation and DX adding support for it. To this day, we still cringe when we hear about HairWorks or TressFX

No arguing there, but saying "it needs to mature" about a tech at its first generation is a truism.

Despite what marketing would have you believe, consoles won't do anything remotely resembling 4k. They'll do what they always do: upscale.


I was talking about stuff that improves image quality. It would be shocking to see something that downgrades quality to incur any kind of performance hit.
Next Generation Consoles WILL do true 4K, that I'm 110% sure of.
4k TVs are a dime a dozen nowadays.
 
Next Generation Consoles WILL do true 4K, that I'm 110% sure of.
Because if a 2080Ti barely handles 4k and HDR at the same time, consoles will totally have no trouble breezing through that :kookoo:
4k TVs are a dime a dozen nowadays.
Totally unrelated, but ok. And you're probably thinking TVs without proper HDR support, those aren't that cheap.
 
Because if a 2080Ti barely handles 4k and HDR at the same time, consoles will totally have no trouble breezing through that :kookoo:
they will
upscaled,at medium-high pc setting

new ps5 gpu is a 9tflop rdna one,so basically a 5700xt with RT support.
 
they will
upscaled,at medium-high pc setting

new ps5 gpu is a 9tflop rdna one,so basically a 5700xt with RT support.
That's what I said, but then SuperXP felt the need to post he's 110% sure next gen consoles will do "real 4k".
 
Last edited:
That's what I said, but then SUperXP felt the need to post he's 110% sure next gen consoles will do "real 4k".
he's 100% sure amd will do 4k 60 RT with no RT hardware cause RT cores do nothing
 
Back
Top