• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 2060 Founders Edition Pictured, Tested

This isn't difficult. Let's even remove an exact refresh rate to make it easier. When two cards are vsynced to refresh rates, you can't tell them apart.
 
This isn't difficult. Let's even remove an exact refresh rate to make it easier. When two cards are vsynced to refresh rates, you can't tell them apart.
was that the point of the blind test ? you seem to have no idea of what I'm referring to.
 
The point of the test was that you couldn't tell which card was in which machine. That is what a blind test is used for. I still don't see why this is so difficult.
 
Wrong, you missed the point, the prices must be calculated with inflation to give a fair comparison.

That does not mean that a price increase is pure inflation, there are many factors here, including varying production costs, competition, etc. But these can only be compared after we have the price corrected for inflation, otherwise any comparison is pointless.

You missed the entire point of the comparison. It was a chart to simply show the launch price for each flagship. It was not a comparison across multiple generations (I.E. not meant to compare adjusted value of non-subsequent products).

The entire point of the chart was to show that prices did not increase with each new generation. No adjusted value required.
 
Not for sure we don't. Waiting to see the benchmarks and reviews.
Sure, here are some wild predictions for when it happens, the 4GB 128bit variant will be crap, just like Navi will be too.
 
The point of the test was that you couldn't tell which card was in which machine. That is what a blind test is used for. I still don't see why this is so difficult.
so,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really? the card could not could not deliver with pure performance numbers, so they capped the fps when comparing it against 1080Ti in the most laughable AMD-sponsored test I've seen :laugh: can you imagine the reverse, if navi beat 2060 by 30% and someone did a nvidia-sponsored blind test where they're both delivering 100 fps.You'd both be the first people to heckle it,and rightly so.

Sure, here are some wild predictions for when it happens, the 4GB 128bit variant will be crap, just like Navi will be too.
:laugh:
the 4gb 128-bit variant will easily match 1070 at 1080p/1440p if the 2060 6gb leak is true. and how is navi gonna be crap ? even if they took polaris at 7nm and ddr6 that'd make a good card.
 
Last edited:
so,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really? the card could not could not deliver with pure performance numbers, so they capped the fps when comparing it against 1080Ti in the most laughable AMD-sponsored test I've seen :laugh: can you imagine the reverse, if navi beat 2060 by 30% and someone did a nvidia-sponsored blind test where they're both delivering 100 fps.You'd both be the first people to heckle it,and rightly so.


:laugh:
the 4gb 128-bit variant will easily match 1070 at 1080p/1440p if the 2060 6gb leak is true. and how is navi gonna be crap ? even if they took polaris at 7nm and ddr6 that'd make a good card.
Point of it was 100FPS of smooth gaming is possible. how much more is needed? I don’t worry that there’s cards that can get 10-20FPS more than I can because I get my “100FPS” oF smooth Freesync gaming. Diminishing returns for epeen after that.
 
o,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really?

I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here. Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing. If anyone is concerned about extra power draw but then fps chases, they are silly. There is simply no point in worrying about frames over your monitors refresh rate. That was the whole purpose of the test.

If you can't understand that, good day!
 
I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here. Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing. If anyone is concerned about extra power draw but then fps chases, they are silly. There is simply no point in worrying about frames over your monitors refresh rate. That was the whole purpose of the test.

If you can't understand that, good day!

I think we can all agree on that, if you meant "once your min fps* hits your monitor's refresh rate". But even that is contingent to one title.

*or at least the 99 or 90 percentile, because absolute min fps is usually just a freak occurence
 
I think we can all agree on that, if you meant "once your min fps* hits your monitor's refresh rate". But even that is contingent to one title.

*or at least the 99 or 90 percentile, because absolute min fps is usually just a freak occurence

Considering we couldn't get the basics down, I didn't want to go advanced yet. But yes, the extra processing power is much better for your minimums.
 
I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here. Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing. If anyone is concerned about extra power draw but then fps chases, they are silly. There is simply no point in worrying about frames over your monitors refresh rate. That was the whole purpose of the test.

If you can't understand that, good day!
you lay it down in black in white,yet you can't understand. tests that are limited in any way,in this case by the resolution/refresh rate (100hz) and the choice of games (doom,shadow warrior) are just pointless and are not a good way to test performance. I don't know how to express it in any simpler terms either. I mean,a gpu comparison with a v-sync cap ? Who even heard of something that stupid.

What is the exact point that we should stop "fps chasing" ? 90 fps ? 100 fps ? 85 fps ? Can you please tell us ? Cause I was under the impression that it's subjective in every case. I absolutely can feel 100 vs 130 fps instantly. The blind test was pointless and just pulling wool over the public eyes.
 
ermahgerd.....

you lay it down in black in white,yet you can't understand. tests that are limited in any way,in this case by the resolution/refresh rate (100hz) and the choice of games (doom,shadow warrior) are just pointless and are not a good way to test performance. I don't know how to express it in any simpler terms either. I mean,a gpu comparison with a v-sync cap ? Who even heard of something that stupid.

This really is black and white. They weren't comparing GPUs, they were comparing a scenarios. It went like this: Hey, you can play X game at Y resolution at Z Hz and you can't tell the difference. Which is absolutely correct. Was it the most eye opening test ever? No. Did it show people that if you target your monitor that any gpu will likely work? Yes.

What is the exact point that we should stop "fps chasing" ? 90 fps ? 100 fps ? 85 fps ? Can you please tell us ? Cause I was under the impression that it's subjective in every case. I absolutely can feel 100 vs 130 fps instantly. The blind test was pointless and just pulling wool over the public eyes.

This is perhaps the most black and white and easiest. The correct answer is, drum roll please: your monitor's refresh rate!
 
ermahgerd.....



This really is black and white. They weren't comparing GPUs, they were comparing a scenarios. It went like this: Hey, you can play X game at Y resolution at Z Hz and you can't tell the difference. Which is absolutely correct. Was it the most eye opening test ever? No. Did it show people that if you target your monitor that any gpu will likely work? Yes.



This is perhaps the most black and white and easiest. The correct answer is, drum roll please: your monitor's refresh rate!
why didn't they go with a 144/165hz one for the tests then ? And it's obvious you're unlikely to see the difference when they just tell you to sit down and play a couple of game for a couple of minutes each.But in the games you play extensively at home you'll easily see the fps difference immediately. Imagine you've played hundreds of hours of BF1 at 100 fps and the same amount at 130 fps. You'll see the difference right away.
 
your monitor's refresh rate!
Not to take sides here, because both of you have made fair points, the above is correct. Once a systems minimum FPS matches or exceeds the maximum refresh rate of the display, FPS chasing becomes a waste of time, effort and energy.

EDIT; Realistically, 120hz is the physical limit to what the human eye can distinguish in real-time. While we can still see a difference in "smoothness" above 120hz, it's only a slight perceptional difference and can be very subjective from person to person.
 
Last edited:
why didn't they go with a 144/165hz one for the tests then ? And it's obvious you're unlikely to see the difference when they just tell you to sit down and play a couple of game for a couple of minutes each.But in the games you play extensively at home you'll easily see the fps difference immediately. Imagine you've played hundreds of hours of BF1 at 100 fps and the same amount at 130 fps. You'll see the difference right away.

We both know the answer to that question: The Vega likely can't do it. Wow, that is disingenuous you are probably telling yourself. The reason they did so is that it is very likely many more people play at 100hz and below. So that is who they targeted. Where it would benefit them the most.

And I don't even want to imagine playing any BF title for hundreds of hours. It only brings up images of my own suicide.

Not to take sides here, because both of you have made fair points, the above is correct. Once a systems minimum FPS matches or exceeds the maximum refresh rate of the display, FPS chasing becomes a waste of time, effort and energy.

I don't play for sides. There is really nothing Cucker has said that is wrong, just that it doesn't apply to what started this debate about the blind test.
 
Everyone is complaining about price. Is it really that expensive? And if so, is really that difficult to save money for an extra month or two?

Seems to be at the end of the card.

You don't seem to get it do you - nvidia has been steadily increasing the cost of it's graphics cards for the last 4-5 generations, and swapping products around. With the 6xx series, the GTX 680 was NOT actually their high-end product - no - that was the original titan. So they released the 680 witch is a mid end card branded as a high end one. The codename for the GPU is GK104 (G=geforce K=Kepler 4=model number). In previous generation cards, Fermi, the 104 numbering was assigned to the GTX 460 - GF104, and the high end model was the GF100. Currently, the GP106 card is the GTX 1060 - but what bore the 106 moniker a few generations ago? the GTX 450! yeah. The GF106 is the GTX 450, witch means the GP104 SHOUD have been the GTX 1050, NOT the 1060. Nvidia keeps swapping names, making even more money off the backs of people like you, who are willing to spend the ludicrous amounts of money these bottomless pit of a company is asking for.

The 680 should have been the GK100, witch nvidia later renamed into GK110 because of the 7 series or kepler refresh - and sold as the original titan - a 1000$ card! No other high end video card sold for that much before - the GTX 480 had a MSRP of 499$, and the 580 was 450$ - so nvidia doubled their profits with Kepler by simply moving the high and mid end cards around their lineup and "inventing" two new models - the Titan, and later, the GK110b, the 780ti - but not before milking consumers with their initial run of defective GK110 chips, the GTX 780, or GK110-300.

Now nvidia is asking 400$ for a mainstream card, almost as much as the 2, 4 and 5 series high end models cost. But wait - the current flagship, the Titan RTX is now 2500 bloody dollars, and the 2800ti was 1299$ at launch, with prices reaching 1500$ in some places. In fact, it's still 1500$ at most online retailers in my country. F#(k that! 1500$ can buy you a lot of nice stuff - a decent car, a good bike, a boat, loads of clothes, a nice vacation - I'm not forking over to nvidia to pay for a product witch has cost around the 400$ mark for the better part of 18 freakin' years.

Don't you realize we're being taken for fools? Companies are treating us like idiots, and we're happy to oblige by forking out more cash for shittier products...
 
Last edited:
So... youd rather have a 'high-end' card from a 'couple' generations ago than a modern card with the latest features which performs the same (or better)....just because its labeled as a midrange card?

I'd rather go with the modern card with all the latest gizmos, and less power use given the same cost and performance.
 
So... youd rather have a 'high-end' card from a 'couple' generations ago than a modern card with the latest features which performs the same (or better)....just because its labeled as a midrange card?

I'd rather go with the modern card with all the latest gizmos, and less power use given the same cost and performance.
Amen to that. Plus warranty.
 
With the 6xx series, the GTX 680 was NOT actually their high-end product - no - that was the original titan.
No, the first Titan was released along with the 700 series.
The top model of the 600 series was the GTX 690, having two GK104 chips.

So they released the 680 witch is a mid end card branded as a high end one. The codename for the GPU is GK104 (G=geforce K=Kepler 4=model number). In previous generation cards, Fermi, the 104 numbering was assigned to the GTX 460 - GF104, and the high end model was the GF100. <snip>

The 680 should have been the GK100, witch nvidia later renamed into GK110 because of the 7 series or kepler refresh - and sold as the original titan - a 1000$ card!
I have to correct you there.
GK100 was bad and Nvidia had to do a fairly "last minute" rebrand of "GTX 670 Ti" into "GTX 680". (I remember my GTX 680 box had stickers over all the product names) The GK100 was only used for some compute cards, but the GK110 was a revised version, which ended up in the GTX 780, and was pretty much what the GTX 680 should have been.

You have to remember that Kepler was a major architectural redesign for Nvidia.
 
400$ for an high end card (meaning it's the best a company can do) like the the 290 for the record, is understandable.
400$ for a mid-range card (see what a 2080 Ti can do) is not good.

Nvidia doubled they're prices because AMD is waiting for next year to propose something, because they are focusing on CPU and don't have Nvidia nor Intel firepower. They can't do both GPU and CPU.
So you are bending over waiting for the theft "a month or two".

Here is your invalidation.
You are saying a 2080 Ti should be 600$? Because they “doubled” it? I know you are going to say “it was a metaphor”, and that’s where problems come in. It’s NOT double, at most I would say they charged 10-25% more than it’s worth. 10-25% is NOT double. That’s where customer satisfaction can be altered, by misleading a consumer market and saying it’s absolutely not worth it where in reality it’s not like what you said.

No, the first Titan was released along with the 700 series.
The top model of the 600 series was the GTX 690, having two GK104 chips.


I have to correct you there.
GK100 was bad and Nvidia had to do a fairly "last minute" rebrand of "GTX 670 Ti" into "GTX 680". (I remember my GTX 680 box had stickers over all the product names) The GK100 was only used for some compute cards, but the GK110 was a revised version, which ended up in the GTX 780, and was pretty much what the GTX 680 should have been.

You have to remember that Kepler was a major architectural redesign for Nvidia.
The GTX 670 Ti doesn’t exist. It’s not a rebrand. Also, if I recall correctly, the 680 has the fully enabled GK104 chip (with the 690 being the same but two GK104 chips). Kepler 1st gen capped at 1536 cores, Kepler 2nd gen capped at 2880 cores. Just by architecture. Not for the intent to rip people off.
 
Last edited:
It's a 750mm2 die with 11 gigs of ddr6 and hardware RT acceleration, you can have one at $1000 or you can have none, they'll not sell it to you at $600 if vega 64 is $400 and 2080Ti is 1.85x Vega's performance.

relative-performance_3840-2160.png



2080Ti RTX = V64 rasterization

2160.png
rtx-2080-ti-2160.png
 
Last edited:
They'll sell it for what people will pay. Which is apparently what it is listed at.
 
Back
Top