# NVIDIA RTX SUPER Lineup Detailed, Pricing Outed



## Raevenlord (Jun 28, 2019)

NVIDIA has officially confirmed pricing and SKU availability for its refreshed Turing lineup featuring the SUPER graphics cards we've been talking about for ages now. Primed as a way to steal AMD's Navi release thunder, the new SUPER lineup means previously-released NVIDIA gppahics cards have now hit an EOL-status as soon as their souped-up, SUPER versions are available, come July 2nd.

The RTX 2060 and RTX 2080 Ti will live on, for now, as the cheapest and most powerful entries unto the world of hardware-based raytracing acceleration, respectively. The RTX 2070 and RTX 2080, however, will be superseded by the corresponding 2070 SUPER and 2080 SUPER offerings, with an additional RTX 2060 SUPER being offered so as to compete with AMD's RX 5700 ($399 for NVIDIA's new RTX 2060 SUPER vs $379 for the AMD RX 5700, which is sandwiched in the low-end by the RTX 2060 at $349).



 



The RTX 2070 SUPER will be positioned at a higher pricing point than AMD's upcoming RX 5700 XT ($499 vs $449), which should put it mildly ahead in performance - just today we've seen benchmarks that showed AMD's RX 5700 XT trading blows with the non-SUPER RTX 2070. The NVIDIA RTX 2080 SUPER will get improved performance as well as a drop in pricing, down to $699 versus the original's (exorbitantly high compared to the GTX 1080's pricing of $549) $799.

*View at TechPowerUp Main Site*


----------



## Vayra86 (Jun 28, 2019)

Hahahahahaha yeah

Its even worse than I had imagined. MSRP is basically 1:1 with Turing relative to performance. Another full year of zero progress it is. Nice.

Super.
Hard.
Pass.



moproblems99 said:


> With aibs much higher.
> 
> https://wccftech.com/nvidia-evga-geforce-rtx-2070-super-and-rtx-2060-super-graphics-card-leak/amp/



Pinch me please, I must be in some weird dream.

Are people really this stupid? Nvidia seems to think so... and they're probably right


----------



## moproblems99 (Jun 28, 2019)

With aibs much higher.

https://wccftech.com/nvidia-evga-geforce-rtx-2070-super-and-rtx-2060-super-graphics-card-leak/amp/


----------



## Xaled (Jun 28, 2019)

2060 for 399$ ? No, thanks.


----------



## 64K (Jun 28, 2019)

moproblems99 said:


> With aibs much higher.
> 
> https://wccftech.com/nvidia-evga-geforce-rtx-2070-super-and-rtx-2060-super-graphics-card-leak/amp/


Not only that but if the cards are in short supply then expect retailer gouging on top of that. We will see what the real world prices are after the dust settles from the launch.


----------



## TesterAnon (Jun 28, 2019)

Xaled said:


> 2060 for 399$ ? No, thanks.



And that is the reference version, the EVGA version is $500 for example.
Yes, $500 for a 2060.

Well see you again next gen, time to wait another 2 years for Pascal prices.


----------



## Vayra86 (Jun 28, 2019)

TesterAnon said:


> And that is the reference version, the EVGA version is $500 for example.
> Yes, $500 for a 2060.



Makes sense for EVGA, they're forced to redesign their cooling every other generation


----------



## B-Real (Jun 28, 2019)

Yeah, that's for the first rumoured 100$ cut for the normal versions.  Be honest, who thought of that? RTX 2060 would cost $250 that way, meaning they would need to cut the price of the $280 1660Ti, then the 1660, 1650. And, this way, the Navi lineup will have a good chance to prove.


Vayra86 said:


> Hahahahahaha yeah
> 
> Its even worse than I had imagined. MSRP is basically 1:1 with Turing relative to performance. Another full year of zero progress it is. Nice.
> 
> ...



Interesting thing is that they didn't list the 2070 and 2080. As the Super version of both cost the same as the normal versions, i think that they may run out current supplies of RTX 2070 and 2080 then replace it with the Super versions. Except for the RTX 2060.


----------



## 64K (Jun 28, 2019)

B-Real said:


> Interesting thing is that they didn't list the 2070 and 2080. As the Super version of both cost the same as the normal versions, i think that they may run out current supplies of RTX 2070 and 2080 then replace it with the Super versions. Except for the RTX 2060.


 From the article sourced from VideoCardz the reason for that is the rumor that the regular 2070 and 2080 are now EOL.


----------



## Markosz (Jun 28, 2019)

They keep pushing up the price...
xx60's card for $400??? GTX 960 was $200 MSRP. Soon they will be asking $500 for entry-mid grade cards


----------



## rtwjunkie (Jun 28, 2019)

Vayra86 said:


> Hahahahahaha yeah
> 
> Its even worse than I had imagined. MSRP is basically 1:1 with Turing relative to performance. Another full year of zero progress it is. Nice.
> 
> ...


I believe I may be remaining on my 1080Ti until the NEXT gen, whenever that might be. We’ll see if I can break a record for time on one GPU. The previous one was a little over 3 years on a 8800GTX until I got the GTX 285.


----------



## Razrback16 (Jun 28, 2019)

Vayra86 said:


> Hahahahahaha yeah
> 
> Its even worse than I had imagined. MSRP is basically 1:1 with Turing relative to performance. Another full year of zero progress it is. Nice.
> 
> ...



Boy, you said it. To be fair, disappointment is what I expected from NVidia and they definitely delivered. I'll still wait and see what they do with the 2080 Ti Super model, but the pricing to performance ratio is still completely unacceptable. $1000 for a 2080 Ti that's only ~30-35 % faster than a 1080 Ti...get outta here. Only way I'd buy Turing at this point is 2nd hand market, cheap. Another $250-300 off the 2080 Ti and then we'll chat, Nvidia.


----------



## puma99dk| (Jun 28, 2019)

I currently own a GTX 1080 Ti FE Hybrid and I was hoping at first that the RTX 2080 would beat this as a next gen usually do with a previous Ti card but not really the 1080 Ti at 1440p still gives the RTX 2080 a run for it's money and since games with Ray-Tracing is still not many for me it doesn't matter.

I don't see me paying like 850 USD for a RTX 2080 when I got a fully working GTX 1080 Ti.


----------



## moproblems99 (Jun 28, 2019)

Razrback16 said:


> Boy, you said it. To be fair, disappointment is what I expected from NVidia and they definitely delivered. I'll still wait and see what they do with the 2080 Ti Super model, but the pricing to performance ratio is still completely unacceptable. $1000 for a 2080 Ti that's only ~30-35 % faster than a 1080 Ti...get outta here. Only way I'd buy Turing at this point is 2nd hand market, cheap. Another $250-300 off the 2080 Ti and then we'll chat, Nvidia.



I'd pay 8 bills for the 2080 ti max.  3 years ago I wouldn't have spent over 6.

Edit:. Poll sucks by the way.  There isn't an option for stupid no matter what Galaxy you live in.


----------



## Razrback16 (Jun 28, 2019)

moproblems99 said:


> I'd pay 8 bills for the 2080 ti max.  3 years ago I wouldn't have spent over 6.



Ya, I'd probably go $750 max on the reference and $850-900 per unit if they are pre-blocked. Sounds like I'm gonna be waiting a while though.  Which is fine, I'm hoping Intel will come out with something competitive on the high end.



moproblems99 said:


> Poll sucks by the way. There isn't an option for stupid no matter what Galaxy you live in.



Agreed. They absolutely need a 3rd option because the two options they have are basically N/A for me.  As Vayra86 said - SUPER. HARD. PASS. That is perfectly accurate for me as well at this point.


----------



## Raevenlord (Jun 28, 2019)

Updated the poll for all you non-believers.

And that's a joke, by the way.


----------



## Razrback16 (Jun 28, 2019)

Raevenlord said:


> Updated the poll



Thank you sir. Vote cast.


----------



## ZoneDymo (Jun 28, 2019)

Ngreedia strikes again.

Seriously, why even release these cards at this point, this is nonsense.


----------



## xkm1948 (Jun 28, 2019)

Wait for W1zzard’s review first


----------



## atomicus (Jun 28, 2019)

This is what happens when you have no competition at the top end. AMD and/or Intel need to pull their fingers out. It's not even funny anymore.


----------



## Recus (Jun 28, 2019)

ZoneDymo said:


> Ngreedia strikes again.
> 
> Seriously, why even release these cards at this point, this is nonsense.



Only RTX 2060 Super price increased and that's only because AMgreeD priced RX 480 ($199) successor $449.  Rest of the Super aka Nails in Navi Coffin are cheaper and faster than originals.


----------



## Manu_PT (Jun 28, 2019)

If RTX 2070 super has same performance or similar to 1080ti, for 500€/550€ it isnt that bad. But the rest... Not good tbh.


----------



## the54thvoid (Jun 28, 2019)

I bought a 2080ti because I had money spare. But even with that, I'm aware it was a silly price. As an Nvidia owner, I say JSH is a total asswipe.


----------



## neatfeatguy (Jun 28, 2019)

rtwjunkie said:


> I believe I may be remaining on my 1080Ti until the NEXT gen, whenever that might be. We’ll see if I can break a record for time on one GPU. The previous one was a little over 3 years on a 8800GTX until I got the GTX 285.



That's it, just 3 years?

I ran with my GTX 280s for just over 3 years (well, one was a bad 285 flashed with a 280 BIOS), then I moved to GTX 570s and had those for just about 4.5 years. I'm coming up on year 4 with my 980Ti cards. I haven't ran them in SLI since after 3 months of owning them, just not necessary for me. 1 does well enough for my gaming needs.

Usually my move in GPUs depended on 1 of 2 things:
1) Card dies and there is no viable replacement to keep SLI setup going
2) A single, new gen, card comes along and can deliver the same or better performance than my current SLI setup.

Going off how I upgraded in the past, I haven't seen a reasonably priced single card that can deliver the same or better performance than my 980Ti cards in SLI. I haven't looked for a while, but I believe the 2080Ti is twice the performance of a single 980Ti, but price-wise, it's absolutely a 'goram' joke. I don't plan on upgrading anytime soon - might be another gen or two at the rate prices are going.


----------



## TheMadDutchDude (Jun 28, 2019)

Pahahah. I’m still glad I hopped on the 1080 Ti wagon and got it with the water block for under $575 almost a year ago.


----------



## GoldenX (Jun 28, 2019)

What a joke, this only will help validate the already high launch price of Navi. So another year without innovation, only price increases.
The best RTX game is a remake of a decades old FPS, and the competition doesn't even offer mesh shaders, let alone RTRT. And I thought that Pascal was a boring release.


----------



## Manu_PT (Jun 28, 2019)

the54thvoid said:


> I bought a 2080ti because I had money spare. But even with that, I'm aware it was a silly price. As an Nvidia owner, I say JSH is a total asswipe.



I would nevet pay 1300€ for a GPU. I am not judging you because each one knows what he does and we cant judge but when I was looking for a new GPU I had/have enough money and still refused the idea of spending so much. Got a 1080ti for 490€.

The problem here is that modern games are starting to demand more and more. And the mid range is now 400€-500€ wich is too much for mid range cards. Soon ppl will enter "panic mode" when their RX570s RX580s gtx 1060 gtx 1070 cant deliver 60fps 1080p anymore unless they use low settings. Then they will find out they have to pay 500€ for a decent upgrade... 

Shocking and worrying imo. CPU prices are better thanks to AMD, ram prices stabilized, but the GPUs wich are the most important part on a gaming PC have this shocking price. 

Idk man. Can see a lot of ppl going console route soon. Paying freaking 1000€-1200€ for a 60fps 1080p rig? Dude... Mental.


----------



## Razrback16 (Jun 28, 2019)

Manu_PT said:


> Soon ppl will enter "panic mode" when their RX570s RX580s gtx 1060 gtx 1070 cant deliver 60fps 1080p anymore unless they use low settings. Then they will find out they have to pay 500€ for a decent upgrade...



Hopefully people don't panic - the 2nd hand market is very healthy. I'd hope people would go to ebay to get some good used gear and give NVidia a big fat middle finger.


----------



## Manu_PT (Jun 28, 2019)

Razrback16 said:


> Hopefully people don't panic - the 2nd hand market is very healthy. I'd hope people would go to ebay to get some good used gear and give NVidia a big fat middle finger.



Idk man, here ppl are still selling gtx 1080s for 350€ at lowest and that's already the best price. But 2nd hand is defo the best thing to do. I got my 1080ti for 490€ on 2nd hand market.
Gotta hunt for good deals I guess.


----------



## Fluffmeister (Jun 28, 2019)

Yeah last time I checked no one is forced to buy anything... and it's sadly no surprise the 1080 Ti gets a lot of love, even the Radeon VII fell short; same price, 7nm, years later.


----------



## ensabrenoir (Jun 28, 2019)

......meanwhile at Nvdia:  Hey guys watch this....boom RTX 2060  "Super" for $400 and every other ridiculous price I can come up with and....5...4..3..2..1.. (pre-orders flooding in bank account swelling) .....                                                     ..........baaaahhhh    hhhhaaaa!!!!!!!!!!!!!!!


......as much as we laugh rant and rave......Nvdia is doing the same thing.....as they watch their bank account swell..


----------



## illli (Jun 29, 2019)

Markosz said:


> They keep pushing up the price...
> xx60's card for $400??? GTX 960 was $200 MSRP. Soon they will be asking $500 for entry-mid grade cards




NV controls like 70% of the discreet market so they set the market... This is what they want to do, and it'll keep getting worse. I saw this back when they had a glut of product last gen and didn't budge much on price, they were content to just let the supply run out whenever/however long it took.  For this gen NV went full greed, pushing the x60 up a tier (where the 2070 should have been in price) and the 2070 up to the x80 tier and so on.  This will continue until people get fed up and stop buying their products. Otherwise, what good is it for people to complain, but keep buy NV cards? I mean.. people have been complaining for a while but AMD hasn't gained much market share in 2 years.


----------



## Juankato1987 (Jun 29, 2019)

Xaled said:


> 2060 for 399$ ? No, thanks.


Righ now you can find 1080's used for about $350, but instead throw another $50 to get RTX suport, a little bit performance
and better power efficiency, doesn't seem a bad deal, but its sad to see xx60 above $400. 
One can dream about a RTX 2060 for less $300.


----------



## Fluffmeister (Jun 29, 2019)

I haven't followed the market that much, but presumably AMD's upcoming Navi cards are going to massively undercut the evil Nvidians?


----------



## illli (Jun 29, 2019)

Fluffmeister said:


> I haven't followed the market that much, but presumably AMD's upcoming Navi cards are going to massively undercut the evil Nvidians?



I wouldn't say massive. I think we've seen AMD try to compete on price and offering better gaming bundles, but people just keep buying NV. In other words, being cheaper doesn't really gain AMD market share.


----------



## Deleted member 158293 (Jun 29, 2019)

Funny pricing...  that's a lot of Stadia subscription YEARS...


----------



## Hotobu (Jun 29, 2019)

I've always thought of the RTX line as fairly priced, but too expensive. The R&D on RTX had to be astronomical, and true real time lighting is without a doubt a welcome addition to gaming. A ~30% performance boost over last gen with ray tracing is worth a premium, the problem is that the FPS drop doesn't make it worth using.


----------



## GoldenX (Jun 29, 2019)

Hotobu said:


> I've always thought of the RTX line as fairly priced, but too expensive. The R&D on RTX had to be astronomical, and true real time lighting is without a doubt a welcome addition to gaming. A ~30% performance boost over last gen with ray tracing is worth a premium, the problem is that the FPS drop doesn't make it worth using.


It's not true RTRT yet. Even the RTX cards crumble with full real ray tracing.


----------



## Eskimonster (Jun 29, 2019)

neatfeatguy said:


> That's it, just 3 years?
> 
> I ran with my GTX 280s for just over 3 years (well, one was a bad 285 flashed with a 280 BIOS), then I moved to GTX 570s and had those for just about 4.5 years. I'm coming up on year 4 with my 980Ti cards. I haven't ran them in SLI since after 3 months of owning them, just not necessary for me. 1 does well enough for my gaming needs.
> 
> ...



I had all my GPU´s 5 year or more, my current 1080 evga is pre-ordered at launch date 08/02/2017 for 375 dollar, ill keep this until it dies.


----------



## Space Lynx (Jun 29, 2019)

eh I'll pass, I'll wait for next round of Navi next year and 7nm nvidia next year.  enjoy your continuing of  lower sales Nvidia.


----------



## sam_86314 (Jun 29, 2019)

Nice to see price to performance has barely changed since 2016.

I really hope Intel manages to shake up the dGPU market next year since AMD seems incapable of doing that. Or hell, maybe Matrox or SIS will burst back onto the scene out of nowhere (I can dream).


----------



## Nkd (Jun 29, 2019)

How much we wanna bet this is nvidia’s phantom msrp and founders edition will be $100 more for each except for 2060 super which will be $50 more I think? Lol.


----------



## Mamya3084 (Jun 29, 2019)

Well...RTX is out of reach. Never in my life did I think I'd see prices of cards cost the full amount of a whole computer minus the GPU.

I just bought a 2nd Vega Fe card for $300. Sure, crossfire is a hit and miss for newer games, but it works fine for DX 11 games.


----------



## Berfs1 (Jun 29, 2019)

15500 MHz GDDR6 is 496GBps, not 512GBps, 16000 MHz GDDR6 is 512 GBps. Looks like the RTX 2060 will remain on TU106, and the RTX 2060 Super to RTX 2080 Super will use TU104, with RTX 2080 ti and Titan RTX on TU102.


----------



## mcraygsx (Jun 29, 2019)

Mamya3084 said:


> Well...RTX is out of reach. Never in my life did I think I'd see prices of cards cost the full amount of a whole computer minus the GPU.
> 
> I just bought a 2nd Vega Fe card for $300. Sure, crossfire is a hit and miss for newer games, but it works fine for DX 11 games.



Exactly the reason I refuse to upgrade this time around. I was blown away when I saw prices of RTX at launch and AMD never grab my attention with their mediocre GPU's. I used to upgrade each time nVIDIA released new set of GPU but not this time. TITAN XP was my last upgrade and I will be utilizing its for some time to come. A non overclocked TITAN XP with 12gb of DDR5x performances great at 144.


----------



## Space Lynx (Jun 29, 2019)

mcraygsx said:


> Exactly the reason I refuse to upgrade this time around. I was blown away when I saw prices of RTX at launch and AMD never grab my attention with their mediocre GPU's. I used to upgrade each time nVIDIA released new set of GPU but not this time. TITAN XP was my last upgrade and I will be utilizing its for some time to come. A non overclocked TITAN XP with 12gb of DDR5x performances great at 144.



yep just wait until 7nm Nvidia next year.


----------



## Tsukiyomi91 (Jun 29, 2019)

and I thought the 2060 SUPER will have a faster memory over the 2060 FE. Guess there's no difference there, only slight bump in CUDA cores & that's all. For $50 increment too. Now I wanna see how it performs over the original RTX2060 in benchmarks & real world use.


----------



## 64K (Jun 29, 2019)

Tsukiyomi91 said:


> and I thought the 2060 SUPER will have a faster memory over the 2060 FE. Guess there's no difference there, only slight bump in CUDA cores & that's all. For $50 increment too. Now I wanna see how it performs over the original RTX2060 in benchmarks & real world use.



According to the leaks the 2060 Super will also gain 2 GB VRAM and have the larger 256 bit memory bus over the 2060 FE but I agree, until we see benchmarks we won't have the full picture of the performance gain. WhyCry over at VideoCardz speculates that the 2060 Super will be around 15% faster than the 260 FE. We'll see soon.


----------



## Tsukiyomi91 (Jun 29, 2019)

I'm just saying. owo Seeing how it performs with minor improvements would be interesting.


----------



## cyneater (Jun 29, 2019)

They should have called it the Big Gay El Edition .. Super


----------



## Pumper (Jun 29, 2019)

Should have been $329 for 2060, $379 for 2060 Super, $479 for 2070 Super, 2080 Super at $629, but keep 2080 for $549.


----------



## r.h.p (Jun 29, 2019)

$600 AUS for a entry level card , seems a lot .


----------



## TheGuruStud (Jun 29, 2019)

64K said:


> According to the leaks the 2060 Super will also gain 2 GB VRAM and have the larger 256 bit memory bus over the 2060 FE but I agree, until we see benchmarks we won't have the full picture of the performance gain. WhyCry over at VideoCardz speculates that the 2060 Super will be around 15% faster than the 260 FE. We'll see soon.



That would make it the same perf as 2070 super....   Nvidia is tarded, but that tarded?


----------



## bug (Jun 29, 2019)

So we have here:
1. 2080 Super, slightly faster than 2080, $100 less
2. 2070 Super, slightly faster than 2070, same price
2. 2060 Super, significantly better than 2060, $50 more

And on a supposedly tech enthusiast, unbiased forum, just over half voters choose "hate it" 
I mean, I'm not buying any of these. But hate getting more bang for the buck? That'll be the day...


----------



## efikkan (Jun 29, 2019)

Nvidia released Turing; portrayed as evil and greedy.
Nvidia releases refreshed Turing improving performance per dollar; portrayed as evil, greedy and out to kill AMD.

I do wonder if there is anything that Nvidia could do that would satisfy these people?


----------



## bug (Jun 29, 2019)

efikkan said:


> Nvidia released Turing; portrayed as evil and greedy.
> Nvidia releases refreshed Turing improving performance per dollar; portrayed as evil, greedy and out to kill AMD.


Nvidia isn't completely innocent in this, tho. If they simply named their products properly (i.e. 2060 -> 2070, 2070 -> 2080 and 2080 -> 2090) they would have avoided a lot of badmouthing. I'm not sure why they went the route they did, but I'm pretty sure they're regretting it now.


----------



## 64K (Jun 29, 2019)

TheGuruStud said:


> That would make it the same perf as 2070 super....   Nvidia is tarded, but that tarded?



From the benches done here the regular 2070 is around 15% faster than the regular 2060. If the speculation on Videocardz is correct then the 2060 Super will be around 15% faster than the regular 2060 so that puts the 2060 Super at around the same performance as the regular 2070 but WhyCry also speculates that the 2070 Super will be around 15% faster than the regular 2070 so then that would make the 2070 Super around 15% faster than the 2060 Super.

Perhaps it makes some sense to Nvidia if the rumor that the regular 2070 and 2080 is also correct that they are both EOL now.

In any case we need to see how well the 2060 Super and 2070 Super overclock to see what is the real world performance increase over the regular 2060 OC and 2070 OC. Obviously not everyone overclocks so that is a consideration for a lot of buyers as well.


----------



## ZoneDymo (Jun 29, 2019)

bug said:


> So we have here:
> 1. 2080 Super, slightly faster than 2080, $100 less
> 2. 2070 Super, slightly faster than 2070, same price
> 2. 2060 Super, significantly better than 2060, $50 more
> ...



In the 2060s case you dont get more bang for buck.
and really, these prices are just ridiculously high and have been from the start.
The GTX1080 as the article mentioned launched at 550 dollar, and this 2080 is 700 dollar.... why not, oh idk, 600? still more, but a more agreeable price.

an RTX2060 for 400 dollars when we all agree it should be more along 250 dollars is what just rubs people the wrong way.


----------



## medi01 (Jun 29, 2019)

atomicus said:


> This is what happens when you have no competition at the top end.


1650 sucks balls in lower end, along 1050 and 1050Ti, where there is a very strong product by AMD (it's called 570, if you are not from the planet Earth or live in a green bubble).

This happens because peoplez swallow the bait and, according to ngreed's own slides, stick with the number, paying more.

It is essentially pushing people to tier-up, disguised as "new generation of cards".


I enjoy both the fact going up (you deserve what you get) and the surreal mental gymnastics to justify the crap.


----------



## 64K (Jun 29, 2019)

ZoneDymo said:


> In the 2060s case you dont get more bang for buck.



That's true. If the rumors are true then you will see a 15% increase in performance for the 2060 Super but with a 15% increase in price and we still don't have a side by side comparison of the Regular and Super OC potential. From the sample 2060 FE tested here W1zzard got a 9.5% increase in performance on Unigine Heaven. What if the 2060 Super is already so close to it's limits that it can only bring a 5% increase in performance when overclocked? Then you would end up with less real world performance increase per dollar from an OC 2060 Super over an OC 2060 FE.

Now if Nvidia had set the MSRP for the 2060 Super at the old price for the 2060 FE then one could make the argument that you are getting more bang for the buck with the 2060 Super.


----------



## medi01 (Jun 29, 2019)

Fluffmeister said:


> Yeah last time I checked no one is forced to buy anything...








Don't get me wrong, someone from the green bubble already went far enough, to justify Standard OIl, you are in no way the first.


----------



## sam_86314 (Jun 29, 2019)

The Turing lineup should've been (based on previous generations):

$699 for the RTX 2080 Ti
$499 for the RTX 2080 with 3072 shaders
$379 for the RTX 2070 with 2560 shaders
$249 for the RTX 2060 with 2176 shaders

None of this "Super" nonsense. That just makes the lineup even more confusing.

But thanks to AMD's complacency, NVIDIA can do whatever they want.

If they keep this up, and we never get a third player in the market, my GTX 1070 will last a while.


----------



## Vayra86 (Jun 29, 2019)

GoldenX said:


> What a joke, this only will help validate the already high launch price of Navi. So another year without innovation, only price increases.
> The best RTX game is a remake of a decades old FPS, and the competition doesn't even offer mesh shaders, let alone RTRT. And I thought that Pascal was a boring release.



Pascal kicked GPU clocks up by 500 mhz (25%!) across the board _while improving on perf/watt_ and was the largest generational jump in many years and you call it boring. Some people are hard to please I guess


----------



## bug (Jun 29, 2019)

ZoneDymo said:


> In the 2060s case you dont get more bang for buck.


You don't know that. 2060 was slammed for having "just" 6GB VRAM and a 192 bits memory interface. The Super fixes both of these. It will be a tad slower than the current 2070.


ZoneDymo said:


> and really, these prices are just ridiculously high and have been from the start.


Yes, I'm not rushing into buying any of these.


ZoneDymo said:


> The GTX1080 as the article mentioned launched at 550 dollar, and this 2080 is 700 dollar.... why not, oh idk, 600? still more, but a more agreeable price.


Again, 2080 is not the successor of 1080 in anything other than name. Performance and power draw say 2070 is. The whole line has shifted. 1660 and 2060 can't both be the successors to the same midrange card.


ZoneDymo said:


> an RTX2060 for 400 dollars when we all agree it should be more along 250 dollars is what just rubs people the wrong way.


I would love to see how you came to the conclusion that it _should_ (kudos for not saying _must_) cost that much. Because you obviously missed economy 101 and how the right price is the one people are willing to pay and sellers to accept. As much as you and I hate the Turing prices, they are obviously what the market is willing to accept since Nvidia doesn't seem to be pressured to lower them a year later and with Navi knocking on our doors.


----------



## GoldenX (Jun 29, 2019)

Vayra86 said:


> Pascal kicked GPU clocks up by 500 mhz (25%!) across the board _while improving on perf/watt_ and was the largest generational jump in many years and you call it boring. Some people are hard to please I guess


Same thing as Intel Core, big jump, then nothing for 3 years, then a boring release even more expensive.


----------



## Vya Domus (Jun 29, 2019)

xx06 class chips have pretty much been successfully lifted into xx04 price ranges. Cool, we are now looking at a second product stack shift, the first dating back to Kepler when they moved xx04 class silicon into xx00/02 price range.


----------



## bug (Jun 29, 2019)

GoldenX said:


> Same thing as Intel Core, big jump, then nothing for 3 years, then a boring release even more expensive.
> Edit: My mistake, I wrote Pascal, but was thinking Maxwell.


Pascal was a black swan, an architecture released after TSMC's 22nm fiasco. And yes, architectures aren't completely remade every generation. Never have, never will be.


Vya Domus said:


> xx06 class chips have pretty much been successfully lifted into xx04 price range. Cool, we are now looking at a second product stack shift, the first dating back to Kepler when they moved xx04 class silicon into xx00/02 price range.


Congratulations, another one judging a book by its covers. Man, I don't know what is up with some people, they look at a number in product's or chip's name and it's like all reason just gets thrown under the bus.


----------



## medi01 (Jun 29, 2019)

sam_86314 said:


> But thanks to AMD's complacency, NVIDIA can do whatever they want.


Yeah. All AMD's fault.
God forbid people use their brains and start voting with their wallet, or god forbid we blame greedia.


----------



## bug (Jun 29, 2019)

medi01 said:


> Yeah. All AMD's fault.


It's not AMD's fault, but AMD has (indirectly) contributed to the high prices. Everyone will charge as much as they can when they have no competition. Including AMD.


----------



## Fluffmeister (Jun 29, 2019)

bug said:


> It's not AMD's fault, but AMD has contributed to the high prices. Everyone will charge as much as they can when they have no competition. Including AMD.



Indeed, Navi his hitting the market at $379 to $499, but naturally that doesn't stop him "waiting for Navi"... whilst ignoring the same priced evil Nvidia options.


----------



## bug (Jun 29, 2019)

Fluffmeister said:


> Indeed, Navi his hitting the market at $379 to $499, but naturally that doesn't stop him "waiting for Navi"... whilst ignoring the same priced evil Nvidia options.


Yes, but Navi will be much faster and draw much less power at those price points. Oh wait...


----------



## rtwjunkie (Jun 29, 2019)

bug said:


> Again, 2080 is not the successor of 1080 in anything other than name. Performance and power draw say 2070 is.


Not this load of Krapp, again. Nvidia’s new model numbers are always the direct replacement for the previous edition and model number.  Of course the performance of lower model numbers will be better than 1 tier up of older models. 

If there was no improvement there would never be any need for new models.


----------



## bug (Jun 29, 2019)

rtwjunkie said:


> Not this load of Krapp, again. Nvidia’s new model numbers are always the direct replacement for the previous edition and model number.  Of course the performance of lower model numbers will be better than 1 tier up of older models.
> 
> If there was no improvement there would never be any need for new models.


And guess what, 2070 improves performance over its 1080 predecessor: https://www.techpowerup.com/review/nvidia-geforce-rtx-2070-founders-edition/33.html


----------



## rtwjunkie (Jun 29, 2019)

bug said:


> And guess what, 2070 improves performance over its 1080 predecessor: https://www.techpowerup.com/review/nvidia-geforce-rtx-2070-founders-edition/33.html


Exactly. I JUST said that. It means it is an improvement of its predecessor, at the same point in the lineup. The rest is a friggin bonus.


----------



## bug (Jun 29, 2019)

rtwjunkie said:


> Exactly. I JUST said that. It means it is an improvement of its predecessor, at the same point in the lineup. The rest is a friggin bonus.


Yeah, but you also said 1080 is not its predecessor.


----------



## rtwjunkie (Jun 29, 2019)

bug said:


> Yeah, but you also said 1080 is not its predecessor.


Of course not!!! 1070 is. It is the SAME POINT IN NVIDIA’s lineup! Even Nvidia, whose products they are tell you that each model replaces itself. 60 to 60, 70 to 70, 80 to 80. Its not a hard concept to grasp.

If they did not get better than higher level models of previous gens, there would be no need to ever produce anything new.


----------



## bug (Jun 29, 2019)

rtwjunkie said:


> Of course not!!! 1070 is. It is the SAME POINT IN NVIDIA’s lineup! Even Nvidia, whose products they are tell you that each model replaces itself. 60 to 60, 70 to 70, 80 to 80. Its not a hard concept to grasp.
> 
> If they did not get better than higher level models of previous gens, there would be no need to ever produce anything new.


Ah, another one that can't see past numbers on the box. I can't fix that buddy.


----------



## rtwjunkie (Jun 29, 2019)

bug said:


> Ah, another one that can't see past numbers on the box. I can't fix that buddy.


Let me make it easy for your obtuseness.

A BMW 3 series gets a new model year. The new model year is as powerful as last years 5 series. Does that make the new one a 5 series or a replacement for last year’s 5 series? No! It makes it a new  and more powerful 3 series. 

Same thing applies to video cards.


----------



## bug (Jun 29, 2019)

rtwjunkie said:


> Let me make it easy for your obtuseness.
> 
> A BMW 3 series gets a new model year. The new model year is as powerful as last years 5 series. Does that make the new one a 5 series? No! It makes it a new  and more powerful 3 series.
> 
> Same thing applies to video cards.



That's a pretty poor analogy. Cars are segmented by features and options, something that doesn't really apply to video cards.

Still, following your line of thought, if 2060 is the successor to 1060, what does that make the 1660(Ti). Because all your reasoning seems to be built around ignoring those cards even exist.
We now have more model numbers. In my opinion, it's the 1660(Ti) that makes all other cards look like they were mislabeled.


----------



## freeagent (Jun 29, 2019)

I only paid 200cnd for my 980 classified that had a loose fan, but someone paid 700usd for it. An 8800GTX launched at 600-650usd, the 8800ultra was like 850.. We had it easy from the 480 series up until 1080 series, non ti of course.


----------



## 64K (Jun 29, 2019)

bug said:


> That's a pretty poor analogy. Cars are segmented by features and options, something that doesn't really apply to video cards.
> 
> Still, following your line of thought, if 2060 is the successor to 1060, what does that make the 1660(Ti). Because all your reasoning seems to be built around ignoring those cards even exist.
> We now have more model numbers. In my opinion, it's the 1660(Ti) that makes all other cards look like they were mislabeled.



But you do get features and options with cards.

For example the GTX 1060. All of which had the same die size of 200 mm²  and 4.4 billion transistors and 192 bit Memory Bus. You could choose between:

1060 3GB 1152 Shaders and GDDR5 VRAM 192.2 GB/s bandwidth
1060 6GB 1280 Shaders and GDDR5 VRAM 192.2 GB/s bandwidth
1060 6GB 1280 Shaders and GDDR5 VRAM 216.7 GB/s bandwidth (higher clocked 9 Gbps VRAM)

Nothing changed on the size of the die or transistor count nor the size of the memory bus width. In short none of the improvements turned a 1060 into a 1070 overclocked versus overclocked.

1070 8GB 1920 Shaders and GDDR5 VRAM 256.3 GB/s bandwidth and 256 bit Memory Bus and Die Size 314 mm²  and 7.2 billion transistors

Nvidia has turned everything into a confusing mess for consumers with the Super series and the naming conventions and they have done this before with the Kepler series 7 years ago.


----------



## bug (Jun 29, 2019)

64K said:


> But you do get features and options with cards.
> 
> For example the GTX 1060. All of which had the same die size of 200 mm²  and 4.4 billion transistors and 192 bit Memory Bus. You could choose between:
> 
> ...


Ok, that's just dumber to the rescue of dumb.
You only get different GPUs which is akin to different engines. You don't get the equivalent of trims, safety features and whatnot.


----------



## 64K (Jun 29, 2019)

bug said:


> Ok, that's just dumber to the rescue of dumb.



When you start calling people dumb and dumber.......that's when the discussion starts to derail into shit.

Be better than that bug.


----------



## bug (Jun 29, 2019)

64K said:


> When you start calling people dumb and dumber.......that's when the discussion starts to derail into shit.
> 
> Be better than that bug.


That may have been uncalled for.
But that doesn't change my argument. With Pascal, starting with the midrange at ~$200 we had:
1060 ($200-250)-> 1070 ($380-450) -> 1080 ($500-700)
With Turing, excluding the Supers, starting with the midrange at ~$200 we have:
1660 ($220-280) -> 2060 ($350) -> 2070 ($500) -> 2080 ($800)

The obvious nomenclature shift (and major snafu on Nvidia's part) seems to be something people try hard to miss.


----------



## GoldenX (Jun 29, 2019)

The whole line is a mess of expensive products and cut down features. And the worst part is that it still is the best choice right now.


----------



## bug (Jun 29, 2019)

GoldenX said:


> The whole line is a mess of expensive products and cut down features. And the worst part is that it still is the best choice right now.


No, the worst part is they don't drop in price because of Navi. Which means a year from now, they'll still be the best option. Imho, the line-up, as I have described it above is ok. It's just that the 2080Ti should have never been made. And probably neither the plain 2080 (though the 2080 Super will shave $100 off the MSRP and that will make it more attractive to whoever buys high-end).


----------



## 64K (Jun 29, 2019)

bug said:


> That may have been uncalled for.
> But that doesn't change my argument. With Pascal, starting with the midrange at ~$200 we had:
> 1060 ($200-250)-> 1070 ($380-450) -> 1080 ($500-700)
> With Turing, excluding the Supers, starting with the midrange at ~$200 we have:
> ...



It's the RTX thing. The Tensor Cores and RT cores made the dies really large on Turings except for the 1660 and 1660 Ti which have neither and since Nvidia pays per wafer they get fewer GPU's per wafer and the cost increase is passed on to consumers and the faster GDDR6 for cards which is more expensive.

I can't say at this time whether Nvidia was right or wrong with pushing Ray Tracing. Time will tell but for now the prices are painful for most gamers looking to upgrade from Maxwells or even Keplers. Pascal owners should probably wait it out until next year if they can.


----------



## bug (Jun 29, 2019)

64K said:


> It's the RTX thing. The Tensor Cores and RT cores made the dies really large on Turings except for the 1660 and 1660 Ti and since Nvidia pays per wafer. They get fewer GPU's per wafer and the cost increase is passed on to consumers and the faster GDDR6 for cards which is more expensive.
> 
> I can't say at this time whether Nvidia was right or wrong with pushing Ray Tracing. Time will tell but for now the prices are painful for most gamers looking to upgrade from Maxwells or even Keplers. Pascal owners should probably wait it out until next year if they can.


I know about the die size.
But I would have named everything differently:
1660 -> 2060*
2060 -> 2070
2070 -> 2080
2080 -> 2080Ti

And I'm sure there would have been much less complaints. Hell, I have written before if I were Nvidia I would have introduced RTX in Quadro cards first because of the large dies. But with AMD such a no-show, they must have seen an opportunity to milk the market.

*this would have been a GTX, of course


----------



## Vya Domus (Jun 29, 2019)

GoldenX said:


> The whole line is a mess of expensive products and cut down features. And the worst part is that it still is the best choice right now.



I can't see how the ultra mega super RTX 2060 at 400$ would be the better choice versus a 380$ 5700.


----------



## bug (Jun 29, 2019)

Vya Domus said:


> I can't see how the ultra mega super RTX 2060 at 400$ would be the better choice versus a 380$ 5700.


It's going to be faster, that's how: https://www.notebookcheck.net/Vague...e-NVIDIA-GeForce-RTX-2060-Super.426572.0.html

But that's just MSRP, we'll have to see where the prices settle.


----------



## Xzibit (Jun 30, 2019)

bug said:


> I know about the die size.
> But I would have named everything differently:
> 1660 -> 2060*
> 2060 -> 2070
> ...



They did.


----------



## Fluffmeister (Jun 30, 2019)

So is the RX 5700 replacing the RX 570?


----------



## GoldenX (Jun 30, 2019)

Vya Domus said:


> I can't see how the ultra mega super RTX 2060 at 400$ would be the better choice versus a 380$ 5700.


The fact that the Navi one should have went against Pascal, not Turing.


----------



## Vya Domus (Jun 30, 2019)

GoldenX said:


> The fact that the Navi one should have went against Pascal, not Turing.



So, the history of what should have went against what is more relevant than the here and now ? I don't think this is about what's the better choice any longer but AMD seems to have understood that thankfully


----------



## GoldenX (Jun 30, 2019)

Vya Domus said:


> So, the history of what should have went against what is more relevant than the here and now ? I don't think this is about what's the better choice any longer but AMD seems to have understood that thankfully


They want to enter the battle without being on par on features (mesh shaders, any sort of RT). Nvidia could do that before because they had the market share, AMD doesn't.


----------



## Vya Domus (Jun 30, 2019)

GoldenX said:


> Nvidia could do that before because they had the market share, AMD doesn't.



And they don't plan to. Undercutting your competitor while trying to offer a better product has been a losing strategy for them. Whether or not they've done all that was possible with that doesn't matter, they've opted out of that battle, the 450$ 5700XT is a clear indication of that. We are looking a complete reversal of mindset from AMD, they are letting Nvidia fight itself trying to sell more cards to the masses of people who already have them.



GoldenX said:


> They want to enter the battle without being on par on features (mesh shaders, any sort of RT).



We both know stuff such as mesh shaders and RT mean jack shit if market share is your goal.


----------



## GoldenX (Jun 30, 2019)

Vya Domus said:


> We both know stuff such as mesh shaders and RT mean jack shit if market share is your goal.


If you aren't on feature parity on the long run, you end up like SiS, VIA, Matrox, etc.


----------



## Vya Domus (Jun 30, 2019)

GoldenX said:


> If you aren't on feature parity on the long run, you end up like SiS, VIA, Matrox, etc.



Not when the features are found in <1% of the software recently released (being generous here).


----------



## Razrback16 (Jun 30, 2019)

ZoneDymo said:


> and really, these prices are just ridiculously high and have been from the start.



For the folks questioning why the quick-poll is showing so many people in disfavor of the new lineup - , just see above ^ - that is why people dislike it. Turing is just a bad deal. It has been since launch. The performance does not even come close to justifying the astronomical price tag. 

2nd hand market is where it's at right now for folks who desperately need an upgrade from Maxwell / Kepler. I myself have been ready to upgrade for about a year but the pricing is just so bad that even though I have borderline unlimited funds for an upgrade, I simply will not endorse what NVidia is doing by pulling the trigger on a pair of 2080 Ti cards.


----------



## Anymal (Jun 30, 2019)

Vya Domus said:


> Not when the features are found in <1% of the software recently released (being generous here).


Similar situation as in 1999 with T&L and Geforce 256. Maybe nvidia is not that stupid. People hates what they dont understand. BTW, 1660 for 220€ and 1660ti in DE are great p/p against 1060 3gb and 1060 6gb. Peace out!


----------



## Xzibit (Jun 30, 2019)

Anymal said:


> Similar situation as in 1999 with T&L and Geforce 256. Maybe nvidia is not that stupid. *People hates what they dont understand.* BTW, 1660 for 220€ and 1660ti in DE are great p/p against 1060 3gb and 1060 6gb. Peace out!



Apparently so did Quadro buyers. Just 5 months after launch the RTX 8000 was reduced 45%, RTX 6000 reduced 30%+


----------



## Pumper (Jun 30, 2019)

Fluffmeister said:


> So is the RX 5700 replacing the RX 570?



If the "leaked" trademarks are correct, then yes, 5700 should be a 570 replacement as AMD is working on RX 58xx and RX 59xx NAVI cards.


----------



## Fluffmeister (Jun 30, 2019)

Okay, it's just that is quite a jump in price too. I guess that is the market these days.


----------



## Vayra86 (Jun 30, 2019)

Vya Domus said:


> And they don't plan to. Undercutting your competitor while trying to offer a better product has been a losing strategy for them. Whether or not they've done all that was possible with that doesn't matter, they've opted out of that battle, the 450$ 5700XT is a clear indication of that. _*We are looking a complete reversal of mindset from AMD, they are letting Nvidia fight itself trying to sell more cards*_ to the masses of people who already have them.
> 
> 
> 
> We both know stuff such as mesh shaders and RT mean jack shit if market share is your goal.



I'm not really seeing that to be honest. All I'm seeing is AMD trying to catch up with minimal resources and failing time and time again. Nothing's changed, and its not like AMD was the value option all the time in the past years either. They weren't; despite endless back and forth about driver quality, promised features, and a whole lot of wishful thinking about MSRP versus real world pricing with a constant supply problem - from Vega interposers to HBM to even something as stupidly simple as a Polaris card... You can blame mining for that, or you can blame Nvidia for that, but the fact remains that value proposition was never really there even on the most basic product. And if it ever was, you'd get a hot, noisy card in return, more often than not. Funny, how that works huh... while in the meantime Nvidia kept shelves stocked with x50's and x60s and even pushed optimized Pascal 9/11Gbps VRAM versions out for good measure (ring a bell? ). Price? It was always much closer to MSRP because of good supply. The bottom line was that Nvidia was the value option in the end, in many regions and moments. And the result is Indeed Nvidia fighting itself. Is that a reversal of mindset? I think its just an inevitable conclusion to events that have occurred and AMD/RTG focusing on CPU and console.

Sure there are a few Nvidia halo cards that have pushed price up, but below that, Nvidia and AMD have been toe to toe like they are now all the time. What is really happening here, is that AMD is riding along on Nvidia's price hikes with _much smaller GPUs_ and even though that might help their bottom line a bit, it certainly does not help us and its actually *a polar opposite of what Nvidia does with the larger Turing dies*. AMD right now does _not _innovate, does _not_ bring absolute performance up to a new level, and does _not_ have a value option except in its 2/3 year old leftovers - and probably has a better margin on Navi than Nvidia on Turing from 2060 and up.

Its easy to shit on Nvidia (not you persay) for pushing the envelope, but really? And you know that even I don't like RT nonsense in GPUs...



Anymal said:


> Similar situation as in 1999 with T&L and Geforce 256. Maybe nvidia is not that stupid. People hates what they dont understand. BTW, 1660 for 220€ and 1660ti in DE are great p/p against 1060 3gb and 1060 6gb. Peace out!



Small difference, its not 1999 anymore, we have 20 years of graphics development to work with and get almost similar results with much less horsepower. In those 20 years we also saw production cost for games explode and the market demand did the same. With that demand, the current state of graphics is _really good_ already. Any new technology is fighting an uphill battle, while back in 1999 even a blind man could see there was a lot to improve. And then there's that nasty little bugger called Moore's Law and the limited potental for shrinks.


----------



## Vya Domus (Jun 30, 2019)

Anymal said:


> Similar situation as in 1999 with T&L and Geforce 256.



You mean hardware that was used to accelerate something which was already implemented in software for years by that point in time ? Yeah that's definitely the same thing. Peace out!



Vayra86 said:


> but really?



But really what ? I am calling it for what it is, AMD had has been doing the same thing over and over for the past 7-8 years while Nvidia pulled ahead in revenue and market share, clearly they can't and wont keep doing that forever. The technology they have at their disposal right now is fine, a large Navi chip clocked in it's optimal power envelope would be plenty fast, there is no need to catch up anymore. While I don't think we'll see one, should the case be and something like that it's released it's price tag is going to be eye watering.


----------



## Aquinus (Jun 30, 2019)

I'm pretty sure this makes me want to sell off my remaining stock in nVidia and put it somewhere else.


----------



## Vayra86 (Jun 30, 2019)

Vya Domus said:


> You mean hardware that was used to accelerate something which was already implemented in software for years by that point in time ? Yeah that's definitely the same thing. Peace out!
> 
> 
> 
> But really what ? I am calling it for what it is, AMD had has been doing the same thing over and over for the past 7-8 years while Nvidia pulled ahead in revenue and market share, clearly they can't and wont keep doing that forever. The technology they have at their disposal right now is fine, a large Navi chip clocked in it's optimal power envelope would be plenty fast, there is no need to catch up anymore. While I don't think we'll see one, should the case be and something like that it's released it's price tag is going to be eye watering.



While I get what you are saying about Navi, the key point is timing. Nvidia can still move to 7nm and Navi cannot catch Nvidia top end even today. Unless you believe Navi 20 is capable of topping 2080ti...

Honestly they can price that halo card up to the moon its still better than nothing. Even the 2080ti is helping the trickle down of performance. But releasing 'plenty fast' sub top end cards does not and we see proof of that right now.


----------



## Vya Domus (Jun 30, 2019)

Vayra86 said:


> Unless you believe Navi 20 is capable of topping 2080ti...



I don't know what Navi 20 is or will be, I look at die sizes, 400mm^2 would easily boost Navi in 2080ti territory. Turing wont scale well size wise on 7nm because it's already huge. You are forgetting AMD is now on even playing field process wise.


----------



## efikkan (Jun 30, 2019)

With Navi 10's 40 CUs reaching a TDP of 225W, Navi 2x could get really hot. Navi 2x might reach beyond RTX 2080 in performance, but at what cost? And by the time it arrives, Nvidia's next gen is right around the corner.


----------



## medi01 (Jun 30, 2019)

efikkan said:


> With Navi 10's 40 CUs reaching a TDP of 225W, Navi 2x could get really hot. Navi 2x might reach beyond RTX 2080 in performance, but at what cost?



Remind me, *why do people who are after $300-400-ish card needing to wait for release of $700-800 card?*

TDP of 5700 is around 180W.. (a 250mm^2 chip).
2080 is what, 30%-is faster than that at 545mm^2 (minus node)?
Hardly something unreachable, even ignoring high yield rumors.



Vayra86 said:


> Nvidia can still move to 7nm and Navi cannot catch Nvidia top end even today


nvidia cannot move to 7nm overnight, for starters.
Elaborate why AMD "can't" catch nVidia's "top end" please.


----------



## efikkan (Jun 30, 2019)

medi01 said:


> Remind me, *why do people who are after $300-400-ish card needing to wait for release of $700-800 card?*
> TDP of 5700 is around 180W.. (a 250mm^2 chip).
> 2080 is what, 30%-is faster than that at 545mm^2 (minus node)?
> Hardly something unreachable, even ignoring high yield rumors.


No, I was more thinking along the lines of which sacrifices they have to make to achieve it, like a >300W TDP, noise etc., not monetary cost.
Right now Nvidia offers RTX 2080 at $700 and 215W, and RTX 2080 Ti costing >$1000 and 250W.
Competing with these will be hard enough, but remember that Navi 2x will primarily compete with the successor of Turing on 7nm, and I assume by that time Nvidia will push down those performance tiers and improve efficiency further.


----------



## Vayra86 (Jun 30, 2019)

medi01 said:


> Remind me, *why do people who are after $300-400-ish card needing to wait for release of $700-800 card?*
> 
> TDP of 5700 is around 180W.. (a 250mm^2 chip).
> 2080 is what, 30%-is faster than that at 545mm^2 (minus node)?
> ...



The bigger you go, the lower the payoff for additional die size and shaders because clocks tend to suffer. This may be less apparent on 7nm and depends on architecture as well, but still, it won't ever be the reverse of that, at least - even on Pascal and Turing you do see the midrange SKUs clock somewhat higher than the top-end.

But as @efikkan points out, TDP budget is going to be a problem once again. 7nm doesn't change that all that much, and if you remove the node and just look at architecture AMD still has work to do. Perf/watt is still a thing and again, we're only even comparing this all to OLD Nvidia stuff - while Navi 20 is yet to release. Timing. Time to market. Relevance. Did you seriously think Nvidia is just now taking a look at what to do with 7nm? I surely hope not... If you were, be ready for another Kepler refresh >>> Maxwell curb stomp because that is very likely the jump we will see there.

The reason people with a 300-400 card budget are looking up at the halo cards is because that will indicate how worthwhile that 300-400 dollar purchase really is. After all, if performance just about flatlines after, say, a 2060, why would you spend 700-800 on the 2080? At the same time, today's 700-800 card is tomorrow's 300-400 card (simply put).

Progress in the high end matters, it is essential to keep the market moving forward. What we are seeing since Turing is _not that_ and the result is price ánd performance stagnation. Since Navi will be too late to even matter in that sense, even Navi 20 catching up to 2080ti is unlikely to make a difference, unless, again, AMD is willing to play the value game they really could play with these GPUs due to their size.


----------



## bug (Jun 30, 2019)

Vayra86 said:


> The bigger you go, the lower the payoff for additional die size and shaders because clocks tend to suffer. This may be less apparent on 7nm and depends on architecture as well, but still, it won't ever be the reverse of that, at least - even on Pascal and Turing you do see the midrange SKUs clock somewhat higher than the top-end.
> 
> But as @efikkan points out, TDP budget is going to be a problem once again. 7nm doesn't change that all that much, and if you remove the node and just look at architecture AMD still has work to do. Perf/watt is still a thing and again, we're only even comparing this all to OLD Nvidia stuff - while Navi 20 is yet to release. Timing. Time to market. Relevance. Did you seriously think Nvidia is just now taking a look at what to do with 7nm? I surely hope not... If you were, be ready for another Kepler refresh >>> Maxwell curb stomp because that is very likely the jump we will see there.
> 
> ...


What he said. I always buy cards around the $250 mark, but at the same time I always read about high end. Just so I know what to expect in a generation or two. Yes, that's correct, I don't expect the high-end to automagically transform into next year's mid-range, because experience has taught that doesn't always happen (no matter how much I can whine about it).


----------



## Mescalamba (Jun 30, 2019)

They might be worth it over current, simply cause later batches. My very late version of 1080 core is quite impressive when it comes to OC, while early batches were much less impressive.


----------



## erixx (Jul 1, 2019)

I'd love to turbo charge my build, as always, but basic info contrasted to GTX1080 Ti does not look revolutionary


----------



## medi01 (Jul 1, 2019)

Vayra86 said:


> The bigger you go, the lower the payoff for additional die size and shaders because clocks tend to suffer. This may be less apparent on 7nm and depends on architecture as well, but still, it won't ever be the reverse of that, at least - even on Pascal and Turing you do see the midrange SKUs clock somewhat higher than the top-end.
> 
> But as @efikkan points out, TDP budget is going to be a problem once again. 7nm doesn't change that all that much, and if you remove the node and just look at architecture AMD still has work to do. Perf/watt is still a thing and again, we're only even comparing this all to OLD Nvidia stuff - while Navi 20 is yet to release. Timing. Time to market. Relevance. Did you seriously think Nvidia is just now taking a look at what to do with 7nm? I surely hope not... If you were, be ready for another Kepler refresh >>> Maxwell curb stomp because that is very likely the jump we will see there.



Sure, performance doesn't grow 1:1 with chip size increase, but we are talking about 30%-ish performance gap that 180w-ish 250mm^2 card has. (I'm talking about non-XT 5700!)
Going from 180w to 280-ish w and doubling chip size should get one way past 30%-ish performance bump.




efikkan said:


> Navi 2x will primarily compete with the successor of Turing on 7nm,


I think we'll see 5800, 5900 by the end of the year, while Turing would come not earlier than Q2 next year, given Huang's comments.
Besides, if Turing is good, pricing on it hardly will be.


----------



## Vayra86 (Jul 1, 2019)

medi01 said:


> Sure, performance doesn't grow 1:1 with chip size increase, but we are talking about 30%-ish performance gap that 180w-ish 250mm^2 card has. (I'm talking about non-XT 5700!)
> Going from 180w to 280-ish w and doubling chip size should get one way past 30%-ish performance bump.



Yes. But there are a few caveats
- memory bandwidth; AMD's delta compression is still behind the curve, and they will be needing a lot of bandwidth to work with 2080ti-levels of data transfer. Something that even Radeon VII with 16GB HBM hasn't had to do yet; even though it should be more than capable; Navi carries GDDR6 and we've seen that even HBM equipped Vega benefits from memory tweaks... The best thing AMD could achieve on GDDR5 was GTX 1060 6GB performance, give or take. Not exactly a feat.
- we have yet to see a proper GPU Boost implementation, though I believe Navi does offer that, or at least improves on it. But as good as GPU Boost 3.0? Fingers crossed.
- if they go very big and lower clockrate as a result, that will rapidly destroy their die size advantage and therefore margins; and ideally they'd go the other way around: higher clockrates while keeping die size under control. They've only just begun on the 7nm node. Exploding die size this early is a huge long-term problem if you intend to remain competitive. Turing's large dies are built on a 12nm node with no future. On 7nm, they will have a lot of breathing room even _with_ dedicated RT hardware.
- Time to market. Nvidia already releases the Super cards now... and they still have 7nm to work with. So by then, AMD once again has a 280~300W(OC) card with probably a large die fighting Nvidia's sub-top end that will probably need about 180-210W. History repeats...

I'm finding it hard to be optimistic about this. The numbers don't lie and unless AMD pulls out an architectural rabbit, they're always going to lag behind. And note: that is even _while completely lacking RT hardware._ If the shit really hits the fan, Nvidia could just shrink their die by 20% and nobody would ever notice


----------



## medi01 (Jul 1, 2019)

Vayra86 said:


> AMD's delta compression is still behind the curve.


Possibly, but we haven't seen how well Navi handles it yet and we know Navi isn't Vega.




Vayra86 said:


> The best thing AMD could achieve on GDDR5 was GTX 1060 6GB performance, give or take. Not exactly a feat.


Hold on, AMD simply didn't try to develop bigger Polaris, instead focusing on Vega.
That doesn't at all mean AMD could not do it, let alone, it could simply use wider memory bus.



Vayra86 said:


> - if they go very big and lower clockrate as a result, that will rapidly destroy their die size advantage and therefore margins; and ideally they'd go the other way around: higher clockrates while keeping die size under control.


Keep in mind, that a sizable part of the mentioned 180w is consumed by the memory/mem controller. So doubling the chip size should be at around 280-300w, I think. (as it was with Vega 64 vs Polaris. In fact, Vega 64 is more than twice bigger than Polaris)



Vayra86 said:


> - Time to market. Nvidia already releases the Super cards now... and they still have 7nm to work with. So by then, AMD once again has a 280~300W(OC) card with probably a large die fighting Nvidia's sub-top end that will probably need about 180-210W. History repeats...


5700/5700XT will be available starting 7.7.2019, bigger guys probably later on, but the most attractive thing about them will be the price.
Obscene margins on 2080 and beyond mean AMD has lots of space to maneuver, downclocking, bigger size, dropping price.

Heck, anything will be better than having that 16Gb HBM2 Vega VII at $699.


----------



## bug (Jul 1, 2019)

medi01 said:


> Hold on, AMD simply didn't try to develop bigger Polaris, instead focusing on Vega.
> That doesn't at all mean AMD could not do it, let alone, it could simply use wider memory bus.


Yes, with the RX590 drawing almost as much power as a 2080, there was definitely room for more powerful Polaris chips


----------



## Vayra86 (Jul 1, 2019)

medi01 said:


> Hold on, AMD simply didn't try to develop bigger Polaris, instead focusing on Vega.
> That doesn't at all mean AMD could not do it, let alone, it could simply use wider memory bus.
> 
> Keep in mind, that a sizable part of the mentioned 180w is consumed by the memory/mem controller. So doubling the chip size should be at around 280-300w, I think. (as it was with Vega 64 vs Polaris.



We'll have to see, but can you see the problem in this combination of statements? I can... Pre-Polaris we had a Fury X that used *HBM1* to reach a GTX 1070 (980ti) performance levels. They could have used a wider bus for Polaris... but then what do you really have? _Hawaii (XT) _with a new name and a problem with power and perf/watt. Not something you can scale further - not viable for iterative improvement. If AMD could really make a viable GPU with a wider GDDR5 bus, they would have, but we have absolutely not a shred of evidence they were ever capable of doing so - the performance simply wouldn't be there.

So yes, I would agree that Navi's (5700/xt) selling point will be price. And that is another case of history repeats, unfortunately.


----------



## medi01 (Jul 1, 2019)

Vayra86 said:


> Not something you can scale further - not viable for iterative improvement. If AMD could really make a viable GPU with a wider GDDR5 bus, they would have, but we have absolutely not a shred of evidence they were ever capable of doing so - the performance simply wouldn't be there.


I don't see how any magic is needed to do a bigger Polaris.
It's just a matter of allocating money to the project.

Back to bigger chip discussion. 5700/XT are 40CU.
60CU chip would have size of about 350mm^2 (with a couple of CUs disabled)

40CUs at 1700Mhz = 8.7TF
60CU @1700 = 13TF   (+50% vs 5700XT) - at around 250W perhaps?
60CU @1600 = 12.2TF (+40% vs 5700XT)
60CU @1500 = 11.5TF (+32% vs 5700XT)

Ain't outlook quite rosy in team red?


----------



## bug (Jul 1, 2019)

medi01 said:


> I don't see how any magic is needed to do a bigger Polaris.
> It's just a matter of allocating money to the project.
> 
> Back to bigger chip discussion. 5700/XT are 40CU.
> ...


It's so rosy you'll have to remind me how much a 50% bigger chip will cost.


----------



## Vayra86 (Jul 1, 2019)

medi01 said:


> I don't see how any magic is needed to do a bigger Polaris.
> It's just a matter of allocating money to the project.
> 
> Back to bigger chip discussion. 5700/XT are 40CU.
> ...



It would look rosy if there was no competitor, yes. What we're missing is the time to market and the aforementioned bandwidth constraints. I used the GDDR5 Polaris example because it shines light on the GDDR6 Navi situation - a repeat of that is on the horizon and Navi is on a fast track to become the next Hawaii XT. A card that has the performance, but falls short in everything else (noise, heat, die size). And note: that is not a huge problem when a release is placed at the END of a node (like 28nm Hawaii), but when you're right at the start....



medi01 said:


> I simply don't see it. There is *transistor on transistor parity *with nVidia, to begin with.
> How come 350mm chip taking on 2080 with roughly the same power consumption will "fail short"?
> It can well fail short at sales, because clueless buy green.
> 
> ...



People perceived Fury quite right though - power hog maybe not so much, but 980ti was allround a better card, which is leagues more relevant today than a Fury X is due to its 6GB and even surpasses Fury at 4K now. Fury X aged horribly, didn't perform as well on 1080p and lost its high resolution performance advantage over time. And yes, it did also use more power for it.

A difference in perception is fine. Time will tell... But I will say my crystal ball has a pretty decent hitrate.


----------



## medi01 (Jul 1, 2019)

Vayra86 said:


> A card that has the performance, but falls short in everything else (noise, heat, die size).


I simply don't see it. There is *transistor on transistor parity *with nVidia, to begin with.
How come 350mm chip taking on 2080 with roughly the same power consumption will "fail short"?
It can well fail short at sales, because clueless buy green.

It's more of a perception than anything else, just how many refer to Fury as "power hog" when, in fact, it was on par with 980Ti.

More to it, both Sony and Microsoft have promised that upcoming consoles will support RT.
They are very likely to go with 7nm EUV, which further lowers power consumption.


----------



## efikkan (Jul 1, 2019)

medi01 said:


> 40CUs at 1700Mhz = 8.7TF
> 60CU @1700 = 13TF   (+50% vs 5700XT) - at around 250W perhaps?
> 60CU @1600 = 12.2TF (+40% vs 5700XT)
> 60CU @1500 = 11.5TF (+32% vs 5700XT)
> ...


I'm just curious, how do you figure that a 50% larger chip at similar clocks would only consume 11% more power?


----------



## bug (Jul 1, 2019)

efikkan said:


> I'm just curious, how do you figure that a 50% larger chip at similar clocks would only consume 11% more power?


He doesn't figure anything, but since he set out to paint a rosy picture of Navi, there simply wasn't room for a bigger number there.


----------



## chinmi (Jul 1, 2019)

Markosz said:


> They keep pushing up the price...
> xx60's card for $400??? GTX 960 was $200 MSRP. Soon they will be asking $500 for entry-mid grade cards


You're right. Scary sh*t indeed.


----------



## medi01 (Jul 1, 2019)

efikkan said:


> I'm just curious, how do you figure that a 50% larger chip at similar clocks would only consume 11% more power?


How much would it consume in your opinion?


----------



## bug (Jul 1, 2019)

medi01 said:


> How much would it consume in your opinion?


Let's see, RTX 2080's die size is 1.7x the size of RTX 2060's. That pushes the TDP from 160W to 250W (1.6x).


----------



## HenrySomeone (Jul 1, 2019)

Well, there is zero reason for Nvidia to be selling any cheaper as even the "lowly" 2060 super will smash the best Navi will have to offer and when properly OCed will rival even the Crapeon 7 so...take your blames with AMD for not being able to compete at all for the last 5 years.


----------



## bug (Jul 1, 2019)

HenrySomeone said:


> Well, there is zero reason for Nvidia to be selling any cheaper as even the "lowly" 2060 super will smash the best Navi will have to offer and when properly OCed will rival even the Crapeon 7 so...take your blames with AMD for not being able to compete at all for the last 5 years.


Yeah, it's got to the point I have more hopes in Intel to put pressure on Nvidia for the next few years.


----------



## efikkan (Jul 1, 2019)

medi01 said:


> How much would it consume in your opinion?


It would be just a guess, but considering RX 5700 XT consumes 225W and the slightly higher clocked anniversary edition consumes 235W, it's a good indicator that AMD have pushed the chip beyond the "sweetspot". So unless a ~50% larger chip runs at much lower clocks, I would assume anywhere from ~275-325W, especially if they put a 384-bit memory controller on it.


----------



## HenrySomeone (Jul 1, 2019)

bug said:


> Yeah, it's got to the point I have more hopes in Intel to put pressure on Nvidia for the next few years.


Yeah, I mean they aren't about to close in on Nvidia anytime soon, but I'm willing to bet they will overtake puny RTG by 2021 after which point they will probably be relegated to APUs only as history shows that no more than 2 major players can survive. And I suspect it won't be much better in the CPU space by then as Intel will probably start rolling out their 7nm chips that will likely obliterate AMD's 5nm an even 3... Team red fanbois should cherrish these days as they won't last...


----------



## bug (Jul 1, 2019)

HenrySomeone said:


> Yeah, I mean they aren't about to close in on Nvidia anytime soon, but I'm willing to bet they will overtake puny RTG by 2021 after which point they will probably be relegated to APUs only as history shows that no more than 2 major players can survive. And I suspect it won't be much better in the CPU space by then as Intel will probably start rolling out their 7nm chips that will likely obliterate AMD's 5nm an even 3... Team red fanbois should cherrish these days as they won't last...


I wouldn't make those predictions, there are too many variables at play here.



efikkan said:


> It would be just a guess, but considering RX 5700 XT consumes 225W and the slightly higher clocked anniversary edition consumes 235W, it's a good indicator that AMD have pushed the chip beyond the "sweetspot". So unless a ~50% larger chip runs at much lower clocks, I would assume anywhere from ~275-325W, especially if they put a 384-bit memory controller on it.


It's possible the anniversary edition is just the TDP set higher to allow for more legroom, not necessarily an indicator the XT is past the sweetspot.
The real problem we know of is price. Even if Navi could be made into bigger chips, we'd still be in the uninteresting $600+ territory.


----------



## Vayra86 (Jul 2, 2019)

bug said:


> Let's see, RTX 2080's die size is 1.7x the size of RTX 2060's. That pushes the TDP from 160W to 250W (1.6x).



And that is not including the fact that at least 15% of the die is not dedicated to raw gaming performance, and barely weighs in on the TDP budget. This works against Navi, as size goes up, I reckon it will fall short with 250W. An interesting calculation then would be: how large would a shrunk down 7nm 2080 be. Im too lazy for that atm


----------



## Anymal (Jul 2, 2019)

Vya Domus said:


> You mean hardware that was used to accelerate something which was already implemented in software for years by that point in time ? Yeah that's definitely the same thing. Peace out!


Not the same, similar as no games to use T&L Gwforce 256 offered in august 99. Come in peace!

Vayra, no shrinking as AMD like to do with old tech. Ampere will be 7nm or 5nm and new tech. Well, AMD patented something hybrid, Papermaster is good at papers.


----------



## HenrySomeone (Jul 2, 2019)

Vayra86 said:


> And that is not including the fact that at least 15% of the die is not dedicated to raw gaming performance, and barely weighs in on the TDP budget. This works against Navi, as size goes up, I reckon it will fall short with 250W. An interesting calculation then would be: how large would a shrunk down 7nm 2080 be. Im too lazy for that atm


Yeah, they could probably fabricate a "monster" die size Navi which would land somewhere between 2080 and 2080Ti while guzzling upwards of 300W and pretty much need watercooling out of the box, just like Fury X, but unlike the latter (which was also a failure, but at least remained somewhat competitive for a while - untill you OCed the 980Ti of course, lol), it would be rendered obsolete almost imediately by the 2080 super.


----------



## medi01 (Jul 2, 2019)

HenrySomeone said:


> ...need watercooling out of the box, just like Fury X...


Fury X power consumption is only 10-15% higher than that of 980Ti, contrary to what you seem to think


----------



## Vayra86 (Jul 2, 2019)

Anymal said:


> Not the same, similar as no games to use T&L Gwforce 256 offered in august 99. Come in peace!
> 
> Vayra, no shrinking as AMD like to do with old tech. Ampere will be 7nm or 5nm and new tech. Well, AMD patented something hybrid, Papermaster is good at papers.



Architecture development has been iterative since forever. Define 'new tech' please... are they going to ditch CUDA?  Since Kepler we haven't seen a new architecture from Nvidia, as in, no radical changes. Turing is also not a huge change, but adds new bits to what was already there.


----------



## bug (Jul 2, 2019)

medi01 said:


> Fury X power consumption is only 10-15% higher than that of 980Ti, contrary to what you seem to think


The 980Ti has 50% more VRAM (and not of the HBM variety). The difference is greater if you compare just the GPUs.


----------



## Anymal (Jul 2, 2019)

Vayra, true. But, but, 480, 580, 590.


----------



## Vya Domus (Jul 2, 2019)

Anymal said:


> Not the same, similar as no games to use T&L Gwforce 256 offered in august 99.



Let me put it in another way, people were already implementing the exact same things in game engines, just not for dedicated hardware. Now let's see, how many engines were doing software ray-tracing before DXR ?

You can't get 3D graphics on screen without projection matrices and whatnot with or without hardware acceleration. Explain to me how this stuff wasn't the same before the amazing people at Nivida released hardware T&L.



Anymal said:


> Vayra, no shrinking as AMD like to do with old tech.



Pascal was pretty much a straight up shrink, change my mind.


----------



## bug (Jul 2, 2019)

Vya Domus said:


> Let me put it in another way, people were already implementing the exact same things in game engines, just not for dedicated hardware. Now let's see, how many engines were doing software ray-tracing before DXR ?



All of them, if they could. Unlike TnL, RTRT cannot be done in software with the silicon we have today.




Vya Domus said:


> Pascal was pretty much a straight up shrink, change my mind.



It's impossible to change your mind. If you were open to change/information you would have found this on your own: https://en.wikipedia.org/wiki/Pascal_(microarchitecture)#Details


----------



## qwerty_lesh (Jul 2, 2019)

GPUs marketed at hype beast millenials.. SMH


----------



## medi01 (Jul 5, 2019)

efikkan said:


> It would be just a guess, but considering RX 5700 XT consumes 225W and the slightly higher clocked anniversary edition consumes 235W, it's a good indicator that AMD have pushed the chip beyond the "sweetspot".


Welp:
















						AMD Radeon RX 5700 XT and RX 5700 review leaks out - VideoCardz.com
					

A review by Benchmark.pl was published prematurely.  AMD Radeon RX 5700 series review leaked ahead of the embargo lift As you might know, Radeon RX 5700 series reviews will be available on July 7th, alongside AMD Ryzen 3000 series and X570 motherboards. Quite a day for tech enthusiasts and a lot...




					videocardz.com


----------



## bug (Jul 5, 2019)

Aand now we have graphs about Navi power draw in a thread about Turing Super. Neat.


----------



## jabbadap (Jul 5, 2019)

bug said:


> Aand now we have graphs about Navi power draw in a thread about Turing Super. Neat.



So? It has released Super cards too thus it's not really off topic.


----------



## bug (Jul 5, 2019)

jabbadap said:


> So? It has released Super cards too thus it's not really off topic.


It doesn't address the original assertion that Navi may be pushed beyond its sweetpot either.


----------



## medi01 (Jul 5, 2019)

jabbadap said:


> So? It has released Super cards too thus it's not really off topic.


So... 
AMD 5700 and 5700XT beats nvidia on perf and perf/watt and perf/$.
AMD 5700 beats 2060 Super (and costs 5% less)
AMD 5700XT beats 2070 Super (and costs 10% less)

You were asking?


----------



## jabbadap (Jul 5, 2019)

bug said:


> It doesn't address the original assertion that Navi may be pushed beyond its sweetpot either.



Well can't really make that kind of assertions until thorough reviews are out. There's not much details on those leaked power figures either, "whole system gameplay" right that might vary game to game, temp might throttle clocks down etc. etc.



Spoiler






medi01 said:


> So...
> AMD 5700 and 5700XT beats nvidia on perf and perf/watt and perf/$.
> AMD 5700 beats 2060 Super (and costs 5% less)
> AMD 5700XT beats 2070 Super (and costs 10% less)
> ...



I did ask some input from @bug and got my answer and thanked it. Nothing to do with you


----------



## efikkan (Jul 5, 2019)

Discussing competitors in a thread about Nvidia GPUs can be on topic as long as it relates to comparing it.
Power consumption under load varies a lot. Let's wait for proper reviews.


----------



## medi01 (Jul 5, 2019)

jabbadap said:


> Well can't really make that kind of assertions until thorough reviews are out.


Assertions can be made before, as soon as information is available.
Final conclusions can be made later.



Spoiler



Next time you want to communicate specifically with some guys (whatever your reason to fancy people who claim cartels are ok, just to make nvidia look better, is) do it using appropriate forum means, namely, send a personal message.

Forum also has a wonderful feature, to be used with buggy users, called "ignore".





efikkan said:


> Power consumption under load varies a lot.


What is that supposed to mean, undermine power consumption figures on all review sites?


----------



## Deleted member 178884 (Jul 5, 2019)

I can just say that I'm glad I moved from a 980ti to a 1080ti instead of waiting.


----------

