# NVIDIA GeForce GTX 980 and GTX 970 Pricing Revealed



## btarunr (Sep 16, 2014)

Apparently, NVIDIA is convinced that it has a pair of winners on its hands, with its upcoming GeForce GTX 980 and GTX 970 graphics cards, and is preparing to price them steeply. The GeForce GTX 980 is expected to start at US $599, nearly the same price as the GeForce GTX 780 Ti. The GTX 970, on the other hand, will start at US $399, danger-close to cannibalizing the GTX 780. 

Across the brands, the GTX 980 is launching at the same pricing AMD's Radeon R9 290X launched at; and the GTX 970 at that of the R9 290. AMD's cards have since settled down to $449 for the R9 290X, and R9 290 at $350. Both the GTX 980 and GTX 970, will be available in non-reference board designs, although reference-design GTX 980 will dominate day-one reviews. Based on the 28 nm GM204 silicon, the GTX 980 features 2,048 CUDA cores, 128 TMUs, 32 ROPs; while the GTX 970 features 1,664 CUDA cores, and 104 TMUs. Both feature 256-bit wide memory interfaces, holding 4 GB of GDDR5 memory.

*View at TechPowerUp Main Site*


----------



## matar (Sep 16, 2014)

GTX 980 Should be $499


----------



## 64K (Sep 16, 2014)

This card is looking better to me for about the same price.

http://www.amazon.com/dp/B00GMTGJIU/?tag=tec06d-20

I will wait for reviews of the GTX 980 though.


----------



## manofthem (Sep 16, 2014)

The GTX980 looks pretty sexy, but at that price....all I can do is


----------



## The Von Matrices (Sep 16, 2014)

btarunr said:


> The GeForce GTX 980 is expected to start at US $599





btarunr said:


> the GTX 980 is launching at the same pricing AMD's Radeon R9 290X launched at


The launch price of the R9 290X was $549, not $599.


----------



## revanchrist (Sep 16, 2014)

980 at the price of 780 Ti;
970 at the price of 780;
probably 960 at the price of 770?
and maybe a future 980 Ti will set a new pricing mileage? And not to mention the Titans...
Performance aside, i can see a continuous trend of price increase with each new generation.


----------



## ISI300 (Sep 16, 2014)

nVidia is reaching new levels of greed. FYI, this is not how it usually plays out. There's no way GM204 and that 4 phase VRM and 8 memory chips on a 256-bit bus warrant that high a launching price. It may perform well comparatively, but that's not enough justification for that price. Call me an AMD fanboy all day but AMD launched the 290x at the same price as they launched the 7970 back in 2012, even though it had a considerably bigger die and 512-bit complex PCB.


----------



## erocker (Sep 16, 2014)

Price isn't bad for the 970 but that is quite a price premium for the 980 for seemingly not a premium performance boost. Maybe the leaked reviews have been wrong?


----------



## GhostRyder (Sep 16, 2014)

Something is not right here...200 bucks more for not much of a performance boost if the leaks are right.  If the 979 though is that price it's going to be the best deal around!

I guess launch day will tell all and then we can see the actual performance numbers.


----------



## HumanSmoke (Sep 16, 2014)

GhostRyder said:


> Something is not right here...200 bucks more for not much of a performance boost if the leaks are right.  If the 979 though is that price it's going to be the best deal around!


There is a huge difference between the 980 and 970 specs-wise. A salvage part usually has one (possibly 2) compute units disabled. the 970 has *3* (13 SMM vs 16 SMM for the full-fat 980), By rights the 970 should be the 960 Ti and the 970 should be 1920 (15 SMM) or 1792 (14 SMM). If yields are bad there wouldn't be any 2048 shader GM 204's, so the obvious rationale is a GTX 970 Ti (975?) at a later date once it isn't in danger of cannibalizing GTX 980 sales too badly. The 20% hardware disparity is pretty much the same as that between the GTX 780 Ti and GTX 780 - except that the full-fat GK 110 didn't launch at the same time the 780 did.

Curious.

The GTX 980 should be a $100 cheaper. The GTX 970 at $399 more expected but ideally should be $50 less. I still wouldn't plump for either over a GK110 based on what I've seen.


----------



## ZoneDymo (Sep 16, 2014)

2expensive


----------



## RejZoR (Sep 16, 2014)

From the looks of it, i'll be buying Radeon once again...


----------



## RCoon (Sep 16, 2014)

revanchrist said:


> 980 at the price of 780 Ti;
> 970 at the price of 780;
> probably 960 at the price of 770?
> and maybe a future 980 Ti will set a new pricing mileage? And not to mention the Titans...
> Performance aside, i can see a continuous trend of price increase with each new generation.



Wrong. The 780ti price was once the 780 price when it first came out. It moved down the stack when the 780ti released, and the ti card adopted its price tag.

Nobody else saw this coming? If anybody thought NVidia would release their top tier card, that is still cut down silicon and will be replaced by a Ti anyway, at a reasonable price, then they've obviously never bought a flagship card on release. I'll remind everyone that the 780 was £550 on launch day, just look at it now. If you don't like the price, don't buy it. Wait a fair few months and prices will naturally become slightly more reasonable.


----------



## the54thvoid (Sep 16, 2014)

If you add about 20% for UK prices, that equates to a shade under £450. That's higher than GTX680 release but lower than GTX780 although architectural model is x04, not x10.

Sub £400 would be far better but hey, let's see what happens....

And let's not forget, the release price of its 680 predecessor was dictated by market forces. Two companies play the price game, not just one.  Its just that one doesn't lower much after release while the other does.


----------



## dwade (Sep 16, 2014)

There is no reason to get new gpus anyways when crappy mobas are the most played games on the pc. Meanwhile, console peasants continues to get AAA GOTY games nonstop.


----------



## HalfAHertz (Sep 16, 2014)

That leaves a huge 200$ gap in between? What will they put in there? The 780Ti?


----------



## Maban (Sep 16, 2014)

Let's all bitch and moan about speculation without knowing any full truths.


----------



## Sony Xperia S (Sep 16, 2014)

matar said:


> GTX 980 Should be $499



even 529-539-549 would do it better, i guess. 

The only thing which can stop nvidia from being so greedy, is AMD, their pricing structure and new faster cards as soon as possible.

As far as I understand, AMD will launch 20 nm cards next year, while nvidia will skip and go to 16 nm perhaps in the beginning of 2016.


----------



## RCoon (Sep 16, 2014)

HalfAHertz said:


> That leaves a huge 200$ gap in between? What will they put in there? The 780Ti?



The same chip as the full fat 980 with just a few things disabled, the 970 itself is actually quite cut down.


----------



## Sony Xperia S (Sep 16, 2014)

RCoon said:


> The same chip as the full fat 980 with just a few things disabled, the 970 itself is actually quite cut down.



Don't you worry. AMD will put nvidia in their place in Q1 2015 with new much faster and cheaper video cards.


----------



## pidgin (Sep 16, 2014)

what a shocker


----------



## THU31 (Sep 16, 2014)

Pathetic, absolutely pathetic. I love NVIDIA cards, but I hate NVIDIA, they have gone insane since the launch of the Titan.


----------



## adulaamin (Sep 16, 2014)

I'll stick with my 780ti for a while a see what AMD brings out. I've never really bought a card on release date and it's been good for me. I hope the cards perform really well to justify the prices especially the 980.


----------



## TheoneandonlyMrK (Sep 16, 2014)

Seams the early reports of nvidia releasing conservative priced gpus this time was wrong again, bit too expensive again for the 980 imho but it is day 1 prices I suppose.


----------



## rainzor (Sep 16, 2014)

3Dcenter is just speculating on the price, they don't have it confirmed by nvidia or any other source, what the...do you guys know that this price is actually correct and just posting this to avoid NDA related problems? I mean, i have some troubles believing $600 for 980, thats 50% more $$ for 25% extra cores....naw man it just doesn't add up., Sure is awful if true.


----------



## Prima.Vera (Sep 16, 2014)

OK, we have the cards and all, but where are the games? I mean, seriously. Except 2 or 3 games, this year was a complete disaster in gaming industry. 
Absolutely no GPU worth to invest in for a new game.


----------



## RCoon (Sep 16, 2014)

Prima.Vera said:


> OK, we have the cards and all, but where are the games? I mean, seriously. Except 2 or 3 games, this year was a complete disaster in gaming industry.
> Absolutely no GPU worth to invest in for a new game.



BF4 still runs like shit for a select few unlucky people.


----------



## rtwjunkie (Sep 16, 2014)

I believe also that NDA's probably prohibit pricing releases too, so the price quotes here are speculation.  That being said, I honestly do not fathom Nvidia charging top-level prices for what is a mid-level chip.  The 780 and 780Ti had those prices because both are based on the top-level chip, not the mid-level, as the GM204 is. 

No matter the performance, pricing should reflect the chip being used, not artificial image created by calling a GM204 chip based graphics card your "Flagship."


----------



## Sony Xperia S (Sep 16, 2014)

RCoon said:


> BF4 still runs like shit for a select few unlucky people.



You will be very badly disappointed since GM204 doesn't offer anything new performance wise compared to GK110. What it was capable of delivery, will now only be delivered with lower power requirements. And nothing more.

nvidia is a really very very shitty company.

the 970 should be $250 and the 980 not more than $400.

nothing to see here, folks, move on and forget it. meh


----------



## 1d10t (Sep 16, 2014)

dwade said:


> There is no reason to get new gpus anyways when crappy mobas are the most played games on the pc. Meanwhile, console peasants continues to get AAA GOTY games nonstop.



There's always a market for this.Like all that 4k craze or eyefinity like i do 
No reason to buy stronger GPU above GTX 770 if you still playing at 1080p,unless you're number worshiper


----------



## utengineer (Sep 16, 2014)

1d10t said:


> There's always a market for this.Like all that 4k craze or eyefinity like i do
> No reason to buy stronger GPU above GTX 770 if you still playing at 1080p,unless you're number worshiper



Agreed.  Since 4K monitors are going down in price, I imagine NVIDIA is addressing/targeting 4K performance with this launch having started the cards with 4GB VRAM.  Hopefully the efficiency in power will give GPU manufacturers the ability to sell cards with high factory overclocks.

I gave you Thanks as well due to your 1080p comment.  I have been running GTX580 3GB SLI since they launched.  I run close to Ultra in all games at 1080p.  Love these cards...great investment.


----------



## Recus (Sep 16, 2014)

Sony Xperia S said:


> You will be very badly disappointed since GM204 doesn't offer anything new performance wise compared to GK110. What it was capable of delivery, will now only be delivered with lower power requirements. And nothing more.
> 
> nvidia is a really very very shitty company.
> 
> ...



So you are fan of phone priced at £467.99 but not a fan of $500 GPU. What?
-------------
Final GTX 980 specs: http://videocardz.com/52362/only-at-vc-nvidia-geforce-gtx-980-final-specifications


----------



## Frick (Sep 16, 2014)

Recus said:


> So you are fan of phone priced at £467.99 but not a fan of $500 GPU. What?
> -------------



You can't be serious. He longs for older days, which is pretty daft too.


----------



## Sony Xperia S (Sep 16, 2014)

Recus said:


> So you are fan of phone priced at £467.99 but not a fan of $500 GPU. What?



Hmm, interesting price. When I bought my Xperia S, I gave them something around € 270-280.

And yes, my sweet spot for a new videocard is approximately that but preferably around $250 max.


----------



## RejZoR (Sep 16, 2014)

utengineer said:


> Agreed.  Since 4K monitors are going down in price, I imagine NVIDIA is addressing/targeting 4K performance with this launch having started the cards with 4GB VRAM.  Hopefully the efficiency in power will give GPU manufacturers the ability to sell cards with high factory overclocks.
> 
> I gave you Thanks as well due to your 1080p comment.  I have been running GTX580 3GB SLI since they launched.  I run close to Ultra in all games at 1080p.  Love these cards...great investment.



You can have 64GB of VRAM, but if card can't process enough pixels and texels to actually benefit from it, all that VRAM is a waste of worlds resources, our time and money. But people buy them because more is always better, even if it really doesn't do anything to performance.


----------



## GhostRyder (Sep 16, 2014)

1d10t said:


> There's always a market for this.Like all that 4k craze or eyefinity like i do
> No reason to buy stronger GPU above GTX 770 if you still playing at 1080p,unless you're number worshiper


Indeed, and that is where these cards are targeted even with the 256bit bus.  The extra ram is going to bring these cards in line for high resolution (Err Ultra HD) gaming with better efficiency.



rtwjunkie said:


> I believe also that NDA's probably prohibit pricing releases too, so the price quotes here are speculation.  That being said, I honestly do not fathom Nvidia charging top-level prices for what is a mid-level chip.  The 780 and 780Ti had those prices because both are based on the top-level chip, not the mid-level, as the GM204 is.
> 
> Could not agree more, its still a little early to be judgmental because for all we know the pricing could end up being 100 bucks less or 50 more even.  I also agree that the pricing should reflect the cards performance and not a number on your name.
> 
> ...


Indeed, heck there are many cards you can get away and keep Ultra or near Ultra at 1080p without breaking the bank.  I have a friend still running 1.5gb GTX 580s which are finally showing age for him but he still can play most games near or at Ultra.


RejZoR said:


> You can have 64GB of VRAM, but if card can't process enough pixels and texels to actually benefit from it, all that VRAM is a waste of worlds resources, our time and money. But people buy them because more is always better, even if it really doesn't do anything to performance.


Well yes but with the performance spots of these cards based on the leaked information and such they have enough in SLI to run many (if not most) games at 60FPS 4k in terms of raw power but VRAM would and was an issue as 3gb was just not enough.  These two will have enough and should be decent cards for 4K with the options out there, and in all honesty we are not going to see single GPU 4k (Well with above 30 FPS) until probably the next series so its good to have some cards that can run together and achieve it.  My 290X trio maybe a bunch, but together they can run 4k pretty effectively and I believe these will do the same based on number.  But I guess time will tell...


----------



## Casecutter (Sep 16, 2014)

*Btarunr thanks for his revisionist history!*

Let's go in the way-back machine to Nov 7th 2013 and what W1zzard said; "Here you are: The GeForce GTX 780 Ti, NVIDIA's gag-reflex to AMD's Radeon R9 290X.  It took the $549 R9 290X and the humble $399 R9 290, launched over the past fortnight, to kick NVIDIA in its rear behind hard enough for price* slashes anywhere between 17 and 23 percent to its then $650 GeForce GTX 780... 
It's on this back-drop that the GeForce GTX 780 Ti is coming to town, a $699 graphics card that almost maxes out the GK110 silicon.*"

So business… usual.  For what at this point is reported smaller die, cut the Bus to 256-Bit, while finally including 4Gb.  All that when 3 month ago the narrative was "compete with better pricing", which on the surface a 970 in the between-time will do.  Though, consider there's still 15/14 SMM part(s) that almost assuredly will appear, safe bet there's a 770Ti, and given the wide breadth between $400-600 it's price is who knows, Nvidia could let the 770 fade much like the 66Ti had, just perhaps quicker. Or more like a replay of the 780/780Ti releases, but quicker, which on that a 970 with a $100 drop in 3-4mo's would be interesting.  That then begs the question of the 960 (which I understand is from the GM204) and release and price ($250?).

I see a 970 being a one-upsmanship to the 290/780, sure with improve efficiency.  The 980 a good replacement for OC custom 780Ti (much what a 285 did for a 280); so not much if any movement on price, all the while cultivating the margins over the use of a GK110.


----------



## utengineer (Sep 16, 2014)

RejZoR said:


> You can have 64GB of VRAM, but if card can't process enough pixels and texels to actually benefit from it, all that VRAM is a waste of worlds resources, our time and money. But people buy them because more is always better, even if it really doesn't do anything to performance.



I am not a GPU expert; but, it appears NVIDIA is addressing that by giving us a GTX 980 card with 64 ROPs.  The 780Ti only has 48.  That is a huge increase and should benefit 4K since the ROP does the heavy lifting for those final pixels going to the monitor.  I think a lot of people are underestimating this series.


----------



## kdawgmaster (Sep 16, 2014)

I can already tell u guys this is false. I work at a computer store and its already in our system and its way higher. im our systems the GTX 980 reads at $800 and the GTX 970 at $500 now sometimes they place it a little higher to be on the safe side but we can expect the GTX 980 to be closer to 650-700 mark.


----------



## Hilux SSRG (Sep 16, 2014)

What is NVidia smoking to think 980 will sell at $599?  If reviews put the 980 between 780 and 780ti performance, good luck to them.

And lastly, it's not even their top tier, that would be the GM210.


----------



## N3M3515 (Sep 16, 2014)

HumanSmoke said:


> There is a huge difference between the 980 and 970 specs-wise. A salvage part usually has one (possibly 2) compute units disabled. the 970 has *3* (13 SMM vs 16 SMM for the full-fat 980), By rights the 970 should be the 960 Ti and the 970 should be 1920 (15 SMM) or 1792 (14 SMM). If yields are bad there wouldn't be any 2048 shader GM 204's, so the obvious rationale is a GTX 970 Ti (975?) at a later date once it isn't in danger of cannibalizing GTX 980 sales too badly. The 20% hardware disparity is pretty much the same as that between the GTX 780 Ti and GTX 780 - except that the full-fat GK 110 didn't launch at the same time the 780 did.
> 
> Curious.
> 
> The GTX 980 should be a $100 cheaper. The GTX 970 at $399 more expected but ideally should be $50 less. I still wouldn't plump for either over a GK110 based on what I've seen.



+1
Did you remember when a new generation vcard was like 70% more perf than the one it was replacing, and the price the same? good old times... (add to that no more than a year and a half passed between generations, right now they have their mouths full saying "new generation" when in reality are no more than miserable refreshes and incremental updates)


----------



## N3M3515 (Sep 16, 2014)

I would prefer if they released vcards every 4 years a present to us a 100% perf. increase than these 15% yearly increases, fucking milking customers and stupid people. For every 15% increase they charge like 200 bucks premium and call it a "new generation" bs.


----------



## 64K (Sep 16, 2014)

N3M3515 said:


> +1
> Did you remember when a new generation vcard was like 70% more perf than the one it was replacing, and the price the same? good old times... (add to that no more than a year and a half passed between generations, right now they have their mouths full saying "new generation" when in reality are no more than miserable refreshes and incremental updates)



Well, the card that the GTX 980 is actually replacing is the GTX 680 and it will no doubt be considerably faster. This is the first release of the GM204 replacing the first release of the GK104. The refresh of the GTX 980 (whatever they call it) will replace the refresh of the GTX 770. Nvidia has caused some confusion with their naming since GTX 680 came out. The big Maxwell will be the GM210 which will replace the GTX 780 and 780 Ti.


----------



## Sony Xperia S (Sep 16, 2014)

N3M3515 said:


> I would prefer if they released vcards every 4 years a present to us a 100% perf. increase than these 15% yearly increases, fucking milking customers and stupid people. For every 15% increase they charge like 200 bucks premium and call it a "new generation" bs.



That's why I am still with my oldie but goldie R6870. It serves me perfectly well and I don't find a need to be fed by those consumer-trolls from nvidia. 

Buy a new card once in 4-5-6 years and you will be doing great! But, of course, they can release meanwhile whatever they want and consider appropriate.

what I'm saying is basically that you do NOT need a new card every new year or two!


----------



## GhostRyder (Sep 16, 2014)

64K said:


> Well, the card that the GTX 980 is actually replacing is the GTX 680 and it will no doubt be considerably faster. This is the first release of the GM204 replacing the first release of the GK104. The refresh of the GTX 980 (whatever they call it) will replace the refresh of the GTX 770. Nvidia has caused some confusion with their naming since GTX 680 came out. The big Maxwell will be the GM210 which will replace the GTX 780 and 780 Ti.


Indeed, but even comparing year to year the GTX 580 - 680 the new 680 was much faster than its predecessor (Heck the 670 was faster) while this is actually turning out to be more of a match but with lower power consumption.  Of course that's based on leaks still but either way it still is going to come down to the end results.

I am also curious after seeing the full specs how far the clocks have been pushed for this card as well.  It is significantly higher speed base and boost than that of any of the previous cards which makes me wonder how far overclocking this card will hit.  We could run into this being pushed up to 1400+mhz, or just up to 1300mhz on a great card which would lead me to believe they heavily overclocked this card to match performance of previous gen cards.


----------



## Roel (Sep 16, 2014)

N3M3515 said:


> I would prefer if they released vcards every 4 years a present to us a 100% perf. increase than these 15% yearly increases, fucking milking customers and stupid people. For every 15% increase they charge like 200 bucks premium and call it a "new generation" bs.



Or just buy a new card every 4 years, it would basically be the same. You make the choice to buy something.


----------



## Fluffmeister (Sep 16, 2014)

Roel said:


> Or just buy a new card every 4 years, it would basically be the same. You make the choice to buy something.



Indeed, if people are getting milked it's their fault.


----------



## 64K (Sep 16, 2014)

GhostRyder said:


> Indeed, but even comparing year to year the GTX 580 - 680 the new 680 was much faster than its predecessor (Heck the 670 was faster) while this is actually turning out to be more of a match but with lower power consumption.  Of course that's based on leaks still but either way it still is going to come down to the end results.
> 
> I am also curious after seeing the full specs how far the clocks have been pushed for this card as well.  It is significantly higher speed base and boost than that of any of the previous cards which makes me wonder how far overclocking this card will hit.  We could run into this being pushed up to 1400+mhz, or just up to 1300mhz on a great card which would lead me to believe they heavily overclocked this card to match performance of previous gen cards.



But bear in mind that the GTX 680 was a die shrink as well as a new architecture so the increased efficiency allowed room to increase the performance per watt over the GTX 580. Maxwell is a more efficient architecture but it's still going to be on the 28nm process so this is a little different from the transition from the 580 to the 680. Really if you want to look at it the way it should have been. The GTX 680 (GK104) replaced the GTX 460 (GF104).

The GTX 680 was quite a leap in performance. It took a lot of people by surprise including AMD.


----------



## GhostRyder (Sep 16, 2014)

64K said:


> But bear in mind that the GTX 680 was a die shrink as well as a new architecture so the increased efficiency allowed room to increase the performance per watt over the GTX 580. Maxwell is a more efficient architecture but it's still going to be on the 28nm process so this is a little different from the transition from the 580 to the 680. Really if you want to look at it the way it should have been. The GTX 680 (GK104) replaced the GTX 460 (GF104).
> 
> The GTX 680 was quite a leap in performance. It took a lot of people by surprise including AMD.


True, but I am also looking at it from the point of a new generation card (On top of the name jump as well).  I do not think that this card is bad as it has more VRAM,  excellent clocked VRAM, high boost and base clocks, and to top it off greater efficiency while performing well.  It just with the price and where the leaks lie (Which I still believe could end up being wrong and we will all get a shock) its not really winning any wars or really providing an excellent value even with viewing it as a top dog card with a price knock up.  The 970 seems to be the real deal here with its performance spot and price point on top of all the aforementioned upgrades.

I still think your next gen card no matter what the chip is labeled (Or where it is supposed to fall) because this is still the GTX 980 and it should perform better than the previous highest contender.  I guess though because the GTX 780ti is a wildcard we could then say well this does outperform better than the GTX 780, but it depends on your viewpoint.  I still will reserve my final judgment when we actually see it in the full light.


----------



## Slizzo (Sep 16, 2014)

As long as I will be able to get another 780 for around $300 I'll be happy.


----------



## 64K (Sep 16, 2014)

GhostRyder said:


> True, but I am also looking at it from the point of a new generation card (On top of the name jump as well).  I do not think that this card is bad as it has more VRAM,  excellent clocked VRAM, high boost and base clocks, and to top it off greater efficiency while performing well.  It just with the price and where the leaks lie (Which I still believe could end up being wrong and we will all get a shock) its not really winning any wars or really providing an excellent value even with viewing it as a top dog card with a price knock up.  The 970 seems to be the real deal here with its performance spot and price point on top of all the aforementioned upgrades.
> 
> I still think your next gen card no matter what the chip is labeled (Or where it is supposed to fall) because this is still the GTX 980 and it should perform better than the previous highest contender.  I guess though because the GTX 780ti is a wildcard we could then say well this does outperform better than the GTX 780, but it depends on your viewpoint.  I still will reserve my final judgment when we actually see it in the full light.



Yeah, the $600 price tag on the GTX 980, if true, is too high. The only point I was trying to make is that the GTX 980 isn't a top end card. It's a midrange Maxwell GPU that won't even have the benefit of greater efficiency with a die shrink. The top end Maxwells will be the GM210. If Nvidia were releasing a die shrink 250 watt GM210 it would run all over a GTX 780Ti (GK110). That won't come until next year some time though.


----------



## rtwjunkie (Sep 16, 2014)

Slizzo said:


> As long as I will be able to get another 780 for around $300 I'll be happy.


 
Indeed!  I'm awaiting the mad rush of 780's and 770's dumping into used for sale threads and on ebay as the masses can't wait to hand over their cash for a "two-generational-upgrade-so-it-must-be good" video card.


----------



## Slizzo (Sep 16, 2014)

64K said:


> Yeah, the $600 price tag on the GTX 980, if true, is too high. The only point I was trying to make is that the GTX 980 isn't a top end card. It's a midrange Maxwell GPU that won't even have the benefit of greater efficiency with a die shrink. The top end Maxwells will be the GM210. If Nvidia were releasing a die shrink 250 watt GM210 it would run all over a GTX 780Ti (GK110). That won't come until next year some time though.



Same thing happened with the GTX680 though. GK104, not top range which would have been GK100, but still performed quite well, and was still quite an upgrade from GF110.


----------



## GhostRyder (Sep 16, 2014)

64K said:


> Yeah, the $600 price tag on the GTX 980, if true, is too high. The only point I was trying to make is that the GTX 980 isn't a top end card. It's a midrange Maxwell GPU that won't even have the benefit of greater efficiency with a die shrink. The top end Maxwells will be the GM210. If Nvidia were releasing a die shrink 250 watt GM210 it would run all over a GTX 780Ti (GK110). That won't come until next year some time though.


Could not agree more, I just wish for more performance and power for the top tier labeled GPUs.  But I am still reserving my full judgment until everything is out in the open.


----------



## ironwolf (Sep 16, 2014)

Now an equally important question: how much of a price premium (read price hike) will places like Newegg, etc. put on the cards the first few days/weeks?


----------



## 64K (Sep 16, 2014)

ironwolf said:


> Now an equally important question: how much of a price premium (read price hike) will places like Newegg, etc. put on the cards the first few days/weeks?



Depends on supply and demand. If the reviews are favorable I expect demand will be high.


----------



## claes (Sep 16, 2014)

Since when did anything sell at MSRP? Doesn't a $600 MSRP mean ~$500-$520 for reference at launch, $550 for aftermarket?


----------



## GhostRyder (Sep 16, 2014)

claes said:


> Since when did anything sell at MSRP? Doesn't a $600 MSRP mean ~$500-$520 for reference at launch, $550 for aftermarket?


If Nvidia says the price is $600 on a card, then that is the reference models price while aftermarket will then have a mark up depending on what the company who made the cooler decides the overclock and new cooler should cost on top of the normal price.  Things are always subject to change and of course this is still just a leak.


----------



## HumanSmoke (Sep 16, 2014)

N3M3515 said:


> +1
> Did you remember when a new generation vcard was like 70% more perf than the one it was replacing, and the price the same? good old times... (add to that no more than a year and a half passed between generations, right now they have their mouths full saying "new generation" when in reality are no more than miserable refreshes and incremental updates)


Big improvements usually accompany a new process node *and* new architecture. The reality is this actually seldom eventuates. The other major point is that silicon gains become more incremental as the process price increases. You also won't see vast improvements as in the past for the simple reason that there was much more to learn and implement back in the day. CPUs today for instance don't show the huge leaps shown by the 8086 -> 80286.
GK 110 > GF100.........104% improvement
GF100  > GT200.........58.3% improvement
GT200  > G80.............38.2% improvement

GK 104 > GF 104......104% improvement
GF 104 > G92............56.8% improvement

Hawaii > Tahiti.........33.3% improvement
Tahiti  > Cayman....38.9% improvement
Cayman > Cypress...16.3% improvement
Cypress > RV770....100% improvement

Pitcairn > Barts.......51.5% improvement
Barts > Juniper.......66.7% improvement
Juniper > RV740....51.5% improvement


----------



## mcraygsx (Sep 16, 2014)

Sony Xperia S said:


> You will be very badly disappointed since GM204 doesn't offer anything new performance wise compared to GK110. What it was capable of delivery, will now only be delivered with lower power requirements. And nothing more.
> 
> nvidia is a really very very shitty company.
> 
> ...



What this person said !

They are asking lot of money for mid tier product. Within next 6 mo they will be introducing GM210.


----------



## Scrizz (Sep 17, 2014)

The Von Matrices said:


> The launch price of the R9 290X was $549, not $599.



I was getting ready to post the same thing.


----------



## N3M3515 (Sep 17, 2014)

HumanSmoke said:


> Big improvements usually accompany a new process node *and* new architecture. The reality is this actually seldom eventuates. The other major point is that silicon gains become more incremental as the process price increases. You also won't see vast improvements as in the past for the simple reason that there was much more to learn and implement back in the day. CPUs today for instance don't show the huge leaps shown by the 8086 -> 80286.
> 
> GK 110 > GF100.........104% improvement
> GF100  > GT200.........58.3% improvement
> ...



You're right about node and architecture, things advance at a much slower pace nowadays...

About the chart thou...
The way i saw the nvidia part (maybe i'm crazy i don't know)

*GK110(GTX780) > GK104*.........19% improvement -msrp 650 vs 500 - year launched 2013 (may)
*GK104(GTX680) > GF110*.........19% improvement -msrp 500 vs 500 - year launched 2012 (march) (you could say that gk104 is a midrange part, and you're right, but to US customers it was sold at high end price, sorry.)
*GF110(GTX580) > GF100*.........11% improvement -msrp 500 vs 500 - year launched 2010 (november)
*GF100(GTX480)  > GT200*.........49.2% improvement -msrp 500 vs 650 - year launched 2010 (march)
*GT200(GTX280)  > G80*.............37% improvement -msrp: 650 vs 520 - year launched 2008 (june)

Now Radeon

*Hawaii(R9290X) > Tahiti*.........41.3% improvement -msrp: 550 vs 550 - year launched 2013 (october)
*Tahiti(HD7970) > Cayman*....29.8% improvement -msrp: 550 vs 370 - year launched 2011 (december)
*Cayman(HD6970) > Cypress*...13.4% improvement -msrp: 370 vs 400 - year launched 2010 (december)
*Cypress(HD5870) > RV770*....49% improvement -msrp: 400 vs 300 - year launched 2009 (september)
*RV770(HD4870) > RV670*....57% improvement -msrp: 300 vs 250 - year launched 2008 (june)


----------



## HumanSmoke (Sep 17, 2014)

N3M3515 said:


> You're right about node and architecture, things advance at a much slower pace nowadays...
> 
> About the chart thou...
> The way i saw the nvidia part (maybe i'm crazy i don't know)
> ...


The GTX 780 isn't a full-die part. I also specified process node and architecture. Your post that I quoted made no mention of pricing which can be subjective depending upon availability, geographic distribution area, and competition.
Even if you could compare a 20% castrated GK 110 (GTX 780), the closest analogue in GK 104 is the OEM GTX 660 or 760 (1152 shader active of 1536)


N3M3515 said:


> * GK104(GTX680) > GF110*.........19% improvement -msrp 500 vs 500 - year launched 2012 (march) (you could say that gk104 is a midrange part, and you're right, but to US customers it was sold at high end price, sorry.)


Regardless, I was talking about GPU hierarchy, not price segment


N3M3515 said:


> .........11% improvement -msrp 500 vs 500 - year launched 2010 (november)
> *GF100(GTX480)  > GT200*.........49.2% improvement -msrp 500 vs 650 - year launched 2010 (march)


The GTX 480 also isn't a full-die part


N3M3515 said:


> Now Radeon
> 
> *Hawaii(R9290X) > Tahiti*.........41.3% improvement -msrp: 550 vs 550 - year launched 2013 (october)
> *Tahiti(HD7970) > Cayman*....29.8% improvement -msrp: 550 vs 370 - year launched 2011 (december)
> ...


I used the highest resolution for the percentages (2560x1440) where possible - overall figures include resolutions like 1024x768...hardly indicative of the GPUs being discussed. I also only looked at fully enabled GPUs. Salvage parts aren't indicative since the degree that they are castrated will differ between architectures.
Nice use of colour - very vibrant.


----------



## Casecutter (Sep 17, 2014)

N3M3515 said:


> You're right about node and architecture, things advance at a much slower pace nowadays...
> 
> About the chart thou...
> The way i saw the nvidia part (maybe i'm crazy i don't know)


 
I see the argument of MSRP's to "like" replacements (segments) is very valid.  Like as the GTX 480 (not a full part), it was still the utmost replacement offered to the gaming market to replace previous high-end.  You can say you have a higher Hp motor back at garage, but _you have to race what you brung_.

While there a part in this that isn't taken into context, there's also price increases both incur from TSMC.  Like the rumored 20-25% increase they enacted on 28nm wafers. Sure it’s moot in the pricing between them, but is a factor still passed on to us,  So the escalation to pricing isn't just AMD/Nvidia.

Another piece that needs consideration was up until Kepler Nvidia purchase the chips individually that achieved their requirements.  If or how it helps/hinder can't say.  Now they buy the wafers and harvest many more variants as was the case like the GK104.  I'm not saying there's anything wrong about it...  It's just a different business structure and would lead one to figure it's enhanced the margins as they can extract from each wafer.  Also why the GTX680 price was maintained as it was the midrange part even though 28nm cost went up.


----------



## The Von Matrices (Sep 17, 2014)

Casecutter said:


> Another piece that needs consideration was up until Kepler Nvidia purchase the chips individually that achieved their requirements.  If or how it helps/hinder can't say.  Now they buy the wafers and harvest many more variants as was the case like the GK104.  I'm not saying there's anything wrong about it...  It's just a different business structure and would lead one to figure it's enhanced the margins as they can extract from each wafer.  Also why the GTX680 price was maintained as it was the midrange part even though 28nm cost went up.


Remember that this pricing structure also disincentives NVidia from being an early adopter of a new process node (with higher defect rate) since NVidia directly pays for all the defects, thus one of the reasons we have a 28nm GM204.


----------



## Casecutter (Sep 17, 2014)

The Von Matrices said:


> Remember that this pricing structure also disincentives NVidia from being an early adopter of a new process node (with higher defect rate) since NVidia directly pays for all the defects, thus one of the reasons we have a 28nm GM204.


I might be missing your point.
I thought in the old arrangement Nvidia didn’t pay for parts that couldn’t meet the specification?  They went later on shrinks because TSCM would work through "risk production" before getting Nvidia stuff going?

The RV670 of the 4870 was 55mn Nov 2007; wasn't the G92+ of the 9800GTX+ the first 55nm released July 2008?
The RV740 (pipe cleaner) of the 4770 was 40nm May 2009 / even the 5870 (RV870) was Sept 2009; while wasn't the GTX480 the first iteration on 40nm and that was March 2010.
And for 28nm the 7970 was out a good 3 months before the GXT680 (both being afflicted by what I understood as TSCM teething pains).  That was the first time Nvidia took the regins to look at each chip test and bin them.

I could be mistaken on those releases, and others may show different GPU's (professional) or dates, but for gamers this is the trend I recall.


----------



## GhostRyder (Sep 17, 2014)

Casecutter said:


> I might be missing your point.
> I thought in the old arrangement Nvidia didn’t pay for parts that couldn’t meet the specification?  They went later on shrinks because TSCM would work through "risk production" before getting Nvidia stuff going?
> 
> The RV670 was 55mn Nov 2007; wasn't the G92+ of the 9800GTX+ the first 55nm release July 2008?
> ...


Nvidia and AMD do not pay for parts that do not meet the full quality standards to an extent, that is the reason we end up at times with the cut down variants of chips and why some cards have so many variants using the same part (GK 110 for instance).  Normally the chip does receive rigorous testing by AMD or Nvidia to see if it meets (Insert Requirements Here) and then depending on yes or no the chip moves on down to become the higher chip or the lower chip.  At times they then if they do not meet the first requirements they move on to the next testing to see if they will meet the next set of requirements and again same process as before and then the chip moves on down the line.  Its a rinse and repeat cycle which is why we end up with cards like the R9 290, GTX 780, and what not.  There are exceptions to that rule and at times some chips are just taken and turned into lower parts or checked to meet just the lower requirements because demand is so high for a lower part that they end up just slapping them on the lower part, an example is the R9 290 early in the process had variants that were not laser cut/contained a bad SMX and instead was able to be unlocked later to use the full fledged core which I believe was done due to the fact people were buying the 290 like hot cakes more than the 290X and while yields may not have been as big they ended up selling a lot more.  That is more of a minute situation of course and not something that happens every generation but even the 6950 had a similar thing if you will.

There is an article somewhere that refers to this, let me see if I can find it...


----------



## HumanSmoke (Sep 17, 2014)

Casecutter said:


> I thought in the old arrangement Nvidia didn’t pay for parts that couldn’t meet the specification?  They went later on shrinks because TSCM would work through "risk production" before getting Nvidia stuff going?


Pretty much correct. This applied for the era of 80nm to 40nm where ATI (AMD) led the process. Most of that stemmed from the fact that Nvidia used to lead the process prior to this but when TSMC screwed up their 110nm process, Nvidia had to jump onto IBM's 130nm FSG, while ATI persevered with TSMC's existing 130nm Lo-K and transitioned to the 110nm process when it was fixed ( R430 I think, so maybe early-mid 2004) so had a lead of a couple of months or so on Nvidia - who have taken a more cautious approach with TSMC since then, although both companies had 28nm wafer starts in the same time frame.


----------



## Casecutter (Sep 18, 2014)

This got much more long winded than I originally thought but it... I could be off in left field, but here's what I see. Straighten-me-out as needed.

They both purchase wafers basically at a set price, no matter the number of parts or the complexity.  Traditionally there’s been an determined yield % for tier 1 and tier 2 parts after risk production is satisfied.  There are cost allocations (may not pay full price depending on the problems) for at risk wafers, but once they project production yields the wafer will hold to "X" percentages things start. There's the unwritten eventual objective that more tier 1&2 parts should be "harvested" as production matures.  Although, parts outside the agreed "yields" that are not so much "defects" they’re parts that can be salvaged/cut-back/fused/gelded to produce other lower variants.

I believe early in the old deal TSCM would go to Nvidia and show they found "tier 3/4 chips" in volumes that could be used for a GSO or some other variants.  It wasn't absolute that Nvidia had to contractually take them, but after the margins they paid for tier 1&2 parts, it was very lucrative to find homes for them.  However with each shrink TSCM yields where less worthwhile for TSCM, and Nvidia was compelled to make use of remnants or see their part costs skyrocket.  With Fermi I see Nvidia embracing a thoughtful architecture layout to provide them even more flexibility past the traditional 2 tier. They work toward the imminent day when the arrangement with TSCM wasn’t flexible enough.  It had long since morphed basically into how AMD toiled and my synopsis still are… basically just 2 tiers. Don’t get me wrong the old arrangement was shrewd aiding Nvidia for a good many years, but TSCM no longer wanted to sort chips and Nvidia had long recognized they needed to change.

Now this is where Nvidia gets accolades, when transitioning for 28nm (a point that TSCM not only raised price, but started true parity for cost of a wafer to both) they smash the "just" 2 tier yield concept with Kepler. Fully embracing the notion of multiples of spec’s that can be discerned from a wafer.  I think Nvidia also worked very studiously for the development of a apparatuses (machines) that rapidly test, sort, and bin in one quick operation.  The long-established method was(is) segregate tiers 1&2, and set the rest aside then come back later and see what the remnants might give... is not what Nvidia does now.  Nvidia I believe not only develops in "by design", but more importantly identifies almost immediately the various iterations they could release. Recognizing when a particular spec can be offered in volumes and priced to slot into the market.  They realize much earlier their potential product stack, meaning they can be much more forward thinking from the first good wafers, rather than reactionary weeks later.


----------



## HumanSmoke (Sep 18, 2014)

^^^^ That is part of the scenario that played out for sure. The other part is that when GPUs first started out there were a lot of vendors, and a lot of pure-play foundry companies catering to them. TSMC's growth allied with the ever decreasing range of vendors basically cut the options of both Nvidia and ATI.
Back in the day (we're talking 800nm - 250nm here), TSMC had competition- lots of it- ST Micro (Nvidia, 3DLabs, ATI, VideoLogic/PowerVR), LSI (Number Nine), LG/Toshiba (Chromatic Research), UMC (S3/XGI, ATI, Matrox, Nvidia), Mitsubishi (Rendition), NEC (Matrox, VideoLogic/PowerVR), IBM (3DLabs, Nvidia), UICC (Trident), MiCRUS (Cirrus Logic), Fujitsu (S3), SMC (SiS), Texas Instruments (Chromatic, 3DLabs) - those are the main ones I remember and GPU vendors usually associated with them, but there a quite a few others (Lockheed-Martin I think only produced for Intel).

As the vendors disappeared, the foundries specializing in large IC's decreased also - some got out of the business by choice, some because their contracts weren't sufficient to remain competitive - but basically TSMC ended up ruling the pure-play foundry business and ATI/AMD and Nvidia became hamstrung. What I was talking about earlier when TSMC's 110nm process got into difficulties. The previous 130nm node had ATI and Nvidia launch at the same time (within a week of each other). 110nm was delayed and forced Nvidia to go with IBM's 130nm FSG process or TSMC's 130nm Lo-K - Nvidia chose IBM for the FX 5700's, and ATI chose Lo-K for the 9600 XT as a stop-gap until 110nm came onstream. Basically as soon as TSMC slipped, Nvidia and ATI were scrambling, and its been that way since early 2003.


----------



## Casecutter (Sep 18, 2014)

HumanSmoke said:


> ^^^^ Basically as soon as TSMC slipped, Nvidia and ATI were scrambling, and its been that way since early 2003.


This is why I had so hoped to almost blindly considering AMD had gone the GloFo with Tonga. If that part was produced by GloFo and came out like that I could've gotten behind it more.  If for no other reason than it could be perhaps the first true change in the market, something I know I'm waiting for. But there's tomorrow...


----------



## Slizzo (Sep 19, 2014)

Casecutter said:


> This is why I had so hoped to almost blindly considering AMD had gone the GloFo with Tonga. If that part was produced by GloFo and came out like that I could've gotten behind it more.  If for no other reason than it could be perhaps the first true change in the market, something I know I'm waiting for. But there's tomorrow...



Problem with Global Foundries is that they're consistently behind the curve in terms of process technology, and AMD can ill afford to have their GPUs wait for GloFo to get on board with their smaller process technologies and let nVidia soak up TSMC's capacity for reducing size.  AMD's CPU business is getting beat up because of this as well; Intel can afford to work on process technology at an alarming pace, GloFo just doesn't have the cash to be able to sink it into that kind of development cycle.


----------

