• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 980 4 GB

Well there is no doubt that a 290X uses much more power, its plainly obvious in that fact since the 980 uses less power than the 780ti which uses less power than the 290X. That being said I find it hard to see a second 980 only adding 100watts as that seem to be more of a low ball park seeing as how the power usage at high stress is around ~180watts for a single card on its own depending of course on clocks and cooler. However its still significantly less in the long run...
Depends on the utilization of both the first and second card. Any CPU limitation or lack of SLI optimization will affect overall power draw, so any power usage need to take into account workload.
Tonga is similar to the GM 107 chip in that it was a new chip to give a taste of something to come. Difference here is that it was designated to replace the aging Tahiti Architecture while showing us a few improvements and offering as a decent midrange price
I think you'll find that Tonga isn't Tahiti's successor (just as GM 204 isn't GK 110's successor - successors don't usually have the same ballpark performance as the chip their replacing) it is Pitcairn/Curacao's successor. Tahiti's successor will be Bermuda (Pirate Islands). Fiji, AMD large die answer to GM 200, does not have current analogue in AMD's lineup. BTW: Tonga is Volcanic Islands not Pirate Islands. There is an overlap of architectural tweaks that crosses GPU series with AMD (Hawaii, Bonaire - Sea Islands, Curacao - Southern Islands, Tonga, Iceland(?) - Volcanic Islands, Bermuda, Fiji - Pirate Islands)
AMD's R&D is not in danger of losing everything just because of this...
No but it certainly wont help matters. Bear in mind that this also has a roll-on effect:
1. Mobile GM 204 (and likely GM 206 will be mobile-centric) will definitely be on OEM's radars.
2. Guaranteed a 10-11 SMM GTX 960 creates pressure on the lower high volume segment of the product stack, and I'm also betting that Nvidia is holding a 14-15 SMM GTX 970 Ti in reserve just in case AMD bring out a fully enabled Tonga SKU. A pricing realignment would just compound the present situation.
3. With a wider uptake of GM 204 cards - and I've seen a number of forumers here and elsewhere looking to change camps if they already haven't done so - it marginalizes even further Mantle and AMD's other features as the installed user base decreases in relation to the opposition (and IMO AMD's Mantle/Gaming Evolved growth led directly to the GTX 970's aggressive pricing). AMD have already invested time, money and effort into making Radeon a more saleable proposition. A large part of that is being obliterated by a shift in current sales. What is the point of Mantle if the oppositions DX11 cards peg performance equal/higher? Without the hardware to walk-the-walk, the features talk-the-talk becomes rather more insignificant.
 
Well there is no doubt that a 290X uses much more power, its plainly obvious in that fact since the 980 uses less power than the 780ti which uses less power than the 290X. That being said I find it hard to see a second 980 only adding 100watts as that seem to be more of a low ball park seeing as how the power usage at high stress is around ~180watts for a single card on its own depending of course on clocks and cooler. However its still significantly less in the long run...



Tonga is similar to the GM 107 chip in that it was a new chip to give a taste of something to come. Difference here is that it was designated to replace the aging Tahiti Architecture while showing us a few improvements and offering as a decent midrange price (Versus just putting the chip in a card slot that was not available yet). It was literally directly advertised as competing with the GTX 760 not the GTX 970 or 980 and was never intended to be such. Even its number is still in the 2XX series which puts it in line with the other cards in that series rather than the new next generation cards that will have more power to spare. The R9 390X is a ways off at this point but it going to be the competition for the 980 when its ready in the same way that the HD 7970 came out and then after a few months the GTX 680 came out to fight back. Its not any different from normal and its just how things work in the game.

AMD got hit with a curve ball not by the GTX 980, specifically the GTX 970 having such a low price point. The overall average still says the R9 290X is a bit higher depending on clocks for the R9 290X especially at high resolutions while the GTX 980 goes beyond it. It will likely fall to $350-$400 range and the 290 will fall to $300 while the rest of the cards get knocked into different areas to follow suit. As for the R&D discussion that is a different argument all together and better suited on a different topic as discussing much further on such things leads a thread about the review of a GPU spammed with the wrong type of discussion but AMD's R&D is not in danger of losing everything just because of this...

Tonga is not a gm107 chip and gm107 purpose was not to give consumers a taste of things to come. There are no concept chips that give consumers a taste of things to come as that would serve no purpose(this is not a car show where concept cars are showed off). The reason GM107 was released was to steal the discrete laptop market and to plug up a weakpoint in Nvidia low performance portfolio.

Occasionally there are test chips which test a new process node and these get released, but these chips are usually small as the process is immature.

Tonga on the other hand is just a bizarre chip and its review score on this site reflected that. Tonga is simply too big of a chip to be a test chip(along with a new node isn't being tested). It seems out of place, if Tonga was 200mm2 rather than 360mm2, and performed at the level it did, it would be a chip that showed potential in much the same way as GM107 did.

The fact is it didn't drop power consumption much, was slightly larger than its predecessor and added 5% more performance does not accomplish what Nvidia did with GM107. The only thing it did is add AMD specific add on feature to which bionaire and Hawaii had(true audio, adaptive sync) in the same performance and die size as tahiti.

Tonga underperforms for its die size and this is why it competes with the gtx 760 and not something higherend where they could sell it for higher and make more money. I think AMD knows this and its why there was generally less fanfare for this launch. No huge press conference. Slow trickling of reviews and generally less excitement from fans.
 
To Wizzard :

I just wanted to ask if you noticed on the refernce model some coil whine coming from the card ?

I know a light coil whine on powerful videocards are normality.
 
Tonga is not a gm107 chip and gm107 purpose was not to give consumers a taste of things to come. There are no concept chips that give consumers a taste of things to come as that would serve no purpose(this is not a car show where concept cars are showed off). The reason GM107 was released was to steal the discrete laptop market and to plug up a weakpoint in Nvidia low performance portfolio.
GM 107 was intended to be a taste of things to come, best marketing around showing a low version of your next generation work early and let people gawk at it. It worked as well because everyone was talking for quite some time about how little power the card used and how much better it was than the previous GTX 650ti.

Tonga on the other hand is just a bizarre chip and its review score on this site reflected that. Tonga is simply too big of a chip to be a test chip(along with a new node isn't being tested). It seems out of place, if Tonga was 200mm2 rather than 360mm2, and performed at the level it did, it would be a chip that showed potential in much the same way as GM107 did.
Not every review agrees with you and it out performs the card it replaces while consuming less power which was the point...

The fact is it didn't drop power consumption much, was slightly larger than its predecessor and added 5% more performance does not accomplish what Nvidia did with GM107. The only thing it did is add AMD specific add on feature to which bionaire and Hawaii had(true audio, adaptive sync) in the same performance and die size as tahiti.
Its more than that performance wise...However that was the point of it not to be the greatest thing since sliced bread but to bring more GCN 1.1 feature cards to the middle ground for future support of certain game features and things to bide time until the next generation is ready while also showing they are making improvements.

Tonga underperforms for its die size and this is why it competes with the gtx 760 and not something higherend where they could sell it for higher and make more money. I think AMD knows this and its why there was generally less fanfare for this launch. No huge press conference. Slow trickling of reviews and generally less excitement from fans.
Because its the R9 285, not the R9 3XX...It got some press time talking about the card but it was not stating anywhere it was the newest card that will bring everyone to their knees in awe. It does what they said it was supposed to, beat the GTX 760, consume less power than the card it replaces, all the while bringing all the CGN 1.1 features to the middle ground.

In the end who cares, its point is well made and at the end of the day its not for everyone. Its a middle ground card designed for 1080p ultra settings not high resolution ultra gaming. The GTX 970 and 980 are next generation and in time we will get AMD's response but until then prices will change to reflect that and for now this is the way things are and the argument never changes no matter what company comes with their next generation card first.
 
I was curious if someone here had tested the GTX 980's power draw with a full GPGPU compute load (perhaps something like Scrypt). Tom's Hardware claims that the reference 980 draws 285 watts under this kind of load, but that can't be right, can it? Nvidia cards generally don't overshoot their TDP, certainly not by a full 100 watts (that sort of thing would get them in big trouble with OEMs if it were true). I suspect it's more likely that the reviewer was simply interpreting the readings from their shiny new oscilloscope incorrectly, but some confirmation would be nice - on one other message board I frequent, there's already a lot of FUD being spread on this subject.
 
I was curious if someone here had tested the GTX 980's power draw with a full GPGPU compute load (perhaps something like Scrypt). Tom's Hardware claims that the reference 980 draws 285 watts under this kind of load, but that can't be right, can it? Nvidia cards generally don't overshoot their TDP, certainly not by a full 100 watts (that sort of thing would get them in big trouble with OEMs if it were true). I suspect it's more likely that the reviewer was simply interpreting the readings from their shiny new oscilloscope incorrectly, but some confirmation would be nice - on one other message board I frequent, there's already a lot of FUD being spread on this subject.
Until Tom's actually tells everyone what they are using for their GPGPU testing and is a little more transparent on how they arrive at their readings, it might be something to keep an eye on (if you use the card for GPGPU), but I wouldn't take is as gospel. The Beyond3D forum is discussing the same information with people better versed in electrical measurement than most, so it might pay to bookmark it.
As for being a hot topic....as is the case when any new dominant card arrives, there will be a certain percentage of people desperate to highlight any flaw it has. In this case, whether they're right or wrong, I think you'll have to wait for compute-centric (F@H, CG rendering, etc.) reviews to arrive. Seems a little strange that mining (a fairly intensive workload) doesn't peg the card above its TDP until overclocked.
 
Last edited:
I think you'll find that Tonga isn't Tahiti's successor (just as GM 204 isn't GK 110's successor - successors don't usually have the same ballpark performance as the chip their replacing) it is Pitcairn/Curacao's successor. Tahiti's successor will be Bermuda (Pirate Islands). Fiji, AMD large die answer to GM 200, does not have current analogue in AMD's lineup. BTW: Tonga is Volcanic Islands not Pirate Islands. There is an overlap of architectural tweaks that crosses GPU series with AMD (Hawaii, Bonaire - Sea Islands, Curacao - Southern Islands, Tonga, Iceland(?) - Volcanic Islands, Bermuda, Fiji - Pirate Islands)
I have to wonder if the reason AMD continues to use the "groups of islands" theme to intentionally make it confusing what generation each GPU is part of. AMD has in the past admitted to switching to names instead of numbers in order to make leaks less useful, but at least we could tell a GPU was part of the 5000 series since it had to be named after a tree even if that information by itself provided no hint of the GPU's performance tier. Since the 6000 series it's all been islands, and if AMD's goal is to confuse everyone, they're doing a mighty good job of it. It's too bad they won't run out of islands any time soon.
 
I have to wonder if the reason AMD continues to use the "groups of islands" theme to intentionally make it confusing what generation each GPU is part of. AMD has in the past admitted to switching to names instead of numbers in order to make leaks less useful, but at least we could tell a GPU was part of the 5000 series since it had to be named after a tree even if that information by itself provided no hint of the GPU's performance tier. Since the 6000 series it's all been islands, and if AMD's goal is to confuse everyone, they're doing a mighty good job of it. It's too bad they won't run out of islands any time soon.
I think it stems from the fact that AMD's R&D is spread fairly thinly. They seem to have an internal roadmap of what they want to achieve with GCN, but the parts that make up the whole are evolving at different rates thanks to R&D prioritization. They went full bore on the high end to match Nvidia, but the architecture isn't that suited to be applied down the product stack in its present form. The sad indictment of this prioritization is that AMD's mobile segment is held together by Pitcairn and Cape Verde based SKUs which look likely to have to soldier on for a while yet...maybe into their third year (Feb/Mar 2015). One thing is for certain, I don't think AMD can afford to keep trimming the R&D budget.
 
nice we back to 680 times
 
Depends on the utilization of both the first and second card. Any CPU limitation or lack of SLI optimization will affect overall power draw, so any power usage need to take into account workload.

Workload is already taken into account because power draw is measured with the same workload.
 
Nice and awesome card.
Hope amd brings 20nm process on early 2015 with 300 series as they promised before ( since they have already taped out 20nm on both apu and gpu).
As consumer, i love competition.
 
This might be the card which allows me to go away from a dual-card setup to run a 2560x1600 resolution and have decent/average 60 fps.....
For a decent price:)

I've been using an EVGA Titan SC for 25X16, I bought an EVGA 980 SC. The numbers in this site's review at 25X16 compared to the 690 and 7990 convinced me.

(the 7990 is at 7% higher, the 690 at 8% higher)

That level of performance from a single chip, let alone one as cool and quiet as this chip, is pretty amazing. The factory OC versions of this card should offer indiscernible performance from the 7990/690 on one GPU, for $600 or less.

Good times to be a gamer, this is one of those pivotal moments in GPU history. (E.G. 9700Pro, 8800GTX)
 
Nice and efficient card, nicely done. It just left a bitter taste that gm107 did not have hdmi2.0, new nvenc and hevc/h265, buying card with gm204 in it for htpc only, is kind of dumb. I hope that nvidia will release gm207 just to catch up with added maxwell 2 features.

offtopic...

*snip*
Tonga on the other hand is just a bizarre chip and its review score on this site reflected that. Tonga is simply too big of a chip to be a test chip(along with a new node isn't being tested). It seems out of place, if Tonga was 200mm2 rather than 360mm2, and performed at the level it did, it would be a chip that showed potential in much the same way as GM107 did.
*snip*

While I agree in most points for tonga being disappointment, I think the one reason why, it is so horrible late. If amd could have managed to release tonga shortly after hawaii way before gm107 with a name r9-280x and r9-280, I think it could have been a good chip at that time. Back then it would have make more sense, think about cayman and barts:
hd6970->r9-290x
hd6950->r9-290
hd6870->tonga xt as r9-280x
hd6850->tonga pro as r9-280

...offtopic
 
Just what i thought: no need to upgrade from my GTX 760 in this & next year*. But i loved how this thing performed in Crysis 3 & Wolfenstein: New Order; why you haven't included Carma: R & Serious Sam 3 bugged me for a moment but inclusion of Wolfenstein in the benchmarks suite fixed it for me (one of games i'm willing to have). Thanx Wiz.

P.S. Loved the "Oh, and AMD seems f*cked" @ the end of review. lol

P.P.S. *Definitely no need to upgrade from 2xGTX 760s this & next year either, gonna have the 2nd one by November.
 
Will the real GTX 980 Please Stand Up!

More of NVIDIA charging flagship prices for a mid-range chip. Yes the preformance is good, but the reality is in terms of chip size this card is a GTX 660/560/460

Imagine if Intel did business the same way NVIDIA has since the GTX 6 series. The 4790K would be a $1,000 processor. And Intel's Flagship processors would carry pie in the sky prices just because there was no competition.

To prove my point the GTX 680 and 660Ti were the same chip. How can a company offer a Flagship Chip at such a price difference? Because even the 680 was a mid range chip.

Until the rest of you Nvidia fanboys catch on to this you will continue to get the second best chip for a premium price.
 
Will the real GTX 980 Please Stand Up!

More of NVIDIA charging flagship prices for a mid-range chip. Yes the preformance is good, but the reality is in terms of chip size this card is a GTX 660/560/460

Imagine if Intel did business the same way NVIDIA has since the GTX 6 series. The 4790K would be a $1,000 processor. And Intel's Flagship processors would carry pie in the sky prices just because there was no competition.

To prove my point the GTX 680 and 660Ti were the same chip. How can a company offer a Flagship Chip at such a price difference? Because even the 680 was a mid range chip.

Until the rest of you Nvidia fanboys catch on to this you will continue to get the second best chip for a premium price.

You may not have noticed, but NVIDIA hasn't been charging $550 for their ultra high end chips for a while now. While it's probably true higher performing Maxwell variants will follow, it's also probably true they won't cost $550..

There are two kinds of buyers:

Guys like me who buy based on price/performance in the current market.

Guys like you who say "WAIT! I think NVIDIA should be charging less for this because they're going to release some more expensive chips later!"

I'd be more surprised if NVIDIA didn't price this at $550:
A. 13% faster than AMDs best at 25X16
B. Either 6, or 16(!) dB lower noise than AMDs best depending which mode you run it in. (BTW- 50dB?! That's FX5800 Ultra Dustbuster level)
C. 110W less peak power consumption/heat dumped in case
D. Better multi GPU
E. Better 3D solution
F. Better feature set (AA, etc)

The cheapest 290Xs on newegg today are $489, 980s would be worth the extra $60 on point A. alone.

AMD may well release water cooled, OCd card but my guess is it be either HIGHLY binned, or close to 300W. I'm expecting a 9590- like product personally. (where they do the OCing and charge a premium for giving you the water cooler)
 
Last edited:
To Wizzard :

I just wanted to ask if you noticed on the refernce model some coil whine coming from the card ?

I know a light coil whine on powerful videocards are normality.

Did anyone check in any reviews for possible coil whine? I am curious despite having one on the way. When mine arrives, what is the best way to get coil whine to present itself? Isn't the best way is have it run a older game super high FPS? I can't remember.
 
I'm OK then. I never use the word f***, and seemingly you don't either

I promise to only use "fuck" and "shit" and not those weird "f***" and "s&!*" notations. Two thumbs up Sony, F*** and s&!* is for pussies!

I highly recommend you to go and check your health in a hospital!

The reason why I wrote it was that I realised how stupid the conversation went, and it should be improved ( and I didn't intend to insult by any means that guy in that case, it was just dirty but innocent wording)!

Why are you so annoyingly arrogant and sarcastic?
 
Workload is already taken into account because power draw is measured with the same workload.
Depends on the utilization of both the first and second card. Any CPU limitation or lack of SLI optimization will affect overall power draw, so any power usage need to take into account workload.
Not sure what you're talking about here. GPUs load balance in SLI/CFX - driver profile, vRAM and/or CPU limitation, vSync, app coding will all affect usage (as will GPUs with differing clock/dynamic boost rates). Just because a game can peg GPU usage to near 100%
q19hJ9G.jpg

...doesn't mean that adding a second card will automatically mean that both are running at 100% in the same circumstances. Same system, same application with a second card added....< 70% GPU usage
sFFhROM.jpg

And of course, depending upon the same app/driver coding and efficiency, vRAM and GPU requirement, not all applications are created equal
1332910830lxuqiwXcM0_8_1.gif


I highly recommend you to go and check your health in a hospital!
Oh, Sonny I didn't know you cared! I'll pass though, my super-hypocrite-sense is tingling, so I think I'm good to go.
( and I didn't intend to insult by any means that guy in that case, it was just dirty but innocent wording)!
Well, the poster (the54thvoid) you aimed it at certainly didn't see it that way, either then or now (judging by the fact he thanked my post and clarified this thoughts just afterwards).
Why are you so annoyingly arrogant and sarcastic?
Check my sig - can't say you weren't warned. Maybe if you applied the same rules to yourself that you are trying to impress upon others, and some degree of argumentative consistency we can put it all behind us and be friends! You have a nice day now.
 
Last edited:
You may not have noticed, but NVIDIA hasn't been charging $550 for their ultra high end chips for a while now. While it's probably true higher performing Maxwell variants will follow, it's also probably true they won't cost $550..

There are two kinds of buyers:

Guys like me who buy based on price/performance in the current market.

Guys like you who say "WAIT! I think NVIDIA should be charging less for this because they're going to release some more expensive chips later!"

I'd be more surprised if NVIDIA didn't price this at $550:
A. 13% faster than AMDs best at 25X16
B. Either 6, or 16(!) dB lower noise than AMDs best depending which mode you run it in. (BTW- 50dB?! That's FX5800 Ultra Dustbuster level)
C. 110W less peak power consumption/heat dumped in case
D. Better multi GPU
E. Better 3D solution
F. Better feature set (AA, etc)

The cheapest 290Xs on newegg today are $489, 980s would be worth the extra $60 on point A. alone.

AMD may well release water cooled, OCd card but my guess is it be either HIGHLY binned, or close to 300W. I'm expecting a 9590- like product personally. (where they do the OCing and charge a premium for giving you the water cooler)


I did not dispute the preformance of the card. Just stating we are getting half of a chip for the price of a flagship. Based on your reasoning anyone who has an intel 4790 should be sending in extra cash to intel to make up for the price/preformance ratio.

Since the 6 series all Nvidia has been doing is protecting its product stack.

Case in point the GTX 780 could have been released long ago but it wasn't because they didnt feel they had to. When they finally did, it came in at $650. Only to be slashed to $500 a month later. How about all those that bought at $650.

I'm just sick of Nvidia giving us "good enough" Had the Titan been a $650 or the 780 released with the titan would have AMD come to market with the 290 series sooner? I don't know, what I do know is it would have put more pressure on AMD to come to market with a competing product. Had that been the case everyone wins. Faster product turnover, more reason to upgrade and 4k becoming more available on a mass market scale. Not to mention more GPU resources for game developers to give us richer more detailed games.

Basically, this good enough attitude from Nvidia is holding up progress on many fronts. It would be nice for Nvidia to give us the best from the get go. Something they havent done since Fermi.

Like I said if Intel did business this way the most powerful CPU available would be the 4790K.
 
Case in point the GTX 780 could have been released long ago but it wasn't because they didnt feel they had to. When they finally did, it came in at $650. Only to be slashed to $500 a month later. How about all those that bought at $650.
That's pretty much the way of the world with high-end cards - you should also check your facts - the GTX 780 price cut came 5 months after its launch not one month. Care to work out the likely depreciation rate on the 290X - prices are starting to fall rather rapidly. A card that sold for ~$500 a month ago can be had for 20% less now. That price is an outlier, but I'm guessing that AMD won't be selling too many cards at the current revised MSRP at the moment. The HD 7970 received a 10% price cut a few weeks after it launched thanks to the GTX 680's arrival, and the R9 280 launched at $280, has now cratered ( ~$200, or nearly a 30% price cut) thanks to AMD EOL'ing the card after a three months lifespan to make way for the R9 285. BTW: That 30% price cut actually exceeds that of the GTX 780's.
I'm just sick of Nvidia giving us "good enough" Had the Titan been a $650 or the 780 released with the titan would have AMD come to market with the 290 series sooner?
No they wouldn't have. Chip design takes years.
I don't know, what I do know is it would have put more pressure on AMD to come to market with a competing product.
Really? If all it takes is a competing product to get AMD to shift gears, why haven't they updated their server/enthusiast desktop platform? R&D is very much a finite commodity for AMD, don't expect them to stay on the pace when they're fighting a three-front war (x86, ARM, discrete graphics)
Basically, this good enough attitude from Nvidia is holding up progress on many fronts. It would be nice for Nvidia to give us the best from the get go. Something they havent done since Fermi.
Odd viewpoint, bearing in mind that both IHV's are dependant upon TSMC's manufacturing process, and both IHV's maximize ROI by respinning product (GTX 680 -> GTX 770 and HD 7970 -> HD 7970 GHz Edition -> R9 280X for example).
 
Last edited:
Hard to really question NVIDIA's business methods.

They currently have a market cap of over $10b, and their only competitor has a market cap under $3b. ( and a big chunk of that is the CPU business)

I see a lot of guys on forums saying "Darn you NVIDIA! Give us your best products for $500 as soon as you can get them out the door!".

I sure wouldn't. If I were in charge at NVIDIA and currently had chips 5X more powerful than what we see here, I'd bleed them out just fast enough to keep stomping on AMD and bringing in the high profit quarters. NVDA answers to their board of directors/stockholders, not gamer whim.
 
That's pretty much the way of the world with high-end cards - you should also check your facts - the GTX 780 price cut came 5 months after its launch not one month. Care to work out the likely depreciation rate on the 290X - prices are starting to fall rather rapidly. A card that sold for ~$500 a month ago can be had for 20% less now. That price is an outlier, but I'm guessing that AMD won't be selling too many cards at the current revised MSRP at the moment. The HD 7970 received a 10% price cut a few weeks after it launched thanks to the GTX 680's arrival, and the R9 280 launched at $280, has now cratered ( ~$200, or nearly a 30% price cut) thanks to AMD EOL'ing the card after a three months lifespan to make way for the R9 285. BTW: That 30% price cut actually exceeds that of the GTX 780's.

No they wouldn't have. Chip design takes years.

Really? If all it takes is a competing product to get AMD to shift gears, why haven't they updated their server/enthusiast desktop platform? R&D is very much a finite commodity for AMD, don't expect them to stay on the pace when they're fighting a three-front war (x86, ARM, discrete graphics)

Odd viewpoint, bearing in mind that both IHV's are dependant upon TSMC's manufacturing process, and both IHV's maximize ROI by respinning product (GTX 680 -> GTX 770 and HD 7970 -> HD 7970 GHz Edition -> R9 280X for example).


Product progression is one thing. 480 to 580 would be another example. Or how about the 8800GT and its progression. In these examples we still got the full chip and amd/nvidia putting the best chip to market. Had amd not had Hawaii we would have never seen the 780ti and the 780 would be still $650 and Maxwell would still be in the pipeline waiting for 20nm.
Not to mention nvidia has discontiuned the 770/780 and 780ti. The 780/780ti I understand. But since a 770 and a 760 cost the same to produce why not keep the 770 with a price cut?
Bottom line is consumers drive the market and as long as people are willing to rush out and spend flagship money on a mid-range chip it is us the consumers that will be continued to be shortchanged.
Again, if Intel did business this way the 4790k would be the $1k flagship. Kudos to Intel for offering the consumer the best chip reguardless of what the competition is doing.
Instead of defending Nvidia, maybe you should defend your pocket book.
 
Thankyou @W1zzard for a really good review. To me the price per dollar is incomplete. Due to display restrictions I play at 1080p.

At 1080p the GTX980 has a 19% perf advantage and a 138Watt advantage over the R9 290X. According to a survey conducted on 34M gamers in May this year as reported by VentureBeat, the average hardcore gamer plays for 22hours/week. If you live in NY, that 138W translates to $26.70/year. The current price comparison for the GTX980 vs the R9 290X is $550 and $500 respectively. Most manufactures will give you at least a 2 year warranty and usually a 3 year warranty on their top end cards making the total cost of ownership for the lifespan of the card equal to the capex + (opex for 3 years).

for 22hours a week with the R2 290X consuming 138W more than the GTX980 that comes to an extra 157kWh/year. In NY that amounts to $26.70 additional power bills. In 3 years that comes to $80.10 making the GTX980 approximately $30 cheaper to own over the lifespan of the card. In fact even the areas of cheapest power in the Continental US would be paying $47.50 additional on power. So the question is: who in their right minds would be willing to save (in the very best possible case) $2.50 and buy a device that has a 19% performance disadvantage? Simple: the ones who have not been properly informed of the total costs of ownership.

I really enjoyed the article, and especially the extra effort taken to really get into some useful usage metrics. But I did find the Performance per Dollar section to be an incomplete analysis and borderline misleading (even though this oversight is obviously not intentional).
 
Product progression is one thing. 480 to 580 would be another example. Or how about the 8800GT and its progression. In these examples we still got the full chip and amd/nvidia putting the best chip to market. Had amd not had Hawaii we would have never seen the 780ti and the 780 would be still $650 and Maxwell would still be in the pipeline waiting for 20nm.
Not to mention nvidia has discontiuned the 770/780 and 780ti. The 780/780ti I understand. But since a 770 and a 760 cost the same to produce why not keep the 770 with a price cut?
Bottom line is consumers drive the market and as long as people are willing to rush out and spend flagship money on a mid-range chip it is us the consumers that will be continued to be shortchanged.
Again, if Intel did business this way the 4790k would be the $1k flagship. Kudos to Intel for offering the consumer the best chip reguardless of what the competition is doing.
Instead of defending Nvidia, maybe you should defend your pocket book.

So what position do you hold at NVIDIA?

The reason I ask is the only way you could actually know any of what you allege is if you worked in NVIDIA engineering or management.

If you don't, everything you have postulated is nothing more than conspiracy theory and speculation.
 
Back
Top