• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 980 4 GB

Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Well there is no doubt that a 290X uses much more power, its plainly obvious in that fact since the 980 uses less power than the 780ti which uses less power than the 290X. That being said I find it hard to see a second 980 only adding 100watts as that seem to be more of a low ball park seeing as how the power usage at high stress is around ~180watts for a single card on its own depending of course on clocks and cooler. However its still significantly less in the long run...
Depends on the utilization of both the first and second card. Any CPU limitation or lack of SLI optimization will affect overall power draw, so any power usage need to take into account workload.
Tonga is similar to the GM 107 chip in that it was a new chip to give a taste of something to come. Difference here is that it was designated to replace the aging Tahiti Architecture while showing us a few improvements and offering as a decent midrange price
I think you'll find that Tonga isn't Tahiti's successor (just as GM 204 isn't GK 110's successor - successors don't usually have the same ballpark performance as the chip their replacing) it is Pitcairn/Curacao's successor. Tahiti's successor will be Bermuda (Pirate Islands). Fiji, AMD large die answer to GM 200, does not have current analogue in AMD's lineup. BTW: Tonga is Volcanic Islands not Pirate Islands. There is an overlap of architectural tweaks that crosses GPU series with AMD (Hawaii, Bonaire - Sea Islands, Curacao - Southern Islands, Tonga, Iceland(?) - Volcanic Islands, Bermuda, Fiji - Pirate Islands)
AMD's R&D is not in danger of losing everything just because of this...
No but it certainly wont help matters. Bear in mind that this also has a roll-on effect:
1. Mobile GM 204 (and likely GM 206 will be mobile-centric) will definitely be on OEM's radars.
2. Guaranteed a 10-11 SMM GTX 960 creates pressure on the lower high volume segment of the product stack, and I'm also betting that Nvidia is holding a 14-15 SMM GTX 970 Ti in reserve just in case AMD bring out a fully enabled Tonga SKU. A pricing realignment would just compound the present situation.
3. With a wider uptake of GM 204 cards - and I've seen a number of forumers here and elsewhere looking to change camps if they already haven't done so - it marginalizes even further Mantle and AMD's other features as the installed user base decreases in relation to the opposition (and IMO AMD's Mantle/Gaming Evolved growth led directly to the GTX 970's aggressive pricing). AMD have already invested time, money and effort into making Radeon a more saleable proposition. A large part of that is being obliterated by a shift in current sales. What is the point of Mantle if the oppositions DX11 cards peg performance equal/higher? Without the hardware to walk-the-walk, the features talk-the-talk becomes rather more insignificant.
 
Joined
Sep 20, 2014
Messages
38 (0.01/day)
Well there is no doubt that a 290X uses much more power, its plainly obvious in that fact since the 980 uses less power than the 780ti which uses less power than the 290X. That being said I find it hard to see a second 980 only adding 100watts as that seem to be more of a low ball park seeing as how the power usage at high stress is around ~180watts for a single card on its own depending of course on clocks and cooler. However its still significantly less in the long run...



Tonga is similar to the GM 107 chip in that it was a new chip to give a taste of something to come. Difference here is that it was designated to replace the aging Tahiti Architecture while showing us a few improvements and offering as a decent midrange price (Versus just putting the chip in a card slot that was not available yet). It was literally directly advertised as competing with the GTX 760 not the GTX 970 or 980 and was never intended to be such. Even its number is still in the 2XX series which puts it in line with the other cards in that series rather than the new next generation cards that will have more power to spare. The R9 390X is a ways off at this point but it going to be the competition for the 980 when its ready in the same way that the HD 7970 came out and then after a few months the GTX 680 came out to fight back. Its not any different from normal and its just how things work in the game.

AMD got hit with a curve ball not by the GTX 980, specifically the GTX 970 having such a low price point. The overall average still says the R9 290X is a bit higher depending on clocks for the R9 290X especially at high resolutions while the GTX 980 goes beyond it. It will likely fall to $350-$400 range and the 290 will fall to $300 while the rest of the cards get knocked into different areas to follow suit. As for the R&D discussion that is a different argument all together and better suited on a different topic as discussing much further on such things leads a thread about the review of a GPU spammed with the wrong type of discussion but AMD's R&D is not in danger of losing everything just because of this...

Tonga is not a gm107 chip and gm107 purpose was not to give consumers a taste of things to come. There are no concept chips that give consumers a taste of things to come as that would serve no purpose(this is not a car show where concept cars are showed off). The reason GM107 was released was to steal the discrete laptop market and to plug up a weakpoint in Nvidia low performance portfolio.

Occasionally there are test chips which test a new process node and these get released, but these chips are usually small as the process is immature.

Tonga on the other hand is just a bizarre chip and its review score on this site reflected that. Tonga is simply too big of a chip to be a test chip(along with a new node isn't being tested). It seems out of place, if Tonga was 200mm2 rather than 360mm2, and performed at the level it did, it would be a chip that showed potential in much the same way as GM107 did.

The fact is it didn't drop power consumption much, was slightly larger than its predecessor and added 5% more performance does not accomplish what Nvidia did with GM107. The only thing it did is add AMD specific add on feature to which bionaire and Hawaii had(true audio, adaptive sync) in the same performance and die size as tahiti.

Tonga underperforms for its die size and this is why it competes with the gtx 760 and not something higherend where they could sell it for higher and make more money. I think AMD knows this and its why there was generally less fanfare for this launch. No huge press conference. Slow trickling of reviews and generally less excitement from fans.
 
Joined
Feb 8, 2008
Messages
2,667 (0.43/day)
Location
Switzerland
Processor i9 9900KS ( 5 Ghz all the time )
Motherboard Asus Maximus XI Hero Z390
Cooling EK Velocity + EK D5 pump + Alphacool full copper silver 360mm radiator
Memory 16GB Corsair Dominator GT ROG Edition 3333 Mhz
Video Card(s) ASUS TUF RTX 3080 Ti 12GB OC
Storage M.2 Samsung NVMe 970 Evo Plus 250 GB + 1TB 970 Evo Plus
Display(s) Asus PG279 IPS 1440p 165Hz G-sync
Case Cooler Master H500
Power Supply Asus ROG Thor 850W
Mouse Razer Deathadder Chroma
Keyboard Rapoo
Software Win 10 64 Bit
To Wizzard :

I just wanted to ask if you noticed on the refernce model some coil whine coming from the card ?

I know a light coil whine on powerful videocards are normality.
 
Joined
Apr 29, 2014
Messages
4,304 (1.11/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Tonga is not a gm107 chip and gm107 purpose was not to give consumers a taste of things to come. There are no concept chips that give consumers a taste of things to come as that would serve no purpose(this is not a car show where concept cars are showed off). The reason GM107 was released was to steal the discrete laptop market and to plug up a weakpoint in Nvidia low performance portfolio.
GM 107 was intended to be a taste of things to come, best marketing around showing a low version of your next generation work early and let people gawk at it. It worked as well because everyone was talking for quite some time about how little power the card used and how much better it was than the previous GTX 650ti.

Tonga on the other hand is just a bizarre chip and its review score on this site reflected that. Tonga is simply too big of a chip to be a test chip(along with a new node isn't being tested). It seems out of place, if Tonga was 200mm2 rather than 360mm2, and performed at the level it did, it would be a chip that showed potential in much the same way as GM107 did.
Not every review agrees with you and it out performs the card it replaces while consuming less power which was the point...

The fact is it didn't drop power consumption much, was slightly larger than its predecessor and added 5% more performance does not accomplish what Nvidia did with GM107. The only thing it did is add AMD specific add on feature to which bionaire and Hawaii had(true audio, adaptive sync) in the same performance and die size as tahiti.
Its more than that performance wise...However that was the point of it not to be the greatest thing since sliced bread but to bring more GCN 1.1 feature cards to the middle ground for future support of certain game features and things to bide time until the next generation is ready while also showing they are making improvements.

Tonga underperforms for its die size and this is why it competes with the gtx 760 and not something higherend where they could sell it for higher and make more money. I think AMD knows this and its why there was generally less fanfare for this launch. No huge press conference. Slow trickling of reviews and generally less excitement from fans.
Because its the R9 285, not the R9 3XX...It got some press time talking about the card but it was not stating anywhere it was the newest card that will bring everyone to their knees in awe. It does what they said it was supposed to, beat the GTX 760, consume less power than the card it replaces, all the while bringing all the CGN 1.1 features to the middle ground.

In the end who cares, its point is well made and at the end of the day its not for everyone. Its a middle ground card designed for 1080p ultra settings not high resolution ultra gaming. The GTX 970 and 980 are next generation and in time we will get AMD's response but until then prices will change to reflect that and for now this is the way things are and the argument never changes no matter what company comes with their next generation card first.
 
Joined
May 8, 2013
Messages
84 (0.02/day)
I was curious if someone here had tested the GTX 980's power draw with a full GPGPU compute load (perhaps something like Scrypt). Tom's Hardware claims that the reference 980 draws 285 watts under this kind of load, but that can't be right, can it? Nvidia cards generally don't overshoot their TDP, certainly not by a full 100 watts (that sort of thing would get them in big trouble with OEMs if it were true). I suspect it's more likely that the reviewer was simply interpreting the readings from their shiny new oscilloscope incorrectly, but some confirmation would be nice - on one other message board I frequent, there's already a lot of FUD being spread on this subject.
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
I was curious if someone here had tested the GTX 980's power draw with a full GPGPU compute load (perhaps something like Scrypt). Tom's Hardware claims that the reference 980 draws 285 watts under this kind of load, but that can't be right, can it? Nvidia cards generally don't overshoot their TDP, certainly not by a full 100 watts (that sort of thing would get them in big trouble with OEMs if it were true). I suspect it's more likely that the reviewer was simply interpreting the readings from their shiny new oscilloscope incorrectly, but some confirmation would be nice - on one other message board I frequent, there's already a lot of FUD being spread on this subject.
Until Tom's actually tells everyone what they are using for their GPGPU testing and is a little more transparent on how they arrive at their readings, it might be something to keep an eye on (if you use the card for GPGPU), but I wouldn't take is as gospel. The Beyond3D forum is discussing the same information with people better versed in electrical measurement than most, so it might pay to bookmark it.
As for being a hot topic....as is the case when any new dominant card arrives, there will be a certain percentage of people desperate to highlight any flaw it has. In this case, whether they're right or wrong, I think you'll have to wait for compute-centric (F@H, CG rendering, etc.) reviews to arrive. Seems a little strange that mining (a fairly intensive workload) doesn't peg the card above its TDP until overclocked.
 
Last edited:
Joined
Dec 16, 2010
Messages
1,668 (0.33/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, Samsung PM981a 1TB, 4 x 4TB + 1 x 10TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
I think you'll find that Tonga isn't Tahiti's successor (just as GM 204 isn't GK 110's successor - successors don't usually have the same ballpark performance as the chip their replacing) it is Pitcairn/Curacao's successor. Tahiti's successor will be Bermuda (Pirate Islands). Fiji, AMD large die answer to GM 200, does not have current analogue in AMD's lineup. BTW: Tonga is Volcanic Islands not Pirate Islands. There is an overlap of architectural tweaks that crosses GPU series with AMD (Hawaii, Bonaire - Sea Islands, Curacao - Southern Islands, Tonga, Iceland(?) - Volcanic Islands, Bermuda, Fiji - Pirate Islands)
I have to wonder if the reason AMD continues to use the "groups of islands" theme to intentionally make it confusing what generation each GPU is part of. AMD has in the past admitted to switching to names instead of numbers in order to make leaks less useful, but at least we could tell a GPU was part of the 5000 series since it had to be named after a tree even if that information by itself provided no hint of the GPU's performance tier. Since the 6000 series it's all been islands, and if AMD's goal is to confuse everyone, they're doing a mighty good job of it. It's too bad they won't run out of islands any time soon.
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
I have to wonder if the reason AMD continues to use the "groups of islands" theme to intentionally make it confusing what generation each GPU is part of. AMD has in the past admitted to switching to names instead of numbers in order to make leaks less useful, but at least we could tell a GPU was part of the 5000 series since it had to be named after a tree even if that information by itself provided no hint of the GPU's performance tier. Since the 6000 series it's all been islands, and if AMD's goal is to confuse everyone, they're doing a mighty good job of it. It's too bad they won't run out of islands any time soon.
I think it stems from the fact that AMD's R&D is spread fairly thinly. They seem to have an internal roadmap of what they want to achieve with GCN, but the parts that make up the whole are evolving at different rates thanks to R&D prioritization. They went full bore on the high end to match Nvidia, but the architecture isn't that suited to be applied down the product stack in its present form. The sad indictment of this prioritization is that AMD's mobile segment is held together by Pitcairn and Cape Verde based SKUs which look likely to have to soldier on for a while yet...maybe into their third year (Feb/Mar 2015). One thing is for certain, I don't think AMD can afford to keep trimming the R&D budget.
 
Joined
Apr 21, 2008
Messages
5,250 (0.86/day)
Location
IRAQ-Baghdad
System Name MASTER
Processor Core i7 3930k run at 4.4ghz
Motherboard Asus Rampage IV extreme
Cooling Corsair H100i
Memory 4x4G kingston hyperx beast 2400mhz
Video Card(s) 2X EVGA GTX680
Storage 2X Crusial M4 256g raid0, 1TbWD g, 2x500 WD B
Display(s) Samsung 27' 1080P LED 3D monitior 2ms
Case CoolerMaster Chosmos II
Audio Device(s) Creative sound blaster X-FI Titanum champion,Creative speakers 7.1 T7900
Power Supply Corsair 1200i, Logitch G500 Mouse, headset Corsair vengeance 1500
Software Win7 64bit Ultimate
Benchmark Scores 3d mark 2011: testing
nice we back to 680 times
 
Joined
Feb 8, 2012
Messages
3,014 (0.64/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Depends on the utilization of both the first and second card. Any CPU limitation or lack of SLI optimization will affect overall power draw, so any power usage need to take into account workload.

Workload is already taken into account because power draw is measured with the same workload.
 
Joined
Mar 31, 2012
Messages
862 (0.19/day)
Location
NL
System Name SIGSEGV
Processor AMD Ryzen 9 9950X
Motherboard MSI MEG ACE X670E
Cooling Noctua NF-A14 IndustrialPPC Fan 3000RPM | Arctic P14 MAX
Memory Fury Beast 64 Gb CL30
Video Card(s) TUF 4090 OC
Storage 1TB 7200/256 SSD PCIE | ~ TB | 970 Evo | WD Black SN850X 2TB
Display(s) 27" /34"
Case O11 EVO XL
Audio Device(s) Realtek
Power Supply FSP Hydro TI 1000
Mouse g402
Keyboard Leopold|Ducky
Software LinuxMint
Benchmark Scores i dont care about scores
Nice and awesome card.
Hope amd brings 20nm process on early 2015 with 300 series as they promised before ( since they have already taped out 20nm on both apu and gpu).
As consumer, i love competition.
 
Joined
Nov 7, 2007
Messages
32 (0.01/day)
This might be the card which allows me to go away from a dual-card setup to run a 2560x1600 resolution and have decent/average 60 fps.....
For a decent price:)

I've been using an EVGA Titan SC for 25X16, I bought an EVGA 980 SC. The numbers in this site's review at 25X16 compared to the 690 and 7990 convinced me.

(the 7990 is at 7% higher, the 690 at 8% higher)

That level of performance from a single chip, let alone one as cool and quiet as this chip, is pretty amazing. The factory OC versions of this card should offer indiscernible performance from the 7990/690 on one GPU, for $600 or less.

Good times to be a gamer, this is one of those pivotal moments in GPU history. (E.G. 9700Pro, 8800GTX)
 
Joined
Mar 10, 2014
Messages
1,793 (0.45/day)
Nice and efficient card, nicely done. It just left a bitter taste that gm107 did not have hdmi2.0, new nvenc and hevc/h265, buying card with gm204 in it for htpc only, is kind of dumb. I hope that nvidia will release gm207 just to catch up with added maxwell 2 features.

offtopic...

*snip*
Tonga on the other hand is just a bizarre chip and its review score on this site reflected that. Tonga is simply too big of a chip to be a test chip(along with a new node isn't being tested). It seems out of place, if Tonga was 200mm2 rather than 360mm2, and performed at the level it did, it would be a chip that showed potential in much the same way as GM107 did.
*snip*

While I agree in most points for tonga being disappointment, I think the one reason why, it is so horrible late. If amd could have managed to release tonga shortly after hawaii way before gm107 with a name r9-280x and r9-280, I think it could have been a good chip at that time. Back then it would have make more sense, think about cayman and barts:
hd6970->r9-290x
hd6950->r9-290
hd6870->tonga xt as r9-280x
hd6850->tonga pro as r9-280

...offtopic
 
Joined
May 12, 2009
Messages
973 (0.17/day)
System Name YautjaLord
Processor Ryzen 9 5900x @ 3700MHz
Motherboard Gigabyte X570 Aorus Xtreme rev. 1.1
Cooling EK-XE360mm front/SE360mm top/3xVardar 120mm top/3xbe quiet! Light Wings 120mm High Speed PWM/etc....
Memory HyperX Predator 4x8GB 4000MHz @ 4000MHz
Video Card(s) 1xGigabyte RTX 3080 Master 10GB rev. 3.0 (watercooled)
Storage Samsung 980 Pro 1TB (system)/Samsung 960 Pro M.2 NVMe (system) |Samsung T7 Shield 1TB USB Type-C
Display(s) LG 27GN950-B 4k IPS 1ms 144Hz HDR
Case be quiet! Dark Base Pro 900 rev. 2 Black/Orange
Audio Device(s) Integrated
Power Supply Corsair HX1200
Mouse Dragonwar ELE-G4.1 laser gaming mouse 9500dpi
Keyboard Corsair K60 Pro Low Profile
Software Windows 10 Pro 64-bit 21H2
Just what i thought: no need to upgrade from my GTX 760 in this & next year*. But i loved how this thing performed in Crysis 3 & Wolfenstein: New Order; why you haven't included Carma: R & Serious Sam 3 bugged me for a moment but inclusion of Wolfenstein in the benchmarks suite fixed it for me (one of games i'm willing to have). Thanx Wiz.

P.S. Loved the "Oh, and AMD seems f*cked" @ the end of review. lol

P.P.S. *Definitely no need to upgrade from 2xGTX 760s this & next year either, gonna have the 2nd one by November.
 
Joined
Sep 21, 2014
Messages
25 (0.01/day)
Will the real GTX 980 Please Stand Up!

More of NVIDIA charging flagship prices for a mid-range chip. Yes the preformance is good, but the reality is in terms of chip size this card is a GTX 660/560/460

Imagine if Intel did business the same way NVIDIA has since the GTX 6 series. The 4790K would be a $1,000 processor. And Intel's Flagship processors would carry pie in the sky prices just because there was no competition.

To prove my point the GTX 680 and 660Ti were the same chip. How can a company offer a Flagship Chip at such a price difference? Because even the 680 was a mid range chip.

Until the rest of you Nvidia fanboys catch on to this you will continue to get the second best chip for a premium price.
 
Joined
Nov 7, 2007
Messages
32 (0.01/day)
Will the real GTX 980 Please Stand Up!

More of NVIDIA charging flagship prices for a mid-range chip. Yes the preformance is good, but the reality is in terms of chip size this card is a GTX 660/560/460

Imagine if Intel did business the same way NVIDIA has since the GTX 6 series. The 4790K would be a $1,000 processor. And Intel's Flagship processors would carry pie in the sky prices just because there was no competition.

To prove my point the GTX 680 and 660Ti were the same chip. How can a company offer a Flagship Chip at such a price difference? Because even the 680 was a mid range chip.

Until the rest of you Nvidia fanboys catch on to this you will continue to get the second best chip for a premium price.

You may not have noticed, but NVIDIA hasn't been charging $550 for their ultra high end chips for a while now. While it's probably true higher performing Maxwell variants will follow, it's also probably true they won't cost $550..

There are two kinds of buyers:

Guys like me who buy based on price/performance in the current market.

Guys like you who say "WAIT! I think NVIDIA should be charging less for this because they're going to release some more expensive chips later!"

I'd be more surprised if NVIDIA didn't price this at $550:
A. 13% faster than AMDs best at 25X16
B. Either 6, or 16(!) dB lower noise than AMDs best depending which mode you run it in. (BTW- 50dB?! That's FX5800 Ultra Dustbuster level)
C. 110W less peak power consumption/heat dumped in case
D. Better multi GPU
E. Better 3D solution
F. Better feature set (AA, etc)

The cheapest 290Xs on newegg today are $489, 980s would be worth the extra $60 on point A. alone.

AMD may well release water cooled, OCd card but my guess is it be either HIGHLY binned, or close to 300W. I'm expecting a 9590- like product personally. (where they do the OCing and charge a premium for giving you the water cooler)
 
Last edited:
Joined
Feb 21, 2008
Messages
5,004 (0.81/day)
Location
NC, USA
System Name Cosmos F1000
Processor Ryzen 9 7950X3D
Motherboard MSI PRO B650-S WIFI AM5
Cooling Corsair H100x, Panaflo's on case
Memory G.Skill DDR5 Trident 64GB (32GBx2)
Video Card(s) MSI Gaming Radeon RX 7900 XTX 24GB GDDR6
Storage 4TB Firecuda M.2 2280
Display(s) 32" OLED 4k 240Hz ASUS ROG Swift PG32UCD
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR RM1000e 1000watt
Mouse G400s Logitech, white Razor Mamba
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
VR HMD Steam Valve Index
Software Win10 Pro, Win11
To Wizzard :

I just wanted to ask if you noticed on the refernce model some coil whine coming from the card ?

I know a light coil whine on powerful videocards are normality.

Did anyone check in any reviews for possible coil whine? I am curious despite having one on the way. When mine arrives, what is the best way to get coil whine to present itself? Isn't the best way is have it run a older game super high FPS? I can't remember.
 
Joined
Mar 28, 2014
Messages
586 (0.15/day)
Processor AMD FX-8320
Motherboard AsRock 970 PRO3 R2.0
Cooling Thermalright Ultra120 eXtreme + 2 LED Green fans
Memory 2 x 4096 MB DDR3-1333 A-Data
Video Card(s) SAPPHIRE 4096M R9 FURY X 4G D5
Storage ST1000VX000 • SV35.6 Series™ 1000 GB 7200 rpm
Display(s) Acer S277HK wmidpp 27" 4K (3840 x 2160) IPS
Case Cooler Master HAF 912 Plus Black + Red Lights
Audio Device(s) Onboard Realtek
Power Supply OCZ ProXStream 1000W
Mouse Genius NetScroll 100X
Keyboard Logitech Wave
Software Windows 7 Ultimate 64-bit
I'm OK then. I never use the word f***, and seemingly you don't either

I promise to only use "fuck" and "shit" and not those weird "f***" and "s&!*" notations. Two thumbs up Sony, F*** and s&!* is for pussies!

I highly recommend you to go and check your health in a hospital!

The reason why I wrote it was that I realised how stupid the conversation went, and it should be improved ( and I didn't intend to insult by any means that guy in that case, it was just dirty but innocent wording)!

Why are you so annoyingly arrogant and sarcastic?
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Workload is already taken into account because power draw is measured with the same workload.
Depends on the utilization of both the first and second card. Any CPU limitation or lack of SLI optimization will affect overall power draw, so any power usage need to take into account workload.
Not sure what you're talking about here. GPUs load balance in SLI/CFX - driver profile, vRAM and/or CPU limitation, vSync, app coding will all affect usage (as will GPUs with differing clock/dynamic boost rates). Just because a game can peg GPU usage to near 100%

...doesn't mean that adding a second card will automatically mean that both are running at 100% in the same circumstances. Same system, same application with a second card added....< 70% GPU usage

And of course, depending upon the same app/driver coding and efficiency, vRAM and GPU requirement, not all applications are created equal


I highly recommend you to go and check your health in a hospital!
Oh, Sonny I didn't know you cared! I'll pass though, my super-hypocrite-sense is tingling, so I think I'm good to go.
( and I didn't intend to insult by any means that guy in that case, it was just dirty but innocent wording)!
Well, the poster (the54thvoid) you aimed it at certainly didn't see it that way, either then or now (judging by the fact he thanked my post and clarified this thoughts just afterwards).
Why are you so annoyingly arrogant and sarcastic?
Check my sig - can't say you weren't warned. Maybe if you applied the same rules to yourself that you are trying to impress upon others, and some degree of argumentative consistency we can put it all behind us and be friends! You have a nice day now.
 
Last edited:
Joined
Sep 21, 2014
Messages
25 (0.01/day)
You may not have noticed, but NVIDIA hasn't been charging $550 for their ultra high end chips for a while now. While it's probably true higher performing Maxwell variants will follow, it's also probably true they won't cost $550..

There are two kinds of buyers:

Guys like me who buy based on price/performance in the current market.

Guys like you who say "WAIT! I think NVIDIA should be charging less for this because they're going to release some more expensive chips later!"

I'd be more surprised if NVIDIA didn't price this at $550:
A. 13% faster than AMDs best at 25X16
B. Either 6, or 16(!) dB lower noise than AMDs best depending which mode you run it in. (BTW- 50dB?! That's FX5800 Ultra Dustbuster level)
C. 110W less peak power consumption/heat dumped in case
D. Better multi GPU
E. Better 3D solution
F. Better feature set (AA, etc)

The cheapest 290Xs on newegg today are $489, 980s would be worth the extra $60 on point A. alone.

AMD may well release water cooled, OCd card but my guess is it be either HIGHLY binned, or close to 300W. I'm expecting a 9590- like product personally. (where they do the OCing and charge a premium for giving you the water cooler)


I did not dispute the preformance of the card. Just stating we are getting half of a chip for the price of a flagship. Based on your reasoning anyone who has an intel 4790 should be sending in extra cash to intel to make up for the price/preformance ratio.

Since the 6 series all Nvidia has been doing is protecting its product stack.

Case in point the GTX 780 could have been released long ago but it wasn't because they didnt feel they had to. When they finally did, it came in at $650. Only to be slashed to $500 a month later. How about all those that bought at $650.

I'm just sick of Nvidia giving us "good enough" Had the Titan been a $650 or the 780 released with the titan would have AMD come to market with the 290 series sooner? I don't know, what I do know is it would have put more pressure on AMD to come to market with a competing product. Had that been the case everyone wins. Faster product turnover, more reason to upgrade and 4k becoming more available on a mass market scale. Not to mention more GPU resources for game developers to give us richer more detailed games.

Basically, this good enough attitude from Nvidia is holding up progress on many fronts. It would be nice for Nvidia to give us the best from the get go. Something they havent done since Fermi.

Like I said if Intel did business this way the most powerful CPU available would be the 4790K.
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Case in point the GTX 780 could have been released long ago but it wasn't because they didnt feel they had to. When they finally did, it came in at $650. Only to be slashed to $500 a month later. How about all those that bought at $650.
That's pretty much the way of the world with high-end cards - you should also check your facts - the GTX 780 price cut came 5 months after its launch not one month. Care to work out the likely depreciation rate on the 290X - prices are starting to fall rather rapidly. A card that sold for ~$500 a month ago can be had for 20% less now. That price is an outlier, but I'm guessing that AMD won't be selling too many cards at the current revised MSRP at the moment. The HD 7970 received a 10% price cut a few weeks after it launched thanks to the GTX 680's arrival, and the R9 280 launched at $280, has now cratered ( ~$200, or nearly a 30% price cut) thanks to AMD EOL'ing the card after a three months lifespan to make way for the R9 285. BTW: That 30% price cut actually exceeds that of the GTX 780's.
I'm just sick of Nvidia giving us "good enough" Had the Titan been a $650 or the 780 released with the titan would have AMD come to market with the 290 series sooner?
No they wouldn't have. Chip design takes years.
I don't know, what I do know is it would have put more pressure on AMD to come to market with a competing product.
Really? If all it takes is a competing product to get AMD to shift gears, why haven't they updated their server/enthusiast desktop platform? R&D is very much a finite commodity for AMD, don't expect them to stay on the pace when they're fighting a three-front war (x86, ARM, discrete graphics)
Basically, this good enough attitude from Nvidia is holding up progress on many fronts. It would be nice for Nvidia to give us the best from the get go. Something they havent done since Fermi.
Odd viewpoint, bearing in mind that both IHV's are dependant upon TSMC's manufacturing process, and both IHV's maximize ROI by respinning product (GTX 680 -> GTX 770 and HD 7970 -> HD 7970 GHz Edition -> R9 280X for example).
 
Last edited:
Joined
Nov 7, 2007
Messages
32 (0.01/day)
Hard to really question NVIDIA's business methods.

They currently have a market cap of over $10b, and their only competitor has a market cap under $3b. ( and a big chunk of that is the CPU business)

I see a lot of guys on forums saying "Darn you NVIDIA! Give us your best products for $500 as soon as you can get them out the door!".

I sure wouldn't. If I were in charge at NVIDIA and currently had chips 5X more powerful than what we see here, I'd bleed them out just fast enough to keep stomping on AMD and bringing in the high profit quarters. NVDA answers to their board of directors/stockholders, not gamer whim.
 
Joined
Sep 21, 2014
Messages
25 (0.01/day)
That's pretty much the way of the world with high-end cards - you should also check your facts - the GTX 780 price cut came 5 months after its launch not one month. Care to work out the likely depreciation rate on the 290X - prices are starting to fall rather rapidly. A card that sold for ~$500 a month ago can be had for 20% less now. That price is an outlier, but I'm guessing that AMD won't be selling too many cards at the current revised MSRP at the moment. The HD 7970 received a 10% price cut a few weeks after it launched thanks to the GTX 680's arrival, and the R9 280 launched at $280, has now cratered ( ~$200, or nearly a 30% price cut) thanks to AMD EOL'ing the card after a three months lifespan to make way for the R9 285. BTW: That 30% price cut actually exceeds that of the GTX 780's.

No they wouldn't have. Chip design takes years.

Really? If all it takes is a competing product to get AMD to shift gears, why haven't they updated their server/enthusiast desktop platform? R&D is very much a finite commodity for AMD, don't expect them to stay on the pace when they're fighting a three-front war (x86, ARM, discrete graphics)

Odd viewpoint, bearing in mind that both IHV's are dependant upon TSMC's manufacturing process, and both IHV's maximize ROI by respinning product (GTX 680 -> GTX 770 and HD 7970 -> HD 7970 GHz Edition -> R9 280X for example).


Product progression is one thing. 480 to 580 would be another example. Or how about the 8800GT and its progression. In these examples we still got the full chip and amd/nvidia putting the best chip to market. Had amd not had Hawaii we would have never seen the 780ti and the 780 would be still $650 and Maxwell would still be in the pipeline waiting for 20nm.
Not to mention nvidia has discontiuned the 770/780 and 780ti. The 780/780ti I understand. But since a 770 and a 760 cost the same to produce why not keep the 770 with a price cut?
Bottom line is consumers drive the market and as long as people are willing to rush out and spend flagship money on a mid-range chip it is us the consumers that will be continued to be shortchanged.
Again, if Intel did business this way the 4790k would be the $1k flagship. Kudos to Intel for offering the consumer the best chip reguardless of what the competition is doing.
Instead of defending Nvidia, maybe you should defend your pocket book.
 

thomashrb

New Member
Joined
Sep 22, 2014
Messages
1 (0.00/day)
Thankyou @W1zzard for a really good review. To me the price per dollar is incomplete. Due to display restrictions I play at 1080p.

At 1080p the GTX980 has a 19% perf advantage and a 138Watt advantage over the R9 290X. According to a survey conducted on 34M gamers in May this year as reported by VentureBeat, the average hardcore gamer plays for 22hours/week. If you live in NY, that 138W translates to $26.70/year. The current price comparison for the GTX980 vs the R9 290X is $550 and $500 respectively. Most manufactures will give you at least a 2 year warranty and usually a 3 year warranty on their top end cards making the total cost of ownership for the lifespan of the card equal to the capex + (opex for 3 years).

for 22hours a week with the R2 290X consuming 138W more than the GTX980 that comes to an extra 157kWh/year. In NY that amounts to $26.70 additional power bills. In 3 years that comes to $80.10 making the GTX980 approximately $30 cheaper to own over the lifespan of the card. In fact even the areas of cheapest power in the Continental US would be paying $47.50 additional on power. So the question is: who in their right minds would be willing to save (in the very best possible case) $2.50 and buy a device that has a 19% performance disadvantage? Simple: the ones who have not been properly informed of the total costs of ownership.

I really enjoyed the article, and especially the extra effort taken to really get into some useful usage metrics. But I did find the Performance per Dollar section to be an incomplete analysis and borderline misleading (even though this oversight is obviously not intentional).
 
Joined
Nov 7, 2007
Messages
32 (0.01/day)
Product progression is one thing. 480 to 580 would be another example. Or how about the 8800GT and its progression. In these examples we still got the full chip and amd/nvidia putting the best chip to market. Had amd not had Hawaii we would have never seen the 780ti and the 780 would be still $650 and Maxwell would still be in the pipeline waiting for 20nm.
Not to mention nvidia has discontiuned the 770/780 and 780ti. The 780/780ti I understand. But since a 770 and a 760 cost the same to produce why not keep the 770 with a price cut?
Bottom line is consumers drive the market and as long as people are willing to rush out and spend flagship money on a mid-range chip it is us the consumers that will be continued to be shortchanged.
Again, if Intel did business this way the 4790k would be the $1k flagship. Kudos to Intel for offering the consumer the best chip reguardless of what the competition is doing.
Instead of defending Nvidia, maybe you should defend your pocket book.

So what position do you hold at NVIDIA?

The reason I ask is the only way you could actually know any of what you allege is if you worked in NVIDIA engineering or management.

If you don't, everything you have postulated is nothing more than conspiracy theory and speculation.
 
Top