# GeForce GTX 960 3DMark Numbers Emerge



## btarunr (Jan 16, 2015)

Ahead of its January 22nd launch, Chinese PC community PCEVA members leaked performance figures of NVIDIA's GeForce GTX 960. The card was installed on a test-bed driven by a Core i7-4770K overclocked to 4.50 GHz. The card itself appears to be factory-overclocked, if these specs are to believed. The card scored P9960 and X3321 in the performance and extreme presets of 3DMark 11, respectively. On standard 3DMark FireStrike, the card scored 6636 points. With some manual overclocking thrown in, it managed to score 7509 points in the same test. 3DMark Extreme (1440p) was harsh on this card, it scored 3438 points. 3DMark Ultra was too much for the card to chew, and it could only manage 1087 points. Looking at these numbers, the GTX 960 could be an interesting offering for Full HD (1920 x 1080) gaming, not a pixel more.



 

 

 

 

 



*View at TechPowerUp Main Site*


----------



## GAR (Jan 16, 2015)

gotta say, I'm a bit disappointed here by the 128 bit bus, weak, nvidia, weak.


----------



## dj-electric (Jan 16, 2015)

"256bit? how the hell are these gtx 900 cards going to beat R9 290 series? no way in hell".

Stop thinking in numbers, start thinking in results


----------



## the54thvoid (Jan 16, 2015)

GAR said:


> gotta say, I'm a bit disappointed here by the 128 bit bus, weak, nvidia, weak.





Sony Xperia S said:


> nvidia sucks.



Just go home.  Or go to school.  Or maybe go live in that cave forever.

My 780ti has a bus of 384.  The 980 has a bus of 256.  It only has 1Gb extra memory.  It's bus is 128 smaller than my card (or only 2/3 of a 780ti.)  It's faster than my card (only, it uses higher clocks but it performs better at 4k and it's not all down to the extra 1Gb memory.  The Maxwell cards feature an effective texture compression algorithm that offsets the need for a larger bus.  When folk bitch about it having a small bus they simply don't understand the engineering or the market.   This card only has 2Gb memory - it's gpu isn't powerful enough for 1440p+ gaming - it doesn't need (in fact the GPU chip itself cant handle) lots of bandwidth.

Stop being so ignorant about the technologies, it's tiresome and childish.  Maxwell doesn't need as large a memory bus because of other technology developments - get over it.


----------



## Bjorn_Of_Iceland (Jan 16, 2015)

This is like GTX680 sipping even lesser current.


----------



## buggalugs (Jan 16, 2015)

Yep, disappointing. A mid range card should be able to run 4K in 2015. or at least 1440p.


----------



## Recus (Jan 16, 2015)

AMD Defense Force can't handle the truth that 128bit card is almost fast as 280X 384bit.


----------



## Nullifier (Jan 16, 2015)

Ughhh 7k graphics score on firestrike normal.
not even 1k on ultra.
Would need 2 to compete with a single 970 at 1080.
And like 6 to compete with a single 970 at 4k LOL


----------



## john_ (Jan 16, 2015)

Dj-ElectriC said:


> "256bit? how the hell are these gtx 900 cards going to beat R9 290 series? no way in hell".
> 
> Stop thinking in numbers, start thinking in results



At higher resolutions that 512Bit data bus on Radeon cards gives them the edge in many cases. So, yes. You should ALSO take in consideration those numbers.


----------



## Overclocker_2001 (Jan 16, 2015)

those number aren't that bad. The card can handle quiet good 1080p ( over 4x AA will kill the fps but.. really you can't live with 2/4x AA? ).

biggest part of the cards potential is its price. I think that a launch price of 199$ is quite good, if after 4-6 week the card will settle in the 170-180$ price range than its a very good buy ( and if the card support sli, and from GB g1 I will say yes ) than sli on the midrange is a quite good option for eye candy detail on 1080p. ( assuming a price of 175$ per card and a total power consumption of about 90-105% of a single gtx970 )

a card that perform like gtx680 but with 2/3 of it's tdp it's good! ( if price is right )


----------



## Xzibit (Jan 16, 2015)

So if its competing against AMD R9 280 it should be $159-$199. If its replacing a card GTX 770 $279-$329.


----------



## dj-electric (Jan 16, 2015)

john_ said:


> At higher resolutions that 512Bit data bus on Radeon cards gives them the edge in many cases. So, yes. You should ALSO take in consideration those numbers.



http://tpucdn.com/reviews/Gigabyte/GeForce_GTX_980_G1_Gaming/images/perfrel_3840.gif
How much higher? 5K? 8K?


----------



## the54thvoid (Jan 16, 2015)

Xzibit said:


> So if its competing against AMD R9 280 it should be $159-$199. If its replacing a card GTX 770 $279-$329.



It's Nvidia, expect a premium over what ever its meant to compete with.


----------



## HumanSmoke (Jan 16, 2015)

the54thvoid said:


> Just go home.  Or go to school.  Or maybe go live in that cave forever.
> My 780ti has a bus of 384.  The 980 has a bus of 256.  It only has 1Gb extra memory.  It's bus is 128 smaller than my card (or only 2/3 of a 780ti.)  It's faster than my card (only, it uses higher clocks but it performs better at 4k and it's not all down to the extra 1Gb memory.  The Maxwell cards feature an effective texture compression algorithm that offsets the need for a larger bus.  When folk bitch about it having a small bus they simply don't understand the engineering or the market.


Hey, it's Nvidia card launch week - welcome to the AMD Troll-a-thon!  I think they're just getting their frustrations out because the lack of action from Team Red (Ink) and some impending bad news regarding AMD's Q4/Yearly financials and graphics market share numbers - coincidentally due the same time Nvidia launch the 960 (and possibly the M6000). Schadenfreude looms large.
Having said that, I'm sure true enthusiasts would recall that in addition to the delta colour compression, Nvidia stated that a Maxwell shader module has 90% of the performance of the Kepler SM thanks to the rejigged resources and additional cache. In the 780 Ti's case, its 15 SMX's would equate fairly well with the GTX 980's 16 SMM's with the latter pulling away in demanding situations thanks to the higher ROP resources.
With a few salient facts at hand it should be a relatively easy matter to deduce the performance parameters without resorting to hyperbole.

I'll stop now, so that some random can explain why this card being a fail because it cant deliver playable framerates at 1080p with 8xSSAA enabled.


----------



## Fluffmeister (Jan 16, 2015)

GM200 does indeed seem to be ready to drop too (no doubt in the guise of the M6000 initially).

http://videocardz.com/54358/nvidia-maxwell-gm200-pictured


----------



## RCoon (Jan 16, 2015)

HumanSmoke said:


> I'll stop now, so that some random can explain why this card being a fail because it cant deliver playable framerates at 1080p with 8xSSAA enabled.



I'm currently prepping an article that covers memory bandwidth usage figures for a couple of AAA titles and a couple of "generic" titles, how it correlates with GPU usage, VRAM usage as well as PCIe Bus usage, and how much (approximately) memory bandwidth said games actually use at Very High presets on 1080 and 1440p. Probably will take a few days though, I've only done two benchmarks  on 1440p, but I should have something concrete before the 960 releases.

All I'm saying after these first couple of tests is, without Maxwell compression methods, the 970 memory bandwidth would have been totally saturated at a Very High preset on 1440p. But _with_ the compression, it's got a butt-load of bandwidth to spare. The 770 has identical bandwidth, so if I can safely assume Maxwell compression is precisely 30%, a 4GB 770 would have too little memory bandwidth available for this one game on 1440p.
I'm being as vague as possible for now so my article doesn't become entirely worthless, and I've not done enough tests to give you 100% certain answer. Not to mention NVidia are currently the only people that allow you to measure PCIe bus usage, AMD don't have it nailed in yet.

I took a day off my day-job to do this, and now I'm starting to realise it's going to take a lot longer.


----------



## GhostRyder (Jan 16, 2015)

Recus said:


> AMD Defense Force can't handle the truth that 128bit card is almost fast as 280X 384bit.


Because the bus is all that matters, but by your logic its almost as fast as a 256bit GTX 770 which is slower than a R9 280X so what was the point of the comment?



Xzibit said:


> So if its competing against AMD R9 280 it should be $159-$199. If its replacing a card GTX 770 $279-$329.


Most likely its supposed to be above a GTX 760 and below a GTX 770 overall while eventually there will probably be a GTX 960ti or similar to fill the gap.  But that's just my guess...


GAR said:


> gotta say, I'm a bit disappointed here by the 128 bit bus, weak, nvidia, weak.


RAM speed can alleviate a low bus width and on top of that when we mix in the new techs (Color compression etc) we get a little more strength out of something like that.  Its more than enough for a card of this magnitude.  Having a 256bit bus on this card would not really have made much sense based on where its supposed to be aimed because it would just add more power to a card that would potentially undermined the GTX 970 which undermines the GTX 980 resulting in harder sells the higher they go up so it really makes sense.  On top of that I am bettering their is a chance of a GTX 960ti that might incorporate something like a 192bit bus later on but that is just a guess.


john_ said:


> At higher resolutions that 512Bit data bus on Radeon cards gives them the edge in many cases. So, yes. You should ALSO take in consideration those numbers.


  Ram speeds do alleviate this as its part of the factor that gives a 256bit bus the feeling of a much larger bus width.  Though the 512bit bus has really helped the R9 290/X in being very well done for high resolution gaming (eyefinity, 4k, etc).

Not sure why every thread regarding NVidia, AMD, or Intel has to be a fest of who can make the most ignorant comment that starts a fan war.  But I guess certain people constantly cooking up AMD hatred on each thread and the people automatically complaining about NVidia not overpowering a middle ground card suffice to the biggest reasons these threads end up in spam fests.


----------



## MxPhenom 216 (Jan 16, 2015)

buggalugs said:


> Yep, disappointing. A mid range card should be able to run 4K in 2015. or at least 1440p.



A single flagship card can barely get away with 30fps at 4k......Either your serious, or trolling, I hope its the latter, or my faith in this community has just dropped.


----------



## Blue-Knight (Jan 16, 2015)

someone said:
			
		

> gotta say, I'm a bit disappointed here by the 128 bit bus, weak, nvidia, weak.


This is like saying my AMD is better because it runs at 5GHz while your Intel runs at only 3GHz.



			
				someone said:
			
		

> Yep, disappointing. A mid range card should be able to run 4K in 2015. or at least 1440p.


Sarcasm!? 

If not: It should not be able to run 4K in 2015, maybe not even in 2016 or more seriously 2017. 
Why? I can count on my hand the people who have a 4K monitor and can only afford a mid range card (if any at all).


----------



## HisDivineOrder (Jan 16, 2015)

MxPhenom 216 said:


> A single flagship card can barely get away with 30fps at 4k......Either your serious, or trolling, I hope its the latter, or my faith in this community has just dropped.




I think the answer is... neither.

I think he was merely being sarcastic to illustrate the comments to come and the absurdity therein.


----------



## MxPhenom 216 (Jan 16, 2015)

HisDivineOrder said:


> I think the answer is... neither.
> 
> I think he was merely being sarcastic to illustrate the comments to come and the absurdity therein.



I hope so. That is why I gave the option of trolling.


----------



## _larry (Jan 16, 2015)

AMD destroys the low/mid-range sector. This confirms that. Some may argue that the 970 is a midrange card...for $300+ that is hardly mid-range..


----------



## Baffles (Jan 16, 2015)

Blue-Knight said:


> I can count on my hand the people who have a 4K monitor and can only afford a mid range card (if any at all).



You rang? Running my samsung 4k on an R9 270X (the buyer's remorse I have for that monitor is immeasurable, but I got it so now I use it). 

Lot of people forget that you don't need to run max settings to enjoy the game, just running the native resolution on low/medium is good enough. 


The 960 seems to be scoring a reasonable amount higher than the 760 and with maxwell should use significantly less power, as long as its priced competitively with the outgoing 760 I see no problem here. 

Also people, if you don't like it, don't buy it. Its simple


----------



## Casecutter (Jan 16, 2015)

btarunr said:


> The card itself appears to be factory-overclocked, if these specs are to believed.  On _*standard*_ 3DMark FireStrike, the card scored 6636 points.


 


Recus said:


> AMD Defense Force can't handle the truth that 128bit card is almost fast as 280X 384bit.


 
Is there some difference between "standard" and "extreme" FireStrike?  I have never heard of it termed as "standard", just good old "FireStrike Extreme".


----------



## ManofGod (Jan 16, 2015)

WOW, just WOW. The 128 Bit defense force it out is force today.  Have to justify the decrepit 128 bit bus by calling others AMD fan boys. Shit, crap is crap whether someone likes it or not. The 128 bit but is from last decade and needs to be dropped on anything about $150. This card is low end at best compared to everything in its price range out there.

The 128 bit bus will quickly get overwhelmed with anything coming out in the next year or so. Sorry but, Nvidia pushed this crap just so people would buy their 970 GTX's instead. Otherwise, with a 256 bit bus, the 970 sales would be eaten into buy the 960.

As Baffles said, if you do not like it, do not buy it. For what it is, it is probably fine but, 128 bit bus is still very limiting.


----------



## bogmali (Jan 16, 2015)

Play nice folks and keep this going without the name-calling/insults


----------



## ManofGod (Jan 16, 2015)

bogmali said:


> Play nice folks and keep this going without the name-calling/insults



You are correct, thanks.  I just wish folks would see computer hardware as it is and not with red, green or blue tinted glasses.


----------



## thebluebumblebee (Jan 16, 2015)

When Nvidia introduced the Kepler (GTX 680) line, they did something different than they had in the past - they introduced the mid-range GPU first, albeit at the price of the previous high-end GPU.  Brilliant marketing move.  It allowed them to move the price scale up.  No longer were they selling Gx204 GPU's for $260.  Now they're selling GM204's for $550.  When I read through a thread like this, I see that a lot of people don't seem to understand this.  This GTX 960's "grandfather" is the GTS 450.
(please don't get picky with this list - I know there are some minor factual errors, but I've tried to compress as much as possible)
Gx200/210 - Nvidia high end: GTX 285 - 480/580 - 780/780 Ti
Gx204/214 - Nvidia mid range GTX 260 - 460/560 - 680/770 - 970/980
Gx206 - Nvidia entry: GTS 450 - GTX 550 - 660 - 960
... and that leaves the GTX 750/Ti.  Notice that it's not SLI compatible?  It belongs to the group that had the 8400GS - GT 210/220 - GT 440/530/630/730 (the 750 Ti is my pet peeve - is should cost less than $100)

It's easy to get lost in the numbers, so maybe think of it this way.  What's been the difference between the mid-range cards and the high end cards since the GTX 2xx days?  In most cases, it's been the settings that you could run a game at.  You could get nearly identical FPS, just not at the same detail settings. 
So,


buggalugs said:


> Yep, disappointing. A *mid range card* should be able to run 4K in 2015. or at least 1440p.


..is right, but the GTX 960 is *not* a mid-range card.  The 970/980 are, and can.


----------



## Fluffmeister (Jan 16, 2015)

Presumably then it's the competition that need to up their game?


----------



## 64K (Jan 16, 2015)

thebluebumblebee said:


> When Nvidia introduced the Kepler (GTX 680) line, they did something different than they had in the past - they introduced the mid-range GPU first, albeit at the price of the previous high-end GPU.  Brilliant marketing move.  It allowed them to move the price scale up.  No longer were they selling Gx204 GPU's for $260.  Now they're selling GM204's for $550.  When I read through a thread like this, I see that a lot of people don't seem to understand this.  This GTX 960's "grandfather" is the GTS 450.
> (please don't get picky with this list - I know there are some minor factual errors, but I've tried to compress as much as possible)
> Gx200/210 - Nvidia high end: GTX 285 - 480/580 - 780/780 Ti
> Gx204/214 - Nvidia mid range GTX 260 - 460/560 - 680/770 - 970/980
> ...




Well said. Nvidia confused a lot of people with the GTX 680 release and the fallout seems to be lingering even to this day.


----------



## rtwjunkie (Jan 16, 2015)

ManofGod said:


> WOW, just WOW. The 128 Bit defense force it out is force today.  Have to justify the decrepit 128 bit bus by calling others AMD fan boys. Shit, crap is crap whether someone likes it or not. The 128 bit but is from last decade and needs to be dropped on anything about $150. This card is low end at best compared to everything in its price range out there.
> 
> The 128 bit bus will quickly get overwhelmed with anything coming out in the next year or so. Sorry but, Nvidia pushed this crap just so people would buy their 970 GTX's instead. Otherwise, with a 256 bit bus, the 970 sales would be eaten into buy the 960.
> 
> As Baffles said, if you do not like it, do not buy it. For what it is, it is probably fine but, 128 bit bus is still very limiting.


 
That's just it, you're living in the last decade.  This is new technology, and you really have to forget what you knew about bus-width and performance.  The compression means that a 128 bit bus now acts the same as at LEAST a 192 bit bus.  There's nothing wrong with this where it's aimed.  The performance numbers so far show it between 760 and 770 performance.  If it's also more energy efficient, it's a complete win, and they can replace the 760 in their lineup.


----------



## MxPhenom 216 (Jan 16, 2015)

rtwjunkie said:


> That's just it, you're living in the last decade.  This is new technology, and you really have to forget what you knew about bus-width and performance.  The compression means that a 128 bit bus now acts the same as at LEAST a 192 bit bus.  There's nothing wrong with this where it's aimed.  The performance numbers so far show it between 760 and 770 performance.  If it's also more energy efficient, it's a complete win, and they can replace the 760 in their lineup.



There at 960Ti rumors as well that could fill in the gap between 960 and 970 even more so.


----------



## john_ (Jan 16, 2015)

Dj-ElectriC said:


> http://tpucdn.com/reviews/Gigabyte/GeForce_GTX_980_G1_Gaming/images/perfrel_3840.gif
> How much higher? 5K? 8K?


Calm down. I said in many cases. Maybe I should have said "some cases" so you don't explode. (I am just kidding)

Anyway, the fact is that, moving higher in resolution, narrows the difference between 980 and 290X. If you compare 970 and 290X i's even more obvious that the extra bandwidth helps the Radeon card. 970 is on top in the two lower resolutions and it loses at 1440p and 2160p. That's from the same page results where your chart is. 

Now....












These numbers come from Tom and I think everyone knows how much they love AMD there.

Have a look at this
AnandTech | The NVIDIA GeForce GTX 970 Review: Featuring EVGA - Print View
If you see the results there, the higher the resolution the less the advantage between the Nvidia cards and the AMD cards. More than that, in a couple of cases AMD cards are on top, and in a few more, Radeon cards have higher minimum frames which is maybe more important than average. 
I think I wouldn't start spamming charts, you can follow the link.


----------



## thebluebumblebee (Jan 16, 2015)

This argument about bus width is - well - silly.  I believe that bus width is a major factor in determining the cost of a video card.  Look at my list above and you (if you research it) will notice similarities in the bus width for the different performance segments.  Is Nvidia getting more performance out of the same bus width?  I sure hope so!  Think of it this way, if you were to go and buy a Ford F150 today, would you want a 1980's spec 5.0l or a current 3.6l Eco-boost?

Just noticed that the 960 and 750 Ti share the same bus width.


----------



## GhostRyder (Jan 16, 2015)

thebluebumblebee said:


> When Nvidia introduced the Kepler (GTX 680) line, they did something different than they had in the past - they introduced the mid-range GPU first, albeit at the price of the previous high-end GPU.  Brilliant marketing move.  It allowed them to move the price scale up.  No longer were they selling Gx204 GPU's for $260.  Now they're selling GM204's for $550.  When I read through a thread like this, I see that a lot of people don't seem to understand this.  This GTX 960's "grandfather" is the GTS 450.
> (please don't get picky with this list - I know there are some minor factual errors, but I've tried to compress as much as possible)
> Gx200/210 - Nvidia high end: GTX 285 - 480/580 - 780/780 Ti
> Gx204/214 - Nvidia mid range GTX 260 - 460/560 - 680/770 - 970/980
> ...


I guess though its really coming down to what is defined as a "High end" card.  The problem being is that the definition has shifted from what we are used to in the past and now its can get quite confusing.  Your chart is very accurate and it shows how NVidia has basically changed the definition of what they define as high end starting with the GTX 680.  The problem is how do we define it anymore, are we going by the chips themselves, the performance, or how do we define it?

Mostly I feel its come down to a performance point, if they make a chip that can best their previous chip then they define it as the next high end chip and market it as such.  The chips in the GTX 980/970 are the successors to the GTX 680/770 chip, however they best their previous high end chip (Well GTX 980 does) so to them that is now what is king.  To add to that I think its also primarily because they do not want to release a new supremely high end GPU (GM 200 which they would rather work/tweak to make it better and have less problems or chance of problems) that would smoke their previous high volume of GPU's because if they did and released the GTX 980 as the 970 at a $300 price point (Just a random example or guesstimate) and the GM 200 as a GTX 980/980ti/titan in the ranges of $500-$1000 then who would buy the GTX 780's and below for any reasonable amount of money?  I think its so they do not lose much profits off the previous cards while the clear them out and gain additional profits from chips that may not be the "highest end" of their new generation.  Again that is just my speculation based on what I have read/seen.



Fluffmeister said:


> Presumably then it's the competition that need to up their game?


How about I retort to you the same thing?  They didn't have much reason to best anything since the HD 7970 beat the GTX 580 by a significant amount and the HD 7970/R9 280X bested the GTX 680/770 so why would they bother as well using your logic.



64K said:


> Well said. Nvidia confused a lot of people with the GTX 680 release and the fallout seems to be lingering even to this day.


Yea because the problem is numbers/names can be more powerful to people than actual facts/specs.  Most PC gamers that I see at LAN parties and such judge their cards (and other components) based more on higher numbers than anything.  I think names play one of the most important roles in this area and that leads people to conclude that "Because the number/name is higher/better, it must be better".



MxPhenom 216 said:


> There at 960Ti rumors as well that could fill in the gap between 960 and 970 even more so.


That is what I am betting on especially if they price this at $200, I would then guess a $250 dollar 960ti would be next.

The 960 is what it is, it may not have glorious specs and there are cards out there that are going to best it even around for the same money probably.  But the fact is its being marketed as a middle ground card and is going to be defined as such whether or not the specs say otherwise because it will perform well in that area we define as the middle.  This is just my opinion of course, but its going to be a good card for the 1080p gamer.


----------



## the54thvoid (Jan 16, 2015)

The issue of high versus mid end is not one of technology. Its chronological and price based. If brand A releases a product that is superior to brand B, as long as both are present at market simultaneously, it is 'high' end. You cannot define the contemporary leading product as mid end. It is high end.
When that brand manufacturer releases the faster version, the previous product can become mid range. Perfect example is the 680 as high end but with 780 (or arguably) Titan, it became the 770 and became mid range.
High end is a market definition, NOT a technological one.


----------



## ManofGod (Jan 16, 2015)

thebluebumblebee said:


> This argument about bus width is - well - silly.  I believe that bus width is a major factor in determining the cost of a video card.  Look at my list above and you (if you research it) will notice similarities in the bus width for the different performance segments.  Is Nvidia getting more performance out of the same bus width?  I sure hope so!  Think of it this way, if you were to go and buy a Ford F150 today, would you want a 1980's spec 5.0l or a current 3.6l Eco-boost?
> 
> Just noticed that the 960 and 750 Ti share the same bus width.



Well, considering how the responses to folks saying they were disappointed in the 128bit bus, I am not surprised this started turning into a crap fest. Hardware limit is still a hardware limit no matter what is done. Well, they had to cut cost somewhere and I guess this is where it happened.


----------



## Blue-Knight (Jan 16, 2015)

I would still wait to see real benchmarks, with real games with the actual graphics card in question before saying anything. Obscure 3DMark results does not satisfy me in any way.

But that is me.


----------



## 64K (Jan 16, 2015)

MxPhenom 216 said:


> There at 960Ti rumors as well that could fill in the gap between 960 and 970 even more so.



I'm almost certain that we will see another card between GTX 960 and GTX 970. The 960 is between a 760 and a 770 and the 970 is between a 780 and a 780Ti leaning towards the 780Ti side. There is a gap between the 770 and 780 that needs to be filled. They may call is a 960Ti or a 965.


----------



## Fluffmeister (Jan 16, 2015)

GhostRyder said:


> How about I retort to you the same thing?  They didn't have much reason to best anything since the HD 7970 beat the GTX 580 by a significant amount and the HD 7970/R9 280X bested the GTX 680/770 so why would they bother as well using your logic.



I'm not the one that is getting hung up on chip code names, if the performance fits I couldn't give two shits where it sits in their current chip hierarchy.

Besides the GTX 580 was released in November 2010, with the 6970 failing to take the crown, it's no surprise a card coming over a year later and benefiting from a node shrink should be faster. Wizz concluded ~15% on average, so really not that exciting in the grand scheme of things.

The 280X and 770 appear neck and neck to me even now in Wizz latest reviews, so yeah. /shrug


----------



## john_ (Jan 16, 2015)

$250-$300 That's the latest rumors. 
Various AIB's Geforce GTX 960 Pictures and Preliminary Pricing Leaked - ASUS, Zotac and EVGA Included


----------



## MxPhenom 216 (Jan 16, 2015)

thebluebumblebee said:


> This argument about bus width is - well - silly.  I believe that bus width is a major factor in determining the cost of a video card.  Look at my list above and you (if you research it) will notice similarities in the bus width for the different performance segments.  Is Nvidia getting more performance out of the same bus width?  I sure hope so!  Think of it this way, if you were to go and buy a Ford F150 today, would you want a 1980's spec 5.0l or a current 3.6l Eco-boost?
> 
> Just noticed that the 960 and 750 Ti share the same bus width.



And the 750Ti, and fairly decent at 1080p for its price point.


----------



## Xzibit (Jan 16, 2015)

john_ said:


> $250-$300 That's the latest rumors.
> Various AIB's Geforce GTX 960 Pictures and Preliminary Pricing Leaked - ASUS, Zotac and EVGA Included



At those prices its competing with AMD R9 280X & R9 290 and just replacing the GTX 770 in pricing. It has a better chance replacing GTX 760 in pricing $229-$259 but still goes up against the R9 280X.

I still think it need to be priced at $199 to compete with the R9 280 if those scores are any indication of the performance. 7 days.


----------



## HumanSmoke (Jan 16, 2015)

john_ said:


> $250-$300 That's the latest rumors.
> Various AIB's Geforce GTX 960 Pictures and Preliminary Pricing Leaked - ASUS, Zotac and EVGA Included


Well, at those prices it wouldn't be very competitive to say the least, especially when another $40 at the top end buys a pretty well-specced 970.
Having said that, I seem to recall that the GTX 970's rumoured MSRP was $400 almost right up until the card actually launched.


----------



## rtwjunkie (Jan 16, 2015)

HumanSmoke said:


> Well, at those prices it wouldn't be very competitive to say the least, especially when another $40 at the top end buys a pretty well-specced 970.
> Having said that, I seem to recall that the GTX 970's rumoured MSRP was $400 almost right up until the card actually launched.


 
I think we are best just awaiting next week to see.


----------



## GhostRyder (Jan 16, 2015)

Fluffmeister said:


> I'm not the one that is getting hung up on chip code names, if the performance fits I couldn't give two shits where it sits in their current chip hierarchy.
> 
> Besides the GTX 580 was released in November 2010, with the 6970 failing to take the crown, it's no surprise a card coming over a year later and benefiting from a node shrink should be faster. Wizz concluded ~15% on average, so really not that exciting in the grand scheme of things.
> 
> The 280X and 770 appear neck and neck to me even now in Wizz latest reviews, so yeah. /shrug


So again comparing a card released a year later is not ok yet you talk about it constantly and make references to cards spaced a year out???  On top of that if 15% average is not that interesting then why is less than 15% interesting now?



Xzibit said:


> At those prices its competing with AMD R9 280X & R9 290 and just replacing the GTX 770 in pricing. It has a better chance replacing GTX 760 in pricing $229-$259 but still goes up against the R9 280X.
> 
> I still think it need to be priced at $199 to compete with the R9 280 if those scores are any indication of the performance. 7 days.


I agree, though to be fair benchmarks are only the top in my book as I prefer seeing the actual games running them before I make judgments.  As long as its priced accordingly its going to be a good value but who knows until the actual release.



john_ said:


> $250-$300 That's the latest rumors.
> Various AIB's Geforce GTX 960 Pictures and Preliminary Pricing Leaked - ASUS, Zotac and EVGA Included


I hope it is not that close to $300...Dang that would be a terrible value pitted against the GTX 970.  $250 to me would be pushing it honestly, though I maybe judging to early until I see its actual performance.


----------



## 64K (Jan 16, 2015)

Just my opinion but $250 is too much for a card coming in between a GTX 760 and a GTX 770. Reason being that you can pick up a R9 290 for $10 or $15 more that outperforms a GTX 770 by ~15%


----------



## Xzibit (Jan 16, 2015)

You so silly that's all resolutions we don't even know if it will be able to handle anything above 1080p



64K said:


> Just my opinion but $250 is too much for a card coming in between a GTX 760 and a GTX 770. Reason being that you can pick up a R9 290 for $10 or $15 more that outperforms a GTX 770 by ~15%



You know how the replys will go.



Fluffmeister said:


> Wizz concluded ~15% on average, so really not that exciting in the grand scheme of things.


----------



## Casecutter (Jan 16, 2015)

Fluffmeister said:


> ... the GTX 580 was released in November 2010, with the 6970 failing to take the crown, it's no surprise a card coming over a year later and benefiting from a node shrink should be faster. Wizz concluded ~15% on average, so really not that exciting in the grand scheme of things.


To that History... The database is showing both at 40nm!  The GTX 580 showed Nov 9th, 2010 (520 mm²)/ 6970 Dec 14th, 2010 (389 mm²). While the 6970 showed a month after the GTX580, yes didn't take the "crown", but it took its thunder. At 2560x1600 it was like 10% less, although MSRP'd for 25% less, offering 12% better Perf/W, and 15% Perf/$.  So it did more to "raise the bar" in ways that at that time folk didn't seem as eager to tout, in today’s thinking many are consider that as a Win!


----------



## 64K (Jan 16, 2015)

Xzibit said:


> You so silly that's all resolutions we don't even know if it will be able to handle anything above 1080p
> 
> 
> 
> You know how the replys will go.



I doubt that this card is targeted at anything above 1080p.

It's a bit more tedious to post multiple resolutions but ok here you go



















The GTX 960 will undoubtedly fall somewhere between the GTX 760 and GTX 770. If the price point is $250 then the R9 290 for $10 to $15 more will be the better deal.


----------



## Xzibit (Jan 16, 2015)

64K said:


> I doubt that this card is targeted at anything above 1080p.
> 
> It's a bit more tedious to post multiple resolutions but ok here you go
> 
> The GTX 960 will undoubtedly fall somewhere between the GTX 760 and GTX 770. If the price point is $250 then the R9 290 for $10 to $15 more will be the better deal.



Sorry, I know what you meant I was just making light of it. Sarcasm and dry sense of humor doesn't come across in text.

If the price is $200+ it will be 750/Ti again. Focus will be on power consumption rather then price/performance.  On Nvidia side it will be seen as a bargain because performance/prices of 600 & 700 similar cards are much higher.


----------



## Fluffmeister (Jan 17, 2015)

GhostRyder said:


> So again comparing a card released a year later is not ok yet you talk about it constantly and make references to cards spaced a year out???  On top of that if 15% average is not that interesting then why is less than 15% interesting now?



By all means compare what you want, just don't let the realities of the situation change the fact that 7970 wasn't quite as special as you thought.

I could bang on about the time scale differences, using the same node, power savings, more performance, and cherry pick benchmarks till the cows come home.
*
So it did more to "raise the bar" in ways that at that time folk didn't seem as eager to tout, in today’s thinking many are consider[ing] that as a Win! - Thanks Casecutter *



Casecutter said:


> To that History... The database is showing both at 40nm!  The GTX 580 showed Nov 9th, 2010 (520 mm²)/ 6970 Dec 14th, 2010 (389 mm²). While the 6970 showed a month after the GTX580, yes didn't take the "crown", but it took its thunder. At 2560x1600 it was like 10% less, although MSRP'd for 25% less, offering 12% better Perf/W, and 15% Perf/$.  So it did more to "raise the bar" in ways that at that time folk didn't seem as eager to tout, in today’s thinking many are consider that as a Win!



I love reading the comments on that 6970 review, one of our resident AMD fans predicted Cayman was gonna be 35% faster than GF110.


----------



## mxp02 (Jan 17, 2015)

buggalugs said:


> Yep, disappointing. A mid range card should be able to run 4K in 2015. or at least 1440p.



If 980 is the mid-size die of maxwell,then compare the specs between 980 and 960,you'll find out it equals 680/650ti.That means 960 is actually 950ti according to former naming method,it's  a low-end card.
Once upon a time,GTS 250 is mid-end,maybe in the future the only thing X50 can do is video playback like GT 720.


----------



## xorbe (Jan 17, 2015)

> Looking at these numbers, the GTX 960 could be an interesting offering for Full HD (1920 x 1080) gaming, not a pixel more.



Told ya that 128-bit is gonna hurt ... it's like when a SandCraft SSD runs into random data, the compression doesn't help.  128 is 1/3 of the big guy.


----------



## GhostRyder (Jan 17, 2015)

64K said:


> I doubt that this card is targeted at anything above 1080p.
> 
> It's a bit more tedious to post multiple resolutions but ok here you go
> 
> ...


For some reason based on what I am seeing of it this card sounds like its going to be a GTX 670 (760ti) area of power card.  But that is just a guess of course based on the preliminary results and what is known so far.

I think its really going to come down to the price in the end and if it can justify itself among the other cards in the standings, I am more worried about that than anything since people seem to be pointing it towards $250+ which to me for the performance drop its to close to a GTX 970 (Price wise) not even including R9 280X and R9 290's which are around/cheaper than that.  But of course that can change and it may end up being $200 which would make it a decent value.


----------



## revin (Jan 17, 2015)

RCoon said:


> I'm currently prepping an article
> 
> I took a day off my day-job to do this, and now I'm starting to realize it's going to take a lot longer.



Thanks @RCoon  THAT is extremely considerate of you to do this !


----------



## TheGuruStud (Jan 17, 2015)

Recus said:


> AMD Defense Force can't handle the truth that 128bit card is almost fast as 280X 384bit.



Troll. The 7970 is 3 yrs old lol


----------



## sergionography (Jan 17, 2015)

thebluebumblebee said:


> When Nvidia introduced the Kepler (GTX 680) line, they did something different than they had in the past - they introduced the mid-range GPU first, albeit at the price of the previous high-end GPU.  Brilliant marketing move.  It allowed them to move the price scale up.  No longer were they selling Gx204 GPU's for $260.  Now they're selling GM204's for $550.  When I read through a thread like this, I see that a lot of people don't seem to understand this.  This GTX 960's "grandfather" is the GTS 450.
> (please don't get picky with this list - I know there are some minor factual errors, but I've tried to compress as much as possible)
> Gx200/210 - Nvidia high end: GTX 285 - 480/580 - 780/780 Ti
> Gx204/214 - Nvidia mid range GTX 260 - 460/560 - 680/770 - 970/980
> ...



Now that's a very sound argument however there are a few things you overlooked. 1 is the increased cost of wafers/yield 2 is die size.

When nvidia released Kepler at 28nm 300-350mm2 die size was fairly expensive. But regardless of that If we are to ignore point1 and move to point 2, then the argument becomes over die size and how much computational real estate u r getting for your money. Now gtx680/770 at 300mm2 is so mid range in my opinion which is similar to a gtx460, but gtx980 is a 400mm2 die so it is not totally midrange yet far from nvidias best, and then u have gtx 960 at about 200mm2 which is low-mid. Now one thing that really upsets me is that this whole competition between amd and nvidia is not going in the right direction because all nvidia does is release cards with very familiar performance but at higher efficiency, and then amd releases a chip with slightly better performance on a smaller die therefore lower price (but clocking too high and sacrificing some of the efficiency due to that) because all they seem to be concerned about is that superficial being the faster single gpu maker by slightly one upping the competitors best, but overall it only makes the market barely move forward. And what even makes it worse is how people were super excited about nvidia and how mighty their engineering is with Maxwell because they achieved slightly better performance than gtx780 ti but with a die that is 400mm2 instead of 550mm and by default was way more power efficient, but for God's sake this isn't engineering noble prize nor r we here to give away beat engineering masterpiece awards.  just give me a darn faster card at the same power envelope, because if I'm running a gtx780ti and u r targetting me as a customer then obviously I have a psu and a case that accommodates the power use and size of card, but no way in hell I'm gonna get 980 because I'm not buying electricity from you nvidia, I'm buying performance.

So to summarize my rant, efficiency is important for sure but clearly it's being abused to milk money out of customers because it's not being used to push performance to the limit, and everyone who bought gtx980 and feel superior because of how efficient their cards is need to remember that this is pretty much last year's performance which is supposed to cost less today, and that if they think they saved on electricity then in reality they didnt, they just paid the bill to nvidia instead of the electric company.


----------



## Pumper (Jan 17, 2015)

thebluebumblebee said:


> When Nvidia introduced the Kepler (GTX 680) line, they did something different than they had in the past - they introduced the mid-range GPU first, albeit at the price of the previous high-end GPU. .



This conspiracy theory is still alive and well I see. I like the part where you provide sources from nvidia that 680 was supposed to be 660. Oh, wait, you did not do that. Strange, considering that you present it as fact.


----------



## repman244 (Jan 17, 2015)

Reading this topic makes me think that the Radeon HD 2900XT is still the fastest GPU out there since it has a 512-bit mem bus.

Seriously there are so many other factors that come into play while most people are looking at that number...I thought this was a tech forum


----------



## Recus (Jan 17, 2015)

GhostRyder said:


> Because the bus is all that matters, but by your logic its almost as fast as a 256bit GTX 770 which is slower than a R9 280X so what was the point of the comment?.



Ok. 128bit is almost fast as 256/384bit. Problem?



Casecutter said:


> Is there some difference between "standard" and "extreme" FireStrike?  I have never heard of it termed as "standard", just good old "FireStrike Extreme".



Looks Extreme to me. 960 - 3438, 280X - 3560.







rtwjunkie said:


> That's just it, you're living in the last decade.  This is new technology, and you really have to forget what you knew about bus-width and performance.  The compression means that a 128 bit bus now acts the same as at LEAST a 192 bit bus.  There's nothing wrong with this where it's aimed.  The performance numbers so far show it between 760 and 770 performance.  If it's also more energy efficient, it's a complete win, and they can replace the 760 in their lineup.



Fully agree. I bet everyone who won't agree that 128bit can deliver performance in 2015 won't agree with this chart. 








TheGuruStud said:


> Troll. The 7970 is 3 yrs old lol



Why so desperate? By your logic you won't be able to buy R9 3x0 because your HD 7950 is 3 years old and you can't compare old vs new?


----------



## rtwjunkie (Jan 17, 2015)

Pumper said:


> This conspiracy theory is still alive and well I see. I like the part where you provide sources from nvidia that 680 was supposed to be 660. Oh, wait, you did not do that. Strange, considering that you present it as fact.


What about his post was comspiracy? Are you the only one who is unaware that the 680 was sold as the top end chip for the 6 series, but was in actuality their midline Kepler. We didnt get topflight kepler until the 780.

Nvidia have done the same, exact thing this time around. I hope you DO know the 980 and 970 are not the top of the line Maxwell chips?


----------



## Blue-Knight (Jan 17, 2015)

rtwjunkie said:


> I hope you DO know the 980 and 980 are not tje top of the line Maxwell chips?


They should not be. It is not $1000+. And it is placed above the GTX Titan X: http://www.geforce.com/hardware/desktop-gpus. And those cost a lot more than the GTX 980s.

They will certainly make others to put on top of that, unless NVIDIA has changed their mind.


----------



## rtwjunkie (Jan 17, 2015)

Blue-Knight said:


> They should not be. It is not $1000+. And it is placed above the GTX Titan X: http://www.geforce.com/hardware/desktop-gpus. And those cost a lot more than the GTX 980s.
> 
> They will certainly make others to put on top of that, unless NVIDIA has changed their mind.



You're right, they Shouldn't do this. It confuses people. But the fact is, they chose to use the mid-grade maxwell chip on their top-grade numbering: Gm204 on the 980.

You won't see the 9 series with the full-bodied GM200 chip. It is happening almost as an exact repeat of the 6 series.


----------



## 64K (Jan 17, 2015)

Blue-Knight said:


> They should not be. It is not $1000+. And it is placed above the GTX Titan X: http://www.geforce.com/hardware/desktop-gpus. And those cost a lot more than the GTX 980s.
> 
> They will certainly make others to put on top of that, unless NVIDIA has changed their mind.



It seems that Nvidia is following the same playbook with Maxwell that they did with Kepler. The GM210 Titan will drop first and be somewhere around $1,000. According to Jen-Hsun Huang they "sold like hotcakes" at that price. Then the GM210 gaming card (not sure what they will call it) which will beat the GTX 980 by a good bit and then the Ti version of that card which will smoke a GTX 980. The Ti version should come in somewhere around $700. All of this is just speculation on my part.


----------



## eroldru (Jan 17, 2015)

128-bit was such a bad move NVIDIA!


----------



## repman244 (Jan 17, 2015)

eroldru said:


> 128-bit was such a bad move NVIDIA!



Why?


----------



## rtwjunkie (Jan 17, 2015)

repman244 said:


> Why?



Don't worry about it.  He's not kept up to speed with the new technology, and doesn't understand 128 bit now, on Maxwell, is not the 128 bit of old.


----------



## rruff (Jan 17, 2015)

thebluebumblebee said:


> Just noticed that the 960 and 750 Ti share the same bus width.



Even the 750 has the same bus width. But ram speed goes 5GHz, 5.4GHz, and 7GHz. 

*In shaders, TMUs, ROPS, and GB of vram* *the 960 is 2x a 750*. Only in bandwidth is there a mere 40% increase. *The 960 is exactly 1/2 a 980 in all 5 metrics*.

I don't believe this is the card that Nvidia held back last fall... rather that would have been a further reduced GM204 based card. The story was that it would have cut into 970 sales, but there is no way *this* 960 would have done that. So I'm sure we will see another model to fill that space and possibly even a 1280 shader GM206 card. Any *could* have been called a GTX 960, but it's just random naming/marketing, and doesn't have any bearing on price/performance regardless. As it stands there is still a huge gap between the 750 Ti and 960 to fill as well. This 960 should be $200 or less and I hope it is, but it really comes down to AMD. Nvidia currently dominates this market well enough that they can toy with their competition. A 960 that performs as well as a R9 285 will be able to get a price premium just because it is new, it's Nvidia, and it uses a lot less power.


----------



## GhostRyder (Jan 17, 2015)

Pumper said:


> This conspiracy theory is still alive and well I see. I like the part where you provide sources from nvidia that 680 was supposed to be 660. Oh, wait, you did not do that. Strange, considering that you present it as fact.


There is no conspiracy, its a fact that the mid range was sold as top end based on the chip designations and how things were done in the past.  The problem/debate is more around the fact if this is an ok strategy or not more than anything because depending on how much performance difference there is it causes problems with people be lured by "Higher Numbers" or forcing us to wait years for actual performance changes.  Not everyone is effected by this but it does slow things down which is where many of the problems lie.


rtwjunkie said:


> What about his post was comspiracy? Are you the only one who is unaware that the 680 was sold as the top end chip for the 6 series, but was in actuality their midline Kepler. We didnt get topflight kepler until the 780.
> 
> Nvidia have done the same, exact thing this time around. I hope you DO know the 980 and 970 are not the top of the line Maxwell chips?


^Bingo


64K said:


> It seems that Nvidia is following the same playbook with Maxwell that they did with Kepler. The GM210 Titan will drop first and be somewhere around $1,000. According to Jen-Hsun Huang they "sold like hotcakes" at that price. Then the GM210 gaming card (not sure what they will call it) which will beat the GTX 980 by a good bit and then the Ti version of that card which will smoke a GTX 980. The Ti version should come in somewhere around $700. All of this is just speculation on my part.


Yep, but this round hopefully from what I am hearing people are more aware of the truth on the Titan branding so I am hoping people pay more attention and fight this so we can end this norm for everyone's sake.  $1000 bucks is something I think everyone would rather not have to invest into a single GPU card at this point.



eroldru said:


> 128-bit was such a bad move NVIDIA!


Nothing wrong with 128bit bus as long as everything else is up to speed (No pun intended).  We have jumped back and fourth on bus speeds constantly and things like vram speed getting higher make up for a lower bus speed on top of new technologies that help increase it as well.  A 128bit bus is more than enough for a card like this aimed at the area around the 760 and 770 because those cards are aimed at 1080p with 2gb of vram and with the higher ram speed the memory bandwidth difference is not to much to cause problems.  Even if they had put a 192bit or 256bit bus it would just end up being wasted because most people who look at this card are probably not considering gaming above 1080p right now as 2gb is generally what is recommended for 1080p and below not to mention the price increase of a higher resolution monitor.

The cards performance is where it should be honestly especially if we agree there will probably be a GTX 960ti, it would bring the cost up to have the bigger bus which on cards that get into the lower grounds can really make every penny count when it comes to price.  I don't see the cards specs as a bad thing and honestly worry more about the price if its to be believed its higher than the original $200 that was on our minds.


----------



## thebluebumblebee (Jan 17, 2015)

Pumper said:


> This conspiracy theory is still alive and well I see. I like the part where you provide sources from nvidia that 680 was supposed to be 660. Oh, wait, you did not do that. Strange, considering that you present it as fact.


Then, apparently, w1zzard's in on it too:


> NVIDIA clearly has a winner on their hands with the GTX 680. The new card, which is based on NVIDIA's *GK104* graphics processor, that introduces the Kepler architecture, is a significant leap forward both in terms of performance and GPU technology. Technically GK104, as its name reveals is an *upper mid-range GPU*, not a pure high-end part. Following NVIDIA's naming convention such a chip would be called GK100.


----------



## Xzibit (Jan 17, 2015)

Recus said:


> Looks Extreme to me. 960 - 3438, 280X - 3560.
> 
> 
> 
> ...



If Firestrike scores are the only measure heck that 1552mhz run should be enough for a 960 to replace 780s.

The AiBs 750 Ti OC also had similar scores to reference 660s but failed to even keep up with 650 Ti Boost. Out of the 5 W1zzard reviewed only 1 managed to outperform the 650 Ti Boost, which needed a base OC of 182. The 650 Ti Boost was cheeper too at $130 compared to a 750 Ti which ranged from reference $150 - $200.

*EDIT:
Didn't even mention the price of the other superior performing products that were in that price window at the time.

Nvidia
660 = $190

AMD 
265 = $150
7870 =$190
270X = $200

*The only 750 Ti OC to beat a 650 Ti Boost



*


----------



## HumanSmoke (Jan 18, 2015)

Xzibit said:


> If Firestrike scores are the only measure heck that 1552mhz run should be enough for a 960 to replace 780s.


Only if you live in a bizarro world where OC gains equate to real world performance increases. Even the most casual tech reader would realize that OC 'ing becomes a case of diminishing returns.


Xzibit said:


> The AiBs 750 Ti OC also had similar scores to reference 660s


W1zzards MSI GTX750Ti OC fell 17% shy of the 660 in his review. Not that dissimilar to the Firestrike Extreme scores. With the same 4770K, the MSI 750Ti OC scores 2053 while the 660 scores 2282 - a 11% deficit and ballpark considering it is a single benchmark rather than the aggregate of sixteen games.


Xzibit said:


> but failed to even keep up with 650 Ti Boost.


And? W1zzards latest review pegs the 650 Ti Boost at 9.4% faster than the 750 Ti at 19x10, while the Firestrike Extreme pretty much mirrors the same differential with the 650 Ti Boost at 2265 which gives it a 10.3% lift (compared to the 2053 score linked above). That's a whole 0.9% difference between the review and a single benchmark.


Xzibit said:


> Out of the 5 W1zzard reviewed only 1 managed to outperform the 650 Ti Boost, which needed a base OC of 182. The 650 Ti Boost was cheeper too at $130 compared to a 750 Ti which ranged from reference $150 - $200.


Wow, that's a shocker! Never would have guessed that a brand new model would sell at a premium over an outgoing card. You might be imparting worthwhile information except:
1. Not news. For example, the R9 285 produced ~9% less performance than the lower numbered 280X, but was only 4% lower in price.
2. The 750 Ti's price realigned (as did the 285's) once the NEWCARDOMG!!! factor had worn off, to the point where you can buy one for $100, while the aforementioned GTX 660 will set you back $130 (or 30% more cost for ~20% more performance over the 750 Ti), or $120+ for the 650 Ti Boost (that's 20% more cost, 10% more performance to save you having to break out the calculator).

So, no, IMO the 960 won't replace the 780, and Firestrike is a pretty decent indicator of performance - a performance that is predicated upon the clocks, cooling, and system being run.


----------



## rruff (Jan 18, 2015)

Xzibit said:


> The 650 Ti Boost was cheeper too at $130 compared to a 750 Ti which ranged from reference $150 - $200.



I agree that the 750 Ti is very poor value at the inflated MSRPs, but who buys cards at those prices? You can buy a nice model pretty much any day of the week for $120, and an EVGA SC was $85 on BF. I haven't seen any 650 Ti Boosts for sale lately, at least not for a good price. There have been 660s for $100 or less several times recently. And you can get 650 Ti right now for $60. 750s can normally be had for $80 and I got one for $45 on BF. Those are ~70% faster than a GTX 650, and 5-10% faster than a 650 Ti. The 750 Ti only adds 10-15% in performance over the 750 and isn't worth the >30% typical price premium. 

It's all supply and demand and marketing. I expect the 960 price/performance to be unimpressive initially, since it will sell anyway, and once things get settled down, and especially if AMD does something this year, you'll be able to pick up 960s for <$150 in the fall, with at least one dip to the $120 range.


----------



## john_ (Jan 18, 2015)

Fortunately it seems that NVidia will not shoot itself in the foot with the price. Or at least it's not going to shoot both feet.
This latest rumor does look like confirmed as the title says. 

Nvidia Geforce GTX 960 Final Pricing Update: MSRP More or Less Confirmed at $200 Retail


----------



## HumanSmoke (Jan 18, 2015)

john_ said:


> Fortunately it seems that NVidia will not shoot itself in the foot with the price. Or at least it's not going to shoot both feet.
> This latest rumor does look like confirmed as the title says.
> 
> Nvidia Geforce GTX 960 Final Pricing Update: MSRP More or Less Confirmed at $200 Retail


Standard operating procedure for WCCF. If they publish enough prices they'll get it right eventually - and as per usual lead with the clickbait hysteria-inducing numbers first. I was kind of hoping that the price might stay secret up until launch as pricing realignment done in a panic always benefits the consumer if the incoming model is available in quantity from day one.


----------



## john_ (Jan 18, 2015)

HumanSmoke said:


> Standard operating procedure for WCCF. If they publish enough prices they'll get it right eventually - and as per usual lead with the clickbait hysteria-inducing numbers first. I was kind of hoping that the price might stay secret up until launch as pricing realignment done in a panic always benefits the consumer if the incoming model is available in quantity from day one.


I can't argue with that comment about Wccftech. They are like a searching machine that just reposts whatever someone posted on the internet as important rumor. But in both articles they had a little more concrete evidence than the usual "someone who is unknown even to his own mother, just posted from nowhere this info". Today's article does look to offer valid info.


----------



## Pumper (Jan 18, 2015)

GhostRyder said:


> There is no conspiracy, its a fact that the mid range was sold as top end based on the chip designations and how things were done in the past.



Exactly, the key word here is "past". When exactly did nvidia state that they will never change the naming of their GPU chips? Never, is my guess.


----------



## HumanSmoke (Jan 18, 2015)

john_ said:


> I can't argue with that comment about Wccftech. They are like a searching machine that just reposts whatever someone posted on the internet as important rumor. But in both articles they had a little more concrete evidence than the usual "someone who is unknown even to his own mother, just posted from nowhere this info". Today's article does look to offer valid info.


I have little doubt that the $200 number is closer to the mark. But whereas other sites with access to the very same prior pre-launch price (gouging), WCCF tend to attach a certain certainty about their stories. Another example would be the die estimate of GM200 based off a low res slightly oblique snap. The guy leads off with


> * Before we begin, in all fairness, I should point out some things that could make this experiment inaccurate ; *lens distortion, inaccurate perspective correction and warping due to rolling shutter just to name a few.


But still claims an accuracy of ± 2.5% ! That's an accuracy of around half a millimetre per side. Oddly enough, if the claims from generally more reliable sources pan out (570-580mm^2) he can actually claim he was out by just 0.51mm per side.


----------



## bpgt64 (Jan 18, 2015)

It's an interesting offering, depending on where it lands price point wise will be make for some interesting competition.  I expect it to land at about 175-225 stock-> aftermarket.  I don't think anyone should worry about the Bus-Speed.   If your look at GTX 960 for 4k, your looking in the wrong place.  I think a pair of 970s is the sweet spot for that, or a single 980(HOF or otherwise).  The question is will a pair of 960s hold up well at 1440p  60 fps+  -> 120fps....


----------



## rruff (Jan 18, 2015)

bpgt64 said:


> I don't think anyone should worry about the Bus-Speed.



I'm a little concerned because of something I posted earlier. *In shaders, TMUs, ROPS, and GB of vram* *the 960 is 2x a 750*. Only in bandwidth is there a mere 40% increase. *The 960 is exactly 1/2 a 980 in all 5 metrics*.

Based on TPU's performance summary charts, 1/2x a 980 would be right at 760 level performance. 2x a 750 would be right at 770 level... but the bandwidth is only 40% higher. It's possible that the 750 has more bandwidth than it needs, but I know that overclocking the ram results in a significant improvement in fps. I don't understand how video cards work that well, but unless Nvidia has done additional optimizations on the GM206, seems like it would be close to a 760... which would be lame for $200. I also believe it will consume a lot less than 120W with base clocks... more like 90W, based on the 980 and 750.

I think the ram quantity of 2GB will make it a poor SLI choice, but if you can get a 4GB model they should SLI between the 970 and 980.


----------



## bpgt64 (Jan 18, 2015)

Imo they did some magical shit with the 9 series,  unlike the r9 290x or 295x2 my room doesn't become a sauna when gaming at 4k.  Let's see some bench marks first then complain about resource allocation or applicability yo higher resolution.  My guessay is this card is going to be a beast with a good cooler.


----------



## Sony Xperia S (Jan 18, 2015)

bpgt64 said:


> unlike the r9 290x or 295x2 my room doesn't become a sauna when gaming at 4k.



You room is a sauna regardless of whether you introduce several more hundreds of watts to it. 

You cannot warm a room up with 300 W or even a 600 W heating.


----------



## GhostRyder (Jan 18, 2015)

Pumper said:


> Exactly, the key word here is "past". When exactly did nvidia state that they will never change the naming of their GPU chips? Never, is my guess.


The fact is your not getting it and most people here understand how it works...They didn't just decide to put a name tag on a chip and call it a day there is a distinct *reason* for that name with how NVidia does its GPU hierarchy.  Believe what you want but the facts don't change...Part of it could be that they thought the GTX 680 and 670 had "enough" power for the time though



bpgt64 said:


> Imo they did some magical shit with the 9 series,  unlike the r9 290x or 295x2 my room doesn't become a sauna when gaming at 4k.  Let's see some bench marks first then complain about resource allocation or applicability yo higher resolution.  My guessay is this card is going to be a beast with a good cooler.


My room is not a sauna lol, and I have 3 of them.



bpgt64 said:


> It's an interesting offering, depending on where it lands price point wise will be make for some interesting competition.  I expect it to land at about 175-225 stock-> aftermarket.  I don't think anyone should worry about the Bus-Speed.   If your look at GTX 960 for 4k, your looking in the wrong place.  I think a pair of 970s is the sweet spot for that, or a single 980(HOF or otherwise).  The question is will a pair of 960s hold up well at 1440p  60 fps+  -> 120fps....


I do not think this card is going to hold up well at 1440p especially if it does fall between in performance of a GTX 760 and GTX 770.  A pair might do a decent job but so would a single 970 which would probably end up being a better buy for most to not have to deal with SLI though I am just speculating based on where the performance is looking like it will fall.



john_ said:


> Fortunately it seems that NVidia will not shoot itself in the foot with the price. Or at least it's not going to shoot both feet.
> This latest rumor does look like confirmed as the title says.
> Nvidia Geforce GTX 960 Final Pricing Update: MSRP More or Less Confirmed at $200 Retail


That is more like what it should be and seems to be where it was headed, I didn't think it would be $250+ because it would make fitting the 960 into the market a little hard with the performance drop not to mention make it hard without price changes down the line for a 960ti (If they will) to be fit in without feeling like they played early adopters.  Though buying right out the gate is normally something to avoid anyways...



Xzibit said:


> If Firestrike scores are the only measure heck that 1552mhz run should be enough for a 960 to replace 780s.
> The AiBs 750 Ti OC also had similar scores to reference 660s but failed to even keep up with 650 Ti Boost. Out of the 5 W1zzard reviewed only 1 managed to outperform the 650 Ti Boost, which needed a base OC of 182. The 650 Ti Boost was cheeper too at $130 compared to a 750 Ti which ranged from reference $150 - $200.
> *EDIT:
> Didn't even mention the price of the other superior performing products that were in that price window at the time.
> ...


Prices out the gate tend to be one of those things that really are hit or miss with better offering for cheaper or the same that will perform better even by the same company.  Most of the time they are just going on the "Higher number = better" routine where most consumers do not pay much attention to extreme detail and just buy something based on the number associated with it thinking that it must be better.  That is why prices can be put that way because they will sell as long as there are people only looking at names, guess as they say you should not judge a book by its cover.


----------



## HumanSmoke (Jan 18, 2015)

Sony Xperia S said:


> You cannot warm a room up with 300 W or even a 600 W heating.


"Random troll disproves the Laws of Thermodynamics" - said no one ever.

Meanwhile, site owner personal experience with 290X Crossfire:


> These 290X cards are HOT! If your computer is in a small room that sees ambient temperatures above 80F, you will not want a pair of these cards. I think you could live with one 290X but the heat that comes off these cards is insane. Luckily when I started testing these the temperatures were still warm here in Texas, so getting my office up to an ambient temperature of 78F was easy to do. A pair of these 290X in CrossFire can easily warm the room you are in up a few degrees. Under full load in Uber Mode, the exhaust temperature of these cards is over 150F. Yes, you can burn yourself on the exhaust ports of the cards should you be so inclined.
> A 300 watt delta between 290X CrossFire and 980 SLI is a huge number. It is easily recognizable when sitting next to the system. After a few hours of gaming with 290X CrossFire, you certainly had that sweaty gamer feeling about you.


----------



## Sony Xperia S (Jan 18, 2015)

HumanSmoke said:


> "Random troll disproves the Laws of Thermodynamics" - said no one ever.



Ooo, we begin with the stupid insults! 

If you are so inclined, I will invite you to my home, we will take a random room, will cool it to ambient temperature of 17-18 degrees Celsius and I will give you a permission to use my rig with two R9 290X.

If you succeed with warming the room, I will admit, I was wrong.

Until then, I will laugh at you, and not even take into consideration those stupid comparisons in hot climate.

In hot climate, even smoking a cigarette feels unpleasant.


----------



## bpgt64 (Jan 18, 2015)

I speak of the heat difference because my wife uses an R9 295X2, and I use a pair of 980s, both driving 4k panels.  I can tell when she is playing a video game.  We leave  window open in winter to counter balance heat coming off both our rigs.  Like HumanSmoke said the TDP difference is very big, and very noticeable.  It's not meant to be an insult, it's just a fact.


----------



## Fluffmeister (Jan 18, 2015)

Sony Xperia S said:


> Ooo, we begin with the stupid insults!
> 
> If you are so inclined, I will invite you to my home, we will take a random room, will cool it to ambient temperature of 17-18 degrees Celsius and I will give you a permission to use my rig with two R9 290X.
> 
> ...



I'm up for popping over, I'll bring a towel to wear because I know you're in denial.


----------



## HumanSmoke (Jan 18, 2015)

Sony Xperia S said:


> If you are so inclined, I will invite you to my home, we will take a random room, will cool it to ambient temperature of 17-18 degrees Celsius and I will give you a permission to use my rig with two R9 290X.


So, you need to pre-cool the room before using the 290X. This was never mentioned in the cards specifications fine print: "DANGER: USE OF THIS CARD IN AMBIENT TEMPERATURES EXCEEDING 18°C CAN LEAD TO EXCESS SWEATING AND DEHYDRATION". Why would you need to do this? Won't your mother be pissed off with you fooling with the thermostat?

In a closed system the addition of heat energy will elevate the temperature of the space it occupies - this is (very) basic physics.


Fluffmeister said:


> I'm up for popping over, I'll bring a towel to wear because I know you're in denial.


You're assuming that the offer doesn't include a personal air conditioning fan?


----------



## Xzibit (Jan 18, 2015)

bpgt64 said:


> Imo they did some magical shit with the 9 series,  unlike the r9 290x or 295x2 my room doesn't become a sauna when gaming at 4k.  Let's see some bench marks first then complain about resource allocation or applicability yo higher resolution.  My guessay is this card is going to be a beast with a good cooler.



Here is how the magic happens



			
				Toms Hardware said:
			
		

> *Gaming Power Consumption*
> These findings further illustrate what we said on the previous page about Maxwell and its ability to regulate GPU voltage faster and more precisely. The gaming-based power consumption numbers show just how much efficiency can be increased if the graphics card matches how much power is drawn to the actual load needed. The version of the GeForce GTX 980 that comes overclocked straight from the factory manages to use significantly less power than the reference version, while offering six percent more performance at the same time.
> 
> *Stress Test Power Consumption*
> If the load is held constant, then the lower power consumption measurements vanish immediately. There’s nothing for GPU Boost to adjust, since the highest possible voltage is needed continuously. Nvidia's stated TDP becomes a distant dream. In fact, if you compare the GeForce GTX 980’s power consumption to an overclocked GeForce GTX Titan Black, there really aren’t any differences between them. This is further evidence supporting our assertion that the new graphics card’s increased efficiency is largely attributable to better load adjustment and matching.



You get better efficiency with non-reference during gaming but its higher then the advertised TDP.  Stress testing tells a different story with non-reference 970 & 980 sucking up 242w & 280w.


----------



## mxp02 (Jan 19, 2015)

Fluffmeister said:


> I'm up for popping over, I'll bring a towel to wear because I know you're in denial.


You're right!2*290X will definitely make a room sauna.Cause I got 970 tri-sli which consumes equal amount of power to 2*290X,and my room is hot like hell.Under this kind of circumstances wearing any  clothes would be suicidal,so I just have a towel around my neck.Those 290X CF owners are terrible liars.


----------



## overpass (Jan 19, 2015)

Hmm, if it is around $180, I'd say it is worth it!  Pretty sure the stock on 760 or 770 cards is running rather low? It will be closer to 770 than 760 due to the optimizations that nVidia will bring to Maxwell.


----------



## Sony Xperia S (Jan 19, 2015)

HumanSmoke said:


> So, you need to pre-cool the room before using the 290X



Ahahaha  This is because when you turn the heating off, the rooms tend to go quickly to that ambient temperature.



HumanSmoke said:


> In a closed system the addition of heat energy will elevate the temperature of the space it occupies - this is (very) basic physics.



Just saying that you won't succeed with so small energy addition, you will need something much more serious for a change.

Two 290Xs won't be enough. 



HumanSmoke said:


> your mother



Ooo, and please, do not mention my mom!!! Because, as far as I remember, I haven't mentioned yours!


----------



## HumanSmoke (Jan 19, 2015)

Sony Xperia S said:


> Just saying that you won't succeed with so small energy addition, you will need something much more serious for a change.


Meanwhile in the real world :  "If a 40-watt fan runs in a small (12' x 12' x 8') room for one hour, it generates enough heat to raise the temperature of the air by about 9 degrees Fahrenheit" - U.S. Department of Energy. You could actually ask a scientist to do the calculation for you, but as you're just trolling I guess that wont happen. It must be about time you switched back to trolling AMD, or the site in general if your usual M.O. holds.


----------



## Tatty_One (Jan 19, 2015)

This is becoming interesting if a little tiresome, if the two of you cannot disagree without the petty jibes, childish tit for tat and constant bickering then I will shut you both down for a spell..... thank you.


----------



## rruff (Jan 19, 2015)

HumanSmoke said:


> You could actually ask a scientist to do the calculation for you



The problem is that both of you are correct.

It will vary hugely depending on how thermally isolated the room is, it's thermal capacitance, its size, how long you are heating it, and the starting temperature, and whether you will perceive a temperature rise as an issue (if it's winter and you keep your house at 65, then no problem... if it's summer and it's already 90, then yes). If the room is perfectly insulated, even 1W would eventually send the temperature to infinity. But most rooms are not well insulated from each other, doors are not sealed (or are even open), and air is circulated via heating and air conditioning systems.

My wife has a little electric resistance heater. It's 1500W. It can heat a sealed small room >+10F in an hour. 500W for 3 hrs would be similar. My Dad's 3000 sq ft 50 year old house uses electric resistance heat. In winter his average consumption is 5000W... that's to heat the whole house 40+F over ambient, steady state. If he had 10 x500W systems cranking 24/7 he could get rid of the furnace.

So yes, the heat given off by a computer can be significant. It is also usually not that hard to make it a non-issue.

Getting back to the 960, I don't think it will be near 120W unless it is over clocked a lot. It's half of a 980 and a little less than double a 750. Should be <100W.


----------



## Casecutter (Jan 19, 2015)

HumanSmoke said:


> "If a 40-watt fan runs in a small (12' x 12' x 8') room for one hour, it generates enough heat to raise the temperature of the air by about 9 degrees Fahrenheit" - U.S. Department of Energy.


I went searching to see what the actual parameters of that DOE test, do you have a link?

I'd want to understand the construction and outside ambient they factor.  While a fan in the right conditions could create an increase, 9°F in an hour or 13% hard to fathom? Especially with something more than the air movement of table fan those a about 10-25W?  Most modern ceiling fans use less than an amp, averaging between 0.5 and 1 amp, 10-50W being customary. Though the air circulation within of a completely sealed room is there other aspects to account for like a human body... as why leave the fan on in the first place.

I’m not sure of how they arrived at a 13% increase over normal [72°] room in just one hour unless the room is effected additional conduction from outside ambient, or body heat adding additional thermal load.  By shear calculation you might say the volume of air within a sealed “room” with 40W load (light bulb ~36 Joules/sec) would produce a rise, but you'd be lucky to see 4-5°F in an hour.  Considering there's air movement with the fan, that influences the energy/heat that would be dissipated/absorbed by the surface of walls as effected by the outside ambient.


----------



## rruff (Jan 19, 2015)

Casecutter said:


> While a fan in the right conditions could create an increase, 9°F in an hour or 13% hard to fathom?



Don't worry about it. That isn't going to happen in any practical situation. It was probably a completely sealed and insulated room, and so only looked at the energy to raise the temperature of the air. Air doesn't have much thermal capacitance, so it doesn't take much to heat it up. It's the exchange with other objects, the outdoors, and the rest of the house that matters. 

Ok, might as well just calculate it for a perfectly sealed and insulated room. Thermal capacitance of air ~1.00 KJ/KG-K. Density ~1.2 KG/m^3. The 12x12x8 room will have 1152 ft^3 or 33m^3. Total mass of air is 33x1.2= 40 KG. 40W for 1 hr is 40x60x60 = 144 KJ. Temperature rise = 144/40 = 3.6 K or 6.5 F. 

I don't get a 9 F rise even with bogus idealized assumptions.


----------



## HumanSmoke (Jan 19, 2015)

rruff said:


> The problem is that both of you are correct.
> It will vary hugely depending on how thermally isolated the room is, it's thermal capacitance, its size, how long you are heating it, and the starting temperature, and *whether you will perceive a temperature rise as an issue* (if it's winter and you keep your house at 65, then no problem... if it's summer and it's already 90, then yes).


Yes, and there is also another variable to take into consideration - proximity of the user to the heat source. People as a general rule don't sit within a couple of feet of a heater, but they may well do that with their computer system. And as you've alluded to, people are unlikely to fire up a heater on a warm day _and_ have it parked next to them.
Having AC or airflow in the room might alleviate or negate entirely the effects, but I'm pretty certain that having the system parked under the desk amplifies the effects. Airflow and AC are also dependant  upon these being available - which might not be a given for a lot of users.
The user experience is of course particular to the person, but I'm pretty sure that anecdotal evidence points to a significant percentage of users affected. My 780 SLI system uses close to a 290X Crossfire setup ( ~700-750W), and I'm pretty sure I notice the difference between idle and 3D load. BTW, the local ambient temp here is 18°C at 9:10 a.m. with winds of <2km/h, which is about average for late/overnight/early morning during summer and autumn.


rruff said:


> Getting back to the 960, I don't think it will be near 120W unless it is over clocked a lot. It's half of a 980 and a little less than double a 750. Should be <100W.


That might be a little conservative. The GTX 970/980's TDP seems based on base clock, which in actuality is rarely, if ever, the limit of its reference core frequency. Two other factors to take into consideration are that the 960 is clocked at least 100MHz higher (core) and uses 4GB of 7Gbps GDDR5 (@1.55V) rather than 2GB of 5Gbps (@1.35V) found in the GTX 750/750 Ti.
I have no doubt that Nvidia will quote the base clock TDP for the card for marketing purposes, but I wouldn't be overly surprised to see the card use 20W or so more in real-world usage for 3D loading.


----------

