# NVIDIA GeForce GTX 960 Specs Confirmed



## btarunr (Jan 15, 2015)

Here's what NVIDIA's upcoming performance-segment GPU, the GeForce GTX 960, could look like under the hood. Key slides from its press-deck were leaked to the web, revealing its specs. To begin with, the card is based on NVIDIA's 28 nm GM206 silicon. It packs 1,024 CUDA cores based on the "Maxwell" architecture, 64 TMUs, and possibly 32 ROPs, despite its 128-bit wide GDDR5 memory interface, which holds on to 2 GB of memory. The bus may seem narrow, but NVIDIA is using a lossless texture compression tech, that will effectively improve bandwidth utilization. 

The core is clocked at 1127 MHz, with 1178 MHz GPU Boost, and the memory at 7.00 GHz (112 GB/s real bandwidth). Counting its texture compression mojo, NVIDIA is beginning to mention an "effective bandwidth" figure of 9.3 GHz. The card draws power from a single 6-pin PCIe power connector, the chip's TDP is rated at just 120W. Display outputs will include two dual-link DVI, and one each of HDMI 2.0 and DisplayPort 1.2. In its slides, NVIDIA claims that the card will be an "overclocker's dream" in its segment, and will offer close to double the performance over the GTX 660. NVIDIA will launch the GTX 960 on the 22nd of January, 2015.



 

 

 



*View at TechPowerUp Main Site*


----------



## 64K (Jan 15, 2015)

Looks like a very nice card for 1080p gaming. I hope the rumored $200 price point is true. It will be very successful if so.


----------



## rtwjunkie (Jan 15, 2015)

So, double the 660 performance means Nvidia is saying hands-down this beats a 770, correct?


----------



## Eternalchaos (Jan 15, 2015)

rtwjunkie said:


> So, double the 660 performance means Nvidia is saying hands-down this beats a 770, correct?


I think it will probably match the GTX770 @ 1080p


----------



## Tsukiyomi91 (Jan 15, 2015)

This new card is very promising for both NVIDIA & new gamers who are looking for a card that's affordable & runs most title at 1080p/60fps. $200 price point is VERY compelling, but hopefully they're right about the pricing.


----------



## dj-electric (Jan 15, 2015)

rtwjunkie said:


> So, double the 660 performance



Nobody said its double the performance, just double the efficiency.

Judging by the graphs here, looks like about 60% more performance.
According to TPU, that will put the GTX 960 on 64% here
http://tpucdn.com/reviews/Gigabyte/GeForce_GTX_980_G1_Gaming/images/perfrel_1920.gif
My guess? it will be on 60-61% relative to this graph, at about GTX 770 performance


----------



## ZoneDymo (Jan 15, 2015)

getting so sick of these terrible terrible presentation chart images


----------



## jabbadap (Jan 15, 2015)

> The core is clocked at 1127 MHz, with 1178 MHz GPU Boost, and the memory at 7.00 GHz (112 GB/s real bandwidth). Counting its texture compression mojo, NVIDIA is beginning to mention an "effective bandwidth" figure of 9.3 GHz.



So in other words it has 9.3*128/8=148.80GB/s of "effective bandwidth", comparable for gtx660 with 144GB/s but fall way short of gtx770 with 224 GB/s or even gtx760 with 192 GB/s. How this effect on AA and higher textures will be interesting to see.

Any word about video decoding options, tegra x1 has full h265/vp9 deocoding. Really hope this has too.


----------



## RCoon (Jan 15, 2015)

It's a 770, with lower power consumption requirements, Maxwell features (a la texture compression method), and quite possibly a lower cost.
Seems reasonable for those looking for a cheap option into 1080p gaming without spending money on fancy PSU's.

The point of this card is to make its competition in the similar price range irrelevant, in both performance and power consumption.

As always, the performance graphs from the manufacturer is the most obscure and pointless arrangement, so wait for reviews.


----------



## gigantor21 (Jan 15, 2015)

I really want this, but 2GB of RAM on a 128-bit bus...I'll have to wait for reviews, or possibly a 960Ti. :/


----------



## rtwjunkie (Jan 15, 2015)

Dj-ElectriC said:


> Nobody said its double the performance, just double the efficiency.
> 
> Judging by the graphs here, looks like about 60% more performance.
> According to TPU, that will put the GTX 960 on 64% here
> ...


 
Actually, the 2nd to last sentance it says "close to double the *performance* over the 660."


----------



## Fluffmeister (Jan 15, 2015)

As with other Maxwell based cards these should indeed overclock like a trooper, a nice refresh for their existing cards and should equally give aging Tahiti based cards a great run for their money whilst using half the power.


----------



## Octavean (Jan 15, 2015)

jabbadap said:


> Any word about video decoding options, tegra x1 has full h265/vp9 deocoding. Really hope this has too.




I haven't heard anything credible yet on that front.  

I've been trying to decide between the GTX 970 and the upcoming GTX 960.  The ~$200 price point of the GTX 960 is attractive but the prowess of the GTX 970 is also attractive as well as its current availability. 

The introduction of hardware HEVC / H.265 and VP9 decoder support on the GTX 960 would be a very nice addition though. It could make a hard decision very easy.


----------



## W1zzard (Jan 15, 2015)

rtwjunkie said:


> Actually, the 2nd to last sentance it says "close to double the performance over the 660."


I'd guess Legal would define "close to double" as 50.1% faster and more  which is what the 3rd slide supports.

For the record I have not done any testing on these cards, so I have no idea about actual numbers.


----------



## DarkOCean (Jan 15, 2015)

only 2gb of vram for a 2015 card... LOL


----------



## rtwjunkie (Jan 15, 2015)

DarkOCean said:


> only 2gb of vram for a 2015 card... LOL


 
That should be plenty in the 1080p segment this card is meant for.


----------



## Blue-Knight (Jan 15, 2015)

someone said:
			
		

> Key slides from its press-deck were leaked to the web, revealing its specs.


People are so impatient. Knowing its specifications is just a matter of time, but people cannot wait for some reason. 



			
				someone said:
			
		

> for a card that's affordable & runs most title at 1080p/60fps.


At maximum settings, as defined by most people on the forum is: 16/32x AA, 16xAF, object occlusion, vsync, all items at nvidia control panel to the maximum, and every slider in game to maximum, obviously.

No card configuration can do that with current games. 



			
				someone said:
			
		

> NVIDIA claims that the card will be an "overclocker's dream" in its segment, and will offer close to double the performance over the GTX 660.


Performance in what? Overclock or gaming performance?

I guess it is overclock.

Because if it is gaming performance, an increase in performance compared to the ancient GTX 660 is more than expected, otherwise what is point in buying it? It should be faster than the GTX 760 first, to make any sense. If it will match or outperforms the GTX 770, and I hope it outperforms it (in such case the difference will not be large), it will be good.

But there other improvements to care more about, people seems to care only about performance and forget other things.


----------



## john_ (Jan 15, 2015)

The price will decide if this card holds any value.

At $150, it is a true 2015 card. Many reasons to buy it. Only mistake the name. It should have been called 950Ti.

A $200 price is just hi for a 128bit card in 2015, even with the better efficiency in memory.

More than $200, lets say $250, and it is a "no go".

As I said, this is 950Ti in my opinion, NOT 960. Based on specs and the fact that this is not 2012. OK Nvidia calls it 960 to justify a higher price, possibly that $200, but in my opinion those specs scream 950Ti.
And although it might look as a good option today for 1080p, a couple more AAA games in 2015 that ask for higher bandwidth and more memory, will sent many people who will buy this card today, to start searching for a better card in 6 months from now. I guess that 128bit also kills any ideas about SLI, IF this card supports SLI.


----------



## RCoon (Jan 15, 2015)

DarkOCean said:


> only 2gb of vram for a 2015 card... LOL



You'd be surprised. Here are my total figures for VRAM usage on *1440p*. I like to think I review a spread of varied games of all styles. Some of those include Early Access titles which are optimised horribly.


----------



## LiveOrDie (Jan 15, 2015)

WTF they compare it to a GTX660 why not the GTX760?


----------



## Tsukiyomi91 (Jan 15, 2015)

if you guys have issues of a card that has 2GB of VRAM & 128-bit bus width, then I think you're looking at the wrong card as NVIDIA stated that this 960 is a perfect card for those who are using 1080p monitors & doesn't play on the highest possible settings on games like FC4, Shadow Of Mordor & other as they wanted a balance of eye candy & decent frame rate. Also, not everyone in the world is capable of buying a GTX970, so the GTX960 is here for this reason. This card IMO will be considered as the new budget VGA card in the market, in which will supersede the GTX750 & it's Ti variant. My GTX760 may not be the fastest Kepler based card, but it's enough to run all the demanding games at High @ 1080p without giving me headache or eye strains. Keep your negative thoughts to yourself.


----------



## RCoon (Jan 15, 2015)

Tsukiyomi91 said:


> This card IMO will be considered as the new budget VGA card in the market



It's also liable to become the top card in the Chinese market for Colorful and similar manufacturers from that ilk.


----------



## Cheeseball (Jan 15, 2015)

You guys think this would be a good replacement for my aging HD 7870 XT (Tahiti LE, 1536 cores) on my secondary PC?


----------



## RCoon (Jan 15, 2015)

Cheeseball said:


> You guys think this would be a good replacement for my aging HD 7870 XT (Tahiti LE, 1536 cores) on my secondary PC?



Wait for W1zzard's review. Nobody knows how it will perform yet.


----------



## GhostRyder (Jan 15, 2015)

The card is designed to be for 1080p and on the midrange so I see the 2gb as enough to get buy and not really a downside because I mean 4gb would be overkill and I doubted they would do 3gb with this card anyway.  It will be nice for people at a $200 price point since the consensus is that is where it will be (Which honestly would make the most sense especially if we factor the possibility of a 960ti variant).

Honestly going to be a great card for those on a budget, like the single 6 pin connector as that opens up the possibility of just something on the small side of PSU's and gives a very wide range of options for PSU.



Cheeseball said:


> You guys think this would be a good replacement for my aging HD 7870 XT (Tahiti LE, 1536 cores) on my secondary PC?


Probably would get an upgrade of course but that would depend on how much your hoping for because I don't think its going to be night and day but more of a decent improvement card to card.  But this is just early speculation of course as its guaranteed to be better but just how much has yet to be seen.


----------



## Cheeseball (Jan 15, 2015)

If the GTX 960 is 10% better performance-wise and only drains half the power of my HD 7870 XT, it's an instant buy for me.


----------



## Tsukiyomi91 (Jan 15, 2015)

RCoon said:


> It's also liable to become the top card in the Chinese market for Colorful and similar manufacturers from that ilk.


It sure is. Bet vendors like them will make custom card based on it with little price impact.


----------



## Tsukiyomi91 (Jan 15, 2015)

@Cheeseball seems that most of us have to just wait & see how good is this card will be once it's out.


----------



## hat (Jan 15, 2015)

I wonder how it compares to the 660 ti, in terms of raw performance?

This would be a good card to swap the 660 ti out with in this machine. I could then replace the 5870 in another machine with my 660 ti.


----------



## john_ (Jan 15, 2015)

Cheeseball said:


> You guys think this would be a good replacement for my aging HD 7870 XT (Tahiti LE, 1536 cores) on my secondary PC?


Tahiti LE?
Nope it will not. It will be better in some areas (lower power consumption and noise) but for $200 in will not be an upgrade that will really make a difference. Except if your favorite games use PhysX effects. In that case it could look as a serious upgrade to you.


----------



## HisDivineOrder (Jan 15, 2015)

RCoon said:


> You'd be surprised. Here are my total figures for VRAM usage on *1440p*. I like to think I review a spread of varied games of all styles. Some of those include Early Access titles which are optimised horribly.




Now include the chart with Watch_Dogs, Assassin's Creed Unity, Dragon Age Inquisition, or any other one that represents more of what people who play at 1440p (or higher) will have when going from PS4 or Xbox One exclusives to PC port.

A bunch of indies and some 360-built ports--might as well include the Saints Row Gat out of Hell benchmarks too right?--don't really represent why having less than 2GB is unwise going forward.


----------



## Chaitanya (Jan 15, 2015)

64K said:


> Looks like a very nice card for 1080p gaming. I hope the rumored $200 price point is true. It will be very successful if so.


If nVidia goes aggressive with pricing down to $180, that card will sell like hot cakes and will be what 7850 did for AMD few years back.


----------



## RCoon (Jan 15, 2015)

HisDivineOrder said:


> Now include the chart with Watch_Dogs, Assassin's Creed Unity, Dragon Age Inquisition, or any other one that represents more of what people who play at 1440p (or higher) will have when going from PS4 or Xbox One exclusives to PC port.
> 
> A bunch of indies and some 360-built ports--might as well include the Saints Row Gat out of Hell benchmarks too right?--don't really represent why having less than 2GB is unwise going forward.



I'll run an article later tonight or tomorrow with a couple of the AAA titles, detailing VRAM as well as memory bandwidth usage and PCIe bus usage on Maxwell. Hoping to get hold of a card without Maxwell compression with similar GB/s memory bandwidth figures (looks like the 770 is a match) to note differences not just in VRAM usage, but also memory bandwidth usage. But that depends on whether I can source a card for tests.


----------



## darkangel0504 (Jan 15, 2015)

Dragon Age Inquisition takes 2400 VRAM on res 1440 x 900


----------



## Sanhime (Jan 15, 2015)

What would this say about mobile solution?  860m is already based on Maxwell, if this is a "960m" coming along, would it be any better than the 860m?


----------



## ap4lifetn (Jan 15, 2015)

That's strange, NVIDIA uses a cut down GM204 with 1024/128bit for their mobile GTX 965M, but it seems that their GM206 (this GTX 960) has the same configuration?


They are stockpiling chips better than 1024/128 for their GTX 960 Ti


----------



## mroofie (Jan 15, 2015)

rtwjunkie said:


> So, double the 660 performance means Nvidia is saying hands-down this beats a 770, correct?


is this stock or oc (the double 660 claim) ?
if its stock that would mean I could reach gtx 970 levels of performance :0


----------



## rtwjunkie (Jan 15, 2015)

mroofie said:


> is this stock or oc (the double 660 claim) ?
> if its stock that would mean I could reach gtx 970 levels of performance :0


 
Who knows?  the whole thing is vague.


----------



## CrAsHnBuRnXp (Jan 15, 2015)

Really curious how this performs up against a 780.


----------



## MxPhenom 216 (Jan 15, 2015)

darkangel0504 said:


> Dragon Age Inquisition takes 2400 VRAM on res 1440 x 900



that game is also terrible so who cares.


----------



## Nabarun (Jan 15, 2015)

All I wanna know is whether it will make Crysis 3 look shitty for decent playability. And of course, FC4 and the last 2 COD stuff - all requireDX11/SM5 which my prehistoric GTS 250 ain't got. However, the "future proof" aspect keeps me worried... so may end up with the 970 ultimately.


----------



## MxPhenom 216 (Jan 15, 2015)

Nabarun said:


> All I wanna know is whether it will make Crysis 3 look shitty for decent playability. And of course, FC4 and the last 2 COD stuff - all requireDX11/SM5 which my prehistoric GTS 250 ain't got. However, the "future proof" aspect keeps me worried... so may end up with the 970 ultimately.



Just buy the best you can get right now. "future proof" really needs to stop "trending" when it comes to this stuff.


----------



## Xzibit (Jan 15, 2015)

jabbadap said:


> Any word about video decoding options, tegra x1 has full h265/vp9 deocoding. Really hope this has too.



If its like all other GeForce cards it wont.  Tegra X1 supports 10bit making it possible to be fully 4k compliant.  GeForces are all 8bit. Decoding and encoding will still work.

For full H265/VP9 you have to get Radeon HD 6xx0 or newer, FirePro or Quadro card
through DP 1.2+ or HDMI 1.4+.

Content->Processing->Panel


----------



## 64K (Jan 15, 2015)

Nabarun said:


> All I wanna know is whether it will make Crysis 3 look shitty for decent playability. And of course, FC4 and the last 2 COD stuff - all requireDX11/SM5 which my prehistoric GTS 250 ain't got. However, the "future proof" aspect keeps me worried... so may end up with the 970 ultimately.



You obviously keep a card for a long time so maybe a GTX 970 would be best for you but I think it's a pretty safe bet that the GTX 960 is going to come in between a 760 and a 770. To which side it leans to is unknown. Probably towards the 770 side. We'll know that pretty soon but a GTX 960 would be a heck of a nice upgrade for you from that GTS 250.


----------



## Nabarun (Jan 15, 2015)

64K said:


> You obviously keep a card for a long time so maybe a GTX 970 would be best for you but I think it's a pretty safe bet that the GTX 960 is going to come in between a 760 and a 770. To which side it leans to is unknown. Probably towards the 770 side. We'll know that pretty soon but a GTX 960 would be a heck of a nice upgrade for you from that GTS 250.


Yeah, I know, the 960 would definitely be a great upgrade *NOW*, but the card I want should be able to tackle, say Crysis 4 - at least minimally (>30fps @ lowest settings) @1080p. Is that too naive to expect? I hope W1zzard includes the FC4 and last 2 COD stuff in the review. Will get a pretty good idea then. Not expecting too much though.


----------



## MxPhenom 216 (Jan 15, 2015)

Nabarun said:


> Yeah, I know, the 960 would definitely be a great upgrade *NOW*, but the card I want should be able to tackle, say Crysis 4 - at least minimally (>30fps @ lowest settings) @1080p. Is that too naive to expect? I hope W1zzard includes the FC4 and last 2 COD stuff in the review. Will get a pretty good idea then. Not expecting too much though.




Wait for Wizz's review before making a decision.


----------



## HumanSmoke (Jan 15, 2015)

Xzibit said:


> If its like all other GeForce cards it wont.  Tegra X1 supports 10bit making it possible to be fully 4k compliant.  GeForces are all 8bit. Decoding and encoding will still work.
> For* full H265*/VP9 you have to get *Radeon HD 6xx0 or newer*, FirePro or Quadro card


Say what???? That's news considering even AMD's latest Tonga offering doesn't have H.265 decode support


----------



## rruff (Jan 15, 2015)

RCoon said:


> I'll run an article later tonight or tomorrow with a couple of the AAA titles, detailing VRAM as well as memory bandwidth usage and PCIe bus usage on Maxwell. Hoping to get hold of a card without Maxwell compression with similar GB/s memory bandwidth figures (looks like the 770 is a match) to note differences not just in VRAM usage, but also memory bandwidth usage. But that depends on whether I can source a card for tests.



I'd be very interested in the results! In the past people have compared the 770 2GB and 770 4GB and not found any benefit to the increased vram, except in sli and barely then. Vram requirements are a very hot topic right now. Many are *claiming* that lack of vram is hurting performance in new games, but they always have cards that are slow as well as low on vram, and assuming vram is the culprit when it probably isn't. 

Would be great to have a testbed set up with two identical fast cards except for double the vram on one. The 770s are pretty ideal, or maybe the 960s once the 4GB version comes out.


----------



## jabbadap (Jan 15, 2015)

Xzibit said:


> If its like all other GeForce cards it wont.  Tegra X1 supports 10bit making it possible to be fully 4k compliant.  GeForces are all 8bit. Decoding and encoding will still work.
> 
> For full H265/VP9 you have to get Radeon HD 6xx0 or newer, FirePro or Quadro card
> through DP 1.2+ or HDMI 2.0.
> ...



I have to admit I have no idea what do you mean. But I can guarantee that you can't hardware decode h265/vp9 with hd6xxx, heck tonga r9-285 was the first amd gpu that could decode h.264 4k60p video.


----------



## Xzibit (Jan 15, 2015)

jabbadap said:


> I have to admit I have no idea what do you mean. But I can guarantee that you can't hardware decode h265/vp9 with hd6xxx, heck tonga r9-285 was the first amd gpu that could decode h.264 4k60p video.



H265/VP9 are mainly for the 4k 10bit 4:2:0+ standard.  You do have the option for lower or higher quality options.

You could let the CPU do the work load if its fast enough and still be crippled by your 8bit GPU.

Content True H265/VP9 4k 10bit 4:2:0+ -> Processing CPU/GPU if your GPU is processing it at 8bit out your already downgrading the quality before it gets to your panel.


----------



## HumanSmoke (Jan 15, 2015)

Xzibit said:


> H265/VP9 are mainly for the 4k 10bit 4:2:0+ standard.  You do have the option for lower or higher quality options.
> You could let the CPU do the work load and still be crippled by your 8bit GPU.
> Content True H265/VP9 4k 10bit 4:2:0+ -> Processing CPU/GPU if your GPU is processing it at 8bit out your already downgrading the quality before it gets to your panel.



So, you're still sticking with your assertion then?


Xzibit said:


> For full H265/VP9 you have to get Radeon HD 6xx0 or newer....



As for the GTX 960...not on my shopping list, but hopefully it causes some price realignments across both vendors cards that benefit the consumer.
An interesting snippet in the source article - One million GTX 970/980's sold so far. An impressive number given their pricing and sales over barely three months.


----------



## Petey Plane (Jan 15, 2015)

DarkOCean said:


> only 2gb of vram for a 2015 card... LOL




you do realize this is a 1080p card right?  2gb is more than enough for that resolution.


----------



## MxPhenom 216 (Jan 15, 2015)

Petey Plane said:


> you do realize this is a 1080p card right?  2gb is more than enough for that resolution.



Not only that but its a mid range GPU.


----------



## Petey Plane (Jan 15, 2015)

MxPhenom 216 said:


> Not only that but its a mid range GPU.



yeah, people complaining that a sub $200 card won't be able to hit 60fps in FarCry 4 on ultra on a 4k screen really should maybe find another hobby, because they obviously don't get this one.


----------



## dj-electric (Jan 15, 2015)

Personally - im already hitting the wall in 1440P with my 3GB GPU. With arma, BF4 and many games going over 2750MB in use.

I would have seeing mid-range cards with 3GB or more. I guess that a 4GB GTX 960 is just a question of time for those SLIing people


----------



## rruff (Jan 15, 2015)

Dj-ElectriC said:


> Personally - im already hitting the wall in 1440P with my 3GB GPU. With arma, BF4 and many games going over 2750MB in use.



Vram used and vram *needed* are two different things.


----------



## GhostRyder (Jan 15, 2015)

Xzibit said:


> If its like all other GeForce cards it wont.  Tegra X1 supports 10bit making it possible to be fully 4k compliant.  GeForces are all 8bit. Decoding and encoding will still work.
> 
> For full H265/VP9 you have to get Radeon HD 6xx0 or newer, FirePro or Quadro card
> through DP 1.2+ or HDMI 1.4+.
> ...


Might this be what your referring to?  Have not thought about it yet but I had not seen full support for it until you mentioned it but now I am interested in it.  Otherwise let me know what your referring to as I would like to know and give it a whirl.

The GTX 960 is a mid range card and as I said before having 2gb with the new compression system makes up for it any how and should be enough for anyone playing 1080p games.  Not sure I would expect it to contain a vast feature set and extreme amount of ram for the price though I guess having at least 1gb more VRAM would be better if your looking to SLI though I doubt it would make much of a difference.


----------



## dj-electric (Jan 15, 2015)

rruff said:


> Vram used and vram *needed* are two different things.



I choose to need more.


----------



## Fluffmeister (Jan 15, 2015)

Dj-ElectriC said:


> I choose to need more.



Good for you, clearly the GTX 960 isn't for you.


----------



## Casecutter (Jan 15, 2015)

Live OR Die said:


> WTF they compare it to a GTX660 why not the GTX760?


Yea, the GTX 660 (GK106) was a dud on so many levels, so not working against any High bar.  When they say it "great OC'n" that means all you'll see are custom specials (perhaps 2x 6-pins) for the normal increase.  Is anyone holding any hope there will be reference cards? And when did folks start considering $200 the point as "Budget" Gaming?

That said I think it will find many homes only because AMD has nothing new on the horizon, and at best lower price of 285/280X won't stir people.  AMD seems real late in thwarting the frenzy and that's clearly their problem.


----------



## rruff (Jan 15, 2015)

Dj-ElectriC said:


> I choose to need more.



What I mean is that it's gratuitous usage. It's used simply because you have it available, and it would run fine on the same settings if you had less... probably. Like when I open Firefox and Chrome together they suck up most of my 8GB of ram... but they don't need it.


----------



## Tonduluboy (Jan 15, 2015)

In my country 1 store listed this card price 960 gigabyte $30 cheaper than Ref 970, feel bad for those who dont have $30 to buy 970


----------



## HumanSmoke (Jan 15, 2015)

rruff said:


> What I mean is that it's gratuitous usage. It's used simply because you have it available, and it would run fine on the same settings if you had less... probably. Like when I open Firefox and Chrome together they suck up most of my 8GB of ram... but they don't need it.


Yup. There can be a big difference in memory allocation vs actual memory usage. There are also plenty of instances where memory allocation not only reserves all the vRAM (minus required buffers) but exceeds the capacity of the vRAM since OGL seems quite happy to reserve system RAM as well as vRAM.


GhostRyder said:


> Might this be what your referring to?  Have not thought about it yet but I had not seen full support for it until you mentioned it but now I am interested in it.  Otherwise let me know what your referring to as I would like to know and give it a whirl.


It's a third party plug-in that can work but doesn't have full support (and no VP9 support). Incidentally, the AMD download states " This version supports the OpenCL devices *like* AMD HD 5000 and above discrete GPUs..." that also includes Nvidia cards (Kepler and Maxwell at least), but like most (if not all) OCL based H.265 encode at present, is as slow as blood in a dead man's veins.


----------



## Xzibit (Jan 15, 2015)

GhostRyder said:


> Might this be what your referring to?  Have not thought about it yet but I had not seen full support for it until you mentioned it but now I am interested in it.  Otherwise let me know what your referring to as I would like to know and give it a whirl.
> 
> The GTX 960 is a mid range card and as I said before having 2gb with the new compression system makes up for it any how and should be enough for anyone playing 1080p games.  Not sure I would expect it to contain a vast feature set and extreme amount of ram for the price though I guess having at least 1gb more VRAM would be better if your looking to SLI though I doubt it would make much of a difference.



I was going off the assumption that if you had a GeForce in the system even an older one you would be able to decode or encode thru software or hybrid be it if your CPU is fast enough or GPU is supported but the output would be dumb down to 8bit output even if the original content was 10bit and you had a 10bit panel.

There is also this.. *PCWorld - New Intel graphics driver adds 4K video support, Chrome video acceleration and more*

The only one not doing 10bit out is Nvidia GeForce.

*EDIT:*
_*added GeForce before someone decides to try and troll like usual. _


----------



## xorbe (Jan 15, 2015)

Opportunity missed, I think 170.6-bit sounds more impressive than lying about 9.3 GHz vram.  (You either get this joke, or you don't ...)


----------



## GhostRyder (Jan 16, 2015)

HumanSmoke said:


> It's a third party plug-in that can work but doesn't have full support (and no VP9 support). Incidentally, the AMD download states " This version supports the OpenCL devices *like* AMD HD 5000 and above discrete GPUs..." that also includes Nvidia cards (Kepler and Maxwell at least), but like most (if not all) OCL based H.265 encode at present, is as slow as blood in a dead man's veins.


Was I speaking to you?


Xzibit said:


> I was going off the assumption that if you had a GeForce in the system even an older one you would be able to decode or encode thru software or hybrid be it if your CPU is fast enough or GPU is supported but the output would be dumb down to 8bit output even if the original content was 10bit and you had a 10bit panel.
> 
> There is also this.. *PCWorld - New Intel graphics driver adds 4K video support, Chrome video acceleration and more*
> 
> ...


Oh I see what your saying now, sorry misinterpretation on my part.  I had actually forgotten about that to be honest as it was something I just did not have to think about on a daily basis.




Tonduluboy said:


> In my country 1 store listed this card price 960 gigabyte $30 cheaper than Ref 970, feel bad for those who dont have $30 to buy 970


Well I hope its at least a little bit more cheaper than that otherwise I think the obvious choice would be a GTX 970 lol.


----------



## Rowsol (Jan 16, 2015)

DarkOCean said:


> only 2gb of vram for a 2015 card... LOL



@ 1080 there's no need for more.  

This card is going to be killer.  If the price/perfrmance is better than the 970 it will be insane.


----------



## ptmmac (Jan 16, 2015)

I am wondering whether the switch to PCI 3.0 is part of why there has been so much of a drop in required bit width in Video cards.  Is 128 bit on a 3.0 PCI equal to 256 bit or is it just the video compression they are running?  Video compression is not necessarily a bad thing especially if it greatly reduces the power and expense of running an nVidia card.  The other question here is will there be support for SLI on this card?  Good overclocking, low power requirements and good support for SLI would make this card a low cost and upgradeable path for the next 3 years.


----------



## Winston_008 (Jan 16, 2015)

Casecutter said:


> Yea, the GTX 660 (GK106) was a dud on so many levels, so not working against any High bar.  When they say it "great OC'n" that means all you'll see are custom specials (perhaps 2x 6-pins) for the normal increase.  Is anyone holding any hope there will be reference cards? And when did folks start considering $200 the point as "Budget" Gaming?
> 
> That said I think it will find many homes only because AMD has nothing new on the horizon, and at best lower price of 285/280X won't stir people.  AMD seems real late in thwarting the frenzy and that's clearly their problem.



How was the gtx 660 a dud on so many levels?


----------



## HumanSmoke (Jan 16, 2015)

ptmmac said:


> I am wondering whether the switch to PCI 3.0 is part of why there has been so much of a drop in required bit width in Video cards.  Is 128 bit on a 3.0 PCI equal to 256 bit or is it just the video compression they are running?


It's the latter - the delta (colour) compression. PCI-E bandwidth for single cards is for communication between the graphics card and CPU computation/system memory. Data movement depends upon the app/game's CPU requirement, but the PCI-E lanes wouldn't become saturated before CPU coding stalls or writing to/retrieving from system memory become the limiting factor. The internal bus width (GPU <-> vRAM) is the more important factor. Colour compression, like any other form of data compression allows for faster data transfer.
As for bus width drops, that isn't necessarily the case. Third/fourth tier GPUs have historically been 128-bit for some time ( AMD's Juniper HD 57x0/67x0, Bonaire and Cape Verde HD 77xx/R7 260) while Nvidia often compromised with 192-bit to offset slower GDDR3/GDDR5 frequencies before they got their memory controller act together.
As the low end discrete graphics market basically evaporates, it also puts more pressure on the next tier up the product stack to remain cost effective, so die size becomes a significant factor as does getting a good return on investment - which is why both AMD and Nvidia's product stacks look less than easy to categorize. Nvidia's present range includes architectures from three architectures (Fermi, Kepler, Maxwell), and AMD five.


ptmmac said:


> The other question here is will there be support for SLI on this card?


Yes. The SLI finger can be clearly seen in this MSI GTX 960








GhostRyder said:


> Was I speaking to you?


Well, if you were asking for information from a just a single individual, why post on a public forum rather than PM the person concerned? Sorry I provided the information as opposed to your BFF - no need to go all...


----------



## darkangel0504 (Jan 16, 2015)

MxPhenom 216 said:


> that game is also terrible so who cares.


game of the year is terrible ???


----------



## silapakorn (Jan 16, 2015)

It hits the shelves today in my country, at 300$ a piece.
I feel bad for those who can't afford 970.


----------



## xorbe (Jan 16, 2015)

HumanSmoke said:


> It's the latter - the delta (colour) compression.



They might even be keeping the textures compressed on the host side, resulting in faster host to gfx card transfers.  (Probably are.)


----------



## HumanSmoke (Jan 16, 2015)

xorbe said:


> They might even be keeping the textures compressed on the host side, resulting in faster host to gfx card transfers.  (Probably are.)


Yeah, I think it works both with writing/retrieving from system RAM, and also from client vRAM to the texture address units of the GPU.


silapakorn said:


> It hits the shelves today in my country, at 300$ a piece.
> I feel bad for those who can't afford 970.


Ouch! Sounds like some serious pre-release price gouging (unless all other cards are carrying the same kind of mark-up).
If they're on the shelves, how about some quick phone pictures for us?


----------



## GhostRyder (Jan 16, 2015)

HumanSmoke said:


> Well, if you were asking for information from a just a single individual, why post on a public forum rather than PM the person concerned? Sorry I provided the information as opposed to your BFF - no need to go all...


Don't care, stop obsessing and throwing a tantrum.



Rowsol said:


> @ 1080 there's no need for more.
> 
> This card is going to be killer.  If the price/perfrmance is better than the 970 it will be insane.


I agree, mostly it's the color compression that makes the 2gb enough for a card like this which is going to be a sweet 1080p card.  I do not doubt there will be 4gb variants for those who want to make sure/go for sli and 1440p on a budget (so long as the price stays with predictions) but 1 of these is what I look forward to seeing in action.


----------



## john_ (Jan 16, 2015)

Color compression is only used for transferring from and to memory, it doesn't have an effect in memory's capacity, or am I missing something?


----------



## silapakorn (Jan 16, 2015)

HumanSmoke said:


> Yeah, I think it works both with writing/retrieving from system RAM, and also from client vRAM to the texture address units of the GPU.
> 
> Ouch! Sounds like some serious pre-release price gouging (unless all other cards are carrying the same kind of mark-up).
> If they're on the shelves, how about some quick phone pictures for us?









I didn't take it myself. Grabbed it from the store's FB page. No unboxed pictures though.

PS. The same store sells Gigabyte GTX970 G1 at around 410$.


----------



## HumanSmoke (Jan 16, 2015)

john_ said:


> Color compression is only used for transferring from and to memory, it doesn't have an effect in memory's capacity, or am I missing something?


Not sure who you're addressing, but myself and xorbe have both asserted that delta colour compression is used in the former ( transferring to/from memory). Not sure how colour compression would be confused with capacity since the measure of it's effectiveness is a percentage of GB*/sec* or Gbps (bandwidth), not GB (capacity). This was explained at GM 204's launch.






silapakorn said:


> The same store sells Gigabyte GTX970 G1 at around 410$.


Well, the 970 G1 is a $360 part at Newegg, so assuming no price gouging on pre-launch sales (might be unlikely) that would make the 960 G1 a $260 card by comparison. Not overly scientific I'll grant.


----------



## xorbe (Jan 16, 2015)

Sure it has an effect on effective vram size.  The textures are 75% of original size, so 1.33x transfer rate, 1.33x texture storage (max -- some vram is used for frame buffers of course).


----------



## HumanSmoke (Jan 16, 2015)

xorbe said:


> Sure it has an effect on effective vram size.  The textures are 75% of original size, so 1.33x transfer rate, 1.33x texture storage (max -- some vram is used for frame buffers of course).


That 1.33 is a very variable number as I'm sure you're aware that not all textures can be compressed - that aside from vRAM capacity set aside for storage buffers, post process effects etc. Compressing the data surely adds to the capacity of the framebuffer, but the fluidity of the workload would surely make the actual gains variable. At the opposite end of the scale, a highly compressed workload might meet a bottleneck with the texture address units. Fillrate seems to fall dramatically compared to Kepler due to the reduced TMU's available (compared with an otherwise ~equally performing GK 110).


----------



## john_ (Jan 16, 2015)

HumanSmoke said:


> Not sure who you're addressing



I wasn't addressing to someone specifically. It was a general question because of what GhostRyder wrote that puzzled me



GhostRyder said:


> I agree, mostly *it's the color compression that makes the 2gb enough* for a card like this which is going to be a sweet 1080p card.


----------



## HumanSmoke (Jan 16, 2015)

john_ said:


> I wasn't addressing to someone specifically. It was a general question because of what GhostRyder wrote that puzzled me


I see why you made the query. The compressibility would allow for more to be stored in the existing vRAM. Assuming the gains are what Nvidia say they are (!), the 700MB might make a difference, but I'm figuring the low TMU count (64) might end up wiping out some of that theoretical gain. Adding vRAM doesn't always translate into tangible benefit (see 4GB vs 8GB R9 290X, 1.5GB vs 3GB GTX 580, 3GB vs 6GB GTX 780 for example*), the graphics pipeline just moves to the next choke point.

Having said that, a 2GB card should still suffice at 19x10 resolution for the image quality setting likely being used. Unless you're a masochist, I doubt many people would deliberately amp up the game I.Q. and play at sub-optimal framerates just to make a point.

* The larger framebuffers only create separation when the smaller vRAM capacity cards are deliberately overwhelmed (e.g. use of high res texture packs) - so less a gain by the large framebuffer card than a deliberate hobbling of the standard offering.


----------



## Lionheart (Jan 16, 2015)

MxPhenom 216 said:


> that game is also terrible so who cares.



Uuh what? Many ppl like that game including myself so ppl will care.


----------



## GhostRyder (Jan 16, 2015)

john_ said:


> I wasn't addressing to someone specifically. It was a general question because of what GhostRyder wrote that puzzled me





john_ said:


> Color compression is only used for transferring from and to memory, it doesn't have an effect in memory's capacity, or am I missing something?


Because its exactly what I meant by it, the color compression technique employed now by both AMD and Nvidia in their desktop GPU's (New gens) helps alleviate a little bit of the vram bottleneck that can surface on top of that.  Its not the most significant amount but it helps and at 1080p games do not really blow away the vram so 2gb is normally enough for a card aimed around mid range area and with that help it will make 2gb feel like a little more or at least enough to alleviate some potential bottlenecks (Not all mind you just some).


----------



## rruff (Jan 16, 2015)

HumanSmoke said:


> Having said that, a 2GB card should still suffice at 19x10 resolution for the image quality setting likely being used. Unless you're a masochist, I doubt many people would deliberately amp up the game I.Q. and play at sub-optimal framerates just to make a point.



I don't understand how video cards work very well. Is the complaint about low vram quantity coming from new games storing higher res textures in vram? Is this somehow separable from the card's processing speed? And if so, is it *necessary* to store so much in vram, or is it just the way the game coders did it... ie assuming that the card would have xGB of vram available?

I have a GTX 750 1GB. So many were saying the low vram would hobble it, but I researched it before buying and decided this wouldn't be the case... and I've not experienced a problem yet. Possibly I will. As you said, if I turned up the settings, it might be an issue, but no one would want to play at 10-15 fps anyway.

The GTX 960 looks to be coming in ~2x the speed of a GTX 750, so I don't expect the 2GB to necessarily be a problem, but the bandwidth is only 40% greater. And overclocking my vram helped quite a lot, so I think that will be a restriction on the 960... ie it could run considerably faster if it had more bandwidth. Well... on the other hand maybe not, since it is already twice as fast using 2x the shaders, TMUs, and ROPs.


----------



## LiveOrDie (Jan 16, 2015)

Lionheart said:


> Uuh what? Many ppl like that game including myself so ppl will care.



 PE____S this card will be less than 10% better than the 760 which is why they use a 660.


----------



## Tsukiyomi91 (Jan 16, 2015)

Like I said, the GTX960 WILL supersede both the GTX750 & GTX750Ti as these 2 cards somewhat did not deliver it's expectations when it came out. In my opinion, I can safely say that the GTX960 & it's $200 or below price point is going to make some heads turn, on both PC builders & budget-conscious PC gamers around the world. Over here in Malaysia, the GTX960 card will become a sensational product of the year as there are many PC users sees the GTX970 as "high end VGA card" thanks to it's $350 price tag, with vendor based kits like ASUS Strix, Gigabyte G1 Gaming, MSI Gaming & others hitting nearly MYR1800 a piece.


----------



## MxPhenom 216 (Jan 16, 2015)

xorbe said:


> Sure it has an effect on effective vram size.  The textures are 75% of original size, so 1.33x transfer rate, 1.33x texture storage (max -- some vram is used for frame buffers of course).



If it drops the quality of the textures, how are they able to advertise it as lossless?


----------



## Tsukiyomi91 (Jan 16, 2015)

Remember; the GTX960 IS A BUDGET 1080p VGA CARD catered specifically for those who can't afford the GTX970 or even the top-of-the-line GTX980 card. Besides, not everyone has the money to buy & build a rig that pushes games on max settings @ 1080p resolution, yet alone 1440p or 4K. 2GB on 128-bit bus is no issue as we're talking about a Maxwell chip running under the hood, not a full-blown GK110 used by the GTX780Ti. Uses a single 6-pin connector is also no problem as this card is not like those power-hungry cards who's requirements are a little too high for budget gamers. Couple the 960 with a cool running Core i3 & cheap 8GB DDR3 kit, you'll get a very efficient rig that uses very little energy while able to play games on 1080p comfortably at High without running into problems like low fps.


----------



## rruff (Jan 16, 2015)

Tsukiyomi91 said:


> Like I said, the GTX960 WILL supersede both the GTX750 & GTX750Ti as these 2 cards somewhat did not deliver it's expectations when it came out.



Actually it's my impression that these cards sold like crazy. And check the reviews from people who bought them... overwhelmingly positive. Probably more so than any other model. Which likely is due to good QC and drivers, but still... many are very satisfied with the performance.

The 960 is in different league altogether. I'm sure that a GTX 960 looks pathetic to someone with SLI 980s, but it is reported to be* literally 2x the speed of a 750, and ~1.8x the speed of a 750 Ti. *At a retail price of $199, the 960 will sell like crazy.


----------



## HumanSmoke (Jan 16, 2015)

rruff said:


> I don't understand how video cards work very well. Is the complaint about low vram quantity coming from new games storing higher res textures in vram? Is this somehow separable from the card's processing speed? And if so, is it *necessary* to store so much in vram, or is it just the way the game coders did it... ie assuming that the card would have xGB of vram available?


As an analogy, the more complicated and larger the picture that the GPU has to paint, the more paint and a wider range of colours are required ( textures, geometry, tessellation etc). The picture has to be painted in one sitting, and you can only use the paint you can fit on your palette. You can increase your painting speed (framerate), but the amount* and colour range of paint you can put on your palette depends upon the palettes size (vRAM framebuffer).
* Using a higher quality (denser) paint would allow for more coverage (delta compression).


rruff said:


> I have a GTX 750 1GB. So many were saying the low vram would hobble it, but I researched it before buying and decided this wouldn't be the case... and I've not experienced a problem yet. Possibly I will. As you said, if I turned up the settings, it might be an issue, but no one would want to play at 10-15 fps anyway.


Yup. Just a simple case of adjusting the workload ( screen resolution, gaming image quality settings) to fit the available hardware.


----------



## RCoon (Jan 16, 2015)

Far Cry 4, Very High preset @1440p





The GTX 960 has 112GB/s, so it's not quite enough to run it at *1440p* during peak gameplay. Once I've done my 1080p benchmarks, we'll see what the figures are.

These are *rough* and *approximate* figures based on some educated extrapolation with a test version of GPU-Z W1zzard sorted for me. The values _could_ be entirely wrong. I'll explain in detail once the full article is up.


----------



## Xzibit (Jan 16, 2015)

Isn't tricky since one of the benefits is delta based leading to variable outcomes depending on the source compressed and the output.

Running benchmark X Game X Scene X as appose to running Bx,Cx,Sy. Game A-Z will never share similar while Game X Scene A-Z will always vary outcome.

I don't know how your running it but wouldn't it be
Kepler & Maxwell similar performance with same frame buffer comparison.

Am I totally miss interpreting it?


----------



## RCoon (Jan 16, 2015)

Xzibit said:


> Isn't tricky since one of the benefits is delta based leading to variable outcomes depending on the source compressed and the output.
> 
> Running benchmark X Game X Scene X as appose to running Bx,Cx,Sy. Game A-Z will never share similar while Game X Scene A-Z will always vary outcome.
> 
> ...



You're correctly interpreting the fact that it's altogether a nightmare to accurately measure. But, somebody needs to do it to prove or disprove whether this 128bit 112GB/s memory bandwidth is actually an issue on 1080p once and for all.
I've got a separate benchmark for non-compressed memory bandwidth usage, as well as some graphs to show how bandwidth usage correlates (or doesn't) with other usage on GPU hardware (VRAM, PCIe Bus, GPU Load). The best possible thing to do is to take the _highest_ bandwidth usage figure and go by that figure to be utterly and completely sure it won't be a bottleneck. I'm also attempting to cover 4 games that represent a couple of different types, including VRAM hogs, CPU limited, GPU limited, and general well rounded title. Once I've finished the full write up, people are welcome to request benchmarks on games.

I'd ideally like a 770 as it shares identical bandwidth to the 970, the difference being the Maxwell compression technique. As it stands I'm having to _assume_ it's 30%. In reality it varies a lot.

I must stress at this point though, even NVidia has mentioned that the available tools for measuring such a thing are _not particularly accurate. (_They actually said the software application available is not 100% representative, just that the values are similar by proxy)


----------



## rruff (Jan 17, 2015)

HumanSmoke said:


> The picture has to be painted in one sitting, and you can only use the paint you can fit on your palette.



Thanks, that makes sense to me. For instance, say I was able to get 30fps max, and I happen to be at 99.9% of my 1GB of vram capacity. If I doubled the speed (say with 100% efficient SLI), I could double my fps, but I wouldn't be able to increase the quality settings at all without causing problems, like swapping to system memory and causing stutters. True? Is the issue as simple as having more digital information in a _single frame_ than the vram capacity, or is it more complicated? Because 1GB seems like quite a lot for one frame.


----------



## HumanSmoke (Jan 17, 2015)

rruff said:


> Thanks, that makes sense to me. For instance, say I was able to get 30fps max, and I happen to be at 99.9% of my 1GB of vram capacity. If I doubled the speed (say with 100% efficient SLI), I could double my fps, but I wouldn't be able to increase the quality settings at all without causing problems, like swapping to system memory and causing stutters. True?


Correct insofar as vRAM is concerned. You can still increase image quality settings that aren't directly texture based. In SLI (and CrossfireX) the cards work in parallel with the same resources mapped into their individual on-board graphics memory - that is to say the vRAM is mirrored across each card. Quick illustration: These benchmarks run by the system builder Digital Storm show that the usage for two GTX 780 Ti's in SLI is the same as that for a single card:









rruff said:


> Is the issue as simple as having more digital information in a _single frame_ than the vram capacity, or is it more complicated? Because 1GB seems like quite a lot for one frame.


Not all the vRAM is allocated for a single frame. The vRAM has portioned buffers (size is dependant upon the application) - holding multiple frames at varying stages of completion, one frame being sent to the monitor (or the primary graphics card then to the monitor if the card is the 2nd, 3rd, or 4th in an SLI setup) from the front buffer, the next held in the back buffer- which then becomes the new front buffer as the newly vacated former front buffer assumes back buffer duty (there is also triple buffering options). There are also many other buffers to take into consideration, such as the depth buffer. Basically, the 1GB of vRAM you have isn't dedicated to a drawing a single frame at a time.


----------



## rruff (Jan 17, 2015)

HumanSmoke said:


> You can still increase image quality settings that aren't directly texture based.



What settings would those be?



> In SLI (and CrossfireX) the cards work in parallel with the same resources mapped into their individual on-board graphics memory - that is to say the vRAM is mirrored across each card.



My understanding of SLI was that the cards sort of took turns rendering frames. Yes, no? 

And the main thing I've been wondering about is all the buzz about how new games need 4GB+ vram. If not this year then the next. I can fully believe that a top card needs that kind of vram because it is capable of processing information _fast_ enough to be limited with less. I've not seen anything yet that convinces me that things have changed. Vram requirements seem to scale with processing speed about the same as several years ago. Are newer higher res textures something that will impact vram quantity more than processing speed? Even if so, would turning down the texture detail enough to solve the problem, make the game look ugly?


----------



## mxp02 (Jan 17, 2015)

rtwjunkie said:


> That should be plenty in the 1080p segment this card is meant for.


New games in 2015 will cost 2GB or more vram @720p.There were  several games cost more than 2GB vram @1080p already last year.


----------



## HumanSmoke (Jan 17, 2015)

rruff said:


> What settings would those be?


Well, for starters, any compute shader post process - effects applied after the scene has been rendered. Common types of this would be Depth of Field and motion blur


rruff said:


> My understanding of SLI was that the cards sort of took turns rendering frames. Yes, no?


Yes. Each card holds it's frame in its vRAM's front buffer and flips the contents to the primary card - the one that is connected to the video display.


rruff said:


> And the main thing I've been wondering about is all the buzz about how new games need 4GB+ vram. If not this year then the next. I can fully believe that a top card needs that kind of vram because it is capable of processing information _fast_ enough to be limited with less. I've not seen anything yet that convinces me that things have changed. Vram requirements seem to scale with processing speed about the same as several years ago. Are newer higher res textures something that will impact vram quantity more than processing speed? Even if so, would turning down the texture detail enough to solve the problem, make the game look ugly?


Voxel based Global Illumination
Path tracing
Larger texture packs as screen resolution increases
Improved physics particle models (fog, water, smoke, interactive/destructible environments) and a host of other graphical refinements. For further reading I'd suggest Googling upcoming/future rendering techniques. The yearly SIGGRAPH is a good place to start being as it is independent.

You'll always have the opportunity to lower game image quality. How good/bad it looks will depend on the game engine, and how far the options are dialled down (few PC gamers would willingly choose static lighting for instance).

At this point we're straying pretty far from the actual topic at hand, the GTX 960.


----------



## rruff (Jan 17, 2015)

But do those things increase vram requirements *more* than the cards computing requirements? Will we need larger amounts of vram even on slow cards? That's what many people seem to believe, but I don't know if there is any truth to it. 



> At this point we're straying pretty far from the actual topic at hand, the GTX 960.



Not necessarily... because the 960 is the fastest new card to come out that still uses 2GB of vram. I've been looking at some the reports where the specs are listed, and the comments are overwhelmingly of this variety "This card is an immediate fail with 2GB, games need 4GB, Nvidia are idiots, shouldn't cost more than $100, it's already obsolete, I feel sorry for anyone dumb enough to buy it" etc. I tend to think that Nvidia knows what they are doing and the card will be balanced and perform well, but I seem to be in the minority.


----------



## HumanSmoke (Jan 17, 2015)

rruff said:


> But do those things increase vram requirements *more* than the cards computing requirements?


It's not a case of either/or. GPU processing and vRAM increases are linked. If a GPU has power but it is hamstrung by a lack of framebuffer, what good is the GPU gain? Likewise, why push games that require more vRAM if the GPUs aren't able to run the settings to take advantage of it.
As it stands now, both AMD and Nvidia's gaming programs push the software to a point where the game at its maximum quality/resolution levels is around two generations of GPUs removed from the ability to play them with a single GPU. This is not by accident. Having the games outstrip the cards ability ensures a market for SLI/Crossfire.


rruff said:


> Will we need larger amounts of vram even on slow cards?


Nothing extreme. Larger framebuffers are as much a marketing tool as a requirement. As I said before, there is always the option of dialling down image quality and/or playing at a lower resolution...and as should be apparent, not every game is a GPU-killing resource hog and the console market dictates to a large degree how much graphics horsepower is required.
As a trend will memory capacity get larger? Of course, unless you expect the quality of gaming images and the game environment to remain unchanged. If you'd followed up any of the links I pointed you towards, the message is pretty clear - the resources to make gaming more realistic are available, but one of the biggest stumbling blocks to implementation (aside from consolitis) is memory capacity and bandwidth. Read though any next-gen 3D article or paper and count how often the words memory/bandwidth limitation (or similar) pop up.


rruff said:


> Not necessarily... because the 960 is the fastest new card to come out that still uses 2GB of vram. I've been looking at some the reports where the specs are listed, and the comments are overwhelmingly of this variety "This card is an immediate fail with 2GB, games need 4GB, Nvidia are idiots, shouldn't cost more than $100, it's already obsolete, I feel sorry for anyone dumb enough to buy it" etc. I tend to think that Nvidia knows what they are doing and the card will be balanced and perform well, but I seem to be in the minority.


This is the internet. Your choice as to what to take on board and what to leave aside. People who aim broadsides at a vendor usually have some weird attachment to another brand. The view from the other side of the fence isn't much dissimilar - "Don't buy Radeon their drivers suck, their support sucks" etc. Filter the opinion and question and evaluate the fact. The fun part is separating out the comments that are opinion (or trolling/shilling) that masquerade as fact....but on one ever said the quest for knowledge was easy.


----------



## rtwjunkie (Jan 17, 2015)

mxp02 said:


> New games in 2015 will cost 2GB or more vram @720p.There were  several games cost more than 2GB vram @1080p already last year.


Read up, and read the other 960 thread. It's not looking like it. RCoon is doing an in depth test. Some of you guys are way too pessimistic. Despite what you think, 2GB cards are not really using all their RAM at 1080p. Not even close on most of them.

And welcome to TPU!


----------



## mxp02 (Jan 17, 2015)

rtwjunkie said:


> Read up, and read the other 960 thread. It's not looking like it. RCoon is doing an in depth test. Some of you guys are way too pessimistic. Despite what you think, 2GB cards are not really using all their RAM at 1080p. Not even close on most of them.
> 
> And welcome to TPU!



I'm pretty sure if you want to run games which look better  than those in 2014 at 1080p high(not highest) smoothly(avg 50fps+  min 30fps maybe minor lag/stutter) this year,a card like GTX780 3GB is required.For highest setting 1080p~1440p,very smooth(min 50fps) a full size maxwell(maybe  990ti) is requried.


----------



## rtwjunkie (Jan 17, 2015)

mxp02 said:


> I'm pretty sure if you want to run games which look better  than those in 2014 at 1080p high(not highest) smoothly(avg 50fps+  min 30fps maybe minor lag/stutter) this year,a card like GTX780 3GB is required.For highest setting 1080p~1440p,very smooth(min 50fps) a full size maxwell(maybe  990ti) is requried.


You are missing the whole point of this card. It's meant to be affordable part of an affordable build. It's meant to be "pretty good" graphics-wise. There have always been very affordable models that will not play everything at max.

And thay's ok! Those are the volume models that bring people into pc gaming because that's all they can afford, giving them a fairly decent experience. These are where the money is at for both companies.


----------



## 64K (Jan 17, 2015)

rtwjunkie said:


> You are missing the whole point of this card. It's meant to be affordable part of an afgordable build. It's meant to be "pretty good" graphics-wise. There have always been very affordable models that will not play everything at max.
> 
> And thay's ok! Those are the volume models that bring people into pc gaming because that's all they can afford, giving them a fairly decent experience. These are where the money is at for both companies.



I'm jumping the gun a bit (need to wait for W1zard's review) but I think this GTX 960, if priced at $200, will be a very nice GPU for 1080p gaming. No, it won't do ultra settings on every game but I think it will deliver solid performance at that price point.


----------



## Tsukiyomi91 (Jan 17, 2015)

To put this simple coz there are some who just can't get the full picture... the GTX960 fills the gap where buying a $350 video card proves too much for the majority of PC gamers who dun have the money & wanted a decent card that runs all their fave games at High with no AA on 1920x1080 comfortably without compromise. It's sub-$200 price tag IS VERY competitive, it's performance is also very competitive for it's class & best of all; it doesn't really kill you wallet/pocket. It's now considered as the best bang for your buck VGA card, just like the GTX760 when it came out back in June 2013. Remember... there are folks who wanted good performance across 1080p resolutions while keeping a decent level of eye candy without spending much is key.


----------



## rruff (Jan 18, 2015)

The price of $200 is looking more solid. For an Asus Strix also, which is not a cheap model: http://wccftech.com/nvidia-geforce-gtx-960-msrp-200/


----------



## Tsukiyomi91 (Jan 18, 2015)

ASUS Strix variant is quite on the expensive side IMO. If you want the really affordable card aim for either EVGA, Leadtek or Gigabyte as they offer their cards at a much better price, but depends on availability.


----------



## mxp02 (Jan 19, 2015)

rtwjunkie said:


> You are missing the whole point of this card. It's meant to be affordable part of an affordable build. It's meant to be "pretty good" graphics-wise. There have always been very affordable models that will not play everything at max.
> 
> And thay's ok! Those are the volume models that bring people into pc gaming because that's all they can afford, giving them a fairly decent experience. These are where the money is at for both companies.




No,not me,I didn't and won't want everything max out on this card.1080p gaming 60fps at high settings is what they said in the official ppt which looks kinda overrated as always.Just suppose the ppt is not exaggerated,then read the tiny remarks below carefully ,those games were relesed in 2014 even 2013.Will this card be able to maintain 1080p gaming 60fps high settings in 2015?In my opinion,no way this gonna happen.What about 50 fps without severe lag/stutter?I seriouly doubt that.


----------



## rtwjunkie (Jan 20, 2015)

mxp02 said:


> No,not me,I didn't and won't want everything max out on this card.1080p gaming 60fps at high settings is what they said in the official ppt which looks kinda overrated as always.Just suppose the ppt is not exaggerated,then read the tiny remarks below carefully ,those games were relesed in 2014 even 2013.Will this card be able to maintain 1080p gaming 60fps high settings in 2015?In my opinion,no way this gonna happen.What about 50 fps without severe lag/stutter?I seriouly doubt that.



You're right, but it's not meant to be "future game proof."  It's meant to provide the VAST majority of people who game on average hardware and who expect to replace a mid-grade card every year.  And those people are just fine with how they play, with 20 up to 60 fps.  30 is still pretty darned playable.


----------



## rruff (Jan 20, 2015)

rtwjunkie said:


> You're right, but it's not meant to be "future game proof."



Serious gamers must be a small % of the market. More are probably about like me... don't game all that much, and are ok with getting games that are a few years old that are almost free, and thoroughly patched and modded. If I was spending $500/yr on games, then sure it would make sense to spend a similar amount on hardware, but I don't. I bet Nvidia will sell a lot more 960s than 980s, and a lot more 750s than 960s. With a 750 you can play most 2014 games well enough, and one 2014 title (Divinity Original Sin) I have everything on Ultra, 1080p and it's 30-60 fps. 

The 960 is half a GTX 980 (Nvidia's top card) and 2x a GTX 750 (bottom of the "gaming" range), so it seems pretty "midrange" to me. If it gets cheap by next BF it will be a big upgrade, but currently I'm not lusting after anything that my 750 can't handle.


----------



## rtwjunkie (Jan 20, 2015)

rruff said:


> Serious gamers must be a small % of the market. More are probably about like me... don't game all that much, and are ok with getting games that are a few years old that are almost free, and thoroughly patched and modded. If I was spending $500/yr on games, then sure it would make sense to spend a similar amount on hardware, but I don't. I bet Nvidia will sell a lot more 960s than 980s, and a lot more 750s than 960s. With a 750 you can play most 2014 games well enough, and one 2014 title (Divinity Original Sin) I have everything on Ultra, 1080p and it's 30-60 fps.
> 
> The 960 is half a GTX 980 (Nvidia's top card) and 2x a GTX 750 (bottom of the "gaming" range), so it seems pretty "midrange" to me. If it gets cheap by next BF it will be a big upgrade, but currently I'm not lusting after anything that my 750 can't handle.


You are quite correct! Serious gamers, or those that buy top-end hardware are only a small percentage of the market. Most are like you and quite content with average, but still worthwhile performance.

You are exactly the kind of person Nvidia is marketing the 960 to. Cheers!


----------



## Octavean (Jan 22, 2015)

jabbadap said:


> Any word about video decoding options, tegra x1 has full h265/vp9 deocoding. Really hope this has too.



Reviews I have been skimming seem to indicate HEVC / H.265 encode and decode support for the GTX 960.


----------



## Xzibit (Jan 22, 2015)

Octavean said:


> Reviews I have been skimming seem to indicate HEVC / H.265 encode and decode support for the GTX 960.









Yes, Limited to 8-bit or lower.


----------



## Nabarun (Jan 22, 2015)

Apparently there's gonna be a 4GB variant of this card (from JJ's overview). I wonder if that's gonna make any difference to the 1080p segment. Anyway, please stop calling it a "$200 card", because it's not. Not here in India at least. Here all the 960s are priced around USD 300. Price/performance wise that's where I see the R9s winning. Although, cost of electricity being equally high here in this hell hole, we're f*cked both ways


----------



## HumanSmoke (Jan 22, 2015)

Nabarun said:


> Anyway, please stop calling it a "$200 card", because it's not. Not here in India at least. Here all the 960s are priced around USD 300


Well, you call it a $300 card, and everyone else who lives in a country where it is $200 can call it a $200 card. Would that make you feel better?

I could say exactly the same thing about the R9 285 ( $NZ415 or $US311 with free delivery! w00t), but I am well aware that my local pricing doesn't reflect that of the majority of markets.


----------



## rruff (Jan 22, 2015)

Nabarun said:


> Anyway, please stop calling it a "$200 card", because it's not.



You... well maybe not you, but those of us in the US can choose from several at Newegg, ~$180 shipped on day one. So I call it a $180 card, and that will likely decline further very soon.


----------



## Nabarun (Jan 22, 2015)

Well, I rest my case. It's priced <= USD 200 in the US ONLY - not anywhere else in the world. So stop putting a false price tag on it.


----------



## HumanSmoke (Jan 22, 2015)

Nabarun said:


> Well, I rest my case. It's priced <= USD 200 in the US ONLY - not anywhere else in the world. So stop putting a false price tag on it.


In a lot of cases that comes down to import tax structure of the countries concerned and how the AIB's allot initial shipments to the various geographic distribution areas. Even with 19% tax, these German/Austrian sellers have the cards at nowhere near $US300, and the R9 285 isn't any cheaper.


----------



## Nabarun (Jan 22, 2015)

Well, I appreciate your time in the market research. Here the tax rates are ABNORMALLY high. But the extremities are around 30% only. And If you are really interested about the current prices, look at my earlier post, ... or THIS.

*"Import tax structure"*

And, just to remind you, read btarunr's last 2 lines in his 960 sli review.


----------

