Thursday, January 15th 2015

NVIDIA GeForce GTX 960 Specs Confirmed

Here's what NVIDIA's upcoming performance-segment GPU, the GeForce GTX 960, could look like under the hood. Key slides from its press-deck were leaked to the web, revealing its specs. To begin with, the card is based on NVIDIA's 28 nm GM206 silicon. It packs 1,024 CUDA cores based on the "Maxwell" architecture, 64 TMUs, and possibly 32 ROPs, despite its 128-bit wide GDDR5 memory interface, which holds on to 2 GB of memory. The bus may seem narrow, but NVIDIA is using a lossless texture compression tech, that will effectively improve bandwidth utilization.

The core is clocked at 1127 MHz, with 1178 MHz GPU Boost, and the memory at 7.00 GHz (112 GB/s real bandwidth). Counting its texture compression mojo, NVIDIA is beginning to mention an "effective bandwidth" figure of 9.3 GHz. The card draws power from a single 6-pin PCIe power connector, the chip's TDP is rated at just 120W. Display outputs will include two dual-link DVI, and one each of HDMI 2.0 and DisplayPort 1.2. In its slides, NVIDIA claims that the card will be an "overclocker's dream" in its segment, and will offer close to double the performance over the GTX 660. NVIDIA will launch the GTX 960 on the 22nd of January, 2015.
Source: VideoCardz
Add your own comment

119 Comments on NVIDIA GeForce GTX 960 Specs Confirmed

#51
Petey Plane
DarkOCeanonly 2gb of vram for a 2015 card... LOL
you do realize this is a 1080p card right? 2gb is more than enough for that resolution.
Posted on Reply
#52
MxPhenom 216
ASIC Engineer
Petey Planeyou do realize this is a 1080p card right? 2gb is more than enough for that resolution.
Not only that but its a mid range GPU.
Posted on Reply
#53
Petey Plane
MxPhenom 216Not only that but its a mid range GPU.
yeah, people complaining that a sub $200 card won't be able to hit 60fps in FarCry 4 on ultra on a 4k screen really should maybe find another hobby, because they obviously don't get this one.
Posted on Reply
#54
dj-electric
Personally - im already hitting the wall in 1440P with my 3GB GPU. With arma, BF4 and many games going over 2750MB in use.

I would have seeing mid-range cards with 3GB or more. I guess that a 4GB GTX 960 is just a question of time for those SLIing people
Posted on Reply
#55
rruff
Dj-ElectriCPersonally - im already hitting the wall in 1440P with my 3GB GPU. With arma, BF4 and many games going over 2750MB in use.
Vram used and vram *needed* are two different things.
Posted on Reply
#56
GhostRyder
XzibitIf its like all other GeForce cards it wont. Tegra X1 supports 10bit making it possible to be fully 4k compliant. GeForces are all 8bit. Decoding and encoding will still work.

For full H265/VP9 you have to get Radeon HD 6xx0 or newer, FirePro or Quadro card
through DP 1.2+ or HDMI 1.4+.

Content->Processing->Panel
Might this be what your referring to? Have not thought about it yet but I had not seen full support for it until you mentioned it but now I am interested in it. Otherwise let me know what your referring to as I would like to know and give it a whirl.

The GTX 960 is a mid range card and as I said before having 2gb with the new compression system makes up for it any how and should be enough for anyone playing 1080p games. Not sure I would expect it to contain a vast feature set and extreme amount of ram for the price though I guess having at least 1gb more VRAM would be better if your looking to SLI though I doubt it would make much of a difference.
Posted on Reply
#57
dj-electric
rruffVram used and vram *needed* are two different things.
I choose to need more.
Posted on Reply
#58
Fluffmeister
Dj-ElectriCI choose to need more.
Good for you, clearly the GTX 960 isn't for you.
Posted on Reply
#59
Casecutter
Live OR DieWTF they compare it to a GTX660 why not the GTX760?
Yea, the GTX 660 (GK106) was a dud on so many levels, so not working against any High bar. When they say it "great OC'n" that means all you'll see are custom specials (perhaps 2x 6-pins) for the normal increase. Is anyone holding any hope there will be reference cards? And when did folks start considering $200 the point as "Budget" Gaming?

That said I think it will find many homes only because AMD has nothing new on the horizon, and at best lower price of 285/280X won't stir people. AMD seems real late in thwarting the frenzy and that's clearly their problem.
Posted on Reply
#60
rruff
Dj-ElectriCI choose to need more.
What I mean is that it's gratuitous usage. It's used simply because you have it available, and it would run fine on the same settings if you had less... probably. Like when I open Firefox and Chrome together they suck up most of my 8GB of ram... but they don't need it.
Posted on Reply
#61
Tonduluboy
In my country 1 store listed this card price 960 gigabyte $30 cheaper than Ref 970, feel bad for those who dont have $30 to buy 970 :)
Posted on Reply
#62
HumanSmoke
rruffWhat I mean is that it's gratuitous usage. It's used simply because you have it available, and it would run fine on the same settings if you had less... probably. Like when I open Firefox and Chrome together they suck up most of my 8GB of ram... but they don't need it.
Yup. There can be a big difference in memory allocation vs actual memory usage. There are also plenty of instances where memory allocation not only reserves all the vRAM (minus required buffers) but exceeds the capacity of the vRAM since OGL seems quite happy to reserve system RAM as well as vRAM.
GhostRyderMight this be what your referring to? Have not thought about it yet but I had not seen full support for it until you mentioned it but now I am interested in it. Otherwise let me know what your referring to as I would like to know and give it a whirl.
It's a third party plug-in that can work but doesn't have full support (and no VP9 support). Incidentally, the AMD download states " This version supports the OpenCL devices like AMD HD 5000 and above discrete GPUs..." that also includes Nvidia cards (Kepler and Maxwell at least), but like most (if not all) OCL based H.265 encode at present, is as slow as blood in a dead man's veins.
Posted on Reply
#63
Xzibit
GhostRyderMight this be what your referring to? Have not thought about it yet but I had not seen full support for it until you mentioned it but now I am interested in it. Otherwise let me know what your referring to as I would like to know and give it a whirl.

The GTX 960 is a mid range card and as I said before having 2gb with the new compression system makes up for it any how and should be enough for anyone playing 1080p games. Not sure I would expect it to contain a vast feature set and extreme amount of ram for the price though I guess having at least 1gb more VRAM would be better if your looking to SLI though I doubt it would make much of a difference.
I was going off the assumption that if you had a GeForce in the system even an older one you would be able to decode or encode thru software or hybrid be it if your CPU is fast enough or GPU is supported but the output would be dumb down to 8bit output even if the original content was 10bit and you had a 10bit panel.

There is also this.. PCWorld - New Intel graphics driver adds 4K video support, Chrome video acceleration and more

The only one not doing 10bit out is Nvidia GeForce.

EDIT:
*added GeForce before someone decides to try and troll like usual. :p
Posted on Reply
#64
xorbe
Opportunity missed, I think 170.6-bit sounds more impressive than lying about 9.3 GHz vram. :laugh: (You either get this joke, or you don't ...)
Posted on Reply
#65
GhostRyder
HumanSmokeIt's a third party plug-in that can work but doesn't have full support (and no VP9 support). Incidentally, the AMD download states " This version supports the OpenCL devices like AMD HD 5000 and above discrete GPUs..." that also includes Nvidia cards (Kepler and Maxwell at least), but like most (if not all) OCL based H.265 encode at present, is as slow as blood in a dead man's veins.
Was I speaking to you?
XzibitI was going off the assumption that if you had a GeForce in the system even an older one you would be able to decode or encode thru software or hybrid be it if your CPU is fast enough or GPU is supported but the output would be dumb down to 8bit output even if the original content was 10bit and you had a 10bit panel.

There is also this.. PCWorld - New Intel graphics driver adds 4K video support, Chrome video acceleration and more

The only one not doing 10bit out is Nvidia GeForce.

EDIT:
*added GeForce before someone decides to try and troll like usual. :p
Oh I see what your saying now, sorry misinterpretation on my part. I had actually forgotten about that to be honest as it was something I just did not have to think about on a daily basis.
TonduluboyIn my country 1 store listed this card price 960 gigabyte $30 cheaper than Ref 970, feel bad for those who dont have $30 to buy 970 :)
Well I hope its at least a little bit more cheaper than that otherwise I think the obvious choice would be a GTX 970 lol.
Posted on Reply
#66
Rowsol
DarkOCeanonly 2gb of vram for a 2015 card... LOL
@ 1080 there's no need for more.

This card is going to be killer. If the price/perfrmance is better than the 970 it will be insane.
Posted on Reply
#67
ptmmac
I am wondering whether the switch to PCI 3.0 is part of why there has been so much of a drop in required bit width in Video cards. Is 128 bit on a 3.0 PCI equal to 256 bit or is it just the video compression they are running? Video compression is not necessarily a bad thing especially if it greatly reduces the power and expense of running an nVidia card. The other question here is will there be support for SLI on this card? Good overclocking, low power requirements and good support for SLI would make this card a low cost and upgradeable path for the next 3 years.
Posted on Reply
#68
Winston_008
CasecutterYea, the GTX 660 (GK106) was a dud on so many levels, so not working against any High bar. When they say it "great OC'n" that means all you'll see are custom specials (perhaps 2x 6-pins) for the normal increase. Is anyone holding any hope there will be reference cards? And when did folks start considering $200 the point as "Budget" Gaming?

That said I think it will find many homes only because AMD has nothing new on the horizon, and at best lower price of 285/280X won't stir people. AMD seems real late in thwarting the frenzy and that's clearly their problem.
How was the gtx 660 a dud on so many levels?
Posted on Reply
#69
HumanSmoke
ptmmacI am wondering whether the switch to PCI 3.0 is part of why there has been so much of a drop in required bit width in Video cards. Is 128 bit on a 3.0 PCI equal to 256 bit or is it just the video compression they are running?
It's the latter - the delta (colour) compression. PCI-E bandwidth for single cards is for communication between the graphics card and CPU computation/system memory. Data movement depends upon the app/game's CPU requirement, but the PCI-E lanes wouldn't become saturated before CPU coding stalls or writing to/retrieving from system memory become the limiting factor. The internal bus width (GPU <-> vRAM) is the more important factor. Colour compression, like any other form of data compression allows for faster data transfer.
As for bus width drops, that isn't necessarily the case. Third/fourth tier GPUs have historically been 128-bit for some time ( AMD's Juniper HD 57x0/67x0, Bonaire and Cape Verde HD 77xx/R7 260) while Nvidia often compromised with 192-bit to offset slower GDDR3/GDDR5 frequencies before they got their memory controller act together.
As the low end discrete graphics market basically evaporates, it also puts more pressure on the next tier up the product stack to remain cost effective, so die size becomes a significant factor as does getting a good return on investment - which is why both AMD and Nvidia's product stacks look less than easy to categorize. Nvidia's present range includes architectures from three architectures (Fermi, Kepler, Maxwell), and AMD five.
ptmmacThe other question here is will there be support for SLI on this card?
Yes. The SLI finger can be clearly seen in this MSI GTX 960
GhostRyderWas I speaking to you?
Well, if you were asking for information from a just a single individual, why post on a public forum rather than PM the person concerned? Sorry I provided the information as opposed to your BFF - no need to go all...
Posted on Reply
#70
darkangel0504
MxPhenom 216that game is also terrible so who cares.
game of the year is terrible ???
Posted on Reply
#71
silapakorn
It hits the shelves today in my country, at 300$ a piece.
I feel bad for those who can't afford 970.
Posted on Reply
#72
xorbe
HumanSmokeIt's the latter - the delta (colour) compression.
They might even be keeping the textures compressed on the host side, resulting in faster host to gfx card transfers. (Probably are.)
Posted on Reply
#73
HumanSmoke
xorbeThey might even be keeping the textures compressed on the host side, resulting in faster host to gfx card transfers. (Probably are.)
Yeah, I think it works both with writing/retrieving from system RAM, and also from client vRAM to the texture address units of the GPU.
silapakornIt hits the shelves today in my country, at 300$ a piece.
I feel bad for those who can't afford 970.
Ouch! Sounds like some serious pre-release price gouging (unless all other cards are carrying the same kind of mark-up).
If they're on the shelves, how about some quick phone pictures for us?
Posted on Reply
#74
GhostRyder
HumanSmokeWell, if you were asking for information from a just a single individual, why post on a public forum rather than PM the person concerned? Sorry I provided the information as opposed to your BFF - no need to go all...
Don't care, stop obsessing and throwing a tantrum.
Rowsol@ 1080 there's no need for more.

This card is going to be killer. If the price/perfrmance is better than the 970 it will be insane.
I agree, mostly it's the color compression that makes the 2gb enough for a card like this which is going to be a sweet 1080p card. I do not doubt there will be 4gb variants for those who want to make sure/go for sli and 1440p on a budget (so long as the price stays with predictions) but 1 of these is what I look forward to seeing in action.
Posted on Reply
#75
john_
Color compression is only used for transferring from and to memory, it doesn't have an effect in memory's capacity, or am I missing something?
Posted on Reply
Add your own comment
Dec 25th, 2024 20:26 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts