Thursday, January 15th 2015

NVIDIA GeForce GTX 960 Specs Confirmed

Here's what NVIDIA's upcoming performance-segment GPU, the GeForce GTX 960, could look like under the hood. Key slides from its press-deck were leaked to the web, revealing its specs. To begin with, the card is based on NVIDIA's 28 nm GM206 silicon. It packs 1,024 CUDA cores based on the "Maxwell" architecture, 64 TMUs, and possibly 32 ROPs, despite its 128-bit wide GDDR5 memory interface, which holds on to 2 GB of memory. The bus may seem narrow, but NVIDIA is using a lossless texture compression tech, that will effectively improve bandwidth utilization.

The core is clocked at 1127 MHz, with 1178 MHz GPU Boost, and the memory at 7.00 GHz (112 GB/s real bandwidth). Counting its texture compression mojo, NVIDIA is beginning to mention an "effective bandwidth" figure of 9.3 GHz. The card draws power from a single 6-pin PCIe power connector, the chip's TDP is rated at just 120W. Display outputs will include two dual-link DVI, and one each of HDMI 2.0 and DisplayPort 1.2. In its slides, NVIDIA claims that the card will be an "overclocker's dream" in its segment, and will offer close to double the performance over the GTX 660. NVIDIA will launch the GTX 960 on the 22nd of January, 2015.
Source: VideoCardz
Add your own comment

119 Comments on NVIDIA GeForce GTX 960 Specs Confirmed

#26
Tsukiyomi91
RCoonIt's also liable to become the top card in the Chinese market for Colorful and similar manufacturers from that ilk.
It sure is. Bet vendors like them will make custom card based on it with little price impact.
Posted on Reply
#27
Tsukiyomi91
@Cheeseball seems that most of us have to just wait & see how good is this card will be once it's out.
Posted on Reply
#28
hat
Enthusiast
I wonder how it compares to the 660 ti, in terms of raw performance?

This would be a good card to swap the 660 ti out with in this machine. I could then replace the 5870 in another machine with my 660 ti.
Posted on Reply
#29
john_
CheeseballYou guys think this would be a good replacement for my aging HD 7870 XT (Tahiti LE, 1536 cores) on my secondary PC?
Tahiti LE?
Nope it will not. It will be better in some areas (lower power consumption and noise) but for $200 in will not be an upgrade that will really make a difference. Except if your favorite games use PhysX effects. In that case it could look as a serious upgrade to you.
Posted on Reply
#30
HisDivineOrder
RCoonYou'd be surprised. Here are my total figures for VRAM usage on 1440p. I like to think I review a spread of varied games of all styles. Some of those include Early Access titles which are optimised horribly.
Now include the chart with Watch_Dogs, Assassin's Creed Unity, Dragon Age Inquisition, or any other one that represents more of what people who play at 1440p (or higher) will have when going from PS4 or Xbox One exclusives to PC port.

A bunch of indies and some 360-built ports--might as well include the Saints Row Gat out of Hell benchmarks too right?--don't really represent why having less than 2GB is unwise going forward.
Posted on Reply
#31
Chaitanya
64KLooks like a very nice card for 1080p gaming. I hope the rumored $200 price point is true. It will be very successful if so.
If nVidia goes aggressive with pricing down to $180, that card will sell like hot cakes and will be what 7850 did for AMD few years back.
Posted on Reply
#32
RCoon
HisDivineOrderNow include the chart with Watch_Dogs, Assassin's Creed Unity, Dragon Age Inquisition, or any other one that represents more of what people who play at 1440p (or higher) will have when going from PS4 or Xbox One exclusives to PC port.

A bunch of indies and some 360-built ports--might as well include the Saints Row Gat out of Hell benchmarks too right?--don't really represent why having less than 2GB is unwise going forward.
I'll run an article later tonight or tomorrow with a couple of the AAA titles, detailing VRAM as well as memory bandwidth usage and PCIe bus usage on Maxwell. Hoping to get hold of a card without Maxwell compression with similar GB/s memory bandwidth figures (looks like the 770 is a match) to note differences not just in VRAM usage, but also memory bandwidth usage. But that depends on whether I can source a card for tests.
Posted on Reply
#33
darkangel0504
Dragon Age Inquisition takes 2400 VRAM on res 1440 x 900 :)
Posted on Reply
#34
Sanhime
What would this say about mobile solution? 860m is already based on Maxwell, if this is a "960m" coming along, would it be any better than the 860m?
Posted on Reply
#35
ap4lifetn
That's strange, NVIDIA uses a cut down GM204 with 1024/128bit for their mobile GTX 965M, but it seems that their GM206 (this GTX 960) has the same configuration?


They are stockpiling chips better than 1024/128 for their GTX 960 Ti
Posted on Reply
#36
mroofie
rtwjunkieSo, double the 660 performance means Nvidia is saying hands-down this beats a 770, correct?
is this stock or oc (the double 660 claim) ?
if its stock that would mean I could reach gtx 970 levels of performance :0
Posted on Reply
#37
rtwjunkie
PC Gaming Enthusiast
mroofieis this stock or oc (the double 660 claim) ?
if its stock that would mean I could reach gtx 970 levels of performance :0
Who knows? the whole thing is vague.
Posted on Reply
#38
CrAsHnBuRnXp
Really curious how this performs up against a 780.
Posted on Reply
#39
MxPhenom 216
ASIC Engineer
darkangel0504Dragon Age Inquisition takes 2400 VRAM on res 1440 x 900 :)
that game is also terrible so who cares.
Posted on Reply
#40
Nabarun
All I wanna know is whether it will make Crysis 3 look shitty for decent playability. And of course, FC4 and the last 2 COD stuff - all requireDX11/SM5 which my prehistoric GTS 250 ain't got. However, the "future proof" aspect keeps me worried... so may end up with the 970 ultimately.
Posted on Reply
#41
MxPhenom 216
ASIC Engineer
NabarunAll I wanna know is whether it will make Crysis 3 look shitty for decent playability. And of course, FC4 and the last 2 COD stuff - all requireDX11/SM5 which my prehistoric GTS 250 ain't got. However, the "future proof" aspect keeps me worried... so may end up with the 970 ultimately.
Just buy the best you can get right now. "future proof" really needs to stop "trending" when it comes to this stuff.
Posted on Reply
#42
Xzibit
jabbadapAny word about video decoding options, tegra x1 has full h265/vp9 deocoding. Really hope this has too.
If its like all other GeForce cards it wont. Tegra X1 supports 10bit making it possible to be fully 4k compliant. GeForces are all 8bit. Decoding and encoding will still work.

For full H265/VP9 you have to get Radeon HD 6xx0 or newer, FirePro or Quadro card
through DP 1.2+ or HDMI 1.4+.

Content->Processing->Panel
Posted on Reply
#43
64K
NabarunAll I wanna know is whether it will make Crysis 3 look shitty for decent playability. And of course, FC4 and the last 2 COD stuff - all requireDX11/SM5 which my prehistoric GTS 250 ain't got. However, the "future proof" aspect keeps me worried... so may end up with the 970 ultimately.
You obviously keep a card for a long time so maybe a GTX 970 would be best for you but I think it's a pretty safe bet that the GTX 960 is going to come in between a 760 and a 770. To which side it leans to is unknown. Probably towards the 770 side. We'll know that pretty soon but a GTX 960 would be a heck of a nice upgrade for you from that GTS 250.
Posted on Reply
#44
Nabarun
64KYou obviously keep a card for a long time so maybe a GTX 970 would be best for you but I think it's a pretty safe bet that the GTX 960 is going to come in between a 760 and a 770. To which side it leans to is unknown. Probably towards the 770 side. We'll know that pretty soon but a GTX 960 would be a heck of a nice upgrade for you from that GTS 250.
Yeah, I know, the 960 would definitely be a great upgrade *NOW*, but the card I want should be able to tackle, say Crysis 4 - at least minimally (>30fps @ lowest settings) @1080p. Is that too naive to expect? I hope W1zzard includes the FC4 and last 2 COD stuff in the review. Will get a pretty good idea then. Not expecting too much though.
Posted on Reply
#45
MxPhenom 216
ASIC Engineer
NabarunYeah, I know, the 960 would definitely be a great upgrade *NOW*, but the card I want should be able to tackle, say Crysis 4 - at least minimally (>30fps @ lowest settings) @1080p. Is that too naive to expect? I hope W1zzard includes the FC4 and last 2 COD stuff in the review. Will get a pretty good idea then. Not expecting too much though.
Wait for Wizz's review before making a decision.
Posted on Reply
#46
HumanSmoke
XzibitIf its like all other GeForce cards it wont. Tegra X1 supports 10bit making it possible to be fully 4k compliant. GeForces are all 8bit. Decoding and encoding will still work.
For full H265/VP9 you have to get Radeon HD 6xx0 or newer, FirePro or Quadro card
Say what???? That's news considering even AMD's latest Tonga offering doesn't have H.265 decode support

Posted on Reply
#47
rruff
RCoonI'll run an article later tonight or tomorrow with a couple of the AAA titles, detailing VRAM as well as memory bandwidth usage and PCIe bus usage on Maxwell. Hoping to get hold of a card without Maxwell compression with similar GB/s memory bandwidth figures (looks like the 770 is a match) to note differences not just in VRAM usage, but also memory bandwidth usage. But that depends on whether I can source a card for tests.
I'd be very interested in the results! In the past people have compared the 770 2GB and 770 4GB and not found any benefit to the increased vram, except in sli and barely then. Vram requirements are a very hot topic right now. Many are *claiming* that lack of vram is hurting performance in new games, but they always have cards that are slow as well as low on vram, and assuming vram is the culprit when it probably isn't.

Would be great to have a testbed set up with two identical fast cards except for double the vram on one. The 770s are pretty ideal, or maybe the 960s once the 4GB version comes out.
Posted on Reply
#48
jabbadap
XzibitIf its like all other GeForce cards it wont. Tegra X1 supports 10bit making it possible to be fully 4k compliant. GeForces are all 8bit. Decoding and encoding will still work.

For full H265/VP9 you have to get Radeon HD 6xx0 or newer, FirePro or Quadro card
through DP 1.2+ or HDMI 2.0.

Content->Processing->Panel
I have to admit I have no idea what do you mean. But I can guarantee that you can't hardware decode h265/vp9 with hd6xxx, heck tonga r9-285 was the first amd gpu that could decode h.264 4k60p video.
Posted on Reply
#49
Xzibit
jabbadapI have to admit I have no idea what do you mean. But I can guarantee that you can't hardware decode h265/vp9 with hd6xxx, heck tonga r9-285 was the first amd gpu that could decode h.264 4k60p video.
H265/VP9 are mainly for the 4k 10bit 4:2:0+ standard. You do have the option for lower or higher quality options.

You could let the CPU do the work load if its fast enough and still be crippled by your 8bit GPU.

Content True H265/VP9 4k 10bit 4:2:0+ -> Processing CPU/GPU if your GPU is processing it at 8bit out your already downgrading the quality before it gets to your panel.
Posted on Reply
#50
HumanSmoke
XzibitH265/VP9 are mainly for the 4k 10bit 4:2:0+ standard. You do have the option for lower or higher quality options.
You could let the CPU do the work load and still be crippled by your 8bit GPU.
Content True H265/VP9 4k 10bit 4:2:0+ -> Processing CPU/GPU if your GPU is processing it at 8bit out your already downgrading the quality before it gets to your panel.
So, you're still sticking with your assertion then?
XzibitFor full H265/VP9 you have to get Radeon HD 6xx0 or newer....
As for the GTX 960...not on my shopping list, but hopefully it causes some price realignments across both vendors cards that benefit the consumer.
An interesting snippet in the source article - One million GTX 970/980's sold so far. An impressive number given their pricing and sales over barely three months.
Posted on Reply
Add your own comment
Dec 25th, 2024 20:48 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts