Wednesday, December 10th 2014

NVIDIA to Launch GeForce GTX 960 in January

NVIDIA is reportedly preparing to launch its mid-range GeForce GTX 960 graphics card some time in January, 2015; according to a SweClockers report. The card could be launched in the sidelines of the 2015 International CES. The card will be based on the company's new GM206 silicon, and it won't be a cut-down GM204. Its only specifications doing rounds are the memory bus width of 128-bit, and standard memory amount of 2 GB. Out of the box, the card could offer performance comparable to a GeForce GTX 770, with much lower power draw, and a $200-ish price.
Source: SweClockers
Add your own comment

62 Comments on NVIDIA to Launch GeForce GTX 960 in January

#26
Disparia
Cool. I have GTX 660 and the wife is on an HD 6850, so the GTX 960 is potentially a decent upgrade for us.
Posted on Reply
#27
nunomoreira10
2Gb defnetly is not enough,
i play with an gtx750ti 2Gb and have to reduce the textures to minimum on assassins creed, medium on far cry4, medium on shadow of mordor and medium on watch dogs to avoid stuturing.
i also close firefox before playing to save some more 200mb of memory, and it will only get worse with new games, so no 2Gb is not enough.
Posted on Reply
#28
EarthDog
What does firefox have to do wit vram?
Posted on Reply
#29
64K
nunomoreira102Gb defnetly is not enough,
i play with an gtx750ti 2Gb and have to reduce the textures to minimum on assassins creed, medium on far cry4, medium on shadow of mordor and medium on watch dogs to avoid stuturing.
i also close firefox before playing to save some more 200mb of memory, and it will only get worse with new games, so no 2Gb is not enough.
Your GPU is fine for a lot of games at high settings but it's underpowered for the most demanding games. More VRAM won't help with your GPU.
Posted on Reply
#30
Petey Plane
nunomoreira102Gb defnetly is not enough,
i play with an gtx750ti 2Gb and have to reduce the textures to minimum on assassins creed, medium on far cry4, medium on shadow of mordor and medium on watch dogs to avoid stuturing.
i also close firefox before playing to save some more 200mb of memory, and it will only get worse with new games, so no 2Gb is not enough.
yeah, that has nothing to do with video card RAM. My 2 year old GTX670 w/ 2gb can run all those games on high to ultra at 1080, no problem. Also, freeing up system memory by closing apps has nothing to do with the VRAM on the video card.
EarthDogWhat does firefox have to do wit vram?
nothing
edit: nothing, unless he/she's also trying to play a Unity based browser game at the same time?? or maybe firefox on a 2nd monitor while gaming on the primary. Eitherway, system memory and vram are 2 different things
Posted on Reply
#32
Blue-Knight
EarthDogWhat does firefox have to do wit vram?
I guess it utilizes the GPU to accelerate page rendering. Not sure.
Posted on Reply
#33
Petey Plane
Blue-KnightI guess it utilizes the GPU to accelerate page rendering. Not sure.
major edit: ok, ignore previous screenshots: noticed gpu mem usage set to "min." because i'm an idiot. Checked it again after 30 mins. of firefox/ typical browsing, max mem. usage was 300mb. So yeah, Windows combined w/ browser does use some vram, but 2gb is still plenty of overhead for almost every game at 1080p.

That being said, some games (titanfall, witcher 2, etc.) have "insane" texture resolutions specifically for cards w/ 4gb+ of vram, but Ulta still looks great in those games, so :rolleyes:
Posted on Reply
#34
Blue-Knight
Petey PlaneIf it does use GPU rendering, it can't be more than a 100mb
It depends on what kind of pages you are browsing, not sure.

Posted on Reply
#35
ZoneDymo
Blue-KnightBlame the game developers for that.
And I do, but, that does not change the situation we are currently in.
Posted on Reply
#36
BorisDG
It will be great budget card for sure. Enough for average casual gamer. For the others there are already cards to go. :)
Posted on Reply
#37
Prima.Vera
nunomoreira102Gb defnetly is not enough,
i play with an gtx750ti 2Gb and have to reduce the textures to minimum on assassins creed, medium on far cry4, medium on shadow of mordor and medium on watch dogs to avoid stuturing.
i also close firefox before playing to save some more 200mb of memory, and it will only get worse with new games, so no 2Gb is not enough.
You have stuttering on those games not because of not enough VRAM but because of the bus. 128bit bus is not wide enough to process the big size of the textures, any AA and AF, etc ;)
Posted on Reply
#38
mroofie
btarunrI remember telling someone in some thread not to hold their breath over this card.
I'm sorry there is no way these specs are real. With these specs it won't achieve 760 or 770 performance

This Rumor = my ass ;)
Posted on Reply
#39
The Von Matrices
hastalabsWTF 128 bit !!!!
Prima.Vera128bit bus is not wide enough to process the big size of the textures, any AA and AF, etc ;)
Bus width is only one of many factors affecting memory bandwidth, which in turn is only one of many factors affecting overall performance.

You can't just say "it's 128 bit so it's not wide enough" or else you're no different than the people who say "my processor has more GHz so it's better!"

I feel like this thread has devolved into an extension of "Choose R9 290 Series for its 512-bit Memory Bus: AMD"
Posted on Reply
#40
Bunjomanjoman
Wow this website is becoming the OLD WCCF tech. All rumors>full BS.
So what is your source OP? you dreamed it last night and made it a news?

You guys should use your brains and if you dont have a brain such as OP, then use LOGIC. 128bit is for low end cards. GTX 960 will have 256bit bandwidth just as GTX 760 and 4GB of ram. While 128bit will be left for GT cards or something like GTX750Ti's.

And at least i can provide proof. It's right there on Zauba. That's an official shipment stating 256bit-4gb of ram versus your stupid rumor. Which one shall we believe hmmmm.

Posted on Reply
#41
mroofie
BunjomanjomanWow this website is becoming the OLD WCCF tech. All rumors>full BS.
So what is your source OP? you dreamed it last night and made it a news?

You guys should use your brains and if you dont have a brain such as OP, then use LOGIC. 128bit is for low end cards. GTX 960 will have 256bit bandwidth just as GTX 760 and 4GB of ram. While 128bit will be left for GT cards or something like GTX750Ti's.

And at least i can provide proof. It's right there on Zauba. That's an official shipment stating 256bit-4gb of ram versus your stupid rumor. Which one shall we believe hmmmm.

finally someone logical :)
Tell em Goku :D
Me thinks this article is the pure form of click bait :0
Posted on Reply
#42
Prima.Vera
The Von MatricesBus width is only one of many factors affecting memory bandwidth, which in turn is only one of many factors affecting overall performance.

You can't just say "it's 128 bit so it's not wide enough" or else you're no different than the people who say "my processor has more GHz so it's better!"

I feel like this thread has devolved into an extension of "Choose R9 290 Series for its 512-bit Memory Bus: AMD"
Then what are the other many factors affecting memory bandwidth?
Anyone can criticize, very few can provide explanations... ;)
Posted on Reply
#43
rtwjunkie
PC Gaming Enthusiast
Those who are acting as if @btarunr just makes stuff up are aware he is the Moderator who does the news and has for quite awhile? Very little of what he posts is pure speculation. Most of what he posts does come to fruition pretty closely. He rarely posts only rumors.

That being said, if Bunjomanjoman would like to spend some research time and get something more concrete from the notoriously secretive Nvidia, please do, and then share with us.

Maybe the original post will not come true and maybe it will be like the manifest listed above. That's still not a reason to blatantly attack.
Posted on Reply
#44
mroofie
rtwjunkieThose who are acting as if @btarunr just makes stuff up are aware he is the Moderator who does the news and has for quite awhile? Very little of what he posts is pure speculation. Most of what he posts does come to fruition pretty closely. He rarely posts only rumors.

That being said, if Bunjomanjoman would like to spend some research time and get something more concrete from the notoriously secretive Nvidia, please do, and then share with us.
so NVIDIA has in this short span of time increased Maxwell's performance and efficiency to be able to replace a gtx 770 with a 128 bit bus and 2GB vram ??

I will admit if this rumor said the bus was 192 bit I would have believed it but 128 NEVER :) ( That's half the performance)

This rumor is complete bs
Posted on Reply
#45
rtwjunkie
PC Gaming Enthusiast
mroofieso NVIDIA has in this short span of time increased Maxwell's performance and efficiency to be able to replace a gtx 770 with a 128 but bus and 2GB vram ??

I will admit if this rumor said the bus was 192 bit I would have believed it but 128 NEVER :) ( That's half the performance)

This rumor is complete bs
Possibly so, and well-spoken rebuttal. My point is for intelligent discussion and counterpoint. I actually tend to agree speculatively with you on the bus width. My objection is that instead of logically providing a counter, Bunjomanjoman decides to use his first post on TPU to blatantly and rudely attack.. There's a right way and a wrong way to argue.

As to the 960, again, I'm inclined to agree with you about 192-bit bus, but since I have not been able to find anything more concrete myself, because Nvidia is so secretive, I will look on btarunr's report as the absolute minimum for the specs, so it's not all bad.
Posted on Reply
#46
mroofie
rtwjunkiePossibly so, and well-spoken rebuttal. My point is for intelligent discussion and counterpoint. I actually tend to agree speculatively with you on the bus width. My objection is that instead of logically providing a counter, Bunjomanjoman decides to use his first post on TPU to blatantly and rudely attack.. There's a right way and a wrong way to argue.

As to the 960, again, I'm inclined to agree with you about 192-bit bus, but since I have not been able to find anything more concrete myself, because Nvidia is so secretive, I will look on btarunr's report as the absolute minimum for the specs, so it's not all bad.
we shall see next year :D
Hopefully NVIDIA doesn't cut down on the VRAM me want's 4GB VRAM :D
Posted on Reply
#47
GhostRyder
The Von MatricesBus width is only one of many factors affecting memory bandwidth, which in turn is only one of many factors affecting overall performance.

You can't just say "it's 128 bit so it's not wide enough" or else you're no different than the people who say "my processor has more GHz so it's better!"

I feel like this thread has devolved into an extension of "Choose R9 290 Series for its 512-bit Memory Bus: AMD"
Yes, another big factor is the memory speed which can be clocked higher to make up for the memory bandwidth issue however as we know the limit right now is around 7ghz even with some overclockers hitting up to 8ghz on the memory that will not make up for it. That being said it does not matter because this card is clearly marketed in the low end area probably going to be more about hitting 1080p on high to ultra at this point.

Having a bigger bus to match the memory makes a card run a lot better the higher the resolution is which is the R9 290/X are very good at doing 4K and delivering at least some playable performance. That article is mostly in reference to that as if your choosing to buy 1-2 290/X or 970/980 cards (Even 780/ti) you are buying them for high resolutions and not 1080p. Buying SLI/CFX of these cards at this point is way overkill for anything below 1440p.

This card is not for these resolutions and is targeted in the lower-mid range area of gaming handling 1080p. Even if you SLI them you will probably only be able to do 1440p at reasonable settings. I actually think while this seems like its on the low end its in the appropriate spot.
Posted on Reply
#48
Casecutter
EarthDogThese cards are really not meant for anything over 1080 in the first place...
That's what I see, a card for mainstream 1080p. Sure more than a 750ti which offers "better than Entry" 1080p, though not near superlative 1080p. If priced <$200 I might see Nvidia place very close to the 285, and offer good $/perf. It's not going to provide if you’re thinking of moving to 2560x upgrade with it. As others see it the gap form this GTX960-970 will need a 960Ti (192-Bit) at some point.
Ikarugabased on the performance of the 750ti and the 970, a middleground 960ti with 192bit bus width and 3GB memory could be easily a "best buy" of its time.
I'd say that is spot-on why cannibalize 970 sales. I don't see Nvidia looking to offer any furthur cut down GM204, (until or if at any time) only perhaps if AMD came with FinFet first, and they said in so many words they aren't...
Posted on Reply
#49
Easy Rhino
Linux Advocate
Today's games do not really benefit from resolutions higher than 1080p because they were never designed for a resolution higher than that. Sure you can crank the resolution of a game up to 4K but you are not improving the graphics in anyway, you are just increasing the resolution. You need games to be developed with higher resolution TEXTURES first before you can really enjoy the visual experience of 1440p or even 4K gaming.
Posted on Reply
#50
xorbe
Easy RhinoToday's games do not really benefit from resolutions higher than 1080p because they were never designed for a resolution higher than that. Sure you can crank the resolution of a game up to 4K but you are not improving the graphics in anyway, you are just increasing the resolution. You need games to be developed with higher resolution TEXTURES first before you can really enjoy the visual experience of 1440p or even 4K gaming.
I game at 1200p, but even I can understand that effective draw distance is enhanced by higher resolutions.
Posted on Reply
Add your own comment
Oct 18th, 2024 16:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts