Friday, January 16th 2015

GeForce GTX 960 3DMark Numbers Emerge

Ahead of its January 22nd launch, Chinese PC community PCEVA members leaked performance figures of NVIDIA's GeForce GTX 960. The card was installed on a test-bed driven by a Core i7-4770K overclocked to 4.50 GHz. The card itself appears to be factory-overclocked, if these specs are to believed. The card scored P9960 and X3321 in the performance and extreme presets of 3DMark 11, respectively. On standard 3DMark FireStrike, the card scored 6636 points. With some manual overclocking thrown in, it managed to score 7509 points in the same test. 3DMark Extreme (1440p) was harsh on this card, it scored 3438 points. 3DMark Ultra was too much for the card to chew, and it could only manage 1087 points. Looking at these numbers, the GTX 960 could be an interesting offering for Full HD (1920 x 1080) gaming, not a pixel more.
Source: PCEVA Forums
Add your own comment

98 Comments on GeForce GTX 960 3DMark Numbers Emerge

#76
john_
HumanSmokeStandard operating procedure for WCCF. If they publish enough prices they'll get it right eventually - and as per usual lead with the clickbait hysteria-inducing numbers first. I was kind of hoping that the price might stay secret up until launch as pricing realignment done in a panic always benefits the consumer if the incoming model is available in quantity from day one.
I can't argue with that comment about Wccftech. They are like a searching machine that just reposts whatever someone posted on the internet as important rumor. But in both articles they had a little more concrete evidence than the usual "someone who is unknown even to his own mother, just posted from nowhere this info". Today's article does look to offer valid info.
Posted on Reply
#77
Pumper
GhostRyderThere is no conspiracy, its a fact that the mid range was sold as top end based on the chip designations and how things were done in the past.
Exactly, the key word here is "past". When exactly did nvidia state that they will never change the naming of their GPU chips? Never, is my guess.
Posted on Reply
#78
HumanSmoke
john_I can't argue with that comment about Wccftech. They are like a searching machine that just reposts whatever someone posted on the internet as important rumor. But in both articles they had a little more concrete evidence than the usual "someone who is unknown even to his own mother, just posted from nowhere this info". Today's article does look to offer valid info.
I have little doubt that the $200 number is closer to the mark. But whereas other sites with access to the very same prior pre-launch price (gouging), WCCF tend to attach a certain certainty about their stories. Another example would be the die estimate of GM200 based off a low res slightly oblique snap. The guy leads off with
Before we begin, in all fairness, I should point out some things that could make this experiment inaccurate ; lens distortion, inaccurate perspective correction and warping due to rolling shutter just to name a few.
But still claims an accuracy of ± 2.5% ! That's an accuracy of around half a millimetre per side. Oddly enough, if the claims from generally more reliable sources pan out (570-580mm^2) he can actually claim he was out by just 0.51mm per side.
Posted on Reply
#79
bpgt64
It's an interesting offering, depending on where it lands price point wise will be make for some interesting competition. I expect it to land at about 175-225 stock-> aftermarket. I don't think anyone should worry about the Bus-Speed. If your look at GTX 960 for 4k, your looking in the wrong place. I think a pair of 970s is the sweet spot for that, or a single 980(HOF or otherwise). The question is will a pair of 960s hold up well at 1440p 60 fps+ -> 120fps....
Posted on Reply
#80
rruff
bpgt64I don't think anyone should worry about the Bus-Speed.
I'm a little concerned because of something I posted earlier. In shaders, TMUs, ROPS, and GB of vram the 960 is 2x a 750. Only in bandwidth is there a mere 40% increase. The 960 is exactly 1/2 a 980 in all 5 metrics.

Based on TPU's performance summary charts, 1/2x a 980 would be right at 760 level performance. 2x a 750 would be right at 770 level... but the bandwidth is only 40% higher. It's possible that the 750 has more bandwidth than it needs, but I know that overclocking the ram results in a significant improvement in fps. I don't understand how video cards work that well, but unless Nvidia has done additional optimizations on the GM206, seems like it would be close to a 760... which would be lame for $200. I also believe it will consume a lot less than 120W with base clocks... more like 90W, based on the 980 and 750.

I think the ram quantity of 2GB will make it a poor SLI choice, but if you can get a 4GB model they should SLI between the 970 and 980.
Posted on Reply
#81
bpgt64
Imo they did some magical shit with the 9 series, unlike the r9 290x or 295x2 my room doesn't become a sauna when gaming at 4k. Let's see some bench marks first then complain about resource allocation or applicability yo higher resolution. My guessay is this card is going to be a beast with a good cooler.
Posted on Reply
#82
Sony Xperia S
bpgt64unlike the r9 290x or 295x2 my room doesn't become a sauna when gaming at 4k.
You room is a sauna regardless of whether you introduce several more hundreds of watts to it. :)

You cannot warm a room up with 300 W or even a 600 W heating. :laugh: :D
Posted on Reply
#83
GhostRyder
PumperExactly, the key word here is "past". When exactly did nvidia state that they will never change the naming of their GPU chips? Never, is my guess.
The fact is your not getting it and most people here understand how it works...They didn't just decide to put a name tag on a chip and call it a day there is a distinct reason for that name with how NVidia does its GPU hierarchy. Believe what you want but the facts don't change...Part of it could be that they thought the GTX 680 and 670 had "enough" power for the time though
bpgt64Imo they did some magical shit with the 9 series, unlike the r9 290x or 295x2 my room doesn't become a sauna when gaming at 4k. Let's see some bench marks first then complain about resource allocation or applicability yo higher resolution. My guessay is this card is going to be a beast with a good cooler.
My room is not a sauna lol, and I have 3 of them.
bpgt64It's an interesting offering, depending on where it lands price point wise will be make for some interesting competition. I expect it to land at about 175-225 stock-> aftermarket. I don't think anyone should worry about the Bus-Speed. If your look at GTX 960 for 4k, your looking in the wrong place. I think a pair of 970s is the sweet spot for that, or a single 980(HOF or otherwise). The question is will a pair of 960s hold up well at 1440p 60 fps+ -> 120fps....
I do not think this card is going to hold up well at 1440p especially if it does fall between in performance of a GTX 760 and GTX 770. A pair might do a decent job but so would a single 970 which would probably end up being a better buy for most to not have to deal with SLI though I am just speculating based on where the performance is looking like it will fall.
john_Fortunately it seems that NVidia will not shoot itself in the foot with the price. Or at least it's not going to shoot both feet.
This latest rumor does look like confirmed as the title says.
Nvidia Geforce GTX 960 Final Pricing Update: MSRP More or Less Confirmed at $200 Retail
That is more like what it should be and seems to be where it was headed, I didn't think it would be $250+ because it would make fitting the 960 into the market a little hard with the performance drop not to mention make it hard without price changes down the line for a 960ti (If they will) to be fit in without feeling like they played early adopters. Though buying right out the gate is normally something to avoid anyways...
XzibitIf Firestrike scores are the only measure heck that 1552mhz run should be enough for a 960 to replace 780s.
The AiBs 750 Ti OC also had similar scores to reference 660s but failed to even keep up with 650 Ti Boost. Out of the 5 W1zzard reviewed only 1 managed to outperform the 650 Ti Boost, which needed a base OC of 182. The 650 Ti Boost was cheeper too at $130 compared to a 750 Ti which ranged from reference $150 - $200.
EDIT:
Didn't even mention the price of the other superior performing products that were in that price window at the time.

Nvidia
660 = $190

AMD
265 = $150
7870 =$190
270X = $200

*The only 750 Ti OC to beat a 650 Ti Boost
Prices out the gate tend to be one of those things that really are hit or miss with better offering for cheaper or the same that will perform better even by the same company. Most of the time they are just going on the "Higher number = better" routine where most consumers do not pay much attention to extreme detail and just buy something based on the number associated with it thinking that it must be better. That is why prices can be put that way because they will sell as long as there are people only looking at names, guess as they say you should not judge a book by its cover.
Posted on Reply
#84
HumanSmoke
Sony Xperia SYou cannot warm a room up with 300 W or even a 600 W heating. :laugh: :D
"Random troll disproves the Laws of Thermodynamics" - said no one ever.

Meanwhile, site owner personal experience with 290X Crossfire:
These 290X cards are HOT! If your computer is in a small room that sees ambient temperatures above 80F, you will not want a pair of these cards. I think you could live with one 290X but the heat that comes off these cards is insane. Luckily when I started testing these the temperatures were still warm here in Texas, so getting my office up to an ambient temperature of 78F was easy to do. A pair of these 290X in CrossFire can easily warm the room you are in up a few degrees. Under full load in Uber Mode, the exhaust temperature of these cards is over 150F. Yes, you can burn yourself on the exhaust ports of the cards should you be so inclined.
A 300 watt delta between 290X CrossFire and 980 SLI is a huge number. It is easily recognizable when sitting next to the system. After a few hours of gaming with 290X CrossFire, you certainly had that sweaty gamer feeling about you.
Posted on Reply
#85
Sony Xperia S
HumanSmoke"Random troll disproves the Laws of Thermodynamics" - said no one ever.
Ooo, we begin with the stupid insults! :D

If you are so inclined, I will invite you to my home, we will take a random room, will cool it to ambient temperature of 17-18 degrees Celsius and I will give you a permission to use my rig with two R9 290X.

If you succeed with warming the room, I will admit, I was wrong.

Until then, I will laugh at you, and not even take into consideration those stupid comparisons in hot climate.

In hot climate, even smoking a cigarette feels unpleasant. :rolleyes:
Posted on Reply
#86
bpgt64
I speak of the heat difference because my wife uses an R9 295X2, and I use a pair of 980s, both driving 4k panels. I can tell when she is playing a video game. We leave window open in winter to counter balance heat coming off both our rigs. Like HumanSmoke said the TDP difference is very big, and very noticeable. It's not meant to be an insult, it's just a fact.
Posted on Reply
#87
Fluffmeister
Sony Xperia SOoo, we begin with the stupid insults! :D

If you are so inclined, I will invite you to my home, we will take a random room, will cool it to ambient temperature of 17-18 degrees Celsius and I will give you a permission to use my rig with two R9 290X.

If you succeed with warming the room, I will admit, I was wrong.

Until then, I will laugh at you, and not even take into consideration those stupid comparisons in hot climate.

In hot climate, even smoking a cigarette feels unpleasant. :rolleyes:
I'm up for popping over, I'll bring a towel to wear because I know you're in denial.
Posted on Reply
#88
HumanSmoke
Sony Xperia SIf you are so inclined, I will invite you to my home, we will take a random room, will cool it to ambient temperature of 17-18 degrees Celsius and I will give you a permission to use my rig with two R9 290X.
So, you need to pre-cool the room before using the 290X. This was never mentioned in the cards specifications fine print: "DANGER: USE OF THIS CARD IN AMBIENT TEMPERATURES EXCEEDING 18°C CAN LEAD TO EXCESS SWEATING AND DEHYDRATION". Why would you need to do this? Won't your mother be pissed off with you fooling with the thermostat?

In a closed system the addition of heat energy will elevate the temperature of the space it occupies - this is (very) basic physics.
FluffmeisterI'm up for popping over, I'll bring a towel to wear because I know you're in denial.
You're assuming that the offer doesn't include a personal air conditioning fan?
Posted on Reply
#89
Xzibit
bpgt64Imo they did some magical shit with the 9 series, unlike the r9 290x or 295x2 my room doesn't become a sauna when gaming at 4k. Let's see some bench marks first then complain about resource allocation or applicability yo higher resolution. My guessay is this card is going to be a beast with a good cooler.
Here is how the magic happens
Toms HardwareGaming Power Consumption
These findings further illustrate what we said on the previous page about Maxwell and its ability to regulate GPU voltage faster and more precisely. The gaming-based power consumption numbers show just how much efficiency can be increased if the graphics card matches how much power is drawn to the actual load needed. The version of the GeForce GTX 980 that comes overclocked straight from the factory manages to use significantly less power than the reference version, while offering six percent more performance at the same time.

Stress Test Power Consumption
If the load is held constant, then the lower power consumption measurements vanish immediately. There’s nothing for GPU Boost to adjust, since the highest possible voltage is needed continuously. Nvidia's stated TDP becomes a distant dream. In fact, if you compare the GeForce GTX 980’s power consumption to an overclocked GeForce GTX Titan Black, there really aren’t any differences between them. This is further evidence supporting our assertion that the new graphics card’s increased efficiency is largely attributable to better load adjustment and matching.
You get better efficiency with non-reference during gaming but its higher then the advertised TDP. Stress testing tells a different story with non-reference 970 & 980 sucking up 242w & 280w.
Posted on Reply
#90
mxp02
FluffmeisterI'm up for popping over, I'll bring a towel to wear because I know you're in denial.
You're right!2*290X will definitely make a room sauna.Cause I got 970 tri-sli which consumes equal amount of power to 2*290X,and my room is hot like hell.Under this kind of circumstances wearing any clothes would be suicidal,so I just have a towel around my neck.Those 290X CF owners are terrible liars.
Posted on Reply
#91
overpass
Hmm, if it is around $180, I'd say it is worth it! :) Pretty sure the stock on 760 or 770 cards is running rather low? It will be closer to 770 than 760 due to the optimizations that nVidia will bring to Maxwell.
Posted on Reply
#92
Sony Xperia S
HumanSmokeSo, you need to pre-cool the room before using the 290X
Ahahaha :laugh: This is because when you turn the heating off, the rooms tend to go quickly to that ambient temperature.
HumanSmokeIn a closed system the addition of heat energy will elevate the temperature of the space it occupies - this is (very) basic physics.
Just saying that you won't succeed with so small energy addition, you will need something much more serious for a change.

Two 290Xs won't be enough. :D
HumanSmokeyour mother
Ooo, and please, do not mention my mom!!! Because, as far as I remember, I haven't mentioned yours! :rolleyes:
Posted on Reply
#93
HumanSmoke
Sony Xperia SJust saying that you won't succeed with so small energy addition, you will need something much more serious for a change.
Meanwhile in the real world : "If a 40-watt fan runs in a small (12' x 12' x 8') room for one hour, it generates enough heat to raise the temperature of the air by about 9 degrees Fahrenheit" - U.S. Department of Energy. You could actually ask a scientist to do the calculation for you, but as you're just trolling I guess that wont happen. It must be about time you switched back to trolling AMD, or the site in generalif your usual M.O. holds.
Posted on Reply
#94
Tatty_Two
Gone Fishing
This is becoming interesting if a little tiresome, if the two of you cannot disagree without the petty jibes, childish tit for tat and constant bickering then I will shut you both down for a spell..... thank you.
Posted on Reply
#95
rruff
HumanSmokeYou could actually ask a scientist to do the calculation for you
The problem is that both of you are correct.

It will vary hugely depending on how thermally isolated the room is, it's thermal capacitance, its size, how long you are heating it, and the starting temperature, and whether you will perceive a temperature rise as an issue (if it's winter and you keep your house at 65, then no problem... if it's summer and it's already 90, then yes). If the room is perfectly insulated, even 1W would eventually send the temperature to infinity. But most rooms are not well insulated from each other, doors are not sealed (or are even open), and air is circulated via heating and air conditioning systems.

My wife has a little electric resistance heater. It's 1500W. It can heat a sealed small room >+10F in an hour. 500W for 3 hrs would be similar. My Dad's 3000 sq ft 50 year old house uses electric resistance heat. In winter his average consumption is 5000W... that's to heat the whole house 40+F over ambient, steady state. If he had 10 x500W systems cranking 24/7 he could get rid of the furnace.

So yes, the heat given off by a computer can be significant. It is also usually not that hard to make it a non-issue.

Getting back to the 960, I don't think it will be near 120W unless it is over clocked a lot. It's half of a 980 and a little less than double a 750. Should be <100W.
Posted on Reply
#96
Casecutter
HumanSmoke"If a 40-watt fan runs in a small (12' x 12' x 8') room for one hour, it generates enough heat to raise the temperature of the air by about 9 degrees Fahrenheit" - U.S. Department of Energy.
I went searching to see what the actual parameters of that DOE test, do you have a link?

I'd want to understand the construction and outside ambient they factor. While a fan in the right conditions could create an increase, 9°F in an hour or 13% hard to fathom? Especially with something more than the air movement of table fan those a about 10-25W? Most modern ceiling fans use less than an amp, averaging between 0.5 and 1 amp, 10-50W being customary. Though the air circulation within of a completely sealed room is there other aspects to account for like a human body... as why leave the fan on in the first place.

I’m not sure of how they arrived at a 13% increase over normal [72°] room in just one hour unless the room is effected additional conduction from outside ambient, or body heat adding additional thermal load. By shear calculation you might say the volume of air within a sealed “room” with 40W load (light bulb ~36 Joules/sec) would produce a rise, but you'd be lucky to see 4-5°F in an hour. Considering there's air movement with the fan, that influences the energy/heat that would be dissipated/absorbed by the surface of walls as effected by the outside ambient.
Posted on Reply
#97
rruff
CasecutterWhile a fan in the right conditions could create an increase, 9°F in an hour or 13% hard to fathom?
Don't worry about it. That isn't going to happen in any practical situation. It was probably a completely sealed and insulated room, and so only looked at the energy to raise the temperature of the air. Air doesn't have much thermal capacitance, so it doesn't take much to heat it up. It's the exchange with other objects, the outdoors, and the rest of the house that matters.

Ok, might as well just calculate it for a perfectly sealed and insulated room. Thermal capacitance of air ~1.00 KJ/KG-K. Density ~1.2 KG/m^3. The 12x12x8 room will have 1152 ft^3 or 33m^3. Total mass of air is 33x1.2= 40 KG. 40W for 1 hr is 40x60x60 = 144 KJ. Temperature rise = 144/40 = 3.6 K or 6.5 F.

I don't get a 9 F rise even with bogus idealized assumptions.
Posted on Reply
#98
HumanSmoke
rruffThe problem is that both of you are correct.
It will vary hugely depending on how thermally isolated the room is, it's thermal capacitance, its size, how long you are heating it, and the starting temperature, and whether you will perceive a temperature rise as an issue (if it's winter and you keep your house at 65, then no problem... if it's summer and it's already 90, then yes).
Yes, and there is also another variable to take into consideration - proximity of the user to the heat source. People as a general rule don't sit within a couple of feet of a heater, but they may well do that with their computer system. And as you've alluded to, people are unlikely to fire up a heater on a warm day and have it parked next to them.
Having AC or airflow in the room might alleviate or negate entirely the effects, but I'm pretty certain that having the system parked under the desk amplifies the effects. Airflow and AC are also dependant upon these being available - which might not be a given for a lot of users.
The user experience is of course particular to the person, but I'm pretty sure that anecdotal evidence points to a significant percentage of users affected. My 780 SLI system uses close to a 290X Crossfire setup ( ~700-750W), and I'm pretty sure I notice the difference between idle and 3D load. BTW, the local ambient temp here is 18°C at 9:10 a.m. with winds of <2km/h, which is about average for late/overnight/early morning during summer and autumn.
rruffGetting back to the 960, I don't think it will be near 120W unless it is over clocked a lot. It's half of a 980 and a little less than double a 750. Should be <100W.
That might be a little conservative. The GTX 970/980's TDP seems based on base clock, which in actuality is rarely, if ever, the limit of its reference core frequency. Two other factors to take into consideration are that the 960 is clocked at least 100MHz higher (core) and uses 4GB of 7Gbps GDDR5 (@1.55V) rather than 2GB of 5Gbps (@1.35V) found in the GTX 750/750 Ti.
I have no doubt that Nvidia will quote the base clock TDP for the card for marketing purposes, but I wouldn't be overly surprised to see the card use 20W or so more in real-world usage for 3D loading.
Posted on Reply
Add your own comment
Dec 22nd, 2024 04:39 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts