Saturday, May 23rd 2015

NVIDIA GeForce GTX 980 Ti Smiles for the Camera

Here are some of the first pictures of an NVIDIA GeForce GTX 980 Ti graphics card, in the flesh. As predicted, the reference design board reuses the PCB of the GeForce GTX TITAN-X, and its cooler is a silver version of its older sibling. According to an older report, the GTX 980 Ti will be carved out of the 28 nm GM200 silicon, by disabling 2 of its 24 SMM units, resulting in a CUDA core count of 2,816. The card retains its 384-bit GDDR5 memory bus width, but holds 6 GB of memory, half that of the GTX TITAN-X. The card is expected to launch in early June, 2015. NVIDIA's add-in card (AIC) partners will be free to launch custom-design boards with this SKU, so you could hold out for the MSI Lightnings, the EVGA Classifieds, the ASUS Strixes, the Gigabyte G1s, and the likes.
Source: VideoCardz
Add your own comment

118 Comments on NVIDIA GeForce GTX 980 Ti Smiles for the Camera

#51
MxPhenom 216
ASIC Engineer
radrokI used the OG Titan reference cooler for a couple of days before dismantling it, while it was quiet and good for stock clocks it couldn't cope with overclocking, unless it was a couple tens of megahertz.

It's good for bone stock quiet operation, but if you want some more be prepared to have a jet engine in your case.
Maybe on the Titan, but 780, quietest and coolest running reference card I have ever had, and I overclocked it 100-150mhz.
Posted on Reply
#52
HumanSmoke
qubitThe gold standard in low noise and lower temperature high end graphics cards are the MSI Gaming and the Asus ones (Strix I think?) If they can do it, then so can NVIDIA
The difference is that reference cooling is more or less predicated on a single fan that exhausts most (if not all) the exhaust outside the chassis. The Asus (Strix/Matrix/DCII etc.) and MSI (TwinFrozr), along with Gigabyte's WindForce, EVGA's ACX, GALAX's whatever-the-hell-that-pseudo-shoebox-is, are all twin fan at the very least that exhaust into the chassis. Unless you can guarantee compatibility with every chassis conforming across the standards for form factors, it cannot be applied universally as a reference design, and hence we see virtually every reference design cooler conforming to the blower/shroud configuration. Those that do not (like the that on the HD 7990) are limited in application and uptake.
qubitand I'm sure it would help to boost their sales by having such a cutting-edge cooler as standard.
The same could be said for AMD's offerings. If a custom (vendor) licenced design could be applied as reference, I'm pretty sure AMD would jump at the chance to decrease their bill of materials from the AIO, to say Sapphire's Vapor-X cooler, or MSI's TF5 which could undoubtedly handle the heat dissipation duties required.
Posted on Reply
#53
arbiter
pidginIf 390X is neck to neck in price/performance with 980 Ti I think 390X will sell a looot easier.
Even if its neck and neck, if that 390x sells for 850$ and 980ti is a good 100-200$ cheaper not sure it will sell that easy. AMD has admitted that supplies could be limited at launch of their card so prices could even go up.
MxPhenom 216Ever since the original Titan,NVIDIA cards have been the sexiest reference cards I have ever seen.
Lets also remember nvidia cools Also works at what its ment to do unlike the AMD one that couldn't.
newtekie1I doubt it, AMD is still having heat and power issues
If AMD wasn't having heat issues why did they go with a water cooler on the card as reference?
MxPhenom 216Why should they redesign the cooler when it works well and about 99.99% of people really like the look?
No need to change a cooler design that works, Let the 3rd party makers worry about design.
qubitThe gold standard in low noise and lower temperature high end graphics cards are the MSI Gaming and the Asus ones (Strix I think?) If they can do it, then so can NVIDIA
I would have an arguement on that. The Gigabyte windforce cooler is usually one the best performing coolers in most reviews. i had one on a gtx670, had that thing with like +200mhz offside never had that card top 65c. I would bought a gtx980 windforce but they made it an inch or 2 longer so wouldn't been able to fit it in my case without some modification.
radrokI used the OG Titan reference cooler for a couple of days before dismantling it, while it was quiet and good for stock clocks it couldn't cope with overclocking, unless it was a couple tens of megahertz.

It's good for bone stock quiet operation, but if you want some more be prepared to have a jet engine in your case.
Least card runs at at clocks they say rather then dropping ~20% of gpu's clock speed after 5min.
Posted on Reply
#54
HTC
qubitI recently saw an NVIDIA card which had this reference cooler. First time I've seen it in real life and while it was nice, I think it's overrated. Noise in 3D mode was reasonable, but I thought it could have been better, too.

I'm beginning to get the feeling that NVIDIA are getting lazy with designing reference coolers nowadays with all their cards looking the same. This design is now two years old and I'm sure they could do better if they were bothered.
Is the reference cooler butt ugly or hidious looking? If so, then i can see the need for a visual improvement, so long as it doesn't compromise performance. If not, why is there a need to waste money trying to "fix" something in perfectly working conditions AND suitable to the job (for now, anyways, in the cards it has been placed on)? It would be a different tune if the cooler were inadequate for the job which it clearly isn't. When AMD's 290 family was launched, i voiced my disapointment @ their cooler quite loudly, and i'm sure many did the same: now there's a reference cooler in dire need for a overall update!!!!

I get this sort of thing @ work too with the engineers "trying" to come up with ways to improve visually and ... forgetting ... that actually doing the job properly is WAY more important then looks. Stupid, IMO!!!!
Posted on Reply
#55
newtekie1
Semi-Retired Folder
arbiterIf AMD wasn't having heat issues why did they go with a water cooler on the card as reference?
That is my point.
Posted on Reply
#56
arbiter
HTCIt would be a different tune if the cooler were inadequate for the job which it clearly isn't. When AMD's 290 family was launched, i voiced my disapointment @ their cooler quite loudly, and i'm sure many did the same: now there's a reference cooler in dire need for a overall update!!!!
You and just about EVERY review site when they realized that 290(x) ref cooler was crap. Almost every site got on AMD over it and rightfully so. Shouldn't need to run the fan near max just to keep advertised clocks. That is on open air bench in temp controlled office to boot. Which most people that buy them won't be using.
Posted on Reply
#57
buildzoid
@Steevo

Those are SMD tantalum caps. Not inductors. You can tell from the markings. Inductors come with an R rating on them caps don't.
Posted on Reply
#58
Solaris17
Super Dainty Moderator
newtekie1That's an AMD gimmick.
/chuckles to be fair that has saved some asses when it came to bad flashes. I agree though :toast:
Posted on Reply
#59
newtekie1
Semi-Retired Folder
arbiterYou and just about EVERY review site when they realized that 290(x) ref cooler was crap. Almost every site got on AMD over it and rightfully so. Shouldn't need to run the fan near max just to keep advertised clocks. That is on open air bench in temp controlled office to boot. Which most people that buy them won't be using.
The AMD cooler actually isn't a bad design. In fact it is better in cooling ability than the nVidia reference cooler. AMD's problem is their GPUs put out a stupid amount of heat. A 290X puts out a good 50w more heat than the Titan-X, that's basically 20% more heat.
Posted on Reply
#60
nunyabuisness
newtekie1Have you actually tried any 4K gaming? I have yet to find a title where 4GB wasn't enough to handle 4K.
well I am at 1440P and I ran at 200% sampling and it defo fills up most games mate!
the witcher doesnt count anyway. its a heavily optimized game Ill give you that. but most games arent! they are bugging and hogs! and its best to have at least 6-8gb for 4K (980TI sweet spot)
and I have intel i7 and titan X in my PC. and my 2 kids have FX6300's and 290's so Im not a boased person and I want AMD to do well. but its making some bad decisions
Posted on Reply
#61
Solaris17
Super Dainty Moderator
nunyabuisnesswell I am at 1440P and I ran at 200% sampling and it defo fills up most games mate!
the witcher doesnt count anyway. its a heavily optimized game Ill give you that. but most games arent! they are bugging and hogs! and its best to have at least 6-8gb for 4K (980TI sweet spot)
and I have intel i7 and titan X in my PC. and my 2 kids have FX6300's and 290's so Im not a boased person and I want AMD to do well. but its making some bad decisions
why do you run at 200% SS at that resolution? AA and AF drastically reduce in IQ depending on monitor size. Hell even the still standard 1080 doesnt need drastic amounts if at all in most games 23-24" monitors and lower.
Posted on Reply
#62
HTC
newtekie1The AMD cooler actually isn't a bad design. In fact it is better in cooling ability than the nVidia reference cooler. AMD's problem is their GPUs put out a stupid amount of heat. A 290X puts out a good 50w more heat than the Titan-X, that's basically 20% more heat.
Regardless: the fact is it's inadequate (soundwise) for those cards. Have AMD stick it in cards that produce less heat thus "miraculously" making the cooler good.
Posted on Reply
#63
arbiter
newtekie1The AMD cooler actually isn't a bad design. In fact it is better in cooling ability than the nVidia reference cooler. AMD's problem is their GPUs put out a stupid amount of heat. A 290X puts out a good 50w more heat than the Titan-X, that's basically 20% more heat.
um where your argument falls apart is Nvidia cooler keeps the chip to 80c most the time that is the setting for before backs down the boost but mostly does it. AMD's cooler couldn't even keep theirs from 95c. SO really argument kinda gets flawed with that. That is 20% hotter so.. I bet if you underclocked a 290x to fit 250watt range, I doubt it could keep that gpu to 80, probably still hit 95c with the cooler they used.
Posted on Reply
#64
Steevo
HTCRegardless: the fact is it's inadequate (soundwise) for those cards. Have AMD stick it in cards that produce less heat thus "miraculously" making the cooler good.
They have to stay with it for the people who have already hot cases, when you stick in another 200W of power dissipation with no venting it literally becomes the oven that kills. Water was the only way out but even that has its limitations, and a smaller process node wasn't happening for them.
Posted on Reply
#65
AsRock
TPU addict
arbiterum where your argument falls apart is Nvidia cooler keeps the chip to 80c most the time that is the setting for before backs down the boost but mostly does it. AMD's cooler couldn't even keep theirs from 95c. SO really argument kinda gets flawed with that. That is 20% hotter so.. I bet if you underclocked a 290x to fit 250watt range, I doubt it could keep that gpu to 80, probably still hit 95c with the cooler they used.
Quality check\finish screwed the reference cooler but with some love it performed much better.
Posted on Reply
#66
newtekie1
Semi-Retired Folder
arbiterum where your argument falls apart is Nvidia cooler keeps the chip to 80c most the time that is the setting for before backs down the boost but mostly does it. AMD's cooler couldn't even keep theirs from 95c. SO really argument kinda gets flawed with that. That is 20% hotter so.. I bet if you underclocked a 290x to fit 250watt range, I doubt it could keep that gpu to 80, probably still hit 95c with the cooler they used.
With coolers they can only handle up to a certain amount of heat, once you go beyond that you get thermal run away and temps go uncontrolled. So, the reference nVidia cooler might be able to dissipate 260w, and the Titan-X puts out 245w, so all is good and the cooler keeps the card at acceptable temps. However, the AMD reference cooler might be capable of dissipating 275w, but since the 290X puts out 295w, the cooler can't dissipate all the heat the 290X is generating and temps go out of control.
Posted on Reply
#67
haswrong
seems like a cut down card.. so lets say for $200 i could be persuaded to buy.. otherwise let me climb the fiji hill..
Posted on Reply
#68
GhostRyder
arbiterum where your argument falls apart is Nvidia cooler keeps the chip to 80c most the time that is the setting for before backs down the boost but mostly does it. AMD's cooler couldn't even keep theirs from 95c. SO really argument kinda gets flawed with that. That is 20% hotter so.. I bet if you underclocked a 290x to fit 250watt range, I doubt it could keep that gpu to 80, probably still hit 95c with the cooler they used.
First of all, you have no idea what your talking about in regards to the reference cooler. I have 3 of them that had the reference cooler, you can keep the temps in a normal case below 95c on the 55% fan speed setting even under gaming load with normal airflow (Yes reviewers had some trouble keeping temps down, we are aware of that but a patch among other things helped out). You bump it to 60% and its in the mid 80's, I own these cards and ALL THREE of them acted the same with one of them being 1 degree (or roughly) higher.
arbiterIf AMD wasn't having heat issues why did they go with a water cooler on the card as reference?
Probably because they went to the next level to give people a quiet reference cooler. Not much more could easily be done to a fan cooler so what's next without making a cooler that dumps air into case.

To top it off, from what i've read the Titan X even now throttles a bit so its not the perfect cooler...

Anyways, the 980ti Titan cooler will be replaced by other aftermarket same as normal. The normal cooler does it job fine but its is getting a bit dated. Though it still looks nice to me!
Posted on Reply
#69
HTC
newtekie1With coolers they can only handle up to a certain amount of heat, once you go beyond that you get thermal run away and temps go uncontrolled. So, the reference nVidia cooler might be able to dissipate 260w, and the Titan-X puts out 245w, so all is good and the cooler keeps the card at acceptable temps. However, the AMD reference cooler might be capable of dissipating 275w, but since the 290X puts out 295w, the cooler can't dissipate all the heat the 290X is generating and temps go out of control.
The prob is that you aren't thinking of differences in ambient temps: the very same card (i mean literally the same) can behave differently with different ambient temps. Even if the card "gets away with it" @ some ambient temps doesn't mean it will if the ambient temps are ... say ... 8º - 10º higher. The difference between summer and winter ambient temps is bigger then that and the card must be able to function properly in either one.
Posted on Reply
#70
kiddagoat
I dunno wtf you all are on about the AMD stock cooler..... this guy owns a Sapphire 290 reference...... I flashed it with a Tri-X OC BIOS ..... card never gets over 75C...... I haven't taken it apart or anything.... And yes I do use MSI Afterburner to run a custom fan profile....

that 50% fan speed is alright depending on chassis configuration and your ambient... you open it up to about 65%-70% and it works just fine.... I have headphones and I still don't hear the card.... I use my 2.1 Klipsch Pro Media... still don't hear the card..

I think some people just like to be anal and cling to every little "negative" thing about a product.... weighing them all equally.....

I have owned both AMD and Nvidia cards..... just get whatever is better for money at the time..... I mean at least AMD doesn't release a driver that smokes their cards by turning the fans off or not letting them work properly then says it is your fault for installing the driver.

Just saying... both sides have blows against them.... damn fanboys....

All the more reason.... if you are just going off reviews and don't actually own the product... your opinion should actually hold less weight.... Not every configuration is the same.... even if all the hardware is the same revision, BIOS, and all that nitty gritty stuff.... it is going to have some uniqueness to it...
Posted on Reply
#71
HumanSmoke
haswrongseems like a cut down card.. so lets say for $200 i could be persuaded to buy.. otherwise let me climb the fiji hill..
Wow, you really are rooting for AMD to die aren't you?

Just for the sake of argument - I'll spend a couple of minutes evaluating the scenario you put all of ten seconds putting some thought into...just for laughs...

A GTX 980 Ti for $200, makes a mainstream 960 what? $50?...a 750Ti..$20?
Nvidia might be able to absorb those losses for a while with $4.8 billion in cash and short term securities, but AMD with their nosediving assets, maybe not so much. The company isn't sustainable long term at its current level, and you're hoping they breach the $600m barrier in short order whereby the company is untenable as a going concern. If you're expecting Fiji to dig them out of the hole, I have news for you. The mainstream volume markets are where the revenue really rolls in. In your scenario, AMD are combatting Nvidia cards priced at pocket change with a slew of rebranded cards - some of which may not even support all of AMD's marketing features.
Posted on Reply
#72
newtekie1
Semi-Retired Folder
HTCThe prob is that you aren't thinking of differences in ambient temps: the very same card (i mean literally the same) can behave differently with different ambient temps. Even if the card "gets away with it" @ some ambient temps doesn't mean it will if the ambient temps are ... say ... 8º - 10º higher. The difference between summer and winter ambient temps is bigger then that and the card must be able to function properly in either one.
You are absolutely correct, but I'm talking in terms of all other factors being the same, the only difference being the coolers. The AMD cooler is the better performing cooler, but even still it just can't keep up with the heat load from the 290X GPU.
Posted on Reply
#73
rtwjunkie
PC Gaming Enthusiast
HTCThe prob is that you aren't thinking of differences in ambient temps: the very same card (i mean literally the same) can behave differently with different ambient temps. Even if the card "gets away with it" @ some ambient temps doesn't mean it will if the ambient temps are ... say ... 8º - 10º higher. The difference between summer and winter ambient temps is bigger then that and the card must be able to function properly in either one.
It works great, summer or winter for me with a very progressive fan profile using all 8 allowed points on a 780. The one game that has made it run tje hottest in 2 years is The Witcher 3. For that, the gpu temp reaches 70 celcius and is at 65% fan speed.

Even with that speed, i have to look thru the window sometimes to see the lighted geforce for intensity of the light, which is attuned to fan, because it's the quietest thing in my case. So, in addition to being great to look at, rigid to support the pcb, and quiet, it is excellent in its cooling.
Posted on Reply
#74
HTC
rtwjunkieIt works great, summer or winter for me with a very progressive fan profile using all 8 allowed points on a 780. The one game that has made it run tje hottest in 2 years is The Witcher 3. For that, the gpu temp reaches 70 celcius and is at 65% fan speed.

Even with that speed, i have to look thru the window sometimes to see the lighted geforce for intensity of the light, which is attuned to fan, because it's the quietest thing in my case. So, in addition to being great to look at, rigid to support the pcb, and quiet, it is excellent in its cooling.
That's not what i was trying to say: the reason i mentioned seasons @ all was to refer their difference, temp wise.

Please look @ this page: www.techpowerup.com/reviews/AMD/R9_290X/30.html

Now: imagine the ambient temp was ... say ... 8º C higher: how do you think the results would be then?

Not everyone lives in moderate climate areas, dude: the card should be able to perform as advertised in Sweden's winter as well as in Ethiopia's summer. Clearly, this card does not, even in moderate climate (referring to the reference model only), since it throttles so damn much: the cooler isn't adequate for the job.
Posted on Reply
#75
RejZoR
newtekie1That's an AMD gimmick.
Gimmick? It's an enthusiast grade feature sent from heavens. You can flash Radeons pretty much absolutely carelessly and you can't go wrong unless if you fry them. But other than that, that switch does wonders...
Posted on Reply
Add your own comment
Nov 22nd, 2024 06:02 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts