# NVIDIA GeForce GTX 980 Ti Smiles for the Camera



## btarunr (May 23, 2015)

Here are some of the first pictures of an NVIDIA GeForce GTX 980 Ti graphics card, in the flesh. As predicted, the reference design board reuses the PCB of the GeForce GTX TITAN-X, and its cooler is a silver version of its older sibling. According to an older report, the GTX 980 Ti will be carved out of the 28 nm GM200 silicon, by disabling 2 of its 24 SMM units, resulting in a CUDA core count of 2,816. The card retains its 384-bit GDDR5 memory bus width, but holds 6 GB of memory, half that of the GTX TITAN-X. The card is expected to launch in early June, 2015. NVIDIA's add-in card (AIC) partners will be free to launch custom-design boards with this SKU, so you could hold out for the MSI Lightnings, the EVGA Classifieds, the ASUS Strixes, the Gigabyte G1s, and the likes.



 

 



*View at TechPowerUp Main Site*


----------



## pidgin (May 23, 2015)

Looks as boring as ever. I'd choose 390X just for HBM already not to mention the superior looks and probably better value.


----------



## xkm1948 (May 23, 2015)

if it is within $600 I am getting one. The Fiji card will probably go for the price of Titan.


----------



## newtekie1 (May 23, 2015)

pidgin said:


> Looks as boring as ever. I'd choose 390X just for HBM already not to mention the superior looks and probably better value.


Reference cards always look boring. 

And we really don't know how much HBM will help the 390X, AMD has had double the memory bandwidth of nVidia in the last generation and couldn't manage to outperform nVidia.  So obviously just piling on more memory bandwidth isn't the solution to their problems.

I'm also not quite sure why people expect it to offer a better value.  If it really does outperform nVidia you can bet it won't offer a better value in any way, AMD will price it just as high as the Titans(they've done it in the past).  If it doesn't outperform nVidia, then AMD will price the 390X right in line with what it matches from nVidia, just like they currently do.


----------



## pidgin (May 23, 2015)

Well memory bandwidth is one thing. Its a new shiny piece of tech that is more intriguing to PC lovers than just numbers.

If 390X is neck to neck in price/performance with 980 Ti I think 390X will sell a looot easier.


----------



## NC37 (May 23, 2015)

Brace yourselves fellas...more sexy PCB pictures with no useful benchmarks or info coming!


----------



## MxPhenom 216 (May 23, 2015)

pidgin said:


> Looks as boring as ever. I'd choose 390X just for HBM already not to mention the superior looks and probably better value.


Ever since the original Titan,NVIDIA cards have been the sexiest reference cards I have ever seen.


----------



## newtekie1 (May 23, 2015)

pidgin said:


> Well memory bandwidth is one thing. Its a new shiny piece of tech that is more intriguing to PC lovers than just numbers.
> 
> If 390X is neck to neck in price/performance with 980 Ti I think 390X will sell a looot easier.



I doubt it, AMD is still having heat and power issues, we already basically know the 980 Ti's power and heat and it is pretty damn good.

Just having a buzz word isn't going to help them sell cards if their card still runs hotter, uses more power, and doesn't actually outperform the competition.


----------



## ZoneDymo (May 23, 2015)

The original maybe, now its just the same damn cooler every time.
Very generic and repetative


----------



## MxPhenom 216 (May 23, 2015)

newtekie1 said:


> I doubt it, AMD is still having heat and power issues, we already basically know the 980 Ti's power and heat and it is pretty damn good.
> 
> Just having a buzz word isn't going to help them sell cards if their card still runs hotter, uses more power, and doesn't actually outperform the competition.


Not to mention HBM is limited to 4GB right now. If they release a 8GB card it will be GDDR5.


----------



## MxPhenom 216 (May 23, 2015)

ZoneDymo said:


> The original maybe, now its just the same damn cooler every time.
> Very generic and repetative


They are damn good reference coolers too. Before I took mine off my 780 to put a waterblock on it, it was definitely the quietest blower style reference cooler I've  ever experienced.


----------



## Debat0r (May 23, 2015)

MxPhenom 216 said:


> Not to mention HBM is limited to 4GB right now. If they release a 8GB card it will be GDDR5.


Still, I'd rather have 4GB of RAM with a max of 128GB/s throughput than 8GB with less than a quarter of that...


----------



## btarunr (May 23, 2015)

Debat0r said:


> Still, I'd rather have 4GB of RAM with a max of 128GB/s throughput than 8GB with less than a quarter of that...



512 GB/s, not 128 GB/s. Funny how 4 GB wasn't a problem during "high-end" GTX 980 launch, just a few months ago.


----------



## ZoneDymo (May 23, 2015)

You can keep the ermm lets call it "Tech", underneath the same, meaning same performance and still change the look.
This is about looks, all the cards since the Titan looking the same, its unimaginative and boring.


----------



## bogami (May 23, 2015)

Unfortunately, people are so poorly educated that they think if it says on the Dimitrovgrad titanium supposed to be the best model. They do not know that we here offer the best gejming model incompletely cut-out processor which should be in the junkyard for a large sum of money.
This never written anywhere . So incompletely excised defective cut processor that would have operate by turning off the SSM unit is sold here! Thus is  in all inferior units on offer regardless of the manufacturer. I have no problem with this, but this should lower the price because it is the waste (reject, skart)!
We can expect the same problems as with GTX970!


----------



## the54thvoid (May 23, 2015)

btarunr said:


> Here are some of the first pictures of an NVIDIA GeForce GTX 980 Ti graphics card, in the flesh. As predicted, the reference design board reuses the PCB of the GeForce GTX TITAN-X, and its cooler is a silver version of its older sibling. According to an older report, the GTX 980 Ti will be carved out of the 28 nm GM200 silicon, by disabling 2 of its 24 SMM units, resulting in a CUDA core count of 2,816. The card retains its 384-bit GDDR5 memory bus width, but holds 6 GB of memory, half that of the GTX TITAN-X. The card is expected to launch in early June, 2015. NVIDIA's add-in card (AIC) partners will be free to launch custom-design boards with this SKU, so you could hold out for the MSI Lightnings, the EVGA Classifieds, the ASUS Strixes, the Gigabyte G1s, and the likes.
> 
> 
> 
> ...



This suggests only 1 SMM disabled

http://www.overclock3d.net/articles/gpu_displays/nvidia_gtx_980ti_specs_and_performance_leaked/1

and this backs it up:

http://forums.overclockers.co.uk/showpost.php?p=28074792&postcount=542

Within 7% of Titan with half memory, makes it a more worth card.  That 7% will evaporate on a modified dual BIOS card with water and a good PCB (Lightning, Classified etc)


----------



## HumanSmoke (May 23, 2015)

the54thvoid said:


> Within 7% of Titan with half memory, makes it a more worth card.  That 7% will evaporate on a modified dual BIOS card with water and a good PCB (Lightning, Classified etc)


7% might evaporate with (non-reference) air cooling depending upon what clocks AIB's settle on. With an extra 30-40W of power budget to play with from halving the vRAM, I'd think the boost clocks should be quite reasonable considering the actual in-app clocks the Titan X posts are close to 1200M, and this seems more a heat/throttling limitation of the reference cooler than anything else considering the board generally seems to pull somewhat less than 300W.


----------



## btarunr (May 23, 2015)

the54thvoid said:


> This suggests only 1 SMM disabled
> 
> http://www.overclock3d.net/articles/gpu_displays/nvidia_gtx_980ti_specs_and_performance_leaked/1



Care to count the number of SMMs in the red-shaded portion of their image? It takes 2 SMMs to go from 3072 to 2816 (128 cores per SMM). Maybe a typo on OC3D's part.


----------



## radrok (May 23, 2015)

This card will basically be what the OG Titan was compared to the Titan Black, negligible performance difference, if any.

What's good is that this will probably have AIB custom PCBs (higher voltage) and with 6GB being more than enough for any humanly conceivable resolution it will be a good chunk better than the Titan X.


----------



## rtwjunkie (May 23, 2015)

pidgin said:


> Looks as boring as ever. I'd choose 390X just for HBM already not to mention the superior looks and probably better value.


 
To each his own, I guess.  That metal high-end reference cooler which was first used with Titan and 780 is exceptionally good looking IMO, and I know from hard use of my 780 that it cools it phenomenally well and is extremely quiet.  It's how to do a blower cooler properly.


----------



## nunyabuisness (May 23, 2015)

then yo uare really silly. 
Lets look at this technically 
its obviously got a better core than a 290X yes? so that means 4K gaming right? WRONG 4GB Vram is maxed out at 1080P and 1440P ultra settings . 

THe HBM in this first release is limited to 4GB not enough for 4K. so that means. that the chip is essentially being wasted! V2 will be a worthwile choice at 8-12-16gb being possible.
but until then its a bragging rights card, as its going to be fast at 1080P and 1440P... lol the 290X with 8gb does that. meaning you will be paying $899 for what a 290X will do at 1/4 the price!


----------



## newtekie1 (May 23, 2015)

nunyabuisness said:


> WRONG 4GB Vram is maxed out at 1080P and 1440P ultra settings .



Have you actually tried any 4K gaming?  I have yet to find a title where 4GB wasn't enough to handle 4K.


----------



## radrok (May 23, 2015)

newtekie1 said:


> Have you actually tried any 4K gaming?  I have yet to find a title where 4GB wasn't enough to handle 4K.



But man 4K is viewed best with 64X MSAA, you need 24GB of VRAM minimum.


----------



## Caring1 (May 23, 2015)

No dual BIOS switch?


----------



## newtekie1 (May 23, 2015)

Caring1 said:


> No dual BIOS switch?


That's an AMD gimmick.


----------



## Caring1 (May 23, 2015)

newtekie1 said:


> That's an AMD gimmick.


And a very handy one too judging by the amount of people that stuff up flashing a BIOS.


----------



## newtekie1 (May 23, 2015)

Caring1 said:


> And a very handy one too judging by the amount of people that stuff up flashing a BIOS.


A miniscule fraction of a percent of the people buying the cards.


----------



## JMccovery (May 23, 2015)

newtekie1 said:


> That's an AMD gimmick.





newtekie1 said:


> A miniscule fraction of a percent of the people buying the cards.



Maybe AMD realized that it's better for that 'miniscule fraction' to have a working card if they mess something up, rather than them having to wait on a RMA.

Why else do a lot of motherboards have similar functionality? Better to spend the money on this kind of system, than spend more money on all the steps involved in a single RMA.


----------



## radrok (May 23, 2015)

Don't think bricking a card with a BIOS flash is under warranty tbh.

I may be wrong though, I know of some AIBs that provide BIOS updates, even for GPUs.


----------



## MxPhenom 216 (May 23, 2015)

bogami said:


> Unfortunately, people are so poorly educated that they think if it says on the Dimitrovgrad titanium supposed to be the best model. They do not know that we here offer the best gejming model incompletely cut-out processor which should be in the junkyard for a large sum of money.
> This never written anywhere . So incompletely excised defective cut processor that would have operate by turning off the SSM unit is sold here! Thus is  in all inferior units on offer regardless of the manufacturer. I have no problem with this, but this should lower the price because it is the waste (reject, skart)!
> We can expect the same problems as with GTX970!


Most of the time I dont think anyone knows wtf you are saying.


----------



## 15th Warlock (May 23, 2015)

I wonder - and this is just a genuine concern, not a desire to beat a dead horse - if because of the way Maxwell is engineered, cutting off a few units on the GPU will effectively create separate partitions of memory a là 970 for this card, as well as reducing the effective ROPs and cache.

Not that it really matters with 6GBs of VRAM, but I hope this time Nvidia clearly informs consumers about the real specs of this card, if they want to avoid the PR shitstorm they brewed with the 970 "fiasco"


----------



## Steevo (May 23, 2015)

What are the big things on the back side of the core? surface mount caps?


----------



## newtekie1 (May 23, 2015)

15th Warlock said:


> I wonder - and this is just a genuine concern, not a desire to beat a dead horse - if because of the way Maxwell is engineered, cutting off a few units on the GPU will effectively create separate partitions of memory a là 970 for this card, as well as reducing the effective ROPs and cache.
> 
> Not that it really matters with 6GBs of VRAM, but I hope this time Nvidia clearly informs consumers about the real specs of this card, if they want to avoid the PR shitstorm they brewed with the 970 "fiasco"



That only happens if they disable part of the L2, AFAIK.  So we'll have to see if they disable some L2 or not.


----------



## AsRock (May 23, 2015)

nunyabuisness said:


> then yo uare really silly.
> Lets look at this technically
> its obviously got a better core than a 290X yes? so that means 4K gaming right? WRONG 4GB Vram is maxed out at 1080P and 1440P ultra settings .
> 
> ...




Only a few games hit near the 4GB usage, one being Arma 3 if you play it long enough you might see a 3.5GB video ram usage and to name another if i remember correctly would be Watch Dogs.

I getting sick off people saying none use such amount of ram and sick of those people who try to claim most games do.


As for the cooler i like the window and that's about it, but as for the PCB well either sides look sexy.


----------



## alchemist83 (May 23, 2015)

Shame that ATI/AMD driver updates are few and far between though, which is why NVIDIA outranks. Constant driver updates are a must and until ATI/AMD fix that, then I'm staying Green! GTX 970 Zotac AMP! Extreme is what I got - tell me which similar priced equivalent at the same performance is available...... It dont exist.


----------



## ensabrenoir (May 23, 2015)

even knowing little.......we know what to expect of this card.   Amd's will be forced to a lower  price point on a product almost as good as a..(insert name here) killing their profit margin further weakening the company. I predict The Titan will keep its crown and this will answer the scarce  almost as good as but for less launch from the red team.  Their is no mindless mining bubble this time so this could get ugly......especially if nvidia release it an agressive price point.


----------



## HumanSmoke (May 23, 2015)

Steevo said:


> What are the big things on the back side of the core? surface mount caps?


Surface Mount Inductors I believe ( You can see a pretty good close-up in this post about Frankensteining the Titan X for sub-zero benching - see the first picture under section "7: Attach measuring cables" )
The GK200 has the inductors in basically the same place


----------



## GhostRyder (May 23, 2015)

Well I am half to see it, but it does not look any different from normal.


All my interest is in where the performance will lie and how well it will overclock compared to the Titan x.


----------



## qubit (May 23, 2015)

I recently saw an NVIDIA card which had this reference cooler. First time I've seen it in real life and while it was nice, I think it's overrated. Noise in 3D mode was reasonable, but I thought it could have been better, too.

I'm beginning to get the feeling that NVIDIA are getting lazy with designing reference coolers nowadays with all their cards looking the same. This design is now two years old and I'm sure they could do better if they were bothered.


----------



## rtwjunkie (May 23, 2015)

@qubit, I have this cooler.  There is really nothing to improve. It does a fantastic job and is quiet.


----------



## Steevo (May 23, 2015)

alchemist83 said:


> Shame that ATI/AMD driver updates are few and far between though, which is why NVIDIA outranks. Constant driver updates are a must and until ATI/AMD fix that, then I'm staying Green! GTX 970 Zotac AMP! Extreme is what I got - tell me which similar priced equivalent at the same performance is available...... It dont exist.




Go away paid shill.



HumanSmoke said:


> Surface Mount Inductors I believe ( You can see a pretty good close-up in this post about Frankensteining the Titan X for sub-zero benching - see the first picture under section "7: Attach measuring cables" )
> The GK200 has the inductors in basically the same place



My god that is crazy, and ballsy. I have screwed with some cards before but don't know if I would have the stones to try that.


----------



## MxPhenom 216 (May 23, 2015)

qubit said:


> I recently saw an NVIDIA card which had this reference cooler. First time I've seen it in real life and while it was nice, I think it's overrated. Noise in 3D mode was reasonable, but I thought it could have been better, too.
> 
> I'm beginning to get the feeling that NVIDIA are getting lazy with designing reference coolers nowadays with all their cards looking the same. This design is now two years old and I'm sure they could do better if they were bothered.


Why should they redesign the cooler when it works well and about 99.99% of people really like the look?


----------



## HumanSmoke (May 23, 2015)

qubit said:


> I'm beginning to get the feeling that NVIDIA are getting lazy with designing reference coolers nowadays with all their cards looking the same.


Amortization of tooling/manufacturing costs, and I'm pretty sure that the cooler is now readily identifiable with the Nvidia brand. While some people yearn for constant change (so long as it doesn't take them out of their comfort zone - see Win8 as example), branding relies upon long term continuity - why logo's, colours, and naming conventions don't change every year.


qubit said:


> This design is now two years old and I'm sure they could do better if they were bothered.


Quite possibly they are trying, but it must be a tall order to improve on a pretty successful design. Of the three vendors offering add-in graphics boards of the 300W class, only one reference - the NVTTM, handles the job with anything approaching satisfaction. AMD's offerings have been so well received they have turned to AIO's, and Intel's Xeon Phi 5000/7000 series aren't noted for their quietness.


Steevo said:


> Go away paid shill.


C'mon, who would actually pay for that level of effort??


qubit said:


> My god that is crazy, and ballsy. I have screwed with some cards before but don't know if I would have the stones to try that.


Yep, there are some people out there with a completely different approach to their hardware purchases. Spending $1K on a card could be the least of the expense.


----------



## radrok (May 23, 2015)

MxPhenom 216 said:


> Why should they redesign the cooler when it works well and about 99.99% of people really like the look?



I can vouch it works pretty well as furniture piece


----------



## bpgt64 (May 23, 2015)

Either way you look at this, it's good news for gamers.   AMD needs to take Nvidia down a peg.  I have been on Green team since the GTA 600 series, and would love to see an AMD card that beats the s*&^ out of Nvidia.  I am not saying AMD hasn't done competitive/impressive things with the R9 290, or 7970, but the heat generation from their cards has become an issue for me, like the old Intel Prescott's back in the day.  Not enough innovation with the new generations, all indicators point to some impressive features in this next gen of AMD cards.


----------



## qubit (May 23, 2015)

I'm surprised at how many of you are sticking up for NVIDIA regarding the cooler.

I don't really have anything to add except that this was my impression on seeing it and I think the noise could be reduced a little more. The gold standard in low noise and lower temperature high end graphics cards are the MSI Gaming and the Asus ones (Strix I think?) If they can do it, then so can NVIDIA - and I'm sure it would help to boost their sales by having such a cutting-edge cooler as standard.

I have an MSI GTX 780 Ti Gaming and it's really quiet, even when being pushed hard. This is the noise performance I'm looking for in my next card, whatever brand it is.

You should see the superlative review it got on TPU. So good that W1z put it in his own rig.


----------



## bpgt64 (May 23, 2015)

qubit said:


> I'm surprised at how many of you are sticking up for NVIDIA regarding the cooler.
> 
> I don't really have anything to add except that this was my impression on seeing it and I think the noise could be reduced a little more. The gold standard in low noise and lower temperature high end graphics cards are the MSI Gaming and the Asus ones (Strix I think?) If they can do it, then so can NVIDIA - and I'm sure it would help to boost their sales by having such a cutting-edge cooler as standard.



I like nvidia's stock cooler for the Titan/GTX 780/980 lineup, it permits very impressive overclocking with out much noise.  It's a shite load better than AMD's.

Bottom line to all this, is it's a good thing to see Nvidia and AMD play ro-sham-bo

http://www.urbandictionary.com/define.php?term=roshambo


----------



## qubit (May 23, 2015)

@bpgt64 It's certainly a lot better than AMD's agreed and that's the problem, isn't it? Note enough competition.  

The reviews back up your assertion about overclocking, though.


----------



## rtwjunkie (May 23, 2015)

qubit said:


> I'm surprised at how many of you are sticking up for NVIDIA regarding the cooler.
> 
> I don't really have anything to add except that this was my impression on seeing it and I think the noise could be reduced a little more. The gold standard in low noise and lower temperature high end graphics cards are the MSI Gaming and the Asus ones (Strix I think?) If they can do it, then so can NVIDIA - and I'm sure it would help to boost their sales by having such a cutting-edge cooler as standard.
> 
> ...


You are correct those vendor superlative coolers do an excellent job, but dump the heat into the case.  I've also got a great MSI one.  But after extended gaming, the metal reference cooler wins out on cooling because it isn't bathing itself in and sucking back into its heatsink hot air. And I can tell you the high-end reference heatsink is the quietest thing in my case.


----------



## MxPhenom 216 (May 23, 2015)

radrok said:


> I can vouch it works pretty well as furniture piece


Right? I took it off my reference 780 after couple months to put a waterblock on it. Love it while it was on my card though.


----------



## radrok (May 23, 2015)

I used the OG Titan reference cooler for a couple of days before dismantling it, while it was quiet and good for stock clocks it couldn't cope with overclocking, unless it was a couple tens of megahertz.

It's good for bone stock quiet operation, but if you want some more be prepared to have a jet engine in your case.


----------



## MxPhenom 216 (May 23, 2015)

radrok said:


> I used the OG Titan reference cooler for a couple of days before dismantling it, while it was quiet and good for stock clocks it couldn't cope with overclocking, unless it was a couple tens of megahertz.
> 
> It's good for bone stock quiet operation, but if you want some more be prepared to have a jet engine in your case.


Maybe on the Titan, but 780, quietest and coolest running reference card I have ever had, and I overclocked it 100-150mhz.


----------



## HumanSmoke (May 23, 2015)

qubit said:


> The gold standard in low noise and lower temperature high end graphics cards are the MSI Gaming and the Asus ones (Strix I think?) If they can do it, then so can NVIDIA


The difference is that reference cooling is more or less predicated on a single fan that exhausts most (if not all) the exhaust outside the chassis. The Asus (Strix/Matrix/DCII etc.) and MSI (TwinFrozr), along with Gigabyte's WindForce, EVGA's ACX, GALAX's whatever-the-hell-that-pseudo-shoebox-is, are all twin fan at the very least that exhaust into the chassis. Unless you can guarantee compatibility with every chassis conforming across the standards for form factors, it cannot be applied universally as a reference design, and hence we see virtually every reference design cooler conforming to the blower/shroud configuration. Those that do not (like the that on the HD 7990) are limited in application and uptake.


qubit said:


> and I'm sure it would help to boost their sales by having such a cutting-edge cooler as standard.


The same could be said for AMD's offerings. If a custom (vendor) licenced design could be applied as reference, I'm pretty sure AMD would jump at the chance to decrease their bill of materials from the AIO, to say Sapphire's Vapor-X cooler, or MSI's TF5 which could undoubtedly handle the heat dissipation duties required.


----------



## arbiter (May 24, 2015)

pidgin said:


> If 390X is neck to neck in price/performance with 980 Ti I think 390X will sell a looot easier.



Even if its neck and neck, if that 390x sells for 850$ and 980ti is a good 100-200$ cheaper not sure it will sell that easy. AMD has admitted that supplies could be limited at launch of their card so prices could even go up.



MxPhenom 216 said:


> Ever since the original Titan,NVIDIA cards have been the sexiest reference cards I have ever seen.


Lets also remember nvidia cools Also works at what its ment to do unlike the AMD one that couldn't.



newtekie1 said:


> I doubt it, AMD is still having heat and power issues


If AMD wasn't having heat issues why did they go with a water cooler on the card as reference?



MxPhenom 216 said:


> Why should they redesign the cooler when it works well and about 99.99% of people really like the look?


No need to change a cooler design that works, Let the 3rd party makers worry about design.



qubit said:


> The gold standard in low noise and lower temperature high end graphics cards are the MSI Gaming and the Asus ones (Strix I think?) If they can do it, then so can NVIDIA


I would have an arguement on that. The Gigabyte windforce cooler is usually one the best performing coolers in most reviews. i had one on a gtx670, had that thing with like +200mhz offside never had that card top 65c. I would bought a gtx980 windforce but they made it an inch or 2 longer so wouldn't been able to fit it in my case without some modification.



radrok said:


> I used the OG Titan reference cooler for a couple of days before dismantling it, while it was quiet and good for stock clocks it couldn't cope with overclocking, unless it was a couple tens of megahertz.
> 
> It's good for bone stock quiet operation, but if you want some more be prepared to have a jet engine in your case.



Least card runs at at clocks they say rather then dropping ~20% of gpu's clock speed after 5min.


----------



## HTC (May 24, 2015)

qubit said:


> I recently saw an NVIDIA card which had this reference cooler. First time I've seen it in real life and while it was nice, I think it's overrated. Noise in 3D mode was reasonable, but I thought it could have been better, too.
> 
> I'm beginning to get the feeling that NVIDIA are getting lazy with designing reference coolers nowadays with all their cards looking the same. This design is now two years old and I'm sure they could do better if they were bothered.



Is the *reference cooler* butt ugly or hidious looking? If so, then i can see the need for a visual improvement, so long as it doesn't compromise performance. If not, why is there a need to waste money trying to "fix" something in perfectly working conditions AND suitable to the job (for now, anyways, in the cards it has been placed on)? It would be a different tune if the cooler were inadequate for the job which it clearly isn't. When AMD's 290 family was launched, i voiced my disapointment @ their cooler quite loudly, and i'm sure many did the same: now there's a reference cooler in dire need for a overall update!!!!

I get this sort of thing @ work too with the engineers "trying" to come up with ways to improve visually and ... forgetting ... that actually doing the job *properly* is WAY more important then looks. Stupid, IMO!!!!


----------



## newtekie1 (May 24, 2015)

arbiter said:


> If AMD wasn't having heat issues why did they go with a water cooler on the card as reference?



That is my point.


----------



## arbiter (May 24, 2015)

HTC said:


> It would be a different tune if the cooler were inadequate for the job which it clearly isn't. When AMD's 290 family was launched, i voiced my disapointment @ their cooler quite loudly, and i'm sure many did the same: now there's a reference cooler in dire need for a overall update!!!!


You and just about EVERY review site when they realized that 290(x) ref cooler was crap. Almost every site got on AMD over it and rightfully so. Shouldn't need to run the fan near max just to keep advertised clocks. That is on open air bench in temp controlled office to boot. Which most people that buy them won't be using.


----------



## buildzoid (May 24, 2015)

@Steevo 

Those are SMD tantalum caps. Not inductors. You can tell from the markings. Inductors come with an R rating on them caps don't.


----------



## Solaris17 (May 24, 2015)

newtekie1 said:


> That's an AMD gimmick.



/chuckles to be fair that has saved some asses when it came to bad flashes. I agree though


----------



## newtekie1 (May 24, 2015)

arbiter said:


> You and just about EVERY review site when they realized that 290(x) ref cooler was crap. Almost every site got on AMD over it and rightfully so. Shouldn't need to run the fan near max just to keep advertised clocks. That is on open air bench in temp controlled office to boot. Which most people that buy them won't be using.



The AMD cooler actually isn't a bad design. In fact it is better in cooling ability than the nVidia reference cooler. AMD's problem is their GPUs put out a stupid amount of heat. A 290X puts out a good 50w more heat than the Titan-X, that's basically 20% more heat.


----------



## nunyabuisness (May 24, 2015)

newtekie1 said:


> Have you actually tried any 4K gaming?  I have yet to find a title where 4GB wasn't enough to handle 4K.


well I am at 1440P and I ran at 200% sampling and it defo fills up most games mate! 
the witcher doesnt count anyway. its a heavily optimized game Ill give you that. but most games arent! they are bugging and hogs! and its best to have at least 6-8gb for 4K (980TI sweet spot) 
and I have intel i7 and titan X in my PC. and my 2 kids have FX6300's and 290's so Im not a boased person and I want AMD to do well. but its making some bad decisions


----------



## Solaris17 (May 24, 2015)

nunyabuisness said:


> well I am at 1440P and I ran at 200% sampling and it defo fills up most games mate!
> the witcher doesnt count anyway. its a heavily optimized game Ill give you that. but most games arent! they are bugging and hogs! and its best to have at least 6-8gb for 4K (980TI sweet spot)
> and I have intel i7 and titan X in my PC. and my 2 kids have FX6300's and 290's so Im not a boased person and I want AMD to do well. but its making some bad decisions



why do you run at 200% SS at that resolution? AA and AF drastically reduce in IQ depending on monitor size. Hell even the still standard 1080 doesnt need drastic amounts if at all in most games 23-24" monitors and lower.


----------



## HTC (May 24, 2015)

newtekie1 said:


> The AMD cooler actually isn't a bad design. In fact it is better in cooling ability than the nVidia reference cooler. AMD's problem is their GPUs put out a stupid amount of heat. A 290X puts out a good 50w more heat than the Titan-X, that's basically 20% more heat.



Regardless: the fact is it's inadequate (soundwise) for those cards. Have AMD stick it in cards that produce less heat thus "miraculously" making the cooler good.


----------



## arbiter (May 24, 2015)

newtekie1 said:


> The AMD cooler actually isn't a bad design. In fact it is better in cooling ability than the nVidia reference cooler. AMD's problem is their GPUs put out a stupid amount of heat. A 290X puts out a good 50w more heat than the Titan-X, that's basically 20% more heat.



um where your argument falls apart is Nvidia cooler keeps the chip to 80c most the time that is the setting for before backs down the boost but mostly does it. AMD's cooler couldn't even keep theirs from 95c. SO really argument kinda gets flawed with that. That is 20% hotter so.. I bet if you underclocked a 290x to fit 250watt range, I doubt it could keep that gpu to 80, probably still hit 95c with the cooler they used.


----------



## Steevo (May 24, 2015)

HTC said:


> Regardless: the fact is it's inadequate (soundwise) for those cards. Have AMD stick it in cards that produce less heat thus "miraculously" making the cooler good.




They have to stay with it for the people who have already hot cases, when you stick in another 200W of power dissipation with no venting it literally becomes the oven that kills. Water was the only way out but even that has its limitations, and a smaller process node wasn't happening for them.


----------



## AsRock (May 24, 2015)

arbiter said:


> um where your argument falls apart is Nvidia cooler keeps the chip to 80c most the time that is the setting for before backs down the boost but mostly does it. AMD's cooler couldn't even keep theirs from 95c. SO really argument kinda gets flawed with that. That is 20% hotter so.. I bet if you underclocked a 290x to fit 250watt range, I doubt it could keep that gpu to 80, probably still hit 95c with the cooler they used.



Quality check\finish screwed the reference cooler but with some love it performed much better.


----------



## newtekie1 (May 24, 2015)

arbiter said:


> um where your argument falls apart is Nvidia cooler keeps the chip to 80c most the time that is the setting for before backs down the boost but mostly does it. AMD's cooler couldn't even keep theirs from 95c. SO really argument kinda gets flawed with that. That is 20% hotter so.. I bet if you underclocked a 290x to fit 250watt range, I doubt it could keep that gpu to 80, probably still hit 95c with the cooler they used.



With coolers they can only handle up to a certain amount of heat, once you go beyond that you get thermal run away and temps go uncontrolled.  So, the reference nVidia cooler might be able to dissipate 260w, and the Titan-X puts out 245w, so all is good and the cooler keeps the card at acceptable temps. However, the AMD reference cooler might be capable of dissipating 275w, but since the 290X puts out 295w, the cooler can't dissipate all the heat the 290X is generating and temps go out of control.


----------



## haswrong (May 24, 2015)

seems like a cut down card.. so lets say for $200 i could be persuaded to buy.. otherwise let me climb the fiji hill..


----------



## GhostRyder (May 24, 2015)

arbiter said:


> um where your argument falls apart is Nvidia cooler keeps the chip to 80c most the time that is the setting for before backs down the boost but mostly does it. AMD's cooler couldn't even keep theirs from 95c. SO really argument kinda gets flawed with that. That is 20% hotter so.. I bet if you underclocked a 290x to fit 250watt range, I doubt it could keep that gpu to 80, probably still hit 95c with the cooler they used.


First of all, you have no idea what your talking about in regards to the reference cooler.  I have 3 of them that had the reference cooler, you can keep the temps in a normal case below 95c on the 55% fan speed setting even under gaming load with normal airflow (Yes reviewers had some trouble keeping temps down, we are aware of that but a patch among other things helped out).  You bump it to 60% and its in the mid 80's, I own these cards and ALL THREE of them acted the same with one of them being 1 degree (or roughly) higher.



arbiter said:


> If AMD wasn't having heat issues why did they go with a water cooler on the card as reference?


Probably because they went to the next level to give people a quiet reference cooler.  Not much more could easily be done to a fan cooler so what's next without making a cooler that dumps air into case.

To top it off, from what i've read the Titan X even now throttles a bit so its not the perfect cooler...

Anyways, the 980ti Titan cooler will be replaced by other aftermarket same as normal.  The normal cooler does it job fine but its is getting a bit dated.  Though it still looks nice to me!


----------



## HTC (May 24, 2015)

newtekie1 said:


> With coolers they can only handle up to a certain amount of heat, once you go beyond that you get thermal run away and temps go uncontrolled.  So, the reference nVidia cooler might be able to dissipate 260w, and the Titan-X puts out 245w, so all is good and the cooler keeps the card at acceptable temps. However, the AMD reference cooler might be capable of dissipating 275w, but since the 290X puts out 295w, the cooler can't dissipate all the heat the 290X is generating and temps go out of control.



The prob is that you aren't thinking of differences in ambient temps: the very same card (i mean literally the same) can behave differently with different ambient temps. Even if the card "gets away with it" @ some ambient temps doesn't mean it will if the ambient temps are ... say ... 8º - 10º higher. The difference between summer and winter ambient temps is bigger then that and the card must be able to function properly in either one.


----------



## kiddagoat (May 24, 2015)

I dunno wtf you all are on about the AMD stock cooler..... this guy owns a Sapphire 290 reference...... I flashed it with a Tri-X OC BIOS ..... card never gets over 75C......   I haven't taken it apart or anything.... And yes I do use MSI Afterburner to run a custom fan profile.... 

that 50% fan speed is alright depending on chassis configuration and your ambient... you open it up to about 65%-70% and it works just fine....  I have headphones and I still don't hear the card.... I use my 2.1 Klipsch Pro Media... still don't hear the card..

I think some people just like to be anal and cling to every little "negative" thing about a product....  weighing them all equally.....  

I have owned both AMD and Nvidia cards..... just get whatever is better for money at the time.....  I mean at least AMD doesn't release a driver that smokes their cards by turning the fans off or not letting them work properly then says it is your fault for installing the driver.  

Just saying... both sides have blows against them.... damn fanboys.... 

All the more reason.... if you are just going off reviews and don't actually own the product... your opinion should actually hold less weight....  Not every configuration is the same.... even if all the hardware is the same revision, BIOS, and all that nitty gritty stuff.... it is going to have some uniqueness to it...


----------



## HumanSmoke (May 24, 2015)

haswrong said:


> seems like a cut down card.. so lets say for $200 i could be persuaded to buy.. otherwise let me climb the fiji hill..


Wow, you really are rooting for AMD to die aren't you?

Just for the sake of argument - I'll spend a couple of minutes evaluating the scenario you put all of ten seconds putting some thought into...just for laughs...

A GTX 980 Ti for $200, makes a mainstream 960 what? $50?...a 750Ti..$20?
Nvidia might be able to absorb those losses for a while with $4.8 billion in cash and short term securities, but AMD with their nosediving assets, maybe not so much. The company isn't sustainable long term at its current level, and you're hoping they breach the $600m barrier in short order whereby the company is untenable as a going concern. If you're expecting Fiji to dig them out of the hole, I have news for you. The mainstream volume markets are where the revenue really rolls in. In your scenario, AMD are combatting Nvidia cards priced at pocket change with a slew of rebranded cards - some of which may not even support all of AMD's marketing features.


----------



## newtekie1 (May 24, 2015)

HTC said:


> The prob is that you aren't thinking of differences in ambient temps: the very same card (i mean literally the same) can behave differently with different ambient temps. Even if the card "gets away with it" @ some ambient temps doesn't mean it will if the ambient temps are ... say ... 8º - 10º higher. The difference between summer and winter ambient temps is bigger then that and the card must be able to function properly in either one.



You are absolutely correct, but I'm talking in terms of all other factors being the same, the only difference being the coolers.  The AMD cooler is the better performing cooler, but even still it just can't keep up with the heat load from the 290X GPU.


----------



## rtwjunkie (May 24, 2015)

HTC said:


> The prob is that you aren't thinking of differences in ambient temps: the very same card (i mean literally the same) can behave differently with different ambient temps. Even if the card "gets away with it" @ some ambient temps doesn't mean it will if the ambient temps are ... say ... 8º - 10º higher. The difference between summer and winter ambient temps is bigger then that and the card must be able to function properly in either one.



It works great, summer or winter for me with a very progressive fan profile using all 8 allowed points on a 780.  The one game that has made it run tje hottest in 2 years is The Witcher 3.  For that, the gpu temp reaches 70 celcius and is at 65% fan speed.  

Even with that speed, i have to look thru the window sometimes to see the lighted geforce for intensity of the light, which is attuned to fan, because it's the quietest thing in my case.  So, in addition to being great to look at, rigid to support the pcb, and quiet, it is excellent in its cooling.


----------



## HTC (May 24, 2015)

rtwjunkie said:


> It works great, summer or winter for me with a very progressive fan profile using all 8 allowed points on a 780.  The one game that has made it run tje hottest in 2 years is The Witcher 3.  For that, the gpu temp reaches 70 celcius and is at 65% fan speed.
> 
> Even with that speed, i have to look thru the window sometimes to see the lighted geforce for intensity of the light, which is attuned to fan, because it's the quietest thing in my case.  So, in addition to being great to look at, rigid to support the pcb, and quiet, it is excellent in its cooling.



That's not what i was trying to say: the reason i mentioned seasons @ all was to refer their difference, temp wise.

Please look @ this page: http://www.techpowerup.com/reviews/AMD/R9_290X/30.html

Now: imagine the ambient temp was ... say ... 8º C higher: how do you think the results would be then?

Not everyone lives in moderate climate areas, dude: the card should be able to perform as advertised in Sweden's winter as well as in Ethiopia's summer. Clearly, this card does not, even in moderate climate (referring to the reference model only), since it throttles so damn much: the cooler isn't adequate for the job.


----------



## RejZoR (May 24, 2015)

newtekie1 said:


> That's an AMD gimmick.



Gimmick? It's an enthusiast grade feature sent from heavens. You can flash Radeons pretty much absolutely carelessly and you can't go wrong unless if you fry them. But other than that, that switch does wonders...


----------



## Vayra86 (May 24, 2015)

btarunr said:


> 512 GB/s, not 128 GB/s. Funny how 4 GB wasn't a problem during "high-end" GTX 980 launch, just a few months ago.



We all knew that was GM204 and not the big chip, right?

This is a big chip, and 6GB is on the low end for this price/performance range. Consider the SLI options as well. For now 6GB may be enough, but you can go a long way with this card especially in SLI, and I can imagine the VRAM won't cut it then anymore.


----------



## rtwjunkie (May 24, 2015)

@HTC I live in a very extreme climate, which here is 8 months of the year.  Very hot, very humid.  We have two seasons: Super Hot, which severely affects ambient indoor temps (air conditioning cant keep up), and "Almost Fall/Spring" (which would be a summer day up North.  

Thats why I responded originally, since i definately am not in a mild climate.  I do agree though, any cooler should be able to work anywhere.


----------



## L.ccd (May 24, 2015)

Regarding the reference cooler discussion: for what it's worth, not all recent nvidia reference blowers are the same. The 980 uses a downgraded version (no vapor chamber) of the high-end blower used on the 780, 780ti, Titan and Titan X. Probably good enough for the gm204, but I'm curious to see if they'll go back to the better variant on the 980ti.


----------



## rtwjunkie (May 24, 2015)

It certainly looks like a vapor.chamber heatsink in there   What are they using? I see no heatpipes.


----------



## L.ccd (May 24, 2015)

This is what is used on the reference 980: http://www.hardware.fr/articles/928-5/specifications-geforce-gtx-980-reference.html (also noted in TPU's own 980 review).

Given the fact that the 980ti will use the gm200 and not be cheap, one can expect nvidia to go back to the vapor chamber version.


----------



## newtekie1 (May 24, 2015)

RejZoR said:


> Gimmick? It's an enthusiast grade feature sent from heavens. You can flash Radeons pretty much absolutely carelessly and you can't go wrong unless if you fry them. But other than that, that switch does wonders...


You make it sound like it is a game changing feature.  It's not.  And almost no one flashes the BIOS on the video cards.  Sure, enthusiasts do it more often, but like I said a fraction of a percent of the overall market.

And heck, a lot of the aftermarket cards omit the dual-bios.


----------



## radrok (May 24, 2015)

newtekie1 said:


> You make it sound like it is a game changing feature.  It's not.  And almost no one flashes the BIOS on the video cards.  Sure, enthusiasts do it more often, but like I said a fraction of a percent of the overall market.
> 
> And heck, a lot of the aftermarket cards omit the dual-bios.



Completely agree on this. 

I also flashed my current GPUs like 10+ times and it's always been a success, if you know what you are flashing and HOW you have to do it then there's like 0.001% risk.


----------



## ensabrenoir (May 24, 2015)

haswrong said:


> seems like a cut down card.. so lets say for $200 i could be persuaded to buy.. otherwise let me climb the fiji hill..




.........you must be george's cousin.  If you got your wish though.....this level of performance for $200 would make Amd's (and most of Nvidia's) entire line up irrelevant and hasten their demise.


----------



## HTC (May 24, 2015)

rtwjunkie said:


> @HTC I live in a very extreme climate, which here is 8 months of the year.  Very hot, very humid.  We have two seasons: Super Hot, which severely affects ambient indoor temps (air conditioning cant keep up), and "Almost Fall/Spring" (which would be a summer day up North.
> 
> Thats why I responded originally, since i definately am not in a mild climate.  I do agree though, any cooler should be able to work anywhere.



Exactly. Now: enough of the side topic (reference cooler). I'm to blame for part of that, anyways ...


This is a 980 Ti card topic: we should be discussing that instead!


----------



## FrustratedGarrett (May 24, 2015)

newtekie1 said:


> Reference cards always look boring.
> 
> And we really don't know how much HBM will help the 390X, AMD has had double the memory bandwidth of nVidia in the last generation and couldn't manage to outperform nVidia.  So obviously just piling on more memory bandwidth isn't the solution to their problems.



HBM improves read/write access latencies dramatically which is a key factor to improve performance of GPUs. I think that is the reason why AMD decided to use HBM in its Fiji based cards and go through this long delay. 



newtekie1 said:


> I'm also not quite sure why people expect it to offer a better value.  If it really does outperform nVidia you can bet it won't offer a better value in any way, AMD will price it just as high as the Titans(they've done it in the past).  If it doesn't outperform nVidia, then AMD will price the 390X right in line with what it matches from nVidia, just like they currently do.



Nah, AMD will probably price around $550. Their dual GPU 295x2 is selling for less than $800.


----------



## the54thvoid (May 24, 2015)

FrustratedGarrett said:


> Nah, AMD will probably price around $550. Their dual GPU 295x2 is selling for less than $800.



The 295X2 is priced that way because of poor sales. The wealth of internet rumours (and they are but rumours) has pretty much stated a higher price than you anticipate.  You have to remember that the 7970 came out at a surprisingly high price for AMD.  This Fiji card carried new technology that has cost a lot to pursue.  Fiji will not be cheap.


----------



## rtwjunkie (May 24, 2015)

HTC said:


> Exactly. Now: enough of the side topic (reference cooler). I'm to blame for part of that, anyways ...
> 
> 
> This is a 980 Ti card topic: we should be discussing that instead!


Absolutely! Onward.


----------



## BiggieShady (May 24, 2015)

Leaked benches claim that at stock it's on par with Titan X, factory boost clocks could get it up by 20%, soon it will be a time to read about all those Strix, G1 and Gaming model announcements again. I kinda know what to expect on the green side and for once I'm interested in what AMD will show memory, heat and power consumption wise.


----------



## the54thvoid (May 24, 2015)

BiggieShady said:


> *Leaked benches* claim that at stock it's on par with Titan X, factory boost clocks could get it up by 20%, soon it will be a time to read about all those Strix, G1 and Gaming model announcements again. I kinda know what to expect on the green side and for once I'm interested in what AMD will show memory, heat and power consumption wise.



Do share!


----------



## BiggieShady (May 24, 2015)

the54thvoid said:


> Do share!


LOL it's firestrike benches you originally shared http://www.overclock3d.net/articles/gpu_displays/nvidia_gtx_980ti_specs_and_performance_leaked/1
What goes around...


----------



## radrok (May 24, 2015)

Like I said 1 or 2 SMMs do not make any noticeable difference unless you are benching.

What's gonna make the difference is the availability of higher voltage enabled SKUs for the 980Tis, that alone kills the Titan X which will be only reference. That's stupid from Nvidia but hey they do what they want.

The only advantage Titan X has is being a full fledged core which may appeal to some just for that reason, and the 12GB VRAM which is immensely oversized for now.


----------



## net2007 (May 25, 2015)

12 gb is not to much considering gamea in 4k are exceeding 6 gb. In my opinion the 980 ti should come with 8 gb.


----------



## chinmi (May 25, 2015)

totally gonna upgrade my obsolete gtx 970. thank god i waited... getting tired with 3,5gb + 0.5gb vram. 
and this card gonna answer my prayer..... thanks nvidia


----------



## 64K (May 25, 2015)

GhostRyder said:


> First of all, you have no idea what your talking about in regards to the reference cooler.  I have 3 of them that had the reference cooler, you can keep the temps in a normal case below 95c on the 55% fan speed setting even under gaming load with normal airflow (Yes reviewers had some trouble keeping temps down, we are aware of that but a patch among other things helped out).  You bump it to 60% and its in the mid 80's, I own these cards and ALL THREE of them acted the same with one of them being 1 degree (or roughly) higher.
> 
> 
> Probably because they went to the next level to give people a quiet reference cooler.  Not much more could easily be done to a fan cooler so what's next without making a cooler that dumps air into case.
> ...



From the review here the Titan X does down clock quickly under load. The cooler had trouble keeping the GPU under 84 degrees. That's the point when it cuts back on the boost. That was disappointing for 2 reasons imo. The down clocking obviously and setting it to do so at 84 degrees. You can run a higher temp on GPUs if performance is top priority. I don't know why Nvidia set it to down clock at 84 degrees.

It's just a guess on my part but I think a 980 Ti even though it will have less cores than the Titan X with a great non reference cooler will possibly outperform the Titan X.

If it comes in at $650 or less then it will sell very well. That was the price for the GTX 780 at release.


----------



## Prima.Vera (May 25, 2015)

net2007 said:


> 12 gb is not to much considering gamea in 4k are exceeding 6 gb. In my opinion the 980 ti should come with 8 gb.


Show me the proof for only 1 game that exceeds 6GB. Just one!


----------



## radrok (May 25, 2015)

Prima.Vera said:


> Show me the proof for only 1 game that exceeds 6GB. Just one!









It's COD:AW and it's because it's poorly coded, it leaves loaded memory in the frame buffer even if it doesn't utilize it anymore, it has been shown that 3GB cards play it without an issue.


----------



## 64K (May 25, 2015)

radrok said:


> It's COD:AW and it's because it's poorly coded, it leaves loaded memory in the frame buffer even if it doesn't utilize it anymore, it has been shown that 3GB cards play it without an issue.



That's the thing that I wonder about when I see VRAM usage going ridiculously high. Is the game loading up VRAM just because it's there or does it really improve performance? The only way to know is to have 2 identical cards except more VRAM in one and test the same part of the game on both to see if performance is better on the card with the greater VRAM. I'm leaning towards the side that it doesn't improve performance in most cases because developers are notorious for wasting resources on the PC version of the game.


----------



## GhostRyder (May 25, 2015)

64K said:


> From the review here the Titan X does down clock quickly under load. The cooler had trouble keeping the GPU under 84 degrees. That's the point when it cuts back on the boost. That was disappointing for 2 reasons imo. The down clocking obviously and setting it to do so at 84 degrees. You can run a higher temp on GPUs if performance is top priority. I don't know why Nvidia set it to down clock at 84 degrees.
> 
> It's just a guess on my part but I think a 980 Ti even though it will have less cores than the Titan X with a great non reference cooler will possibly outperform the Titan X.
> 
> If it comes in at $650 or less then it will sell very well. That was the price for the GTX 780 at release.


I agree, its kinda a low point for the throttle but its what they chose.  The 980ti might just outperform it by a good amount depending as well if they allow for higher clocking especially on classified or lightning versions of the card.

As far as the price goes, I am better its going to be $750 area based on its performance point.  For $150 more than the 980 it would have such a hike in performance it would possibly overshadow the 980 at its price point (IMHO).



radrok said:


> It's COD:AW and it's because it's poorly coded, it leaves loaded memory in the frame buffer even if it doesn't utilize it anymore, it has been shown that 3GB cards play it without an issue.


Or they did it to pretend their game needs that much


----------



## TheinsanegamerN (May 25, 2015)

radrok said:


> It's COD:AW and it's because it's poorly coded, it leaves loaded memory in the frame buffer even if it doesn't utilize it anymore, it has been shown that 3GB cards play it without an issue.


interestingly, COD AW, DR3, and SoM all go past 4GB. SoM is getting close to 6GB, and watch dogs, ryse, FC4, and AC unity are close to 4GB. Given another year, 4GB probably isnt going to be enough. even 6GB may not be enough, if SoM is any indication.


----------



## net2007 (May 25, 2015)

chinmi said:


> totally gonna upgrade my obsolete gtx 970. thank god i waited... getting tired with 3,5gb + 0.5gb vram.
> and this card gonna answer my prayer..... thanks nvidia



Proof? Do you even play on 4k?


----------



## Casecutter (May 25, 2015)

ZoneDymo said:


> The original maybe, now its just the same damn cooler every time.
> Very generic and repetative


At least this time they figured they would sell enough they could tool a cover to add "Ti" on this one.  With Titan they just re-use the old Titan, figuring crinkle paint would gussy it up enough.

At then on TitanX this cooler appears to be pushed to the limit, it caused throttling, and similar in dbA as the quiet mode for the reference 290X.  The Ti having the 6Gb should do better heat wise as from W1zzard's thermals it seem the heat off the memory saturated the PCB and to some extent the places stress on the GPU itself.


----------



## HumanSmoke (May 25, 2015)

You can add the latest poster boy, GTA V to the list. Here's a little vid showing vRAM usage at 4K


----------



## radrok (May 26, 2015)

HumanSmoke said:


> You can add the latest poster boy, GTA V to the list. Here's a little vid showing vRAM usage at 4K



Last time I used afterburner with a Radeon card it gave me the sum of the two cards video memory when running a CFX configuration, I wouldn't be surprised if this was the same case.


----------



## net2007 (May 26, 2015)

radrok said:


> Last time I used afterburner with a Radeon card it gave me the sum of the two cards video memory when running a CFX configuration, I wouldn't be surprised if this was the same case.


are you serious? Is that with the DX 12? This is something I'm most looking forward to seeing besides the other interesting technologies associated with dx12. I really want to knoway about vram stacking because this could open up the door for so much potential. As of right now you wouldn't even really have to buy a 980 ti in order to play  4K at 60 frames a second. You could get away with buying 3 970s. There's definitely some interesting possibilities there


----------



## xenocide (May 26, 2015)

Witcher 3 uses under 2GB of vRAM even on 4K.  I assume other devs are just not doing a good job coding and being very liberal with their vRAM allocation.  I'll also add that vRAM has never been a legitimate bottleneck--just because you max out the vRAM on your card doesn't mean any performance is actually lost.


----------



## net2007 (May 26, 2015)

xenocide said:


> Witcher 3 uses under 2GB of vRAM even on 4K.  I assume other devs are just not doing a good job coding and being very liberal with their vRAM allocation.  I'll also add that vRAM has never been a legitimate bottleneck--just because you max out the vRAM on your card doesn't mean any performance is actually lost.




There is performance lost when you use up all the Vram.  Crysis 2 for example, shooting the water resulted in 1 fps


----------



## xorbe (May 26, 2015)

net2007 said:


> 12 gb is not to much considering gamea in 4k are exceeding 6 gb. In my opinion the 980 ti should come with 8 gb.



And you propose that be done how with a 384-bit bus?


----------



## net2007 (May 26, 2015)

xorbe said:


> And you propose that be done how with a 384-bit bus?


I don't propose anything. I'm just saying 6 is not enough but it will have to make due I guess.


----------



## xenocide (May 26, 2015)

net2007 said:


> There is performance lost when you use up all the Vram.  Crysis 2 for example, shooting the water resulted in 1 fps


 
Crysis 2 also tesselated an entire ocean under the visable game world.  Not exactly the pinnacle of great design choices.  That's one where I'll assume the problem was on the developer end.  Also, why would shooting water be affected directly by filled vRAM?  That sounds more like it has to do with the way the game handled water than with filled vRAM.


----------



## rodneyhchef (May 26, 2015)

Since this is going to be a cut-down TITAN X - will it have the same memory allocation controversy as the GTX 970 I wonder?

I'd be interested in one of these if the price is right, needs to be cheaper than 2x970s though.


----------



## 64K (May 26, 2015)

xenocide said:


> Witcher 3 uses under 2GB of vRAM even on 4K.  I assume other devs are just not doing a good job coding and being very liberal with their vRAM allocation.  I'll also add that vRAM has never been a legitimate bottleneck--just because you max out the vRAM on your card doesn't mean any performance is actually lost.



More VRAM does nothing for performance unless the game actually needs it and you don't have it then the engine starts using system RAM which is slower so performance takes a hit.


----------



## net2007 (May 26, 2015)

64K said:


> More VRAM does nothing for performance unless the game actually needs it and you don't have it then the engine starts using system RAM which is slower so performance takes a hit.


that's the thing though, games are getting more demanding especially in higher resolution


----------



## 64K (May 26, 2015)

net2007 said:


> that's the thing though, games are getting more demanding especially in higher resolution



Agreed.


----------



## newtekie1 (May 26, 2015)

HumanSmoke said:


> You can add the latest poster boy, GTA V to the list. Here's a little vid showing vRAM usage at 4K



It is really hard to see because his video is so blurry and poorly recorded, seriously he has afterburner installed why isn't he just using that to capture video, but anyway, he has MSAA on.  I'm guessing 8x MSAA.  Turning MSAA on when running at 4K is stupid and will use stupid amounts of vRAM.


----------



## net2007 (May 26, 2015)

Far cry 4 In 4k uses 6gb easy. not to mention shadow of Mordor


----------



## xenocide (May 27, 2015)

net2007 said:


> Far cry 4 In 4k uses 6gb easy. not to mention shadow of Mordor


 
And Witcher 3 uses less than 2GB in 4K.  It seems more likely that these companies were being very negligent when it came to resource allocation than anything.  I mean shit, CoD: AW uses as much vRAM as it has access to, not because it's demanding but because the devs didn't care to optimize it.  As resolution goes up developers will just have to be more careful with how they allocate resources.

EDIT:  Even the consoles are limited to like 5GB-6GB of RAM, so I imagine not many games will use drastically more than 6GB.  6GB is plenty these days.


----------



## net2007 (May 27, 2015)

If this windows 10 tech preview wasn't messing up I would post screens right now of vram usage. I haven't even had a chance to look.


----------



## xorbe (May 27, 2015)

xenocide said:


> And Witcher 3 uses less than 2GB in 4K.  It seems more likely that these companies were being very negligent when it came to resource allocation than anything.  I mean shit, CoD: AW uses as much vRAM as it has access to, not because it's demanding but because the devs didn't care to optimize it.  As resolution goes up developers will just have to be more careful with how they allocate resources.
> 
> EDIT:  Even the consoles are limited to like 5GB-6GB of RAM, so I imagine not many games will use drastically more than 6GB.  6GB is plenty these days.



Vram is technically wasted if not put to use caching things.  So, good on CoD:AW for aggressive caching if the vram is there.  It's not inefficient, it's opportunistic.


----------

