Tuesday, May 26th 2015

NVIDIA GeForce GTX 980 Ti Clock Speeds Revealed

NVIDIA's upcoming GeForce GTX 980 Ti graphics card is shaping up to be the "almost Titan-X for two-thirds its price" product the company wants, out in the market. A leaked GPU-Z screenshot of the card by Korean tech-publication HardwareBattle (the same site that broke the card's core config,) reveals its reference clock speeds. All the values displayed by GPU-Z 0.8.2 in the screenshot are pulled from the system, and not an internal lookup table (all the LUT-based values are grayed out, because version 0.8.2 lacks those values for the GTX 980 Ti). The card offers clock speeds that are similar to those of the GTX Titan-X. The core is clocked at 1000 MHz, with a maximum GPU Boost frequency of 1076 MHz (1089 MHz on the GTX Titan-X), while the memory ticks at 7012 MHz (GDDR5-effective).

From our older article, it's known that the GTX 980 Ti will feature a lower CUDA core count, at 2,816 cores, compared to 3,072 on the GTX Titan-X. The TMU count is proportionately lower, at 176. The ROP count is a bigger mystery than Nessie. The card features 6 GB of GDDR5 memory, across a 384-bit wide memory interface. While the reference board design is something that's beginning to look dated, NVIDIA will allow its AIC (add-in card) partners to come up with custom-design boards factory-overclocked to Kingdom come, from day-one. The GeForce GTX 980 Ti is expected to be launched on the sidelines of Computex 2015, in the first week of June.
Sources: HardwareBattle, VideoCardz
Add your own comment

75 Comments on NVIDIA GeForce GTX 980 Ti Clock Speeds Revealed

#51
HumanSmoke
the54thvoidA lot of folk rubbish brands but it's really an availability heuristic. I've never had a card fail my most recent AMD were 5850's and 7970's. They didn't fail either. I'm sure you could browse selectively but this is the first thing i came across:

www.pugetsystems.com/labs/articles/Video-Card-Failure-Rates-by-Generation-563/#NVIDIAFailureRates

No mention of brands but the generic is that AMD of late are actually worse.
Hardware France publish 6 monthly return rates from one of Europes biggest etailers (see other editions on the site to get a 3-4 year overview) The minimum sample size is fairly large. It makes some sobering reading for those AMD cheerleaders taking potshots at EVGA.

www.hardware.fr/articles/934-5/cartes-graphiques.html

Bear in mind that these are total returns. Failures, buyers remorse, and customer dissatisfaction ( coil whine, too high expectation etc.) are all included.
my personal experience with EVGA has been exemplary. 8800U (3), GTX 280 (2), GTX 580, GTX 780 - all overclocked, no complaints.
Posted on Reply
#52
wickedcricket
RejZoRIs it just me or is NVIDIA desperately trying to steer off attention from Radeon R9-390 ? Does anyone even cares about 980Ti at the moment? Especially with R9-390 being right around the corner...
You mean the refreshed GPU's they re-branding and going to sell under RX 300 flag, the same that were around the corner for many years?

AMD6658.1 AMD Radeon R7 360 Graphics Bonaire XTX = Radeon R7 260X
AMD67B0.1 AMD Radeon R9 300 Series Hawaii XT = Radeon R9 290X
AMD67B1.1 AMD Radeon R9 300 Series Hawaii PRO = Radeon R9 290
AMD6810.1 AMD Radeon R7 370 Graphics Curacao XT =Radeon R9 270X
AMD6810.2 AMD Radeon R7 300 Series Curacao XT = Radeon R9 270X
AMD6811.1 AMD Radeon R7 300 Series Curacao PRO = Radeon R9 270
AMD6939.1 AMD Radeon R9 300 Series Tonga PRO =Radeon R9 285
Posted on Reply
#53
RejZoR
Maybe because all AMD cards that have GCN already fully support DX12 and can actually afford to do that. Unlike NVIDIA which doesn't even with last gen Maxwell... Just sayin'

I'm not saying that's the best way to do things from consumer perspective, but if the prices are right, no one really cares in the end. If people will be able to get rebranded R9-290X for around 200 €, I think many will grab it. After all, this card still attacks GTX 970 despite its age.

Besides, don't be daft into thinking that they'll rebrand same GPU's for the 3rd time. R9-370X will never be based on R9-270X. R9-370X will most likely be based on R9-280X variant (R9-285X most likely).
Posted on Reply
#54
EarthDog
LOL 'let their board partners overclock it until kingdom come'...

... dont you mean until you reach NVIDIA's abhorrently low power limit? :p
Posted on Reply
#55
Prima.Vera
This is interesting.
So compared to a 780Ti it has double the VRAM, double the Pixel Fillrate (really??), but 64 less shaders and aprox same texture fill rate and memory bandwidth.
Posted on Reply
#56
EarthDog
I am guessing (hoping) that texture fill rates and memory bandwidth are not saturated, hence the choice?
Posted on Reply
#57
Shigawire
RejZoRIs it just me or is NVIDIA desperately trying to steer off attention from Radeon R9-390 ? Does anyone even cares about 980Ti at the moment? Especially with R9-390 being right around the corner...
I think you could be right. The truth is, with DirectX 12 looming on the horizon, Nvidia will lose its long held competitive edge over AMD. There will now be a lot more competition. DX12, beyond being a godsend for developers and gamers, is also an equalizer between the GPU giants.
All in all, good times are coming for PC gamers, performance wise. With DX12, there are no more issues with providing SLI support for games. We can also for the first time consider SLI VR, a split frame rendering a realistic prospect.
Posted on Reply
#58
Casecutter
wickedcricketYou mean the refreshed GPU's they re-branding and going to sell under RX 300 flag, the same that were around the corner for many years?

AMD6658.1 AMD Radeon R7 360 Graphics Bonaire XTX = Radeon R7 260X
AMD67B0.1 AMD Radeon R9 300 Series Hawaii XT = Radeon R9 290X
AMD67B1.1 AMD Radeon R9 300 Series Hawaii PRO = Radeon R9 290
AMD6810.1 AMD Radeon R7 370 Graphics Curacao XT =Radeon R9 270X
AMD6810.2 AMD Radeon R7 300 Series Curacao XT = Radeon R9 270X
AMD6811.1 AMD Radeon R7 300 Series Curacao PRO = Radeon R9 270
AMD6939.1 AMD Radeon R9 300 Series Tonga PRO =Radeon R9 285
What I find odd in that derived list is how they devalued the X70 class card as just an "R7". Just saying this whole model matrix info is really “corrupted” from what we knew and had previously. I'm not putting much prominence as to that matrix info, heck this could be simply be back filling spots for OEM and Mobil, and the new discrete aftermarket stuff is 4XX series... I doubt that last driver offered these tell-tale-tidbits are even the “reviewers driver for the release”, AMD probably won't see needing to post that till the reveal at Computex.
The crystal balls are quit anxious about all this and making anything out of it.
Posted on Reply
#59
RejZoR
Not to mention, in DX12, Radeons from older generation (R9-290) are serious competition against even latest GeForce cards in DX12 mode...

Though I hope multi-card setups will be more user friendly over current hacked options with game profiles. It's one of the reasons why I always strictly used just 1 GPU. Because it's guaranteed problem free. No one can say the same for any SLi or Crossfire setup.

Also, stacking more GPU's on single card should be a lot easier with DX12...
Posted on Reply
#60
Shigawire
RejZoRNot to mention, in DX12, Radeons from older generation (R9-290) are serious competition against even latest GeForce cards in DX12 mode...

Though I hope multi-card setups will be more user friendly over current hacked options with game profiles. It's one of the reasons why I always strictly used just 1 GPU. Because it's guaranteed problem free. No one can say the same for any SLi or Crossfire setup.

Also, stacking more GPU's on single card should be a lot easier with DX12...
Absolutely right. I have a twin 980 SLI setup, but even that has (at times) mediocre SLI support. It all depends on the game I'm playing. But with DX12, at least I know that SLI support will be near flawless and NOT wonky, like it's been hitherto.

Under DX12, AMD's already strong flagships will suddenly find strength that the users never knew it had.. DX12 not only equalize the playing field between the GPU giants, it also prolongs the lifetime of each gamer's graphics card! I'm so excited for this change, it's hard to not get carried away. ;)
Posted on Reply
#61
RejZoR
AMD doesn't exactly like it (since it won't sell more of their new cards), but gamers that have R9-290's will be super happy.
Posted on Reply
#62
haswrong
chinmiso compared to titanx, this card probably 35% cheaper, and 10% slower ? nice.....
its definitely not nice.. for 4k, i need something faster, not slower, and 50% cheaper, not 35%..
Posted on Reply
#63
EarthDog
Gotta pay to play 4k friend. ;)
Posted on Reply
#64
lukesky
HumanSmokeHardware France publish 6 monthly return rates from one of Europes biggest etailers (see other editions on the site to get a 3-4 year overview) The minimum sample size is fairly large. It makes some sobering reading for those AMD cheerleaders taking potshots at EVGA.

www.hardware.fr/articles/934-5/cartes-graphiques.html

Bear in mind that these are total returns. Failures, buyers remorse, and customer dissatisfaction ( coil whine, too high expectation etc.) are all included.
my personal experience with EVGA has been exemplary. 8800U (3), GTX 280 (2), GTX 580, GTX 780 - all overclocked, no complaints.
EVGA does not ever appear in the hardware.fr component return page in all the years I have looked at. It could be because (I speculate) the volume of sales in the EU channel is too low or hardware.fr simply cannot measure EVGA sales due to some restrictions.

Edit: If you read in the middle of this page, it lists the cards that have sold more than 200 units or 100 units in italics hardware.fr tracks to get the return rankings.
www.hardware.fr/articles/934-5/cartes-graphiques.html
Posted on Reply
#65
HumanSmoke
lukeskyEVGA does not ever appear in the hardware.fr component return page in all the years I have looked at. It could be because (I speculate) the volume of sales in the EU channel is too low or hardware.fr simply cannot measure EVGA sales due to some restrictions.
Probably the brand isn't high volume in France and Belgium, although it sold through the major (r)etailers
lukeskyEdit: If you read in the middle of this page, it lists the cards that have sold more than 200 units or 100 units in italics hardware.fr tracks to get the return rankings.
I am actually aware of this fact, thanks. I have been following the return figures since their inception on Hardware France/BeHardware. The only reason I included the link (and the associated links that can be accessed through it) is because a minimum sample size from a major etailer should provide a better level of factual basis than some anecdotal posting by an anonymous forum member - one who seems to be waging a war on EVGA at every opportunity. Even a casual glance at EVGA's Newegg verified ownership reviews should be accorded better status against such an self-admittedly small sample size.
Posted on Reply
#67
lukesky
It was already known it would have 96 ROPS in order to arrive at 384 bits memory bandwidth. Now the million dollar question is whether it would have L2 cache removed and have the -0.5GB penalty.
Posted on Reply
#68
xorbe
lukeskyIt was already known it would have 96 ROPS in order to arrive at 384 bits memory bandwidth. Now the million dollar question is whether it would have L2 cache removed and have the -0.5GB penalty.
No that was the whole rub with the 970, 256-bit but ROPs disabled ("enabled but unused"). If 980 Ti has all ROPs then it probably has all L2.
Posted on Reply
#69
Xzibit
xorbeNo that was the whole rub with the 970, 256-bit but ROPs disabled ("enabled but unused"). If 980 Ti has all ROPs then it probably has all L2.
Nope, He is right. Its a matter if any L2 are disabled.

Posted on Reply
#70
xorbe
Huh, then I need to go re-read, I must have gotten confused again. =/
Posted on Reply
#71
HumanSmoke
xorbeNo that was the whole rub with the 970, 256-bit but ROPs disabled ("enabled but unused"). If 980 Ti has all ROPs then it probably has all L2.
That is my understanding also. ROPs and L2 are linked. Disabling ROPs/L2 (as in the 970), but maintaining an enabled memory controller causes the slowdown of the second partition. As the Anandtech article clearly states(that Xzibit used the attached diagram in his post)
In doing this, each and every 32 bit memory channel needs a direct link to the crossbar through its partner ROP/L2 unit. However in the case of the GTX 970 a wrench is thrown into the works, as there are 7 crossbar ports and 8 memory channels
If the ROPs are fully enabled, then each associated 32kB L2 slice, should also be enabled....and if the memory controller configuration is fully enabled then the issue that affected the 970 should not be relevant to the 980Ti.
Posted on Reply
#73
Xzibit
<-_->EVGA GTX 980 Ti list from ShopBLT.

06G-P4-4990-KR NVIDIA REFERENCE FAN : $798.77
06G-P4-4991-KR EVGA ACX2.0+ COOLING : $798.77
06G-P4-4992-KR SC NVIDIA REFERENCE FAN : $815.85
06G-P4-4993-KR SC EVGA ACX2.0+ COOLING : $810.16
06G-P4-4995-KR SC+ WITH BP EVGA ACX2.0+ : $827.25

www.shopblt.com/search/order_id=%21ORDERID%21&s_max=25&t_all=1&s_all=GTX980TI
Not surprised. If those turn out to be true.

The bright side, at least you get Titan X performance for $200 less.
Posted on Reply
#74
xorbe
Ooh I don't know about that. Someone with $800 to blow on a gfx card probably has $999 for a Titan X. Not sure that is going to fly with people. Too close in pricing to the top dog.
Posted on Reply
#75
EarthDog
I see these prices as pretty liquid too... If the 390x comes in at $600 and beats it, I can see the 980ti come down in price.
Posted on Reply
Add your own comment
May 5th, 2024 17:17 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts