• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's Latest Titan V GPU Benchmarked, Shows Impressive Performance

I highly doubt since AMD is doing mistake after mistake and not making any step forward to rival nVidia.

Of course they won't, they don't need to. 1080 Ti has no competition as you say.

Gv100 powers supercomputers like Summit and is for research institutions.

All else is tears in the rain.
 
Well Nvidia managed to cram an additional 73% more transistors into Volta than Pascal while keeping the same power envelop, if Ampere has none of the compute stuff and maintain the same transistors count we can expect the same jump as from 980ti to 1080ti (60%)

1080Ti outperforms 980Ti by a lot more than 60%. In TPU's November 24, 2017 EVGA GTX 1070 Ti FTW2 iCX 8GB Review, the 1080Ti reference card was 75% faster at 1440p (135%/77%) and 87% faster at 4K (140%/75%). You can try making an argument that a reference 980Ti hasn't aged that well (which can be seen somewhat since it now consistently loses to the Fury X on average) or argue in favor of its great overclocking headroom (980Ti gains more from overclocking than 1080Ti does). Strictly speaking from a generational performance leap, it seems unlikely that a reference $700+ GTX2080Ti projected for a 2019 release will be 75-87% faster at 1440p-4K compared to the $700 1080Ti.

More likely than not, the performance increase will be far lower and prices could be even higher due to lack of competition from AMD. Apple has shown that its loyal customers are willing to pay $1,000-1,200 USD for iPhone X. There is little reason why NV cannot price GTX2080 at $699-749 and move the GTX2080Ti to $899-999 given that Vega guzzles power and barely performs as fast as 1.5-year-old GTX1080. I expect the 2018-2019 generation to be the most expensive one as AMD's deficit on the high-end is too great to overcome, leaving NV with full power to raise prices again. We should probably expect GTX2070 ~ GTX1080Ti and GTX2080 to outperform the 1080Ti by 15-25% and the 2080Ti card to launch in 2019. This has pretty much been the way NV has operated since 2012 Kepler.
 
Last edited:
IMPRESSIVE PERFORMANCE? SOMEONE SHOULD BEAT THIS PEOPLE WHO WRITE SUCH THINGS.
45% IMPROVEMENT FROM 1200$ TO 3000$ IMPRESIVE.
THAT'S IMPRESSIVE DONKEY EARS FOR ANYONE WHO SAY THAT.
Are you blind, are you normal at all. 45% improvement or 50% or 55% after 18 months over graphic card worth 1200$ to charge 3000$?
I don't see nothing impressive there, only diagnose.
For this price I could buy new Kawasaki 2018 multipurpose or Off road.
And next year or for 5 years no one will launch 50% better model, are you aware of that.
One word- F A I L!
I expected to NVIDIA show up with some 16GB HBM model at least 60-70% stronger than TITAN XP, to match with TITAN XP SLI and replace him with price. This, buyers should be jelous for people who resist for this and ask help from doctor.
This is school example of Asian robbery like people earn 1K dollars for 5 days.
Even with so good job you need to work like horse 15 days to pay him and you have 10 days to earn for food and everything else.

Impressive win over 2 years old model with three time more expensive graphic card. Success, pure success.
 
Last edited:
Meh, not a gaming card anyway but seeing what it can do with its specs is kinda disappointing. Not commenting about the price since this one is clearly not gaming oriented unlike their previous Titans. Its a good taste of whats to come but its not what was hoped for the specs, then again were talking gaming performance and not what its actually designed for.
 
Well people wanted Titan cards to be compute cards again, now you have to pay for it... especially with what the GV100 offers. I agree the Maxwell and Pascal based Titan cards were merely overpriced consumer cards but sadly people lapped them up... nom nom nom same with the original Keplar Titan card which was soon dethroned.

Soon we will back to moaning why the GTX 1180 has poor FP64 performance.
 
IMPRESSIVE PERFORMANCE? SOMEONE SHOULD BEAT THIS PEOPLE WHO WRITE SUCH THINGS.
45% IMPROVEMENT FROM 1200$ TO 3000$ IMPRESIVE.
THAT'S IMPRESSIVE DONKEY EARS FOR ANYONE WHO SAY THAT.
Are you blind, are you normal at all. 45% improvement or 50% or 55% after 18 months over graphic card worth 1200$ to charge 3000$?
I don't see nothing impressive there, only diagnose.
For this price I could buy new Kawasaki 2018 multipurpose or Off road.
And next year or for 5 years no one will launch 50% better model, are you aware of that.
One word- F A I L!
I expected to NVIDIA show up with some 16GB HBM model at least 60-70% stronger than TITAN XP, to match with TITAN XP SLI and replace him with price. This, buyers should be jelous for people who resist for this and ask help from doctor.
This is school example of Asian robbery like people earn 1K dollars for 5 days.
Even with so good job you need to work like horse 15 days to pay him and you have 10 days to earn for food and everything else.

Impressive win over 2 years old model with three time more expensive graphic card. Success, pure success.


Titan V does not have hampered fp64 performance like previous Titans did, and also this includes the Tensor cores for deep learning.

This card is not expected to compete with any other gaming card out there, and destroys previous Titans when you include apps that take advantage of fp64 and deep learning AI.
 
The Titan Volta is not a gaming card.

It might not be a gaming card but it sure is named as if its one. If they release a card in the future for gaming with titan name they have officially no marketing department any more..
 
I'm reading these comments and in my head I'm replaying all those comments about how Polaris was a great move, because who cares about the high-end? :wtf:
 
Well people wanted Titan cards to be compute cards again, now you have to pay for it... especially with what the GV100 offers. I agree the Maxwell and Pascal based Titan cards were merely overpriced consumer cards but sadly people lapped them up... nom nom nom same with the original Keplar Titan card which was soon dethroned.

Let's not fall into disillusion. Titan cards were just a vanity item for gamers with deep pockets from the beginning. The first Titan wasn't even that astounding in terms of compute , Kepler was ironically the worst architecture Nvidia ever made for that purpose. It made headlines more for it's price rather than it's usefulness.

This is the first card that they have released since Fermi that is actually meant for compute. So who exactly asked for this ? Because the average customer sure didn't , they never gave a shit about any of this.

Soon we will back to moaning why the GTX 1180 has poor FP64 performance.

Literally no one will ever say that. FP64 isn't even in vogue for machine learning.

FP16 is where it's at. As you can see GV100 gets you more than 25 TFLOPS of that and those Tensor Cores run on mixed FP16 and FP32 as well.
 
I'm going to mine with the 4 I'm buying. :D:D :peace:
 
The performance difference actually is not that impressive if we go back and see 980Ti vs 1080Ti. I mean here we have the full power of Volta and not some cut-down chip. When **80/**80Ti comes out the gap will be even smaller.

It's still impressive chip for scientific calculations tho.
This is a cut-down chip.
 
Let's not fall into disillusion. Titan cards were just a vanity item for gamers with deep pockets from the beginning. The first Titan wasn't even that astounding in terms of compute , Kepler was ironically the worst architecture Nvidia ever made for that purpose. It made headlines more for it's price rather than it's usefulness.

This is the first card that they have released since Fermi that is actually meant for compute. So who exactly asked for this ? Because the average customer sure didn't , they never gave a shit about any of this.

I don't disagree, but it doesn't change the fact it sold just fine, much to the despair of many and to the surprise of mean old Nvidia.

Luckily the average consumer isn't forced to buy anything, I guess much to the despair of many again... but hey there is a pro market for a reason too. And some people just want GTX 1080 Ti performance 6 MONTHS earlier, and thats what Nv gave with the first Pascal Titan for example... blah blah blah.

Literally no one will ever say that. FP64 isn't even in vogue for machine learning.

FP16 is where it's at. As you can see GV100 gets you more than 25 TFLOPS of that and those Tensor Cores run on mixed FP16 and FP32 as well.

You'll be surprised, but I just hate it when people say shit is "gimped" at compute and the like, it tickles my funny bone.
 
Last edited:
1080Ti outperforms 980Ti by a lot more than 60%. In TPU's November 24, 2017 EVGA GTX 1070 Ti FTW2 iCX 8GB Review, the 1080Ti reference card was 75% faster at 1440p (135%/77%) and 87% faster at 4K (140%/75%). You can try making an argument that a reference 980Ti hasn't aged that well (which can be seen somewhat since it now consistently loses to the Fury X on average) or argue in favor of its great overclocking headroom (980Ti gains more from overclocking than 1080Ti does). Strictly speaking from a generational performance leap, it seems unlikely that a reference $700+ GTX2080Ti projected for a 2019 release will be 75-87% faster at 1440p-4K compared to the $700 1080Ti.

heh better go watch this


or many others video that shows how 980ti oc compare to 1070 oc
New 1080ti review probably didn't have the 980ti retested and just used old value that don't include driver optimization anyway. And 980ti losing to Fury X is the furthest from truth (even beating Fury X in Wolfenstein 2).

Edit: new video show the Titan V can mine Ethereum at 77MH/s @ 220w, so this is like a do all best all card atm but only affordable to Tony Stark
 
Last edited:
This is just the normal NVIDIA cash grab, this card is for those few who have too much money and a desire to have the absolute fastest hardware, for a month or two. Us working folks need not be concerned, or even covet this. The same performance at 1/4 the price will be along shortly...
 
Not that I'm in the market for a $3,000 card (my 1080ti was more than expensive enough) but I wonder what the OC headroom on it is like. If it is anything like the 980ti it'd be quite good.
 
Not that I'm in the market for a $3,000 card (my 1080ti was more than expensive enough) but I wonder what the OC headroom on it is like. If it is anything like the 980ti it'd be quite good.
An interesting point. Given these don't sell in large numbers, there probably aren't that many aftermarket cooling solution for them. Then again, these are almost exclusively reference designs (and high-priced) so maybe there are.
 
Not that I'm in the market for a $3,000 card (my 1080ti was more than expensive enough) but I wonder what the OC headroom on it is like. If it is anything like the 980ti it'd be quite good.

Boost 3.0 largely negates the OC ability of NVIDIA cards as it already tries to run at the highest clocks it is able to while staying withing the power and temp targets.
 
The same was said for earlier versions too, those also have locked down ranges so products don't encroach on others in the lineup.
 
Boost 3.0 largely negates the OC ability of NVIDIA cards as it already tries to run at the highest clocks it is able to while staying withing the power and temp targets.
I don't think it negates it. Renders it redundant seems like a more appropriate description.
 
Interesting conclusions from GN:

Purely observationally, based on the data we have presently collected, it would appear that the Titan V has two primary behaviors: (1) Applications which are built atop low-level APIs and asynchronous computational pipelines appear to process more efficiently on the Titan V; (2) the Titan V appears to host more cores than some of these applications (namely D3D11 titles) can meaningfully use, and that is demonstrated fully upon overclocking.

Given that overclocks in D3D11 applications produce performance uplift of ~20% (in some instances), it would appear that the high core count becomes more of a burden than a benefit. The GPU needs the faster clocks, and can’t access or leverage its high core count in a meaningful way. The result is that the Titan V begins to tie with the Titan Xp, and that the 1080 Ti closes-in on the Titan V. In lower-level API games, however, the Titan V pulls away by large margins – 27% to 40%, in some cases. The gains are big enough that we retested numerous times on numerous cards, but they remained. Our present analysis is that these applications are better able to spin-off multiple, simultaneous, in-flight render jobs across the high core count, whereas the tested Dx11 titles may function more synchronously.

As for the Titan V specifically, it can certainly be used for games -- but only in the context of, "I bought this thing for work, and sometimes I play games." If you're just gaming, clearly, this isn't the right purchase.

Some nice follow ups over @ Beyond3D too including some crypto testing:

https://forum.beyond3d.com/threads/nvidia-volta-speculation-thread.53930/page-46
 
well, I don't think this benchmark has the impressive result. the culprit must be the unoptimized driver for that titan volta. ah, I know, they must reserve the 2080Ti for milking the cows. They just need making the uproar news and reviews Ti beating titan with 'impressive result' then profit. double kill. ;-D
 
Naturally, only AMD get a free pass on day-0 drivers.
 
Back
Top