Wednesday, May 10th 2017

NVIDIA Announces Its Volta-based Tesla V100

Today at its GTC keynote, NVIDIA CEO Jensen Huang took the wraps on some of the features on their upcoming V100 accelerator, the Volta-based accelerator for the professional market that will likely pave the way to the company's next-generation 2000 series GeForce graphics cards. If NVIDIA goes on with its product carvings and naming scheme for the next-generation Volta architecture, we can expect to see this processor on the company's next-generation GTX 2080 Ti. Running the nitty-gritty details (like the new Tensor processing approach) on this piece would be impossible, but there are some things we know already from this presentation.

This chip is a beast of a processor: it packs 21 billion transistors (up from 15,3 billion found on the P100); it's built on TSMC's 12 nm FF process (evolving from Pascal's 16 nm FF); and measures a staggering 815 mm² (from the P100's 610 mm².) This is such a considerable leap in die-area that we can only speculate on how yields will be for this monstrous chip, especially considering the novelty of the 12 nm process that it's going to leverage. But now, the most interesting details from a gaming perspective are the 5,120 CUDA cores powering the V100 out of a total possible 5,376 in the whole chip design, which NVIDIA will likely leave for their Titan Xv. These are divided in 84 Volta Streaming Multiprocessor Units with each carrying 64 CUDA cores (84 x 64 = 5,376, from which NVIDIA is cutting 4 Volta Streaming Multiprocessor Units for yields, most likely, which accounts for the announced 5,120.) Even in this cut-down configuration, we're looking at a staggering 42% higher pure CUDA core-count than the P100's. The new V100 will offer up to 15 FP 32 TFLOPS, and will still leverage a 16 GB HBM2 implementation delivering up to 900 GB/s bandwidth (up from the P100's 721 GB/s). No details on clock speed or TDP as of yet, but we already have enough details to enable a lengthy discussion... Wouldn't you agree?
Add your own comment

103 Comments on NVIDIA Announces Its Volta-based Tesla V100

#1
v12dock
Block Caption of Rainey Street
Correct me if I am wrong but ~6% CUDA core performance increase...? :wtf:
Posted on Reply
#2
xkm1948
Looks very promising for scientific research. On gaming perspective it should be pretty amazing as well.

Somehow i feel like Nvidia has already maxed out the possible efficiency optimization for Pascal-Maxwell CUDA designing. They are also back to the MOAR CORE and Higher MHz direction. With so many CUDA units available i am pretty sure Async Computing will be Volta's advantage. It should perform pretty well in Vulkan and DX12.

Poor VEGA.
Posted on Reply
#3
TheoneandonlyMrK
Doesn't look to be that great a graphics card considering its specs@ 21 billion transistors, but Google might like it.
Is it me or have they done little but rearrange the graphics components then add a shit tone of ai processors and call it done.
I am now more interested in lower tier volta because this ones for egg heads :) and not so much gamers ,what the heck is small volta going to look like given big voltas trying to go all Cyberdyne on us.
I don't believe this is going to be easy to market to joe public though because any record for dearest consumer card is going to get obliterated when this hits the shops Christmas 2018 ;).
Posted on Reply
#6
chaosmassive
uh oh, this is should be BIG warning sign for AMD
Nvidia has give a glimpse on Volta, meanwhile AMD still teasing us with Vega :shadedshu:
Posted on Reply
#7
Nokiron
theoneandonlymrkDoesn't look to be that great a graphics card considering its specs@ 21 billion transistors, but Google might like it.
Is it me or have they done little but rearrange the graphics components then add a shit tone of ai processors and call it done.
I am now more interested in lower tier volta because this ones for egg heads :) and not so much gamers ,what the heck is small volta going to look like given big voltas trying to go all Cyberdyne on us.
I don't believe this is going to be easy to market to joe public though because any record for dearest consumer card is going to get obliterated when this hits the shops Christmas 2018 ;).
What? The specs looks fantastic!

You never saw the GP100 for consumers and you will never see this. Expect a completely different card with no FP16, FP64 and Tensor-capabilities.
Posted on Reply
#8
FordGT90Concept
"I go fast!1!11!1!"
Only 15 TFLOP? That's only 3 TFLOP more than Vega should have. That lower clockspeed hurts.
Posted on Reply
#9
Nokiron
FordGT90ConceptOnly 15 TFLOP? That's only 3 TFLOP more than Vega should have. That lower clockspeed hurts.
Well, that's not how you measure performance when selecting processing power for datacenters. Unless Vega has something like the Tensor Cores it will not even be competition.

It also has the capability of executing INT32 and FP32 simultaneously. I know devs at my work are already frothing in their mouths just reading about it.
Posted on Reply
#10
bug
v12dockCorrect me if I am wrong but ~6% CUDA core performance increase...? :wtf:
Yeah, a measly 6% more performance coupled with a measly 42% more CUDA cores amounts to about nothing :wtf:

And I'm not sure where you got the 6% from.
Posted on Reply
#11
Fluffmeister
And such a massive HPC chip already pushing 1.4+ Ghz.

Strip away the tech not relevant to gaming and GV104 and GV102 chips are gonna absolutely fly.
Posted on Reply
#12
eidairaman1
The Exiled Airman
bugYeah, a measly 6% more performance coupled with a measly 42% more CUDA cores amounts to about nothing :wtf:

And I'm not sure where you got the 6% from.
Lets not forget price gouging/fixing...
Posted on Reply
#13
dwade
nvidia gives us Volta specs. AMD gives us a Vega logo. Yeah Intel should've been the one to buy ATI.
Posted on Reply
#14
Fluffmeister
dwadenvidia gives us Volta specs. AMD gives us a Vega logo. Yeah Intel should've been the one to buy ATI.
To be fair, they did announce recently that Vega was going to be called RX.... Vega.
Posted on Reply
#15
eidairaman1
The Exiled Airman
dwadenvidia gives us Volta specs. AMD gives us a Vega logo. Yeah Intel should've been the one to buy ATI.
Baseless comment.
Posted on Reply
#16
TheinsanegamerN
eidairaman1Lets not forget price gouging/fixing...
Perhaps AMD should wake up and make a decent GPU then? Nobody can really blame nvidia for making some additional cash and capitalizing on AMD's inability to perform.
Posted on Reply
#17
TheGuruStud
Lol at yields of a die that size. Consumer card will be missing 1,300 at least.
Posted on Reply
#18
RejZoR
TheinsanegamerNPerhaps AMD should wake up and make a decent GPU then? Nobody can really blame nvidia for making some additional cash and capitalizing on AMD's inability to perform.
AMD made several excellent GPU's that were either better priced, technologically superior or just plain superior as whole. And their market share didn't change at all. In fact they kept on losing it. Maybe, instead of blaming AMD, users should blame themselves? Ever though of it that way? I've seen people constantly sticking with Intel or NVIDIA literally "just because". And then tehy go 180°and piss on AMD for "not doing a better job". ¯\_(ツ)_/¯
Posted on Reply
#19
Fluffmeister
TheGuruStudLol at yields of a die that size. Consumer card will be missing 1,300 at least.
The consumer isn't the issue right now, it's big buck contracts like Summit: www.olcf.ornl.gov/summit/

It's not like Pascal has any competition yet anyway. Can i say Happy Birthday GTX 1080 yet?
Posted on Reply
#20
oxidized
RejZoRAMD made several excellent GPU's that were either better priced, technologically superior or just plain superior as whole. And their market share didn't change at all. In fact they kept on losing it. Maybe, instead of blaming AMD, users should blame themselves? Ever though of it that way? I've seen people constantly sticking with Intel or NVIDIA literally "just because". And then tehy go 180°and piss on AMD for "not doing a better job". ¯\_(ツ)_/¯
Apart from the grammar error, when exactly did that happen? All i remember are slower and far less efficient cards, in what way exactly were those technologically superior? Maybe i'm not counting another aspect.
Posted on Reply
#21
v12dock
Block Caption of Rainey Street
bugYeah, a measly 6% more performance coupled with a measly 42% more CUDA cores amounts to about nothing :wtf:

And I'm not sure where you got the 6% from.
The chip is 33% bigger and carries 33% more cores blow Pascal up by 33% and you will have a very very similar chip that uses more power
Posted on Reply
#22
bug
eidairaman1Lets not forget price gouging/fixing...
Because pricing is totally what we were talking about.
RejZoRAMD made several excellent GPU's that were either better priced, technologically superior or just plain superior as whole. And their market share didn't change at all. In fact they kept on losing it. Maybe, instead of blaming AMD, users should blame themselves? Ever though of it that way? I've seen people constantly sticking with Intel or NVIDIA literally "just because". And then tehy go 180°and piss on AMD for "not doing a better job". ¯\_(ツ)_/¯
Users can't be expected to sift through a lineup to identify which cards are current generation and which are actually worth buying. Ever thought about it that way?
But the simple truth is users are users and when competing for them, it's the companies that are expected to bend over backwards. Crying like a baby that your good product doesn't sell is not a market strategy.
Posted on Reply
#23
medi01
Palladium"The new Volta SM is 50% more energy efficient than the previous generation Pascal design"

Ouch.
Remind me, when nvidia's "%" claimes were to be trusted.
oxidizedAll i remember are slower and far less efficient cards
Oh dear...
Posted on Reply
#24
Dethroy
What a freakin' massive chip! 815mm² :kookoo:
Would be nice if the 2070 would come close to 1080TI levels ...

Edit: Cannot wait for Vega, though. Really wanna see what AMD has to offer as well.
Posted on Reply
#25
TheGuruStud
FluffmeisterThe consumer isn't the issue right now, it's big buck contracts like Summit: www.olcf.ornl.gov/summit/

It's not like Pascal has any competition yet anyway. Can i say Happy Birthday GTX 1080 yet?
I bet they can buy four MI25s for the price of this. I hope they all jump ship when it's released.
Posted on Reply
Add your own comment
Dec 18th, 2024 09:03 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts