• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces Its Volta-based Tesla V100

measures a staggering 815 mm

I didn't even know this was possible.
 
What? The specs looks fantastic!

You never saw the GP100 for consumers and you will never see this. Expect a completely different card with no FP16, FP64 and Tensor-capabilities.
Maybe, some could be used via physx but if they sell you this chip your chiping in for the rest no doubt , Even if they cut it out.
It is not that great a graphics card for 21 billion transistors, a lot of which wouldn't run crysis.
Its a good piece of tech ,no doubt but its got that many different types of compute unit in, it will struggle to use it in a lot of use cases and most definately in games but it does have enough raw gfx power for maybe 20-30% on big Pascal maybe.
 
I bet they can buy four MI25s for the price of this. I hope they all jump ship when it's released.

Well like I said, Nv already have contracts to fulfil. But equally there is more to this than just delivering a product and saying "here you go".

Hope doesn't come into it.
 
Lets not forget price gouging/fixing...

Unless AMD is working with Nvidia on prices controls, it's not price fixing. The price is what happens to goods when there are no competing products on the market.
 
Maybe, some could be used via physx but if they sell you this chip your chiping in for the rest no doubt , Even if they cut it out.
It is not that great a graphics card for 21 billion transistors, a lot of which wouldn't run crysis.
Its a good piece of tech ,no doubt but its got that many different types of compute unit in, it will struggle to use it in a lot of use cases and most definately in games but it does have enough raw gfx power for maybe 20-30% on big Pascal maybe.
They will probably end up doing the same as with Pascal where the GP102 chip is a single precision card. Don't forget that the GP100 appears to have the same CUDA core count as the GP102. Hopefully they will start implementing HBM2 on the consumer tier Volta cards, otherwise I will probably just get a second hand Pascal card around the end of the year. The performance of desktop GPUs has already scaled well past what games (and what 4core-dual channel computers) can realistically handle, the only way to take advantage of the enormous power of the modern cards is by adding unnecesary levels of AA or running at 1440p or higher, but the prices of 4k monitors are (as of now) prohibitive (I'd expect that to change within a year or so).

Basically there is not much need for such a powerful chip in the desktop market, remember that we are looking at a chip that could be as much as twice as powerful as a GP102...
 
Because pricing is totally what we were talking about.



Users can't be expected to sift through a lineup to identify which cards are current generation and which are actually worth buying. Ever thought about it that way?
But the simple truth is users are users and when competing for them, it's the companies that are expected to bend over backwards. Crying like a baby that your good product doesn't sell is not a market strategy.

Now, all of a sudden, users are incapable of reading graphic card reviews even though they somehow managed to find out there are brand new graphic cards selling. C'mon dude, be serious.
 
Now, all of a sudden, users are incapable of reading graphic card reviews even though they somehow managed to find out there are brand new graphic cards selling. C'mon dude, be serious.
Reviews don't sell products if they show that there are better options :)

And also from what I can tell, people outside of the enthusiast circle are wary of AMD and give them a bad rep on stuff like the 290(x) heat, driver problems, support, etc.

And to give a personal example: I use x-plane, I basically can't use an AMD card because it runs like garbage on their drivers...
 
You gave an excellent example. R9 290X was a single chip card that beat NVIDIA's DUAL GPU Titan. In fact it was so fast AMD could afford to just rebrand it to R9 390X and place it against BRAND NEW NVIDIA GTX 980. Talking about inferiority. And yeah, I bought GTX 980 because on paper it had better DX12 support. In the end, it turned out AMD had better support despite being just DX 12_0 feature level where GTX 980 is DX 12_1. That technological inferiority. Async compute? Ups, AMD again superior. Besides, even if I have GTX 980 now, I don't feel bad because I have a long history of owning both brands. And despite RX Vega being late to the game, it's quite likely I'll grab one.
 
What a freakin' massive chip! 815mm² :kookoo:
Would be nice if the 2070 would come close to 1080TI levels ...

Based on past performance results it most likely will

GTX 1070 very close to a GTX 980 Ti and GTX 1080 beat it (Pascal vs Maxwell)

GTX 970 very close to a GTX 780 Ti and GTX 980 beat it (Maxwell vs Kepler)

GTX 670 very close to a GTX 580 and GTX 680 Beat it (Kepler vs Fermi)

But I'm expecting even more from Volta. Things are going to get exciting for gamers in Q1 2018 imo
 
Last edited:
I bet they can buy four MI25s for the price of this. I hope they all jump ship when it's released.

Well that is not enough, v100 has ten times more fp64 flops than mi25, which matters the most for hpc. Mi25 is not actually even hpc product so they will not even consider it. For AI and machine learning mambojambo for sure: it should be competitive... If they managed to push reliable and working drivers/software for them.
 
Now, all of a sudden, users are incapable of reading graphic card reviews even though they somehow managed to find out there are brand new graphic cards selling. C'mon dude, be serious.
I'm really not sure what you're trying to argue here. That users do read reviews, see that AMD has the better card and then they go buy Nvidia?
 
AMD made several excellent GPU's that were either better priced, technologically superior or just plain superior as whole. And their market share didn't change at all. In fact they kept on losing it. Maybe, instead of blaming AMD, users should blame themselves? Ever though of it that way? I've seen people constantly sticking with Intel or NVIDIA literally "just because". And then tehy go 180°and piss on AMD for "not doing a better job". ¯\_(ツ)_/¯
AMD is far from perfect though. They had all the hardware but none of the software.

Why bother using a similar product when the other gives you better performing APIs/programming models, tons of dedicated support-staff and developers for a little more money?

It's no surprise that they were not selling, which is what they are trying to change now.

Well that is not enough, v100 has ten times more fp64 flops than mi25, which matters the most for hpc. Mi25 is not actually even hpc product so they will not even consider it. For AI and machine learning mambojambo for sure: it should be competitive... If they managed to push reliable and working drivers/software for them.
Machine learning and AI will be a no-go if they don't have something similar to the Tensor Cores though.
 
Perhaps AMD should wake up and make a decent GPU then? Nobody can really blame nvidia for making some additional cash and capitalizing on AMD's inability to perform.

Last time nv had anything special was the 8800 Series in my books.
 
AMD is far from perfect though. They had all the hardware but none of the software.

Why bother using a similar product when the other gives you better performing APIs/programming models, tons of dedicated support-staff and developers for a little more money?

It's no surprise that they were not selling, which is what they are trying to change now.

Machine learning and AI will be a no-go if they don't have something similar to the Tensor Cores though.

People keep yodeling this "bad software support" and "bad drivers" and "awful OpenGL support/performance" and in decade of owning ATi/AMD cards, I hardly ever experienced any issues, stability or performance. This too is one of the reasons why AMD just can't get any traction. People spreading misinformation and just plain lies. For reasons unknown to any logic.
 
People keep yodeling this "bad software support" and "bad drivers" and "awful OpenGL support/performance" and in decade of owning ATi/AMD cards, I hardly ever experienced any issues, stability or performance. This too is one of the reasons why AMD just can't get any traction. People spreading misinformation and just plain lies. For reasons unknown to any logic.


Em ,nope. Long time ATi user here. As a matter of fact I have ONLY bought ATi/AMD GPUs so far(at least for all my OWN GPUs) FuryX, 7970, 6870, 5870 and all the way back to Radeon 9700

ATi/AMD's driver did improve quite a lot. They also incorported some pretty cool features(Wattman, ReLive and etc.)

At the same time their software support to developers is seriously lacking. AMD likes to showcase a lot of cool features during a GPU launch, most of which will never see the day light being actually implemented in the driver.
 
Apart from the grammar error, when exactly did that happen? All i remember are slower and far less efficient cards, in what way exactly were those technologically superior? Maybe i'm not counting another aspect.

4870 vs GTX280, 4870 was $200 cheaper, a much smaller die was under 15% performance difference. 4870 first card to use GDDR5
5870 (was as fast as dual GPU cards like 4870x2 and GTX 295) vs GTX285. Was more power effcient. Fermi arrived late, was hot and power hungry, although faster. 5800 series first to support eyefinity or multi-montor gaming.
R9 290/x vs 780/Titan

Pretty much the same through the midrange as well. The masses will buy nVidia, regardless of what AMD makes. Nvidia knows this, which is why they are selling midrange GPU's at higher prices. Like the 1080 launching at around $700 was it.

People keep yodeling this "bad software support" and "bad drivers" and "awful OpenGL support/performance" and in decade of owning ATi/AMD cards, I hardly ever experienced any issues, stability or performance. This too is one of the reasons why AMD just can't get any traction. People spreading misinformation and just plain lies. For reasons unknown to any logic.

To be fair, there was poor opengl performance way back in the early 2000's. One of the reason the FX series did better in Doom 3 vs the Radeon 97/800 series. This hasn't been the case for awhile, except maybe in linux where the OpenSource driver is really turning things around and is now much faster in most OpenGL scenarios than the proprietary drive.
 
Last edited:
Volta sounds like a decent upgrade over Pascal which can do 4K without batting an eyelid, that's coming out relatively soon, so I don't think it's worth investing £800 in a GTX 1080 Ti now and hence I'm happy with my GTX 1080 which "only" cost me £550 and races through everything at 1080p.
 
4870 vs GTX280, 4870 was $200 cheaper, a much smaller die was under 15% performance difference. 4870 first card to use GDDR5
5870 (was as fast as dual GPU cards like 4870x2 and GTX 295) vs GTX285. Was more power effcient. Fermi arrived late, was hot and power hungry, although faster. 5800 series first to support eyefinity or multi-montor gaming.
R9 290/x vs 780/Titan

Pretty much the same through the midrange as well. The masses will buy nVidia, regardless of what AMD makes. Nvidia knows this, which is why they are selling midrange GPU's at higher prices. Like the 1080 launching at around $700 was it.

Back then i'm pretty sure was the time ATi had shitty drivers, and temperatures, and for some reason i remember GTX2xx being a bit better, i also remember GTX4xx being the worst shit created by nvidia, also, introducing GDDR5 with not real benefits, sounds like a HBM part 1, i have to admit i don't remember about power efficiency.
Anyway nvidia has been doing the better overall product in the last ~10 years hands down, and it's not because people are stupid they're buying nvidia, or at least not all of them, as i said elsewhere, they both have the reputation they deserve more or less, it's not like ATi or AMD deserve more than what nvidia has now, they deserve less because they did worse, stop thinking that some kind of conspiracy or any sort of stuff, is the only reason AMD is in this position, it's actually hardly A reason to begin with.
 
People keep yodeling this "bad software support" and "bad drivers" and "awful OpenGL support/performance" and in decade of owning ATi/AMD cards, I hardly ever experienced any issues, stability or performance. This too is one of the reasons why AMD just can't get any traction. People spreading misinformation and just plain lies. For reasons unknown to any logic.
It's common knowledge that AMD/ATI/Radeon software and driver support sucks, for all their hardware, including CPUs and chipsets. I guess you never got the memo...oh that's right, you're the one who ignores everything negative about AMD and makes up stats that "prove" your point. Please don't stop, you are providing comic relief with your delusions. Continue obsessing over trivial BS,
so we can keep laughing...
 
It's common knowledge that AMD/ATI/Radeon software and driver support sucks, for all their hardware, including CPUs and chipsets. I guess you never got the memo...oh that's right, you're the one who ignores everything negative about AMD and makes up stats that "prove" your point. Please don't stop, you are providing comic relief with your delusions. Continue obsessing over trivial BS,
so we can keep laughing...

Actually nowadays' AMD driver/software support sounds to be better than nvidia's, but this can be said for the very last 3 gens maybe?
 
IMG_1626.JPG


Death of RX Vega based GPU.
 
It's common knowledge that AMD/ATI/Radeon software and driver support sucks, for all their hardware, including CPUs and chipsets. I guess you never got the memo...oh that's right, you're the one who ignores everything negative about AMD and makes up stats that "prove" your point. Please don't stop, you are providing comic relief with your delusions. Continue obsessing over trivial BS,
so we can keep laughing...

"It's a common knowledge" No, it's just BS. And "I didn't get the memo". Dude, this GTX 980 is my first NVIDIA card after like 10 years of ONLY AMD/ATi cards. And I've not experienced any problems. But do go on talking out of your rear how I have no clue and how I'm an AMD fanboy just because I believe just 10% of bullshit people write about AMD (which I know first hand is a load of garbage for the most part). Looks like I'll be buying AMD out of spite just to annoy NVIDIA fanboy idiots with their "perfect everything" garbage attitude.
 
Back then i'm pretty sure was the time ATi had shitty drivers, and temperatures, and for some reason i remember GTX2xx being a bit better, i also remember GTX4xx being the worst shit created by nvidia, also, introducing GDDR5 with not real benefits, sounds like a HBM part 1, i have to admit i don't remember about power efficiency.
Anyway nvidia has been doing the better overall product in the last ~10 years hands down, and it's not because people are stupid they're buying nvidia, or at least not all of them, as i said elsewhere, they both have the reputation they deserve more or less, it's not like ATi or AMD deserve more than what nvidia has now, they deserve less because they did worse, stop thinking that some kind of conspiracy or any sort of stuff, is the only reason AMD is in this position, it's actually hardly A reason to begin with.
The GTX 280 was a bit better than the 4870 in performance, but not $200 better, which is the point, nvidia still outsold, even with the GTX 400 series. All through the GTX 200-500 series AMD was more power effceint, which was a reflection of their small die strategy, yet still maintained performance very close to nvidia. Most temp problems are with reference coolers. As an owner of a 4870 I don't recall going much past 80C with 4870 even in crossfire during BFBC2. nvidia having the better product over the last 10 years, is just not true, as I have been trying to demonstrate. In 2007 with the 8800 series, yes hand down no denying, the 2900 series was not great. But from the GTX 200-500 series as I stated above, was just not so. Although the 6900 series was a bit sub-par in my oppinion, as it certainly lacked in AA performance compared to nvidia as well as tessalation. That all changed with GCN. The 7000 series was faster than anything at the time, until kepler came out (which was the begining of nvidia selling midrange GPU's as higher end GPU's). After some drivers updates the 7000 series was back on par with kepler. IMO The R9 290/290X was a big upset for nvidia, besting their $1000 card in several scenarios, unfortunately it was plagued by the bitcoin mining fad. A lot of Market share was already lost before this point, in AMD's case. They have had a bad stigma that they have not been able to overcome. Maxwell was a homerun for nvida, but was behind AMD in one aspect, asynchronus compute, which has really helped the AMD cards shine with DX12/Vulkan and has been in place since the first gen of GCN and still isnt in pascal, which is nothing more than die shrunk maxwell + more cude core and higher clocks. SO nvidia hands down for the past 10 years? No way. AMD has been very close and the technological superior comments made, are because they were doing very close to nvidia performance, with a small die, with lower power for most of those ten years while still being very close in performance. I am not making the case, that AMD has lost drastic market share because of some consumer conspiracy, I am simply pointing out that it is NOT because they were technologically inferrior. However, i believe at this point, Vega could be 15% faster than Titan Xp and AMD's market share would not grow drastically.

So please share, how did AMD do so bad? Where they failed was marketing and getting game devs on board. Which they are working on correcting now. nvidia won early on, IMO, because of their The Way Its Meant to be Played marketing strategy and later on GimpWorks. You could see this loading at the start of many games back in the day. I remember seeing this at the begining of UT2003/4 as the skar busted through it, there was a mod to change it to an ATi, logo.
 
Back
Top