# NVIDIA Announces Its Volta-based Tesla V100



## Raevenlord (May 10, 2017)

Today at its GTC keynote, NVIDIA CEO Jensen Huang took the wraps on some of the features on their upcoming V100 accelerator, the Volta-based accelerator for the professional market that will likely pave the way to the company's next-generation 2000 series GeForce graphics cards. If NVIDIA goes on with its product carvings and naming scheme for the next-generation Volta architecture, we can expect to see this processor on the company's next-generation GTX 2080 Ti. Running the nitty-gritty details (like the new Tensor processing approach) on this piece would be impossible, but there are some things we know already from this presentation.

This chip is a beast of a processor: it packs 21 billion transistors (up from 15,3 billion found on the P100); it's built on TSMC's 12 nm FF process (evolving from Pascal's 16 nm FF); and measures a staggering 815 mm² (from the P100's 610 mm².) This is such a considerable leap in die-area that we can only speculate on how yields will be for this monstrous chip, especially considering the novelty of the 12 nm process that it's going to leverage. But now, the most interesting details from a gaming perspective are the 5,120 CUDA cores powering the V100 out of a total possible 5,376 in the whole chip design, which NVIDIA will likely leave for their Titan Xv. These are divided in 84 Volta Streaming Multiprocessor Units with each carrying 64 CUDA cores (84 x 64 = 5,376, from which NVIDIA is cutting 4 Volta Streaming Multiprocessor Units for yields, most likely, which accounts for the announced 5,120.) Even in this cut-down configuration, we're looking at a staggering 42% higher pure CUDA core-count than the P100's. The new V100 will offer up to 15 FP 32 TFLOPS, and will still leverage a 16 GB HBM2 implementation delivering up to 900 GB/s bandwidth (up from the P100's 721 GB/s). No details on clock speed or TDP as of yet, but we already have enough details to enable a lengthy discussion... Wouldn't you agree?





*View at TechPowerUp Main Site*


----------



## v12dock (May 10, 2017)

Correct me if I am wrong but ~6% CUDA core performance increase...?


----------



## xkm1948 (May 10, 2017)

Looks very promising for scientific research. On gaming perspective it should be pretty amazing as well. 

Somehow i feel like Nvidia has already maxed out the possible efficiency optimization for Pascal-Maxwell CUDA designing. They are also back to the MOAR CORE and Higher MHz direction. With so many CUDA units available i am pretty sure Async Computing will be Volta's advantage. It should perform pretty well in Vulkan and DX12.

Poor VEGA.


----------



## TheoneandonlyMrK (May 10, 2017)

Doesn't look to be that great a graphics card considering its specs@ 21 billion transistors, but Google might like it.
Is it me or have they done little but rearrange the graphics components then add a shit tone of ai processors and call it done.
I am now more interested in lower tier volta because this ones for egg heads  and not so much gamers ,what the heck is small volta going to look like given big voltas trying to go all Cyberdyne on us.
I don't believe this is going to be easy to market to joe public though because any record for dearest consumer card is going to get obliterated when this hits the shops Christmas 2018 .


----------



## jabbadap (May 10, 2017)

TDP is 300W and boost clock is 1455MHz. More in nvidia devblog:
https://devblogs.nvidia.com/parallelforall/inside-volta/


----------



## Palladium (May 10, 2017)

jabbadap said:


> TDP is 300W and boost clock is 1455MHz. More in nvidia devblog:
> https://devblogs.nvidia.com/parallelforall/inside-volta/



"The new Volta SM is 50% more energy efficient than the previous generation Pascal design" 

Ouch.


----------



## chaosmassive (May 10, 2017)

uh oh, this is should be BIG warning sign for AMD
Nvidia has give a glimpse on Volta, meanwhile AMD still teasing us with Vega


----------



## Nokiron (May 10, 2017)

theoneandonlymrk said:


> Doesn't look to be that great a graphics card considering its specs@ 21 billion transistors, but Google might like it.
> Is it me or have they done little but rearrange the graphics components then add a shit tone of ai processors and call it done.
> I am now more interested in lower tier volta because this ones for egg heads  and not so much gamers ,what the heck is small volta going to look like given big voltas trying to go all Cyberdyne on us.
> I don't believe this is going to be easy to market to joe public though because any record for dearest consumer card is going to get obliterated when this hits the shops Christmas 2018 .


What? The specs looks fantastic! 

You never saw the GP100 for consumers and you will never see this. Expect a completely different card with no FP16, FP64 and Tensor-capabilities.


----------



## FordGT90Concept (May 10, 2017)

Only 15 TFLOP?  That's only 3 TFLOP more than Vega should have.  That lower clockspeed hurts.


----------



## Nokiron (May 10, 2017)

FordGT90Concept said:


> Only 15 TFLOP?  That's only 3 TFLOP more than Vega should have.  That lower clockspeed hurts.


Well, that's not how you measure performance when selecting processing power for datacenters. Unless Vega has something like the Tensor Cores it will not even be competition.

It also has the capability of executing INT32 and FP32 simultaneously. I know devs at my work are already frothing in their mouths just reading about it.


----------



## bug (May 10, 2017)

v12dock said:


> Correct me if I am wrong but ~6% CUDA core performance increase...?


Yeah, a measly 6% more performance coupled with a measly 42% more CUDA cores amounts to about nothing 

And I'm not sure where you got the 6% from.


----------



## Fluffmeister (May 10, 2017)

And such a massive HPC chip already pushing 1.4+ Ghz.

Strip away the tech not relevant to gaming and GV104 and GV102 chips are gonna absolutely fly.


----------



## eidairaman1 (May 10, 2017)

bug said:


> Yeah, a measly 6% more performance coupled with a measly 42% more CUDA cores amounts to about nothing
> 
> And I'm not sure where you got the 6% from.




Lets not forget price gouging/fixing...


----------



## dwade (May 10, 2017)

nvidia gives us Volta specs. AMD gives us a Vega logo. Yeah Intel should've been the one to buy ATI.


----------



## Fluffmeister (May 10, 2017)

dwade said:


> nvidia gives us Volta specs. AMD gives us a Vega logo. Yeah Intel should've been the one to buy ATI.



To be fair, they did announce recently that Vega was going to be called RX.... Vega.


----------



## eidairaman1 (May 10, 2017)

dwade said:


> nvidia gives us Volta specs. AMD gives us a Vega logo. Yeah Intel should've been the one to buy ATI.



Baseless comment.


----------



## TheinsanegamerN (May 10, 2017)

eidairaman1 said:


> Lets not forget price gouging/fixing...


Perhaps AMD should wake up and make a decent GPU then? Nobody can really blame nvidia for making some additional cash and capitalizing on AMD's inability to perform.


----------



## TheGuruStud (May 10, 2017)

Lol at yields of a die that size. Consumer card will be missing 1,300 at least.


----------



## RejZoR (May 10, 2017)

TheinsanegamerN said:


> Perhaps AMD should wake up and make a decent GPU then? Nobody can really blame nvidia for making some additional cash and capitalizing on AMD's inability to perform.



AMD made several excellent GPU's that were either better priced, technologically superior or just plain superior as whole. And their market share didn't change at all. In fact they kept on losing it. Maybe, instead of blaming AMD, users should blame themselves? Ever though of it that way? I've seen people constantly sticking with Intel or NVIDIA literally "just because". And then tehy go 180°and piss on AMD for "not doing a better job".  ¯\_(ツ)_/¯


----------



## Fluffmeister (May 10, 2017)

TheGuruStud said:


> Lol at yields of a die that size. Consumer card will be missing 1,300 at least.



The consumer isn't the issue right now, it's big buck contracts like Summit: https://www.olcf.ornl.gov/summit/

It's not like Pascal has any competition yet anyway. Can i say Happy Birthday GTX 1080 yet?


----------



## oxidized (May 10, 2017)

RejZoR said:


> _*AMD made several excellent GPU's that were either better priced, technologically superior or just plain superior as whole*_. And their market share didn't change at all. In fact they kept on losing it. Maybe, instead of blaming AMD, users should blame themselves? Ever though of it that way? I've seen people constantly sticking with Intel or NVIDIA literally "just because". And then tehy go 180°and piss on AMD for "not doing a better job".  ¯\_(ツ)_/¯



Apart from the grammar error, when exactly did that happen? All i remember are slower and far less efficient cards, in what way exactly were those technologically superior? Maybe i'm not counting another aspect.


----------



## v12dock (May 10, 2017)

bug said:


> Yeah, a measly 6% more performance coupled with a measly 42% more CUDA cores amounts to about nothing
> 
> And I'm not sure where you got the 6% from.



The chip is 33% bigger and carries 33% more cores blow Pascal up by 33% and you will have a very very similar chip that uses more power


----------



## bug (May 10, 2017)

eidairaman1 said:


> Lets not forget price gouging/fixing...



Because pricing is totally what we were talking about.



RejZoR said:


> AMD made several excellent GPU's that were either better priced, technologically superior or just plain superior as whole. And their market share didn't change at all. In fact they kept on losing it. Maybe, instead of blaming AMD, users should blame themselves? Ever though of it that way? I've seen people constantly sticking with Intel or NVIDIA literally "just because". And then tehy go 180°and piss on AMD for "not doing a better job".  ¯\_(ツ)_/¯



Users can't be expected to sift through a lineup to identify which cards are current generation and which are actually worth buying. Ever thought about it that way?
But the simple truth is users are users and when competing for them, it's the companies that are expected to bend over backwards. Crying like a baby that your good product doesn't sell is not a market strategy.


----------



## medi01 (May 10, 2017)

Palladium said:


> "The new Volta SM is 50% more energy efficient than the previous generation Pascal design"
> 
> Ouch.


Remind me, when nvidia's "%" claimes were to be trusted.




oxidized said:


> All i remember are slower and far less efficient cards


Oh dear...


----------



## Dethroy (May 10, 2017)

What a freakin' massive chip! 815mm² 
Would be nice if the 2070 would come close to 1080TI levels ...

Edit: Cannot wait for Vega, though. Really wanna see what AMD has to offer as well.


----------



## TheGuruStud (May 10, 2017)

Fluffmeister said:


> The consumer isn't the issue right now, it's big buck contracts like Summit: https://www.olcf.ornl.gov/summit/
> 
> It's not like Pascal has any competition yet anyway. Can i say Happy Birthday GTX 1080 yet?



I bet they can buy four MI25s for the price of this. I hope they all jump ship when it's released.


----------



## diatribe (May 10, 2017)

_measures a staggering 815 mm_

I didn't even know this was possible.


----------



## TheoneandonlyMrK (May 10, 2017)

Nokiron said:


> What? The specs looks fantastic!
> 
> You never saw the GP100 for consumers and you will never see this. Expect a completely different card with no FP16, FP64 and Tensor-capabilities.


Maybe, some could be used via physx but if they sell you this chip your chiping in for the rest no doubt , Even if they cut it out.
It is not that great a graphics card for 21 billion transistors, a lot of which wouldn't run crysis.
Its a good piece of tech ,no doubt but its got that many different types of compute unit in, it will struggle to use it in a lot of use cases and most definately in games but it does have enough raw gfx power for maybe 20-30% on big Pascal maybe.


----------



## Fluffmeister (May 10, 2017)

TheGuruStud said:


> I bet they can buy four MI25s for the price of this. I hope they all jump ship when it's released.



Well like I said, Nv already have contracts to fulfil. But equally there is more to this than just delivering a product and saying "here you go".

Hope doesn't come into it.


----------



## diatribe (May 10, 2017)

eidairaman1 said:


> Lets not forget price gouging/fixing...



Unless AMD is working with Nvidia on prices controls, it's not price fixing.  The price is what happens to goods when there are no competing products on the market.


----------



## GorbazTheDragon (May 10, 2017)

theoneandonlymrk said:


> Maybe, some could be used via physx but if they sell you this chip your chiping in for the rest no doubt , Even if they cut it out.
> It is not that great a graphics card for 21 billion transistors, a lot of which wouldn't run crysis.
> Its a good piece of tech ,no doubt but its got that many different types of compute unit in, it will struggle to use it in a lot of use cases and most definately in games but it does have enough raw gfx power for maybe 20-30% on big Pascal maybe.


They will probably end up doing the same as with Pascal where the GP102 chip is a single precision card. Don't forget that the GP100 appears to have the same CUDA core count as the GP102. Hopefully they will start implementing HBM2 on the consumer tier Volta cards, otherwise I will probably just get a second hand Pascal card around the end of the year. The performance of desktop GPUs has already scaled well past what games (and what 4core-dual channel computers) can realistically handle, the only way to take advantage of the enormous power of the modern cards is by adding unnecesary levels of AA or running at 1440p or higher, but the prices of 4k monitors are (as of now) prohibitive (I'd expect that to change within a year or so).

Basically there is not much need for such a powerful chip in the desktop market, remember that we are looking at a chip that could be as much as twice as powerful as a GP102...


----------



## RejZoR (May 10, 2017)

bug said:


> Because pricing is totally what we were talking about.
> 
> 
> 
> ...



Now, all of a sudden, users are incapable of reading graphic card reviews even though they somehow managed to find out there are brand new graphic cards selling. C'mon dude, be serious.


----------



## GorbazTheDragon (May 10, 2017)

RejZoR said:


> Now, all of a sudden, users are incapable of reading graphic card reviews even though they somehow managed to find out there are brand new graphic cards selling. C'mon dude, be serious.


Reviews don't sell products if they show that there are better options 

And also from what I can tell, people outside of the enthusiast circle are wary of AMD and give them a bad rep on stuff like the 290(x) heat, driver problems, support, etc. 

And to give a personal example: I use x-plane, I basically can't use an AMD card because it runs like garbage on their drivers...


----------



## RejZoR (May 10, 2017)

You gave an excellent example. R9 290X was a single chip card that beat NVIDIA's *DUAL GPU* *Titan*. In fact it was so fast AMD could afford to just rebrand it to R9 390X and place it against *BRAND NEW* NVIDIA GTX 980. Talking about inferiority. And yeah, I bought GTX 980 because on paper it had better DX12 support. In the end, it turned out AMD had better support despite being just DX 12_0 feature level where GTX 980 is DX 12_1. That technological inferiority. Async compute? Ups, AMD again superior. Besides, even if I have GTX 980 now, I don't feel bad because I have a long history of owning both brands. And despite RX Vega being late to the game, it's quite likely I'll grab one.


----------



## 64K (May 10, 2017)

Dethroy said:


> What a freakin' massive chip! 815mm²
> Would be nice if the 2070 would come close to 1080TI levels ...



Based on past performance results it most likely will

GTX 1070 very close to a GTX 980 Ti and GTX 1080 beat it (Pascal vs Maxwell)

GTX 970 very close to a GTX 780 Ti and GTX 980 beat it (Maxwell vs Kepler)

GTX 670 very close to a GTX 580 and GTX 680 Beat it (Kepler vs Fermi)

But I'm expecting even more from Volta. Things are going to get exciting for gamers in Q1 2018 imo


----------



## jabbadap (May 10, 2017)

TheGuruStud said:


> I bet they can buy four MI25s for the price of this. I hope they all jump ship when it's released.



Well that is not enough, v100 has ten times more fp64 flops than mi25, which matters the most for hpc. Mi25 is not actually even hpc product so they will not even consider it. For AI and machine learning mambojambo for sure: it should be competitive... If they managed to push reliable and working drivers/software for them.


----------



## bug (May 10, 2017)

RejZoR said:


> Now, all of a sudden, users are incapable of reading graphic card reviews even though they somehow managed to find out there are brand new graphic cards selling. C'mon dude, be serious.


I'm really not sure what you're trying to argue here. That users _do _read reviews, see that AMD has the better card and then they go buy Nvidia?


----------



## Nokiron (May 10, 2017)

RejZoR said:


> AMD made several excellent GPU's that were either better priced, technologically superior or just plain superior as whole. And their market share didn't change at all. In fact they kept on losing it. Maybe, instead of blaming AMD, users should blame themselves? Ever though of it that way? I've seen people constantly sticking with Intel or NVIDIA literally "just because". And then tehy go 180°and piss on AMD for "not doing a better job".  ¯\_(ツ)_/¯


AMD is far from perfect though. They had all the hardware but none of the software.

Why bother using a similar product when the other gives you better performing APIs/programming models, tons of dedicated support-staff and developers for a little more money?

It's no surprise that they were not selling, which is what they are trying to change now.



jabbadap said:


> Well that is not enough, v100 has ten times more fp64 flops than mi25, which matters the most for hpc. Mi25 is not actually even hpc product so they will not even consider it. For AI and machine learning mambojambo for sure: it should be competitive... If they managed to push reliable and working drivers/software for them.


Machine learning and AI will be a no-go if they don't have something similar to the Tensor Cores though.


----------



## eidairaman1 (May 10, 2017)

TheinsanegamerN said:


> Perhaps AMD should wake up and make a decent GPU then? Nobody can really blame nvidia for making some additional cash and capitalizing on AMD's inability to perform.



Last time nv had anything special was the 8800 Series in my books.


----------



## RejZoR (May 10, 2017)

Nokiron said:


> AMD is far from perfect though. They had all the hardware but none of the software.
> 
> Why bother using a similar product when the other gives you better performing APIs/programming models, tons of dedicated support-staff and developers for a little more money?
> 
> ...



People keep yodeling this "bad software support" and "bad drivers" and "awful OpenGL support/performance" and in decade of owning ATi/AMD cards, I hardly ever experienced any issues, stability or performance. This too is one of the reasons why AMD just can't get any traction. People spreading misinformation and just plain lies. For reasons unknown to any logic.


----------



## xkm1948 (May 10, 2017)

RejZoR said:


> People keep yodeling this "bad software support" and "bad drivers" and "awful OpenGL support/performance" and in decade of owning ATi/AMD cards, I hardly ever experienced any issues, stability or performance. This too is one of the reasons why AMD just can't get any traction. People spreading misinformation and just plain lies. For reasons unknown to any logic.




Em ,nope. Long time ATi user here. As a matter of fact I have ONLY bought ATi/AMD GPUs so far(at least for all my OWN GPUs)  FuryX, 7970, 6870, 5870 and all the way back to Radeon 9700

ATi/AMD's driver did improve quite a lot. They also incorported some pretty cool features(Wattman, ReLive and etc.)

At the same time their software support to developers is seriously lacking. AMD likes to showcase a lot of cool features during a GPU launch, most of which will never see the day light being actually implemented in the driver.


----------



## m0nt3 (May 10, 2017)

oxidized said:


> Apart from the grammar error, when exactly did that happen? All i remember are slower and far less efficient cards, in what way exactly were those technologically superior? Maybe i'm not counting another aspect.



4870 vs GTX280, 4870 was $200 cheaper, a much smaller die was under 15% performance difference. 4870 first card to use GDDR5
5870 (was as fast as dual GPU cards like 4870x2 and GTX 295) vs GTX285. Was more power effcient. Fermi arrived late, was hot and power hungry, although faster. 5800 series first to support eyefinity or multi-montor gaming.
R9 290/x vs 780/Titan

Pretty much the same through the midrange as well. The masses will buy nVidia, regardless of what AMD makes. Nvidia knows this, which is why they are selling midrange GPU's at higher prices. Like the 1080 launching at around $700 was it.



RejZoR said:


> People keep yodeling this "bad software support" and "bad drivers" and "awful OpenGL support/performance" and in decade of owning ATi/AMD cards, I hardly ever experienced any issues, stability or performance. This too is one of the reasons why AMD just can't get any traction. People spreading misinformation and just plain lies. For reasons unknown to any logic.



To be fair, there was poor opengl performance way back in the early 2000's. One of the reason the FX series did better in Doom 3 vs the Radeon 97/800 series. This hasn't been the case for awhile, except maybe in linux where the OpenSource driver is really turning things around and is now much faster in most OpenGL scenarios than the proprietary drive.


----------



## qubit (May 10, 2017)

Volta sounds like a decent upgrade over Pascal which can do 4K without batting an eyelid, that's coming out relatively soon, so I don't think it's worth investing £800 in a GTX 1080 Ti now and hence I'm happy with my GTX 1080 which "only" cost me £550 and races through everything at 1080p.


----------



## oxidized (May 11, 2017)

m0nt3 said:


> 4870 vs GTX280, 4870 was $200 cheaper, a much smaller die was under 15% performance difference. 4870 first card to use GDDR5
> 5870 (was as fast as dual GPU cards like 4870x2 and GTX 295) vs GTX285. Was more power effcient. Fermi arrived late, was hot and power hungry, although faster. 5800 series first to support eyefinity or multi-montor gaming.
> R9 290/x vs 780/Titan
> 
> Pretty much the same through the midrange as well. The masses will buy nVidia, regardless of what AMD makes. Nvidia knows this, which is why they are selling midrange GPU's at higher prices. Like the 1080 launching at around $700 was it.



Back then i'm pretty sure was the time ATi had shitty drivers, and temperatures, and for some reason i remember GTX2xx being a bit better, i also remember GTX4xx being the worst shit created by nvidia, also, introducing GDDR5 with not real benefits, sounds like a HBM part 1, i have to admit i don't remember about power efficiency.
Anyway nvidia has been doing the better overall product in the last ~10 years hands down, and it's not because people are stupid they're buying nvidia, or at least not all of them, as i said elsewhere, they both have the reputation they deserve more or less, it's not like ATi or AMD deserve more than what nvidia has now, they deserve less because they did worse, stop thinking that some kind of conspiracy or any sort of stuff, is the only reason AMD is in this position, it's actually hardly A reason to begin with.


----------



## Hood (May 11, 2017)

RejZoR said:


> People keep yodeling this "bad software support" and "bad drivers" and "awful OpenGL support/performance" and in decade of owning ATi/AMD cards, I hardly ever experienced any issues, stability or performance. This too is one of the reasons why AMD just can't get any traction. People spreading misinformation and just plain lies. For reasons unknown to any logic.


It's common knowledge that AMD/ATI/Radeon software and driver support sucks, for all their hardware, including CPUs and chipsets.  I guess you never got the memo...oh that's right, you're the one who ignores everything negative about AMD and makes up stats that "prove" your point.  Please don't stop, you are providing comic relief with your delusions.  Continue obsessing over trivial BS, 
so we can keep laughing...


----------



## oxidized (May 11, 2017)

Hood said:


> It's common knowledge that AMD/ATI/Radeon software and driver support sucks, for all their hardware, including CPUs and chipsets.  I guess you never got the memo...oh that's right, you're the one who ignores everything negative about AMD and makes up stats that "prove" your point.  Please don't stop, you are providing comic relief with your delusions.  Continue obsessing over trivial BS,
> so we can keep laughing...



Actually nowadays' AMD driver/software support sounds to be better than nvidia's, but this can be said for the very last 3 gens maybe?


----------



## R4E3960FURYX (May 11, 2017)

Death of RX Vega based GPU.


----------



## oxidized (May 11, 2017)

R4E3960FURYX said:


> View attachment 87786
> 
> Death of RX Vega based GPU.



Vega still has to come out, chill your beans, and also, even if they meant to take it on Volta, we all know they just misspelled Pascal  now let's be quiet and just wait for them.


----------



## RejZoR (May 11, 2017)

Hood said:


> It's common knowledge that AMD/ATI/Radeon software and driver support sucks, for all their hardware, including CPUs and chipsets.  I guess you never got the memo...oh that's right, you're the one who ignores everything negative about AMD and makes up stats that "prove" your point.  Please don't stop, you are providing comic relief with your delusions.  Continue obsessing over trivial BS,
> so we can keep laughing...



"It's a common knowledge" No, it's just BS. And "I didn't get the memo". Dude, this GTX 980 is my first NVIDIA card after like 10 years of ONLY AMD/ATi cards. And I've not experienced any problems. But do go on talking out of your rear how I have no clue and how I'm an AMD fanboy just because I believe just 10% of bullshit people write about AMD (which I know first hand is a load of garbage for the most part). Looks like I'll be buying AMD out of spite just to annoy NVIDIA fanboy idiots with their "perfect everything" garbage attitude.


----------



## m0nt3 (May 11, 2017)

oxidized said:


> Back then i'm pretty sure was the time ATi had shitty drivers, and temperatures, and for some reason i remember GTX2xx being a bit better, i also remember GTX4xx being the worst shit created by nvidia, also, introducing GDDR5 with not real benefits, sounds like a HBM part 1, i have to admit i don't remember about power efficiency.
> Anyway nvidia has been doing the better overall product in the last ~10 years hands down, and it's not because people are stupid they're buying nvidia, or at least not all of them, as i said elsewhere, they both have the reputation they deserve more or less, it's not like ATi or AMD deserve more than what nvidia has now, they deserve less because they did worse, stop thinking that some kind of conspiracy or any sort of stuff, is the only reason AMD is in this position, it's actually hardly A reason to begin with.


The GTX 280 was a bit better than the 4870 in performance, but not $200 better, which is the point, nvidia still outsold, even with the GTX 400 series. All through the GTX 200-500 series AMD was more power effceint, which was a reflection of their small die strategy, yet still maintained performance very close to nvidia. Most temp problems are with reference coolers. As an owner of a 4870 I don't recall going much past 80C with 4870 even in crossfire during BFBC2. nvidia having the better product over the last 10 years, is just not true, as I have been trying to demonstrate. In 2007 with the 8800 series, yes hand down no denying, the 2900 series was not great. But from the GTX 200-500 series as I stated above, was just not so. Although the 6900 series was a bit sub-par in my oppinion, as it certainly lacked in AA performance compared to nvidia as well as tessalation. That all changed with GCN. The 7000 series was faster than anything at the time, until kepler came out (which was the begining of nvidia selling midrange GPU's as higher end GPU's). After some drivers updates the 7000 series was back on par with kepler. IMO The R9 290/290X was a big upset for nvidia, besting their $1000 card in several scenarios, unfortunately it was plagued by the bitcoin mining fad. A lot of  Market share was already lost before this point, in AMD's case. They have had a bad stigma that they have not been able to overcome. Maxwell was a homerun for nvida, but was behind AMD in one aspect, asynchronus compute, which has really helped the AMD cards shine with DX12/Vulkan and has been in place since the first gen of GCN and still isnt in pascal, which is nothing more than die shrunk maxwell + more cude core and higher clocks. SO nvidia hands down for the past 10 years? No way. AMD has been very close and the technological superior comments made, are because they were doing very close to nvidia performance, with a small die, with lower power for most of those ten years while still being very close in performance. I am not making the case, that AMD has lost drastic market share because of some consumer conspiracy, I am simply pointing out that it is NOT because they were technologically inferrior. However, i believe at this point, Vega could be 15% faster than Titan Xp and AMD's market share would not grow drastically.

So please share, how did AMD do so bad? Where they failed was marketing and getting game devs on board. Which they are working on correcting now. nvidia won early on, IMO, because of their The Way Its Meant to be Played marketing strategy and later on GimpWorks. You could see this loading at the start of many games back in the day. I remember seeing this at the begining of UT2003/4 as the skar busted through it, there was a mod to change it to an ATi, logo.


----------



## Kanan (May 11, 2017)

It's ~95% GTX 1280 and not 2080, you're skipping 10 generation's buddy. 700 series -> new architecture -> 900 -> new wannabe architecture, rather a node shrink to 16nm -> GTX 1000 series -> new architecture -> GTX 1200 series. Makes sense, huh?

And speaking about the GPU: it has nothing to do with "Titan Xv" it's Tesla, comparable to P100, which will not be used for the Titan series or GTX series of gaming/"normal" workstation cards. A smaller version of it could be possible used for gaming, leaving the unnecessary bits aside, maybe with 4500 shaders then for the full GV102 (successor to GP102). The first Titan "X" which wie will again name by ourselves "Titan XV" will be the cut off version with 4200 shaders and then 1 year later they will bring the full version with 4500 shaders (named Titan Xv of course, resembling "Titan Xp", nvidia playing it "smart" again) and a GTX 1280 Ti with 4200 shaders like the "Titan XV" one year before it.

/speculation mode. But it makes a lot of sense, at least if everything goes to plans of nvidia.


----------



## RejZoR (May 11, 2017)

m0nt3 said:


> 4870 vs GTX280, 4870 was $200 cheaper, a much smaller die was under 15% performance difference. 4870 first card to use GDDR5
> 5870 (was as fast as dual GPU cards like 4870x2 and GTX 295) vs GTX285. Was more power effcient. Fermi arrived late, was hot and power hungry, although faster. 5800 series first to support eyefinity or multi-montor gaming.
> R9 290/x vs 780/Titan
> 
> ...



I've played basically all OpenGL games on Radeons and never even cared that it's OpenGL and that I was suppose to get inferior performance. And you probably know by now that I love playing games at max quality...


----------



## m0nt3 (May 11, 2017)

RejZoR said:


> I've played basically all OpenGL games on Radeons and never even cared that it's OpenGL and that I was suppose to get inferior performance. And you probably know by now that I love playing games at max quality...


I didn't mean to give the impression that OpenGL games were unplayable. I beat Doom3 on a 9800 non-pro at max settings, minus AA, its performance just was not relative to the FX 5900 when comapred to DX9.


----------



## Rockarola (May 11, 2017)

I realize that computers work in binary, but I didn't know that it works the same way 40cm from the monitor! 
Any post mentioning AMD/nVidia will look the same...fanboys screaming at each other, trolls crap posting just to get a rise and the few intelligent posts drowned out by the noise. 
WHAT THE FUCK HAPPENED TO OBJECTIVITY??? 
/rant over


----------



## ppn (May 11, 2017)

the next titan X is probably going to be 
GV102 5120 Cuda out of 5376 
GDDR6 384 bit 12 GB HYNIX 
610 sq.mm.


----------



## GorbazTheDragon (May 11, 2017)

Just some theory on why AMD doesn't sell too well.

AMD basically dropped all their market share in the mobile market, both on the CPU and GPU side. For many people I think the logical choice after buying an Intel-NV laptop is an Intel-NV desktop (not to mention that noone in their right mind would have bought FX after Haswell if it was for gaming).

AMD CPUs don't have the best rep in the last 5 years or so. Probably a lot of people (again, those who are not familiar with computers but want to game on PC) look at that and think the same applies to Radeon.


----------



## Caring1 (May 11, 2017)

GorbazTheDragon said:


> Just some theory on why AMD doesn't sell too well.
> 
> AMD basically dropped all their market share in the mobile market, both on the CPU and GPU side. For many people I think the logical choice after buying an Intel-NV laptop is an Intel-NV desktop (not to mention that noone in their right mind would have bought FX after Haswell if it was for gaming).
> 
> AMD CPUs don't have the best rep in the last 5 years or so. Probably a lot of people (again, *those who are not familiar with computers* but want to game on PC) look at that and think the same applies to Radeon.


Hate to burst your bubble, but "those that are not familiar with computers" look at price and appearance when buying, they generally get sold cheaper systems the salespeople want to clear from stock, usually AMD products fill that niche.


----------



## Fluffmeister (May 11, 2017)

So Vega isn't even the fastest unreleased GPU anymore? Fuck sake, life is cruel.


----------



## Caring1 (May 11, 2017)

Fluffmeister said:


> So Vega isn't even the fastest unreleased GPU anymore? Fuck sake, life is cruel.


Theoretically it is as this Volta based Tesla V100 isn't a GPU as such.


----------



## GorbazTheDragon (May 11, 2017)

Caring1 said:


> Hate to burst your bubble, but "those that are not familiar with computers" look at price and appearance when buying, they generally get sold cheaper systems the salespeople want to clear from stock, usually AMD products fill that niche.


I'd say it varies a lot, you have to remember that there is a large influx of PC gamers who want to build their own rig, the PCMR subreddit for example has a lot of traction. And I think a lot of people looking for real gaming rigs are buying prebuilts from sites where you can configure the systems, I think if you ask them they will tell you they sell more Intel than AMD systems, not entirely sure on the GPU side though...


----------



## Fluffmeister (May 11, 2017)

Caring1 said:


> Theoretically it is as this Volta based Tesla V100 isn't a GPU as such.



This place has gone mad.


----------



## TheoneandonlyMrK (May 11, 2017)

Hood said:


> It's common knowledge that AMD/ATI/Radeon software and driver support sucks, for all their hardware, including CPUs and chipsets.  I guess you never got the memo...oh that's right, you're the one who ignores everything negative about AMD and makes up stats that "prove" your point.  Please don't stop, you are providing comic relief with your delusions.  Continue obsessing over trivial BS,
> so we can keep laughing...


Your confused amd recently updated drivers and oc software for fx and radeon and fx needs no cpu upgrade via ageesa because it's not got built in killer faults but as we all know they do when needed, again bs.

Crossfire 480s with waterblocks do 4k@ 60hz fine on all old stuff , and I have been using them a year for 565


----------



## Caring1 (May 11, 2017)

Fluffmeister said:


> This place has gone mad.


If you are referring to me, it's never been proven


----------



## R4E3960FURYX (May 11, 2017)

oxidized said:


> Vega still has to come out, chill your beans, and also, even if they meant to take it on Volta, we all know they just misspelled Pascal  now let's be quiet and just wait for them.



I think RX Vega may not compete with 1080Ti GTX anymore. Due SK Hynix HBM2 supplier does not meet target speed 409.6GB/sec only with 1.6Gbps module.
RX Vega core config not different much from my R9 Fury X. Estimate 15% gain clock for clock from R9 Fury X GPU max.

Compare to those Volta GV 100 5000+ Cuda Core 32MB SM Memory (High Bandwidth Cache of Volta Cuda core) with 16GB HBM2 @ 900MHz.

RX Vega 8GB HBM2 is death and overkill by Volta GV 100.





336 Texture Unit OMG! 128 ROP 1455MHz Clock 5120 CUDA Core.





Maybe It was advance T-800 Chip from the terminator.


----------



## GorbazTheDragon (May 11, 2017)

Other thing you need to remember with this chip is it has many non fp32 cores. Expect a 5k CUDA core Volta for gaming to be significantly smaller.


----------



## alucasa (May 11, 2017)

Fluffmeister said:


> This place has gone mad.



Look the same to me. Hasn't changed ever since I joined. The fanboyism has gotten worse but not by much.


----------



## RejZoR (May 11, 2017)

Caring1 said:


> Hate to burst your bubble, but "those that are not familiar with computers" look at price and appearance when buying, they generally get sold cheaper systems the salespeople want to clear from stock, usually AMD products fill that niche.



I've seen how that worked for normies. They were getting "recommendations" from experts. And experts said "buy Pentium 4" even though Athlon XP was the shit at the time. So, I've kinda seen how that went. Intel was shoehorned everywhere "just because Intel". Literally.

EDIT:
Also idiots comparing an out of this world expensive Volta compute unit with a consumer gaming GPU. I don't think I'm allowed to express the words I'd like to express right now, right here...


----------



## Kanan (May 11, 2017)

RejZoR said:


> I've seen how that worked for normies. They were getting "recommendations" from experts. And experts said "buy Pentium 4" even though Athlon XP was the shit at the time. So, I've kinda seen how that went. Intel was shoehorned everywhere "just because Intel". Literally.


Indeed, but those times are over, Ryzen has a lot of momentum, and Intel's good reputation is long gone, by heavily milking its users and such chaotic things like toothpaste on CPU dies to potentially decrease overclocking in order to decrease longevity of its own CPUs to force users to buy new stuff more early (after seeing Sandy Bridge is "too good"). I think AMD, at least their CPU part of company is way more mature than in Athlon XP times, AMD isn't so small anymore and the internet, and changed behaviour of people (almost anyone is a internet guy nowadays, not just nerds and typical PC users like before) helps them big big time. They are doing a way better job at PR today than 5, 10 or more years ago, I'm quite satisfied with it, and I'm rather critical person, and always critisized them for poor PR.


> EDIT:
> Also idiots comparing an out of this world expensive Volta compute unit with a consumer gaming GPU. I don't think I'm allowed to express the words I'd like to express right now, right here...


Right, it's annoying as fuck. Who to blame? The news editor, but I think it was intended. I already did multiple posts to explain that it's not a GeForce GTX GPU for consumers and will never be, and that the actual Titan "Xv" will be much smaller, probably under 500mm² again.


----------



## alucasa (May 11, 2017)

Let's vote to ban the news editors.


----------



## oxidized (May 11, 2017)

m0nt3 said:


> The GTX 280 was a bit better than the 4870 in performance, but not $200 better, which is the point, nvidia still outsold, even with the GTX 400 series. All through the GTX 200-500 series AMD was more power effceint, which was a reflection of their small die strategy, yet still maintained performance very close to nvidia. Most temp problems are with reference coolers. As an owner of a 4870 I don't recall going much past 80C with 4870 even in crossfire during BFBC2. nvidia having the better product over the last 10 years, is just not true, as I have been trying to demonstrate. In 2007 with the 8800 series, yes hand down no denying, the 2900 series was not great. But from the GTX 200-500 series as I stated above, was just not so. Although the 6900 series was a bit sub-par in my oppinion, as it certainly lacked in AA performance compared to nvidia as well as tessalation. That all changed with GCN. The 7000 series was faster than anything at the time, until kepler came out (which was the begining of nvidia selling midrange GPU's as higher end GPU's). After some drivers updates the 7000 series was back on par with kepler. IMO The R9 290/290X was a big upset for nvidia, besting their $1000 card in several scenarios, unfortunately it was plagued by the bitcoin mining fad. A lot of  Market share was already lost before this point, in AMD's case. They have had a bad stigma that they have not been able to overcome. Maxwell was a homerun for nvida, but was behind AMD in one aspect, asynchronus compute, which has really helped the AMD cards shine with DX12/Vulkan and has been in place since the first gen of GCN and still isnt in pascal, which is nothing more than die shrunk maxwell + more cude core and higher clocks. SO nvidia hands down for the past 10 years? No way. AMD has been very close and the technological superior comments made, are because they were doing very close to nvidia performance, with a small die, with lower power for most of those ten years while still being very close in performance. I am not making the case, that AMD has lost drastic market share because of some consumer conspiracy, I am simply pointing out that it is NOT because they were technologically inferrior. However, i believe at this point, Vega could be 15% faster than Titan Xp and AMD's market share would not grow drastically.
> 
> So please share, how did AMD do so bad? Where they failed was marketing and getting game devs on board. Which they are working on correcting now. nvidia won early on, IMO, because of their The Way Its Meant to be Played marketing strategy and later on GimpWorks. You could see this loading at the start of many games back in the day. I remember seeing this at the begining of UT2003/4 as the skar busted through it, there was a mod to change it to an ATi, logo.



Let alone the costs, that's the only big issue with nvidia. Fermi refresh aka GTX5xx was better than its counter part, you yourself said even GTX4xx was faster, but with all those big issues, well 5xx was even faster and with none of those problems, actually i think that 5xx was one of the most successful recent "architectures". Again about power efficiency i don't remember back then but i pretty sure remember "faster" and "cooler". My old GTX580 reference pcb/cooler reaches 85C now, pretty sure it was under 80C back then even at full load. So let's recap, 8800 series was great, 9800 was just a rebrand, so, just faster, with all 8800 series pros, GTX2xx was a better product than its counterpart, despite maybe efficiency (?) GTX4xx was aids, GTX5xx was pretty good as it came with GTX4xx or better performance, but run cooler and less power hungry and everything else, after that GTX6xx 7xx 9xx, nvidia always had the upper hand, at least for the first years the product came out, after that, Rx 2xx/3xx "finewine" kicked in, and those cards managed to age better than nvidia counterpart, and eventually managed to reach or slightly top them, all this at least 1 year later in some case, 2 or more in some other case, not to mention the poor efficiency and temperature on those cards, that were pretty much always worse than nvidia counterpart. Also now that i remember my brother used to have a XFX 5850 in his pc i built back then, and i remember it being a pretty decent card, but it was very power hungry and temps were not that good, just ok.
ATi/AMD has been behind for the last 10 years, i agree that they always were pretty close, incredibly actually, having far far less funds than nvidia, that's admirable, they surely worked harder than nvidia, that's no doubt, but they were behind, except for GTX4xx, in the rest of scenarios, nvidia just managed to collect more points in terms of pro vs cons, so if certain ATi/AMD card was more power efficient than its nvidia counterpart, it was at the same time slower and had higher temps and not so good driver support, that's why i used the term "overall" because we cannot just use, 1 pro as an example, and make it matter more than the others, if a X card was faster than a Y card, but the Y card was more efficient, cheaper, lower temps, and better drivers, the Y card would be the better card overall, and this repeats with all the other pros, as long as one card as more pros than the other, that's the better product.
You wanna talk about dies? Ok let's do it, let's talk about Polaris vs Pascal GP106, let's see, 480 is slower, hotter, FAR less power efficient, maybe has better drivers, cheaper (maybe, because initially, here in europe i can assure you, was far easier to find 1060 at a decent prices, 480 had insane prices and far less availability), less availability at launch and for the 2/3 months upon launch, so which one is the better product? You can't only make price matter, if 1060 cost more, how much more it's to see but, still, let's say it cost more, because we're not only talking about europe here, what's the problem? It's the better product, in almost everything, OH i forgot to mention that Polaris has a slightly better performance on DX12 and Vulkan (maybe, because DOOM's shift of framerates isn't only related to Vulkan), GTX 1060 is still a winner overall (i ordered an XFX rx 480 gtr black edition a week ago, should be shipped either today or tomorrow, so no, i'm not the fanboy you think i am).
Ok so i shared how AMD did worse in the last 10 years (no blame, seen the funds) they also failed miserably in marketing yeah, which they are correcting now? Really? "Poor Volta-ge" anyone? "It has a brain" "it has a soul"? Really, correcting? No. Gimping cards is still another one of those legends i really wanted to see with my eyes, because sounds so much like a BS, but no, hey, BS only come out of nvidia's mouth.
"_You could see this loading at the start of many games back in the day._ _I remember seeing this at the begining of UT2003/4 as the skar busted through it, there was a mod to change it to an ATi, logo"_
Basically ATi/AMD in a nutshell.


----------



## Nokiron (May 11, 2017)

RejZoR said:


> People keep yodeling this "bad software support" and "bad drivers" and "awful OpenGL support/performance" and in decade of owning ATi/AMD cards, I hardly ever experienced any issues, stability or performance. This too is one of the reasons why AMD just can't get any traction. People spreading misinformation and just plain lies. For reasons unknown to any logic.


What? Im talking business and enterprise (which is the market that matters). AMD did *NOT* have good software support, no matter if you like it or not.


----------



## R4E3960FURYX (May 11, 2017)

ppn said:


> the next titan X is probably going to be
> GV102 5120 Cuda out of 5376
> GDDR6 384 bit 12 GB HYNIX
> 610 sq.mm.



King of 2018 GPU ever made.


----------



## medi01 (May 11, 2017)

12nm is a marketing term Intel would laugh at.

Clock speed is slower (expected).
800mm2 is monstrous, so will be the price of that thing.

Some say it means that consumer products are around the corner. If so, bad for AMD, although it won't be as big a jump as some expect.




R4E3960FURYX said:


> Estimate 15% gain clock for clock from R9 Fury X GPU max.



And they use nVidia marketing slides to somehow get to 12.5 TFlops figure (up from 8.6) with mere 15% clock bump and the same number of shaders.
Genious.

You should apply for a job at wccftech dude, you got talent.




Nokiron said:


> Im talking business and enterprise (which is the market that matters)


Market that matters to... whom?
All but gaming market share is laughable.

So are 20-80k cars Tesla makes annually, compared to it.


----------



## GorbazTheDragon (May 11, 2017)

medi01 said:


> And they use nVidia marketing slides to somehow get to 12.5 TFlops figure (up from 8.6) with mere 15% clock bump and the same number of shaders.
> Genious.
> 
> You should apply for a job at wccftech dude, you got talent.
> ...


Strength of volta will probably show in areas outside FP32.

And I'd encourage you to check your facts whrn it comes to what markets nvidias chips are sold... Last I heard they were producing huge volumes of chips for supercomputers and processing clusters for commercial/government funded operations. From what I've heard from friends in astronomy, they are also interested in moving to GPGPU, which will be done with NV chips. The market is most definitely there, and it is also set to grow.


----------



## Nokiron (May 11, 2017)

medi01 said:


> Market that matters to... whom?
> All but gaming market share is laughable.
> 
> So are 20-80k cars Tesla makes annually, compared to it.


For AMD. That's where the money is.


----------



## medi01 (May 11, 2017)

GorbazTheDragon said:


> I'd encourage you to check your facts





GorbazTheDragon said:


> I've heard from friends


Ironic.



Nokiron said:


> For AMD. That's where the money is.


It's a quickly growing market that is still dwarfed by gaming GPU market.


----------



## bug (May 11, 2017)

RejZoR said:


> People keep yodeling this "bad software support" and "bad drivers" and "awful OpenGL support/performance" and in decade of owning ATi/AMD cards, I hardly ever experienced any issues, stability or performance. This too is one of the reasons why AMD just can't get any traction. People spreading misinformation and just plain lies. For reasons unknown to any logic.


I guess you're not a Linux user then. Only last year AMD came out with a more decent driver, but a year later even that one doesn't have all the features of the older driver. Minor stuff like HDMI audio, Vulkan or OpenCL. And need I remind you the trainwreck CCC was at launch? How people begged ATI/AMD not to release that monstrosity and how they went ahead and did it anyway? Today the driver seems to be in a much better place, but it still has higher CPU overhead.
If you weren't bitten by any of those problems, then congrats.


----------



## GorbazTheDragon (May 11, 2017)

medi01 said:


> Ironic.
> 
> 
> It's a quickly growing market that is still dwarfed by gaming GPU market.


Friends of mine who happen to work in astronomy?


----------



## RejZoR (May 11, 2017)

bug said:


> I guess you're not a Linux user then. Only last year AMD came out with a more decent driver, but a year later even that one doesn't have all the features of the older driver. Minor stuff like HDMI audio, Vulkan or OpenCL. And need I remind you the trainwreck CCC was at launch? How people begged ATI/AMD not to release that monstrosity and how they went ahead and did it anyway? Today the driver seems to be in a much better place, but it still has higher CPU overhead.
> If you weren't bitten by any of those problems, then congrats.



And here we come with Linux thing argument again. For those 50 people who game on Linux, is it really a problem? C'mon?

Trainwreck of CCC? Why you people keep on bashing AMD for PAST things but conveniently ignore great CURRENT ones? If you'd look at Crimson Contro Panel, it's trillion light years ahead of archaic NV Control Panel which is the same as it was 10 years ago. And equally as broken. It pisses me off every time I have to change anything in it because it keeps on resetting the god damn settings list to the top whenever you select ANYTHING. It's so infuriating and yet it has been this way for ages. Go figure...

And there is no "higher CPU overhead". NVIDIA just invested tons of time and resources into multithreading with DX11 which makes it look like AMD cards are more CPU intensive. But realistically, it's all drama bullshit. Been gaming in DX11 with Radeons for years and performance was always excellent. But when people see 3fps difference in benchmarks and they instantly lose their s**t entirely.


----------



## GorbazTheDragon (May 11, 2017)

Linux drivers are shit on both sides so whats the argument?


----------



## bug (May 11, 2017)

RejZoR said:


> And here we come with Linux thing argument again. For those 50 people who game on Linux, is it really a problem? C'mon?
> 
> Trainwreck of CCC? Why you people keep on bashing AMD for PAST things but conveniently ignore great CURRENT ones? If you'd look at Crimson Contro Panel, it's trillion light years ahead of archaic NV Control Panel which is the same as it was 10 years ago. And equally as broken. It pisses me off every time I have to change anything in it because it keeps on resetting the god damn settings list to the top whenever you select ANYTHING. It's so infuriating and yet it has been this way for ages. Go figure...
> 
> And there is no "higher CPU overhead". NVIDIA just invested tons of time and resources into multithreading with DX11 which makes it look like AMD cards are more CPU intensive. But realistically, it's all drama bullshit. Been gaming in DX11 with Radeons for years and performance was always excellent. But when people see 3fps difference in benchmarks and they instantly lose their s**t entirely.


The world is not built around your personal usage pattern. For that, we apologize.


----------



## xenocide (May 11, 2017)

I don't post here for a few months and you all turn into god damn savages...

I kid, but seriously.


----------



## jabbadap (May 11, 2017)

GorbazTheDragon said:


> Linux drivers are shit on both sides so whats the argument?



In HPC space linux has over 90% market share, where these cards are well suited, and no, nvidia has great linux driver with full support of needed features. 

Xorg is old dinosaur, which handicaps whole desktop side of linux. No driver can't fix that.


----------



## GorbazTheDragon (May 11, 2017)

We talking what products??? Geforce or tesla?

Last time i tried it was almost impossible to get my 670 to work properly on ubuntu and mint. Was even more hopeless when i tried my old laptop (yay optimus), but from people i know who use the cards for actual compute stuff, its a different story.

But someone in the thread told me hpc was an irrelevant market


----------



## bug (May 11, 2017)

GorbazTheDragon said:


> We talking what products??? Geforce or tesla?
> 
> Last time i tried it was almost impossible to get my 670 to work properly on ubuntu and mint. Was even more hopeless when i tried my old laptop (yay optimus), but from people i know who use the cards for actual compute stuff, its a different story.
> 
> But someone in the thread told me hpc was an irrelevant market


The article has "Tesla" in its name, so...
Also, my trusty old 660Ti (not 670, but pretty damn close) worked flawlessly on Ubuntu for years. My work 610M continues to do so, along the IGP. Prime gave me a bit of a headache till I set it up right, but it's been smooth ever since. I can't imagine how you managed to screw it up, there's literally no distro that makes installing proprietary drivers easier than Ubuntu.


----------



## alucasa (May 11, 2017)

bug said:


> there's literally no distro that makes installing proprietary drivers easier than Ubuntu.



True dat.


----------



## GorbazTheDragon (May 11, 2017)

Ehh probably me being unfamiliar with the platform, but neither my mother and father, who both used unix systems since for their whole careers, got it to work at the time.

My 8500GT never gave me trouble though...

Regardless, NV is obviously pushing heavily on the HPC/compute market like they have done since tesla, and i think the results show. Since Kepler to Volta they have opened up many new markets for GPGPU, among these what i mentioned earlier in (radio) astronomy...


----------



## RejZoR (May 11, 2017)

Nokiron said:


> What? Im talking business and enterprise (which is the market that matters). AMD did *NOT* have good software support, no matter if you like it or not.



Of course you were


bug said:


> The world is not built around your personal usage pattern. For that, we apologize.



Now, all of a sudden my usage (as a gamer) doesn't matter because it's not a negative one towards AMD. Okay...


----------



## bug (May 11, 2017)

RejZoR said:


> Now, all of a sudden my usage (as a gamer) doesn't matter because it's not a negative one towards AMD. Okay...



Way to spin it. I gave you several areas where AMD fell short over the years, you dismissed them all because you didn't have a problem with them and now I'm cherry-picking? You're good.

Edit: And if you're talking strictly Windows, yes, I've had no problem recommending AMD to friends over the years. But for me, it never made the cut, mostly because of abysmal Linux support.


----------



## efikkan (May 11, 2017)

Raevenlord said:


> This chip is a beast of a processor: it packs 21 billion transistors (up from 15,3 billion found on the P100); it's built on TSMC's 12 nm FF process (evolving from Pascal's 16 nm FF); and measures a staggering 815 mm² (from the P100's 610 mm².) This is such a considerable leap in die-area that we can only speculate on how yields will be for this monstrous chip, especially considering the novelty of the 12 nm process that it's going to leverage.


That's less than a 3% increase in transistor density. The reason for this is the fact that TSMC's "12 nm FinFet" is actually the same node as TSMC's "16nm FinFet" and is actually TSMC's "20nm" node. "Third generation 20nm" would be a more fair description, if we follow Intel's standard. If this was a real node shrink well see a close to doubling in density. So remember this, TSMC's "12nm" is not a node shrink 



R4E3960FURYX said:


> RX Vega 8GB HBM2 is death and overkill by Volta GV 100.


GV100 will not arrive in any consumer product anytime soon, perhaps never.



RejZoR said:


> And there is no "higher CPU overhead". NVIDIA just invested tons of time and resources into multithreading with DX11 which makes it look like AMD cards are more CPU intensive. But realistically, it's all drama bullshit. Been gaming in DX11 with Radeons for years and performance was always excellent. But when people see 3fps difference in benchmarks and they instantly lose their s**t entirely.


One customer being "satisfied" doesn't prove anything.
Not to spawn another discussion, but stutter is one aspect where AMD still have a lot to improve in their driver.



GorbazTheDragon said:


> Linux drivers are shit on both sides so whats the argument?


Linux might not have the same game selection as Windows, but as anyone into professional graphics would know; Nvidia is the only vendor offering enterprise quality drivers for Linux, and the drivers are even more stable than the counterparts for Windows.


----------



## Fx (May 11, 2017)

Hood said:


> It's common knowledge that AMD/ATI/Radeon software and driver support sucks, for all their hardware, including CPUs and chipsets.  I guess you never got the memo...oh that's right, you're the one who ignores everything negative about AMD and makes up stats that "prove" your point.  Please don't stop, you are providing comic relief with your delusions.  Continue obsessing over trivial BS,
> so we can keep laughing...



No sir, it is not common knowledge. That would be called a stigma which originated from a time when ATI was lacking in support. That was a long time ago.

bug, fyi, I am referring to Windows support with that being the prevalent gaming platform by vast margins.


----------



## RejZoR (May 11, 2017)

efikkan said:


> That's less than a 3% increase in transistor density. The reason for this is the fact that TSMC's "12 nm FinFet" is actually the same node as TSMC's "16nm FinFet" and is actually TSMC's "20nm" node. "Third generation 20nm" would be a more fair description, if we follow Intel's standard. If this was a real node shrink well see a close to doubling in density. So remember this, TSMC's "12nm" is not a node shrink
> 
> 
> GV100 will not arrive in any consumer product anytime soon, perhaps never.
> ...



I can tell you a lot about unexplained stutter with my GTX 980 as well. It's fine for a while and then mouse motion just feels like absolute garbage for no logical reason. But hey, what does one user mean, right?


----------



## Nokiron (May 11, 2017)

RejZoR said:


> Of course you were


We are in a thread about a datacenter GPU... Why else would I talk about programming models, APIs and developers?

Which then _you_ quoted.


----------



## RejZoR (May 11, 2017)

Nokiron said:


> We are in a thread about a datacenter GPU... Why else would I talk about programming models, APIs and developers?
> 
> Which then _you_ quoted.



Then you may want to ask the same to everyone dragging Vega into this discussion. Vega is a consumer card designed for gaming.


----------



## jabbadap (May 11, 2017)

RejZoR said:


> Then you may want to ask the same to everyone dragging Vega into this discussion. Vega is a consumer card designed for gaming.



RX Vega is consumer card, Vega 10 on Mi25 is datacenter Vega.


----------



## RejZoR (May 11, 2017)

C'mon, who calls the compute one "Vega 10"? No one. We all call it Mi25.


----------



## GorbazTheDragon (May 11, 2017)

And you are expecing an AMD comeback sheerly off their radeon gaming card sales?

Still havent heard back about this thing of all markets outside gaming being irrelevant. AMD is simply uncompetitive in this regard because of how fast NV is moving.


----------



## jabbadap (May 11, 2017)

RejZoR said:


> C'mon, who calls the compute one "Vega 10"? No one. We all call it Mi25.



And yet it has the same vega 10 gpu inside than consumer card RX Vega. But granted you are not the one who dragged _RX Vega_ on this discussion in the first place. 

It's painful when one call their under hood architecture and some consumer product with same name. Makes it quite hard to follow conversation.


----------



## Nokiron (May 11, 2017)

RejZoR said:


> Then you may want to ask the same to everyone dragging Vega into this discussion. Vega is a consumer card designed for gaming.


Vega is relevant since AMD calls MI25 "MI25 Vega with NCU"

http://instinct.radeon.com/en-us/about/


----------



## RejZoR (May 11, 2017)

Oh dear. That's like calling GeForce Titan Xp "a graphic card, but not the ordinary one, it's the one for developers, but you know, it's not the cut down Pascal, it's the full fat one, but it's not only for developers, gamers can also use it for you know, gaming, but we don't call it Pascal because that's it's core codename."

That's the level of stupidity here about how a product is being called. Gaming one is RX Vega, Vega, Vega 10, the big Vega, you name it. The "professional" ones were always called either RadeonPro/FirePro or now, Instinct (Mi25 in Vega's SKU case). No one cares how you might want to call it or how long version of the first paragraph applies to it. It's just simple basic communications common sense that prevents any kind of confusion. Any Vega is gaming card, any Mi or Fire is workstation stuff. It's not a rocket science to use it this way you know. It's not like there's gonna be any other Mi25 with I don't know, Navi core. It'll be called different. So, why the need to overcomplicate simple things?


----------



## Nokiron (May 11, 2017)

RejZoR said:


> Oh dear. That's like calling GeForce Titan Xp "a graphic card, but not the ordinary one, it's the one for developers, but you know, it's not the cut down Pascal, it's the full fat one, but it's not only for developers, gamers can also use it for you know, gaming, but we don't call it Pascal because that's it's core codename."
> 
> That's the level of stupidity here about how a product is being called. Gaming one is RX Vega, Vega, Vega 10, the big Vega, you name it. The "professional" ones were always called either RadeonPro/FirePro or now, Instinct Mi25. No one cares how you might want to call it or how long version of the first paragraph applies to it. It's just simple basic communications common sense that prevents any kind of confusion. Any Vega is gaming card, any Mi or Fire is workstation stuff. It's not a rocket science to use it this way you know. It's not like there's gonna be any other Mi25 with I don't know, Navi core. It'll be called different. So, why the need to overcomplicate simple things?


No, it is not. Titan Xp is a specific product with a very specific target audience.

That does not matter. Again, we are talking about datacenters which is why Vega is relevant (since the only existing Vega-based product is in that market). The only product we actually have data on. If you assume that we are talking about the RX-version you have not read the article.
If someone does talk about it, it is not relevant to either the news, the product or the competition.

And Instinct is most definitely no workstation cards.


----------



## RejZoR (May 11, 2017)

Then, how hard is it to address it simply as "Mi25" eh? Very apparently... All you need is 4 characters and you lot prefers to go full retard about it. Jesus...


----------



## Fluffmeister (May 11, 2017)

Nv pushing up to $130 on their stock prices, the markets are liking what they are seeing.

http://www.nasdaq.com/article/nvidia-nvda-q1-earnings-revenues-top-outlook-strong-cm787164


----------



## GorbazTheDragon (May 11, 2017)

Of course they are liking it, this is a card which looks to have of the order of 200-300% improvement over the previous gen on many relevant (realistic) compute workloads!


----------

