# Possible NVIDIA GM200 Specs Surface



## btarunr (Jan 2, 2015)

Somebody sent our GPU-Z validation database a curious looking entry. Labeled "NVIDIA Quadro M6000" (not to be confused with AMD FirePro M6000), with a device ID of 10DE - 17F0, this card is running on existing Forceware 347.09 drivers, and features a BIOS string that's unlike anything we've seen. Could this be the fabled GM200/GM210 silicon? 

The specs certainly look plausible - 3,072 CUDA cores, 50 percent more than those on the GM204; a staggering 96 ROPs, and a 384-bit wide GDDR5 memory interface, holding 12 GB of memory. The memory is clocked at 6.60 GHz (GDDR5-effective), belting out 317 GB/s of bandwidth. The usable bandwidth is higher than that, due to NVIDIA's new lossless texture compression algorithms. The core is running at gigahertz-scraping 988 MHz. The process node and die-size are values we manually program GPU-Z to show, since they're not things the drivers report (to GPU-Z). NVIDIA is planning to hold a presser on the 8th of January, along the sidelines of the 2015 International CES. We're expecting a big announcement (pun intended).





*View at TechPowerUp Main Site*


----------



## ZoneDymo (Jan 2, 2015)

Sooo can we now finally truly move on to 4k?


----------



## HumanSmoke (Jan 2, 2015)

Assuming its correct and is indicative of a shipping product, the clocks seem fairly high given that this appears to be a workstation card, especially the memory clock.
If so, clocks-wise, it augers well for GeForce branded cards...now I guess, we wait around until Nvidia is pressured into releasing it as such.

If I'm reading this right, the ROP and core count are both 50% greater than GM 204, but the 252.9 GTex/sec fillrate implies that the texture address units have increased by 100% from 128 to 256 ( 988 * 256 = 252.928 GTexels/sec) - similar steps to that seen in the Kepler arch, although assuming 128 cores per module (as per GM 107 and GM 204) the number should be 192 ( 3072 cores / 128 per module = 24 SMM * 8 TMU per SMM = 192)

@btarunr
Should run a poll on the thread: "What percentage of posts will howl about pricing?"
A. 90-95%
B. 96-97%
C. 98%+


----------



## Prima.Vera (Jan 2, 2015)

ZoneDymo said:


> Sooo can we now finally truly move on to 4k?


No. You can barely run at 1440p, nevermind 4K...


----------



## the54thvoid (Jan 2, 2015)

If that's the quadro part we could expect 6gb memory on the desktop part. Assuming info is correct.

And yes, I think the flame trolls will be inbound with haste on the cost front.


----------



## HumanSmoke (Jan 2, 2015)

the54thvoid said:


> If that's the quadro part we could expect 6gb memory on the desktop part. Assuming info is correct.


Aye, and it will be interesting to see what the clock envelope is for the gaming/prosumer parts. 12gb running at 6600M effective must chew through at least a third of the power budget for a workstation card (assuming ~250W board power)


----------



## Chitz (Jan 2, 2015)

Holy Cupcakes just comparing my 680's specs to this puts my entire rig to shame


----------



## GhostRyder (Jan 2, 2015)

Interesting spec sheet as if this really is the GM 200 we have been anticipating then its got a lot of power hidden inside waiting especially if you factor in how the 980 with significantly less Cuda cores performs with Kepler showcasing a significant amount more cores.  I will be more interested in what the core and memory clocks are on the desktop counterpart more than anything and the final memory configuration even though based on that its probably going to be probably 6gb based off the preliminary guesses.



ZoneDymo said:


> Sooo can we now finally truly move on to 4k?



Well one 290X or 980 can do an ok job right now at high settings so based on that if this has about a ~30% performance difference depending on clocks I would say yes that two of these would be able to drive almost all games ultra at 60FPS.  But that is just a guess...



the54thvoid said:


> If that's the quadro part we could expect 6gb memory on the desktop part. Assuming info is correct.



That is what I think as well but I think that is going to be the GTX 1080 (Yet to be named but I would love it to be that name!!!) amount as I just do not see the Titan II having 6gb again based on where they (sortta) aim that product.


----------



## alwayssts (Jan 2, 2015)

HumanSmoke said:


> Aye, and it will be interesting to see what the clock envelope is for the gaming/prosumer parts. 12gb running at 6600M effective must chew through at least a third of the power budget for a workstation card (assuming ~250W board power)



According to my math, that sounds about right.  

So (if this is true) figure a core clock roughly 20-25% higher or so (real/'boost'/load), which more-or-less meshes with the earlier rumors.

I have to imagine the default clock (for consumer parts) is fluid dependent upon whatever Fiji/whatever is comparably, but I don't doubt that earlier-reported 1100n/1390b (overclocked?) is within reason for 300w.

I've always figured this arch was engineered with 20nm in mind (which obviously nvidia backed out of at some point, probably not long after they threw that hissy-fit presentation about tsmc's 20nm cost) and the clocks vs older archs reflect that.  IOW,  1100 is probably the new '~900mhz' and 1390 the new 11xx (think 1.163-1.175ish volts for both companies earlier products).   It meshes with one of nvidia's chief scientists that said 20nm gave a 20-25% boost.

Extrapolate as we may, we still prolly have a ways to go before any hard numbers that could truly give an indication of where it (or the competition) will end up.

The only thing one can safely say is 980 is meant to preempt a faster iteration of a 290x-like product, and a higher segment is typically around 15-20% faster.

From there, each company *should* have two more products on new chips that climb two more approx similar steps, while obviously each lower chip generally overclocks to the level of the stock next level.

If you take that to it's logical conclusion assuming 21 and 24 SMM parts that (over)clock slightly lower (~90%?) than GM204 counterparts (due to extra ram etc), I think it gels...but that's pure speculation.

(and I didn't even mention cost....even though I think 290/290x clearly show a trend for the way forward in AMD's future price structure...which will lead to inevitable 980 price drops and higher-end chips taking it's place.)


----------



## Steevo (Jan 2, 2015)

Dear god, 50% more than current production, and assuming a slight process gain in efficiency? Its like AMD is getting tentacle raped in the bad way, all holes all the time. 

Even if this is the pro grade part, the slightly cut down version will still be a monster, but assuming its on the same process node how are they cooling this beast?


----------



## ZhuI (Jan 2, 2015)

I'm laughing at all the anxiousness of the pro-NV crowd who don't want a debate on price.

Let's be honest, if this is indeed the Titan II then we'll see the 1000 dollar pricetag return. The Titan I was an epic failure. Those who got a 780 basically got a card which was only 10% away from the Titan but at half the price. And let's not even talk about Titan Z vs R295X2.

Nvidia needs to destroy the Titan line. But they won't, because lots of NV fanboys will do almost anything for Nvidia even as the company pisses in their mouths, the fanboys only beg for more. 

And btw, if AMD goes down in the GPU space - which is absolutely a possibility - consider the X80 flagships gone and replaced with the 1000 dollar GPU cards instead. But I'm sure you people will defend that, too


----------



## CrAsHnBuRnXp (Jan 2, 2015)

ZhuI said:


> I'm laughing at all the anxiousness of the pro-NV crowd who don't want a debate on price.
> 
> Let's be honest, if this is indeed the Titan II then we'll see the 1000 dollar pricetag return. The Titan I was an epic failure. Those who got a 780 basically got a card which was only 10% away from the Titan but at half the price. And let's not even talk about Titan Z vs R295X2.
> 
> ...


Because this post doesnt scream AMD zealot at all.


----------



## Hilux SSRG (Jan 2, 2015)

If this is the full chip and not a cutdown then it's going to cost $1000-$1500.


----------



## the54thvoid (Jan 2, 2015)

CrAsHnBuRnXp said:


> Because this post doesnt scream AMD zealot at all.



So true. 

I'm one of those evil NVidiots. I drive my Nissan Skyline with decals of a winged JSH on the hood/bonnet. I buy NV stock and smoke rolled up AMD shares. 

Or I but what suits my needs as long as my budget meets it.


----------



## ZeDestructor (Jan 2, 2015)

ZhuI said:


> I'm laughing at all the anxiousness of the pro-NV crowd who don't want a debate on price.
> 
> Let's be honest, if this is indeed the Titan II then we'll see the 1000 dollar pricetag return. The Titan I was an epic failure. Those who got a 780 basically got a card which was only 10% away from the Titan but at half the price. And let's not even talk about Titan Z vs R295X2.
> 
> ...



Please. Titan 1 is the poor man's Quadro K6000, not your rich man's GTX 785. Its sole purpose is to provide FP64 processing and ECC at $1000 instead of $6000, and for that, it works just fine.

Titan Z is just a showoff piece, not a card that sells. Mind you, it sold even more poorly than nvidia expected, so there is a sentiment of overpricedness there as well.


----------



## MxPhenom 216 (Jan 2, 2015)

ZoneDymo said:


> Sooo can we now finally truly move on to 4k?



More like 5-8k.


----------



## Xzibit (Jan 2, 2015)

ZhuI said:


> I'm laughing at all the anxiousness of the pro-NV crowd who don't want a debate on price.
> 
> Let's be honest, if this is indeed the Titan II then we'll see the 1000 dollar pricetag return. The Titan I was an epic failure. Those who got a 780 basically got a card which was only 10% away from the Titan but at half the price. And let's not even talk about Titan Z vs R295X2.
> 
> ...



If this turns out to be true.

*NVIDIA Planning To Ditch Maxwell GPUs For HPC Purposes Due To Lack of DP Hardware – Will Update Tesla Line With Pascal in 2016, Volta Arriving in 2017*

There might not be Titans this time around but high prices is another thing.


----------



## Jurassic1024 (Jan 2, 2015)

96 ROPS! Holy sh*t.


----------



## Fluffmeister (Jan 2, 2015)

Xzibit said:


> If this turns out to be true.
> 
> *NVIDIA Planning To Ditch Maxwell GPUs For HPC Purposes Due To Lack of DP Hardware – Will Update Tesla Line With Pascal in 2016, Volta Arriving in 2017*
> 
> There might not be Titans this time around but high prices is another thing.



DP is irrelevant for average joe consumer anyway, so no great lose there.

And it presumably makes sense with 28nm sticking around longer than expected, the recent launch of the K80 no doubt helps to fill the gap.

Speaking of Titan, that big fat government contract for two new super computers Summit and Sierra should keep them busy.

*NVIDIA Volta, IBM POWER9 Land Contracts For New US Government Supercomputers*


----------



## Xzibit (Jan 2, 2015)

It also gives a little more credibility to the leaks from Sisoft *remember*.  Might also mean the ChipHell leaks were not far off either.

Nvidia GM200






AMD Fiji


----------



## Assimilator (Jan 2, 2015)

Blimey. 50% more CUDA cores, 50% more ROPs, and a staggering 100% more TMUs. If these specs are correct, this card will be a beast of note. And if nVidia's already got GM200 ready to go, only slightly after AMD releases Fiji... it's not gonna be good for AMD. Even if Fiji outperforms GM204 significantly, nVidia can just drop GM200 into the retail channel and erode that advantage.


----------



## techy1 (Jan 2, 2015)

if AMDs Fiji will be weak - this will cost 2000$ at least... if AMDs Fiji will be beast+ low price , then this will be priced 700$ max.... so lets us all hope and cheer for AMD - so we can all get this Nvidia cheap and really handle hat 4K


----------



## Xzibit (Jan 2, 2015)

techy1 said:


> if AMDs Fiji will be weak - this will cost 2000$ at least... if AMDs Fiji will be beast+ low price , then this will be priced 700$ max.... so lets us all hope and cheer for AMD - so we can all get this Nvidia cheap and really handle hat 4K



If one puts any real value into the Nvidia/AMD leaks. This one is interesting because we could very well be headed for a nice performance/price war if these products come out close to one another.


----------



## HumanSmoke (Jan 2, 2015)

Xzibit said:


> If one puts any real value into the Nvidia/AMD leaks. This one is interesting because we could very well be headed for a nice performance/price war if these products come out close to one another.


Those "results" were debunked as bogus almost as soon as they emerged.
Ask yourself, what source would be in possession of BOTH Nvidia's and AMD's next top cards as well as AMD's second tier offering and Nvidia's GM 200 salvage part. One source having access to *four unreleased top tier parts* across both vendors 

Not only do they have access to *both* vendors next offerings, not a single other source has even a single one of those four benchmarked.


----------



## Xzibit (Jan 2, 2015)

HumanSmoke said:


> Those "results" were debunked as bogus almost as soon as they emerged.
> Ask yourself, what source would be in possession of BOTH Nvidia's and AMD's next top cards as well as AMD's second tier offering and Nvidia's GM 200 salvage part. One source having access to *four unreleased top tier parts* across both vendors



A similar source that put up the Sisoft scores? Which months later now match the TPU validation..

  Can you source the debunking? How were they debunked by the way?  Was it a consensus from people who didn't like the outcome?


----------



## the54thvoid (Jan 2, 2015)

Xzibit said:


> A similar source that put up the Sisoft scores? Which months later now match the TPU validation..
> 
> Can you source the debunking? How were they debunked by the way?  Was it a consensus from people who didn't like the outcome?



It IS hugely unlikely to get both vendors top cards.  Besides, the scores are all over the shop.

On the other hand, a 290X runs on par with GTX980 at 4K so it's not out of the question to expect AMD and NV top be close this time around.  The loser will be whoever releases first (IMO).  I think the vendor that releases second will tinker with their product to pull a performance edge or use aggressive pricing.

Good for both camps.


----------



## HumanSmoke (Jan 2, 2015)

Xzibit said:


> Can you source the debunking? How were they debunked by the way?


Maybe because basically every outlet from the rumour mills to noted sites evinced total scepticism over a single source having access to both Nvidia and AMD unreleased products, or maybe because there are two AMD cards months away from release- one at least with a markedly different design if the HBM rumours are true - being benchmarked with obviously functioning drivers? No mean feat considering AMD (and Nvidia for that matter) have had obvious driver issues on/before launch day.


Xzibit said:


> Was it a consensus from people who didn't like the outcome?


More of a consensus from people with common sense and an understanding of the evolution of a product.

EDIT: I forgot the obvious red flag. The guy leaking this info claimed that the AMD GPUs are manufactured on GloFo's 20nm process....which means that AMD is *presently using* *20nm* and *will transition to 28nm* (?!) in the future. BTW the ONLY 20nm process GloFo has is a low power process - which doesn't gel with a large monolithic GPU of 200-300W


----------



## AsRock (Jan 2, 2015)

ZhuI said:


> I'm laughing at all the anxiousness of the pro-NV crowd who don't want a debate on price.
> 
> Let's be honest, if this is indeed the Titan II then we'll see the 1000 dollar pricetag return. The Titan I was an epic failure. Those who got a 780 basically got a card which was only 10% away from the Titan but at half the price. And let's not even talk about Titan Z vs R295X2.
> 
> ...



Shhhh, NV fans will cry if you say shit like that.  Were not allowed to point out facts like that even if this thread is based on a rumor in the first place.


----------



## Fluffmeister (Jan 2, 2015)

AsRock said:


> Shhhh, NV fans will cry if you say shit like that.  Were not allowed to point out facts like that even if this thread is based on a rumor in the first place.



Titan existed and it sold just fine, even nVidia were surprised how well it sold:

http://www.pcgamer.com/nvidias-surp...e-year-old-gtx-690-in-just-3-months-were-not/

The fake graphs and the clutching at straws in this thread though is funny, now that is a fact.


----------



## Xzibit (Jan 2, 2015)

the54thvoid said:


> It IS hugely unlikely to get both vendors top cards.  Besides, the scores are all over the shop.



The GM200 scores are the AMD ones seam stable with one at 800 and the other at 1000.

People forget how early out we got Maxwell 750 benchmarks. 5 months early and Chiphell coming in 3 months earlier but they provided OC 750 Ti samples results and since those numbers looked favorable people gravitated towards them and turned out to be over estimated.  If these cards are being released in a similar time-frame to one another its plausible for them and the Sisoft source to have both.



HumanSmoke said:


> Maybe because basically every outlet from the rumour mills to noted sites evinced total scepticism over a single source having access to both Nvidia and AMD unreleased products, or maybe because there are two AMD cards months away from release- one at least with a markedly different design if the HBM rumours are true - being benchmarked with obviously functioning drivers? No mean feat considering AMD (and *Nvidia for that matter) have had obvious driver issues on/before launch day*.



Blasphemy

Time will tell if they are true & accurate or not but it seams to be more of a hate for Chiphell having access to the cards.


----------



## Champ (Jan 2, 2015)

Steevo said:


> Its like AMD is getting tentacle raped in the bad way, all holes all the time.



Thats funny


----------



## Fluffmeister (Jan 2, 2015)

Xzibit said:


> Time will tell if they are true & accurate or not but it seams to be more of a



Well you would think that, I'd say send your CV off but I don't think they are hiring at the moment.


----------



## MxPhenom 216 (Jan 3, 2015)

LOL at those who say Titan 1 was a failure. That card sold like hot cakes before the 780 came about, and even after that people who had use for the compute performance bought them up.


----------



## ZoneDymo (Jan 3, 2015)

MxPhenom 216 said:


> LOL at those who say Titan 1 was a failure. That card sold like hot cakes before the 780 came about, and even after that people who had use for the compute performance bought them up.



indeed, says quite a bit about the NV crowd does not not?


----------



## Fluffmeister (Jan 3, 2015)

ZoneDymo said:


> indeed, says quite a bit about the NV crowd does not not?



They are successful well paid individuals with great taste?


----------



## MxPhenom 216 (Jan 3, 2015)

ZoneDymo said:


> indeed, says quite a bit about the NV crowd does not not?



Are you trying to say they are stupid for buying such a card? I think that's more opinion than fact.@Fluffmeister hit it on the head I think.


----------



## HumanSmoke (Jan 3, 2015)

Fluffmeister said:


> Well you would think that, I'd say send your CV off but I don't think they are hiring at the moment.


Even if they were, I'd suggest investing the cash rather than buying the postage. In a few years it should appreciate enough to buy a majority shareholding.


----------



## 15th Warlock (Jan 3, 2015)

This will be an awesome year for PC gamers, can't wait to see what comes next from the red team, either way I'm excited at the sheer amount of power these new cards will bring to us gamers, was tired of 10-15% performance increments at full price.

We are in for a dozy guys, we might be looking at the business/research version of big Maxwell here, but I'm sure a more costumer friendly card a là 780/780ti will be just around the corner.

As someone who's fully embraced 4K gaming, all I can say is: bring it on AMD and Nvidia!


----------



## vega22 (Jan 3, 2015)

CrAsHnBuRnXp said:


> Because this post doesnt scream AMD zealot at all.




only to the green fanbois.

some of us see it as fact.


----------



## MxPhenom 216 (Jan 3, 2015)

marsey99 said:


> only to the green fanbois.
> 
> some of us see it as fact.



Its furthest from the fact.


----------



## entropy13 (Jan 3, 2015)

btarunr said:


> The process node and die-size are values we manually program GPU-Z to show, since they're not things the drivers report (to GPU-Z).



So Nvidia didn't break science/physics? 


I find Zhul's *only* post so far quite weird. By its own logic Intel's i7 for the LGA 1150 does not exist because it has been replaced by LGA 2011's i7s. Because AMD is absent on the high end in processors, just like with his assumption "if AMD goes under...X80 series will be all $1000 parts".


----------



## AsRock (Jan 3, 2015)

Fluffmeister said:


> They are successful well paid individuals with great taste?



Who have no value or sense, but shit  cannot spend ya money when ya dead so.

As they say more money than sense.


----------



## HumanSmoke (Jan 3, 2015)

Xzibit said:


> Time will tell if they are true & accurate or not but it seams seems to be more of a



Good catch, and quite possibly true...and the hate comes with a side order of bodily excretion fixation 


ZhuI said:


> lots of NV fanboys will do almost anything for Nvidia even as the company pisses in their mouths, the fanboys only beg for more.





AsRock said:


> Who have no value or sense, but shit  cannot spend ya money when ya dead so.As they say more money than sense.


----------



## Xzibit (Jan 3, 2015)

HumanSmoke said:


> Good catch, and quite possibly true...and the hate comes with a side order of bodily excretion fixation



I meant the hate for Chiphell when its convenient.  You pointed to a quote from VideoCardz for your source of proof to debunking it. There was no debunking of anything just a sentence dismissing it as fake but *NO PROOF*. Given VideoCardz will use Chiphell as a source when ever possible and not give them or anyone credit unless they water mark there screens. It reeks of hilarity.

As for the Nvidia vs AMD hate you guys can carry on.  I enjoy the childish, hypocrisy & self righteousness.


----------



## HumanSmoke (Jan 3, 2015)

Xzibit said:


> I meant the hate for Chiphell when its convenient.


Then you don't know what you're talking about.
I don't tar Chiphell with distain for a single poster anymore than I would show distain for TPU because a single forum member chooses to pick an argument using nonsensical data - namely that *the first line in the "leakers" post categorically said that the AMD GPUs were manufactured on GloFo's 20nm Low Power Mobility process* - a process they would be totally unsuited for (Globalfoundries has stated as much in both interview and PPS presentation if the naming wasn't enough of a hint) , and the totally farcical situation where AMD are supposedly building GPUs on 20nm, but are moving to 28nm in 2015. When was the last time a vendor moved production to a larger process than they are already supposedly using?


Xzibit said:


> You pointed to a quote from VideoCardz for your source of proof to debunking it.


No, what I said was that a range of sites - from rumour mills to respected mainstream sites (along with process tech based sites) dismissed the benchmarks as bogus, and the easiest evidence of that proof came from the "leaker" himself stating that the large die GPUs were built on GloFo's 20LPM process.
Videocardz was by no means the only site that were sceptical. Hilbert at G3D called out the charts as fake on the day they showed up:


> The tests originate from ChipHell,* this site is well known for both legit and fake leaks*. Let me reiterate that it is EXTREMELY unlikely for these guys to have all these unreleased engineering sample boards let alone drivers. *I for one do not believe these results are for real*.



Feel free to believe (and I see two others upvoted your post so they obviously believe as well) that AMD are building GPUs on an 20nm process only to toss it aside to use 28nm, and the only person on the planet leaking gaming benchmarks for unreleased models conveniently just so happens to have *all four* of the top cards - along with working drivers....just don't try to convince me they are legit when common sense says otherwise - common sense will trump you every time.


----------



## the54thvoid (Jan 3, 2015)

In all fairness, the bickering is inevitable. At least Xzibit keeps it civil. Some brand enthusiasts just can't help being infantile. We have to accept the fact that NV can charge more because the product is viewed by almost all reviewers as superior. 
AMD inevitably compete on lower pricing. This creates two troll states of consciousness, NV trolls can see AMD as bargain bin trash (and they are not). NV are seen by AMD trolls as exploiting bastards. Which as an NV buyer, I'd say they are.
Point is though, I buy NV because I like buying the best solution for me. At least my 2nd Classified card was a bargain.
Maybe Fiji will be my next choice. People need open, logical minds to make informed tech decisions.

But for those wishing to be brave, why not state here and now who will have the best card next, AMD's Fiji or Big Maxwell?
I'd suggest Maxwell but at leat if I'm wrong, I'll buy the better option for me. Unless AMD's chip is ruined by a poorly made card.


----------



## HumanSmoke (Jan 3, 2015)

the54thvoid said:


> But for those wishing to be brave, why not state here and now who will have the best card next, AMD's Fiji or Big Maxwell?


Without some concrete info on AMD's chip it's a bit of crapshoot. If the Fiji XT's core count turns out to be the rumoured 4096, then I'd think that it would be the chip to beat. As with the 295X2 it looks like AMD are going for broke on the power budget if the Asetek contract for AIO's for the reference cards are any indication. Xzibit's claimed legit "leaker" has AMD's second tier GPU using around the same power as GM 200 ( ~ 215-220W), but curiously, the power consumption figures for the top part (which the leaker -also curiously, says is Bermuda contrary to virtually every other source) aren't included. If the second tier GPU is a 200+W part then 300W or more isn't out of the question for the 390/390X especially if AMD have a closed loop watercooler lined up for reference cooling duties. 4096 cores and a 300W power budget (most of which will be leveraged for the GPU if the rumours of HBM are correct) is going to be a hard act to top in outright performance.
So, I'd pick Fiji XT as the top GPU at the expense of power consumption and framebuffer if HBM is involved since 4GB is the maximum permissable with first-gen HBM.


----------



## Fluffmeister (Jan 3, 2015)

AsRock said:


> Who have no value or sense, but shit  cannot spend ya money when ya dead so.
> 
> As they say more money than sense.



Cool story bro.


----------



## Rowsol (Jan 3, 2015)

The need for people to segregate themselves to one side or the other is hilarious.  For me, it's always been price/performance.  Here's hoping this is another windmill slam like the 970.  Doubt it though.


----------



## vega22 (Jan 3, 2015)

MxPhenom 216 said:


> Its furthest from the fact.




so you are going to even try to argue that the titan was worth every penny of the grand they was asking?

i bet you have an iphone too as apple only make quality products as well :rofl:


----------



## HumanSmoke (Jan 3, 2015)

marsey99 said:


> so you are going to even try to argue that the titan was worth every penny of the grand they was asking?
> i bet you have an iphone too as apple only make quality products as well :rofl:


Great! Now you're setting yourself up as the arbiter of what constitutes value for all of us. Please don't stop at graphics cards and phones, tell us what cars, houses, toothpaste, and cereal we should be buying - I've been yearning for some random to decide what I should buy based on absolutely no knowledge of my situation.

:SMH:


----------



## Xzibit (Jan 3, 2015)

the54thvoid said:


> In all fairness, the bickering is inevitable. At least Xzibit keeps it civil.* Some brand enthusiasts just can't help being infantile*. We have to accept the fact that NV can charge more because the product is viewed by almost all reviewers as superior.
> AMD inevitably compete on lower pricing. This creates two troll states of consciousness, NV trolls can see AMD as bargain bin trash (and they are not). NV are seen by AMD trolls as exploiting bastards. Which as an NV buyer, I'd say they are.
> Point is though,* I buy NV because I like buying the best solution for me*. At least my 2nd Classified card was a bargain.
> Maybe Fiji will be my next choice. *People need open, logical minds to make informed tech decisions*.



That's what it should be.



HumanSmoke said:


> Xzibit's claimed legit "leaker" has AMD's second tier GPU using around the same power as GM 200 ( ~ 215-220W).



I don't think you've even read my post.  Maybe you should try the UFC if confrontation is what your after.

So much for things being different around here. Not even 2 days into the New Year and we have the same character acting they same way as last year. People are suppose to get more mature with age not immature. I'd hate to be in my 50s and be acting like that.


----------



## HumanSmoke (Jan 3, 2015)

Xzibit said:


> I don't think you've even read my post.


What I read in your series of posts was an overwhelming reluctance to acknowledge that the info you posted was faked (and no, retroactive post editing doesn't count) - even after pointing out the patent falsity of the "leakers" first "fact".
Rather than accept that the information the "leaker" posted is fake (at least as far as the unreleased parts are concerned), you then move on to some senseless and unfounded assertion that I have bias against the site that* hosted* the "leaked" charts. If you claim that you're the one being wronged, why keep the accusations flying and invite reply?


Xzibit said:


> Maybe you should try the UFC if confrontation is what your after.


"*The lady doth protest too much, methinks*" - Hamlet,  Act III, Scene II.

I guess if confrontation isn't your thing this will be the last word on the matter. Anyhow, I assume we can now get back to the subject at hand - GM 200 specifications and Quadro.


----------



## Xzibit (Jan 3, 2015)

HumanSmoke said:


> What I read in your series of posts was an overwhelming reluctance to acknowledge that the info you posted was faked (and no, retroactive post editing doesn't count) - even after pointing out the patent falsity of the "leakers" first "fact".
> Rather than accept that the information the "leaker" posted is fake (at least as far as the unreleased parts are concerned), you then move on to some senseless and unfounded assertion that I have bias against the site that* hosted* the "leaked" charts. If you claim that you're the one being wronged, why keep the accusations flying and invite reply?



You haven't changed one bit.

I'm not sure where you get the idea I'm asking you to believe them or take them as fact.  I'm just asking you to provide proof of what you said.  What did you do?  You clung onto the 20nm statement.  How is that debunking the results?  The sites you linked just think its highly unlikely to have them all.

Just incase you forget and before you accuse me or imagine I said something totally different. Here is a recap of the conversation since you seem to have trouble with it.



Xzibit said:


> If one puts any real value into the Nvidia/AMD leaks. This one is interesting because we could very well be headed for a nice performance/price war if these products come out close to one another.





HumanSmoke said:


> Those "results" were debunked as bogus almost as soon as they emerged.





Xzibit said:


> Can you source the debunking? How were they debunked by the way?  Was it a consensus from people who didn't like the outcome?



I guess asking you to prove what you said is too much.


----------



## Sony Xperia S (Jan 3, 2015)

Xzibit said:


> If one puts any real value into the Nvidia/AMD leaks. This one is interesting because we could very well be headed for a nice performance/price war if these products come out close to one another.



Looks plausible, Bermuda XT faster than GM200.

The question now is... When?


----------



## Lionheart (Jan 3, 2015)

This thread.....






I miss the old TPU   Had less trolls & fanboys


----------



## MxPhenom 216 (Jan 3, 2015)

marsey99 said:


> so you are going to even try to argue that the titan was worth every penny of the grand they was asking?
> 
> i bet you have an iphone too as apple only make quality products as well :rofl:



Well first I don't have an iPhone, and never have owned an Apple product. I have a Windows Phone. And second, whether its worth ever penny they asked or not is besides the fact, that card still sold well, better then Nvidia and the man up top (CEO) thought it would. I don't think any computer part is worth every penny you pay for it. Its just going to get replaced by something better 6 months later, but that's what we pay for as enthusiasts. Now go on continue telling me what isn't and is worth it, and how much you don't know about me..........


----------



## the54thvoid (Jan 3, 2015)

Lionheart said:


> This thread.....
> 
> 
> 
> ...



It _is_ a thread about a top tier new gfx product. Every forum has them and their associated trolls. At least Xzibit and Human are using high brow combatant discourse with only a few 'low brow' posters making it trollish.
Writing styles always get 'frosty' but there's not that many 'fuck you' posts. Yet!


----------



## Sony Xperia S (Jan 3, 2015)

the54thvoid said:


> It _is_ a thread about a top tier new gfx product. Every forum has them and their associated trolls. At least Xzibit and Human are using high brow combatant discourse with only a few 'low brow' posters making it trollish.
> Writing styles always get 'frosty' but there's not that many 'fuck you' posts. Yet!



The guy probably left, anyways.

I understand why some nvidia fanboys could be unhappy about introducing discussions in the direction that despite those "so great" specs of GM200, they won't be enough and the company will soon be lagging behind.

Don't underestimate and ignore all possibilities.

Actually, I would even put my money on a bet that those scores from CH are plausible.


----------



## ensabrenoir (Jan 3, 2015)

......I started to say something rational and intelligent........but its just so much more fun to raz the  Amd Guys   One day......... all will realize that there is something for everyone .........





_*TILL ALL ARE ONE!!!!!!!!!!!!!!!!!!!*!_


----------



## Fluffmeister (Jan 3, 2015)

Sony Xperia S said:


> The guy probably left, anyways.
> 
> I understand why some nvidia fanboys could be unhappy about introducing discussions in the direction that despite those "so great" specs of GM200, they won't be enough and the company will soon be lagging behind.
> 
> ...



Brave man, but let's hope so for AMD's sake, they haven't had a good time of late. Tonga was no wonder chip despite being their latest and greatest and people were predicting it would give Nv a bloody nose then too.

But hey maybe they have achieved wonders, now all they need is to get the bloody things on the shelves, paper tigers don't pay the bills after all.


----------



## the54thvoid (Jan 3, 2015)

Sony Xperia S said:


> I understand why some nvidia fanboys could be unhappy about introducing discussions in the direction that despite those "so great" specs of GM200, they won't be enough and the company will soon be lagging behind.



See, this is uninformed. You need to rationalise arguments. You're merely resorting to using the term 'fanboy' and saying NV will fail with their next card, on no evidence whatsoever. 
The specs could (on hardware config and architecture maturation) lead to a plausible 40-50% increase on GTX 980. 
We require R9 290X perf to be bested by 50-60% to compete on that basis. I can't call that, I don't have the chips in my hand.
But your statement is pure troll buddy. You may not mean it but without some form of tech in there to back up your assertion, it is pure flame...


----------



## HumanSmoke (Jan 3, 2015)

Sony Xperia S said:


> Actually, I would even put my money on a bet that those scores from CH are plausible.


They are plausible, because anyone with basic arithmetic skills, a baseline to work with, and some estimated specifications can scale possible increases in performance. Somehow I don't think it's a coincidence that the chart shows the GM200 to be ~50% stronger in performance than the 780 Ti when GM200 has 50% more cores and 50% more ROPs and 50% more shader modules. It's also no surprise that the poster has the Bermuda XT ( shouldn't that be Fiji XT ???) showing around the same increases over the 290X given that the rumoured core count is 45% higher ( 4096 vs 2816), the compute units are 45% higher ( 64 vs 44), and TMU's 45% higher (256 vs 176).

What isn't very plausible is that these new parts are _supposedly_ scaling perfectly in relation to mature products months out from release using what are undoubtedly very immature drivers. Unless you believe that both Nvidia and AMD have already perfected the drivers for these parts well ahead of launch. How likely does that sound?


Sony Xperia S said:


> I understand why some nvidia fanboys could be unhappy about introducing discussions in the direction that despite those "so great" specs of GM200, they won't be enough and the company will soon be lagging behind.


Like the fortunes of a company are predicted upon a halo part sold in limited numbers (didn't seem to do much for AMD when Hawaii ruled the roost for both single and dual GPU cards)? Who are they supposed to be lagging behind and why? Last time I checked, the company held ~80% of the workstation market, 85% of the HPC GPGPU market (+ a few high profile additions to come), is gobbling up mobile discrete graphics market share as fast as AMD is losing it, and is carving out a growing market for auto based SoC's. How is this all supposed to come tumbling down and what kind of timeframe are you expecting? You made the prediction, so you must have some supporting theory and evidence, right?


----------



## AsRock (Jan 3, 2015)

Lionheart said:


> This thread.....
> 
> 
> 
> ...



Used to have more facts and less rumors too which probably was started by some one who loves nvidia, same goes for this crap when AMD stuff is posted without some kind of proof.


----------



## Sony Xperia S (Jan 4, 2015)

HumanSmoke said:


> ( shouldn't that be Fiji XT ???)



On your German link Fiji is a 28 nm TSMC part, while we are speaking here about 20 nm process.
So, no, it shouldn't be.



HumanSmoke said:


> What isn't very plausible is that these new parts are _supposedly_ scaling perfectly in relation to mature products months out from release using what are undoubtedly very immature drivers. Unless you believe that both Nvidia and AMD have already perfected the drivers for these parts well ahead of launch. How likely does that sound?



Likely enough.



HumanSmoke said:


> Who are they supposed to be lagging behind and why?



Nvidia behind AMD because of lower gaming performance from top-tier new cards.



HumanSmoke said:


> You made the prediction, so you must have some supporting theory and evidence, right?



Yes.


Oh, and I didn't say GM200 would be a fail, just that it would be inferior.


----------



## ensabrenoir (Jan 4, 2015)

........wow..... the stupidity level is now over 9000.... (un subs this thread)........


----------



## HumanSmoke (Jan 4, 2015)

Sony Xperia S said:


> On your German link Fiji is a 28 nm TSMC part, while we are speaking here about 20 nm process.


You missed the point entirely. 3DC are talking about the 4096 core part being named Fiji, not Bermuda.


Sony Xperia S said:


> Nvidia behind AMD because of lower gaming performance from top-tier new cards.


Really? Even when that has been demonstrably true ( R300, Evergreen series launch, Hawaii launch), ATI/AMD have still failed to outsell Nvidia. It also doesn't explain how performance of gaming top tier cards should affect Quadro, Tesla, or SoC sales.
You can live the dream(world) for AMD all you like, but the facts are pretty clear. Nvidia has outsold ATI/AMD in discrete graphics for every quarter for more than a dozen years and is presently outselling AMD two-to-one - at higher prices I might add, and that ratio is historically increasing....on that note, Q4 2014's figures should make some interesting reading in a couple of weeks time.


marsey99 said:


> but fun and games aside (being hitler, fun and games xD) please feel free to explain how the titan offered such great value when it didn't?


I'd suggest you direct that question to people who bought the GTX Titan. Even if you discounted the benchmarking/gaming fraternity, the card sold well amongst the CG rendering crowd.


marsey99 said:


> as for it being a sales success, well yes many men do feel the need to use money to make them feel better about their tiny, tiny penis.


While undoubtedly true in some instances, there are also many instances where it boils down to buying the best tool for the job. Where CUDA outstrips OpenCL in rendering applications and time to completion is a priority, people choose the system best tailored to their needs. As for how many buy because of penis issues, I'll leave you to initiate a straw pole poll.

@ensabrenoir
I think I'll join you in un-subbing. When a graphics card thread devolves into Hitler, penises, and full-on trolling (Hi Sony), it's time to pull the pin
/ SMH and exits stage left


----------



## vega22 (Jan 4, 2015)

HumanSmoke said:


> I'd suggest you direct that question to people who bought the GTX Titan. Even if you discounted the benchmarking/gaming fraternity, the card sold well amongst the CG rendering crowd.
> 
> While undoubtedly true in some instances, there are also many instances where it boils down to buying the best tool for the job. Where CUDA outstrips OpenCL in rendering applications and time to completion is a priority, people choose the system best tailored to their needs. As for how many buy because of penis issues, I'll leave you to initiate a straw pole poll.



but that is kinda the point dude. you are defending the (then) ludacris cost of a "gaming" card because it sold well with professional users...

also, "When a graphics card thread devolves into Hitler, penises, and full-on trolling..." my job here is done.

just fyi for anyone that cares. i hope this, 980ti super kraken eating titan 2 and the 3>9000xtxsxrisriinxs+ are both monsters of gpu which muller 4k and are ready for 8k as i dont like multi gpu setups my self and cant wait till 1 gpu can do 4k as i will be upgrading to it then  and they will have a price war, even better!


----------



## the54thvoid (Jan 4, 2015)

Can a mod close this thread please?  It's a fucking catastrophe and pandering to arseholes.


----------



## Sony Xperia S (Jan 4, 2015)

HumanSmoke said:


> You missed the point entirely. 3DC are talking about the 4096 core part being named Fiji, not Bermuda.



Would you be so kind to explain how TSMC would be capable of manufacturing a 4096-shader Fiji on its existing 28nm processes?

Are you claiming that everything they stated back in November last year is correct?

So, you think 3dcenter's info is plausible, while Chiphell's is not?


----------



## Tatty_One (Jan 4, 2015)

Thread cleaned up, it's a news thread....... stop the crap now please, debate by all means but some of the crap in here is not even worthy of a toddler, any more from now and holidays will ensue..... thank you!


----------



## HumanSmoke (Jan 4, 2015)

Ah, moderation makes an appearance



marsey99 said:


> but that is kinda the point dude. you are defending the (then) ludacris cost of a "gaming" card because it sold well with professional users...


Bit of a reading fail on your part then. What I said was people buy the tool for the job, and sales are sales regardless of the end users intent - it is actually no different to the sales (and inflated pricing) attached to Radeon cards (also marketed as gaming) due to sales to miners, many of whom did nothing gaming related with the cards at all.


Sony Xperia S said:


> Would you be so kind to explain how TSMC would be capable of manufacturing a 4096-shader Fiji on its existing 28nm processes?


Assuming this is a genuine question then...
In theory, it would be fairly easy. Most people* should* realize that a large die performance/enthusiast GPU devotes ~50% of its die area to cores and TMU's. The remaining 50% comprises the uncore (memory controllers, memory interfaces, command processor, transcode engine, raster ops etc.)
The green area's are the core, everything else the uncore





The uncore is relatively fixed in size if the memory interfaces (bus width) remain static. Hawaii at 2816 cores is 438mm^2, half of which is cores and texture address units (220mm). If the core count is increased by 45% ( to 4096) then the area devoted to it increases to 319mm^2. Add the 220mm^2 for the uncore and the resultant die area becomes 539mm^2 - or just slightly smaller than GK 110.
That is how TSMC is capable of manufacturing a 4096 shader Fiji. Whether they are the foundry involved depends on when AMD decided to use GloFo's 28nm SHP process for GPUs in addition to Kaveri APUs. One of these two processes will almost certainly be the manufacturing node involved.


Sony Xperia S said:


> Are you claiming that everything they stated back in November last year is correct?


If you'd bothered to read what I wrote it would be obvious that what I was pointing out was that 3DC attributed the name Fiji to the 4096 shader part. I might also point out that many other sources do the same including a well known AMD brown-noser who claims intimate knowledge of AMD's business (although you'll have to stump up a fee to breach the paywall ). Have AMD swapped the names around? were they in the right order to begin with? Who knows, although I'd note that the other parts in the hierarchy don't seem affected.


marsey99 said:


> So, you think 3dcenter's info is plausible, while Chiphell's is not?


3DC don't release leaks, they gather information and extrapolate from that. Their membership includes a number of industry insiders, coders, architects. Chiphell on the other hand are a conglomeration like any forum based site. The validity of their information depends upon the individuals posting there. Some is legitimate, some is quasi-legitimate (access to samples but results/info massaged for PR spin ***cough**Coolaler**cough***), some is estimation/guesstimation, and some is outright bullshit. Chiphell posts should be taken on a case by case basis- especially from posters with little or no previous track record of providing reliable information.

In this particular instance, we have a poster with no previous record for releasing reliable leaks, quoting a manufacturing process wholly unsuited for large GPUs, using a naming convention at odds with the rest of the tech world, and showing results that would indicate perfect scaling for both vendors which supposes mature drivers for both AMD and Nvidia months out from launch....all this plus a single source having access to not just one unreleased top-tier card, nor two, nor three, but *four* - access that includes both AMD and Nvidia.
I also find it difficult to accept that this guy benchmarked four unreleased cards (along with comparisons with many released cards) across 20 games, yet can't provide any shred of photographic evidence, no standard benchmark validations (Heaven, 3DMark), nor power figures for AMD's top part, nor any single game numbers. All a bit convenient.


----------



## Xzibit (Jan 5, 2015)

If you guys are interested in a bit of clarity rather then trolling each other of what might be or not be...

*CES 2015 Nvidia Press Conference is at 8:00pm PST (Nvidia Live Stream)*

_*Don't think they are offering children discounts, even if you act like one._


----------



## Fluffmeister (Jan 5, 2015)

It's certainly nice to get back on topic. 

Good effort though.


----------



## xenocide (Jan 5, 2015)

Lionheart said:


> I miss the old TPU   Had less trolls & fanboys


 
You miss the days of Seronxadamus predicting AMD's new CPU's performing 60% better than Intels with perfectly linear scaling and everyone in the software design industry magically coding for 32-threads?  The news posts have always been hit or miss, this one is just especially silly.



Sony Xperia S said:


> I understand why some nvidia fanboys could be unhappy about introducing discussions in the direction that despite those "so great" specs of GM200, they won't be enough and the company will soon be lagging behind.
> 
> Don't underestimate and ignore all possibilities.


 
This is all sorts of goofy.  There is something extremely bias about saying a part that's a 45% increase across the board is clearly going to be better than one that's 50% in 2 categories and 100% in the other.  There's nothing wrong with theorizing, but that just flat out doesn't make sense.  I don't think "ignoring" of possiblities is the issue, and it's a half-assed rebuttal to any statement since you could just respond with "Oh sorry, I hadn't consider this card might be the one where AMD breaks the laws of Physics, how silly of me!"



marsey99 said:


> but that is kinda the point dude. you are defending the (then) ludacris cost of a "gaming" card because it sold well with professional users...


 
The card was priced as a budget workstation card, but also happened to have the highest gaming performance on the market, so Nvidia attacked two markets at once--with pretty good success.  They advertised it to the Gaming Audience because anyone with half a brain would look at the cutdown K6000 for $1000 and know it was a steal.  It was hardly a gaming card though--unless bragging rights are considered a performance metric.



Sony Xperia S said:


> Would you be so kind to explain how TSMC would be capable of manufacturing a 4096-shader Fiji on its existing 28nm processes?


 
Arithmetic?


----------



## Sony Xperia S (Jan 5, 2015)

HumanSmoke said:


> In theory, it would be fairly easy.
> 
> If you'd bothered to read what I wrote it would be obvious that what I was pointing out was that 3DC attributed the name Fiji to the 4096 shader part. I might also point out that many other sources do the same including a well known AMD brown-noser who claims intimate knowledge of AMD's business (although you'll have to stump up a fee to breach the paywall ). Have AMD swapped the names around? were they in the right order to begin with? Who knows, although I'd note that the other parts in the hierarchy don't seem affected.
> 
> 3DC don't release leaks, they gather information and extrapolate from that. Their membership includes a number of industry insiders, coders, architects. Chiphell on the other hand are a conglomeration like any forum based site.



3DC info is old enough and we have new piece of data which changes initial plans.
Maybe initially Fiji had indeed been scheduled for production on 28nm with 4096 shaders, but afterwards it could be forward-ported to a more advanced manufacturing process, 20nm at GloFo.

In theory it would be ok but in practice, to me, releasing anything 28nm (even GM200) is purely a short-vision decision.

The more you delay in time


			
				TheGoodBadWeird said:
			
		

> The 390x is rumored not to come till summer 2015. Would be a long wait till then with only a low-midtier class card.



the more likely those will use either 20nm or 16nm. 

http://wccftech.com/nvidia-gm200-titan-2-amd-fiji-380x-bermuda-390x-benchmarked/


----------



## HumanSmoke (Jan 5, 2015)

Sony Xperia S said:


> 3DC info is old enough and we have new piece of data which changes initial plans.
> Maybe initially Fiji had indeed been scheduled for production on 28nm with 4096 shaders, but afterwards it could be forward-ported to a more advanced manufacturing process, 20nm at GloFo.


No doubt the design can be ported to a 20nm based process in future (although it seems unlikely with 16nmFF/14nm-XM not too far off), but I still haven't heard a single logical explanation why AMD would go from building large GPUs on 20nm (and if they are being benchmarked now production started at least a quarter ago) then move to 28nm SHP this year. Seems ass-backwards.

What makes it even more unlikely is that AMD will be unveiling the Carrizo APU with the same GPU architecture using 28nm (very likely at this weeks CES). Wouldn't it make more sense to consolidate APUs and GPUs on the same process when the APUs use the same graphics cores as the discrete GPUs?



Sony Xperia S said:


> In theory it would be ok but in practice, to me, releasing anything 28nm (even GM200) is purely a short-vision decision.


Graphics cards have a short shelf life. You build on the processes suited for the task and readily available. 


Sony Xperia S said:


> the more likely those will use either 20nm or 16nm.


Unless AMD are really late to the party, I think you're setting yourself up for disappointment. The time frame for Fiji (its test and validation boards being sighted on Zauba some months back) almost certainly seems to point to 28nm IMO. Bermuda hasn't been sighted, so if it's a further development for a later launch then a smaller node is a definite possibility.


----------



## Lionheart (Jan 6, 2015)

xenocide said:


> You miss the days of Seronxadamus predicting AMD's new CPU's performing 60% better than Intels with perfectly linear scaling and everyone in the software design industry magically coding for 32-threads?  The news posts have always been hit or miss, this one is just especially silly.



Uum no!  Got nothing to do with just AMD & Intel, I'm referring to the way ppl had discussions on this site, ppl actually helped each other without resorting to a fanboyish mindset, and there were rarely any trolls as well. But I'm going back to around 2008 - 2011....


----------



## Sony Xperia S (Jan 6, 2015)

HumanSmoke said:


> why AMD would go from building large GPUs on 20nm



No, because Bermuda won't be so large in the first place.



HumanSmoke said:


> (and if they are being benchmarked now production started at least a quarter ago)



Not necessarily. You had a benchmarked engineering sample.



HumanSmoke said:


> then move to 28nm SHP this year.



I am still not quite sure about this.



HumanSmoke said:


> Graphics cards have a short shelf life.
> 
> Unless AMD are really late to the party, I think you're setting yourself up for disappointment.



No.
No.

My finances would be actually quite relaxed to wait till real 20nm or 16nm products. Short shelf life doesn't equate to short service in consumers' cases.


----------



## HumanSmoke (Jan 6, 2015)

Lionheart said:


> Uum no!  Got nothing to do with just AMD & Intel, I'm referring to the way ppl had discussions on this site, ppl actually helped each other without resorting to a fanboyish mindset, and there were rarely any trolls as well. But I'm going back to around 2008 - 2011....


Just for comparison, here's a direct comparison of the same situation ( Nvidia top-tier GPU release rumour) from 2009 - the fanboy accusations started flying inside the first 2 pages.

EDIT:


Sony Xperia S said:


> Not necessarily. You had a benchmarked engineering sample.
> 
> 
> HumanSmoke said:
> ...


Fiji XT boards began shipping two months ago at least




From tape out to finished silicon takes 8-12 weeks ( Tape out > fabrication > die cutting > chip runtime testing > die packaging > board assembly). Eight weeks prior to 7 November makes it four months - a quarter of a year.


----------



## Sony Xperia S (Jan 8, 2015)

HumanSmoke said:


> ...



The problem with your theory is that if Fiji has been shipping for so long time, there is still no physical evidence or proof, or even a slight hint about coming launches.

So, when will it be released?


----------

