Friday, January 2nd 2015

Possible NVIDIA GM200 Specs Surface

Somebody sent our GPU-Z validation database a curious looking entry. Labeled "NVIDIA Quadro M6000" (not to be confused with AMD FirePro M6000), with a device ID of 10DE - 17F0, this card is running on existing Forceware 347.09 drivers, and features a BIOS string that's unlike anything we've seen. Could this be the fabled GM200/GM210 silicon?

The specs certainly look plausible - 3,072 CUDA cores, 50 percent more than those on the GM204; a staggering 96 ROPs, and a 384-bit wide GDDR5 memory interface, holding 12 GB of memory. The memory is clocked at 6.60 GHz (GDDR5-effective), belting out 317 GB/s of bandwidth. The usable bandwidth is higher than that, due to NVIDIA's new lossless texture compression algorithms. The core is running at gigahertz-scraping 988 MHz. The process node and die-size are values we manually program GPU-Z to show, since they're not things the drivers report (to GPU-Z). NVIDIA is planning to hold a presser on the 8th of January, along the sidelines of the 2015 International CES. We're expecting a big announcement (pun intended).
Add your own comment

80 Comments on Possible NVIDIA GM200 Specs Surface

#51
Xzibit
the54thvoidIn all fairness, the bickering is inevitable. At least Xzibit keeps it civil. Some brand enthusiasts just can't help being infantile. We have to accept the fact that NV can charge more because the product is viewed by almost all reviewers as superior.
AMD inevitably compete on lower pricing. This creates two troll states of consciousness, NV trolls can see AMD as bargain bin trash (and they are not). NV are seen by AMD trolls as exploiting bastards. Which as an NV buyer, I'd say they are.
Point is though, I buy NV because I like buying the best solution for me. At least my 2nd Classified card was a bargain.
Maybe Fiji will be my next choice. People need open, logical minds to make informed tech decisions.
That's what it should be.
HumanSmokeXzibit's claimed legit "leaker" has AMD's second tier GPU using around the same power as GM 200 ( ~ 215-220W).
I don't think you've even read my post. Maybe you should try the UFC if confrontation is what your after.

So much for things being different around here. Not even 2 days into the New Year and we have the same character acting they same way as last year. People are suppose to get more mature with age not immature. I'd hate to be in my 50s and be acting like that.
Posted on Reply
#52
HumanSmoke
XzibitI don't think you've even read my post.
What I read in your series of posts was an overwhelming reluctance to acknowledge that the info you posted was faked (and no, retroactive post editing doesn't count) - even after pointing out the patent falsity of the "leakers" first "fact".
Rather than accept that the information the "leaker" posted is fake (at least as far as the unreleased parts are concerned), you then move on to some senseless and unfounded assertion that I have bias against the site that hosted the "leaked" charts. If you claim that you're the one being wronged, why keep the accusations flying and invite reply?
XzibitMaybe you should try the UFC if confrontation is what your after.
"The lady doth protest too much, methinks" - Hamlet, Act III, Scene II.

I guess if confrontation isn't your thing this will be the last word on the matter. Anyhow, I assume we can now get back to the subject at hand - GM 200 specifications and Quadro.
Posted on Reply
#53
Xzibit
HumanSmokeWhat I read in your series of posts was an overwhelming reluctance to acknowledge that the info you posted was faked (and no, retroactive post editing doesn't count) - even after pointing out the patent falsity of the "leakers" first "fact".
Rather than accept that the information the "leaker" posted is fake (at least as far as the unreleased parts are concerned), you then move on to some senseless and unfounded assertion that I have bias against the site that hosted the "leaked" charts. If you claim that you're the one being wronged, why keep the accusations flying and invite reply?
You haven't changed one bit.

I'm not sure where you get the idea I'm asking you to believe them or take them as fact. I'm just asking you to provide proof of what you said. What did you do? You clung onto the 20nm statement. How is that debunking the results? The sites you linked just think its highly unlikely to have them all.

Just incase you forget and before you accuse me or imagine I said something totally different. Here is a recap of the conversation since you seem to have trouble with it.
XzibitIf one puts any real value into the Nvidia/AMD leaks. This one is interesting because we could very well be headed for a nice performance/price war if these products come out close to one another.
HumanSmokeThose "results" were debunked as bogus almost as soon as they emerged.
XzibitCan you source the debunking? How were they debunked by the way? Was it a consensus from people who didn't like the outcome?
I guess asking you to prove what you said is too much.
Posted on Reply
#54
Sony Xperia S
XzibitIf one puts any real value into the Nvidia/AMD leaks. This one is interesting because we could very well be headed for a nice performance/price war if these products come out close to one another.

Looks plausible, Bermuda XT faster than GM200.

The question now is... When?
Posted on Reply
#55
Lionheart
This thread.....



I miss the old TPU :( Had less trolls & fanboys :banghead:
Posted on Reply
#56
MxPhenom 216
ASIC Engineer
marsey99so you are going to even try to argue that the titan was worth every penny of the grand they was asking?

i bet you have an iphone too as apple only make quality products as well :rofl:
Well first I don't have an iPhone, and never have owned an Apple product. I have a Windows Phone. And second, whether its worth ever penny they asked or not is besides the fact, that card still sold well, better then Nvidia and the man up top (CEO) thought it would. I don't think any computer part is worth every penny you pay for it. Its just going to get replaced by something better 6 months later, but that's what we pay for as enthusiasts. Now go on continue telling me what isn't and is worth it, and how much you don't know about me..........
Posted on Reply
#57
the54thvoid
Super Intoxicated Moderator
LionheartThis thread.....



I miss the old TPU :( Had less trolls & fanboys :banghead:
It is a thread about a top tier new gfx product. Every forum has them and their associated trolls. At least Xzibit and Human are using high brow combatant discourse with only a few 'low brow' posters making it trollish.
Writing styles always get 'frosty' but there's not that many 'fuck you' posts. Yet!
Posted on Reply
#58
Sony Xperia S
the54thvoidIt is a thread about a top tier new gfx product. Every forum has them and their associated trolls. At least Xzibit and Human are using high brow combatant discourse with only a few 'low brow' posters making it trollish.
Writing styles always get 'frosty' but there's not that many 'fuck you' posts. Yet!
The guy probably left, anyways.

I understand why some nvidia fanboys could be unhappy about introducing discussions in the direction that despite those "so great" specs of GM200, they won't be enough and the company will soon be lagging behind.

Don't underestimate and ignore all possibilities.

Actually, I would even put my money on a bet that those scores from CH are plausible.
Posted on Reply
#59
ensabrenoir
......I started to say something rational and intelligent........but its just so much more fun to raz the Amd Guys :roll: One day......... all will realize that there is something for everyone .........


TILL ALL ARE ONE!!!!!!!!!!!!!!!!!!!!
Posted on Reply
#60
Fluffmeister
Sony Xperia SThe guy probably left, anyways.

I understand why some nvidia fanboys could be unhappy about introducing discussions in the direction that despite those "so great" specs of GM200, they won't be enough and the company will soon be lagging behind.

Don't underestimate and ignore all possibilities.

Actually, I would even put my money on a bet that those scores from CH are plausible.
Brave man, but let's hope so for AMD's sake, they haven't had a good time of late. Tonga was no wonder chip despite being their latest and greatest and people were predicting it would give Nv a bloody nose then too.

But hey maybe they have achieved wonders, now all they need is to get the bloody things on the shelves, paper tigers don't pay the bills after all.
Posted on Reply
#61
the54thvoid
Super Intoxicated Moderator
Sony Xperia SI understand why some nvidia fanboys could be unhappy about introducing discussions in the direction that despite those "so great" specs of GM200, they won't be enough and the company will soon be lagging behind.
See, this is uninformed. You need to rationalise arguments. You're merely resorting to using the term 'fanboy' and saying NV will fail with their next card, on no evidence whatsoever.
The specs could (on hardware config and architecture maturation) lead to a plausible 40-50% increase on GTX 980.
We require R9 290X perf to be bested by 50-60% to compete on that basis. I can't call that, I don't have the chips in my hand.
But your statement is pure troll buddy. You may not mean it but without some form of tech in there to back up your assertion, it is pure flame...
Posted on Reply
#62
HumanSmoke
Sony Xperia SActually, I would even put my money on a bet that those scores from CH are plausible.
They are plausible, because anyone with basic arithmetic skills, a baseline to work with, and some estimated specifications can scale possible increases in performance. Somehow I don't think it's a coincidence that the chart shows the GM200 to be ~50% stronger in performance than the 780 Ti when GM200 has 50% more cores and 50% more ROPs and 50% more shader modules. It's also no surprise that the poster has the Bermuda XT ( shouldn't that be Fiji XT ???) showing around the same increases over the 290X given that the rumoured core count is 45% higher ( 4096 vs 2816), the compute units are 45% higher ( 64 vs 44), and TMU's 45% higher (256 vs 176).

What isn't very plausible is that these new parts are supposedly scaling perfectly in relation to mature products months out from release using what are undoubtedly very immature drivers. Unless you believe that both Nvidia and AMD have already perfected the drivers for these parts well ahead of launch. How likely does that sound?
Sony Xperia SI understand why some nvidia fanboys could be unhappy about introducing discussions in the direction that despite those "so great" specs of GM200, they won't be enough and the company will soon be lagging behind.
Like the fortunes of a company are predicted upon a halo part sold in limited numbers (didn't seem to do much for AMD when Hawaii ruled the roost for both single and dual GPU cards)? Who are they supposed to be lagging behind and why? Last time I checked, the company held ~80% of the workstation market, 85% of the HPC GPGPU market (+ a few high profile additions to come), is gobbling up mobile discrete graphics market share as fast as AMD is losing it, and is carving out a growing market for auto based SoC's. How is this all supposed to come tumbling down and what kind of timeframe are you expecting? You made the prediction, so you must have some supporting theory and evidence, right?
Posted on Reply
#63
AsRock
TPU addict
LionheartThis thread.....



I miss the old TPU :( Had less trolls & fanboys :banghead:
Used to have more facts and less rumors too which probably was started by some one who loves nvidia, same goes for this crap when AMD stuff is posted without some kind of proof.
Posted on Reply
#64
Sony Xperia S
HumanSmoke( shouldn't that be Fiji XT ???)
On your German link Fiji is a 28 nm TSMC part, while we are speaking here about 20 nm process.
So, no, it shouldn't be.
HumanSmokeWhat isn't very plausible is that these new parts are supposedly scaling perfectly in relation to mature products months out from release using what are undoubtedly very immature drivers. Unless you believe that both Nvidia and AMD have already perfected the drivers for these parts well ahead of launch. How likely does that sound?
Likely enough.
HumanSmokeWho are they supposed to be lagging behind and why?
Nvidia behind AMD because of lower gaming performance from top-tier new cards.
HumanSmokeYou made the prediction, so you must have some supporting theory and evidence, right?
Yes.


Oh, and I didn't say GM200 would be a fail, just that it would be inferior.
Posted on Reply
#65
ensabrenoir
........wow..... the stupidity level is now over 9000....:shadedshu: (un subs this thread)........
Posted on Reply
#66
HumanSmoke
Sony Xperia SOn your German link Fiji is a 28 nm TSMC part, while we are speaking here about 20 nm process.
You missed the point entirely. 3DC are talking about the 4096 core part being named Fiji, not Bermuda.
Sony Xperia SNvidia behind AMD because of lower gaming performance from top-tier new cards.
Really? Even when that has been demonstrably true ( R300, Evergreen series launch, Hawaii launch), ATI/AMD have still failed to outsell Nvidia. It also doesn't explain how performance of gaming top tier cards should affect Quadro, Tesla, or SoC sales.
You can live the dream(world) for AMD all you like, but the facts are pretty clear. Nvidia has outsold ATI/AMD in discrete graphics for every quarter for more than a dozen years and is presently outselling AMD two-to-one - at higher prices I might add, and that ratio is historically increasing....on that note, Q4 2014's figures should make some interesting reading in a couple of weeks time.
marsey99but fun and games aside (being hitler, fun and games xD) please feel free to explain how the titan offered such great value when it didn't?
I'd suggest you direct that question to people who bought the GTX Titan. Even if you discounted the benchmarking/gaming fraternity, the card sold well amongst the CG rendering crowd.
marsey99as for it being a sales success, well yes many men do feel the need to use money to make them feel better about their tiny, tiny penis.
While undoubtedly true in some instances, there are also many instances where it boils down to buying the best tool for the job. Where CUDA outstrips OpenCL in rendering applications and time to completion is a priority,people choose the system best tailored to their needs. As for how many buy because of penis issues, I'll leave you to initiate a straw pole poll.

@ensabrenoir
I think I'll join you in un-subbing. When a graphics card thread devolves into Hitler, penises, and full-on trolling (Hi Sony), it's time to pull the pin
/ SMH and exits stage left
Posted on Reply
#67
vega22
HumanSmokeI'd suggest you direct that question to people who bought the GTX Titan. Even if you discounted the benchmarking/gaming fraternity, the card sold well amongst the CG rendering crowd.

While undoubtedly true in some instances, there are also many instances where it boils down to buying the best tool for the job. Where CUDA outstrips OpenCL in rendering applications and time to completion is a priority,people choose the system best tailored to their needs. As for how many buy because of penis issues, I'll leave you to initiate a straw pole poll.
but that is kinda the point dude. you are defending the (then) ludacris cost of a "gaming" card because it sold well with professional users...

also, "When a graphics card thread devolves into Hitler, penises, and full-on trolling..." my job here is done.

just fyi for anyone that cares. i hope this, 980ti super kraken eating titan 2 and the 3>9000xtxsxrisriinxs+ are both monsters of gpu which muller 4k and are ready for 8k as i dont like multi gpu setups my self and cant wait till 1 gpu can do 4k as i will be upgrading to it then :) and they will have a price war, even better!
Posted on Reply
#68
the54thvoid
Super Intoxicated Moderator
Can a mod close this thread please? It's a fucking catastrophe and pandering to arseholes.
Posted on Reply
#69
Sony Xperia S
HumanSmokeYou missed the point entirely. 3DC are talking about the 4096 core part being named Fiji, not Bermuda.
Would you be so kind to explain how TSMC would be capable of manufacturing a 4096-shader Fiji on its existing 28nm processes?

Are you claiming that everything they stated back in November last year is correct?

So, you think 3dcenter's info is plausible, while Chiphell's is not? :eek:
Posted on Reply
#70
Tatty_Two
Gone Fishing
Thread cleaned up, it's a news thread....... stop the crap now please, debate by all means but some of the crap in here is not even worthy of a toddler, any more from now and holidays will ensue..... thank you!
Posted on Reply
#71
HumanSmoke
Ah, moderation makes an appearance
marsey99but that is kinda the point dude. you are defending the (then) ludacris cost of a "gaming" card because it sold well with professional users...
Bit of a reading fail on your part then. What I said was people buy the tool for the job, and sales are sales regardless of the end users intent - it is actually no different to the sales (and inflated pricing) attached to Radeon cards (also marketed as gaming) due to sales to miners, many of whom did nothing gaming related with the cards at all.
Sony Xperia SWould you be so kind to explain how TSMC would be capable of manufacturing a 4096-shader Fiji on its existing 28nm processes?
Assuming this is a genuine question then...
In theory, it would be fairly easy. Most people should realize that a large die performance/enthusiast GPU devotes ~50% of its die area to cores and TMU's. The remaining 50% comprises the uncore (memory controllers, memory interfaces, command processor, transcode engine, raster ops etc.)
The green area's are the core, everything else the uncore


The uncore is relatively fixed in size if the memory interfaces (bus width) remain static. Hawaii at 2816 cores is 438mm^2, half of which is cores and texture address units (220mm). If the core count is increased by 45% ( to 4096) then the area devoted to it increases to 319mm^2. Add the 220mm^2 for the uncore and the resultant die area becomes 539mm^2 - or just slightly smaller than GK 110.
That is how TSMC is capable of manufacturing a 4096 shader Fiji. Whether they are the foundry involved depends on when AMD decided to use GloFo's 28nm SHP process for GPUs in addition to Kaveri APUs. One of these two processes will almost certainly be the manufacturing node involved.
Sony Xperia SAre you claiming that everything they stated back in November last year is correct?
If you'd bothered to read what I wrote it would be obvious that what I was pointing out was that 3DC attributed the name Fiji to the 4096 shader part. I might also point out that many other sources do the same including a well known AMD brown-noser who claims intimate knowledge of AMD's business (although you'll have to stump up a fee to breach the paywall). Have AMD swapped the names around? were they in the right order to begin with? Who knows, although I'd note that the other parts in the hierarchy don't seem affected.
marsey99So, you think 3dcenter's info is plausible, while Chiphell's is not? :eek:
3DC don't release leaks, they gather information and extrapolate from that. Their membership includes a number of industry insiders, coders, architects. Chiphell on the other hand are a conglomeration like any forum based site. The validity of their information depends upon the individuals posting there. Some is legitimate, some is quasi-legitimate (access to samples but results/info massaged for PR spin ***cough**Coolaler**cough***), some is estimation/guesstimation, and some is outright bullshit. Chiphell posts should be taken on a case by case basis- especially from posters with little or no previous track record of providing reliable information.

In this particular instance, we have a poster with no previous record for releasing reliable leaks, quoting a manufacturing process wholly unsuited for large GPUs, using a naming convention at odds with the rest of the tech world, and showing results that would indicate perfect scaling for both vendors which supposes mature drivers for both AMD and Nvidia months out from launch....all this plus a single source having access to not just one unreleased top-tier card, nor two, nor three, but four - access that includes both AMD and Nvidia.
I also find it difficult to accept that this guy benchmarked four unreleased cards (along with comparisons with many released cards) across 20 games, yet can't provide any shred of photographic evidence, no standard benchmark validations (Heaven, 3DMark), nor power figures for AMD's top part, nor any single game numbers. All a bit convenient.
Posted on Reply
#73
Fluffmeister
It's certainly nice to get back on topic. :P

Good effort though.
Posted on Reply
#74
xenocide
LionheartI miss the old TPU :( Had less trolls & fanboys :banghead:
You miss the days of Seronxadamus predicting AMD's new CPU's performing 60% better than Intels with perfectly linear scaling and everyone in the software design industry magically coding for 32-threads? The news posts have always been hit or miss, this one is just especially silly.
Sony Xperia SI understand why some nvidia fanboys could be unhappy about introducing discussions in the direction that despite those "so great" specs of GM200, they won't be enough and the company will soon be lagging behind.

Don't underestimate and ignore all possibilities.
This is all sorts of goofy. There is something extremely bias about saying a part that's a 45% increase across the board is clearly going to be better than one that's 50% in 2 categories and 100% in the other. There's nothing wrong with theorizing, but that just flat out doesn't make sense. I don't think "ignoring" of possiblities is the issue, and it's a half-assed rebuttal to any statement since you could just respond with "Oh sorry, I hadn't consider this card might be the one where AMD breaks the laws of Physics, how silly of me!"
marsey99but that is kinda the point dude. you are defending the (then) ludacris cost of a "gaming" card because it sold well with professional users...
The card was priced as a budget workstation card, but also happened to have the highest gaming performance on the market, so Nvidia attacked two markets at once--with pretty good success. They advertised it to the Gaming Audience because anyone with half a brain would look at the cutdown K6000 for $1000 and know it was a steal. It was hardly a gaming card though--unless bragging rights are considered a performance metric.
Sony Xperia SWould you be so kind to explain how TSMC would be capable of manufacturing a 4096-shader Fiji on its existing 28nm processes?
Arithmetic?
Posted on Reply
#75
Sony Xperia S
HumanSmokeIn theory, it would be fairly easy.

If you'd bothered to read what I wrote it would be obvious that what I was pointing out was that 3DC attributed the name Fiji to the 4096 shader part. I might also point out that many other sources do the same including a well known AMD brown-noser who claims intimate knowledge of AMD's business (although you'll have to stump up a fee to breach the paywall). Have AMD swapped the names around? were they in the right order to begin with? Who knows, although I'd note that the other parts in the hierarchy don't seem affected.

3DC don't release leaks, they gather information and extrapolate from that. Their membership includes a number of industry insiders, coders, architects. Chiphell on the other hand are a conglomeration like any forum based site.
3DC info is old enough and we have new piece of data which changes initial plans.
Maybe initially Fiji had indeed been scheduled for production on 28nm with 4096 shaders, but afterwards it could be forward-ported to a more advanced manufacturing process, 20nm at GloFo.

In theory it would be ok but in practice, to me, releasing anything 28nm (even GM200) is purely a short-vision decision.

The more you delay in time
TheGoodBadWeirdThe 390x is rumored not to come till summer 2015. Would be a long wait till then with only a low-midtier class card.
the more likely those will use either 20nm or 16nm. :rolleyes:

wccftech.com/nvidia-gm200-titan-2-amd-fiji-380x-bermuda-390x-benchmarked/

Posted on Reply
Add your own comment
Dec 3rd, 2024 12:13 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts