• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Kepler Refresh GPU Family Detailed

Joined
Dec 17, 2011
Messages
359 (0.08/day)
Hmmm. Nvidia. One request. Try to release these GPU's without GPU Boost. It really hampers overclocking. If the GTX 680 didn't have GPU Boost, it would have easily reached 1.4 GHz with good binning.
 
Joined
Mar 23, 2012
Messages
570 (0.12/day)
Processor Intel i9-9900KS @ 5.2 GHz
Motherboard Gigabyte Z390 Aorus Master
Cooling Corsair H150i Elite
Memory 32GB Viper Steel Series DDR4-4000
Video Card(s) RTX 3090 Founders Edition
Storage 2TB Sabrent Rocket NVMe + 2TB Intel 960p NVMe + 512GB Samsung 970 Evo NVMe + 4TB WD Black HDD
Display(s) 65" LG C9 OLED
Case Lian Li O11D-XL
Audio Device(s) Audeze Mobius headset, Logitech Z906 speakers
Power Supply Corsair AX1000 Titanium
Hmmm. Nvidia. One request. Try to release these GPU's without GPU Boost. It really hampers overclocking. If the GTX 680 didn't have GPU Boost, it would have easily reached 1.4 GHz with good binning.

+1 I would LOVE an option to disable GPU-Boost. Maybe put a dual BIOS or throw an option into the control panel or something.

If they do that and start allowing voltage control again, they'll have a far superior product for people wanting to overclock. GPU Boost nonsense + no voltage control kept me away from the GTX 680.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,115 (6.63/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
+1 I would LOVE an option to disable GPU-Boost. Maybe put a dual BIOS or throw an option into the control panel or something.

If they do that and start allowing voltage control again, they'll have a far superior product for people wanting to overclock. GPU Boost nonsense + no voltage control kept me away from the GTX 680.

with the trend NV is following especially after forcing EVGA to disable voltage tuning, i honestly dont think they will listen to customer feedback in this sense
 
Joined
Mar 23, 2012
Messages
570 (0.12/day)
Processor Intel i9-9900KS @ 5.2 GHz
Motherboard Gigabyte Z390 Aorus Master
Cooling Corsair H150i Elite
Memory 32GB Viper Steel Series DDR4-4000
Video Card(s) RTX 3090 Founders Edition
Storage 2TB Sabrent Rocket NVMe + 2TB Intel 960p NVMe + 512GB Samsung 970 Evo NVMe + 4TB WD Black HDD
Display(s) 65" LG C9 OLED
Case Lian Li O11D-XL
Audio Device(s) Audeze Mobius headset, Logitech Z906 speakers
Power Supply Corsair AX1000 Titanium
I know, and I'm very sad about that. I often prefer Nvidia's GPUs, but if AMD offers me something that I can turn into a superior product via overclocking while Nvidia cripples that capability, as happened with the 7970 and 680, I'll take AMD's offering every time.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.53/day)
What I'm saying is that your status as a reviewer gives no inherent credibility to your dismissal of tech rumors/stories (sorry to break it to you...). That might be true if the stories were from people clueless about tech, or if everyone who is well informed about GPUs agreed with you, but that's not the case. When you get corroborating evidence from many reliable and semi-reliable tech sources, there's something to it.

http://en.wikipedia.org/wiki/Argument_from_authority#Disagreement
http://en.wikipedia.org/wiki/Appeal_to_accomplishment

I never said it did. I said that you must assume that what I post IS speculation only, since I do what I do, and any real info about un-released products I cannot post.


And like-wise, the same applies to any tech site.


That is all. GK110 is un-released, nobody except nVidia employees and those that work at nvidia board pertners know anything about it, and none of them can comment due to NDA.


So anything, anything at all about it...is questionable.


Heck, it might not even actually exist, and is only an idea.


Post a pic of a GK100 chip, full specs and everything else officail form nvidia, and I'll stop my speculation.

Otherwise, if you don't like my post..that's just too bad. the report button is to the left, if you like.

you cna say asll you like that it was planned, you have no proof, and neither do I. And neither of us, if we did, could post it. So I can think and post what I like, and so can you. It's no big deal...only you are making it a big deal that I do not agree with this news.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
Post a pic of a GK100 chip, full specs and everything else officail form nvidia, and I'll stop my speculation.
you cna say asll you like that it was planned, you have no proof, and neither do I. And neither of us, if we did, could post it. So I can think and post what I like, and so can you. It's no big deal...only you are making it a big deal that I do not agree with this news.

Argument from ignorance. You ARE claiming both that GK100 never existed and that it didn't exist because it can not be made, based on the fact that we can not provide proof to disprove your theory. You are the only one claiming anything using this argument from ignorance falacy to back it up.

The rest of us is just saying that it is entirely posible and probable that GK100 existed and was simply delayed or slightly redesigned into GK110, in a move similar to GF100 -> GF110. The proofs although rumors, are out there and have been there for a long time. Rumors about chips don't always end up being entirely true, but there's always some true to it. GK100 was mentioned many times. GK110 DOES exist. 2+2=4

All in all Nvidia has already shipped cards based on the 7.1 b transistor GK110 chip, so the notion that such a chip cannot be made is obviously false.
 
Joined
Mar 23, 2012
Messages
570 (0.12/day)
Processor Intel i9-9900KS @ 5.2 GHz
Motherboard Gigabyte Z390 Aorus Master
Cooling Corsair H150i Elite
Memory 32GB Viper Steel Series DDR4-4000
Video Card(s) RTX 3090 Founders Edition
Storage 2TB Sabrent Rocket NVMe + 2TB Intel 960p NVMe + 512GB Samsung 970 Evo NVMe + 4TB WD Black HDD
Display(s) 65" LG C9 OLED
Case Lian Li O11D-XL
Audio Device(s) Audeze Mobius headset, Logitech Z906 speakers
Power Supply Corsair AX1000 Titanium
I've just never seen someone so ready to cavalierly dismiss a multitude of tech rumors based on their own idea of what is or is not possible from a manufacturing perspective...

To each his own I guess.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.53/day)
Argument from ignorance. You ARE claiming both that GK100 never existed and that it didn't exist because it can not be made, based on the fact that we can not provide proof to disprove your theory. You are the only one claiming anything using this argument from ignorance falacy to back it up.

The rest of us is just saying that it is entirely posible and probable that GK100 existed and was simply delayed or slightly redesigned into GK110, in a move similar to GF100 -> GF110. The proofs although rumors, are out there and have been there for a long time. Rumors about chips don't always end up being entirely true, but there's always some true to it. GK100 was mentioned many times. GK110 DOES exist. 2+2=4

All in all Nvidia has already shipped cards based on the 7.1 b transistor GK110 chip, so the notion that such a chip cannot be made is obviously false.

I've just never seen someone so ready to cavalierly dismiss a multitude of tech rumors based on their own idea of what is or is not possible from a manufacturing perspective...

To each his own I guess.

Nah, actually, I'm claiming this since i know all the specs of GK110 already. I even have a die shot. And yeah, liek you said, it is now for sale.

You can find info just as easy, too.

And becuase of this, I do think nvidia knew long before AMD's 7970 release that GK110 was not possible(which is when that news of GTX680 being a mid-range chip), and as such it wasn't meant to be GTX680, ever. Is GK110 the ultimate Kepler design...sure. but it was NEVER intended to be released as GTX680. It was always meant as a Tesla GPGPU card.

Liekwise, AMD knew that Steamroller...and excavator were coming...and that they arre the "big daddy" of the Bulldozer design...but that doesn't mean that Bulldozer or Piledriver are mid-range chips.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
Nah, actually, I'm claiming this since i know all the specs of GK110 already. I even have a die shot.

You can find it just as easy, too.

Anmd becuase of this, I do think nvidia knew long before AMD's release that GK110 was not possible, and as such it wasn;'t meant to be.

Liekwise, AMD knew that Steamroller...and excavator were coming...and that they arre the "big daddy" of the Bulldozer design...but that doesn't mean that Bulldozer or Piledriver are mid-range chips.

Everybody knows the specs and has seen die shots, for a long time already. That means nothing to the discussion at hand. Specs and dies shots say nothing about whether it is feasible to do or not (it IS, it's been already been created AND shipped to customers) and certainly says nothing regarding the intentions of Nvidia.

If GK100/110 was so unfeasible as a gaming card that it was never meant to be one, they would design a new chip to fill in that massive ~250mm^2 difference that exists between GK104 and GK110, instead of using GK110 as the refreshed high-end card. GK110 being an HPC chip wouldn't have so many gaming features wasting space either.

EDIT: SteamRoller, etc. Worst analogy ever.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.53/day)
If GK100/110 was so unfeasible as a gaming card that it was never meant to be one, they would design a new chip to fill in that massive $ 250mm^2 difference that exists between GK104 and GK110, instead of using GK110 as the refreshed high-end card. GK110 being an HPC chip wouldn't have so many gaming features wasting space either.

I dunno. You know, the one thing that nVidia is really good as is getting the most dollar for R&D, and designing another chip kinda goes against that mantra.

I mean, it's like dropping the hot clock. They knew they had to.

Keeping within the 300W power envelope with the full-blown Kepler design was obviously not possible, proved by Fermi, IMHO.

Jen Hsun said "The interconnecting mesh was the problem" for Fermi. That mesh...is cache.

And gaming doesn't need that cache. But... HPC does. Gaming needs power-savings, and dropping the hotclock and lowering cache and DP lowered power consumption enough that GTX680 is pretty damn good.

GK104..was that chip you just mentioned.


HPC is more money. WAY MORE MONEY. So for THAT market, yes, a customized chip makes sense.


See, Fermi was the original here. GF100 is the original, NOT GK100. or GK110.


If nvidia started with Kepler as the new core design, then I would have sided with you guys, for sure, but really, to me, Kepler is a bunch of customized Fermi designs, customized in such a way to deliver the best product possible for the lowest cost, for each market.

You may think the Steamroller analogy is wrong here, but to me, that is EXACTLY what Kepler is. And you know what..nVidia says the same thing, too. :p


The hotclock to me, and the lack of DP functionality, says it all. hotclock lets you use less die space, but requires more power. DP functionality also requires more power, because it requires more cache. Dropping 128-bits of memory control..again, to save on power...


If the current GTX680 was meant to be a mid-range chip, after doing all that to save on power, damn, Nvidia really does suck hard. :p
 
Joined
Apr 19, 2011
Messages
2,198 (0.44/day)
Location
So. Cal.
Try to release these GPU's without GPU Boost.
Sure and if you click the check box to enable OC or break the seal on the switch and then it bricks? I would love to have heard how GK104 could do with the dynamic nanny turned off... While you may believe it might find 1.4Ghz... would it live on for any duration?

I speculate it wouldn’t or Nvidia would've not put in place such restrictions if there wasn't good reasons. Will they still have it that way for next generation? Yes, almost assuredly but at that point better TDP and improve clock and thermal profiles will mean there's will be no gain over operating at an exaggerated-fixed clock. I think for mainstream both sides will continue to refine boost type control. It provides them the best of both worlds, lower claimed power usage, while the highest FpS return.
 
Joined
Nov 13, 2009
Messages
5,614 (1.02/day)
Location
San Diego, CA
System Name White Boy
Processor Core i7 3770k @4.6 Ghz
Motherboard ASUS P8Z77-I Deluxe
Cooling CORSAIR H100
Memory CORSAIR Vengeance 16GB @ 2177
Video Card(s) EVGA GTX 680 CLASSIEFIED @ 1250 Core
Storage 2 Samsung 830 256 GB (Raid 0) 1 Hitachi 4 TB
Display(s) 1 Dell 30U11 30"
Case BIT FENIX Prodigy
Audio Device(s) none
Power Supply SeaSonic X750 Gold 750W Modular
Software Windows Pro 7 64 bit || Ubuntu 64 Bit
Benchmark Scores 2017 Unigine Heaven :: P37239 3D Mark Vantage
i love how everyone is saying AMD will have a hard time competing :roll: did everyone forget that yawn that is the 7970 GHz edition still beat out the GTX 680 and this gen for the most part each company is equal at the typical price points.

8970 is expected to be 40% faster than the 7970

GTX 780 is expected to be 40-55% faster than the 680

add in overclocking on both and we end up with the exact same situation as this generation. So in reality it just plain doesnt matter lol performance is all i care about and who gets product onto store shelfs and from their into my hands. Doesn't matter whos fastest if it takes 6 months for stock to catch up.

:roll::roll::roll::roll::roll::roll:

It's allways entertaining when fanboys get butt hurt over peoples opinions, the fact is a stock reference 7970 is slower than a stock reference 680, I know this hurts you to accept this fact but it is true. As for the Ghz edition, compare it to a 680 that is factory overclocked and the result is thew same. My 680 classified walks all over any 7970, so :cry::cry: less and check facts more.
 
Joined
Mar 23, 2012
Messages
570 (0.12/day)
Processor Intel i9-9900KS @ 5.2 GHz
Motherboard Gigabyte Z390 Aorus Master
Cooling Corsair H150i Elite
Memory 32GB Viper Steel Series DDR4-4000
Video Card(s) RTX 3090 Founders Edition
Storage 2TB Sabrent Rocket NVMe + 2TB Intel 960p NVMe + 512GB Samsung 970 Evo NVMe + 4TB WD Black HDD
Display(s) 65" LG C9 OLED
Case Lian Li O11D-XL
Audio Device(s) Audeze Mobius headset, Logitech Z906 speakers
Power Supply Corsair AX1000 Titanium
Guys, we should only look to cadaveca now for tech rumors, this guy obviously knows what's up and we can't trust dozens of other knowledgeable people/sites. They all just make stuff up and obviously only release info to get page views.

:rolleyes:
 
Joined
Mar 23, 2012
Messages
570 (0.12/day)
Processor Intel i9-9900KS @ 5.2 GHz
Motherboard Gigabyte Z390 Aorus Master
Cooling Corsair H150i Elite
Memory 32GB Viper Steel Series DDR4-4000
Video Card(s) RTX 3090 Founders Edition
Storage 2TB Sabrent Rocket NVMe + 2TB Intel 960p NVMe + 512GB Samsung 970 Evo NVMe + 4TB WD Black HDD
Display(s) 65" LG C9 OLED
Case Lian Li O11D-XL
Audio Device(s) Audeze Mobius headset, Logitech Z906 speakers
Power Supply Corsair AX1000 Titanium
My 680 classified walks all over any 7970, so :cry::cry: less and check facts more.

My 1200/1800 Lightnings very seriously doubt that... OC'd 680s and 7970s are about even overall, trading blows depending on the game/benchmark.

The 7970 gets beat by the 680, sure, but the pricing has updated itself to reflect that now - the 7970 is priced about the same as the 670, and the GHz edition priced around the 680.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
I dunno. You know, the one thing that nVidia is really good as is getting the most dollar for R&D, and designing another chip kinda goes against that mantra.

I mean, it's like dropping the hot clock. They knew they had to.

Keeping within the 300W power envelope with the full-blown Kepler design was obviously not possible, proved by Fermi, IMHO.

Jen Hsun said "The interconnecting mesh was the problem" for Fermi. That mesh...is cache.

And gaming doesn't need that cache. But... HPC does. Gaming needs power-savings, and dropping the hotclock and lowering cache and DP lowered power consumption enough that GTX680 is pretty damn good.

GK104..was that chip you just mentioned.


HPC is more money. WAY MORE MONEY. So for THAT market, yes, a customized chip makes sense.


See, Fermi was the original here. GF100 is the original, NOT GK100. or GK110.


If nvidia started with Kepler as the new core design, then I would have sided with you guys, for sure, but really, to me, Kepler is a bunch of customized Fermi designs, customized in such a way to deliver the best product possible for the lowest cost, for each market.

You may think the Steamroller analogy is wrong here, but to me, that is EXACTLY what Kepler is. And you know what..nVidia says the same thing, too. :p


The hotclock to me, and teh lack of DP functionality, says it all. hotclock lets you use less die space, but requires more power. DP functionality also requires more power, because it requires more cache.

:laugh: at Kepler being Fermi. Sure and Fermi is Tesla arch (GT200). :laugh:

If we go by similarities, as in they look the same to me with a few tweaks we can go back to G80 days. Same on AMD side. But you know what? They have very little in common. Abandoning hot-clocks is not a trivial thing. Tripling the number of SPs on a similar transistor budget is not trivial either and it denotes exactly te opposite of what you're saying. Fermi and Kepler schematics may look the same, but they aren't the same at all.

As to the rest. It makes little sense to think that GK104 is the only thing they had planned. In previous geerations they created 500 mm^2 chips that were 60%-80% faster than their previous gen and AMD was close, 15%-20% behind. But on this gen they said: "you know what? What the heck. Let's create a 300mm^2 chip that is only 25% faster than our previous gen. Let's make the smallest (by far) jump on performance that we've ever had, let's just leave all that potential there. Later we'll make GK110 a 550 mm^2, so we know we can do it, and it's going to be a refresh part so it IS going to be a gaming card, but for now, let's not make a 450mm^2 chip, or a 350mm^2, no, no sir, a 294mm^2 and with a 256 bit interface that will clearly be the bottleneck even at 6000 MHz, let's just let AMD rip us a new one..."

EDIT: If GK110 had not been fabbed and shipped to customers already, you'd have the start of a point. But since it's already been shipped, it means that it's physically posible to create a 7.1 b chip and make it economically viable (the process hasn't changed much in 6 months). So like I said something in the middle, lika a 5b transistor and/or 400mm^2 would be entirely posible and Nvidia would have gone with that, because AMD's trend has been upwards in regards to die size and there's no way in hell Nvidia would have tried to compete with a 294mm^2 chip, when they knew 100% that AMD had a bigger chip AND they have been historically more competent at making more in less area. Nvidia can be a lot of things, but they are not stupid and would not commit suicide.
 
Last edited:

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.53/day)
Guys, we should only look to cadaveca now for tech rumors, this guy obviously knows what's up and we can't trust dozens of other knowledgeable people/sites. They all just make stuff up and obviously only release info to get page views.

:rolleyes:

Yep. :p


The fact you can't ignore that bit, says something.


What, I cannot speculate myself?

And when you can't attack my points, you go after my character? lulz.

As if I want to be the source of rumours. :p Yes, I want to be a gossip queen.


Like, do you get that? I'm not the one that posted the news...BTA didn't either...he just brought it here for us to discuss...

These same sites you trust, get it wrong just as often as right. Oh yeah, Bulldozer is awesome, smokes INtel outright..yeah..that worked...


HD7990 form AMD in auguest....but it was Powercolor...


Rumours are usually only part-truths, so to count them all as fact...is not my porogative. :p

:laugh: at Kepler being Fermi. Sure and Fermi is Tesla arch (GT200). :laugh:

If we go by similarities, as in they look the same to me with a few tweaks we can go back to G80 days. Same on AMD side. But you know what? They have very little in common. Abandoning hot-clocks is not a trivial thing. Tripling the number of SPs on a similar transistor budget is not trivial either and it denotes exactly te opposite of what you're saying. Fermi and Kepler schematics may look the same, but they aren't the same at all.

As to the rest. It makes little sense to think that GK104 is the only thing they had planned. In previous geerations they created 500 mm^2 chips that were 60%-80% faster than their previous gen and AMD was close, 15%-20% behind. But on this gen they said: "you know what? What the heck. Let's create a 300mm^2 chip that is only 25% faster than our previous gen. Let's make the smallest (by far) jump on performance that we've ever had, let's just leave all that potential there. Later we'll make GK110 a 550 mm^2, so we know we can do it, and it's going to be a refresh part so it IS going to be a gaming card, but for now, let's not make a 450mm^2 chip, or a 350mm^2, no, no sir, a 294mm^2 and with a 256 bit interface that will clearly be the bottleneck even at 6000 MHz, let's just let AMD rip us a new one..."

Well, that's just it. This is complicated stuff.

I am not saying at all that GK104 was the only thing...it isn't. But GK110 was never meant to be a GTX part. Kepler is where the Geforce and Tesla become truly seperate products.

View attachment 48746

And yeah, it probably did work exactly like that...300mm2...best they could get IN THAT SPACE, since this dictates that they can get so many chips per wafer. You know, designs do work like that, so they can optimize wafer usage...right?
 
Last edited:
Joined
Mar 23, 2012
Messages
570 (0.12/day)
Processor Intel i9-9900KS @ 5.2 GHz
Motherboard Gigabyte Z390 Aorus Master
Cooling Corsair H150i Elite
Memory 32GB Viper Steel Series DDR4-4000
Video Card(s) RTX 3090 Founders Edition
Storage 2TB Sabrent Rocket NVMe + 2TB Intel 960p NVMe + 512GB Samsung 970 Evo NVMe + 4TB WD Black HDD
Display(s) 65" LG C9 OLED
Case Lian Li O11D-XL
Audio Device(s) Audeze Mobius headset, Logitech Z906 speakers
Power Supply Corsair AX1000 Titanium
Didn't mean it as an attack on your character, I'm just saying that your last couple posts had an "I know what I'm talking about because I'm a reviewer and you peons don't" flavor to them, that's all.

Could just be reading them wrong, I suppose, but I think not.

Anyways, rumors are rumors, but they exist for a reason, and this particular family of rumors has been around for almost a year now... plenty long enough to indicate there's something to it.

Enough debating about rumors for me, though.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.53/day)
Could just be reading them wrong, I suppose, but I think not.

Yeah, you're reading that wrong. I was saying explicitly that I don't know WTF I'm talking about here, since I'm a reviewer. If I did know what I was tlaknig about, I'd not be able to discuss it.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
But GK110 was never meant to be a GTX part.

Oh gosh. Before you say that one more time, can you please explain at least once, why it has so many units that are completely useless in a Tesla card?

Looking at the whitepaper, anyone who knows a damn about GPUs can see that GK110 has been designed to be a fast GPU as much as it's been designed to be a fast HPC chip. Even GF100/110 was castrated in that regards compared to GF104, and G80 and G9x had the same kind of castration, but in Kepler the family where "Geforce and Tesla become truly seperate products." they choose to mantain all those innecessary TMU, tesselators and geometry engines.

- If GK104 was at least close to 400mm^2, your argument would hold some water. At 294mm^2 it does not.
- If GK104 was 384 bits, your argument would hold water. At 256 bit, it doe not.
- If GK110 didn't exist and had not released 6 months after GK104 did...
- If GK110 had no gaming features and wasn't used as the high-end refresh card...
- If GK104 had been named GK100... you get it.
 
Last edited:

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.53/day)
Oh gosh. Before you say that one more time, can you please explain at least once, why it has so many units that are completely useless in a Tesla card?

Because all those things are needed for medical imaging. HPC products still need 3D video capability too. Medical imaging is a very vast market, worth billions. 3D is not gaming. That's where you miss some things. :p

And no, I do not agree with the summation that GK110 was intended to be a "fast GPU". The needed die size says that is not really possible.


But, since it's for HPC, where precision is needed over speed as a priority, that's OK, and lowered clocks, but greater functionality, makes sense.


However, for the desktop market, where speed wins overall, the functionality side isn't so much needed, so it was stripped out. This makes for two distinct product lines, with staggered releases, and hence not competing for each other.

I mean likewise, what do all those HPC features have to do with a gaming product? :p
 
Joined
Nov 13, 2009
Messages
5,614 (1.02/day)
Location
San Diego, CA
System Name White Boy
Processor Core i7 3770k @4.6 Ghz
Motherboard ASUS P8Z77-I Deluxe
Cooling CORSAIR H100
Memory CORSAIR Vengeance 16GB @ 2177
Video Card(s) EVGA GTX 680 CLASSIEFIED @ 1250 Core
Storage 2 Samsung 830 256 GB (Raid 0) 1 Hitachi 4 TB
Display(s) 1 Dell 30U11 30"
Case BIT FENIX Prodigy
Audio Device(s) none
Power Supply SeaSonic X750 Gold 750W Modular
Software Windows Pro 7 64 bit || Ubuntu 64 Bit
Benchmark Scores 2017 Unigine Heaven :: P37239 3D Mark Vantage
My 1200/1800 Lightnings very seriously doubt that... OC'd 680s and 7970s are about even overall, trading blows depending on the game/benchmark.

The 7970 gets beat by the 680, sure, but the pricing has updated itself to reflect that now - the 7970 is priced about the same as the 670, and the GHz edition priced around the 680.

You confusing value in a debate about performance, not the same thing at all, nor valid in any way.:shadedshu
 
Joined
Sep 7, 2010
Messages
854 (0.16/day)
Location
Nairobi, Kenya
Processor Intel Core i7-14700K
Motherboard ASUS ROG STRIX Z790-H
Cooling DeepCool AK500 WH
Memory Crucial Pro 32GB Kit (16GB x 2) DDR5-5600 (CP2K16G56C46U5)
Video Card(s) Intel ARC A770 Limited Edition
Storage Solidigm P44 Pro (2TB x 2) / PNY CS3140 2TB
Display(s) Philips 32M1N5800A
Case Lian Li O11 Air Mini (White)
Power Supply Seasonic Prime Fanless Titanium 600W
Keyboard Dell KM714 Wireless
Software Windows 11 Pro x64
Anyways, rumors are rumors, but they exist for a reason, and this particular family of rumors has been around for almost a year now... plenty long enough to indicate there's something to it.

Yes there is something and what we know for sure is GK110 TESLA/QUADRO,... for now.

And as cadaveca said, the info we have right now is just rumors and speculation, lets just wait and sooner or later we will all know for sure.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
Because all those things are needed for medical imaging. HPC products still need 3D video capability too. Medical imaging is a very vast market, worth billions. 3D is not gaming. That's where you miss some things. :p

Medical imaging is not HPC. Maybe you should have been more clear. That being said, Nvidia has announced GK110 based Tesla, but no Quadro:

http://www.techpowerup.com/170096/N...tion-Revolution-With-Kepler-Architecture.html

Their Maximus platform is composed off GK104 based Quadro and GK110 based Tesla cards. So I think that you're missing much more than I.

And oh, I don't doubt there will be a GK110 based Quadro, but it's not been announced yet afaik. I've only heard about them in the same rumors as the GeForce part so... ;)

And no, I do not agree with the summation that GK110 was intended to be a "fast GPU". The needed die size says that is not really possible.

And yet it all points to Nvidia using it. And in the past they have used chips of the same size and quite successfully.

And an HPC chip has never been profitable on it's own and I don't think it is right now either.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.53/day)
And an HPC chip has never been profitable on it's own and I don't think it is right now either.

I bet nVidia would disagree.


For me, Medical imaging is part of the HPC market. Precise imaging isn't needed just for medical uses either, anything that needs a picture that is accurate, from oil and gas exploration to military uses, all fall under the same usage. Both Tesla and Quadro cards are meant to be used together, building an infrastucture that can scale to consumer demands, called Maximus. If you need more rendering power, say for movie production, you got it, or if you need more compute, for stock market simulation, that's there too, so I fail to agree you've posted much that agrees with your stance there. Nvidia doesn't build single GPUs...they build compute infrastructure.


Welcome to 2012.


With this second generation of Maximus, compute work is assigned to run on the new NVIDIA Tesla K20 GPU computing accelerator, freeing up the new NVIDIA Quadro K5000 GPU to handle graphics functions. Maximus unified technology transparently and automatically assigns visualization and simulation or rendering work to the right processor.

Did you read that press release?


:p

I mean, that whole press release is nVidia claiming it IS profitible, or they wouldn't be marketing towards it. :p



In fact, that press release kinda proves my whole original point, now doesn't it? ;) GK104 for imaging(3D, Quadro and Geforce), GK110 for compute(Tesla).


Like, maybe I'm crazy...but...well...whatever. I'm gonna play some BF3. :p
 
Last edited:

crazyeyesreaper

Not a Moderator
Staff member
Joined
Mar 25, 2009
Messages
9,816 (1.72/day)
Location
04578
System Name Old reliable
Processor Intel 8700K @ 4.8 GHz
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling Custom Water
Memory 32 GB Crucial Ballistix 3666 MHz
Video Card(s) MSI RTX 3080 10GB Suprim X
Storage 3x SSDs 2x HDDs
Display(s) ASUS VG27AQL1A x2 2560x1440 8bit IPS
Case Thermaltake Core P3 TG
Audio Device(s) Samson Meteor Mic / Generic 2.1 / KRK KNS 6400 headset
Power Supply Zalman EBT-1000
Mouse Mionix NAOS 7000
Keyboard Mionix
i could care less if you want to call me a fanboy [H]@RD5TUFF but honestly it just makes you look like a child.

i could care less about your classified 680s blah blah i still had my card months before 680 was available and enjoying roughly the same performance.

Simple fact is if i want to be a dick and pull useless numbers the 7970 holds the World Record for 3DMark 11 Extreme, Heaven Extreme, among others

when both cards are clocked they perform the same, they excell in certain games over their rival and vice versa

AvP favors AMD
BF3 favors NVIDIA
etc
 
Top