• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia's future endeavors

Joined
Jun 28, 2008
Messages
1,109 (0.18/day)
Location
Greenville, NC
System Name Champ's 1440P Rig
Processor Intel i7-4770K @ 4.6 GHz
Motherboard AsRock Z97 Extreme6
Cooling Corsair H60
Memory Corsair Vengeance 16GB 1600 Mhz 4x4 Blue Ram
Video Card(s) Nvidia 1080 FE
Storage Samsung 840 Evo 256 GB/RAID 0 Western Digital Blue 1 TB HDDs
Display(s) Acer XG270HU
Case Antec P100
Power Supply Corsair CX850M
Mouse Logitech G502
Keyboard TT eSports Poseidon
Software Windows 10
Meh, idle rumour for now. Saw it on WCCF or whatever it is called - very bad at good info.
 
Like they said at the end of the article, they really don't NEED to do anything like this for a while. AMD have nothing to counter it with right now, and they really can't even afford to try yet. The only thing you see dropping is Nvdia's own 780, which isn't matching the value of the 970. I saw a sale on the ASUS 780 today for $293 at Newegg, and even at that price, a lot of people are going to prefer the 970.

Honestly, I'm tempted after reading this to hold off on my platform/GPU upgrade. I've been kinda leaning toward end of next yr lately anyway, since Broadwell and the new Skylake socket have yet to release (1151?), and I'm sure DDR4 will go down in price and latency, as well as appear on the next budget platform. I've also only had my 7970 just less than 2 yrs, and it still handles all games pretty well.

By end of next yr I'll also have a better idea of what my next display might be (1440p, 4k, etc). Even the 970, esp if getting one of the better ones like the Giga WF, can handle 1440p (although a 970 Ti would do so better certainly). Hard to say how broad a lineup and how soon we'll see 20nm GPUs, but I worry the price of admission will be quite high.
 
I'm honestly tempted to hold off also to plunge that money into a 980 TI or new Titan
 
I'm honestly tempted to hold off also to plunge that money into a 980 TI or new Titan

Yeah, would be cool to have a 4 k capable single GPU.
 
Just a rumour and nothing more. I see Nvidia doing exactly what they did with GTX 600/700 series; release GK104 (mid-tier GPU) as high-end SKU (GTX 670/680), then rehash the mid-tier GPU as mid-range card (GTX 770) and release the top-tier GPU GK110 as high-end cards (780/Ti/Titan). So we will have GM204 as GTX 1070? and GM200 as GTX 1080/Ti/Titan? This'll be quite a few months off yet...
 
Things are too easy for Nvidia right now. Have been since Kepler. They can drop a bigger chip if AMD gets anywhere close to them, or push in the die shrink and start their whole midrange --> big chip cycle again. I guess it's a plus if you want to feel like your card is top notch for a longer period of time, but it sucks if you're on 4k and need that power now.
 
If you are a serious gamer, no other choice than Nvidia: low latency, high quality, lower tdp(more silent), better stability + best optimization for my fav esport games like Starcraft 2. Any CS game is preferable for nvidia because of lower frames delay. Beautiful driver customization features for any soft on PC. G-SYNC.
 
If you are a serious gamer, no other choice than Nvidia: low latency, high quality, lower tdp(more silent), better stability + best optimization for my fav esport games like Starcraft 2. Any CS game is preferable for nvidia because of lower frames delay. Beautiful driver customization features for any soft on PC. G-SYNC.

What a pile of crap.

I know i will wait untill shit matures on the TV\monitor side and as for video cards if you have one of the latest you don't need just like you don't need a nVidia card to be a serious gamer LMFAO.
 
+1

If anything the better price/performance of AMD cards should be what decides it.
 
What a pile of crap.

I know i will wait untill shit matures on the TV\monitor side and as for video cards if you have one of the latest you don't need just like you don't need a nVidia card to be a serious gamer LMFAO.

Yet at the end of the day, the 970 still kicks ass to the 290x, and for a LOT less money. Not to mention that low TDP producing hefty OCs.

Time to give credit where it's due. I'm ready for a change already, and I'm less than 2 yrs into my 7970. Then again, I knew it was a get me by until high end Maxwells released, and I got it at roughly the equivalent of only $250 with the game bundle I got.

Truth be told though, had the MSI 660 Ti PE OC not had compat issues with my TV, I'd still be on it, and the 7970 was a far better deal. That said, I can see now why the customer reviews are more spotty on the AMD cards. Mine doesn't OC worth a damn. It artifacts badly if I take it a breath over the small factory OC.

Most quality brand Nvidia factory OCed cards will yield at least SOME amount of OC beyond the factory one, esp Giga with their GPU Gauntlet Sorting. It's Maxwell all the way for me on next GPU. My 7970 will serve me adequately until I decide which one, but it's def looking green for me.
 
Last edited by a moderator:
Kicks it's ass or not don't mean you or my self are not serious gamers, i been games over 30 years and to make such claim is total bullshit.

How ever if the 970 was out when i was getting my 290X start of the year i would of opted for it for sure.

I do know that for what games there on the market today that require such cards are few and i know that i could of been a happy gamer using my older 6970 and played all but 1 at really reasonable detail.

I all so know that i should of waited until a nm shrink as lets face it it was not as if i was using a card that was not able to handle good serious gaming.
 
What a pile of crap.

I know i will wait untill shit matures on the TV\monitor side and as for video cards if you have one of the latest you don't need just like you don't need a nVidia card to be a serious gamer LMFAO.
Man I love you but lets me honest........all you do is play Diablo. lol

I agree it was a crap statement but Diablo can run on a cell phone.
 
Man I love you but lets me honest........all you do is play Diablo. lol

I agree it was a crap statement but Diablo can run on a cell phone.

Well i had a long ass break from that to play other games until last week as i am waiting on a Divinity price drop lol.

And Arma 3 is in stale mate at this time.

As for other games been there done it either got bored or completed them.

I just feel sorry for those you play Gauntlet with as i know you only to well.
 
Kicks it's ass or not don't mean you or my self are not serious gamers, i been games over 30 years and to make such claim is total bullshit.

How ever if the 970 was out when i was getting my 290X start of the year i would of opted for it for sure.

I don't think anyone foresaw the high end Maxwells being so affordable, but I'm not surprised by the performance. We saw Nvidia's expected performance on it well before they came out with it, before they even came out with the 750 Ti, and usually they calculate that stuff fairly accurately. I knew then I wanted a Maxwell.

Don't take the serious gamer comment so personally man, you're better than that. I took it as meaning "those whom are avid gamers are gonna want this card", and that's no lie, we do. I don't think he was bashing anyone on AMD at all. In fact if anything, it relates to AMD customers even more now that Nvidia is offering such bang for buck, which has usually been AMD's forte.

So even though I know the 970 is the most ass kicking card right now on bang for,...scratch that, I'm officially calling it Big Bang for Buck, something also tells me AMD had a lot to do with Nvidia's pricing choice. It's Nvidia saying, "See we can offer great value too".

It's all good man, this competition is healthy for the industry, and even our heated discussions about it, but no need for anonymity here.
 
I'm actually concerned that AMD is again falling behind... They were the one responding to GK110, now they have to respond to the GM204...
 
I'm actually concerned that AMD is again falling behind... They were the one responding to GK110, now they have to respond to the GM204...

They have been since the GTX 600 series. Their top-tier Tahiti GPUs only just managed to match/beat the mid-tier (but high-end SKU) GK104 GTX670/680. When Nvidia refreshed/rehashed the Keplers for the GTX 700 series and finally dropped the GK110 in the GTX 780 (to compete with GHz Ed?), AMD responded with their Hawaii GPUs. By this time the GK110 had first been available (albeit as a Telsa card) for almost 12 months prior and had been speculated many months earlier. This leads to a theory that Nvidia purposely withheld the GK110 after seeing the original Tahiti performance and raked in the cash on a marked up mid tier GPU (GK104). It seems history is now repeating itself with the new Maxwell GPUs found within the GTX 970/980 (GM204).

Putting on a tinfoil hat for a second, one could almost think there is some sort of gentlemen's agreement between NVidia and AMD that has resulted in price collusion; for the last few generations, their GPUs have never really competed directly, just conveniently slot between each other in price, performance and features. :ohwell:
 
I hate spreading rumors since this is a factual place, but my tweaktown email said nvidia is working on an AIO version of the 980. I can only see this being useful for a possible TI or Titan.
 
I hate spreading rumors since this is a factual place, but my tweaktown email said nvidia is working on an AIO version of the 980.

The only possible reason I can see for Nvidia doing such a thing, is that the GTX 980s are hitting the 80c temperature trigger a little too often, affecting the boost clocks. I doubt this is the case, however. One other possibility is that they're releasing some sort of SKU that comes with heavy factory overclocking?
 
They have been since the GTX 600 series. Their top-tier Tahiti GPUs only just managed to match/beat the mid-tier (but high-end SKU) GK104 GTX670/680. When Nvidia refreshed/rehashed the Keplers for the GTX 700 series and finally dropped the GK110 in the GTX 780 (to compete with GHz Ed?), AMD responded with their Hawaii GPUs. By this time the GK110 had first been available (albeit as a Telsa card) for almost 12 months prior and had been speculated many months earlier. This leads to a theory that Nvidia purposely withheld the GK110 after seeing the original Tahiti performance and raked in the cash on a marked up mid tier GPU (GK104). It seems history is now repeating itself with the new Maxwell GPUs found within the GTX 970/980 (GM204).

Putting on a tinfoil hat for a second, one could almost think there is some sort of gentlemen's agreement between NVidia and AMD that has resulted in price collusion; for the last few generations, their GPUs have never really competed directly, just conveniently slot between each other in price, performance and features. :ohwell:
Nvidia originally planned to make a GK100 but it was an epic failure in yield rates so it got scrapped and replaced by the GK110. When the GTX 680 launched there were supply issues for the same reason.

AMD announced that the 3xx series will be 20nm in 2015 and the only way they will stop chasing Nvidia is if they change tactics a design gpu the size of GK110 from day one because for now it's been: AMD switches nodes first but Nvidia eventually manages to produce enough of their huge GPUs and AMD ends up behind again .
 
Nvidia originally planned to make a GK100 but it was an epic failure in yield rates so it got scrapped and replaced by the GK110.

GK100 was only ever speculation at best, so I doubt there was ever yield information released to public (if it even ever existed?).

Here's my two predictions for the next 15 months or so:

  • GM200 will be released by the end of this year on the matured 28nm node. First half next year, Nvidia will refresh Maxwell on 20nm for mid to high range SKUs. This will bring higher clocks, etc, possibly followed by a Titan Z successor.
  • Or the more likely the scenario is that we won't see the GM200 this year and it will be released sometime during H1 2015 along side a 20nm Maxwell refresh. The GM200 itself will be on the new 20nm node. As with the first scenario, existing chips like the GM204 will receive a refresh on 20nm node, if not will just received better PCB, etc, like GTX 770 vs GTX 680.
This is similar to one of my other predictions:

My prediction is Maxwell refresh on 20nm early to mid next year. Pascal release on 20nm. Then Pascal 16nm refresh? Should carry us for the next 3 years. It seems to me that architectural generations last at least 18 months/over two SKU series (ie Fermi (400 and 500 series), Kepler (600 and 700 series), Maxwell (primarily 800 and 900? series).

Edit: Obviously my last prediction was before Nvidia decided to skip the 800 series in the desktop space.
 
Last edited:
Ocing anything is never a guarantee(atleast it used to not be). Promising something will jump to 4.5GHz or w/e is just sandbagging.
 
GK100 was only ever speculation at best, so I doubt there was ever yield information released to public (if it even ever existed?).
Why would they call the GK110 the GK110 unless there was a GK100 which failed. Fermi is the same the GF100 was never fully operational and then you got the GF110 which was a revised GF100.
 
There is no 980ti
This rumour is baseless and there is literally nothing backing it up. The only thing going forward from here is the maxwell 960, and the inevitable DPP card that's either a Quadro or a Titan (given NVidia's awful market acceptance of the Titan these days, they will know going down that road is a bad idea).

I'm not buying this AIO GPU stuff either. It's speculation, designed to generate clicks.

For everything else, wait on the 210 chip next year.
 
Why would they call the GK110 the GK110 unless there was a GK100 which failed. Fermi is the same the GF100 was never fully operational and then you got the GF110 which was a revised GF100.

The difference is that the GF100 actually existed in a SKU and the GF110 was the revision, as you mentioned. Speculation is formation of a theory without firm evidence which is what is happening here. GK100 may have existed as a prototype chip which never reached even engineering samples. It could also just be simple marketing, GK100 may be seen by the public as too close to GF100 in naming scheme and may draw upon negative memories of a relatively hot and power hungry card. Stretching the theory more towards my original speculation regarding the original GTX 600 series launch, it could be possible that when Nvidia saw the original Tahiti performance, they made a quick revision to the GK100 and renamed it GK110 to differentiate it internally. This still could be related to yields, but since such a chip was never even leaked with credible evidence, we can only speculate here.
 
Back
Top