# NVIDIA GeForce GTX 880 Detailed



## btarunr (Apr 10, 2014)

NVIDIA's next-generation GeForce GTX 880 graphics card is shaping up to be a true successor to the GTX 680. According to a Tyden.cz report, GTX 880 will be based on NVIDIA's GM204 silicon, which ranks within its product stack in the same way GK104 does to the GeForce "Kepler" family. It won't be the biggest chip based on the "Maxwell" architecture, but will have what it takes to outperform even the GK110, again, in the same way GK104 outperforms GF110. The DirectX 12-ready chip will feature an SMM (streaming multiprocessor Maxwell) SIMD design that's identical to that of the GeForce GTX 750 Ti, only there are more SMMs, spread across multiple graphics processing clusters (GPCs), probably cushioned by a large slab of cache.



 

This is what the GTX 880 is shaping up to be.



20 nm GM204 silicon
7.9 billion transistors
3,200 CUDA cores
200 TMUs
32 ROPs
5.7 TFLOP/s single-precision floating-point throughput
256-bit wide GDDR5 memory interface
4 GB standard memory amount
238 GB/s memory bandwidth
Clock speeds of 900 MHz core, 950 MHz GPU Boost, 7.40 GHz memory
230W board power

*View at TechPowerUp Main Site*


----------



## OC-Rage (Apr 10, 2014)

wow GODDAMN what is this {monster}

ready for 4K Games    ready for  O.C i think 256bit not enough for high resolutions

instead 7.40 Ghz memory speed


----------



## john_ (Apr 10, 2014)

I like coincidents. For example at wccftech they have possible specs for R300 series.
AMD Pirate Islands : R9 300 Series Alleged Specifications Detailed, Flagship Cores Bermuda XTX, Treasure Island XTX and Fiji XTX

Anyway, I thing 32 ROPs is pretty low for a hi end card today with 4K in mind, or am I wrong? 256bit already mentioned.
Maybe it is happening what I am afraid of. Hi end names and prices on mid range products.
That's why I was shouting that Titan was and is bad news for us gamers/desktop users no matter who's company fan you are and 295X2 is in fact a ridiculously overpriced card justified partly thanks to Titan Z's price. And it doesn't matter if you rush to defend the semi custom nature of Titan. As gamers WE DO NOT CARE FOR THE PROFESSIONAL FEATURES OF THE CARD.

Who ever was defending Titan, as a good Nvidia fanboy, gets as a present a card with 32 ROPs instead of 48 and 256bits data bus instead of at least 384bits. Congrats.


----------



## Tsukiyomi91 (Apr 10, 2014)

if it's true, then AMD has to come up with a "contingency" plan since the Maxwell's power efficient chips (as well as potential) proved to be one of Nvidia's main strongholds here. But what really surprises me is that a high-end chip, the GM204 chip inside the GTX880 has a sweet 230W power consumption! That's like 20W lower than GTX780Ti's full-blown GK110 chip!!! So, I say kudos to Nvidia once again of proving to be the best VGA vendors money can buy.


----------



## Tsukiyomi91 (Apr 10, 2014)

the Maxwell aka GM2xx chips have bigger memory cache than Kepler chips, so having a blazing-fast VRAM speeds & huge video memory compensates the bus width. 256-bit is quite standard so it's not of a big deal anyways. Remember, Maxwell is focused on energy efficiency, not raw power like AMD's super-hot Tahiti chips.


----------



## Katanai (Apr 10, 2014)

3200?!? This will be a beast. If it won't cost a fortune, I will have to get this one.


----------



## NC37 (Apr 10, 2014)

And all the people who bought a $3000 GPU go... 

NV stated they'd be doing a tick-tock setup like Intel. Kepler was a nice advancement but it was still the tock to Fermi. It isn't super different. Now we're going on the next tick, which is Maxwell. 

Interested to see if NV gets it right again like they did with the 400s. The 750 is really mediocre and its a GM chip. Will be keeping a close eye and hopefully new employment to pay for it. Unless of course, AMD finally gets everything right.


----------



## Tsukiyomi91 (Apr 10, 2014)

what AMD needs to do now is to make their chips faster, lower the power consumption on new or refreshed chips & improving the software (drivers too) in order to push the card to it's maximum potential. Heat issues are inevitable, but if they managed to yield lower TDP, then they're getting on the right track for the 1st time in years.


----------



## HumanSmoke (Apr 10, 2014)

john_ said:


> I like coincidents. For example at wccftech they have possible specs for R300 series.
> AMD Pirate Islands : R9 300 Series Alleged Specifications Detailed, Flagship Cores Bermuda XTX, Treasure Island XTX and Fiji XTX.


Ah, WCCF...


> I have just recently gotten my hands on a somewhat outdated and alleged AMD GPU specification details.....all Pirate Island GPUs will have the DirectX 12 Hardware Feature Set.


That's some mighty fast work. According to AMD, DX12 only became a reality after Mantle showed up, so AMD are going from architecture tweak to design layout to tape out to risk silicon to production silicon and die packaging in 9-10 months...on a new process which doesn't seem conducive to high power IC's (TSMC's CLN20SOC) ?...and there's no way that TSMC's 20nm BEOL/16nm FEOL will be ready for primetime in that timeframe.

Should be interesting to see how big the die gets. 4224 cores are a 50% increase over Hawaii for a less than 30% smaller node transition.

Both of these estimations (Pirate Islands and GM204) look more like wish lists, unless the timeframe is further out to allow for 20nm/16nmFF node usage.


----------



## Xzibit (Apr 10, 2014)

*Anyone else getting a *VIRUS WARNING* when clicking the ExPreview link ?

Here the comparison chart and it looks like going to cost just as much as a 780 TI.   Have to wonder how much the big chip will cost then


----------



## bim27142 (Apr 10, 2014)

Hoping for a 860 or 860 Ti with no external power connector...


----------



## Tsukiyomi91 (Apr 10, 2014)

AMD now wanted to use DX12 Hardware Feature Set?? that's something...


----------



## HumanSmoke (Apr 10, 2014)

Xzibit said:


> Here the comparison chart and it looks like going to cost just as much as a 780 TI.   Have to wonder how much the big chip will cost then


An arm and a leg...and maybe a kidney for the fully enabled die.
The cost per wafer for 20nm BEOL+ 16nm FEOL (FinFET) is rumoured to be around $6000-6500 per- a sizeable cost increase over the existing 28nm process (< $4K) + costs for masks. If the Pirate Islands info from WCCF   is ballpark then there probably wont be much difference in die size between AMD and Nvidia at the high end. A quick and dirty calculation says that AMD's chip (for a 50% core increase) would be ~460-470mm², while Nvidia's would probably decrease from its current flagship (551mm²) (GM204 *not* GM200 if it eventuates) since the core count is only increasing by 11% and uncore would decrease in relation to GK110 (less die space needed for memory controllers, I/O) unless Nvidia decide to beef up the SFU's to increase FP64 or dramatically increase cache.


----------



## Rahmat Sofyan (Apr 10, 2014)

Titan Z not come yet for review, GTX 790 never come?

Now GTX 880 already rumoured..Jezzz


----------



## sweet (Apr 10, 2014)

Tyden.cz? Never heard about it. An unpopular source from Czech leaks info about nVidia? So reliable 
And the "leaked" specs is not that impressive compared to the "leaked" from AMD
http://wccftech.com/amd-pirate-islands-r9-300-series-bermuda-fiji-treasure-islands-xtx/


----------



## buildzoid (Apr 10, 2014)

Maxwell is better at mining than current GCN so if scrypt ASICs don't show up soon these will cost 2x MSRP


----------



## HumanSmoke (Apr 10, 2014)

sweet said:


> Tyden.cz? Never heard about it. An unpopular source from Czech leaks info about nVidia? So reliable
> And the "leaked" specs is not that impressive compared to the "leaked" from AMD
> http://wccftech.com/amd-pirate-islands-r9-300-series-bermuda-fiji-treasure-islands-xtx/


WCCF are only any good when they're cutting and pasting from known leakers (Chinese forums) or another mainstream site...otherwise it's more like this (and I don't think there is a correct specification in the whole article)


----------



## sweet (Apr 10, 2014)

HumanSmoke said:


> WCCF are only any good when they're cutting and pasting from known leakers (Chinese forums) or another mainstream site...otherwise it's more like this (and I don't think there is a correct specification in the whole article)


Yeah, I never said WCCF is a reliable source. But is Tyden.cz THAT popular to be cited by TPU???

I expected some leaked nVidia's slides when I read the title "*NVIDIA GeForce GTX 880 Detailed" *(which has no question mark). What a mislead article.


----------



## arbiter (Apr 10, 2014)

sweet said:


> Tyden.cz? Never heard about it. An unpopular source from Czech leaks info about nVidia? So reliable
> And the "leaked" specs is not that impressive compared to the "leaked" from AMD
> http://wccftech.com/amd-pirate-islands-r9-300-series-bermuda-fiji-treasure-islands-xtx/



yea that one link shows 390x having 4224 Stream Processors seems bit far fetched for 1 gpu generation. Seems like be more like be about the same as nvidia's or 3500ish range for AMD's 390x.  Otherwise would be a be a pretty massive gpu and being 300watts ish or more wouldn't shock me..


----------



## rokazs1 (Apr 10, 2014)

bim27142 said:


> Hoping for a 860 or 860 Ti with no external power connector...



That's a bit unrealistic, even some of the 750Ti has 1 x 6pin 
 860Ti with 1 x 6pin and I'm sold .


----------



## HumanSmoke (Apr 10, 2014)

sweet said:


> Yeah, I never said WCCF is a reliable source. But is Tyden.cz THAT popular to be cited by TPU???


Probably not. As I said in an earlier post - Both of these estimations (Pirate Islands and GM204) look more like wish lists.
The silly season is upon us. People are itching to hear about something new after a relative ( two-and-a-bit years) eternity on 28nm, so the semi-pro guessers lay down the estimates and the forums go wild. That Czech site will probably get more hits in the next 24 hours than they normally get in 6 months. Ka-Ching!


----------



## RejZoR (Apr 10, 2014)

Interesting, they'll be using less ROP's and less bus bandwidth than in older models.


----------



## TheDeeGee (Apr 10, 2014)

Katanai said:


> 3200?!? This will be a beast. If it won't cost a fortune, I will have to get this one.



According to a comparison table on Guru3D, it's aimed to be the same price as a 780 Ti.

GTX880 - 16000 CZK (583 Euros / 807 Dollars)
GTX780 Ti - 16100 CZK (586 Euros/ 881 Dollars)


----------



## buggalugs (Apr 10, 2014)

Tsukiyomi91 said:


> AMD now wanted to use DX12 Hardware Feature Set?? that's something...




 What are you going on about. This is Nvidia next gen hardware, wont be out for a while, we're still talking rumours. AMD will have something to compare soon enough. AMD will be built on 20nm too so performance will be +/- 10%.....just like every generation. Both are built at the same factory.

 The plan is same as always, to milk the consumers. Nvidia will sell you cut down GTX 880 first with disabled ROPS,TMUs CUDA cores etc etc, then AMD will release something, as good or better, then Nvidia will unlock some features of same silicon and call it 880 Ti so consumers need to spend hundreds on a new card.

 There is no "better" graphics card company, they are both in on it, everything is pre-planned, Nvidia knows exactly what AMD is doing and AMD knows exactly what Nvidia is doing.


----------



## robert3892 (Apr 10, 2014)

Looking forward to this but since tsmc is having difficulties with 20nm dies I expect to see 8XX series cards towards the end of this year or beginning of next year.


----------



## LeonVolcove (Apr 10, 2014)

lets say $1000?


----------



## THU31 (Apr 10, 2014)

Absolutely ridiculous specs.

Compared to 780 Ti this has:
15% more Gflops, 1/3 less memory bandwidth, fewer TMUs and ROPs, and an enormous TDP. Bear in mind, Maxwell has 50% better performance/watt, and 20 nm adds another 30%.

Whoever came up with those specs is a complete idiot.


----------



## SimplexPL (Apr 10, 2014)

Harry Lloyd said:


> Absolutely ridiculous specs.
> 
> Compared to 780 Ti this has:
> 15% more Gflops, 1/3 less memory bandwidth, less TMUs and ROPs, and an enormous TDP. Bear in mind, Maxwell has 50% better performance/watt, and 20 nm adds another 30%.
> ...



Yeah, I was just about to write the same thing. All those people who got excited about those specs - read them again and compare to specs of current GPUs.

Luckily this source has zero credibility so let's hope that this specs will turn out to be completely false, or they turn out to be specs of GTX 860


----------



## 20mmrain (Apr 10, 2014)

I got some solid info that Nvidia will be releasing this card with a black cooler and under the Titan Designation.
It will be called the "GTX 880 Titan-ZZ top edition" and cost $10,000 US dollars (But it will not be the fully unlocked version) But don't worry... those waiting for the fully unlocked version (after spending their money on the GTX 880 won't have to wait long.) Because 3 months later Nvidia will release the more powerful GTX 880ti Super duper Titan ZZ top edition for the price of $20,000 US dollars.





On a serious note, yes I know I was trolling above....and I am glad to see Nvidia make improvements finally. However, I just hope that Nvidia does't try and charge an arm and a leg for this card when it is unnecessary to do so.


----------



## THU31 (Apr 10, 2014)

SimplexPL said:


> ...or they turn out to be specs of GTX 860



Very realistic, indeed (except for the TDP).

660 had 20% more Gflops than 580, and cost 230 $ (580 cost 500 $).

680 had 100% more Gflops than 580, and cost exactly the same, 500 $.

580 TDP - 244 W
680 TDP - 195 W
660 TDP - 140 W


So how could a card with just 15% more Gflops and 80% better performance/watt have a TDP of 230 W? Complete bullshit.


----------



## the54thvoid (Apr 10, 2014)

Well, I just bought a new card so it follows all the next gen stuff will appear in about a month...

But seriously, yawn.  This is getting old.  New gfx card releases are becoming as dull as Intel's new cpu's.  How these things are used and the things they are designed for, i.e. games, need to move forward.  

I'd much rather see innovation in gaming and software dictating what the gfx vendors need to do.  It's almost like designing a race car that goes 8000 mph in a straight line only to find out we're racing round a donut.   Energy efficiencies are pushing NV forward in the mobile space but as far as desktop, something 'more' is required to make it all good.

People are applauding the 295x2 but that's also a boring turd.  There's nothing inventive about it - it's a dual chip card with no refinement and a cooling solution slapped on by pragmatism, not innovation.  Does it power 4K?  Of course it does.  But so do 2 x 290x.

TitanZ? who knows.  If it's 300watt and 90% the perf of 295x2, it's an achievement.  But still meh.

Next gen needs a big rabbit out of a small hat.


----------



## CounterZeus (Apr 10, 2014)

Finally, my new graphics card detailed! That or a 870 if my psu doesn't cut it.


----------



## KainXS (Apr 10, 2014)

so are we going to have the titan mashup again

GTX880 -> GTX880 Titan(48Rop) -> GTX880Ti(48Rops) -> GTX880 Titan Z/Black//(All SP's/Rops unlocked for $$$$)

don't think its true myself.


----------



## ISI300 (Apr 10, 2014)

This is why no one should buy Titan-Zs or 295 X2s. Sooner or later, this thing will release, And AMD has had plenty of time to get their architecture right. Let's hope they can compete.


----------



## 64K (Apr 10, 2014)

Harry Lloyd said:


> Very realistic, indeed (except for the TDP).
> 
> 660 had 20% more Gflops than 580, and cost 230 $ (580 cost 500 $).
> 
> ...



Agreed. The TDP doesn't make sense for Maxwell GM104 with those specs. If the TDP is wrong then what other specs are also wrong and why did whoever leak this info know enough to make the other specs somewhat believable but screw up the TDP? I have to throw the whole leak out the window for now. Maybe some of it will prove to be true at the end of this year. We'll see.


----------



## xorbe (Apr 10, 2014)

I am going to camp until ALL of the 800 series cards are on the table (probably when they start talking 900 series).  They burned their goodwill by going 780 -> 780Ti -> 6GB 780Ti last round.  Luckily we were able to flash a lot of original Titans towards 1100 Mhz and side-step that whole disaster, though still down one SMX, but not much difference to the 6GB 780Ti that was finally sold months and months later.


----------



## TheHunter (Apr 10, 2014)

So another "mid-range" dressed up as high-end aka GK104 chip.. Meh talk about milking again.


----------



## TheGuruStud (Apr 10, 2014)

This won't ship for another 7-8 months, but we have specs?

Bwahahhahahahahhahahahahahahhahahahahhhhhahahahahahahhaha

Clickbait crap.


----------



## TheDeeGee (Apr 10, 2014)

OC-Rage said:


> wow GODDAMN what is this {monster}
> 
> ready for 4K Games    ready for  O.C i think 256bit not enough for high resolutions
> 
> instead 7.40 Ghz memory speed



Maxwell has an L2 cache of 2MB, so 256-Bit is far from a bottleneck for high resolutions.

Apart from that it runs at 7,4 GHz which might even be OCed to 8 GHz.


----------



## MxPhenom 216 (Apr 10, 2014)

buggalugs said:


> What are you going on about. This is Nvidia next gen hardware, wont be out for a while, we're still talking rumours. AMD will have something to compare soon enough. AMD will be built on 20nm too so performance will be +/- 10%.....just like every generation. Both are built at the same factory.
> 
> The plan is same as always, to milk the consumers. Nvidia will sell you cut down GTX 880 first with disabled ROPS,TMUs CUDA cores etc etc, then AMD will release something, as good or better, then Nvidia will unlock some features of same silicon and call it 880 Ti so consumers need to spend hundreds on a new card.
> 
> There is no "better" graphics card company, they are both in on it, everything is pre-planned, Nvidia knows exactly what AMD is doing and AMD knows exactly what Nvidia is doing.



No, GM210 or whatever they are calling big die Maxwell will likely be out at the gtx980, which is what id expect nvidia to use to counter AMDs release. Just like what the 780 is to 680.


----------



## Slizzo (Apr 10, 2014)

Yeah, if the 880 is going to be GM204, then I'm waiting for whatever GM200 or GM210 will be.


----------



## H2323 (Apr 10, 2014)

buggalugs said:


> What are you going on about. This is Nvidia next gen hardware, wont be out for a while, we're still talking rumours. AMD will have something to compare soon enough. AMD will be built on 20nm too so performance will be +/- 10%.....just like every generation. Both are built at the same factory.
> 
> The plan is same as always, to milk the consumers. Nvidia will sell you cut down GTX 880 first with disabled ROPS,TMUs CUDA cores etc etc, then AMD will release something, as good or better, then Nvidia will unlock some features of same silicon and call it 880 Ti so consumers need to spend hundreds on a new card.
> 
> There is no "better" graphics card company, they are both in on it, everything is pre-planned, Nvidia knows exactly what AMD is doing and AMD knows exactly what Nvidia is doing.



Nice job, first practical thing I have read in comments for some time.....That said AMD might do some GloFlo fab for GPU. But yes, it's a game.


----------



## crazyeyesreaper (Apr 10, 2014)

yup fake hardcore super fake fake fake fake fake.


----------



## Hilux SSRG (Apr 10, 2014)

7.40 GHz memory stock ?


----------



## BiggieShady (Apr 10, 2014)

crazyeyesreaper said:


> yup fake hardcore super fake fake fake fake fake.



When was the last time when a chinese tech website covered the story from czech tech website and it all ends in TPU news section ... and it turns out to be fake?


----------



## thebluebumblebee (Apr 10, 2014)

Harry Lloyd said:


> Very realistic, indeed (except for the TDP).
> 
> 660 had 20% more Gflops than 580, and cost 230 $ (580 cost 500 $).
> 
> ...


Unfortunately, you are comparing GPU's from totally different "families".  This is the trick that Nvidia has played on us.  Remember that what we're paying for is the silicone.
580 GF110
680 GK104 (GTX 460/560 family) remember the "shorty" 670's?
660 GK106 (GTS 450/550 family)
and I've got to throw this in
750 Ti GM107 (GT 640/GTS 650)

Lets do some real power comparisons, using w1zzard's PEAK numbers
580=229 watts, 780 Ti=269 watts
560 Ti=159 watts 770=180 watts
550 TI=112 watts 660=124 watts
650=54 watts 750 Ti=57 watts
Has Nvidia reduced power usage in a family?


TheHunter said:


> So another "mid-range" dressed up as high-end aka GK104 chip.. Meh talk about milking again.


Bingo!


----------



## mroofie (Apr 10, 2014)

Harry Lloyd said:


> Very realistic, indeed (except for the TDP).
> 
> 660 had 20% more Gflops than 580, and cost 230 $ (580 cost 500 $).
> 
> ...



I agree the TDP should be below 200w
This info is lacking credibility !!
TPU shame on you !!!!!!!!!!!!!!!!!!


----------



## 64K (Apr 10, 2014)

thebluebumblebee said:


> Unfortunately, you are comparing GPU's from totally different "families".  This is the trick that Nvidia has played on us.  Remember that what we're paying for is the silicone.
> 580 GF110
> 680 GK104 (GTX 460/560 family) remember the "shorty" 670's?
> 660 GK106 (GTS 450/550 family)
> ...



However.....

GF104   160 watts
GK104   190 watts
GM204  230 watts - doesn't make sense.

If we're looking at GM210 then yes, but not GM204. I fully expect a 250 watt Big Maxwell but that won't come from GM204. If Nvidia follows suit with the Kepler releases then look for that about a year after GM204. So I'm thinking about 1.5 years at earliest.


----------



## rooivalk (Apr 10, 2014)

looking forward to 860  hopefully this year.


----------



## mroofie (Apr 10, 2014)

rooivalk said:


> looking forward to 860  hopefully this year.


60 range only comes after a few months after 80 and 70 range :/


----------



## bogami (Apr 10, 2014)

Again outdated exits in the majority .  Short and thick as if it could have something to hide thick ineffective cooler .Da not to mention the price and because it is the middle class GPU that should  replace GTX 770 , is 300 € ! A realistic price ! Of course INVIDIA will not be so blue oriented and we will be imposing abnormal prices for the development of  outdated GPU processor as the next generation is already developed !. Horror . follows me when I see what they  shows us as already developed and then sold generation of obsolete ., just to draw profit from the patents . The future Tegra based on the second generation of Maxwell and so on (car suport demo) ! developers processors lag behind the capabilities and long known solutions require a few years to finally begin to put into practice . We really lacking strong competition that cood ......
As far as the progress of this processor but I can say that initially predicted ratio of 1 ( FG ) vs. 16 ( MG ) and then it was suspended  on 1 against the 8. DirectX12 comes and we'll see what brings us .I hope that AMD will not be left behind with their new generation.


----------



## THU31 (Apr 10, 2014)

thebluebumblebee said:


> Unfortunately, you are comparing GPU's from totally different "families".  This is the trick that Nvidia has played on us.  Remember that what we're paying for is the silicone.
> 580 GF110
> 680 GK104 (GTX 460/560 family) remember the "shorty" 670's?
> 660 GK106 (GTS 450/550 family)
> ...



That is the whole point, of course I am comparing different families.

580 was 40 nm Fermi, 660 and 680 were 28 nm Kepler, as was the 780 (Ti). 860 and 880 will be 20 nm Maxwell.

660 was slightly more powerful than 580, and 680 was two times more powerful than 580. That is why it makes sense to assume that an 860 will offer 780/Titan level of performace at just over half the power consumption, and the 880 should be at least 1.5 times more powerful. I really do not see the 880 offering less than 8 Tflops, that would be stupid.


580 - 1.5 Tflops
680 - 3 Tflops
780 Ti - 5 Tflops

How could 880 offer just 5.7 Tfops? Ridiculous.


----------



## Nordic (Apr 10, 2014)

buildzoid said:


> Maxwell is better at mining than current GCN so if scrypt ASICs don't show up soon these will cost 2x MSRP


They are good at mining but I doubt they will sell like amd's were just a short time ago. The reason the gpu prices went up is because the coin prices were so high. Now so many joined the mining force that the difficulty rose so it is harder to mine, and everyone sells so what they do mine is worth less.


----------



## Hilux SSRG (Apr 10, 2014)

mroofie said:


> lol wat milking ?/
> this info is not even real !
> FAAAAKKKEEEEE


 


mroofie said:


> Don't listen to him he a amd fanboy


 


mroofie said:


> hmm ?? so what are you saying lol ?


 


mroofie said:


> 60 range only comes after a few months after 80 and 70 range :/


 
FYI, there's a Multi-Quote button next to reply.


I still think the 20nm Maxwell isn't arriving until early 2015.  I know Amd from past releases follows NVidia after a die shrink, which seems still on target.


----------



## dj-electric (Apr 10, 2014)

*I feel the need to clear some things up, so here we go.
*
GTX 750 Ti is a maxwell core card base on 28NM and consumes about 55W at gaming.
If it was a 20NM card, it would probably consume 35W, concidering power save and penalty as-well.

If you take GTX 750Ti's power and double it, you would get performance around the GTX 770. (according to TPU). so 35W X2 + penalty = A 80W GTX 770+ card.

I dont see how impossible, a card that is about twice the GTX 770's power (so 640 GTX750 Ti's shaders times 4) for about 180W could be. Add another cube of GTX 750 Ti's power and what you would get is about a 210W power card, with 3200 shaders. You could limit it using a 256 bit bus and pair it with 4GB of memory

This pobability is far from being a unicorn. It is most likely that at 180W power consumption we will get something that beats the GTX 780 Ti without much effort. It is also a probability that for 230W we will get something that goes even further, a lot further.

I don't get people here.


----------



## Nordic (Apr 10, 2014)

Dj-ElectriC said:


> *I feel the need to clear some things up, so here we go.
> *
> GTX 750 Ti is a maxwell core card base on 28NM and consumes about 55W at gaming.
> If it was a 20NM card, it would probably consume 35W, concidering power save and penalty as-well.
> ...


I am no engineer, but is there not some aspect of scaling involved here where it is easier to make a power efficient small chip like the 750ti? Nvidea does design from mobile and up.


----------



## dj-electric (Apr 10, 2014)

What's why i mentioned the penalty, same penalty level that exists on modern cards and previous ones.


----------



## TheHunter (Apr 10, 2014)

Harry Lloyd said:


> That is the whole point, of course I am comparing different families.
> 
> 580 was 40 nm Fermi, 660 and 680 were 28 nm Kepler, as was the 780 (Ti). 860 and 880 will be 20 nm Maxwell.
> 
> ...



was it? 780GTX is 2x more powerfull then 580gtx, 680gtx was ~ 30-50% max. Imo You can't really look at those tflops and determinate its true power.. For example 680gtx vs 580gtx compute performance, 580gtx wins almost all the time.
http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17


Anyway, I think that 5.7tflops gap is just a treshold to say look its faster then 780ti so it will be "high-end" for now, even though the rest screams "mid-range", 265bit bus, 40rops,.. Also we will make a full line, just like by 600series aka GK104 and then slowly move to full Maxwell. 
that full GM110? will come later and probably turn into 900series 


Imo It all depends how much money they want for this 880gtx, if its 500€+ nah not really worth it, unless it will be min 40-50% faster then 780ti.


----------



## arbiter (Apr 10, 2014)

Harry Lloyd said:


> Very realistic, indeed (except for the TDP).
> 
> 660 had 20% more Gflops than 580, and cost 230 $ (580 cost 500 $).
> 
> ...



maxwell is per watt is 3x more effcient so it is possible.



64K said:


> However.....
> 
> GF104   160 watts
> GK104   190 watts
> ...



Keep in mind gk104 that was 190 watts was only 1536 cuda cores and if the graph is right, 880 will be about 2x that. being its replacing GK110 which is a 250watt 2304 cuda core part it seems about right


----------



## 64K (Apr 10, 2014)

My issue with the TDP isn't about what wattage is required for the specs. It's about business. Look at Nvidia's track record. They release a new architecture/die shrink a step at a time to maximize profits and keep the buzz news going. I'm using initial releases where possible because GTX 880 is the initial release and not the refresh. Consider this.......

GTX 280       236 watt TDP
GTX 480       250 watt TDP
GTX 780 TI   250 Watt TDP (Note there is no real progression here because they pulled some shit and called the GTX 660 a GTX 680 $500 GPU)

~250 watts is their single GPU flagship target. They will never release a GM104 with a 230 watt TDP. If they did then what would be the incentive to buy a GM110?
Consider the difference in power between a GTX 680 and a GTX 780 Ti. That's what sells and that's what keeps the buzz going.


----------



## THU31 (Apr 10, 2014)

TheHunter said:


> was it? 780GTX is 2x more powerfull then 580gtx, 680gtx was ~ 30-50% max. Imo You can't really look at those tflops and determinate its true power.. For example 680gtx vs 580gtx compute performance, 580gtx wins almost all the time.
> http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17
> 
> 
> ...



So if a card with 100% more Gflops was 30-50% more powerful in games, how much more powerful will a card with 15% more Gflops be? 5%?
Those benchmarks are irrelevant here, because the 600/700 series has a completely limited FP64 performance. My numbers were for single precision.

I know framerate is a completely different issue, but I am not talking about that. I am just talking about pure computational single precision power, which is why it is absolutely impossible for a top-end Maxwell to have just 5.7 Tflops. Coupled with fewer TMUs and ROPs it would barely be faster than the 780 Ti, if at all. No way.

And Radeon rumors suggest even 96 ROPs. And ROPs is what helps them in 4K, they are just destroying GeForces in that resolution because of that. An 880 with 32 or even 48 ROPs would look just ridiculous. They will not want to lose that market completely.


----------



## SKL_H (Apr 10, 2014)

Wow GTX880 already and I am still on GK106 :-(

Still waiting for Tatan Z review....


----------



## sweet (Apr 10, 2014)

At first, I found that it was strange when TPU cited from an unknown media sourse, about a 2015 product. But now I understand.

The whole point of this article is "We need to offer something about nVidia or all folks will only talk about AMD 295x2".


----------



## crazyeyesreaper (Apr 11, 2014)

People be trollin', trollin', trollin', trollin'
keep trollin', trollin', trollin', trollin'
keep trollin', trollin', trollin', trollin'
keep trollin', trollin', trollin', trollin'


----------



## xorbe (Apr 11, 2014)

SKL_H said:


> Still waiting for Tatan Z review....



The best you'll get right now: http://forums.guru3d.com/showthread.php?t=388143


----------



## HammerON (Apr 11, 2014)

sweet said:


> The whole point of this article is "We need to offer something about nVidia or all folks will only talk about AMD 295x2".



Wow - really?


----------



## pr0n Inspector (Apr 11, 2014)

sweet said:


> At first, I found that it was strange when TPU cited from an unknown media sourse, about a 2015 product. But now I understand.
> 
> The whole point of this article is "We need to offer something about nVidia or all folks will only talk about AMD 295x2".



The butthurt is palpable.


----------



## Serpent of Darkness (Apr 11, 2014)

Tsukiyomi91 said:


> AMD now wanted to use DX12 Hardware Feature Set?? that's something...



1.  Anybody with the Win9.0 OS can use DX12.
2.  Since NVidia and AMD pay royalties to use any and all D3D versions, it's not uncommon for AMD to have DX12.0 as long as they slip the funds into M$'s pocket.  AMD has been paying royalties for full DX11.1 and DX11.2 for the past year.  A lot of ignorant, NVidia Fan Boys have painted this unrealistic image that AMD is against the epic, tag-team duo of NVidia and Microsoft.  This is all thanks to Mantle.  It's nothing more then a fantasy in every zombie-state, NVidia Fan-boy's mind.




buggalugs said:


> There is no "better" graphics card company, they are both in on it, everything is pre-planned, Nvidia knows exactly what AMD is doing and AMD knows exactly what Nvidia is doing.



+1.

I'm theorizing a few things.

1.  The R9-390x spec might actually be true.  When Titan was first released, AMD was going to produce a graphic card with twice the amount of SP to the 7000 series card.  This could never be done on 28, but it was possible for it to be on 20 nm.  I believe the card was called Tenerife II.  It was suppose to have twice the SP of a 7980 and 16 additional SP in series.  Now that 20nm graphic are starting to make their appearance, it's possible that Bermuda XTX is actually Tenerife II v1.5.

Look at it from this point of view:
AMD 7980: 2048 SP at 975 Mhz Core Clock
Tenerife II: 2x 2048 sp = 4096 Sp + 16 Sp = 4112 Sp @ 975 Mhz Core Clock.
R9-390x: 0.5(4224 SP) = 2112 Sp; difference of 2.06.


2. Another thing to consider is the R9-380x.  R9-380x has 3072 Sp.  R9-290x has 2816 SP.  In addition to this, R9-290x has roughly 10% of it's total SP locked to control TDP.

Difference between R9-290x and R9-390x: 9.09%.
R9-290x's 2816 + 10% = 3097 SP.

So in essence, R9-380x is a rebranded R9-290x.  It has 99% of it's cores unlocked.


Other things to consider:
Performance gain from GTX 680 to GTX 780 Ti: 65.4%
Performance gains from GTX 680 to GTX 780:  27.6%
Performance gain from GTX 780 to GTX 780 Ti: 29.9%
Performance gain from GTX 780 Ti and GTX 880:  13.1%

Performance gain from AMD 7970 to R9-290x:  48.7%
Performance gain from AMD 7970Ghz to R9-290x: 37.5% 
Performance gain from AMD 7970 to R9-390x:  122%
Performance gain from AMD 7970 to R9-280x: 8.11%
Performance gain from AMD 7970Ghz to R9-280x: 0.00% to 5.00% (-1)
Performance gain from AMD R9-290x to R9-380x: 9.09% 
Performance gain from AMD R9-290x to R9-390x: 50.0%

W9100 Double to Single PP Ratio: 0.474877
K40 Tesla Double to Single PP Ratio: 0.333333

Single Precision:
GTX 780 Ti = 5.37 GFLOPs.
GTX 880 = 6.08 GFLOPs.
R9-290x = 5.63 GFLOPs.
R9-390x = 8.45 GFLOPs.

I suspect that the AMD side is more than 50.0% true.   There's a noticeable trend between each generation.   As for the NVidia side, I believe that GTX 880 will be 2015's GTX 680.  Following after that, we'll see a GTX 680 Ti, a GTX Titan-Black-M and GTX Titan-Z-M with improved or tweaked versions of "Rough-Draft" Maxwell until GTX 980 is released.

M = Maxwell.

2560 x 1400
Theoretical Output:  BF4 Dx11
GTX 780 Ti = 64 FPS.
R9-290x = 68 FPS.

GTX 880 roughly = 72 FPS.
R9-390x roughly = 102 FPS.

2560 x 1400
Theoretical Output: BioShock Infinite Dx11
GTX 780 Ti = 78 FPS.
R9-290x = 62 FPS.

GTX 880 roughly = 88.3 FPS.
R9-390x roughly = 93.0 FPS.


----------



## john_ (Apr 11, 2014)

sweet said:


> At first, I found that it was strange when TPU cited from an unknown media sourse, about a 2015 product. But now I understand.
> 
> The whole point of this article is "We need to offer something about nVidia or all folks will only talk about AMD 295x2".



There isn't competition only between gpu makers but also between hardware sites. So some rumors about products that could interest the majority of the readers could see a first page easily. Of course rumors about the next AMD series never got the first page but who cares right?


----------



## the54thvoid (Apr 11, 2014)

Serpent of Darkness said:


> ....blah......



Stop speculating. It's just pointless.  Besides, you say:

2560 x 1400
Theoretical Output:  BF4 Dx11
GTX 780 Ti = 64 FPS.
R9-290x = 68 FPS.

when we have 2560x1600:

gtx780ti - 46.3
290x - 44.9







Holes can be picked in any 'speculated' performance.  Architecture/software efficiencies make mince meat out of bare metal.

Let's wait for real products....

Also, as to this idiotic comment:



sweet said:


> At first, I found that it was strange when TPU cited from an unknown media sourse, about a 2015 product. But now I understand.
> 
> The whole point of this article is "We need to offer something about nVidia or all folks will only talk about AMD 295x2".



TPU is not a biased site.  No matter what you believe - the owner, writers and mods are pretty neutral.  What they do is repeat relevant rumour and gossip no matter what the source.  If some web site latches on to something, others follow.  It's the nature of having news when their is none.
The whole point of this article is the same as any other tech gossip.  And it feeds the trolls.


----------



## refillable (Apr 11, 2014)

Taking this with a grain of salt. 2014 is the starting year of the change from Full HD to 4K. If a Flagship card from nvidia don't play a good 4K, nvidia is going to be on a trouble and will lose hardcore gamer's attention.


----------



## pr0n Inspector (Apr 12, 2014)

john_ said:


> There isn't competition only between gpu makers but also between hardware sites. So some rumors about products that could interest the majority of the readers could see a first page easily. Of course rumors about the next AMD series never got the first page but who cares right?




9/11 was an inside job. The Jews control all the money. Elvis is eating cheeseburgers out there somewhere.


----------



## john_ (Apr 12, 2014)

pr0n Inspector said:


> 9/11 was an inside job. The Jews control all the money. Elvis is eating cheeseburgers out there somewhere.


If someone says something that you don't like and maybe see a little truth in it, just attack the guy who said that. Try to humiliate him, laugh at him. It is the easiest thing to do. It is so easy that even a low two digit IQ person can do it.


----------



## pr0n Inspector (Apr 12, 2014)

Conspiracy nut jobs thinking others are dumb for not seeing the "truth".  How stereotypical. Next thing you know they'll claim that TPU staff and members are conspiring to take away their GPUs!


----------



## john_ (Apr 12, 2014)

pr0n Inspector said:


> Conspiracy nut jobs thinking others are dumb for not seeing the "truth".  How stereotypical. Next thing you know they'll claim that TPU staff and members are conspiring to take away their GPUs!



Oh THE "TRUTH". WOW!!! I feel the chill in my spine!
And of course more blah blah blah trying to discredit, not just the other opinion, but also the person expressing it. Boring. Be more original. 


The funny part of course is that in my post I defended the choice of putting the rumor about 880 in the first page, but it seems that you have a problem understanding it. You only care about the part of the post where I was implying that maybe there could be also place for the rumors about the next radeon.
So it seems that while I am not the one who have problem with 880 getting into the first page, you are really having a huge problem to accept even the idea that news about the future radeons should be posted.
Or you are just trolling which I don't really mind right now.


PS "nut jobs" LOL!!!! (I am crying now)


----------



## pr0n Inspector (Apr 12, 2014)

No you clearly stated that somehow news about your preferred brand's next-gen products somehow doesn't get to the "first page"(which is ridiculous because everything on TPU frontpage is listed in chronological order and the forums are by default ordered by last post time which you can easily bump). You are essentially saying that news about your preferred brand are being actively suppressed, i.e. a conspiracy against your preferred brand. 


Although I will give you that you sounded less crazy than the other guy who outright said TPU(and perhaps the Illuminati) is manipulating us into talking about nVidia instead of AMD.


----------



## john_ (Apr 12, 2014)

pr0n Inspector said:


> No you clearly stated that somehow news about your preferred brand's next-gen products somehow doesn't get to the "first page"(which is ridiculous because everything on TPU frontpage is listed in chronological order and the forums are by default ordered by last post time which you can easily bump). You are essentially saying that news about your preferred brand are being actively suppressed, i.e. a conspiracy against your preferred brand.
> 
> 
> Although I will give you that you sounded less crazy than the other guy who outright said TPU(and perhaps the Illuminati) is manipulating us into talking about nVidia instead of AMD.



You look at my post as a half empty glass and you expect everybody else to look at it like that. Then from a "nut job" you ...upgrade me? downgrade me? I don't know, to just "less crazy".
What do you want me to say, that you are right or even thank you for your kind words????

Let's stop it here because every post you do it just makes it worst.


----------



## pr0n Inspector (Apr 12, 2014)

There was no half-empty glass. You made it abundantly clear that you think one brands' news will magically never make it to the "first page" while the other brand's did.

Also, there are nut jobs, and then there are nut jobs.


----------



## john_ (Apr 12, 2014)

And you continue.......
Whatever you like to believe.

PS And no. When commending someone you don't know there is only one kind of "nut jobs". The kind that insults.


----------



## MxPhenom 216 (Apr 12, 2014)

These specs are about as fake as they can get. All those numbers should come out with a lot more than 5.7 TFLOPS.


----------



## TheoneandonlyMrK (Apr 12, 2014)

MxPhenom 216 said:


> These specs are about as fake as they can get. All those numbers should come out with a lot more than 5.7 TFLOPS.


Are some of you new to tech news, pr spin begets pr spin and one bit of news that might mold what a consumer buys, naturally is countered by the other teams pr.
Look at what's said of next generation gpu's over the years and both this and the pirate island info are both realistic possibilities that's all, final steppings and binning might yet destroy all hope of a good 2015 (as I said ages ago) as 20nm is not looking stable.


----------



## 64K (Apr 12, 2014)

theoneandonlymrk said:


> Are some of you new to tech news, pr spin begets pr spin and one bit of news that might mold what a consumer buys, naturally is countered by the other teams pr.
> Look at what's said of next generation gpu's over the years and both this and the pirate island info are both realistic possibilities that's all, final steppings and binning might yet destroy all hope of a good 2015 (as I said ages ago) as 20nm is not looking stable.



mrk there are people on these forums that now think the GTX 880 has been detailed. If no one opposes it then it will become the truth by default. It's not just here. There is so much disinformation floating around the net about video cards. I have seen people on Tom's Hardware with badges for graphics prowess talking nonsense. One of these experts was defending the GTX 680 (my card) as the flagship Kepler about a year ago. He missed the GK104 designation I guess.


----------



## rtwjunkie (Apr 12, 2014)

I'm just confused why an obviously completely speculatory story, that isn't even true specs has fostered such back and forth hostility.  Are there really people that think the GTX 880 has been announced and we now have the specs?  since when have we EVER gotten anything reliable from Nvidia's closely guarded development information in any more than 2 months?  It's obviously fake, and no 880 (whatever it's specs) will be forthcoming till sometime in 2015.


----------



## john_ (Apr 12, 2014)

The rumors are just a base to talk about a possible card(and pass the time in front of the monitor). What will come out maybe it is unknown even to Nvidia. They can't know for sure how 20nm will be doing in TSMC in 6 months from now.
With Maxwell we didn't knew for sure how the cards will look like 1 weak before the official presentation, not months. Do they have a six pin connector? They need it? They don't? just as an example.
AMD is more easily to predict with cards.


----------



## 64K (Apr 12, 2014)

rtwjunkie said:


> I'm just confused why an obviously completely speculatory story, that isn't even true specs has fostered such back and forth hostility.  Are there really people that think the GTX 880 has been released and we now have the specs?  since when have we EVER gotten anything reliable from Nvidia's closely guarded development information in any more than 2 months?  It's obviously fake, and no 880 (whatever it's specs) will be forthcoming till sometime in 2015.



This is an unprecedented time in PC gaming video cards. Moves are being made from both camps to cash in on the confusion and ignorance.
Nvidia farts out an imaginary Titan Z for $3,000 and AMD follows suit with an actual R9 295X2 for the low low price of only $1,500. At least we get a suitcase with this one. 

Welcome to the Circus of Values!


----------



## xorbe (Apr 13, 2014)

64K said:


> This is an unprecedented time in PC gaming video cards. Moves are being made from both camps to cash in on the confusion and ignorance.
> Nvidia farts out an imaginary Titan Z for $3,000 and AMD follows suit with an actual R9 295X2 for the low low price of only $1,500. At least we get a suitcase with this one.
> 
> Welcome to the Circus of Values!



For the common gamer, 1920x1080 is basically covered extremely well by a $249 GTX 760 (which has been as low as USD$199).  Reality check: consoles get by with beefy integrated graphics.  It's not a crime to select a graphics setting less than ultra extreme 16xAA.  The halo products are just that, halo products which demonstrate leading-edge technology, and the rich people buy them even if they don't really need it.


----------



## OneCool (Apr 13, 2014)

So nVidia is going to charge a premium price for a mid level chip again?  whatever.... :\


----------



## MxPhenom 216 (Apr 13, 2014)

OneCool said:


> So nVidia is going to charge a premium price for a mid level chip again?  whatever.... :\



If it performs like a high end card(like 680 beating AMD flagship) why wouldnt they?


----------



## rtwjunkie (Apr 13, 2014)

OneCool said:


> So nVidia is going to charge a premium price for a mid level chip again?  whatever.... :\


 
They're gonna charge whatever the market will bear. As long as the sell all they want at the price they ask, there is no reason to go lower. At the point people stop buying a current, expensive product, then incentives are given and/or price lowered till an item is a good seller again.


----------



## OneCool (Apr 14, 2014)

And yet AGAIN people will defend a multi billion dollar company for doing bullshit practices.


----------



## xenocide (Apr 14, 2014)

OneCool said:


> And yet AGAIN people will defend a multi billion dollar company for doing bullshit practices.


 
You mean a multibillion dollar company doing *CAPITALISM*.  Yes, because that's kind of their thing.  If they are selling cards for $150 and make a new version that performs a bit better and uses less power, how is it unreasonable to charge $175 for it?  As long as people buy them they will keep pricing them up.  There's also the fact that semiconductor manufacturing is getting expensive these days.  20nm is not a cheap node to manufacture on.


----------



## LAN_deRf_HA (Apr 14, 2014)

OneCool said:


> So nVidia is going to charge a premium price for a mid level chip again? whatever.... :\



If you want nvidia to start leading with their "110" chips again and to knock off the Titan crap complain to AMD. Until they get their shit together we're stuck with nvidia doing the bare minimum for a premium price. Unfortunately because of the 7000s series sucking so bad nvidia has been on cruise control with Kepler, which means Maxwell will probably be the most polished chip of all time putting AMD further behind. Once you screw up like that it's damn near impossible to ever catch back up. Case and point, Intel vs AMD.


----------



## john_ (Apr 14, 2014)

The problem isn't with the chip. If Nvidia comes out with a chip in the size of Maxwell in the 750 card, but with a 50% extra performance compared to a GTX780, they have every right to sell it as a hi end. Not just that, I am also buying it.

I am buying it? Wait. The problem isn't with the chip. The problem is with the price. Nvidia is intentionally increasing the prices in the hi end sector. What you could buy with $500-600 in the past, costs $700-$1000 today or that's how much it will cost you, me, everybody tomorrow. So if that small chip with a 50% extra performance compared to GTX780 comes as a 880GTX at $900, well, they know what to do with it, better what to do with the whole card for better satisfaction.


----------



## john_ (Apr 14, 2014)

LAN_deRf_HA said:


> If you want nvidia to start leading with their "110" chips again and to knock off the Titan crap complain to AMD. Until they get their shit together we're stuck with nvidia doing the bare minimum for a premium price. Unfortunately because of the 7000s series sucking so bad nvidia has been on cruise control with Kepler, which means Maxwell will probably be the most polished chip of all time putting AMD further behind. Once you screw up like that it's damn near impossible to ever catch back up. Case and point, Intel vs AMD.



Forget AMD. AMD tried to do a price war with the 290's. That awful cooler was part of that strategy (keep the costs down) and the hardware sites didn't lose the chance to fire at will at AMD. You where getting $600-$700 performance(based on Nvidia's pricing) for $400-$550 and the hardware sites just found out how much more important it was the noise than the performance/dollar. Especially Tom's review was like it was written from Nvidia's marketing department.
Then the mining madness happen and the extra dollars from the price hikes didn't go into AMD's pockets but in the retailers pockets. So at AMD I am guessing they where hitting their heads in the wall for losing the chance to sell the cards at much better margins.
The result is AMD to follow the leader, Nvidia. So here we are with a metal case, a hydro cooler and $1500 price that make even Nvidia's marketing department at Tom's happy with the card(the price was in line with Nvidia's plans).


----------



## SeanJ76 (Jun 14, 2014)

I really don't see the need for 8gb of Vram, nothing will ever use so much Vram, I use 2 x 670GTX FTW's(4gb Vram) games very rarely ever uses 4gb..... even on 4k screens you may only see 4.5gb used.....So I see a 4gb version releasing with some 8gb version released later on. As we saw with the 4gb version 670/680's they literally halted no performance increase vs. the 2gb versions. So I expect the same result with the 880's.


----------



## SeanJ76 (Jun 14, 2014)

LAN_deRf_HA said:


> If you want nvidia to start leading with their "110" chips again and to knock off the Titan crap complain to AMD. Until they get their shit together we're stuck with nvidia doing the bare minimum for a premium price. Unfortunately because of the 7000s series sucking so bad nvidia has been on cruise control with Kepler, which means Maxwell will probably be the most polished chip of all time putting AMD further behind. Once you screw up like that it's damn near impossible to ever catch back up. Case and point, Intel vs AMD.


Amen!


----------



## TheoneandonlyMrK (Jun 14, 2014)

john_ said:


> The problem isn't with the chip. If Nvidia comes out with a chip in the size of Maxwell in the 750 card, but with a 50% extra performance compared to a GTX780, they have every right to sell it as a hi end. Not just that, I am also buying it.
> 
> I am buying it? Wait. The problem isn't with the chip. The problem is with the price. Nvidia is intentionally increasing the prices in the hi end sector. What you could buy with $500-600 in the past, costs $700-$1000 today or that's how much it will cost you, me, everybody tomorrow. So if that small chip with a 50% extra performance compared to GTX780 comes as a 880GTX at $900, well, they know what to do with it, better what to do with the whole card for better satisfaction.


The problem is that people don't realise they vote with their wallet, you want reasonable prices out of nvidia, let them know that then.


----------



## MxPhenom 216 (Jun 14, 2014)

theoneandonlymrk said:


> Are some of you new to tech news, pr spin begets pr spin and one bit of news that might mold what a consumer buys, naturally is countered by the other teams pr.
> Look at what's said of next generation gpu's over the years and both this and the pirate island info are both realistic possibilities that's all, final steppings and binning might yet destroy all hope of a good 2015 (as I said ages ago) as 20nm is not looking stable.


 
Get off your high horse.


----------



## Steevo (Jun 15, 2014)

Dj-ElectriC said:


> *I feel the need to clear some things up, so here we go.
> *
> GTX 750 Ti is a maxwell core card base on 28NM and consumes about 55W at gaming.
> If it was a 20NM card, it would probably consume 35W, concidering power save and penalty as-well.
> ...


Where do you get any of that information?

http://www.techpowerup.com/reviews/ASUS/GTX_750_Ti_OC/23.html

There is a huge increase in core voltage needed to attain the last bit of clock speed, and it becomes exponential as the core gets hotter and leakage increases, look at any GPU review, and you will notice this, the only exception to this rule is the minimal switching power required, and the 750Ti does a good job, at that, at lower core speeds the leakage is so minimal that the .95 volts it has doesn't mean anything in real power consumption, and it is only possible on that core as it is so small with so few shaders. If we attempted this with a larger core the power drop for the first increase in clock speeds would cause the hardware to fail during the clock state transition.


Now we have two options here, we can say, hey look at the 750Ti, it costs $150, and the R7 265 it costs $150.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_750_Ti/28.html

Even overclocked it doesn't reach the R7, which is 2 years old. So bad on them, if you are concerned about the few watts difference in idle and full power why are you even talking about a high power GPU? 

The other option is you can say, is the 750Ti is a great first foray for nvidia to test something new, and despite its lack of GPU power for actual use since 2009 I hope they learn from it. But power consumption speculation and raw GPU power is and has been centered around the 300W ceiling for both companies for the last few generations, I don't see any reprise there, all I can see is the shrink should bring power consumption down per transistor and allow for more transistors and more raw compute power, core improvements and features will no doubt improve their IPC. Anything more than this is buffalo chips.


----------



## TheoneandonlyMrK (Jun 15, 2014)

MxPhenom 216 said:


> Get off your high horse.


What you on about and how long did it take you to think that up


----------



## AsRock (Jun 15, 2014)

NC37 said:


> And all the people who bought a $3000 GPU go...
> 
> NV stated they'd be doing a tick-tock setup like Intel. Kepler was a nice advancement but it was still the tock to Fermi. It isn't super different. Now we're going on the next tick, which is Maxwell.
> 
> Interested to see if NV gets it right again like they did with the 400s. The 750 is really mediocre and its a GM chip. Will be keeping a close eye and hopefully new employment to pay for it. Unless of course, AMD finally gets everything right.




AMD are getting it right even more so in the price department..  The over inflated prices cannot be blamed on them, well except maybe the 290x2.


----------



## Slizzo (Jun 16, 2014)

SeanJ76 said:


> I really don't see the need for 8gb of Vram, nothing will ever use so much Vram, I use 2 x 670GTX FTW's(4gb Vram) games very rarely ever uses 4gb..... even on 4k screens you may only see 4.5gb used.....So I see a 4gb version releasing with some 8gb version released later on. As we saw with the 4gb version 670/680's they literally halted no performance increase vs. the 2gb versions. So I expect the same result with the 880's.



Because nothing ever used more than 8MB? Then 16MB? Then 32mb, then 64mb, then 128mb and on and on.

Can't hurt to have 8GB on board really. As resolutions increase and textures get bigger to support those resolutions video memory will be paramount.


----------



## Prima.Vera (Jun 17, 2014)

Slizzo said:


> Because nothing ever used more than 8MB? Then 16MB? Then 32mb, then 64mb, then 128mb and on and on.
> 
> Can't hurt to have 8GB on board really. As resolutions increase and textures get bigger to support those resolutions video memory will be paramount.



Yeah, but remember back then you used to game on 640x480 resolution. Then VooDoo 2 came with SVGA support (800x600) and 12MB of VRAM and it was a blast. Quake 2 and Unreal anyone?


----------

