Thursday, April 10th 2014

NVIDIA GeForce GTX 880 Detailed

NVIDIA's next-generation GeForce GTX 880 graphics card is shaping up to be a true successor to the GTX 680. According to a Tyden.cz report, GTX 880 will be based on NVIDIA's GM204 silicon, which ranks within its product stack in the same way GK104 does to the GeForce "Kepler" family. It won't be the biggest chip based on the "Maxwell" architecture, but will have what it takes to outperform even the GK110, again, in the same way GK104 outperforms GF110. The DirectX 12-ready chip will feature an SMM (streaming multiprocessor Maxwell) SIMD design that's identical to that of the GeForce GTX 750 Ti, only there are more SMMs, spread across multiple graphics processing clusters (GPCs), probably cushioned by a large slab of cache.
This is what the GTX 880 is shaping up to be.

  • 20 nm GM204 silicon
  • 7.9 billion transistors
  • 3,200 CUDA cores
  • 200 TMUs
  • 32 ROPs
  • 5.7 TFLOP/s single-precision floating-point throughput
  • 256-bit wide GDDR5 memory interface
  • 4 GB standard memory amount
  • 238 GB/s memory bandwidth
  • Clock speeds of 900 MHz core, 950 MHz GPU Boost, 7.40 GHz memory
  • 230W board power
Sources: PCTuning Tyden.cz, Expreview
Add your own comment

102 Comments on NVIDIA GeForce GTX 880 Detailed

#51
THU31
thebluebumblebeeUnfortunately, you are comparing GPU's from totally different "families". This is the trick that Nvidia has played on us. Remember that what we're paying for is the silicone.
580 GF110
680 GK104 (GTX 460/560 family) remember the "shorty" 670's?
660 GK106 (GTS 450/550 family)
and I've got to throw this in
750 Ti GM107 (GT 640/GTS 650)

Lets do some real power comparisons, using w1zzard's PEAK numbers
580=229 watts, 780 Ti=269 watts
560 Ti=159 watts 770=180 watts
550 TI=112 watts 660=124 watts
650=54 watts 750 Ti=57 watts
Has Nvidia reduced power usage in a family?
That is the whole point, of course I am comparing different families.

580 was 40 nm Fermi, 660 and 680 were 28 nm Kepler, as was the 780 (Ti). 860 and 880 will be 20 nm Maxwell.

660 was slightly more powerful than 580, and 680 was two times more powerful than 580. That is why it makes sense to assume that an 860 will offer 780/Titan level of performace at just over half the power consumption, and the 880 should be at least 1.5 times more powerful. I really do not see the 880 offering less than 8 Tflops, that would be stupid.


580 - 1.5 Tflops
680 - 3 Tflops
780 Ti - 5 Tflops

How could 880 offer just 5.7 Tfops? Ridiculous.
Posted on Reply
#52
Nordic
buildzoidMaxwell is better at mining than current GCN so if scrypt ASICs don't show up soon these will cost 2x MSRP
They are good at mining but I doubt they will sell like amd's were just a short time ago. The reason the gpu prices went up is because the coin prices were so high. Now so many joined the mining force that the difficulty rose so it is harder to mine, and everyone sells so what they do mine is worth less.
Posted on Reply
#53
Hilux SSRG
mroofielol wat milking ?/
this info is not even real !
FAAAAKKKEEEEE :)
mroofieDon't listen to him he a amd fanboy :p
mroofiehmm ?? so what are you saying lol ?
mroofie60 range only comes after a few months after 80 and 70 range :/
FYI, there's a Multi-Quote button next to reply.


I still think the 20nm Maxwell isn't arriving until early 2015. I know Amd from past releases follows NVidia after a die shrink, which seems still on target.
Posted on Reply
#54
dj-electric
I feel the need to clear some things up, so here we go.

GTX 750 Ti is a maxwell core card base on 28NM and consumes about 55W at gaming.
If it was a 20NM card, it would probably consume 35W, concidering power save and penalty as-well.

If you take GTX 750Ti's power and double it, you would get performance around the GTX 770. (according to TPU). so 35W X2 + penalty = A 80W GTX 770+ card.

I dont see how impossible, a card that is about twice the GTX 770's power (so 640 GTX750 Ti's shaders times 4) for about 180W could be. Add another cube of GTX 750 Ti's power and what you would get is about a 210W power card, with 3200 shaders. You could limit it using a 256 bit bus and pair it with 4GB of memory

This pobability is far from being a unicorn. It is most likely that at 180W power consumption we will get something that beats the GTX 780 Ti without much effort. It is also a probability that for 230W we will get something that goes even further, a lot further.

I don't get people here.
Posted on Reply
#55
Nordic
Dj-ElectriCI feel the need to clear some things up, so here we go.

GTX 750 Ti is a maxwell core card base on 28NM and consumes about 55W at gaming.
If it was a 20NM card, it would probably consume 35W, concidering power save and penalty as-well.

If you take GTX 750Ti's power and double it, you would get performance around the GTX 770. (according to TPU). so 35W X2 + penalty = A 80W GTX 770+ card.

I dont see how impossible, a card that is about twice the GTX 770's power (so 640 GTX750 Ti's shaders times 4) for about 180W could be. Add another cube of GTX 750 Ti's power and what you would get is about a 210W power card, with 3200 shaders. You could limit it using a 256 bit bus and pair it with 4GB of memory

This pobability is far from being a unicorn. It is most likely that at 180W power consumption we will get something that beats the GTX 780 Ti without much effort. It is also a probability that for 230W we will get something that goes even further, a lot further.

I don't get people here.
I am no engineer, but is there not some aspect of scaling involved here where it is easier to make a power efficient small chip like the 750ti? Nvidea does design from mobile and up.
Posted on Reply
#56
dj-electric
What's why i mentioned the penalty, same penalty level that exists on modern cards and previous ones.
Posted on Reply
#57
TheHunter
Harry LloydThat is the whole point, of course I am comparing different families.

580 was 40 nm Fermi, 660 and 680 were 28 nm Kepler, as was the 780 (Ti). 860 and 880 will be 20 nm Maxwell.

660 was slightly more powerful than 580, and 680 was two times more powerful than 580. That is why it makes sense to assume that an 860 will offer 780/Titan level of performace at just over half the power consumption, and the 880 should be at least 1.5 times more powerful. I really do not see the 880 offering less than 8 Tflops, that would be stupid.


580 - 1.5 Tflops
680 - 3 Tflops
780 Ti - 5 Tflops

How could 880 offer just 5.7 Tfops? Ridiculous.
was it? 780GTX is 2x more powerfull then 580gtx, 680gtx was ~ 30-50% max. Imo You can't really look at those tflops and determinate its true power.. For example 680gtx vs 580gtx compute performance, 580gtx wins almost all the time.
www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17


Anyway, I think that 5.7tflops gap is just a treshold to say look its faster then 780ti so it will be "high-end" for now, even though the rest screams "mid-range", 265bit bus, 40rops,.. Also we will make a full line, just like by 600series aka GK104 and then slowly move to full Maxwell. :D
that full GM110? will come later and probably turn into 900series


Imo It all depends how much money they want for this 880gtx, if its 500€+ nah not really worth it, unless it will be min 40-50% faster then 780ti.
Posted on Reply
#58
arbiter
Harry LloydVery realistic, indeed (except for the TDP).

660 had 20% more Gflops than 580, and cost 230 $ (580 cost 500 $).

680 had 100% more Gflops than 580, and cost exactly the same, 500 $.

580 TDP - 244 W
680 TDP - 195 W
660 TDP - 140 W


So how could a card with just 15% more Gflops and 80% better performance/watt have a TDP of 230 W? Complete bullshit.
maxwell is per watt is 3x more effcient so it is possible.
64KHowever.....

GF104 160 watts
GK104 190 watts
GM204 230 watts - doesn't make sense.

If we're looking at GM210 then yes, but not GM204. I fully expect a 250 watt Big Maxwell but that won't come from GM204. If Nvidia follows suit with the Kepler releases then look for that about a year after GM204. So I'm thinking about 1.5 years at earliest.
Keep in mind gk104 that was 190 watts was only 1536 cuda cores and if the graph is right, 880 will be about 2x that. being its replacing GK110 which is a 250watt 2304 cuda core part it seems about right
Posted on Reply
#59
64K
My issue with the TDP isn't about what wattage is required for the specs. It's about business. Look at Nvidia's track record. They release a new architecture/die shrink a step at a time to maximize profits and keep the buzz news going. I'm using initial releases where possible because GTX 880 is the initial release and not the refresh. Consider this.......

GTX 280 236 watt TDP
GTX 480 250 watt TDP
GTX 780 TI 250 Watt TDP (Note there is no real progression here because they pulled some shit and called the GTX 660 a GTX 680 $500 GPU)

~250 watts is their single GPU flagship target. They will never release a GM104 with a 230 watt TDP. If they did then what would be the incentive to buy a GM110?
Consider the difference in power between a GTX 680 and a GTX 780 Ti. That's what sells and that's what keeps the buzz going.
Posted on Reply
#60
THU31
TheHunterwas it? 780GTX is 2x more powerfull then 580gtx, 680gtx was ~ 30-50% max. Imo You can't really look at those tflops and determinate its true power.. For example 680gtx vs 580gtx compute performance, 580gtx wins almost all the time.
www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17


Anyway, I think that 5.7tflops gap is just a treshold to say look its faster then 780ti so it will be "high-end" for now, even though the rest screams "mid-range", 265bit bus, 40rops,.. Also we will make a full line, just like by 600series aka GK104 and then slowly move to full Maxwell. :D
that full GM110? will come later and probably turn into 900series


Imo It all depends how much money they want for this 880gtx, if its 500€+ nah not really worth it, unless it will be min 40-50% faster then 780ti.
So if a card with 100% more Gflops was 30-50% more powerful in games, how much more powerful will a card with 15% more Gflops be? 5%?
Those benchmarks are irrelevant here, because the 600/700 series has a completely limited FP64 performance. My numbers were for single precision.

I know framerate is a completely different issue, but I am not talking about that. I am just talking about pure computational single precision power, which is why it is absolutely impossible for a top-end Maxwell to have just 5.7 Tflops. Coupled with fewer TMUs and ROPs it would barely be faster than the 780 Ti, if at all. No way.

And Radeon rumors suggest even 96 ROPs. And ROPs is what helps them in 4K, they are just destroying GeForces in that resolution because of that. An 880 with 32 or even 48 ROPs would look just ridiculous. They will not want to lose that market completely.
Posted on Reply
#61
SKL_H
Wow GTX880 already and I am still on GK106 :-(

Still waiting for Tatan Z review....
Posted on Reply
#62
sweet
At first, I found that it was strange when TPU cited from an unknown media sourse, about a 2015 product. But now I understand.

The whole point of this article is "We need to offer something about nVidia or all folks will only talk about AMD 295x2".
Posted on Reply
#63
crazyeyesreaper
Not a Moderator
People be trollin', trollin', trollin', trollin'
keep trollin', trollin', trollin', trollin'
keep trollin', trollin', trollin', trollin'
keep trollin', trollin', trollin', trollin'
Posted on Reply
#65
HammerON
The Watchful Moderator
sweetThe whole point of this article is "We need to offer something about nVidia or all folks will only talk about AMD 295x2".
Wow - really?
Posted on Reply
#66
pr0n Inspector
sweetAt first, I found that it was strange when TPU cited from an unknown media sourse, about a 2015 product. But now I understand.

The whole point of this article is "We need to offer something about nVidia or all folks will only talk about AMD 295x2".
The butthurt is palpable.
Posted on Reply
#67
Serpent of Darkness
Tsukiyomi91AMD now wanted to use DX12 Hardware Feature Set?? that's something...
1. Anybody with the Win9.0 OS can use DX12.
2. Since NVidia and AMD pay royalties to use any and all D3D versions, it's not uncommon for AMD to have DX12.0 as long as they slip the funds into M$'s pocket. AMD has been paying royalties for full DX11.1 and DX11.2 for the past year. A lot of ignorant, NVidia Fan Boys have painted this unrealistic image that AMD is against the epic, tag-team duo of NVidia and Microsoft. This is all thanks to Mantle. It's nothing more then a fantasy in every zombie-state, NVidia Fan-boy's mind.
buggalugsThere is no "better" graphics card company, they are both in on it, everything is pre-planned, Nvidia knows exactly what AMD is doing and AMD knows exactly what Nvidia is doing.
+1.

I'm theorizing a few things.

1. The R9-390x spec might actually be true. When Titan was first released, AMD was going to produce a graphic card with twice the amount of SP to the 7000 series card. This could never be done on 28, but it was possible for it to be on 20 nm. I believe the card was called Tenerife II. It was suppose to have twice the SP of a 7980 and 16 additional SP in series. Now that 20nm graphic are starting to make their appearance, it's possible that Bermuda XTX is actually Tenerife II v1.5.

Look at it from this point of view:
AMD 7980: 2048 SP at 975 Mhz Core Clock
Tenerife II: 2x 2048 sp = 4096 Sp + 16 Sp = 4112 Sp @ 975 Mhz Core Clock.
R9-390x: 0.5(4224 SP) = 2112 Sp; difference of 2.06.


2. Another thing to consider is the R9-380x. R9-380x has 3072 Sp. R9-290x has 2816 SP. In addition to this, R9-290x has roughly 10% of it's total SP locked to control TDP.

Difference between R9-290x and R9-390x: 9.09%.
R9-290x's 2816 + 10% = 3097 SP.

So in essence, R9-380x is a rebranded R9-290x. It has 99% of it's cores unlocked.


Other things to consider:
Performance gain from GTX 680 to GTX 780 Ti: 65.4%
Performance gains from GTX 680 to GTX 780: 27.6%
Performance gain from GTX 780 to GTX 780 Ti: 29.9%
Performance gain from GTX 780 Ti and GTX 880: 13.1%

Performance gain from AMD 7970 to R9-290x: 48.7%
Performance gain from AMD 7970Ghz to R9-290x: 37.5%
Performance gain from AMD 7970 to R9-390x: 122%
Performance gain from AMD 7970 to R9-280x: 8.11%
Performance gain from AMD 7970Ghz to R9-280x: 0.00% to 5.00% (-1)
Performance gain from AMD R9-290x to R9-380x: 9.09%
Performance gain from AMD R9-290x to R9-390x: 50.0%

W9100 Double to Single PP Ratio: 0.474877
K40 Tesla Double to Single PP Ratio: 0.333333

Single Precision:
GTX 780 Ti = 5.37 GFLOPs.
GTX 880 = 6.08 GFLOPs.
R9-290x = 5.63 GFLOPs.
R9-390x = 8.45 GFLOPs.

I suspect that the AMD side is more than 50.0% true. There's a noticeable trend between each generation. As for the NVidia side, I believe that GTX 880 will be 2015's GTX 680. Following after that, we'll see a GTX 680 Ti, a GTX Titan-Black-M and GTX Titan-Z-M with improved or tweaked versions of "Rough-Draft" Maxwell until GTX 980 is released.

M = Maxwell.

2560 x 1400
Theoretical Output: BF4 Dx11
GTX 780 Ti = 64 FPS.
R9-290x = 68 FPS.

GTX 880 roughly = 72 FPS.
R9-390x roughly = 102 FPS.

2560 x 1400
Theoretical Output: BioShock Infinite Dx11
GTX 780 Ti = 78 FPS.
R9-290x = 62 FPS.

GTX 880 roughly = 88.3 FPS.
R9-390x roughly = 93.0 FPS.
Posted on Reply
#68
john_
sweetAt first, I found that it was strange when TPU cited from an unknown media sourse, about a 2015 product. But now I understand.

The whole point of this article is "We need to offer something about nVidia or all folks will only talk about AMD 295x2".
There isn't competition only between gpu makers but also between hardware sites. So some rumors about products that could interest the majority of the readers could see a first page easily. Of course rumors about the next AMD series never got the first page but who cares right?
Posted on Reply
#69
the54thvoid
Super Intoxicated Moderator
Serpent of Darkness....blah......
Stop speculating. It's just pointless. Besides, you say:

2560 x 1400
Theoretical Output: BF4 Dx11
GTX 780 Ti = 64 FPS.
R9-290x = 68 FPS.

when we have 2560x1600:

gtx780ti - 46.3
290x - 44.9



Holes can be picked in any 'speculated' performance. Architecture/software efficiencies make mince meat out of bare metal.

Let's wait for real products....

Also, as to this idiotic comment:
sweetAt first, I found that it was strange when TPU cited from an unknown media sourse, about a 2015 product. But now I understand.

The whole point of this article is "We need to offer something about nVidia or all folks will only talk about AMD 295x2".
TPU is not a biased site. No matter what you believe - the owner, writers and mods are pretty neutral. What they do is repeat relevant rumour and gossip no matter what the source. If some web site latches on to something, others follow. It's the nature of having news when their is none.
The whole point of this article is the same as any other tech gossip. And it feeds the trolls.
Posted on Reply
#70
refillable
Taking this with a grain of salt. 2014 is the starting year of the change from Full HD to 4K. If a Flagship card from nvidia don't play a good 4K, nvidia is going to be on a trouble and will lose hardcore gamer's attention.
Posted on Reply
#71
pr0n Inspector
john_There isn't competition only between gpu makers but also between hardware sites. So some rumors about products that could interest the majority of the readers could see a first page easily. Of course rumors about the next AMD series never got the first page but who cares right?
9/11 was an inside job. The Jews control all the money. Elvis is eating cheeseburgers out there somewhere.
Posted on Reply
#72
john_
pr0n Inspector9/11 was an inside job. The Jews control all the money. Elvis is eating cheeseburgers out there somewhere.
If someone says something that you don't like and maybe see a little truth in it, just attack the guy who said that. Try to humiliate him, laugh at him. It is the easiest thing to do. It is so easy that even a low two digit IQ person can do it.
Posted on Reply
#73
pr0n Inspector
Conspiracy nut jobs thinking others are dumb for not seeing the "truth". How stereotypical. Next thing you know they'll claim that TPU staff and members are conspiring to take away their GPUs!
Posted on Reply
#74
john_
pr0n InspectorConspiracy nut jobs thinking others are dumb for not seeing the "truth". How stereotypical. Next thing you know they'll claim that TPU staff and members are conspiring to take away their GPUs!
Oh THE "TRUTH". WOW!!! I feel the chill in my spine!
And of course more blah blah blah trying to discredit, not just the other opinion, but also the person expressing it. Boring. Be more original. :p


The funny part of course is that in my post I defended the choice of putting the rumor about 880 in the first page, but it seems that you have a problem understanding it. You only care about the part of the post where I was implying that maybe there could be also place for the rumors about the next radeon.
So it seems that while I am not the one who have problem with 880 getting into the first page, you are really having a huge problem to accept even the idea that news about the future radeons should be posted.
Or you are just trolling which I don't really mind right now.


PS "nut jobs" LOL!!!! (I am crying now)
Posted on Reply
#75
pr0n Inspector
No you clearly stated that somehow news about your preferred brand's next-gen products somehow doesn't get to the "first page"(which is ridiculous because everything on TPU frontpage is listed in chronological order and the forums are by default ordered by last post time which you can easily bump). You are essentially saying that news about your preferred brand are being actively suppressed, i.e. a conspiracy against your preferred brand.


Although I will give you that you sounded less crazy than the other guy who outright said TPU(and perhaps the Illuminati) is manipulating us into talking about nVidia instead of AMD.
Posted on Reply
Add your own comment
Dec 19th, 2024 18:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts