Friday, August 1st 2014

NVIDIA to Launch GeForce GTX 880 in September

NVIDIA is expected to unveil its next generation high-end graphics card, the GeForce GTX 880, in September 2014. The company could tease its upcoming products at Gamescom. The company is reportedly holding a huge media event in California this September, where it's widely expected to discuss high-end graphics cards based on the "Maxwell" architecture. Much like AMD's Hawaii press event that predated actual launch of its R9 290 series by several weeks; NVIDIA's event is expected to be a paper-launch of one or more graphics cards based on its GM204 silicon, with market availability expected in time for Holiday 2014 sales.

The GM204 is expected to be NVIDIA's next workhorse chip, which will be marketed as high-end in the GeForce GTX 800 series, and performance-segment in its following GTX 900 series; much like how the company milked its "Kepler" based GK104 across two series. It's expected to be built on the existing 28 nm process, although one cannot rule out an optical shrink to 20 nm later (like NVIDIA shrunk the G92 from 65 nm to 55 nm). The GTX 880 reportedly features around 3,200 CUDA cores, and 4 GB of GDDR5 memory.
Source: VideoCardz
Add your own comment

96 Comments on NVIDIA to Launch GeForce GTX 880 in September

#51
Steevo
We are at teh verge of what Silicon can do for us, until we make a breakthrough in graphiene or another substance to replace it we are reaching the limit of how much performance we can get without going bigger on power use and heat dissipation. Essentially we are close to the buy a Prius, Evo, or Ferrari of performance.
Posted on Reply
#52
GhostRyder
I think many of yall are giving to little credit here...We have already been well versed in what the current process is capable of with the GTX 750ti. While that card is nothing amazing performance wise, its the power consumption difference and the power with less cores available. The point of comparing the 640 cores in the 750ti versus the 768 in the 650ti while the 750ti pretty much blows the 650ti out of the water shows that you can improve on current processes and such. Why should we be so upset that were not dropping down to a smaller process now and be focused on disappointment before anything is even shown to the public?

Even using current basic logic, by the standard that the 650ti to the 750ti using less cores was able to outperform the older generation chip while using less power, we can assume that if the GTX 880 has the same amount of cores or even ~15% less that the performance would still be above the predecessor. Of course that's assuming core clocks remain roughly the same which depending could end up showing higher clock speeds and memory speeds (Of course that's just an assumption).

We also are basing so much off of rumors, speculations, and possible release dates.
Posted on Reply
#53
mcraygsx
rtwjunkieI'm definately not excited. It's the same crap they did with the 680, selling a mid-line chip (GK104) as their top end. Not until 780 did Nvidia actually release the top end chip (GK110) as the top end GPU. I'm not gonna buy an 880 with a GM204 marketed as top of the line, when we all know that GM210 will actually be the top of the line 980. I'll keep my 780 and wait till 980. I'm at 1080P, so no loss waiting an extra year.
+1 OP

Exactly what this guy said. They could have easily released Gk110 but instead they were selling 680 for a high end price. We should know what NVidia does from past.
Posted on Reply
#54
arbiter
jagdNvidia will use what available them at TSMC ( TSMC if they dont change their foundry ) there are not much foundry fabs and switching to new process ( 28nm to 20 nm to 16 etc) costing more with every step and also difficulty and problems .
If you look to TSMC 20nm struggle youll find what i mean , nvidia can't decide to skip to 16nm for simple reasons , it will take more time , that 16nm installation to fabs look very long way , and nvidia will be stuck at 28 nm with power and price disadvantage
One thing you forget AMD will stuck at the same proc if they release a new gpu soon as well. So they would be stuck with same issue as well, so its not just problem for nvidia but could be bigger problem for amd.
Posted on Reply
#55
64K
mcraygsx+1 OP

Exactly what this guy said. They could have easily released Gk110 but instead they were selling 680 for a high end price. We should know what NVidia does from past.
I disagree. TSMC failed with the 28nm process and Nvidia had to pay per wafer for them. This resulted in the gimped GTX 780. The 28nm process is much more refined now so we should see plenty of un-gimped GM210 chips a few months after the GTX 880 actually becomes available for purchase.

Edit: I'm not defending there price structure though. It's gone balls up.
Posted on Reply
#56
mcraygsx
64KI disagree. TSMC failed with the 28nm process and Nvidia had to pay per wafer for them. This resulted in the gimped GTX 780. The 28nm process is much more refined now so we should see plenty of un-gimped GM210 chips a few months after the GTX 880 actually becomes available for purchase.

Edit: I'm not defending there price structure though. It's gone balls up.
I hope you are right but considering what we have all seems in past several years how NVidia has been treating us. Its too good to be true that NVidia wont sell 880 Labeled as a premium product. Of course it will have minor advantages over GeForce 780 but time will tell.
Posted on Reply
#57
64K
mcraygsxI hope you are right but considering what we have all seems in past several years how NVidia has been treating us. Its too good to be true that NVidia wont sell 880 Labeled as a premium product. Of course it will have minor advantages over GeForce 780 but time will tell.
Yes, and that's the downside.

The GTX 880 should hammer on the GTX 780 and I think it will. It may roll right over the GTX 780 Ti in performance. Time will tell.

If anyone needs to upgrade their GPU in the next couple of months and wants to go Nvidia then the GTX 880 priced at around $425 will probably be a good deal. Otherwise wait for the 20nm Maxwells.
Posted on Reply
#58
arbiter
64KYes, and that's the downside.

The GTX 880 should hammer on the GTX 780 and I think it will. It may roll right over the GTX 780 Ti in performance. Time will tell.

If anyone needs to upgrade their GPU in the next couple of months and wants to go Nvidia then the GTX 880 priced at around $425 will probably be a good deal. Otherwise wait for the 20nm Maxwells.
Likely 500$ at lowest probably 600$ being new gpu. if it has 30% like rumored it would be within price range.
Posted on Reply
#59
Fluffmeister
mcraygsx+1 OP

Exactly what this guy said. They could have easily released Gk110 but instead they were selling 680 for a high end price. We should know what NVidia does from past.
This whole nVidia are the bad guys thing is just nonsense, again the GK104 based 680 was more than a match for the 7970, they literally had zero reason to release a consumer orientated card based on the GK110 at that time [regardless of it being ready or not].

I guess it would have been funny to see a GK110 powered 680 vs the under clocked Tahiti 7970, embarrassing.... but funny.
Posted on Reply
#60
GhostRyder
FluffmeisterThis whole nVidia are the bad guys thing is just nonsense, again the GK104 based 680 was more than a match for the 7970, they literally had zero reason to release a consumer orientated card based on the GK110 at that time [regardless of it being ready or not].

I guess it would have been funny to see a GK110 powered 680 vs the under clocked Tahiti 7970, embarrassing.... but funny.
Wow dude I cannot believe that you really believe that. I mean really do you think nvidias strategy was to release a GPU that was even instead of releasing something way more powerful. If nvidia had the GK 110 ready they would have easily released it and at a price point fitting ( probably close to the 1k mark depending).

This is the same strategy that has been used before so I do not get why people are getting so shocked. Compare the fermi architecture to the Kepler in terms of release and the chips used you will see the strategy remains the same and the same ideas can be expressed. Each cycle follows a similar strategy with the companies, you can say it's like a tick tock cycle. They release the introduction to a new architecture, show off how well it performs, gain data, and then next cycle release the full powered version of the architecture next release.

VLIW (or the Terascale series) from ATI also followed a similar set. This is no different than the strategies were all used to (well not much at least) and we can also compare the GCN architecture in that way.

Also anyone assuming that the GTX 880 is going to be weaker than the 780ti I feel is going to be either disappointed or impressed (depending on your outlook). It would not make much sense to release a less powerful GPU as your next gen GPU...
Posted on Reply
#61
64K
GhostRyderWow dude I cannot believe that you really believe that. I mean really do you think nvidias strategy was to release a GPU that was even instead of releasing something way more powerful. If nvidia had the GK 110 ready they would have easily released it and at a price point fitting ( probably close to the 1k mark depending).

This is the same strategy that has been used before so I do not get why people are getting so shocked. Compare the fermi architecture to the Kepler in terms of release and the chips used you will see the strategy remains the same and the same ideas can be expressed. Each cycle follows a similar strategy with the companies, you can say it's like a tick tock cycle. They release the introduction to a new architecture, show off how well it performs, gain data, and then next cycle release the full powered version of the architecture next release.

VLIW (or the Terascale series) from ATI also followed a similar set. This is no different than the strategies were all used to (well not much at least) and we can also compare the GCN architecture in that way.

Also anyone assuming that the GTX 880 is going to be weaker than the 780ti I feel is going to be either disappointed or impressed (depending on your outlook). It would not make much sense to release a less powerful GPU as your next gen GPU...
Well said GhostRyder.
Posted on Reply
#62
Fluffmeister
GhostRyderWow dude I cannot believe that you really believe that. I mean really do you think nvidias strategy was to release a GPU that was even instead of releasing something way more powerful. If nvidia had the GK 110 ready they would have easily released it and at a price point fitting ( probably close to the 1k mark depending).
Why would they need to put all their cards on the table right away? That makes absolutely zero sense.

Fact is when the GK110 was ready, it was in the form of the K20X for Oak Ridge, low yield, high returns, much more sense than appeasing forum warriors at TPU.
Posted on Reply
#63
HumanSmoke
GhostRyderWow dude I cannot believe that you really believe that. I mean really do you think nvidias strategy was to release a GPU that was even instead of releasing something way more powerful. If nvidia had the GK 110 ready they would have easily released it and at a price point fitting ( probably close to the 1k mark depending).
Nvidia did have GK110 up and running in the same time frame. For some reason I have yet to fathom, people seem to think that releasing the GPU as a $1000 card makes more sense than selling it in a $4500-5000 package to a client who was the damned launched customer (contact signed October 2011) and needed nineteen thousand of the things in a contract that would severely penalize Nvidia if the boards weren't supplied on schedule.
If Nvidia had intended for the GK 110 for desktop from the outset - which they could have managed as a paper/soft launch with basically no availability but plenty of PR ( i.e. a green scenario mirroring the HD 7970's 22nd December 2011 "launch"), they in likelihood could have had parts out in time. GK 110 taped out in early January 2012 (even noted Nvidia-haters tend to agree on this point). Fabrication, testing/debug, die packaging, board assembly, product shipping to distributers takes 8-12 weeks for a consumer GPU - production GK110's are A1 silicon, so no revision was required - that means early to mid March 2012 for a possible launch date IF the GTX 680 hadn't proved sufficient....and the launch date for the GTX 680? March 22nd, 2012.
Oak Ridge National Labs started receiving their first Tesla K20's in September 2012 (1000 or so in the first tranche), which tallies with the more stringent runtime validation process required for professional boards in general and mission critical HPC in particular.

Unbelievable that so much FUD exists about this considering most of the facts are actually well documented by third parties.
FluffmeisterWhy would they need to put all their cards on the table right away? That makes absolutely zero sense.
History tells us that the GTX 680 was sufficient. The competition (the 7970) was a known factor, so there was actually zero need to hastily put together a GK110 card. I doubt that a GK110 GTX card would have been any more than a PR stunt in any case, since Oak Ridge's contract superseded any consumer pissing contest.
FluffmeisterFact is when the GK110 was ready, it was in the form of the K20X for Oak Ridge, low yield, high returns, much more sense than appeasing forum warriors at TPU.
True enough. ORNL's Titan was the high profile large order customer, but more than a few people forget that Nvidia was also contracted to supply the Swiss supercomputing institute's Todi system, and the Blue Waters system for the National Center for Supercomputing, so around 22,000 boards required without taking replacements into consideration.
Posted on Reply
#64
Fluffmeister
^ Exactly, and once those contracts were fulfilled and yields gradually improved what did we see some 5-6 months later.... *drum roll*.... the $1000 GTX Titan , still without any fear of direct competition and a price as much as about protecting Nvdia's professional product stack as.... why fuck not?

But no, it should have been $400 bucks and called the 680, wonders never cease. :P
Posted on Reply
#65
rtwjunkie
PC Gaming Enthusiast
Well-explained by @Fluffmeister and @HumanSmoke why the mid-levelnchip ended up as the premiere Kepler card (and remained there so long)!!

Still, since I bought the 780 before the price drop, i prefer to keep the top of the chip line in my main rig. For me it just makes sense to wait til GM210, whenever that is (GTX 980?). Gotta get my money's worth!!

So anyway, i take back some of my false advertising statements, about 680 and the correllary to the 880, with neither top of the line card having the top of the line chip in the lineup. It all relates to being ready as well as business committments by Nvidia.
Posted on Reply
#66
TheoneandonlyMrK
You two sound like nvidia board members some days .it would Be nice to get back on topic at some point.
No new news on the hybrid board , gtx880 or anything going on then I guess.
Posted on Reply
#67
Fluffmeister
theoneandonlymrkYou two sound like nvidia board members some days .it would Be nice to get back on topic at some point.
No new news on the hybrid board , gtx880 or anything going on then I guess.
Really? Talking sense is being on the board of nVidia?

I guess you're right.
Posted on Reply
#68
SIGSEGV
arbiterOne thing you forget AMD will stuck at the same proc if they release a new gpu soon as well. So they would be stuck with same issue as well, so its not just problem for nvidia but could be bigger problem for amd.
According to various sources AMD already stated that they will introduce TSMC's 20nm products including gpu by next year (2015)
Posted on Reply
#69
64K
If you need a GPU upgrade and you want 4 GB VRAM then go with GTX 880. I am 100% convinced that it will smoke the GTX 780 at this point but know what you're buying. It's not the Maxwell Flagship. It's a mid range GPU and a throwback to the 28nm process. It is by no means a Maxwell Flagship so consider the price and don't be scammed.
Posted on Reply
#70
GhostRyder
FluffmeisterWhy would they need to put all their cards on the table right away? That makes absolutely zero sense.

Fact is when the GK110 was ready, it was in the form of the K20X for Oak Ridge, low yield, high returns, much more sense than appeasing forum warriors at TPU.
Makes more sense than releasing an equal product with less VRAM and performs almost exactly the same on average (Except when you take resolutions into account)...
Fluffmeister^ Exactly, and once those contracts were fulfilled and yields gradually improved what did we see some 5-6 months later.... *drum roll*.... the $1000 GTX Titan , still without any fear of direct competition and a price as much as about protecting Nvdia's professional product stack as.... why fuck not?

But no, it should have been $400 bucks and called the 680, wonders never cease. :p
5-6months...Try almost a year later dude...

GTX 680 Released: March 22, 2012
GTX Titan Released: February 19, 2013

Yea they released Titan as a 1k card almost 11 months later, obviously they had no problem releasing a 1k Desktop grade video card. If they had wanted to get that card out sooner they would have been happier to and charged accordingly, but they had enough trouble even getting the GTX 680 out which was out of stock and basically required camping your computer night and day to get one.
theoneandonlymrkYou two sound like nvidia board members some days .it would Be nice to get back on topic at some point.
No new news on the hybrid board , gtx880 or anything going on then I guess.
I am getting just as tired as you are of people dragging out these threads to off subject fanboy arguments.

But then what is going to be the excuse this time with the 880? Since everyone is convinced an un-released card with very little known about it is going to be inferior to the current lineup...
64KIf you need a GPU upgrade and you want 4 GB VRAM then go with GTX 880. I am 100% convinced that it will smoke the GTX 780 at this point but know what you're buying. It's not the Maxwell Flagship. It's a mid range GPU and a throwback to the 28nm process. It is by no means a Maxwell Flagship so consider the price and don't be scammed.
Exactly, im at a loss how certain people keep claiming that this chip sucks before we have even seen anything...
Posted on Reply
#71
Xzibit
HumanSmokeHistory tells us that the GTX 680 was sufficient. The competition (the 7970) was a known factor, so there was actually zero need to hastily put together a GK110 card. I doubt that a GK110 GTX card would have been any more than a PR stunt in any case, since Oak Ridge's contract superseded any consumer pissing contest.
They pulled the PR stunt anyways with the intro of TITAN brand

680 was good enough for them and they saw a $ benefit. 580 was FP64=1/8 and since then all Geforce have gone to a FP64=1/24. While AMD stuck to a FP64=1/4 on Tahiti until Hawaii where they lowered it to FP64=1/8.

Taihiti had FP64=1/4 so it was AMD "Titan" successor to the 580 if u don't take sides released a year after the 580. Not to mention the prices.
11/2010 - GTX 580 = $500
1/2012 - HD 7970 = $550
2/2013 - GTX Titan = $1000
The whole "TITAN" argument applies to Tahiti with in that same time frame with the notable exception of CUDA of course.

Now both companies are further cutting FP64 for gaming line where if Nvidia would had stuck to its old ways TITAN would have been the 580 successor not 680 nor 780.

I hope Maxwell goes back to the old ways but I highly doubt it.
Posted on Reply
#72
HumanSmoke
XzibitThe whole "TITAN" argument applies to
...nothing being talked about here....but since you're hanging out the bait...
XzibitThey pulled the PR stunt anyways with the intro of TITAN brand
Sure did. Seems like a marketing winner.
Xzibit680 was good enough for them and they saw a $ benefit. 580 was FP64=1/8 and since then all Geforce have gone to a FP64=1/24.
Spends a whole tortured introduction trying to get Titan into the topic....then screws it up.
GeForce GTX Titan :FP64 1:3 rate (w/boost disabled - which stands to reason since overclocking and double precision aren't mutually beneficial from either a error or power consideration)
GeForce GTX Titan Black: FP64 1:3 rate w/boost disabled
GeForce GTX Titan Z : FP64 1:3 rate w/boost disabled
XzibitWhile AMD stuck to a FP64=1/4 on Tahiti until Hawaii where they lowered it to FP64=1/8.
Thanks for reminding me that AMD halved the double precision ratio for desktop high end in the current series - though I already was aware of the fact. How about not offering double precision at all on GPU's other than the top one for the Evergreen and Northern Islands series after offering FP64 on the HD 4000 series RV770 ? Crazy shit huh? or limiting Pitcairn and Curacao to 1:16 FP64 to save die space and keep power demand in check? It's called tailoring the feature set to the segment.

Horses for courses. FP64 is a die space luxury largely unrequired in gaming GPUs.
Nvidia figured out a while ago that the monolithic big die really isn't that economic when sold at consumer prices which was why the line was bifurcated after the Fermi architecture - who would have thought selling a 520mm² GPU for $290 (GTX 560 Ti 448) and $350 (GTX 570) wouldn't have resulted in a financial windfall !. AMD will likely do the same since they will need a big die for pro/HSA apps ( and Fiji sounds like a 500mm²+ from all accounts), and keep the second tier and lower die-area ruled by gaming considerations ( just as Barts, Pitcairn, and Curacao are now)
XzibitI hope Maxwell goes back to the old ways but I highly doubt it.
The old ways of reverting back to 1:8 FP64 rate with Fermi, or 1:3 rate with the current GTX Titan range ? :confused:
Posted on Reply
#73
Xzibit
HumanSmoke...nothing being talked about here....but since you're hanging out the bait...

Sure did. Seems like a marketing winner.

Spends a whole tortured introduction trying to get Titan into the topic....then screws it up.
GeForce GTX Titan :FP64 1:3 rate (w/boost disabled - which stands to reason since overclocking and double precision aren't mutually beneficial from either a error or power consideration)
GeForce GTX Titan Black: FP64 1:3 rate w/boost disabled
GeForce GTX Titan Z : FP64 1:3 rate w/boost disabled

Thanks for reminding me that AMD halved the double precision ratio for desktop high end in the current series - though I already was aware of the fact. How about not offering double precision at all on GPU's other than the top one for the Evergreen and Northern Islands series after offering FP64 on the HD 4000 series RV770 ? Crazy shit huh? or limiting Pitcairn and Curacao to 1:16 FP64 to save die space and keep power demand in check? It's called tailoring the feature set to the segment.

Horses for courses. FP64 is a die space luxury largely unrequired in gaming GPUs.
Nvidia figured out a while ago that the monolithic big die really isn't that economic when sold at consumer prices which was why the line was bifurcated after the Fermi architecture - who would have thought selling a 520mm² GPU for $290 (GTX 560 Ti 448) and $350 (GTX 570) wouldn't have resulted in a financial windfall !. AMD will likely do the same since they will need a big die for pro/HSA apps ( and Fiji sounds like a 500mm²+ from all accounts), and keep the second tier and lower die-area ruled by gaming considerations ( just as Barts, Pitcairn, and Curacao are now)

The old ways of reverting back to 1:8 FP64 rate with Fermi, or 1:3 rate with the current GTX Titan range ? :confused:
WOW.

Even when I'm not arguing with you, you still come off as a jerk.

I didn't include TITAN because that was the exception on there top series card even though it has different "branding". I understood you would know the difference. Sheesh. Didn't think crossing T's and dotting I's was needed for you to understand.

Old way as to not change FP64 with-in chip in gaming series. GK110 was there first to do that TITAN & 780 differ. They saw an opportunity to make $ off so many that didn't meet standards but it was a smart business move but not so good for the consumer.



P.S.
I need to stay away from culinary school. Apparently it turns you into an even greater ass.
Posted on Reply
#74
HumanSmoke
XzibitI didn't include TITAN because that was the exception on there top series card even though it has different "branding".
Ah, I see.
So when you said...
Xzibit580 was FP64=1/8 and since then all Geforce have gone to a FP64=1/24.[
...what you actually meant was "all GeForces have gone to 1:24 except the ones that are 1:3"
Makes sense. Might have been apropos to include that....but then it would make the rest of your post redundant.
Still not sure why you actually bought up double precision in any case, since GM 204 likely won't be compute/pro focused any more than any other sub-300mm^2 GPU is, and it isn't actually apropos to anything anyone including myself was talking about - so why bother quoting my post which wasn't in any way related to what you are talking about?
Xzibitjerk...ass
Still can't hold a discussion without resorting to name calling? Some things never change.
Posted on Reply
#75
Xzibit
HumanSmokeAh, I see.
So when you said...

...what you actually meant was "all GeForces have gone to 1:24 except the ones that are 1:3"
Makes sense. Might have been apropos to include that....but then it would make the rest of your post redundant.
Still not sure why you actually bought up double precision in any case, since GM 204 likely won't be compute/pro focused any more than any other sub-300mm^2 GPU is, and it isn't actually apropos to anything anyone including myself was talking about - so why bother quoting my post which wasn't in any way related to what you are talking about?

Still can't hold a discussion without resorting to name calling? Some things never change.
Should have pointed you here but I doubt that would stop your usual grandiose reply as usual.

GEEKS3D - AMD Radeon and NVIDIA GeForce FP32/FP64 GFLOPS Table

Really I though most of that post I quoted you from was refering to GK110.? Silly me. :rolleyes:

Name calling. More like observation. Not like I'm the only one nor in this thread with such an observation.

I'll leave you to your HPD
Posted on Reply
Add your own comment
Mar 29th, 2025 13:49 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts