Friday, August 1st 2014
NVIDIA to Launch GeForce GTX 880 in September
NVIDIA is expected to unveil its next generation high-end graphics card, the GeForce GTX 880, in September 2014. The company could tease its upcoming products at Gamescom. The company is reportedly holding a huge media event in California this September, where it's widely expected to discuss high-end graphics cards based on the "Maxwell" architecture. Much like AMD's Hawaii press event that predated actual launch of its R9 290 series by several weeks; NVIDIA's event is expected to be a paper-launch of one or more graphics cards based on its GM204 silicon, with market availability expected in time for Holiday 2014 sales.
The GM204 is expected to be NVIDIA's next workhorse chip, which will be marketed as high-end in the GeForce GTX 800 series, and performance-segment in its following GTX 900 series; much like how the company milked its "Kepler" based GK104 across two series. It's expected to be built on the existing 28 nm process, although one cannot rule out an optical shrink to 20 nm later (like NVIDIA shrunk the G92 from 65 nm to 55 nm). The GTX 880 reportedly features around 3,200 CUDA cores, and 4 GB of GDDR5 memory.
Source:
VideoCardz
The GM204 is expected to be NVIDIA's next workhorse chip, which will be marketed as high-end in the GeForce GTX 800 series, and performance-segment in its following GTX 900 series; much like how the company milked its "Kepler" based GK104 across two series. It's expected to be built on the existing 28 nm process, although one cannot rule out an optical shrink to 20 nm later (like NVIDIA shrunk the G92 from 65 nm to 55 nm). The GTX 880 reportedly features around 3,200 CUDA cores, and 4 GB of GDDR5 memory.
96 Comments on NVIDIA to Launch GeForce GTX 880 in September
Two guys with good GPU history knowledge, slightly ignoring literal freedoms and being obtuse. But kudos to both for knowing their stuff.
As for 880, it has to be more powerful than 780 and if it's not faster than 780ti, will need to come in an appropriate price point. What will its FP be? Doesn't matter, gamers don't need it. What matters in the market now is power draw/performance ratio. For 4k we need better power efficiency for the dual gpu's it looks like we need to run them till maybe a couple generations away?
I know folks say power consumption is irrelevant to the gamer but it isn't to the manufacturer. Mobile markets are dictating the trend and whoever gets the best power efficient architecture in their gfx will win. It's the only reason AMD aren't buried on the CPU front, with their APU's beating Intel's on board gfx solutions.
If there was no value in FP64 TITAN brand wouldn't exist. I was simply implying that during Tahiti the value was always there since the launch of 7970->7990 until Titan at a more value oriented granted you weren't tied down to CUDA. Which is a benefit for Nvidia which can lure you into spending more $ if your locked into their CUDA Eco-System. I know some green faithful don't even want to look at the other side but it was there if you weren't bias.
The comparisons people are making are just wacky especially when they include GK110 Titan because then your being hypocrite when up-holding its value and not seeing GK104 similar down-side when comparing it to Tahiti. That's my grip.
By all means compare GK110 780/Ti all day long in a gaming context since its better and by the looks of it those people are more interested in swinging there favorite color purse at someone.
I will be looking at Maxwell but in a more cautious way with the talk of V1 28nm and V2 20nm.
Both have come a long way regardless. You're right I got my dates wrong, but once again they were perfectly happy selling K20 \ K20X for thousands a pop. Once your high end pro market is happy you can start to trickle that tecnonlogy down to the consumer, even coming almost a year after the 680 it still had the market to itself for another what 8 months?
You were correct as per usual in your argument.
From Google Translation of the original Expreviewarticle:
Worst case scenario = a refined Kepler with a 800 series designation.
Best case scenario = Full Maxwell on 28nm (no gimps)
Atleast we got two months of speculations. What if Maxwell comes with a built in Alien receiver. My bad that was Tesla. Maxwell should come with a camera and a picture of little green men.
Don't worry guys, at least you have Tonga to look forward to.
Can't believe the GK110 is like 2 years old already, what a monster.
The Primary Focus of these cards are as follows:
Large Scale calculations (Floating Point)
Cuda/OpenCL
Large Scale Image Generation
They are essentially with those series cards almost just saying "Here is a powerful GPU, have fun we will see you later". Your not getting the same type of package as with a Quaddro or Desktop card...
The Teslas are a special breed of cards designed with super computers in mind and do not need to have certain attributes designed for those in mind. They are meant for people to program things to utilize their GPU cores for calulations and have professionals spend time working on them. Nvidia can release GK 110 chips on these even if they are not ready for the mainstream because even with a poor early binning process they do not expect many sales of the cards. It is a very limited market (Even in the Oak Ridge Super Computer has 18,688 GPUs that are K20X but that is still an insignificant amount of GPU's out there) and Nvidia knows that which is why putting out a GK 110 chip early to a very niche market still meant they could work and improve the chip for the main market. I still call upon my last quote "they had enough trouble even getting the GTX 680 out which was out of stock and basically required camping your computer night and day to get one". They were not ready and even the K20X still did not have the full powered core (K40 does) because the process of creating those chips was still difficult like it was for GK 104. If they had been fully ready to release the chip, they would have done so even releasing gimped GK 110 chips (Like the 780) but they were not ready to push it out onto the market (Just like AMD was not ready with Hawaii or else they would have done the same). But they were not ready and had not perfected the binning process yet and a company likes to be prepared with the best product. They do not like having to release products that do not meet standards, they do not want to waste money and maximize profits and getting a bunch of poor quality chips that cannot run at the fullest power is a sure fire way to waste money. This suprises you how? Most GPU's, CPU's, or other chips all have existed out in the development for quite some time (A year or so) but does not mean its ready to be used. There are exceptions of course but most GPU's are made awhile in advance and go through rigorous testing including working on simple things like making the GPU in a cost effective way with having the least amount of failures (Or poor performing chips). Nvidia nor AMD does not just plop out a chip the month they announce it, its not like they experiment and the chips appear with a chemical reaction and they go "BY GOD WE HAVE DONE IT!!! Quick make the annoucement", its a long and tiring process that includes much testing and refining.
GM 204 is the same, its takes along time and they have been working on it for quite some time. They were well aware at a point they could not drop down a node and began working on the GM 204 using the old process and working on it. It has existed for quite some time and we will see all the work they have put into it very soon.
Again hating on GM 204 saying its a poor GPU is going to show to be foolish. GM 204 is going to beat the GK 110 by a decent margin probably at least similar to how the GTX 680 beat the GTX 580. Even though its on the tick cycle where they introduce the new architecture and save the final chip until they have improved/refined the whole process, your going to get a better performing chip. Until we have more information however, most of this is still speculation. The only thing I am sure of (Unless something really wierd happens) is that it will be the top performing single GPU chip from Nvidia once its released at the time!
You implied if the GK110 was ready they would release it, I and HumanSmoke said it was but they were fulfilling big buck contracts first which made much more sense then rushing it to the consumer market.
www.techpowerup.com/forums/threads/nvidia-to-launch-geforce-gtx-880-in-september.203661/page-3#post-3144739
www.techpowerup.com/forums/threads/nvidia-to-launch-geforce-gtx-880-in-september.203661/page-3#post-3144743
I then said: You corrected me on the release date, but I was referring to the thousands of GK110 installed and up and running at Oak Ridge, so no I wasn't referring to the 680 release at all, and was right in the first place, so yeah shrug.
The comment about the age of GK110 is merely to emphasize how competitive the chip is, even against new AMD silicon, so again my point is simply they didn't need to rush it to the consumer market that's all... yes i really believe that.
Just because a chip exists does not make it ready for the average market or even the average professional market (Hence why no Quaddros or else they would have catered to all the professional markets and not the very strict niche market). If they had released a Quaddro card this would be a different discussion but as with the Desktop cards the GK 110 chip came out much later for reasons of more testing, improving binning, and improved software among other things. Hawaii was the same way and is why we saw Tahiti released first over the Hawaii chips just like we saw GK 104 over GK 110. They are the chips that can be made ready on the new architecture and they are used to test the fields while they put work onto the better chips and improve the processes so they can be ready for the next release. Ok well either way GK 110 being in a super computer working among professionals does not make it a consumer release or provide that it was ready for the big time... Again you are saying that like this chip is two years old and the AMD silicon is a few months old which is not the case. These chips may have been made at some different dates and exacts are hard to pinpoint except by the executives at both companies. But the fact remains both are probably alot closer in age than you would expect (Or seem to think)...
Just like GM 204, it was probably existed for much longer than we give credit for...
I am also done arguing this at this point...
Besides I never said it was ready for the consumer market, I said they didn't need to rush it to the consumer market, different things.
I'm glad your done too, for my own sake.
PS. Hi Xzibit!
Nvidia (and AMD but AMD is the smaller evil) guys are indeed arses and money-grabbing jerks. And some guys make it sound as if everything they do should be justified.
After all, HumanSmoke's argument would have made some sense (about the die space and power consumption) if they actually didn't cripple in the driver and used different dies but they use the same die for professional and consumer cards!
As a gamer, I didn't say that I need my fully-enabled double precision for games. There are multiple other applications where I would be glad to use the same card for.
Anandtech have a pretty nice showcase page called "Compute: What You Leave Behind?" in the GTX 680 review.
www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17
You will see the PrimeGRID Genefer 1.06, and also AESEncryptDecrypt, SmallLUX GPU, Civilisation V.
I will leave you for your own conclusions.