Thursday, July 3rd 2014

Is This the First Picture of GeForce GTX 880?

Chinese tech publication MyDrivers posted what it claims to be a graphics board running NVIDIA's next-generation GM204 graphics chip, which is designed to succeed the GK104, as the company's next workhorse GPU, covering a wide range of price-points. The pre-production graphics board usually has all its components placed (some redundant), to test out the best combination of them on production boards. Right away you see the purported GM204 chip, which looks bigger than the GK104, flanked by eight memory chips on three sides (reinforcing the 256-bit wide memory interface theory). The GM204 silicon is based on NVIDIA's "Maxwell" architecture, and is rumored to feature 3,200 CUDA cores, and about 4 GB of memory across a 256-bit wide memory interface. It is widely rumored to be built on the current 28 nm silicon fab process. NVIDIA could launch the first products running this chip before X'mas.
Add your own comment

80 Comments on Is This the First Picture of GeForce GTX 880?

#51
RCoon
arbiterLastly Nvidia is NOT only gpu maker, so don't like their price, then DON'T BUY THEM and spare us your AMD fanboy crap.
Dude, you are aware he owned a Titan, and is currently running a 780ti? You just made yourself look like a complete tool, and you and people like you are the reason this forum is getting worse.
Posted on Reply
#52
the54thvoid
Super Intoxicated Moderator
arbiterLastly Nvidia is NOT only gpu maker, so don't like their price, then DON'T BUY THEM and spare us your AMD fanboy crap.
I read this last night and :roll:



My Intel mobo, 3930k and hefty 780ti Classified suggest otherwise.

And it is pretty universally accepted that when you call someone a F***** without good reason you are one yourself. :slap:
RCoonDude, you are aware he owned a Titan, and is currently running a 780ti? You just made yourself look like a complete tool, and you and people like you are the reason this forum is getting worse.
We need to team up like double dragon, bring Freedom too! But seriously, yeah, so many people are incapable of reading a post in a neutral manner. It's becoming a PITA with a lot of people too ignorant or with their bias guns set all the way to 10.
Posted on Reply
#53
dom99
TheMailMan78I love new tech but until games catch up OR you are gaming at 4K its not like the old days. You used to have to upgrade every year just to stay on medium settings! I think tech will be in a slump for a long time until the market demands more. Seriously a AMD 4200 X2 is more than enough for the average computer user and a 570 is over kill for the casual player. Its really hard for me to get excited over this stuff anymore.

Maybe I'm just getting older and jaded. For those of you who still get excited......I'm jelly.
I agree, there is no reason to upgrade to powerhouse GPUs untill 4k becomes more affordable
Posted on Reply
#54
Sony Xperia S
dom99I agree, there is no reason to upgrade to powerhouse GPUs untill 4k becomes more affordable
I think the time when "4k becomes more affordable" will be close to the end of next year, that is - 2016 will be the year of 4K.

Windows 9 (because Windows 7 and Windows 8 are relatively poor with high DPI software)+ new GPUs (if we are lucky on new manufacturing processes, that's what I am more interested in, rather than Maxwell on 28 nm, MEH) + some new CPUs. ;)

:)
Posted on Reply
#55
Raúl García
Hey U got a really nice piece of ingeneering over that cooling device... guess the water is oversalted (don't know if that is what is said like... too salt!)

Hey... hate to do this, but I need help on a topic and I'm new here... have any Idea, on how to filter members so I can research specifically to people who are interested in programming with vb.net??

thnks anyway...
Posted on Reply
#56
RCoon
Raúl GarcíaHey U got a really nice piece of ingeneering over that cooling device... guess the water is oversalted (don't know if that is what is said like... too salt!)

Hey... hate to do this, but I need help on a topic and I'm new here... have any Idea, on how to filter members so I can research specifically to people who are interested in programming with vb.net??

thnks anyway...
Head on over to that forum, and make a new thread.
www.techpowerup.com/forums/forums/programming-webmastering.52/
Posted on Reply
#58
Tatty_Two
Gone Fishing
GAROk, lets see

R9 280X = slower than the GTX 770 in most cases costs $300-$350 on average depending on model

GTX 770 = Overclocks like a champ, costs $300-400 on average depending on model and ram

both are very close, i dont see where this "huge" price difference is..... That argument is pointless, we can go down, all the way down to the 750 ti, same story..... Not saying one is better than the other, just saying they are close to price/performance.
Little point in using overclocking as a pro or con when 90%+ of graphics card users don't overclock to be honest, if everyone did overclock reference designs would cost us even more as there would be little market for overclocked or special edition models, just a personal opinion though.
Posted on Reply
#59
Naito
Sony Xperia SSilly, nvidia is not alone but their pricing has a very pronounced negative effect on the whole market, since they are part of it, thus emphasizing and enforcing even further stagnation and crisis in the same market rather than positive influence on growth or whatever else you want to achieve. ;).
I believe all the current prices have stemmed from AMD not being able to compete mano a mano with nVidia since possibly around the times of the X1950 series of cards. Sure, current AMD cards offer great performance for the money, but tend to draw much more power, overclock less, produce more heat, etc, compared to their Nvidia counterparts (debatable on a generational basis). There is also the perceived (real or otherwise) greater reliability and stability of Nvidia drivers.

I'll leave this here.
...go back to the release of the previous generation of SKUs. AMD has a 4 month head start on Nvidia by releasing the HD 7000 series. There is speculation, that in this time, Nvidia moved their mid-tier GK104 to a high-end SKU, removing the GK110 from the lineup. This is entirely possible, because a product based on the GK110 was announced as far back as May 2012, one month after the release of the GK104 SKUs. Fast-forward to February this year. Nvidia is starting to lose some competitiveness against the HD7970 GHz Ed (and possibly due to other pressures) and decide to release the GTX TITAN (and eventually the GTX 780) and cash in on the enthusiast market.

So to sum up, AMD are only just beginning to be competitive with a 17 month old GPU. Maybe AMD should be blamed for Nvidias crazy prices? But having said that, some say AMD play a different game; price/performance.
There is also the fact that each companies SKU tiers seem to conveniently slot in between each others' price range - again, none go head to head. Sure they try to undercut each other, but it almost seems there is some sort of gentleman's agreement in place. Maybe I'm just paranoid....
Posted on Reply
#60
HumanSmoke
NaitoThere is also the fact that each companies SKU tiers seem to conveniently slot in between each others' price range - again, none go head to head. Sure they try to undercut each other, but it almost seems there is some sort of gentleman's agreement in place. Maybe I'm just paranoid....
That has been the case for a few years- at least since Nvidia and AMD got smacked for price collusion. The illusion of a price war from a few special case SKU's ( HD 5830 vs GTX 460 for instance), but with very little actual impact to the overall product stack (and ASP's) of either company.
Posted on Reply
#61
Recus
theoneandonlymrkAre you on something.
I pointed out the obvious and I welcome all new tech even in 2015 :p im no OT fan of any of them.
Sorry if I was not excited enough for your liking but im not impressed by random ass silicon in that raw or mysterious a form.
Why you haven't pointed this?

AMD’s Full Blown Hawaii XTX Core Confirmed – Is this the R9 295X with 3072 SPs and 48 CUs?
I got a message yesterday, hinting at AMD lifting the embargo date to start "leaking info" for the upcoming GPUs, due to worry some GM 200 leaks spread.
Next day...

[Updated] AMD’s Hawaii XTX 48CU GPU Doesn’t Exist After All – Reliable Source Denies Existence Completely
Posted on Reply
#64
TheoneandonlyMrK
I have mentioned similar tactics in amd threads ie look at this not them, they all do it.
Many a purile jab at me here but The facts still are.
Here is a chip
Its got plenty of memory
Its at ALPHA stage
You bet it needs power
And it still could be anything
Especially given nvidias recent roadmaps, that's not me being biased, that's me saying the way it is.
Posted on Reply
#65
Nabarun
In India (not everywhere in India, just a few places I checked) the 280x is around INR 3k cheaper than the 770. Cheaper, but not enough imho, given the higher power consumption and largely without the sexy backplate as on the Asus 770 DCU II.
Posted on Reply
#66
64K
I have been seeing the talk about the GTX 880 (GM 104/204) having a 8 pin and 2 6 pin power connectors and the talk of it being up to a possible 375 watt TDP when you add it all up. Take a look at the TDP of the GTX 260 and GTX 460 and GTX 680 (really should have been called a GTX 670). Nvidia had to invest some time and money into redesigning the Maxwell for the 28nm process due to the situation at TSMC so you can be sure that they intend to release an entire series of 28nm Maxwells over a period of several months in 2015 to recoup their investment and then we will have the 20nm Maxwells and those will be the performance powerhouses.
Posted on Reply
#67
HumanSmoke
theoneandonlymrkI have mentioned similar tactics in amd threads ie look at this not them, they all do it.
I think the principle difference people are referring to, is when it's Nvidia speculation there's a howling and gnashing of teeth from your direction (Also not sure what the power input has to do with it, and why it's such an issue considering it certainly isn't with the same people WRT to 290X, 290X Crossfire, 295X2 )....but someone posts an obvious BS wishlist (Hawaii 4096 cores, 20nm, etc etc etc ) and an obvious silicon floorplan of an APU, and you give it the full flag waving treatment
theoneandonlymrkWow compute cored gfx is around the corner nice , whilst the pic may be a hoax it does all make perfect sense , imho those serial cores will either be a new special Dp shader sub unit aimed at serial compute or something much more interesting like Jaguer cores perhaps;) but they are not at all like an smx unit btw ( cant remember who said that) they will be separate and special , I think the bus/fabric that binds it all will be very very interesting too.
....even after the source has been debunked
theoneandonlymrkAmd are beating many to the ball with this tech and due to their apu , gpu and soc achievements they are certainly one to bet on imho.
Maybe some middle ground between the unbridled optimism (for AMD) and the most pessimistic outcome possible (for Nvidia) might be closer to reality?......although much less entertaining in retrospect.
Posted on Reply
#68
TheoneandonlyMrK
I disagree .
Some people have too much free time on a Friday night my comments in this thread were neg only slightly but I suppose you read into it what you will.
Id hope you are all right because imho we need 4-10x the gpu grunt we have and progress is progress but this pic is intriguing but not informative.
I obviously don't approve of the miss quoted stuff smokey.
Posted on Reply
#69
arbiter
64KI have been seeing the talk about the GTX 880 (GM 104/204) having a 8 pin and 2 6 pin power connectors and the talk of it being up to a possible 375 watt TDP when you add it all up. Take a look at the TDP of the GTX 260 and GTX 460 and GTX 680 (really should have been called a GTX 670). Nvidia had to invest some time and money into redesigning the Maxwell for the 28nm process due to the situation at TSMC so you can be sure that they intend to release an entire series of 28nm Maxwells over a period of several months in 2015 to recoup their investment and then we will have the 20nm Maxwells and those will be the performance powerhouses.
Its nvidia they wouldn't put a card out that uses 375watts for a single gpu. That would just be outright stupid for them. AMD I would believe it more from them to do it as they have and are will to use what ever amount of power to push their chips to be competitive Nvidia (on GPU side) and Intel (n cpu) side.
Posted on Reply
#70
Sony Xperia S
NaitoThere is also the perceived (real or otherwise) greater reliability and stability of Nvidia drivers.
That's because of stupidity of end-users who otherwise pretend to be able to do many things. :D :eek:

I have been using ATi Catalyst for 8 years now and never ever had any problems with it.

Everything seems to depend on the quality of the device-behind-the-keyboard, and since I think we can assume AMD drivers are more sensitive to your PC environment, like installed MS Visual C++, .NET Framework, etc, it is indeed some kind of a proof of this claim. :)
Posted on Reply
#72
xenocide
Sony Xperia SI have been using ATi Catalyst for 8 years now and never ever had any problems with it.
I refuse to believe you've used any kind of software for 8 years and never encountered a single problem.
Posted on Reply
#74
Nabarun
TheHunterI remember how he made a scene about Z87 not compatiblewith Devils Canyon. Then when it went official he goes did intel break under pressure? Really? imo this guy can be a joke sometimes.
How the hell do you know he is one of the commentators??? I don't see the damn username. :confused:
Posted on Reply
#75
Sony Xperia S
This 28 nm GTX 880 will be worthless if:

NVIDIA to skip 20nm fabrication process, third generation Maxwell to use 16nm?

Now, here’s the biggest shocker coming from SemiAccurate article. According to their sources, NVIDIA will skip 20nm node and move straight to 16nm. Not only that, GM204 will be the first GPU remanufactured and relaunched using this process.

NVIDIA’s GeForce GTX 880 will launch is set for Q3. If everything goes according to plan, the new flagship should appear in October.

The second wave would launch somewhere in mid-Q1/2015, which gives us 4 to 6 months between the second and third generation of Maxwell.

Long story short, the GM204 at A stepping is expected to be the last 28nm GPU NVIDIA is going to make. The GM204 B and all future GPUs will use 16nm node.

videocardz.com/51009/nvidia-preparing-four-maxwell-gm204-skus
Posted on Reply
Add your own comment
Nov 25th, 2024 00:23 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts