# Is This the First Picture of GeForce GTX 880?



## btarunr (Jul 3, 2014)

Chinese tech publication MyDrivers posted what it claims to be a graphics board running NVIDIA's next-generation GM204 graphics chip, which is designed to succeed the GK104, as the company's next workhorse GPU, covering a wide range of price-points. The pre-production graphics board usually has all its components placed (some redundant), to test out the best combination of them on production boards. Right away you see the purported GM204 chip, which looks bigger than the GK104, flanked by eight memory chips on three sides (reinforcing the 256-bit wide memory interface theory). The GM204 silicon is based on NVIDIA's "Maxwell" architecture, and is rumored to feature 3,200 CUDA cores, and about 4 GB of memory across a 256-bit wide memory interface. It is widely rumored to be built on the current 28 nm silicon fab process. NVIDIA could launch the first products running this chip before X'mas.



 

 

 



*View at TechPowerUp Main Site*


----------



## TheMailMan78 (Jul 3, 2014)

I hope it worth upgrading what I have now.


----------



## RCoon (Jul 3, 2014)

That chip is crazy large...
Also, why is everything except the chip blurred out?
Correct me if I'm wrong, but I see 3 x 6 pins, 2 stacked on top of each other, one by the side.


----------



## Sony Xperia S (Jul 3, 2014)

wow, GK104 was launched more than 2 years ago and they still need almost half a year more to release this GM104.


----------



## BorisDG (Jul 3, 2014)

Weird prototype - SLI/Power placements, extremely long PCB.  I'm not sure, but GM204 should be the size of GK104. This looks like GK110.  Interesting, why the VRM section is censored.  Also - 8 chips. If they are - 512MB each - we have 4GB confirmed. If it's not - 2GB... But ... for me this is fake.


----------



## the54thvoid (Jul 3, 2014)

RCoon said:


> That chip is crazy large...
> Also, why is everything except the chip blurred out?
> Correct me if I'm wrong, but I see 3 x 6 pins, 2 stacked on top of each other, one by the side.



Yeah, the pins are to figure out different power configs apparently, don't read anything into them.  Probably the same reason there are blurred out areas, it's an engineering board with engineering gadgets.  Hexus has pics too and you can also see a red button near the power inputs which you wouldn't see on a normal PCB (Galaxy HOF excepted).

This needs to shit all over GK104 otherwise it's going to be a disappointment.  780ti (full GK110) is 50% faster than GK104 (in BF4 at 1600p), so you'd hope this would beat it's Kepler equivalent by at least 75% to be 25% better then GK110.  Problem is, GF104 - GK104 wasn't that big a jump....

Hexus already put paid to the idea this might be GM110 by linking it to a dispatch itinerary that stated GM104's were on board.

Ooooh, Q4 with Haswell E and Maxwell.... Come on AMD, join the fight!  I want to see reasonable prices.  (This isn't a dig at AMD, I WANT to see lots of competition and AMD can certainly put a fly in NV's ointment if it wants too).


----------



## RCoon (Jul 3, 2014)

the54thvoid said:


> Come on AMD, join the fight! I want to see reasonable prices.



AMD need to, I'm not paying for another horrendously overpriced GPU stunt from NVidia again. Below £550 this time please. I'm riding my upgrade hopes on getting two of these things with their fabled lower power usage.


----------



## Sony Xperia S (Jul 3, 2014)

RCoon said:


> That chip is crazy large.





BorisDG said:


> GM204 should be the size of GK104.



The most reasonable update using 28 nm is to have this GM104 right in between GK104 and GK110.
So, you will have more performance just because of more shaders (if everything else equal) and then the new microarchitecture will say its word.

Perhaps with the size of around 350-400 mm^2.


----------



## RCoon (Jul 3, 2014)

I'd also like to request Inductors that don't squeal on the reference models. Looks like the engineer sample is using the same cheap R22's as the 780 reference.


----------



## TheMailMan78 (Jul 3, 2014)

I love new tech but until games catch up OR you are gaming at 4K its not like the old days. You used to have to upgrade every year just to stay on medium settings! I think tech will be in a slump for a long time until the market demands more. Seriously a AMD 4200 X2 is more than enough for the average computer user and a 570 is over kill for the casual player. Its really hard for me to get excited over this stuff anymore.

Maybe I'm just getting older and jaded. For those of you who still get excited......I'm jelly.


----------



## the54thvoid (Jul 3, 2014)

RCoon said:


> AMD need to, I'm not paying for another horrendously overpriced GPU stunt from NVidia again. Below £550 this time please. I'm riding my upgrade hopes on getting two of these things with their fabled lower power usage.



I'm with you there.  NV need a stick up their ass to dislodge that pricing problem they have.


----------



## Sony Xperia S (Jul 3, 2014)

the54thvoid said:


> I'm with you there.  NV need a stick up their ass to dislodge that pricing problem they have.



I am also with you, guys! Fuck that damn Nvidia! :lol:


----------



## Casecutter (Jul 3, 2014)

It feels to me that this is nothing more than "don't forget about us we're... coming".  They just can't have "Tonga" only talking up the forums.

Any 880 must be at minimum 20% above the GTX780 performance, be about 200W TDP, while on a Dia size well under 400mm to be priced at $450-470 or there's not much of reason to bring it, I see.


----------



## Disparia (Jul 3, 2014)

Hey, where did you get pics of my wife!? Luckily some of her naughty bits are blurred out...


----------



## DarkOCean (Jul 3, 2014)

RCoon said:


> That chip is crazy large...
> Also, why is everything except the chip blurred out?
> Correct me if I'm wrong, but I see 3 x 6 pins, 2 stacked on top of each other, one by the side.



The chip is not that large, its smaller than gk110( I expect something around 450mm2) considering its still 28nm.
Why the green board with too many power conectors? its probably a workstation card.


----------



## Hockz (Jul 3, 2014)

RCoon said:


> That chip is crazy large...
> Also, why is everything except the chip blurred out?
> Correct me if I'm wrong, but I see 3 x 6 pins, 2 stacked on top of each other, one by the side.


It's actually 1 x 8 pin + 2 x 6 pin


----------



## WithoutWeakness (Jul 3, 2014)

The weird PCB and port layouts is probably due to this being a prototype or engineering sample board. There's no way it's a final reference layout with the power connectors like that and the SLI bridges where they are.


----------



## erocker (Jul 3, 2014)

It's a piece of silicon on what looks to be a prototype PCB.


----------



## GhostRyder (Jul 3, 2014)

It's just an engineering sample guys, I would take the look of the card very lightly with this because you can bet they are going to do a way different design than this.

Though it is funny to see 3 power connectors on it at the moment, its just a prototype probably for putting the GPU through its paces.


----------



## TheoneandonlyMrK (Jul 3, 2014)

Casecutter said:


> It feels to me that this is nothing more than "don't forget about us we're... coming".  They just can't have "Tonga" only talking up the forums.
> 
> Any 880 must be at minimum 20% above the GTX780 performance, be about 200W TDP, while on a Dia size well under 400mm to be priced at $450-470 or there's not much of reason to bring it, I see.


Exactly what I think regarding remember us.
This has to rate as one of the tattiest pr outbursts ive seen nvidia allow out , or perhaps you think nvidia pass engineering samples a few days old to just anyone, I don't. 
No info at all here that can be carried into the future bar,  LOOK A CHIP (could even be tegra who knows T6?? And 1421noted on the chip) voilà


----------



## MxPhenom 216 (Jul 3, 2014)

TheMailMan78 said:


> I hope it worth upgrading what I have now.



Probably not if its really GM204 and 28nm like the GTX680 was GK104. Im waiting for big die GM210 at 20nm before even considering Maxwell.


----------



## Hilux SSRG (Jul 3, 2014)

MxPhenom 216 said:


> Probably not if its really GM204 and 28nm like the GTX680 was GK104. Im waiting for big die GM210 at 20nm before even considering Maxwell.



I agree, if you have a 600/700 series wait on 20nm Maxwell.  If you need the performance, just add another for sli.


----------



## MxPhenom 216 (Jul 3, 2014)

Hilux SSRG said:


> I agree, if you have a 600/700 series wait on 20nm Maxwell.  If you need the performance, just add another for sli.



yeah that was my original plan, adding another 780 with 1440p monitor, but now I'm not working so that's a problem.


----------



## the54thvoid (Jul 3, 2014)

theoneandonlymrk said:


> This has to rate as one of the tattiest pr outbursts ive seen nvidia allow out , or perhaps you think nvidia pass engineering samples a few days old to just anyone, I don't



Your hatred of Nvidia is strong.  You must learn to master your emotions, only then will you become impartial.

j/k

But seriously, I doubt it's PR at all.  GTX 880? to take the shine of Tonga?  The articles state it's to combat the GTX 760.  If you're in the market for a 760, you're not buying a GTX 880.  This is just a leak, maybe even a fake one, to stir things up.  There's lots of Chinese devils out there with all sorts of pseudo rumours and tech.

All we know is .... nada.


----------



## GAR (Jul 3, 2014)

BorisDG said:


> Weird prototype - SLI/Power placements, extremely long PCB.  I'm not sure, but GM204 should be the size of GK104. This looks like GK110.  Interesting, why the VRM section is censored.  Also - 8 chips. If they are - 512MB each - we have 4GB confirmed. If it's not - 2GB... But ... for me this is fake.



LOL, you must be new to the PC world..... This is normal for a "test" sample card, its kind of like a beta, or alpha, testing stages.


----------



## Selene (Jul 3, 2014)

This could also be a ES card with the on board ARM chip on the back we have been hearing about. 8pin + 6 pin for the GPU and 1x 6pin for the ARM chip.


----------



## MxPhenom 216 (Jul 3, 2014)

theoneandonlymrk said:


> Exactly what I think regarding remember us.
> This has to rate as one of the tattiest pr outbursts ive seen nvidia allow out , or perhaps you think nvidia pass engineering samples a few days old to just anyone, I don't.
> No info at all here that can be carried into the future bar,  LOOK A CHIP (could even be tegra who knows T6?? And 1421noted on the chip) voilà



Oh the AMD fanboy rage!


----------



## john_ (Jul 3, 2014)

the54thvoid said:


> Come on AMD, join the fight!  I want to see reasonable prices.  (This isn't a dig at AMD, I WANT to see lots of competition and AMD can certainly put a fly in NV's ointment if it wants too).





RCoon said:


> AMD need to, I'm not paying for another horrendously overpriced GPU stunt from NVidia again. Below £550 this time please. I'm riding my upgrade hopes on getting two of these things with their fabled lower power usage.



As I said in the past, many wait from AMD, ask from AMD, demand from AMD to join the fight, so they can buy cheaper Intel and Nvidia stuff. How nice...



As for the card it is an ES with 8GB, 7GHz RAM. I think the only interesting stuff about this board is that *8*GB, because it smells like a 256bit data bus. Also the SLI connectors just say "NO" to XDMA approach from AMD. Other than that, I don't think there is anything else of interest here. Maybe that pixelated black circles could be something more than just fans for the power circuit. ARM cores maybe in that area?


----------



## Selene (Jul 3, 2014)

AMD is was part of the reason for the 680 being $499.99 instead of a 660 @ half the price, but thats in the past. Maybe this time around they wont sand bag and we can have full flag ships from the start.


----------



## BorisDG (Jul 3, 2014)

GAR said:


> LOL, you must be new to the PC world..... This is normal for a "test" sample card, its kind of like a beta, or alpha, testing stages.


No, I'm not new. I just mentioned that, because I saw a lot of "ES" in the past and they are not that different.


----------



## MxPhenom 216 (Jul 3, 2014)

BorisDG said:


> No, I'm not new. I just mentioned that, because I saw a lot of "ES" in the past and they are not that different.



I think it depends on the stage of the ES. If its very early sample, it likely will not look anything close to retail PCB reference designs. Typically the ES samples that are sent out close to the release date to reviewers (unless they send retail cards) are much closer to final designs.


----------



## HM_Actua1 (Jul 3, 2014)

I'll patiently wait for Pascal.


----------



## Sony Xperia S (Jul 3, 2014)

Hitman_Actual said:


> I'll patiently wait for Pascal.



That's at least 2, maybe even a 3-year wait.


----------



## Nabarun (Jul 3, 2014)

Sony Xperia S said:


> That's at least 2, maybe even a 3-year wait.


That sucks, man!


----------



## Hilux SSRG (Jul 3, 2014)

Sony Xperia S said:


> That's at least 2, maybe even a 3-year wait.



I thought it was Volta that's years away?  Or was that done away with?


----------



## HumanSmoke (Jul 3, 2014)

RCoon said:


> That chip is crazy large...
> Also, why is everything except the chip blurred out?
> Correct me if I'm wrong, but I see 3 x 6 pins, 2 stacked on top of each other, one by the side.


1 x 8 pin, 2 x 6-pin.

Actually fairly common for prototypes. Nvidia likely wouldn't know exactly how the chip performs or its headroom for both stock clocks and AIB margins. The prototype allows for 2 x 6 pin and alternatively, 1 x 8pin + 1 x 6pin operation. Testing at a range of input power (GPU voltage) would offer a fine tuning capability - the old sliding scale of performance vs power consumption/heat output.


----------



## arbiter (Jul 3, 2014)

the54thvoid said:


> I'm with you there.  NV need a stick up their ass to dislodge that pricing problem they have.



Well what you expect when R&D is actually Done on card instead of AMD model of throw some chips together, put cheapest ass cooler we got on it and ship it out the door. <-- what happened with AMD and 290x. They were so deadset on beating the 780. Nvidia tends to put out a card they will do least xxx, and what ever you get with boost OC. AMD is your card will do "up to xxxx" but we all know who uses the "up to" crap and ends up not being that. ISP's, mpg's on cars.

Lastly Nvidia is NOT only gpu maker, so don't like their price, then DON'T BUY THEM and spare us your AMD fanboy crap.


----------



## HumanSmoke (Jul 3, 2014)

arbiter said:


> Well what you expect when R&D is actually Done on card instead of AMD model of throw some chips together, put cheapest ass cooler we got on it and ship it out the door.


It isn't really that simple I don't think. Nvidia tend to look for higher ASP's on consumer cards to offset the sweetheart deals they offer OEM's ( the Amazon Tesla K10 deal would be a good case in point). Since Nvidia also leverage a far (far far) higher proportion of its expenses on software development that also needs to factored in. The company has never been satisfied to merely exist as a counterpoint to another vendor, so the pricing tends to reflect that also.

BTW: For those dismayed about the size of the GM 204 die, it seems to scale out at ~430 mm² deducting the die package. Assuming it sits around Hawaii XT performance then it really isn't that bad considering Hawaii itself is 438 mm².


----------



## Sony Xperia S (Jul 3, 2014)

arbiter said:


> Lastly Nvidia is NOT only gpu maker, so don't like their price, then DON'T BUY THEM and spare us your AMD fanboy crap.



Silly, nvidia is not alone but their pricing has a very pronounced negative effect on the whole market, since they are part of it, thus emphasizing and enforcing even further stagnation and crisis in the same market rather than positive influence on growth or whatever else you want to achieve. 

About Volta I don't know but I guess it is not possible to have anything else between Maxwell this year and the projected roadmap with Pascal in 2016.


----------



## TheoneandonlyMrK (Jul 3, 2014)

MxPhenom 216 said:


> Oh the AMD fanboy rage!


Are you on something. 
I pointed out the obvious and I welcome all new tech even in 2015  im no OT fan of any of them.
Sorry if I was not excited enough for your liking but im not impressed by random ass silicon in that raw or mysterious a form.


----------



## MxPhenom 216 (Jul 3, 2014)

theoneandonlymrk said:


> Are you on something.
> I pointed out the obvious and I welcome all new tech even in 2015  im no OT fan of any of them.
> Sorry if I was not excited enough for your liking but im not impressed by random ass silicon in that raw or mysterious a form.



Seems like you are on probably the same thing.


----------



## GAR (Jul 3, 2014)

The pricing comments dont make any sense.....

lets see R9 290X = $550? cant be overclocked much because AMD pushed it to the limit already, uses more power.
GTX 780 = $450-550? Overclocked it beats the R9 290X and it matches the 780 ti and in some cases beats it...


----------



## john_ (Jul 4, 2014)

GAR said:


> The pricing comments dont make any sense.....
> 
> lets see R9 290X = $550? cant be overclocked much because AMD pushed it to the limit already, uses more power.
> GTX 780 = $450-550? Overclocked it beats the R9 290X and it matches the 780 ti and in some cases beats it...



Nice maths. Now try again this comparison with EVERY GRAPHICS CARD UNDER $400.


----------



## GAR (Jul 4, 2014)

john_ said:


> Nice maths. Now try again this comparison with EVERY GRAPHICS CARD UNDER $400.



Ok, lets see

R9 280X = slower than the GTX 770 in most cases costs $300-$350 on average depending on model

GTX 770 = Overclocks like a champ, costs $300-400 on average depending on model and ram

both are very close, i dont see where this "huge" price difference is..... That argument is pointless, we can go down, all the way down to the 750 ti, same story..... Not saying one is better than the other, just saying they are close to price/performance.


----------



## Jetster (Jul 4, 2014)

They blurred out the sexy parts  lol


----------



## LeonVolcove (Jul 4, 2014)

GAR said:


> Ok, lets see
> 
> R9 280X = slower than the GTX 770 in most cases costs $300-$350 on average depending on model
> 
> ...




Slower than GTX 770? then why is R9 280x is being "best graphics card for money" by Tom for Graphics Card june edition?
http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-4.html


----------



## eidairaman1 (Jul 4, 2014)

Looks fake to me


----------



## The Von Matrices (Jul 4, 2014)

HumanSmoke said:


> BTW: For those dismayed about the size of the GM 204 die, it seems to scale out at ~430 mm² deducting the die package. Assuming it sits around Hawaii XT performance then it really isn't that bad considering Hawaii itself is 438 mm².



That doesn't seem to make much sense to me that it would perform the same as a Hawaii XT.  Why would Nvidia go through the effort of designing a new GPU if it was only equal to its competition from a performance/die area perspective?  I find it unlikely that they are selling so many GTX 780s that the lower production cost of a smaller, fully enabled GPU (compared to an 80% enabled GK110) would pay back the capital investment in a new die.

Much more likely is that it is an  improvement over Hawaii XT from a performance/die area perspective.  However, considering the usual price differential between AMD and NVidia, they probably will both have the same performance/price with the NVidia card having higher performance and a higher price.



LeonVolcove said:


> Slower than GTX 770? then why is R9 280x is being "best graphics card for money" by Tom for Graphics Card june edition?
> http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-4.html



There's a difference between having the best performance (what the original post you're referring to said) and having the best performance/price, which is what the Tom's hardware article is stating.


----------



## arbiter (Jul 4, 2014)

LeonVolcove said:


> Slower than GTX 770? then why is R9 280x is being "best graphics card for money" by Tom for Graphics Card june edition?
> http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-4.html



I would guess they went with 280x cause its cheaper card not faster. Considering only 1 nvidia card on that list I question the writer who he favors personally.


----------



## HumanSmoke (Jul 4, 2014)

The Von Matrices said:


> That doesn't seem to make much sense to me that it would perform the same as a Hawaii XT.  Why would Nvidia go through the effort of designing a new GPU if it was only equal to its competition from a performance/die area perspective?  I find it unlikely that they are selling so many GTX 780s that the lower production cost of a smaller, fully enabled GPU (compared to an 80% enabled GK110) would pay back the capital investment in a new die.
> Much more likely is that it is an  improvement over Hawaii XT from a performance/die area perspective.  However, considering the usual price differential between AMD and NVidia, they probably will both have the same performance/price with the NVidia card having higher performance and a higher price.


Did you note my use of the word "assuming"? I suppose I could have said that the GM 204 could have X% of improvement over Hawaii....and what kind of response do you think that would elicit from some of our more rabid posters?
Truth is, I think GM 204 is a successor to the GK 104 and GF 104/114 lineage, so ~ 780/780Ti/290/290X performance would be respectable. I'm not certain that comparing the card to a heavily castrated die from the previous generation is overly helpful. The HD 7870XT (Tahiti LE) also a heavily cut part basically sits at the same level of performance as the incoming (Pitcairn-based) R9 270X. Personally it wouldn't surprise me to see the 780/780Ti 3GB phased out and the 6GB cards using B1 (assuming it isn't being further revised) silicon become the norm. If the 880 is 256-bit then it's possible to market that as mainstream, and the 6GB/384-bit for the higher resolution crowd. Will it happen? Who knows? But if both vendors are using the same process, and the same die space - and Maxwell isn't that great an improvement over Kepler so far, how much better than Hawaii do you expect it can be ?


LeonVolcove said:


> Slower than GTX 770? then why is R9 280x is being "best graphics card for money" by Tom for Graphics Card june edition?
> http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-4.html



Tom's?.....Tom's Hardware? Awesome.

BTW:  The pricing that Tom's used looks a little like bait advertising. The card now retails at the same store for $300, which is more in line with other outlets.


----------



## john_ (Jul 4, 2014)

GAR said:


> Ok, lets see
> 
> R9 280X = slower than the GTX 770 in most cases costs $300-$350 on average depending on model
> 
> ...



The cheapest 770 here costs 290 euros here. The cheapest 280X costs 250. And of course 770 is not faster "in most cases".

280 here costs on average 20 euros less than 760.
There is nothing new from Nvidia to put next to 270X
270 costs a little more than 750Ti and it is way faster
Even 265 that costs the same as 750Ti is faster
260X is faster than 750
260 is cheaper than 750.
250X is faster than Nvidia cards in the same price range.
The same is true for 250
and 240


----------



## RCoon (Jul 4, 2014)

arbiter said:


> Lastly Nvidia is NOT only gpu maker, so don't like their price, then DON'T BUY THEM and spare us your AMD fanboy crap.



Dude, you are aware he owned a Titan, and is currently running a 780ti? You just made yourself look like a complete tool, and you and people like you are the reason this forum is getting worse.


----------



## the54thvoid (Jul 4, 2014)

arbiter said:


> Lastly Nvidia is NOT only gpu maker, so don't like their price, then DON'T BUY THEM and spare us your AMD fanboy crap.



I read this last night and 







My Intel mobo, 3930k and hefty 780ti Classified suggest otherwise.

And it is pretty universally accepted that when you call someone a F***** without good reason you are one yourself. 



RCoon said:


> Dude, you are aware he owned a Titan, and is currently running a 780ti? You just made yourself look like a complete tool, and you and people like you are the reason this forum is getting worse.



We need to team up like double dragon, bring Freedom too! But seriously, yeah, so many people are incapable of reading a post in a neutral manner.  It's becoming a PITA with a lot of people too ignorant or with their bias guns set all the way to 10.


----------



## dom99 (Jul 4, 2014)

TheMailMan78 said:


> I love new tech but until games catch up OR you are gaming at 4K its not like the old days. You used to have to upgrade every year just to stay on medium settings! I think tech will be in a slump for a long time until the market demands more. Seriously a AMD 4200 X2 is more than enough for the average computer user and a 570 is over kill for the casual player. Its really hard for me to get excited over this stuff anymore.
> 
> Maybe I'm just getting older and jaded. For those of you who still get excited......I'm jelly.


 
I agree, there is no reason to upgrade to powerhouse GPUs untill 4k becomes more affordable


----------



## Sony Xperia S (Jul 4, 2014)

dom99 said:


> I agree, there is no reason to upgrade to powerhouse GPUs untill 4k becomes more affordable



I think the time when "4k becomes more affordable" will be close to the end of next year, that is - 2016 will be the year of 4K.

Windows 9 (because Windows 7 and Windows 8 are relatively poor with high DPI software)+ new GPUs (if we are lucky on new manufacturing processes, that's what I am more interested in, rather than Maxwell on 28 nm, MEH) + some new CPUs.


----------



## Raúl García (Jul 4, 2014)

Hey U got a really nice piece of ingeneering over that cooling device... guess the water is oversalted (don't know if that is what is said like... too salt!)

Hey... hate to do this, but I need help on a topic and I'm new here... have any Idea, on how to filter members so I can research specifically to people who are interested in programming with vb.net??

thnks anyway...


----------



## RCoon (Jul 4, 2014)

Raúl García said:


> Hey U got a really nice piece of ingeneering over that cooling device... guess the water is oversalted (don't know if that is what is said like... too salt!)
> 
> Hey... hate to do this, but I need help on a topic and I'm new here... have any Idea, on how to filter members so I can research specifically to people who are interested in programming with vb.net??
> 
> thnks anyway...



Head on over to that forum, and make a new thread.
http://www.techpowerup.com/forums/forums/programming-webmastering.52/


----------



## Raúl García (Jul 4, 2014)

NICE!! thanks a lot...


----------



## Tatty_One (Jul 4, 2014)

GAR said:


> Ok, lets see
> 
> R9 280X = slower than the GTX 770 in most cases costs $300-$350 on average depending on model
> 
> ...


  Little point in using overclocking as a pro or con when 90%+ of graphics card users don't overclock to be honest, if everyone did overclock reference designs would cost us even more as there would be little market for overclocked or special edition models, just a personal opinion though.


----------



## Naito (Jul 4, 2014)

Sony Xperia S said:


> Silly, nvidia is not alone but their pricing has a very pronounced negative effect on the whole market, since they are part of it, thus emphasizing and enforcing even further stagnation and crisis in the same market rather than positive influence on growth or whatever else you want to achieve. .



I believe all the current prices have stemmed from AMD not being able to compete mano a mano with nVidia since possibly around the times of the X1950 series of cards. Sure, current AMD cards offer great performance for the money, but tend to draw much more power, overclock less, produce more heat, etc, compared to their Nvidia counterparts (debatable on a generational basis). There is also the perceived (real or otherwise) greater reliability and stability of Nvidia drivers.

I'll leave this here.



> ...go back to the release of the previous generation of SKUs. AMD has a 4 month head start on Nvidia by releasing the HD 7000 series. There is speculation, that in this time, Nvidia moved their mid-tier GK104 to a high-end SKU, removing the GK110 from the lineup. This is entirely possible, because a product based on the GK110 was announced as far back as May 2012, one month after the release of the GK104 SKUs. Fast-forward to February this year. Nvidia is starting to lose some competitiveness against the HD7970 GHz Ed (and possibly due to other pressures) and decide to release the GTX TITAN (and eventually the GTX 780) and cash in on the enthusiast market.
> 
> So to sum up, AMD are only just beginning to be competitive with a 17 month old GPU. Maybe AMD should be blamed for Nvidias crazy prices? But having said that, some say AMD play a different game; price/performance.



There is also the fact that each companies SKU tiers seem to conveniently slot in between each others' price range - again, none go head to head. Sure they try to undercut each other, but it almost seems there is some sort of gentleman's agreement in place. Maybe I'm just paranoid....


----------



## HumanSmoke (Jul 4, 2014)

Naito said:


> There is also the fact that each companies SKU tiers seem to conveniently slot in between each others' price range - again, none go head to head. Sure they try to undercut each other, but it almost seems there is some sort of gentleman's agreement in place. Maybe I'm just paranoid....


That has been the case for a few years- at least since Nvidia and AMD got smacked for price collusion. The illusion of a price war from a few special case SKU's ( HD 5830 vs GTX 460 for instance), but with very little actual impact to the overall product stack (and ASP's) of either company.


----------



## Recus (Jul 4, 2014)

theoneandonlymrk said:


> Are you on something.
> I pointed out the obvious and I welcome all new tech even in 2015  im no OT fan of any of them.
> Sorry if I was not excited enough for your liking but im not impressed by random ass silicon in that raw or mysterious a form.



Why you haven't pointed this?

AMD’s Full Blown Hawaii XTX Core Confirmed – Is this the R9 295X with 3072 SPs and 48 CUs?



> I got a message yesterday, hinting at AMD lifting the embargo date to start "leaking info" for the upcoming GPUs, due to worry some GM 200 leaks spread.



Next day...

[Updated] AMD’s Hawaii XTX 48CU GPU Doesn’t Exist After All – Reliable Source Denies Existence Completely


----------



## RCoon (Jul 4, 2014)

Recus said:


> Why you haven't pointed this?
> 
> AMD’s Full Blown Hawaii XTX Core Confirmed – Is this the R9 295X with 3072 SPs and 48 CUs?
> 
> ...








Lol. I'm remembered by WCCF!


----------



## Champ (Jul 4, 2014)

This card sounds like it's gonna be ungodly. Got this in my email: http://www.tweaktown.com/news/38849...wsletter&utm_medium=ttemail&utm_campaign=ttcs  2 of these should be 4K city


----------



## TheoneandonlyMrK (Jul 4, 2014)

I have mentioned similar tactics in amd threads ie look at this not them, they all do it.
Many a purile jab at me here but The facts still are.
Here is a chip
Its got plenty of memory
Its at ALPHA stage
You bet it needs power
And it still could be anything 
Especially given nvidias recent roadmaps, that's not me being biased,  that's me saying the way it is.


----------



## Nabarun (Jul 4, 2014)

In India (not everywhere in India, just a few places I checked) the 280x is around INR 3k cheaper than the 770. Cheaper, but not enough imho, given the higher power consumption and largely without the sexy backplate as on the Asus 770 DCU II.


----------



## 64K (Jul 4, 2014)

I have been seeing the talk about the GTX 880 (GM 104/204) having a 8 pin and 2 6 pin power connectors and the talk of it being up to a possible 375 watt TDP when you add it all up. Take a look at the TDP of the GTX 260 and GTX 460 and GTX 680 (really should have been called a GTX 670). Nvidia had to invest some time and money into redesigning the Maxwell for the 28nm process due to the situation at TSMC so you can be sure that they intend to release an entire series of 28nm Maxwells over a period of several months in 2015 to recoup their investment and then we will have the 20nm Maxwells and those will be the performance powerhouses.


----------



## HumanSmoke (Jul 4, 2014)

theoneandonlymrk said:


> I have mentioned similar tactics in amd threads ie look at this not them, they all do it.


I think the principle difference people are referring to, is when it's Nvidia speculation there's a howling and gnashing of teeth from your direction (Also not sure what the power input has to do with it, and why it's such an issue considering it certainly isn't with the same people WRT to 290X, 290X Crossfire, 295X2 )....but someone posts an obvious BS wishlist (Hawaii 4096 cores, 20nm, etc etc etc ) and an obvious silicon floorplan of an APU, and you give it the full flag waving treatment 


theoneandonlymrk said:


> Wow compute cored gfx is around the corner  nice , whilst the pic may be a hoax it does all make perfect sense , imho those serial cores will either be a new special Dp shader sub unit aimed at serial compute or something much more interesting like Jaguer cores perhaps but they are not at all like an smx unit btw ( cant remember who said that) they will be separate and special , I think the bus/fabric that binds it all will be very very interesting too.


....even after the source has been debunked


theoneandonlymrk said:


> Amd are beating many to the ball with this tech and due to their apu , gpu and soc achievements they are certainly one to bet on imho.



Maybe some middle ground between the unbridled optimism (for AMD) and the most pessimistic outcome possible (for Nvidia) might be closer to reality?......although much less entertaining in retrospect.


----------



## TheoneandonlyMrK (Jul 4, 2014)

I disagree .
Some people have too much free time on a Friday night my comments in this thread were neg only slightly but I suppose you read into it what you will.
Id hope you are all right because imho we need 4-10x the gpu grunt we have and progress is progress but this pic is intriguing but not informative.
I obviously don't approve of the miss quoted stuff smokey.


----------



## arbiter (Jul 4, 2014)

64K said:


> I have been seeing the talk about the GTX 880 (GM 104/204) having a 8 pin and 2 6 pin power connectors and the talk of it being up to a possible 375 watt TDP when you add it all up. Take a look at the TDP of the GTX 260 and GTX 460 and GTX 680 (really should have been called a GTX 670). Nvidia had to invest some time and money into redesigning the Maxwell for the 28nm process due to the situation at TSMC so you can be sure that they intend to release an entire series of 28nm Maxwells over a period of several months in 2015 to recoup their investment and then we will have the 20nm Maxwells and those will be the performance powerhouses.



Its nvidia they wouldn't put a card out that uses 375watts for a single gpu. That would just be outright stupid for them. AMD I would believe it more from them to do it as they have and are will to use what ever amount of power to push their chips to be competitive Nvidia (on GPU side) and Intel (n cpu) side.


----------



## Sony Xperia S (Jul 5, 2014)

Naito said:


> There is also the perceived (real or otherwise) greater reliability and stability of Nvidia drivers.



That's because of stupidity of end-users who otherwise pretend to be able to do many things.  

I have been using ATi Catalyst for 8 years now and never ever had any problems with it.

Everything seems to depend on the quality of the device-behind-the-keyboard, and since I think we can assume AMD drivers are more sensitive to your PC environment, like installed MS Visual C++, .NET Framework, etc, it is indeed some kind of a proof of this claim.


----------



## Ikaruga (Jul 6, 2014)

I want one


----------



## xenocide (Jul 7, 2014)

Sony Xperia S said:


> I have been using ATi Catalyst for 8 years now and never ever had any problems with it.


 
I refuse to believe you've used *any* kind of software for 8 years and never encountered a single problem.


----------



## TheHunter (Jul 7, 2014)

RCoon said:


> Lol. I'm remembered by WCCF!






I remember how he made a scene about Z87 not compatible with Devils Canyon. Then when it went official he goes did intel break under pressure? Really? imo this guy can be a joke sometimes.


----------



## Nabarun (Jul 7, 2014)

TheHunter said:


> I remember how he made a scene about Z87 not compatible with Devils Canyon. Then when it went official he goes did intel break under pressure? Really? imo this guy can be a joke sometimes.


How the hell do you know he is one of the commentators??? I don't see the damn username.


----------



## Sony Xperia S (Jul 7, 2014)

This 28 nm GTX 880 will be worthless if:

NVIDIA to skip 20nm fabrication process, third generation Maxwell to use 16nm?

Now, here’s the biggest shocker coming from SemiAccurate article. According to their sources, NVIDIA will skip 20nm node and move straight to 16nm. Not only that, GM204 will be the first GPU remanufactured and relaunched using this process.

NVIDIA’s GeForce GTX 880 will launch is set for Q3. If everything goes according to plan, the new flagship should appear in October.

*The second wave would launch somewhere in mid-Q1/2015, **which gives us 4 to 6 months between the second and third generation of Maxwell.* 

Long story short, the GM204 at A stepping is expected to be the last 28nm GPU NVIDIA is going to make. The GM204 B and all future GPUs will use 16nm node.

http://videocardz.com/51009/nvidia-preparing-four-maxwell-gm204-skus


----------



## TheHunter (Jul 7, 2014)

^
Yeah I saw that, I think I might wait for GM210, not interested in another mid-range GM204 chip.



Nabarun said:


> How the hell do you know he is one of the commentators??? I don't see the damn username.


And?  My point is he/that site can be trolling sometimes, mostly spreading false rumors. I stopped taking them so seriously a long time ago.

Imo no wonder its blocked @ guru3d


----------



## Nabarun (Jul 7, 2014)

TheHunter said:


> And?  My point is he/that site can be a trolling, mostly spreading false rumors. I stopped talking them seriously a long time ago.
> 
> Imo no wonder its blocked @ guru3d


Ahhh.... gotcha


----------



## HumanSmoke (Jul 7, 2014)

Sony Xperia S said:


> Now, here’s the biggest shocker coming from SemiAccurate article. According to their sources, NVIDIA will skip 20nm node and move straight to 16nm.


What precisely is the shocker here? TSMC's CLN20SOC isn't suitable for high power GPU's. Myself, as well as others, have attempted to debunk that supposed fairy tale for some time.
TSMC's own roadmap stated the same thing before they softened the PR when they cancelled the CLN20G (General Purpose) process






 If Charlie D is using his usual modus operandi, he'll claim that Nvidia will launch four cards, then when they launch 2 or 3 claim that they couldn't launch the rest because of bad yields. I'm actually quite surprised that he hasn't announced a fictitious GM 2xx part as clickbait, then later, deliver a six-part exposé on why Nvidia failed to produce it


----------



## xenocide (Jul 8, 2014)

TheHunter said:


> ^
> Yeah I saw that, I think I might wait for GM210, not interested in another mid-range GM204 chip.


 
So you're telling me if Nvidia launched a GM204 part, that outperformed the 780Ti, for half the price, you would have no interest in it?  Who *cares* as long as the performance is there?



HumanSmoke said:


> If Charlie D is using his usual modus operandi, he'll claim that Nvidia will launch four cards, then when they launch 2 or 3 claim that they couldn't launch the rest because of bad yields. I'm actually quite surprised that he hasn't announced a fictitious GM 2xx part as clickbait, then later, deliver a six-part exposé on why Nvidia failed to produce it


 
Dude swings and misses so often I'm surprised the Red Sox haven't offered him a contract.


----------



## TheHunter (Jul 8, 2014)

No I wouldn't, I already own GK110.


Only future GM210 should be worth it. Hopefully ~ 10tflops


----------

