# Galaxy GeForce GTX 970 Pictured, Specs Confirmed, Early Benchmarks Surface



## btarunr (Sep 10, 2014)

Here are some of the first pictures of an AIC partner branded NVIDIA GeForce GTX 970 graphics card, the Galaxy GTX 970 GC. Spotted across Chinese PC enthusiast forums and social networks, the latest set of leaks cover not just pictures of what the GTX 970 looks like, but also what's under its hood. To begin with, Galaxy's card appears to be built for the high-end market segment. A meaty twin-fan aluminium fin-stack heatsink, coupled by a spacey backplate cover a signature Galaxy blue PCB, holding NVIDIA's new GTX 970 GPU, and 4 GB of GDDR5 memory. The card appears to feature a high-grade VRM that draws power from a combination of 8-pin and 6-pin PCIe power connectors.



 

 

 




There's also a selection of pictures of a purported reference-design GeForce GTX 970 graphics card. It may look drab, but that's because NVIDIA will not ship reference-design cards. The GTX 970 will likely be an AIC-exclusive, meaning that you'll only find custom-design cards based on the chip. We wonder if that's the same with the GTX 980. Straightaway, you'll notice that the GTX 970 reference PCB bears an uncanny resemblance to the one NVIDIA used for the GTX 670, GTX 660 Ti, and GTX 760. That's probably because the GK104 and new GM204 are pin-identical. Such a thing isn't new. The "Pitcairn" silicon (Radeon HD 7870, HD 7850) and its predecessor, "Barts" (HD 6870 and HD 6850) are similarly pin-identical, differing with the die. The similarity in PCB design, if nothing, shows that the GTX 970 will be as energy-efficient as the GTX 670.



 

Moving on to the actual-specs, and some users with access to GeForce GTX 970 managed to pull these specs off a TechPowerUp GPU-Z screenshot. Some parts of the screenshot look blurry, probably due to a failed attempt at blurring out the BIOS string. GPU-Z has preliminary support for GM204 since version 0.7.9. This is what it could make out:

GPU identified as "1C32"
1,664 CUDA cores
138 TMUs
32 ROPs
256-bit wide GDDR5 memory interface
4 GB standard memory amount
1051 MHz core, 1178 MHz GPU Boost, and 7012 MHz (GDDR5-effective) memory clocks
224 GB/s memory bandwidth
The sample was put through a quick run of 3DMark 11 Extreme Preset. It scored X3963 points, which if you factor in the dual-core Core i3-4130 used in the bench, puts the GTX 970 somewhere between the GTX 780 and GTX 780 Ti, in terms of performance.



 

 

 



*View at TechPowerUp Main Site*


----------



## ZoneDymo (Sep 10, 2014)

"puts the GTX 970 somewhere between the GTX 780 and GTX 780 Ti, in terms of performance." 

Uuurggggg lame!
Come on now, you name as if its 2 jumps from the 700 series and THAT is the performance?
I hope some magic drivers can give it a good kick in the ass


----------



## thebluebumblebee (Sep 10, 2014)

> draws power from a combination of 8-pin and 6-pin PCIe power connectors


Bummer.  There goes the hope for a lower power middle weight.  That's good for up to 300 watts.  I was looking for something around the GTX 780's performance while using 125-150 watts.  I do see that the other board shots show 2x6 pin.


----------



## LAN_deRf_HA (Sep 10, 2014)

Without a die shrink never expect much. Always hate it when this happens because you don't get as much performance jump and then when the die shrunk cards do comes its another small jump.



thebluebumblebee said:


> Bummer. There goes the hope for a lower power middle weight. That's good for up to 300 watts. I was looking for something around the GTX 780's performance while using 125-150 watts. I do see that the other board shots show 2x6 pin.



Galaxy likes to do that on lower power cards. They had my 660 Ti's on a 680 PCB and I was never able to come close to hitting the power limit on that thing even volt modded.


----------



## erixx (Sep 10, 2014)

1) LOVELY to see the TechpowerUp utility being used!!!!! Feeling proud of this community!
2) 970 is the lower spec twinbrother, so if it is in between 780 and 780Ti, I am in!


----------



## the54thvoid (Sep 10, 2014)

Reviews will be soon enough.  No point even guessing now.

Let's all be patient, after all, we know we're getting the repeat scenario from a few years back.

Next year we'll have 980Ti.  But for now, a 980 beating a 780Ti will be nice, if not for anything else, better perf/watt and more Vram.


----------



## GhostRyder (Sep 10, 2014)

That's more what I expected, its between the 780 and 780ti and its a X70 counterpart which is very good and could confirm that the 980 is going to be better than the 780ti even if it is not by an extreme amount.

Going to be a beast at that with 4gb GDDR5 with that 7012mhz clock for the memory!  It will be a good high resolution gaming card from NVidia especially if the price falls nicely in line (Though I am expecting $600~).

Glad to see this card turns out well, now it just comes down to how good the power consumption is.


----------



## Sleepless (Sep 10, 2014)

btarunr said:


> 1051 MHz core, 1178 MHz GPU Boost, and 7012 MHz (GDDR5-effective) memory clocks
> 232 GB/s memory bandwidth


Bandwidth should be the same 224.4GB/s as a GTX 770 since its the same speed and bus size.


----------



## Roel (Sep 10, 2014)

thebluebumblebee said:


> Bummer.  There goes the hope for a lower power middle weight.  That's good for up to 300 watts.  I was looking for something around the GTX 780's performance while using 125-150 watts.  I do see that the other board shots show 2x6 pin.



The amount of power connections say nothing about the actual consumption of the card.


----------



## 64K (Sep 10, 2014)

Maxwell is more efficient so we know that the GTX 970/980 will outperform Keplers but being tied down to the 28nm process is what will limit it. My guess is the GTX 980 will perform somewhere around the GTX 780Ti maybe beating it.


----------



## btarunr (Sep 10, 2014)

HazMatt said:


> Bandwidth should be the same 224.4GB/s as a GTX 770 since its the same speed and bus size.



Fixed. Mixed up memory clocks between reference and that Galaxy GC (factory OC card).


----------



## HumanSmoke (Sep 10, 2014)

thebluebumblebee said:


> Bummer.  There goes the hope for a lower power middle weight.  That's good for up to 300 watts.  I was looking for something around the GTX 780's performance while using 125-150 watts.  I do see that the other board shots show 2x6 pin.


The original source does mention that the reference design is 150W TDP ( 2 x 6pin), so I guess it comes down to how aggressive the vendor boards push the OC's, and the marketing factor. A lot of people do associate a performance board with 1*8pin + 1*6pin (and 2*8 for uber SKUs) whether it warrants it or not.


----------



## thebluebumblebee (Sep 10, 2014)

Roel said:


> The amount of power connections say nothing about the actual consumption of the card.


But it's obvious that Galaxy thinks that it can use more than the 225 watts that 2x6 pin (75 watts from the PCI-e bus) can provide.


----------



## Selene (Sep 10, 2014)

Sigh another mid rage GPU with a high end part number just like the 670/680 and 770. Will sit on my 780 Classifieds for a while longer.


----------



## Animalpak (Sep 10, 2014)

I will skip this gen. because of the waterblock that i bought for my 780 Ti.


----------



## thebluebumblebee (Sep 10, 2014)

Can I swap out the GPU on my 660 Ti?  I'm okay with 2 GB.


----------



## RejZoR (Sep 10, 2014)

Not sure if i'll even bother replacing my HD7950...


----------



## Hilux SSRG (Sep 10, 2014)

Selene said:


> Sigh another mid rage GPU with a high end part number just like the 670/680 and 770. Will sit on my 780 Classifieds for a while longer.



Exactly.  The power efficiency will be better but still won't beat a 770sli setup. 

High end will be GM210.


----------



## HumanSmoke (Sep 10, 2014)

thebluebumblebee said:


> But it's obvious that Galaxy thinks that it can use more than the 225 watts that 2x6 pin (75 watts from the PCI-e bus) can provide.


Well, hopefully the decision reflects the overclocking headroom. If it mirrors the GM 107, then the DIY performance upgrade should make the card all the more interesting. If the reference card debuts at $399, then I doubt the vendor specials will be $20-30 more for the most part. Not bad for the performance, but I doubt it would cause AMD's exec's to get heart palpitations either. Seemingly the status quo remains....how surprising 
Just for the record, Galaxy used a 6-pin on their GTX 750 (non-Ti), and it did need it -barely- 80-85W with a max 1402MHz core boost frequency


----------



## thebluebumblebee (Sep 10, 2014)

The sad truth is that there are no games pushing the hardware anymore.  Do remember how long it took for hardware to catch up to Crysis?


Hilux SSRG said:


> Exactly.  *The power efficiency will be better but still won't beat a 770sli setup*.
> 
> High end will be GM210.


Remember the movie Trading Places?  There's the scene where they're buying up all of the stock at rock bottom prices.  This might turn out to be a good time to pick up some 670/680/770's as people blindly dump them for the new P/N.


----------



## dj-electric (Sep 10, 2014)

I am not pleased by the fact that NVIDIA decided to use its micro-architecture supiriority over AMD and deliver us slower-than-expected product. Really hoped for a 1920 core product that will match or even surpass the 780 ti.

Oh well... time to wait for the bigger dogs.


----------



## Casecutter (Sep 10, 2014)

Is there ample confirmation to the $400 MSRP for this GTX 970, I'd say given the small PCB of their reference BOM it might bare that out?


----------



## HumanSmoke (Sep 10, 2014)

Dj-ElectriC said:


> I am not pleased by the fact that NVIDIA decided to use its micro-architecture supiriority over AMD and deliver us slower-than-expected product. Really hoped for a 1920 core product that will match or even surpass the 780 ti.


You hoped a 256-bit, 1920 core part would outperform a 384-bit 2880 core using the same basic architecture on the same process node ? I think you expect a little too much from a tweaking of shader module/core ratio's and increased cache. As it is, I'm guessing that GM 204 makes further sacrifices in compute to reduce power demand, and save die space (esp. if the 370-390mm² die size estimate is correct)


Dj-ElectriC said:


> Oh well... time to wait for the bigger dogs.


Well, this is the GK 104 replacement after all, so I think there's only so much you can reasonably expect with any given transistor density.


----------



## dj-electric (Sep 10, 2014)

HumanSmoke said:


> You hoped a 256-bit, 1920 core part would outperform a 384-bit 2880 core using the same basic architecture on the same process node ? I think you expect a little too much from a tweaking of shader module/core ratio's and increased cache. As it is, I'm guessing that GM 204 makes further sacrifices in compute to reduce power demand, and save die space (esp. if the 370-390mm² die size estimate is correct)
> 
> Well, this is the GK 104 replacement after all, so I think there's only so much you can reasonably expect with any given transistor density.



Kepler and Maxwell are not the same, not nearly the same in performance\watt ratio.
If a 1664 part can beat a GTX 780, than yes, a 1920 part could match something that is 22% faster than a GTX 780.


----------



## JDG1980 (Sep 11, 2014)

thebluebumblebee said:


> But it's obvious that Galaxy thinks that it can use more than the 225 watts that 2x6 pin (75 watts from the PCI-e bus) can provide.



Or, alternatively, they just re-used the GTX 770 board design as-is. It is a mistake to estimate TDP from the power connectors on a non-reference card; even the power-sipping GTX 750 Ti has some aftermarket boards with PCI-E plugs.


----------



## Nordic (Sep 11, 2014)

HumanSmoke said:


> As it is, I'm guessing that GM 204 makes further sacrifices in compute to reduce power demand, and save die space (esp. if the 370-390mm² die size estimate is correct)


I was hoping for more compute performance. The 750ti has incredible compute performance per watt. In F@H my overclocked 750ti gets about 70k points per day, similar to that of a 7870 or 660ti. Gaming wise, it is no where near them, but in compute it is really close.

Are you sure maxwell won't have good compute performance?


----------



## bubbleawsome (Sep 11, 2014)

I'm happy because 4GB vRAM


----------



## HumanSmoke (Sep 11, 2014)

Dj-ElectriC said:


> Kepler and Maxwell are not the same, not nearly the same in performance\watt ratio.
> If a 1664 part can beat a GTX 780, than yes, a 1920 part could match something that is 22% faster than a GTX 780.


Well I wouldn't get your expectations too high. The 3963 3DMark 11 Extreme score *supposedly* put the card (which carries a 5% overclock) between the 780 and 780Ti when using a Core i3-4130 ? Yet the same Core i3-4130 and a bog-standard 780 at reference clocks scores 4228. So it seems like the only "proof" (the score) doesn't actually back up WCCF's assertion of the cards ability. Am I surprised? No. Am I surprised that people just accept a claim without delving past the hyperbole? Somewhat.







james888 said:


> I was hoping for more compute performance. The 750ti has incredible compute performance per watt. In F@H my overclocked 750ti gets about 70k points per day, similar to that of a 7870 or 660ti. Gaming wise, it is no where near them, but in compute it is really close.
> Are you sure maxwell won't have good compute performance?


Ah, I was referring to the double precision rather than general compute performance - my bad- I should have been more specific. GM 107's FP64 rate is 1:32, which is a decrease over low-end Kepler's 1:24. I pretty much expect GM 204 to follow that trend in comparison to GK 104.
Double precision is an over rated metric in general (although of seemingly variable importance to some people), but if GM 204 is also intended for Quadro cards -as seems likely, it may be a bullet point for future SKUs.


----------



## Diverge (Sep 11, 2014)

When is Nvidia going to get rid of the DVI ports and make the vent go from end to end? 

There's not much reason for bulky DVI ports anymore, when we have hdmi that easily covert to DVI with a passive adapter.


----------



## Nordic (Sep 11, 2014)

Diverge said:


> When is Nvidia going to get rid of the DVI ports and make the vent go from end to end?
> 
> There's not much reason for bulky DVI ports anymore, when we have hdmi that easily covert to DVI with a passive adapter.


But I don't want to purchase a new cable to go with a new card, hypothetically that is. Really though, is the vent going from end to end going to help cooling that much compared to the practicality of a dvi port? Just get a card where an card maker doesn't put a dvi port. Remember, there is no reference design.


----------



## Delta6326 (Sep 11, 2014)

I've been away for some time now, but did I miss GTX 8** or something?


----------



## HumanSmoke (Sep 11, 2014)

Diverge said:


> There's not much reason for bulky DVI ports anymore, when we have hdmi that easily covert to DVI with a passive adapter.


Except that HDMI 1.x doesn't natively support 21:9 aspect ratios - so all those people who buy 21:9 monitors are basically screwed for HDMI anyway - it's DVI or DP, and DP isn't found on all monitors (although most if not all ultrawides should have it).


Delta6326 said:


> I've been away for some time now, but did I miss GTX 8** or something?


Mobile.


----------



## silapakorn (Sep 11, 2014)

Delta6326 said:


> I've been away for some time now, but did I miss GTX 8** or something?



It's just a naming scam. Kinda like when some buildings skip the 13th floor entirely or name it "12a".


----------



## awesomesauce (Sep 11, 2014)

lovely tiny pcb...


----------



## 15th Warlock (Sep 11, 2014)

I hope I don't get too much flak for this, as it's my own personal opinion of the tech landscape so far the last 12 months:

First the whole Mantle delay fiasco, then the Devil's Canyon disappointment, Haswell-E's sort of meh release afterwards including some ridiculous markup on DDR4 modules, and now this...

I guess the days of 50~75% performance gains between enthusiast CPU and GPU generation jumps are far behind us, and now we have to just have to accept a paltry 10~15% jump or the same performance than last generation but at a discounted price...

Mobile and notebooks is where the current technology revolution is taking place, Core-M, Tegra K1, AMD's APUs...

I mean, you can still invest thousands of dollars on a monster build with a watercooled i7 5960X, 32GBs of DDR4, an ROG RVE X99 board and SLI GTX980s, but chances are, gaming performance wise, you'll probably just be 15~20% ahead of a 2~3 yr old rig...

Oh well, i look forward to the new breed of x86 tablets with enough performance to run most current games and battery life to last you a whole day, but I sure miss those quantum leaps in performance of days long gone by...


----------



## GhostRyder (Sep 11, 2014)

Delta6326 said:


> I've been away for some time now, but did I miss GTX 8** or something?


They named the mobile GPU's for the last Kepler series (And a few Maxwell variants) as the 8XX series.  The desktop market decided to skip for a little boost in anticipation that they are going way beyond the previous generation of cards.



Diverge said:


> When is Nvidia going to get rid of the DVI ports and make the vent go from end to end?
> 
> There's not much reason for bulky DVI ports anymore, when we have hdmi that easily covert to DVI with a passive adapter.


DVI is still a basic standard when it comes to monitors for gamers hence why they keep them.  Most people I run into still actually use them at LAN parties while the rest of course now use HDMI but its still something in use.  If I were choosing I would love to just have Display Ports now and use adaptors for everything but that's a personal opinion.


----------



## wolf (Sep 11, 2014)

looks quite promising from where I stand, with only preliminary specs and one benchmark.

Interested in power consumption figures. keep in mind that driver optimization will always boost performance beyond what we are seeing now, and looking at ~1600 shaders performing between 2300 and 2900 of the older type bodes very well for maxwell architecture.

Question is, is the fully unlocked GM104 1792 or 2048? and what about GM100?

Oh and being pin compatible with GK104 should do well for pricing, after the inital few months of sales when prices settle, naturally.

Interesting times ahead.


----------



## The Von Matrices (Sep 11, 2014)

At the moment is there any faster GDDR5 than 7GHz or will the GTX 980 have the same memory bandwidth as the GTX 970?  I remember a few scenarios where the GTX 680 performed nearly the same as the GTX 670 due to their identical memory bandwidths, and this new generation may have the same issue.


----------



## GhostRyder (Sep 11, 2014)

The Von Matrices said:


> At the moment is there any faster GDDR5 than 7GHz or will the GTX 980 have the same memory bandwidth as the GTX 970?  I remember a few scenarios where the GTX 670 performed no worse than the GTX 680 due to their identical memory bandwidths, and this new generation may have the same issue.


Most likely, the memory will be the same based on thethe early "Leaks".  Makes it a good thing because now the lower and higher cards will be ready for all high resolution needs even with the 256 bit bus.

I mean if we go off that leak of course...Plus those were overclocks so it may just end up being exactly the same or we could find out things were hidden from us.


----------



## v12dock (Sep 11, 2014)

Anyone else notice ATItool in the background?


----------



## HumanSmoke (Sep 11, 2014)

The Von Matrices said:


> At the moment is there any faster GDDR5 than 7GHz or will the GTX 980 have the same memory bandwidth as the GTX 970?


Likely the same. 7Gbps is the fastest GDDR5 at the moment
Elpida (Micron)
SK Hynix
Samsung


v12dock said:


> Anyone else notice ATItool in the background?


Software photobomb!


----------



## Lionheart (Sep 11, 2014)

Seems decent... I would love to pick up a GTX 980 if it surpasses the GTX 780Ti which it should!!! & also if the price is right


----------



## Axaion (Sep 11, 2014)

Diverge said:


> When is Nvidia going to get rid of the DVI ports and make the vent go from end to end?
> 
> There's not much reason for bulky DVI ports anymore, when we have hdmi that easily covert to DVI with a passive adapter.



>wanting HDMI 
>any year
>not displayport

Jesus christ just get a mac


----------



## johnspack (Sep 11, 2014)

Still waiting for my 770......


----------



## BiggieShady (Sep 11, 2014)

Well, GTX 770 was less of an improvement over GTX 670, than GTX 970 is relative to GTX 770. Skipping the 800 series made it even more obvious


----------



## buildzoid (Sep 11, 2014)

HumanSmoke said:


> Except that HDMI 1.x doesn't natively support 21:9 aspect ratios - so all those people who buy 21:9 monitors are basically screwed for HDMI anyway - it's DVI or DP, and DP isn't found on all monitors (although most if not all ultrawides should have it).
> 
> Mobile.


I use HDMI for to drive my 21:9 ASUS monitor at 83hz


----------



## HumanSmoke (Sep 11, 2014)

buildzoid said:


> I use HDMI for to drive my 21:9 ASUS monitor at 83hz


Well, that's interesting - so I guess the issues people have with 21:9 aspect ratio monitors and TV's are firmware related - scalar issues?
Seems a little odd that the HDMI 2.0 spec highlights native 21:9 aspect ratio and the HDMI 1.4 spec doesn't mention it.
Support for the wide angle theatrical 21:9 video aspect ratio 
And maybe the Wiki page needs an update.
Nice to know that the issue isn't within the specification, although HDMI.org really need to make the specification and compatibility clearer.


----------



## z1tu (Sep 11, 2014)

HumanSmoke said:


> Well I wouldn't get your expectations too high. The 3963 3DMark 11 Extreme score *supposedly* put the card (which carries a 5% overclock) between the 780 and 780Ti when using a Core i3-4130 ? Yet the same Core i3-4130 and a bog-standard 780 at reference clocks scores 4228. So it seems like the only "proof" (the score) doesn't actually back up WCCF's assertion of the cards ability. Am I surprised? No. Am I surprised that people just accept a claim without delving past the hyperbole? Somewhat.
> 
> 
> 
> ...



This is what I was thinking, I mean, maybe I'm not very good at interpreting video card specs but core and memory clocks aside, the 780 has more memory bandwidth, more cuda cores, TMU's and ROP's. I was under the impression that at least the memory bandwidth and cuda cores/shading units whatever should count towards more performance and not just raw mhz.


----------



## hardcore_gamer (Sep 11, 2014)

It all depends on the price. If I can get a couple of 980Ti for <$900, and these perform well at 4K, I'm in. It's either 4K or GTFO.


----------



## Roel (Sep 11, 2014)

thebluebumblebee said:


> The sad truth is that there are no games pushing the hardware anymore.  Do remember how long it took for hardware to catch up to Crysis?



Games are actually pushing hardware pretty hard and GPUs can't catch up, that's if you're using a newer monitor. 1080P 60Hz is really old by now, it's like saying that Crysis didn't push hardware when you were gaming at 720P or even 540P. 1440P 60Hz and 1080P 120+Hz monitors have been around for quite a while already and we're actually making the transition to 4K 60Hz and 1440P 144Hz. Try to use the full potential of such monitors in games like Battlefield 4 and you will see how hard it pushes your hardware. And then we're not even talking about surround gaming, in that case even the fastest 4-way SLI builds can't keep up.


----------



## CoolZone (Sep 11, 2014)

Will this card feature HDMI 2.0?


----------



## HumanSmoke (Sep 11, 2014)

CoolZone said:


> Will this card feature HDMI 2.0?


Apparently yes.


----------



## ensabrenoir (Sep 11, 2014)

.... new #2 (970)  card allways matches or slightly surpasses last gen #1(780).  Most times new #2 is last gen's #1  tweeked and rebadged.  So this is all good.  The real story will be the new number 1(980).


----------



## Diverge (Sep 11, 2014)

Axaion said:


> >wanting HDMI
> >any year
> >not displayport
> 
> Jesus christ just get a mac



I'm not sure what about my post makes you suggest I buy an Apple product... I don't own anything from Apple, and never will. Although the trashbin shaped mac pro is very interesting from a design and engineering point of view.


----------



## z1tu (Sep 11, 2014)

ensabrenoir said:


> .... new #2 (970)  card allways matches or slightly surpasses last gen #1(780 ti).  Most times new #2 is last gen's #1  tweeked and rebadged.  So this is all good.  The real story will be the new number 1(980).


Doesn't seem like the 970 will match or surpass the 780ti


----------



## ensabrenoir (Sep 11, 2014)

z1tu said:


> Doesn't seem like the 970 will match or surpass the 780ti


Oops should have said 780.....


----------



## Yellow&Nerdy? (Sep 11, 2014)

Considering that 16nm FINFET products might possibly be out as early as Q1 2015, these cards seem a little pointless. Waiting for the "true" next-gen cards will most likely net you a healthy boost in both performance and efficiency. 

If these leaks are true, the GTX 970 seems like a decent card though. I'm just hoping the pricing is going to be on point as well.


----------



## Casecutter (Sep 11, 2014)

Huum, a Core i3-4130 on 32-bit... while notice what Fudzilla found, "One odd detail is the date displayed in 3Dmark. It is January 17th 2014, which suggests one of two things. While this could be an elaborate hoax, that would not explain the photos of the actual card. It could just mean someone was lazy and did not set the right time on the test rig."

I'd take this with... salt.


----------



## v12dock (Sep 11, 2014)

x3828 on my r9 280x


----------



## rtwjunkie (Sep 11, 2014)

Yellow&Nerdy? said:


> Considering that 16nm FINFET products might possibly be out as early as Q1 2015, these cards seem a little pointless. Waiting for the "true" next-gen cards will most likely net you a healthy boost in both performance and efficiency.
> 
> If these leaks are true, the GTX 970 seems like a decent card though. I'm just hoping the pricing is going to be on point as well.


 
Yep, I'm awaiting GM210, which is likely the true next generation on the smaller die.  But the 970  on GM204 should be a decent enough card anyway, for those that need the upgrade, which is most people on 6-series and below.


----------



## Sony Xperia S (Sep 11, 2014)

Yellow&Nerdy? said:


> Considering that 16nm FINFET products might possibly be out as early as Q1 2015, these cards seem a little pointless.



These cards are good only because they bring somehow lower power consumption, of course, they do not bring anything new on the table regarding performance improvements but I hope they will put good pricing because this is what really counts and is vital.


About 16 nm, I am afraid you are overly optimistic. I guess that even if technically there are no obstacles, from marketing persepctive they will do their "best" to postpone the launch at least for 2Q 2015. 

*Or even in 2016. *


----------



## newtekie1 (Sep 11, 2014)

thebluebumblebee said:


> Bummer.  There goes the hope for a lower power middle weight.  That's good for up to 300 watts.  I was looking for something around the GTX 780's performance while using 125-150 watts.  I do see that the other board shots show 2x6 pin.





btarunr said:


> The similarity in PCB design, if nothing, shows that the GTX 970 will be as energy-efficient as the GTX 670.






thebluebumblebee said:


> But it's obvious that Galaxy thinks that it can use more than the 225 watts that 2x6 pin (75 watts from the PCI-e bus) can provide.



Nope, they just like to stick an 8-pin on there to make consumers think the card is more powerful than it really is.  If the chip is pin compatible with GK104, and the 970 is using the same amount of power as a 670, then I'm willing to bet the 980 will be using roughly the same amount of power as a 680.


----------



## thebluebumblebee (Sep 11, 2014)

It's one big conspiracy to get us to buy bigger PSU's than we _need_.  A person would not _need_ a PSU larger than 500 watts to run one of these cards in a _normal_ desktop, but the 8+6 pin requirement will push the PSU to the 650 watt range.


----------



## btarunr (Sep 12, 2014)

CoolZone said:


> Will this card feature HDMI 2.0?



Yes. Ultra HD @ 60 Hz over HDMI. You will need an HDMI 2.0 compatible monitor, though.


----------



## newconroer (Sep 13, 2014)

LAN_deRf_HA said:


> Without a die shrink never expect much. Always hate it when this happens because you don't get as much performance jump and then when the die shrunk cards do comes its another small jump.



What's worse is that naive and uninformed consumers will buy these unnecessary cars, and then sell their old ones which aren't that much different to begin with. Nvidia could release nothing until Pascal and the market would be better off.


----------



## RealNeil (Sep 13, 2014)

GhostRyder said:


> its between the 780 and 780ti and its a X70 counterpart which is very good and could confirm that the 980 is going to be better than the 780ti even if it is not by an extreme amount.



A 980Ti with 6GB of RAM would be nice to see. (in my PC)


----------



## bubbleawsome (Sep 14, 2014)

thebluebumblebee said:


> It's one big conspiracy to get us to buy bigger PSU's than we _need_.  A person would not _need_ a PSU larger than 500 watts to run one of these cards in a _normal_ desktop, but the 8+6 pin requirement will push the PSU to the 650 watt range.


Agreed. I'm lucky my 550w has 2x8pin. It powers the fastest 7970 I can find perfectly, but the requirement page says 650w. (The entire thing pulls maybe 300w.)


----------



## RealNeil (Sep 14, 2014)

They always ask for more power than they really need. (to account for low quality PSUs that do not deliver what they claim to)


----------



## natr0n (Sep 15, 2014)

Dude holding the card needs a nail clipper asap.


----------



## Prima.Vera (Sep 15, 2014)

bubbleawsome said:


> I'm happy because 4GB vRAM


Indeed. 
That will give you a 200% increase in performance on your Samsung SyncMaster 940BF 1280x1024 monitor.


----------

