# GIGABYTE GeForce GTX 680 Graphics Card Pictured



## btarunr (Mar 19, 2012)

Christmas came early to Overclock.net forum member "ironman86", who showed off his swanky new GeForce GTX 680 graphics card, branded by GIGABYTE (model: GV-N680D5-2GD-B). The card sticks to NVIDIA reference design to the book, except of course a sci-fi sticker on the cooler and fan. A futuristic art piece, with GIGABYTE and GTX 680 ("8" obscured by the protective plastic film). The card indeed draws power from two 6-pin power connectors, settling the power-connector debate once and for all, since this is the first picture of a retail-channel GTX 680. 



 

 



*View at TechPowerUp Main Site*


----------



## btarunr (Mar 19, 2012)

Many Thanks to NHKS for the tip.


----------



## arnoo1 (Mar 19, 2012)

Lucky bastard!
I want it to 
I hope he leaks benchmarks and stuff


----------



## [H]@RD5TUFF (Mar 19, 2012)

DO WANT! well perhaps a card from a brand that doesn't suck but still sorta jelly!


----------



## djxinator (Mar 19, 2012)

He intends to leak the benchmarks on Wednesday, I stumbled upon this at about 8am GMT, posted it everywhere but it seemed to fall on deaf ears, I'm glad its properly out in the open though xD


----------



## RejZoR (Mar 19, 2012)

I like the way how NVIDIA cards always come with exhaust coolers. Where with AMD, it's nearly impossible to get any exhausting cooled cards few months after then release. I remember how hard it was to find HD5850 with refrence cooler. And even after i managed to find one, the fan started rattling horribly after few months. Very annoying. Luckily i don't need exhausting cards anymore with current case, but i still wouldn't mind them since they dumb all the heat outside, keeping the case internal temperature much lower.


----------



## buggalugs (Mar 19, 2012)

RejZoR said:


> I like the way how NVIDIA cards always come with exhaust coolers. .



 You obviously dont care about a card that sounds like a vacuum cleaner and runs 20 degrees hotter.

 Ridiculous point anyway, AMD have released reference exhaust coolers for the last 5 generations. With the 7 series AMD chose to release reference and non-reference at the same time to give everyone choice.

 There are plenty of people that refuse to buy reference blower noisy coolers, once you've had a silent highend GPU with a silent cooler, you never go back to vacuum cleaner coolers.


----------



## xenocide (Mar 19, 2012)

I've had both, minimal difference unless you go very high end.


----------



## NHKS (Mar 19, 2012)

I do like EVGA's decal(&color) on the reference GTX580(& 570) more than other brand's reference cards.. waiting to see what EVGA has got for 680.. 
since i saw the first pics of the 680, I have felt that the 'GEFORCE GTX' logo could have been a tad smaller..


----------



## Crap Daddy (Mar 19, 2012)

There is no DX11.1 support written on the box. What could this be?


----------



## amdftw (Mar 19, 2012)

Confirmed! The stock clock is 706/1502MHz.
By this way it is easy to have the GTX680 195W TDP.


----------



## Protagonist (Mar 19, 2012)

Does it support 4K? Does it support DX 11.1?


----------



## NHKS (Mar 19, 2012)

Crap Daddy said:


> There is no DX11.1 support written on the box. What could this be?



good spotting!.. I have been having my doubts over the DX 11.1 support of GTX680.. some specs say 11.1 while some only 11.. so maybe DX11 it is.. 
 will it make much difference? 
or more bragging rights for AMD?


----------



## lZKoce (Mar 19, 2012)

That is SOOO COOL...one firebreathing beast right there  . You have to be geek all the way baby  

To the nVidia Marketing team- you did it. Now your cause is my cause. And I am ready to work tirelessly to spread the glory of the green team and the use of CUDA and PhysiX in the professional world of computing. Naahh, I am just joking. It's just green is my favourite colour anyways


----------



## legends84 (Mar 19, 2012)

just saw one in store today..  and expensive...


----------



## NHKS (Mar 19, 2012)

probably u r one of the first TPU members to touch the retail box! .. thanks for the pic

u said u saw the price, but is there any difference in prices between 7970 & 680 currently?


----------



## Crap Daddy (Mar 19, 2012)

amdftw said:


> Confirmed! The stock clock is 706/1502MHz.
> By this way it is easy to have the GTX680 195W TDP.



And how do you know? Link to official specs please.


----------



## Rahmat Sofyan (Mar 19, 2012)

*???*



legends84 said:


> just saw one in store today..  and expensive...



already on sale in Malaysia dude ??? how much in ringgit ???


----------



## legends84 (Mar 19, 2012)

NHKS said:


> probably u r one of the first TPU members to touch the retail box! .. thanks for the pic
> 
> u said u saw the price, but is there any difference in prices between 7970 & 680 currently?



for gigabyte 7970 is around MYR1999
gigabyte  GTX680 maybe around MYR2100


----------



## Rahmat Sofyan (Mar 19, 2012)

*...*



Crap Daddy said:


> And how do you know? Link to official specs please.



I guess from this link overclock.net

this is the pic :


----------



## djxinator (Mar 19, 2012)

Everyone pile into the OCN thread, hes got it loaded up (WITH SCREENIES) on his Windows 7 32bit Core i3 system xD

Still, Haven shouldn't be bottle-necked by it - I think (THINK) he's benching it as we speak.

I'M SCARED GUISE I DON'T WANT THE UNIVERSE TO IMPLODE


----------



## Crap Daddy (Mar 19, 2012)

Rahmat Sofyan said:


> I guess from this link overclock.net
> 
> this is the pic :
> 
> http://cdn.overclock.net/7/72/725a8b9d_gpuz.png



There is no public version of GPU-z that supports Kepler.


----------



## hhumas (Mar 19, 2012)

I want evga one


----------



## Rahmat Sofyan (Mar 19, 2012)

*..*



Crap Daddy said:


> There is no public version of GPU-z that supports Kepler.



yeah, I'm just guessing...wait for GPU-Z 6.0 to fix it.


----------



## Protagonist (Mar 19, 2012)

*DOES IT SUPPORT 4K* and *DirectX 11.1*?


----------



## dj-electric (Mar 19, 2012)

Does it that much matter to you?


----------



## djxinator (Mar 19, 2012)

Its been benched


----------



## NHKS (Mar 19, 2012)

st.bone said:


> *DOES IT SUPPORT 4K* and *DirectX 11.1*?



i guess it will support 4K res like 7970.. but DX11.1 not sure.. the Gigabyte box says only DX11..  still no official info on either of these specs..

that 'Heaven benchmark' is @ 1600x900.. ironman86 @ OCN probably doesn't have a full HD monitor
 he says - "sorry for not in 1080p resolution,i only got 20" inch monitor to test "

earlier when someone asked him to confirm the clocks in nvidia control panel.. he did and said - "i see is same clock as GPU-Z,so im not posted it"


----------



## Crap Daddy (Mar 19, 2012)

NHKS said:


> i guess it will support 4K res like 7970.. but DX11.1 not sure.. the Gigabyte box says only DX11..  still no official info on either of these specs..
> 
> that 'Heaven benchmark' is @ 1600x900.. ironman86 @ OCN probably doesn't have a full HD monitor
> he says - "sorry for not in 1080p resolution,i only got 20" inch monitor to test "
> ...



Hmm. Is he running the card at lowish clocks because the results are pretty bad?


----------



## NHKS (Mar 19, 2012)

Crap Daddy said:


> Hmm. Is he running the card at lowish clocks because the results are pretty bad?



i guess not.. his comment on the clocks(shown by nV control panel) was before his heaven benchmark..

also his CPU is i3-2100 & 32-bit OS.. bottleneck could be one of the reasons.. but the heaven scores cannot be compared directly...

UPDATE: just to compare with ironman86's GTX680 Heaven score of 1362, one of the members at OCN did a Heaven run using 7970 @ 1600x900 (he has i7) and here is his score


----------



## Rahmat Sofyan (Mar 19, 2012)

he use i3 2100, maybe bottleneck ???


----------



## Crap Daddy (Mar 19, 2012)

NHKS said:


> i guess not.. his comment on the clocks(shown by nV control panel) was before his heaven benchmark..
> 
> also his CPU is i3-2100 & 32-bit OS.. bottleneck could be one of the reasons.. but the heaven scores cannot be compared directly...



Something is wrong there. He also said that the clocks (700MHz) are shown in NV CP but I still believe that this is not base clock for GTX680 hence the results which are poor. 

I also don't think we are talking about any bottleneck as Crysis 1 has nothing to do with more than 2 fast cores which the i3 provides a plenty and he says something around 50FPS with only 4AA at that resolution. If I remember corectly I had above 50 on 1680x1050 (higher res) with my GTX570.

So it's either the bios, the driver or settings in NV CP.


----------



## Shurakai (Mar 19, 2012)

Seems pretty on par, can't take the minimum fps into account because I bet he ran that test asap without letting it loop a bit hence the very low 8fps for a split second but still acceptable average.


----------



## [H]@RD5TUFF (Mar 19, 2012)

I'm thinking that i3 gimped the results.


----------



## dj-electric (Mar 19, 2012)

I don't, the unigine heaven test gives the same results with either I3 2100 or I5 2500


----------



## Protagonist (Mar 19, 2012)

Dj-ElectriC said:


> Does it that much matter to you?


DOES IT SUPPORT 4K and DirectX 11.1?

Yes it does matter, coz if it doesn't support 4K i wont buy, if it doesn't DX11.1 i still wont buy, i will get AMD Radeon HD7xxx which does support all this, so basically to me if it doesent support this features its a worthless card and generally speaking not fast coz does not possess the same features it would be a cranked up last gen GPU by the name of Kepler.

Kind of like Nvidia 8,9 100, 200, were all DX10, while AMD Radeon HD4xxx had DX 10.1


----------



## dj-electric (Mar 19, 2012)

CAPS I DONT KNOW NOW IT ISNT OUT YET.

And the lack of DX10.1 support didn't disturbed the GTX260 216SP to whop HD4870's behind...


----------



## beck24 (Mar 19, 2012)

Official benchies soon. These are rubbish.


----------



## dj-electric (Mar 19, 2012)

Alright, i had this aluminum biscuit tested again


----------



## Protagonist (Mar 19, 2012)

Dj-ElectriC said:


> CAPS I DONT KNOW NOW IT ISNT OUT YET.
> 
> And the lack of DX10.1 support didn't disturbed the GTX260 216SP to whop HD4870's behind...



At list some one who had HD4870 could see the details in DX10.1 unlike those who had Nvidia 8..... 200 series. who could not see the details of DX10.1,

I Had a 9800GTX then and i would not repeat the same mistake, i will go for features this time round for my next GPU purchase.

And for the record i have used both camps, currently on Geforce GTX460 1GB 336cuda 256bit, which i bought July 2010, and it replaced My AMD Radeon HD5770. other wise its a long list of both GeForce, Radeon & Intel iGPs. 
Currently on 1920x1080 and not switching to higher soon, may be 2013 ill switch to 3k or 4k displays that will be available


----------



## lZKoce (Mar 19, 2012)

st.bone said:


> DOES IT SUPPORT 4K and DirectX 11.1?
> 
> Yes it does matter, coz if it doesn't support 4K i wont buy, if it doesn't DX11.1 i still wont buy, i will get AMD Radeon HD7xxx which does support all this, so basically to me if it doesent support this features its a worthless card and generally speaking not fast coz does not possess the same features it would be a cranked up last gen GPU by the name of Kepler.
> 
> Kind of like Nvidia 8,9 100, 200, were all DX10, while AMD Radeon HD4xxx had DX 10.1



(facepalm). Does the lack of 11.*1* have a HUGE impact on your *productivity level* such as windtunnel testing, programming, rendering and actual income on using it?  And the card is not out yet, so on what basis did you deduce that is "not fast and worthelss" I just don't know


----------



## Protagonist (Mar 19, 2012)

lZKoce said:


> (facepalm). Does the lack of 11.*1* have a HUGE impact on your *productivity level* such as windtunnel testing, programming, rendering and actual income on using it?  And the card is not out yet, so on what basis did you deduce that is "not fast and worthelss" I just don't know



It's my money i am planning to spend on a new GPU that replaces my GTX460 which has served me very well no complains, but if it turns out that the GTX680 has no 4K and DX 11.1 support then it should cost less (hence the worthless) which i still would not buy even if it were less and doesn't have 4K and DX11.1 support, my current card does a great job not supporting those features.
So i want to spend my money on features... and a faster card than what i have currently, which does a great job anyway.


----------



## Crap Daddy (Mar 19, 2012)

st.bone said:


> It's my money i am planning to spend on a new GPU that replaces my GTX460 which has served me very well no complains, but if it turns out that the GTX680 has no 4K and DX 11.1 support then it should cost less (hence the worthless) which i still would not buy even if it were less and doesn't have 4K and DX11.1 support, my current card does a great job not supporting those features.
> So i want to spend my money on features... and a faster card than what i have currently, which does a great job anyway.



You know that the GTX680 has only 2GB of memory? That's not a concern for you?


----------



## lZKoce (Mar 19, 2012)

st.bone said:


> It's my money i am planning to spend on a new GPU that replaces my GTX460 which has served me very well no complains, but if it turns out that the GTX680 has no 4K and DX 11.1 support then it should cost less (hence the worthless) which i still would not buy even if it were less and doesn't have 4K and DX11.1 support, my current card does a great job not supporting those features.
> So i want to spend my money on features... and a faster card than what i have currently, which does a great job anyway.



Of course it's your money. I don't want to be intrusive or rude. I was just pointing out that these arguments don't make sense to me personally. But I am biased anyway. So I am sorry if I caused any disturbance.


----------



## Protagonist (Mar 19, 2012)

Crap Daddy said:


> You know that the GTX680 has only 2GB of memory? That's not a concern for you?



No its not.

At list Intel Ivy bridge processors will support 4K, i plan on getting a 4K or 3K display next year 2013.


----------



## dj-electric (Mar 19, 2012)

Common sense just left the building...


----------



## Count Shagula (Mar 19, 2012)

st.bone said:


> No its not.
> 
> At list Intel Ivy bridge processors will support 4K, i plan on getting a 4K or 3K display next year 2013.



4096×3072.

I cannot even begin to imagine the cost of such a screen and what power would be needed to power games on it.


----------



## Protagonist (Mar 19, 2012)

Count Shagula said:


> 4096×3072.
> 
> I cannot even begin to imagine the cost of such a screen and what power would be needed to power games on it.



I mean 4K (4096x2xxx) 

or 3K (3860x2xxx)


money is no problem


----------



## brandonwh64 (Mar 19, 2012)

st.bone said:


> I mean 4K (4096x2xxx)
> 
> or 3K (3860x2xxx)
> 
> ...



Even so, I doubt this card alone would handle that res in games such as BF3. you would need at minimum, 2 or more.


----------



## Protagonist (Mar 19, 2012)

brandonwh64 said:


> Even so, I doubt this card alone would handle that res in games such as BF3. you would need at minimum, 2 or more.



4K or 3K i do not intend for gaming mostly for photo editing


----------



## brandonwh64 (Mar 19, 2012)

st.bone said:


> 4K or 3K i do not intend for gaming mostly for photo editing



fair enough


----------



## Bjorn_Of_Iceland (Mar 19, 2012)

That i3 bench has the i3 bottlenecking it I guess.. I can get 53 fps something with this 775 quad and the GTX580


----------



## Rahmat Sofyan (Mar 19, 2012)

more clear image


----------



## okidna (Mar 19, 2012)

In case if someone wondering :






http://www.overclock.net/t/1231113/gigabyte-gtx-680-2gb-already-arrive-at-my-shop/350#post_16750252


----------



## Delta6326 (Mar 19, 2012)

I think the 7970 and 680 are going to be neck an neck.


----------



## MxPhenom 216 (Mar 19, 2012)

amdftw said:


> Confirmed! The stock clock is 706/1502MHz.
> By this way it is easy to have the GTX680 195W TDP.



CONFIRMED!

......what?

Where in the article does it say the stock clock is 706mhz on the core lol! 


EDIT: I will definitely be getting one of these cards for sure around summer time.


----------



## 15th Warlock (Mar 19, 2012)

st.bone said:


> DOES IT SUPPORT 4K and *DirectX 11.1*?
> 
> Yes it does matter, coz if it doesn't support 4K i wont buy, *if it doesn't DX11.1 i still wont buy*, i will get AMD Radeon HD7xxx which does support all this, *so basically to me if it doesent support this features its a worthless card and generally speaking not fast* coz does not possess the same features it would be a cranked up last gen GPU by the name of Kepler.
> 
> Kind of like Nvidia 8,9 100, 200, were all DX10, while AMD Radeon HD4xxx had DX 10.1





st.bone said:


> _4K or 3K* i do not intend for gaming mostly for photo editing*_



I don't know if you're trolling... :shadedshu


----------



## amdftw (Mar 19, 2012)

nvidiaintelftw said:


> CONFIRMED!
> 
> ......what?
> 
> ...



Can you read this?
The Gpu-z capture.


----------



## MxPhenom 216 (Mar 19, 2012)

15th Warlock said:


> I don't know if you're trolling... :shadedshu



god yeah wtf.

DX10.1 didn't do shit for ATI cards. The GTX260 Core 216 was still raping them.


----------



## MxPhenom 216 (Mar 19, 2012)

amdftw said:


> Can you read this?



and so then that means according to some other stuff that has been released they are capable of a 63% overclock.

That guys i3 is holding the 680 back a bit, but it stil lbeat a guys 7970 at the 706mhzx,unless the dynamic clock stuff took it to 1006. Either way once drivers are better(since the minimum FPS seems a bit low) I think this card will take off.


----------



## okidna (Mar 19, 2012)

I notice that the driver version is different. 
300.99 in the earlier SLI leak and 300.65 in this Gigabyte retail.


----------



## semantics (Mar 19, 2012)

okidna said:


> I notice that the driver version is different.
> 300.99 in the earlier SLI leak and 300.65 in this Gigabyte retail.


not that surprising that a retail box would have old drivers on a new card which will likely go though several driver updates in the months following release. Gotta pre package that stuff weeks in advance.


----------



## Crap Daddy (Mar 19, 2012)

amdftw said:


> Can you read this?
> The Gpu-z capture.



That version of GPU-z DOES NOT SUPPORT KEPLER. So we don't really know.


----------



## amdftw (Mar 19, 2012)

Crap Daddy said:


> That version of GPU-z DOES NOT SUPPORT KEPLER. So we don't really know.



So you can not read...
Ha said the NV control panel showed the same clocks!


----------



## Crap Daddy (Mar 19, 2012)

amdftw said:


> So you can not read...
> Ha said the NV control panel showed the same clocks!



OK wise guy look here, same forum, same thread, another aparent owner of a GTX680 a little later:






The right GPU-z version.


----------



## Protagonist (Mar 19, 2012)

Crap Daddy said:


> OK wise guy look here, same forum, same thread, another aparent owner of a GTX680 a little later:
> 
> http://i43.tinypic.com/2am93r.jpg
> 
> ...



Thanks for this at list my mind is at peace knowing it supports DX11.1 according to GPU-Z 0.6.0 now just to confirm if it supports 4K if so, i might just still use Nvidia for my next GPU upgrade, tho AMD Radeon HD7970 seems appealing.


----------



## Crap Daddy (Mar 19, 2012)

I really don't think that's important right now. Things are really heating up over at Overclockers.
Look at the unigine result and compare.


----------



## ChristTheGreat (Mar 19, 2012)

There is one guy showing the card doing about 2000 at heaven benchmark.. And one guy showed a score of GTX 580 SLI, 2200...


----------



## MxPhenom 216 (Mar 19, 2012)

Crap Daddy said:


> I really don't think that's important right now. Things are really heating up over at Overclockers.
> Look at the unigine result and compare.



linky. I want to watch/read it.


----------



## Count Shagula (Mar 19, 2012)

http://www.overclock.net/t/1231113/gigabyte-gtx-680-2gb-already-arrive-at-my-shop/510


----------



## Crap Daddy (Mar 19, 2012)

So GPU-z reads: 294mm2 and release date: Mar. 22.2012 

Soon...


----------



## Casecutter (Mar 19, 2012)

nvidiaintelftw said:


> and so then that means according to some other stuff that has been released they are capable of a 63% overclock.


Yep! that's what the whole Adaptive V-Sync (also what was termed Dynamic profiles) does... it manipulates the clocks of GPU/memory, along with stimulates or restrain sections of cuda cores as need to alter and maintain frame-rate transitions to be "fluid and/or organic".   Basically when 3D scene starts dropping frame-rates below monitor refresh-rate or above, it limits or augments such resources trying to maintain close to 60Fps.  As the scene loads the GPU and other chip/card resources provide exactly how much energy each single scene require smooth frame-rates.  So within milliseconds the card juggles various resources dynamically against present profiles.

What's nice is now Nvidia doesn’t need to supply a cooling system built to handle the constant say 40-50% top OC’n spikes, it’s BTU performance can be scaled back, because chiefly the profiles will maintain the lowest clock for the 3D load and FpS required.  The chance of a 40-50% OC boost might last a few seconds then be down again. That permits a theoretically higher max TDP, because they turn up the heat that high very infrequently.  The only downside is the conventional OC’n enthusiasts are use to may no longer be there, if you disable the Adaptive V-Sync now that chip has a TDP of 160W consistently, OC’d at say 775Mhz.


----------



## rvalencia (Mar 20, 2012)

nvidiaintelftw said:


> god yeah wtf.
> 
> DX10.1 didn't do shit for ATI cards. The GTX260 Core 216 was still raping them.


NVIDIA implements a kit-bashed/subset of DX10.1 with thier "DX10.0" CUDA cards i.e. NVIDIA actively destroyed MS's DX10.1 standard.

http://www.bit-tech.net/news/hardware/2008/10/22/nvidia-gpus-support-dx10-1-features-in-far-cry-2/1


----------



## alexsubri (Mar 20, 2012)

I still am worried about this...its 256-bit and not 384-bit ...it's 2GB and not 3GB ...so if you were to play n-finity (nvidia's eyefiniti) you won't get the best bang for you buck. For the price to be 15% higher than a 7970 it better damn do better than a 7970! I just don't see the comparison that much here ... BTW, what's with the 6-pin connectors?


----------



## [H]@RD5TUFF (Mar 20, 2012)

alexsubri said:


> I still am worried about this...its 256-bit and not 384-bit ...it's 2GB and not 3GB ...so if you were to play n-finity (nvidia's eyefiniti) you won't get the best bang for you buck. For the price to be 15% higher than a 7970 it better damn do better than a 7970! I just don't see the comparison that much here ... BTW, what's with the 6-pin connectors? http://www.techpowerup.com/forums/attachment.php?attachmentid=46289&stc=1&d=1332212566



It's not fugly at all, also why would you care how the PCI-E power ports look as you will never see them when the card is plugged in, that's a pretty lame gripe . ..  but to each their own I suppose.


----------



## jaredpace (Mar 20, 2012)

7970 OC at 1005/1500 (default CCC)






GTX680 stock at 1006/1500 with boost





7970 OC at 1005/1500 (with AMD optimized Tesselation selected in CCC)





http://www.xtremesystems.org/forums...orce-GTX-780&p=5071228&viewfull=1#post5071228


----------



## MxPhenom 216 (Mar 20, 2012)

jaredpace said:


> 7970 OC at 1005/1500 (default CCC)
> http://img20.imageshack.us/img20/2950/heaven2012032008491694.jpg
> 
> GTX680 stock at 1006/1500 with boost
> ...



AMD Optimized Tesselation?


----------



## TRWOV (Mar 20, 2012)

Rahmat Sofyan said:


> more clear image
> 
> http://cdn.overclock.net/d/d7/d7af8...1358_168500823188738_917052_1372456462_n.jpeg



Imagine if that was an actual CCFL


----------



## alexsubri (Mar 20, 2012)

[H]@RD5TUFF said:


> It's not fugly at all, also why would you care how the PCI-E power ports look as you will never see them when the card is plugged in, that's a pretty lame gripe . ..  but to each their own I suppose.



I'm more into Graphical Design


----------



## radrok (Mar 20, 2012)

nvidiaintelftw said:


> AMD Optimized Tesselation?



http://forums.guru3d.com/showthread.php?t=337224

And since Unigine is a heavy tessellation benchmark it makes a huge difference to optimize this value.


----------



## semantics (Mar 20, 2012)

benchmarks have unrealistic and nonsensical amounts of load on gpu tasks, who would have known


----------



## Rahmat Sofyan (Mar 20, 2012)

*...*



TRWOV said:


> Imagine if that was an actual CCFL



hahaha...LOL, it'll increase the heat I guess


----------



## CrAsHnBuRnXp (Mar 20, 2012)

st.bone said:


> *DOES IT SUPPORT 4K* and *DirectX 11.1*?



Was the bold necessary?



> The new piece of information is that Kepler will support PCIE Gen 3 and DirectX 11.1, so it’s all set for Windows 8.



Source

Simple Google search. First result yielded that. Not sure what you mean with 4K though.



st.bone said:


> DOES IT SUPPORT 4K and DirectX 11.1?
> 
> Yes it does matter, coz if it doesn't support 4K i wont buy, if it doesn't DX11.1 i still wont buy, i will get AMD Radeon HD7xxx which does support all this, so basically to me if it doesent support this features its a worthless card and generally speaking not fast coz does not possess the same features it would be a cranked up last gen GPU by the name of Kepler.
> 
> Kind of like Nvidia 8,9 100, 200, were all DX10, while AMD Radeon HD4xxx had DX 10.1



Ok now this just sounds like to me someone that is brain dead and just wants the latest and greatest and if it has a .1 of a difference that must mean its super better than the regular non .1 version because they really have no tech knowledge to begin with.



> Kind of like Nvidia 8,9 100, 200, were all DX10, while AMD Radeon HD4xxx had DX 10.1



And even though those cards didnt have that .1, didnt nvidia still whoop their ass?


----------



## rvalencia (Mar 20, 2012)

CrAsHnBuRnXp said:


> Was the bold necessary?
> 
> 
> 
> ...


NV's DX10.X kitbash has some DX10.1 like features.


----------



## micropage7 (Mar 20, 2012)

i dunno whats the point they place the connector like that?


----------



## [H]@RD5TUFF (Mar 20, 2012)

micropage7 said:


> http://www.techpowerup.com/img/12-03-19/192c.jpg
> i dunno whats the point they place the connector like that?



Cable management is my only guess, at least they are trying something new, instead of cranking out the same old stuff.


----------



## HuLkY (Mar 20, 2012)

The Design, SUCKS! what the hell is this "GeForce GTX" Thing! at least bring the old GTX280 glowing LED  , and no back plate?! 

In EVGA We TRUST


----------



## dj-electric (Mar 20, 2012)

People must must understand that although GTX680 is a high-end card it's replacement will come relatively fast. It's just NVIDIA's quick answer there to HD7900.
Don't get bummed for not getting high-end looks on that card.
"In EVGA We TRUST" ? makes no sense.
I will trust whoever gives me a good product.


----------



## Casecutter (Mar 20, 2012)

micropage7 said:


> i dunno whats the point they place the connector like that?


Probably has to do with freeing up property on the PCB and traces.  The card has the construction of a $250-300 parts and components, but with that you get firmware/software optimizations that maximize game play to be 60Fps... right? 

Can we continue using the normal maximum/average/low way of calculating performance when 60fps is always the target?  From what I see this thing promotes the "average" and nothing higher. It a way of unearthing efficiency and lowering thermals, while giving it a shot of nitrous when not keeping up to the mean!   So will all the old data and graphs necessarily translate to apple-to-apples as provided in the past?  Will it look/play great... in theory yes, though will Adaptive V-Sync sire new glitchy-ness over the traditional lag and spike as we’ve know it in past.  Averaging at 60 Fps will not look all that different than say a 7970, which at say the old way of looking at an average hits say 80 Fps (that's what Nvidia is betting on).  This really in my mind stops the whole I’m fastest, changes it to I’m the sam at providing the average, because no one can tell the difference.  Kind’of the old "more than a mouth full is a waste"!

I don’t know what this now means, but traditional testing like W1zzard been done may well have very little merit any longer.  We might need more of graphs like [H]ard|OPC has done those "spiky" graphs as the games played; although now there will be this slightly wavy green line hugging right a 60fps.  Well that’s boring might really minimize graphic card reviews as we know it, sure plays BF3.... 60fps.

It feels like Nvidia took the ball and ran into left field and is saying, "the game as you knew it has changed".  Which isn’t wrong or bad, but it really feels like the old way of figuring the best card has changed... but the price remains the same!   Now, here’s a question why now can there be enthusist cards? In theory any GPU that’s strong and sought enough to game the newest or most demanding titles as long as they can clock it fast enough for the few milliseconds to keep the game from dipping below 60fps is all the best newest offerings need to be.  The new mantra will be "we can render that @ XXXX resolution with this level/type of AA (as in Nvidia TXAA, anti-aliasing algorithm).  It’s no longer about being fastest or faster.  So if AMD take the Cape Verde and does this, are we all okay with it?


----------

