# GeForce GTX "Pascal" Faces High DVI Pixel Clock Booting Problems



## btarunr (Jun 27, 2016)

The second design flaw to hit the GeForce GTX 1080 and GTX 1070 after the fan revving bug, isn't confined to the reference "Founders Edition" cards, but affects all GTX 1080 and GTX 1070 cards. Users of monitors with dual-link DVI connectors are noticing problems in booting to Windows with pixel clocks set higher than 330 MHz. You can boot to windows at default pixel clocks, and when booted, set the refresh-rates (and conversely pixel clocks) higher than 330 MHz, and the display works fine, it's just that you can't boot with those settings, and will have to revert to default settings each time you shut down or restart your machine. 

A user of a custom-design GTX 1070 notes that if the refresh rate of their 1440p monitor is set higher than 81 Hz (the highest refresh rate you can achieve with pixel clock staying under 330 MHz) and the resolution at 2560 x 1440, the machine doesn't correctly boot into Windows. The splash screen is replaced with flash color screens, and nothing beyond. The system BIOS screen appears correctly (because it runs at low resolutions). The problem is also said to be observed on a custom-design GTX 1080, and has been replicated by other users on the GeForce Forums. 





*View at TechPowerUp Main Site*


----------



## avatar_raq (Jun 27, 2016)

It will affect a small number of users (those with high refresh rate no DP ports monitors) but it is an issue nevertheless. I wonder if a VGA bios update will solve this.


----------



## TheGuruStud (Jun 27, 2016)

How was this not found by nvidia in super early testing, let alone reviewers? Rush, rush, rush, who cares, launch them all no matter the incompetence. I imagine the delay for users to find this was b/c almost no one can get the cards anyway XD


----------



## RejZoR (Jun 27, 2016)

My system often hangs at boot (just black screen with white underscore in the top left middle of the screen) even with GTX 980. Something it never ever happened with HD7950 that I had before it. If I unplug the DisplayPort cable during boot and try again, it'll boot no problem. Huh? Related in any way?


----------



## the54thvoid (Jun 27, 2016)

Monitors using dual link dvi with refresh rates above 81hz.  I'm sure this will have some slobbering over Nvidia failing again but really...

Is Display Port or HDMI not better? And if your monitor doesn't have those, why buy an expensive gfx card. My 6 year old Dell has Display Port.


----------



## TheGuruStud (Jun 27, 2016)

RejZoR said:


> My system often hangs at boot (just black screen with white underscore in the top left middle of the screen) even with GTX 980. Something it never ever happened with HD7950 that I had before it. If I unplug the DisplayPort cable during boot and try again, it'll boot no problem. Huh? Related in any way?



Hell, monitor won't even show bios screen most of the time on a 660ti and it's not b/c of uefi (turned that stupid boot shit off, plus takes a while to detect devices).


----------



## TheGuruStud (Jun 27, 2016)

the54thvoid said:


> Monitors using dual link dvi with refresh rates above 81hz.  I'm sure this will have some slobbering over Nvidia failing again but really...
> 
> Is Display Port or HDMI not better? And if your monitor doesn't have those, why buy an expensive gfx card. My 6 year old Dell has Display Port.



All of the low latency korean monitors came with only dvi (multiple inputs increase latency). Tons of them were sold for refresh OCing.

I have no monitors with DP. DP is a new addition to most monitors.


----------



## Eroticus (Jun 27, 2016)

TheGuruStud said:


> All of the low latency korean monitors came with only dvi (multiple inputs increase latency). Tons of them were sold refresh OCing.
> 
> I have no monitors with DP. DP is a new addition to most monitors.



DVI is available only on low-mid end monitors.

Most high-end monitor has DP/HDMI and maybe Thunder Bolt.


----------



## P4-630 (Jun 27, 2016)

I'm using DVI-D and HDMI, both monitor and tv are running at 60Hz, so no problems for me.
If I was running a higher refresh-rate monitor, I would just lower the Hz before shutting down, because I would be uncomfortable flashing my expensive brand new card _if_ there would be a BIOS update soon.


----------



## Vayra86 (Jun 27, 2016)

RejZoR said:


> My system often hangs at boot (just black screen with white underscore in the top left middle of the screen) even with GTX 980. Something it never ever happened with HD7950 that I had before it. If I unplug the DisplayPort cable during boot and try again, it'll boot no problem. Huh? Related in any way?



I am running my 120hz panel over Display Port but I am also noticing that my boot up time is MUCH longer than it ever was before. The past weeks I've attributed this just to Windows 10 updates (which it may very well still be, shady as that OS does its business in background), but with the 'flicker bug' being fixed in the past Hotfix 386.51 driver I am getting really, really suspicious.

Fuck I'm rolling back to a solid Kepler-finished driver again and see what happens. Will post back results. Anyone got a solid version they run their Kepler cards with? Considering branch 350-something.

*note: Nvidia isn't scoring points over here the past year. Fuckup after fuckup, as little as they may be, but it's becoming a real pattern now.


----------



## Recon-UK (Jun 27, 2016)

I run overclocked 75hz through HDMI, no matter the GPU i can't OC any higher than 65hz on DVI-D


----------



## ZoneDymo (Jun 27, 2016)

Eroticus said:


> DVI is available only on low-mid end monitors.
> 
> Most high-end monitor has DP/HDMI and maybe Thunder Bolt.



Bit simple to state it like that...
Its a choice in connection and HDMI is exactly the same except it carry's audio as well which nobody cares about in a monitor.
To call something low-mid end based on the connections it offers rather then the actual performance....well is just weird man.

On topic, get your shit together Nvidia!
Im sure another driver will fix it but man....


----------



## Legacy-ZA (Jun 27, 2016)

RejZoR said:


> My system often hangs at boot (just black screen with white underscore in the top left middle of the screen) even with GTX 980. Something it never ever happened with HD7950 that I had before it. If I unplug the DisplayPort cable during boot and try again, it'll boot no problem. Huh? Related in any way?




I don't turn off my machine often, but I had this issue with my GTX760 before it died, I now use my old GTX580 and had/have this issue while using my display port on both cards. I am now on a DVI cable, ironically... but I still get the issue from time to time. Coincidence? I don't believe in them. The system in fact, doesn't hang, I can log in and everything, I just can't see anything. I don't really know what to think at this stage.


----------



## P4-630 (Jun 27, 2016)

Anyway, good to know for when I upgrade to a 1440p monitor, preferably DP then.


----------



## Vayra86 (Jun 27, 2016)

P4-630 said:


> Anyway, good to know for when I upgrade to a 1440p monitor, preferably DP then.



Yeah or HDMI 2.0

DP is not without its issues. Bit slow to connect.


----------



## Eroticus (Jun 27, 2016)

ZoneDymo said:


> Bit simple to state it like that...
> Its a choice in connection and HDMI is exactly the same except it carry's audio as well which nobody cares about in a monitor.
> To call something low-mid end based on the connections it offers rather then the actual performance....well is just weird man.
> 
> ...



as I know DVI doesn't support 2440x1440@120/144  and over 2,560 × 1,600@60, based on this year, this isn't high-end any more...

Last 2 monitors i owned, didn't had DVI port. last 4 i owned didn't had VGA.


----------



## Vayra86 (Jun 27, 2016)

Eroticus said:


> as I know DVI doesn't support 2440x1440@120/144  and over 2,560 × 1,600@60, based on this year, this isn't even close to high end.



Seeing as the 1070 cannot even max out 120 fps on a 1080p monitor in all games, I don't see why DVI-D can't be a high end connection. There is no single-GPU in the world that can saturate that connection with all the content you can play on it.

So what the fuck are you on about? 4K or 1440p says nothing at all about what 'end' you're on it just means you have a crapload of pixels to push.


----------



## P4-630 (Jun 27, 2016)

Vayra86 said:


> Yeah or HDMI 2.0
> 
> DP is not without its issues. Bit slow to connect.



I'm already using the HDMI port for my tv.
So I have to go with DP if I upgrade to 1440p if it would have a higher refresh rate than 60Hz.


----------



## Ubersonic (Jun 27, 2016)

the54thvoid said:


> Is Display Port or HDMI not better?



Displayport is, it's the successor to DVI & HDMI.  HDMI however is no better than DVI, they are sister technologies that emerged in the late 1990's, DVI was designed for monitors as a digital replacement for VGA, and HDMI was a stripped down version for designed for televisions as a digital replacement for SCART and component RGB, you see a lot of overlap however as their digital part is essentially the same, hence the existence of passive adaptors.


----------



## Eroticus (Jun 27, 2016)

Vayra86 said:


> Seeing as the 1070 cannot even max out 120 fps on a 1080p monitor in all games, I don't see why DVI-D can't be a high end connection. There is no single-GPU in the world that can saturate that connection with all the content you can play on it.
> 
> So what the fuck are you on about? 4K or 1440p says nothing at all about what 'end' you're on it just means you have a crapload of pixels to push.



Most shooters work very well, even with 290x. not every one like you play with MSAAx8


----------



## Valdas (Jun 27, 2016)

I wonder if bios fix will help. I own two monitors, one running at 144Hz and an older one running at 120Hz which requires DVI, so this is something that can potentially affect me when I finally get my hands on either 1080 or 1070.


----------



## Recon-UK (Jun 27, 2016)

Eroticus said:


> Most shooters work very well, even with 290x. not every one like you play with MSAAx8


Agreed


----------



## Ubersonic (Jun 27, 2016)

Recon-UK said:


> I run overclocked 75hz through HDMI, no matter the GPU i can't OC any higher than 65hz on DVI-D



That's because you are using an old "single link" DVI cable, they are limited to 1920x1200@60Hz, you're getting the extra 5Hz due to running at 1080.


----------



## P4-630 (Jun 27, 2016)

Valdas said:


> I wonder if bios fix will help. I own two monitors, one running at 144Hz and an older one running at 120Hz which requires DVI, so this is something that can potentially affect me when I finally get my hands on either 1080 or 1070.



Or just lower the Hz everytime before shutting down, which what I would do, I'm uncomfortable flashing my expensive card.


----------



## ZoneDymo (Jun 27, 2016)

P4-630 said:


> Or just lower the Hz everytime before shutting down, which what I would do, I'm uncomfortable flashing my expensive card.



That would be sooooo tedious though.
And for such an expensive card....I mean its your choice but I would keep the receipt.


----------



## Vayra86 (Jun 27, 2016)

Eroticus said:


> Most shooters work very well, even with 290x. not every one like you play with MSAAx8



Of course it can work, but you got the point I reckon - nothing wrong with DVI-D if it suits your monitor and rig, and there is no relation to it being 'low or high end'. It's like tires on a car, if they're the right size, it's fine, but you *can* get super exotic tires too and still be stuck to the same engine under the hood. You only get a faster car with a better engine.


----------



## Eroticus (Jun 27, 2016)

Vayra86 said:


> Of course it can work, but you got the point I reckon - nothing wrong with DVI-D if it suits your monitor and rig, and there is no relation to it being 'low or high end'. It's like tires on a car, if they're the right size, it's fine, but you *can* get super exotic tires too and still be stuck to the same engine under the hood. You only get a faster car with a better engine.


I didn't try to hurt you or anything ... =P

But the technology is old, no point to use it any more, when DP works well support more resolution and frequencies.

You won't buy used GT-R 1993, if you could get GT-R 2016 for same price.


----------



## Vayra86 (Jun 27, 2016)

Eroticus said:


> I didn't try to hurt you or anything ... =P
> 
> But the technology is old, no point to use it any more, when DP works well support more resolution and frequencies.
> 
> You won't buy used GT-R 1993, if you could get GT-R 2016 for same price.



The point is, you are making a silly argument when Nvidia puts a DVI connector on its _newest GPU's_. "Bwah, not important, old stuff, it can be shit"

Like... huh?


----------



## Eroticus (Jun 27, 2016)

Vayra86 said:


> The point is, you are making a silly argument when Nvidia puts a DVI connector on its newest GPU's. "Bwah, not important, old stuff, it can be shit"



Sooner or later some year will pick up ur DVI port to, like 1999 toke ur optical drive. 

Any way, enjoy ur DVI port, have a nice day....


----------



## Caring1 (Jun 27, 2016)

Vayra86 said:


> Of course it can work, but you got the point I reckon - nothing wrong with DVI-D if it suits your monitor and rig, and there is no relation to it being 'low or high end'. It's like tires on a car, if they're the right size, it's fine, but you *can* get super exotic tires too and still be stuck to the same engine under the hood. You only get a faster car with a better engine.


That analogy is incorrect, the vehicle can be driven faster due to better tires alone.


----------



## Frick (Jun 27, 2016)

Vayra86 said:


> *note: Nvidia isn't scoring points over here the past year. Fuckup after fuckup, as little as they may be, but it's becoming a real pattern now.



Sadly it doesn't affect them in any way.

Anyway DVI for the win. I wish they kept the connectors anyway.


----------



## Vayra86 (Jun 27, 2016)

Caring1 said:


> That analogy is incorrect, the vehicle can be driven faster due to better tires alone.



Yeah... car analogies never work. Damn it!


----------



## goodeedidid (Jun 27, 2016)

I don't care about that at all, I haven't seen DVI connection in years. What sane person will use this old shit anyways with the likes of 1080. This is the best video-card ever.


----------



## Vayra86 (Jun 27, 2016)

goodeedidid said:


> I don't care about that at all, I haven't seen DVI connection in years. What sane person will use this old shit anyways with the likes of 1080. This is the best video-card ever.



You made a new account just to say that you don't care?


----------



## goodeedidid (Jun 27, 2016)

Vayra86 said:


> You made a new account just to say that you don't care?


What do you mean?


----------



## qubit (Jun 27, 2016)

This is such an obvious problem how could it not have been found by NVIDIA? Also by reviewers and users alike who have high refresh rate monitors.

For example, my setup will default to 144Hz when I install a new graphics card and then the drivers as that's what my monitor supports.


----------



## goodeedidid (Jun 27, 2016)

qubit said:


> This is such an obvious problem how could it not have been found by NVIDIA? Also by reviewers and users alike who have high refresh rate monitors.
> 
> For example, my setup will default to 144Hz when I install a new graphics card and then the drivers as that's what my monitor supports.


I suppose reviewers didn't do reviewing much with DVI connections.

I mean who does gaming with such expensive and new tech with old monitors? Or maybe there are modern DVI connections, I haven't seen anything like this myself.


----------



## okidna (Jun 27, 2016)

P4-630 said:


> Or just lower the Hz everytime before shutting down, which what I would do, I'm uncomfortable flashing my expensive card.



You can do it with just one click with batch file to run simple app like Display Changer : http://12noon.com/?page_id=80

I use it to test an image processing project a couple months ago, to test how the application react in different refresh rate, resolution, and color depth without touching Windows display settings or even graphic card driver settings at all.


----------



## qubit (Jun 27, 2016)

goodeedidid said:


> I suppose reviewers didn't do reviewing much with DVI connections.
> 
> I mean who does gaming with such expensive and new tech with old monitors? Or maybe there are modern DVI connections, I haven't seen anything like this myself.


DVI is pretty standard and hardly obsolete, so this is an obvious fault that should have been picked up straight away. DisplayPort is only required for 4K and above.


----------



## jabbadap (Jun 27, 2016)

Correct me if I'm wrong, but is this only affect people who OC their monitors to connected by dual link DVI?


----------



## puma99dk| (Jun 27, 2016)

This sound like a Bios update is needed, dunno if it can be fixed with a new driver which could be awesome if it was possible.


----------



## the54thvoid (Jun 27, 2016)

jabbadap said:


> Correct me if I'm wrong, but is this only affect people who OC their monitors to connected by dual link DVI?



If so, it's laughable. Truly. For all the fairy protests from the usual sources, for a problem they probably don't have. For an issue that affects so few who overclock monitors not really designed for it in most cases.
More tea cups and more storms.


----------



## Valdas (Jun 27, 2016)

goodeedidid said:


> I suppose reviewers didn't do reviewing much with DVI connections.
> 
> I mean who does gaming with such expensive and new tech with old monitors? Or maybe there are modern DVI connections, I haven't seen anything like this myself.


Imho people don't change their monitors nearly as often as they change their graphics cards or other components, therefore assumption that if you have a 1080 in your system, you won't have an older monitor, is incorrect.
I've got two monitors where one is old, so I should just throw it away? Even when it works just fine?


----------



## bug (Jun 27, 2016)

TheGuruStud said:


> All of the low latency korean monitors came with only dvi (multiple inputs increase latency). Tons of them were sold for refresh OCing.
> 
> I have no monitors with DP. DP is a new addition to most monitors.



I have a DP capable monitor I bought like 5 years ago (HP z24something). And it didn't cost an arm and a leg.

Maybe this will convince Nvidia it's time to put DVI to greener pastures. It's limited to 2560x1600@60Hz anyway (see https://en.wikipedia.org/wiki/Digital_Visual_Interface#Technical_overview)


----------



## RCoon (Jun 27, 2016)

the54thvoid said:


> If so, it's laughable. Truly. For all the fairy protests from the usual sources, for a problem they probably don't have. For an issue that affects so few who overclock monitors not really designed for it in most cases.
> More tea cups and more storms.



For every mini scandal covered by the media, 99 more slip through the cracks.


----------



## P4-630 (Jun 27, 2016)

bug said:


> Maybe this will convince Nvidia it's time to put DVI to greener pastures.



The market is still flooded with monitors that offer only HDMI, VGA and DVI-D.
I have bought a nice one not long ago with IPS display.
I have it connected through DVI-D.
A tv takes the HDMI port.


----------



## rtwjunkie (Jun 27, 2016)

goodeedidid said:


> Or maybe there are modern DVI connections, I haven't seen anything like this myself.



Yes, exactly, there are.  My monitor is new as well as a recent model.  It has DVI and DP.  I play at 60Hz and am very comfortable and pleased with that.  DVI is just fine.  It is merely the victim of elitist snobbery.


----------



## john_ (Jun 27, 2016)

Funny.

Nvidia cards are having problems with DVI.
Solution: Throw away your monitors that still use DVI. They are old.
Really? Maybe before telling someone to change his monitor you should send him a few hundred dollars.

Anyway, one more problem where there is an easy workaround. The question is how many .bat workaround files can someone have on his desktop, especially after paying hundreds of dollars for a Pascal card and a high refresh rate GSync(or not) monitor. One workaround for high power consumption when sitting on the desktop at 144Hz, one workaround for booting/shutting down the system, more workarounds in the future?

Probably Nvidia will implement something more automatic in their next driver, like auto changing refresh rates when shutting down and when booting up. But they should had seen this in time and fix it. A few extra frames at the latest title does sell cards, but their reputation about drivers quality took years to establish and people will stop forgiving them for little things like this.


----------



## rtwjunkie (Jun 27, 2016)

john_ said:


> but their reputation about drivers quality took years to establish and people will stop forgiving them for little things like this.



As far as I am concerned Nvidia already dashed their driver reputation last summer with a string of bad drivers, leaving many to hang onto older drivers until Autumn.

It's why I always wait at least 2 releases now before upgrading drivers.  They can't be trusted anymore not to release shit.


----------



## TheinsanegamerN (Jun 27, 2016)

TheGuruStud said:


> How was this not found by nvidia in super early testing, let alone reviewers? Rush, rush, rush, who cares, launch them all no matter the incompetence. I imagine the delay for users to find this was b/c almost no one can get the cards anyway XD


No kidding. Nvidia has really been dropping the ball in the last year or so, between rushed pascal and driver problems.

There is some term to describe this, IDK what it is called, when a company is in the lead they stop caring about their products as much as when they had competition. nvidia is in that stage. And much like how the same stage let IE get overtaken by chrome and firefox, nvidia may be in trouble here soon if they dont shape up.

I will say I regret not getting the M295x in my alienware 15 when I had the chance. Nvidia's optimus has been disappointing lately.


rtwjunkie said:


> Yes, exactly, there are.  My monitor is new as well as a recent model.  It has DVI and DP.  I play at 60Hz and am very comfortable and pleased with that.  DVI is just fine.  It is merely the victim of elitist snobbery.


Or technological obsolescence. There is nothing wrong with DVI, it's just old. Technology moved forward since DVI came onto the market in 1999. Heck, dual link DVI doesnt officially support over 60hz at 1600p. Modern monitors are pushing 144 hz 1080p, and 144hz 1440p will be coming.

of course it works fine for 60hz, but more people are beginning to get on the high refresh rate train. In that world, DVI no longer matters.


----------



## qubit (Jun 27, 2016)

RCoon said:


> For every mini scandal covered by the media, 99 more slip through the cracks.


True. 



rtwjunkie said:


> *As far as I am concerned Nvidia already dashed their driver reputation last summer* with a string of bad drivers, leaving many to hang onto older drivers until Autumn.
> 
> It's why I always wait at least 2 releases now before upgrading drivers.  They can't be trusted anymore not to release shit.


Yes, they've been a bit flaky, unfortunately. However, I still install straight away with wild abandon  and so far have somehow managed to avoid any major issues.


----------



## P4-630 (Jun 27, 2016)

rtwjunkie said:


> As far as I am concerned Nvidia already dashed their driver reputation last summer with a string of bad drivers, leaving many to hang onto older drivers until Autumn.
> 
> It's why I always wait at least 2 releases now before upgrading drivers.  They can't be trusted anymore not to release shit.



Some people used to say that AMD had crappy drivers but now nvidia has showed driver problems as well, 
I also will wait updating my display driver when there has been a new one released.
We will read it fast enough on the internet if there are any problems with them


----------



## ensabrenoir (Jun 27, 2016)

......who buys a $400 + card and don't us a monitor to full experience what they paid for.......step your game up people display port!   This is like saying...Yeah i bought this Porsche/Ferrari/Lamborghini  and it just doesn't perform well on 87 octane gas(as oppose to 93 and above).  Since some hate car analogies, its also like when flat screen 1080p TVs first came out and people were still using the red white and yellow rca connectors.  You still got a picture but not the quality that you paid for.  I guess they have to put warning stickers on graphic cards suggesting the proper monitors.  Didn't they eliminate dvi on high end cards and make it so adapters didn't work either......You gotta do your research......


----------



## rtwjunkie (Jun 27, 2016)

ensabrenoir said:


> ......who buys a $400 + card and uses dvi or an  hdmi even......step your game up people display port!   This is like saying...Yeah i bought this Porsche/Ferrari/Lamborghini  and it just doesn't perform well on 87 octane gas(as oppose to 93 and above). Do they have to put warning stickers on graphic cards suggesting a high end monitor? Didn't they eliminate dvi on some cards and make it so adapters didn't work either......



People have different standards of what makes a good monitor.  For me, IQ is important, followed by screen size for what works on my desk, and then screen material.  I prefer glossy for its rich colors.  I don't need 144Hz 1440p to have a great gaming experience.  

Why? Because that experience is subjective, and no one can say that something is inadequate.  I'm also very cost-conscious, and I am not going to replace my near $300 monitor for considerably more just to make elitists happy.


----------



## bug (Jun 27, 2016)

P4-630 said:


> The market is still flooded with monitors that offer only HDMI, VGA and DVI-D.
> I have bought a nice one not long ago with IPS display.
> I have it connected through DVI-D.
> A tv takes the HDMI port.



Not a problem. Leave only HDMI and DP connections on the video card (possibly with some mini variants in there) and let those who still cling on to DVI use a HDMI-to-DVI cable (it's cheap, I know, I have one). Or include a HDMI-toDVI adapter with the card if you feel generous.


----------



## Zubasa (Jun 27, 2016)

ensabrenoir said:


> ......who buys a $400 + card and uses dvi or an  hdmi even......step your game up people display port!   This is like saying...Yeah i bought this Porsche/Ferrari/Lamborghini  and it just doesn't perform well on 87 octane gas(as oppose to 93 and above). Do they have to put warning stickers on graphic cards suggesting a newer monitor? Didn't they eliminate dvi on high end cards and make it so adapters didn't work either......


Thing is the people made a big deal about the Fury X not having HDMI 2.0, so I guess plenty of people use HDMI.


----------



## robert3892 (Jun 27, 2016)

I've been reading this thread and it usually affects users of 2K monitors imported from Korean that have only a DVI port (no DP port). Monitors which have DP ports are not affected provided the user is NOT using a DVI port.

I used to have a QNIX myself and these monitors can be easily overclocked by users up to 120 hertz. So until NVIDIA comes out with a fix don't overclock the QNIX monitors beyong 80hertz...maybe make it 75 hertz just to be sure.


----------



## jaggerwild (Jun 27, 2016)

I find it funny its been 2 weeks sense the story of hand picked over clocked samples now this. Add to it low stock, I think were gonna see a new card sooner then Christmas from the big N. 

http://www.newegg.com/Product/Produ...d=1&N=100007709 601201888 601203818 601204369


----------



## rtwjunkie (Jun 27, 2016)

jaggerwild said:


> I find it funny its been 2 weeks sense the story of hand picked over clocked samples now this. Add to it low stock, I think were gonna see a new card sooner then Christmas from the big N.
> 
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709 601201888 601203818 601204369



I wouldn't be surprised to see a 1060 from them, since they literally are poised to lose the whole mid-tier to AMD.  Of course, 1060's would also need to be in stock, LOL!


----------



## ensabrenoir (Jun 27, 2016)

Zubasa said:


> Thing is the people made a big deal about the Fury X not having HDMI 2.0, so I guess plenty of people use HDMI.



true.....i edited my rant as soon as higher logic and coffee kicked in.


----------



## rtwjunkie (Jun 27, 2016)

TheinsanegamerN said:


> more people are beginning to get on the high refresh rate train. In that world, DVI no longer matters.



Yes, they are.  From my eyes, I see no reason to do it.  And I have a modern, recent monitor.  Earlier I stated what qualities in a monitor are important to me.

I've never been a victim of peer pressure.  I don't follow the herd.  I've always done what works for me in life.


----------



## Ubersonic (Jun 27, 2016)

bug said:


> Maybe this will convince Nvidia it's time to put DVI to greener pastures. It's limited to 2560x1600@60Hz anyway (see https://en.wikipedia.org/wiki/Digital_Visual_Interface#Technical_overview)



It's not limited to it, that's the "spec", but it's common to see systems running out of spec.  You can do 1080p/144Hz and 1440p/120Hz over dual link DVI no problem.

I do agree that it's time to drop the DVI port on high end cards though, now that they are using DVI-D instead of DVI-I the DVI-VGA adapters no longer work so they may as well just drop the DVI port or replace it with an extra HDMI port and include a HDMI-DVI adapter, that will save space.


----------



## ZoneDymo (Jun 27, 2016)

Ubersonic said:


> It's not limited to it, that's the "spec", but it's common to see systems running out of spec.  You can do 1080p/144Hz and 1440p/120Hz over dual link DVI no problem.
> 
> I do agree that it's time to drop the DVI port on high end cards though, now that they are using DVI-D instead of DVI-I the DVI-VGA adapters no longer work so they may as well just drop the DVI port or replace it with an extra HDMI port and include a HDMI-DVI adapter, that will save space.



should most def drop it if you cannot seem to do it correctly....yeah...


----------



## Camm (Jun 27, 2016)

There is also a problem with DisplayPort sync above 120hz, which causes white line artifacting.


----------



## bug (Jun 27, 2016)

Ubersonic said:


> It's not limited to it, that's the "spec", but it's common to see systems running out of spec.  You can do 1080p/144Hz and 1440p/120Hz over dual link DVI no problem.
> 
> I do agree that it's time to drop the DVI port on high end cards though, now that they are using DVI-D instead of DVI-I the DVI-VGA adapters no longer work so they may as well just drop the DVI port or replace it with an extra HDMI port and include a HDMI-DVI adapter, that will save space.



Well, you can run anything out of spec, but then you no longer expect any predictable result, do you?


----------



## Zubasa (Jun 27, 2016)

ZoneDymo said:


> should most def drop it if you cannot seem to do it correctly....yeah...


Also dropping that bulky DVI connector can also mean more space for the exhaust on reference cards.


----------



## horik (Jun 27, 2016)

Was looking for an GTX 1070 to replace my GTX 970 but with the corean monitor i have,  OC'ed to 96 Hz,  might have problems.


----------



## Steevo (Jun 27, 2016)

Sounds like they have a display controller issue, and its almost for the market they are laying claim to with their cards, high resolution high FPS displays. 
Perhaps they are lucky their not able to produce more actual cards yet, a BIOS update for thousands of cards will undoubtedly cause some to be bricked, and no replacements available would be a worse black eye than the already absurd price gouging and limited availability.


----------



## Slizzo (Jun 27, 2016)

Just going to throw this out here, but how many monitors that support greater than 60Hz refresh rate (natively, no overclocking) that only use DVI are out there? I'm going to venture none. Therefore nVidia can't have known this, because they likely do not test out of spec configurations.

This issue isn't present for a majority of GTX1080 owners. Including myself; however I stopped overclocking my gen 1 Qnix Qx2710 when I got my XB270HU.


----------



## jabbadap (Jun 27, 2016)

Steevo said:


> Sounds like they have a display controller issue, and its almost for the market they are laying claim to with their cards, high resolution high FPS displays.
> Perhaps they are lucky their not able to produce more actual cards yet, a BIOS update for thousands of cards will undoubtedly cause some to be bricked, and no replacements available would be a worse black eye than the already absurd price gouging and limited availability.



Well if this would happen with Displayport monitors like 1440@144Hz or 2160p@60Hz then I would agree. But the "bug" is with running dl-dvi out of spec Mpix/s, thus not very critical. This is more troublesome bug:
https://forums.geforce.com/default/...es-not-work-with-the-htc-vive-on-gtx-1080-/1/


----------



## truth teller (Jun 27, 2016)

Slizzo said:


> Just going to throw this out here, but how many monitors that support greater than 60Hz refresh rate (natively, no overclocking) that only use DVI are out there? I'm going to venture none. Therefore nVidia can't have known this, because they likely do not test out of spec configurations.


not true, "old" native 75/85/120hz monitors use dvi and they seem to work just fine. heck, even some (if not most)144hz monitors have dvi, not the only interface but its probably the primary


----------



## Swampdonkey (Jun 27, 2016)

I think the take away from the problems with the 1070/1080s is to not buy first batch. I've been impatient in the past, and burned several times with glitchy cards requiring an RMA. No doubt that Nvidia will sort out these bugs. Until then, my wallet's staying closed.


----------



## TheinsanegamerN (Jun 27, 2016)

truth teller said:


> not true, "old" native 75/85/120hz monitors use dvi and they seem to work just fine. heck, even some (if not most)144hz monitors have dvi, not the only interface but its probably the primary


"old" native 75/85/120hz were also not running 2560x1600 resolution. DVI has no trouble pushing that HZ at newer rez, and you do not need to push the pixel clock higher to achieve said refresh rates. As such, said monitors would have no issue with a 1080.

If any of them WERE pushing above 2560x1600p60, then those monitors were being sold outside of spec, and whoever bought them took the risk that something wouldnt work. Any decent monitor that was pushing said specs really should have used displayport.


----------



## Valdas (Jun 27, 2016)

Slizzo said:


> Just going to throw this out here, but how many monitors that support greater than 60Hz refresh rate (natively, no overclocking) that only use DVI are out there? I'm going to venture none. Therefore nVidia can't have known this, because they likely do not test out of spec configurations.
> 
> This issue isn't present for a majority of GTX1080 owners. Including myself; however I stopped overclocking my gen 1 Qnix Qx2710 when I got my XB270HU.


Take a look at mine Viewsonic V3D245. It has d-sub, dvi and hdmi.


----------



## MxPhenom 216 (Jun 27, 2016)

the54thvoid said:


> Monitors using dual link dvi with refresh rates above 81hz.  I'm sure this will have some slobbering over Nvidia failing again but really...
> 
> Is Display Port or HDMI not better? And if your monitor doesn't have those, why buy an expensive gfx card. My 6 year old Dell has Display Port.



Korean 1440p PLS and IPS monitors, the good ones, only have dual link dvi, and these are the only ones that allow a decent refresh rate overclock, plus very low input lag. 

Looks like ill be waiting for this to hopefully be fixed in drivers or domething before i buy a 1070, since this will be a issue for me.


----------



## TheinsanegamerN (Jun 27, 2016)

Valdas said:


> Take a look at mine Viewsonic V3D245. It has d-sub, dvi and hdmi.


and its only 1080p. So dual link DVI pushing that rez faster then 60hz would be in spec. Hence no problems here.

The real question should be how many monitors are out there, that use DVI for input, that are above 60hz, AND are 1440/1600P? The answer is very few, and those were sold out of spec, so nvidia couldnt be expected to test for that.


----------



## jabbadap (Jun 27, 2016)

truth teller said:


> Spoiler
> 
> 
> 
> ...



Well he obviously mean 1440p monitors, not every resolution with 75/85/120/144Hz refresh rate. i.e. 1080p@144Hz is within dl-dvi 330Mpix spec(298.60MPix/s)...

And then there is monitors with multiple inputs(i.e. BenQ XL2730Z 144Hz ), which while have dl-dvi input does not support 1440p@144Hz over dvi(dp only).


----------



## dr emulator (madmax) (Jun 27, 2016)

i was going to get a EVGA GeForce GTX 1080 "Classified" but had this news from my supplier 

here's what i wrote 

"Hi will this card connect to a analogue monitor or do i have to purchase an adaptor"

they replied "The outputs are Digital only and it states on EVGA spec sheet for the card " Please do not connect to "DVI to VGA" adapter." Hope this helps."

i have a 21 inch sun microsystems crt monitor ( ye ye it's old ) but it does for me, the card i asked the question about was the 
GeForce GTX 1080 FTW ACX 3.0 , so i gather they will be the same outputs on the classified 

oh-well i didn't want to spend £700 on a graphics card anyway


----------



## jabbadap (Jun 27, 2016)

dr emulator (madmax) said:


> i was going to get a EVGA GeForce GTX 1080 "Classified" but had this news from my supplier
> 
> here's what i wrote
> 
> ...



I'm afraid you are out of luck. I doubt there will be new graphics card with analog signals anymore. AMD stopped them with hawaii(R9-290X) and nvidia dropped it now with pascals. Your only change is to try to find a active dp->vga converter and use that.


----------



## dr emulator (madmax) (Jun 27, 2016)

jabbadap said:


> I'm afraid you are out of luck. I doubt there will be new graphics card with analog signals anymore. AMD stopped them with hawaii(R9-290X) and nvidia dropped it now with pascals. Your only change is to try to find a active dp->vga converter and use that.




tbh i didn't know that 

as said though i still have this 21inch crt which i love to bits, so i might get a gtx 980 (i have a HD5870 at the mo) hopefully i can get one in the sales, saying that i haven't seen much movement in their prices yet 

i can wait though, and might invest in a new monitor sometime in the near future, just means more of my savings gone


----------



## Valdas (Jun 27, 2016)

TheinsanegamerN said:


> and its only 1080p. So dual link DVI pushing that rez faster then 60hz would be in spec. Hence no problems here.
> 
> The real question should be how many monitors are out there, that use DVI for input, that are above 60hz, AND are 1440/1600P? The answer is very few, and those were sold out of spec, so nvidia couldnt be expected to test for that.


I stand corrected.


----------



## jabbadap (Jun 27, 2016)

dr emulator (madmax) said:


> tbh i didn't know that
> 
> as said though i still have this 21inch crt which i love to bits, so i might get a gtx 980 (i have a HD5870 at the mo) hopefully i can get one in the sales, saying that i haven't seen much movement in their prices yet
> 
> i can wait though, and might invest in a new monitor sometime in the near future, just means more of my savings gone



Well depending resolution of your monitor, you might find a dp->d-sub adapter for 10-20$. But of course their max res is 1200p@60Hz and they add a bit latency. And I can't guarantee, that they will work fine with some high refresh crt:s.


----------



## m1dg3t (Jun 27, 2016)

I find it laughable that any time nVidia is caught/called out on something the majority of people downplay or outright dismiss it, meanwhile anything AMD does they get crucified.


----------



## _Flare (Jun 27, 2016)

It´s generally impossible to go above 165MHz Pixelclock per DVI-Link so 330 MHz is the DVI-Dual-Link-Maximum per Specification, and it will be so forever.


----------



## Totally (Jun 27, 2016)

m1dg3t said:


> I find it laughable that any time nVidia is caught/called out on something the majority of people downplay or outright dismiss it, meanwhile anything AMD does they get crucified.



:shrug: You get used to it. Occasionally you come across some real gems who are foaming blood out the mouth as if AMD killed their beloved or a family member. "DAMMIT, this is completely unforgivable you are now and forever enemy to my family!!!" then they go on to join the horde afterwards. When they never owned said afflicted card(s) in the first place.


----------



## centaurius (Jun 27, 2016)

At least someone from NVIDIA replied on the Geforce forums and they are aware of the issue, let's see how long until they give any official input about this. 

This only occurs from 1080p@144 Hz and above resolutions/refresh rate yes? If you use 1080p@100 Hz or 1080p@120 Hz will the problem also occur?


----------



## TheinsanegamerN (Jun 27, 2016)

centaurius said:


> At least someone from NVIDIA replied on the Geforce forums and they are aware of the issue, let's see how long until they give any official input about this.
> 
> This only occurs from 1080p@144 Hz and above resolutions/refresh rate yes? If you use 1080p@100 Hz or 1080p@120 Hz will the problem also occur?


No. 1080p144 is fine. It's people running 1440p/1600p at above 60hz that are having this problem.


----------



## m1dg3t (Jun 27, 2016)

Totally said:


> :shrug: You get used to it. Occasionally you come across some real gems who are foaming blood out the mouth as if AMD killed their beloved or a family member. "DAMMIT, this is completely unforgivable you are now and forever enemy to my family!!!" then they go on to join the horde afterwards. When they never owned said afflicted card(s) in the first place.



Get used to it? Really? That right there; Apathy, is what has allowed things to progress to this point. 

<rant>Any gamer worth his kb/mouse knows that AMD has been the one consistently trying to push things forward with all their innovation on both the hardware & software aspects of gaming. All the while doing it with the smallest budget and the largest opposition, and not raping their customers in the process. I'm not saying AMD are/have been without any fault, but that is true of all things.</rant>


----------



## 64K (Jun 27, 2016)

Totally said:


> :shrug: You get used to it. Occasionally you come across some real gems who are foaming blood out the mouth as if AMD killed their beloved or a family member. "DAMMIT, this is completely unforgivable you are now and forever enemy to my family!!!" then they go on to join the horde afterwards. When they never owned said afflicted card(s) in the first place.



I think sometimes people talk smack about AMD because they think it's funny to see how worked up emotionally AMD fans can get when they do that.


----------



## the54thvoid (Jun 27, 2016)

m1dg3t said:


> I find it laughable that any time nVidia is caught/called out on something the majority of people downplay or outright dismiss it, meanwhile anything AMD does they get crucified.



88 posts, many against Nvidia says otherwise. You tend to find at least on TPU, if Nvidia drop the ball, there are plenty of people ready to kick it away or burst it.
Just like AMD. It's always a matter of perspective. Does pixel over clocking affect me? Not at all. So I'm not bothered by an otherwise niche element. Is it only people who overclock specific monitors on a specific adapter? Yup. 
So it it really a huge issue? Don't think so. Like all things, overclocking even when software supported isn't a guarantee, especially on a monitor.

I do think we require more tea cups though because this is a great over reaction party. Especially from those it doesn't affect.


----------



## Dippyskoodlez (Jun 27, 2016)

m1dg3t said:


> Get used to it? Really? That right there; Apathy, is what has allowed things to progress to this point.
> 
> <rant>Any gamer worth his kb/mouse knows that AMD has been the one consistently trying to push things forward with all their innovation on both the hardware & software aspects of gaming. All the while doing it with the smallest budget and the largest opposition, and not raping their customers in the process. I'm not saying AMD are/have been without any fault, but that is true of all things.</rant>



I don't see how i3 equivilant FX CPUs at $200+ is not 'raping consumers' in late 2016.


----------



## erocker (Jun 27, 2016)

Everyone keep on topic, no insults and post in a civil manner. 

Thanks much!


----------



## GC_PaNzerFIN (Jun 27, 2016)

_Flare said:


> It´s generally impossible to go above 165MHz Pixelclock per DVI-Link so 330 MHz is the DVI-Dual-Link-Maximum per Specification, and it will be so forever.



Well, the "generally impossible" has been working with pretty much all previous gen NVIDIA and AMD desktop graphics cards. In fact, I am now running 384 MHz on GTX 780 Ti just fine. I am very sad to hear I can no longer do 96Hz 1440p on this really amazing QNIX 27" panel. Might have to resort in spending 800 euros on a new screen, which only advantage would be G-SYNC and few Hz more.


----------



## semantics (Jun 27, 2016)

64K said:


> I think sometimes people talk smack about AMD because they think it's funny to see how worked up emotionally AMD fans can get when they do that.


Yeah it's called trolling...


----------



## jabbadap (Jun 27, 2016)

GC_PaNzerFIN said:


> Well, the "generally impossible" has been working with pretty much all previous gen NVIDIA and AMD desktop graphics cards. In fact, I am now running 384 MHz on GTX 780 Ti just fine. I am very sad to hear I can no longer do 96Hz 1440p on this really amazing QNIX 27" panel. Might have to resort in spending 800 euros on a new screen, which only advantage would be G-SYNC and few Hz more.



Yes you still can. You just have to boot it under 81Hz and when system is up and running you can oc it back to 1440p@96Hz(or use lower refresh rate on desktop and your oc clock for games)...


----------



## Jstn7477 (Jun 27, 2016)

Sadly, my GTX 980 has DisplayPort issues where my Acer G257HU gets frozen at the login screen with a blank screen and I have to power cycle the monitor on nearly every boot, wait for the stupid 12 seconds of Acer/ENERGY STAR crap and then resume logging in. It also does this on resolution change sometimes.

Also, apparently my monitor can be overclocked to 85Hz, but I get frame skipping at 61Hz on any interface while AMD users say the monitor is supposedly stutter-free when overclocked on their cards.


----------



## xorbe (Jun 27, 2016)

Jstn7477 said:


> wait for the stupid 12 seconds of Acer/ENERGY STAR crap



Acer xb270hu also has excessively slow logo screen, it's rather annoying.  I turn the screen on before I need to use the computer, when I come back then it's on.  Yeah, that's saving power ...


----------



## AsRock (Jun 27, 2016)

ZoneDymo said:


> That would be sooooo tedious though.
> And for such an expensive card....I mean its your choice but I would keep the receipt.



Yeah and for him to flash it, well it be his fault.




ensabrenoir said:


> ......who buys a $400 + card and don't us a monitor to full experience what they paid for.......step your game up people display port!   This is like saying...Yeah i bought this Porsche/Ferrari/Lamborghini  and it just doesn't perform well on 87 octane gas(as oppose to 93 and above).  Since some hate car analogies, its also like when flat screen 1080p TVs first came out and people were still using the red white and yellow rca connectors.  You still got a picture but not the quality that you paid for.  I guess they have to put warning stickers on graphic cards suggesting the proper monitors.  Didn't they eliminate dvi on high end cards and make it so adapters didn't work either......You gotta do your research......



What a load of 



Spoiler



http://img.techpowerup.org/160627/untitled603.jpg


----------



## R-T-B (Jun 27, 2016)

This is likely driver correctable IMO.  Or at least bios correctable.

Still, disappointing to see it having issues.  If the cable can carry it, the card should transmit it without having issues booting windows, period.

It's not a big issue, but it IS an issue, period.


----------



## natr0n (Jun 27, 2016)

AMD has all the momentum now with news like this and new card in a few days.


----------



## Solaris17 (Jun 28, 2016)

Vayra86 said:


> Seeing as the 1070 cannot even max out 120 fps on a 1080p monitor in all games, I don't see why DVI-D can't be a high end connection. There is no single-GPU in the world that can saturate that connection with all the content you can play on it.
> 
> So what the fuck are you on about? 4K or 1440p says nothing at all about what 'end' you're on it just means you have a crapload of pixels to push.



My AOC IPS displays were driven by DVI-D and had no HDMI hookup. Id argue that they looked better than some peoples 4k displays, but thats none of my business.


----------



## ZoneDymo (Jun 28, 2016)

64K said:


> I think sometimes people talk smack about AMD because they think it's funny to see how worked up emotionally AMD fans can get when they do that.



from everything I read on forums its more the case that AMD has fans..and Nvidia has fanboys...
How often you see someone on a random forum telling you Nvidia is the only way to go and AMD is crap vs the other way around? 100 vs 1?
AMD users far more often tell you about getting what is the best at that moment and what works for you personally with your personal needs in mind....or ya know...sane consumers.


----------



## TheoneandonlyMrK (Jun 28, 2016)

GC_PaNzerFIN said:


> Well, the "generally impossible" has been working with pretty much all previous gen NVIDIA and AMD desktop graphics cards. In fact, I am now running 384 MHz on GTX 780 Ti just fine. I am very sad to hear I can no longer do 96Hz 1440p on this really amazing QNIX 27" panel. Might have to resort in spending 800 euros on a new screen, which only advantage would be G-SYNC and few Hz more.


This guys facing the actual issue , and despite many of you're opinion on it a quick Google round says 1440p 144Hz are here big time this year , I suppose they will all be displayport or something eh ,even the cheap VA ones.


----------



## P4-630 (Jun 28, 2016)

At least there is no suprise of crippled vram 
I'm running my monitor at 60Hz so no problem for me.
Once I upgrade to a 1440p monitor I will keep in mind to buy one with DP if it has a higher refresh-rate.


----------



## the54thvoid (Jun 28, 2016)

theoneandonlymrk said:


> This guys facing the actual issue , and despite many of you're opinion on it a quick Google round says 1440p 144Hz are here big time this year , I suppose they will all be displayport or something eh ,even the cheap VA ones.



I may be wrong but this isn't about 144hz displays. It's about pixel over clocking manually?


----------



## P4-630 (Jun 28, 2016)

the54thvoid said:


> I may be wrong but this isn't about 144hz displays. It's about pixel over clocking manually?



In the Dutch tech news they said:
"Nvidia GTX 1000-gebruikers die een 1440p-monitor met een refreshrate hoger dan 90Hz hebben "

Which is translated:
"Nvidia GTX 1000-users who own a 1440p-monitor with a refresh-rate higher than 90Hz"


----------



## jabbadap (Jun 28, 2016)

theoneandonlymrk said:


> This guys facing the actual issue , and despite many of you're opinion on it a quick Google round says 1440p 144Hz are here big time this year , I suppose they will all be displayport or something eh ,even the cheap VA ones.



Well, I doubt that any monitor manufacturer will sell monitor out of spec(and if monitor have multiple inputs, most likely only dp connector is 1440p@144Hz). Overlord computer(no defunct) was one of the first that was selling those OC ips 1440p@120Hz korean dual link dvi monitors for western customers. But those too were 1440p@60Hz(like korean qnix, catleap, etc.) at stock, OC were left for the customer. Lovely monitors from the era before 1440p@144Hz ips/VA dp monitors.


----------



## redeye (Jun 28, 2016)

And this is why all of the gtx 1080/1070 pascal's are sold out!... seriously. makes sense.
Nvidia is correcting the problem, in secret, and will have a new version out "real soon now"


----------



## TheinsanegamerN (Jun 28, 2016)

the54thvoid said:


> I may be wrong but this isn't about 144hz displays. It's about pixel over clocking manually?


You're technically both right and wrong. The issue is people using pixel overclocking to push 1440p at 75+hz on dual link DVI, which does not officially support anything over 60hz at 1440p. 

The problem is that people bought these 1440p korean monitors that only have DVI, OCed them past 60hz, and now the new pascal cards dont like that. IMO, the consumer took the risk by OCing a monitor, they have to deal with said issues. It would be nice if nvidia fixed the issue, if they can, but they have no responsibility to do so.


----------



## TheoneandonlyMrK (Jun 28, 2016)

jabbadap said:


> Well, I doubt that any monitor manufacturer will sell monitor out of spec(and if monitor have multiple inputs, most likely only dp connector is 1440p@144Hz). Overlord computer(no defunct) was one of the first that was selling those OC ips 1440p@120Hz korean dual link dvi monitors for western customers. But those too were 1440p@60Hz(like korean qnix, catleap, etc.) at stock, OC were left for the customer. Lovely monitors from the era before 1440p@144Hz ips/VA dp monitors.


Asus and Samsung have 1440p 144Hz monitors out soon, out of spec ??? , that niche you were on about caught on last year ,this year its going enthusiast mainstream.


----------



## TheinsanegamerN (Jun 28, 2016)

theoneandonlymrk said:


> Asus and Samsung have 1440p 144Hz monitors out soon, out of spec ??? , that niche you were on about caught on last year ,this year its going enthusiast mainstream.


No, they are in spec for _DisplayPort. _Displayport 1.2, which came out in 2009, can run 1440p144 just fine, or 4k60. 1.3 can do 1440p165, or 4k120.

The problem is these monitors out of korea that only have DVI, that users clock above 60hz. DVI can only do 1440p60, anybody running higher is running out of spec.


----------



## Slizzo (Jun 28, 2016)

InsanegamerN is correct. I will add to his comment the issue is only present when BOOTING WINDOWS while having your refresh rate over 80Hz on a 1440P panel that only has a dual-link DVI connection. All other resolutions and refresh rates at the DVI spec, or on other types of connection are OK.


----------



## Watashi_Omda (Jun 28, 2016)

According to the post, it implies that it only affects DVI users. So I assume display port and HDMI users are not suffering from this issue right?


----------



## truth teller (Jun 29, 2016)

truth teller said:


> not true, "old" native 75/85/120hz monitors use dvi and they seem to work just fine. heck, even some (if not most)144hz monitors have dvi, not the only interface but its probably the primary


sweet move on the late edit, mod. apparently only some can voice their sarcasm around here, what was the problem with


> yet another new account made to praise something that has some minor issues
> sure, the problem is not in the card, not at all, its in the people that try to use one of standard interfaces in it that have problems, not to mention a sturdier physical connector. those poor people that just dont get it...


that required it to be removed? why not remove the whole post then? dont get it...


----------



## Slizzo (Jun 29, 2016)

Watashi_Omda said:


> According to the post, it implies that it only affects DVI users. So I assume display port and HDMI users are not suffering from this issue right?



No issue there.


----------



## Captain_Tom (Jun 29, 2016)

P4-630 said:


> I'm using DVI-D and HDMI, both monitor and tv are running at 60Hz, so no problems for me.
> If I was running a higher refresh-rate monitor, I would just lower the Hz before shutting down, because I would be uncomfortable flashing my expensive brand new card _if_ there would be a BIOS update soon.



Great.  So you paid $700 to have a graphics card without working fans, and it needs you to adjust monitor settings every time you turn it off.  Look at the premium Geoforce Experience everybody!!!


----------



## P4-630 (Jun 29, 2016)

Captain_Tom said:


> Great.  So you paid $700 to have a graphics card without working fans, and it needs you to adjust monitor settings every time you turn it off.  Look at the premium Geoforce Experience everybody!!!



The fans of my card are working fine, no problems here, no problems with monitor and tv either.


----------



## Slizzo (Jun 29, 2016)

Captain_Tom said:


> Great.  So you paid $700 to have a graphics card without working fans, and it needs you to adjust monitor settings every time you turn it off.  Look at the premium Geoforce Experience everybody!!!



Wow, people are crazy angry that a NEW VIDEO CARD that does not support analog any longer (legacy, sunsetted feature) will not boot correctly when the DVI connection is forced to work outside it's own spec.

This is just nuts.



Look here, this is not an issue, nor a design flaw. nVidia can hardly be expected to test or design a function that is outside of specification to work on their GPU.


----------



## xorbe (Jun 29, 2016)

Slizzo said:


> Look here, this is not an issue, nor a design flaw. nVidia can hardly be expected to test or design a function that is outside of specification to work on their GPU.



Well, you can say that it's a regression wrt previous and competing products.  I would say dot clock 'overclocking' is far more universal than gpu overclocking.  Love me some XFree86 modelines!


----------



## Makaveli (Jul 1, 2016)

TheGuruStud said:


> All of the low latency korean monitors came with only dvi (multiple inputs increase latency). Tons of them were sold for refresh OCing.
> 
> I have no monitors with DP. DP is a new addition to most monitors.



My *HP ZR24W *from 2010 has display port!


----------



## D007 (Jul 1, 2016)

ZoneDymo said:


> from everything I read on forums its more the case that AMD has fans..and Nvidia has fanboys...
> How often you see someone on a random forum telling you Nvidia is the only way to go and AMD is crap vs the other way around? 100 vs 1?
> AMD users far more often tell you about getting what is the best at that moment and what works for you personally with your personal needs in mind....or ya know...sane consumers.



Lol people who call others "fan boys" are in fact themselves, fan boys...
I've tried both.. Nvidia has always won.. Performance is what I pay for..
No amount of fan boys, hoping AMD will one day be better, will change anything..
They have their niche. Beneath Nvidia..


----------



## Camm (Jul 4, 2016)

Slizzo said:


> No issue there.



Currently issues with DP at higher refresh causing artifacting.


----------



## Slizzo (Jul 4, 2016)

Camm said:


> Currently issues with DP at higher refresh causing artifacting.



Some have it, some don't. I do not have this issue.


----------



## Camm (Jul 4, 2016)

Slizzo said:


> Some have it, some don't. I do not have this issue.



I certainly do. Somewhat annoying my old Titan X ran better than my new 1080.


----------



## newconroer (Jul 4, 2016)

avatar_raq said:


> It will affect a small number of users (those with high refresh rate no DP ports monitors) but it is an issue nevertheless. I wonder if a VGA bios update will solve this.



Small number ...there's a lot of 1440p 120/144hz users who are still using monitors with Dual DVI...

Man everytime I invest in an Nvidia product, right after wards..something like this comes out.. wtf


I wonder what happens when you use a DVI - D to Display Port adapter.


----------



## Slizzo (Jul 5, 2016)

newconroer said:


> Small number ...there's a lot of 1440p 120/144hz users who are still using monitors with Dual DVI...
> 
> Man everytime I invest in an Nvidia product, right after wards..something like this comes out.. wtf
> 
> ...



I don't think there are any people at 1440P with 120Hz-144Hz going over a dual link DVI, as that connection does not have enough bandwidth to support what you speak of. This is why you see display port on monitors that support those specs.


----------



## newconroer (Jul 5, 2016)

Slizzo said:


> I don't think there are any people at 1440P with 120Hz-144Hz going over a dual link DVI, as that connection does not have enough bandwidth to support what you speak of. This is why you see display port on monitors that support those specs.



There's a whole sub community of users who have QNIX and Catleap monitors from as far back as last decade. All of which are using Dual link DVI. The limitation is not the cable (provided it's a decent quality/gauge).
The issue is the pixel clock. Pixel clock patchers exist, to exceed the limitations in the graphics drivers.

In fact I am confident that there's more 120hz 1440p monitor owners running DVI then there are running Display Port. Mass marketed 1440p 120hz+ isn't even a thing and it's only the past several years. that consumers have taken to 1440p (at 60hz let alone 120/144hz) over 1080p - of which is still the industry standard.


----------



## Slizzo (Jul 5, 2016)

newconroer said:


> There's a whole sub community of users who have QNIX and Catleap monitors from as far back as last decade. All of which are using Dual link DVI. The limitation is not the cable (provided it's a decent quality/gauge).
> The issue is the pixel clock. Pixel clock patchers exist, to exceed the limitations in the graphics drivers.
> 
> In fact I am confident that there's more 120hz 1440p monitor owners running DVI then there are running Display Port. Mass marketed 1440p 120hz+ isn't even a thing and it's only the past several years. that consumers have taken to 1440p (at 60hz let alone 120/144hz) over 1080p - of which is still the industry standard.



You realize you're talking about people running their monitors and connection over spec right?  I own one of those QNIX monitors. I ran it at 100Hz all the time on my 780. I then bought a TRUE 144Hz 1440P screen, and the QNIX is now running as it should, at 60Hz.

Also, the issue is not present on any other connection other than DVI.


----------



## newconroer (Jul 5, 2016)

Slizzo said:


> You realize you're talking about people running their monitors and connection over spec right?


 Like everyone's CPUs and GPUs? Not sure what the point there is



Slizzo said:


> I own one of those QNIX monitors. I ran it at 100Hz all the time on my 780. I then bought a TRUE 144Hz 1440P screen, and the QNIX is now running as it should, at 60Hz.


So there's no possibility that the manufacturers intentionally designed and/or later realized that people were using them in an enthusiast way and then ensured they were from that point on, marketed in a way as to feed that enthusiast crowd's interest?
That's like saying a car motor was designed and 'oops look I guess you can tune it for more horsepower by turning up the turbo boost pressure.. oh dear look what we've done'




Slizzo said:


> Also, the issue is not present on any other connection other than DVI.



Maybe, but that doesn't change the fact that a lot of people have DVI-D monitors running high frequencies.


----------



## Slizzo (Jul 5, 2016)

newconroer said:


> Like everyone's CPUs and GPUs? Not sure what the point there is
> 
> 
> So there's no possibility that the manufacturers intentionally designed and/or later realized that people were using them in an enthusiast way and then ensured they were from that point on, marketed in a way as to feed that enthusiast crowd's interest?
> ...



My thing is this: nVidia should in no way be expected to "answer" for this, as cases such as this are outside of the design spec.  The fact that some are lucky and able to get their monitors to run at a refresh rate that they are not designed for is just that: luck.

Do you complain to Intel or AMD/nVidia that your overclock on your CPU or GPU is making your computer fail to boot?


----------



## newconroer (Jul 5, 2016)

Slizzo said:


> My thing is this: nVidia should in no way be expected to "answer" for this, as cases such as this are outside of the design spec.  The fact that some are lucky and able to get their monitors to run at a refresh rate that they are not designed for is just that: luck.
> 
> Do you complain to Intel or AMD/nVidia that your overclock on your CPU or GPU is making your computer fail to boot?



What could they have possibly overlooked or dare I say done on purpose, that would limit the ability of the GPU to not function properly when using a high pixel clock, that worked on previous generations?
It does not make sense.

So yes, it looks to be a mistake - and based on what I've informed you of, one that affects a lot of users.


----------



## Slizzo (Jul 5, 2016)

newconroer said:


> What could they have possibly overlooked or dare I say done on purpose, that would limit the ability of the GPU to not function properly when using a high pixel clock, that worked on previous generations?
> It does not make sense.
> 
> So yes, it looks to be a mistake - and based on what I've informed you of, one that affects a lot of users.



They removed the analog signal from Pascal altogether. It's completely different than what Maxwell supported.


----------



## newconroer (Jul 5, 2016)

Slizzo said:


> They removed the analog signal from Pascal altogether. It's completely different than what Maxwell supported.


You'll have to educate me on where this factors in. DVI-D is not DVI-I. It ignores the analog signal.


----------



## xorbe (Jul 6, 2016)

Camm said:


> I certainly do. Somewhat annoying my old Titan X ran better than my new 1080.



Wait, what?  What's going on with the 1080 compared to Titan X?  Should I hang onto my Titan X?  What's the deal?


----------



## RichF (Jul 6, 2016)

Dippyskoodlez said:


> I don't see how i3 equivilant FX CPUs at $200+ is not 'raping consumers' in late 2016.


1) No one is being forced to buy those.

2) The 8320E, which overclocks to 5 GHz with the right board, is $90 at Microcenter most of the time, with $40 off a board, too. People who buy the more expensive 8 core FX processors are paying a premium for not doing their homework — something extremely common in the world of consumerism and hardly enough to make AMD somehow a bad company. Corporations are out there to sell you less than what you're paying for. That's how profit-seeking works. You don't even need a Microcenter to get a low price on an FX, although it is true that the extra cost of cooling and a robust power supply can make it less of a value option.

If I were in the market for a budget gaming system I'd get a 5675C Broadwell and overclock it some. The CPU is more expensive than an FX but its gaming efficiency is quite a bit higher. Skylake doesn't really offer anything since Intel refused to put EDRAM on even one SKU.

3) i3 CPUs don't equal or outperform an overclocked 8 core FX in some workloads. Even though the FX design is old and it's still on 32nm it has 8 integer cores. Many integer-heavy workloads that are highly threaded will be good enough on an FX even today. If your workload is heavily floating point then look elsewhere.


----------



## Slizzo (Jul 6, 2016)

newconroer said:


> You'll have to educate me on where this factors in. DVI-D is not DVI-I. It ignores the analog signal.



I'm saying that the port is not unchanged, they made changes to their GPU and now this "issue" comes up. An issue that any manufacturer of parts should have no concern over supports since it is out of standard. The fact that it worked before and doesn't now is just luck.

Intel does not support overclocking. If your CPU does not run outside of it's specified voltage and clock speed, you can't really be angry at them about it, can you?


----------



## newconroer (Jul 8, 2016)

Slizzo said:


> I'm saying that the port is not unchanged, they made changes to their GPU and now this "issue" comes up. An issue that any manufacturer of parts should have no concern over supports since it is out of standard. The fact that it worked before and doesn't now is just luck.
> 
> Intel does not support overclocking. If your CPU does not run outside of it's specified voltage and clock speed, you can't really be angry at them about it, can you?



How is it out of standard?


----------



## Dippyskoodlez (Jul 11, 2016)

RichF said:


> 1) No one is being forced to buy those.




So its okay for amd to make a poor product and sell it to consumers because noone is forced to buy those, but nvidia is forcing gpus down your throat now?

Its clear your statement was bullshit and now you're playing mental gymnastics to prevent from admitting your statement was stupid.


----------



## Circa_Survivor (Jul 11, 2016)

Holy crap.... i have been dealing with this boot problem for 3 days after I installed my 1080. I tried everything under the sun and had a feeling it could be my monitor, which is a qnix i overclocked to 96hz. Luckily, i bought a new monitor because i was sick of using dvi. Im so damn relieved.


----------



## RichF (Jul 11, 2016)

Dippyskoodlez said:


> So its okay for amd to make a poor product and sell it to consumers because noone is forced to buy those, but nvidia is forcing gpus down your throat now?


Let me know where I said anything of the sort.



Dippyskoodlez said:


> Its clear your statement was bullshit and now you're playing mental gymnastics to prevent from admitting your statement was stupid.


What's clear is that you're using arguments ostensibly from me, but not actually from me, to try to discredit things I actually did say.

It's bad enough to post with an abusive tone without resorting to fabrication.


----------



## newconroer (Jul 14, 2016)

Is there a comment missing from this?


----------



## Covert_Death (Jul 15, 2016)

newconroer said:


> Is there a comment missing from this?



yes, that was weird, Chrome won't let me type anything, the text box is missing in HTML....

Anywho, what I was trying to say was:



Slizzo said:


> I don't think there are any people at 1440P with 120Hz-144Hz going over a dual link DVI, as that connection does not have enough bandwidth to support what you speak of. This is why you see display port on monitors that support those specs.



While the number may be small, the whole Korean monitor shebang was a thing because of those monitors reaching over 120Hz on dual link DVI. Infact I have a crossover 27" 1440p monitor that I run daily at 120hz for gaming over a single dual link DVI cable. 

Currently this issue sucks as every time I boot, my screen has 4 mirrored screens in each corner instead of one display. I can edit and post a screenshot later if interested but imagine that the GPU just uses the 4 quadrants as separate displays that are in mirror mode. 

It's quite strange and I've never had this issue before, ever reboot it happens now, I have to launch NVidia control panel and refresh my screen settings to get it back to a single display so I can use it. 

I really hope this is fixed soon.


----------



## Slizzo (Jul 15, 2016)

I believe 368.69 resolved the issue with pixel clocks. That or the new 368.81


----------



## Covert_Death (Jul 16, 2016)

for me .69 introduced the issue as a problem every single boot up (before it was intermittent) but .81 resolved the issues as you suggested!

All is well and back to 120hz


----------



## newconroer (Jul 16, 2016)

Yes it's fixed now in .81  ...guess it was just a mistake on their part.


----------



## Camm (Jul 20, 2016)

xorbe said:


> Wait, what?  What's going on with the 1080 compared to Titan X?  Should I hang onto my Titan X?  What's the deal?



There's a power management bug with high bandwidth displays, which causes artifacting. Think 144hz+ panels, OC 4k screens, etc.


----------



## cheddle (Aug 23, 2016)

came here just to say I don't get this issue. EVGA 1080 FE - boot at 2560x1440 @ 100hz over DVI on a korean OC monitor.


----------



## puma99dk| (Aug 23, 2016)

cheddle said:


> came here just to say I don't get this issue. EVGA 1080 FE - boot at 2560x1440 @ 100hz over DVI on a korean OC monitor.



Nvidia already fixed this with a driver update some time ago


----------

