Monday, June 27th 2016

GeForce GTX "Pascal" Faces High DVI Pixel Clock Booting Problems

The second design flaw to hit the GeForce GTX 1080 and GTX 1070 after the fan revving bug, isn't confined to the reference "Founders Edition" cards, but affects all GTX 1080 and GTX 1070 cards. Users of monitors with dual-link DVI connectors are noticing problems in booting to Windows with pixel clocks set higher than 330 MHz. You can boot to windows at default pixel clocks, and when booted, set the refresh-rates (and conversely pixel clocks) higher than 330 MHz, and the display works fine, it's just that you can't boot with those settings, and will have to revert to default settings each time you shut down or restart your machine.

A user of a custom-design GTX 1070 notes that if the refresh rate of their 1440p monitor is set higher than 81 Hz (the highest refresh rate you can achieve with pixel clock staying under 330 MHz) and the resolution at 2560 x 1440, the machine doesn't correctly boot into Windows. The splash screen is replaced with flash color screens, and nothing beyond. The system BIOS screen appears correctly (because it runs at low resolutions). The problem is also said to be observed on a custom-design GTX 1080, and has been replicated by other users on the GeForce Forums.
Source: Reddit
Add your own comment

147 Comments on GeForce GTX "Pascal" Faces High DVI Pixel Clock Booting Problems

#51
P4-630
rtwjunkieAs far as I am concerned Nvidia already dashed their driver reputation last summer with a string of bad drivers, leaving many to hang onto older drivers until Autumn.

It's why I always wait at least 2 releases now before upgrading drivers. They can't be trusted anymore not to release shit.
Some people used to say that AMD had crappy drivers but now nvidia has showed driver problems as well,
I also will wait updating my display driver when there has been a new one released.
We will read it fast enough on the internet if there are any problems with them :D
Posted on Reply
#52
ensabrenoir
......who buys a $400 + card and don't us a monitor to full experience what they paid for.......step your game up people display port! This is like saying...Yeah i bought this Porsche/Ferrari/Lamborghini and it just doesn't perform well on 87 octane gas(as oppose to 93 and above). Since some hate car analogies, its also like when flat screen 1080p TVs first came out and people were still using the red white and yellow rca connectors. You still got a picture but not the quality that you paid for. I guess they have to put warning stickers on graphic cards suggesting the proper monitors. Didn't they eliminate dvi on high end cards and make it so adapters didn't work either......You gotta do your research......
Posted on Reply
#53
rtwjunkie
PC Gaming Enthusiast
ensabrenoir......who buys a $400 + card and uses dvi or an hdmi even......step your game up people display port! This is like saying...Yeah i bought this Porsche/Ferrari/Lamborghini and it just doesn't perform well on 87 octane gas(as oppose to 93 and above). Do they have to put warning stickers on graphic cards suggesting a high end monitor? Didn't they eliminate dvi on some cards and make it so adapters didn't work either......
People have different standards of what makes a good monitor. For me, IQ is important, followed by screen size for what works on my desk, and then screen material. I prefer glossy for its rich colors. I don't need 144Hz 1440p to have a great gaming experience.

Why? Because that experience is subjective, and no one can say that something is inadequate. I'm also very cost-conscious, and I am not going to replace my near $300 monitor for considerably more just to make elitists happy.
Posted on Reply
#54
bug
P4-630The market is still flooded with monitors that offer only HDMI, VGA and DVI-D.
I have bought a nice one not long ago with IPS display.
I have it connected through DVI-D.
A tv takes the HDMI port.
Not a problem. Leave only HDMI and DP connections on the video card (possibly with some mini variants in there) and let those who still cling on to DVI use a HDMI-to-DVI cable (it's cheap, I know, I have one). Or include a HDMI-toDVI adapter with the card if you feel generous.
Posted on Reply
#55
Zubasa
ensabrenoir......who buys a $400 + card and uses dvi or an hdmi even......step your game up people display port! This is like saying...Yeah i bought this Porsche/Ferrari/Lamborghini and it just doesn't perform well on 87 octane gas(as oppose to 93 and above). Do they have to put warning stickers on graphic cards suggesting a newer monitor? Didn't they eliminate dvi on high end cards and make it so adapters didn't work either......
Thing is the people made a big deal about the Fury X not having HDMI 2.0, so I guess plenty of people use HDMI.
Posted on Reply
#56
robert3892
I've been reading this thread and it usually affects users of 2K monitors imported from Korean that have only a DVI port (no DP port). Monitors which have DP ports are not affected provided the user is NOT using a DVI port.

I used to have a QNIX myself and these monitors can be easily overclocked by users up to 120 hertz. So until NVIDIA comes out with a fix don't overclock the QNIX monitors beyong 80hertz...maybe make it 75 hertz just to be sure.
Posted on Reply
#58
rtwjunkie
PC Gaming Enthusiast
jaggerwildI find it funny its been 2 weeks sense the story of hand picked over clocked samples now this. Add to it low stock, I think were gonna see a new card sooner then Christmas from the big N.

www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709 601201888 601203818 601204369
I wouldn't be surprised to see a 1060 from them, since they literally are poised to lose the whole mid-tier to AMD. Of course, 1060's would also need to be in stock, LOL!

:roll:
Posted on Reply
#59
ensabrenoir
ZubasaThing is the people made a big deal about the Fury X not having HDMI 2.0, so I guess plenty of people use HDMI.
true.....i edited my rant as soon as higher logic and coffee kicked in.
Posted on Reply
#60
rtwjunkie
PC Gaming Enthusiast
TheinsanegamerNmore people are beginning to get on the high refresh rate train. In that world, DVI no longer matters.
Yes, they are. From my eyes, I see no reason to do it. And I have a modern, recent monitor. Earlier I stated what qualities in a monitor are important to me.

I've never been a victim of peer pressure. I don't follow the herd. I've always done what works for me in life.
Posted on Reply
#61
Ubersonic
bugMaybe this will convince Nvidia it's time to put DVI to greener pastures. It's limited to 2560x1600@60Hz anyway (see en.wikipedia.org/wiki/Digital_Visual_Interface#Technical_overview)
It's not limited to it, that's the "spec", but it's common to see systems running out of spec. You can do 1080p/144Hz and 1440p/120Hz over dual link DVI no problem.

I do agree that it's time to drop the DVI port on high end cards though, now that they are using DVI-D instead of DVI-I the DVI-VGA adapters no longer work so they may as well just drop the DVI port or replace it with an extra HDMI port and include a HDMI-DVI adapter, that will save space.
Posted on Reply
#62
ZoneDymo
UbersonicIt's not limited to it, that's the "spec", but it's common to see systems running out of spec. You can do 1080p/144Hz and 1440p/120Hz over dual link DVI no problem.

I do agree that it's time to drop the DVI port on high end cards though, now that they are using DVI-D instead of DVI-I the DVI-VGA adapters no longer work so they may as well just drop the DVI port or replace it with an extra HDMI port and include a HDMI-DVI adapter, that will save space.
should most def drop it if you cannot seem to do it correctly....yeah...
Posted on Reply
#63
Camm
There is also a problem with DisplayPort sync above 120hz, which causes white line artifacting.
Posted on Reply
#64
bug
UbersonicIt's not limited to it, that's the "spec", but it's common to see systems running out of spec. You can do 1080p/144Hz and 1440p/120Hz over dual link DVI no problem.

I do agree that it's time to drop the DVI port on high end cards though, now that they are using DVI-D instead of DVI-I the DVI-VGA adapters no longer work so they may as well just drop the DVI port or replace it with an extra HDMI port and include a HDMI-DVI adapter, that will save space.
Well, you can run anything out of spec, but then you no longer expect any predictable result, do you?
Posted on Reply
#65
Zubasa
ZoneDymoshould most def drop it if you cannot seem to do it correctly....yeah...
Also dropping that bulky DVI connector can also mean more space for the exhaust on reference cards.
Posted on Reply
#66
horik
Was looking for an GTX 1070 to replace my GTX 970 but with the corean monitor i have, OC'ed to 96 Hz, might have problems.
Posted on Reply
#67
Steevo
Sounds like they have a display controller issue, and its almost for the market they are laying claim to with their cards, high resolution high FPS displays.
Perhaps they are lucky their not able to produce more actual cards yet, a BIOS update for thousands of cards will undoubtedly cause some to be bricked, and no replacements available would be a worse black eye than the already absurd price gouging and limited availability.
Posted on Reply
#68
Slizzo
Just going to throw this out here, but how many monitors that support greater than 60Hz refresh rate (natively, no overclocking) that only use DVI are out there? I'm going to venture none. Therefore nVidia can't have known this, because they likely do not test out of spec configurations.

This issue isn't present for a majority of GTX1080 owners. Including myself; however I stopped overclocking my gen 1 Qnix Qx2710 when I got my XB270HU.
Posted on Reply
#69
jabbadap
SteevoSounds like they have a display controller issue, and its almost for the market they are laying claim to with their cards, high resolution high FPS displays.
Perhaps they are lucky their not able to produce more actual cards yet, a BIOS update for thousands of cards will undoubtedly cause some to be bricked, and no replacements available would be a worse black eye than the already absurd price gouging and limited availability.
Well if this would happen with Displayport monitors like 1440@144Hz or 2160p@60Hz then I would agree. But the "bug" is with running dl-dvi out of spec Mpix/s, thus not very critical. This is more troublesome bug:
forums.geforce.com/default/topic/939426/geforce-1000-series/displayport-does-not-work-with-the-htc-vive-on-gtx-1080-/1/
Posted on Reply
#70
truth teller
SlizzoJust going to throw this out here, but how many monitors that support greater than 60Hz refresh rate (natively, no overclocking) that only use DVI are out there? I'm going to venture none. Therefore nVidia can't have known this, because they likely do not test out of spec configurations.
not true, "old" native 75/85/120hz monitors use dvi and they seem to work just fine. heck, even some (if not most)144hz monitors have dvi, not the only interface but its probably the primary
Posted on Reply
#71
Swampdonkey
I think the take away from the problems with the 1070/1080s is to not buy first batch. I've been impatient in the past, and burned several times with glitchy cards requiring an RMA. No doubt that Nvidia will sort out these bugs. Until then, my wallet's staying closed.
Posted on Reply
#72
TheinsanegamerN
truth tellernot true, "old" native 75/85/120hz monitors use dvi and they seem to work just fine. heck, even some (if not most)144hz monitors have dvi, not the only interface but its probably the primary
"old" native 75/85/120hz were also not running 2560x1600 resolution. DVI has no trouble pushing that HZ at newer rez, and you do not need to push the pixel clock higher to achieve said refresh rates. As such, said monitors would have no issue with a 1080.

If any of them WERE pushing above 2560x1600p60, then those monitors were being sold outside of spec, and whoever bought them took the risk that something wouldnt work. Any decent monitor that was pushing said specs really should have used displayport.
Posted on Reply
#73
Valdas
SlizzoJust going to throw this out here, but how many monitors that support greater than 60Hz refresh rate (natively, no overclocking) that only use DVI are out there? I'm going to venture none. Therefore nVidia can't have known this, because they likely do not test out of spec configurations.

This issue isn't present for a majority of GTX1080 owners. Including myself; however I stopped overclocking my gen 1 Qnix Qx2710 when I got my XB270HU.
Take a look at mine Viewsonic V3D245. It has d-sub, dvi and hdmi.
Posted on Reply
#74
MxPhenom 216
ASIC Engineer
the54thvoidMonitors using dual link dvi with refresh rates above 81hz. I'm sure this will have some slobbering over Nvidia failing again but really...

Is Display Port or HDMI not better? And if your monitor doesn't have those, why buy an expensive gfx card. My 6 year old Dell has Display Port.
Korean 1440p PLS and IPS monitors, the good ones, only have dual link dvi, and these are the only ones that allow a decent refresh rate overclock, plus very low input lag.

Looks like ill be waiting for this to hopefully be fixed in drivers or domething before i buy a 1070, since this will be a issue for me.
Posted on Reply
#75
TheinsanegamerN
ValdasTake a look at mine Viewsonic V3D245. It has d-sub, dvi and hdmi.
and its only 1080p. So dual link DVI pushing that rez faster then 60hz would be in spec. Hence no problems here.

The real question should be how many monitors are out there, that use DVI for input, that are above 60hz, AND are 1440/1600P? The answer is very few, and those were sold out of spec, so nvidia couldnt be expected to test for that.
Posted on Reply
Add your own comment
Nov 21st, 2024 06:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts