Monday, June 27th 2016
GeForce GTX "Pascal" Faces High DVI Pixel Clock Booting Problems
The second design flaw to hit the GeForce GTX 1080 and GTX 1070 after the fan revving bug, isn't confined to the reference "Founders Edition" cards, but affects all GTX 1080 and GTX 1070 cards. Users of monitors with dual-link DVI connectors are noticing problems in booting to Windows with pixel clocks set higher than 330 MHz. You can boot to windows at default pixel clocks, and when booted, set the refresh-rates (and conversely pixel clocks) higher than 330 MHz, and the display works fine, it's just that you can't boot with those settings, and will have to revert to default settings each time you shut down or restart your machine.
A user of a custom-design GTX 1070 notes that if the refresh rate of their 1440p monitor is set higher than 81 Hz (the highest refresh rate you can achieve with pixel clock staying under 330 MHz) and the resolution at 2560 x 1440, the machine doesn't correctly boot into Windows. The splash screen is replaced with flash color screens, and nothing beyond. The system BIOS screen appears correctly (because it runs at low resolutions). The problem is also said to be observed on a custom-design GTX 1080, and has been replicated by other users on the GeForce Forums.
Source:
Reddit
A user of a custom-design GTX 1070 notes that if the refresh rate of their 1440p monitor is set higher than 81 Hz (the highest refresh rate you can achieve with pixel clock staying under 330 MHz) and the resolution at 2560 x 1440, the machine doesn't correctly boot into Windows. The splash screen is replaced with flash color screens, and nothing beyond. The system BIOS screen appears correctly (because it runs at low resolutions). The problem is also said to be observed on a custom-design GTX 1080, and has been replicated by other users on the GeForce Forums.
147 Comments on GeForce GTX "Pascal" Faces High DVI Pixel Clock Booting Problems
I also will wait updating my display driver when there has been a new one released.
We will read it fast enough on the internet if there are any problems with them :D
Why? Because that experience is subjective, and no one can say that something is inadequate. I'm also very cost-conscious, and I am not going to replace my near $300 monitor for considerably more just to make elitists happy.
I used to have a QNIX myself and these monitors can be easily overclocked by users up to 120 hertz. So until NVIDIA comes out with a fix don't overclock the QNIX monitors beyong 80hertz...maybe make it 75 hertz just to be sure.
www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709%20601201888%20601203818%20601204369
:roll:
I've never been a victim of peer pressure. I don't follow the herd. I've always done what works for me in life.
I do agree that it's time to drop the DVI port on high end cards though, now that they are using DVI-D instead of DVI-I the DVI-VGA adapters no longer work so they may as well just drop the DVI port or replace it with an extra HDMI port and include a HDMI-DVI adapter, that will save space.
Perhaps they are lucky their not able to produce more actual cards yet, a BIOS update for thousands of cards will undoubtedly cause some to be bricked, and no replacements available would be a worse black eye than the already absurd price gouging and limited availability.
This issue isn't present for a majority of GTX1080 owners. Including myself; however I stopped overclocking my gen 1 Qnix Qx2710 when I got my XB270HU.
forums.geforce.com/default/topic/939426/geforce-1000-series/displayport-does-not-work-with-the-htc-vive-on-gtx-1080-/1/
If any of them WERE pushing above 2560x1600p60, then those monitors were being sold outside of spec, and whoever bought them took the risk that something wouldnt work. Any decent monitor that was pushing said specs really should have used displayport.
Looks like ill be waiting for this to hopefully be fixed in drivers or domething before i buy a 1070, since this will be a issue for me.
The real question should be how many monitors are out there, that use DVI for input, that are above 60hz, AND are 1440/1600P? The answer is very few, and those were sold out of spec, so nvidia couldnt be expected to test for that.