Friday, August 24th 2018
NVIDIA's BFGD Solutions Delayed to Q1 2019, Will Cost an Awful Penny
NVIDIA's BFGD solutions (Big Format Gaming Display) are meant to become the ultimate gaming graphics display solution for gamers. their 4K resolution and 120 Hz refresh rates with G-Sync support are meant to become the baseline for smoothness in gaming scenarios, and the 1000 NITS peak brightness is meant to make HDR images that are relevant - differing from other, less "refined", shall we say, implementations. However, the hardware specs for these systems are high, parts are expensive and difficult to procure, and the process of integrating so much technology (including Quantum Dot tech and NVIDIA Shield) seems to be giving integrators a hard time.As such, and as part of Gamescom coverage, press was made aware by NVIDIA partners of a recent delay decision for these BFGD panels' market introduction - they've been moved to Q1 2019. And as the launch timeframe has jumped, so have cost estimates for the end-user: these now sit between the €4,000 and €5,000 ballpark, making these displays, with as much tech as they have, a difficult buy to stomach. The fact that OLED display solutions can be had, in the same diagonals, by much, much less, should give anyone pause in their purchase decision for these BFGD displays. Even if the value one puts down on G-Sync does lead users to a purchase decision, remember that integration of the HDMI 2.1 standard brings with it VRR (Variable Refresh Rate) support, and that Xbox consoles already support the open, free-to-implement FreeSync standard.
Sources:
Hardware.Info, via Videocardz
53 Comments on NVIDIA's BFGD Solutions Delayed to Q1 2019, Will Cost an Awful Penny
Note they been chatting bfgd for so long im bored before seeing one tut, Pr vapour nonesense thread imho.
Look at us loook at us <Nvidia.
Imho.
Oh damn, I had to re-read that. "NVIDIA SHIELD built-in." So now your monitor has a Tegra chip with it's own ARM CPU and Kepler GPU. Gee, I wonder why. Oh, right, the G-Sync module historically was basically a mini computer to handle NVIDIA's bullshit. They just went all the way now. I wonder how long this monitor takes to boot up. :roll: And how are they going to manage "ultra-low latency" when everything has to be handled by two GPUs? [facepalm.jpg] Give. Up. NVIDIA. Implement the adaptive sync standard.
If they didn't think this was worth the effort they wouldn't bother.
short version: my current monitor is a proof that you can have more for less :laugh: (well ... personal opinion right? )
suspecting the incoming 4k ultimate GPU namely the RTX 2080Ti (an arm, a kidney and maybe a part of your liver, out of taxes of course) will push obscenely priced 4k monitor with more gimmick than ever ...
rog.asus.com/articles/gaming-graphics-cards/introducing-geforce-rtx-2080-ti-and-rtx-2080-graphics-cards-from-rog-and-asus
RTX2000 series fights AMD Vega 20
Next year 7nm Nvidia Ampere (RTX3000) series will have HDMI 2.1 and PCIe 4.0..... Second generation Ray Tracing!
But that's Q4 2019 or Q1 2020
2020 Nvidia Ampere is fighting 10nm Intel Arctic Sound and 7nm AMD Navi
On a more serious note, I've given up on trying fond a decent 32", 4k, HDR capable monitor. The hype is there, but the technology isn't. It will take few more years. Which is fine, because that's about how much it will take for video cards that I buy (in the $200-300 range) to start handling 4k ok-ish.
edit: It's also potentially a great way to turn over OLED sets more quickly, since those pixels will wear out faster, especially as the ridiculous 8K craze becomes the standard. People will "discover" problems like gamut shrinkage (especially in the blue shades) and contrast reduction and offer upgrades to fix the problem. "Old set looking washed-out, the new-and-improved sets not only have 10K resolution, they have a wider color gamut than sRGB!"
What gamers and video watchers need more than 1000 nits is better static contrast (except for OLED) and a vastly wider color gamut than the ancient sRGB. The new HDR standard is going in that direction but too much emphasis is being placed where it shouldn't be (pixel shrinkage and, especially, excessive eye-searing brightness). I have no doubt that the primary factor behind the brightness marketing is advertising. Ad companies have already discovered the trick of turning the screen black periodically during commercials to make people think the ad is over.
OLED doesnt need this peak brightness to achieve HDR ;)