Friday, August 24th 2018

NVIDIA's BFGD Solutions Delayed to Q1 2019, Will Cost an Awful Penny

NVIDIA's BFGD solutions (Big Format Gaming Display) are meant to become the ultimate gaming graphics display solution for gamers. their 4K resolution and 120 Hz refresh rates with G-Sync support are meant to become the baseline for smoothness in gaming scenarios, and the 1000 NITS peak brightness is meant to make HDR images that are relevant - differing from other, less "refined", shall we say, implementations. However, the hardware specs for these systems are high, parts are expensive and difficult to procure, and the process of integrating so much technology (including Quantum Dot tech and NVIDIA Shield) seems to be giving integrators a hard time.

As such, and as part of Gamescom coverage, press was made aware by NVIDIA partners of a recent delay decision for these BFGD panels' market introduction - they've been moved to Q1 2019. And as the launch timeframe has jumped, so have cost estimates for the end-user: these now sit between the €4,000 and €5,000 ballpark, making these displays, with as much tech as they have, a difficult buy to stomach. The fact that OLED display solutions can be had, in the same diagonals, by much, much less, should give anyone pause in their purchase decision for these BFGD displays. Even if the value one puts down on G-Sync does lead users to a purchase decision, remember that integration of the HDMI 2.1 standard brings with it VRR (Variable Refresh Rate) support, and that Xbox consoles already support the open, free-to-implement FreeSync standard.
Sources: Hardware.Info, via Videocardz
Add your own comment

53 Comments on NVIDIA's BFGD Solutions Delayed to Q1 2019, Will Cost an Awful Penny

#26
TheoneandonlyMrK
Too expensive too late int day and gsync is now an unnecessary price burden imho with hdmi2.1 out.

Note they been chatting bfgd for so long im bored before seeing one tut, Pr vapour nonesense thread imho.

Look at us loook at us <Nvidia.
Posted on Reply
#27
Vayra86
the54thvoidIts peak brightness, not sustained.
Its still ridiculously high. And I strongly doubt its not detrimental to your eyesight, and it 100% is more tiresome to look at. You have to adjust way more often and more dramatically.
Vya DomusIs there anything that Nvidia does which doesn't cost a pretty penny ?
Sure, for GFE they want your soul :p (logins)
Posted on Reply
#28
cucker tarlson
They better provide a lifetime supply of eyedrops with that.
Posted on Reply
#29
Totally
cucker tarlsonI was really talking about a longer period of time AND NOT ABOUT FRIGGIN GPUS, another one doesn't understand that. How much did first 1440p 144Hz cost, and how much are they now ?
Two years ago they were going for about $400, quick vist to newegg show the same exact monitor going for $360, it's replacement $449
Posted on Reply
#30
R-T-B
TotallyThing get cheaper as people buy more really? You still buy into that fairy tale?
It's not a "fairy tale." It just requires a competitive marketplace, which the gpu market presently lacks.
Posted on Reply
#31
TheoneandonlyMrK
R-T-BIt's not a "fairy tale." It just requires a competitive marketplace, which the gpu market presently lacks.
It's only not competitive above a certain price point , the price point Nvidia left behind with the 680, when they started out on this price hike Innovation, and I am presently residing in the realistic expectations camp as far as Rtx goes , i await reviews.
Imho.
Posted on Reply
#32
hat
Enthusiast
That price is about as high as the dude who set it... Nvidia execs are smoking the GOOD shit.
Posted on Reply
#33
Fluffmeister
I didn't realise Nvidia made monitors? But I appreciate the name gets people mad instantly.
Posted on Reply
#34
FordGT90Concept
"I go fast!1!11!1!"
the54thvoidIts peak brightness, not sustained.
Even worse. Rapid changes in brightness cause eye strain.
FluffmeisterI didn't realise Nvidia made monitors? But I appreciate the name gets people mad instantly.
They don't. They make the G-Sync module they're selling for probably $1000-2000 and they're putting requirements on the panel itself that significantly adds to that cost before it will reach the market. NVIDIA just priced itself out of the market.


Oh damn, I had to re-read that. "NVIDIA SHIELD built-in." So now your monitor has a Tegra chip with it's own ARM CPU and Kepler GPU. Gee, I wonder why. Oh, right, the G-Sync module historically was basically a mini computer to handle NVIDIA's bullshit. They just went all the way now. I wonder how long this monitor takes to boot up. :roll: And how are they going to manage "ultra-low latency" when everything has to be handled by two GPUs? [facepalm.jpg] Give. Up. NVIDIA. Implement the adaptive sync standard.
Posted on Reply
#35
Fluffmeister
FordGT90ConceptThey don't. They make the G-Sync module they're selling for probably $1000-2000 and they're putting requirements on the panel itself that significantly adds to that cost before it will reach the market. NVIDIA just priced itself out of the market.
I thought it was $200? People need to make their minds up.
Posted on Reply
#36
FordGT90Concept
"I go fast!1!11!1!"
That's for non-HDR G-Sync modules. HDR G-Sync modules are closer to $500. There's no way the panel tech is costing $3500+ so it's likely these new HDR modules are even more expensive than the previous ones. Call it the "SHIELD tax." NVIDIA is all "we're adding features to your monitors so we deserve more money." Monitor manufacturers need to flip the finger at NVIDIA and wash their hands clean of it. Only the huge monitor manufacturers can even afford to consider selling a G-Sync monitor.
Posted on Reply
#37
Fluffmeister
Last time I checked, monitor manufacturers are in the business of making money, it's their name on it after all, not Nvidia.

If they didn't think this was worth the effort they wouldn't bother.
Posted on Reply
#38
FordGT90Concept
"I go fast!1!11!1!"
Since none are out yet, I'd say they aren't bothering. The numbers we're seeing probably come from preliminary estimates which are being leaked to the press so NVIDIA reconsiders pricing of the module and/or requirements (1000 nit is stupid).
Posted on Reply
#39
GreiverBlade
Ed_1I wish they would focus more on 1440p size with all the features in like 27-32" size.
They would sell way more than some nich size/res monitor.
i have 1440p 32" (well ... more like 1620p) but what do you mean by all feature? G-Sync? erk ... my screen would cost 500+ and not 299 if it carried that :laugh: 144hz? well ... if my card would push more than 60ish average at 1440p (a little lower but not much at 1620p ), granted that even a 1080Ti can't do 144hz in 1440p (close but not equal or above) maybe it would be nice ... (tho ... 20XX would probably do that ... ) 60hz/75hz oc is quite fine in my setup...

short version: my current monitor is a proof that you can have more for less :laugh: (well ... personal opinion right? )

suspecting the incoming 4k ultimate GPU namely the RTX 2080Ti (an arm, a kidney and maybe a part of your liver, out of taxes of course) will push obscenely priced 4k monitor with more gimmick than ever ...
Posted on Reply
#40
Fluffmeister
Who needs body parts? I thought Switzerland benefited greatly from stolen gold! Cash in and treat yourself.
Posted on Reply
#41
RH92
zelnep120fps on 4K?... like what gpu are nvidia planing to recommend for this? do nvidia know something about amd navi? or intels 2020 project? because one thing nvida can know for sure - no green team cards that can push past 80fps on 4K anytime soon
You live under a rock or something ? 2080 is already averaging 71fps on that 10game sample Nvidia provided so 2080Ti should be able to brake the 100fps 4K barrier and that without taking into account DLSS etc .
Posted on Reply
#42
ToxicTaZ
jabbadapSpeaking of which. What are the specs of Displayport 1.4a, which Turing is ready for? I can't find them anywhere, only thing what came up was it was released April 2018.
Turing is HDMI 2.0b

rog.asus.com/articles/gaming-graphics-cards/introducing-geforce-rtx-2080-ti-and-rtx-2080-graphics-cards-from-rog-and-asus

RTX2000 series fights AMD Vega 20

Next year 7nm Nvidia Ampere (RTX3000) series will have HDMI 2.1 and PCIe 4.0..... Second generation Ray Tracing!

But that's Q4 2019 or Q1 2020

2020 Nvidia Ampere is fighting 10nm Intel Arctic Sound and 7nm AMD Navi
Posted on Reply
#43
Caring1
RenaldPlaying with HDR 1000 is like that :


Prepare to have your eyes on fire after 1 hour.

My 400cd screen is half or third (depending) of full luminosity so I can use it without losing an eye at the end of the day.
Next Nvidia will be selling gaming glasses to ease eye strain during long gaming sessions.
Posted on Reply
#44
Totally
Caring1Next Nvidia will be selling gaming glasses to ease eye strain during long gaming sessions.
Posted on Reply
#45
bug
These are expensive, but if we get the same monitors with FreeSync, they should like $500 cheaper! Oh, wait...

On a more serious note, I've given up on trying fond a decent 32", 4k, HDR capable monitor. The hype is there, but the technology isn't. It will take few more years. Which is fine, because that's about how much it will take for video cards that I buy (in the $200-300 range) to start handling 4k ok-ish.
Posted on Reply
#46
jabbadap
ToxicTaZTuring is HDMI 2.0b

rog.asus.com/articles/gaming-graphics-cards/introducing-geforce-rtx-2080-ti-and-rtx-2080-graphics-cards-from-rog-and-asus

RTX2000 series fights AMD Vega 20

Next year 7nm Nvidia Ampere (RTX3000) series will have HDMI 2.1 and PCIe 4.0..... Second generation Ray Tracing!

But that's Q4 2019 or Q1 2020

2020 Nvidia Ampere is fighting 10nm Intel Arctic Sound and 7nm AMD Navi
Yes of course it has hdmi2.0b. But I was talking about Displayport 1.4a. I suppose it has the same bw and other specs as dp1.4 has. But only difference might be what it says on the FAQ:
What is the current version of the DisplayPort Standard?
DisplayPort 1.4a was published in April, 2018 and defines the new normative requirement and informative guideline for component and system design.

For more information on DisplayPort 1.4a, see DisplayPort 1.4a Standard FAQs
Posted on Reply
#47
Frick
Fishfaced Nincompoop
I don't object to this, in fact I find it quite cool. Halo products are supposed to be crazy. But I honestly can't figure out a use case for it. Yes, if it's good enough it can be used as more than a "mere" gaming monitor, but who is the target audience? Companies needing large displays with this tech, for some reason? Gamers with truly high end computers they play with controllers from the couch? Or are there people who actually uses monitors of this size as desktop monitors?
Posted on Reply
#48
RichF
Vayra86Its still ridiculously high. And I strongly doubt its not detrimental to your eyesight
The point of this 1000 nits target is for advertising. They want their flash flash flash ads to really sear into you.

edit: It's also potentially a great way to turn over OLED sets more quickly, since those pixels will wear out faster, especially as the ridiculous 8K craze becomes the standard. People will "discover" problems like gamut shrinkage (especially in the blue shades) and contrast reduction and offer upgrades to fix the problem. "Old set looking washed-out, the new-and-improved sets not only have 10K resolution, they have a wider color gamut than sRGB!"

What gamers and video watchers need more than 1000 nits is better static contrast (except for OLED) and a vastly wider color gamut than the ancient sRGB. The new HDR standard is going in that direction but too much emphasis is being placed where it shouldn't be (pixel shrinkage and, especially, excessive eye-searing brightness). I have no doubt that the primary factor behind the brightness marketing is advertising. Ad companies have already discovered the trick of turning the screen black periodically during commercials to make people think the ad is over.
Posted on Reply
#49
Mistral
At this rate these will be DOA by the time they come out...
Posted on Reply
#50
Vayra86
RichFThe point of this 1000 nits target is for advertising. They want their flash flash flash ads to really sear into you.

edit: It's also potentially a great way to turn over OLED sets more quickly, since those pixels will wear out faster, especially as the ridiculous 8K craze becomes the standard. People will "discover" problems like gamut shrinkage (especially in the blue shades) and contrast reduction and offer upgrades to fix the problem. "Old set looking washed-out, the new-and-improved sets not only have 10K resolution, they have a wider color gamut than sRGB!"

What gamers and video watchers need more than 1000 nits is better static contrast (except for OLED) and a vastly wider color gamut than the ancient sRGB. The new HDR standard is going in that direction but too much emphasis is being placed where it shouldn't be (pixel shrinkage and, especially, excessive eye-searing brightness). I have no doubt that the primary factor behind the brightness marketing is advertising. Ad companies have already discovered the trick of turning the screen black periodically during commercials to make people think the ad is over.
The 1000 nit HDR spec and the VESA specs are not for OLED but for LCD.

OLED doesnt need this peak brightness to achieve HDR ;)
Posted on Reply
Add your own comment
Jan 20th, 2025 04:02 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts