Thursday, January 10th 2019

NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"

PC World managed to get a hold of NVIDIA CEO Jensen Huang, picking his thoughts on AMD's recently announced Radeon VII. Skirting through the usual amicable, politically correct answers, Jensen made his thoughts clear on what the competition is offering to compete with NVIDIA's RTX 2000 series. The answer? Vega VII is an "underwhelming product", because "The performance is lousy and there's nothing new. [There's] no ray tracing, no AI. It's 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we'll crush it. And if we turn on ray tracing we'll crush it." Not content on dissing the competition's product, Jensen Huang also quipped regarding AMD's presentation and product strategy, saying that "It's a weird launch, maybe they thought of it this morning."
Of course, the real market penetration of the technologies Jensen Huang mentions is currently extremely low - only a handful of games support NVIDIA's forward-looking ray tracing technologies. That AMD chose to not significantly invest resources and die-space for what is essentially a stop-gap high-performance card to go against NVIDIA's RTX 2080 means its 7 nm 331 mm² GPU will compete against NVIDIA's 12 nm, 545 mm² die - if performance estimates are correct, of course.
The next remarks came regarding AMD's FreeSync (essentially a name for VESA's Adaptive Sync), which NVIDIA finally decided to support on its GeForce graphics cards - something the company could have done outright, instead of deciding to go the proprietary, module-added, cost-increased route of G-Sync. While most see this as a sign that NVIDIA has seen a market slowdown for its G-Sync, added price-premium monitors and that they're just ceding to market demands, Huang sees it another way, saying that "We never competed. [FreeSync] was never proven to work. As you know, we invented the area of adaptive sync. The truth is most of the FreeSync monitors do not work. They do not even work with AMD's graphics cards." In the wake of these word from Jensen, it's hard to understand the overall silence from users that might have their FreeSync monitors not working.

Reportedly, NVIDIA only found 12 out of 400 FreeSync-supporting monitors to support their G-Sync technology automatically in the initial battery of tests, with most panels requiring a manual override to enable the technology. Huang promised that "We will test every single card against every single monitor against every single game and if it doesn't work, we will say it doesn't work. And if it does, we will let it work," adding a snarky punchline to this matter with an "We believe that you have to test it to promise that it works, and unsurprisingly most of them don't work." Fun times.
Source: PC World
Add your own comment

270 Comments on NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"

#226
xtreemchaos
gamerman
you can get tablets for nvidia addiction :) sounds like you have the same as poor mr Huang matey, times are a changing... you know im joking right.
Posted on Reply
#227
Valantar
xtreemchaosgamerman
you can get tablets for nvidia addiction :) sounds like you have the same as poor mr Huang matey, times are a changing... you know im joking right.
I believe the tablet for Nvidia addiction is called Nintendo Switch.



... I'll see myself out.
Posted on Reply
#229
Slizzo
NxodusAMD fanbois will be so angry when in a couple of years RTX becomes standard, and AMD will be forced to adopt it effectively increasing the prices of their hot plastic cards even more.
SasquiYou know, that is a good point, they invested in RTX and maybe or maybe not it'll play out well for them. "Standard" is also a good question, since DX12 introduced the RTX API in Win 10, it's still very young. It's all about the ecosystem, meaning how many developers are going to invest time into implementing it when there's only a handful (or maybe one right now... 2080 ti) that can really handle it.

It's not a very compelling story at the moment.
Guys, you realize that currently, the only implementation of RTRT on desktop is Windows 10, with 1809 update, using DXR as the library to display said technology? You know, the library that anyone can use to accelerate ray tracing? It's not "RTX API". It's Microsofts' DXR API, that is a part of DirectX. Once AMD implements Radeon Rays, they'll be using the same library.
scevismI've just got a 55'' samsung tv with freesync built in 120fps with my r9 290x only at 1080p but works for me. WITH MY LEATHER JACKET ON LOL.....
Loving my Q6 so far. Been a great TV
AnymalNo halving thank to dlss
Man, your guys reliance on DLSS being an end all solution is bewildering. Personally, I'd never rely on a scaling solution in my gaming. I'd rather just run at native res, for better or worse. Beyond that, we're already getting 1080p 60fps in the only RTX supported title so far, with cards that are lower in the stack than the range topping RTX Titan and RTX 2080 Ti.



Beyond all that, the Radeon 7 does look good. So what, it's just a shrunk and higher clocked Vega. But it has twice the ROPS as previous, a huge increase that hamstrung the first Vega, and it has faster HBM, and twice as much as last time as well.

It's $700 because NVIDIA allowed them to price it at that; I can hardly blame AMD for taking advantage of NVIDIA's absofuckinglutely insane pricing.
Posted on Reply
#230
Valantar
SlizzoGuys, you realize that currently, the only implementation of RTRT on desktop is Windows 10, with 1809 update, using DXR as the library to display said technology? You know, the library that anyone can use to accelerate ray tracing? It's not "RTX API". It's Microsofts' DXR API, that is a part of DirectX. Once AMD implements Radeon Rays, they'll be using the same library.



Loving my Q6 so far. Been a great TV



Man, your guys reliance on DLSS being an end all solution is bewildering. Personally, I'd never rely on a scaling solution in my gaming. I'd rather just run at native res, for better or worse. Beyond that, we're already getting 1080p 60fps in the only RTX supported title so far, with cards that are lower in the stack than the range topping RTX Titan and RTX 2080 Ti.



Beyond all that, the Radeon 7 does look good. So what, it's just a shrunk and higher clocked Vega. But it has twice the ROPS as previous, a huge increase that hamstrung the first Vega, and it has faster HBM, and twice as much as last time as well.

It's $700 because NVIDIA allowed them to price it at that; I can hardly blame AMD for taking advantage of NVIDIA's absofuckinglutely insane pricing.
The main difference here: AMD's range tops out at $700. Regardless of relative and absolute performance, that's within reason, I'd say. I paid that much for my Fury X, and I'm happy with it, but it would take a serious improvement to make me pay as much again, at least for a couple of years yet. Nvidia's range, on the other hand, tops out at either $1200 or $2500, depending if you include the Titan. I don't think it counts (it's not GeForce), but apparently rabid fanboys such as the above example disagree (edit: to clarify, not the post quoted above, but the one you've all noticed if you've read the last page of posts). That is well beyond "within reason". People were pissed when Nvidia pushed Titan pricing to $1200, yet now they're eating up the 2080 ti at the same price. Just shows how easily accustomed one gets to insanity.
Posted on Reply
#231
kapone32
I bought a Water block for my Sapphire Vega 64 card. I do seriously hope that the layout is the same as when I ran Fire Strike this morning
ValantarThe main difference here: AMD's range tops out at $700. Regardless of relative and absolute performance, that's within reason, I'd say. I paid that much for my Fury X, and I'm happy with it, but it would take a serious improvement to make me pay as much again, at least for a couple of years yet. Nvidia's range, on the other hand, tops out at either $1200 or $2500, depending if you include the Titan. I don't think it counts (it's not GeForce), but apparently rabid fanboys such as the above example disagree. That is well beyond "within reason". People were pissed when Nvidia pushed Titan pricing to $1200, yet now they're eating up the 2080 ti at the same price. Just shows how easily accustomed one gets to insanity.
Exactly, and people are talking about DLSS and Ray Tracing without Physx, Hairworks and SLI which Nvidia took over and hogged for themselves only to see the technology not used because of the way they do things vs AMD with Vulkan which let to DX12 or Freesync which suddenly Nvidia is supporting. Of course with a comment for Jensen making it seem like Freesync is only good for NVidia cards..
Posted on Reply
#232
Casecutter
Like I said "no bad cards just bad pricing" let hope this spurs more competiveness!
Posted on Reply
#233
Sasqui
SlizzoIt's Microsofts' DXR API, that is a part of DirectX. Once AMD implements Radeon Rays, they'll be using the same library.
Yes, yes and yes. It's a component of DX in the latest Windows 10 version that everyone has access to.

The catch is that game developers have to write code to implement it, and the hardware has to be able to process it as well. Currently, these games are in the works to support it, at least to some degree or another:
  • Assetto Corsa Competizione from Kunos Simulazioni/505 Games.
  • Atomic Heart from Mundfish.
  • Battlefield V from EA/DICE.
  • Control from Remedy Entertainment/505 Games.
  • Enlisted from Gaijin Entertainment/Darkflow Software.
  • Justice from NetEase.
Posted on Reply
#234
mindbomb
theoneandonlymrkWhat so screen tearing, which freesync was made for wouldn't give the game away if it's not working??.
People would notice.
The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.
Vayra86Actually no, there are in depth Youtube vids about just about everything and when there are doubts, some reddit or YT or Twitch channel will explode because there's a new daily shitstorm to click on.
There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Gaming-Experience-FreeSync-
They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.
pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community. Those overshoot and ghosting issues would be solved by dynamic overdrive, something nvidia requires for all gsync monitors.
Posted on Reply
#235
TheoneandonlyMrK
mindbombThe screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.



There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Gaming-Experience-FreeSync-

They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.

pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community.
So you have 450 ish more examples (of different monitor) to find and Huang is then right, crack on.
He did say they're All broken.

And you Are hear trying to back his words, Have you seen this issue in the flesh??????

Everyone with some sense would expect the odd monitor or even line of monitor to have issues possibly but to say they're all broken ,well.

My Samsung works without artefacts etc but tbf it is a 45/75hz one so I can see it's not that model.
Posted on Reply
#236
Unregistered
"Underwhelming. The performance is lousy."

Kinda funny since that's exactly what I thought when reading the Turing reviews.
Posted on Edit | Reply
#237
mtcn77
mindbombThe screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.



There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Gaming-Experience-FreeSync-
They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.
pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community. Those overshoot and ghosting issues would be solved by dynamic overdrive, something nvidia requires for all gsync monitors.
There is no need to rationalize all that much after the debate, here you go;
Posted on Reply
#238
Unregistered
Raevenlord"underwhelming product"
Ironically, the 2080 ti is exactly this.
Posted on Edit | Reply
#239
Manoa
Smartcom5Yup, like RTX and DLSS …



Smartcom
loool
AnymalYou werent there when the first T&L gpu came out, a!
mybe he wasn't, but it looks like you weren't there when the first shading GPU (R300) came out
Markosz
- and this was 3 years ago... they only got worse.

AMD doesn't lock features behind hardware, they make open-source platforms.
they don't do it for you, or should I say us, they do it becouse it help them save money from software development expenses, it is also the reason they have shit gpu drivers
Posted on Reply
#240
Vayra86
mindbombThe screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.



There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Gaming-Experience-FreeSync-
They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.
pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community. Those overshoot and ghosting issues would be solved by dynamic overdrive, something nvidia requires for all gsync monitors.
Sorry but no. There are terrible monitors with Gsync too. Has nothing to do with VRR and everything with the panel and board behind that.

Tftcentral has it all in case you are curious. Not once has Freesync been considered related to ghosting or artifacting. The only gripe is limited ranges.
Posted on Reply
#241
Manoa
Valantar-While I agree that Vega as an architecture is far better suited for professional applications (we've seen this proven plenty of times), apparently AMD feels confident enough in the gaming-related improvements when doubling the memory bandwidth and doubling the number of ROPs to give it a go. I suggest we wait for reviews.
vega IS far better suited for (professional) compute applications
this is not about what AMD feels
memory bandwidth gives nothing to gaming performance on vega
it is not double the ROPs, they are still 64
but guess what ? I also suggest we waith for reviews
FluffmeisterUltimately this is good news, finally Vega can compete with the 1080 Ti. Let's wait for the review and see if TSMC's 7nm is the silver bullet many had hoped for.
it's not really 7 nm, actual size is closer to intel's 10 nm.

I think I know whay he says freesync doesn't work "at all" on "all" monitors, I suspect he considers the concept of a monitor not running the same hertz range in freesync as it does without it.
which makes him right about that since it is a lie sold by specifications and makes people believe freesync is full range, this says something to many people here that say AMD is not as dirty as nvidea...

Sasqui Assetto CC have shit for graphics ray tracing is not going to do it any good
Posted on Reply
#242
Valantar
Manoavega IS far better suited for (professional) compute applications
Yes, that's what I said. Kind of weird to try to make a point against me be repeating what I said like that. Is a snarky tone supposed to make it mean something different? This is a fact. Doesn't mean that it can't work for gaming. I was very surprised to see Vega 20 arrive as a gaming card, but apparently it performs well for this also. Nothing weird about that.
Manoathis is not about what AMD feels
Considering they're the only ones with any knowledge of how this performs in various workloads, it very much is. Launching it is also 100% AMD's decision, so, again this comes down to how AMD feels about it.
Manoamemory bandwidth gives nothing to gaming performance on vega
Source?
Manoait is not double the ROPs, they are still 64
That seems to be true, given the coverage of how this was an error in initial reports. My post was made before this was published, though.
Manoait's not really 7 nm, actual size is closer to intel's 10 nm.
Node names are largely unrelated to feature size in recent years - there's nothing 10nm in Intel's 10nm either - but this is still a full node shrink from 14/12nm. Intel being slightly more conservative with node naming doesn't change anything about this.
ManoaI think I know whay he says freesync doesn't work "at all" on "all" monitors, I suspect he considers the concept of a monitor not running the same hertz range in freesync as it does without it.
If that's the case, he should look up the definition of "work". At best, that's stretching the truth significantly, but I'd say it's an outright lie. What you're describing there is not "not working".
Manoawhich makes him right about that since it is a lie sold by specifications and makes people believe freesync is full range, this says something to many people here that say AMD is not as dirty as nvidea...
Why is that? AMD helped create an open standard. How this is implemented is not up to them, as long as the rules of the standard are adhered to (and even then, it's VESA that owns the standard, not AMD). AMD has no power to police how monitor manufacturers implement Freesync - which is how it should be, and not Nvidia's cost-adding gatekeeping. AMD has never promised more than tear-free gaming (unless you're talking FS2, which is another thing entirely), which they've delivered. AMD even has a very helpful page dedicated to giving accurate information on the VRR range of every single FreeSync monitor.

Also, your standard for "as dirty as" seems... uneven, to say the least. On the one hand, we have "created an open standard and didn't include strict quality requirements or enforce this as a condition for use of the brand name", while on the other we have (for example) "attempted to use their dominant market share to strong-arm production partners out of using their most famous/liked/respected/recognized brand names for products from their main competitor". Do those look equal to you?
Posted on Reply
#243
Manoa
you mean feels as in "it good enough to go out to the market" ?
yhe sure it can work in gaming, first vegas proved that :), im saying it like that becouse wrong chip for the wrong purpose, mybe better to augment polaris ?
you right AMD is better becouse they use the open standards approach, that's not the problem. for me the problem is that they are not doing it for good reasons, for us, it becouse they are doing it for themselfs...
thanks man that very elaborate response, you cleaned up a few things :)
I didn't know the freesync problems are becouse of the monitors not AMD or the freesync system itself
everyone are liers today about the nm eh ?
Posted on Reply
#244
Aquinus
Resident Wat-man
ValantarSource?
I have a Vega 64 and I assure you that memory bandwidth isn't helping gaming. 20% overclock on HBM yields no tangible performance benefit with mine. These cards are constrained by power consumption, not memory bandwidth.
Posted on Reply
#245
HTC
CammThis strikes me more that Huang is somewhat scared of what's coming down the pipeline.

Nvidia got forced out of its lucrative Gsync business because otherwise it wouldn't be able to certify against HDMI 2.1.

Nvidia's again pushed out of the console space (yes we know theres the Switch, but the Switch won't be defining GPU trends), which will continue to see Nvidia technology only bolted on, rather than core to any game experience. In 6-12 months we will see two new consoles built on AMD Navi GPU's, and corresponding cards for desktop. This is new, as traditionally console GPU's have lagged behind desktop tech, which will invariably benefit AMD for optimisation.

And lastly, with Intel entering the GPU space, Nvidia is left out in the cold with a lack of x86 ability, and only so much of that can be countered with its investment in RISC.

Full disclosure - I own a 2080 Ti, but to me this comes across as a temper tantrum rather than anything meaningful
Have not heard of this: source link, please?
Posted on Reply
#246
Valantar
HTCHave not heard of this: source link, please?
The HDMI 2.1 spec includes VRR support as a requirement, which would prevent Nvidia from certifying its cards against that spec if they still limited VRR support to G-sync. I guess a "middle of the road" option would be possible by supporting VRR on non-G-sync displays only if they are HDMI 2.1 displays, but that would just lead to (yet another) consumer uproar against Nvidia. Let alone the fact that (at least according to all reports I've seen) the HDMI 2.1 implementation of VRR is technically very, very close to AMD's existing extension of VESA Adaptive Sync To FreeSync over HDMI. Nvidia would have no technical excuse whatsoever to not support FreeSync.
Posted on Reply
#247
champsilva
GinoLatinoI don't think so! The difference is that nVidia was stable and just dropped, while AMD had a "bubble" and then came back to usual values, is even higher than "before the bubble".
nVidia : red
AMD : green

According to this 2018 was the boom to RTG group

Posted on Reply
#248
FordGT90Concept
"I go fast!1!11!1!"
Discreet Vega did really well by that chart.
Posted on Reply
#249
mindbomb
Vayra86Sorry but no. There are terrible monitors with Gsync too. Has nothing to do with VRR and everything with the panel and board behind that.

Tftcentral has it all in case you are curious. Not once has Freesync been considered related to ghosting or artifacting. The only gripe is limited ranges.
No, since the gsync boards have very fine control of overdrive as the refresh rate changes, they will likely have better response times than freesync or gsync compatible monitors, since those tend to have coarse control of overdrive. The amount of overdrive needed changes depending on the refresh rate, so that's why this technology is important.

There is nothing stopping monitor companies from making sure their freesync monitors have finer control, but only one freesync monitor has implemented this, the nixeus edg27 (they call it adaptive antighosting).

So you could do the experiment, find a gsync and freesync ips monitor that uses the same panel, and you'll see that the gsync monitor will end up with better response times with a variable refresh rate, due to a better overdrive system, which was mandated by nvidia.
Posted on Reply
#250
TheoneandonlyMrK
Manoayou mean feels as in "it good enough to go out to the market" ?
yhe sure it can work in gaming, first vegas proved that :), im saying it like that becouse wrong chip for the wrong purpose, mybe better to augment polaris ?
you right AMD is better becouse they use the open standards approach, that's not the problem. for me the problem is that they are not doing it for good reasons, for us, it becouse they are doing it for themselfs...
thanks man that very elaborate response, you cleaned up a few things :)
I didn't know the freesync problems are becouse of the monitors not AMD or the freesync system itself
everyone are liers today about the nm eh ?
YES THEY ARE , Laptops had a version of freesync for years(its what freesync was based on) before haungs crew decided to reinvent the wheeL and charge for it , at Least AMD dont charge, and Vegas fine for gaming in first or second edition, most games can manage to run at 4k with uLtra or very high settings with a few not a Lot of exceptions but those exceptions do matter to some ,me incuded.
Posted on Reply
Add your own comment
Aug 25th, 2024 10:44 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts