Thursday, January 10th 2019

NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"

PC World managed to get a hold of NVIDIA CEO Jensen Huang, picking his thoughts on AMD's recently announced Radeon VII. Skirting through the usual amicable, politically correct answers, Jensen made his thoughts clear on what the competition is offering to compete with NVIDIA's RTX 2000 series. The answer? Vega VII is an "underwhelming product", because "The performance is lousy and there's nothing new. [There's] no ray tracing, no AI. It's 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we'll crush it. And if we turn on ray tracing we'll crush it." Not content on dissing the competition's product, Jensen Huang also quipped regarding AMD's presentation and product strategy, saying that "It's a weird launch, maybe they thought of it this morning."
Of course, the real market penetration of the technologies Jensen Huang mentions is currently extremely low - only a handful of games support NVIDIA's forward-looking ray tracing technologies. That AMD chose to not significantly invest resources and die-space for what is essentially a stop-gap high-performance card to go against NVIDIA's RTX 2080 means its 7 nm 331 mm² GPU will compete against NVIDIA's 12 nm, 545 mm² die - if performance estimates are correct, of course.
The next remarks came regarding AMD's FreeSync (essentially a name for VESA's Adaptive Sync), which NVIDIA finally decided to support on its GeForce graphics cards - something the company could have done outright, instead of deciding to go the proprietary, module-added, cost-increased route of G-Sync. While most see this as a sign that NVIDIA has seen a market slowdown for its G-Sync, added price-premium monitors and that they're just ceding to market demands, Huang sees it another way, saying that "We never competed. [FreeSync] was never proven to work. As you know, we invented the area of adaptive sync. The truth is most of the FreeSync monitors do not work. They do not even work with AMD's graphics cards." In the wake of these word from Jensen, it's hard to understand the overall silence from users that might have their FreeSync monitors not working.

Reportedly, NVIDIA only found 12 out of 400 FreeSync-supporting monitors to support their G-Sync technology automatically in the initial battery of tests, with most panels requiring a manual override to enable the technology. Huang promised that "We will test every single card against every single monitor against every single game and if it doesn't work, we will say it doesn't work. And if it does, we will let it work," adding a snarky punchline to this matter with an "We believe that you have to test it to promise that it works, and unsurprisingly most of them don't work." Fun times.
Source: PC World
Add your own comment

270 Comments on NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"

#176
mtcn77
AnymalIt looks like: lol, 7nm, 300w tdp only 2080 or 1080ti performance from 2 years ago, no new features, but 16gb hbm2, for games...
Wrong, 180w and ~92% yield per max OC. People need to remember GpuBoost 3.0 is vendor-locked while the AMD driver is accessible. Changing the voltages and undervolting is done by default, not manually like AMD, on Nvidia. Also, the memory model is tiled, meaning it is not intermediate mode and always consuming power like AMD. If both cards were at %100 load, they would much likely be in AMD's favour because of the stated inference.
Posted on Reply
#177
Valantar
nickbaldwin86love how everyone is beating on the stock market being down for NV

@ time of posting it is at 143.09 and UP
While AMD is DOWN @ 19.33 ... I would rather own stock in a company that is worth $143 than a company with $19. just food for thought people.

NVIDIA Corporation
NASDAQ: NVDA
143.09 USD +0.51 (0.36%)
Jan 10, 1:56 PM EST ·

Advanced Micro Devices, Inc.
NASDAQ: AMD
19.33 USD −0.86 (4.26%)
Jan 10, 1:55 PM EST
Doesn't that depend on the price when you actually bought the stock? Your post sure makes it ound like you don't quite understand quite how this works. Owning $150 stock if you bought in at $170 certainly is quite a lot worse than owning $15 stock having bought in at $10 - and as you're likely to have invested the same amount, you'd also have 10-20x more stock with the cheaper one, making the difference very noticeable.
SasquiGone from a majority to a healthy minority.

If it weren't for crypto, NVDA and AMD wouldn't be crapping their trousers at this point in time.
What? You get that the total addressable market has expanded dramatically over that period of time, right? The PC gaming hardware market is not smaller than before, and the software market certainly isn't. Other devices growing into their own and establishing themselves as parallel alternatives does not at all spell impending doom for the incumbent segments. That PC gaming is now "a minority" is meaningless when the total gaming market is 10x what it was when that was the case.
efikkanTo buyers, the only thing that actually matters is how well it performs, along with thermals, noise etc. What process they use, how many cores, what clocks, etc. shouldn't matter to buyers, the benchmarks should show the reality. But if AMD needs ~300W to match the performance level of RTX 2080 / GTX 1080 Ti, they need to offer up some other advantage to justify going with a TDP that much higher, e.g. lower price.

Strategically, I can't see any real good reasons why AMD releases this card. My best guess is they felt the need to get some attention in the market since they don't have anything else for a good while. This is similar to the "pointless" RX 590 released last year, released just to get another review cycle.

For the future, this doesn't look good. AMD just spent what might be the last "good" node shrink, and repurposed a GPU for the professional market, a GPU which is very expensive to make and taking up precious capacity on the 7nm node. If this was planned a long time ago, they would have made a GPU without full fp64 support and with GDDR6, then at least they could have sold it much cheaper. But with Vega 20 they are at a disadvantage, not only is the HBM2 more expensive, the die is too. And they even put 16GB on it, which is completely wasted for gaming. If they really wanted more than 8GB, they could have gone with 12GB, and at least made it a little cheaper.
You've got this all turned around.
-Very expensive chip? Not likely. It's <400mm2. Even on 7nm, that's likely to be cheaper than Nvidia's monster 12nm chips. It does need an interposer and HBM, but given that GDDR6 is reportedly 60-70% more expensive than GDDR5(X), that leaves us at roughly price parity between GDDR6 and HBM2. AMD uses twice as much, which means less margins there, but not night and day. The interposer is likely not that expensive, given that it's far smaller than the Fiji lineup's interposer.
-"That much higher TDP"? While there's no denying that Nvidia's cards are more efficient, by a significant margin, the difference between a 300W VII and a 275W 1080Ti or 225W 2080 is ... not that much, really. 9 and 33% respectively. The difference was far bigger the last go around, with the 275W V64 roughly matching the 180W 1080 (52%).
-Strategically, this is AMD saying "Hey, we're here, and we can compete this go around too" rather than leaving the old and 1080-ish V64 as their highest performer. At the same price or cheaper than the 2080 for matching performance, this is a good alternative, particularly with a good tri-fan cooler for the reference card. That makes very much sense, as the other option would be leaving the RTX series unopposed until Navi arrives. If that's within a few months, this would be of no use, but if it's towards fall or Q4, this is a very good strategic move. Nvidia can't launch anything new in response, all they can do is lower prices on their existing cards, which we know are ridiculously expensive to produce, even if they are still overpriced. If Nvidia doesn't, AMD will sell more. What ever happens, this is a good move by AMD.
-"Spending a node shrink" on a refresh of a known architecture is a well established mode of operations in the tech world. It's when you don't have a node shrink that you need a new arch - a position that AMD has now put itself in. This primes the market for Navi. Of course, it lessens the "wow" effect if Navi delivers on improving perf/w (combining that with a node shrink would make it really stand out), but on the other hand it makes for continuity in AMD's offerings and the performance of their product stack.
-While I agree that Vega as an architecture is far better suited for professional applications (we've seen this proven plenty of times), apparently AMD feels confident enough in the gaming-related improvements when doubling the memory bandwidth and doubling the number of ROPs to give it a go. I suggest we wait for reviews.
-You're right that 12GB of HBM2 might have made this cheaper, but getting 6hi vs. 8hi KGSDs of HBM2 is likely a challenge, and given that 8hi is in volume production for enterprise/datacenter/etc., it might be the same price or even cheaper than special-ordered 6hi stacks.
Posted on Reply
#178
Anymal
Vayra86You really have reading issues, I already told you, skim back a few pages. Think of it as a treasure hunt!
2 posts before you say there is no new content, no visual gains. Treasure hunt is over.
Posted on Reply
#179
HammerON
The Watchful Moderator
Alright @Vayra86 and @Anymal - enough already. Take it to PM if you want to continue bickering.
Posted on Reply
#180
VulkanBros
This seems like a very odd statement from Mr. nVidia...
He most know, that such a statement will be discussed in various tech forums, and likely will backfire, so what caused such odd saying? Gives no meaning, at all, to me :confused:
Posted on Reply
#181
HisDivineOrder
This whole affair really highlights how bad the GPU market is right now.

Nvidia is owning AMD by providing a smidge more performance for decently more money than last generation at the same price point. Meanwhile, AMD can't even beat THAT with a new card launch and a die shrink. So:

1) Nvidia took a technological advantage and wasted most of their extra die space on fad features that won't even get play until a few generations from now. They should have waited to implement RTX when they had a die shrink and space to kill. Here, they basically made a slightly enhanced version of their last generation and added features that won't matter for a few gens. Whoops. Meaning performance didn't go up to much degree, compelling them to swap the model numbers around to hide that behind marketing spin of "AND COMPARING THE 2080 to the 1080, WE SEE A HUGE UPLIFT IN FRAMERATE!"

2) AMD took a dieshrink advantage and managed to barely keep up with nvidia's last generation high end. So for the same money as nvidia, they'll give you a card that mostly keeps up at a higher power draw and lacks even the fad features that nvidia has.

So if Nvidia was "greedy" to charge basically the same for a tiny performance uplift plus fad features, then what is AMD to charge the same price for virtually the same (or less?) performance WITHOUT the fad features?

This is Aliens vs Predator all over again. No matter who wins this fight, we all just lost.
Posted on Reply
#182
efikkan
Valantar-Very expensive chip? Not likely. It's <400mm2. Even on 7nm, that's likely to be cheaper than Nvidia's monster 12nm chips.
Estimates are about twice the cost per mm². Combined with a interposer and HBM2, this will be more expensive than "12nm" and GDDR6.
Valantar-"That much higher TDP"? While there's no denying that Nvidia's cards are more efficient, by a significant margin, the difference between a 300W VII and a 275W 1080Ti or 225W 2080 is ... not that much, really. 9 and 33% respectively. The difference was far bigger the last go around, with the 275W V64 roughly matching the 180W 1080 (52%).
When comparing RTX 2080 vs "Radeon VII", presumably with similar performance, ~300W vs. 215W and the same price, it becomes a hard sell. On the other hand if it was $50 cheaper or had some other significant advantage, it could have some potential, assuming AMD could make enough of them.
Valantar-"Spending a node shrink" on a refresh of a known architecture is a well established mode of operations in the tech world.
<snip>
-While I agree that Vega as an architecture is far better suited for professional applications…
The refresh itself is not the problem, but doing it to make a very expensive GPU with irrelevant features that have no real chance in the market, due to the great inefficiency of the architecture.

The Vega 20 was intended for professional markets, which means it includes more fp64 cores and I believe some other new features which is wasted die space on the very expensive and currently low volume node. As I mentioned, if they had a Vega 20 without this and with GDDR, it would at least make some sense.
Valantar-You're right that 12GB of HBM2 might have made this cheaper, but getting 6hi vs. 8hi KGSDs of HBM2 is likely a challenge, and given that 8hi is in volume production for enterprise/datacenter/etc., it might be the same price or even cheaper than special-ordered 6hi stacks.
No, I'm talking about using 3 stacks instead of 4.
Posted on Reply
#183
Mistral
Wow, he sounds super salty...
It "barely keeps up with" our more expensive card "and if we turn on ray tracing we'll crush it" by halving our framerates...
Posted on Reply
#184
GreiverBlade
heck ... that speech is making me want a Vega VII ... rather than a 20XX card ... even if the price point is around a 2080 (where i live ...)
eagerly waiting on AIB and etailer to list them to see ...

but for now i am more inclined to get a deal on a Vega 64 rather than getting a 2070 from the green goblin ---


red dawn indeed ...
Posted on Reply
#185
Anymal
No halving thank to dlss
Posted on Reply
#186
VulkanBros
Exactly my thought - it will backfire on nVidia...
GreiverBladeheck ... that speech is making me want a Vega VII ... rather than a 20XX card ... even if the price point is around a 2080 (where i live ...)
eagerly waiting on AIB and etailer to list them to see ...

but for now i am more inclined to get a deal on a Vega 64 rather than getting a 2070 from the green goblin ---


red dawn indeed ...
Posted on Reply
#187
neatfeatguy
Mr.Mopar392Not a hater trust me, a fanboy yes. I Really don't care what anyone wants to buy its their money, i just don't care for Nvidia cards or their tech thats it. last nvidia cards i own was a pair of gtx 570 in sli they ran like garbage and color look wash out. Never own since then, not for the foreseeable future.
I ran 570s in SLI for 4+ years. I thought they handled things rather well. I used them across several different monitors over that time frame and things never looked washed out. I maxed out games on 1080p for a long time. I probably would have kept the cards until Pascal hit the shelves, but having jumped into the 5760x1080 resolution that I really like playing games on (that support it), the 570s struggled. I ran FarCry 3 on 5760x1080, but all settings were turned down to the medium range and FPS usually were around 40-50fps. I could max out Borderlands 2 (except for Physx, I kept that on low) and hold around 60fps at 5760x1080.....anyway. I decided to jump to something and after some deliberating I went with my 980Ti AMP Omega. Hell of an upgrade.

I can't say I've ever had issues with Nvidia cards before. I've had issues with SLI over the years, but not big glaring issues that people seem to claim. My next card will most likely be from Nvidia because I've been using them for so long....but if AMD can throw something out that works as well or better and cost less, I wouldn't be against picking up one of their GPUs for my next upgrade. My next GPU is still probably 2-3 years out (if my 980Ti lasts that long), so who knows what we'll see then.
Posted on Reply
#188
Vya Domus
Sticks and stones will break my bones, but words will never bring back the market cap.
Posted on Reply
#189
mindbomb
I have an rx 480 in my main rig, and jensen might be right about freesync. It was so poorly regulated that it was hard figuring out which monitors actually performed well in their variable refresh range. Most reviewers only check image quality with fixed refresh rates.
Posted on Reply
#190
kapone32
mindbombI have an rx 480 in my main rig, and jensen might be right about freesync. It was so poorly regulated that it was hard figuring out which monitors actually performed well in their variable refresh range. Most reviewers only check image quality with fixed refresh rates.
If you really want to get Freesync working as intended use a program called CRU. There you can adjust the Sync rate of your monitor I have mine set to 25-80
Posted on Reply
#191
Sasqui
ValantarWhat? You get that the total addressable market has expanded dramatically over that period of time, right? The PC gaming hardware market is not smaller than before, and the software market certainly isn't. Other devices growing into their own and establishing themselves as parallel alternatives does not at all spell impending doom for the incumbent segments. That PC gaming is now "a minority" is meaningless when the total gaming market is 10x what it was when that was the case.
Hey, use acronyms like "TAM", ok? :p But yes, to your point the total market has increased as the share %'s shift. Interesting thing from that graph is that PC market has almost taken over the console market from 2016-2019. Who would have thought?
Posted on Reply
#192
ironwolf
27MaDFacebook comment.:D
Yep, that would be me over on FB who made the comment. :p
Posted on Reply
#193
GorbazTheDragon
How to read this:

Vega VII is a "noteworthy product"

The performance is competitive but there's no frills.

[There's] no ray tracing, no AI (underdeveloped gimmicks which you as a consumer don't really care about yet).

It's 7nm is a turning point because VEGA went from competing with the 1080 to competing with the 1080 Ti.

We can only rely on gimmicks that have zero market penetration and are so far only empty promises to feign a better offer to potential customers.
Posted on Reply
#194
EntropyZ
He is feeling the pressure now and lost his cool. What is it JHH? Is that leather jacket getting a little tight or something? :laugh::laugh:

The marketing and lies are like 2 ends of a pole, nVidia better end up hitting themselves back into the hole they dug.
Posted on Reply
#195
Fluffmeister
Ultimately this is good news, finally Vega can compete with the 1080 Ti. Let's wait for the review and see if TSMC's 7nm is the silver bullet many had hoped for.
Posted on Reply
#196
Minus Infinity
HellfireWow, I'd expect a CEO of a multi billion dollar company not to whine like a little bitch.

I mean I've seen Emo kids who are less whiney than he is.
This, highly undignified response and clearly means he's worried.
Posted on Reply
#197
Basard
Give that guy a piss test lol.
Posted on Reply
#198
Fluffmeister
Minus InfinityThis, highly undignified response and clearly means he's worried.
No doubt, with worse performance per watt than Pascal despite being built on 7nm tech he is wondering when the competiton will turn up.
Posted on Reply
#200
B-Real
The most important thing is your RTX series is lousy in terms of performance jump and especially in price. After these comments some more stock decrease is inevitable. :D
FluffmeisterNo doubt, with worse performance per watt than Pascal despite being built on 7nm tech he is wondering when the competiton will turn up.
I didn't know the most important thing is performance per Watt. LOL. :D Shame on you.
AnymalA lot of people here who do not understand rt and dlss. Yeah, rtx line is all fail, VII is the best ever, AMD power FTW
HAHA. Whine little one, whine. What you can do is use RT in ONE game in 4 months time. And the effect of RT was officially confirmed to have been lowered for somewhat better performance, which meant ~70 fps in FHD and ~30 fps average in 4K (with one of the best optimized engines) with a $1000 (1200) card. After the patch, the effects were further lowered in order to get better results. Still, its 50-55 average in 4K with a $1000 (1200) card. :D Tomb Raider will get it with FHD 30ish fps on 2080Ti. FFXV RT was cancelled. Now these are shame.
Posted on Reply
Add your own comment
Nov 3rd, 2024 21:19 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts