# NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"



## Raevenlord (Jan 10, 2019)

PC World managed to get a hold of NVIDIA CEO Jensen Huang, picking his thoughts on AMD's recently announced Radeon VII. Skirting through the usual amicable, politically correct answers, Jensen made his thoughts clear on what the competition is offering to compete with NVIDIA's RTX 2000 series. The answer? Vega VII is an "underwhelming product", because "The performance is lousy and there's nothing new. [There's] no ray tracing, no AI. It's 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we'll crush it. And if we turn on ray tracing we'll crush it." Not content on dissing the competition's product, Jensen Huang also quipped regarding AMD's presentation and product strategy, saying that "It's a weird launch, maybe they thought of it this morning."



 

 



Of course, the real market penetration of the technologies Jensen Huang mentions is currently extremely low - only a handful of games support NVIDIA's forward-looking ray tracing technologies. That AMD chose to not significantly invest resources and die-space for what is essentially a stop-gap high-performance card to go against NVIDIA's RTX 2080 means its 7 nm 331 mm² GPU will compete against NVIDIA's 12 nm, 545 mm² die - if performance estimates are correct, of course.



 

The next remarks came regarding AMD's FreeSync (essentially a name for VESA's Adaptive Sync), which NVIDIA finally decided to support on its GeForce graphics cards - something the company could have done outright, instead of deciding to go the proprietary, module-added, cost-increased route of G-Sync. While most see this as a sign that NVIDIA has seen a market slowdown for its G-Sync, added price-premium monitors and that they're just ceding to market demands, Huang sees it another way, saying that "We never competed. [FreeSync] was never proven to work. As you know, we invented the area of adaptive sync. The truth is most of the FreeSync monitors do not work. They do not even work with AMD's graphics cards." In the wake of these word from Jensen, it's hard to understand the overall silence from users that might have their FreeSync monitors not working.

Reportedly, NVIDIA only found 12 out of 400 FreeSync-supporting monitors to support their G-Sync technology automatically in the initial battery of tests, with most panels requiring a manual override to enable the technology. Huang promised that "We will test every single card against every single monitor against every single game and if it doesn't work, we will say it doesn't work. And if it does, we will let it work," adding a snarky punchline to this matter with an "We believe that you have to test it to promise that it works, and unsurprisingly most of them don't work." Fun times.

*View at TechPowerUp Main Site*


----------



## ZoneDymo (Jan 10, 2019)

Someone seems a little bitter.... and afraid or something.
Like seriously, I dont say this as some jab, if you are comfortable with your own products and choices, you would not try to put the competition down like this.


----------



## phanbuey (Jan 10, 2019)

He's probably angry he has to pay someone to take all of those 1180 stickers off of the emergency batch of 1080ti's that they prepped.


----------



## Recus (Jan 10, 2019)

Huang probably saw this too and deiced to kick back fanboys into reality. 


__
		https://www.reddit.com/r/pcmasterrace/comments/acwgik


----------



## Dinnercore (Jan 10, 2019)

He sounds like a salty and edgy teenager, wow. I guess he always tried to sell this image and targets that audience, but this is low.
Can he legally do that? Accuse products not working out of thin air and without proof? If I do that I will get sued for compensation in my country. I can video proof that my freesync monitor works well, I´d like to invite him over and show him this magic that his company seems unable to achieve without additional hardware.


----------



## Kohl Baas (Jan 10, 2019)

It Just Works


----------



## JalleR (Jan 10, 2019)

hmmm i think he look at that small die of the V2 "rocket"  and see that if he Sells 1 2080 and AMD one V2 (radeon 7) they earn more than him and he is ANGRY about that   i want the Money he says….


----------



## ShurikN (Jan 10, 2019)

ZoneDymo said:


> Someone seems a little bitter....


You'd be too if you lost this much of your market value in 3 months.


----------



## Hellfire (Jan 10, 2019)

Wow, I'd expect a CEO of a multi billion dollar company not to whine like a little bitch.

I mean I've seen Emo kids who are less whiney than he is.


----------



## Bones (Jan 10, 2019)

He also says "The more you spend, the more you save". 
It's all marketing BS guys, trying to make a sale or two.


----------



## megamanxtreme (Jan 10, 2019)

Are those responses typical from this guy?


----------



## human_error (Jan 10, 2019)

So does he think that the 2080 has underwhelming performance? How about the 2070 or 2060 which are even slower?

The Vega VII cards aren't setting a new benchmark in performance but if they perform at the same level as the 2080 I wouldn't say that they are underwhelming at all. 

Of course what do people expect the CEO of AMD's competitor to say "oh yes they're decent cards"?


----------



## Rahmat Sofyan (Jan 10, 2019)

hmmm, weird .. "They do not even work with AMD's graphics cards .. ? " any proof from his statement ?


----------



## therealmeep (Jan 10, 2019)

"stop-gap card" with more RAM and actual performance improvement against what is a rehashed 1080ti with fancy RAM and more fancy features. RTX is gonna kill Nvidia if they keep trying to push their flop of a series. As for freesync, it's probably a lie that we'll see proved wrong by AMD or some third party in a week or 2.


----------



## tvamos (Jan 10, 2019)

I've used 3 cards that support freesync with my monitor, zero issues. He knows g-sync is gonna sell less, one more reason for bitterness.


----------



## Hellfire (Jan 10, 2019)

I thought my freesync worked great.... on all of my cards.... and monitors... IlluminatiConfirmed


----------



## INSTG8R (Jan 10, 2019)

More salt please Jensen, I’ll make you some fries....My Freesync works just fine thanks...


----------



## Vayra86 (Jan 10, 2019)

Let's twist Huangs' little nerdrage back to reality 

NVIDIA only found *12 out of 400 FreeSync-supporting monitors to support their G-Sync technology automatically* in the initial battery of tests, *with most panels requiring a manual override to enable the technology*. 
(Hey Huang, isn't that your fault for locking down Gsync and creating artificial barriers? You're saying your tech doesn't work well with an industry wide standard while 12 monitors prove that it can and should   - so, Gsync, it 'doesn't just work')

Huang promised that "We will test every single card against every single monitor against every single game 
(blatant lie - you can never do this and you didn't do it for Gsync with its multitude of configuration-related bugs. In a somewhat related matter, I'm still missing a couple thousand SLI profiles as well)

and if it doesn't work, we will say it doesn't work. And if it does, we will *let it work*," adding a snarky punchline to this matter with an "We believe that *you have to test it to promise that it works*, and unsurprisingly most of them don't work." 
(Right, so you're surprised you can't plug and play your proprietary tech maze into an open standard? Mkay)

It seems we need to rewrite Huangs punchline:

Nvidia, _get to work, so that it "just works". Thanks!_

Meanwhile...


----------



## Xaled (Jan 10, 2019)

Yeah sure! thats why he started adopting freesync. Or it  wouldve been a no-brainer to select VII over 2080..


----------



## eidairaman1 (Jan 10, 2019)

So if freesync doesnt work why did nvidia adopt it???


----------



## megamanxtreme (Jan 10, 2019)

human_error said:


> So does he think that the 2080 has underwhelming performance? How about the 2070 or 2060 which are even slower?


You are making me wonder what the 2080 would have been if competition was strong from both teams. A 2050?


----------



## ShurikN (Jan 10, 2019)

tvamos said:


> I've used 3 cards that support freesync with my monitor, zero issues. He knows g-sync is gonna sell less, one more reason for bitterness.


As far as I understood the whole Freesync launch, most issues were on the very earliest of panels (and probably drivers) After a couple of months the tech worked like intended.


----------



## NdMk2o1o (Jan 10, 2019)

Smells like sour grapes and immature from a ceo of a multi billion dollar corporation. I don't see what the issue is of vega vii is at 2080 level and priced accordingly.


----------



## INSTG8R (Jan 10, 2019)

ShurikN said:


> As far as I understood the whole Freesync launch, most issues were on the very earliest of panels (and probably drivers) After a couple of months the tech worked like intended.


I had one of the first Freesync monitors(ASUS MG279Q) my only issue was the odd range(35-90) but was a great monitor. Now on a Freesync 2 monitors and again great monitor with full range.


----------



## unikin (Jan 10, 2019)

I'm staying in $250 buying ballpark no matter what. If Nvidia charges $250 for RTX 2050 with GTX 1060 performance, I'll just skip it and stay with my OC RX 570 until something comparable to GTX 1070/Vega 56 comes along with $250 price tag or less. That's how you say no to ridiculous pricing of low mid end end products like x060 or 590. I don't care about red or green, I'll buy best price/performance ratio GPU inside my budget. If no progress has been made on price/performance front, I'll skip generation.


----------



## Vayra86 (Jan 10, 2019)

NdMk2o1o said:


> Smells like sour grapes and immature from a ceo of a multi billion dollar corporation. I don't see what the issue is of vega vii is at 2080 level and priced accordingly.



The issue is Nvidia just lost their domination on the high end, when they needed it the most to push their Turing affair, which is the worst deal in GPU history. They now only have one top-end card that is completely priced out of the market - out of _pure necessity because its too flippin' large_ - to eclipse AMD's ancient GCN with some lazy tweaks and a shrink. So that, alongside them having to give up the Gsync cash cow because it had nowhere else to go, RTX not really gaining traction in hearts & minds or dev products, mining being over, and their stock price stabilizing at half what it was a few months back.

Yeah, I'd be grumpy too


----------



## tvamos (Jan 10, 2019)

ShurikN said:


> As far as I understood the whole Freesync launch, most issues were on the very earliest of panels (and probably drivers) After a couple of months the tech worked like intended.


Maybe Jensen only tested those early ones. All 400 of them.


----------



## ShurikN (Jan 10, 2019)

tvamos said:


> Maybe Jensen only tested those early ones. All 400 of them.


Made me laugh not gonna lie


----------



## Xaled (Jan 10, 2019)

megamanxtreme said:


> Are those responses typical from this guy?


He is the richest hater, troll on earth
The richer he gets the meaner he becomes..



ShurikN said:


> Made me laugh not gonna lie


He hates on something that he bought and tested 400 of ..


----------



## Sasqui (Jan 10, 2019)

This speaks to lack of integrity and self serving interests among other things.  Just wait for the next NVDA earnings release in early Feb... though the share price already has most of the downed estimates baked in.

NVDA makes great products but the leadership and culture of the company has been and still is, one big money grab by way of coercion.  From SLI to G-Sync, can you say "proprietary"?


----------



## Octopuss (Jan 10, 2019)

This almost looks like a new low. Just shut up maybe? :-O


----------



## kastriot (Jan 10, 2019)

Inferiority complex can be devastating and more money doesn't help


----------



## XiGMAKiD (Jan 10, 2019)

The hyped raytracing capability is not that good considering they need to reduce so many raytraced elements to get acceptable framerates on all cards. DLSS is a better feature to brag about as it can increase performance and make the game a bit prettier, and I love how he twist the wording about condition on support for Adaptive Sync to make them looks good, 11/10 a legit CEO


----------



## Vayra86 (Jan 10, 2019)

tvamos said:


> Maybe Jensen only tested those early ones. All 400 of them.



Against every game, no less.

https://www.polygon.com/2014/4/20/5633602/list-of-every-video-game-all-time

That makes for a little over 160000 FreeSync tests - and if you consider the rate at which Steam spits out indies, they better have a really efficient test sequence 

Yeah, its a good day  We needed some spice to forget all our GPU woes.


----------



## Xuper (Jan 10, 2019)

hrmm, What happened? so much angry ?


----------



## megamanxtreme (Jan 10, 2019)

XiGMAKiD said:


> DLSS is a better feature to brag about


Reminded me how Xbox 360 had this feature, not sure why AMD didn't invest on it, would really help free up resources(or was just speculation on the console).
https://www.anandtech.com/show/1864/inside-microsoft-s-xbox-360/8


----------



## Vayra86 (Jan 10, 2019)

XiGMAKiD said:


> The hyped raytracing capability is not that good considering they need to reduce so many raytraced elements to get acceptable framerates on all cards. DLSS is a better feature to brag about as it can increase performance and make the game a bit prettier, and I love how he twist the wording about condition on support for Adaptive Sync to make them looks good, 11/10 a legit CEO



Not sure about DLSS, did you see it? With old AA everyone complained about every odd jaggy or blur that resulted from using it, and now we're fine with all sorts of visual artifacting? For 'free performance'? What you're seeing is a downsampled render that can be _very_ inaccurate and needs to be implemented on a per-game basis. I'll take the usual AA, thanks. What's even better is that the ONE game in which it actually works, all development was discontinued recently


----------



## XiGMAKiD (Jan 10, 2019)

megamanxtreme said:


> Reminded me how Xbox 360 had this feature, not sure why AMD didn't invest on it, would really help free up resources(or was just speculation on the console).
> https://www.anandtech.com/show/1864/inside-microsoft-s-xbox-360/8


My guess is at least on console to give room for upscaling feature, as for the PC well maybe the available AA technique is enough for them right now, although if implemented it can give them a healthy boost on performance.


----------



## 27MaD (Jan 10, 2019)

"underwhelming , Performance is lousy , Freesync doesn't work" ? what ? , did you expect him to say the new Radeon 7 kicks A$$?
someone is shaking.


----------



## megamanxtreme (Jan 10, 2019)

Vayra86 said:


> What you're seeing is a downsampled render that can be _very_ inaccurate and needs to be implemented on a per-game basis. I'll take the usual AA, thanks.


Checkerboard in consoles, some can make them look okayish but other times its completely noticeable, the FF part with the fence in the distance flickering. If smoothing makes the textures look better, I'm all in. If not, I'll stick to 4K textures and native 1080p, 1440p, and 4K.


----------



## ShurikN (Jan 10, 2019)

XiGMAKiD said:


> DLSS is a better feature to brag about...


I thought the same thing until I saw how much blur that tech actually creates.


----------



## phanbuey (Jan 10, 2019)

Yeah the new nvidia 'features' are experimental at best.  Great potential though but i never was a fan of upsamling/downsampling.  I do like TXAA though.  Ray tracing will be a thing in like 3 years.


----------



## Vayra86 (Jan 10, 2019)

phanbuey said:


> Yeah the new nvidia 'features' are experimental at best.  Great potential though but i never was a fan of upsamling/downsampling.  I do like TXAA though.  Ray tracing will be a thing in like 3 years.



That's just it, TSSAA is miles better and it can scale along with an internal render resolution slider. Its a one-size fits all approach for every level of performance and it is already implemented in almost every recent game.

As for RT... until I see AMD announce their next console APU that has hardware RT capability, I don't see any movement in that department and deep down we all know that is what the tech needs for mass adoption and, thus, support in content. Nvidia can't even take that cake because they don't make x86 CPUs and don't do custom silicon.


----------



## TheoneandonlyMrK (Jan 10, 2019)

ZoneDymo said:


> Someone seems a little bitter.... and afraid or something.
> Like seriously, I dont say this as some jab, if you are comfortable with your own products and choices, you would not try to put the competition down like this.


Seconded , bang on , nothing short of caty.
Freesync works fine and has this last year for me go figure, Im that lucky i guess.
The guy should get his focus back, his company probably lost in worth , AMDs actual worth and he's being investigated for ineptitude or deception (tbd) so should probably be working to increase adoption by devs of Rtx imho, not this shit banter.


----------



## XiGMAKiD (Jan 10, 2019)

Vayra86 said:


> Not sure about DLSS, did you see it? With old AA everyone complained about every odd jaggy or blur that resulted from using it, and now we're fine with all sorts of visual artifacting? For 'free performance'? What you're seeing is a downsampled render that can be _very_ inaccurate and needs to be implemented on a per-game basis. I'll take the usual AA, thanks. What's even better is that the ONE game in which it actually works, all development was discontinued recently


Well if Nvidia is fine about raytraced games that is raytraced on bare minimum then visual weirdness here and there is fine for them too


----------



## Imsochobo (Jan 10, 2019)

Hellfire said:


> Wow, I'd expect a CEO of a multi billion dollar company not to whine like a little bitch.
> 
> I mean I've seen Emo kids who are less whiney than he is.




What I hate most about nvidia:
Jensen.
Linux driver support.

With freesync now supported one BIG ass contributing factor for me to purchase a Vega 64 @gtx1070 price was that and still felt like having shit tons of linux issues, no adaptive sync and buying a 1080 at 100$ more but closed my eyes and purchased it.
Still not happy about the Vega Hardware, that's the only thing I'm not happy about with the AMD Gpu.

If only AMD Knew how to make good gaming gpu chips again.......


----------



## Anymal (Jan 10, 2019)

unikin said:


> I'm staying in $250 buying ballpark no matter what. If Nvidia charges $250 for RTX 2050 with GTX 1060 performance, I'll just skip it and stay with my OC RX 570 until something comparable to GTX 1070/Vega 56 comes along with $250 price tag or less. That's how you say no to ridiculous pricing of low mid end end products like x060 or 590. I don't care about red or green, I'll buy best price/performance ratio GPU inside my budget. If no progress has been made on price/performance front, I'll skip generation.


If you dont buy new card you have infinite p/p, no money spent and rx 570 stays, the best deal ever.


----------



## Dimi (Jan 10, 2019)

Well it is pretty lousy if you see the dozen different freesync ranges on all these monitors. Most of em only have a freesync range between 48hz-75hz. What is the point of that? There is no standard like gsync has. With gsync, the adaptive sync starts from 30hz up to its maximum refresh rate, almost every gsync monitor has a adaptive sync range of 30hz-144hz. 

Mind you, my 1440p 165hz gsync monitor only cost me $350. Granted its a TN panel but i use a ColorMunki display calibrator for color accuracy.


----------



## Anymal (Jan 10, 2019)

A lot of people here who do not understand rt and dlss. Yeah, rtx line is all fail, VII is the best ever, AMD power FTW


----------



## Vayra86 (Jan 10, 2019)

Anymal said:


> A lot of people here who do not understand rt and dlss. Yeah, rtx line is all fail, VII is the best ever, AMD power FTW



Feel free to add some content and enlighten us. What do you think 'people' don't understand about it? The tech has been covered extensively. And I think the majority understands it very well: its vaporware at this point and for the foreseeable future.


----------



## ShurikN (Jan 10, 2019)

Anymal said:


> A lot of people here who do not understand rt and dlss.


What do you mean "do not understand". Did you see DLSS in FF15, it's a mess. What's there to not understand. The evidence is obvious. Not to mention the horrible shimmering.


----------



## champsilva (Jan 10, 2019)

ShurikN said:


> You'd be too if you lost this much of your market value in 3 months.
> 
> View attachment 114308



This is mining reason, same happened to amd.


----------



## ShurikN (Jan 10, 2019)

champsilva said:


> This is mining reason, same happened to amd.


So? Money lost is money lost, no matter the reason.

Also AMD has much less value, therefore they lost less in pure numbers, not percentages.


----------



## Sandbo (Jan 10, 2019)

The poll is one of the best follow-up I have seen.


----------



## megamanxtreme (Jan 10, 2019)

I'll switch Ray Tracing on when a GTX X50 or RX X60 can play at 60fps and at least 1080p with it on.


----------



## Anymal (Jan 10, 2019)

Its new and inovative tech, you can turn it off if you want, and play on 2080 as on new, the best amd has to offer, the radeon VII.  Jensen made his decision after seeing a brand new 7nm gpu with no new features and performance similar to 2080/1080ti which was avaliable almost 2 years ago, gp102 as Titan Xp even more, 2.5 years.


----------



## TheDeeGee (Jan 10, 2019)

Good to see competition, isn't it?


----------



## INSTG8R (Jan 10, 2019)

ShurikN said:


> What do you mean "do not understand". Did you see DLSS in FF15, it's a mess. What's there to not understand. The evidence is obvious. Not to mention the horrible shimmering.


Yeah it looks like AMDs MLAA which also just made everything blurry but hey it doesn’t need a $1200 card to do it either...


----------



## ShurikN (Jan 10, 2019)

Anymal said:


> Its new and inovative tech, you can turn it off if you want


And now you're stuck with a feature you will not use and one you paid extra for.


----------



## Smartcom5 (Jan 10, 2019)

Kohl Baas said:


> It Just Works


Yup, like RTX and DLSS …





Smartcom


----------



## Anymal (Jan 10, 2019)

ShurikN said:


> And now you're stuck with a feature you will not use and one you paid extra for.


You can buy VII and you are safe than. Probably for the same money.


----------



## eidairaman1 (Jan 10, 2019)

Im tired of NVs crap anyway.


----------



## Anymal (Jan 10, 2019)

eidairaman1 said:


> Im tired of NVs crap anyway.


Someone please explain how Vega and VII is not a crap.


----------



## Nxodus (Jan 10, 2019)

AMD fanbois will be so angry when in a couple of years RTX becomes standard, and AMD will be forced to adopt it effectively increasing the prices of their hot plastic cards even more.


----------



## Casecutter (Jan 10, 2019)

_"It's a weird launch, maybe they thought of it this morning", _he's pissed no one in his company saw this coming and he found out when it was announced?  Funny if he took CEO Lisa Su at her word "saying no consumer 7nm in 2019"... who lies to family!  It was served cold and hard and he's under HUGE pressure already!

While, Wow just wow no one stop this guy?  He thought his stock has been tanking this kind of tantrum from a CEO just drove it off the cliff!

Calling the Vega 7 "underwhelming" it matches the 2080 at the same price, his understanding of the word escape's him.  Now while I wait to see reviews, I might say his company delivered "underwhelming"... at such price first.  There's never truly bad cards just bad price points. 

As to FreeSyn when your company just came out saying they are going to now support the VERSA adaptive Sync it, but says "does not work" I'm not interested in consider using your hardware, i'll go with a proven and working solution.  Ah, so get your crack driver team to make it work on your cards, don't blame the an open source VERSA specification you never want to support or give input to, and now that your behind the 8-Ball can't seem to get it to work with your hardware.  The chicken came to roost... 

It seems like he's seeing AMD doing a "Ryzen" and instead of keeping a "stiff upper lip" like Intel did he freaked out, and this is not near as bad as Intel had it.  Navi will be coming soon and I'd want to believe AMD will have more mainstream pricing for parts that does strong 1440p.  Then he's realizes he's up against a Navi/HBM2 on 7mm part, and has no idea how that might turn out but is said to be more gaming than a Vega compute architecture.  AMD is already proven they can 7mn, so it to be Navi gains on a big chip that's on 7nm, who knows.  At this point the next thing might be he found out AMD has product lining up from the "Instinct geldings" and strong HBM2 supplies, and can fill the channel way better the first Vega.  Jensen Huang knows AMD is in a better place to work down price if they can maintain strong product levels, while his investors would freak on him.

Find me Popcorn...


----------



## Valantar (Jan 10, 2019)

Dimi said:


> Well it is pretty lousy if you see the dozen different freesync ranges on all these monitors. Most of em only have a freesync range between 48hz-75hz. What is the point of that? There is no standard like gsync has. With gsync, the adaptive sync starts from 30hz up to its maximum refresh rate, almost every gsync monitor has a adaptive sync range of 30hz-144hz.
> 
> Mind you, my 1440p 165hz gsync monitor only cost me $350. Granted its a TN panel but i use a ColorMunki display calibrator for color accuracy.


Calibration can't make up for a low-gamut panel, but of course there are decent gamut TN panels out there. Just don't look at it from any kind of angle.

As for 48-75Hz, it sure beats locked 60Hz, which is what you'd otherwise get for that price. Added flexibility and higher refresh rates at the same price is a win for me. Would you rather have locked 60p? You're right that the lower bound requirement is an advantage for GSync, but other than that, it's a question of the panel quality. Also, what do you think of all the 60Hz 4k GSync panels? Are they equally useless?


----------



## Athlonite (Jan 10, 2019)

Jensen to all nVidiots everywhere buy our new shit because it's new shit and it's not AMD


----------



## xkm1948 (Jan 10, 2019)

FuryX plus samsung 32” 1440p Freesync got the occasional pulsing problem. Can’t seem to fix it no matter what i do


----------



## Deleted member 158293 (Jan 10, 2019)

Anybody, but especially a veteran CEO speaking and acting this speaks volumes.

Wow...


----------



## Anymal (Jan 10, 2019)

Casecutter, VII is as fast as 2080 but on 7nm, 300w tdp. No worries then.


----------



## Nxodus (Jan 10, 2019)

yakk said:


> Anybody, but especially a veteran CEO speaking and acting this speaks volumes.
> 
> Wow...


He knows Nvidia is superior, if AMD would finally become proper competition maybe he would take back a notch.

Besides, who cares about CEO's nowadays? they are replacable like batteries. He's just a 3D printed bureoucrat


----------



## Sasqui (Jan 10, 2019)

Nxodus said:


> AMD fanbois will be so angry when in a couple of years RTX becomes standard



You know, that is a good point, they invested in RTX and maybe or maybe not it'll play out well for them.  "Standard" is also a good question, since DX12 introduced the RTX API in Win 10, it's still very young.  It's all about the ecosystem, meaning how many developers are going to invest time into implementing it when there's only a handful (or maybe one right now... 2080 ti) that can really handle it.  

It's not a very compelling story at the moment.


----------



## Anymal (Jan 10, 2019)

You have missed performance bump, 60fps even on 2060 1080p rtx on.


----------



## Vayra86 (Jan 10, 2019)

Anymal said:


> Its new and inovative tech, you can turn it off if you want, and play on 2080 as on new, the best amd has to offer, the radeon VII.  Jensen made his decision after seeing a brand new 7nm gpu with no new features and performance similar to 2080/1080ti which was avaliable almost 2 years ago, gp102 as Titan Xp even more, 2.5 years.



In the end, 99.99% of all games ever made and to be made does _not support this new innovative tech_ and your performance plummets when you dó use it, or you get a hideous looking AA mode in return for some 'free' performance. So what, realistically, is the added value here in RTX? The promise of something that might someday be nice (wasn't that an AMD punchline since forever)? You're still getting handicapped performance if you do use it - keep that in mind. How many BFV players have RTX ON, I'm very curious. Especially in the MP that the game focuses on.

So the bottom line is that AMD has a competitive product if they can price just under a 2080 which they probably will, in due time at the very least. That means the days of 2080 are numbered, because it is a very large die with some R&D budget behind it, while AMD is surfing the Vega wave they already paid for - dearly I'd say - and they have a smaller die with much better yields per wafer. That alone can offset their HBM2 expenses and create a margin that enables them to move along with any Nvidia price cuts. The only caveat is power consumption, but I'm getting the impression more and more people are willing to overlook that just for giving Nvidia a kick in the arrogant nuts. I would - if I was in the market for such a card.

In the end, Radeon 7 is essentially the long awaited 1080ti performance everyone wanted since the card launched. The 1080ti did that within a 255W power budget (more if you OC)... so how bad is it, really. 7nm is just enough to keep AMD in the game, in quite a convincing way. Its not ideal, but its a counterweight for certain.


----------



## Casecutter (Jan 10, 2019)

Anymal said:


> No worries then


I don't say it is a great part or the TDP, but dropping a part that's a cut-down version of Instinct production, is what they have... And I say we wait to see reviews that February 7th is coming up fast.  

You got Lemons, a little work and you make good money selling Lemonade.  Easy peasy lemon squeezy!


----------



## Nxodus (Jan 10, 2019)

Sasqui said:


> You know, that is a good point, they invested in RTX and maybe or maybe not it'll play out well for them.  "Standard" is also a good question, since DX12 introduced the RTX API in Win 10, it's still very young.  It's all about the ecosystem, meaning how many developers are going to invest time into implementing it when there's only a handful (or maybe one right now... 2080 ti) that can really handle it.
> 
> It's not a very compelling story at the moment.



You gotta start somewhere, right? That's principle of innovation. Innovation is the heart of PC nerding, isn't it? It's been so stale in the last decade that my heart starts racing even upon seeing the slightest innovation nowadays. Adaptive Sync, SSD's and RGB, that's about all the innovation we saw in the last decade. 

Nvidia had a fantastic idea with RTX: it not only looks better, but it effectively helps developers finish games faster. It's sensational. I wish them all the best of luck with that. 

While AMD, with its unlimited army of trolls does nothing, no innovation, and I'd be the first to shake AMD's hands if they finally did something to advance the industry. I want more games, I want hardware that helps devs finish games faster, I want to see some innovation finally


----------



## eidairaman1 (Jan 10, 2019)

Nxodus said:


> AMD fanbois will be so angry when in a couple of years RTX becomes standard, and AMD will be forced to adopt it effectively increasing the prices of their hot plastic cards even more.



Wrong here, AMD has been working on Radeon Rays for sometime.


Vayra86 said:


> In the end, 99.99% of all games ever made and to be made does _not support this new innovative tech_ and your performance plummets when you dó use it, or you get a hideous looking AA mode in return for some 'free' performance.
> 
> So the bottom line is that AMD has a competitive product if they can price just under a 2080 which they probably will, in due time at the very least. That means the days of 2080 are numbered, because it is a very large die with some R&D budget behind it, while AMD is surfing the Vega wave they already paid for - dearly I'd say - and they have a smaller die with much better yields per wafer.



And not forget Vega can be launched in different markets for dies that dont meet full vega 2 spec.


----------



## ironwolf (Jan 10, 2019)

I think he is mad because Lisa Su wore a leather jacket better than him.


----------



## Markosz (Jan 10, 2019)

"if we turn on DLSS we'll crush it. And if we turn on ray tracing we'll crush it."

Wow nice, so he ackowledges that the Radeon 7 will beat the RTX 2080 for less money in standard cases, and so he resorts to this childish flaming.

NVIDIA is toxic as fuck, they are greedy, anti-consumer, they try every dirty trick like GPP, and now trash talk like this.


----------



## Anymal (Jan 10, 2019)

If you can choose betwen VII and 2080 both priced at 750usd, and I believe we all have that power to make our own decision, why the hell would you choose VII?



Markosz said:


> "if we turn on DLSS we'll crush it. And if we turn on ray tracing we'll crush it."
> 
> Wow nice, so he ackowledges that the Radeon 7 will beat the RTX 2080 for less money in standard cases, and so he resorts to this childish flaming.
> 
> NVIDIA is toxic as fuck, they are greedy, anti-consumer, they try every dirty trick like GPP, and now trash talk like this.


You werent there when the first T&L gpu came out, a!


----------



## Nxodus (Jan 10, 2019)

eidairaman1 said:


> Wrong here, AMD has been working on Radeon Rays for sometime.



good for them. Unfortunately that's not enough.


----------



## Anymal (Jan 10, 2019)

Vayra86 said:


> In the end, 99.99% of all games ever made and to be made does _not support this new innovative tech_ and your performance plummets when you dó use it, or you get a hideous looking AA mode in return for some 'free' performance. So what, realistically, is the added value here in RTX? The promise of something that might someday be nice (wasn't that an AMD punchline since forever)? You're still getting handicapped performance if you do use it - keep that in mind. How many BFV players have RTX ON, I'm very curious. Especially in the MP that the game focuses on.
> 
> So the bottom line is that AMD has a competitive product if they can price just under a 2080 which they probably will, in due time at the very least. That means the days of 2080 are numbered, because it is a very large die with some R&D budget behind it, while AMD is surfing the Vega wave they already paid for - dearly I'd say - and they have a smaller die with much better yields per wafer. That alone can offset their HBM2 expenses and create a margin that enables them to move along with any Nvidia price cuts. The only caveat is power consumption, but I'm getting the impression more and more people are willing to overlook that just for giving Nvidia a kick in the arrogant nuts. I would - if I was in the market for such a card.
> 
> In the end, Radeon 7 is essentially the long awaited 1080ti performance everyone wanted since the card launched. The 1080ti did that within a 255W power budget (more if you OC)... so how bad is it, really. 7nm is just enough to keep AMD in the game, in quite a convincing way. Its not ideal, but its a counterweight for certain.


For the same money, as 1080ti, 2 years after. Yes everyone waited that. 7nm vs 16nm, congrats to AMD.


----------



## RH92 (Jan 10, 2019)

eidairaman1 said:


> Wrong here, AMD has been working on Radeon Rays for sometime.



Sure sure but those Radeon Rays are nowhere to be seen in actual products so the least we can say is that AMD is a full generation behind on this topic + 1 full generation behind pure rasterisation performance since they are catching in 2019 what is 2017 1080Ti perf  ( at similar  MSRP on top of that so they don't even have price advantage anymore ) , and don't even get me started with energy efficiency !  I mean things are far from looking great on their GPU department  ..........


----------



## Pruny (Jan 10, 2019)

Boycot ngreedia, who cares about tdp, just buy Radeon


----------



## DarkMantle (Jan 10, 2019)

Nxodus said:


> He knows Nvidia is superior, if AMD would finally become proper competition maybe he would take back a notch.
> 
> Besides, who cares about CEO's nowadays? they are replacable like batteries. He's just a 3D printed bureoucrat




I think you need to read a bit more about Hardware history. He is not only the CEO, he is the guy who founded the company you know.


----------



## Vayra86 (Jan 10, 2019)

Anymal said:


> If you can choose betwen VII and 2080 both priced at 750usd, and I believe we all have that power to make our own decision, why the hell would you choose VII?
> 
> 
> You werent there when the first T&L gpu came out, a!



Because you get double the VRAM
Because you got pretty annoyed with Nvidia's tactics of late
Because its a similar GPU in every way except for vaporware
Because this card was announced not at $750, but at $700,- and if Vega 56/64 is an indicator, it will drop much lower.

See, its not that difficult to think of reasons, they may not be reasons for you, but I can certainly see how they are good ones for others. The bottom line: Nvidia doesn't have free reign at the 2080 price point.

Comparing T&L with RTRT is a complete and utter fallacy. T&L and other pioneering tech was a move to make rendering _more efficient while increasing quality. _RTX only offers _brute forcing an extremely costly render technique with minor visual gains. _There is a reason we do rasterization and constantly improved upon it, and its current state is in many cases so close to RTRT that it is considerably better to use the 'old way'.



Anymal said:


> For the same money, as 1080ti, 2 years after. Yes everyone waited that. 7nm vs 16nm, congrats to AMD.



Ehm... what exactly is the competitor doing then? The stagnant perf/dollar started and ended with Nvidia, not AMD. Get your facts straight. I agree its still stagnant, but now at least there is competition.


----------



## IceShroom (Jan 10, 2019)

So Nvidia's most advance architecture don't have proper DP 1.2a support!!! 
And we need DLSS(Checker-bording in console) to get more performence!!! How low we need to go to get more performence. At this it is better to buy console then a 1200$ gpu.


----------



## Sasqui (Jan 10, 2019)

Nxodus said:


> Nvidia had a fantastic idea with RTX



I do believe that RTX was thought of long before NVidia and implemented in DX12 by MS. If it weren't for that, then it likely wouldn't have happened.

They do get kudos for pushing the envelope and make it part of their marketing strategy.  Will it pan out?  Maybe someday.


----------



## HM_Actua1 (Jan 10, 2019)

SLAAAAAAAAAYY!!!!!!
love it!
Savage! Hope that sparks a fire under AMD's butt's. we need better competitors to push technology further and faster.


----------



## Amite (Jan 10, 2019)

Have a BenQ 144 mrz with out Freesync and a MSI 144 mhz with Freesync.
As far as Battlefield V   "Multi Player"  The MSI sure as heck seems a lot smoother when u are in a room with 5 guys trying to kill each other.

Have a Vega 56


----------



## Nxodus (Jan 10, 2019)

Markosz said:


> "if we turn on DLSS we'll crush it. And if we turn on ray tracing we'll crush it."
> 
> Wow nice, so he ackowledges that the Radeon 7 will beat the RTX 2080 for less money in standard cases, and so he resorts to this childish flaming.
> 
> NVIDIA is toxic as fuck, they are greedy, anti-consumer, they try every dirty trick like GPP, and now trash talk like this.



but AMD on the other hand is so nice, secretly they are a non-profit NGO spending all the profits to help the elderly and clean the oceans.


----------



## Markosz (Jan 10, 2019)

Anymal said:


> If you can choose betwen VII and 2080 both priced at 750usd, and I believe we all have that power to make our own decision, why the hell would you choose VII?



If I were to choose between AMD and NVIDIA for exactly the same performance for the same money, I'd choose AMD since NVIDIA as a company is disgusting.
And thats without being an AMD fanboy, I'm angry at them for releasing the RX 590 as they did, because that card for that money is such a bullshit.


----------



## RH92 (Jan 10, 2019)

Markosz said:


> Wow nice, so he ackowledges that the Radeon 7 will beat the RTX 2080



Better wait 3rd party reviews before making such claims  !



Markosz said:


> for less money


https://uploads.disquscdn.com/image...9048dae55bf1fff355975ffcb93349be66c8eb511.jpg

Nah i don't think so !



Markosz said:


> NVIDIA is toxic as fuck, they are greedy, anti-consumer, they try every dirty trick like GPP, and now trash talk like this.



Not even going to bother with this .....


----------



## Nxodus (Jan 10, 2019)

all you people whining about Huangs tone are probably a little too sensitive, you can't view the world through reddit-esque safe space filters. In the real corporate world theres a blood and bone war going in, especially in an industry where there's only two competitors (where are you Intel?). The shareholders expect you to have balls. Huang had balls.


----------



## Sasqui (Jan 10, 2019)

Nxodus said:


> Huang had balls.



Past tense.  He just spit them out of his mouth.


----------



## Mr.Mopar392 (Jan 10, 2019)

Anymal said:


> Casecutter, VII is as fast as 2080 but on 7nm, 300w tdp. No worries then.



Even if  vega 7 was faster you still wouldn't buy it, your distain for amd is showing!


----------



## Nxodus (Jan 10, 2019)

DarkMantle said:


> I think you need to read a bit more about Hardware history. He is not only the CEO, he is the guy who founded the company you know.



oh shit really? Haha I probaly have to read up on it indeed. I judge a company buy it's products, not history. I never really cared about prominent people in hardware or anywhere.


----------



## illli (Jan 10, 2019)

He is just envious of her better looking jacket


----------



## oxidized (Jan 10, 2019)

Everyone focusing on the WHY and not on the WHAT.
We know huang's personality i thought everyone just got used to it, but no, everyone is complaining about that, and not on what he said, which is pretty much right, except for the freesync thing.


----------



## 27MaD (Jan 10, 2019)

ironwolf said:


> I think he is mad because Lisa Su wore a leather jacket better than him.


Facebook comment.


----------



## Markosz (Jan 10, 2019)

Nxodus said:


> but AMD on the other hand is so nice, secretly they are a non-profit NGO spending all the profits to help the elderly and clean the oceans.










 - and this was 3 years ago... they only got worse.

AMD doesn't lock features behind hardware, they make open-source platforms.


----------



## Deleted member 158293 (Jan 10, 2019)

Nxodus said:


> He knows Nvidia is superior, if AMD would finally become proper competition maybe he would take back a notch.
> 
> Besides, who cares about CEO's nowadays? they are replacable like batteries. He's just a 3D printed bureoucrat



Everyone is replaceable, but the public face of a company is integral to the position CEO.

This was more akin to a scripted pre-event smack talk, not a multi-national business corporation.


----------



## Anymal (Jan 10, 2019)

Mr.Mopar392 said:


> Even if  vega 7 was faster you still wouldn't buy it, your distain for amd is showing!


Well its not faster, thats why. You can also imagine how lame is to release top 7nm gpu with performance as top 16nm from 2 years ago, with higher tdp.


----------



## Vayra86 (Jan 10, 2019)

Sasqui said:


> I do believe that RTX was thought of long before NVidia and implemented in DX12 by MS. If it weren't for that, then it likely wouldn't have happened.
> 
> They do get kudos for pushing the envelope and make it part of their marketing strategy.  Will it pan out?  Maybe someday.



I think RTX was in the pipeline since Volta and since Pascal pushed the performance level to a comfy Ultra 1080p/60 for midrange, which means there is little ground left to cover except going along with the resolution increases of a niche of gamers. Nvidia knows just like everyone else that we're entering the realm of diminishing returns, they simply can't keep scaling things up the old way. That on its own was never going to be enough to keep the Nvidia profit train going, so they needed 'added value', and hey look, we already have Tensor cores, let's let 'em do RT. And that resulted in Quadro... and a new reason to push it on content developers. Boom new desire, new market created and datacenter tech repurposed.

RTX is an afterthought. This was *never *about realtime RT, it was about producing content more efficiently. And that is proven by the complete lack of RTRT content. RTX is a complete gamble and it was cheap for Nvidia to do, they had to offload those faulty Quadros somewhere anyway. In that sense the risk they take is low, and you can rest assured they have a non-RT alternative escape route. We already heard rumors about GTX 11xx...


----------



## oxidized (Jan 10, 2019)

Anymal said:


> Well its not faster, thats why. You can also imagine how lame is to release top 7nm gpu with performance as top 16nm from 2 years ago, with higher tdp.



You call that "lame" i call that ridiculous lol.


----------



## Anymal (Jan 10, 2019)

Markosz said:


> If I were to choose between AMD and NVIDIA for exactly the same performance for the same money, I'd choose AMD since NVIDIA as a company is disgusting.
> And thats without being an AMD fanboy, I'm angry at them for releasing the RX 590 as they did, because that card for that money is such a bullshit.


Thats a lot of bullshit.


----------



## Casecutter (Jan 10, 2019)

Nxodus said:


> Besides, who cares about CEO's nowadays? they are replacable like batteries. He's just a 3D printed bureoucrat


You don't know who Jensen Huang is he's just not a "figure-head"!  It be like saying China could just dump Xi Jinping.  But nice to hear an "up-swell" from people feeling he might find himself to close to the door.



Anymal said:


> If you can choose betwen VII and 2080 both priced at 750usd, and I believe we all have that power to make our own decision, why the hell would you choose VII?



First In America the more realistic scenario in a couple of weeks is Vega 7 could be a $660 USD card working a Rebate.   For perspective today, the RTX 2080 has 2-3 SKU's starting $695, then ramps up to $730-750 (3-5 SKU's), while the bulk (14) are $780 on up to $900.

Let revisit this say mid-March-April and see what plays out.



Hitman_Actual said:


> sparks a fire under AMD's butt's


AND let's not forget Nvidia, they've been complacent also... RTX 2060 is a dud for $350 being 2019.


----------



## Mr.Mopar392 (Jan 10, 2019)

Pruny said:


> Boycot ngreedia, who cares about tdp, just buy Radeon



I'll be getting a vega 7, I will never own another nvidia card as long amd a is around, and seeing some  of these nvidiots comments make my choice to choose amd much easier


----------



## Anymal (Jan 10, 2019)

Vayra86 said:


> I think RTX was in the pipeline since Volta and since Pascal pushed the performance level to a comfy Ultra 1080p/60 for midrange, which means there is little ground left to cover except going along with the resolution increases of a niche of gamers. Nvidia knows just like everyone else that we're entering the realm of diminishing returns, they simply can't keep scaling things up the old way. That on its own was never going to be enough to keep the Nvidia profit train going, so they needed 'added value', and hey look, we already have Tensor cores, let's let 'em do RT. And that resulted in Quadro... and a new reason to push it on content developers. Boom new desire, new market created and datacenter tech repurposed.
> 
> RTX is an afterthought. This was *never *about realtime RT, it was about producing content more efficiently. And that is proven by the complete lack of RTRT content. RTX is a complete gamble and it was cheap for Nvidia to do, they had to offload those faulty Quadros somewhere anyway. In that sense the risk they take is low, and you can rest assured they have a non-RT alternative escape route. We already heard rumors about GTX 11xx...


You really need to read more about rtx. Compare it to screen space reflections, whats the diff.



Mr.Mopar392 said:


> I'll be getting a vega 7, I will never own another nvidia card as long amd a is around, and seeing some  of these nvidiots comments make my choice to choose amd much easier


Well, now, who is a fanboy and a hater.


----------



## eidairaman1 (Jan 10, 2019)

RH92 said:


> Sure sure but those Radeon Rays are nowhere to be seen in actual products so the least we can say is that AMD is a full generation behind on this topic + 1 full generation behind pure rasterisation performance since they are catching in 2019 what is 2017 1080Ti perf  ( at similar  MSRP on top of that so they don't even have price advantage anymore ) , and don't even get me started with energy efficiency !  I mean things are far from looking great on their GPU department  ..........



They came to realize the perf hit for the detail level provided wasnt worth launching fully or advertising it, so still under development. Nv is getting skewered for it.


----------



## Vayra86 (Jan 10, 2019)

Anymal said:


> You really need to read more about rtx. Compare it to screen space reflections, whats the diff.
> 
> 
> Well, now, who is a fanboy and a hater.



You're new / not frequent here, its okay, but you can search my post history and see how well versed I am. I know precisely what it is compared to everything else.


----------



## PanicLake (Jan 10, 2019)

champsilva said:


> This is mining reason, same happened to amd.


I don't think so! The difference is that nVidia was stable and just dropped, while AMD had a "bubble" and then came back to usual values, is even higher than "before the bubble".
nVidia : red
AMD : green


----------



## eidairaman1 (Jan 10, 2019)

yakk said:


> Everyone is replaceable, but the public face of a company is integral to the position CEO.
> 
> This was more akin to a scripted pre-event smack talk, not a multi-national business corporation.



He is just another muppet, trying to minimize the truth.


----------



## Valantar (Jan 10, 2019)

Nxodus said:


> but AMD on the other hand is so nice, secretly they are a non-profit NGO spending all the profits to help the elderly and clean the oceans.


Well, no, they've got less power to abuse, so it's quite logical that there are fewer abuses of power from that side. That doesn't take away Nvidia's responsibility for their actions though. Far from it. They're still entirely to blame for being the anti-consumer, proprietary tech-promoting, wannabe monopolists that they are. AMD on the other hand supports open standards across the board, and while this is (obviously) done for both marketing reasons and out of necessity (can't establish a proprietary standard if you're 1/3rd of your competitor's size), it's still a better and more consumer-friendly stance.


As for the VII, it looks decent. Interesting to see where final (in-game) clocks end up.
Last gen:
Vega 64, mid-to-large die, 8gb expensive HBM2, pushed to the limit, 275W TDP.
trailed or matched
GTX 1080 mid-to-small die, 8GB relatively cheap GDDR5X, OC potential, 180W TDP.
A clear win for Nvidia in both cost and performance.

This gen:
Radeon VII, mid-to-small die, 16GB (slightly less) expensive HBM2, unknown OC potential (likely not much), 300W TDP
reportedly roughly matches the
RTX 2080, with a _huge_ die, 8GB (roughly equally) expensive GDDR6, less OC potential than last gen, 225W TDP.

Even counting the process jump, AMD is gaining on Nvidia. Nvidia does have RT and Tensor cores, but they're not really useful, and might never really be (for this level of hardware). Traditional rendering is still king, and AMD is hopefully progressing with a new arch, but as a stop-gap, this looks surprisingly good. Didn't think we'd see 7nm Vega for consumers, but given that the die likely costs half of an RTX 2080 die and HBM2 and GDDR6 cost roughly the same per GB, this sure is a role reversal from last year when Nvidia made more money selling faster parts for similar prices.

Have to wonder what Navi will bring to the table. My Fury X lives on, but who knows what next year will bring?


----------



## Nxodus (Jan 10, 2019)

Vayra86 said:


> We already heard rumors about GTX 11xx...


Those rumors are 100% false. Seriously, who would believe that any sane company would cannibalize its own products?


----------



## Tsukiyomi91 (Jan 10, 2019)

the amount of salt is just... wow.

@Nxodus let alone "nerfing" the already expensive & hard to produce Turing core that has Tensor & RT Cores baked into the silicon. No one is sane enough to spend another billion or trillion dollar just to make another silicon that has the same codename but without those 2 key features that one has already put into motion for nearly a decade.


----------



## iO (Jan 10, 2019)

He's pissed that AMD only had to port their ancient arch to 7nm to become somewhat competitive again.


And the only thing RTX crushes in it's current form is the framerate...


----------



## Sasqui (Jan 10, 2019)

Vayra86 said:


> RTX is an afterthought. This was never about realtime RT, it was about producing content more efficiently. And that is proven by the complete lack of RTRT content.



I am just going to have to disagree with you there, at least part of it   the dream of Real Time RT has been around for a while.  It's been that, just a dream thanks to the fact that no consumer hardware was capable of handling it.  That said, it'll be interesting to see who is willing to put the money into developing content with a very limited audience.


----------



## Nxodus (Jan 10, 2019)

Valantar said:


> AMD on the other hand supports open standards across the board



No one said AMD were dumb. Very smart move. They are forced to do open standards, those bring you the most virtue-signaling points.


----------



## mouacyk (Jan 10, 2019)

GTX 11XX series will artifact with the wallpaper patterns behind him.


----------



## Sasqui (Jan 10, 2019)

Nxodus said:


> They are forced to do open standards, those bring you the best virtue-signaling points



Even better, they foster new markets.  Everyone wins, including the consumer.


----------



## RH92 (Jan 10, 2019)

eidairaman1 said:


> They came to realize the perf hit for the detail level provided wasnt worth launching fully or advertising it, so still under development.



Truth is litle do you ( or anyone else  outside of AMD ) know about that so unless  you work there and you have secrets to share yeah  .......  pretend at least to be objective  you are not even trying ....


----------



## Valantar (Jan 10, 2019)

Nxodus said:


> No one said AMD were dumb. Very smart move. They are forced to do open standards, those bring you the most virtue-signaling points.


Not among insecure gamer bros; what Huang is doing here is nothing more than virtue-signaling how "tough" he is. The funny thing is, only fundamentally insecure people act like this. Oh well.

Also, the entire term "virtue signaling" is BS - doing good is doing good, regardless of your intent is to be seen doing good. That might be seen as a character flaw, but your actions are still beneficial in the end. Acting like an ass just to look tough, on the other hand, just makes you look insecure, all the while benefiting nobody at all. Which of these is better, would you say?


----------



## eidairaman1 (Jan 10, 2019)

Sasqui said:


> Even better, they foster new markets.  Everyone wins, including the consumer.



Yep because NV was forced to by sales slumps, they didn't want to commit corporate suicide.


----------



## M2B (Jan 10, 2019)

Valantar said:


> Well, no, they've got less power to abuse, so it's quite logical that there are fewer abuses of power from that side. That doesn't take away Nvidia's responsibility for their actions though. Far from it. They're still entirely to blame for being the anti-consumer, proprietary tech-promoting, wannabe monopolists that they are. AMD on the other hand supports open standards across the board, and while this is (obviously) done for both marketing reasons and out of necessity (can't establish a proprietary standard if you're 1/3rd of your competitor's size), it's still a better and more consumer-friendly stance.
> 
> 
> As for the VII, it looks decent. Interesting to see where final (in-game) clocks end up.
> ...



You're forgetting something very important, Vega 64 VS. GTX 1080 was 14 VS 16nm. Radeon VII VS. RTX 2080 is a completely different story, they're not gaining on nvidia, they're falling further behind.
"WhyCry" from VideoCardz says he has many sources stating that Navi was not good enough to compete and is delayed.
If the Radeon VII is the most AMD can offer within 2 years, their flagship will get beaten by nvidia's next gen XX70 class card, which is just sad.


----------



## Mr.Mopar392 (Jan 10, 2019)

Anymal said:


> You really need to read more about rtx. Compare it to screen space reflections, whats the diff.
> 
> 
> Well, now, who is a fanboy and a hater.



Not a hater trust me, a fanboy yes. I Really don't care what anyone wants to buy its their money, i just don't care for Nvidia cards or their tech thats it. last nvidia cards i own was a pair of gtx 570 in sli they ran like garbage and color look wash out. Never own since then, not for the foreseeable future.


----------



## Valantar (Jan 10, 2019)

eidairaman1 said:


> Yep because NV was forced to by sales slumps, they didn't want to commit corporate suicide.


Still have to love (grudgingly) adopting a standard one day, and the next pissing on it as "not working" when talking about a competitor. Either you then adopted a non-working tech, or you're lying to make yourself look good. Neither are particularly smart. 

Nvidia would require proprietary PSUs if they could pull it off. Oh, damn, I probably gave them an idea now. Sorry!


----------



## juiseman (Jan 10, 2019)

Sooo...did Nvidia anounce any new product at CES? or just dis AMD?
This is just bad form.

I find it funney (mabye its wrong) that 3 big companys who have been overcharging for their products (Apple,Intel & Nvidia)
are finally getting a taste of competition. They did this to themselves and their share prices are reflecting this also.
I think they don't understand that people simply can't afford $1000+ Iphones, GPU's and CPU's anymore.


----------



## Sasqui (Jan 10, 2019)

eidairaman1 said:


> Yep because NV was forced to by sales slumps, they didn't want to commit corporate suicide.



Are you referring to them "allowing" Free-sync to be enabled on G-sync equipped monitors?


----------



## Mad_foxx1983 (Jan 10, 2019)

tvamos said:


> I've used 3 cards that support freesync with my monitor, zero issues. He knows g-sync is gonna sell less, one more reason for bitterness.



Same here. From and rx580 to Rx Vega 56. Zero issues. Always smooth gameplay


----------



## eidairaman1 (Jan 10, 2019)

Sasqui said:


> Are you referring to them "allowing" Free-sync to be enabled on G-sync equipped monitors?



Yup, they support vulkan after flack iirc...


----------



## Valantar (Jan 10, 2019)

M2B said:


> You're forgetting something very important, Vega 64 VS. GTX 1080 was 14 VS 16nm. Radeon VII VS. RTX 2080 is a completely different story, they're not gaining on nvidia, they're falling further behind.
> "WhyCry" from VideoCardz says he has many sources stating that Navi was not good enough to compete and is delayed.
> If the Radeon VII is the most AMD can offer within 2 years, their flagship will get beaten by nvidia's next gen XX70 class card, which is just sad.


Sure, there is a process advantage, and there is no doubt Nvidia's base arch is better in both perf/w and absolute perf (if you think I said otherwise, you misunderstood what "gaining" means) but Nvidia could have chosen to go wide and slow with the 20XX series on 12nm, adding SMs rather than adding RT and tensor cores. Instead, they chose stagnation. Nvidia had zero perf/w gain (and no perf/SM gain) moving from 16 to 12nm, just slightly smaller dice, so they're definitely not pulling away. They'll obviously gain perf when they move to 7nm, but that's likely still a year out (though I wouldn't put it past Nvidia to make their own $1200 GPUs obsolete within a year, frankly). Still, this buys AMD valuable time to improve their arch before this happened.

Strategically, this is an _excellent_ move by AMD. Will be interesting to see how it plays out when the next generations arrive.


----------



## dogsbody (Jan 10, 2019)

Dimi said:


> Mind you, my 1440p 165hz gsync monitor only cost me $350. Granted its a TN panel but i use a ColorMunki display calibrator for color accuracy.


What is the point of color calibration on a TN display that only display colors correctly in a circle the center of which is at the screen normal intersecting your iris? Colors are badly distorted towards the edges of this circle and everywhere else (way beyond what lack of calibration would cause). TN is crap due to lousy view angles.


----------



## Anymal (Jan 10, 2019)

Desperate moves, 2 years later.


----------



## Sasqui (Jan 10, 2019)

dogsbody said:


> What is the point of color calibration on a TN display that only display colors correctly in a circle the center of which is at the screen normal intersecting your iris? Colors are badly distorted towards the edges of this circle and everywhere else (way beyond what lack of calibration would cause). TN is crap due to lousy view angles.



Not to be flip or anything, but don't you turn your eyes or head when in front of your monitor?  You certainly are correct about human peripheral color vision.


----------



## Anymal (Jan 10, 2019)

Vayra86 said:


> You're new / not frequent here, its okay, but you can search my post history and see how well versed I am. I know precisely what it is compared to everything else.


You sure are. Just misleaded right now.


----------



## Valantar (Jan 10, 2019)

dogsbody said:


> What is the point of color calibration on a TN display that only display colors correctly in a circle the center of which is at the screen normal intersecting your iris? Colors are badly distorted towards the edges of this circle and everywhere else (way beyond what lack of calibration would cause). TN is crap due to lousy view angles.


I still remember the first time I connected my Dell U2711, which replaced a (not cheap) 1200p Samsung TN monitor back in 2010 or 2011 (damn, my monitor is eight years old!). Mind? Blown. There is no way on earth I'll ever go back to TN, and I'd much rather take 75Hz FS IPS than 144Hz FS TN. Heck, I'm waiting for money (and motivation) to replace my current 60Hz non-FS IPS with something actually worth the upgrade, when I could afford to replace it with a 90-120Hz FS TN tomorrow if I wanted to.


----------



## kapone32 (Jan 10, 2019)

Anyone who knows anything about the history of Nvidia would not be any way surprised by that comment. Jensen is one of the most evil executives in any industry. All i heard from him is Ray tracing this year with no mention of hair works......I guess that is going to way of Physx hehe


----------



## R0H1T (Jan 10, 2019)

Valantar said:


> Strategically, this is an _excellent_ move by AMD. Will be interesting to see how it plays out when the next generations arrive.


Well if (some) of the rumor mill is to be believed, Navi should come out later this year - which frankly IMO would be just about perfect timing, considering it's been in development for such a long time.


----------



## Dr_b_ (Jan 10, 2019)

Ray Tracing RTX cards are a botched launch.  The cost is astronomical for the only 2 parts that are truly faster than the Pascal cards, and the performance and availability in titles are lacking.  This feature also doesn't improve anything for existing games which are 99.99% of all games that you would need a dGPU to play at decent framerates.   We must however admit that its great to see new technology, but it cant be so expensive that no one can afford or wouldn't want to pay that much for what it delivers.


----------



## Nxodus (Jan 10, 2019)

Valantar said:


> Not among insecure gamer bros; what Huang is doing here is nothing more than virtue-signaling how "tough" he is. The funny thing is, only fundamentally insecure people act like this. Oh well.
> 
> Also, the entire term "virtue signaling" is BS - doing good is doing good, regardless of your intent is to be seen doing good. That might be seen as a character flaw, but your actions are still beneficial in the end. Acting like an ass just to look tough, on the other hand, just makes you look insecure, all the while benefiting nobody at all. Which of these is better, would you say?



Huang insecure? He has absolutely no reason to be insecure. He might be fed up with AMD armies sh*tting on the 20 series all over the internet, it might have gotten to him AMD armies expected a cheaper and better 2080 killer, and look what they got, an overpriced product with zero innovation and similar, or worse performance despite the die shrink. He won again. Don't tell me he's insecure because of VII.


----------



## PanicLake (Jan 10, 2019)

Nxodus said:


> Huang insecure? He has absolutely no reason to be insecure.


Falling stock beg to differ!


----------



## Valantar (Jan 10, 2019)

Nxodus said:


> Huang insecure? He has absolutely no reason to be insecure. He might be fed up with AMD armies sh*tting on the 20 series all over the internet, it might have gotten to him AMD armies expected a cheaper and better 2080 killer, and look what they got, an overpriced product with zero innovation and similar, or worse performance despite the die shrink. He won again. Don't tell me he's insecure because of VII.


Insecurity isn't rational. Doesn't matter what the reasoning is, this is not the behavior of a confident person. Period. This is the behavior of someone desperately trying to come off as confident (and blowing it).


----------



## GoldenX (Jan 10, 2019)

Are all CEOs this bitchy? You already have a technological and performance advantage over your competition, plus a better branding in the name of non-tech people, there is no need to act as an stereotypical fanboy.


----------



## Nxodus (Jan 10, 2019)

juiseman said:


> Sooo...did Nvidia anounce any new product at CES? or just dis AMD?
> This is just bad form.
> 
> I find it funney (mabye its wrong) that 3 big companys who have been overcharging for their products (Apple,Intel & Nvidia)
> ...



Because the Radeon VII is such a cheap deal, right? You people must accept a few facts: "Normies" still prefer to play on consoles or mobile. PC has become niche. Many times there were news about PC and PC component sales tanking. You can't stay afloat in a shrinking market by playing nice and cheap. I for one am very grateful theres still companies left who work in a niche market. I'm grateful anyone is even developing games for PC. Just look at the RGB craze. That's a deperate effort to get normies back into the PC market. "Just look at out flashy stuff, your PS4 can't do that". I'm pretty sure the times of cheap PC components are over, especially gaming ones. PC's are too complicated to average Joe.



GinoLatino said:


> Falling stock beg to differ!



Stock market is a volatile market, even the Dow Jones has had some bad days, so what, should all americans commit sudoku when it falls 5%?


----------



## R0H1T (Jan 10, 2019)

*Sudoku*, are you confusing it with something else


----------



## juiseman (Jan 10, 2019)

Remember this is the launch price, i bet it will drop $150 or more after too long.
Then the price seems more attractive.
Will someone remind me of the RTX launch price? 
If I remember the 2080 TI was around $1000 or more?


----------



## Nxodus (Jan 10, 2019)

R0H1T said:


> *Sudoku*, are you confusing it with something else



seppuku


----------



## medi01 (Jan 10, 2019)

Nxodus said:


> Huang insecure? He has absolutely no reason to be insecure.


Great to see analysts with insights posting on this forum.
Would you mind to elaborate on Nv stock turbulence?


----------



## TheLaughingMan (Jan 10, 2019)

So his argument is that AMD can compete with them in their proprietary hardware and software? That makes no sense what so ever.


----------



## Nxodus (Jan 10, 2019)

medi01 said:


> Great to see analysts with insights posting on this forum.
> Would you mind to elaborate on Nv stock turbulence?



you don't have to be an expert to see certain trends. Stuff like Facebook stocks dropping after the data leak scandals. Facebook is still here, unfortunately.
What about gold and silver? Few years ago prices literally collapsed, and behold, people are still buying precious metals. Markets adjusts, and life goes on. As long as Nvidia is still turning a profit there's no bankruptcy in sight, regardless of share values.


----------



## HTC (Jan 10, 2019)

In case nobody mentioned it yet:



> The performance is lousy and there’s nothing new.



By Jensen Huang



> What I would say is that we’re very excited about Radeon VII, and I would probably suggest that he hasn’t seen it yet.



By Lisa Su

Seems a ... diplomatic ... reply.

Found that here: can't find it's source


----------



## kapone32 (Jan 10, 2019)

Nxodus said:


> Because the Radeon VII is such a cheap deal, right? You people must accept a few facts: "Normies" still prefer to play on consoles or mobile. PC has become niche. Many times there were news about PC and PC component sales tanking. You can't stay afloat in a shrinking market by playing nice and cheap. I for one am very grateful theres still companies left who work in a niche market. I'm grateful anyone is even developing games for PC. Just look at the RGB craze. That's a deperate effort to get normies back into the PC market. "Just look at out flashy stuff, your PS4 can't do that". I'm pretty sure the times of cheap PC components are over, especially gaming ones. PC's are too complicated to average Joe.
> 
> That is interesting statement. In my opinion it only applies to the highest end of the market. If you didn't know AMD has released products like $99 APUs, That can play fortnite all day. Oh and while the AAA companies stocks are floundering Indie developers and ones not tied to corporate greed (the shareholders are more important to satisfy than customers) are excelling. Doubt me ask how many people are interested in Cyberpunk 2077 or Total War Warhammer 3 that comes from Sega a company that is doing quite well in today's volatile market. Why did I mention fortnite? I have received 3 requests in the last month to build fortnite  capable PCs for friends and colleagues.
> 
> ...


----------



## Casecutter (Jan 10, 2019)

mouacyk said:


> GTX 11XX series will artifact with the wallpaper patterns behind him.



No, behind him is the mat he's playing Twister on.


----------



## Nxodus (Jan 10, 2019)

while silly games like Fortnite can bring a few people over, I don't see a trend in the near future where PC gaming goes mainstream. AAA games produce meager sales like a few dozen millions. In a world of 7.7 billion people.


----------



## XXL_AI (Jan 10, 2019)

dude, get your act together, we know amd sucks but bashing them on a public scene as the ceo of the most innovative gpu brand in the history, relax a bit.


----------



## scevism (Jan 10, 2019)

I've just got a 55'' samsung tv with freesync built in 120fps with my r9 290x only at 1080p but works for me. WITH MY LEATHER JACKET ON LOL.....


----------



## kapone32 (Jan 10, 2019)

Nxodus said:


> while silly games like Fortnite can bring a few people over, I don't see a trend in the near future where PC gaming goes mainstream. AAA games produce meager sales like a few dozen millions. In a world of 7.7 billion people.



A few there are over 14 million players of fortnite and it is growing. A lot of those users are adults but the majority are kids who will get exposed to PC gaming. I have a friend who has been on consoles for the longest time. I helped him build a PC using a R7 2700x 1070Ti, NVME and SSD with a 1440P screen. Alll i can tell you is he has made statements like "now I see how gaminf magazines pictures looked so crisp"


----------



## Sasqui (Jan 10, 2019)

Take this with a grain of salt






source: https://www.wepc.com/news/video-game-statistics/


----------



## nickbaldwin86 (Jan 10, 2019)

love how everyone is beating on the stock market being down for NV

@ time of posting it is at 143.09 and UP
While AMD is DOWN @ 19.33 ... I would rather own stock in a company that is worth $143 than a company with $19. just food for thought people.

NVIDIA Corporation
NASDAQ: NVDA
143.09 USD +0.51 (0.36%)
Jan 10, 1:56 PM EST ·

Advanced Micro Devices, Inc.
NASDAQ: AMD
19.33 USD −0.86 (4.26%)
Jan 10, 1:55 PM EST


----------



## Nxodus (Jan 10, 2019)

kapone32 said:


> A few there are over 14 million players of fortnite and it is growing. A lot of those users are adults but the majority are kids who will get exposed to PC gaming. I have a friend who has been on consoles for the longest time. I helped him build a PC using a R7 2700x 1070Ti, NVME and SSD with a 1440P screen. Alll i can tell you is he has made statements like "now I see how gaminf magazines pictures looked so crisp"



nice of you to help him ascend We now need a few million more to help keep hardware prices down.



Sasqui said:


> Take this with a grain of salt
> 
> 
> 
> ...



grain of salt taken, still extremely worrisome


----------



## XiGMAKiD (Jan 10, 2019)

HTC said:


> In case nobody mentioned it yet:
> 
> 
> 
> ...


It's from PC World, link in this thread's news source


----------



## geon2k2 (Jan 10, 2019)

this looks like:
... and the grapes are sour, said the fox when he saw that he could not reach them ...
or maybe:
... all this technology and these guys at amd crush us at compute, crush us at bandwidth with such an old technology and such small die size, a bit more than half of ours.

anyway pretty lame speech  for a CEO, no diplomacy at all.


----------



## Sasqui (Jan 10, 2019)

Nxodus said:


> grain of salt taken, still extremely worrisome



Gone from a majority to a healthy minority.

If it weren't for crypto, NVDA and AMD wouldn't be crapping their trousers at this point in time.


----------



## Anymal (Jan 10, 2019)

geon2k2 said:


> this looks like:
> ... and the grapes are sour, said the fox when he saw that he could not reach them ...
> or maybe:
> ... all this technology and these guys at amd crush us at compute, crush us at bandwidth with such an old technology and such small die size, a bit more than half of ours.
> ...


It looks like: lol, 7nm, 300w tdp only 2080 or 1080ti performance from 2 years ago, no new features, but 16gb hbm2, for games...


----------



## Vayra86 (Jan 10, 2019)

Nxodus said:


> Those rumors are 100% false. Seriously, who would believe that any sane company would cannibalize its own products?



Its not a stretch for Nvidia to rebrand Pascal one time and rearrange the product stack with it. They can redo a Kepler refresh even, because there's GDDR6 now. Even up until very recently they've released 1060 versions we never saw before. Pascal "2" could become a value-oriented, non-RT GPU stack to secure those upgrades people have been sitting on/waiting for. It could also be Nvidia's answer to the Radeon 7 to compete with it on price.

Imagine you'd have a 1080ti right now. Your only option is moving to a 2080ti at nearly double the MSRP for a pretty marginal performance win. Not really the best route. And if I look at myself; there is absolutely NO reason for me to ditch my 1080. Only a value-oriented performance upgrade would put me on that path and right now all I can move to is an 800 dollar 2080 to even make a dent.

Regardless, they're rumors. Just saying, don't be surprised if it happens.


----------



## eidairaman1 (Jan 10, 2019)

kapone32 said:


> Anyone who knows anything about the history of Nvidia would not be any way surprised by that comment. Jensen is one of the most evil executives in any industry. All i heard from him is Ray tracing this year with no mention of hair works......I guess that is going to way of Physx hehe



If that's true- good riddance to rubbish



XXL_AI said:


> dude, get your act together, we know amd sucks but bashing them on a public scene as the ceo of the most innovative gpu brand in the history, relax a bit.



Say that on the guys twitter directly, he cant see your comment here.


----------



## Vayra86 (Jan 10, 2019)

Anymal said:


> You sure are. Just misleaded right now.



I am, as always, ready to read your sound argumentation on that. So far you're not being very good at producing a post with substance. All I'm seeing is repetitive oneliners lacking data to back them up.


----------



## Anymal (Jan 10, 2019)

Vayra86 said:


> I am, as always, ready to read your sound argumentation on that. So far you're not being very good at producing a post with substance. All I'm seeing is repetitive oneliners lacking data to back them up.


Still waiting your argumentation why 60fps rtx on 1080p monitor with 350usd 2060 is fail for you? It wil get better and better and its new tehnology not just be better in rasterization. You should watch Jensens keynote on ces2019, explaind for plain minded as you.


----------



## neatfeatguy (Jan 10, 2019)

I think he went the very wrong way with his comments. They should have been more in line of how his top end cards are still a great value (some kind of PR spin to make them sound good for what they) and then commend AMD for bring up some friendly competition and how he hopes for more from them and yadda yadda yadda.


----------



## Vayra86 (Jan 10, 2019)

Anymal said:


> Still waiting your argumentation why 60fps rtx on 1080p monitor with 350usd 2060 is fail for you? It wil get better and better and its new tehnology not just be better in rasterization. You should watch Jensens keynote on ces2019, explaind for plain minded as you.



Troll confirmed. I already pointed that out extensively, start reading - no content, no worthwhile visual gain, etc etc

Besides, this isn't about 2060, but the high end.


----------



## efikkan (Jan 10, 2019)

Anymal said:


> It looks like: lol, 7nm, 300w tdp only 2080 or 1080ti performance from 2 years ago.


To buyers, the only thing that actually matters is how well it performs, along with thermals, noise etc. What process they use, how many cores, what clocks, etc. shouldn't matter to buyers, the benchmarks should show the reality. But if AMD needs ~300W to match the performance level of RTX 2080 / GTX 1080 Ti, they need to offer up some other advantage to justify going with a TDP that much higher, e.g. lower price.

Strategically, I can't see any real good reasons why AMD releases this card. My best guess is they felt the need to get some attention in the market since they don't have anything else for a good while. This is similar to the "pointless" RX 590 released last year, released just to get another review cycle.

For the future, this doesn't look good. AMD just spent what might be the last "good" node shrink, and repurposed a GPU for the professional market, a GPU which is very expensive to make and taking up precious capacity on the 7nm node. If this was planned a long time ago, they would have made a GPU without full fp64 support and with GDDR6, then at least they could have sold it much cheaper. But with Vega 20 they are at a disadvantage, not only is the HBM2 more expensive, the die is too. And they even put 16GB on it, which is completely wasted for gaming. If they really wanted more than 8GB, they could have gone with 12GB, and at least made it a little cheaper.


----------



## Anymal (Jan 10, 2019)

He was oversurprised how lame VII is for a 7nm gpu and he was carried away, maybe few drinks before, they are all rich and cocky



Vayra86 said:


> Troll confirmed. I already pointed that out extensively, start reading - no content, no worthwhile visual gain, etc etc


How do you call VII than? Color me troll but you just make no sense with your statements. RT is the only new content, best visual gains in years.



efikkan said:


> To buyers, the only thing that actually matters is how well it performs, along with thermals, noise etc. What process they use, how many cores, what clocks, etc. shouldn't matter to buyers, the benchmarks should show the reality. But if AMD needs ~300W to match the performance level of RTX 2080 / GTX 1080 Ti, they need to offer up some other advantage to justify going with a TDP that much higher, e.g. lower price.
> 
> Strategically, I can't see any real good reasons why AMD releases this card. My best guess is they felt the need to get some attention in the market since they don't have anything else for a good while. This is similar to the "pointless" RX 590 released last year, released just to get another review cycle.
> 
> For the future, this doesn't look good. AMD just spent what might be the last "good" node shrink, and repurposed a GPU for the professional market, a GPU which is very expensive to make and taking up precious capacity on the 7nm node. If this was planned a long time ago, they would have made a GPU without full fp64 support and with GDDR6, then at least they could have sold it much cheaper. But with Vega 20 they are at a disadvantage, not only is the HBM2 more expensive, the die is too. And they even put 16GB on it, which is completely wasted for gaming. If they really wanted more than 8GB, they could have gone with 12GB, and at least made it a little cheaper.


Along with thermals, noise,... well, tdp 300w is against these two. Price? Same as 2080 at least on the beginning. Customers are not stupid, market share is the evidence.


----------



## Vayra86 (Jan 10, 2019)

Anymal said:


> How do you call VII than? Color me troll but you just make no reason with your statements. RT is the only new content, best visual gains in years.



Ah I see, so your logic is that 2080 performance is now midrange? Maybe you got carried away, had a few drinks before and got all cocky?


----------



## Anymal (Jan 10, 2019)

Vayra86 said:


> Ah I see, so your logic is that 2080 performance is now midrange? Maybe you got carried away, had a few drinks before and got all cocky?


No argument then?
VII is good, 2080 is no new content, no visual gains,... seriously?


----------



## Vayra86 (Jan 10, 2019)

Anymal said:


> No argument then?
> VII is good, 2080 is no new content, no visual gains,... seriously?



You really have reading issues, I already told you, skim back a few pages. Think of it as a treasure hunt!


----------



## JaymondoGB (Jan 10, 2019)

Such a nasty bloke who is always so negative, which is a bit like their stock.


----------



## mtcn77 (Jan 10, 2019)

Anymal said:


> It looks like: lol, 7nm, 300w tdp only 2080 or 1080ti performance from 2 years ago, no new features, but 16gb hbm2, for games...


Wrong, 180w and ~92% yield per max OC. People need to remember GpuBoost 3.0 is vendor-locked while the AMD driver is accessible. Changing the voltages and undervolting is done by default, not manually like AMD, on Nvidia. Also, the memory model is tiled, meaning it is not intermediate mode and always consuming power like AMD. If both cards were at %100 load, they would much likely be in AMD's favour because of the stated inference.


----------



## Valantar (Jan 10, 2019)

nickbaldwin86 said:


> love how everyone is beating on the stock market being down for NV
> 
> @ time of posting it is at 143.09 and UP
> While AMD is DOWN @ 19.33 ... I would rather own stock in a company that is worth $143 than a company with $19. just food for thought people.
> ...


Doesn't that depend on the price when you actually bought the stock? Your post sure makes it ound like you don't quite understand quite how this works. Owning $150 stock if you bought in at $170 certainly is quite a lot worse than owning $15 stock having bought in at $10 - and as you're likely to have invested the same amount, you'd also have 10-20x more stock with the cheaper one, making the difference very noticeable.


Sasqui said:


> Gone from a majority to a healthy minority.
> 
> If it weren't for crypto, NVDA and AMD wouldn't be crapping their trousers at this point in time.


What? You get that the total addressable market has expanded dramatically over that period of time, right? The PC gaming hardware market is not smaller than before, and the software market certainly isn't. Other devices growing into their own and establishing themselves as parallel alternatives does not at all spell impending doom for the incumbent segments. That PC gaming is now "a minority" is meaningless when the total gaming market is 10x what it was when that was the case.


efikkan said:


> To buyers, the only thing that actually matters is how well it performs, along with thermals, noise etc. What process they use, how many cores, what clocks, etc. shouldn't matter to buyers, the benchmarks should show the reality. But if AMD needs ~300W to match the performance level of RTX 2080 / GTX 1080 Ti, they need to offer up some other advantage to justify going with a TDP that much higher, e.g. lower price.
> 
> Strategically, I can't see any real good reasons why AMD releases this card. My best guess is they felt the need to get some attention in the market since they don't have anything else for a good while. This is similar to the "pointless" RX 590 released last year, released just to get another review cycle.
> 
> For the future, this doesn't look good. AMD just spent what might be the last "good" node shrink, and repurposed a GPU for the professional market, a GPU which is very expensive to make and taking up precious capacity on the 7nm node. If this was planned a long time ago, they would have made a GPU without full fp64 support and with GDDR6, then at least they could have sold it much cheaper. But with Vega 20 they are at a disadvantage, not only is the HBM2 more expensive, the die is too. And they even put 16GB on it, which is completely wasted for gaming. If they really wanted more than 8GB, they could have gone with 12GB, and at least made it a little cheaper.


You've got this all turned around.
-Very expensive chip? Not likely. It's <400mm2. Even on 7nm, that's likely to be cheaper than Nvidia's monster 12nm chips. It does need an interposer and HBM, but given that GDDR6 is reportedly 60-70% more expensive than GDDR5(X), that leaves us at roughly price parity between GDDR6 and HBM2. AMD uses twice as much, which means less margins there, but not night and day. The interposer is likely not that expensive, given that it's far smaller than the Fiji lineup's interposer.
-"That much higher TDP"? While there's no denying that Nvidia's cards are more efficient, by a significant margin, the difference between a 300W VII and a 275W 1080Ti or 225W 2080 is ... not that much, really. 9 and 33% respectively. The difference was _far_ bigger the last go around, with the 275W V64 roughly matching the 180W 1080 (52%).
-Strategically, this is AMD saying "Hey, we're here, and we can compete this go around too" rather than leaving the old and 1080-ish V64 as their highest performer. At the same price or cheaper than the 2080 for matching performance, this is a good alternative, particularly with a good tri-fan cooler for the reference card. That makes very much sense, as the other option would be leaving the RTX series unopposed until Navi arrives. If that's within a few months, this would be of no use, but if it's towards fall or Q4, this is a very good strategic move. Nvidia can't launch anything new in response, all they can do is lower prices on their existing cards, which we know are ridiculously expensive to produce, even if they are still overpriced. If Nvidia doesn't, AMD will sell more. What ever happens, this is a good move by AMD.
-"Spending a node shrink" on a refresh of a known architecture is a well established mode of operations in the tech world. It's when you _don't _have a node shrink that you need a new arch - a position that AMD has now put itself in. This primes the market for Navi. Of course, it lessens the "wow" effect if Navi delivers on improving perf/w (combining that with a node shrink would make it _really_ stand out), but on the other hand it makes for continuity in AMD's offerings and the performance of their product stack.
-While I agree that Vega as an architecture is far better suited for professional applications (we've seen this proven plenty of times), apparently AMD feels confident enough in the gaming-related improvements when doubling the memory bandwidth and doubling the number of ROPs to give it a go. I suggest we wait for reviews.
-You're right that 12GB of HBM2 might have made this cheaper, but getting 6hi vs. 8hi KGSDs of HBM2 is likely a challenge, and given that 8hi is in volume production for enterprise/datacenter/etc., it might be the same price or even cheaper than special-ordered 6hi stacks.


----------



## Anymal (Jan 10, 2019)

Vayra86 said:


> You really have reading issues, I already told you, skim back a few pages. Think of it as a treasure hunt!


2 posts before you say there is no new content, no visual gains. Treasure hunt is over.


----------



## HammerON (Jan 10, 2019)

Alright @Vayra86 and @Anymal - enough already.  Take it to PM if you want to continue bickering.


----------



## VulkanBros (Jan 10, 2019)

This seems like a very odd statement from Mr. nVidia...
He most know, that such a statement will be discussed in various tech forums, and likely will backfire, so what caused such odd saying? Gives no meaning, at all, to me


----------



## HisDivineOrder (Jan 10, 2019)

This whole affair really highlights how bad the GPU market is right now.

Nvidia is owning AMD by providing a smidge more performance for decently more money than last generation at the same price point.  Meanwhile, AMD can't even beat THAT with a new card launch and a die shrink.  So:

1) Nvidia took a technological advantage and wasted most of their extra die space on fad features that won't even get play until a few generations from now.  They should have waited to implement RTX when they had a die shrink and space to kill.  Here, they basically made a slightly enhanced version of their last generation and added features that won't matter for a few gens.  Whoops.  Meaning performance didn't go up to much degree, compelling them to swap the model numbers around to hide that behind marketing spin of "AND COMPARING THE 2080 to the 1080, WE SEE A HUGE UPLIFT IN FRAMERATE!"

2) AMD took a dieshrink advantage and managed to barely keep up with nvidia's last generation high end.  So for the same money as nvidia, they'll give you a card that mostly keeps up at a higher power draw and lacks even the fad features that nvidia has.

So if Nvidia was "greedy" to charge basically the same for a tiny performance uplift plus fad features, then what is AMD to charge the same price for virtually the same (or less?) performance WITHOUT the fad features?

This is Aliens vs Predator all over again.  No matter who wins this fight, we all just lost.


----------



## efikkan (Jan 10, 2019)

Valantar said:


> -Very expensive chip? Not likely. It's <400mm2. Even on 7nm, that's likely to be cheaper than Nvidia's monster 12nm chips.


Estimates are about twice the cost per mm². Combined with a interposer and HBM2, this will be more expensive than "12nm" and GDDR6.



Valantar said:


> -"That much higher TDP"? While there's no denying that Nvidia's cards are more efficient, by a significant margin, the difference between a 300W VII and a 275W 1080Ti or 225W 2080 is ... not that much, really. 9 and 33% respectively. The difference was far bigger the last go around, with the 275W V64 roughly matching the 180W 1080 (52%).


When comparing RTX 2080 vs "Radeon VII", presumably with similar performance, ~300W vs. 215W and the same price, it becomes a hard sell. On the other hand if it was $50 cheaper or had some other significant advantage, it could have some potential, assuming AMD could make enough of them.



Valantar said:


> -"Spending a node shrink" on a refresh of a known architecture is a well established mode of operations in the tech world.
> <snip>
> -While I agree that Vega as an architecture is far better suited for professional applications…


The refresh itself is not the problem, but doing it to make a very expensive GPU with irrelevant features that have no real chance in the market, due to the great inefficiency of the architecture.

The Vega 20 was intended for professional markets, which means it includes more fp64 cores and I believe some other new features which is wasted die space on the very expensive and currently low volume node. As I mentioned, if they had a Vega 20 without this and with GDDR, it would at least make some sense.



Valantar said:


> -You're right that 12GB of HBM2 might have made this cheaper, but getting 6hi vs. 8hi KGSDs of HBM2 is likely a challenge, and given that 8hi is in volume production for enterprise/datacenter/etc., it might be the same price or even cheaper than special-ordered 6hi stacks.


No, I'm talking about using 3 stacks instead of 4.


----------



## Mistral (Jan 10, 2019)

Wow, he sounds super salty... 
It "barely keeps up with" our more expensive card "and if we turn on ray tracing we'll crush it" by halving our framerates...


----------



## GreiverBlade (Jan 10, 2019)

heck ... that speech is making me want a Vega VII ... rather than a 20XX card ... even if the price point is around a 2080 (where i live ...)
eagerly waiting on AIB and etailer to list them to see ... 

but for now i am more inclined to get a deal on a Vega 64 rather than getting a 2070 from the green goblin ---


red dawn indeed ...


----------



## Anymal (Jan 10, 2019)

No halving thank to dlss


----------



## VulkanBros (Jan 10, 2019)

Exactly my thought - it will backfire on nVidia...



GreiverBlade said:


> heck ... that speech is making me want a Vega VII ... rather than a 20XX card ... even if the price point is around a 2080 (where i live ...)
> eagerly waiting on AIB and etailer to list them to see ...
> 
> but for now i am more inclined to get a deal on a Vega 64 rather than getting a 2070 from the green goblin ---
> ...


----------



## Anymal (Jan 10, 2019)

GreiverBlade said:


> heck ... that speech is making me want a Vega VII ... rather than a 20XX card ... even if the price point is around a 2080 (where i live ...)
> eagerly waiting on AIB and etailer to list them to see ...
> 
> but for now i am more inclined to get a deal on a Vega 64 rather than getting a 2070 from the green goblin ---
> ...


You are gettin all emotional


----------



## neatfeatguy (Jan 10, 2019)

Mr.Mopar392 said:


> Not a hater trust me, a fanboy yes. I Really don't care what anyone wants to buy its their money, i just don't care for Nvidia cards or their tech thats it. last nvidia cards i own was a pair of gtx 570 in sli they ran like garbage and color look wash out. Never own since then, not for the foreseeable future.



I ran 570s in SLI for 4+ years. I thought they handled things rather well. I used them across several different monitors over that time frame and things never looked washed out. I maxed out games on 1080p for a long time. I probably would have kept the cards until Pascal hit the shelves, but having jumped into the 5760x1080 resolution that I really like playing games on (that support it), the 570s struggled. I ran FarCry 3 on 5760x1080, but all settings were turned down to the medium range and FPS usually were around 40-50fps. I could max out Borderlands 2 (except for Physx, I kept that on low) and hold around 60fps at 5760x1080.....anyway. I decided to jump to something and after some deliberating I went with my 980Ti AMP Omega. Hell of an upgrade.

I can't say I've ever had issues with Nvidia cards before. I've had issues with SLI over the years, but not big glaring issues that people seem to claim. My next card will most likely be from Nvidia because I've been using them for so long....but if AMD can throw something out that works as well or better and cost less, I wouldn't be against picking up one of their GPUs for my next upgrade. My next GPU is still probably 2-3 years out (if my 980Ti lasts that long), so who knows what we'll see then.


----------



## Vya Domus (Jan 10, 2019)

Sticks and stones will break my bones, but words will never bring back the market cap.


----------



## mindbomb (Jan 10, 2019)

I have an rx 480 in my main rig, and jensen might be right about freesync. It was so poorly regulated that it was hard figuring out which monitors actually performed well in their variable refresh range. Most reviewers only check image quality with fixed refresh rates.


----------



## kapone32 (Jan 10, 2019)

mindbomb said:


> I have an rx 480 in my main rig, and jensen might be right about freesync. It was so poorly regulated that it was hard figuring out which monitors actually performed well in their variable refresh range. Most reviewers only check image quality with fixed refresh rates.



If you really want to get Freesync working as intended use a program called CRU. There you can adjust the Sync rate of your monitor I have mine set to 25-80


----------



## Sasqui (Jan 10, 2019)

Valantar said:


> What? You get that the total addressable market has expanded dramatically over that period of time, right? The PC gaming hardware market is not smaller than before, and the software market certainly isn't. Other devices growing into their own and establishing themselves as parallel alternatives does not at all spell impending doom for the incumbent segments. That PC gaming is now "a minority" is meaningless when the total gaming market is 10x what it was when that was the case.



Hey, use acronyms like "TAM", ok?   But yes, to your point the total market has increased as the share %'s shift.  Interesting thing from that graph is that PC market has almost taken over the console market from 2016-2019.  Who would have thought?


----------



## ironwolf (Jan 10, 2019)

27MaD said:


> Facebook comment.


Yep, that would be me over on FB who made the comment.


----------



## GorbazTheDragon (Jan 10, 2019)

How to read this:

Vega VII is a "noteworthy product"

The performance is competitive but there's no frills.

[There's] no ray tracing, no AI (underdeveloped gimmicks which you as a consumer don't really care about yet).

It's 7nm is a turning point because VEGA went from competing with the 1080 to competing with the 1080 Ti.

We can only rely on gimmicks that have zero market penetration and are so far only empty promises to feign a better offer to potential customers.


----------



## EntropyZ (Jan 10, 2019)

He is feeling the pressure now and lost his cool. What is it JHH? Is that leather jacket getting a little tight or something? 

The marketing and lies are like 2 ends of a pole, nVidia better end up hitting themselves back into the hole they dug.


----------



## Fluffmeister (Jan 10, 2019)

Ultimately this is good news, finally Vega can compete with the 1080 Ti. Let's wait for the review and see if TSMC's 7nm is the silver bullet many had hoped for.


----------



## Minus Infinity (Jan 10, 2019)

Hellfire said:


> Wow, I'd expect a CEO of a multi billion dollar company not to whine like a little bitch.
> 
> I mean I've seen Emo kids who are less whiney than he is.



This, highly undignified response and clearly means he's worried.


----------



## Basard (Jan 10, 2019)

Give that guy a piss test lol.


----------



## Fluffmeister (Jan 10, 2019)

Minus Infinity said:


> This, highly undignified response and clearly means he's worried.



No doubt, with worse performance per watt than Pascal despite being built on 7nm tech he is wondering when the competiton will turn up.


----------



## m4dn355 (Jan 10, 2019)

Shame on you Leather


----------



## B-Real (Jan 10, 2019)

The most important thing is your RTX series is lousy in terms of performance jump and especially in price. After these comments some more stock decrease is inevitable. 



Fluffmeister said:


> No doubt, with worse performance per watt than Pascal despite being built on 7nm tech he is wondering when the competiton will turn up.



I didn't know the most important thing is performance per Watt. LOL.  Shame on you.



Anymal said:


> A lot of people here who do not understand rt and dlss. Yeah, rtx line is all fail, VII is the best ever, AMD power FTW



HAHA. Whine little one, whine. What you can do is use RT in ONE game in 4 months time. And the effect of RT was officially confirmed to have been lowered for somewhat better performance, which meant ~70 fps in FHD and ~30 fps average in 4K (with one of the best optimized engines) with a $1000 (1200) card. After the patch, the effects were further lowered in order to get better results. Still, its 50-55 average in 4K with a $1000 (1200) card.  Tomb Raider will get it with FHD 30ish fps on 2080Ti. FFXV RT was cancelled. Now these are shame.


----------



## Fluffmeister (Jan 11, 2019)

I'd dredge up some of your previous hype posts, but I can't be bothered anymore.

Looks like your about to enjoy 1080 Ti performance two years later for... *drum roll* $699. If that makes you happy i guess the quote of the day is: "shame on you".


----------



## qubit (Jan 11, 2019)

It was obvious from the moment that AMD made an open standard version of G-SYNC that it would eventually beat G-SYNC into submission. AMD invited NVIDIA to support it from day one too, but NVIDIA arrogantly didn't want to, but now they're forced to. No wonder NVIDIA is pissed.

AFAIK, G-SYNC is the technically superior solution, but don't quote me.

He's made some pretty scandalous claims there about FreeSync on most of those monitors not working even with AMD cards, that sounds pretty suspicious. If it was true, enthusiasts would have been shouting about it on forums long ago and reviewers might have noticed a little bit, too. As long as he doesn't name names though they probably can't touch him.

It sounds like those non-functioning monitors simply don't work with NVIDIA cards, but he doesn't wanna admit that because it would look really bad for NVIDIA, so he includes AMD as well. So underhand, but NVIDIA's got form for this so I'm not surprised. And I'm saying this as a hardcore longtime NVIDIA user, so I'm not just being biased.


----------



## mindbomb (Jan 11, 2019)

qubit said:


> He's made some pretty scandalous claims there about FreeSync on most of those monitors not working even with AMD cards, that sounds pretty suspicious. If it was true, enthusiasts would have been shouting about it on forums long ago and reviewers might have noticed a little bit, too. As long as he doesn't name names though they probably can't touch him.



The scary thing is that I don't think anyone would notice if many freesync monitors were performing poorly. Enthusiasts are relying heavily on anecdotal information, and reviewers only sporadically cover monitor releases, and often only check performance with a fixed refresh rate only.


----------



## Super XP (Jan 11, 2019)

A mindless comment by the Nvidia Boss. Do I hear arrogance in his voice or is he threatened by AMD's upcoming arsenal.


----------



## PooPipeBoy (Jan 11, 2019)

Oh great, now Jensen himself has jumped into the Nvidia Vs AMD poo-throwing arguments.
He could've made a constructive (or even neutral) comment, but no...


----------



## sergionography (Jan 11, 2019)

I mean to be fair, rtx2080 performance but with 300w tdp on 7nm is indeed very underwhelming. Rtx2080 is on 12nm and has a 225TDP, so that part of his statement I actually agree with. The freesync statement however is just petty


----------



## Midland Dog (Jan 11, 2019)

human_error said:


> So does he think that the 2080 has underwhelming performance? How about the 2070 or 2060 which are even slower?
> 
> The Vega VII cards aren't setting a new benchmark in performance but if they perform at the same level as the 2080 I wouldn't say that they are underwhelming at all.
> 
> Of course what do people expect the CEO of AMD's competitor to say "oh yes they're decent cards"?


 2070 and 2060 aint 799


----------



## xtreemchaos (Jan 11, 2019)

hes spat the dummy out poor guy. AMD talk of love and oneness for the gamer and nvidia just bitch. i am glad to say ive never brought anything from nvidia and that guy says why.  charl.


----------



## steen (Jan 11, 2019)

Anymal said:


> No halving thank to dlss


Except for the resolution... 

Had to say it. Thanks for the entertainment guys. It's just PR anyway, Jen-hsun has always been hands on & good on him. No question AMD dropped the ball with RTG & we all know the reasons (mostly). Just shows how forward thinking a concurrent gaphics/compute uarch it was in 2012. I don't know what people expect from Navi, but it too is GCN based. Vega20 at last gives us 128 ROPs, but doesn't break the 4 tris/clock front end given the primitive discard engine bypassing GS->HS->DS->VS is a no-show. Perhaps fixed in Navi? Nvidia has turned Maxwell->Pascal->Turing into a better balanced gaming uarch esp given substantial devrel programs.


----------



## B-Real (Jan 11, 2019)

Vayra86 said:


> The issue is Nvidia just lost their domination on the high end, when they needed it the most to push their Turing affair, which is the worst deal in GPU history. They now only have one top-end card that is completely priced out of the market - out of _pure necessity because its too flippin' large_ - to eclipse AMD's ancient GCN with some lazy tweaks and a shrink. So that, alongside them having to give up the Gsync cash cow because it had nowhere else to go, RTX not really gaining traction in hearts & minds or dev products, mining being over, and their stock price stabilizing at half what it was a few months back.
> 
> Yeah, I'd be grumpy too


Actually with the 10 series and Vega, it was the same, as the Vega 64 brought 1080 performance. However, it was more expensive for a long time (except for the very first days and the period after the mining craze) and it was released 1 year and 3 months later than the 1080. Now, this difference will be cut to less than 5 months, and it's "only" a new fab, not even a whole new series of cards like Navi will be.



Fluffmeister said:


> I'd dredge up some of your previous hype posts, but I can't be bothered anymore.
> 
> Looks like your about to enjoy 1080 Ti performance two years later for... *drum roll* $699. If that makes you happy i guess the quote of the day is: "shame on you".


Can you understand that this is the same Vega on smaller fab? Hope you hate the 2080 that much as it only brings 1080Ti performance. Hopeless tries from you.



sergionography said:


> I mean to be fair, rtx2080 performance but with 300w tdp on 7nm is indeed very underwhelming. Rtx2080 is on 12nm and has a 225TDP, so that part of his statement I actually agree with. The freesync statement however is just petty


You can gain 70-80W from switching BIOSes on a Vega 64, while losing only 1-2% of performance. If we assume that this will be available for the Radeon VII, the so wanted performance/watt ratio by Nvidia owners (as they cannot say anything else) will be close to the 2080. And remember, this is the same Vega, not Navi. If you know what I mean. You can say anything, but bringing products even only reaching the second fastest cards of the other company with only a fragment of cash available is quite an achievement. 



Anymal said:


> No halving thank to dlss


And losing detail. Thank you.



Midland Dog said:


> 2070 and 2060 aint 799


He was speaking of performance, not prices. Anyway, all 3 had increased by $100-130$ while not even having more VRAM.


----------



## TheoneandonlyMrK (Jan 11, 2019)

mindbomb said:


> The scary thing is that I don't think anyone would notice if many freesync monitors were performing poorly. Enthusiasts are relying heavily on anecdotal information, and reviewers only sporadically cover monitor releases, and often only check performance with a fixed refresh rate only.


What so screen tearing, which freesync was made for wouldn't give the game away if it's not working??.
People would notice.


----------



## Vayra86 (Jan 11, 2019)

mindbomb said:


> The scary thing is that I don't think anyone would notice if many freesync monitors were performing poorly. Enthusiasts are relying heavily on anecdotal information, and reviewers only sporadically cover monitor releases, and often only check performance with a fixed refresh rate only.



Actually no, there are in depth Youtube vids about just about everything and when there are doubts, some reddit or YT or Twitch channel will explode because there's a new daily shitstorm to click on.


----------



## Camm (Jan 11, 2019)

This strikes me more that Huang is somewhat scared of what's coming down the pipeline.

Nvidia got forced out of its lucrative Gsync business because otherwise it wouldn't be able to certify against HDMI 2.1.

Nvidia's again pushed out of the console space (yes we know theres the Switch, but the Switch won't be defining GPU trends), which will continue to see Nvidia technology only bolted on, rather than core to any game experience. In 6-12 months we will see two new consoles built on AMD Navi GPU's, and corresponding cards for desktop. This is new, as traditionally console GPU's have lagged behind desktop tech, which will invariably benefit AMD for optimisation.

And lastly, with Intel entering the GPU space, Nvidia is left out in the cold with a lack of x86 ability, and only so much of that can be countered with its investment in RISC.

Full disclosure - I own a 2080 Ti, but to me this comes across as a temper tantrum rather than anything meaningful


----------



## Aaron_Henderson (Jan 11, 2019)

Meh...was ready for another green card ( was thinking of going GTX 1070) but I just can't support this type of marketing bull...I'll just buy another used card...they've already made their money on it and none of mine goes towards bickering back and forth.  Anyone got a used upgrade for my 290x or my gf's GTX 770? Lol. Seriously only buying used cards from here on out.


----------



## Valantar (Jan 11, 2019)

B-Real said:


> Can you understand that this is the same Vega on smaller fab? Hope you hate the 2080 that much as it only brings 1080Ti performance. Hopeless tries from you.


While I largely agree with what you're saying in this post, it's worth pointing out that Vega 20 has some architectural changes from Vega 10 - they're just focused on compute and AI, not gaming. As such, it's not "the same Vega", but rather a tweaked and optimized part that they've now decided to launch for a different market - likely due to there being more stock than expected, production being cheaper than expected, wanting a higher-end card to compete until Navi arrives, or all of the above. If the performance holds up, I don't mind them selling repurposed compute architecture based cards for gaming, but I'm very much looking forward to the gaming optimized Navi. Die sizes would at least be smaller due to stripping out FP64 cores and similar non-gaming features. Should leave room for improvements and still make it relatively cheap to produce.


----------



## juiseman (Jan 11, 2019)

If I had money to burn; I would buy AMD stock right now.
I think that they will more than tripple in price at the rate they are going.
Just my opinion; but also based on the inovation of their recent CPU product's.

 Havn't bought an AMD cpu since 2009 (I'm all intell  Xeon's for my 7 RIgs right now)..
So Not an AMD fanboy, but clearly they have somthing good going in their CPU department...

The evidence of this is based on Intel majically boosting thier core count and
Frequency on their CPU's to the max; just to stay a touch ahead of AMD's CPU's.
To me that says it all..


----------



## Vayra86 (Jan 11, 2019)

Nxodus said:


> Stock market is a volatile market, even the Dow Jones has had some bad days, so what, should all americans commit sudoku when it falls 5%?



Sorry but that one is going in my sig. that was no autocarrot!


----------



## xtreemchaos (Jan 11, 2019)

*juiseman*
what is a fanboy ? and will it cool my threadripper ?


----------



## Valantar (Jan 11, 2019)

xtreemchaos said:


> *juiseman*
> what is a fanboy ? and will it cool my threadripper ?


I believe you'll need a waterboy for that.


----------



## FordGT90Concept (Jan 11, 2019)

What he said: "Freesync Doesn't Work"
What he meant: "FreeSync doesn't work on NVIDIA cards, because we don't want it to."


----------



## juiseman (Jan 11, 2019)

My spelling is atrocious..my thoughts are faster than typing.
I blame google for making me extra lazy. 
ha...


----------



## Durvelle27 (Jan 11, 2019)

Someone’s oblivious knows that AMD is about to kick them really hard with navi


----------



## tvamos (Jan 11, 2019)

qubit said:


> AFAIK, G-SYNC is the technically superior solution...
> 
> Well, for such a premium price difference it should be. But it ain't worth as much.


----------



## FordGT90Concept (Jan 11, 2019)

Durvelle27 said:


> Someone’s oblivious knows that AMD is about to kick them really hard with navi


Mum's the word on Navi.  I don't think it's imminent.  PS5 releases anywhere between Christmas 2019 and 2021.  I think Christmas 2019 is probably the soonest we'll see a consumer Navi graphics card.  Makes one wonder what AMD is going to release between now and then.  Polaris 40 on 7nm?

I don't think Radeon VII really concerns Huang so much as the road ahead for NVIDIA is rough. AI isn't taking off like they banked on it doing. RTX reception was poor.  7nm transition for NVIDIA is going to be very costly. Cryptocurrency mining on GPUs is mostly dead. FreeSync integration is rougher than he expected. The sky looks cloudy to Huang where the sky over Su looks sunny.


----------



## gamerman (Jan 11, 2019)

well i dont think huang is scare.noway.

truth is that amd brand new flagship gpu cant challenge even nvidia 3rd faster  gpu, not even close,so even amd compare it 2080 gpu...hehe, so amd compare they flagship not nvidia flagshp, but 3 rd fastest gpu...2080 nad hypeting THAT!! come on! THINK!!
i remembe same issue was when vega64 release...amd compare it gtx 1080..NOT gtx 1080 ti..think and i hope when amd then also hypeting vega64 very high...

truth is that without  7nm update radeon VII is melting,thats fact. so its need 3 fans toREFERENCE gpu,never happend bfore gpu history, just like vega64 must use watercool!


and radeon VII even its make 7nm tech STILL eat more than 300W power and  i be sure real power is somewhere 350W and peak 400W its terrible score!


think ppl cant understand how big big handicap amd radeon VII have bcoz it build 7nmm architecture... THINK!! its almost half of rtx 2000 series 12nm,so its should eat much less power what is now eating....something is very wrong, and it is that amd cant build good efficiency gpu.

example rtx 2080 eat 220W for average gaming,radeon VII score is over 300W ,EVEN its make 7nm lines!!!!!!!!

its lausy gpu like it little brother vega64, and  i call it vega64 II.

rtx 2060 is 10 times better gpu. if we took fastest gpus, we took rtx 2080,rtc 2080ti and of coz rtx titan.

looks that radeon VII just and just avoid watercool,let see how its oc'd.. if it is power eat go absolutly dky high. eve its 7nm gpu.

lausy gpu again, and lisa know it and huang know it.


we e seen it soon or do i say afer 12 month when nvidia release thys 7nm gpu...i promise , under 250W and performance is 2080ti + 40% at least.

dont lie urself, see truth on eye, amd want every1 have 1kw psu. or amd not care. amd should thanks and bed deep another company TMSC that they brink it 7nm arhitecture,without that help i think amd not even release radeon VII.

lausy gpu. shame amd, and ppl wake up, amd not deserve anything smpaty,its earth eat gpu, slow and for it performace its pricey.


----------



## xtreemchaos (Jan 11, 2019)

*gamerman*
you can get tablets for nvidia addiction  sounds like you have the same as poor mr Huang matey, times are a changing... you know im joking right.


----------



## Valantar (Jan 11, 2019)

xtreemchaos said:


> *gamerman*
> you can get tablets for nvidia addiction  sounds like you have the same as poor mr Huang matey, times are a changing... you know im joking right.


I believe the tablet for Nvidia addiction is called Nintendo Switch.



... I'll see myself out.


----------



## xtreemchaos (Jan 11, 2019)

*Valantar*
lol how did i miss that, come back in i like you humor.


----------



## Slizzo (Jan 11, 2019)

Nxodus said:


> AMD fanbois will be so angry when in a couple of years RTX becomes standard, and AMD will be forced to adopt it effectively increasing the prices of their hot plastic cards even more.





Sasqui said:


> You know, that is a good point, they invested in RTX and maybe or maybe not it'll play out well for them.  "Standard" is also a good question, since DX12 introduced the RTX API in Win 10, it's still very young.  It's all about the ecosystem, meaning how many developers are going to invest time into implementing it when there's only a handful (or maybe one right now... 2080 ti) that can really handle it.
> 
> It's not a very compelling story at the moment.



Guys, you realize that currently, the only implementation of RTRT on desktop is Windows 10, with 1809 update, using DXR as the library to display said technology? You know, the library that anyone can use to accelerate ray tracing?  It's not "RTX API". It's Microsofts' DXR API, that is a part of DirectX.  Once AMD implements Radeon Rays, they'll be using the same library.



scevism said:


> I've just got a 55'' samsung tv with freesync built in 120fps with my r9 290x only at 1080p but works for me. WITH MY LEATHER JACKET ON LOL.....



Loving my Q6 so far. Been a great TV



Anymal said:


> No halving thank to dlss



Man, your guys reliance on DLSS being an end all solution is bewildering. Personally, I'd never rely on a scaling solution in my gaming. I'd rather just run at native res, for better or worse. Beyond that, we're already getting 1080p 60fps in the only RTX supported title so far, with cards that are lower in the stack than the range topping RTX Titan and RTX 2080 Ti.



Beyond all that, the Radeon 7 does look good. So what, it's just a shrunk and higher clocked Vega. But it has twice the ROPS as previous, a huge increase that hamstrung the first Vega, and it has faster HBM, and twice as much as last time as well.

It's $700 because NVIDIA allowed them to price it at that; I can hardly blame AMD for taking advantage of NVIDIA's absofuckinglutely insane pricing.


----------



## Valantar (Jan 11, 2019)

Slizzo said:


> Guys, you realize that currently, the only implementation of RTRT on desktop is Windows 10, with 1809 update, using DXR as the library to display said technology? You know, the library that anyone can use to accelerate ray tracing?  It's not "RTX API". It's Microsofts' DXR API, that is a part of DirectX.  Once AMD implements Radeon Rays, they'll be using the same library.
> 
> 
> 
> ...


The main difference here: AMD's range _tops out_ at $700. Regardless of relative and absolute performance, that's within reason, I'd say. I paid that much for my Fury X, and I'm happy with it, but it would take a serious improvement to make me pay as much again, at least for a couple of years yet. Nvidia's range, on the other hand, tops out at either $1200 or $2500, depending if you include the Titan. I don't think it counts (it's not GeForce), but apparently rabid fanboys such as the above example disagree (edit: to clarify, not the post _quoted _above, but the one you've all noticed if you've read the last page of posts). That is well beyond "within reason". People were pissed when Nvidia pushed Titan pricing to $1200, yet now they're eating up the 2080 ti at the same price. Just shows how easily accustomed one gets to insanity.


----------



## kapone32 (Jan 11, 2019)

I bought a Water block for my Sapphire Vega 64 card. I do seriously hope that the layout is the same as when I ran Fire Strike this morning


Valantar said:


> The main difference here: AMD's range _tops out_ at $700. Regardless of relative and absolute performance, that's within reason, I'd say. I paid that much for my Fury X, and I'm happy with it, but it would take a serious improvement to make me pay as much again, at least for a couple of years yet. Nvidia's range, on the other hand, tops out at either $1200 or $2500, depending if you include the Titan. I don't think it counts (it's not GeForce), but apparently rabid fanboys such as the above example disagree. That is well beyond "within reason". People were pissed when Nvidia pushed Titan pricing to $1200, yet now they're eating up the 2080 ti at the same price. Just shows how easily accustomed one gets to insanity.




Exactly, and people are talking about DLSS and Ray Tracing without Physx, Hairworks and SLI which Nvidia took over and hogged for themselves only to see the technology not used because of the way they do things vs AMD with Vulkan which let to DX12 or Freesync which suddenly Nvidia is supporting. Of course with a comment for Jensen making it seem like Freesync is only good for NVidia cards..


----------



## Casecutter (Jan 11, 2019)

Like I said "no bad cards just bad pricing" let hope this spurs more competiveness!


----------



## Sasqui (Jan 11, 2019)

Slizzo said:


> It's Microsofts' DXR API, that is a part of DirectX. Once AMD implements Radeon Rays, they'll be using the same library.



Yes, yes and yes.  It's a component of DX in the latest Windows 10 version that everyone has access to.  

The catch is that game developers have to write code to implement it, and the hardware has to be able to process it as well.  Currently, these games are in the works to support it, at least to some degree or another:

Assetto Corsa Competizione from Kunos Simulazioni/505 *Games*.
Atomic Heart from Mundfish.
Battlefield V from EA/DICE.
Control from Remedy Entertainment/505 *Games*.
Enlisted from Gaijin Entertainment/Darkflow Software.
Justice from NetEase.


----------



## mindbomb (Jan 11, 2019)

theoneandonlymrk said:


> What so screen tearing, which freesync was made for wouldn't give the game away if it's not working??.
> People would notice.



The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.



Vayra86 said:


> Actually no, there are in depth Youtube vids about just about everything and when there are doubts, some reddit or YT or Twitch channel will explode because there's a new daily shitstorm to click on.



There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
https://www.pcper.com/reviews/Displ...hnical-Discussion/Gaming-Experience-FreeSync-
They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.
https://pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community. Those overshoot and ghosting issues would be solved by dynamic overdrive, something nvidia requires for all gsync monitors.


----------



## TheoneandonlyMrK (Jan 11, 2019)

mindbomb said:


> The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.
> 
> 
> 
> ...



So you have 450 ish more examples (of different monitor) to find and Huang is then right, crack on.
He did say they're All broken.

And you Are hear trying to back his words, Have you seen this issue in the flesh?????? 

Everyone with some sense would expect the odd monitor or even line of monitor to have issues possibly but to say they're all broken ,well.

My Samsung works without artefacts etc but tbf it is a 45/75hz one so I can see it's not that model.


----------



## Razrback16 (Jan 11, 2019)

"Underwhelming. The performance is lousy."

Kinda funny since that's exactly what I thought when reading the Turing reviews.


----------



## mtcn77 (Jan 11, 2019)

mindbomb said:


> The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.
> 
> 
> 
> ...


There is no need to rationalize all that much after the debate, here you go;


----------



## Deleted member 178884 (Jan 11, 2019)

Raevenlord said:


> "underwhelming product"


Ironically, the 2080 ti is exactly this.


----------



## Manoa (Jan 13, 2019)

Smartcom5 said:


> Yup, like RTX and DLSS …
> 
> View attachment 114314
> 
> Smartcom



loool



Anymal said:


> You werent there when the first T&L gpu came out, a!



mybe he wasn't, but it looks like you weren't there when the first shading GPU (R300) came out



Markosz said:


> - and this was 3 years ago... they only got worse.
> 
> AMD doesn't lock features behind hardware, they make open-source platforms.



they don't do it for you, or should I say us, they do it becouse it help them save money from software development expenses, it is also the reason they have shit gpu drivers


----------



## Vayra86 (Jan 13, 2019)

mindbomb said:


> The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.
> 
> 
> 
> ...



Sorry but no. There are terrible monitors with Gsync too. Has nothing to do with VRR and everything with the panel and board behind that.

Tftcentral has it all in case you are curious. Not once has Freesync been considered related to ghosting or artifacting. The only gripe is limited ranges.


----------



## Manoa (Jan 13, 2019)

Valantar said:


> -While I agree that Vega as an architecture is far better suited for professional applications (we've seen this proven plenty of times), apparently AMD feels confident enough in the gaming-related improvements when doubling the memory bandwidth and doubling the number of ROPs to give it a go. I suggest we wait for reviews.



vega IS far better suited for (professional) compute applications
this is not about what AMD feels
memory bandwidth gives nothing to gaming performance on vega
it is not double the ROPs, they are still 64
but guess what ? I also suggest we waith for reviews



Fluffmeister said:


> Ultimately this is good news, finally Vega can compete with the 1080 Ti. Let's wait for the review and see if TSMC's 7nm is the silver bullet many had hoped for.



it's not really 7 nm, actual size is closer to intel's 10 nm.

I think I know whay he says freesync doesn't work "at all" on "all" monitors, I suspect he considers the concept of a monitor not running the same hertz range in freesync as it does without it.
which makes him right about that since it is a lie sold by specifications and makes people believe freesync is full range, this says something to many people here that say AMD is not as dirty as nvidea...

Sasqui Assetto CC have shit for graphics ray tracing is not going to do it any good


----------



## Valantar (Jan 13, 2019)

Manoa said:


> vega IS far better suited for (professional) compute applications


Yes, that's what I said. Kind of weird to try to make a point against me be repeating what I said like that. Is a snarky tone supposed to make it mean something different? This is a fact. Doesn't mean that it can't work for gaming. I was very surprised to see Vega 20 arrive as a gaming card, but apparently it performs well for this also. Nothing weird about that. 


Manoa said:


> this is not about what AMD feels


Considering they're the only ones with any knowledge of how this performs in various workloads, it very much is. Launching it is also 100% AMD's decision, so, again this comes down to how AMD feels about it. 


Manoa said:


> memory bandwidth gives nothing to gaming performance on vega


Source?


Manoa said:


> it is not double the ROPs, they are still 64


That seems to be true, given the coverage of how this was an error in initial reports. My post was made before this was published, though. 



Manoa said:


> it's not really 7 nm, actual size is closer to intel's 10 nm.


Node names are largely unrelated to feature size in recent years - there's nothing 10nm in Intel's 10nm either - but this is still a full node shrink from 14/12nm. Intel being slightly more conservative with node naming doesn't change anything about this. 



Manoa said:


> I think I know whay he says freesync doesn't work "at all" on "all" monitors, I suspect he considers the concept of a monitor not running the same hertz range in freesync as it does without it.


If that's the case, he should look up the definition of "work". At best, that's stretching the truth significantly, but I'd say it's an outright lie. What you're describing there is not "not working". 


Manoa said:


> which makes him right about that since it is a lie sold by specifications and makes people believe freesync is full range, this says something to many people here that say AMD is not as dirty as nvidea...


Why is that? AMD helped create an open standard. How this is implemented is not up to them, as long as the rules of the standard are adhered to (and even then, it's VESA that owns the standard, not AMD). AMD has no power to police how monitor manufacturers implement Freesync - which is how it should be, and not Nvidia's cost-adding gatekeeping. AMD has never promised more than tear-free gaming (unless you're talking FS2, which is another thing entirely), which they've delivered. AMD even has a very helpful page dedicated to giving accurate information on the VRR range of every single FreeSync monitor.

Also, your standard for "as dirty as" seems... uneven, to say the least. On the one hand, we have "created an open standard and didn't include strict quality requirements or enforce this as a condition for use of the brand name", while on the other we have (for example) "attempted to use their dominant market share to strong-arm production partners out of using their most famous/liked/respected/recognized brand names for products from their main competitor". Do those look equal to you?


----------



## Manoa (Jan 13, 2019)

you mean feels as in "it good enough to go out to the market" ?
yhe sure it can work in gaming, first vegas proved that , im saying it like that becouse wrong chip for the wrong purpose, mybe better to augment polaris ?
you right AMD is better becouse they use the open standards approach, that's not the problem. for me the problem is that they are not doing it for good reasons, for us, it becouse they are doing it for themselfs...
thanks man that very elaborate response, you cleaned up a few things 
I didn't know the freesync problems are becouse of the monitors not AMD or the freesync system itself
everyone are liers today about the nm eh ?


----------



## Aquinus (Jan 13, 2019)

Valantar said:


> Source?


I have a Vega 64 and I assure you that memory bandwidth isn't helping gaming. 20% overclock on HBM yields no tangible performance benefit with mine. These cards are constrained by power consumption, not memory bandwidth.


----------



## HTC (Jan 13, 2019)

Camm said:


> This strikes me more that Huang is somewhat scared of what's coming down the pipeline.
> 
> Nvidia got *forced out* of its lucrative Gsync business because otherwise* it wouldn't be able to certify against HDMI 2.1*.
> 
> ...



Have not heard of this: source link, please?


----------



## Valantar (Jan 13, 2019)

HTC said:


> Have not heard of this: source link, please?


The HDMI 2.1 spec includes VRR support as a requirement, which would prevent Nvidia from certifying its cards against that spec if they still limited VRR support to G-sync. I guess a "middle of the road" option would be possible by supporting VRR on non-G-sync displays _only_ if they are HDMI 2.1 displays, but that would just lead to (yet another) consumer uproar against Nvidia. Let alone the fact that (at least according to all reports I've seen) the HDMI 2.1 implementation of VRR is technically very, very close to AMD's existing extension of VESA Adaptive Sync To FreeSync over HDMI. Nvidia would have no technical excuse whatsoever to not support FreeSync.


----------



## champsilva (Jan 14, 2019)

GinoLatino said:


> I don't think so! The difference is that nVidia was stable and just dropped, while AMD had a "bubble" and then came back to usual values, is even higher than "before the bubble".
> nVidia : red
> AMD : green
> 
> View attachment 114316



According to this 2018 was the boom to RTG group


----------



## FordGT90Concept (Jan 14, 2019)

Discreet Vega did really well by that chart.


----------



## mindbomb (Jan 14, 2019)

Vayra86 said:


> Sorry but no. There are terrible monitors with Gsync too. Has nothing to do with VRR and everything with the panel and board behind that.
> 
> Tftcentral has it all in case you are curious. Not once has Freesync been considered related to ghosting or artifacting. The only gripe is limited ranges.



No, since the gsync boards have very fine control of overdrive as the refresh rate changes, they will likely have better response times than freesync or gsync compatible monitors, since those tend to have coarse control of overdrive. The amount of overdrive needed changes depending on the refresh rate, so that's why this technology is important.

There is nothing stopping monitor companies from making sure their freesync monitors have finer control, but only one freesync monitor has implemented this, the nixeus edg27 (they call it adaptive antighosting).

So you could do the experiment, find a gsync and freesync ips monitor that uses the same panel, and you'll see that the gsync monitor will end up with better response times with a variable refresh rate, due to a better overdrive system, which was mandated by nvidia.


----------



## TheoneandonlyMrK (Jan 14, 2019)

Manoa said:


> you mean feels as in "it good enough to go out to the market" ?
> yhe sure it can work in gaming, first vegas proved that , im saying it like that becouse wrong chip for the wrong purpose, mybe better to augment polaris ?
> you right AMD is better becouse they use the open standards approach, that's not the problem. for me the problem is that they are not doing it for good reasons, for us, it becouse they are doing it for themselfs...
> thanks man that very elaborate response, you cleaned up a few things
> ...


YES THEY ARE , Laptops had a version of freesync for years(its what freesync was based on) before haungs crew decided to reinvent the wheeL and charge for it , at Least AMD dont charge, and Vegas fine for gaming in first or second edition, most games can manage to run at 4k with uLtra or very high settings with a few not a Lot of exceptions but those exceptions do matter to some ,me incuded.


----------



## mtcn77 (Jan 14, 2019)

HTC said:


> Have not heard of this: source link, please?


Hehe, I bet you never heard about this one:[You can guess what the topic is, imho.]


----------



## Valantar (Jan 14, 2019)

champsilva said:


> According to this 2018 was the boom to RTG group


Well, considering Vega launched in mobile and desktop APUs in 2018, it's not really any wonder that there was a significant boost to Vega revenue - even if adoption of said APUs by OEMs was slow. Laptops move a lot of volume, after all, and both the 2200G and 2400G seem to have been very successful.

Other than that, I see no trace of a "boom" - all other market segments are more or less static, after all. The growth in Vega sales is understandable, and Polaris sales are only slightly up. Still, quite impressive that there's no drop in semi-custom revenue considering how far into the current console cycle we are.


----------



## FordGT90Concept (Jan 14, 2019)

Pretty sure embedded GPUs (dark gray) include APUs.


----------



## hat (Jan 14, 2019)

As I said in the other thread, wouldn't it be natural for an opposing CEO to say such things? What CEO in their right mind would ever openly admit the competition has the better product? Especially when you've been charging through the nose for yours...

Supporting freesync though is a good move, finally. A lot of people are already salty because nVidia has stupid high prices, but now you gotta pay another stupid high price for the gsync screen? Jensen isn't THAT stupid... he must have realized that supporting freesync would be a good move and potentially get him more sales form the crowd that might be interested in buying nVidia right now, but not paying the price for gsync. They can keep gsync for the extreme high end crowd with the money and the will to spend it on that, and allow freesync for everyone else... rather than handing sales to AMD because they don't support freesync.


----------



## mtcn77 (Jan 14, 2019)

hat said:


> As I said in the other thread, wouldn't it be natural for an opposing CEO to say such things? What CEO in their right mind would ever openly admit the competition has the better product? Especially when you've been charging through the nose for yours...
> 
> Supporting freesync though is a good move, finally. A lot of people are already salty because nVidia has stupid high prices, but now you gotta pay another stupid high price for the gsync screen? Jensen isn't THAT stupid... he must have realized that supporting freesync would be a good move and potentially get him more sales form the crowd that might be interested in buying nVidia right now, but not paying the price for gsync. They can keep gsync for the extreme high end crowd with the money and the will to spend it on that, and allow freesync for everyone else... rather than handing sales to AMD because they don't support freesync.


Hey, he got coaxed into signing it.


> In fact the lack of DisplayPort availability in displays overall is a big part of why RTG has pursued this. According to numbers from RTG, *only about 30% of all monitors sold include a DisplayPort*, while the other 70% are only implementing HDMI or HDMI + DVI. Consequently FS-DP is an inherently limited market and the majority of monitor buyers will never be able to use FS-DP. Meanwhile from what I hear the actual cost of implementing variable refresh rate support on a TCON is very low, which means that RTG could get far greater penetration for FreeSync by extending it to support HDMI, not to mention bringing down the overall cost of entry-level FreeSync monitors. We’re still talking about a highly price sensitive commodity market – after all, there’s a reason that most monitors don’t ship with a DisplayPort – but if the costs of adding FreeSync are as low as RTG hints, then there is a market for consumers who would spend a bit more on a variable refresh rate monitor but don’t know anything about display I/O standards beyond HDMI.


----------



## Valantar (Jan 14, 2019)

FordGT90Concept said:


> Pretty sure embedded GPUs (dark gray) include APUs.


From what I've read, Embedded and semi-custom is an entirely separate business unit that deals solely with customers outside of AMD. Embedded APUs (such as embedded Epycs and the like) are AFAIK handled by this unit (though neither designed by nor produced specifically for them; they're just re-using existing silicon in those cases), but consumer-oriented APUs are not. If anything, consumer APUs don't fall within RTG at all, but I'm willing to bet the iGPUs in those count towards some of the growth for Vega in this graph (likely through some sort of "sale"/licencing of IP to the Ryzen group.



hat said:


> As I said in the other thread, wouldn't it be natural for an opposing CEO to say such things? What CEO in their right mind would ever openly admit the competition has the better product? Especially when you've been charging through the nose for yours...


You seem to not quite understand the possible nuances of language. The distance between what Huang said and "openly admit[ting] the competition has the better product" is _vast_. How about a respectful acknowledgement of competition while underscoring belief in the superiority of one's own products? There are countless ways of doing this without acting like a condescending ass and still do his job in terms of promoting his company, yet he chose the "be an asshole" tactic. Not that Intel is much of an example of anything much positive, but how many "LOL, Ryzen SUX!!!11!1!!!" statements have you heard or read from C-level executives in that company? Yeah, exactly. Treating people with respect in public is quite fundamental, and not doing so certainly doesn't reflect well on the person acting like an ass. It just makes you look insecure, lashing out, not trusting in your own (or your company's) ability to stand on your own merits.


----------



## FordGT90Concept (Jan 15, 2019)

Valantar said:


> From what I've read, Embedded and semi-custom is an entirely separate business unit that deals solely with customers outside of AMD. Embedded APUs (such as embedded Epycs and the like) are AFAIK handled by this unit (though neither designed by nor produced specifically for them; they're just re-using existing silicon in those cases), but consumer-oriented APUs are not. If anything, consumer APUs don't fall within RTG at all, but I'm willing to bet the iGPUs in those count towards some of the growth for Vega in this graph (likely through some sort of "sale"/licencing of IP to the Ryzen group.


RTG is about as independent from AMD as Xbox is from Microsoft (that is, not).  AMD doesn't license anything from itself.

The Vega integrated in Ryzen APUs are embedded GPUs.  An example of "semi-custom" is consoles, which are also embedded GPUs.  AMD's graph is quite clear in this distinction: integrated (many architectures), mainstream discreet (Polaris), and high performance discreet (Vega).


----------



## Valantar (Jan 15, 2019)

FordGT90Concept said:


> RTG is about as independent from AMD as Xbox is from Microsoft (that is, not).  AMD doesn't license anything from itself.
> 
> The Vega integrated in Ryzen APUs are embedded GPUs.  An example of "semi-custom" is consoles, which are also embedded GPUs.  AMD's graph is quite clear in this distinction: integrated (many architectures), mainstream discreet (Polaris), and high performance discreet (Vega).


While what you're saying is technically true, that still doesn't mean that the embedded and semi-custom business unit provides designs for internal partners. Embedded means for embedded applications, not embedded designs, i.e. industrial PCs, digital signage, and so on. Semi-custom is (mainly) consoles. It's about who they sell to (and thus what SKUs, firmware, drivers and support they provide). 

Also, if AMD didn't do internal licensing, Vega APUs wouldn't affect this graph at all, as they would then fall entirely under the purview of the CPU division.


----------



## FordGT90Concept (Jan 15, 2019)

What does that have to do with anything?  The chart shows what AMD shipped sans CPU-only products.  It's basic accounting and inventory.

Hades Canyon is a semi-custom embedded GPU too.

AMD owns all of the Radeon and x86 IPs/licenses.  As long as it has an AMD sticker on it, AMD's legal team takes care of it.  There's no "internal licensing" because RTG exists in name only.


----------



## Valantar (Jan 15, 2019)

FordGT90Concept said:


> What does that have to do with anything?  The chart shows what AMD shipped sans CPU-only products.  It's basic accounting and inventory.
> 
> Hades Canyon is a semi-custom embedded GPU too.
> 
> AMD owns all of the Radeon and x86 IPs/licenses.  As long as it has an AMD sticker on it, AMD's legal team takes care of it.  There's no "internal licensing" because RTG exists in name only.


While in theory I agree with you that "discrete" divisions inside the same company are still the same company (and that this is one of the most BS tactics of the corporate world in terms of shuffling money around, dodging taxes, laundering money, and generally being asses), that's not how these things are handled in terms of economics or management. 

I have misread the charts a bit, though. After checking how AMD organizes this, Enterprise, Embedded, and Semi-custom is one division, while Compute and Graphics is another, where RTG is roughly half. It seems they've stopped reporting RTG earnings separately, at least from the latest earnings announcements. What that _does _tell us, without a doubt, is that "embedded" does not include APUs, at least not outside of embedded-only APUs - even if these are identical silicon to consumer ones, just with a different name and slightly different microcode. They still get silicon from the same wafers, through the same binning process, with the same designs, but they're separate business units, and thus have separate money. You can't have one pay the costs and the other earn the revenue without a matching payment in-between. In other words, the only way consumer Vega APUs fit into this is in the red block at the top - which is also marked with the same logo as the retail boxes for those chips are. Crypto likely represents the majority of that growth, but it'd be surprising if APUs didn't also add to that.


----------



## INSTG8R (Jan 15, 2019)

mtcn77 said:


> Hey, he got coaxed into signing it.


Except that Freesync over HDMI has been possible for awhile now granted it’s not on all monitors but it’s been available for at least a generation of monitors I suspect a lot of the low range (45-75)monitors are mostly HDMI equipped because of the lack of need for any extreme bandwidth requirements. My monitor can do HDMI 100hz Freesync or 144hz on DP.


----------



## mtcn77 (Jan 15, 2019)

INSTG8R said:


> Except that Freesync over HDMI has been possible for awhile now granted it’s not on all monitors but it’s been available for at least a generation of monitors I suspect a lot of the low range (45-75)monitors are mostly HDMI equipped because of the lack of need for any extreme bandwidth requirements. My monitor can do HDMI 100hz Freesync or 144hz on DP.


Granted, I have researched into this subject, that I have subjective bias against disbelievers and green goblins, it still strikes me as much as it does that I have skipped over Anandtech on 2 occations and a forumer on the next just because some other publisher expresses more enthusiasm.
Thanks Camm & Valantar, this would make for a big reception in 2015!


> The use of vendor specific extensions is perfectly legal within the HDMI standard, but it does mean that FS-HDMI is proprietary - a.k.a proprietary FreeSync


----------



## FordGT90Concept (Jan 15, 2019)

Yeah, until HDMI 2.1, FreeSync over HDMI is proprietary; however, it's something most display manufacturers got behind because very little is different compared to DisplayPort as far as FreeSync is concerned. As far as I know, there's nothing legally stopping NVIDIA from mimicking the signals AMD sends to tell the monitor an adaptive sync signal is coming.  There may be technical limitations though.

Blame the HDMI Forum for being really slow on the uptake, not AMD for making adaptive sync support available in the Xbox One.


----------



## Valantar (Jan 15, 2019)

mtcn77 said:


> Granted, I have researched into this subject, that I have subjective bias against disbelievers and green goblins, it still strikes me as much as it does that I have skipped over Anandtech on 2 occations and a forumer on the next just because some other publisher expresses more enthusiasm.
> Thanks Camm & Valantar, this would make for a big reception in 2015!


Can't say that I understand why you're mentioning me here - but then again I really can't make sense of your post whatsoever. Hm. This brings back some memories.



FordGT90Concept said:


> Yeah, until HDMI 2.1, FreeSync over HDMI is proprietary; however, it's something most display manufacturers got behind because very little is different compared to DisplayPort as far as FreeSync is concerned. As far as I know, there's nothing legally stopping NVIDIA from mimicking the signals AMD sends to tell the monitor an adaptive sync signal is coming.  There may be technical limitations though.
> 
> Blame the HDMI Forum for being really slow on the uptake, not AMD for making adaptive sync support available in the Xbox One.


Yep, the HDMI Forum is really quite slow, but they've been lagging DP pretty much forever. Have to wonder why - is DP pushing the technological envelope more; is HDMI engineered for cost savings; or is there some other reason?

Also, while you're right that Freesync over HDMI is a proprietary extension to the HDMI standard, AFAIK it's still part of the open (as in free-to-implement) FreeSync standard, and should as such not be any problem for Nvidia to adopt even before HDMI 2.1 comes knocking. If HDMI 2.0 is anything to go by, it'll still be a few years before we see displays adopting it, after all.


----------



## mtcn77 (Jan 15, 2019)

Valantar said:


> Can't say that I understand why you're mentioning me here - but then again I really can't make sense of your post whatsoever. Hm. This brings back some memories.
> 
> 
> Yep, the HDMI Forum is really quite slow, but they've been lagging DP pretty much forever. Have to wonder why - is DP pushing the technological envelope more; is HDMI engineered for cost savings; or is there some other reason?


Well, there happens to be one other person that doesn't read its own links, maybe I'm not too alone in oversight:


> In fact the lack of DisplayPort availability in displays overall is a big part of why RTG has pursued this. According to numbers from RTG, only about 30% of all monitors sold include a DisplayPort, while the other 70% are only implementing HDMI or HDMI + DVI. Consequently FS-DP is an inherently limited market and the majority of monitor buyers will never be able to use FS-DP. Meanwhile from what I hear the actual cost of implementing variable refresh rate support on a TCON is very low, which means that RTG could get far greater penetration for FreeSync by extending it to support HDMI, not to mention bringing down the overall cost of entry-level FreeSync monitors. We’re still talking about a highly price sensitive commodity market – after all, there’s a reason that most monitors don’t ship with a DisplayPort – but if the costs of adding FreeSync are as low as RTG hints, then there is a market for consumers who would spend a bit more on a variable refresh rate monitor but don’t know anything about display I/O standards beyond HDMI.


----------



## Valantar (Jan 15, 2019)

mtcn77 said:


> Well, there happens to be one other person that doesn't read its own links, maybe I'm not too alone in oversight:


Does that quote apply to anything I've discussed here? I think you might be confusing me with someone else.

Edit: damn autocorrect.


----------



## FordGT90Concept (Jan 15, 2019)

Valantar said:


> Yep, the HDMI Forum is really quite slow, but they've been lagging DP pretty much forever. Have to wonder why - is DP pushing the technological envelope more; is HDMI engineered for cost savings; or is there some other reason?


https://hdmiforum.org/about/hdmi-forum-board-directors/

AMD had a lot of people there to convince.  I'm sure NVIDIA was resistant in the name of defending G-Sync.


----------



## mtcn77 (Jan 15, 2019)

Valantar said:


> Does that quote apply to anything I've discussed here? I think you might be confusing me with someone else.
> 
> Edit: damn autocorrect.


Well, it was your qoute that shed light to mine and besides, market penetration is a stupendous force which Nvidia is well-known for.


----------

