Look,
Valantar , I am talking about simple thing competitiveness, you are talking about utopia and how the CTO is always right.
The same people who introduced R600, Bulldozer, Jaguar, Vega and now have two competing chips Polaris 30 and Navi 14 covering absolutely the same market segment.
Please, let's just agree to disagree with each other and stop the argument here and now.
Thanks.
Sorry, but no, I'll not agree to disagree when you aren't actually managing to formulate a coherent argument or even correctly read what I'm writing. Let's see. Did I say "the CTO is always right"? No, among other things I said
Of course it's possible for these choices to turn out to be completely wrong (Hello, Bulldozer architecture!)
Which is a rather explicit acknowledgement that mistakes can and have and will be made, no? You, on the other hand, are saying "Mark Papermaster said they made 'the right' improvements, therefore this must be subjective and wrong!" with
zero basis for saying so (at least that you are able to present here). Having made bad calls previously does not mean that all future calls will be poor. Besides, Papermaster wasn't the person responsible for a lot of what you're pointing out, so I don't quite understand why you're singling that specific executive out as fundamentally incapable of making sound technical decisions. Not to mention that no executive makes any sort of decision except based on the work of their team. If you want your opinion to be respected, at least show us others the respect of presenting it in a coherent and rational manner instead of just throwing out accusations and wild claims with no basis.
(And again, please don't read this as me somehow saying that "Mark Papermaster is a genius that can only make brilliant decisions" - I am not arguing
for something, I am arguing
against your brash and unfounded assertions that these decisions are necessarily wrong. They might be wrong, but given AMD's recent history they might also be right. And unless you can present some actual basis for your claims, this is just wild speculation and entirely useless anyhow.)
Polaris production is winding down, current "production" is likely just existing chip inventories being sold out (including that new China-only downclocked "RX 590" whatsitsname). They are only competing directly as far as previous-gen products are still in the channel, which is a situation that takes a while to resolve itself every generation. Remember, the RX 5500 launched less than three months ago. A couple more months and supply of new Polaris cards will be all but gone.
But beyond that, you aren't talking about competitiveness, in fact I would say you aren't presenting a coherent argument for anything specific at all. What does an imagined delay from an imagined previous (2019?) launch date of Navi 2X have to do with competitiveness as long as it launches reasonably close to Nvidia's next generation and performs competitively? What does the lack of RTRT in Navi 1X have to do with competitiveness when there are currently just a handful of RTRT titles? If you want to make an overarching point about something, please make sure what you're talking about actually relates to that point.
Also, I forgot this one:
How do consoles with poor compared to the top PC hardware run 4K then and why?
Why are 4K TVs mainstream now?
4K TVs are mainstream because TV manufacturers need to sell new products and have spent a fortune on marketing a barely perceptible (at TV sizes and viewing distances) increase in resolution as a revolutionary upgrade. TVs are also not even close to mainly used or sold for gaming, they are TVs. 4k TVs being mainstream has nothing to do with gaming whatsoever.
Consoles can run 4k games because they turn down the image quality settings dramatically, and (especially in the case of the PS4 Pro) use rendering tricks like checkerboard rendering. They also generally target 30fps, at least at 4k. Console games generally run quality settings comparable to medium-low settings in their own PC ports. Digital Foundry (part of Eurogamer) has done a lot of great analyses on this, comparing various parts of image quality across platforms for a bunch of games. Worth the read/watch! But the point is, if you set your games to equivalent quality settings and lower your FPS expectations you can match any console with a similarly specced PC GPU. Again, DF has tested this too, with comparison images and frame time plots to document everything.
Looking at TSMC process chart, I simply do not see where the perf/watt jump should come from.
7N => 7NP/7N+ could give 10%/15% power savings, but the rest...
So, 35-40% improvement would come from arch updates alone?
And that following major perf/watt jump Vega=>Navi?
That was what they said in the fin an day presentation, yeah, including specifically . This does make it seem like like RDNA (1) was a bit of a "we need to get this new arch off the ground" effort with lots of low-hanging fruit left in terms of IPC improvements. I'm mildly skeptical - it seems too good to be true - but saying stuff you aren't sure of at a presentation targeting the financial sector is generally not what risk-averse corporations tend to do. PR is BS, but what you say to your (future) shareholders you might actually be held accountable for.
Welp, what about Vega vs Navi? Same process, 330mm2 with faster mem barely beating 250mm2 chip from the next generation.
Not to mention at ~70W more power draw.