Nvidia doesn't dictate anything. The company will have problems once TSMC stops releasing new processes, which will inevitably happen because the Moore's law has been dead for a while already.
Nvidia relies on the old nodes 6nm and 4nm, and this is a disaster for them.
They do today. They have architectural advantages and many games in their PC ports optimize specifically for Nvidia. The public buy the sticker blindly, it's not a case of TSMC.
No, in fact it didn't help when RDNA 3 came out with driver issues that were caused by a new architecture that is more complicated - dual issue shaders being the culprit but also other things. As I mentioned somewhere else, if AMD streamlines their approach to architecture like Nvidia did, it will make driver issues less and less, and they will have very good drivers even at launch. Now we go back to RDNA 2, RDNA 2 had proper drivers at launch because it was just a bigger RDNA 1 with RT cores added, not a much changed architecture like RDNA 3 was. If RDNA 4 has very good drivers from second 1, it will get nice mind share, just like RDNA 2.
They tried to streamline it with RDNA for gaming and CDNA for compute and then compute and AI and Stable Diffusion and stuff happened and now they are moving into combining those again, meaning one architecture for everything, back to the GCN era. AMD failed with RDNA 3 not because of drivers, but because they failed to predict that a feature like RT, a gimmick that probably doesn't do much or maybe it does, would become a very important parameter for when buying a GPU much faster than what they where expecting. If RDNA 3 was bringing twice RT performance from the same number of CUs compared to RDNA 2, AMD wouldn't have lost so much market share. Then they failed to take advantage of the better compute capabilities of RDNA 3. Getting Amuse and promoting it with their GPUs shows that at least they understand that buyers of graphics cards want to (feel they can) do more with their hardware than simply playing games. When the equivalent of a $200 GPU from 10 years ago, today costs $500, consumers want to have at least the impression that they get more for paying more. And I don't just mean raster performance. While I keep saying a number of things about Nvidia, the buyer of a $500 Nvidia card will only get gimped in VRAM. It will get raster, RT, AI upscaling, FG, CUDA, AI, OptiX. With AMD until recently the buyer was getting only raster performance and FreeSync. AMD had to go to FSR 3.1 to offer something good, buy Amuse and offer it free for those wanting to play with image creation through AI, implement AFMF 2 in the driver to come to a level where the buyer was feeling that was getting more than simply raster performance for gaming.
Illusion or not, frame gen works, not so in competitive games, where I would never use it, as it doesn't improve your relative latency, but in other games it's working well and it is really like you are having more FPS. The buyer cares about how good the game works, not about intrinsic technicalities like "if it is generated or real performance". "Real" is also relative and freely arguable. "Real" is for me what works, and not "traditional" performance. So frame gen is very much real, as long as it performs like advertised (and it usually does, I used it extensively in CP2077 for example - I used it in D4 because of CPU bottlenecks in the cities, it worked in both cases). Frame gen has three downsides: 1) it's not really usable for competitive games as it's not making you see enemies better, as the relative latency doesn't improve, 2) it has rare image glitches, the quality is mostly very good, but not always. 3) frame gen is not (really) usable if your fps is under 50-60 fps, without it being turned on. So you can't use it if your frames with settings X aren't high enough.
What I posted above! I guess we agree here. But you didn't realised what I meant. It is an illusion with the RTX 4060 vs RTX 3060 and that's why Nvidia haven't supported Frame Generation on RTX 3000 cards. Because the performance stagnation in many cases would have been obvious. So when I am talking about an illusion in this situation, it is not about FG working or not, but the artificial limitation of not offering FG to 3060 and any 3000 model.
Are you saying that it is ok to sell a new generation card with almost equal raster performance with same price, only because of RT improvement? I am actually not caring that much of the RT.
GRE is selling for much higher than $500 I think. At least in most countries. Considering that AMD almost always starts with a relatively high MSRP and drops the price slowly even weeks after release, a $500 MSRP will translate in $450 6 months latter. A raster performance of 7900 GRE and an RT performance higher than that of 7900XTX for $500 MSRP, $450 6 months later, is not bad at all. Intel's top models and Nvidia's 5070 and 5060 models will dictate the final price. But AMD the last few years always starts with a higher MSRP than we wish to see. Maybe because at AMD they are super annoyed reading the same phrase over and over again for the last two decades
"Please AMD, build something good and cheap, so Intel and Nvidia drop their prices, so I could go and buy cheaper Intel and Nvidia hartdware".