yes since power comes from power.
ampere had to tear the efficiency of the 250 watts top cards because rdna 2 was to strong for them that why the 80 card got the top dog chip too and the samung node was not that good. thats also the reason the 4090 is so fast because nvidia pretty much always does the same thing if amd was close in one generation they went all out in the next. thats why blackwell will be extrremely boring turing 2.00 gen. 5090 will be a 2080ti 2.0 barely faster a bit more efficient maybe rt is better by a good margin but who cares for that 2 games in year that really needs it lol.(imo)
The latest gen is the most efficient.
Power draw ≠ efficiency.
thats just logical that the next gen is always more efficient thats the nature of the beast. because otherwise it would mean the cards gets slower and thirstier which would be a death sentence for any gpu manufacturer.
still we had 250-260 watts at the top from nvidia for a long time but now we are facing the limits of monolithic chips design. just look what the 4090 brings to the street in comparison to the 4080 it should be much faster than what it is, if we go by specs but it isnt because physic limits.
MCM will fix that to a degree but still power comes from power. they will need a a compeltely new way to make gpus if we want to keep the perfromance gains. And raytracing didnt even start yet. Real rt will need 100 times more gpu power. rasterization will be here for a very very long time, decades still-
i could see 4080/4090 class gpu's staying up in power draw but i would expect 4070 to be powerful but still reasonable draw.
the 70 cards suck all 200-285 Watts , pascal was at 150 watts, turing was at 175 watts. its ridiculous and the regular 4070 is a 4060ti at best still 200 watts.
sure they are efficient but they alo need a lot more power that before. pascal was insane and we will never see anything like it ever again. efficient, fast, a lot of vram, power very low in total terms.