Why 8GB when you can 11-12? Why buy a ridiculous 16GB card when you can buy a non-ridiculous 16GB card? I can't see this out-performing a 7800xt, even realistically in RT.
Come on.
I still use the same screen WHQD + freesync ASUS PA278QV.
I bought in the first days Radeon 6600XT. The gpu mining days were expensive. This was with 5800X + 2x32 GiB DDR4 RAM.
I switched the processor to something which is basically a sidegrade Ryzen 7600X. (I mention that because the sold am4 platform and my current am5 platform are very close - so it is not really a apple to banana comparision. of course the am5 platform used windows 23h2 - now 24h2. the am4 platform used windows 10. the operating system changes)
The more VRAM I could see in variuos free games with the radeon 6800 non xt. That card was one of the last cards of the product cycle. VRAM matters - even for the free epic games giveaway games - I most of the time play and enjoy.
I'm kinda happy with the Powercolor 7800XT hellhound. (the point why I wrote this hole text wall - 7800XT)
Raytracing is a nonsense feature. Hardly any of my games support it. Except those free trash games amd gave away. Avatar pandora / the last of us / star wars jedi survivor. (I never bought a steam game - ubisoft - ea game. My last game purchase was in the old days with a worm media.)
Why do you want raytracing with an entry level graphic card? The nivida 3070 was too weak / radeon 6800 non xt / 7800xt are too weak.
If you want raytracing you should buy at least 7900XT or go higher.
RT = raytracing is just the killer argument. I do not see a difference or any improvement with raytracing. I always see the same technical demonstration software to show the work in progress.(tech demo in short) For entry level gaming a 7800XT is far decent card. It's expensive card, but mine is at least quiet.
The cards above are for those who are willing to buy twice as much for a graphic card.
It was already hard to justify to buy a graphic card which cost as much as my mainboard + cpu bundle and the expensive dram 2x32GiB DDR5 .
I also believe these higher end graphic cards take more than the 220Watts I see. Of course you have "up to" double the frames with "up to" 450 Watts or 670 Watts for a graphic card.
the idle consumption of ~47 Watts (please check the video yourself - not sure if it was 40 or 47 -- much too high as my ~7-16 Watts in idle) according to the gamers nexus video for the Nividia 5090 is another flaw. I trust them more because they measure with checked equipment any power going in a graphic card.
-- That VRAM topic is also over. Keep staying in your 8GiB of 12GiB VRAM bubble if you want to. I saw clear difference in certain games from the 6600XT 8GB going to the 6800 non XT 16GB card with certain games. I play the older games from epic games and such. With recent games I most likely would not use a 7800XT, but something much better. I want to see how your 8GB card does in WHQD in AVATAR Pandora. That game is demanding and was a free "trash" amd giveaway game. Buggy quests and such. My statement is only valid for AMD graphic cards. I tested a 4GB nvidia card for the driver quality in 2023. Intel is not really worth considering in my point of view.
The reason I bought the 7800xt hellhound was the low noise in various tests. MSI radeon 6800 z trio / asrock challenger 6600XT 8GB D were far too loud in my point of view. One of many reason why these cards had to go to the second hand market.
edit: the buyer should be aware of - he limits himself to certain games - certain frame rates - certain display resolution - certain game settings when buying a 8GB VRAM card. I would not call a 16GB card future proof but acceptable for games made up to the year ~2021 (feel free to determine the year limit yourself).