Guy, you don't buy a
The amd paranoia. The xtx can't even keep up with the 4080 in new games anymore, but now it's faster than the 5080
Believe what I want? Im using the data YOU provided. You linked that graph that clearly shows the cheaper nvidia card matching the much more expensive and twice the vram nvidia card. Clearly amd fleeced us last gen. According to your graph the 5080 is 38% faster than the XTX, but im the one that believes what he wants, lol. Delusional
I give up. You just don't get it. Hopefully it makes sense to other people. Have fun however you enjoy your games, guy.
7900xtx for 4k non-rt w/o a doubt. 5080 for 4k non-rt maybe? Would have to see the dips/ram usage on an OC. The point is that RAM is the limitation there. The future hold better for 7900xtx in raster at 4k.
Both will run native 1440p RT. Neither higher.
4070ti not upscaled from 1080p to 1440p, 7900xtx yes. 7800xt 1080p native, same as 4070 Ti.
My point is if it runs something better, but still can't run it well, is that a win? I say no. If it can run raster better for a higher playable resolution (and often much cheaper) is that a win? I say yes.
Again, I use 60fps as a baseline. You may not. That's fine if you don't. The industry does. You clearly just either don't understand, or are defensive. I'm making a point about aimed resolutions/playability.
They are different aims, no doubt. That will be somewhat similar with 5070 (which will be clocked through the damn ceiling to compete against 9070 in 1080p RT, some games upscaled higher).
That does not change the fact 12GB just isn't enough ram for 1440p RT (and soon probably many cases of 1080p upscaled to other resolutions) and doesn't excuse it's inability to run 1440p/4k bc low ram either!
Like I said, it's a conversation that will make more sense as things evolve. This is just the beginning.
You'll understand better when AMD targets RT resolutions and not just raster. It's not so much the arch changed, but people's perception and hence the target performance aim.
9070 onwards will target RT (upscaled from 1080p or 1080p native) where-as former products target pure raster resolution (1080p/1440p 7600/7800) like 7900xt is 1440p->4k and 7900xtx is 4k in raster.
Because of this 7900xtx does not target 1440p RT (or upscaled to 1440p raster) at stock nor does 7800xt 1080p or upscaled RT from 1080p raster.
9070 series will target 1080p/upscaled from 1080 RT (to 1440p/4k).
Make sense? The absolute performance of a lot of 7000 series is fine, but they are certainly shifting stock perf aim to make people like you happy. Like I said, it won't completely shake out until 3nm cards.
And then 12GB will DEFINITELY suck. Enjoy your 1080p without the ability to upscale to ANYTHING with decent performance in (certainly not RT).
I know. I know...You can turn down settings. Kill the shadows. I know. But that's not the point.
People want to play with maximum settings (I would argue not RT until next gen at 1080p or upscaled from 1080p to 1440p/4k) and that is how cards are aimed.
9070 will start that reality, 4070ti/5070 will not continue it. The next-gen will be squarely aimed at that across the board and for scaling from higher rez (1440p->4k) outside of something like a 4090.
The 192-bit cards will replace 9070/5080 and do it better...and those will be the aim bc similar perf to a PS6.
Some games will be limited by 16GB, some not. That's my point. 9070 is fine, but 192-bit 3nm better for longevity.
45-90TF a nice scale. Under that 12GB is fine but soon to be outdated (as it'll be below console capability). This includes ALLLL THE "RT" cards you've seen up until this standard.
45TF starts needing 16GB, which is good up to around 60TF. 7800xt is pretty much limited to 45TF for this reason (to make 9070 look good for up 60TF), and so is 4070ti 12GB and 5070 (<45TF).
90TF is around absolute performance of 24GB. This is a 4090. It is a good design bc I don't think most people are going to play above 1440p upscaled and/or use >24GB (need 32GB) most of the time.
256-bit 3nm cards will be targeted at this spec using 24GB GDDR7.
PS6 probably ~60TF. So, IOW, PS6 will use a minimum of 16GB and conceivably more (absolute perf of 7900xt 60TF and it's 20GB), while likely targeting ~1080->4k RT worse-case.
Conceivably higher-resolution with/without RT.
192-bit cards will target this using 18GB GDDR7.
I don't think it's that difficult to understand? You can literally look at every card and see what I'm talking about.
Also, again, why 5080 16GB is a travesty because they clearly don't want it to be a 4k native card. It's clocked at 2640mhz at stock so it doesn't exceed 60TF and have ram limitations be clearly seen.
This is literally clear as day. Maybe it's not clear to some. I don't know. I'm a nerd. It's the truth though (that it's clear...also that I'm a nerd).
This is why 9070XTX, which will likely reach CLOSE to 60TF absolute performance (meaning 8192sp @ high-as-hell clockspeed) is a smart/cheap design bc high-clocked 256-bit 16GB GDDR6.
5080 is so badly designed as a stop-gap to the card you want (a 4090) it's riduclous how blatant it is.
The fact they don't clock it over 60TF with 16GB or clock it higher with 24GB telling (that's a Super, which will also be trash bc still not 4090 raster while 3nm cards will be).
I need to teach courses on this shit or something. I'd give you extra homework.