OTOH, +75% speed at ray traced (not even path traced) CP2077.
+ DLSS support. Matters and by a lot. FSR is awful at 1440p, only looks good at 4K.
+ Better XeSS speed. Doesn't really matter but a bonus is a bonus.
+ Lower power consumption. Matters the most in western Europe and other regions where electricity is overly expensive. To a lesser extent, everywhere else.
+ Better support of professional workloads. Not everyone is a gamer.
+ Overall better RT support which results in this GPU more future-proof.
+ Longer driver support.
+ "Oi m8, what the heck is AMD GPUs?" from average Joes. Yes, not everyone is so smart like us.
7900 XT needs further discount to be attractive. Only being faster at pure raster was enough five years ago (but AMD hadn't achieved that either), now it's just a one-advantage-billion-disadvantage line-up.
Ok, so:
Minimums, approx. Out of this test suite...which we'll get into a little further down. Many games not included, for whatever reason, would perform like Cyberpunk
or worse.
Plague Tale 4k60:
Overclocked 7900xt: Ya, prolly.
Overclocked long-ass name nvidia card: Naw
AC mirage 1440p120:
Overclocked 7900xt: Ya, prolly.
Overclocked Box of Scraps AD103: Naw.
Baldur's Gate 3 1440p120:
Overclocked 7900xt: Ya, prolly.
Overclocked 4070 Ti
Bigger Geebees-I-IT-UM: Naw
Battlefield 4k120:
Stock 7900xt: Ya.
Can't call it 4080 16GB because that already exists: Naw
Dead Space 4k60:
Overclock 7900xt: Pretty much
Overclocked 4080-10+Titanium+Super: Naw
Sorry, I couldn't bother to go past the D's. Post will be long-enough as-is. Point taken? '
Shoulda been what the 4070 Ti was from the beginning' is not a great example of The Way It's Meant To Be Played.
Unless you're implying nVIDIA's thoughts toward consumers.
I struggle to find pretty much anywhere this will have any kind of TANGIBLE lead in absolute performance, but many the opposite is true. I think people expect 4k (if DLSS/FSR Q) for >$600, for which FSR often gains more relative (and higher-setting allowing) performance and does indeed not perceptibly hurt IQ at that rez, while this product is not going to guarantee that using nVIDIA's 'feature advantage' of DLSS/RT...as I attempted to imply in my earlier post...and that's generously and importantly using
averages. The minimum in that game is
38.7, for which you can extrapolate from that (as even more non-desirable), and it's important to understand most will be using a less-robust system than what W1zard uses for testing. If you're aiming for less than 4k60/upscaling Q or 1440p at <~120 avg, you're better off buying something cheaper than a 7900xt IMO and upgrading as necessary (which will be a much better value). We can argue
960p->1440p up-scaling all day (which I don't think even DLSS is 'good-enough', let-alone for the price you pay and other/cheaper options allowing native 1440p, although some do), but imo taking nVIDIA's bait on that in exchange for an ever-increasing RT/raster ratio per generation is falling into a trap of more frequent and more expensive upgrades regardless. Your RT performance today will not guarentee it for raster-capable settings when the next gen launches. Ask anyone with a 3000 or 2000 series; also ask them how frame gen is working out for them. Oh, that's right, totally not possible without proprietary 40 series tech, kinda like how a Gsync module is absolutely required for VRR. Oh wait, that's
a complete load of shit as AMD did them just fine through software. Who knows what the next performance scapegoat unsupported on past generations or improvements to relegate near-recent past parts to non-RT or lower resolutions will be. Odds are though, they will come to pass. Maybe AMD (or even Intel) will save our asses again, maybe they won't. We all know how nVIDIA feels about
source, regardless, even to their own earlier generations. I honestly don't know how anyone can support that practice when there are other options, and those options are often actually better/cheaper.
On avg, this card performs similar to worse than I thought it might. $800 nV now has adequate RAM, but lost what the 4080 had in raster to completely take advantage of it (esp wrt nVIDIA's RT feature-set).
They simply refuse to give both for a decent price. I have said some version of this many times, and it has been shown to be true more times than I can count over the years.
I find it fascinating that some people, including W1zard, choose to die on the hill of not needing more ram. They've been proven wrong continously over the course of the years (as AMD cards with more RAM age more gracefully to some peoples' evergreen bewilderment), but simply complain about the game ('optimization') when the threshold is reached, not, you know, planned-obsolescene the competition doesn't share, and then PROCEED TO REMOVE THOSE GAMES FROM THE TEST SUITE IN GPU REVIEWS. Immortals of Aveum, one of the first UE5 games (and indeed, a forebearer of how others in the future may also perform if not perfectly polished)? Nope. Done been vanished.
Weird.
Or is it? At least we still have
Hogwarts Legacy, which may be a hint of things to come in the PC space. Seeing a trend? Yes,
many people should in-fact consider overclocking a 7900xt and have good-enough performance for less money
without having to settle for upscaling (rn).
RT on that card isn't HORRIBLE, either, in many use-cases, especially if you figure a ~15%+ overclock, which isn't crazy or abnormal. That's the performance of an overclocked 4070 Super, which isn't much cheaper, or the stock performance of the old 4070 Ti 12GB, which was MORE EXPENSIVE (and again arguably the most ridiculous fucking thing nVIDIA has ever sold to enthusiasts). +40% RT performance only matters when your compute/raster/buffer doesn't suck 40% more. I wonder what's more versatile? Probably the compute shaders which AMD has apparently used to replicate nVIDIA's fixed-function hardware, including FSR and FG, to great avail.
It's interesting we still have the
absolute nVIDIA-advantage outlier of CS2 tested though, which will play fine on most-anything (you can get 1440p120 mins on a
7800xt: if you need more it's a skill issue).
(I still think he should switch that out and bring back
The Callisto Protocol, which is perhaps the most 1:1 console->PC performance port. You know: an actual important thing to understand...but whatever).
It happened at 3GB; 970 because of
lawsuit-inducing bus (which I owned and stuttered like hell when using the last GB, then later literally erupted into flames in my PC due to a widely-discussed common problem, WHICH WAS EFFING SCARY [
never change, nVIDIA]). It happened at 4GB, which I was right here complaining about
Fiji long before it was even released. It happened at 6GB, which I don't recall AMD making any parts using that past a time BEFORE it was needed. It happened at 8GB, for which one company tried to sell as a 1440p card at $400 within the last year. It is happening at 10-12GB, for which many will then forget 3080 10GB is/was a (fairly prominent) thing; expect it to be removed from the comparisons when next gen launches, even though it's raster performance is still adequete. It doesn't matter AS MUCH for this card, as again it lacks the grunt in other areas to fully take advantage of it...but many neglect that aspect of it completely as it's not currently as readily apparent as a 8GB card hitting a wall at 1080p/1440p, sometimes only apparent
in his suite at
4k. It truly is history repeating itself: outliers occur (as they already have for 10-12GB), and then become the norm (as they will). There will also always be SOME games that need to be brute-forced, regardless of reason. Talk to me again after the PS5Pro and Blackwell release, games are normalized toward the new console (4GB for OS + 16GB games, unlike reg PS5 which is 512MB OS +
16GB shared [~13.5GB-14GB for games?]) and/or nVIDIA's new arch (especially wrt RT); when the AD104 replacement(s) is/are perhaps 18GB and people that own a AD104 (in terms of ram) or this card (in terms of raster) STILL OWN THEM because they likely are considered a long-term (2+ year) investment, especially at $800. Did I mention we might get both before the end of this year, and for certainly less than this card (likely to fight Navi 4)?
The PS5 Pro will be a whole damn computer for less money, but not tangibly less performance. That's what makes me the most sad about this, as I, like many here, are PC gaming enthusiasts at heart and want it to thrive and push the medium forward. It's a bummer when it's extremely difficult to justify it to people because of products like this which do absolutely nothing to move the needle back in that direction.
Not trying to coax a narrative, often save people from one. Buy what you want for reasons that are important to you. If you feel you're getting the perf/$ for the time/way you'll use it, that's what matters.
The problem is, some don't understand the way they'll be able to comfortably use it over the course of it's time in their PC, and what practical advantages/disadvantages each has over time.
Think of many whom proclaim 60fps is essential for frame gen and desire 90fps+ for a FPS. Can this guarentee that with 4k/DLSS Q/RT? No, it can't, even now. As demonstrated by my 2077 post.
That's my point, which is to say it's advantages are, for many people, largely moot. This will only become more apparent over time, as nVIDIA has demonstrated
over and over again.