Sunday, December 11th 2022

First Alleged AMD Radeon RX 7900-series Benchmarks Leaked

With only a couple of days to go until the AMD RX 7900-series benchmarks go live, some alleged benchmarks from both the RX 7900 XTX and RX 7900 XT have leaked on Twitter. The two cards are being compared to a NVIDIA RTX 4080 card in no less than seven different game titles, all running at 4K resolution. The games are God of War, Cyberpunk 2077, Assassin's Creed Valhalla, Watchdogs Legion, Red Dead Redemption 2, Doom Eternal and Horizon Zero Dawn. The cards were tested on a system with a Core i9-12900K CPU which was paired with 32 GB of RAM of unknown type.

It's too early to draw any real conclusions from this test, but in general, the RX 7900 XTX comes out on top, ahead of the RTX 4080, so no surprises here. The RX 7900 XT is either tied with the RTX 4080 or a fair bit slower, with the exception being Red Dead Redemption 2, where the RTX 4080 is the slowest card, although it also appears to have some issues, since the one percent lows are hitting 2 FPS. Soon, the reviews will be out and everything will become more clear, but it appears that AMD's RX 7900 XTX will give NVIDIA's RTX 4080 a run for its money, if these benchmarks are anything to go by.

Update Dec 11th: The original tweet has been removed, for unknown reasons. It could be because the numbers were fake, or because they were in breach of AMD's NDA.

Source: @Vitamin4Dz
Add your own comment

146 Comments on First Alleged AMD Radeon RX 7900-series Benchmarks Leaked

#76
vMax65
To be honest, I have seen this AMD vs Nvidia vs Intel from the earliest days and I still cannot understand how so many seem to get wrapped up in brands. For me it is simply budget and use case and maximising what I get for the budget. Both Nvidia and AMD are the same sadly...Nvidia more so as they are the market leader and if the tables were turned it would be the same. The 4080 was clearly priced absurdly and there probably are reasons to do with over production of the 30 series as they were trying to cream profits during the mining craze which also showed them the absurd prices people were willing to pay. When the bottom fell with the end of GPU mining, they were caught between a rock and hard place as they did not want to lose margins on the 30 series for both themselves and the AIB's...still it was there own fault as they sadly jumped in on the scalper band wagon.

AMD on the other hand also are not doing us any favours, the 7900XTX and XT are also over priced and lets not forget as soon as they reached parity with Intel and finally overtook Intel on performance, they upped the prices of the CPU's and cut out the lower end models. Only with Alder Lake and Raptor Lake did AMD finally have to drop prices of the 7000 series cpu's. If Raptor Lake did not compete we would be paying the higher prices all the way through. Compitition is working on CPU's...

I think the average gamer is getting priced out as the low end is becoming mid range prices, mid range becoming high end pricing and high end becoming halo pricing. Sad to see on all sides, especially with the current inflation crisis. Hopefully the 7900XTX does compete well with the 4080 as that could see the 4080 drop in price and if Intel can get in on the act in a few years we might get lucky with some good compitition driving down prices.

Just buy the best GPU for your budget be it AMD or Nvidia...I went and bought a 40 series GPU as it fit my needs. The 4080 and 4090 are good GPU's effecient and powerful and know doubt the 7900XTX and XT will be the same..
Posted on Reply
#77
InVasMani
Until something is confirmed the bickering serves no purpose. It's also worth waiting to see if behaves any differently on AMD hardware as well because there is certainly a chance that it might work slightly differently for AMD with smart access memory for a system built around AMD rather than Intel.
Posted on Reply
#78
Gica
Lots of salt. The 4080 can't reach those temperatures in games.
Posted on Reply
#79
InVasMani
The CPU temps to me look strange you'd except fewer average frames to make the CPU run cooler not hotter, but opposite seems to be happening in general. It's not like CPU utilization drops due to rendering higher frames it's the opposite a stronger GPU just allows you to render more of them at least until the CPU itself reaches a choking threshold point.
Posted on Reply
#80
Hxx
DenverI think what matters here is the truth, nothing more. in this case, anyone who says that the Nvidia driver is 100% bug-free and perfect is clearly lying.

If you want to avoid having to deal with bugs as much as possible, the best solution is to buy a console. PC is an amazing and comprehensive platform, but dealing with bugs is part of the experience.
Tbh i don’t think any drivers are bug free but in a decade of owning nvidia hardware 3080 2070 1070 980ti etc etc I don’t think I’ve ever seen a consistent driver crash. Now this Gen I really want to switch it up to amd and grab a 7900xtx from my local MC. Last time I owned an amd card was an ATI 4080 I think lmao or whatever the dual gpu card was named . I know next to nothing about their gpu platform except what I read in reviews so I hope whatever bugs there are if any … are nothing more than one offs and not some game breaking consistent type crash.
Posted on Reply
#81
mahoney
InVasManiThe CPU temps to me look strange you'd except fewer average frames to make the CPU run cooler not hotter, but opposite seems to be happening in general. It's not like CPU utilization drops due to rendering higher frames it's the opposite a stronger GPU just allows you to render more of them at least until the CPU itself reaches a choking threshold point.
Because these benchmarks are fake. You have literally 1000's of these youtube benchmark charlatans posting all sorts of gpu/cpu numbers for products that haven't even come out yet without ever showing proof they actually have them.
Posted on Reply
#82
Psychoholic
GicaLots of salt. The 4080 can't reach those temperatures in games.
True, i think the most i have ever seen on mine is 62C.
Posted on Reply
#83
mama
If these figures are accurate, which I doubt, there may be an issue with 1% lows. Hopefully driver optimisations will fix the issue.
Posted on Reply
#84
Zach_01
RedBearAs far as I know Nvidia never said that RTX 4090/Ada Lovelace was aiming for 4Ghz, but AMD did say that RDNA3 was "architected to achieve 3Ghz":

(Edited picture from this reddit post)
Being unable to hit that target, if confirmed, would imply that there was some major hiccup in the development of the architecture.
I guess you or that guy didn’t watch AMD’s RDNA3 presentation when they declare that the reference 7900XTX/XT clock speed is up to 2.5GHz.
Learn first… commend later. The only bugs here are those that coming out of some mouths/keyboards

The up to 3GHz could be for some AIB GPUs that also exceeding the 355W power limit of reference 7900XTX.
Posted on Reply
#85
Jism
KrazyTHow can the cpu temp is around 10° lower with the 4080 ?
Should it not be the same ?
Thats whats called driver overhead.
Posted on Reply
#86
1d10t
Why it's look like made up 720p screen grab upscaled to "4K" video with inserted text? Seriously, I've seen some side by side video benchmark and this leaks doesn't even close to 1080p resolution even with Premiere scale.
Posted on Reply
#87
swirl09
MeanhxA 4080 running at those temps? I call BS on these benchmarks.
First thing I noticed honestly. Dont most 4080s use the same cooler as their 4090 model does?

I can tell you I havent seen my card do more than 67C and that was pulling 480w, neither of those numbers are what you typically see tho.
Posted on Reply
#88
iGigaFlop2
Something seems off not any of the 4090’s will hit 80c and the 4080 all pretty much have the 4090 coolers and run extremely cool. But I think the xtx should be a little faster than the 4080 in pure rasterization but lose badly in ray tracing. But I do think most people should already have their cards and their reviews ready so we might see some legit
benchmarks I picked up a 4090 its a beast as it should be had to sell my first born to get it. Im always excited for cpu and especially gpu launches. I want amd to compete it helps with innovation and pricing. Ray tracing was pretty much a gimmick in the turing days but it is a legit reason to pick a certain card now with all the game’s getting it. I think the xtx will have decent ray tracing on par with a 3090ti or maybe a little better. I think the xt should have been $200 cheaper also who’s gonna pick up an xt when an xtx is $100 more. When your paying 900-1200 for a gpu whats $100. Im hoping that’s as high as we see aftermarket cards like the nitro($1200)
Posted on Reply
#89
Razzic
First concerning thing is I noticed is the 1% lows. Nvidia in most titles is far superior and we know that equals a smooth gameplay experience.

Secondly the framerate doesn't say average frame rate so I'm gathering that is just the current framerate?

I hope the 1% lows will just be rectified by a driver fix and it is just an unoptimised driver issue as we know this can happen with Nvidia as well.
Posted on Reply
#90
Luminescent
While system is stable AMD DRIVERS ARE BAD!
Proof, a recent game W1zzard tested, sponsored by AMD runs very bad on AMD cards, The Calisto protocol, so i investigated what graphical setting causes such poor performance on amd, it was not ray tracing, it was volumetric quality that causes a 20-25% performance hit in DX12 with the latest driver AMD Radeon 22.11.2.
So i said fu..ck it, i put my pro driver back ( i do photo/video work ) and if i want to play Calisto i will do it with the pro driver, Pro edition 22.Q4, a driver from 14 november 2022, framerate went from 40 to 50 fps in a scene i was testing before.
I can now play 1440P FSR 2 quality on ultra in DX12 no problems.
So it's not the people who made this game at fault, it's actually AMD who messed up with the latest driver.
Posted on Reply
#91
aciDev
MxPhenom 216If the 7900XT trades blows with the 4080 that'll be a huge win for AMD @ $899.
**80 tier performance at 900$+VAT is still a loss for us.
Posted on Reply
#92
Lovec1990
Personaly i will wait for tests and then decide
Posted on Reply
#93
TheLostSwede
News Editor
mechtechdegrees C that is ;)
Sorry, it's below 273,15 degrees K outside.
Althought I missed the fact that people were complaining it was runing too hot. :oops:
Posted on Reply
#94
Sithaer
vMax65To be honest, I have seen this AMD vs Nvidia vs Intel from the earliest days and I still cannot understand how so many seem to get wrapped up in brands. For me it is simply budget and use case and maximising what I get for the budget. Both Nvidia and AMD are the same sadly...Nvidia more so as they are the market leader and if the tables were turned it would be the same. The 4080 was clearly priced absurdly and there probably are reasons to do with over production of the 30 series as they were trying to cream profits during the mining craze which also showed them the absurd prices people were willing to pay. When the bottom fell with the end of GPU mining, they were caught between a rock and hard place as they did not want to lose margins on the 30 series for both themselves and the AIB's...still it was there own fault as they sadly jumped in on the scalper band wagon.

AMD on the other hand also are not doing us any favours, the 7900XTX and XT are also over priced and lets not forget as soon as they reached parity with Intel and finally overtook Intel on performance, they upped the prices of the CPU's and cut out the lower end models. Only with Alder Lake and Raptor Lake did AMD finally have to drop prices of the 7000 series cpu's. If Raptor Lake did not compete we would be paying the higher prices all the way through. Compitition is working on CPU's...

I think the average gamer is getting priced out as the low end is becoming mid range prices, mid range becoming high end pricing and high end becoming halo pricing. Sad to see on all sides, especially with the current inflation crisis. Hopefully the 7900XTX does compete well with the 4080 as that could see the 4080 drop in price and if Intel can get in on the act in a few years we might get lucky with some good compitition driving down prices.

Just buy the best GPU for your budget be it AMD or Nvidia...I went and bought a 40 series GPU as it fit my needs. The 4080 and 4090 are good GPU's effecient and powerful and know doubt the 7900XTX and XT will be the same..
I don't get it either, I always buy whats best for my budget/use case and for that both AMD and Nvidia got the job done just fine and same deal on the CPU side for me with AMD/Intel over the years.
Can't recall any serious driver issues either on my end, they both had their share of smaller problems but nothing too bad that would make me so pissed that I wouldn't buy from that brand anymore. 'I had my RX 570 undervolted/tweaked in the AMD driver for ~3 years and it worked fine until I sold it/upgraded'

I was always more on the budget-mid range level of hardware and yea I can for sure notice/feel that I'm being priced out if I want a reasonable upgrade on the GPU front.
My usual price/budget range simply does not exist anymore, I already switched to the second hand market with my previous cards 'RX 570/GTX 1070/3060 Ti' since I both cannot and refuse to spend that much on brand new cards nowadays.
Posted on Reply
#95
InVasMani
LuminescentWhile system is stable AMD DRIVERS ARE BAD!
Proof, a recent game W1zzard tested, sponsored by AMD runs very bad on AMD cards, The Calisto protocol, so i investigated what graphical setting causes such poor performance on amd, it was not ray tracing, it was volumetric quality that causes a 20-25% performance hit in DX12 with the latest driver AMD Radeon 22.11.2.
So i said fu..ck it, i put my pro driver back ( i do photo/video work ) and if i want to play Calisto i will do it with the pro driver, Pro edition 22.Q4, a driver from 14 november 2022, framerate went from 40 to 50 fps in a scene i was testing before.
I can now play 1440P FSR 2 quality on ultra in DX12 no problems.
So it's not the people who made this game at fault, it's actually AMD who messed up with the latest driver.
That's a pretty common issue regardless of brand. Different drivers and for different new and old game titles can routinely lead to differing frame rates. That's not a new issue hell there was a website tweakforce that use to be dedicated to modded display drivers that existed for a long period of time. I think eventually stopped doing it after Nvidia did something to prevent driver modding if memory serves me right. That was after like probably a decade of the site being devoted to modding drivers for the betterment of users. It was a whole community site and forum dedicated to modded drivers and driver performance and would often share information on which driver versions were working best with which titles. There were dozens of occasions in which it was better to not use the latest Nvidia drivers over the years or even use a more recent or bit older Quadro driver.
Posted on Reply
#96
the54thvoid
Super Intoxicated Moderator
aciDev**80 tier performance at 900$+VAT is still a loss for us.
While true, AMD's problem from a shareholder perspective would be that the XTX is the highest tier product. They need to price it higher. By all accounts though, the XT should not be $100 cheaper, it should be way down on cost.
Posted on Reply
#97
Luminescent
There is a difference, Nvidia does it on purpose, AMD is just cheap and lazy, that last driver said it added support for calisto and witcher 3, for calisto it did worse and degraded performance.
Posted on Reply
#98
TheoneandonlyMrK
LuminescentWhile system is stable AMD DRIVERS ARE BAD!
Proof, a recent game W1zzard tested, sponsored by AMD runs very bad on AMD cards, The Calisto protocol, so i investigated what graphical setting causes such poor performance on amd, it was not ray tracing, it was volumetric quality that causes a 20-25% performance hit in DX12 with the latest driver AMD Radeon 22.11.2.
So i said fu..ck it, i put my pro driver back ( i do photo/video work ) and if i want to play Calisto i will do it with the pro driver, Pro edition 22.Q4, a driver from 14 november 2022, framerate went from 40 to 50 fps in a scene i was testing before.
I can now play 1440P FSR 2 quality on ultra in DX12 no problems.
So it's not the people who made this game at fault, it's actually AMD who messed up with the latest driver.
On a game that just came out, why so hyperbolic over a driver issue on a week one release on a different card then this threads about, chill the. F out.

The proof I see proves more about you.
Posted on Reply
#99
Dirt Chip
The hype train is off the station and everyone going bananas
Christ, wait for proper bench..
:kookoo:
Posted on Reply
#100
Beermotor
TheinsanegamerNThat assumes its a hardware problem at all, instead of it being AMD's drivers being buggy, like with rDNA 1's downclocking issue or the 200/300's black screen issues, ece.
This whole chain of posts is an exercise in "tell me I don't know anything about integrated circuit engineering without saying I don't know anything about integrated circuit engineering."

All hardware has bugs, or "errata" as they're called in the business. Intel, AMD, Samsung, Qualcomm, and even Nvida have very obvious and/or concerning issues in their hardware that somehow make it through to final silicon.
Posted on Reply
Add your own comment
Nov 21st, 2024 13:29 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts