Tuesday, December 24th 2024

AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance

Recent benchmark leaks have revealed that AMD's upcoming Radeon RX 9070 XT graphics card may not deliver the groundbreaking performance initially hoped for by enthusiasts. According to leaked 3DMark Time Spy results shared by hardware leaker @All_The_Watts, the RDNA 4-based GPU achieved a graphics score of 22,894 points. The benchmark results indicate that the RX 9070 XT performs only marginally better than AMD's current RX 7900 GRE, showing a mere 2% improvement. It falls significantly behind the RX 7900 XT, which maintains almost a 17% performance advantage over the new card. These findings contradict earlier speculation that suggested the RX 9070 XT would compete directly with NVIDIA's RTX 4080.

However, synthetic benchmarks tell only part of the story. The GPU's real-world gaming performance remains to be seen, and rumors indicate that the RX 9070 XT may offer significantly improved ray tracing capabilities compared to its RX 7000 series predecessors. This could be crucial for market competitiveness, particularly given the strong ray tracing performance of NVIDIA's RTX 40 and the upcoming RTX 50 series cards. The success of the RX 9070 XT depends on how well it can differentiate itself through features like ray tracing while maintaining an attractive price-to-performance ratio in an increasingly competitive GPU market. We expect these scores not to be the final tale in the AMD RDNA 4 story, as we must wait and see what AMD delivers during CES. Third-party reviews and benchmarks will give the final verdict in the RDNA 4 market launch.
Sources: @All_The_Watts, @GawroskiT
Add your own comment

204 Comments on AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance

#151
Melvis
Its all about Price/Performance, and for this Aussie if its priced around $600 and performs between a GRE and XT then its a winner, and I do not give a flying F about RT......
Posted on Reply
#152
jh_berg
I could be wrong, but the RX 9070 XT is the successor to the RX 7700 XT. Or not? Comparing its performance with the RX 7900 XT is not fair. IMHO.
Posted on Reply
#153
kapone32
watzupkenAMD is starting to fall behind the GPU side of things. The new RDNA 4 don’t seem like a big upgrade from a rasterization perspective. It may make up, or may be catch up in terms of other features like RT, but unfortunately, only catching up with Intel. The question now is whether AMD will catch up with Intel Arc Alchemist or Battlemage when it comes to GPU features. In fact, I feel Intel made a big stride with Battlemage which makes Intel GPUs more attractive than AMD now.
That is only because the Tech media says so.
Posted on Reply
#154
john_
jh_bergI could be wrong, but the RX 9070 XT is the successor to the RX 7700 XT. Or not? Comparing its performance with the RX 7900 XT is not fair. IMHO.
By name probably. But what about pricing?
RX 7600 based on it's name was the successor of the RX 6600. BUT it had the specs of RX 6650XT and was selling at the market price of RX 6650XT.
So, successor by name, but not based on specs and price.

If RX 9070XT comes at the current retail price of 7700XT, then yes.
Posted on Reply
#155
dismuter
john_If RX 9070XT comes at the current retail price of 7700XT, then yes.
I would not expect it to start at the 7700 XT's current $400 price. But it may follow its pricing lifecycle, starting at $450 and gradually coming down to $400.
Posted on Reply
#156
Visible Noise
watzupkenAMD is starting to fall behind the GPU side of things.
Starting???
Posted on Reply
#157
mkppo
ymdhisHD5870 was faster per watt. And ran much, much cooler. GTX 480 became a meme for being able to fry an egg...

GTX480 was better at tessellating, true, in benchmarks, where you put 100s of polygons into every pixel. This is completely useless in real world usage because polygons so small will not get rendered on screen, so it's just a waste of computing power. It's why AMD added tessellation settings into their drivers.

Fermi may have had official support for longer, but how much of that was just the same driver, recompressed with the latest runtime components? I didn't follow the changelogs, did any of them mention specific Fermi-related optimizations for modern games? Genuine question here, I don't know this.
I don't want to open a can of worms, but that's the time I started experiencing the absolutely dirty, dirty tactics that nvidia were pulling and somewhat continue to pull to this day.

Raster was something AMD were, and still are, good at. Fermi was released 6 months after evergreen and was still significantly inferior in raster perf/watt compared to Fermi. So what did nVidia do? Push game companies to use unnecessary amounts of Tessellation (read: absolutely fucking unnecessary amounts) by paying them directly to do so because the Fermi was good at it. It showed their garbage mentality just how they did the same, just in a less obvious way, with RT.

Here's the thing though - in both instances they pushed graphics with unnecessary gimmicks that didn't look better but performance absolutely tanked on their GPU's and even worse on their competitors. The consumers were the ones to lose out in the end while they gained relative performance compared to their competitors.

For it's time, Fermi was garbage. Maybe less so than Geforce FX which got it's ass handed to it by the 9700Pro, but it definitely lost the battle compared to evergreen which was a much better balanced architecture released half a year before it. Also remember that the 5870 overclocked significantly better. But hey, you can pay reviewers, game companies and throw around quotations like it's 40% faster in extreme tessellation and change the consumers mentality and still sell a ton of half baked GPU's.

AMD pulled a few of their own shenanigans too, but it's peanuts compared to the awful, slimy crock of BS nvidia pulled for decades.
Posted on Reply
#158
AcE
john_They tried to streamline it with RDNA for gaming and CDNA
That's not streamlining, that's two different "clean" ways, they tried the two-way approach, after having a 1-way approach and now with UDNA (Unified DNA), they are going back to it. The two-way approach was to have better performance, and it worked, but made software-wise everything very complicated for them, that's why they're going back again.
john_and that's why Nvidia haven't supported Frame Generation on RTX 3000 cards.
No, FG isn't supported on RTX 30 because it doesn't have the optical flow analyser (a hardware part), that also is stronger on eg 4090 than on 4060, meaning, FG works better on 4090 than on 4060. I have scrutinised this heavily at the beginning of release of RTX 40 series btw until a friend linked me video of "hacked" FG on RTX 3090, which did not work well, since it doesn't have that hardware part. It worked "somewhat", which isn't good enough. A nvidia architect also explained this one time in twitter, and he seemed believable.
john_AMD failed with RDNA 3 not because of drivers, but because they failed to predict that a feature like RT, a gimmick that probably doesn't do much or maybe it does, would become a very important parameter
AMD failed with RDNA 3 because of a mix of things. Driver issues, enough to get people talking again about it, which isn't good, not huge ones albeit, RT performance mind share, being inferior there, not enough performance at the high end. Mediocre pricing (7600, 7900 XT until it was reduced). Then even things like FSR having a mediocre reputation, and too many people saying "DLSS is way better" (which isn't necessarily wrong, but also not the whole truth). All in all, a mindshare thing, at the end.
john_While I keep saying a number of things about Nvidia, the buyer of a $500 Nvidia card will only get gimped in VRAM
500$ Nvidia card gives you a 4060 Ti with 16 GB or a 4070 with 12, I wouldn't call that "gimped". You're gimped with the 400~$ 4060 Ti and that's it. The 4060 being a low end card, you will get what you pay for. And all other cards have normal amounts of vram, including the 4080. If someone was vram hungry he would get the 4090 anyway, guess why it was outselling so much. 2000$ isn't much money if you work with it, btw. The workstation GPUs are much more expensive.
Tech NinjalMAO. The 7900xtx is a midrange card compared to Nvidia. This will be lucky to compete with 5060
It has more shader than 7800 XT and better ones, the frequency is also higher, so I expect it will have at least 7900 GRE performance, which should be enough to compete in mid range. And with RT on it should have easily better performance than both, so, it could be a good card, but we will see. Again, pricing is probably more important than just performance. Price to performance > *
Posted on Reply
#159
kapone32
AcEAMD failed with RDNA 3 because of a mix of things. Driver issues, enough to get people talking again about it, which isn't good, not huge ones albeit, RT performance mind share, being inferior there, not enough performance at the high end. Mediocre pricing (7600, 7900 XT until it was reduced). Then even things like FSR having a mediocre reputation, and too many people saying "DLSS is way better" (which isn't necessarily wrong, but also not the whole truth). All in all, a mindshare thing, at the end.
Can you please in detail explain the Driver issues with RDNA3? I also like how FSR is mediocre when it is the only Upscaling tech that supports every single GPU. Meanwhile DLSS is proprietary based on the GPU number. The Mindshare has been created by the Tech media. HUB got threatened by Nvidia and RT and DLSS became necessities. TPU got 4000 GPUs from PNY and were so happy they posted about it. Wzzard used to use AMD software as a reason to buy AMD but in the latest GPU review blatantly made no DLSS a negative. What is the issue with that? It creates the narrative that Nvidia are the only choice as they are the only GPU vendor that supports DLSS.

Meanwhile 7000 users have Rebar for another 10-17% increase in raster and SAM for another 10-17% increase in raster. Do you know what that means? The 7000 users club always has new posts. Not one post about 7000 GPUs has complained about performance. AMD also confirmed that to us. What you don't see is that AMD is in both Consoles and we are getting out of the Exclusive (some what) to bring all Games to PC. When Hogwarts launched and there were so many complaints about performance 7000 users wondered what all the noise was about. You see with technology the fact that Ryzen and Radeon are in the Consoles guarantees that you will have better performance if you have parts that are in the same famiy but faster. What would I use as evidence? CP2077 in that Game I can get up to 180 FPS running raw raster at 4K. RT is for those that want it but is anethema to the PC narrative. You see we used to be all about resolution and raster. 1440P looks better than 1080P and 4K looks better than 1440P. The narrative now says that 1440P is the best option but that is because only 6 or 7 GPUs can give you high frame rates at 4K. Guess what? That includes the 7900XTX and 7900XT.
Posted on Reply
#160
john_
AcEThat's not streamlining, that's two different "clean" ways, they tried the two-way approach, after having a 1-way approach and now with UDNA (Unified DNA), they are going back to it. The two-way approach was to have better performance, and it worked, but made software-wise everything very complicated for them, that's why they're going back again.
I was thinking that splitting the architectures to gaming and compute, could help them simplify the chips for each task and by extension the driver support for each architecture. I think the timing was bad and also made RDNA GPUs look extremely simplistic next to Nvidia's offerings.
AcENo, FG isn't supported on RTX 30 because it doesn't have the optical flow analyser (a hardware part), that also is stronger on eg 4090 than on 4060, meaning, FG works better on 4090 than on 4060. I have scrutinised this heavily at the beginning of release of RTX 40 series btw until a friend linked me video of "hacked" FG on RTX 3090, which did not work well, since it doesn't have that hardware part. It worked "somewhat", which isn't good enough. A nvidia architect also explained this one time in twitter, and he seemed believable.
Well AMD's FSR 3.1 and probably even that little application, Lossless Scaling is enough to make it look like something simple that can be emulated at least, in software. Nvidia could offer a software version to 3000 and 2000 owners and just call the hardware version in 4000 "Premium" or something. The optical flow analyzer sounds interesting and an architect will know how to convince the audience that a flow analyzer is the only way to make FG happen. I mean an expert can throw inaccurities in such a way that would be believable excuses. You should ask your friend to try something that isn't hacked and could be performing bad because it is hacked. Ask him to try FSR 3.1 in a game where he can combine it with DLSS and see then if a flow analyser is needed or if that flow analyser is in fact a BS flow distributer.
AcEAMD failed with RDNA 3 because of a mix of things. Driver issues, enough to get people talking again about it, which isn't good, not huge ones albeit, RT performance mind share, being inferior there, not enough performance at the high end. Mediocre pricing (7600, 7900 XT until it was reduced). Then even things like FSR having a mediocre reputation, and too many people saying "DLSS is way better" (which isn't necessarily wrong, but also not the whole truth). All in all, a mindshare thing, at the end.
One major problem was the high power consumption in video playback. I still can't understand why AMD (and Nvidia?) can't offer a video playback engine that uses as low power as a SOC on a smartphone. I doubt a smartphone SOC needs more than 5-6 watts to playback a high resolution video. And then you see 7000 series asking for 40-60W if I remember correctly before somewhat fixing it to less ridiculous levels. As for pricing I keep insisting in a theory that a company without enough wafers can't sell at good prices. If their competitor with better access and better deals to the same factory desides to drop prices, the whole Radeon line will go DOA. So you don't provoke Nvidia. But I could be wrong here. But even with mediocre pricing AMD could sell more cards if those cards had at least twice the performance of RX 6000 in RT. That bad performance in RT was a huge dissadvantage that was used against them. As for FSR, I think that while it is not at the level of DLSS, it is at an OK level for the average gamer. But I also think that Nvidia feared a second FreeSync so they had orchistrated an online campaign to make FSR look worst than it is, to limit it's success. It's interesting that FSR 1.0 had less negative responce than FSR 2.x and even FSR 3.x from reviewers. I believe this happened because the first reviews of FSR where unbiased, while by the time FSR 2.x was out Nvidia had enough time to convince reviewers to focus on any dissadvantages of FSR compared to DLSS, elnarge them and convince their audience that FSR was completelly useless.
AcE500$ Nvidia card gives you a 4060 Ti with 16 GB or a 4070 with 12, I wouldn't call that "gimped". You're gimped with the 400~$ 4060 Ti and that's it. The 4060 being a low end card, you will get what you pay for. And all other cards have normal amounts of vram, including the 4080. If someone was vram hungry he would get the 4090 anyway, guess why it was outselling so much. 2000$ isn't much money if you work with it, btw. The workstation GPUs are much more expensive.
Forget the RTX 4060 Ti 16GB and RX 7600 XT. They are useless cards where you search to find a game at specific settings that make these cards have a reason for existence. These cards where created to convince gamers that 16GBs are overated(RTX 4060 Ti 16G) or as a responce to the competition(RX 7600 XT).
4070 is probably strong enough to take advantage of 4 extra GBs and Nvidia probably knows it. Nvidia is using VRAM as a way to limit the life of a good product for many many years. If Nvidia could offer 16GBs at $400, it could definitely offer those 16GBs for $600, the MSRP price of 4070. But they didn't. They put 16GBs on a card that can't use them, and they limit the VRAM capacity on a card that can use it. That way both products will have limited life before starting losing in benchmarks.
kapone32anethema
anathema
Posted on Reply
#161
igormp
kapone32Meanwhile 7000 users have Rebar for another 10-17% increase in raster and SAM for another 10-17% increase in raster.
It doesn't work like that, SAM is just AMD's ReBar implementation with some sprinkles on top.
Posted on Reply
#162
AcE
kapone32The Mindshare has been created by the Tech media.
No, mindshare is mostly with users, of course reviewers are a victim of it, being biased as well sometimes. Do you know what's better than every ad and review on the planet? People telling others what to buy. This is what Nvidia profited the most from, not reviews and not ads. And mindshare mostly comes from merit, so any mindshare Nvidia had, it mostly was deservedly so.
john_It's interesting that FSR 1.0 had less negative responce than FSR 2.x and even FSR 3.x from reviewers. I believe this happened because the first reviews of FSR where unbiased
FSR 1 was so welcomed because it ran on everything and we got an alternative, while reviewers still acknowledged it had inferior IQ, so with FSR 2 and later they always compared it more with DLSS and scrutinised it because the newcomer factors vanished. As a lot of people are doing that, you can not go around and say they were all paid off by Nvidia to do so. I had also issues accepting the fact that DLSS is (way) better, but it is. Often though, because the implementation of FSR isn't good, which is a dev issue, maybe because they simply care less to optimise for GPUs only 10-15% of user base use. Maybe because Nvidia helps devs more with their work. These are all factors in it. Radeon GM, not long ago, acknowledged the fact that devs have a hard time caring about Radeon if the market share is too low, that's why AMD wants to focus with RDNA 4 on getting more market share first, before putting a lot of money into high end cards again.
john_Forget the RTX 4060 Ti 16GB and RX 7600 XT. They are useless cards where you search to find a game at specific settings that make these cards have a reason for existence.
Yet I had a very long discussion not soon ago here, where people could not accept the fact that 8 GB are enough for a low end card, it's too funny.
john_Nvidia is using VRAM as a way to limit the life of a good product for many many years.
No, they are using the lowest amounts possible so they can maximise their bottom line, the GPUs are historically rarely impacted by vram, they are usually usable for years without problems until the GPU becomes too slow anyway. AMD used to give a lot of vram and this rarely helped much, only down the line several years later it did, but how many people use their GPU like that? Not many. R9 390X had doubled vram, with the same chip than 290X, it never really helped, it helped after the GPU was already old. 780 Ti had only 3 GB vram, but it was never a issue in relevant life time of the card, especially keep in mind that it's a high end GPU and users like those replace their GPUs often times sooner. We can also give another example the other way around: Fury X "only" had 4 GB vram (despite R9 390X of same gen, having 8 GB) and the 980 Ti had 6 GB - for years it was never a issue, when the card was already old and outdated, it slowly became one. The vram argument is in general overstretched and overdramatised, especially with reviewers who only use Ultra settings and can't for the life of them consider different, more sensible settings than those. It's like they're trying so hard to make their point, grasping for straws with nonsensical settings and then pretending it's all normal. Ultra settings are a luxury and shouldn't be treated as the norm.
Posted on Reply
#163
john_
@AcE We are seeing things from different angles.
Posted on Reply
#164
Visible Noise
kapone32You see we used to be all about resolution and raster.
I 100% agree with this statement. That was the past.

The rest of your rambling is just your daily ranting hogwash. Seriously, every post you make is the same tired anti-Nvidia falsehoods.
Posted on Reply
#165
kapone32
Visible NoiseI 100% agree with this statement. That was the past.

The rest of your rambling is just your daily ranting hogwash. Seriously, every post you make is the same tired anti-Nvidia falsehoods.
Hogwash? From someone that does not even list their Specs? Next time you see me go on an Nvidia focused thread and talk about the benefits of AMD cards you can attack me.
igormpIt doesn't work like that, SAM is just AMD's ReBar implementation with some sprinkles on top.
SAM and Rebar are the same thing?
Posted on Reply
#166
Dawora
DahitaThey would bother with it to gain market shares and make more money. That's the concept of running a business.

Now, how on earth would a 7900GRE with 5% more perf at $500 be "an amazing deal" when the 4070 SUPER has been available at this price and in this area of performance for 6 months now? What's next, a 9080XTX at $1100 with the perfs of a 4080 SUPER?!
Because 500$ is cheap and great deal if buying AMD gpu but if buying Nvidia its expensive..

Thats how brains works if owning AMD hardware.
Posted on Reply
#167
Hecate91
Visible NoiseI’m confused. How does Nvidia’s so called greed damage the the gaming market(whatever that is) that people think it’s the only brand to buy?

Rational people would see this so called greed and avoid it. I mean it couldn’t be performance and features that make Nvidia popular, right?

BTW, I don’t play brand wars. I’ll bet I have more AMD equipment in my office than you have ever bought in your entire life.
You do realize Nvidia has 90% market share, right? A company having a near monopoly over the dedicated GPU market isn't healthy, neither are people and the tech press pushing the bias of recommending Nvidia no matter what. Most people aren't rational when it comes to buying a GPU, they just see a green box and buy it. It isn't because of the performance or features because if it was people would be buying AMD for the better rasterized performance and no feature lock in.
The company you work for buying AMD hardware doesn't count, CPU's don't count either, and I know the people that are always crapping on AMD never even consider an AMD graphics card because they're part of the mindshare.
Visible NoiseI 100% agree with this statement. That was the past.

The rest of your rambling is just your daily ranting hogwash. Seriously, every post you make is the same tired anti-Nvidia falsehoods.
It's not the past, there was a poll here and most people voted for rasterization over ray tracing, and when most GPU's sold are in the xx60 class aren't capable of RT without fake frames, most people don't care about RT either. Although, Nvidia and the tech media has pushed RT as being the best thing ever to the point game devs are forcing RT on by default.
This is a AMD thread though, you're the one always bashing on AMD in these threads.
Posted on Reply
#168
Visible Noise
kapone32Hogwash? From someone that does not even list their Specs? Next time you see me go on an Nvidia focused thread and talk about the benefits of AMD cards you can attack me.
What does my computer specs have to do with anything? Which specs should I list? My gaming machines, my workstation, my laptops, my servers?

I don’t play childish epeen games, that’s why my specs aren’t listed. I mean seriously, what brand of chip is in a device that isn’t yours matters that much to you? Trying to put me on a team isn’t going to work, because being a fanboy of a consumer brand is freaking stupid.
Posted on Reply
#169
kapone32
Visible NoiseWhat does my computer specs have to do with anything? Which specs should I list? My gaming machines, my workstation, my laptops, my servers?

I don’t play childish epeen games, that’s why my specs aren’t listed. I mean seriously, what brand of chip is in a device that isn’t yours matters that much to you? Trying to put me on a team isn’t going to work, because being a fanboy of a consumer brand is freaking stupid.
I don't know, common courtesy? If you think that I buy AMD because I am a fan boy you don't understand what the 7900 cards represent for AMD users.
Posted on Reply
#170
Visible Noise
Hecate91It isn't because of the performance or features because if it was people would be buying AMD for the better rasterized performance and no feature lock in.
Citation needed
Hecate91The company you work for buying AMD hardware doesn't count,
I wasn’t talking about my company, I was talking about my home office.
Hecate91CPU's don't count either
More silliness.
Hecate91there was a poll here and most people voted for rasterization over ray tracing
Let me guess, you’ve never taken a stats class, and have no idea why a poll taken here isn’t representative of any population outside of the people that answered the poll. Look up selection bias.
Hecate91This is a AMD thread though
You wouldn’t know it by how often you bring up Nvidia.
Hecate91you're the one always bashing on AMD in these threads.
Not always, only when they deserve it. If that’s too frequent for you, that’s your problem. The adults will continue the discussion even if the facts offend your feels.
kapone32I don't know, common courtesy? If you think that I buy AMD because I am a fan boy you don't understand what the 7900 cards represent for AMD users.
You just put yourself on a team and I don’t think you are even aware of it. Why do you self identify as an AMD user and not computer user? Isn’t that the definition of a fanboy?

Please explain to me what the 7900 means to “AMD users”. I’m fascinated by people that have an emotional attachment to a manufactured commodity. Especially people that think AMD is different from Nvidia, Intel, or Volkswagen.
Posted on Reply
#171
kapone32
Visible NoiseYou just put yourself on a team and I don’t think you are even aware of it. Why do you self identify as an AMD user and not computer user? Isn’t that the definition of a fanboy?
Indeed, listing your specs puts you in a Camp. I can't believe we are living the pages of Mad Magazine Spy vs Spy. I use AMD but I don't tell people not to use anything else. You see the first word is missing from your tangent. Personal. That means you.

If I am a fan boy of anything it is Thermalright CPU Coolers that I am indeed guilty of.
Posted on Reply
#172
Hecate91
Visible NoiseCitation needed
The fact that cards like the 3060 and 4060 are selling the most is evidence enough, the average gamer blindly buys Nvidia because of the brand status, they think a 4060 is better because they see the halo tier card and think a xx60 card will be something special.
Visible NoiseI wasn’t talking about my company, I was talking about my home office.
Sure dude, I know people who constantly bash on AMD are allergic to buying anything but the latest from the leather jacket man.
Visible NoiseMore silliness.
The silliness is you getting upset by people pointing out the obvious regarding Nvidia in an AMD thread.
Visible NoiseLet me guess, you’ve never taken a stats class, and have no idea why a poll taken here isn’t representative of any population outside of the people that answered the poll. Look up selection bias.
The poll represents the community here, its a decent enough representation of what enthusiasts care about.
Visible NoiseYou wouldn’t know it by how often you bring up Nvidia.
And you're defending Nvidia in an AMD thread.
Visible NoiseNot always, only when they deserve it. If that’s too frequent for you, that’s your problem. The adults will continue the discussion even if the facts offend your feels.
Why do you even care if you hate AMD so much? The adults here are having a civil discussion, you seem to be taking it too personally.
Visible NoiseYou just put yourself on a team and I don’t think you are even aware of it. Why do you self identify as an AMD user and not computer user? Isn’t that the definition of a fanboy?

Please explain to me what the 7900 means to “AMD users”. I’m fascinated by people that have an emotional attachment to a manufactured commodity. Especially people that think AMD is different from Nvidia, Intel, or Volkswagen.
If you think listing specs puts you on a team, then you're clearly on a team yourself, being a hardware enthusiast doesn't mean you have to be acting like its a brand war, its silly people have to get so defensive when someone points out the facts on their favorite brand. I don't see AMD users here constantly ridiculing others and telling people what brand to buy, but I see it here all the time with Nvidia, even some of the staff take threads off topic to obsessively push Nvidia.
I avoid posting my specs here because I know the mindshare will only go after me even more for it, and as for 7900 users, they aren't listening to the biased nonsense and don't care about gimmicky features, some just want to play games without having to replace their GPU in a year because of insufficient VRAM.
Posted on Reply
#173
Visible Noise
Hecate91The fact that cards like the 3060 and 4060 are selling the most is evidence enough, the average gamer blindly buys Nvidia because of the brand status, they think a 4060 is better because they see the halo tier card and think a xx60 card will be something special.
GREAT citation.

As for the rest, thanks for proving my point AGAIN. You just can’t stop talking about Nvidia in an AMD thread. Do you have all your points saved in notepad so you can just copy-paste them multiple times a day? Or do you use Onenote?
Posted on Reply
#174
DaemonForce
Nah. Some of us chose the 7900 cards because of complete packaging and a particular design. My favorite target (7900XT) is currently the only in stock model breaking rank at sub-600 Open Box. It has more than enough vram, raster, cooling and encode that there's no point in looking at anything other than googling "7900XT problems" and either saying "doesn't matter to me" or "yeah I'll deal" before pulling the trigger on that purchase. Notice how I haven't mentioned RT/PT in that equation. We are clearly interested in these cards for completely separate reasons.

That precious GRE lineup isn't coming back (for obvious reasons) and the 7800XT is more than enough for mainstream desktop that the rest literally doesn't matter. I need the additional resolution for Desktop+VR+Compute+Rec+Stream, which is a sliver of an imaginary percentage of users let alone gamers that double as developers. At this point we're just waiting for whatever ground breaking features hit the spotlight. The 9070 is going to have a very warm welcome if it hits the shelves at 7900GRE pricing, otherwise the adjacent SKUs get a price drop while the rest will rot. Either way I win.
Posted on Reply
#175
igormp
kapone32SAM and Rebar are the same thing?
Yes but not quite, there are some minor specifics on SAM (SAM is basically rebar with some extra optimizations on top).
The way you said it sounded like you could enable rebar, get 10~17% perf, and then add SAM to get another 10~17% on top of the previous one, which is not the case.
Posted on Reply
Add your own comment
Jan 6th, 2025 19:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts