Tuesday, February 6th 2024

AMD Radeon RX 7900 XT Now $100 Cheaper Than GeForce RTX 4070 Ti SUPER

Prices of the AMD Radeon RX 7900 XT graphics card hit new lows, with a Sapphire custom-design card selling for $699 with a coupon discount on Newegg. This puts its price a whole $100 cheaper (12.5% cheaper) than the recently announced NVIDIA GeForce RTX 4070 Ti SUPER. The most interesting part of the story is that the RX 7900 XT is technically from a segment above. Originally launched at $900, the RX 7900 XT is recommended by AMD for 4K Ultra HD gaming with ray tracing; while the RTX 4070 Ti SUPER is officially recommended by NVIDIA for maxed out gaming with ray tracing at 1440p, although throughout our testing, we found the card to be capable of 4K Ultra HD gaming.

The Radeon RX 7900 XT offers about the same performance as the RTX 4070 Ti SUPER, averaging 1% higher than it in our testing, at the 4K Ultra HD resolution. At 1440p, the official stomping ground of the RTX 4070 Ti SUPER, the RX 7900 XT comes out 2% faster. These are, of course pure raster 3D workloads. In our testing with ray tracing enabled, the RTX 4070 Ti SUPER storms past the RX 7900 XT, posting 23% higher performance at 4K Ultra HD, and 21% higher performance at 1440p.
Source: VideoCardz
Add your own comment

132 Comments on AMD Radeon RX 7900 XT Now $100 Cheaper Than GeForce RTX 4070 Ti SUPER

#76
RGAFL
theoutoYou don't get it, AusWolf, we MUST discuss and fight about products we will realistically never even breathe on, it is a NECESSITY.
This is what I don't get to be honest. I honestly can't get my head around why people defend any company. I don't and I can't. To them i'm a .0000001% (if that) blip on their bottom line. What I can't abide is people posting a benchmark as gospel when a 5 minute Google search dispels that. Posting them and then saying well look at the sales, sorry you have lost the argument. Ones better at some games, the other is better at others. What is there to defend? Isn't that what should be happening? I don't look at the features, I look at the games. Maybe that's just me.
Posted on Reply
#77
Random_User
AusWolfGuys, come back to the real world, please! This card is 10% faster in this game, that card is 5% faster in that game... Who gives a F, seriously? :D Whether you have a 7900 XT or a 4080, I bet you're equally happy. :)
Couldn't be more true. The words of wisdom. That couple percent difference is not crucial, as betveen iGPU and dGPU, which even more watered down these days.

You did your choice, You had some reasoning behind the purchase of particular card you have. It' doesn't matter anymore, if someone's card is faster, sometimes within margin of error.
Some people don't even use RTRT, even while having hi-end RTX card. That's their choice. If you don't like the card and company of that card, that someone else uses... well it's not your busines. Go enjoy your <whatever GPU brand and color TM>.
Posted on Reply
#78
Vayra86
RGAFLThis is what I don't get to be honest. I honestly can't get my head around why people defend any company. I don't and I can't. To them i'm a .0000001% (if that) blip on their bottom line. What I can't abide is people posting a benchmark as gospel when a 5 minute Google search dispels that. Posting them and then saying well look at the sales, sorry you have lost the argument. Ones better at some games, the other is better at others. What is there to defend? Isn't that what should be happening? I don't look at the features, I look at the games. Maybe that's just me.
Nah its not dont worry. Some people are just people. Ill leave it at that and we should all.
Posted on Reply
#79
evernessince
theoutoMaybe I am remembering wrong, but I recall HUB saying that something wrong was going on with their AMD tests. And even if they didn't say so, it's clear something is wrong when the card performs the same at both 1080p and 1440p, and nigh identical at 4K, but the competition doesn't.
I definitely do not remember if something like that happened but I don't see how a bug / idiosyncrasy could give AMD cards more FPS. Less I could understand but not more. Looking at other benchmarks around the web the 4090 and 7900 XTX appear to be neck and neck at different settings (ultra). At the end of the day change is settings / FPS doesn't alter any conclusions I've made. I do know there were versions of CoD warzones where certain game versions and driver combinations would yield very poor performance. CoD warzones is a great example of a game where you could cherry pick results with any outcome you want based on driver, game version used, and settings because performance has been all over the place.
Posted on Reply
#80
GhostRyder
Problem with the constant argument about Ray Tracing is its the same argument used with the likes of PhysX, gameworks, or any other system they have developed (AMD is included in this statement as people have made similar arguments) on the card. At the end it is irrelevant to the primary point with is the overall performance of the card, the power consumption/temps (IF they are at significant enough differences/levels that can affect users), and even the size are much more important to the overwhelming majority of users. Ray Tracing is on both but its a gimmick at the end of the day that kills performance for slight better clarity in certain situations. There is no rule saying you can't love a certain feature, but people constantly railing that its a game changer and trying to make it the only thing people should care about when buying a card are just singing the same tune every time one of these new gimmicks comes out.

If people want change, they have to make the first step instead of making every excuse in the book to purchase from the brand they are supporting. If you want the best video card on the market, its the 4090, there is no doubt about it. The 7900 XTX is slightly better than the 4080 overall but both are very close (The 7900XTX can be had cheaper however). The 7900 XT is better than the 4070 ti super but is much closer than the 4070 ti (And now its quite a bit cheaper). The fact we have so much brand loyalty out there is what is killing the GPU market.
Posted on Reply
#81
evernessince
nguyenWeird when Warzone is the cherry picked best case scenario for 7900XTX to begin with LMAO, all other esport games play better on Nvidia anyways. Funny how the latest AMD drivers broke the performance in Warzone, the only esport game that AMD is good at :roll:
The game's performance has always been all over the place. The game is notoriously inconsistent regardless of which vendor's card you have and performance has shifted with driver and game updates, both up and down.
nguyenBTW I'm a very good esport player myself and I play with low settings all the time, always having the highest end gaming GPU and monitor certainly give me the competitive advantage :rolleyes:.
If you want a competitive edge in CoD just use any of the numerous controller remap programs to trick the game into thinking you are using controller to give you free auto-aim (or any of the programs to designed to do that without the slight hassle). It's not even considered cheating. Competitive and CoD are oxymorons. The game has aimbot built in and no CoD has even been balanced and it has never been considered an eSports title.
nguyenThere is something wrong in your thinking that people with high-end GPUs must play only single players game :kookoo:
Don't put words in my mouth. Never said that.
nguyenI guess you have never played esport, capping FPS only work half the time.
First, COD is not an eSports. There's a greater argument that you haven't played eSport if all you've played is CoD. Second, I reached Masters is OW when that game was actually good (it's trash now).
nguyenIf you cap the FPS too high and get GPU-bound mid gunfight you will get very noticeable very bad input latency when you need it the most, capping FPS too low and you are losing on latency reduction with higher FPS (e.g. capping at 120FPS give worse input latency vs 200FPS with reflex). Uncapped FPS + Reflex is simply the best solution for esports.
Sounds like a skill issue, can't properly cap FPS. In a hypothetical scenario where you cap your FPS say 10 below the ideal to ensure consistent input latency (which is really what's penalized when your GPU reaches 100% utilization), the difference in latency in a game that runs at 280 FPS is 0.13 ms. You're no faker bro, 0% chance you are seeing that, let alone being harmed by a 0.13ms difference. Especially in a game like CoD with crap netcode that makes getting killed by superbullets (multiple hits registering at once) and hits behind walls common. It's not like you can't pull up the setings menu and adjust the cap at any time if you do experience a latency increase during a fight, not really sure what you are aruging here other than being either lazy or incompetent.

You are trying to make an argument that would typically apply to eSports titles but in the case of CoD there are far far greater issues that plauge the game.
Vayra86Its complete nonsense. The overwhelming majority of the supposed advantages in latency in CS is placebo. Just because you can measure it, doesn't mean it factually improves your performance.

The human factor determines your performance. Irrespective of equipment. Equipment can only nudge that performance slightly higher. The amount of gamers that are pro enough to even get that nudge and prove it helps them is certainly not equal to the amount of supposed pro's that think they get it.

Rigorous training and building muscle memory is where its at. This means a lot more than playing a lot of CS. It means actually doing IRL sports to improve your gaming, short sessions of gaming, and full control.
I'd have to agree, there likely isn't much if any benefit to 700 FPS over 500 for example in regard to latency. I could maybe understand a very slight benefit to the ability to predict movement thanks to the additional frames but even still, you are talking 500 FPS vs 700, both of which are very high. Heck going from a 144 Hz to 240 Hz screen was a small upgrade to my eyes and most eSports players that have been asked about the difference seemed to agree. Most said that benefits really cap out at 360 Hz. That said there is a separate factor of image sharpness. A higher refresh rate monitor can improve sharpness of motion but there are other technologies that address that as well (ULMB2 for example). I think these additional factors make the conversation more complicated but at the very least we can conclude that benefits from higher FPS / refresh rates has reached extremely diminishing returns.
Posted on Reply
#82
Vya Domus
GhostRyderProblem with the constant argument about Ray Tracing is its the same argument used with the likes of PhysX, gameworks, or any other system they have developed (AMD is included in this statement as people have made similar arguments) on the card. At the end it is irrelevant to the primary point with is the overall performance
I would have agreed but this is different, PhysX, gameworks and whatnot were nothing more than addons, arguably irrelevant additions. RT has become an integral selling point and every new feature/metric revolves around it, you wont see a single piece of marketing from Nvidia that doesn't feature RT or is tangentially related to it i.e upscaling/frame generation. It's something that has completely warped people's perception of performance and value.
Posted on Reply
#83
3valatzy
AusWolfFor a start, AMD, Intel and Nvidia are for-profit companies. Their main goal is to keep the investors happy. Talking about non-profit companies is irrelevant here.

Secondly, society functions without high-end graphics cards just fine, it has for millennia. They certainly have an effect on society, but they're not necessary by any means. Therefore, AMD, Intel and Nvidia are not obliged to lower their profit margins as long as customers are happy to pay up. Why would they? If you could sell a loaf of bread for 100 bucks, tell me you wouldn't. ;)
A loaf of bread is worth pennies, not hundreds. Of course that I would not. But I would gladly sell an idea for 100 bucks.
Also, these high-end graphics, or the chips inside them are also used in supercomputers which definitely have a more important function - to compute all kinds of problems, you said global warming, medicines solutions, even if you wish the capitalist system which can't last forever because it has its own disadvantages, someone mentioned bursting bubbles, etc.
Vayra86Chips have followed a similar trajectory. For 30 years or more, there was low hanging fruit. That's gone now. Graphics, similarly, have advanced at a similar pace. Low hanging fruit is gone. A lot of the recent developments are really not developments, but disruptions to set a new status quo. RT is an example of that in gaming. 'Let's make a new bar to place/compare cards on, so it looks like we're still giving more'. Meanwhile, raster perf, the basis under all performance, is no longer advancing as fast as it used to and the price per FPS is getting stagnant gen to gen.
CPUs are cheap, while graphics cards are expensive. I don't see here anything but random prices setting by someone who dictates that behind the scenes.
Vya DomusI would have agreed but this is different, PhysX, gameworks and whatnot were nothing more than addons, arguably irrelevant additions. RT has become an integral selling point and every new feature/metric revolves around it, you wont see a single piece of marketing from Nvidia that doesn't feature RT or is tangentially related to it i.e upscaling/frame generation. It's something that has completely warped people's perception of performance and value.
PhysX is like DLSS, proprietary feature, while ray-tracing is part of the Microsoft DXR specification which all agreed on, including the AMD that looks like not interested at all.
Posted on Reply
#84
GhostRyder
Vya DomusI would have agreed but this is different, PhysX, gameworks and whatnot were nothing more than addons, arguably irrelevant additions. RT has become an integral selling point and every new feature/metric revolves around it, you wont see a single piece of marketing from Nvidia that doesn't feature RT or is tangentially related to it i.e upscaling/frame generation. It's something that has completely warped people's perception of performance and value.
I feel I have to disagree, I mean there was tons of marketing and pushes for PhysX and Gameworks. Many games were being trotted out and pushed with those labels being all over them and Nvidia had conference/announcements around them (Arkham Asylum as an example). I don't really see this as much different (Sure its not locked down near as much as previous examples but its still niche) in the gaming world except maybe its more noticeable an effect versus the past. I mean just because they named their cards with an R at the beginning and market the cards as having improvements around it doesn't mean its going to be the a defining factor in the gaming sphere.

I mean, I am still fine with someone saying thats more important specifically to them because they want to run Cyberpunk at 4K Ultra with RT on Max, but even with that argument the only card that runs ray tracing decently on the titles it is decently noticeable is the 4090 which is very expensive. Even then it still is a huge performance hit on that card.
Posted on Reply
#85
Vayra86
evernessinceI'd have to agree, there likely isn't much if any benefit to 700 FPS over 500 for example in regard to latency. I could maybe understand a very slight benefit to the ability to predict movement thanks to the additional frames but even still, you are talking 500 FPS vs 700, both of which are very high. Heck going from a 144 Hz to 240 Hz screen was a small upgrade to my eyes and most eSports players that have been asked about the difference seemed to agree. Most said that benefits really cap out at 360 Hz. That said there is a separate factor of image sharpness. A higher refresh rate monitor can improve sharpness of motion but there are other technologies that address that as well (ULMB2 for example). I think these additional factors make the conversation more complicated but at the very least we can conclude that benefits from higher FPS / refresh rates has reached extremely diminishing returns.
All I ever state with conviction is, unless I specifically say otherwise... (then I'm guessing and asking) based on personal experience as well. I've travelled past quite a few shooters monitors and online environments between super casual and pro scene... I've been diving deep into monitor refreshes latencies and lowering them, always sought after the most 1:1 experience between input and response. Simply because it feels good. And when it feels good, its good. Moving the bar further just won't help much if anything. If you remove all your barriers to have the idea that its you, the screen and your actions in the game, that's where you should be. If that is 120hz, its 120. If its 240, fine too. With the caveat that the higher you go, the more prone you are to instability in framerate/time which is FAR worse than any arbitrary FPS limit you set.

Muscle memory is trained on stable latencies. I played pretty competitive Guild Wars stuff and raided on shitty laptops, taking down raid bosses in WoW at 10 FPS, even leading raids with another CPU hog in the background (Ventrilo or TS). Throttling? I didn't have a clue what that was, but yea, either the server ponied up high latencies/low FPS, or the hardware wouldn't handle all the assets proper. Whatever. If you do it 10 times, you know when to push that button. Overall, if you played online a good ten thousand hours, you know what latency is and how to adapt your input to still land everything at the right time.

Variable latency? You can safely forget your game performance to increase. But on any stable latency, you can train.
Posted on Reply
#86
Super XP
Dr. DroNews tend to be very focused on the pampered American market that responds almost instantly to pricing changes enacted by companies. Out here we don't get FE cards, we barely get any AIB distributed first parties, prices take months to change, if they ever do...

Either if it was $200 cheaper I would have a hard time justifying a 7900 XT over a 4070 Ti SUPER, though.
Going with either GPU is subjective. The 7900xt is the faster card excluding ray tracing where the 7900 XT runs approx 6% worse. That said, RT is overrated IMO. I personally don't like how it looks. Maybe sometime in the future it will get better & look better. But right now no thanks.

Also buying a GPU nowadays solely based on RT is a pretty bad idea, because no GPU, not even the $2000+ overpriced Nvidia cards can successfully utilize RT without tanking performance and implementing more gimmicks to gain that performance back somewhat. Lol
Posted on Reply
#87
AusWolf
evernessinceThe game's performance has always been all over the place. The game is notoriously inconsistent regardless of which vendor's card you have and performance has shifted with driver and game updates, both up and down.



If you want a competitive edge in CoD just use any of the numerous controller remap programs to trick the game into thinking you are using controller to give you free auto-aim (or any of the programs to designed to do that without the slight hassle). It's not even considered cheating. Competitive and CoD are oxymorons. The game has aimbot built in and no CoD has even been balanced and it has never been considered an eSports title.



Don't put words in my mouth. Never said that.



First, COD is not an eSports. There's a greater argument that you haven't played eSport if all you've played is CoD. Second, I reached Masters is OW when that game was actually good (it's trash now).



Sounds like a skill issue, can't properly cap FPS. In a hypothetical scenario where you cap your FPS say 10 below the ideal to ensure consistent input latency (which is really what's penalized when your GPU reaches 100% utilization), the difference in latency in a game that runs at 280 FPS is 0.13 ms. You're no faker bro, 0% chance you are seeing that, let alone being harmed by a 0.13ms difference. Especially in a game like CoD with crap netcode that makes getting killed by superbullets (multiple hits registering at once) and hits behind walls common. It's not like you can't pull up the setings menu and adjust the cap at any time if you do experience a latency increase during a fight, not really sure what you are aruging here other than being either lazy or incompetent.

You are trying to make an argument that would typically apply to eSports titles but in the case of CoD there are far far greater issues that plauge the game.



I'd have to agree, there likely isn't much if any benefit to 700 FPS over 500 for example in regard to latency. I could maybe understand a very slight benefit to the ability to predict movement thanks to the additional frames but even still, you are talking 500 FPS vs 700, both of which are very high. Heck going from a 144 Hz to 240 Hz screen was a small upgrade to my eyes and most eSports players that have been asked about the difference seemed to agree. Most said that benefits really cap out at 360 Hz. That said there is a separate factor of image sharpness. A higher refresh rate monitor can improve sharpness of motion but there are other technologies that address that as well (ULMB2 for example). I think these additional factors make the conversation more complicated but at the very least we can conclude that benefits from higher FPS / refresh rates has reached extremely diminishing returns.
If I ever decided to do eSports, I would most definitely be hindered by my 35 Mbps internet connection that gets unstable as heck once the missus starts watching anything better than 720p on YouTube. Whether my PC can render 7 million FPS, and whether I have Jedi reflexes or not is totally irrelevant here in the Great British Midlands. :laugh:

TLDR: Doesn't the internet have the highest latency of all "components"?
Posted on Reply
#88
Super XP
Vayra86Even the planet follows a trajectory like that: humanity has clearly peaked, and the jury is still convening whether we are in decline or just slowly keep climbing as a species. But realistically all signs are red: climate, resources, population... we've exploded and we're fat, bloated, and fill societies with frivolities and wasteful practices. Are we happier though? I don't think so. More people than ever in therapy of some kind or another. Healthcare expenses explode like our bellies do. Wealth makes lazy. Lazy makes unhealthy. We're getting stagnant gen to gen, but fatter. Top seller and recent hype of 2024? A miracle medicine that promises weight loss. Go figure.

So do you really keep getting more for less, or is this just a matter of perception? What IS more?
Bloated humanity? Humanity peaked? Climate, Resources, Population?

This is all nonsense, globalists have been spreading this lie for ages now. Don't buy into those lies.
Posted on Reply
#89
AusWolf
3valatzyA loaf of bread is worth pennies, not hundreds. Of course that I would not. But I would gladly sell an idea for 100 bucks.
Also, these high-end graphics, or the chips inside them are also used in supercomputers which definitely have a more important function - to compute all kinds of problems, you said global warming, medicines solutions, even if you wish the capitalist system which can't last forever because it has its own disadvantages, someone mentioned bursting bubbles, etc.
Oh, I bet you'd pay 100 bucks if you were extremely hungry, and the only loaf of bread accessible to you was in my possession. ;) The fact that making it costs pennies is irrelevant to the final retail price. If you're hungry, you'll buy it.

This is the kind of situation most gamers wrongly assume to be in when they complain about GPU prices, but go ahead and buy one from the top shelf anyway. The only difference is, we don't need that GPU. We just want it when we could make do with any other model. We are at fault for prices. If we all refused to pay thousands for a mere toy, then it wouldn't cost thousands, it's that simple. But we don't refuse because we're sheep and we believe in stupid nonsensical slogans like "it's just the way it is" or "these are harsh times". No, it's not the way it is. We make it ourselves.

And I'm talking about gaming GPU prices. Supercomputer and datacentre GPUs are totally different, and so are their prices. If I'm wrong, then I'd gladly hear about your experiences with your 750 W Radeon Instinct super GPU in games. :)
Posted on Reply
#90
Vayra86
AusWolfIf I ever decided to do eSports, I would most definitely be hindered by my 35 Mbps internet connection that gets unstable as heck once the missus starts watching anything better than 720p on YouTube. Whether my PC can render 7 million FPS, and whether I have Jedi reflexes or not is totally irrelevant here in the Great British Midlands. :laugh:

TLDR: Doesn't the internet have the highest latency of all "components"?
Sure but latency is of course a stack of things, internet being one of them. It is true you can get substantially lower latencies with the right setup. A solid net connection offers on average a stable ping of around 16-17ms if the server isn't on another continent. There's at least as much latency on the local side too that you can work with, or at least, on some parts of the pipeline. But really, above 60 FPS anything you tweak that has anything to do with lowering latency by getting higher FPS... maybe you'll win 10ms if you're really lucky. That's still less than 30% of your total latency. Input (click-) latency, that's easy to get around, 2-3ms is possible in-game. But even there, you've won what, 3-5ms over any other mouse. Okay :D

If we put this perspective on the % difference between GPUs... even a GPU that can output twice the FPS might net you what, a 10% advantage in total latency over the other. If the supposed 7900XTX scores 200 FPS vs another that has 400 you've already closed the gap for the most part. Diminishing returns.
Posted on Reply
#91
Dr. Dro
Super XPGoing with either GPU is subjective. The 7900xt is the faster card excluding ray tracing where the 7900 XT runs approx 6% worse. That said, RT is overrated IMO. I personally don't like how it looks. Maybe sometime in the future it will get better & look better. But right now no thanks.

Also buying a GPU nowadays solely based on RT is a pretty bad idea, because no GPU, not even the $2000+ overpriced Nvidia cards can successfully utilize RT without tanking performance and implementing more gimmicks to gain that performance back somewhat. Lol
I disagree and everyone on this forum has heard the story a hundred if not a thousand times. I'm good.
Posted on Reply
#92
Super XP
GhostRyderRay Tracing is on both but its a gimmick at the end of the day that kills performance for slight better clarity in certain situations.
Couldn't have said it any better. And to add for a couple other posters on here. People have brand loyalty, and that's OK.
Dr. DroI disagree and everyone on this forum has heard the story a hundred if not a thousand times. I'm good.
To each there own I suppose. But I stand with my comment
Posted on Reply
#93
AusWolf
Vayra86If we put this perspective on the % difference between GPUs... even a GPU that can output twice the FPS might net you what, a 10% advantage in total latency over the other. If the supposed 7900XTX scores 200 FPS vs another that has 400 you've already closed the gap for the most part. Diminishing returns.
That's what I mean. Some of us overestimate the importance of our rigs compared to everything else. Like the 4090 suddenly became necessary to play CS:GO at 1080p low, like it was absolutely impossible before, or as if playing at 4 thousand FPS (that your monitor won't even display and your internet connection won't even push through) magically made you a much better player. :kookoo:
Posted on Reply
#94
kapone32
Dr. DroI disagree and everyone on this forum has heard the story a hundred if not a thousand times. I'm good.
Everyone knows you love Nvidia
Posted on Reply
#95
evernessince
AusWolfIf I ever decided to do eSports, I would most definitely be hindered by my 35 Mbps internet connection that gets unstable as heck once the missus starts watching anything better than 720p on YouTube. Whether my PC can render 7 million FPS, and whether I have Jedi reflexes or not is totally irrelevant here in the Great British Midlands. :laugh:

TLDR: Doesn't the internet have the highest latency of all "components"?
Indeed but internet latency is additive in the sense that it only increase the total latency chain in addition to whatever latency you experience as a result of your system. I'd have to agree that in your situation you have bigger fish to fry than chasing frames. In your case you may be able to improve latency somewhat by getting a router that supports QoS. What you are likely experiencing is called buffer bloat, wherein the packets sent from your game client are waiting in queue before actually being sent to the game server. With QoS enabled though, if properly configured it will always ensure that some bandwidth is available for high priority data, This is important for games as the data is very time sensitive. Packets from a video streaming service can afford to wait a few additional ms in queue as every modern video client supports buffering providing more than enough time to makeup for such delays (typically most buffer for 5000 ms or 5 seconds at a minimum).
Posted on Reply
#96
Vya Domus
kapone32Everyone knows you love Nvidia
I don't think people really love Nvidia, they just love products with a premium alure and that product happens to be Nvidia, if AMD made a GPU that was 5% better at RT but was like 3K$ the same people would eventually switch sides.

And they will want to argue endlessly about why their overpriced product is so much better because what else are they going to do, something that is X% more expensive is never X% better, you're always getting ripped off, the more you pay the worse the value and the more you'll feel the need to justify your choice to others.
Posted on Reply
#97
kapone32
Vya DomusI don't think people really love Nvidia, they just love products with a premium alure and that product happens to be Nvidia, if AMD made a GPU that was 5% better at RT but was like 3K$ the same people would eventually switch sides.

And they will want to argue endlessly about why their overpriced product is so much better because what else are they going to do, something that is X% more expensive is never X% better, you're always getting ripped off, the more you pay the worse the value and the more you'll feel the need to justify your choice to others.
The only issue I have is when people tell me how weak my PC is based on the narrative and refuse to believe the truth based on my experience.
Posted on Reply
#98
GhostRyder
Vya DomusI don't think people really love Nvidia, they just love products with a premium alure and that product happens to be Nvidia, if AMD made a GPU that was 5% better at RT but was like 3K$ the same people would eventually switch sides.

And they will want to argue endlessly about why their overpriced product is so much better because what else are they going to do, something that is X% more expensive is never X% better, you're always getting ripped off, the more you pay the worse the value and the more you'll feel the need to justify your choice to others.
I agree, I think people more like the brand recognition because its viewed as a premium product similar to how Apple is viewed. Though I think AMD could make a card 25% faster in all aspects and many people would still say it needed to be significantly cheaper (While still not buying it).
Super XPCouldn't have said it any better. And to add for a couple other posters on here. People have brand loyalty, and that's OK.


To each there own I suppose. But I stand with my comment
I personally am not a fan of brand loyalty because the market can get complacent like it has in the last decade. Everyone is free to purchase whatever they want at the end of the day, but there are consequences for blind loyalty to a specific brand. I purchased my GTX Titan X (Pascal) back in the day for $1,000 (After spending $1650 previously for 3 R9 290X's) and many thought I was crazy spending that much on one GPU (Even I did). Now 1K is not even top tier anymore, want to guess the price of top tier in the next 5 years?
Posted on Reply
#99
Vya Domus
GhostRyderI agree, I think people more like the brand recognition because its viewed as a premium product similar to how Apple is viewed. Though I think AMD could make a card 25% faster in all aspects and many people would still say it needed to be significantly cheaper (While still not buying it).
Nah, I can assure assure you it's not the brand, people will eventually switch to praising whatever they think it's more premium (not even necessary buying it). Look at how Intel has slowly lost it's mindshare, as soon as the 5800X3D made it's appearance a lot of Intel loyalists vanished.

If the next X3D CPU is like 1000$, you'd absolutely have tons of people arguing it's actually OK because it is the fastest gaming CPU after all and if you think it's horrid value and a stupid choice you're just a poor pleb or something.
Posted on Reply
#100
The Norwegian Drone Pilot
As much as I hate NVIDIA as a company, it's still a fact that NVIDIA here is overall better in many things, even if the price on the AMD GPUs is a little cheaper. It's not all about the price all the times.

There are several features the NVIDIA GPUs have that AMD doesn't have that favours NVIDIA for me. I'm going to list some of the more important features for me under here.So, an 'RTX 4070 Ti SUPER' is the GPU I'm going to buy if I'm buying a new GPU before the next generation of GPUs comes out from NVIDIA and AMD.
Posted on Reply
Add your own comment
Jan 19th, 2025 00:07 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts