• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU upgrade - buy 7900xtx or 4080 Super now or wait for next gen release in 2025?

Status
Not open for further replies.
People should just stop buying those overpriced 2000+ EUR enthusiast-level cards, when manufacturers see that people are buying them, it's easy to them bump up the price in every generation.

Remember when practically no gamer bought them 1000EUR Titans but instead a little cut-down x80/xx80 Tis? Now it's totally fine to pay 2000+ EUR for a bare graphics card, and I don't take inflation as an excuse. Back then you could build an enthusiast-level setup including all peripherals with that amount of cash.
 
People should just stop buying those overpriced 2000+ EUR enthusiast-level cards, when manufacturers see that people are buying them, it's easy to them bump up the price in every generation.

Remember when practically no gamer bought them 1000EUR Titans but instead a little cut-down x80/xx80 Tis? Now it's totally fine to pay 2000+ EUR for a bare graphics card, and I don't take inflation as an excuse. Back then you could build an enthusiast-level setup including all peripherals with that amount of cash.

It's not going to happen, because people believe they're worth that much, and with AI essentially making the people who buy these things a lot of money, they've long since stopped being a gaming-only product. End of the day if you disagree you got two options, you either stop buying flagships (not that you need them), or you get a console, I don't see what's the huge deal. If a more down to earth, affordable yet still perfectly acceptable product is available, then we have no issue, the world's big enough for both target markets.

"AMD open source support is not worth mentioning. There is no gui for creating fan curves, undervolting, underclocking. High clock memory bug - and loud fans with the 6800 card - both in windows 11 pro and gnu gentoo linux. This topic is endless why amd has bad software in my current windows 11 pro or my gnu gentoo linux."

I will reiterate, please read the posts you are replying to and re-read them if you have not fully understood before making a reply. If even after that you still don't understand, please ask for clarification before making a rebuttal. He's obviously talking about fan control on the Linux side - and then that he ran into the high memory clock bug on both OSes, not that Windows doesn't have fan control. Like I said, knightly defense. Just like you couldn't figure out that I wasn't replying about the 580 vs 1060 video but to the allegation that AI upscalers require internet connection and consume bandwidth (which is not true, the tech does not work that way).
 
It's not going to happen, because people believe they're worth that much, and with AI essentially making the people who buy these things a lot of money, they've long since stopped being a gaming-only product. End of the day if you disagree you got two options, you either stop buying flagships (not that you need them), or you get a console, I don't see what's the huge deal. If a more down to earth, affordable yet still perfectly acceptable product is available, then we have no issue, the world's big enough for both target markets.
But even the mid-end market is a total joke these days with overpriced and slow cards like RTX 4060 (Ti) and RX 7600 (XT).
 
But even the mid-end market is a total joke these days with overpriced and slow cards like RTX 4060 (Ti) and RX 7600 (XT).

I mean, we sure expected better, but the 7600 XT makes a good case study, if you compare it to previous generation products, you've never got that much performance and that much memory out of a card that slotted in its similar segment and price bracket before. It's considerably better than the 6600 XT, and light years ahead of the 5600 XT - to that extent, while it may pale in comparison to its better endowed siblings, it's still a successful product at what it proposes to do, even if yes, you can buy a 7800 XT for just a little more and that is probably a very good idea.
 
I mean, we sure expected better, but the 7600 XT makes a good case study, if you compare it to previous generation products, you've never got that much performance and that much memory out of a card that slotted in its similar segment and price bracket before. It's considerably better than the 6600 XT, and light years ahead of the 5600 XT - to that extent, while it may pale in comparison to its better endowed siblings, it's still a successful product at what it proposes to do, even if yes, you can buy a 7800 XT for just a little more and that is probably a very good idea.
The main problem is that devs are hella lazy what it comes to optimizing games these days. UE5 for example is horrible what it comes to performance, why it's so hard to optimize games like Doom 2016, which runs on a potato yet still looks good.

edit: I maybe underestimated RX 7600 XT, but that's just a stupid card by its model name, just an factory overclocked RX 7600 with double the VRAM.
 
Do you know what AMD software is?

What does the Steam Deck run on? Is it not a Linux based distro? Is that OS not designed to run on AMD hardware?

I think reading is not your skill. Or asking the right question - when the point is not understood

I posted several times myself the windows 11 pro amd gpu driver screenshot. Do not annoy me or others with fake screenshot - when the topic was gnu linux. Linux is only the kernel - not the operating system. The other bits are from different "projects", sometimes from the gnu project. Most any book I had in my hand so far explained in very detail the difference. small hint: use a webbrowser and check out that page: https://kernel.org

edit: to make it very, very clear. kapone32 posted a windows 11 pro - amd software - amd gpu driver screenshot
I - was talking about bad software support in gnu linux in regards of current personally owned, or previous sold amd based hardware. Which included mainboards, graphic cards and processors.

edit: just correcting facts - If this goes on, a post below that - I'll use the ignore list function - so the conversation ends than on my side. Facts - vs - not bothering reading books about introduction about gnu linux // real life experience as long term user - vs fanboy

edit: hard facts about bad amd software for their graphic card

I do not bothering writing any guides for free on any website
It's a topic about the high idle consumption caused by high memory clocks for a msi radeon 6800 Z Trio in Windows and gnu gentoo linux. Both operating system had the same issue. With a fix. There is also a third party software available which goes the same route, which i found myself.

Basically you do not need to use google translate - Config file code and pictures should explain it very, very clear.
Feel free to use google translate https://www.computerbase.de/forum/t...00-z-trio-idle-verbrauch-reduzierung.2145581/
 
Last edited:
People should just stop buying those overpriced 2000+ EUR enthusiast-level cards, when manufacturers see that people are buying them, it's easy to them bump up the price in every generation.
You can run a long way with that. Where do you cut it off? $500? $200? If you think about it, gaming is a total waste of both time and energy, so maybe we should all just use iGPUs?

Nobody is forcing you to buy them either, the mid range and low end still exist.
But even the mid-end market is a total joke these days with overpriced and slow cards like RTX 4060 (Ti) and RX 7600 (XT).
Oh, so you dont want those either. So I'm guessing you're in the camp of "bring back high end $500 cards" club?
Needless to say, that club ignores history and inflation combined. I wouldnt recommend maintaining your membership.
Remember when practically no gamer bought them 1000EUR Titans but instead a little cut-down x80/xx80 Tis? Now it's totally fine to pay 2000+ EUR for a bare graphics card, and I don't take inflation as an excuse. Back then you could build an enthusiast-level setup including all peripherals with that amount of cash.
If you wish to ignore reality you'll find yourself forever screaming at clouds. Inflation IS real. Costs are significantly higher. One can take a look at nvidia's margin, factor in the boatloads of $50k+ AI chips they are selling, and see the $500 flagship is a total pipe dream today.

The 8800 ultra was a halo card that ran $850, or $1150 today. It contained 681 million transistors. The RTX 4080 Contains 45.9 billion transistors, on a node that is over 10x the price per wafer, and retails for $1000.
 
Stuff like market share is irrelevant. Just get the best GPU for the job at the right price.
It's really not.

Software is optimized for CUDA because it's so utterly dominant that the market may as well be a monopoly.

For gaming, you pick AMD if you:

1. Don't care about RT
2. Don't care about having a competent upscaler or DLAA
3. Don't mind worse energy efficiency
4. Don't mind driver cadence being iffy, will you get game ready on launch? Maybe, maybe not.
5. Like big VRAM numbers that make no difference in the real world, could be useful in professional applications if you didn't need to be insane to buy AMD over NV for most of those applications.
6. Don't mind your GPU being dropped a year or two earlier than equivalent NV GPU in current driver branch.
7. Are happy with "slightly worse but slightly cheaper" attitude, and being 1-2 years behind the curve in innovation features wise.

Really hoping Battlemage shows up or we're going to have a monopoly next gen.

Also, it would be good to mention that UE5 is becoming very dominant, and most games are now using some form of RT. Whether you care or not about it, it's relevant to performance regardless.
 
It's really not.

Software is optimized for CUDA because it's so utterly dominant that the market may as well be a monopoly.

For gaming, you pick AMD if you:

1. Don't care about RT
2. Don't care about having a competent upscaler or DLAA
3. Don't mind worse energy efficiency
4. Don't mind driver cadence being iffy, will you get game ready on launch? Maybe, maybe not.
5. Like big VRAM numbers that make no difference in the real world, could be useful in professional applications if you didn't need to be insane to buy AMD over NV for most of those applications.
6. Don't mind your GPU being dropped a year or two earlier than equivalent NV GPU in current driver branch.
7. Are happy with "slightly worse but slightly cheaper" attitude, and being 1-2 years behind the curve in innovation features wise.
Battlemage is gonna go one of two ways.

!) its going to trade blows with upcoming AMD and nvidia hardware
2) its gonna choke on its own fumes.

The A770 showed Alchemist's problem. 400mm2, larger then the RTX 4080 at 379mm2, yet it performed on par with a 159mm2 4060 or 203mm2 RX 7600. Something is either seriously wrong with the architecture or the drivers, or both, even today. For intel to not fix this with its second generation chips would be sheer incompetence.
Really hoping Battlemage shows up or we're going to have a monopoly next gen.
Agreed
Also, it would be good to mention that UE5 is becoming very dominant, and most games are now using some form of RT. Whether you care or not about it, it's relevant to performance regardless.
This is where AMD is truly screwed. The PS5 pro has rDNA4 RT hardware and , if its performance is anything to go by, there's been little if any actual improvement over rDNA2 core for core.
 
Battlemage is gonna go one of two ways.

!) its going to trade blows with upcoming AMD and nvidia hardware
2) its gonna choke on its own fumes.

The A770 showed Alchemist's problem. 400mm2, larger then the RTX 4080 at 379mm2, yet it performed on par with a 159mm2 4060 or 203mm2 RX 7600. Something is either seriously wrong with the architecture or the drivers, or both, even today. For intel to not fix this with its second generation chips would be sheer incompetence.

Agreed

This is where AMD is truly screwed. The PS5 pro has rDNA4 RT hardware and , if its performance is anything to go by, there's been little if any actual improvement over rDNA2 core for core.
Battlemage and RDNA 4 are competing for the mid range.

AMD/Intel own the low end iGPU, AMD probably wins there because of Steam Deck. But there's rumours RTX APU in Steam Deck 2, plus Switch (obv), so who knows.
 
Battlemage and RDNA 4 are competing for the mid range.

AMD/Intel own the low end iGPU, AMD probably wins there because of Steam Deck. But there's rumours RTX APU in Steam Deck 2, plus Switch (obv), so who knows.

Let's hope they actually compete becuase we don't need it any worse than it already is with both those companies just getting Nvidia's leftovers at this point. Nvidia doesn't even try under 500 usd anymore and still wrecks the competition....

Talking about actual sales not P/P etc.
 
Battlemage and RDNA 4 are competing for the mid range.

AMD/Intel own the low end iGPU, AMD probably wins there because of Steam Deck. But there's rumours RTX APU in Steam Deck 2, plus Switch (obv), so who knows.
For the price they are asking (leaked B580 at $259) and the die size of their chips, their B770 better not be mdi range. If it is there will be no real competition, AMD at low mid, Nvidia mid-high-halo, and intel as the kid in the corner eating glue.
 
I think reading is not your skill. Or asking the right question - when the point is not understood

I posted several times myself the windows 11 pro amd gpu driver screenshot. Do not annoy me or others with fake screenshot - when the topic was gnu linux. Linux is only the kernel - not the operating system. The other bits are from different "projects", sometimes from the gnu project. Most any book I had in my hand so far explained in very detail the difference. small hint: use a webbrowser and check out that page: https://kernel.org

edit: to make it very, very clear. kapone32 posted a windows 11 pro - amd software - amd gpu driver screenshot
I - was talking about bad software support in gnu linux in regards of current personally owned, or previous sold amd based hardware. Which included mainboards, graphic cards and processors.

edit: just correcting facts - If this goes on, a post below that - I'll use the ignore list function - so the conversation ends than on my side. Facts - vs - not bothering reading books about introduction about gnu linux // real life experience as long term user - vs fanboy

edit: hard facts about bad amd software for their graphic card

I do not bothering writing any guides for free on any website
It's a topic about the high idle consumption caused by high memory clocks for a msi radeon 6800 Z Trio in Windows and gnu gentoo linux. Both operating system had the same issue. With a fix. There is also a third party software available which goes the same route, which i found myself.

Basically you do not need to use google translate - Config file code and pictures should explain it very, very clear.
Feel free to use google translate https://www.computerbase.de/forum/t...00-z-trio-idle-verbrauch-reduzierung.2145581/
First comment I found on what you posted for facts. It is not 2006.

"I cannot believe that. A Gentoo installation from 2006?"

Battlemage and RDNA 4 are competing for the mid range.

AMD/Intel own the low end iGPU, AMD probably wins there because of Steam Deck. But there's rumours RTX APU in Steam Deck 2, plus Switch (obv), so who knows.
Really so Intel GPUs sell as well as AMD?
 
Really so Intel GPUs sell as well as AMD?
Where did he claim that? He specified iGPU, which, yeah, intel dominates the mobile space, despite AMD's efforts.
 
Where did he claim that? He specified iGPU, which, yeah, intel dominates the mobile space, despite AMD's efforts.
Yep the MSI Claw outsells the Ally much less the Steam Deck.
 
But even the mid-end market is a total joke these days with overpriced and slow cards like RTX 4060 (Ti) and RX 7600 (XT).
I wouldn't count then as mid, more like low.
 
Last edited:
I wouldnt count then as mid, more like low.
Yeah, if we're going by die size, the 7600 is smaller then the geforce 550ti. Nobody called that a mid range card back in the day.
 
Yeah, if we're going by die size, the 7600 is smaller then the geforce 550ti. Nobody called that a mid range card back in the day.

N33 on TSMC's 40 nm process - if at all feasible to produce - would likely be in the 5 digits mm², though....
 
It's really not.

Software is optimized for CUDA because it's so utterly dominant that the market may as well be a monopoly.

For gaming, you pick AMD if you:

1. Don't care about RT
2. Don't care about having a competent upscaler or DLAA
3. Don't mind worse energy efficiency
4. Don't mind driver cadence being iffy, will you get game ready on launch? Maybe, maybe not.
5. Like big VRAM numbers that make no difference in the real world, could be useful in professional applications if you didn't need to be insane to buy AMD over NV for most of those applications.
6. Don't mind your GPU being dropped a year or two earlier than equivalent NV GPU in current driver branch.
7. Are happy with "slightly worse but slightly cheaper" attitude, and being 1-2 years behind the curve in innovation features wise.

Really hoping Battlemage shows up or we're going to have a monopoly next gen.

Also, it would be good to mention that UE5 is becoming very dominant, and most games are now using some form of RT. Whether you care or not about it, it's relevant to performance regardless.
I'm not sure why you're quoting my comment. Your view is that Nvidia gpus are superior. You're an Nvidia warrior. Fine, run with the generic, lazy criticisms of other gpus (even the unreleased ones). I stand by my comment that market share is an irrelevant consideration when assessing the suitability of a product.
 
I'm not sure why you're quoting my comment. Your view is that Nvidia gpus are superior. You're an Nvidia warrior. Fine, run with the generic, lazy criticisms of other gpus (even the unreleased ones). I stand by my comment that market share is an irrelevant consideration when assessing the suitability of a product.
This is why i buy what i buy and thats it, ive seen the huge bias on here and yt and i dont buy into it.
 
The is directly from the Kit Guru Youtube channel. The question was why do you not use AMD Cards when doing AMD builds.

AIdro_lTtvPcrAQFzDc1FsCjWnKnr8DyeVV8v2s_guc13_ISAfE=s96-c-k-c0x00000000-no-cc-rj-rp

KitGuruTech replied: "We get a lot of Nvidia partner support just. We have reviewed all the amd cards and even did articles on their features in the past. Thanks for the feedback though."
 
Personally, I'm very interested to see what RDNA4 brings to the table. Give me 30% more rasterised performance and 50% RT than an XTX and I'm sold! Yes, I am dreaming, lmao.
 
Personally, I'm very interested to see what RDNA4 brings to the table. Give me 30% more rasterised performance and 50% RT than an XTX and I'm sold! Yes, I am dreaming, lmao.
That was what happened when I went from the 6800XT to the 7900XT.
 
That was what happened when I went from the 6800XT to the 7900XT.
I'm looking for those gains from a GRE to whatever they release next. If more I will not complain... :D
 
Personally, I'm very interested to see what RDNA4 brings to the table. Give me 30% more rasterised performance and 50% RT than an XTX and I'm sold! Yes, I am dreaming, lmao.
+30% raster performance is past 4090 perf so highly unlikely.

For top RDNA4 I would expect 7900XT Raster and +30-50% XTX RT performance at $600.
 
Status
Not open for further replies.
Back
Top