• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

When will gpu prices return to normal.

Status
Not open for further replies.
AMD's / ATI's buggy GPU drivers goes back more than twenty years... if AMD wants to beat nVidia in sales then they have to fix this.
You seem not to have kept track of AMD drivers in the past ... oh, seven years or so. They have fixed this. Are they entirely bug-free? Of course not. Neither are Nvidia. The 5700 XT was a major exception (though there are indications of that not being a pure driver thing - after all, other SKUs in the same series, even using the same GPU, didn't have the same issues), but beyond that, AMD drivers generally work very well these days. They're not quite up to Nvidia's speed for bug fixes or troubleshooting, but considering the size (and thus budget) disparity they're pretty much as close as one could expect.
 
This rebound could be a good sign, suppliers are running out of the inventory. And newegg has very few of them left. 6800 XT is a better deal. But none of the deals in the time just before the next gen are any good over all. It's yeah happy happy, and 7800 turns out to be 25-50% faster for the same price at less power for example. I wouldn't even consider 7000 or 40 series before 6000 and 16/20/30 series are gone for good. The only good time to buy is the first possible day. Any day later and it's already obsolete
6800XT may be a better deal but I do not have the funds for that better deal. Unless your paying, that is?
 
You seem not to have kept track of AMD drivers in the past ... oh, seven years or so. They have fixed this. Are they entirely bug-free? Of course not. Neither are Nvidia. The 5700 XT was a major exception (though there are indications of that not being a pure driver thing - after all, other SKUs in the same series, even using the same GPU, didn't have the same issues), but beyond that, AMD drivers generally work very well these days. They're not quite up to Nvidia's speed for bug fixes or troubleshooting, but considering the size (and thus budget) disparity they're pretty much as close as one could expect.
I haven't had a non-nVidia GPU since my HD 4890.
The drivers for the 5700 XT were plagued with bugs so I'm glad that I dodged that one with purchasing a used GTX 1080 Ti instead.
 
You seem not to have kept track of AMD drivers in the past ... oh, seven years or so. They have fixed this. Are they entirely bug-free? Of course not. Neither are Nvidia. The 5700 XT was a major exception (though there are indications of that not being a pure driver thing - after all, other SKUs in the same series, even using the same GPU, didn't have the same issues), but beyond that, AMD drivers generally work very well these days. They're not quite up to Nvidia's speed for bug fixes or troubleshooting, but considering the size (and thus budget) disparity they're pretty much as close as one could expect.

you can't really say "they fixed it" and then present that insane disaster that was rdna1. Saying rdna1 was the exception is as valid as saying rdna2 was the exception. Anyway i don't care about teams, but AMD drivers are shit compared to Nvidia. I just update and forget about it on Nvidia. On AMD we have to downgrade all the time for whatever issue. And Adrenaline is a bug fest.
 
I haven't had a non-nVidia GPU since my HD 4890.
The drivers for the 5700 XT were plagued with bugs so I'm glad that I dodged that one with purchasing a used GTX 1080 Ti instead.
But that's exactly the thing: those weren't necessarily driver bugs - it was clearly a more complex issue than that, seeing how dependent issues were on system configurations, specific hardware components, etc. - and how the RX 5700, 5600 XT and 5500 XT lacked the same issues.
you can't really say "they fixed it" and then present that insane disaster that was rdna1. Saying rdna1 was the exception is as valid as saying rdna2 was the exception. Anyway i don't care about teams, but AMD drivers are shit compared to Nvidia. I just update and forget about it on Nvidia. On AMD we have to downgrade all the time for whatever issue. And Adrenaline is a bug fest.
So, first off, you know that AMD has released quite a few generations of GPUs in the past decade, right? You're making it sound like there have been two, RDNA and RDNA2, and one of those had issues. This is what we adults call a false equivalency.

Second, RDNA did not have driver issues. The RX 5700 XT had a complex set of issues that no doubt involved drivers, but was far more complex than a simple driver bug - if it was a driver bug and nothing else, it would have been fixed long ago. The best answer to what that issue was seems to be some weird interactions between specific models of hardware, drivers, firmware, and more. Remember, there are tons of reports of the same GPU being moved to a new PC, running the exact same OS, drivers and software, and having zero issues. Also, other RDNA1 GPUs than the RX 5700 XT had far fewer issues. That also doesn't mean that the same issues weren't seen at all outside of the RX 5700 XT, but they were far more rare, indicating that there was something about either the hardware, firmware, or both of the RX 5700 XT that caused these issues. And, crucially, that's not a driver bug. It's still not acceptable - IMO everyone experiencing this issue should have been offered a refund - but it's not a driver issue, nor is the failure to entirely remedy the issue the fault of AMD's driver team.

Also, where are you having to downgrade? I haven't downgraded a GPU driver since my Fury X. And that's across that Fury X, an RX 570, an RX 6900 XT and just recently an RX 6600. Saying AMD drivers are shit is just FUD. It's nonsense. They're absolutely not perfect, but they are perfectly fine. And if by 'Adrenalie' you mean what used to be called Radeon Software (now AMD Software, IIRC), once again: what you're saying doesn't line up with my experiences whatsoever. You might have had issues - but then there are people who have had persistent black screen bug issues with Nvidia drivers - for years! - too. Bugs happen. Bad luck happens. Having a bad experience doesn't mean your experience is universal - just like having a good experience doesn't mean that either. The truth is, as always, somewhere in between. But what you're saying here is just nonsense.
 
But that's exactly the thing: those weren't necessarily driver bugs - it was clearly a more complex issue than that, seeing how dependent issues were on system configurations, specific hardware components, etc. - and how the RX 5700, 5600 XT and 5500 XT lacked the same issues.

So, first off, you know that AMD has released quite a few generations of GPUs in the past decade, right? You're making it sound like there have been two, RDNA and RDNA2, and one of those had issues. This is what we adults call a false equivalency.

Second, RDNA did not have driver issues. The RX 5700 XT had a complex set of issues that no doubt involved drivers, but was far more complex than a simple driver bug - if it was a driver bug and nothing else, it would have been fixed long ago. The best answer to what that issue was seems to be some weird interactions between specific models of hardware, drivers, firmware, and more. Remember, there are tons of reports of the same GPU being moved to a new PC, running the exact same OS, drivers and software, and having zero issues. Also, other RDNA1 GPUs than the RX 5700 XT had far fewer issues. That also doesn't mean that the same issues weren't seen at all outside of the RX 5700 XT, but they were far more rare, indicating that there was something about either the hardware, firmware, or both of the RX 5700 XT that caused these issues. And, crucially, that's not a driver bug. It's still not acceptable - IMO everyone experiencing this issue should have been offered a refund - but it's not a driver issue, nor is the failure to entirely remedy the issue the fault of AMD's driver team.

Also, where are you having to downgrade? I haven't downgraded a GPU driver since my Fury X. And that's across that Fury X, an RX 570, an RX 6900 XT and just recently an RX 6600. Saying AMD drivers are shit is just FUD. It's nonsense. They're absolutely not perfect, but they are perfectly fine. And if by 'Adrenalie' you mean what used to be called Radeon Software (now AMD Software, IIRC), once again: what you're saying doesn't line up with my experiences whatsoever. You might have had issues - but then there are people who have had persistent black screen bug issues with Nvidia drivers - for years! - too. Bugs happen. Bad luck happens. Having a bad experience doesn't mean your experience is universal - just like having a good experience doesn't mean that either. The truth is, as always, somewhere in between. But what you're saying here is just nonsense.


i also owned 2 polaris cards, worked flawless, but that's old tech. We should be talking about RDNA, same tech. Or next we will be discussing the riva tnt or fermi. I only owned the 5700, not exactly sure if the other 5*** cards have issues so i can't argue there, i sold it and never looked back, don't know, don't care. I could look it up but i don't wanna. Still it was the flagship.
 
Now, the customer has no choice - so claiming they don't care is a bit comedy.
No that was my exact point. Harping about features that nobody in that price bracket will ever get and has never gotten is you just being a drama queen. You can't claim Nvidia cares about those things and the buyers agrees when literally nobody in that nvidia price bracket is getting the things your complaining about
 
No that was my exact point. Harping about features that nobody in that price bracket will ever get and has never gotten is you just being a drama queen. You can't claim Nvidia cares about those things and the buyers agrees when literally nobody in that nvidia price bracket is getting the things your complaining about

Never say never. You will be disappointed about your bad prediction.
4K will be a new mainstream soon enough.

Already 8K TVs are being shipped and prepared for sales worldwide.

The market will not suffer vacuum in the offers.
 
I disagree. I was excited as hell that the 6800 Fighter had dropped to £499.99. Now it's jumped back up £558!!! It may be isolated, but it's pissed off me massively.
Well, that may be local dynamics but overall things are going down. I'm still rooting for "down" don't get me wrong, just was tired of waiting. Best of luck.
 
Never say never. You will be disappointed about your bad prediction.
4K will be a new mainstream soon enough.

Already 8K TVs are being shipped and prepared for sales worldwide.

The market will not suffer vacuum in the offers.

4K is mainstream in TVs because the tech got cheap enough that they're now available for less than $500. FHD only exists anymore in the very bottom end of the market, so if one wants something that doesn't suck, one is forced to buy 4K. Two things hold up the 4K transition in PC land: Monitors and graphics. FHD and QHD monitors are still common and plentiful, and will remain so until 4K models crowd them out in the under-$300 space. 4K gaming will follow when capable cards are common at a similar price point. That's probably a ways out yet. What's the threshold for solid 4K60, 6700 XT / 3060 ti? Those are still over $400. Maybe the next generation of cards will manage that kind of performance around $300, but my guess is the gen after that. We'll see if monitors reach that point at the same time; I think they'll lag a year or two. Since widespread adoption doesn't happen overnight, I'll predict based on nothing more than the preceding and ye olde gut that 4K will overtake FHD in PC gaming sometime in 2027 or 2028.
 
I do what I want, thanks. Prices are falling as of now, not rising.

Yeah still waiting on the 6800XT to be at a reasonable price haha. Still here have the money for it 2 years later lol.
 
Never say never. You will be disappointed about your bad prediction.
4K will be a new mainstream soon enough.

Already 8K TVs are being shipped and prepared for sales worldwide.

The market will not suffer vacuum in the offers.

For TVs 4K has a large adoption and continues to grow.

For PC gaming.....no. Not happening. Estimates are that only around 2% of PC gamers are using 4K and it's been below 2% for years and years and it will stay extremely low for the foreseeable future. The reason is that it requires very expensive GPUs to run it properly. The average PC gamer is using entry level through midrange GPUs because that's what they can afford. Those GPUs are inadequate for 4K and that is why the number one resolution is 1080p and has been for many years and will continue to be so for the foreseeable future.

You can make the case that the nextgen midrange could somewhat handle 4K in present games but during that GPU's lifetime (maybe 4 years) it will become inadequate because games continue to require more and more from hardware. They always have and they always will.

1440p is another story. It's growing in adoption and most likely will continue to do so.
 
Never say never. You will be disappointed about your bad prediction.
To be honest by the time those two things even happen (lower mid range doing ray tracing and 4k with zero issue) that $250-300 USD price bracket in question probably won't even exist anymore.........
 
Chiphell leak:

1662712983834.png


1662713015962.png


1662713056366.png


1662712794613.png

Alleged NVIDIA GeForce RTX 4090 Graphics Card 3DMark Time Spy Benchmark Leaks Out, Up To 3 GHz Clock Speed, 2x Faster Than RTX 3090 (wccftech.com)

For TVs 4K has a large adoption and continues to grow.

For PC gaming.....no. Not happening. Estimates are that only around 2% of PC gamers are using 4K and it's been below 2% for years and years and it will stay extremely low for the foreseeable future. The reason is that it requires very expensive GPUs to run it properly. The average PC gamer is using entry level through midrange GPUs because that's what they can afford. Those GPUs are inadequate for 4K and that is why the number one resolution is 1080p and has been for many years and will continue to be so for the foreseeable future.

You can make the case that the nextgen midrange could somewhat handle 4K in present games but during that GPU's lifetime (maybe 4 years) it will become inadequate because games continue to require more and more from hardware. They always have and they always will.

1440p is another story. It's growing in adoption and most likely will continue to do so.

I am using a 4K monitor, and never going to lower resolution.
 
Do you use it for gaming?

For sure, games like the F1 series, Counter-Strike Source, Counter-Strike: GO, etc, run with almost ANY graphics card at 4K, not maxed out, but some settings a bit down.
3840x2160 is the only right way these days...
 
For sure, games like the F1 series, Counter-Strike Source, Counter-Strike: GO, etc, run with almost ANY graphics card at 4K, not maxed out, but some settings a bit down.
3840x2160 is the only right way these days...

there is a right way to play games now?
 
I find myself struggling greatly with the 2K or 4K question. I bought a Samsung G7 a couple weeks ago which is 1440p. Beautiful monitor. Since I am going to be using whatever monitor I get for many years I couldnt get the thought of 4k out of my head. Ultimately I put in for a return and have a Gigabyte M32UC on the way for the same price I paid for the G7. Man I hope I dont regret it.
 
I find myself struggling greatly with the 2K or 4K question. I bought a Samsung G7 a couple weeks ago which is 1440p. Beautiful monitor. Since I am going to be using whatever monitor I get for many years I couldnt get the thought of 4k out of my head. Ultimately I put in for a return and have a Gigabyte M32UC on the way for the same price I paid for the G7. Man I hope I dont regret it.
The good thing about 2160p is that its higher pixel density allows for running non-native resolutions with less visible blurriness and scaling artefacts - at least on ordinary monitor sizes (27" and the like). This effect diminishes with increased monitor size, but should be there still at 32". Plus, of course, with the advent of FSR, DLSS and the like, there are plenty of options for running non-native resolutions and having it look good both today and going forward. You just can't buy a 2160p monitor and be gung-ho about only running native res and ultra settings unless you either love low framerates or have an unlimited GPU budget.
 
The good thing about 2160p is that its higher pixel density allows for running non-native resolutions with less visible blurriness and scaling artefacts - at least on ordinary monitor sizes (27" and the like). This effect diminishes with increased monitor size, but should be there still at 32". Plus, of course, with the advent of FSR, DLSS and the like, there are plenty of options for running non-native resolutions and having it look good both today and going forward. You just can't buy a 2160p monitor and be gung-ho about only running native res and ultra settings unless you either love low framerates or have an unlimited GPU budget.
Thanks for the info. I had been concerned with running stuff at 1440p on a 4k monitor. I dont have the GPU yet to run 4K, but I have settled on waiting and getting either a 4070 or 4080 if the prices arent outrageous. My GPU budget is around $700 US, so hopefully that will net me at least a 4070. Even if I have to switch back and forth from 2k to 4k for a few years eventually i'll have enough power for 4k reliably. Hopefully this particular monitor can look good running in 2K. I should have waited a few months before buying one though. I was building my daughter a new computer and caught the "I need new parts!" bug even though I had no intention of upgrading when I started, lol.
 
you can't really say "they fixed it" and then present that insane disaster that was rdna1. Saying rdna1 was the exception is as valid as saying rdna2 was the exception. Anyway i don't care about teams, but AMD drivers are shit compared to Nvidia. I just update and forget about it on Nvidia. On AMD we have to downgrade all the time for whatever issue. And Adrenaline is a bug fest.

This is exactly why, barring a > $50 price difference in AMD vs Nvidia, I'd personally still go with Nvidia.

And nothing in the midrange is priced reasonably, at all.

Given this is a 2 year product cycle, this represents the last 4 years of the midrange, and it is not impressive at all. With a 2 year product cycle, we should see 40% uplifts the way the high end GPUs are doing.

But for Halo products, they just keep making higher and higher end models, while the mid range kind of languishes. What do we get next, the 4950XT Ti Super Titan?

And yes I mixed up AMD/Nvidia naming convention intentionally.


1662747391867.png
 
Thanks for the info. I had been concerned with running stuff at 1440p on a 4k monitor. I dont have the GPU yet to run 4K, but I have settled on waiting and getting either a 4070 or 4080 if the prices arent outrageous. My GPU budget is around $700 US, so hopefully that will net me at least a 4070. Even if I have to switch back and forth from 2k to 4k for a few years eventually i'll have enough power for 4k reliably. Hopefully this particular monitor can look good running in 2K. I should have waited a few months before buying one though. I was building my daughter a new computer and caught the "I need new parts!" bug even though I had no intention of upgrading when I started, lol.
$700 for a GPU should get you into 2160p territory just fine. Just don't set your settings to Ultra - that's just stupid anyhow. The next setting down is typically visually indistinguishable yet performs far, far better. Reviews tend to test at Ultra because they want to test worst-case scenarios for proper stress testing, but gaming at Ultra makes no sense.

Also, this is a bit pedantic, but please stop calling 1440p "2K". A bit of a pet peeve, but this really drives me up the wall. The only resolution called 2K is DCI 2K, a cinema resolution that's 2048x1080 pixels (essentially a slightly wider 1080p). 1440p is 1440p, and there is no "XK" numbering for it, seeing how that numbering comes from the realm of TV sales and marketing, where 1440p has never existed.
 
Status
Not open for further replies.
Back
Top