• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

What is Radeon RX 7000 series for you? Please elaborate

What is Radeon RX 7000 series for you?

  • Success

    Votes: 49 30.8%
  • Disappointment

    Votes: 42 26.4%
  • So-so

    Votes: 68 42.8%

  • Total voters
    159
  • Poll closed .
"Your game doesn't run properly? Oh, just apply this image-blurring, resolution-decreasing magical slider, and now you've got double the FPS! Next time you won't be paying for performance, you'll be paying for the next version of the image-blurring, resolution-decreasing magical slider." Disgusting! Purely disgusting.

This is exactly what I hate about the GPU market these days.
 
Nobody forces you to buy it.

This is not how the sales department at AMD works or thinks about the situation. They want more sales, but because no card of the current generation provides even remotely good value, especially when the customers are poorer than they had been in the previous years, it's really tough for them to sell any cards.

And yes, they do force you to buy. It's about yor resistance to fall into the trap. I vote with my wallet and say no. If AMD wants a deal, it must decerase the pricings.
If you think logically, AMD is a very rich company, they have billions and billions, and they can afford to decrease the pricings, so that more sales eventually would appear.
 
This is not how the sales department at AMD works or thinks about the situation. They want more sales, but because no card of the current generation provides even remotely good value, especially when the customers are poorer than they had been in the previous years, it's really tough for them to sell any cards.

And yes, they do force you to buy. It's about yor resistance to fall into the trap. I vote with my wallet and say no. If AMD wants a deal, it must decerase the pricings.
If you think logically, AMD is a very rich company, they have billions and billions, and they can afford to decrease the pricings, so that more sales eventually would appear.
Good on you, and you're exactly right about this. I don't get the AMD strategy either.

At the same time though I do get how minds work quite well :) Eventually you too will cave... But that's the beauty of PC gaming IMHO. The longer you wait, the easier it gets to justify an upgrade simply because the jump in performance becomes too big to ignore. So at its core I think that's the market working as it should, however unhappy we are with it.
 
Even after all this time and a deep think it's better than the opposition.

Simple as.
Don't like the price, pay more for less with Nvidia or less for less at Intel but YmmV, no doubt.
 
I prefer to see reviewers test these products with settings turned up and no software aids to help them. It really shows the state of the market in terms of hardware, and how the game engines take advantage of them (or not).

I don't want these software aids to become an excuse for these companies to skimp on the improvements to the hardware (i.e., give you less rasterization frames, generation over generation). That seems to be what's happening.

To be fair, I also prefer reviewers that benchmark games that people actually play, and not the same test suite over and over of games with built-in benchmarks that no one plays.

No one benchmarks GTA V anymore, but it's consistently in the top 10 most played games on Steam, and that doesn't include people with physical copies and digital keys from Rockstar directly.

Even after all this time and a deep think it's better than the opposition.

Simple as.
Don't like the price, pay more for less with Nvidia or less for less at Intel but YmmV, no doubt.

Or... Pay even less for more via the second hand market.
 
Last edited:
This is not how the sales department at AMD works or thinks about the situation. They want more sales, but because no card of the current generation provides even remotely good value, especially when the customers are poorer than they had been in the previous years, it's really tough for them to sell any cards.
Then why has basically every single 7800 XT listing in the UK sold out on, or soon after launch day? And why hasn't any other GPU since the 4090?

And yes, they do force you to buy.
As far as I checked, nobody was pointing a gun at my head as I clicked "buy".
 
Not really accurate. DLSS gives superior quality (especially in 1080p) and is more stable overall. Frame gen is not to be underestimated. These are QoL features like air conditioning or mp3 players.
You forgot to add in your opinion. How is Frame Gen different than Fujitsu's solution for High res Screens? It was crazy watching a Computer Chronicles episode that dealt with Ray Tracing.....in 1987. DLSS will be in less Games than FSR for one reason, Both the Xbox and PS5 are Ryzen and let's remember that it is with the help of Sony and Microsoft that RDNA became a reality. People like to blame AMD for FSR being the upscaler of choice in modern Games but Xbox and PS5 don't support DLSS so what do you think THEY told Bathesda. As a result of this Nvidia released the binaries for DLSS to Open Source. They have basically asked the modder Communtiy to carry the torch for DLSS but Frame Gen is all theirs. I look at Frame Gen like E cores and certainly not a true representation of a performance improvement but something to wax the numbers. Unfortunately the propaganda that IBM applied still exists so even though the Commodore was a better (and Cheaper) machine for most users they never sold in America like they did in Canada but the the 64 was the best Gaming platform of the Early to mid 80s with plenty of Games. Today we have expansion cards with massive VRAM buffers and stupid clock speeds. I am sorry I have been absorbing Computer Chronicle episodes on Youtube lately and it has me appreciating 20 mhz processors being the cutting edge, The modern GPU is a marvel at parallel processing and they all have a place. I am a Gamer, I have been a Gamer since the beginning of this experience and am at times overwhelmed at what Gaming is today. I will give you 2 examples that have nothing to do with Upscaling. Baldurs Gate 3 is one of the best Games ever made period. Everspace 2 is a really sweet Game too that is great but so are Games like Arkham, Section 8 and Sleeping Dogs that are still an excellent experience. I have been drinking so I will go on. I saw the preview for Dragons Dogma 2 and know that as much as I want to play the s$#% out of that Game, if it doesn't support High refresh rates I won't even buy it. That leads me to my next great thing about today's Gaming. We are getting more and more Japanese studios making Games for PC. Do i mind getting my fix from Konami, Capcom. Atlus, Square Enix, Namco, Tecmo and SNK? on PC? no not at all. If you played Horizon Zero Dawn and enjoyed it for the story and butter smooth controls you get what I am saying. To tie it back to the thread these cards will all be great at the resolution that they are aimed at. The fact that a 4070TI is about $350 more new vs a 7800XT (where I live) proves exactly that. The best though is that 4060TI 8GB is $150 less than the 16GB but there is arguably no difference in performance. That is about the same price gap of the 7800XT vs 7700XT as well but at least you get more for your money with AMD.
 
That comparison isn't too relevant, is it? Like comparing a GT 710 to a 730.
GT 730 is massively faster, to the extent some games are utterly unplayable on the former (below 30 FPS) whilst being comfy on the latter (above 60 FPS), you should've gone with GTX 760 to GTX 770 comparison.
We'll see when FSR 3
The key word is "when." FSR3 cannot be considered before release. Why do we compare real things to something which doesn't exist?
Nah, they're just gimmicks. Running your games on high detail at native resolution with no DLSS/FSR/XeSS or frame generation is QoL.
Try playing with FSR/DLSS enabled at 100% scale. Not 66% which is "Quality" but 100%. This "gimmick" will suddenly become a massive improvement over your built-in TAA. Yes, you will lose FPS but you will also improve your image quality a lot. These are legit tools and they are only gimmicks if used misproperly.
"Your game doesn't run properly? Oh, just apply this image-blurring, resolution-decreasing magical slider"
Considering we have never been able to actually run AAA at 1440pUltra with current gen GPUs cheaper than x700/x70 series at 60 FPS, be it before DLSS invention or after... We can now blame some gamedevs for "reinventing Crysis," but by no mean these upscalers per se.

1440p with Quality presets already looks better than native 1080p (considering we compare equal sized 1440p and 1080p displays of equal panel quality with equal distance from viewer to display) in most games. And performance penalty is considerably low.

At 4K, it's already very hard to tell if upscaler is enabled or not. I know this will most probably lead to gamedevs only caring how their games work with DLSS/FSR enabled making their games uplayable trash without these tools. But we'll see.
 
The key word is "when." FSR3 cannot be considered before release. Why do we compare real things to something which doesn't exist?
Only because I have nothing to use DLSS 3 FG on, so I won't have any first-hand experience before FSR 3 comes out. Otherwise, I don't care about the technology.

Try playing with FSR/DLSS enabled at 100% scale. Not 66% which is "Quality" but 100%. This "gimmick" will suddenly become a massive improvement over your built-in TAA. Yes, you will lose FPS but you will also improve your image quality a lot. These are legit tools and they are only gimmicks if used misproperly.
Not every game lets you adjust the scaling. Most come with options for "quality", "balanced" and "performance", which are all trash at 1080p. If I see one with such scaling options, I'll give it a go.

Considering we have never been able to actually run AAA at 1440pUltra with current gen GPUs cheaper than x700/x70 series at 60 FPS, be it before DLSS invention or after... We can now blame some gamedevs for "reinventing Crysis," but by no mean these upscalers per se.

1440p with Quality presets already looks better than native 1080p (considering we compare equal sized 1440p and 1080p displays of equal panel quality with equal distance from viewer to display) in most games. And performance penalty is considerably low.

At 4K, it's already very hard to tell if upscaler is enabled or not. I know this will most probably lead to gamedevs only caring how their games work with DLSS/FSR enabled making their games uplayable trash without these tools. But we'll see.
No wonder I'm still happy with a 24" 1080p screen. Upscaling is a technology used to push gamers into upgrading their monitors to resolutions that their graphics cards can't handle. If you game on a 4K TV, and can't see any difference (I can't when I'm watching films on mine), fair play to you. But like you said, I'd rather not have game devs rely on the technology for lower resolutions as well as an excuse for no optimisation.
 
This thread reads like a vehicle to shit on these gpus.
 
Only because I have nothing to use DLSS 3 FG on. Otherwise, I don't care about the technology.
This is too subjective and thus, irrelevant.
Not every game lets you adjust the scaling.
True that, and it's a shame.
Upscaling is a technology used to push gamers into upgrading their monitors to resolutions that their graphics cards can't handle.
No one pushes anyone. If you see a mid-range 3-gen old GPU under your bonnet it's a clear indicator you won't be able to play something spicy at resolutions higher than 1080p or, likely, even 900p. Now, when GPU prices came down and used GPU market gives us a lot of tasty options, gaming at something above 1080p is an issue only if you are broke. Insanely broke. RX 7800 XT is five hundred bucks. And we yet to see a game which is unplayable at native 1440p with high or even maxed out settings with this GPU. Used RX 6800/RTX 2080 Ti/RTX 3070 Ti come for $300 in some regions and... like, playing at med-high (but mostly ultra) at native 1440p ain't bad, y'know, 'pecially knowing you got your GPU for what's supposed to be a 1080p product price.
If you game on a 4K TV, and can't see any difference
I see some image quality worsening if I go FSR: Balanced on my 4K TV (43" display, 50 to 55", or 127 to 140 cm eye-to-display distance), yet the framerate is exceeding this TV's refresh rate which I'm greatly fine with. For 4K + FSR: Quality, I'd need something better than RX 6700 XT or something easier than AAA games. Baby sized 4K displays (27") are clearly made for FSR because your vision is definitely above 100% if you can tell native from upscaled apart, unless you're <70 cm close to your monitor which is pure insanity.

I'm currently with 1080p because I prefer a million Hz to a billion pixels. Unfortunately, my display can only handle 83 Hz. Wish I had a 165 Hz one. And yeah, since they don't serve 34" UW 1080p high refresh rate displays and I don't wanna have a thirteenth 16:9 display I will buy a 3440x1440 one of 144+ Hz. And with a little bit of FSR, my 6700 XT will handle it just fine. My eyes, too, since I'm 50" away from the display. Can't complain with my 70 PPI, and definitely won't complain when it comes to 110 PPI from the same distance. Will prolly be unable to tell native from FSR: Balanced apart lol.
when I'm watching films on mine
Movies are not games. They don't need AA, they don't need image quality boosters of other kinds. A 1080p movie looks way more clear than a 2160p maxed out best looking game on the same display. Professional video cameras, massively experienced operators, all that stuff. Games can't beat it. And won't be able to. Especially if the most popular GPU segment, namely 60/70, or 600/700/800, will get us less and less and less gen-to-gen improvements.

I massively like FSR, DLSS, XeSS, ray tracing and all that stuff but these upscalers are kinda too prone to end up being gamedevs' excuses when it comes to image quality and overall performance. Ray tracing is really cool and this is a feature I'm most interested in but I clearly understand it's not yet the age of real RT. It will come, optimistically, in a dozen years from now when even a mediocre next-gen GPU (like what would've been an RTX 4050) will get 60 FPS with ultra hyper mega ray tracing effects (compared to what we have now, namely a couple reflections, a couple mirrors, a couple realistic shadows, nothing to write home about).
 
This is too subjective and thus, irrelevant.
The whole topic is subjective, isn't it? ;)

No one pushes anyone. If you see a mid-range 3-gen old GPU under your bonnet it's a clear indicator you won't be able to play something spicy at resolutions higher than 1080p or, likely, even 900p. Now, when GPU prices came down and used GPU market gives us a lot of tasty options, gaming at something above 1080p is an issue only if you are broke. Insanely broke. RX 7800 XT is five hundred bucks. And we yet to see a game which is unplayable at native 1440p with high or even maxed out settings with this GPU. Used RX 6800/RTX 2080 Ti/RTX 3070 Ti come for $300 in some regions and... like, playing at med-high (but mostly ultra) at native 1440p ain't bad, y'know, 'pecially knowing you got your GPU for what's supposed to be a 1080p product price.
I guess I worded it wrong. What I meant is, while without upscaling, you'd be OK with your 1080p monitor, and you'd look at anything higher as niche products, now you're being told that upgrading to 1440p or even 4K is doable because you've got these great tools to help you run a good frame rate. "Just buy that shiny new monitor, lower your image quality with this awesome new tech and be happy." But my question is, why should I spend money on a new monitor if I end up having to lower the image quality? I'm not saying I'm not curious about trying some games on my 4K TV for shits and giggles, but my 1080p main screen with no upscaling is just fine.

I'm currently with 1080p because I prefer a million Hz to a billion pixels. Unfortunately, my display can only handle 83 Hz. Wish I had a 165 Hz one. And yeah, since they don't serve 34" UW 1080p high refresh rate displays and I don't wanna have a thirteenth 16:9 display I will buy a 3440x1440 one of 144+ Hz. And with a little bit of FSR, my 6700 XT will handle it just fine. My eyes, too, since I'm 50" away from the display. Can't complain with my 70 PPI, and definitely won't complain when it comes to 110 PPI from the same distance. Will prolly be unable to tell native from FSR: Balanced apart lol.
Now, ultrawide is something I see the point of, as opposed to increasing resolutions on a "baby sized" screen. I might get a 3440x1440 monitor myself someday, when you don't need a 4090 anymore to feed one natively with no upscaling. :)

these upscalers are kinda too prone to end up being gamedevs' excuses when it comes to image quality and overall performance.
And that's what I'm afraid of.
 
Last edited by a moderator:
Seconded.

Bar was set in the dirt by Op.

I just tripped on it it's so low


Sad times.
What struck me is that despite the negativity the % calling 7000 a success has risen over time.

It seems 7000 is mighty competitive too right now, and it runs Starfield relatively well, plus, it doesn't have any content recently where it falls apart, and we have the 7800/7700XT releases now. Pretty interesting times, because 7000 is actually positioned a LOT better at this point than it was at the start of this topic.

Hey and another thing, perhaps we need a similarly worded topic on the Nvidia side. How does Ada 'feel' to TPU? We might end up getting similar results... I haven't seen anyone 100% positive about it.
 
Last edited by a moderator:
What struck me is that despite the negativity the % calling 7000 a success has risen over time.

It seems 7000 is mighty competitive too right now, and it runs Starfield relatively well, plus, it doesn't have any content recently where it falls apart, and we have the 7800/7700XT releases now. Pretty interesting times, because 7000 is actually positioned a LOT better at this point than it was at the start of this topic.
The 7700 XT is a mixed bag, but the 7800 XT sold out basically everywhere in the UK following a day or two after launch. The only other current gen card that had similar success was the 4090, as far as I'm aware. Nothing else. This isn't even an opinion - it's fact.

Hey and another thing, perhaps we need a similarly worded topic on the Nvidia side. How does Ada 'feel' to TPU? We might end up getting similar results... I haven't seen anyone 100% positive about it.
Good idea, I'll start. Ada feels to me like Ampere on a node shrink with fake frames enabled and artificially inflated prices. Nothing new, nothing innovative. The epitome of gimmicks features over function. Though I admit, its efficiency is pretty impressive, just the chips and VRAM capacities are too small for what Nvidia is asking for them. Next! :D

Seconded.

Bar was set in the dirt by Op.

I just tripped on it it's so low


Sad times.
They're letting it go on because the topic is relevant, and hasn't resulted in name calling, or any other against-the-rules activity, I suppose.

But the bar is pretty low, I agree. When you ask the general public a question like this, trying to manipulate the results by stating your opinion together with the question in the opening post is never a good sign.
 
Last edited by a moderator:
The 7700 XT is a mixed bag, but the 7800 XT sold out basically everywhere in the UK following a day or two after launch. The only other current gen card that had similar success was the 4090, as far as I'm aware. Nothing else. This isn't even an opinion - it's fact.


Good idea, I'll start. Ada feels to me like Ampere on a node shrink with fake frames enabled and artificially inflated prices. Nothing new, nothing innovative. The epitome of gimmicks features over function. Though I admit, its efficiency is pretty impressive, just the chips and VRAM capacities are too small for what Nvidia is asking for them. Next! :D


They're letting it go on because the topic is relevant, and hasn't resulted in name calling, or any other against-the-rules activity, I suppose.

But the bar is pretty low, I agree. When you ask the general public a question like this, trying to manipulate the results by stating your opinion together with the question in the opening post is never a good sign.
Then constantly piping in to remind all and argue with all that your opening statement is right no less.

I'll show how this normally goes by showing you how this should have been done.
 
Because there were only two cards in stock :D That's why. :kookoo:
Two cards across 5 retailers (Scan, Box, Ebuyer, CCL Computers and Overclockers), 5 brands (Sapphire, XFX, PowerColor, Asus and Gigabyte), and 2-3 models per brand. Haha, nice joke. :roll:

Seriously though, there was plenty of stock on Sep 6, but Scan was all out by the morning of Sep 7 when I read the review. Ebuyer still had 16 of the Sapphire Pulse, that's where I ordered mine from. A day later, they were out, too, and so were Overclockers and all the rest. At Scan, all orders are filled until Sep 25 (Sep 30 on some cards), Sep 25-28 at Ebuyer - other retailers don't show an ETA.
 
Two cards across 5 retailers (Scan, Box, Ebuyer, CCL Computers and Overclockers), 5 brands (Sapphire, XFX, PowerColor, Asus and Gigabyte), and 2-3 models per brand. Haha, nice joke. :roll:

Seriously though, there was plenty of stock on Sep 6, but Scan was all out by the morning of Sep 7 when I read the review. Ebuyer still had 16 of the Sapphire Pulse, that's where I ordered mine from. A day later, they were out, too, and so were Overclockers and all the rest. At Scan, all orders are filled until Sep 25 (Sep 30 on some cards), Sep 25-28 at Ebuyer - other retailers don't show an ETA.

Newegg US had over 10 models sell out and only 2 have been restocked both Asrock cards which don't seem as desirable as other brands.

Definitely the best launch since the 4090/7900XTX for what it's worth.
 
This means that there is no supply.
Do you think retailers get new shipments every day/week? :kookoo:

Can somebody change the title of this thread to "Let's talk s*#$ about the RX 7000 series to fit ARF's opinion" please?
 
Considering we have never been able to actually run AAA at 1440pUltra with current gen GPUs cheaper than x700/x70 series at 60 FPS [...]

Until the clusterf**k that is Starfield came along. There is no justification whatsoever for a $600 CPU + $1,600 GPU to struggle with 1080p/120 Hz.

This thread reads like a vehicle to shit on these gpus.

I think I have been pretty fair.
 
Do you have one?
Don't have either of the latest gen, sorry but I am still going to review the specs and judge.

That being said, I am more satisfied with the lineup structure of 7xxx than ada. Not that that is saying much though.
 
Until the clusterf**k that is Starfield came along. There is no justification whatsoever for a $600 CPU + $1,600 GPU to struggle with 1080p/120 Hz.
This has nothing to do with GPUs though. Starfield is irreasonably heavy on CPUs, not on GPUs. $500 GPU is more than enough for 1440p60, which is a rough equivalent of 90 FPS at 1080p. $900 GPU, namely 7900 XTX, is theoretically more than enough for 1080p120. It's the CPUs which by no mean are.
Bethesda ALWAYS failed to deliver games with reasonable system resources utilisation so I don't know why you act like it's a new thing or AMD/nVidia's fault. It's Bethesda's fault.
 
This has nothing to do with GPUs though. Starfield is irreasonably heavy on CPUs, not on GPUs. $500 GPU is more than enough for 1440p60, which is a rough equivalent of 90 FPS at 1080p. $900 GPU, namely 7900 XTX, is theoretically more than enough for 1080p120. It's the CPUs which by no mean are.
Bethesda ALWAYS failed to deliver games with reasonable system resources utilisation so I don't know why you act like it's a new thing or AMD/nVidia's fault. It's Bethesda's fault.
If we compare starfield with the last of us, which is more cpu bound? I feel even though maybe starfield is cpu demanding too, but its not as demanding as the last of us. The last of us still the most cpu bound game i ever played...
 
Back
Top