• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

What is Radeon RX 7000 series for you? Please elaborate

What is Radeon RX 7000 series for you?

  • Success

    Votes: 49 30.8%
  • Disappointment

    Votes: 42 26.4%
  • So-so

    Votes: 68 42.8%

  • Total voters
    159
  • Poll closed .
1080p + DLSS would probably look better than 720p or something
Pure 720p is disgusting on a 1080p display.
Going DLSS/FSR (Quality) makes it render at 720p but it still looks pretty much fine considering the same display, game, and other settings.

All these upscalers are very far from ideal at 1080p and lower resolutions nevertheless. Not enough data to work with. OTOH, at 1440p and especially 4K, it's less sense in going native than in going DLSS/FSR.
 
Things are probably different in other markets from what I read, but here in Brazil 7900XT/XTX are overpriced as hell. The 7900XT is ~1000BRL (~200 USD) higher than 4070Ti on average, and with shorter warranties (1 year vs 3 from the nVidia lineup) most of the time. The 7900XTX is more or less equal to 4080 on pricing, but the warranty problem still stands, with only MSI offering 3 years for RDNA3.

The only product that makes sense from a bang for the buck perspective is the RX 7600, and that is if you're coming from Polaris or Vega at most, and if you only play 1080p (not a big deal, since 1440p and 4k gaming is still overwhelmingly the minority here).

All in all, I don't think AMD cares about this market. The price cuts that happen elsewhere usually never make it here, and combined with the warranty stuff, that makes it harder to recommend these products against the Ada stack.
Here, most partners offer 2 year warranties with some, such as AsRock and Asus, offering 3 year warranties.
 
As for using transistor budget, that's also utter nonsense you've made up with zero factual basis. Transistor count spent on cache or chiplet interconnect logic does not add to performance the same way that transistor count spent on compute cores, ROPs. You're completely clueless to how a GPU works if you believe otherwise.

Maybe you are clueless. The transistor count is a part of the bill of materials and DOES participate in the expenses for the making of the product.
Your reality shifting won't change the facts :D
 
Maybe you are clueless. The transistor count is part of the bill of materials and DOES participate in the expenses for the making of the product.
Your reality shifting won't change the facts :D
Die area is the only thing that matters when it comes to a chip's cost.
 
Die area is the only thing that matters when it comes to a chip's cost.

And die area is about how many transistors the design has. Less transistors means less die area. Less die area means less transistors.
 
Die area is the only thing that matters when it comes to a chip's cost.
Die area and manufacturing process. For instance what Nvidia is using this generation is most certainly more expensive per wafer than what AMD is using, so even if you'd compare chips with the same area the cost wouldn't be the same.
 
And die area is about how many transistors the design has. Less transistors means less die area. Less die area means less transistors.
You can't derive die area from transistors and vice versa. When talking about a chip's cost, you should only mention die area. No foundry counts the transistors and bills you for them.

Die area and manufacturing process. For instance what Nvidia is using this generation is most certainly more expensive per mm^2 than what AMD is using, so even if you'd compare chips with the same area the cost wouldn't be the same.
Of course; that's a given. The GTX 1060 and GTX 980 have comparable performance, but they were made on different processes so a cost comparison would depend upon the costs of the various processes.
 
1691160973424.png

 
It might be that the 4090 has the same problem that AMD encountered for larger GPUs: the front-end is not able to keep up with the shaders, and they don't get enough work in many workloads.

Or it has transistors which don't add to the overall performance - transistors for computing loads (not gaming, or at least not games which have already been released), and ray-tracing, ML, DL, etc...
 
Let me politely disagree. Upscaling uses different algorithms to upscale (as the name suggests) a lower resolution image to your chosen resolution. That, by all means, is lower quality than native.
That's not what I said at all though. You are not supposed to use upscale instead of native (although in lots and lots of games it does look better cause of TAA). You are supposed to use it when your target is, let's say 120 fps and you can only get 80 without it. So instead of eg. dropping to 1440p on a 4k monitor, you use DLSS / FSR instead
Sure, 4K + DLSS looks better than 1440p native. In my case, 1080p + DLSS would probably look better than 720p or something. But that's not the point. 1080p native will always be better than 1080p + DLSS.
That part is definitely, definitely not true. Lots of games look like ass at native due to TAA. Hwunboxed tested this, and in half the game dlss just straight up looks better than native.

Because the graph is obviously flawed. It counts disabled transistors as well. 3080 = 3090? Nope.

So play it in the preset just below the maxed out one, the perceptible difference is generally inexistant and the game run much easier.
Well then it depends on what is more inexistant to you, activating upscaling or dropping quality settings. To me upscaling shows less difference or sometimes even an improved compared to dropping the quality preset, so I use it.
 
Or it has transistors which don't add to the overall performance - transistors for computing loads (not gaming, or at least not games which have already been released), and ray-tracing, ML, DL, etc...
Nope, that isn't the case. The 4090 has 68% more ray tracing and shading capability than the 4080, but there's no game where it shows anywhere close to that scaling.

@fevgatos also raises a good point. You can only use transistor count or die area to compare performance if you're comparing fully functional products, e.g. a 3090 Ti vs a 3070 Ti or a 6900 XT vs a 6700 XT.
 
Let me politely disagree. Upscaling uses different algorithms to upscale (as the name suggests) a lower resolution image to your chosen resolution. That, by all means, is lower quality than native.
Guess what, when you're viewing those lower quality images at over 60 per second, it doesn't matter because your eyes literally cannot pick out the bits that are lower-quality. This is exactly why GPU designers are investing in upscaling algorithms, because they understand that frame rates are more important than quality.
 
Last edited by a moderator:
Don't count transistors as a way to measure performance like you don't count shaders to measure performance.
Ada has tons of transistors aren't getting used in games, often at all:
- Giant 2 media engines for transcoding
- Tensor cores for deep learning applications
- RT core transistors which still only see moderate use in games, and surely don't count towards performance

GPU cores are a complex thing. This isn't a giant chunk of "frames per second" infrastucture built purely out of parts that contribute to your games performance
 
Define fine? Hwunboxed did a video a couple of days ago, the 7900xt is getting nowhere near 144hz at 4k. Heck, not even at 1440p. It drops below 60 fps in some games at 4k, and the average on the games he used was 82 fps.
Let me think of the Games I have played recently. With my 7900X3D

Greedfall 120+ FPS at all times
Remnant 131 FPS
Baldurs Gate 3 133.5 FPS
TWWH3 120+ FPS in battles
Forspoken 130+ FPS
HZD 140+ FPS
Just Cause 4 150+ FPS
Raji 160+ FPS
Sleeping Dogs 200+ FPS
Dragons Dogma 60 FPS with hot spot temp of 50 C
Grid Legends 300+ FPS
Redout 2 200+ FPS



You do not seem to appreciate that I actually own a 7900XT and you know what, I could have gotten a XTX but I saved $400 getting the XT and to honest the only reason I respond is because I am so impressed. I will say it again but a 20GB GPU that has up to 2898 transistor Gates opening at the same time is fing epic when you pair a CPU that has Vcache for Games that support it and 5.65 GHz on the other CCD when it is not.

Since I am having such a good time I had no reservations ordering Armored Core 6. Baldurs Gate 3 as you can see is also fine but COH3, Watch Dog and Dues Ex are also spectacular in performance. I have been drinking so you will get a more passionate response in that I am absolutely over the moon with my PC. When people like you try to opine on products do you understand how foolish it is arguing with someone who is an actual owner. I am not some snot nosed fan boy but someone who is passionate about PC and since Atari, Gaming. All of the culture War issues that come into the PC space are so ridiculous with people championing things that the average user could not feel. As an example the Kingston NV2 will feel as fast as a Seagate 530 but show it's weakness in everyday use after about 2 weeks but on a 120Hz monitor you would be hard pressed to tell the difference but there is a difference between the 6800XT and 7900XT in the form of 300+ Mhz higher boost clocks., more transistors and 4 more GB of VRAM. You can go on though as your arguments to make for good reading in the responses. Unfortunately you are unwavering in support for Nvidia and would hence hijack an AMD thread and verbalize like it would be incredulous to accuse you if such.
 
That's not what I said at all though. You are not supposed to use upscale instead of native (although in lots and lots of games it does look better cause of TAA). You are supposed to use it when your target is, let's say 120 fps and you can only get 80 without it. So instead of eg. dropping to 1440p on a 4k monitor, you use DLSS / FSR instead
My target is whatever is smooth. In some games, it's 30 FPS, sometimes it's 40, in some other games, it's 60. What I definitely don't want is to rely on upscaling to get it, though.

That part is definitely, definitely not true. Lots of games look like ass at native due to TAA. Hwunboxed tested this, and in half the game dlss just straight up looks better than native.
That's not my experience. I prefer to trust my own eyes to believing reviews unquestioningly. You might have a different experience, and that's fine, but at 1080p, upscaling always makes the image quality worse. Sometimes by a little, sometimes by a lot. This topic has been over-discussed, though, so I'd suggest we leave it at that.

Guess what, when you're viewing those lower quality images at over 60 per second, it doesn't matter because your eyes literally cannot pick out the bits that are lower-quality. This is exactly why GPU designers are investing in upscaling algorithms, because they understand that frame rates are more important than quality.
I'm a slow gamer (I only play offline, for the story and atmosphere), so I can pick these differences most of the time. But if I can have good enough performance without any upscaling, then why shouldn't I?
 
You do not seem to appreciate that I actually own a 7900XT and you know what
I don't really care about what you have and youve played, im just saying that claiming the XT is fine for 4k 144hz is just not true, more like missinformation. That's like someone saying he plays 1440p 60 fps on a 1060. Sure it might be true for his case cause he is playing cs go and dota, but it gives the wrong impression.
 
I don't really care about what you have and youve played, im just saying that claiming the XT is fine for 4k 144hz is just not true, more like missinformation. That's like someone saying he plays 1440p 60 fps on a 1060. Sure it might be true for his case cause he is playing cs go and dota, but it gives the wrong impression.
Telling a happy owner that the thing he owns isn't good and he shouldn't be happy is a bit strange.

Also, everything depends on your use case. If you're happy with a 1060, then you are, there's no "wrong impression" about that.
 
Telling a happy owner that the thing he owns isn't good and he shouldn't be happy is a bit strange.

Also, everything depends on your use case. If you're happy with a 1060, then you are, there's no "wrong impression" about that.
Never said it's not good or anything like that. I said it's not getting the fps he is claiming to be getting.
 
Things are probably different in other markets from what I read, but here in Brazil 7900XT/XTX are overpriced as hell. The 7900XT is ~1000BRL (~200 USD) higher than 4070Ti on average, and with shorter warranties (1 year vs 3 from the nVidia lineup) most of the time. The 7900XTX is more or less equal to 4080 on pricing, but the warranty problem still stands, with only MSI offering 3 years for RDNA3.

The only product that makes sense from a bang for the buck perspective is the RX 7600, and that is if you're coming from Polaris or Vega at most, and if you only play 1080p (not a big deal, since 1440p and 4k gaming is still overwhelmingly the minority here).

All in all, I don't think AMD cares about this market. The price cuts that happen elsewhere usually never make it here, and combined with the warranty stuff, that makes it harder to recommend these products against the Ada stack.
My discussions with @Dr. Dro lead me to believe that Brazilians get screwed over.
You just have to pick what you can, based on your own research of what's available at whatever scalper price you can get it for.
Performance/$ charts seem to be meaningless in Brazil :\
 
My discussions with @Dr. Dro lead me to believe that Brazilians get screwed over.
You just have to pick what you can, based on your own research of what's available at whatever scalper price you can get it for.
Performance/$ charts seem to be meaningless in Brazil :\
It's pretty silly that they're ignoring a country which has a higher GDP than Spain and comparable to Australia.
 
Never said it's not good or anything like that. I said it's not getting the fps he is claiming to be getting.
You can check my posts, in there are some screenshots of Gaming so if you refuse to believe that is your problem not mine.
 
It's pretty silly that they're ignoring a country which has a higher GDP than Spain and comparable to Australia.
They're not ignoring it. Brazil has laws on imported electronics which make it a less than desirable market. Why this is impacting AMD more than Nvidia I'm not sure but likely a factor of the scale.
 
You can check my posts, in there are some screenshots of Gaming so if you refuse to believe that is your problem not mine.
Its not that I refuse to believe it, it's that in heavy games it cant get 144fps at 4k. Not even a 4090 can.
 
Its not that I refuse to believe it, it's that in heavy games it cant get 144fps at 4k. Not even a 4090 can.
Like I said check my posts that have screenshots showing performance.
 
Back
Top