• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 7600

You probably wont be able to buy RDNA2 in few months once they have the entire line up. That is one of the only reason it looks bad. I think 6600 probably would have been discontinued if it wasn't for bad economics and low demand. Likely why the pushed the 7600 later but still first and have no 7700 or 7800 series.

We don't know if Radeon RX 7700 and 7800 will ever get launched. There are serious discussions that what you got today is the last one.

Next is next series - "Radeon 8000 series" if they exist.
 
I'm not losing my shit, but literally the only thing the 7600 could bring to the table after failing to deliver a significant performance uplift was performance/Watt. It's a tiny bit better than the 6600XT but it's not exactly the generational leap we were promised. It just feels like a mid-cycle refresh with an RDNA3 label slapped on it without any tangible benefits.

Everything about this generation is just a case of "it's not what was advertised". We were promised new and better but what we're getting is "slightly more expensive, not really adding anything new, and it's not actually measurably better".

This is the wet fart generation of GPUs. Even if you extend the discussion to include Nvidia, outside of the 4090, every new GPU in the last year has been a disappointment.

I won't argue the low end parts being disappointing. But given this is 7600 not 7600xt AMD might be saving some juice for that. While it still wont be great I would compare it to 6600 series. I am not gonna argue this generation mid gen is meh so far, but nvidia opened that box and I think AMD is following the same foot steps but little better on price.
 
It is in fact useless because it doesn't offer better performance numbers.
Radeon RX 7600 could have been called Radeon RX 6655 XT and nobody would have ever noticed.



Only AV1 and neither of the others.

He specifically stated RDNA3. Read the post and maybe then reply.

By your logic the 4060ti could be the 3060ti super and no one would notice.

Again there’s no need to defend Nvidia or AMD. To say RDNA3 or ADA are useless is absolutely wrong.
 
  • Haha
Reactions: ARF
I'm with Dr. Dro on this; You are indeed quoting "marketing talk" improvements that don't appear to exist in the real world.

That 17% IPC is 2%
That 50% increase in ray intersection performance achieves zero.
No games give a toss about AI accelerators
AV1 is useful to a few people, but also not relevant to the overwhelming majority of gamers.

Perhaps RDNA3 will age like fine wine, but right now it's a turd that has achieved less then any achitecture before it.
Its definitely better in ray tracing. IDK what you are talking about RDNA 3 achieving 0 of the 50% better lmao.
Calling it a turd is simply ridiculous. Everything is worth its price, if they were charging you 12-1300 for RDNA3 top end yea you could call it a turd.
AV1 is bad now? LMAO.
its good architecture that doesn't beat the nvidia almighty efficiency but calling it turd is nonsense.

You seem to be just complaining for the sake of complaining lmao.
 
Just a 6600XT replacement. AMD removed the XT from the final card name to have the right to compare it with RX 6600 instead of RX 6600XT. But based on market pricing and specs (number of cores), this is just an 6600XT replacement. So, what we get is a refresh, not a new architecture. Granted it is RDNA3 not 2, but in performance do we really see any difference? RDNA3 is a lost generation for AMD.

At least they try to be realistic and they price the card based on current market pricing. They could price 7600 at $300+ and while we here would be lauphing and throwing DOA all over the place, many out there would be paying those extra dollars over a 66x0/XT because some seller explained them that this is a newer and better card compared to RX 66x0/XT.

Anyway this generation in tldr
Nvidia : Better performance at higher prices, result stagnation.
AMD: The same performance at current prices, result stagnation.

How nice. Hope we get out of this comma soon.

----------

@W1zzard I will object even here about those 8GBs conclusions. While you are the expert and me the noob who just "read/seen it on the internet", an article where you prove that 8GBs are enough would be something needed. Because while the frame rate in some games would be smooth and paint a certain picture, videos have shown that the time it takes to keep loading textures in a VRAM limited card does have big impact in visual quality. So maybe FPS numbers don't really say the truth. Think of this in this way. If programmers think they found a way to offer smooth gaming without VRAM optimizations, by just loading uploading textures and just betting on user ignorance about what to expect from a game visually, you will keep saying that "8GBs are enough" even after a dozen or more examples of games offering inferior visual quality in a VRAM limited scenario. Run some tests now, see for yourself now, if you haven't done it already, because in a year or two things could be bad and people start questioning your review conclusions.

Lastly I got a little triggered about that backplate and "AMD's problem". I understand your points there, AMD should have tested more cables, but I thought that no matter the conditions, not inserting a cable ALL the way, it's a USER ERROR not Nvidia's AMD's mistake. That's what Steve from Gamers Nexus is telling us every time he have to do some damage control for Nvidia in this matter. (obviously I don't have a problem with you here)
And more seriously, is there really a fire hazard with the 8 pin cable? 12 pin cables on fire and excuses about "not all the way in", might not apply on the old, tested, proved, 8 pin cable. Any tests/videos/articles that say otherwise?
 
Its definitely better in ray tracing. IDK what you are talking about RDNA 3 achieving 0 of the 50% better lmao.
Calling it a turd is simply ridiculous. Everything is worth its price, if they were charging you 12-1300 for RDNA3 top end yea you could call it a turd.
AV1 is bad now? LMAO.
its good architecture that doesn't beat the nvidia almighty efficiency but calling it turd is nonsense.

You seem to be just complaining for the sake of complaining lmao.

This card is a Turd the 7900XT was a turd at 900 but now is decent at 780 usd and the 7900XTX has always been fine.
 
Yes that's what I said... but even with all of the improvements from Source 2 and the likely impeccable optimization since it's the biggest e-sport around (plus Valve spending an inordinate amount of resources on perfecting it), it's still a more sophisticated engine and programmers aren't magicians, I fully expect the system requirements to rise and have exactly this segment feel it - don't think anyone with a 3090 or 7900 XT is gonna be feeling it regardless.



The fine wine talk has always struck me as more of a fan pitch than an actual reality. It's true that AMD's early GCN cards aged significantly better to the point that they went a full generation/tier up over time, but AMD stuck to the same foundation from late 2011 (HD 7970 launch) to early 2019 (Radeon VII launch) and kept building there. Indeed, each and every one of these improvements carried over to some extent to all previous generation GCN hardware because they were highly iterative instead of complete replacements or redesigns. But it's very easy to look at this iterative work after the many years it's been in the making and call that fine wine, ignoring that many times you would find yourself with a suboptimal configuration because of driver bugs or inefficiencies that would only be worked out X weeks/months/years into the future.

RDNA doesn't carry that trait, each generation having significant differences between each other - the first generation being significantly downlevel hardware incompatible with DirectX 12 Ultimate (and without supporting the optional software fallback as Nvidia offers with Pascal), or the third generation's chiplet architecture, dual-issue workgroups, overhauled instruction set, etc.

You see what bothers me, I genuinely think that RDNA 3 is exceptionally well architected. I just have no idea why it doesn't deliver. I had a faint hope that this would deliver an extra 20% over the 6650 XT, alleviating my concerns that the 7900 series' relatively lukewarm performance was down to the first-generation chiplet architecture and perhaps some internal bottlenecks through its multiple interconnects and buses... but that didn't occur.

I'll need some time to process all of this, read more, understand it better. I'm really confused right now.
I think the issues present in the 7900XTX are also present in this 7600. It's not new silicon, it arrived in laptops at about the same time as the 7900XTX. The fact were only getting 6-month-old silicon on desktop now is another matter.

I struggle to find where I've read and watched AMD discuss Navi 3x but I've definitely seen admission from AMD's own engineers that there are errors in the Navi31 and Navi33 silicon that require a hardware rework, and that was the reason the 7900XTX underperformed, why all the driver team dropped what they were doing for four months solidly and tried to work around the faults, and why Navi32 is being postponed.

RDNA3 may yet be great, but right now it's broken. At least they've accepted it's unfixable and gone back to updating drivers for RDNA2 cards again....
 
Just a 6600XT replacement. AMD removed the XT from the final card name to have the right to compare it with RX 6600 instead of RX 6600XT. But based on market pricing and specs (number of cores), this is just an 6600XT replacement. So, what we get is a refresh, not a new architecture. Granted it is RDNA3 not 2, but in performance do we really see any difference? RDNA3 is a lost generation for AMD.

The problem is that RDNA 3 is here to stay for years, since they have nothing new after that.
Even the fact that they so aggressively stay on the 2019 6 nm process when historically they were the pioneers on the state-of-the-art processing nodes, is quite telling and disturbing.

If you are in the market to buy today, the only option is Radeon RX 7900 XT 20GB no matter how prohibitively expensive it is. Buy it today and forget for a new card in the next 10 years.
There won't be new cards after it, anyways.
 
The problem is that RDNA 3 is here to stay for years, since they have nothing new after that.
Even the fact that they so aggressively stay on the 2019 6 nm process when historically they were the pioneers on the state-of-the-art processing nodes, is quite telling and disturbing.

If you are in the market to buy today, the only option is Radeon RX 7900 XT 20GB no matter how prohibitively expensive it is. Buy it today and forget for a new card in the next 10 years.
There won't be new cards after it, anyways.

WTF have you been under a rock they announced RDNA 4 a while back. Almost a year ago.

2022-06-10_2-31-13.png
 
You means RDNA 4 replacement for RX 7600 that is up to 5% faster, and Navi 41 that is up to 20% faster than RX 7900 XTX.
No, thank you :D

That could be true I wouldn't hold my breath on it being a generational improvement but to say they have nothing after RDNA3 is still false.
 
Anyway this generation in tldr
Nvidia : Better Last gen performance at the same or higher prices, result stagnation - with the exception of the 4090 providing a nice performance leap for more $$$
AMD: The same performance at current prices, result stagnation - exception 7900 XTX provides performance gains over previous gen for more $$$ while remaining in line price to performance.

How nice. Hope we get out of this comma soon.

FTFY

Anything that isn’t a 4090 or 7900 XTX is an absolute wash unless your GPU died or are building from scratch at lower price points.
 
Thorough review of a 1080p card, at least they (AMD) called it as it is; RTX 2080 performance without, RTX 2070 performance with ray-chasing...
@W1zzard on page 3, under "8-pin won't connect", the second paragraph it should say "This is clearly an AMD issue - I've never encountered it on any other card before."
 
If you are in the market to buy today, the only option is Radeon RX 7900 XT 20GB no matter how prohibitively expensive it is. Buy it today and forget for a new card in the next 10 years.
There won't be new cards after it, anyways.
Unfortunately the only option today is the RTX 4090. Nothing else if we are talking about keeping something for 10 years. AMD could had a real killer with this series if they could double RT performance over 6000 series. If they had done that, if this architecture could archive that, 7000 series would have been a future proof option. In it's current form it is not.

[Stupid mode on] If they use shaders for RT calculations, maybe they should start considering a return for the (hybrid) Crossfire, where a second card or even the RDNAx cores in Ryzen CPUs/APUs can be used to assist the cards in FSR/RT/Whatever calculations. Probably it will be a return of nightmares for the driver development team, but could it offer a solution?[/stupid mode off]
 
At this point I almost prefer the Intel CPU monopoly we had over the Nvidia / AMD duopoly currently controlling the GPU market. FFS these new cards are abysmal.
 
At this point I almost prefer the Intel CPU monopoly we had over the Nvidia / AMD duopoly currently controlling the GPU market. FFS these new cards are abysmal.
By what logic were 8-cores for $1000 better than a new 1080P GPU that plays everything for $270?
 
Last edited:
By what logic are 6-cores for $1000 better than a new 1080P GPU that plays everything for $270?

Nonsense, I purchased my 5820K back when I had it for $320, far before AMD was competitive.
 
I gotta be honest with you chief for all the (rightful) spanking Nvidia got over the 4060 Ti, this card manages to be even worse. Steve spent half an hour mauling Nvidia for the 5 to 7.5% gains over the 3060 Ti, but according to W1zz's review, this card is a paltry 4% over the 6650 XT and at 1440p that shrinks to 2% and stays there at 4K. And AMD doesn't even have the feature upsell of DLSS 3, RTX remix features, etc.

The sole redeeming thing about it is the price. Bloody hell, AMD blew it even scurrying to lower the price
The 4060 Ti was objetively worst.
You CAN'T compare the 6650 xt to the 7600, you compare the 6600 to the 7600, and in that comparison, the 7600 is 24% faster while being $270 instead of $300 that the 6600 launched at.
 
The 4060 Ti was objetively worst.
You CAN'T compare the 6650 xt to the 7600, you compare the 6600 to the 7600, and in that comparison, the 7600 is 24% faster while being $270 instead of $300 that the 6600 launched at.

You can compare anything in it's general price range so the 6600-6700XT not sure why that isn't fair a review is to inform a person on what the value in the current market is for a given product naming is irrelevant these are both equally bad products I would probably say this one slightly less so just because 8GB at 400 usd is stupid but both are pretty bad in the current market.
 
Last edited:
We have a near perfect comparison with the 6650XT - the only significant difference being the architecture - and it achieves precisely nothing.
Nor was it technically supposed to. It's using a smaller chip on a cheaper process.
It's probably cheaper to produce than the 6650XT while consuming a bit less power on gaming workloads.


The price is still terrible, though.
AMD would have had a winner with this card at $230, but they'd rather get destroyed by reviewers to get a higher margin that they'll never actually get because they're not selling these GPUs to anyone.
 
not inserting a cable ALL the way, it's a USER ERROR
Wait what? It's my fault that the cable cannot go all the way in, no matter how much force I apply or how much I'm trying, since I'm actually aware of the problem?

@W1zzard on page 3, under "8-pin won't connect", the second paragraph it should say "This is clearly an AMD issue - I've never encountered it on any other card before."
Fixed :)
 
Wait what? It's my fault that the cable cannot go all the way in, no matter how much force I apply?

Get to the gym bruh need to get dem gains to plug in dem cables.
 
Wait what? It's my fault that the cable cannot go all the way in, no matter how much force I apply?
Maybe my English are bad here. Not blaming you here. I did say I understand your points and that AMD should have tested more cables, have I not?

It's just that GN especially and many others online are touting all day long and for many months now, that every RTX card out there burned, it is a user error. So, I am making fun here saying that, maybe even when it is IMPOSSIBLE to push the cable all the way in, maybe even in this case is a user error......? :p

Anyway, have you seen anything about the 8 pin cables? Do they catch fire if they are not 100% pushed in the socket? That whole "not all the way in" could be just an excuse to cover up Nvidia and PCI-SIG.
 
RTX 4090. Nothing else if we are talking about keeping something for 10 years.
A gpu from 10 years ago can't do even 1080p properly today....
 
Back
Top