• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7600

if AMD don't sell a lot of 7600
This is only possible if it's mispriced, which is not yet the case. Its current price makes older GPUs sell better. They go outta stock, AMD will sell 7600s like it's le mining boom again (either for $250 if nVidia's 4060 fails or for below $200 otherwise) and stay profitable. Considering their super mega dooper profits from 2020 to 2022 cryptor fever era they won't really mind losing a lil bit of revenue on this exact GPU.
 
It would be if it was something "new" to the market. Let me introduce you to the RX6700 (non-XT):
(280$ at the time of writing)
Umm, I specifically said MSRP. Market prices vary, at some point you will be able to pick up any of these cards for less than half that. A two-year-old mid-range card for less than $300, that isn't news. It was launched at what $429 MSRP, I know it was over $400. If the market continues like this, you'll be able to snag one of these for $200 in a few months.
 
Umm, I specifically said MSRP. Market prices vary, at some point you will be able to pick up any of these cards for less than half that. A two-year-old mid-range card for less than $300, that isn't news. It was launched at what $429 MSRP, I know it was over $400. If the market continues like this, you'll be able to snag one of these for $200 in a few months.
AFAIK the desktop 6700 non XT was never "officially" announced by AMD, the cards just started showing up, where did you find the MSRP?
Once a card releases it needs to be compared with others in the same current price bracket and/or similar (raster) performance. If tied on both metrics, then also look at (exclusive) features and power consumption (perf/w).
The 7600 only advantages are hdmi 2.1 and AV1 if I understand correctly.
 
Last edited:
Why RX 6650 XT (and RX 6600) isn't included on Power Consumption charts and Intel ARC A750 on performance charts? @W1zzard
 
Last edited:
Hello, why does relative performance of rx 7600 in this review and the relative performance of it in TPU database does not match? In this review it is 11% perf behind 3060ti at 1080p and even more at higher resolution...
In database rx 7600 is just 3% shy of 3060ti and matches the performance of 2080 super, 4% slower than 6700xt.... IN this review rx 7600 is 14% behind of 6700xt at 1080p and even more and higher res.
 
I don't know why AMD consistently shoots itself in the foot, if this was priced at $250 it would have been a much better product and it would have created a lot more buzz. It still wouldn't have been great, but it would have been pretty good and certainly a offered a lot more value! As it is it got mediocre reviews and mediocre reception which will kill any momentum for this card, so even when they ultimately lower the price to $250 in a month or so it would fly under people's radar.

Realistically this should cost $200, not a penny more, but the RTX 4060ti should not be more than $250 either, alternatively $280-300 for the 16GB version.
 
I don't know why AMD consistently shoots itself in the foot, if this was priced at $250 it would have been a much better product and it would have created a lot more buzz. It still wouldn't have been great, but it would have been pretty good and certainly a offered a lot more value! As it is it got mediocre reviews and mediocre reception which will kill any momentum for this card, so even when they ultimately lower the price to $250 in a month or so it would fly under people's radar.

Realistically this should cost $200, not a penny more, but the RTX 4060ti should not be more than $250 either, alternatively $280-300 for the 16GB version.

According to PCgamer RTX 30 series enjoys 30% market share over AMDs measly 3% 6000-series share (PCgamer observing data from Steam Hardware & Software Survey: April 2023).

You would think AMD would push to compete more aggressively with more attractive price points but nah, they're not interested. Increasing the market share at the gaming end is no longer a priority... the market focus seems to be 98% AI and other (/commercial) more profitable revenue streams. Consumers just need to wisen up and stop buying into all the BS, or in the least not at the given price. Otherwise, the price is justified.
 
Hello, why does relative performance of rx 7600 in this review and the relative performance of it in TPU database does not match? In this review it is 11% perf behind 3060ti at 1080p and even more at higher resolution...
In database rx 7600 is just 3% shy of 3060ti and matches the performance of 2080 super, 4% slower than 6700xt.... IN this review rx 7600 is 14% behind of 6700xt at 1080p and even more and higher res.

Yep, there is no way in hell the 3060TI is only 3% faster than the 7600 lol

1685332381512.png



From @W1zzard's own review, Relative Performance at 1080p (which is the best case scenario for the 7600, the gap widens even further at 1440p and 4k)

1685332473342.png
 
To anyone calling the 4060 ti a 50-series card in disguise: Is your memory short, or did Nvidia Stockholm Syndrome you so bad that you don't remember historical performance scaling? Ampere blew the lid off the expected performance range in the consumer market, and managed to normalize it and the prices it commands. 50-series performance in 2015 (for NV; AMD was playing name games at the time, as usual) meant 45fps in Metro:LL, 43 in Battlefield 4 on a GTX 950 at 1080p. Cut to today, and the 4060 ti pulls 93fps in CP77 and 127 in FarCry 6 (not direct comparisons, I know, but Metro and Battlefield don't have anything resembling recent releases). It manages a 60+ average in the entire current test suite at 1440p (excepting the CP twins), at Ultra no less. Blame COVID and crypto all you want, but PC gamers need to take some responsibility for allowing expectations to be set too high, and being willing to spend too much money to meet them. The 4060 ti is definitely priced too high, and may have been better slotted as a 4060, but a 4050 it is not.

On the topic of the 7600, though... Lop off thirty bucks and you start to be in the realm of having something compelling on your hands. Preferably fifty.
 
I'm with Dr. Dro on this; You are indeed quoting "marketing talk" improvements that don't appear to exist in the real world.

That 17% IPC is 2%
That 50% increase in ray intersection performance achieves zero.
No games give a toss about AI accelerators
AV1 is useful to a few people, but also not relevant to the overwhelming majority of gamers.

Perhaps RDNA3 will age like fine wine, but right now it's a turd that has achieved less then any achitecture before it.
If you compare it to the RX 6650XT, then yes it only offers 2% faster FPS and 2% better ray tracing, which is rather disappointing.
But if you compare it to the RX 6600, then the difference is close to 25% faster FPS and ray tracing.

In an apples-to-apples comparison, with similar specs, it looks like a slightly overclocked 6650XT with AV1 and better AI rendering.

Recipe. How to make an RX 7600:
1) Take an RX 6650XT.
2) Add AV1 encoding support.
3) Add improved AI rendering (for programs such as Stable Diffusion).
4) Add 6nm process (for improved performance/watt).

It's quite likely the core is substantially the same as the 6650XT, just with AV1 and improved AI. It's still an improvement though, and prices will drop.
 
Recipe. How to make an RX 7600:
1) Take an RX 6650XT.
2) Add AV1 encoding support.
3) Add improved AI rendering (for programs such as Stable Diffusion).
4) Add 6nm process (for improved performance/watt).

It's quite likely the core is substantially the same as the 6650XT, just with AV1 and improved AI. It's still an improvement though, and prices will drop.
Chips and Cheese just micro-benchmarked the 7600 and it differs from regular RDNA3 in some ways. The biggest difference, aside from the lithography node, is the smaller vector register file. Each RDNA3 CU in Navi 33 has 256 KiB of vector registers while its bigger siblings get 384 KiB. This means that ray tracing workloads will see a smaller improvement compared to what was seen with the 7900 XT/XTX. It's also interesting to see that the off-chip L3 cache in Navi 31 is significantly higher latency than the on-chip L3 in Navi 33.

1685990321458.png
 
Chips and Cheese just micro-benchmarked the 7600 and it differs from regular RDNA3 in some ways. The biggest difference, aside from the lithography node, is the smaller vector register file. Each RDNA3 CU in Navi 33 has 256 KiB of vector registers while its bigger siblings get 384 KiB. This means that ray tracing workloads will see a smaller improvement compared to what was seen with the 7900 XT/XTX. It's also interesting to see that the off-chip L3 cache in Navi 31 is significantly higher latency than the on-chip L3 in Navi 33.

View attachment 299478
If I understand right, the decoupled front-end and shader clocks are gone, too?
 
If I understand right, the decoupled front-end and shader clocks are gone, too?
Even if the decoupling is there, this implementation of RDNA3 seems to prioritize the shader clocks which makes sense as the front-end has to feed a 3 times smaller GPU than the 7900 XTX. Navi 33 would be limited by its shader array rather than the front end.
 
Even if the decoupling is there, this implementation of RDNA3 seems to prioritize the shader clocks which makes sense as the front-end has to feed a 3 times smaller GPU than the 7900 XTX. Navi 33 would be limited by its shader array rather than the front end.
It makes sense - also the fact that it was a big part in the 7900 series marketing, but not now.

Does GPU-Z or HWinfo report the front-end clock?

Now I'm even more tempted to buy a 7600 just to play around. :D
 
this is probably the worst review ever...." a 1080p card: when its at ultra that vs high is typically wasted performance for almost nothing at 2k pre fsr is well above 60 when hundreds of millions of gamers on ps5 xbox series s and the pos switch that struggles for 720p at 30 fps is insanely popular this guy thinks bad settings at ultra getting 70+ at 2k is a 1080p card before fsr....? keep in mind ANY twitch shooter or e sport gamer whos broke wont play at 2k ultra even ppl on 2.5k cost computers don't

ppl on budget cards probably shouldn't use ultra... or skip out on fsr yet yout data pre all of that is positive for 2k before actualy using the card right hardware unboxed a mulitmillion u tube did a good deepdive showign ultra vs high is dumb.... like this review
 
The guys at AMD must be taking their sweet time redesigning the oh-so-complicated shroud around the power connector, because it's still in "coming soon" status on the AMD shop website.

C'mon, fellas, hurry up! I want my spare AMD card to fool around with! :D
 
Back
Top