• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 7600

He specifically stated RDNA3. Read the post and maybe then reply.

By your logic the 4060ti could be the 3060ti super and no one would notice.

Again there’s no need to defend Nvidia or AMD. To say RDNA3 or ADA are useless is absolutely wrong.
Talk about missing the point.

The 4060ti is basically a slightly faster 3060ti, the only point in its favor is notably better energy efficiency. That is the effect of ADA.

This card, OTOH, is slightly faster then a 6650xt and pulls power like a 6650xt. Oh wait........

Assuming 6nm is to be credited for the mild efficiency gain, what exactly has rDNA3 done? Because it sure looks like complete stagnation outside of mild RT improvements (and notably higher RT power usage).
By what logic are 6-cores for $1000 better than a new 1080P GPU that plays everything for $270?
The logic of "I can buy a $200 CPU and it can play everything, without issue, for 10 years".

Not only do GPUs not last that long, but these $270 GPUs cannot play everything fine at 1080p. Steve from HUB has already gone over that, 8GB is insufficient for 1080p in many modern games, you have to turn settings down to a mix of medium/high to avoid stuttering and in some games it flat out doesnt work.

There is a legion of difference there.
 
Talk about missing the point.

The 4060ti is basically a slightly faster 3060ti, the only point in its favor is notably better energy efficiency. That is the effect of ADA.

This card, OTOH, is slightly faster then a 6650xt and pulls power like a 6650xt. Oh wait........

Assuming 6nm is to be credited for the mild efficiency gain, what exactly has rDNA3 done? Because it sure looks like complete stagnation outside of mild RT improvements (and notably higher RT power usage).

Bottom line in the current market the 4060ti is 6% more expensive than it's predecessor for 12.5% more performance. The 7600 is 7.5% faster than the 6600XT for 8% more money.... I would have used the 6600 but W1z decided not to use it in the average fps chart for some reason but either way major stagnation with both cards.

Going by techspots numbers though the 7600 is 24% faster than the 6600 for 35% more money in the current market so that comparison is even worse.


There really is no defending either of these products someone would need to be really high on RED or Green Kool-Aid to even try and make a case that they are not terrible.
 
Last edited:
Maybe my English are bad here. Not blaming you here. I did say I understand your points and that AMD should have tested more cables, have I not?

It's just that GN especially and many others online are touting all day long and for many months now, that every RTX card out there burned, it is a user error. So, I am making fun here saying that, maybe even when it is IMPOSSIBLE to push the cable all the way in, maybe even in this case is a user error......? :p

Anyway, have you seen anything about the 8 pin cables? Do they catch fire if they are not 100% pushed in the socket? That whole "not all the way in" could be just an excuse to cover up Nvidia and PCI-SIG.
With the the new connector on rtx 40 series cards, people didn't fully connect, that was pure user error as it was completely physically possible to push the connector all the way in.

With this 7600, the backplate physically prevents you from fully inserting some 8 pin cables, it depends on what cables your psu has, but it isn't user error, it's a design flaw.
 
Hopefully people have some common sense not to encourage this behavior from both AMD and NVIDIA.
I would rather buy used cheaper and faster older gen hardware than support this bs.
 
Last edited:
A gpu from 10 years ago can't do even 1080p properly today....
Performance improvements are slowing down from generation to generation, especially in the mid range, while new generations take longer to come to market. Many years ago we where getting a new gen every year with real performance improvements over the previous gens. Today we have to wait 2+ years to get about the same performance in mid range. Yes, high end keeps getting serious performance improvements at extremely high prices, but games are build based on consoles and cards that cost more or less under $700.

For example
8800 GT, 512MB, 2007, $349 - 100% performance
GTX 970, 4GB, 2014, $329 - 691% performance
So, in 7 years period, 7 times the performance, 8 times the VRAM capacity

GTX 970, 4GB, 2014, $329 - 100% performance
RTX 4060 Ti, 8GB, 2023, $400 - 300% performance
So, in 9 years period, 3 times the performance, 2 times the VRAM capacity and higher price.

So, if nothing changes, in 10 years you can expect 2 or 3 times the performance, 2 times the VRAM capacity (if not just 50% more) and even higher price for the $400-$600 market.
An RTX 4090 will still be at least as fast as a 2033 $500 GPU, if things don't improve drastically. It will be losing in features, efficiency and technology.
 
Last edited:
Performance improvements are slowing down from generation to generation, especially in the mid range, while new generations take longer to come to market. Many years ago we where getting a new gen every year with real performance improvements over the previous gens. Today we have to wait 2+ years to get about the same performance in mid range. Yes, high end keeps getting serious performance improvements at extremely high prices, but games are build based on consoles and cards that cost more or less under $700.

For example
8800 GT, 512MB, 2007, $349 - 100% performance
GTX 970, 4GB, 2014, $329 - 691% performance
So, in 7 years period, 7 times the performance, 8 times the VRAM capacity

GTX 970, 4GB, 2014, $329 - 100% performance
RTX 4060 Ti, 8GB, 2023, $400 - 300% performance
So, in 9 years period, 3 times the performance, 2 times the VRAM capacity and higher price.

So, if nothing changes, in 10 years you can expect 2 times the performance, 2 times the VRAM capacity (if not just 50% more) and even higher price.
An RTX 4090 will still be twice as fast as a 2033 $500 GPU, if things don't improve drastically. It will be losing in features, efficiency and technology.
Those improvements are down, but its not because of the node. You already highlighted that the 4090 is a genuine improvement. If we call the 4060ti what it is, a 4050, then it actually looks pretty good, for $250.

Reminder: the 4060ti die is 190mm2. The 1650 was 200mm2. This thing is not a 4060ti, it is a 4050 with the wrong name. Much like how the 4080 is really a 4070, the 4070 a 4060, and so on. When put into the appropriate price range, the improvements are great.
 
Ouch. Review is enough. Avoid both companies. Sigh. Time to sit on my hands. I want an RTX 4080 for $700 USD, will wait for that then.
 
We haven't had competition in the gpu market for quite a while and AMD in that time has become nvidia subsidiary, pitiful. The only hope is Intel or a new GPU manufacturer to get in. The rx 7600 belongs to 100 usd segment, 270 usd is too much.
 
With the the new connector on rtx 40 series cards, people didn't fully connect, that was pure user error as it was completely physically possible to push the connector all the way in.

With this 7600, the backplate physically prevents you from fully inserting some 8 pin cables, it depends on what cables your psu has, but it isn't user error, it's a design flaw.
If a POWER connector is not made in a way that can prevent the user from misplacing it, or the power connector is made in a way that it doesn't completely lock in it's place, then it is a design error.

PS So in both cases this is a design error. The question is, how much of a fire hazard is in the case of the 8pin cable.
 
Last edited:
Going by techspots numbers though the 7600 is 24% faster than the 6600 for 35% more money in the current market so that comparison is even worse.
You can't compare a gpu that just released with one that has over 2 years on the market. I would say it's $270 vs $330 and 24% better, that's comparing apples to apples(MSRP vs MSRP), you can come back in 1 year and see the 7600 at $220 easy.

The case for nvidia would be $400 vs $400, and get only 10% better.
 
You can't compare a gpu that just released with one that has over 2 years on the market.
Yes, you can. You literally can, since that 2 year old GPU is the card the new one is replacing. This should not need explaining.
I would say it's $270 vs $330 and 24% better, that's comparing apples to apples(MSRP vs MSRP), you can come back in 1 year and see the 7600 at $220 easy.
So you cant compare a new GPU to a 2 year old GPU, but you CAN compare a 3 year old GPU to a 1 year old GPU?
 
You can't compare a gpu that just released with one that has over 2 years on the market. I would say it's $270 vs $330 and 24% better, that's comparing apples to apples(MSRP vs MSRP), you can come back in 1 year and see the 7600 at $220 easy.

The case for nvidia would be $400 vs $400, and get only 10% better.

So we can only compare it in a way that makes your chosen gpu maker look better gotcha......
 
Talk about missing the point.

The 4060ti is basically a slightly faster 3060ti, the only point in its favor is notably better energy efficiency. That is the effect of ADA.

This card, OTOH, is slightly faster then a 6650xt and pulls power like a 6650xt. Oh wait........

Assuming 6nm is to be credited for the mild efficiency gain, what exactly has rDNA3 done? Because it sure looks like complete stagnation outside of mild RT improvements (and notably higher RT power usage).

The logic of "I can buy a $200 CPU and it can play everything, without issue, for 10 years".

Not only do GPUs not last that long, but these $270 GPUs cannot play everything fine at 1080p. Steve from HUB has already gone over that, 8GB is insufficient for 1080p in many modern games, you have to turn settings down to a mix of medium/high to avoid stuttering and in some games it flat out doesnt work.

There is a legion of difference there.

Missing what point?

The 4060ti offers on average ~8% at 1080p and a performance regression at other resolutions over the 3060ti at the same price point. The advantages being better efficiency (more than RDNA3), AV1, and DLSS3. From a performance per dollar (value) standpoint it offers worse value than the 3060ti does; you’re now paying more for the for almost no generational improvement with the exception of efficiency.

The 7600 offers small gains over the card it’s set to replace with a slight efficiency bump. With the additional benefit of AV1 and other architectural improvements from a hardware standpoint similar to ADA over the previous gen.

They’re both horrible products comparing gen to gen, which should be the comparison here. It’s just one is a SIGNIFICANTLY worse value/buy than the other *cough 4060ti cough*

So many people look at these low/midranged products with such a warped viewpoint based on looking at each product stack top down. The only thing that should matter to the end user is how much you’re getting for the amount you paid, and as previously mentioned the only true options are the 4090 and 7900 XTX if you’re looking for substantial generational improvements depending on RT no RT use cases.

There’s no reason to defend either product when both companies are giving consumers a massive shafting.
 
I'm with Dr. Dro on this; You are indeed quoting "marketing talk" improvements that don't appear to exist in the real world.

That 17% IPC is 2%
That 50% increase in ray intersection performance achieves zero.
No games give a toss about AI accelerators
AV1 is useful to a few people, but also not relevant to the overwhelming majority of gamers.

Perhaps RDNA3 will age like fine wine, but right now it's a turd that has achieved less then any achitecture before it.
No. Quoting wiz. :)

agree 100%. Was just pointing out there was some differences vs the 6600. Nothing more. Nothing less.
 
Those improvements are down, but its not because of the node. You already highlighted that the 4090 is a genuine improvement. If we call the 4060ti what it is, a 4050, then it actually looks pretty good, for $250.

Reminder: the 4060ti die is 190mm2. The 1650 was 200mm2. This thing is not a 4060ti, it is a 4050 with the wrong name. Much like how the 4080 is really a 4070, the 4070 a 4060, and so on. When put into the appropriate price range, the improvements are great.
We can't ignore pricing. If in 10 years the whatever xxx60 card costs $2000, we can't say
"Yeah, but this is in reality a $300 card, so we have to approach it as a $300 and not as a $2000 card."
 
So you cant compare a new GPU to a 2 year old GPU, but you CAN compare a 3 year old GPU to a 1 year old GPU?
Price will always fall, be patient. That's the smart thing to do. Although i agree this should have a max of $250 msrp.

your chosen gpu maker look better gotcha
I have both, don't care about brand. Only performance/dollar.
 
Price will always fall, be patient. That's the smart thing to do. Although i agree this should have a max of $250 msrp.

The only problem is even at 250usd it would only offer a 8% price to performance jump over the 6600XT

This really needs to be $225 max.
 
All those imaginary improvements and it's still almost zero gains from the 6650XT with the same shader count. Good thing it can do 300fps on a 4k monitor though it really needed that improvement
Ya don’t disagree. This is a 6600 replacement though. Not a 6600xt or 6650xt replacement. Well at least that’s what AMD is saying.
 
Ya don’t disagree. This is a 6600 replacement though. Not a 6600xt or 6650xt replacement. Well at least that’s what AMD is saying.

My issue with comparing it to the 6600 at least when price/performance is concerned in the current market is it actually looks way worse..... it's around a net -10% currently.
 
You probably wont be able to buy RDNA2 in few months once they have the entire line up. That is one of the only reason it looks bad. I think 6600 probably would have been discontinued if it wasn't for bad economics and low demand. Likely why the pushed the 7600 later but still first and have no 7700 or 7800 series.
I’m hoping the 6600 goes down more once this is available. I could use two more inexpensive cards.
 
I’m hoping the 6600 goes down more once this is available. I could use two more inexpensive cards.

From what I am seeing at least a couple days ago vs today is prices actually slightly went up lol.
 
You got to be joking? This is a joke of a product just like the 4060ti
I was a little high after the whirlwind of the 4060ti reviews. I expected AMD to launch at $400 or $380 or something dumb as well, $270 almost seems decent on the surface until you factor in the well below MSRP pricing of 6650xts today.
Missing what point?

The 4060ti offers on average ~8% at 1080p and a performance regression at other resolutions over the 3060ti at the same price point. The advantages being better efficiency (more than RDNA3), AV1, and DLSS3. From a performance per dollar (value) standpoint it offers worse value than the 3060ti does; you’re now paying more for the for almost no generational improvement with the exception of efficiency.
You literally just spelled it out. The 4060ti offers small improvements and a notable increase in efficiency. rDNA3 doesnt offer anywhere near as much, and thats not a high bar to clear.
The 7600 offers small gains over the card it’s set to replace with a slight efficiency bump. With the additional benefit of AV1 and other architectural improvements from a hardware standpoint similar to ADA over the previous gen.
Keyboard being "slight". Compared to ADA, rDNA3's arch improvements may as well be non existent.
They’re both horrible products comparing gen to gen, which should be the comparison here.

There’s no reason to defend either product when both companies are giving consumers a massive shafting.
I dont disagree. That doesnt mean we cant point out when one is offering almost nothing over the last card *cough cough* 7600 *cough cough*.
We can't ignore pricing. If in 10 years the whatever xxx60 card costs $2000, we can't say
"Yeah, but this is in reality a $300 card, so we have to approach it as a $300 and not as a $2000 card."
Nobody is ignoring price. The price is the SINGLE LARGEST COMPLAINT. AS in, these cards are priced an ENTIRE TIER too high.

At $700 the 4080 is a great card. At $250 so is the 4060ti. But not at $1200 and $400.
 
@W1zzard kudos on your callback to the non-XT 6700, although I would've liked that you had actual data on the charts. IIRC the non-XT 6700 was meant as mobile part and had better perf/W than other (desktop) RDNA2 parts.

Good ideas there, I think they need to be split by segment though

I agree and there are lots of cools ways to segment like esports high refresh rate games/GPUs segment, RT/SS segment, pure raster segment, etc.

Btw, something like my idea would have meant the 4060TI would have received the ‘Good for the Earth’ award since it was in the top 5 on the efficiency chart. Very few would argue over such an award and let readers know that they are getting good frames per watt.

I would separate the awards according to performance segments, like the ones seen on some games system requirements

1080p@30fps, 1080p@60fps, 1080p@144fps, ...., 1440p@60fps, ..., 2160p@144fps, ... (maybe not that many, but just to have an idea)
 
Back
Top