• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 7600

If a POWER connector is not made in a way that can prevent the user from misplacing it, or the power connector is made in a way that it doesn't completely lock in it's place, then it is a design error.

PS So in both cases this is a design error. The question is, how much of a fire hazard is in the case of the 8pin cable.
Nothing can prevent user error. There was no obstruction on the 12VHPWR connection, people just didn't plug it all the way in. "It's a competition between engineers to make things more idiot proof, and the universe to make bigger idiots."

This card has a literal obstruction.
 
I'd say even $900 and $330 would be worlds better.
True, the 4080 would still be good at $900, although still a bump from the $700 3080. The 4050 4060ti, however, I strongly disagree. Anywhere near $300 and 12GB needs to be the minimum. No 8GB card is worth $300 today.
 
True, the 4080 would still be good at $900, although still a bump from the $700 3080. The 4050 4060ti, however, I strongly disagree. Anywhere near $300 and 12GB needs to be the minimum. No 8GB card is worth $300 today.

if they did price it at 250 usd they probably kill AMD sales completely we all know Nvidia isn't going to do that lol. I forgot to add when I said 330 I meant the 16GB version the 8GB version should just die really.
 
I was a little high after the whirlwind of the 4060ti reviews. I expected AMD to launch at $400 or $380 or something dumb as well, $270 almost seems decent on the surface until you factor in the well below MSRP pricing of 6650xts today.

You literally just spelled it out. The 4060ti offers small improvements and a notable increase in efficiency. rDNA3 doesnt offer anywhere near as much, and thats not a high bar to clear.

Keyboard being "slight". Compared to ADA, rDNA3's arch improvements may as well be non existent.

I dont disagree. That doesnt mean we cant point out when one is offering almost nothing over the last card *cough cough* 7600 *cough cough*.

Nobody is ignoring price. The price is the SINGLE LARGEST COMPLAINT. AS in, these cards are priced an ENTIRE TIER too high.

At $700 the 4080 is a great card. At $250 so is the 4060ti. But not at $1200 and $400.

The point being both products lines are equally pointless based on price. Which is an entirely different statement and opposite of the fact both ADA and RDNA3 provide performance, hardware, and efficiently improvements.
 
if they did price it at 250 usd they probably kill AMD sales completely we all know Nvidia isn't going to do that lol. I forgot to add when I said 330 I meant the 16GB version the 8GB version should just die really.
They don't need to though. Their cards sell. NVIDIA has what, 90% marketshare? It's AMD that needs to be price competitive and they were handed that opportunity to gain significant market share on a silver platter with Ada pricing the way it is, but here we are.
 
The point being both products lines are equally pointless based on price. Which is an entirely different statement and opposite of the fact both ADA and RDNA3 provide performance, hardware, and efficiently improvements.
The point is that ADA's generational improvements are a HELL of a lot more impressive then rDNA3s. What part of that isnt getting through your skull?
 
They don't need to though. Their cards sell. NVIDIA has what, 90% marketshare? It's AMD that needs to be price competitive and they were handed that opportunity to gain significant market share on a silver platter with Ada pricing the way it is, but here we are.

Although apparently the 4070-4080 isn't selling very well I guess we will get to see how well ADA is doing in the next earnings call although I'm sure nvidia will try to spin it out to be better than it really is.
 
They don't need to though. Their cards sell. NVIDIA has what, 90% marketshare? It's AMD that needs to be price competitive and they were handed that opportunity to gain significant market share on a silver platter with Ada pricing the way it is, but here we are.
AMD is probably tired of being seen as the "discount" company. They've made a last-minute price change when they realised that 300$ wasn't going to cut it...and that's with Navi 33 chip already saving cost to them by being on 6nm.
It also seems that they have too much stock of RX 6000, and don't want to slash the price too much. (Probably also why the upper mid-range is nowhere to be seen)
 
So, I am making fun here saying that, maybe even when it is IMPOSSIBLE to push the cable all the way
ah sorry, I misinterpreted what you wrote then

16-pin is definitely user error imo as they can go in just fine, as long as you know how much to push and what signs to look for when connected. Having it done a few hundred times now I actually like the 16-pin because it's so quick and easy to use

Anyway, have you seen anything about the 8 pin cables? Do they catch fire if they are not 100% pushed in the socket?
It is possible, but very unlikely imo to actually get flames fire, just like on the 16-pin, still I wouldn't wanna run like that for years and not know what's going on
 
AMD is probably tired of being seen as the "discount" company. They've made a last-minute price change when they realised that 300$ wasn't going to cut it...and that's with Navi 33 chip already saving cost to them by being on 6nm.
It also seems that they have too much stock of RX 6000, and don't want to slash the price too much. (Probably also why the upper mid-range is nowhere to be seen)
Well the alternative is to come to market first with superior performance/architecture/feature-set, or make something that has parity, but has USPs or an efficiency advantage, something, that pulls people away from what they know - NVIDIA. If the company can't/won't do that, then it needs to be cheaper, or nothing will ever change marketshare wise. Intel Arc is currently - cheap, has several USPs e.g. encoding or large VRAM buffers, and has shown up with drivers and ray tracing, so it's gaining marketshare (slowly).

The point I'm making is that the value has to be better for people to jump ship, why would you leave the CUDA/feature playground of NVIDIA, for something like this? The status quo not changing is beneficial to NVIDIA, so they aren't the ones who need to make pricing or product changes.
 
Why is everyone comparing this to either a card that is a third more expensive for performance( 4060Ti ) or outgoing stock that
Is heavily discounted for price ( RX 66xx )?
 
Why is everyone comparing this to either a card that is a third more expensive for performance( 4060Ti )
Because that is the same class by name, the xx6x class. It is also a 8GB GPU, and given that 8 GB is the limiting factor it makes sense they would get compared. Nvidia charging ludicrous pricing does not mean it is free of criticism.
or outgoing stock that Is heavily discounted for price ( RX 66xx )?
Because that is the GPU this one is replacing. Do people just expect all GPUs to be compared in a vacuum?
 
Talk about missing the point.

The 4060ti is basically a slightly faster 3060ti, the only point in its favor is notably better energy efficiency. That is the effect of ADA.

This card, OTOH, is slightly faster then a 6650xt and pulls power like a 6650xt. Oh wait........

Assuming 6nm is to be credited for the mild efficiency gain, what exactly has rDNA3 done? Because it sure looks like complete stagnation outside of mild RT improvements (and notably higher RT power usage).

The logic of "I can buy a $200 CPU and it can play everything, without issue, for 10 years".

Not only do GPUs not last that long, but these $270 GPUs cannot play everything fine at 1080p. Steve from HUB has already gone over that, 8GB is insufficient for 1080p in many modern games, you have to turn settings down to a mix of medium/high to avoid stuttering and in some games it flat out doesnt work.

There is a legion of difference there.
1) $1000 for any PC part is more than expensive no matter what and was even more expensive 7 years ago if you deduct the inflation of the years between and the Ryzen CPUs that made that 6900K obsolete very sooner than expected.-
2) Any cost efficient 1080P GPU with 8GB can play any game smoothly since you don't care for ultra settings in that market segment but for the best vfm.
 
1) $1000 for any PC part is more than expensive no matter what and was even more expensive 7 years ago if you deduct the inflation of the years between and the Ryzen CPUs that made that 6900K obsolete very sooner than expected.-
2) Any cost efficient 1080P GPU with 8GB can play any game smoothly since you don't care for ultra settings in that market segment but for the best vfm.
1) I dont know why you are hung up on $1000 6 core CPUs. What even is the point you are trying to make with that word salad?
2) No. As Hardware Unboxed showed, even high settings are out of reach for 8GB GPUs now, bringing with them major stuttering issues. Even if the games run, the visual quality is severely compromised and issues arise from texture swapping issues. It's only a matter of time until even medium settings will be too high going forward. A $300 8GB GPU is not "cost efficient" in 2023.
 
Wow, Computex has not even started yet and people are saying that there are no more SKUs coming from AMD. What you don't see is this is the response to the 4060TI. Nvidia has given us the 16 GB 4060TI and the 8 GB 4060 in July. I am willing to bet that by the end of next week we will have more information on the rest of the stack from AMD. The naming is important. This is not an XT card and should not be compared to the 6600XT much less the 6650XT when the 6600 exists as a card.

For $270 US you are getting a card that is perfect for 1080P ultra (PC world). However if you like value, the best card for the money in Canada right now is a MSI 6800 Trio OC for $609 CAD (Yes $450) and that is a killer deal. I am not as glad as I will be when they release the 6700XT replacement as right now the 6700XT is not cheap enough for regular use. The best one in terms of price is the As Rock 6700XT for $489.

AS it stands 6600s are now in the 220-290 range and in some cases cheaper than some 6500XTs and even 6400s on Newegg.
 
What makes you think it's fixed? 2 W -> 18 W ... 900% increase for connecting a 2nd monitor? 2 W -> 27 W for video?
I guess I don't really consider the numbers to be high (unlike the 7900 XT/XTX), but I take your point - that as a percent of idle it seems like poor/buggy behavior not exhibited by other cards. I agree that's a con. Thanks for clarifying.
 
1) I dont know why you are hung up on $1000 6 core CPUs. What even is the point you are trying to make with that word salad?
2) No. As Hardware Unboxed showed, even high settings are out of reach for 8GB GPUs now, bringing with them major stuttering issues. Even if the games run, the visual quality is severely compromised and issues arise from texture swapping issues. It's only a matter of time until even medium settings will be too high going forward. A $300 8GB GPU is not "cost efficient" in 2023.
Nice moving of the target by your side there! Firstly you ignore the price of 6900K (my mistake it is a 8-core one but it was demolished by a much cheaper 1700X) and then the one for the just released 7600. Kudos! As for the the games being playable or not, I haven't seen any proof of what you wrote about the 7600. Please provide one if you like.
 
Last edited:
I guess I don't really consider the numbers to be high (unlike the 7900 XT/XTX), but I take your point - that as a percent of idle it seems like poor/buggy behavior not exhibited by other cards. I agree that's a con. Thanks for clarifying.
You are right, 7900 Series was next-level bad, because it has twice the amount of memory, clocked much higher, possibly at higher voltage.

The underlying issue is that in those two states the GPU runs memory at full rated speed, not at some lower level.


Actually, looks like at least some improvement, multi-monitor now runs at a slightly lower frequency
 
Of course not. It’s getting close to 2 years old. It’s rotted away. But it is a benchmark in time which is still useful for tracking and comparison purposes for release vs release. Obviously time changes everything. I mean you can get a new GT1030 for $100 if you wanted to. ;)
MSRP at launch (6 year ago) of 1030 (Pascal architecture =7 years ago) was $70, and not was a good price. This is a museums card, its price today should be $5.
Forget this trash, no one is going to buy obsolete garbage today, and less at a higher price than at launch. The existence of people who still remember it, and the higher prices, indicates that in the minds of some still endures stockholm plandemic miner syndrome. World goes on.
 
AMD: 8GB is a joke in 2023

Also...

AMD: Here is a RX 6650 XT 8GB replacement a year later... enjoy!

What a time to be alive.
 
Because that is the GPU this one is replacing. Do people just expect all GPUs to be compared in a vacuum?

I don't even understand the logic of not comparing it to other products you can buy new in the general price range it makes no sense. If the reviewer was comparing it to used products only I fully understand.

Honestly I can't even remember a generation where they had this much stock left mostly AMD of previous generation gpus. literally 6 months into this generation.
 
id buy it at $225, however the generational gains are hilariously small, 5%. The efficiency is as bad as the 6650xt. What an awful time
 
You can actually see how that rumour started though.

So someone gets some info that N33 is 20+ Tflops which is about what the 6900XT is, leaks the info, someone assumes that means N33 must have double the shaders over N23.

On top of that you get the info about RDNA being architected for 3Ghz + so you think, 4k shaders + 3ghz or above clockspeeds and suddenly the idea that it could match a 6900XT at low resolution where the 32MB and 128 bit bus are not such an issue begins to form.

The reality is though it does not have double the shaders, they are just dual issue and it seems they barely work in games but give it a compute workload and you can get 21Tflops out of it.

Also AMD have form for this kind of thing. The 4870 had 2.5x the shaders of the 3870. RDNA2 was a huge leap over RDNA1. Zen3 was a big uplift over Zen2 on the same node so given their recent track record of exceeding expectations people thought RDNA3 would be more of the same. Obviously it didn't turn out that way but it was a fun ride while it lasted.

This shows how stupid people are and lack critical thinking. We knew Navi 33 die size and manufacturing node for a while now.

How would AMD raise or maintain IPC, while shrinking the die on what is largely the same manufacturing process, while doubling the shader count( and tflop count)? Particularly without the xfactor chiplet presents(at the time). Particularly with barriers which should limit performance. People somehow expected the impossible possible.

Some of these impossibilities included violating the laws of physics(double transistor density while not running into heat issues on the same node), fix some glaring flaw which made RDNA2 inefficient in regards to performance per die space(sarcasm). Magically get over bandwidth barriers with only l3 cache only 1 quarter of the size.

No one thought of these potential and obvious roadblocks which should temper their expectations(atleast in the AMD fanbase). It should have been obvious that AMD was doing something very cheap in terms of silicon which meant IPC was going way down(gaming performance per teraflop). Being able to to more than double performance, while not increasing die space, on the same node with some severe bottlenecks in place is impossible, particularly with how well RDNA2 was designed.

But people ignored this and all I can blame is the tribalism effect. People just following each other, believing a youtuber, while not evaluating if it's true or not. Simultaneously not listening to reason which is valid because they would rather feel good and be stupid.
 
What’s useless about RDNA3? Better raster? Better RT vs last gen? AV1? Small but present efficiency gains?

Nothing, Navi33 is a minuscule chip, it has 33% of the shaders and bandwidth of a fully enabled Navi31, I don't know what people expected.

There is nothing wrong with the 7600 other than the obvious fact that it's too expensive.
 
Back
Top