Wednesday, December 27th 2023

AMD Readies Radeon RX 7600 XT, RX 7700, and RX 7800

Even as NVIDIA inches close to the launch of its RTX 40-series SUPER graphics cards in January, AMD could be preparing a product stack update of its own. While NVIDIA's refresh focuses on the higher end of its lineup, AMD looks to spread out more into the mainstream-performance segments. A regulatory filing with the Eurasian Economic Commission mentions the terms "RX 7600 XT," "RX 7700," and "RX 7800," which fill gaps between the RX 7600, RX 7700 XT, and RX 7800 XT.

There exists a rather big gap between the $230 Radeon RX 7600 and the $450 RX 7700 XT, which AMD is looking to fill with the RX 7600 XT and RX 7700 (non-XT). How AMD goes about carving out these two will be interesting to see. The RX 7600 already maxes out the 6 nm "Navi 33" silicon that it's based on, which means to create the RX 7600 XT, AMD might have to tap into the larger (and much more expensive) "Navi 32" MCM. There is a vast gap between the 32 CU (compute units) available to the RX 7600, and the 54 CU that the RX 7700 XT has (while the silicon itself has 60). Besides CU count, AMD has other levers, such as the MCD (memory cache die) count, which could be down to just 2 for the RX 7600 XT, or 3 for the RX 7700. The Radeon RX 7800 is a different beast. AMD faced quite some flack for positioning the RX 7700 XT within $50 of the RX 7800 XT, and now the former can be had for a street price of roughly $430. To be able to squeeze the RX 7800 between the two, AMD might need to widen the gap by pushing the RX 7700 XT down.
Source: VideoCardz
Add your own comment

47 Comments on AMD Readies Radeon RX 7600 XT, RX 7700, and RX 7800

#26
Vayra86
80-watt HamsterBoth are still go-to recommendations at their price point, and that doesn't look to be changing any time soon.
Ah you mean it like that, then yeah, sure. But perhaps it says more about the dreadful condition of the newest gen maybe :D
Posted on Reply
#27
Beginner Macro Device
LionheartI disagree, 6800 was the go to enthusiast/high end card for AMD, especially with that frame buffer.
I think you're confusing it with the 6800 XT. That was worth every penny, unlike its handicapped sibling.
Posted on Reply
#28
Minus Infinity
Vayra86Same, and honestly, they do, except they have one different number.


He isn't wrong. RDNA2 became interesting when AMD started slashing prices, especially at the 6700-6800XT level.
Yeah I recall 6800XT hitting close to $2K in Australia at one stage, but the 3080's were over $2K, so AMD were always cheaper during the crypto boom.
Posted on Reply
#29
Chrispy_
HyderzThe 7800 non xt will be sandwiched between 7700xt and 7800xt interesting….. the msrp is 50 dollars difference
MSRPs will likely be adjusted to better reflect market value and any shift in competition needed when Nvidia launches the Supers, likely with MSRP adjustments to more evenly distribute their three "new" SKUs.

The 7800XT is selling consistently above it's $500 MSRP, typically $520+ and the 7700XT is consistently lower than its MSRP, often down at $430-440. I'm guessing but I would expect them to shift the official MSRP of the 7700XT to $420 and introduce a 16GB 7800 at $450-475
80-watt HamsterThe longevity of those two is impressive IMO, and speaks to both how balanced the designs are and how poorly the current generation has served their segments of the market.

I'm disappointed there's nothing on deck for sub-$200 buyers. Are N33 yields too high, or margins too slim to allow for such a thing? I get that USD100 is realistically too low to make any money today, but it feels like there should be room for a viable $150 part.
$99 GPUs have been mediocre or bad for a decade or more, simply because inflation moved the equivalent price point higher. Fixed costs like component sourcing logistics and management, assembly, machine time, physical packaging, transport, inventory management, customer support, retail logistics, retail storage and shelf space - none of them scale anywhere near as well as the price/performance curve of the GPU silicon. You have to assume that the fixed overheads of a $99 graphics card are probably close to half those of a $999 graphics card, yet the $999 graphics card can afford to eat a $120 fixed overheads and still be profitable. The $99 graphics card cannot afford the $60 fixed overheads and have any budget left for the actual product!
Posted on Reply
#30
80-watt Hamster
Chrispy_$99 GPUs have been mediocre or bad for a decade or more, simply because inflation moved the equivalent price point higher. Fixed costs like component sourcing logistics and management, assembly, machine time, physical packaging, transport, inventory management, customer support, retail logistics, retail storage and shelf space - none of them scale anywhere near as well as the price/performance curve of the GPU silicon. You have to assume that the fixed overheads of a $99 graphics card are probably close to half those of a $999 graphics card, yet the $999 graphics card can afford to eat a $120 fixed overheads and still be profitable. The $99 graphics card cannot afford the $60 fixed overheads and have any budget left for the actual product!
Right, $100 is out; pretty sure we all recognize that at this point. I still think $150 should work. My company sources a low-volume, domestically produced PCBA with AC-DC conversion, SOC, Wi-Fi module and several socket connectors for an $80 purchased cost (IIRC). Granted, we're dealing with 7W rather than ~75, but that still suggests to me that the economics of volume production in Asia should work for a $150 GPU. Maybe I'm wrong. Wouldn't be the first time.
Posted on Reply
#31
Chrispy_
80-watt HamsterRight, $100 is out; pretty sure we all recognize that at this point. I still think $150 should work. My company sources a low-volume, domestically produced PCBA with AC-DC conversion, SOC, Wi-Fi module and several socket connectors for an $80 purchased cost (IIRC). Granted, we're dealing with 7W rather than ~75, but that still suggests to me that the economics of volume production in Asia should work for a $150 GPU. Maybe I'm wrong. Wouldn't be the first time.
I guess the products you're talking about are the RX 6400 and GTX 1630, both of which are atrocious value for money because of those overheads eating up a disproportionate amount of the hardware budget.

They exist, but they've been sales disasters and just about every single review, guide, forum, vlogger has told anyone listening to give them a wide berth and just spend the extra 20% on an RX6600 or save money and buy something on the used market. A $139 RX 6400 is on par with a GTX 970 from 8 years ago, and you can snag those (working, not faulty) on ebay for as little as $25 if you're lucky and patient, otherwise the median price is about $55. Why would you pay $139 for something you can get for $55, and it also works fine without needing ReBAR or a PCIe 4.0 slot because it actually has all 16 lanes wired up...
Posted on Reply
#32
Vayra86
80-watt HamsterRight, $100 is out; pretty sure we all recognize that at this point. I still think $150 should work. My company sources a low-volume, domestically produced PCBA with AC-DC conversion, SOC, Wi-Fi module and several socket connectors for an $80 purchased cost (IIRC). Granted, we're dealing with 7W rather than ~75, but that still suggests to me that the economics of volume production in Asia should work for a $150 GPU. Maybe I'm wrong. Wouldn't be the first time.
Frankly 150 for a GPU is ultra cheap too. Look at whats needed. Its not like these bottom tier GPUs can be a totally failed mid- or high end chip. They need a small die for this segment.

Its about the chip. And then there is also VRAM chips. Those things just cannot be any off the shelf OEM part.

I would personally be fine with a 200-225 dollar entry level GPU that isnt crippled beyond playability. This is where I believe x50 should be at. x60 is perfect at 300. x70 can populate 400-450. x80 shouldnt exceed 700. Top end.. sure, 1K.

We can dream :)
Posted on Reply
#33
80-watt Hamster
Chrispy_I guess the products you're talking about are the RX 6400 and GTX 1630, both of which are atrocious value for money because of those overheads eating up a disproportionate amount of the hardware budget.

They exist, but they've been sales disasters and just about every single review, guide, forum, vlogger has told anyone listening to give them a wide berth and just spend the extra 20% on an RX6600 or save money and buy something on the used market. A $139 RX 6400 is on par with a GTX 970 from 8 years ago, and you can snag those (working, not faulty) on ebay for as little as $25 if you're lucky and patient, otherwise the median price is about $55. Why would you pay $139 for something you can get for $55, and it also works fine without needing ReBAR or a PCIe 4.0 slot because it actually has all 16 lanes wired up...
I'm not talking about anything that exists currently, but what could potentially exist. I also don't like paying too much attention to the used market when talking about retail lineups. Used customers are a relatively small proportion of buyers AFAIK.

The $300 4060 manages near double the performance, yet pulls the same power, as the $220 (launch) 1660. I find it hard to believe that an Ada-based, 1660-equivalent card couldn't have been designed to viably hit a $150-180 ASP. However:
Vayra86Its about the chip. And then there is also VRAM chips. Those things just cannot be any off the shelf OEM part.
This is true: they would probably have needed an AD108 with the proper number of shaders and a bus configuration that could support 6G/128. A crippled AD107 with 6G/9296 would get crucified in the press/community, even if it performed reasonably well.

Edit: bus width
Posted on Reply
#34
systemBuilder
Beginner Micro DeviceWhy? It was more expensive and not substantially faster than 3070, also lacking DLSS/CUDA support and lacking RT performance. FSR was not a thing either. The whole RDNA2 line-up was launched more overpriced than Ampere was. It's only this year it's the other way around.
You have completely botched your research! 6800 is within 5 fps of the 3080 on rasterization on toms hardware gpu hierarchy from 1080p to 4k - the 3070 is about 15% behind ...

6800 @MRSP had the best price/performance of AMD 6000 series - highest fps/watt for AMD, AND the lowest watts/fps champion! People seem to be ignorant of what a great performer this 6800 is! It was sold out for almost the entire pandemic as a result - I know, I was looking for one!
DenverI bet the 7600XT is just a version with more memory and maybe higher clocks. I don't see any reason to use a larger chip that could be sold in a product with better margins.
More cache, and/or higher memory clocks are possible, less than 10% speedup like a 3070 Ti ...
Posted on Reply
#35
Beginner Macro Device
systemBuilderYou have completely botched your research! 6800 is within 5 fps of the 3080 on rasterization on toms hardware gpu hierarchy from 1080p to 4k - the 3070 is about 15% behind ...
How old is this testing? If less than a couple years old then it's purely unable to deny my statement.

At the launch (late 2020) and not a single Q later (= no 2021+), the 6800 was rated 580 dollars MSRP. Non-existent 3070 Ti is outta equasion, and 3070 was $500 (about 5 to 10 percent behind in raster, about 20 to 40 percent ahead in RT + had DLSS, whereas FSR had not been invented yet). 3080 was ahead in every game by mostly 10 to 30 percent. In pure raster. In RT, that easily could spike up to being more than twice as fast thanks to DLSS.

So no, 3 years ago, that was very horrible value at best. And being sold out whilst pandemic is not an achievement, everything was sold out back then.
Posted on Reply
#36
Chrispy_
Beginner Micro DeviceHow old is this testing? If less than a couple years old then it's purely unable to deny my statement.

At the launch (late 2020) and not a single Q later (= no 2021+), the 6800 was rated 580 dollars MSRP. Non-existent 3070 Ti is outta equasion, and 3070 was $500 (about 5 to 10 percent behind in raster, about 20 to 40 percent ahead in RT + had DLSS, whereas FSR had not been invented yet). 3080 was ahead in every game by mostly 10 to 30 percent. In pure raster. In RT, that easily could spike up to being more than twice as fast thanks to DLSS.

So no, 3 years ago, that was very horrible value at best. And being sold out whilst pandemic is not an achievement, everything was sold out back then.
You can't compare launch prices for Ampere or RDNA2. MSRP or street price didn't matter because there's no accurate data on either. People were paying 250% the list price on Nvidia cards, which were absolute vapourware at those prices unless you were lucky enough to get one of the miniscule number sold on the Nvidia website direct from them.

RDNA2's list prices were the opposite, rather than being unrealistic, they were disappointingly realistic. If the 3080 was selling for 1500 $€£ in your region, then the 6800XT was selling close to that even if it didn't have the ETH-mining throughput - because there were enough gamers trying to get hold of a gaming GPU that its relative gaming performance against the market price of the 3080's gaming performance was enough to exceed the supply, even among gamers with no interest in ETH mining.

2020 Q2 onwards was a global dumpster-fire of GPU short supply and over-demand issues in very stark contrast to record-low prices and massive amount of surplus GPUs made to meet the 2017-2018 mining boom which then crashed just as they all finished production. As consumers we swung from 2019 being the best year for GPU pricing ever recorded, to the worst.
Posted on Reply
#37
Denver
systemBuilderYou have completely botched your research! 6800 is within 5 fps of the 3080 on rasterization on toms hardware gpu hierarchy from 1080p to 4k - the 3070 is about 15% behind ...

6800 @MRSP had the best price/performance of AMD 6000 series - highest fps/watt for AMD, AND the lowest watts/fps champion! People seem to be ignorant of what a great performer this 6800 is! It was sold out for almost the entire pandemic as a result - I know, I was looking for one!


More cache, and/or higher memory clocks are possible, less than 10% speedup like a 3070 Ti ...
Yeah, the presence of the 6700XT/6750XT on the market prompts consideration of the $299 MSRP for the 7600XT.
Posted on Reply
#38
Animekenji
I hope that there is going to be a card comparable to the 4070, with similar power consumption, and a single 8-pin power connector. AMD has a poor track record when it comes to matching Nvidia's power efficiency, and for putting an appropriate power connector on their cards. A 200W card should not require more than a single 8-pin connector, not two, which AMD is infamous for.
Posted on Reply
#39
Beginner Macro Device
Animekenjihope that there is going to be a card comparable to the 4070, with similar power consumption, and a single 8-pin power connector.
RDNA4? Possibly.
RDNA3? Not gonna happen.
Posted on Reply
#40
Denver
AnimekenjiI hope that there is going to be a card comparable to the 4070, with similar power consumption, and a single 8-pin power connector. AMD has a poor track record when it comes to matching Nvidia's power efficiency, and for putting an appropriate power connector on their cards. A 200W card should not require more than a single 8-pin connector, not two, which AMD is infamous for.
Isn't the 7800XT already that? 50W more or less doesn't change anyone's life. .-.
Posted on Reply
#41
Chrispy_
DenverIsn't the 7800XT already that? 50W more or less doesn't change anyone's life. .-.
and AMD cannot match the power efficiency of a 4nm monolithic die using a multiple 5nm and 6nm dies that have additional chip interconnect power draw

It's kind of incredible that the 7800XT gets as close to the 4070 as it does, given that it's also powering a wider bus and more GDDR6 packages too in addition to the MCD design.
Posted on Reply
#42
khohada
HD64GApart from 7600XT, those GPUs might be just OEM models with lower power draw? And 7600XT might be a full-die and factory oced N33. Just saying.
Previous RX7600 was a full-die Navi 33, so it's probably RX7600XT will be a higher clocked version of RX7600
Posted on Reply
#43
HD64G
khohadaPrevious RX7600 was a full-die Navi 33, so it's probably RX7600XT will be a higher clocked version of RX7600
With more info about it in the last few days, it seems that 7600XT is a further cut-down version of N32. If so, a sensible pricing might make it THE bargain GPU.
Posted on Reply
#44
RJARRRPCGP
Chrispy_A $139 RX 6400 is on par with a GTX 970 from 8 years ago
IIRC, with less functionality than a GTX 970! (no video encoding, IIRC) For a fair amount of gaming, especially GTA V and others before 2020, we would have been better off with a GTX 970, LOL.
Posted on Reply
#45
Beginner Macro Device
RJARRRPCGPIIRC, with less functionality than a GTX 970! (no video encoding, IIRC) For a fair amount of gaming, especially GTA V and others before 2020, we would have been better off with a GTX 970, LOL.
Yeah, and for $140, you can also obtain a 2nd hand 1080 (or even 1080 Ti if you're lucky enough) and get much more FPS as well as proper 8 GB VRAM (for $140, it's proper).
Posted on Reply
#46
Chrispy_
HD64GWith more info about it in the last few days, it seems that 7600XT is a further cut-down version of N32. If so, a sensible pricing might make it THE bargain GPU.
Do you have source(s)? I'm not finding much info about it and 16GB VRAM (confirmed by official ECC registration) is the only actual fact we have so far; The rest is older debunked info that conflicts with the newer hard evidence of the ECC filing.

A cut-down Navi32 card would have 12GB (or 10GB) VRAM, not 16GB, and if it was cut in half to make make a 16GB card using just two MCDs it would be much more expensive than Navi33 to make and also slower, since Navi33 has the latency advantage of being monolithic AND it's more than half of Navi32 to begin with. IMO, that's not economically viable, and also relies on a large enough quantity of extremely defective Navi32 dies, which contradicts both AMD and TSMC boasting about absolutely fantastic yields this generation...

I can certainly see another triple-MCD card likely in the lineup - that's undoubtedly going to be the RX 7700, but it would be a 12GB variant, or possibly a 10GB variant if AMD want to use up some harvested MCDs to give a 2.5 MCD, 160-bit bus. My original speculation of the 7700 being a 48 CU (3072 shaders) Navi32 and the 7600XT being a 40 CU (2560 shaders) Navi32 have been proven incorrect - at least for the 7600XT - by the 16GB VRAM size.
Posted on Reply
#47
HD64G
Chrispy_Do you have source(s)? I'm not finding much info about it and 16GB VRAM (confirmed by official ECC registration) is the only actual fact we have so far; The rest is older debunked info that conflicts with the newer hard evidence of the ECC filing.

A cut-down Navi32 card would have 12GB (or 10GB) VRAM, not 16GB, and if it was cut in half to make make a 16GB card using just two MCDs it would be much more expensive than Navi33 to make and also slower, since Navi33 has the latency advantage of being monolithic AND it's more than half of Navi32 to begin with. IMO, that's not economically viable, and also relies on a large enough quantity of extremely defective Navi32 dies, which contradicts both AMD and TSMC boasting about absolutely fantastic yields this generation...

I can certainly see another triple-MCD card likely in the lineup - that's undoubtedly going to be the RX 7700, but it would be a 12GB variant, or possibly a 10GB variant if AMD want to use up some harvested MCDs to give a 2.5 MCD, 160-bit bus. My original speculation of the 7700 being a 48 CU (3072 shaders) Navi32 and the 7600XT being a 40 CU (2560 shaders) Navi32 have been proven incorrect - at least for the 7600XT - by the 16GB VRAM size.
Firstly, cut-down N10 (5700) had the same memory config with the full die one (5700XT), so that scenario is valid and easy to make. Moreover, N33 is maxed out in 7600, so there are vert small chances to be just an oced model. In addition to this, most rumour sites for 2 weeks now suggest that 7600XT has a step more cut-down core die than 7700XT. Not far from the launch now, we will know for sure soon, eh?
Posted on Reply
Add your own comment
May 15th, 2024 15:58 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts