Wednesday, May 12th 2021

AMD Radeon RX 6600 XT, 6600 to Feature Navi 23 Chip With up to 2048 Stream Processors

AMD is preparing to round-out its RX 6000 series lineup with the upcoming RX 6600 XT and RX 6600, introducing true midrange GPUs to their latest generation RDNA2 architecture. According to recent leaks, both graphics cards should feature AMD's Navi 23 chip, another full chip design, instead of making do with a cut-down Navi 22 (even though that chip still only powers one graphics card in the AMD lineup, the RX 6700 XT).

According to the leak, the RX 6600 XT should feature 2048 stream processors and 8 GB of GDDR6 memory over a 128-bit memory bus. The RX 6600, on the other hand, is said to feature a cut-down Navi 23, with only 1796 stream processors enabled out of the original silicon design, whilst offering the same 8 GB GDDR6 over a 128-bit memory bus. There are even some benchmark scores to go with these leaks: supposedly, the RX 6600 XT scores 9,439 points in 3DMark Time Spy (Graphics), while the RX 6600 card scores 7,805 points. Those scores place these cards in the same ballpark as the RDNA-based RX 5700 XT and RX 5700. It's expected that these cards feature a further cut-down 32 MB of Infinity cache - half that of the RX 6700 XT's 64 MB. With die-size being an estimated 236 mm², AMD is essentially introducing the same performance with 15 mm² less area, whilst shaving some 45 W from that cards' TDP (225 W for the RX 5700 XT, and an estimated 180 W for the RX 6600 XT).
Source: Videocardz
Add your own comment

45 Comments on AMD Radeon RX 6600 XT, 6600 to Feature Navi 23 Chip With up to 2048 Stream Processors

#1
london
L O L give me a 120w card AMD NOT A . 180 W card with 128 bit bus.
Posted on Reply
#2
Turmania
I'm a bit off 75w gpu fanatic with no power cables. Would love to see AMD implement one from their camp. But with the latest trend of increasing power consumption on the cards, I'm resigned on the possibility off it.
Posted on Reply
#3
Punkenjoy
i think there is a mistake in the news. the 6700 XT have 96 MB of cache and these are expected to receive 64 MB of infinity cache.
Posted on Reply
#4
ZoneDymo
but will it actually be value for money? orrr not really like with the 6700xt
Posted on Reply
#5
InVasMani
So inline with the performance of a GTX1080 and it's TDP not bad. If the price is reasonable a lot of people will get these. Makes a lot more sense getting these than a used GTX1080 assuming pricing is tolerable. The fact that it's a newer architecture with added modern features is a perk too. I might overclock reasonably well too one can certainly hope.
Posted on Reply
#6
MKRonin
Let's hope they don't botch the launch pricing like the 6700XT.
Posted on Reply
#7
medi01
ZoneDymoorrr not really like with the 6700xt
Yeah, 2080Ti perf/3070 perf, but 12GB and MSRP of $479, such a "terrible value". Almost like 3070s. :D
Posted on Reply
#8
Vader
InVasManiI might overclock reasonably well too one can certainly hope.
Sadly, i don't think they will. It has smaller die and 4GB less than the already pushed 6700XT, and that only shaves 50 watts. Doesn't feel like there will be much headroom to OC
Posted on Reply
#9
Chrispy_
londonL O L give me a 120w card AMD NOT A . 180 W card with 128 bit bus.
Not going to happen until we move off TSMC 7nm.
It's largely the manufacturing process that determines power efficiency at any given GPU die size so although AMD have trimmed power use slightly since 1st Gen Navi, it's not going to change significantly until 5nm TSMC starts shipping.
Posted on Reply
#10
TheinsanegamerN
TurmaniaI'm a bit off 75w gpu fanatic with no power cables. Would love to see AMD implement one from their camp. But with the latest trend of increasing power consumption on the cards, I'm resigned on the possibility off it.
We'll likely see one sooner or later. It's likely we'll get one with 1280 cores or so, the RX 560 was a 960/1024 core card.

I need a 75 watt low profile GPU to replace my RX 560 in my low profile media PC. The 560 is long in the tooth, and nvidia's linux drivers leave a lot to be desired. With the RTX 3050 being a 90w card, all hopes are that AMD makes a 75w card. Likely a RX 6500 or RX 6400 card.
Posted on Reply
#11
sutyi
londonL O L give me a 120w card AMD NOT A . 180 W card with 128 bit bus.
6600XT probably is cranked up right to the edge, should land in the neighborhood of 2.8-2.9GHz boost clock on the better AIB partner cards.
Posted on Reply
#12
TheinsanegamerN
Chrispy_Not going to happen until we move off TSMC 7nm.
It's largely the manufacturing process that determines power efficiency at any given GPU die size so although AMD have trimmed power use slightly since 1st Gen Navi, it's not going to change significantly until 5nm TSMC starts shipping.
Huh? AMD made a 75 watt 128 bit card on 14nm. 7nm isnt the restriction here.
Posted on Reply
#13
ZoneDymo
medi01Yeah, 2080Ti perf/3070 perf, but 12GB and MSRP of $479, such a "terrible value". Almost like 3070s. :D

that was more the point.
Posted on Reply
#14
Chrispy_
TheinsanegamerNHuh? AMD made a 75 watt 128 bit card on 14nm. 7nm isnt the restriction here.
made?
They're still making the RX560.

Maybe I misunderstood what you're asking - you want a budget, raytracing, RDNA2 graphics card on the most expensive and highly-contstrained silicon process currently in existence, to perform at 5700XT levels within a 75W power envelope?

I don't think it'll happen. RDNA2's raytracing architecture doesn't scale down that low, for the same reason that the RTX2060 is the slowest RTX card and everything under that lacked raytracing altogether; Below a certain amount of overall formance the raytracing is just wasted silicon because if you even had enough VRAM on a low-budget, small die card to enable those features, you'd be getting slideshow unplayable performance.
Posted on Reply
#15
IceShroom
TheinsanegamerNHuh? AMD made a 75 watt 128 bit card on 14nm. 7nm isnt the restriction here.
AMD even made 75W card on 28nm, HD 7750 was 55W card. On 7nm AMD even made 50W Navi12 based Pro 5600M for Apple. But the problem is can AMD sell 75W card now for none Apple user. Most OEM will use DRR4 based GT1030 rather than use a low power AMD card.
Posted on Reply
#16
TheinsanegamerN
Chrispy_made?
They're still making the RX560.

Maybe I misunderstood what you're asking - you want a budget, raytracing, RDNA2 graphics card on the most expensive and highly-contstrained silicon process currently in existence, to perform at 5700XT levels within a 75W power envelope?
I dont expect a 5700xt in performance at 75W, no. That was more London's point, although he seems more peturbed at a 180 watt 128 bit card, which doesnt make a whole lotta sense since bit width =! performance.

Me personally, I want to see a 75 watt card. The 560x was really pushing the 75w envelope, and only clocks to 1.3 GHz without an external connector. Low profile builders like me have been stuck there ever since, AMD didnt bother to release the RX 5300 as a DIY card. A RDNA2 1280 core card would likely fit in the 75w envelope and provide nothing short of a massive performance boost over the 560.
IceShroomAMD even made 75W card on 28nm, HD 7750 was 55W card. On 7nm AMD even made 50W Navi12 based Pro 5600M for Apple. But the problem is can AMD sell 75W card now for none Apple user. Most OEM will use DRR4 based GT1030 rather than use a low power AMD card.
OEMs didnt purchase the 7750 either. They purchase the 7500 and 7600 series for OEM builds. The 7750 was primarily purchased by end users that wanted better performance either in pre built OEM systems like the dell optiplex or were looking for a low power build.

The pro 5600m was a mobile GPU, built on the RX 5300/5500 desktop chip which never saw a DIY release.
Posted on Reply
#17
Turmania
Chrispy_made?
They're still making the RX560.

Maybe I misunderstood what you're asking - you want a budget, raytracing, RDNA2 graphics card on the most expensive and highly-contstrained silicon process currently in existence, to perform at 5700XT levels within a 75W power envelope?

I don't think it'll happen. RDNA2's raytracing architecture doesn't scale down that low, for the same reason that the RTX2060 is the slowest RTX card and everything under that lacked raytracing altogether; Below a certain amount of overall formance the raytracing is just wasted silicon because if you even had enough VRAM on a low-budget, small die card to enable those features, you'd be getting slideshow unplayable performance.
A low end 75w gpu does not need raytracing capabilities, it just needs better performance to the one that its replacing.
Posted on Reply
#18
Chrispy_
TheinsanegamerNMe personally, I want to see a 75 watt card. The 560x was really pushing the 75w envelope, and only clocks to 1.3 GHz without an external connector. Low profile builders like me have been stuck there ever since, AMD didnt bother to release the RX 5300 as a DIY card. A RDNA2 1280 core card would likely fit in the 75w envelope and provide nothing short of a massive performance boost over the 560.
I agree but still don't think it will happen for a while.

1024-1280 shaders under 2GHz should be possible at about 75W with the current architecture and current TSMC 7nm process, but that process is so constrained and expensive that it's highly unlikely non-premium parts will ever be given allocation when AMD could instead be printing money with Zen3 chiplets for EPYC and 5000-series, or for "$999" graphics cards.

If they did make a 75W part, it would probably have to be RDNA1 without raytracing for obvious performance and die-area reasons, AMD does not currently have any RDNA2 products decoupled from raytracing - that's likely not going to exist until Rembrand APUs come along, so the design for such a 75W RDNA card is likely still just an idea rather than something that can start a production run right now.

More importantly, even if AMD did make a 75W RDNA card, it would be expensive with terrible performance/$ because the process it's being manufactured on comes at such a premium. The niche audience who want a slot-powered, yet state-of-the-art card is so small because those two terms are almost an oxymon. It's not that AMD couldn't do it, it's just not convenient or profitable for them to do it at the moment, and that means that the board will for sure dismiss the idea for 2021 and maybe 2022 if the silicon shortage continues through into next year. The small die cards (RX460, RX550, RX560) were always the last cards of any generation to hit the market and they only even made sense to manufacture once all the other market segments had been sated and excess stock/inventory was piling up for those products. We are a long way away from having excess stock of anything right now :)
TurmaniaA low end 75w gpu does not need raytracing capabilities, it just needs better performance to the one that its replacing.
yeah, see above - that's exactly what I've just said.
Posted on Reply
#19
IceShroom
TheinsanegamerNThe pro 5600m was a mobile GPU, built on the RX 5300/5500 desktop chip which never saw a DIY release.
Pro 5600M is Navi12 based, and was made for Apple to use in their Macbook. Navi12 has same shader count and CU as Navi10+some new instructions and uses HBM instead of GDDR. Pro 5600M and RX 5600M uses different GPUs.
What i am tring to say that AMD has capability of making low power gpu, but there needs to a market for it. If OEM and People only pushing Nvidia and ignoring AMD, AMD will also ignor that market segment. There is 50W RX 550, which is faster than GT1030, but always gets ingored.
Posted on Reply
#20
RedelZaVedno
Those scores place these cards in the same ballpark as the RDNA-based RX 5700 XT and RX 5700.

Decent upgrade, IF 6600 would be priced at 249 and 6600XT at 299 bucks (or lower). Anything above that is a hard pass.
Posted on Reply
#21
ShurikN
RedelZaVedno Those scores place these cards in the same ballpark as the RDNA-based RX 5700 XT and RX 5700.

Decent upgrade, IF 6600 would be priced at 249 and 6600XT at 299 bucks (or lower). Anything above that is a hard pass.
Add a $100 to those prices. Unfortunately
Posted on Reply
#22
Luminescent
So, we get a 1000$ card that performs at the level of 5700 xt from 2019 that was 400$, great.
1000$ is optimistic, at launch day they might go well over 1000$
Posted on Reply
#23
TumbleGeorge
Chrispy_and current TSMC 7nm process
Which of hmm 3(if I'm not wrong) TMSC's 7nm variants is expensive?
Posted on Reply
#24
Chrispy_
TumbleGeorgeWhich of hmm 3(if I'm not wrong) TMSC's 7nm variants is expensive?
All of them?

It's supply and demand that sets the pricing and TSMC's fully booked well into 2022 with the highest-performing, most desirable processes on the market right now. They have a surplus of customers and limited capacity so there's no reason for them to be giving away stuff at a discount!

For what it's worth, almost everything AMD makes is on N7P now, which is the successor to the original N7(FF) node. I could be wrong but it looks like the N7+ node (EUV lithography) is being sold only to mobile chip makers like Qualcomm and HiSilicon (Huaweii), and almost all customers who originally used N7 have shifted to N7P - possibly because TSMC is converting N7 to N7P as it phases out N7.
Posted on Reply
#25
Sithaer
LuminescentSo, we get a 1000$ card that performs at the level of 5700 xt from 2019 that was 400$, great.
1000$ is optimistic, at launch day they might go well over 1000$
Yep, 5700XTs go for ~1100$ on the second hand market in my country atm and I don't really expect this to be any cheaper.

Tbh I'm not even sure whats the point of relasing new cards in this current situation.:rolleyes: 'I mean ppl like me who are waiting for normal priced mid range cards can't afford these anyway'
Posted on Reply
Add your own comment
Dec 22nd, 2024 08:44 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts