Monday, October 14th 2024

AMD Readies Radeon RX 7650 GRE Based on "Navi 33"

AMD is readying a new mainstream graphics card positioned based on its current RDNA 3 graphics architecture, the Radeon RX 7650 GRE (Golden Rabbit Edition). The company has had great success in selling graphics card SKUs with the "GRE" brand extension in China, with the RX 6750 GRE being a popular SKU there; and it even has an enthusiast-class SKU with the RX 7900 GRE, which saw a global launch earlier this year. The company is hoping for the "GRE" moniker to compete better against the GeForce RTX 4060, at least in specific markets. A new Benchlife.info report says that the RX 7650 GRE will be based on the 6 nm "Navi 33" monolithic silicon, and not the 5 nm "Navi 32" chiplet-based GPU previously reported.

AMD has already maxed out the "Navi 33" for both the RX 7600 and RX 7600 XT, with the latter only seeing its memory size doubled over the former, so it remains to be seen where AMD goes with the RX 7650 GRE. The "7650" numbering suggests a faster SKU, so it's possible that AMD increases the engine clock speeds of the "Navi 33" by as much as it can. The RX 7600 comes with a 2.25 GHz Game clock, which the RX 7600 XT slightly bumps up to 2.47 GHz. If we were to guess, the RX 7650 GRE could focus on increasing the Game clock, not the memory size; and so it could have the power configuration of the RX 7600 XT, and room for Game clocks either on-par or higher than the RX 7600 XT, while retaining the 8 GB memory size of the RX 7600.
Sources: Benchlife.info, VideoCardz
Add your own comment

38 Comments on AMD Readies Radeon RX 7650 GRE Based on "Navi 33"

#26
The Shield
...good Lord...and why not resurrect Vega?
Posted on Reply
#27
RaceT3ch
The Shield...good Lord...and why not resurrect Vega?
going back to GCN???
why
Posted on Reply
#28
The Shield
RaceT3chgoing back to GCN???
why
I was only joking. We are at the end of 2024, we need new CPUs at competitive price (...I know, I know...), not another weird refresh no one cares about.
Posted on Reply
#29
Chrispy_
Navi 33 is already maxed out in the 7600XT and the clocks of that push it into disappointing performance/Watt territory already.

What AMD needs is something to bridge the huge gulf between the 54 compute units of the 7700XT and the 32 compute units of the 7600XT. The massive gap in spec between these two dies implies that they were expecting to chop down Navi 32 a lot to fill in the gaps. Currently we have 60CU and 54CU variants of Navi 32, and what's desperately needed is something in the 40-48CU range.
jak_2456I'm interested to see how this one performs. The 7600 is mid, the XT is not great, the 4060 is also mid, and Battle mage is not looking too likely.
Personally I feel like the cheap 7600 8GB is as much as you'd want to spend on Navi33, and Navi33 is also only a good deal if it's on sale for cheaper than the older 6600XT/6650XT which can still be found for $210/$225 respectively.

The 7600XT 16GB is pointless because Navi33 is too weak to actually run at the resolutions that would require >8GB.
This 7650XT (presumably 16GB) is going to be the 4060Ti 16GB of the AMD world. Way too much VRAM for it's pathetic 128-bit bus and underwhelming ROP count.
Posted on Reply
#30
Vader
Chrispy_The 7600XT 16GB is pointless because Navi33 is too weak to actually run at the resolutions that would require >8GB.
This 7650XT (presumably 16GB) is going to be the 4060Ti 16GB of the AMD world. Way too much VRAM for it's pathetic 128-bit bus and underwhelming ROP count.
I would get a 7600 XT rather than a 7600. Just for handling higher texture levels. They make a great portion of the presentation and cost hardly anything to run, only needs vram. There are many examples of games exceeding 8 gb even on 1080p.
Also remember it's not about using the full 16 gb, that's unlikely in the life of the card. Instead picture those scenarios when using more than 8 gb makes a difference, that's much more common to happen
Posted on Reply
#31
DaemonForce
VaderAlso remember it's not about using the full 16 gb, that's unlikely in the life of the card.
I've seen something like this before and it's more or less towards EOL when it matters.
~20 years ago when I was still on a Rage XL 8MB and needed the extra resolution of a 16MB Magnum, the Rage Magnum pushed those pixels.
Did I use the full 16MB? Yes. Did anyone else try pushing 2048x1536 with such a card? Probably not.
The stutters weren't noticable outside of N64 emulation, where this shined brightest.



There's a big difference in bandwidth, between 800MB/s and 2.08GB/s.
Same goes for the jump from DX5 to DX6 and suddenly having GL support.
Now I'm trying to move off the 8GB card and onto a 16/20GB model. We have come full circle.
While there are no functional differences between the 7600 and 7600XT, there are probably environment differences where the 16GB card shines.
I wouldn't be surprised if someone out there is trying to push multiple 4K/5K on the 7600 and they really need the 7900XT to compete.
Some issues never really change.
Posted on Reply
#32
Chrispy_
VaderI would get a 7600 XT rather than a 7600. Just for handling higher texture levels. They make a great portion of the presentation and cost hardly anything to run, only needs vram. There are many examples of games exceeding 8 gb even on 1080p.
Also remember it's not about using the full 16 gb, that's unlikely in the life of the card. Instead picture those scenarios when using more than 8 gb makes a difference, that's much more common to happen
Yeah, 16GB is preferable but the problem with the 7600XT is the price - it costs as much as a 6750XT which is a much faster card, and that also has more than 8GB of VRAM which will let you max the textures at resolutions the 6750XT is capable of running at.
Posted on Reply
#33
Vader
Agreed, the 6700/6750 XT feels more balanced for 1080p, plus it's more efficient. I would get one over the 7600XT for the same price.
Posted on Reply
#34
mechtech
"The company is hoping for the "GRE" moniker to compete better against the GeForce RTX 4060,"

Give it a 256-bit bus, and the fastest gddr6 you can get. ;)

May as well throw in 16 pcie lanes for the heck of it.

And price $325 cad
Posted on Reply
#35
RaceT3ch
mechtech"The company is hoping for the "GRE" moniker to compete better against the GeForce RTX 4060,"

Give it a 256-bit bus, and the fastest gddr6 you can get. ;)

May as well throw in 16 pcie lanes for the heck of it.

And price $325 cad
sorry bud, not how n33 works. only 128-bit bus >:C
Posted on Reply
#36
Caring1
RaceT3chsorry bud, not how n33 works. only 128-bit bus >:C
196 bit bus I believe. Still not wide enough.
Posted on Reply
#37
Chrispy_
Caring1196 bit bus I believe. Still not wide enough.
Navi32 is 256-bit, Navi 22 (last-gen) was 192-bit.

Bus-widths are always multiples of 32-bits since that is the interface width of a single GDDR6 package, so the weird 6700-non-XT had 10GB with one of the six 32-bit-wide memory controllers disabled for a total of 5*32=160-bit

You can disable memory controllers like that example, but you cannot enable memory controllers that aren't there. Sadly, Navi33 has four 32-bit-wide memory PHYs so the maximum bus width that silicon can ever manage is 4*32=128-bit.
Posted on Reply
#38
rattlehead99
Minus InfinityThese are nothing more than 7500 vs 4050. Both shameful rebrands of trash tier products.
The 4080(Super) is a 70 class GPU - 379mm^2 die size vs 392mm^2(RTX 3070) die size respectively.
The 4070Ti is a 60 class GPU - 294mm^2
The 4070(Super) is a 50Ti class GPU
The 4060Ti is a 50 class GPU - 188mm^2
The 4060 is a 30/40 class GPU - 147mm^2
Posted on Reply
Add your own comment
Nov 24th, 2024 22:39 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts