Thursday, April 17th 2025

AMD Radeon RX 9070 GRE Spec Sheet Leaked; Report Suggests 3072 Stream Processor Count
The unannounced Radeon RX 9070 GRE 12 GB graphics card model seems to be next in line within AMD's RDNA 4 range. Despite official presentation material teasing a Q2'25 launch of Radeon RX 9060 Series cards, insiders believe that Team Red will debut an in-between option—possibly before the arrival of Radeon RX 9060 XT 16 GB and 8 GB SKUs. Rumored new-generation "Great Radeon Edition" (GRE) cards are expected to launch as Chinese market exclusives; potentially as a "strategic" gap fill. Local board partner moles have whispered about almost zero replenishments of Radeon RX 9070 (non-XT) 16 GB stock in the region. As reported earlier today, some of VideoCardz's inside sources insist that Radeon RX 9070 GRE cards will—eventually—replace Radeon RX 9070 options. Initial leaks suggested fundamental "step-down" specification pillars: 12 GB of VRAM, a 192-bit memory interface, and a "reduced" "Navi 48" GPU die.
According to fresh claims, the Radeon RX 9070 GRE model could utilize a "Navi 48 XL" GPU variant. VideoCardz reckons that a quarter of the original GPU core count has been shut off; resulting in a total of 3072 stream processors. Comparatively, the Radeon 9070 XT arrived with 4096 SPs. The Radeon 9070 launched with 3584 units. The report delved into alleged core frequency details: "the RX 9070 GRE's clocks will be higher than the RX 9070, reportedly at 2.79 GHz boost, resulting in around 17.1 TFLOPS compared to 18 TFLOPS on the RX 9070...Some custom variants we know of will approach a 3.0 GHz boost clock, so there is definitely room for overclocking." The latest spec sheet leak confirms a 12 GB pool of VRAM, in GDDR6 form—VideoCardz weighed in with some embellishments: "the memory will not be clocked at 20 Gbps, as on (already launched) RX 9070 (XT) and (incoming) RX 9060 XT, but at 18 Gbps. This means that the memory bandwidth will be about 1/3 lower than the RX 9070 (XT) at 432 GB/s." Certain industry observers reckon that AMD will continue to rely on AIBs to produce an all-custom lineup of forthcoming RDNA 4 products. So-called "reference designs" (MBA) have turned up in China, but only in very limited numbers—sold via hazy avenues.
Source:
VideoCardz
According to fresh claims, the Radeon RX 9070 GRE model could utilize a "Navi 48 XL" GPU variant. VideoCardz reckons that a quarter of the original GPU core count has been shut off; resulting in a total of 3072 stream processors. Comparatively, the Radeon 9070 XT arrived with 4096 SPs. The Radeon 9070 launched with 3584 units. The report delved into alleged core frequency details: "the RX 9070 GRE's clocks will be higher than the RX 9070, reportedly at 2.79 GHz boost, resulting in around 17.1 TFLOPS compared to 18 TFLOPS on the RX 9070...Some custom variants we know of will approach a 3.0 GHz boost clock, so there is definitely room for overclocking." The latest spec sheet leak confirms a 12 GB pool of VRAM, in GDDR6 form—VideoCardz weighed in with some embellishments: "the memory will not be clocked at 20 Gbps, as on (already launched) RX 9070 (XT) and (incoming) RX 9060 XT, but at 18 Gbps. This means that the memory bandwidth will be about 1/3 lower than the RX 9070 (XT) at 432 GB/s." Certain industry observers reckon that AMD will continue to rely on AIBs to produce an all-custom lineup of forthcoming RDNA 4 products. So-called "reference designs" (MBA) have turned up in China, but only in very limited numbers—sold via hazy avenues.
20 Comments on AMD Radeon RX 9070 GRE Spec Sheet Leaked; Report Suggests 3072 Stream Processor Count
I guess when 5070 is as bad as it is, they could afford to do that (and can sell it cheap if need-be) The question still becomes if it can keep 1440p48 (overclocked) in 'those games' where 9070 is ~50 and 9070xt 60.
If it can, that's cool; make it cheap as they want. If it can't, it's a waste of materials...just like 5070. So I do hope it retains actually worth-while purpose for obtaining settings, not just being 'less bad'.
Although I would argue 12GB kinda already makes it a tough rec from somebody like me, regardless.
Important to note '18gbps-clocked' ram doesn't necessarily mean overclocking much worse than 20gbps (and still over it as seen on 7900gre); likely because same chips.
It's amazing to watch these companies see which can nerf their products the most and get away with slotting them in to keep prices higher on things people might want to buy and/or actually keep a long time.
That part is pretty pathetic, regardless of which company, imho. This, for instance, clearly aimed at keeping the price of 9070 above $400 long-term (it would appear), and hence 9070xt at or above 7800xt prices.
Most nvidia customers will buy nvidia graphic cards. I would not compare it to nvidia. Nvidia has "the better software stack and drivers". "AMD has bad drivers". I read that too often.
edit: There are plenty of amd graphic cards for sale. They are barely bought regardless of performance and price
I don't know if you've noticed, but both companies truly are helping each other keep prices inflated. Again; I don't take sides. I call the bullshit and unfortunate realities; which this may be one.
Some see this as a 'much-needed product', which is what they want you to believe. I see it as a way to keep the better chips prices higher, so that inevitably a die shrink at $400 still appears appealing.
7800XT
7900 GRE
7900 XT
Just for example, we saw something similar w/ the 6750 GRE 12GB, which was JUST above the 6700XT but below the 6750XT. There's certainly outliers though, such as the 10GB ver of the 6750 GRE, or the 7650 GRE.
It would be nice if the 9070 GRE makes it to the rest of the world but I'm not holding my breath on it.. if it did, it might unironically be the card I buy this gen, if the price is right.
Getting back to Radeon recently and I love their panel. It's loaded with features like overclocking and I love AFMF and Radeon Chill. Especially the later is such underused feature. Like, I get around 180W average power consumption on RX 9070 XT with it and it doesn't impact perceivable performance. This beats all the efficiency of NVIDIA through smart software design. It's really insane feature people don't talk about enough and AMD often does things in clever way over brute force. InfinityCache in GPUs and CPUs from AMD is another smart thing over brute force.
And I can't really complain about drivers either. Only issue I had so far was in Vivaldi browser where mouse was laggy and it's a blame game between Vivaldi, Google Chrome code and AMD. Firefox and Opera are just fine. And that's really the only issue I've had. Games all work fine without any issues.
I think this whole "AMD drivers are bad" is such a stupid myth that just wouldn't die. And I've had a lot of Radeons in my life and I never had any issues worth mentioning.
Previous 6600XT, 6800 non XT, current radeon 7800XT: Every game need it's own game profile to reduce the shown wattage in the on screen display from windows 11 pro amd gpu driver. I'm lucky to have a calibrated screen with only 75 fps, so i can use freesync with target fps of 45 fps. the windows amd gpu drivers are buggy. I had to revert several times already the driver. The windows amd gpu driver lost all custom game profiles several times already. A mass import or export is not possible. Bugs are not fixed. They work but not to my standards. 6000 series cards need for gnu userspace and linux kernel custom config files for windows 10 pro and windows 11 pro custom config files to reduce the idle wattage. AMD does not really work on the amd gpu drivers or quality. Some bugs exists since I bought fresh the 6600XT. (I wrote full page with bugs several times on different forums. The list is long for amd driver bugs which I only found ) I was forced to revert the amd gpu drivers several times on my desktop box with 7800xt - 7600X. It seems this is a new driver quality in past 9 months.
For the usual gamer the amd driver will do the job most likely. Many features are there which i do not want or need. I use something else for many years. That bad software quality is not something I am very used, except the intel graphic card drivers and intel wlan driver over the years.
Anyway. As of now, all three companies have software issues. So it does not really matter.
FACTS are the 9070/XT has been sold out consistently, the green cards have always been available, just not many want to pay the price.
www.newegg.com/asus-tuf-gaming-tuf-rtx5080-o16g-gaming-nvidia-geforce-rtx-5080-16gb-gddr7/p/N82E16814126743?Item=N82E16814126743&cm_sp=Homepage_SS-_-P1_14-126-743-_-04172025
www.newegg.com/gigabyte-gv-r9070xtgaming-oc-16gd-amd-radeon-rx-9070-xt-16gb-gddr6/p/N82E16814932751
'Hoping Navi 48 yields are great, and this becomes the 1440P/1080P+RT card, worldwide.
Semi-Tangential: What's the word on Intel Celestial? B580's starting to look a little long in the teeth with RX 9060s and 9070 GRE on the horizon.
I don't think Intel is gonna stick around IMO.. :( It screams them trying not too anyway.
Shader Cores matter.
I enjoyed 3072 Shader Cores for 5 years.
Cheers
It will probably be the same price as the RTX 5060 Ti but with more VRAM and better performance.
One thing I would really want from AMD is their own version of Rx HDR NN optimized, which is just good, noticeable and more usable than ray tracing. And I'm too lazy to set up more cumbersome alternatives: autohdr, reshade, klevel or something.
A simple, one-click solution like RTX HDR would be awesome. And it seems to me that it's within their capabilities to release something like that in around a year.
Anyway, news like this is like a carrot dangling in front of someone who has lost all hope in PC gaming. I don't have PC.
And probably won't buy couse everything is Us owned companies and I try not to buy anything american. And doubt that China will release any good GPU any time soon.
Will wait for good guys to catch-up bloodthirsty empire inevery domain.
But 1440p mini led IPS with that GPU could be sweet deal for HDR gaming.
300 monitor +400 GPU will be better than any 700 GPU alone but still a bit to expensive. Maybe in 3 years I could afford it.
videocardz.com/newz/acer-mistakenly-lists-radeon-rx-9070-gre-xt-nitro-graphics-card
Apparently, Acer accidentally made and released promo material for a RX 9070 GRE XT.
My guess/hope:
Navi 44 (9060) is too cut-down to fill the gaps in behind the 9070 non-XT.
Navi 48 will be further binned-out into at least 4 desktop SKUs.
RX 9070 GRE | RX 9070 GRE XT | RX 9070 | RX 9070 XT | RX 9070 XTX
All depends if their bother to physically cut out memory bus.
About the 9060 XT I am pretty sure it's going to be very close to the 5060 Ti (if its specs will be true).
Therefore the 9070 GRE will land somewhere between the 9060 XT and the 9070, most likely closer to the 9070 than to the 9060 XT.
If the 9060 XT will match the 7700 XT performance-wise then the GRE could land somewhere around the 7800 XT - 6900 XT level of performance. This is a less optimistic prediction, more optimistic than this would be 4070 Super level and then there are three cards that are roughly equal 7900 GRE, 3080 Ti, 6950 XT (this is realistically the maximum ceiling that the 9070 GRE could reach). After that is the 5070 which I'm guessing could end up marginally stronger than the 9070 GRE (I say this because the 9070 needs more processing units and a higher bandwidth than the GRE to beat the 5070).
So in conclusion I would say the interval between 7800 XT and 4070 Super is the most probable level where the 9070 GRE will land. And regarding price $450 sounds about right, if performance-wise the difference between the 9070 GRE and the 5070 is similar to that between the 9070 XT and the 5070 Ti then $450 is the correct answer.
5070 Ti - $750 ....................................... 9070 XT - $600
5070 12GB - $550 ................................. 9070 GRE 12GB - X
One thing I find weird is that AMD will use 18 Gbps memory chips thus the bandwidth will be 432 GB/s -> 10% lower than 480 GB/s which would have been expected when using the same 20 Gbps memory chips. This coupled with the fact that the advertised boost clock will be about halfway between the 9070 and the 9070 XT that could mean that the 9070 GRE will be more bandwidth limited than other RDNA4 cards relative to the shader count (by this metric the 9070 looks the best).
@LabRat 891
Regarding more segmentation, if the 9060 XT ends up close to the 5060 Ti (and priced at $350 for 16GB) and the 9070 GRE ends up close to the 4070 Super (and priced at $450 for 12GB) there is no other possible option between these or between the 9070 GRE and the 9070. Possible that would make sense as a market product that is, you obviously can chop away a CU/SM at a time and make GPUs for $25 increments.