Thursday, February 1st 2024
PowerColor Hellhound Radeon RX 7900 GRE OC Lined up for Possible EU Wide Release
It seems that AMD and its board partners are continuing to rollout new custom graphics cards based on the formerly China market exclusive Radeon RX 7900 GRE 16 GB GPU—PowerColor unleashed its fiendish flagship Red Devil model as one of last September's launch options. Their Chinese website has been updated with another Navi 31 XL entry—Hellhound Radeon RX 7900 GRE OC. This design sits below the Red Devil in the company's graphics card product and pricing hierarchy; providing excellent cooling performance with fewer frills. The latest custom RX 7900 GRE card borrows PowerColor's existing demonic dog design from the mid-tier Hellhound RX 7800 XT and RX 7700 XT models. The Hellhound enclosure deployed on Radeon RX 7900 XTX and RX 7900 XT GPUs is a much chunkier affair.
The PowerColor Hellhound Radeon RX 7900 GRE OC has also popped up on a couple of UK and mainland Europe price comparison engines (published 2024-01-30), so it possible that a very limited release could occur across a small smattering of countries and retail channels—Proshop Denmark seems to be the first place with cards in stock, pricing is €629.90 (~$682) at the time of writing. The Radeon RX 7900 GRE (Golden Rabbit Edition) GPU sits in an awkward spot between the fancier Navi 31 options, and Navi 32 siblings—AMD and its AIB partners have reduced MSRPs in Europe, possibly in reaction to the recent launch of NVIDIA's GeForce RTX 40 SUPER series. We are not sure if this initiative has boosted the RX 7900 GRE's popularity in this region, since very few outlets actually offer the (XFX-produced) reference model or Sapphire's Pulse custom design.Proshop.de details: AMD Radeon RX 7900 GRE Overclocked (Core clock 1500 MHz / Boost clock 2355 MHz), 5120 stream cores, 16 GB GDDR6 (Memory clock 18 GHz) - 256-bit, PCI-Express 4.0 x16, 3x DisplayPort 2.1 / 1x HDMI 2.1 connections, supports AMD FreeSync, supports Adaptive Sync, 2 x 8-pin power connector, recommended power supply: 750 watts, short length: 322 mm, PCI port width: 2.5 slots, PowerColor Triple Fan low noise cooler - with zero fan mode RPM (at low temperature), with Amethyst LED.Model number: RX7900GRE 16G-L/OC.
Sources:
PowerColor China, Skinflint UK, Geizhals DE
The PowerColor Hellhound Radeon RX 7900 GRE OC has also popped up on a couple of UK and mainland Europe price comparison engines (published 2024-01-30), so it possible that a very limited release could occur across a small smattering of countries and retail channels—Proshop Denmark seems to be the first place with cards in stock, pricing is €629.90 (~$682) at the time of writing. The Radeon RX 7900 GRE (Golden Rabbit Edition) GPU sits in an awkward spot between the fancier Navi 31 options, and Navi 32 siblings—AMD and its AIB partners have reduced MSRPs in Europe, possibly in reaction to the recent launch of NVIDIA's GeForce RTX 40 SUPER series. We are not sure if this initiative has boosted the RX 7900 GRE's popularity in this region, since very few outlets actually offer the (XFX-produced) reference model or Sapphire's Pulse custom design.Proshop.de details: AMD Radeon RX 7900 GRE Overclocked (Core clock 1500 MHz / Boost clock 2355 MHz), 5120 stream cores, 16 GB GDDR6 (Memory clock 18 GHz) - 256-bit, PCI-Express 4.0 x16, 3x DisplayPort 2.1 / 1x HDMI 2.1 connections, supports AMD FreeSync, supports Adaptive Sync, 2 x 8-pin power connector, recommended power supply: 750 watts, short length: 322 mm, PCI port width: 2.5 slots, PowerColor Triple Fan low noise cooler - with zero fan mode RPM (at low temperature), with Amethyst LED.Model number: RX7900GRE 16G-L/OC.
27 Comments on PowerColor Hellhound Radeon RX 7900 GRE OC Lined up for Possible EU Wide Release
2. Sapphire Nitro and PowerColor Red Devil are comparable. Both have i.e. the same boost frequency on the same GPU. None is a upsell of the other.
3. Unfortunately for me i don't think that there will be a Waterblock available. So my choice will keep between a 7900XT and a 7800xt. Especially as my new GPU will have to drive two UWQHD monitors at 155 Hz.
I've only travelled to Denmark once in the past. Specifically Copenhagen, so I'm not really aware of anything outside of the city...vaguely remember listening to a band from Aarhus.
I guess it just exists to fish for extra 100€ from non tech savvy folks who just see the "9" and think is has to be faster. :p
Being Navi 31, even cut down, it'll never be as cheap as Navi 32 because it should mostly retain the bill of materials from the other 7900 models, even though it has less components such as memory overall. It loses to the 6800 XT and often the 7800 XT in benchmarks because the RDNA 3 architecture is very inefficient at the high end and scales really poorly. This isn't a problem specific to the GRE and it affects the other two cards directly.
Although comparing between architectures is apples to oranges, the general concept still applies: RX 7900 XTX (full Navi 31) has 42.5% more ROPs, 16.5% more TMUs, 23.3% more memory bandwidth, 50% higher memory capacity, 50% larger last level cache and 10% higher TGP allowance, but it is only 1% faster than RTX 4080 SUPER (full AD103) in raster and 20% slower in RT. A processor of its size and complexity clearly targeted the RTX 4090, but it failed to measure up, and upon launching AMD positioned it against the RTX 4080, same deal we've always known: it's 4% faster in raster and 16% slower in RT.
The 7900GRE was never meant for retail but there is obviously enough interest to bring it to retail.
My "opinions" come from looking at reviews, spec sheets (for example, the transistor density figures are available on the TPU GPU database, where I got the data from), and a bit of hands-on experience. I stand by everything I say, and admit when I'm wrong. Forums are an exchange of knowledge, and it really seems to bother you that I don't have a favorable opinion of AMD. That's true, I don't, but I dislike their fanbase and their warped view of reality far more than I dislike the company itself. There's no need to act like "Whatever" or come up with actual nonsense like "you update AGESA by updating the chipset drivers" like on the other thread. Perhaps the reality is that AMD just isn't as nice a company as you think they are, and their products do have some glaring shortcomings that clearly do not match the irrationally high regard that so many have for them.
I do not hate Nvidia. I just do not agree with their hubris. You made it seem like China is inert. They have built 3 Aircraft carriers, that we know about fully.
Now what nanometer is Nvidia at? What nanometer is AMD at? What DDR does Nvidia use? What DDR does AMD use? I know they both use TSMC but like I said already, they are on different nodes. You are even championing that ridiculous connector that has had one of the fastest revisions in the history of PC. Now PSU cables are bulky wow. I am not even going to touch why Nvidia cards use less power but we can go on.
Have you heard me say anything negative about the performance of the cards that Nvidia make? It has nothing to do with you disliking AMD it is just your comments about how bad AMD is. You could not even help yourself with "Perhaps the reality is that AMD just isn't as nice a company as you think they are, and their products do have some glaring shortcomings that clearly do not match the irrationally high regard that so many have for them." Yes I am over the moon with my 7900X3D/7900XT combo and yes I challenge someone with a "bit of experience" to show me when they make ridiculous comments like the 7900 series is "the RDNA 3 architecture is very inefficient at the high end and scales really poorly". I almost fell out of my chair when I read that.
As an example to you if I set Vsync my frame rate is 144 and in most Games the GPU is at 1800 Mhz at 4K. I did not buy a 7900XT for that though so if you think that a 7800XT is the best price/performance card that is just not true. Maybe in the States where those cards are under $500 but where I live the 7900XT is $1000 and 7800XT is $800. I would pay that premium for 2 more chiplets and 4GB more RAM. As I play my Games at 4K.
Knowledge is listening and not arguing with someone how bad their product is because you read it somewhere. Why don't you browse one of the AMD threads and see us talking about things like how easy it is to get to 3 GHZ with 1 click OCs. Or a Ryzen thread and see how much we talk about PBO and other things like X3D not using as much power as regular chips. I digress though. As you are in an AMD thread bashing AMD.
The Hellhound has LED lights and actually a couple of options, but a third option is the off mode, which blacks out the cards led's if you don't want any.
Regarding the GRE, I feel like it's not a 7800XT, computerbase tested the card and I think does 7-8% better. which is this card comes at around 599€, then it's a great buy.
As for why Navi 31 can't beat AD103, that's more likely because of combined disadvantages - Navi's shaders, ROPs, TMUs, etc. are not utilised to their highest potential (there must be some analysis which show that it can't reach the theoretical performance levels), and some optimisation by Nvidia in their drivers which increase the performance while losing something else, for example textures resolution.
It is true that the N31 resources cannot be fully explored, that is the reason why this phenomenon occurs to begin with. However, past AD103, scaling begins to become problematic for Nvidia as well, it's only fair to mention that.
As for the Nvidia cheats on image quality argument that's been debunked so long ago that I'd rather not comment. Textures do not look worse on a GeForce than they do on a Radeon, both quality and bit depth are identical.
Watch the proof from 8:02 to 12:49 this video:
I hope you make a difference between low VRAM when the framerate stays sky high but the textures are missing, and when low VRAM, the textures remain on screen, but the framerate plummets?
The situation is such that some people even claim that they don't see a difference between 1K and 4K. How can we discuss anything more? Nothing is "debunked". It has been proven since decades that Nvidia in general offers lower quality and there are many internet threads about it.
I have also been able to compare and tell you that I do not feel comfortable when a GeForce card is connected to my screen.
Actually, it was proven multiple times.www.toolify.ai/ai-news/amd-vs-nvidia-who-delivers-better-image-quality-78004