• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PowerColor Hellhound Radeon RX 7900 GRE OC Lined up for Possible EU Wide Release

T0@st

News Editor
Joined
Mar 7, 2023
Messages
2,077 (3.17/day)
Location
South East, UK
It seems that AMD and its board partners are continuing to rollout new custom graphics cards based on the formerly China market exclusive Radeon RX 7900 GRE 16 GB GPU—PowerColor unleashed its fiendish flagship Red Devil model as one of last September's launch options. Their Chinese website has been updated with another Navi 31 XL entry—Hellhound Radeon RX 7900 GRE OC. This design sits below the Red Devil in the company's graphics card product and pricing hierarchy; providing excellent cooling performance with fewer frills. The latest custom RX 7900 GRE card borrows PowerColor's existing demonic dog design from the mid-tier Hellhound RX 7800 XT and RX 7700 XT models. The Hellhound enclosure deployed on Radeon RX 7900 XTX and RX 7900 XT GPUs is a much chunkier affair.

The PowerColor Hellhound Radeon RX 7900 GRE OC has also popped up on a couple of UK and mainland Europe price comparison engines (published 2024-01-30), so it possible that a very limited release could occur across a small smattering of countries and retail channels—Proshop Denmark seems to be the first place with cards in stock, pricing is €629.90 (~$682) at the time of writing. The Radeon RX 7900 GRE (Golden Rabbit Edition) GPU sits in an awkward spot between the fancier Navi 31 options, and Navi 32 siblings—AMD and its AIB partners have reduced MSRPs in Europe, possibly in reaction to the recent launch of NVIDIA's GeForce RTX 40 SUPER series. We are not sure if this initiative has boosted the RX 7900 GRE's popularity in this region, since very few outlets actually offer the (XFX-produced) reference model or Sapphire's Pulse custom design.




Proshop.de details: AMD Radeon RX 7900 GRE Overclocked (Core clock 1500 MHz / Boost clock 2355 MHz), 5120 stream cores, 16 GB GDDR6 (Memory clock 18 GHz) - 256-bit, PCI-Express 4.0 x16, 3x DisplayPort 2.1 / 1x HDMI 2.1 connections, supports AMD FreeSync, supports Adaptive Sync, 2 x 8-pin power connector, recommended power supply: 750 watts, short length: 322 mm, PCI port width: 2.5 slots, PowerColor Triple Fan low noise cooler - with zero fan mode RPM (at low temperature), with Amethyst LED.



Model number: RX7900GRE 16G-L/OC.

View at TechPowerUp Main Site | Source
 
Joined
Dec 25, 2020
Messages
7,013 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Smart from PowerColor, the EU market is very receptive to Radeon and extremely value conscious, this could work well for them.
 
Joined
Dec 26, 2020
Messages
382 (0.26/day)
System Name Incomplete thing 1.0
Processor Ryzen 2600
Motherboard B450 Aorus Elite
Cooling Gelid Phantom Black
Memory HyperX Fury RGB 3200 CL16 16GB
Video Card(s) Gigabyte 2060 Gaming OC PRO
Storage Dual 1TB 970evo
Display(s) AOC G2U 1440p 144hz, HP e232
Case CM mb511 RGB
Audio Device(s) Reloop ADM-4
Power Supply Sharkoon WPM-600
Mouse G502 Hero
Keyboard Sharkoon SGK3 Blue
Software W10 Pro
Benchmark Scores 2-5% over stock scores
Would be great, a nice inbetween card. Would be even nicer if the Hellhound had proper RGB lights instead of this blue and white nonsense. Obviously to upsell to a Nitro+.
 
Joined
Aug 4, 2021
Messages
59 (0.05/day)
Location
Belgium
Processor AMD Ryzen 7 7800X3D
Motherboard Asus ROG Crosshair X670E Hero
Cooling EKWB loop
Memory Corsair Dominator Titanium DDR5-8000C38
Video Card(s) nvidia RTX 4090 FE
Storage WD Black SN850X 1TB // WD Black SN850X 2TB // WD Black SN770 2TB
Display(s) Samsung Odyssey G8 OLED
Case Corsair 5000D Airflow
Power Supply Corsair HX1500i
Mouse Glorious Model D 2 Pro
Keyboard Lemokey P1 Pro
Would be great, a nice inbetween card. Would be even nicer if the Hellhound had proper RGB lights instead of this blue and white nonsense. Obviously to upsell to a Nitro+.
Nitro+? You mean the Red Devil? I highly doubt PowerColor is trying to upsell you a Sapphire card :)
 
Joined
May 24, 2023
Messages
801 (1.38/day)
Location
127.0.0.1, ::1
System Name Naboo (2019)
Processor AMD 3800x
Motherboard Gigabyte Aorus Master V1 (X470)
Cooling individual EKWB/Heatkiller loop
Memory 4*8 GB 3600 Corsair Vengeance
Video Card(s) Sapphire Pulse 5700XT
Storage SSD 1TB PCIe 4.0x4, 2 TB PCIe 3.0
Display(s) 2*WQHD
Case Lian Li O11 Rog
Audio Device(s) Hifiman, Topping DAC/KHV
Power Supply Seasonic 850W Gold
Mouse Logitech MX2, Logitech MX Ergo Trackball
Keyboard Cherry Stream Wireless, Logitech MX Keys
Software Linux Mint "Vera" Cinnamon
1. Proshop.de is btw. in difference to their TLD a danish shop near Aarhus and not a german one.

2. Sapphire Nitro and PowerColor Red Devil are comparable. Both have i.e. the same boost frequency on the same GPU. None is a upsell of the other.

3. Unfortunately for me i don't think that there will be a Waterblock available. So my choice will keep between a 7900XT and a 7800xt. Especially as my new GPU will have to drive two UWQHD monitors at 155 Hz.
 

T0@st

News Editor
Joined
Mar 7, 2023
Messages
2,077 (3.17/day)
Location
South East, UK
1. Proshop.de is btw. in difference to their TLD a danish shop near Aarhus and not a german one.
Thanks for the knowledge; updated the article with correct Geography. Confused by the .de thing, and the site's language being entirely Deutsch.

I've only travelled to Denmark once in the past. Specifically Copenhagen, so I'm not really aware of anything outside of the city...vaguely remember listening to a band from Aarhus.
 
Last edited:
Joined
Feb 24, 2023
Messages
3,126 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Copenhagen
But that's Denmark, not Germany...
this could work well for them...
...if they priced this GPU below the level of 4070 non-Super. 7900 GRE is margin of error faster than 7800 XT and the latter doesn't run circles around anything at least as fast as 4070. For 630 Euros, sales will be mediocre at best because 7800 XT is a hundred dollars cheaper despite being almost identical in performance ("thanks" to cripplingly low power limit and relatively slow VRAM on the GRE).
 

T0@st

News Editor
Joined
Mar 7, 2023
Messages
2,077 (3.17/day)
Location
South East, UK
But that's Denmark, not Germany...
Yeah, I was aware that I was taking a flight to Denmark, back in 2007. I would've been highly confused had I landed in a place full of people speaking German...not Dansk.
 
Joined
Feb 24, 2023
Messages
3,126 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Joined
May 24, 2023
Messages
801 (1.38/day)
Location
127.0.0.1, ::1
System Name Naboo (2019)
Processor AMD 3800x
Motherboard Gigabyte Aorus Master V1 (X470)
Cooling individual EKWB/Heatkiller loop
Memory 4*8 GB 3600 Corsair Vengeance
Video Card(s) Sapphire Pulse 5700XT
Storage SSD 1TB PCIe 4.0x4, 2 TB PCIe 3.0
Display(s) 2*WQHD
Case Lian Li O11 Rog
Audio Device(s) Hifiman, Topping DAC/KHV
Power Supply Seasonic 850W Gold
Mouse Logitech MX2, Logitech MX Ergo Trackball
Keyboard Cherry Stream Wireless, Logitech MX Keys
Software Linux Mint "Vera" Cinnamon
Thanks for the knowledge; updated the article with correct Geography. Confused by the .de thing, and the site's language being entirely Deutsch.

I've only travelled to Denmark once in the past. Specifically Copenhagen, so I'm not really aware of anything outside of the city...vaguely remember listening to a band from Aarhus.
Your welcome.

That feels like forever ago. I was a 12 y.o. dude trying to survive at school...
12 yrs. ago I lived in the swiss and visited denmark with my ukrainian lady to marry her. Roundabout when you was born i absolve my second university studies and wrote my diploma thesis. I got my first Computer 1974 when i was six. One had to solder and assemble it on his own. Since then i'm developing software. That's feeling like forever
 
Joined
Apr 6, 2021
Messages
1,131 (0.83/day)
Location
Bavaria ⌬ Germany
System Name ✨ Lenovo M700 [Tiny]
Cooling ⚠️ 78,08% N² ⌬ 20,95% O² ⌬ 0,93% Ar ⌬ 0,04% CO²
Audio Device(s) ◐◑ AKG K702 ⌬ FiiO E10K Olympus 2
Mouse ✌️ Corsair M65 RGB Elite [Black] ⌬ Endgame Gear MPC-890 Cordura
Keyboard ⌨ Turtle Beach Impact 500
...if they priced this GPU below the level of 4070 non-Super. 7900 GRE is margin of error faster than 7800 XT and the latter doesn't run circles around anything at least as fast as 4070. For 630 Euros, sales will be mediocre at best because 7800 XT is a hundred dollars cheaper despite being almost identical in performance ("thanks" to cripplingly low power limit and relatively slow VRAM on the GRE).
Right? I don't get the purpose of the 7900 GRE. It's atm. ~100€ more expensive than the 7800XT (which has the same performance) and has no extra features that would give it a edge.

I guess it just exists to fish for extra 100€ from non tech savvy folks who just see the "9" and think is has to be faster. :p

 
Joined
May 3, 2018
Messages
2,881 (1.19/day)
Right? I don't get the purpose of the 7900 GRE. It's atm. ~100€ more expensive than the 7800XT (which has the same performance) and has no extra features that would give it a edge.

I guess it just exists to fish for extra 100€ from non tech savvy folks who just see the "9" and think is has to be faster. :p

Indeed, performance of the 7900 GRE makes no sense. it has way more CU's than the 7800XT yet I guess since gpu clocks are much lower doesn't outperform it by more than a few fps, so why not just sell the Chinese the 7800XT.
 
Joined
Apr 6, 2021
Messages
1,131 (0.83/day)
Location
Bavaria ⌬ Germany
System Name ✨ Lenovo M700 [Tiny]
Cooling ⚠️ 78,08% N² ⌬ 20,95% O² ⌬ 0,93% Ar ⌬ 0,04% CO²
Audio Device(s) ◐◑ AKG K702 ⌬ FiiO E10K Olympus 2
Mouse ✌️ Corsair M65 RGB Elite [Black] ⌬ Endgame Gear MPC-890 Cordura
Keyboard ⌨ Turtle Beach Impact 500
Indeed, performance of the 7900 GRE makes no sense. it has way more CU's than the 7800XT yet I guess since gpu clocks are much lower doesn't outperform it by more than a few fps, so why not just sell the Chinese the 7800XT.
Wait, they aren't selling the 7800XT in China? :wtf:
 
Joined
Dec 25, 2020
Messages
7,013 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Right? I don't get the purpose of the 7900 GRE. It's atm. ~100€ more expensive than the 7800XT (which has the same performance) and has no extra features that would give it a edge.

I guess it just exists to fish for extra 100€ from non tech savvy folks who just see the "9" and think is has to be faster. :p


Far from it, the purpose of the 7900 GRE is to make a product out of a low quality Navi 31 die that technically works but doesn't make the cut to be a 7900 XT. AMD then disables the bits that don't work too well and sell it as a cheaper product. The RTX 4090 is pretty much the same thing with the AD102, both the 7900 GRE and 4090 are similarly cut down to a significant part of their capability. This makes the 7800 XT is the best RDNA 3 card if you want something that is balanced and performs to the hardware's fullest extent.

Being Navi 31, even cut down, it'll never be as cheap as Navi 32 because it should mostly retain the bill of materials from the other 7900 models, even though it has less components such as memory overall. It loses to the 6800 XT and often the 7800 XT in benchmarks because the RDNA 3 architecture is very inefficient at the high end and scales really poorly. This isn't a problem specific to the GRE and it affects the other two cards directly.

Although comparing between architectures is apples to oranges, the general concept still applies: RX 7900 XTX (full Navi 31) has 42.5% more ROPs, 16.5% more TMUs, 23.3% more memory bandwidth, 50% higher memory capacity, 50% larger last level cache and 10% higher TGP allowance, but it is only 1% faster than RTX 4080 SUPER (full AD103) in raster and 20% slower in RT. A processor of its size and complexity clearly targeted the RTX 4090, but it failed to measure up, and upon launching AMD positioned it against the RTX 4080, same deal we've always known: it's 4% faster in raster and 16% slower in RT.
 
Joined
Jun 2, 2017
Messages
9,370 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Far from it, the purpose of the 7900 GRE is to make a product out of a low quality Navi 31 die that technically works but doesn't make the cut to be a 7900 XT. AMD then disables the bits that don't work too well and sell it as a cheaper product. The RTX 4090 is pretty much the same thing with the AD102, both the 7900 GRE and 4090 are similarly cut down to a significant part of their capability. This makes the 7800 XT is the best RDNA 3 card if you want something that is balanced and performs to the hardware's fullest extent.

Being Navi 31, even cut down, it'll never be as cheap as Navi 32 because it should mostly retain the bill of materials from the other 7900 models, even though it has less components such as memory overall. It loses to the 6800 XT and often the 7800 XT in benchmarks because the RDNA 3 architecture is very inefficient at the high end and scales really poorly. This isn't a problem specific to the GRE and it affects the other two cards directly.

Although comparing between architectures is apples to oranges, the general concept still applies: RX 7900 XTX (full Navi 31) has 42.5% more ROPs, 16.5% more TMUs, 23.3% more memory bandwidth, 50% higher memory capacity, 50% larger last level cache and 10% higher TGP allowance, but it is only 1% faster than RTX 4080 SUPER (full AD103) in raster and 20% slower in RT. A processor of its size and complexity clearly targeted the RTX 4090, but it failed to measure up, and upon launching AMD positioned it against the RTX 4080, same deal we've always known: it's 4% faster in raster and 16% slower in RT.
Whatever, I have no idea where you get your opinions from. Do you realize that the 4090 uses a smaller node than the 7900 series and uses more power but of course I must be talking out my ass as the 7900 series require a 600 Watt connector. That is the reason and there are plenty of Games that are faster than the 4090. Ray Tracing is in the mouths of every reviewer but certainly not in every Game that has been released. As far as you being an AMD employee and knowing the short comings of Navi 31. I would use my year long experience with the 7900Xt than your baseless argument.

The 7900GRE was never meant for retail but there is obviously enough interest to bring it to retail.
 
Joined
Dec 25, 2020
Messages
7,013 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Whatever, I have no idea where you get your opinions from. Do you realize that the 4090 uses a smaller node than the 7900 series and uses more power but of course I must be talking out my ass as the 7900 series require a 600 Watt connector. That is the reason and there are plenty of Games that are faster than the 4090. Ray Tracing is in the mouths of every reviewer but certainly not in every Game that has been released. As far as you being an AMD employee and knowing the short comings of Navi 31. I would use my year long experience with the 7900Xt than your baseless argument.

The 7900GRE was never meant for retail but there is obviously enough interest to bring it to retail.

AMD's and NVIDIA's nodes are roughly equivalent in this generation, in fact, the transistor density of the Navi 31 GCD (150.2 million per mm²) exceeds that of the process used in AD102 (125.3 million per mm²), making it every bit as advanced as the NVIDIA card even after factoring in the MCDs with earlier generation nodes, and neither of them require 600 watts. The new 16-pin connector is a design choice so you run only one instead of three bulky cables to your card. That's the reason it was developed, not because you need absurd power limits for either card. Edge cases (although not even Starfield, arguably the ultimate edge case, doesn't result in a win for AMD) are not relevant.

My "opinions" come from looking at reviews, spec sheets (for example, the transistor density figures are available on the TPU GPU database, where I got the data from), and a bit of hands-on experience. I stand by everything I say, and admit when I'm wrong. Forums are an exchange of knowledge, and it really seems to bother you that I don't have a favorable opinion of AMD. That's true, I don't, but I dislike their fanbase and their warped view of reality far more than I dislike the company itself. There's no need to act like "Whatever" or come up with actual nonsense like "you update AGESA by updating the chipset drivers" like on the other thread. Perhaps the reality is that AMD just isn't as nice a company as you think they are, and their products do have some glaring shortcomings that clearly do not match the irrationally high regard that so many have for them.
 
Joined
Jun 2, 2017
Messages
9,370 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
AMD's and NVIDIA's nodes are roughly equivalent in this generation, in fact, the transistor density of the Navi 31 GCD (150.2 million per mm²) exceeds that of the process used in AD102 (125.3 million per mm²), making it every bit as advanced as the NVIDIA card even after factoring in the MCDs with earlier generation nodes, and neither of them require 600 watts. The new 16-pin connector is a design choice so you run only one instead of three bulky cables to your card. That's the reason it was developed, not because you need absurd power limits for either card. Edge cases (although not even Starfield, arguably the ultimate edge case, doesn't result in a win for AMD) are not relevant.

My "opinions" come from looking at reviews, spec sheets (for example, the transistor density figures are available on the TPU GPU database, where I got the data from), and a bit of hands-on experience. I stand by everything I say, and admit when I'm wrong. Forums are an exchange of knowledge, and it really seems to bother you that I don't have a favorable opinion of AMD. That's true, I don't, but I dislike their fanbase and their warped view of reality far more than I dislike the company itself. There's no need to act like "Whatever" or come up with actual nonsense like "you update AGESA by updating the chipset drivers" like on the other thread. Perhaps the reality is that AMD just isn't as nice a company as you think they are, and their products do have some glaring shortcomings that clearly do not match the irrationally high regard that so many have for them.
Indeed. Reality. Russia has invaded the Ukraine. After that Putin met with Chi and made a whatever it takes treaty. China sends BMs towards Taiwan every week. China is openly buying Oil from Iran. Iran is supplying Russia. China warns Taiwan that if the Democratic party is elected they will essentially invade. Russia was losing the War. Russia influenced HAMAS through Iran to trigger Netanyahu. The World court finds Israel guilty of Genocide. The west responds with cutting off aid to the Palestinians. Now the Middle East is a powder keg. Getting back to the Ukraine. One of the truths of the Ukraine conflict is how much technology has changed the Battlefield. Access to the Warsaw Pact database allows their drones and drone operators to destroy about 12 billion in Military Equipment like tanks and Troop carriers since the War started. One of the best drones are like the ones that Russia gets from Iran that has Machine learning. Anecdotally we are getting 4090 laptop chips in GPU shrouds. Why did they remove them in the first place?
I do not hate Nvidia. I just do not agree with their hubris. You made it seem like China is inert. They have built 3 Aircraft carriers, that we know about fully.

Now what nanometer is Nvidia at? What nanometer is AMD at? What DDR does Nvidia use? What DDR does AMD use? I know they both use TSMC but like I said already, they are on different nodes. You are even championing that ridiculous connector that has had one of the fastest revisions in the history of PC. Now PSU cables are bulky wow. I am not even going to touch why Nvidia cards use less power but we can go on.

Have you heard me say anything negative about the performance of the cards that Nvidia make? It has nothing to do with you disliking AMD it is just your comments about how bad AMD is. You could not even help yourself with "Perhaps the reality is that AMD just isn't as nice a company as you think they are, and their products do have some glaring shortcomings that clearly do not match the irrationally high regard that so many have for them." Yes I am over the moon with my 7900X3D/7900XT combo and yes I challenge someone with a "bit of experience" to show me when they make ridiculous comments like the 7900 series is "the RDNA 3 architecture is very inefficient at the high end and scales really poorly". I almost fell out of my chair when I read that.

As an example to you if I set Vsync my frame rate is 144 and in most Games the GPU is at 1800 Mhz at 4K. I did not buy a 7900XT for that though so if you think that a 7800XT is the best price/performance card that is just not true. Maybe in the States where those cards are under $500 but where I live the 7900XT is $1000 and 7800XT is $800. I would pay that premium for 2 more chiplets and 4GB more RAM. As I play my Games at 4K.

Knowledge is listening and not arguing with someone how bad their product is because you read it somewhere. Why don't you browse one of the AMD threads and see us talking about things like how easy it is to get to 3 GHZ with 1 click OCs. Or a Ryzen thread and see how much we talk about PBO and other things like X3D not using as much power as regular chips. I digress though. As you are in an AMD thread bashing AMD.
 
Joined
Oct 27, 2014
Messages
185 (0.05/day)
Would be great, a nice inbetween card. Would be even nicer if the Hellhound had proper RGB lights instead of this blue and white nonsense. Obviously to upsell to a Nitro+.

I actually love the fact they don't. :)
The Hellhound has LED lights and actually a couple of options, but a third option is the off mode, which blacks out the cards led's if you don't want any.

Regarding the GRE, I feel like it's not a 7800XT, computerbase tested the card and I think does 7-8% better. which is this card comes at around 599€, then it's a great buy.
 
Joined
Jan 27, 2024
Messages
291 (0.88/day)
Processor Ryzen AI
Motherboard MSI
Cooling Cool
Memory Fast
Video Card(s) Matrox Ultra high quality | Radeon
Storage Chinese
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Chinese
Mouse Chinese
Keyboard Chinese
VR HMD No
Software Android | Yandex
Benchmark Scores Yes
7900 GRE is to make a product out of a low quality Navi 31 die that technically works but doesn't make the cut to be a 7900 XT. AMD then disables the bits that don't work too well and sell it as a cheaper product.

Yeah, ironically, I'd call it "Garbage Radeon Edition". Stands well for what one potential buyer would get from it. :mad:

Although comparing between architectures is apples to oranges, the general concept still applies: RX 7900 XTX (full Navi 31) has 42.5% more ROPs, 16.5% more TMUs, 23.3% more memory bandwidth, 50% higher memory capacity, 50% larger last level cache and 10% higher TGP allowance, but it is only 1% faster than RTX 4080 SUPER (full AD103) in raster and 20% slower in RT. A processor of its size and complexity clearly targeted the RTX 4090, but it failed to measure up, and upon launching AMD positioned it against the RTX 4080, same deal we've always known: it's 4% faster in raster and 16% slower in RT.

Well, the combined die size of Navi 31 is only 529 sq. mm with only 57.7 billion transistors, while the AD102 has as many as 76.3 billion transistors (32% more) and a die size of 609 sq. mm (15% larger). Clearly, Navi 31 cannot compete simply because the resources in it are less.

As for why Navi 31 can't beat AD103, that's more likely because of combined disadvantages - Navi's shaders, ROPs, TMUs, etc. are not utilised to their highest potential (there must be some analysis which show that it can't reach the theoretical performance levels), and some optimisation by Nvidia in their drivers which increase the performance while losing something else, for example textures resolution.
 
Joined
Dec 25, 2020
Messages
7,013 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Yeah, ironically, I'd call it "Garbage Radeon Edition". Stands well for what one potential buyer would get from it. :mad:



Well, the combined die size of Navi 31 is only 529 sq. mm with only 57.7 billion transistors, while the AD102 has as many as 76.3 billion transistors (32% more) and a die size of 609 sq. mm (15% larger). Clearly, Navi 31 cannot compete simply because the resources in it are less.

As for why Navi 31 can't beat AD103, that's more likely because of combined disadvantages - Navi's shaders, ROPs, TMUs, etc. are not utilised to their highest potential (there must be some analysis which show that it can't reach the theoretical performance levels), and some optimisation by Nvidia in their drivers which increase the performance while losing something else, for example textures resolution.

I'm not sure that absolute combined die area or total transistor count can be used to support the argument of N31 being "smaller" because like I said, architectures differ and the GCD is much more densely packed, and there's a characteristic in Ada that it contains several instances of NVDEC in which all GeForce models only one of them are actually enabled on the chip. Regardless, since the prices are much lower, it's quite forgivable, it costs AMD an absolute crown but it doesn't detract of the GPU as a product.

It is true that the N31 resources cannot be fully explored, that is the reason why this phenomenon occurs to begin with. However, past AD103, scaling begins to become problematic for Nvidia as well, it's only fair to mention that.

As for the Nvidia cheats on image quality argument that's been debunked so long ago that I'd rather not comment. Textures do not look worse on a GeForce than they do on a Radeon, both quality and bit depth are identical.
 
Joined
Jan 27, 2024
Messages
291 (0.88/day)
Processor Ryzen AI
Motherboard MSI
Cooling Cool
Memory Fast
Video Card(s) Matrox Ultra high quality | Radeon
Storage Chinese
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Chinese
Mouse Chinese
Keyboard Chinese
VR HMD No
Software Android | Yandex
Benchmark Scores Yes
Joined
Dec 25, 2020
Messages
7,013 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
They indeed look worse.
Watch the proof from 8:02 to 12:49 this video:


This is not caused by "AMD" or "NVIDIA" but rather because the RTX 3070 is VRAM starved and games have begun to push 8 GB cards beyond their limitations. HUB themselves made a follow up video testing the RTX A4000 (which they happened to own) and it does not suffer from that problem. The same issue will affect 8 GB AMD GPUs, and it's why people want 12 GB as an absolute minimum going forward.

 
Joined
Jan 27, 2024
Messages
291 (0.88/day)
Processor Ryzen AI
Motherboard MSI
Cooling Cool
Memory Fast
Video Card(s) Matrox Ultra high quality | Radeon
Storage Chinese
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Chinese
Mouse Chinese
Keyboard Chinese
VR HMD No
Software Android | Yandex
Benchmark Scores Yes
This is not caused by "AMD" or "NVIDIA" but rather because the RTX 3070 is VRAM starved and games have begun to push 8 GB cards beyond their limitations. HUB themselves made a follow up video testing the RTX A4000 (which they happened to own) and it does not suffer from that problem. The same issue will affect 8 GB AMD GPUs, and it's why people want 12 GB as an absolute minimum going forward.

That's not true. The behaviour after low VRAM amount is either a message that the game refuses to run at all, or very low framerate with the same textures resolution.
I hope you make a difference between low VRAM when the framerate stays sky high but the textures are missing, and when low VRAM, the textures remain on screen, but the framerate plummets?
 
Joined
Dec 25, 2020
Messages
7,013 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
That's not true. The behaviour after low VRAM amount is either a message that the game refuses to run at all, or very low framerate with the same textures resolution.
I hope you make a difference between low VRAM when the framerate stays sky high but the textures are missing, and when low VRAM, the textures remain on screen, but the framerate plummets?

Not only it is true, but don't you think that if Nvidia was really cheating on something like texturing, everyone would know and immediately notice? It's been debunked ages upon ages ago. HUB's video literally shows that the exact same GA104 with 8 and 16 GB behave completely different, and all of the newer RTX cards with 12+ GB have always been immune to the problems shown in that video. And again, VRAM starved AMD cards display the exact same symptoms, the difference being, that AMD 8 GB cards tend to be much lower end because they aren't as stingy with the amount of memory installed. It's perfectly OK for an RX 6600 to have 8 GB, but while workable, it's clearly not that OK for a 3070 or 3070 Ti-level card to have just 8 GB, and that's why the 6700 XT comes with 12 GB.
 
Joined
Jan 27, 2024
Messages
291 (0.88/day)
Processor Ryzen AI
Motherboard MSI
Cooling Cool
Memory Fast
Video Card(s) Matrox Ultra high quality | Radeon
Storage Chinese
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Chinese
Mouse Chinese
Keyboard Chinese
VR HMD No
Software Android | Yandex
Benchmark Scores Yes
if Nvidia was really cheating on something like texturing, everyone would know and immediately notice?

No. Many people claim that they don't see the difference. Either because they support Nvidia so much that they don't want to, or simply because they use old 1K screens which are horrendous with image quality, anyways.
The situation is such that some people even claim that they don't see a difference between 1K and 4K. How can we discuss anything more?

t's been debunked ages upon ages ago.

Nothing is "debunked". It has been proven since decades that Nvidia in general offers lower quality and there are many internet threads about it.
I have also been able to compare and tell you that I do not feel comfortable when a GeForce card is connected to my screen.

Actually, it was proven multiple times.

Dota 2 showed a slight difference in image detail between the AMD and Nvidia graphics cards. When examining the grass and walls, there was a noticeable increase in detail on the AMD RX590. Additionally, the flag appeared more defined on the AMD card. Despite the difference in detail, the AMD graphics card performed lower than the Nvidia card in terms of frame rates.




VRAM starved AMD cards display the exact same symptoms

Here, I'd like to correct myself - yes, in some new games the Radeons do behave strangely and begin to not load textures.

The 7600 takes the lead over the 6650 XT at 1440p, but these results are somewhat skewed as the 7600 consistently had missing textures in this test. This is a common issue for all 8GB models, leading to inconsistent memory management and unreliable results.

The Last of Us poses problems for VRAM and 8GB graphics cards. At 1080p with ultra quality settings, the game appeared to render correctly, but frame time performance was noticeably worse for the 8GB cards. The RTX 3060, which has 12GB of VRAM, saw 1% lows nearly 40% greater than those of the 7600.

 
Top