Thursday, February 1st 2024

PowerColor Hellhound Radeon RX 7900 GRE OC Lined up for Possible EU Wide Release

It seems that AMD and its board partners are continuing to rollout new custom graphics cards based on the formerly China market exclusive Radeon RX 7900 GRE 16 GB GPU—PowerColor unleashed its fiendish flagship Red Devil model as one of last September's launch options. Their Chinese website has been updated with another Navi 31 XL entry—Hellhound Radeon RX 7900 GRE OC. This design sits below the Red Devil in the company's graphics card product and pricing hierarchy; providing excellent cooling performance with fewer frills. The latest custom RX 7900 GRE card borrows PowerColor's existing demonic dog design from the mid-tier Hellhound RX 7800 XT and RX 7700 XT models. The Hellhound enclosure deployed on Radeon RX 7900 XTX and RX 7900 XT GPUs is a much chunkier affair.

The PowerColor Hellhound Radeon RX 7900 GRE OC has also popped up on a couple of UK and mainland Europe price comparison engines (published 2024-01-30), so it possible that a very limited release could occur across a small smattering of countries and retail channels—Proshop Denmark seems to be the first place with cards in stock, pricing is €629.90 (~$682) at the time of writing. The Radeon RX 7900 GRE (Golden Rabbit Edition) GPU sits in an awkward spot between the fancier Navi 31 options, and Navi 32 siblings—AMD and its AIB partners have reduced MSRPs in Europe, possibly in reaction to the recent launch of NVIDIA's GeForce RTX 40 SUPER series. We are not sure if this initiative has boosted the RX 7900 GRE's popularity in this region, since very few outlets actually offer the (XFX-produced) reference model or Sapphire's Pulse custom design.
Proshop.de details: AMD Radeon RX 7900 GRE Overclocked (Core clock 1500 MHz / Boost clock 2355 MHz), 5120 stream cores, 16 GB GDDR6 (Memory clock 18 GHz) - 256-bit, PCI-Express 4.0 x16, 3x DisplayPort 2.1 / 1x HDMI 2.1 connections, supports AMD FreeSync, supports Adaptive Sync, 2 x 8-pin power connector, recommended power supply: 750 watts, short length: 322 mm, PCI port width: 2.5 slots, PowerColor Triple Fan low noise cooler - with zero fan mode RPM (at low temperature), with Amethyst LED.

Model number: RX7900GRE 16G-L/OC.
Sources: PowerColor China, Skinflint UK, Geizhals DE
Add your own comment

27 Comments on PowerColor Hellhound Radeon RX 7900 GRE OC Lined up for Possible EU Wide Release

#1
Dr. Dro
Smart from PowerColor, the EU market is very receptive to Radeon and extremely value conscious, this could work well for them.
Posted on Reply
#2
thegnome
Would be great, a nice inbetween card. Would be even nicer if the Hellhound had proper RGB lights instead of this blue and white nonsense. Obviously to upsell to a Nitro+.
Posted on Reply
#3
Kirederf
thegnomeWould be great, a nice inbetween card. Would be even nicer if the Hellhound had proper RGB lights instead of this blue and white nonsense. Obviously to upsell to a Nitro+.
Nitro+? You mean the Red Devil? I highly doubt PowerColor is trying to upsell you a Sapphire card :)
Posted on Reply
#4
gurusmi
1. Proshop.de is btw. in difference to their TLD a danish shop near Aarhus and not a german one.

2. Sapphire Nitro and PowerColor Red Devil are comparable. Both have i.e. the same boost frequency on the same GPU. None is a upsell of the other.

3. Unfortunately for me i don't think that there will be a Waterblock available. So my choice will keep between a 7900XT and a 7800xt. Especially as my new GPU will have to drive two UWQHD monitors at 155 Hz.
Posted on Reply
#5
T0@st
News Editor
gurusmi1. Proshop.de is btw. in difference to their TLD a danish shop near Aarhus and not a german one.
Thanks for the knowledge; updated the article with correct Geography. Confused by the .de thing, and the site's language being entirely Deutsch.

I've only travelled to Denmark once in the past. Specifically Copenhagen, so I'm not really aware of anything outside of the city...vaguely remember listening to a band from Aarhus.
Posted on Reply
#6
Beginner Macro Device
T0@stCopenhagen
But that's Denmark, not Germany...
Dr. Drothis could work well for them...
...if they priced this GPU below the level of 4070 non-Super. 7900 GRE is margin of error faster than 7800 XT and the latter doesn't run circles around anything at least as fast as 4070. For 630 Euros, sales will be mediocre at best because 7800 XT is a hundred dollars cheaper despite being almost identical in performance ("thanks" to cripplingly low power limit and relatively slow VRAM on the GRE).
Posted on Reply
#7
T0@st
News Editor
Beginner Micro DeviceBut that's Denmark, not Germany...
Yeah, I was aware that I was taking a flight to Denmark, back in 2007. I would've been highly confused had I landed in a place full of people speaking German...not Dansk.
Posted on Reply
#8
Beginner Macro Device
T0@stback in 2007.
That feels like forever ago. I was a 12 y.o. dude trying to survive at school...
Posted on Reply
#9
gurusmi
T0@stThanks for the knowledge; updated the article with correct Geography. Confused by the .de thing, and the site's language being entirely Deutsch.

I've only travelled to Denmark once in the past. Specifically Copenhagen, so I'm not really aware of anything outside of the city...vaguely remember listening to a band from Aarhus.
Your welcome.
Beginner Micro DeviceThat feels like forever ago. I was a 12 y.o. dude trying to survive at school...
12 yrs. ago I lived in the swiss and visited denmark with my ukrainian lady to marry her. Roundabout when you was born i absolve my second university studies and wrote my diploma thesis. I got my first Computer 1974 when i was six. One had to solder and assemble it on his own. Since then i'm developing software. That's feeling like forever
Posted on Reply
#10
MarsM4N
Beginner Micro Device...if they priced this GPU below the level of 4070 non-Super. 7900 GRE is margin of error faster than 7800 XT and the latter doesn't run circles around anything at least as fast as 4070. For 630 Euros, sales will be mediocre at best because 7800 XT is a hundred dollars cheaper despite being almost identical in performance ("thanks" to cripplingly low power limit and relatively slow VRAM on the GRE).
Right? I don't get the purpose of the 7900 GRE. It's atm. ~100€ more expensive than the 7800XT (which has the same performance) and has no extra features that would give it a edge.

I guess it just exists to fish for extra 100€ from non tech savvy folks who just see the "9" and think is has to be faster. :p

Posted on Reply
#11
Minus Infinity
MarsM4NRight? I don't get the purpose of the 7900 GRE. It's atm. ~100€ more expensive than the 7800XT (which has the same performance) and has no extra features that would give it a edge.

I guess it just exists to fish for extra 100€ from non tech savvy folks who just see the "9" and think is has to be faster. :p

Indeed, performance of the 7900 GRE makes no sense. it has way more CU's than the 7800XT yet I guess since gpu clocks are much lower doesn't outperform it by more than a few fps, so why not just sell the Chinese the 7800XT.
Posted on Reply
#12
MarsM4N
Minus InfinityIndeed, performance of the 7900 GRE makes no sense. it has way more CU's than the 7800XT yet I guess since gpu clocks are much lower doesn't outperform it by more than a few fps, so why not just sell the Chinese the 7800XT.
Wait, they aren't selling the 7800XT in China? :wtf:
Posted on Reply
#13
Dr. Dro
MarsM4NRight? I don't get the purpose of the 7900 GRE. It's atm. ~100€ more expensive than the 7800XT (which has the same performance) and has no extra features that would give it a edge.

I guess it just exists to fish for extra 100€ from non tech savvy folks who just see the "9" and think is has to be faster. :p

Far from it, the purpose of the 7900 GRE is to make a product out of a low quality Navi 31 die that technically works but doesn't make the cut to be a 7900 XT. AMD then disables the bits that don't work too well and sell it as a cheaper product. The RTX 4090 is pretty much the same thing with the AD102, both the 7900 GRE and 4090 are similarly cut down to a significant part of their capability. This makes the 7800 XT is the best RDNA 3 card if you want something that is balanced and performs to the hardware's fullest extent.

Being Navi 31, even cut down, it'll never be as cheap as Navi 32 because it should mostly retain the bill of materials from the other 7900 models, even though it has less components such as memory overall. It loses to the 6800 XT and often the 7800 XT in benchmarks because the RDNA 3 architecture is very inefficient at the high end and scales really poorly. This isn't a problem specific to the GRE and it affects the other two cards directly.

Although comparing between architectures is apples to oranges, the general concept still applies: RX 7900 XTX (full Navi 31) has 42.5% more ROPs, 16.5% more TMUs, 23.3% more memory bandwidth, 50% higher memory capacity, 50% larger last level cache and 10% higher TGP allowance, but it is only 1% faster than RTX 4080 SUPER (full AD103) in raster and 20% slower in RT. A processor of its size and complexity clearly targeted the RTX 4090, but it failed to measure up, and upon launching AMD positioned it against the RTX 4080, same deal we've always known: it's 4% faster in raster and 16% slower in RT.
Posted on Reply
#14
kapone32
Dr. DroFar from it, the purpose of the 7900 GRE is to make a product out of a low quality Navi 31 die that technically works but doesn't make the cut to be a 7900 XT. AMD then disables the bits that don't work too well and sell it as a cheaper product. The RTX 4090 is pretty much the same thing with the AD102, both the 7900 GRE and 4090 are similarly cut down to a significant part of their capability. This makes the 7800 XT is the best RDNA 3 card if you want something that is balanced and performs to the hardware's fullest extent.

Being Navi 31, even cut down, it'll never be as cheap as Navi 32 because it should mostly retain the bill of materials from the other 7900 models, even though it has less components such as memory overall. It loses to the 6800 XT and often the 7800 XT in benchmarks because the RDNA 3 architecture is very inefficient at the high end and scales really poorly. This isn't a problem specific to the GRE and it affects the other two cards directly.

Although comparing between architectures is apples to oranges, the general concept still applies: RX 7900 XTX (full Navi 31) has 42.5% more ROPs, 16.5% more TMUs, 23.3% more memory bandwidth, 50% higher memory capacity, 50% larger last level cache and 10% higher TGP allowance, but it is only 1% faster than RTX 4080 SUPER (full AD103) in raster and 20% slower in RT. A processor of its size and complexity clearly targeted the RTX 4090, but it failed to measure up, and upon launching AMD positioned it against the RTX 4080, same deal we've always known: it's 4% faster in raster and 16% slower in RT.
Whatever, I have no idea where you get your opinions from. Do you realize that the 4090 uses a smaller node than the 7900 series and uses more power but of course I must be talking out my ass as the 7900 series require a 600 Watt connector. That is the reason and there are plenty of Games that are faster than the 4090. Ray Tracing is in the mouths of every reviewer but certainly not in every Game that has been released. As far as you being an AMD employee and knowing the short comings of Navi 31. I would use my year long experience with the 7900Xt than your baseless argument.

The 7900GRE was never meant for retail but there is obviously enough interest to bring it to retail.
Posted on Reply
#15
Dr. Dro
kapone32Whatever, I have no idea where you get your opinions from. Do you realize that the 4090 uses a smaller node than the 7900 series and uses more power but of course I must be talking out my ass as the 7900 series require a 600 Watt connector. That is the reason and there are plenty of Games that are faster than the 4090. Ray Tracing is in the mouths of every reviewer but certainly not in every Game that has been released. As far as you being an AMD employee and knowing the short comings of Navi 31. I would use my year long experience with the 7900Xt than your baseless argument.

The 7900GRE was never meant for retail but there is obviously enough interest to bring it to retail.
AMD's and NVIDIA's nodes are roughly equivalent in this generation, in fact, the transistor density of the Navi 31 GCD (150.2 million per mm²) exceeds that of the process used in AD102 (125.3 million per mm²), making it every bit as advanced as the NVIDIA card even after factoring in the MCDs with earlier generation nodes, and neither of them require 600 watts. The new 16-pin connector is a design choice so you run only one instead of three bulky cables to your card. That's the reason it was developed, not because you need absurd power limits for either card. Edge cases (although not even Starfield, arguably the ultimate edge case, doesn't result in a win for AMD) are not relevant.

My "opinions" come from looking at reviews, spec sheets (for example, the transistor density figures are available on the TPU GPU database, where I got the data from), and a bit of hands-on experience. I stand by everything I say, and admit when I'm wrong. Forums are an exchange of knowledge, and it really seems to bother you that I don't have a favorable opinion of AMD. That's true, I don't, but I dislike their fanbase and their warped view of reality far more than I dislike the company itself. There's no need to act like "Whatever" or come up with actual nonsense like "you update AGESA by updating the chipset drivers" like on the other thread. Perhaps the reality is that AMD just isn't as nice a company as you think they are, and their products do have some glaring shortcomings that clearly do not match the irrationally high regard that so many have for them.
Posted on Reply
#16
kapone32
Dr. DroAMD's and NVIDIA's nodes are roughly equivalent in this generation, in fact, the transistor density of the Navi 31 GCD (150.2 million per mm²) exceeds that of the process used in AD102 (125.3 million per mm²), making it every bit as advanced as the NVIDIA card even after factoring in the MCDs with earlier generation nodes, and neither of them require 600 watts. The new 16-pin connector is a design choice so you run only one instead of three bulky cables to your card. That's the reason it was developed, not because you need absurd power limits for either card. Edge cases (although not even Starfield, arguably the ultimate edge case, doesn't result in a win for AMD) are not relevant.

My "opinions" come from looking at reviews, spec sheets (for example, the transistor density figures are available on the TPU GPU database, where I got the data from), and a bit of hands-on experience. I stand by everything I say, and admit when I'm wrong. Forums are an exchange of knowledge, and it really seems to bother you that I don't have a favorable opinion of AMD. That's true, I don't, but I dislike their fanbase and their warped view of reality far more than I dislike the company itself. There's no need to act like "Whatever" or come up with actual nonsense like "you update AGESA by updating the chipset drivers" like on the other thread. Perhaps the reality is that AMD just isn't as nice a company as you think they are, and their products do have some glaring shortcomings that clearly do not match the irrationally high regard that so many have for them.
Indeed. Reality. Russia has invaded the Ukraine. After that Putin met with Chi and made a whatever it takes treaty. China sends BMs towards Taiwan every week. China is openly buying Oil from Iran. Iran is supplying Russia. China warns Taiwan that if the Democratic party is elected they will essentially invade. Russia was losing the War. Russia influenced HAMAS through Iran to trigger Netanyahu. The World court finds Israel guilty of Genocide. The west responds with cutting off aid to the Palestinians. Now the Middle East is a powder keg. Getting back to the Ukraine. One of the truths of the Ukraine conflict is how much technology has changed the Battlefield. Access to the Warsaw Pact database allows their drones and drone operators to destroy about 12 billion in Military Equipment like tanks and Troop carriers since the War started. One of the best drones are like the ones that Russia gets from Iran that has Machine learning. Anecdotally we are getting 4090 laptop chips in GPU shrouds. Why did they remove them in the first place?
I do not hate Nvidia. I just do not agree with their hubris. You made it seem like China is inert. They have built 3 Aircraft carriers, that we know about fully.

Now what nanometer is Nvidia at? What nanometer is AMD at? What DDR does Nvidia use? What DDR does AMD use? I know they both use TSMC but like I said already, they are on different nodes. You are even championing that ridiculous connector that has had one of the fastest revisions in the history of PC. Now PSU cables are bulky wow. I am not even going to touch why Nvidia cards use less power but we can go on.

Have you heard me say anything negative about the performance of the cards that Nvidia make? It has nothing to do with you disliking AMD it is just your comments about how bad AMD is. You could not even help yourself with "Perhaps the reality is that AMD just isn't as nice a company as you think they are, and their products do have some glaring shortcomings that clearly do not match the irrationally high regard that so many have for them." Yes I am over the moon with my 7900X3D/7900XT combo and yes I challenge someone with a "bit of experience" to show me when they make ridiculous comments like the 7900 series is "the RDNA 3 architecture is very inefficient at the high end and scales really poorly". I almost fell out of my chair when I read that.

As an example to you if I set Vsync my frame rate is 144 and in most Games the GPU is at 1800 Mhz at 4K. I did not buy a 7900XT for that though so if you think that a 7800XT is the best price/performance card that is just not true. Maybe in the States where those cards are under $500 but where I live the 7900XT is $1000 and 7800XT is $800. I would pay that premium for 2 more chiplets and 4GB more RAM. As I play my Games at 4K.

Knowledge is listening and not arguing with someone how bad their product is because you read it somewhere. Why don't you browse one of the AMD threads and see us talking about things like how easy it is to get to 3 GHZ with 1 click OCs. Or a Ryzen thread and see how much we talk about PBO and other things like X3D not using as much power as regular chips. I digress though. As you are in an AMD thread bashing AMD.
Posted on Reply
#17
lukart
thegnomeWould be great, a nice inbetween card. Would be even nicer if the Hellhound had proper RGB lights instead of this blue and white nonsense. Obviously to upsell to a Nitro+.
I actually love the fact they don't. :)
The Hellhound has LED lights and actually a couple of options, but a third option is the off mode, which blacks out the cards led's if you don't want any.

Regarding the GRE, I feel like it's not a 7800XT, computerbase tested the card and I think does 7-8% better. which is this card comes at around 599€, then it's a great buy.
Posted on Reply
#18
3valatzy
Dr. Dro7900 GRE is to make a product out of a low quality Navi 31 die that technically works but doesn't make the cut to be a 7900 XT. AMD then disables the bits that don't work too well and sell it as a cheaper product.
Yeah, ironically, I'd call it "Garbage Radeon Edition". Stands well for what one potential buyer would get from it. :mad:
Dr. DroAlthough comparing between architectures is apples to oranges, the general concept still applies: RX 7900 XTX (full Navi 31) has 42.5% more ROPs, 16.5% more TMUs, 23.3% more memory bandwidth, 50% higher memory capacity, 50% larger last level cache and 10% higher TGP allowance, but it is only 1% faster than RTX 4080 SUPER (full AD103) in raster and 20% slower in RT. A processor of its size and complexity clearly targeted the RTX 4090, but it failed to measure up, and upon launching AMD positioned it against the RTX 4080, same deal we've always known: it's 4% faster in raster and 16% slower in RT.
Well, the combined die size of Navi 31 is only 529 sq. mm with only 57.7 billion transistors, while the AD102 has as many as 76.3 billion transistors (32% more) and a die size of 609 sq. mm (15% larger). Clearly, Navi 31 cannot compete simply because the resources in it are less.

As for why Navi 31 can't beat AD103, that's more likely because of combined disadvantages - Navi's shaders, ROPs, TMUs, etc. are not utilised to their highest potential (there must be some analysis which show that it can't reach the theoretical performance levels), and some optimisation by Nvidia in their drivers which increase the performance while losing something else, for example textures resolution.
Posted on Reply
#19
Dr. Dro
3valatzyYeah, ironically, I'd call it "Garbage Radeon Edition". Stands well for what one potential buyer would get from it. :mad:



Well, the combined die size of Navi 31 is only 529 sq. mm with only 57.7 billion transistors, while the AD102 has as many as 76.3 billion transistors (32% more) and a die size of 609 sq. mm (15% larger). Clearly, Navi 31 cannot compete simply because the resources in it are less.

As for why Navi 31 can't beat AD103, that's more likely because of combined disadvantages - Navi's shaders, ROPs, TMUs, etc. are not utilised to their highest potential (there must be some analysis which show that it can't reach the theoretical performance levels), and some optimisation by Nvidia in their drivers which increase the performance while losing something else, for example textures resolution.
I'm not sure that absolute combined die area or total transistor count can be used to support the argument of N31 being "smaller" because like I said, architectures differ and the GCD is much more densely packed, and there's a characteristic in Ada that it contains several instances of NVDEC in which all GeForce models only one of them are actually enabled on the chip. Regardless, since the prices are much lower, it's quite forgivable, it costs AMD an absolute crown but it doesn't detract of the GPU as a product.

It is true that the N31 resources cannot be fully explored, that is the reason why this phenomenon occurs to begin with. However, past AD103, scaling begins to become problematic for Nvidia as well, it's only fair to mention that.

As for the Nvidia cheats on image quality argument that's been debunked so long ago that I'd rather not comment. Textures do not look worse on a GeForce than they do on a Radeon, both quality and bit depth are identical.
Posted on Reply
#20
3valatzy
Dr. DroTextures do not look worse on a GeForce
They indeed look worse.
Watch the proof from 8:02 to 12:49 this video:

Posted on Reply
#21
Dr. Dro
3valatzyThey indeed look worse.
Watch the proof from 8:02 to 12:49 this video:

This is not caused by "AMD" or "NVIDIA" but rather because the RTX 3070 is VRAM starved and games have begun to push 8 GB cards beyond their limitations. HUB themselves made a follow up video testing the RTX A4000 (which they happened to own) and it does not suffer from that problem. The same issue will affect 8 GB AMD GPUs, and it's why people want 12 GB as an absolute minimum going forward.

Posted on Reply
#22
3valatzy
Dr. DroThis is not caused by "AMD" or "NVIDIA" but rather because the RTX 3070 is VRAM starved and games have begun to push 8 GB cards beyond their limitations. HUB themselves made a follow up video testing the RTX A4000 (which they happened to own) and it does not suffer from that problem. The same issue will affect 8 GB AMD GPUs, and it's why people want 12 GB as an absolute minimum going forward.
That's not true. The behaviour after low VRAM amount is either a message that the game refuses to run at all, or very low framerate with the same textures resolution.
I hope you make a difference between low VRAM when the framerate stays sky high but the textures are missing, and when low VRAM, the textures remain on screen, but the framerate plummets?
Posted on Reply
#23
Dr. Dro
3valatzyThat's not true. The behaviour after low VRAM amount is either a message that the game refuses to run at all, or very low framerate with the same textures resolution.
I hope you make a difference between low VRAM when the framerate stays sky high but the textures are missing, and when low VRAM, the textures remain on screen, but the framerate plummets?
Not only it is true, but don't you think that if Nvidia was really cheating on something like texturing, everyone would know and immediately notice? It's been debunked ages upon ages ago. HUB's video literally shows that the exact same GA104 with 8 and 16 GB behave completely different, and all of the newer RTX cards with 12+ GB have always been immune to the problems shown in that video. And again, VRAM starved AMD cards display the exact same symptoms, the difference being, that AMD 8 GB cards tend to be much lower end because they aren't as stingy with the amount of memory installed. It's perfectly OK for an RX 6600 to have 8 GB, but while workable, it's clearly not that OK for a 3070 or 3070 Ti-level card to have just 8 GB, and that's why the 6700 XT comes with 12 GB.
Posted on Reply
#24
3valatzy
Dr. Droif Nvidia was really cheating on something like texturing, everyone would know and immediately notice?
No. Many people claim that they don't see the difference. Either because they support Nvidia so much that they don't want to, or simply because they use old 1K screens which are horrendous with image quality, anyways.
The situation is such that some people even claim that they don't see a difference between 1K and 4K. How can we discuss anything more?
Dr. Drot's been debunked ages upon ages ago.
Nothing is "debunked". It has been proven since decades that Nvidia in general offers lower quality and there are many internet threads about it.
I have also been able to compare and tell you that I do not feel comfortable when a GeForce card is connected to my screen.

Actually, it was proven multiple times.
Dota 2 showed a slight difference in image detail between the AMD and Nvidia graphics cards. When examining the grass and walls, there was a noticeable increase in detail on the AMD RX590. Additionally, the flag appeared more defined on the AMD card. Despite the difference in detail, the AMD graphics card performed lower than the Nvidia card in terms of frame rates.
www.toolify.ai/ai-news/amd-vs-nvidia-who-delivers-better-image-quality-78004


VRAM starved AMD cards display the exact same symptoms
Here, I'd like to correct myself - yes, in some new games the Radeons do behave strangely and begin to not load textures.
The 7600 takes the lead over the 6650 XT at 1440p, but these results are somewhat skewed as the 7600 consistently had missing textures in this test. This is a common issue for all 8GB models, leading to inconsistent memory management and unreliable results.
The Last of Us poses problems for VRAM and 8GB graphics cards. At 1080p with ultra quality settings, the game appeared to render correctly, but frame time performance was noticeably worse for the 8GB cards. The RTX 3060, which has 12GB of VRAM, saw 1% lows nearly 40% greater than those of the 7600.
www.techspot.com/review/2686-amd-radeon-7600/
Posted on Reply
#25
Dr. Dro
3valatzyNo. Many people claim that they don't see the difference. Either because they support Nvidia so much that they don't want to, or simply because they use old 1K screens which are horrendous with image quality, anyways.
The situation is such that some people even claim that they don't see a difference between 1K and 4K. How can we discuss anything more?
That's because there's no difference, not because "they can't tell". A decade plus ago or whatever Nvidia defaulted HDMI to a limited color range for compatibility reasons (deep color isn't supported on earlier HDMI TVs: even the PS3 has an option to toggle this) and this caused the image to look compressed, this hasn't been the case for years upon years now. Nothing was "proven", you don't feel comfortable when a GeForce is plugged to your screen due to personal bias, not because "the image looks worse".
Posted on Reply
Add your own comment
May 2nd, 2024 01:10 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts