• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

When will gpu prices return to normal.

Status
Not open for further replies.
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I am not looking at them - you brought them in the discussion.

The 6800 XT is double the performance of 6600.
Uhm... Your memory seems to be failing. Someone compared its value to the RTX 3050, after which you jumped in with "it's still terrible value", which you "argued for" by quiting some silly 2160p benchmarks.

Also, the 6800 XT is more than double the price of the 6600. So, if it's more than double the price, for double the performance, that's actually worse value, no?
 
Joined
Oct 29, 2019
Messages
466 (0.25/day)
While I mostly agree with you overall, Pascal was a generation dominated by a stretching and shifting of product segments - with the "60 tier" stretching into not one, not two, but ... four? five? GPUs? The RTX-GTX split ensured that, with the sudden appearance of 1660, 1660 Ti, 1660 Super, 2060, and 2060 Super. And arguably a sixth with the 2060 12GB, though that was much later. The 1660 slotted in at the traditional "60-tier" price level, while the 2060 was the one everyone actually noticed, as it was presented as far more attractive (and performed much better despite nominally sharing a tier). So things got confusing real fast there. One would think Ampere would smooth that out with GTX cards disappearing, but it sure doesn't seem that way.
Yeah to be 100% honest I completely forgot those cards even existed. Those cards though where still 300-350 euro range at launch so that's way off from the "179-229$ price range" and "should cost no more than 200" nonsense he's spouting
 
Joined
Feb 8, 2020
Messages
73 (0.04/day)
System Name Ryzen3950X
Processor AMD Ryzen 3950X
Motherboard AsRock X470 Taichi
Cooling Corsair H100x
Memory Team Group Dark Pro 4x8Gb 3200Mhz 14-14-14-31 :: testing Team 4133Mhz 18-18-18-38
Video Card(s) NVIDIA RTX-2070
Storage Samsung 960 250Gb M.2 plus a buttload of spinners
Display(s) Dell U2718Q 4K HDR
Case Fractal Design R6
Audio Device(s) Couple of speakers
Power Supply Seasonic 650W Gold
Mouse Logitech MX Ergo trackball
Keyboard Velocifier wireless mechanical
Software Arch kernel 5.4.17-1-MANJARO
Benchmark Scores Cinebench 20: 9316 / 511
Joined
Jun 21, 2021
Messages
3,121 (2.49/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
Uhm... Your memory seems to be failing. Someone compared its value to the RTX 3050, after which you jumped in with "it's still terrible value", which you "argued for" by quiting some silly 2160p benchmarks.

Also, the 6800 XT is more than double the price of the 6600. So, if it's more than double the price, for double the performance, that's actually worse value, no?

The 6800 XT is a better value. The 6600 is a piece of junk because it's also a badly designed product, and its price makes the things much worse.

1660883363697.png

1660883389120.png

AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble - Control | TechPowerUp
MSI Radeon RX 6600 XT Gaming X Review - Control | TechPowerUp
 
Joined
Aug 14, 2013
Messages
2,373 (0.58/day)
System Name boomer--->zoomer not your typical millenial build
Processor i5-760 @ 3.8ghz + turbo ~goes wayyyyyyyyy fast cuz turboooooz~
Motherboard P55-GD80 ~best motherboard ever designed~
Cooling NH-D15 ~double stack thot twerk all day~
Memory 16GB Crucial Ballistix LP ~memory gone AWOL~
Video Card(s) MSI GTX 970 ~*~GOLDEN EDITION~*~ RAWRRRRRR
Storage 500GB Samsung 850 Evo (OS X, *nix), 128GB Samsung 840 Pro (W10 Pro), 1TB SpinPoint F3 ~best in class
Display(s) ASUS VW246H ~best 24" you've seen *FULL HD* *1O80PP* *SLAPS*~
Case FT02-W ~the W stands for white but it's brushed aluminum except for the disgusting ODD bays; *cries*
Audio Device(s) A LOT
Power Supply 850W EVGA SuperNova G2 ~hot fire like champagne~
Mouse CM Spawn ~cmcz R c00l seth mcfarlane darawss~
Keyboard CM QF Rapid - Browns ~fastrrr kees for fstr teens~
Software integrated into the chassis
Benchmark Scores 9999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999
You’re a halo user. The market isn’t there yet. The vast (vaaaaaast) majority of users don’t play at 4K and, even if they did, all GPUs are garbage (and that’s not entirely true, as you get much better performance margins, according to reviews, with their caveats, with more expensive products). All of your claims about the economics underlying the actual limits of technology for your xxxtreme requirements don’t have any actual foundations in how the economy is organized. You are designed to be to fucked by the market. That’s consumer choice.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
The 6800 XT is a better value. The 6600 is a piece of junk because it's also a badly designed product, and its price makes the things much worse.

View attachment 258602
View attachment 258603
AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble - Control | TechPowerUp
MSI Radeon RX 6600 XT Gaming X Review - Control | TechPowerUp
You seem awfully fond of using Control as an illustration, why might that be? Oh, right, it's a major outlier that greatly exaggerates the performance difference between these two GPUs. Go figure. What you're saying here is pure nonsense. The 6600 is much better value than the 6800 XT.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
You’re a halo user. The market isn’t there yet. The vast (vaaaaaast) majority of users don’t play at 4K and, even if they did, all GPUs are garbage (and that’s not entirely true, as you get much better performance margins, according to reviews, with their caveats, with more expensive products). All of your claims about the economics underlying the actual limits of technology for your xxxtreme requirements don’t have any actual foundations in how the economy is organized. You are designed to be to fucked by the market. That’s consumer choice.

That's not true. Every card above the Radeon RX 6800, including the XT, 6900 XT and 6950 XT, not to mention the new generation which is due in several weeks do perfectly fine support 4K gaming.
It's just that the 6600/XT is heavily limited, be it low VRAM, low shaders performance, low PCIe bandwidth, etc...
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
That's not true. Every card above the Radeon RX 6800, including the XT, 6900 XT and 6950 XT, not to mention the new generation which is due in several weeks do perfectly fine support 4K gaming.
It's just that the 6600/XT is heavily limited, be it low VRAM, low shaders performance, low PCIe bandwidth, etc...
... or, maybe, it's the entirely common and normal fact that as you step down in price, you also step down in performance, and that 2160p is still unequivocally a high-end resolution? You keep going "waa, waa, the 6600 is terrible value because it doesn't game well at 2160p", yet ... value compared to what? There is nothing that provides a notably better value at that resolution, only things that deliver better absolute performance. Value is a function of price and performance. You're using the world 'value' in a way that just doesn't make sense. You want more than GPUs of that class can deliver. That's fine, but it doesn't make them poor value, it just makes them unsuited to your use. And, again, your examples are ludicrously obvious cherry-picking. The base RX 6600 does decently at 2160p for its class, even at stupid Ultra settings:

The 3060 is marginally faster, with the 3060 Ti pulling clearly ahead, just barely cracking 60fps average.

What does this tell us?
- That the 6600 can play 2160p perfectly fine as long as it's not forced to run at Ultra. If it does 40fps average at Ultra, it'll do >60 at medium-high easily. That doesn't mean it'll hit 60 in the outlier titles, but then, seeing how the 3070 didn't crack 30 in CP2077, that's hardly surprising.
- For its price, it's a decently competitive card even at 2160p despite RDNA2 not scaling that well to higher resolutions. You need to step up to the much more expensive 3060 Ti for a meaningful increase.
- The class of performance that you seem to be asking for is simply not where this card sits in the product stack, and it is priced accordingly. This makes your claims about it being poor value fall apart entirely.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
... or, maybe, it's the entirely common and normal fact that as you step down in price, you also step down in performance, and that 2160p is still unequivocally a high-end resolution? You keep going "waa, waa, the 6600 is terrible value because it doesn't game well at 2160p", yet ... value compared to what? There is nothing that provides a notably better value at that resolution, only things that deliver better absolute performance. Value is a function of price and performance. You're using the world 'value' in a way that just doesn't make sense. You want more than GPUs of that class can deliver. That's fine, but it doesn't make them poor value, it just makes them unsuited to your use. And, again, your examples are ludicrously obvious cherry-picking. The base RX 6600 does decently at 2160p for its class, even at stupid Ultra settings:

The 3060 is marginally faster, with the 3060 Ti pulling clearly ahead, just barely cracking 60fps average.

What does this tell us?
- That the 6600 can play 2160p perfectly fine as long as it's not forced to run at Ultra. If it does 40fps average at Ultra, it'll do >60 at medium-high easily. That doesn't mean it'll hit 60 in the outlier titles, but then, seeing how the 3070 didn't crack 30 in CP2077, that's hardly surprising.
- For its price, it's a decently competitive card even at 2160p despite RDNA2 not scaling that well to higher resolutions. You need to step up to the much more expensive 3060 Ti for a meaningful increase.
- The class of performance that you seem to be asking for is simply not where this card sits in the product stack, and it is priced accordingly. This makes your claims about it being poor value fall apart entirely.

Yeah, let's agree that the 6600/XT is a card for 1080p medium.
Which is pathetic for its current price range.
 
Joined
Dec 25, 2020
Messages
6,753 (4.72/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I'm too much of a GPU elitist to really chime in on this one, but Navi 23, while not really a 4K card shouldn't put too ugly a show at 1440p targeting 60 fps imo. The 6650 XT (fully enabled and enhanced core) might even handle some of the lighter games at 4K 60 just fine, I guess. The mobile RTX 3050 (so GA107 with 16 SMs out of 20 enabled), at 80W, generally does 1080p60 perfectly well in most games, the 4 GB VRAM being its real issue IMO. RX 6600 can't really be worse than that.
 
Joined
Jul 15, 2020
Messages
1,021 (0.64/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
I`m still wating for a 970GTX level of pref/$$$
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Yeah, let's agree that the 6600/XT is a card for 1080p medium.
Which is pathetic for its current price range.
Like ... what? Man, your reality distortion field is strong, clearly.

First off: The 6600 and 6600 XT are not "a card", they are two distinct SKUs with quite distinct performance levels.

Second:

That's the 6600, not the XT. 1080p medium, you say? At what refresh rate, 240Hz? 'Cause that review suite quite clearly shows it delivering an average 114fps at Ultra. It's still well above 60fps average at 1440p.

You're talking out of your rear end here, and your claims about "value" are just plain-faced ridiculous. Please just stop making a fool out of yourself.
 
Joined
Aug 21, 2015
Messages
1,723 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
I`m still wating for a 970GTX level of pref/$$$

That's a difficult metric to compare to at this point, since none of the benchmarks from the 970 launch are in use anymore. But here I go anyway. If we look at the 970's $330 launch price, the only card that matches right now (in the US) is the 6600 XT, which averages over 100fps (and at least 60) in TPU's test battery at 1080p, and nearly 100 at 1440 (most games above 60). It falls on its face a bit at 4K, but so did the 970. Power envelope is even similar, at 160W for the Radeon vs. the GeForce's 150W.

Granted, the 6600 XT isn't as close to the high-end cards as the 970 was in its day. But the ceiling has been raised by a TON over the last couple of generations. Top single-GPU dog that gen was the 980 ti, a 250W design. The 3080 ti by contrast pulls 350W, while a 3090 ti will suck down around 450W. If we limit ourselves to the TDP of the 980 ti, the highest current-gen performer is the 6800 on the AMD side and 3070 for Nvidia (yes, it's actually 220W, but the next-step-up 2070 ti is 290W). By the TPU 980 ti FE review, the 970 was 25-30% behind overall. The 6600 XT is 25-50% down at 1080/1440 depending on whether you're comparing the 3070 or 6800; the 6800 pulls further ahead as resolution scales, and is actually double at 4K.

TPU wasn't doing their AVGfps chart back when Maxwell launched, so I quick made my own, choosing 1440p because someone inevitably scoffs at 1080p and 4K isn't a reasonable target at this price point. The dashed line represents the ~99fps average of the 6600 XT @ 1440p. Looking at the two side-by-side now, I'd conclude that the 6600 XT is actually a better P/P card than the GTX 970.

1660920633954.png


1660920825408.png


TL;DR: If you're looking for high-end-adjacent performance for $350 or less, that's probably never happening. If what you want is equivalent or better performance in current titles for that same money, AMD has a 6600 XT they'd love to sell you.

What about ray-tracing? It can't enable it?

If one searches for a reason to dislike something, one will always find it. The 6600 and XT suck by the metrics that matter to you. We get it. They're excellent cards for users with other priorities. Personally, I'd love to have either one.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
If one searches for a reason to dislike something, one will always find it. The 6600 and XT suck by the metrics that matter to you. We get it. They're excellent cards for users with other priorities. Personally, I'd love to have either one.

Actually, the problem that I see is the one that you mentioned - the gigantic performance difference between it and the higher end cards.

It is a very risky purchase without future proofing.
New games launched and you will observe under 50 FPS everywhere regardless of the settings.

Granted, the 6600 XT isn't as close to the high-end cards as the 970 was in its day. But the ceiling has been raised by a TON over the last couple of generations. Top single-GPU dog that gen was the 980 ti, a 250W design. The 3080 ti by contrast pulls 350W, while a 3090 ti will suck down around 450W. If we limit ourselves to the TDP of the 980 ti, the highest current-gen performer is the 6800 on the AMD side and 3070 for Nvidia (yes, it's actually 220W, but the next-step-up 2070 ti is 290W). By the TPU 980 ti FE review, the 970 was 25-30% behind overall. The 6600 XT is 25-50% down at 1080/1440 depending on whether you're comparing the 3070 or 6800; the 6800 pulls further ahead as resolution scales, and is actually double at 4K.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Actually, the problem that I see is the one that you mentioned - the gigantic performance difference between it and the higher end cards.
As the post you quoted said, you're looking - and very, very hard - for a reason to dislike this card. Is it further behind the top than the 970 was? Yes, because the 970 was a third-tier card, while the 6600 is 6th (if you count only the first round of RDNA2 SKUs) on AMD's tier list. This has been talked about for years: that as more resolutions become useable, the range of what is "usable" performance widens considerably, and necessitates an increasing number of SKUs - especially as production costs also rise as we come ever closer to running into various production/engineering/lithography walls that have yet to be worked around. Does the existence of many more SKUs make the 6600 a worse deal? Not at all, as those SKUs are much more expensive, and much worse value, even if they're also faster. The 6600, let alone the XT, is a great deal, delivering excellent value for money, great performance at 1080p and 1440p, and nothing of what you're saying comes even close to being an argument against that, let alone being a convincing one.
It is a very risky purchase without future proofing.
No GPU is future proof in any way, shape or form. This is nonsense.
New games launched and you will observe under 50 FPS everywhere regardless of the settings.
As has been the case with every GPU ever made. As time passes, its ability to keep up with new launches will diminish. There is no reason to expect the 6600 or 6600 XT to be outliers in this regard.
You are trying really, really, really hard here, so I guess kudos for the effort if nothing else? Sadly it isn't working though. You can't expect passable RT performance at this price level. I mean, you could buy a 3050 - which costs more in many places - and get ... let's see:

Oh, right, it's worse than both the 6600 and 6600 XT. So, to get passable 1080p RT performance in that title, you ened a 6700 XT or 3060 Ti - both of which are also much more expensive. So ... the value proposition is still there, no, if you have to pay more to get more?

Are you perhaps seeing a pattern here? Something like "if you want more, you pay more"? 'Cause that's what your examples are illustrating - no matter how much work you put into picking as selectively as possible or framing them in extremely biased ways. Also, I thought you didn't care about anything but 2160p? So why are you looking at 1080p testing, all of a sudden? Oh, right, you're trying desperately to cherry-pick a defense of your ludicrous "this is a 1080p medium card" stance, right. 'Cause when you said that, what you meant was "this card can't handle RT". Makes perfect sense.
 
Joined
Jun 21, 2021
Messages
3,121 (2.49/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
The recent back-and-forth conversation in this thread between two TPU forum participants is a good illustration of several points.

You can typically make a stance (about graphics cards) by cherry picking through various benchmark results. That's one reason why I like 10-game, 15-game, 20-game average scores. While I never play all of the games, the averages do even out any particular game's preferred architecture or idiosyncrasies.

This leads us to another point: most people play more than one game over the ownership of a given gaming device. It's possible to build a pure play PC optimized for a single game (like Hollow Knight) but that's not a real world usage case for Joe Consumer or even the typical DIY PC Builder.

There are details that these game average FPS scores don't always reveal in an obvious way.

One example is display resolution. One manufacturer's cards don't do so well at higher resolutions compared to the equivalent competition.

Ray tracing performance is superior in one. DLSS is available in one. It's important to note that graphics card reviews generally run separate tests for these two features.

From a pure rasterization standpoint, AMD cards offer a better value (performance per dollar), particularly at lower gameplay resolutions. However if you play games that use ray tracing (more are added as time goes on) and take advantage of DLSS (same), there are benefits of Nvidia cards that AMD's current lineup don't offer.

There are other weird little features that might favor an Ampere card. I happen to use Nvidia Broadcast for cleaning up live audio and video. The Tensor cores apparently do most of the heavy lifting here. While this undoubtedly is not in everyone's usage case, it is a real world task that my recent GeForce cards can offer.

In the end, the best strategy is to buy from a reputable merchant with a reasonable return policy and to use the card heavily in its expected usage situations during that return window to determine whether or not the product works for your specific needs.

I don't play Cyberpunk 2077 so I really don't care about cards that perform exceptionally well with that title. That's why the aggregate average game scores are more useful than a single game comparison. I do play Control but I'm certainly not going to base a graphics card purchase on that one game benchmark.
 
Last edited:
Joined
May 2, 2022
Messages
1,624 (1.73/day)
Location
G-City, UK
System Name AMDWeapon
Processor Ryzen 7 7800X3D
Motherboard X670E MSI Tomahawk WiFi
Cooling Thermalright Peerless Assassin 120 ARGB with Silverstone Air Blazer 2200rpm fans
Memory G-Skill Trident Z Neo RGB 6000 CL30 32GB@EXPO
Video Card(s) Powercolor 7900 GRE Red Devil
Storage Samsung 870 QVO 1TB x 2, Lexar 256 GB, TeamGroup MP44L 2TB, Crucial T700 1TB, Seagate Firecuda 2TB
Display(s) 32" LG UltraGear GN600-B
Case Montech 903 MAX AIR
Audio Device(s) Corsair void wireless/Sennheiser EPOS 670
Power Supply MSI MPG AGF 850 watt gold
Mouse Glorious Model D l Pad GameSir G7 SE
Keyboard Redragon Vara K551P
Software Windows 11 Pro 24H2
Benchmark Scores Fast Enough.
When GPU prices return to normal I shall be 84ish.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
A report from graphics card channel dealers published by MyDrivers states that ASUS, Gigabyte, and MSI are having a difficult time trying to convince retailers and distributors to buy Radeon RX 6000 series cards for sale in the consumer segment. The reason is just like NVIDIA's GPUs which saw a huge price jump during the mining boom but pricing has now plummeted heavily. AMD's Radeon RX 6000 series graphics card prices have plummeted even worse than NVIDIA's GPUs & there is little to no demand for gaming cards right now.

GPU Price Crash Is Making It Hard For AIBs To Offload AMD Radeon Graphics Cards Too, RX 6700 XT Drops Below $400 US, RX 6600 Below $260 US (wccftech.com)
 
Joined
Aug 21, 2015
Messages
1,723 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
Actually, the problem that I see is the one that you mentioned - the gigantic performance difference between it and the higher end cards.

It is a very risky purchase without future proofing.
New games launched and you will observe under 50 FPS everywhere regardless of the settings.

I agree that there's a problem, but disagree on what that problem is. Both manufacturers have been raising the bar on what's considered high-end by pushing power consumption. I'm personally of a mind that 300W+ for graphics is absurd. This particular arms race has gotten way out of hand. The 980 ti (since I was already talking about it above), the biggest, baddest card of its day that wasn't two cards bolted to one board, launched at $650. Its successor stuck with the same power envelope, wiped the floor with it performance-wise, and asked $700. Today, that power envelope will get you another 50% performance on top of that in a 6800 or 3070, and cost you $600-700. Price/performance isn't the issue, IMO. It's expectations.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
...and? Is this supposed to support your claim that the 6600 (and XT) is poor value? 'Cause "no demand" is not the same as "everyone thinks things are too expensive". After all, a huge portion of the glut of products now is due to overproduction after a period of unprecedented demand, which is now being followed by (increased) recession and economic anxiety across much of the wealthier parts of the world.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
...and? Is this supposed to support your claim that the 6600 (and XT) is poor value? 'Cause "no demand" is not the same as "everyone thinks things are too expensive".

There is no bad product, there is a bad price.

Give me the Radeon RX 6800 XT for 300 euro, I will create that demand right now and buy the card.
 
Joined
Dec 25, 2020
Messages
6,753 (4.72/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
There is no bad product, there is a bad price.

Give me the Radeon RX 6800 XT for 300 euro, I will create that demand right now and buy the card.

I want one at these charity prices too, big guy. Get in line, I'll flash my cred as a Vanguard beta tester group member and say I need it more than you :eek:

The recent back-and-forth conversation in this thread between two TPU forum participants is a good illustration of several points.

You can typically make a stance (about graphics cards) by cherry picking through various benchmark results. That's one reason why I like 10-game, 15-game, 20-game average scores. While I never play all of the games, the averages do even out any particular game's preferred architecture or idiosyncrasies.

This leads us to another point: most people play more than one game over the ownership of a given gaming device. It's possible to build a pure play PC optimized for a single game (like Hollow Knight) but that's not a real world usage case for Joe Consumer or even the typical DIY PC Builder.

There are details that these game average FPS scores don't always reveal in an obvious way.

One example is display resolution. One manufacturer's cards don't do so well at higher resolutions compared to the equivalent competition.

Ray tracing performance is superior in one. DLSS is available in one. It's important to note that graphics card reviews generally run separate tests for these two features.

From a pure rasterization standpoint, AMD cards offer a better value (performance per dollar), particularly at lower gameplay resolutions. However if you play games that use ray tracing (more are added as time goes on) and take advantage of DLSS (same), there are benefits of Nvidia cards that AMD's current lineup don't offer.

There are other weird little features that might favor an Ampere card. I happen to use Nvidia Broadcast for cleaning up live audio and video. The Tensor cores apparently do most of the heavy lifting here. While this undoubtedly is not in everyone's usage case, it is a real world task that my recent GeForce cards can offer.

In the end, the best strategy is to buy from a reputable merchant with a reasonable return policy and to use the card heavily in its expected usage situations during that return window to determine whether or not the product works for your specific needs.

I don't play Cyberpunk 2077 so I really don't care about cards that perform exceptionally well with that title. That's why the aggregate average game scores are more useful than a single game comparison. I do play Control but I'm certainly not going to base a graphics card purchase on that one game benchmark.

I agree. End of the day having an abundance of options to pick from is an excellent thing. There is always a product that will fit your personal needs best.
 
Joined
Oct 21, 2005
Messages
7,061 (1.01/day)
Location
USA
System Name Computer of Theseus
Processor Intel i9-12900KS: 50x Pcore multi @ 1.18Vcore (target 1.275V -100mv offset)
Motherboard EVGA Z690 Classified
Cooling Noctua NH-D15S, 2xThermalRight TY-143, 4xNoctua NF-A12x25,3xNF-A12x15, 2xAquacomputer Splitty9Active
Memory G-Skill Trident Z5 (32GB) DDR5-6000 C36 F5-6000J3636F16GX2-TZ5RK
Video Card(s) ASUS PROART RTX 4070 Ti-Super OC 16GB, 2670MHz, 0.93V
Storage 1x Samsung 970 Pro 512GB NVMe (OS), 2x Samsung 970 Evo Plus 2TB (data), ASUS BW-16D1HT (BluRay)
Display(s) Dell S3220DGF 32" 2560x1440 165Hz Primary, Dell P2017H 19.5" 1600x900 Secondary, Ergotron LX arms.
Case Lian Li O11 Air Mini
Audio Device(s) Audiotechnica ATR2100X-USB, El Gato Wave XLR Mic Preamp, ATH M50X Headphones, Behringer 302USB Mixer
Power Supply Super Flower Leadex Platinum SE 1000W 80+ Platinum White, MODDIY 12VHPWR Cable
Mouse Zowie EC3-C
Keyboard Vortex Multix 87 Winter TKL (Gateron G Pro Yellow)
Software Win 10 LTSC 21H2
With the decrease in crypto, a lot of garbage ex miner cards are being sold on eBay right now, most are labeled honestly as "nonfunctional", "parts", or "repair". With that said, probably a good portion of the "used" and "refurbished" cards are also probably non functional.
 
Status
Not open for further replies.
Top