• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

When will gpu prices return to normal.

Status
Not open for further replies.

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
This is exactly why, barring a > $50 price difference in AMD vs Nvidia, I'd personally still go with Nvidia.

And nothing in the midrange is priced reasonably, at all.

Given this is a 2 year product cycle, this represents the last 4 years of the midrange, and it is not impressive at all. With a 2 year product cycle, we should see 40% uplifts the way the high end GPUs are doing.

But for Halo products, they just keep making higher and higher end models, while the mid range kind of languishes. What do we get next, the 4950XT Ti Super Titan?

And yes I mixed up AMD/Nvidia naming convention intentionally.


View attachment 261197

The reason for this difference between Turing and Ampere is that all Turing dies were very large:

RTX 2080 Ti - TU102 - 754 sq. mm.
RTX 2080 - TU104 - 545 sq. mm.
RTX 2060 - TU106 - 445 sq. mm.

While with the RTX 3060, nvidia tried to return to the more normal die sizes:

RTX 3060 - GA106 - 276 sq. mm.

GA106 is a virtual die shrink of the TU106.
When there are direct shrinks, there is no high-performance improvement.
 

Count von Schwalbe

Moderator
Staff member
Joined
Nov 15, 2021
Messages
3,087 (2.78/day)
Location
Knoxville, TN, USA
System Name Work Computer | Unfinished Computer
Processor Core i7-6700 | Ryzen 5 5600X
Motherboard Dell Q170 | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | RTX 2080 Ti FE
Storage Crucial BX500 2TB | TBD
Display(s) 3x LG QHD 32" GSM5B96 | TBD
Case Dell | Heavily Modified Phanteks P400
Power Supply Dell TFX Non-standard | EVGA BQ 650W
Mouse Monster No-Name $7 Gaming Mouse| TBD
FHD vs QHD vs 4K (or UHD) makes the most sense.
$700 for a GPU should get you into 2160p territory just fine. Just don't set your settings to Ultra - that's just stupid anyhow. The next setting down is typically visually indistinguishable yet performs far, far better. Reviews tend to test at Ultra because they want to test worst-case scenarios for proper stress testing, but gaming at Ultra makes no sense.
Yes!

Also, you would be better off getting a better monitor at a lower resolution if buying a midrange GPU.


GA106 is a virtual die shrink of the TU106.
When there are direct shrinks, there is no high-performance improvement
Small performance bump from clock speeds, usually. However, GA106 has 3840 shading units, vs the 2304 of TU106. The 3060 has around 50% more than the 2060.

What I can't understand is why the GPU manufacturers are not more focused on improvements to the highest volume markets - the 60-70 class units. Mindshare/press coverage maybe?
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
Small performance bump from clock speeds, usually. However, GA106 has 3840 shading units, vs the 2304 of TU106. The 3060 has around 50% more than the 2060.

What I can't understand is why the GPU manufacturers are not more focused on improvements to the highest volume markets - the 60-70 class units. Mindshare/press coverage maybe?

If I'm not mistaken two Ampere shaders are equal to one Turing shader. They were made to be smaller and less performance.
 
Joined
Aug 21, 2015
Messages
1,725 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
FHD vs QHD vs 4K (or UHD) makes the most sense.

Yes!

Also, you would be better off getting a better monitor at a lower resolution if buying a midrange GPU.



Small performance bump from clock speeds, usually. However, GA106 has 3840 shading units, vs the 2304 of TU106. The 3060 has around 50% more than the 2060.

What I can't understand is why the GPU manufacturers are not more focused on improvements to the highest volume markets - the 60-70 class units. Mindshare/press coverage maybe?

My hypothesis, generated on the fly during a conversation with a friend yesterday, is the confluence of high die yields with strong demand for high performance cards. Back in The Day(tm), the number of customers willing to pay top dollar for the top cards was limited. Chipmakers would resort (allegedly?) to fusing off fully-functioning dice to have chips for cards lower in the stack. Then the PC gaming resurgence came around, coupled with the crypto boom. All of a sudden there was functionally unlimited demand for the highest performing chip either manufacturer could muster, even beyond the traditional one-upsmanship. From AMD/Nvidia's perspective, that doesn't leave much incentive to drive value in the midrange. AMD was the champion there in recent years because they didn't have anything that could compete with Nvidia in the high end, so needed to lean on the value play. Then the above happened, coupled with legit advancements of their own, and that wasn't necessary anymore.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
My hypothesis, generated on the fly during a conversation with a friend yesterday, is the confluence of high die yields with strong demand for high performance cards. Back in The Day(tm), the number of customers willing to pay top dollar for the top cards was limited. Chipmakers would resort (allegedly?) to fusing off fully-functioning dice to have chips for cards lower in the stack. Then the PC gaming resurgence came around, coupled with the crypto boom. All of a sudden there was functionally unlimited demand for the highest performing chip either manufacturer could muster, even beyond the traditional one-upsmanship. From AMD/Nvidia's perspective, that doesn't leave much incentive to drive value in the midrange. AMD was the champion there in recent years because they didn't have anything that could compete with Nvidia in the high end, so needed to lean on the value play. Then the above happened, coupled with legit advancements of their own, and that wasn't necessary anymore.

I disagree. We have a duopoly and AMD didn't master anything at all. We see the same old performance from AMD and no improvement at all.
I mean look at the 5500 XT (2019) -> 6500 XT (2022) transition. 0% progress. None!
It even looks that AMD intentionally stagnates the market, so that no "average joe" to ever be able to upgrade from the ancient 1080p monitor.
The AMD market share is still low 20% and there is absolutely no signs that to be improving.

Please, show the cards sales - how many low-end cards were sold, how many mid-range and how many high-end?
 
Joined
Nov 26, 2021
Messages
1,648 (1.50/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
If I'm not mistaken two Ampere shaders are equal to one Turing shader. They were made to be smaller and less performance.
That isn't the case. Turing introduced separate int32 execution units. Pascal executed int32 operations on the fp32 units. Ampere added 1 fp32 unit for each int32 unit. So the resources per SMX look like this:

PascalTuringAmpere
Units64 fp3264 fp32 + 64 int32128 fp32 + 64 int32
Max ops per clock64 fp32
or
64 int32
64 fp32
or
32 fp32 and 32 int32
128 fp32
or
64 fp32 and 64 int32

Keep in mind that each SMX in Pascal and Turing is limited to 64 operations per clock. However, each SMX in Ampere can execute 128 operations per clock. In practice, the difference isn't that great. Let's use the example of a game that has a mix of 2 fp32 ops to 1 int32 op. I'm using this example, because this is what Nvidia used to illustrate Turing's improvements over Pascal.

Over the course of 6 instructions, the 3 architectures will execute them like this:

CyclePascalTuringAmpere
11 int321 fp32 + 1 int321 fp32 + 1 int32
21 fp321 fp32 + 1 int321 fp32 + 1 int32
31 fp321 fp322 fp32
41 int321 fp32
51 fp32
61 fp32

Thus, Ampere, despite having twice the fp32 throughput of Turing, will be limited to a 4/3 or 33% increase in performance per SMX. On top of that, shader throughput isn't the only factor affecting frame times. Still, due to Nvidia increasing fixed function resources as well (96 ROP vs 64 for instance), this corresponds pretty closely to the relative performance of the RTX 2080 and 3070 which have the same SMX count.

1662756032525.png
 
Joined
Aug 21, 2015
Messages
1,725 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
I disagree. We have a duopoly and AMD didn't master anything at all. We see the same old performance from AMD and no improvement at all.
I mean look at the 5500 XT (2019) -> 6500 XT (2022) transition. 0% progress. None!

The 6500 XT is a joke for sure. That's well-established. By contrast, the 6600 is faster than the 5700 (at ~50W less power) and only a bit behind the 5700 XT, while the 66[0,5]0 XT is faster. The 5700 XT was the top AMD card of its generation, and they've got 5 more cards above it now.

It even looks that AMD intentionally stagnates the market, so that no "average joe" to ever be able to upgrade from the ancient 1080p monitor.

That's an... interesting claim. What reason would they have to do this? If anything, I'd think it'd be the opposite; driving higher resolutions so consumers feel the need to buy the more capable, more profitable cards. After all, they have products in that segment now.

The AMD market share is still low 20% and there is absolutely no signs that to be improving.

Yeah, they've struggled with this for over a decade. Nvidia's in a very entrenched position of strength.

Please, show the cards sales - how many low-end cards were sold, how many mid-range and how many high-end?

That is data I don't have, but is it relevant to this discussion?

Rewinding a bit, I didn't claim that AMD mastered anything, but that they were "champions" of value-for-money vs. Nvidia's midrange in recent history. R9 380 vs. GTX 960, RX 480/580 against GTX 1060. In both those cases, one generally got more perf for less $ (but more watts) with the AMD solution. I wasn't paying attention to the 5000 series, but right now we have the RX 6600 (yes, I'm bringing it up yet again) conclusively beating the RTX 3050 for significantly less money. The caveat there is that those aren't supposed to be competing against each other, but they are because the 3050 is way, way overpriced.

So, I understand your frustration with the lack of progress in the low-to-midrange. I am, too (more with pricing than anything), but I also don't think it's as bad (6400/6500 XT excepted) as you're making it out to be.
 
Joined
Sep 6, 2022
Messages
49 (0.06/day)
Location
Sweden
System Name Daedalus VI
Processor Intel Core i7 3770K @ 4.5GHz (1.368 V)
Motherboard MSI Z77A-G43
Cooling Arctic Alpine 11 Pro
Memory 32GB DDR3 1600MHz CL9
Video Card(s) nVidia Zotac GTX 1080 Ti Mini
Storage Western Digital Caviar Blue 1TB
Display(s) Samsung 27'' QLED 1440p @ 144Hz
Case Cooler Master Elite 333
Audio Device(s) Realtek ALC892 / Logitech 4W speakers + 12W subwoofer
Power Supply Corsair 750W 80+ Gold
Mouse Logitech G203 Lightsync
Keyboard Qpad MK-95
Software Microsoft Windows 7 Professional 64-bit with Service Pack 1
$700 for a GPU should get you into 2160p territory just fine. Just don't set your settings to Ultra - that's just stupid anyhow. The next setting down is typically visually indistinguishable yet performs far, far better. Reviews tend to test at Ultra because they want to test worst-case scenarios for proper stress testing, but gaming at Ultra makes no sense.

Also, this is a bit pedantic, but please stop calling 1440p "2K". A bit of a pet peeve, but this really drives me up the wall. The only resolution called 2K is DCI 2K, a cinema resolution that's 2048x1080 pixels (essentially a slightly wider 1080p). 1440p is 1440p, and there is no "XK" numbering for it, seeing how that numbering comes from the realm of TV sales and marketing, where 1440p has never existed.
1440p is 2.5K. :p
 
Joined
Oct 15, 2011
Messages
2,418 (0.50/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sapphire Nitro+ Radeon RX 6750 XT
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2

Count von Schwalbe

Moderator
Staff member
Joined
Nov 15, 2021
Messages
3,087 (2.78/day)
Location
Knoxville, TN, USA
System Name Work Computer | Unfinished Computer
Processor Core i7-6700 | Ryzen 5 5600X
Motherboard Dell Q170 | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | RTX 2080 Ti FE
Storage Crucial BX500 2TB | TBD
Display(s) 3x LG QHD 32" GSM5B96 | TBD
Case Dell | Heavily Modified Phanteks P400
Power Supply Dell TFX Non-standard | EVGA BQ 650W
Mouse Monster No-Name $7 Gaming Mouse| TBD
My hypothesis, generated on the fly during a conversation with a friend yesterday, is the confluence of high die yields with strong demand for high performance cards. Back in The Day(tm), the number of customers willing to pay top dollar for the top cards was limited. Chipmakers would resort (allegedly?) to fusing off fully-functioning dice to have chips for cards lower in the stack. Then the PC gaming resurgence came around, coupled with the crypto boom. All of a sudden there was functionally unlimited demand for the highest performing chip either manufacturer could muster, even beyond the traditional one-upsmanship. From AMD/Nvidia's perspective, that doesn't leave much incentive to drive value in the midrange. AMD was the champion there in recent years because they didn't have anything that could compete with Nvidia in the high end, so needed to lean on the value play. Then the above happened, coupled with legit advancements of their own, and that wasn't necessary anymore.
Yes, but remember, not all cards use the same die. So, if both cost and profit are based on die size, at least one mid-range card could be focused on (full navi 23 or GA106, or even smaller).
 
Joined
Aug 21, 2015
Messages
1,725 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
Yes, but remember, not all cards use the same die. So, if both cost and profit are based on die size, at least one mid-range card could be focused on (full navi 23 or GA106, or even smaller).

This is true. I was remembering cases like the GTX 970 and 980. Both were GM204, but the 980 cost over two hundred more bones. Did Nvidia actually gimp fully-functional GM204 dice that could have been used for 980s because there wasn't enough demand for those, but plenty for the 970? I don't know. It wouldn't surprise me.
 

Count von Schwalbe

Moderator
Staff member
Joined
Nov 15, 2021
Messages
3,087 (2.78/day)
Location
Knoxville, TN, USA
System Name Work Computer | Unfinished Computer
Processor Core i7-6700 | Ryzen 5 5600X
Motherboard Dell Q170 | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | RTX 2080 Ti FE
Storage Crucial BX500 2TB | TBD
Display(s) 3x LG QHD 32" GSM5B96 | TBD
Case Dell | Heavily Modified Phanteks P400
Power Supply Dell TFX Non-standard | EVGA BQ 650W
Mouse Monster No-Name $7 Gaming Mouse| TBD
This is true. I was remembering cases like the GTX 970 and 980. Both were GM204, but the 980 cost over two hundred more bones. Did Nvidia actually gimp fully-functional GM204 dice that could have been used for 980s because there wasn't enough demand for those, but plenty for the 970? I don't know. It wouldn't surprise me.
These thoughts make me want to make tables and charts.
 
Joined
Dec 1, 2019
Messages
103 (0.06/day)
System Name even lower budget then an old Optiplex :)
Processor Intel Xeon E5-2643 v3 - 3.4GHz
Motherboard Lenovo Orion - X99 chipset
Cooling ID Cooling SE-914-XT
Memory 16GB Micron DDR4 2400MHz
Video Card(s) Nvidia Quadro K2200 4GB GDDR5
Storage 250GB/512GB SSDs - 3TB HDD
Display(s) Samsung UN32EH4003 32" TV/Monitor 1366x768
Case Fractal Design Pop Mini Air
Audio Device(s) Onboard Realtek ALC662
Power Supply EVGA SuperNOVA 550 GA
Mouse MS Wheel Mouse Optical 1.1A USB
Keyboard Lemokey X1 red switches
Software Windows 10 22H2
As a note of interest, I've been watching this Sapphire Pulse RX6600 price as a friend needs an upgrade from an RX470 that isn't quite cutting it in newer games on his 5760x1080 Eyefinity setup. This card went to 259.99 with free shipping in early August, then was raised to an insane 279.99 + 9.99 shipping in early September, then dropped back to 259.99 with free shipping a few days ago. Looking at sold history, these are barely moving even at 259.99, with last one sold on August 24th, and finally another one sold on September 9th. This is from newegg in the USA.


If watching this particular card is any indication: 1. The midrange cards are still too expensive, especially in these deteriorating economic conditions. 2. People really are voting with their wallets and not buying at these prices.

I keep reading articles around the internet about GPU prices being lowered because AMD and Nvidia want to move product, but I sure can't say I'm seeing any of this watching prices on midrange cards like the RX6600 and even worse the RTX 3050 which is selling for $300+ still for a card that's slower then the RX6600. I wonder when these midrange cards will get real discounts that will move them off the shelves????
 

Appalachian

New Member
Joined
Sep 3, 2022
Messages
24 (0.03/day)
Location
East Tennessee
As a note of interest, I've been watching this Sapphire Pulse RX6600 price as a friend needs an upgrade from an RX470 that isn't quite cutting it in newer games on his 5760x1080 Eyefinity setup. This card went to 259.99 with free shipping in early August, then was raised to an insane 279.99 + 9.99 shipping in early September, then dropped back to 259.99 with free shipping a few days ago. Looking at sold history, these are barely moving even at 259.99, with last one sold on August 24th, and finally another one sold on September 9th. This is from newegg in the USA.


If watching this particular card is any indication: 1. The midrange cards are still too expensive, especially in these deteriorating economic conditions. 2. People really are voting with their wallets and not buying at these prices.

I keep reading articles around the internet about GPU prices being lowered because AMD and Nvidia want to move product, but I sure can't say I'm seeing any of this watching prices on midrange cards like the RX6600 and even worse the RTX 3050 which is selling for $300+ still for a card that's slower then the RX6600. I wonder when these midrange cards will get real discounts that will move them off the shelves????
I had read speculation that the higher tier cards are the ones getting the better discounts because those are the ones coming out first. Ive been watching 3070's, 80's, 90's and 6900 XT. There have been some pretty modest deals the last few weeks on all of them. Ive also been watching the 3060 ti and it hasn't budged so it seems this may be the case. Of course, the higher tier cards price was already so bloated, they have a lot more room to give in pricng.
 
Joined
Oct 15, 2011
Messages
2,418 (0.50/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sapphire Nitro+ Radeon RX 6750 XT
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2
Joined
Sep 6, 2022
Messages
49 (0.06/day)
Location
Sweden
System Name Daedalus VI
Processor Intel Core i7 3770K @ 4.5GHz (1.368 V)
Motherboard MSI Z77A-G43
Cooling Arctic Alpine 11 Pro
Memory 32GB DDR3 1600MHz CL9
Video Card(s) nVidia Zotac GTX 1080 Ti Mini
Storage Western Digital Caviar Blue 1TB
Display(s) Samsung 27'' QLED 1440p @ 144Hz
Case Cooler Master Elite 333
Audio Device(s) Realtek ALC892 / Logitech 4W speakers + 12W subwoofer
Power Supply Corsair 750W 80+ Gold
Mouse Logitech G203 Lightsync
Keyboard Qpad MK-95
Software Microsoft Windows 7 Professional 64-bit with Service Pack 1
This is true. I was remembering cases like the GTX 970 and 980. Both were GM204, but the 980 cost over two hundred more bones. Did Nvidia actually gimp fully-functional GM204 dice that could have been used for 980s because there wasn't enough demand for those, but plenty for the 970? I don't know. It wouldn't surprise me.
The 980 Ti sold better than the 980 i.e. the consumer isn't fooled that easily and the 5700 XT 50th Anniversary Edition sold miserably as well with a 0.04% market share at it's peak.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
RX 5700 XT was the worst. Designed during a difficult period for AMD in which the company faced the real threat of going under, to bankrupt, its share stock price was $1, and AMD had no money to design a big Navi 1 chip to fill the empty high-end niche.
 
Joined
Sep 6, 2022
Messages
49 (0.06/day)
Location
Sweden
System Name Daedalus VI
Processor Intel Core i7 3770K @ 4.5GHz (1.368 V)
Motherboard MSI Z77A-G43
Cooling Arctic Alpine 11 Pro
Memory 32GB DDR3 1600MHz CL9
Video Card(s) nVidia Zotac GTX 1080 Ti Mini
Storage Western Digital Caviar Blue 1TB
Display(s) Samsung 27'' QLED 1440p @ 144Hz
Case Cooler Master Elite 333
Audio Device(s) Realtek ALC892 / Logitech 4W speakers + 12W subwoofer
Power Supply Corsair 750W 80+ Gold
Mouse Logitech G203 Lightsync
Keyboard Qpad MK-95
Software Microsoft Windows 7 Professional 64-bit with Service Pack 1
RX 5700 XT was the worst. Designed during a difficult period for AMD in which the company faced the real threat of going under, to bankrupt, its share stock price was $1, and AMD had no money to design a big Navi 1 chip to fill the empty high-end niche.
Wasn't the Zen 1 CPU's making them money?
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
Wasn't the Zen 1 CPU's making them money?

Yes but later, RX 5700 XT was launched in late 2019, while the design stages last 4 years at least, so its design began in 2015, two years before the Zen.
 
Joined
Sep 6, 2022
Messages
49 (0.06/day)
Location
Sweden
System Name Daedalus VI
Processor Intel Core i7 3770K @ 4.5GHz (1.368 V)
Motherboard MSI Z77A-G43
Cooling Arctic Alpine 11 Pro
Memory 32GB DDR3 1600MHz CL9
Video Card(s) nVidia Zotac GTX 1080 Ti Mini
Storage Western Digital Caviar Blue 1TB
Display(s) Samsung 27'' QLED 1440p @ 144Hz
Case Cooler Master Elite 333
Audio Device(s) Realtek ALC892 / Logitech 4W speakers + 12W subwoofer
Power Supply Corsair 750W 80+ Gold
Mouse Logitech G203 Lightsync
Keyboard Qpad MK-95
Software Microsoft Windows 7 Professional 64-bit with Service Pack 1
Yes but later, RX 5700 XT was launched in late 2019, while the design stages last 4 years at least, so its design began in 2015, two years before the Zen.
Ah, I see.
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
People really are voting with their wallets and not buying at these prices.
That is such a wildly myopic analysis. People aren't "voting with their wallets" - six months ago, hordes of people were throwing whatever money they had at any available GPU. Calling the current reversal "voting with their wallets" is in such blatant direct conflict with recent history that it just makes no sense. People are either already using a relatively new GPU, or they have been objecting to GPU pricing for years and abstaining from upgrading. (Or, like the vast majority most likely, they just don't care that much about computers, and just want to play games.) But that just goes to show how utterly useless the concept of "voting with your wallet is" - 'cause if that's the case, then this same group has likely been doing the same for the entire GPU shortage. And not a single person has noticed or cared, because every single GPU made, no matter the price, would sell. The only exception to this is people who previously had the means to buy a new GPU, couldn't due to the shortage, and have in the meantime lost the economic security enabling them to do so. Even with current economic insecurity, that is not a large group.

What we are seeing is a combination of oversupply, a saturated market, a well stocked used market, and economic insecurity. Reducing that to "people are voting with their wallets" is incredibly reductive.
I disagree. We have a duopoly and AMD didn't master anything at all. We see the same old performance from AMD and no improvement at all.
I mean look at the 5500 XT (2019) -> 6500 XT (2022) transition. 0% progress. None!
It even looks that AMD intentionally stagnates the market, so that no "average joe" to ever be able to upgrade from the ancient 1080p monitor.
The AMD market share is still low 20% and there is absolutely no signs that to be improving.
You always have some rather interesting responses and analyses, and this is definitely one of them. First off, did anyone say AMD had "mastered" anything? Second, the RX 6500 is well established to be a very weirdly specced and wholly underwhelming GPU that makes zero sense. It definitely shows that someone at AMD has some rather bad ideas - but crucially, it is not whatsoever representative of the RDNA2 generation at all - not whatsoever. It is a clear and remarkable outlier. It does indeed show no scaling from the RX 5500 XT - but on the other hand, the RX 6600 (non-XT) outperforms the RX 5700 at 1080p; the 6600 XT outperforms the 5700 XT at 1080p and the 5700 at 1440p; and the RX 6700 XT delivers a very significant 25-32% (dependent on resolution) jump over the 5700 XT.

Also, you're very selectively only focusing on naming tiers rather than pricing - which is after all what matters more to most buyers. It doesn't matter if the GPU you can afford is in a 4, 5, 6, 7, or 8 tier - if it's what you can afford, that's the fastest GPU you're getting. Of course, in a way, you're right - there has been massive stagnation this generation. However, singling out AMD for this makes no sense. Nvidia has been identical to AMD in this regard, delivering like-for-like price and performance increases over previous generations. Blaming the current value stagnation in GPUs on AMD in that context is just nonsensical - both AMD and Nvidia are perfectly equal in this regard for this generation. And, IMO, Nvidia as a market leader has a lot more clout and a lot more of an impact with their decisions, and thus had far more of a say of where price/performance has shaken out in the end. Yes, AMD could absolutely have undercut them as they have preivously, and that is absolutely something one could argue that AMD should have done. However, that doesn't absolve Nvidia from its responsibility in consistently driving GPU prices upwards and relative value (perf/$ gen-over-gen) downwards each generation.

AMD's market share and its failure to grow is an extremely complex topic, and one that is strongly affected by Nvidia's massive mindshare advantage and their status as a decades-long incumbent market leader. Trying to gain market share against that kind of opposition is incredibly difficult, no matter your product quality, marketing, etc. It took AMD more than three years of consistently delivering better-value and in the end fundamentally superior products to Intel to erase Intel's mindshare advantage in CPUs, and that was far less entrenched than Nvidia's mindshare in GPUs.

The GPU market is a duopoly in that there are only two significant actors, but that is also a gross misrepresentation of reality, as the term implies that they are comparable in market strength. In reality, the GPU market is a quasi-monopoly with a moderately strong runner-up that somewhat threatens the status of the incumbent, but is by no means in a position to beat them - not in finances, not in multi-year R&D funding, not in mindshare.
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
I think there are two problems:

1. Inflation which in the graphics cards segment is anywhere between 50% and infinity.
2. Macro-economic pressures (political?) to make the lives of the rich easier, and the lives of the poor people much more difficult.

Why I am saying this.

Because the RX 6500 XT is probably not a replacement of the old Radeon RX 5500 XT, but rather a tier below it.
The true RX 5500 XT replacement is the cut down Radeon RX 6600.
Prices - MSRP at launch in the first case - $169, in the second case - $329, or 95% inflation.

Also, let's look at idealo.de and how the prices have changed in the last 10-day period between 30 August and 9 September:

Radeon RX 6400 - 171.69 euro - 172.00 euro +0.1%
Radeon RX 6500 XT - 187.90 euro - 187.90 euro +0
Radeon RX 6600 - 278.00 euro - 278.00 euro +0
Radeon RX 6600 XT - 388.96 euro - 399.00 euro +2.6%
Radeon RX 6650 XT - 403.00 euro - 401.27 euro -0.4%
Radeon RX 6700 XT - 431.10 euro - 479.90 euro +11.3%
Radeon RX 6750 XT - 540.48 euro - 547.90 euro +1.4%
Radeon RX 6800 - 629.00 euro - 629.00 euro +0%
Radeon RX 6800 XT - 769.00 euro - 789.90 euro +2.7%
Radeon RX 6900 XT - 924.92 euro - 914.92 euro -1%
Radeon RX 6950 XT - 1173.98 euro - 999.00 euro -17.5%

For a comparison:

Prices of the low-end cards as of 16 August 2022:

Radeon RX 6400 - 168.82 euro -2%
Radeon RX 6500 XT - 169.00 euro -11%
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I think there are two problems:

1. Inflation which in the graphics cards segment is anywhere between 50% and infinity.
2. Macro-economic pressures (political?) to make the lives of the rich easier, and the lives of the poor people much more difficult.

Why I am saying this.

Because the RX 6500 XT is probably not a replacement of the old Radeon RX 5500 XT, but rather a tier below it.
The true RX 5500 XT replacement is the cut down Radeon RX 6600.
Prices - MSRP at launch in the first case - $169, in the second case - $329, or 95% inflation.

Also, let's look at idealo.de and how the prices have changed in the last 10-day period between 30 August and 9 September:

Radeon RX 6400 - 171.69 euro - 172.00 euro +0.1%
Radeon RX 6500 XT - 187.90 euro - 187.90 euro +0
Radeon RX 6600 - 278.00 euro - 278.00 euro +0
Radeon RX 6600 XT - 388.96 euro - 399.00 euro +2.6%
Radeon RX 6650 XT - 403.00 euro - 401.27 euro -0.4%
Radeon RX 6700 XT - 431.10 euro - 479.90 euro +11.3%
Radeon RX 6750 XT - 540.48 euro - 547.90 euro +1.4%
Radeon RX 6800 - 629.00 euro - 629.00 euro +0%
Radeon RX 6800 XT - 769.00 euro - 789.90 euro +2.7%
Radeon RX 6900 XT - 924.92 euro - 914.92 euro -1%
Radeon RX 6950 XT - 1173.98 euro - 999.00 euro -17.5%

For a comparison:

Prices of the low-end cards as of 16 August 2022:

Radeon RX 6400 - 168.82 euro -2%
Radeon RX 6500 XT - 169.00 euro -11%
I think your second point has some merit, though calling the first point "inflation" is adopting a very problematic term (which mainly serves to hand-wave away the actual willful politics behind the economic processes causing price hikes and value drops) and making it mean something it doesn't.

Also, your generational succesion story doesn't add up to me. Given that the RX 6600 is significantly more powerful than the RX 5600 XT, it makes no sense for it to be a successor to the 5500 XT. The RX 6500 XT is undoubtedly the successor to the 5500 XT - it's just clear that whoever designed that card at AMD didn't care about dGPUs. It is very obviously an "oh, crap, I guess we have to make a desktop dGPU as well" version of a strictly designed-for-mobile, low-power, low-cost GPU. I do agree that the RX 6500 should have been called the 6400 - at which point it would have been decent for its tier! - and there should have been a ~20-24CU RX 6500 XT, not the 16CU one we got. The 6600 isn't the GPU you're missing - the one you're missing doesn't exist, as it was never made. This is a major flaw in AMD's RDNA2 die designs, as it leaves a massive performance gap between cut-down Navi 23 and full-sized Navi 24 - the 6600 is essentially twice as fast as a 6500 XT, after all.

What AMD did was clearly bet on Navi 24 being a volume seller for low-end gaming notebooks, and they optimized the design to the extreme for that use case alone. This doesn't seem to have actually led to it getting a lot of mobile design wins, so this seems like a major strategic blunder from AMD's side, as it also made their dGPU versions of the die - clearly an afterthought - look even more silly. What they should have done IMO, is make Navi 24 either 20 or 24 CUs with a 96-bit VRAM bus (and possibly a slightly larger Infinity Cache). This would have made the die slightly more expensive, and thus a harder sell for those $6-700 entry gaming laptops, but all the more flexible and attractive as an entry gaming GPU in its own right, regardless of the target market. And, of course, it would have led to a far less stupid-looking RX 6500 XT, as it wouldn't have been the least efficient GPU of its whole generation as they wouldn't have had to OC the snot out of it to make it not look terrible, etc. I just hope AMD treats the low end and lower mid-range with some more respect for the 7000 series. Hopefully the current demand slump is enough to remind them of just how many units the RX 570/580 and GTX 1060 sold, and how lucrative the ~$150-250 market is as long as you go for volume rather than massive margins.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.73/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
What we are seeing is a combination of oversupply, a saturated market, a well stocked used market, and economic insecurity.

That sums it up very nicely.
 
Status
Not open for further replies.
Top