• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Confirms Radeon RX 7900 Series Clocks, Direct Competition with RTX 4080

Joined
May 31, 2016
Messages
4,437 (1.44/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Couldn't care less if 4090 is crippled chip or 4090Ti come out in a year, I care that my 4090 is not artificially handicapped just so that Nvidia can sell 4090 OC edition.

Affordability is kinda relative, 1600usd is kinda pocket change for some people :)
You're missing the point here as usual. Also you need to look for different arguments. By your logic, you may argue that not fully enabled chip is a handicap. Insufficient cooling on a chip is a handicap as well. Insufficient power delivery can be considered a handicap.
These all above can be considered a handicap which in my book is silly to even talk about it. Your argument belongs in the same category.
 
Joined
May 2, 2017
Messages
7,762 (2.82/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Couldn't care less if 4090 is crippled chip or 4090Ti come out in a year, I care that my 4090 is not artificially handicapped just so that Nvidia can sell 4090 OC edition.

Affordability is kinda relative, 1600usd is kinda pocket change for some people :)
But as has been discussed above it is handicapped so that Nvidia can come back in half a year and sell a 4090 Ti with 2048 more shaders. Whether the limitation is disabled shaders or power/clock limitations is immaterial - it's all product segmentation, just by marginally different means.
 

ixi

Joined
Aug 19, 2014
Messages
1,451 (0.39/day)
Looks like I know my next gpu :}. XTX I'm waiting for jū.
 
Joined
Nov 11, 2016
Messages
3,395 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
You're missing the point here as usual. Also you need to look for different arguments. By your logic, you may argue that not fully enabled chip is a handicap. Insufficient cooling on a chip is a handicap as well. Insufficient power delivery can be considered a handicap.
These all above can be considered a handicap which in my book is silly to even talk about it. Your argument belongs in the same category.

Handicap vs being artificially handicapped are two separate things, seem like you can't tell the differrence.

But as has been discussed above it is handicapped so that Nvidia can come back in half a year and sell a 4090 Ti with 2048 more shaders. Whether the limitation is disabled shaders or power/clock limitations is immaterial - it's all product segmentation, just by marginally different means.

At this point in time there is no certainty that Nvidia will ever release 4090Ti, if 7900XTX and its higher binned variances cannot compete with 4090, Nvidia might just let 4090 be the best for 2 years (like 2080Ti Super rumor, Nvidia just like to keep an ace up their sleeves)
 
Joined
Jul 15, 2020
Messages
1,020 (0.64/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
Yes it does disappoint and here is why.
If you look closer at reviews, 4090 is basically double the performance of a 3080 10GB. 3080 MSRP was set to $700 at launch which we all know was so damn high at that time. Not to mention the enormous street prices. You may argue 4090 is faster than 3090 and 3090 Ti and it is cheaper or has a better performance per $ ratio the problem is those 3090 and 3090Ti had a crazy pricing anyway. Also 4090 is the fastest thus the stupid pricing and obviously you will have to pay premium for it. There is one problem though. It is not the fastest because it is not fully unlocked. You buy crippled GPU for astronomical price which will be replaced by a GPU with a higher price than ridiculous but fully unlocked.
4090 so far, despite its performance, has disappointed in all other fronts. 4080 16GB? Same thing considering its price. That is basically for every GPU NV is planning to release so far with the current information we have. Hopefully something will change but I doubt it.
4090 is not the very top GPU you speak of. It is only a mere mid-top tier GPU.
The 4090 ti is the top-top tier (unless a 4090 super ti is coming) with the full fat and it`s cost, when released, will represent that- be sure abot that :)
NV is just using that "psychological human error" (read: being an average human) that mae you think that if the GPU isn`t whole than "it is crippled" to make even more profit.

Another emotional aspect: 'I'm paying this much, it better be 'the whole thing'.
The emotional, psychological aspect is playing a major role and we all can see it very clearly in the forum.
Make you change, sometime completely, your choice against solid data and proven facts.
Every business company that respect itself will exploit this 'merit' to the max. NV and Intel are specifically excel in that explointment, AMD still have some miles to cover but it is doing very good to catch up.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.82/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
At this point in time there is no certainty that Nvidia will ever release 4090Ti, if 7900XTX and its higher binned variances cannot compete with 4090, Nvidia might just let 4090 be the best for 2 years (like 2080Ti Super rumor, Nvidia just like to keep an ace up their sleeves)
... that's literally the point, and the only reason why fully enabled dice matter at all: that way you know that there isn't something faster coming. Without that, you can never know. But all precedent shows that Nvidia will launch a faster SKU mid-generation. They did so for Kepler, for Maxwell, for Pascal, for Ampere. Turing is the only exception to this over the past decade, and that had mid-gen refreshes for literally everything but the top SKU.
4090 is not the very top GPU you speak of. It is only a mere mid-top tier GPU.
The 4090 ti is the top-top tier with the full fat and it`s cost, when released, will represent that- be sure abot that :)
NV is just using that "psychological human error" (read: being an average human) that mae you think that if the GPU isn`t whole than "it is crippled" to make even more profit.
No, it is a top-tier GPU. It's just not guaranteed to stay as the top-tier GPU. It'll still be top-tier if/when they launch a 4090 Ti, simply because the Ti will only be marginally faster due to the relatively minor hardware differences. The issue isn't the veracity of whether or not it "actually" is a top-tier GPU, but the inherently shitty move of selling something as "the best of the best" only to undermine this shortly afterwards with a "well, actually..." launch. If someone promises you through marketing that the product you're buying is the best, then it's reasonable to expect it to stay as the best for a while, and to expect that the maker of said product isn't planning to supersede it in a few months' time.
 
Joined
Jul 15, 2020
Messages
1,020 (0.64/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
No, it is a top-tier GPU. It's just not guaranteed to stay as the top-tier GPU. It'll still be top-tier if/when they launch a 4090 Ti, simply because the Ti will only be marginally faster due to the relatively minor hardware differences. The issue isn't the veracity of whether or not it "actually" is a top-tier GPU, but the inherently shitty move of selling something as "the best of the best" only to undermine this shortly afterwards with a "well, actually..." launch. If someone promises you through marketing that the product you're buying is the best, then it's reasonable to expect it to stay as the best for a while, and to expect that the maker of said product isn't planning to supersede it in a few months' time.
It is guaranteed NOT to stay the top tier. Also, as of the xx9x\x9xx family we have sub-tieres in the top tier. 4090 is the low/mid top tier because 4090 ti is almost certainly exist.
If you consider "what is the best" and diciding product A vs. B according to that plus you are willing to pay more only to be entitled to that "the best" treet by itself than, well, you condemn yourself into a limbo of dissapointment.

"...then it's reasonable to expect it to stay as the best for a while"
This is a very naive, childish approche imo.
No one has or will guarantee you a time frame of "the best". Expecting such thing is way out of scope of product spec hace the gape leading to disappointment.

But to which is own I guess, just please don`t use that disappointment to bash any company. That being bias.
 
Joined
Sep 17, 2014
Messages
22,385 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Handicap vs being artificially handicapped are two separate things, seem like you can't tell the differrence.
It is the exact same thing. The result is that there is room in a stack for a less handicapped product.
 
Joined
Feb 15, 2020
Messages
38 (0.02/day)
Location
Slovakia
Processor Intel Core i9 14900K
Motherboard Gigabyte Z790 Aorus Elite X W7
Cooling Direct-die, custom loop
Memory 2x24GiB G.Skill Trident Z5 6400 CL32
Video Card(s) Gigabyte RTX 4090 WF3
Storage Sabrent Rocket 4.0 1TB, 4x4TB Samsung 860 EVO
Display(s) Acer XV273K
Case none
Audio Device(s) Creative SoundBlasterX G5
Power Supply Seasonic Prime Ultra Titanium 850W
Mouse Microsoft Pro IntelliMouse
Keyboard AJAZZ AKP846 RWB

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,154 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
The issue isn't the veracity of whether or not it "actually" is a top-tier GPU, but the inherently shitty move of selling something as "the best of the best" only to undermine this shortly afterwards with a "well, actually..." launch. If someone promises you through marketing that the product you're buying is the best, then it's reasonable to expect it to stay as the best for a while, and to expect that the maker of said product isn't planning to supersede it in a few months' time.
Not everything is some shitty move, the reasoning is pretty clear. The flagship GPU is so large that yields play into this, to make the volume they want to be able to sell given their market share, they can move vastly more with relatively minor defects and start to stockpile ones with no faults/good binning etc, then when enough exist to make a product viable, they'll sell it. How shitty of them.
 
Joined
May 31, 2016
Messages
4,437 (1.44/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
4090 is not the very top GPU you speak of. It is only a mere mid-top tier GPU.
The 4090 ti is the top-top tier (unless a 4090 super ti is coming) with the full fat and it`s cost, when released, will represent that- be sure abot that :)
NV is just using that "psychological human error" (read: being an average human) that mae you think that if the GPU isn`t whole than "it is crippled" to make even more profit.
We know that and that is not the point. The point is You can literally call anything handicapped and that is why I mentioned 4090 not fully enabled chip. And no, as of today it is the top tier GPU.

Handicap vs being artificially handicapped are two separate things, seem like you can't tell the differrence.
Either it is a frequency cap or power delivery cap or CU enabled in a chip all can come down to artificial handicap.
 
Joined
Jan 14, 2019
Messages
12,337 (5.79/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Its quite some difference i would say.
Different how? It's worse value than all but 4 available GPUs instead of 2?

Not everything is some shitty move, the reasoning is pretty clear. The flagship GPU is so large that yields play into this, to make the volume they want to be able to sell given their market share, they can move vastly more with relatively minor defects and start to stockpile ones with no faults/good binning etc, then when enough exist to make a product viable, they'll sell it. How shitty of them.
But it's not just flagship GPUs they do this with, the GA104 being a good example.
 
Joined
Dec 26, 2012
Messages
1,134 (0.26/day)
Location
Babylon 5
System Name DaBeast!!! DaBeast2!!!
Processor AMD AM4 Ryzen 7 5700X3D 8C/16T/AMD AM4 RYZEN 9 5900X 12C/24T
Motherboard Gigabyte X570 Aorus Xtreme/Gigabyte X570S Aorus Elite AX
Cooling Thermaltake Water 3.0 360 AIO/Thermalright PA 120 SE
Memory 2x 16GB Corsair Vengeance RGB RT DDR4 3600C16/2x 16GB Patriot Elite II DDR4 4000MHz
Video Card(s) XFX MERC 310 RX 7900 XTX 24GB/Sapphire Nitro+ RX 6900 XT 16GB
Storage 500GB Crucial P3 Plus NVMe PCIe 4x4 + 4TB Lexar NM790 NVMe PCIe 4x4 + TG Cardea Zero Z NVMe PCIe 4x4
Display(s) Samsung LC49HG90DMEX 32:9 144Hz Freesync 2/Acer XR341CK 75Hz 21:9 Freesync
Case CoolerMaster H500M/SOLDAM XR-1
Audio Device(s) iFi Micro iDSD BL + Philips Fidelio B97/FostexHP-A4 + LG SP8YA
Power Supply Corsair HX1000 Platinum/Enermax MAXREVO 1500
Mouse Logitech G303 Shroud Ed/Logitech G603 WL
Keyboard Logitech G915/Keychron K2
Software Win11 Pro/Win11 Pro
I will be stuck here in Canada till February, so I'd have no choice but to wait, that would gimme the chance to go thru reviews and benchmarks by then, so I guess I'd be better placed to make an informed decision. I'm not too concerned about RT performance as long as it has improved a fair bit over my RX 6900 XT, the only game I wanna play with good framerate + RT is Metro Exodus PC Enhanced. I play it on my main rig at 3840x1080, and my RX 6900 XT does struggle with RT maxed outm so IF the RX 7900 XTX really does improve RT performance by a fair margin, I hope to play ME PC Enhanced again at full maxed out setting.....and net more than playable framerate.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,154 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
But it's not just flagship GPUs they do this with, the GA104 being a good example.
Still, the volume they need/want to ship on these products, they need to accept a small cut down to produce enough acceptable ones in the given timeframe, save the golden chips for the professional RTX products with more VRAM like the ampere A series products.
 
Joined
May 2, 2017
Messages
7,762 (2.82/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
It is guaranteed NOT to stay the top tier. Also, as of the xx9x\x9xx family we have sub-tieres in the top tier. 4090 is the low/mid top tier because 4090 ti is almost certainly exist.
That is literally exactly what I said. That's the distinction between being the top-tier SKU and a top-tier SKU.
If you consider "what is the best" and diciding product A vs. B according to that plus you are willing to pay more only to be entitled to that "the best" treet by itself than, well, you condemn yourself into a limbo of dissapointment.
No. All purchases are a judgement of value for money, and when buying a product you cannot escape such judgements even if you explicitly don't care about value - it's there in literally every aspect of the thing. If you're paying extra for a top-tier product - which you inherently are by buying a flagship GPU - then you're putting your money and trust into a relationship with another party based on their messaging, i.e. marketing. If that company then abuses that trust by subsequently changing the basis for the agreement, then that's on them, not on the buyer.
"...then it's reasonable to expect it to stay as the best for a while"
This is a very naive, childish approche imo.
What? No. A while is not a fixed amount of time. It is an inherently variable and flexible amount of time. That's the entire point. There's nothing naive about this, it is explicitly not naive, but has the expectation of its end built into it. The question is how and why that end comes about - whether it's reasonable, i.e. by a new generation arriving or significant refresh occurring, or whether it's unreasonable, i.e. through the manufacturer creating minuscule, arbitrary binnings in order to launch ever-higher priced premium SKUs.
No one has or will guarantee you a time frame of "the best". Expecting such thing is way out of scope of product spec hace the gape leading to disappointment.
No it isn't. The explicit promise of a flagship GPU is that it's the best GPU, either from that chipmaker or outright. Not that it will stay that way forever, not that it will stay that way for any given, fixed amount of time, but that it is so at the time and will stay that way until either the next generation or a significant mid-gen refresh.
But to which is own I guess, just please don`t use that disappointment to bash any company. That being bias.
Yes, of course, all criticism of shady marketing practices is bias, of course. Unbiased criticism is impossible! Such logic, much wow! It's not as if these companies have massive marketing budgets and spend hundreds of millions of dollars yearly to convince people to give them their hard-earned money, no, of course not :rolleyes: Seriously, if there's anyone arguing a woefully naïve stance here it's you with how you're implicitly pretending that humans are entirely rational actors and that how we are affected by our surroundings is optional. 'Cause if you're not arguing that, then the core of your argument here falls apart.


Not everything is some shitty move, the reasoning is pretty clear. The flagship GPU is so large that yields play into this, to make the volume they want to be able to sell given their market share, they can move vastly more with relatively minor defects and start to stockpile ones with no faults/good binning etc, then when enough exist to make a product viable, they'll sell it. How shitty of them.
No, not everything is a shitty move, that's true. But you're giving a lot of benefit of the doubt here. An unreasonable amount IMO. Even if initial yields of fully enabled dice are bad - say 70%, which is borderline unacceptable for the chip industry - and that a further 50% of undamaged dice don't meet the top spec bin, what is stopping them from still making a retail SKU from the remaining 35% of chips?

The problem is, you're drawing up an unreasonable scenario. Nobody is saying Nvidia has to choose between either launching a fully enabled chip, or a cut down one. They could easily do both - supplies of either would just be slightly more limited. Instead they're choosing to only sell the cut-down part - which initially must include a lot of chips that could have been the top-end SKU, unless their yields are absolute garbage. Look at reports of Intel's fab woes. What yield rates are considered not economically viable? Even 70% is presented as bad. And 70% yields doesn't mean 70% usable chips, it means 70% fault-free chips.

A napkin math example: AD102 is a 608mm² almost-square die. As I couldn't find the specifics online, let's say it's 23x26.4mm (that's 607.2mm², close enough, but obviously not accurate). Let's plug that into Caly Technologies' die-per-wafer calculator (sadly only on the Wayback machine these days). On a 300mm wafer, assuming TSMC's long-reported 0.09 defect rate (which should be roughly applicable for N4, as N4 is a variant of N5, and N5 is said to match N7 defect rates, which were 0.09 several years ago), that results in 87 total dice per wafer, of which ~35 would have defects, and 52 would be defect-free. Given how GPUs are massive arrays of identical hardware, it's likely that all dice with defects are usable in a cut-down form. Let's then assume that half of defect-free dice meet the binning requirements for a fully enabled SKU. That would leave Nvidia with three choices:

- Launch a cut-down flagship consumer SKU at a binning and active block level that lets them use all chips that don't meet binning criteria for a fully enabled chip, and sell all fully enabled chips in higher margin markets (enterprise/workstation etc.) - but also launch a fully enabled consumer SKU later
- Launch a fully enabled consumer SKU and a cut-down SKU at the same time, with the fully enabled SKU being somewhat limited in quantity and taking some supply away from the aforementioned higher margin markets
- Only ever launch a cut-down consumer SKU, leaving fully enabled chips only to other markets

Nvidia consistently picks the first option among these - the option that goes hard for maximizing profits above all else, while also necessarily including the iffy move of promising "this is the flagship" just to supersede it 6-12 months later. And that? That's a shitty move, IMO. Is it horrible? Of course not. But it's explicitly exploitative and cash-grabby at the expense of customers, which makes it shitty.

Still, the volume they need/want to ship on these products, they need to accept a small cut down to produce enough acceptable ones in the given timeframe, save the golden chips for the professional RTX products with more VRAM like the ampere A series products.
That depends on die size and actual yields. As I showed above, with published yields for the process nodes used here, there are still lots of chips that would meet the criteria for fully enabled SKUs.

Different how? It's worse value than all but 4 available GPUs instead of 2?
Also remember that that chart for some reason only assumes MSRP or lower rather than the expected and actual reality of prices being MSRP or higher.
 
Joined
Jan 14, 2019
Messages
12,337 (5.79/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Also remember that that chart for some reason only assumes MSRP or lower rather than the expected and actual reality of prices being MSRP or higher.
Yep. That only makes the picture worse.

Soon, buying Nvidia instead of AMD will be like buying the 500 HP Ferrari for $100k instead of the 500 HP Mustang for $35k. Or is it like that already?
 

bug

Joined
May 22, 2015
Messages
13,735 (3.97/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
But it's not just flagship GPUs they do this with, the GA104 being a good example.
Yields aren't dictated solely by the complexity of a chip. They're also (and even moreso) dictated by the maturity of the fabrication process.
 
Joined
Dec 28, 2012
Messages
3,853 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Yep. That only makes the picture worse.

Soon, buying Nvidia instead of AMD will be like buying the 500 HP Ferrari for $100k instead of the 500 HP Mustang for $35k. Or is it like that already?
That's a bit like comparing a NASCAR stock car to a F1 car.
 
Joined
Jan 14, 2019
Messages
12,337 (5.79/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Joined
May 2, 2017
Messages
7,762 (2.82/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
That's a bit like comparing a NASCAR stock car to a F1 car.
Isn't the classic distinction that a Mustang lets you go fast, but a Ferrari lets you go fast and go around corners without crashing?
 
Joined
Dec 28, 2012
Messages
3,853 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Isn't the classic distinction that a Mustang lets you go fast, but a Ferrari lets you go fast and go around corners without crashing?
Yeah, the mustangs of old were rocket ships, classic american muscle that went real fast, just dont try to turn. ferraris were much more track focused, like most euro designs. They also had torque, but just enough to spin the tires, not enough to shred them for track drifting. The Mustang followed more the mercedes SLS black of "POWAAAH".

The modern mustang is closer to the euro principle of fast and agile, but they still dont hold a candle to a ferrari.
 
Joined
Jan 14, 2019
Messages
12,337 (5.79/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Isn't the classic distinction that a Mustang lets you go fast, but a Ferrari lets you go fast and go around corners without crashing?
It used to be, but nowadays, I think it's more like the distinction that a Mustang lets you go fast, but a Ferrari lets you go fast and look like a smug d***. :p

Sorry for the off.
 
Joined
May 2, 2017
Messages
7,762 (2.82/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
It used to be, but nowadays, I think it's more like the distinction that a Mustang lets you go fast, but a Ferrari lets you go fast and look like a smug d***. :p

Sorry for the off.
I would say that a Mustang manages to do that exact thing just fine, but then that might just be my personal preferences :laugh:
 
Joined
Nov 26, 2021
Messages
1,633 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Not everything is some shitty move, the reasoning is pretty clear. The flagship GPU is so large that yields play into this, to make the volume they want to be able to sell given their market share, they can move vastly more with relatively minor defects and start to stockpile ones with no faults/good binning etc, then when enough exist to make a product viable, they'll sell it. How shitty of them.
Even with a die as large as the 4090, the yields would be close to 60% if N5 has the same defect rate as N7. We know that N5 is actually outperforming N7 for defect rate. For the 4080, the yields would be 70%. Put another way, even the 4090 will have more working dies than defective ones. This is a product segmentation move.
 
Joined
Apr 1, 2017
Messages
420 (0.15/day)
System Name The Cum Blaster
Processor R9 5900x
Motherboard Gigabyte X470 Aorus Gaming 7 Wifi
Cooling Alphacool Eisbaer LT360
Memory 4x8GB Crucial Ballistix @ 3800C16
Video Card(s) 7900 XTX Nitro+
Storage Lots
Display(s) 4k60hz, 4k144hz
Case Obsidian 750D Airflow Edition
Power Supply EVGA SuperNOVA G3 750W
We get it, lots of you don't care about RT, you've only been shouting that from the rooftops since Turings debut, believe me, we hear you loud and clear.

Would ya'll please be mindful there is a subset of enthusiasts who do want good RT performance in future purchases, for it not to be an afterthought, a checkbox to tick.

If it's not for you, awesome, but I tire of hearing nObOdY cArEs ABouT rAy TrACinG when clearly people do, so please, stop (silly to even ask I know, this will probably make the vocal among you double down and write me an essay on why RT is a gimmick or that 'most' people don't care, good on you!).

Personally I'd love to be able to very strongly consider AMD GPU's, but a prerequisite of that is for them to take RT more seriously, and lessen the hit, and they can certainly swing even more buyers there way if they deliver on that, so I eagerly wait to see if the top product has made significant strides.
We get it, lots of you care about RT, you've only been shouting that from the rooftops since Turings debut, believe me, we hear you loud and clear.

Would ya'll please be mindful there is a subset of enthusiasts who don't want RT in future purchases.

If it's for you, awesome, but I tire of hearing eVeRyBoDy cArEs ABouT rAy TrACinG when clearly people don't, so please, stop (silly to even ask I know, this will probably make the vocal among you double down and write me an essay on why RT is not a gimmick or that 'most' people do care, good on you!).

Personally I'd love to be able to very strongly consider NVIDIA GPU's, but a prerequisite of that is for them to take RT less seriously, and lessen the power draw, and they can certainly swing even more buyers there way if they deliver on that, so I eagerly wait to see if the top product has made significant strides.
 
Top