• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Confirms Radeon RX 7900 Series Clocks, Direct Competition with RTX 4080

bug

Joined
May 22, 2015
Messages
13,735 (3.97/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
That alone makes AMD's attitude to sales a lot more sympathetic than Nvidia's in my opinion.
  • AMD tries to sell fully functional dies as much as possible, while also having cheaper versions at the same time - they play with open hands right from launch.
  • Nvidia sells partially disabled dies as their high-end, and reserves fully functional ones to carve out an even higher-end segment of the market later - they artificially boost the hype train to maintain sales.
That is actually a cost-saving measure. Nvidia traditionally engineers more complex chips. Yields for those are not that good at first. So you get the "not fully enabled" dies. Once production steps up, yields improve and fully unlocked chips become more viable. If they pushed for fully enabled dies you end up either with more expensive dies or with cut-down ones (to the level that can be produced initially) with nowhere to go once yields improve.

I also don't get why people get hung up on dies being fully enabled or not. You get the product benched as-is and you know very well what it is capable of.
 
Joined
Sep 18, 2020
Messages
117 (0.08/day)
System Name Vedica
Processor Intel Core i7-9700K
Motherboard Gigabyte AORUS Ultra Z390
Cooling Alphacool Eisblock XPX
Memory 32GB DDR4, 4x Crucial Ballistix Sport LT BLS8G4D30AESBK
Video Card(s) Nvidia RTX 3080
Storage 2x Sabrent 1 TB Rocket - 1x Seagate Barracuda ST4000DM004
Display(s) Dell AW3423DWF
Case Fractal Design Define R6
Audio Device(s) Motu M2
Power Supply Corsair RM1000x
Mouse Cooler Master MM720
Keyboard Wooting One
i think most people should skip the 40 series since 30 series is pretty decent and i have a feeling nvidia will bring out the 50 series earlier instead of 2 years since the launch of rtx30 series
Since 40 series was delayed due to a number of reason which i wont go through since the launch of 30 series.

The ray tracing performance looks horrid .. the 4080 will land the knockout punch

I don’t think anyone expects better RT out of AMD when compared to nvidia . However in terms of raster performance, that’s where AMD will likely win so it will come down to what matters to the buyer

WHO CARES ABOUT RAY TRACING. ARE OU OUT YO MIND?

Personally quite excited to see these cards. The lower RT performance isn't an issue to me if it's Ampere level. The price is important.

reality is who care about RT below 60fps at 4K like 4090

if only ray tracing made games good

Couldn't care less about RT
If barely decent video content(which is ray-traced by nature) is produced, is RT really expected to make games better today or in the future?

I am still waiting, for many years, for realistic audio in games. I'm probably the only one, as industry interest is next to zero.
 
Joined
Nov 11, 2016
Messages
3,394 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Ah, so your 4090 isn't artificially limited by having 2048 of its GA102's shaders disabled? ;)

I don't care about having the best, but I do care about having the best in the price and performance range I consider sensible for my needs.


Then why do you care about a locked voltage/frequency curve?

Do you even read reviews? does 4090 performance dissappoint anyone with its 2048 shaders disabled?

Meanwhile locking voltage/freq is a dick move, especially on top-of-the-line GPU where enthusiasts (who are more likely to buy these products) like to overclock
 
Joined
Jan 14, 2019
Messages
12,337 (5.79/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I also don't get why people get hung up on dies being fully enabled or not. You get the product benched as-is and you know very well what it is capable of.
Because it's a sign that Nvidia leaves performance on the table just to sell it for more money later, even if it's really due to yield issues (which I highly doubt).

Do you even read reviews? does 4090 performance dissappoint anyone with its 2048 shaders disabled?
I do read reviews, and the 4090 indeed does disappoint:
1668511009174.png


Meanwhile locking voltage/freq is a dick move, especially on top-of-the-line GPU where enthusiasts (who are more likely to buy these products) like to overclock
Enthusiasts should realise that overclocking is a thing of the past - we live in an age when GPUs and CPUs come delivering their fullest right out of the box. If you want some personalisation, you should underclock/undervolt more than anything, imo.
 

bug

Joined
May 22, 2015
Messages
13,735 (3.97/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Because it's a sign that Nvidia leaves performance on the table just to sell it for more money later, even if it's really due to yield issues (which I highly doubt).
Well, improved yields are untapped potential. What would you want Nvidia to do with that?
 
Joined
Jan 14, 2019
Messages
12,337 (5.79/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Well, improved yields are untapped potential. What would you want Nvidia to do with that?
If AMD can sell fully enabled chips even in their highest-end products, then so can Nvidia.
 
Joined
Feb 15, 2020
Messages
38 (0.02/day)
Location
Slovakia
Processor Intel Core i9 14900K
Motherboard Gigabyte Z790 Aorus Elite X W7
Cooling Direct-die, custom loop
Memory 2x24GiB G.Skill Trident Z5 6400 CL32
Video Card(s) Gigabyte RTX 4090 WF3
Storage Sabrent Rocket 4.0 1TB, 4x4TB Samsung 860 EVO
Display(s) Acer XV273K
Case none
Audio Device(s) Creative SoundBlasterX G5
Power Supply Seasonic Prime Ultra Titanium 850W
Mouse Microsoft Pro IntelliMouse
Keyboard AJAZZ AKP846 RWB
Because it's a sign that Nvidia leaves performance on the table just to sell it for more money later, even if it's really due to yield issues (which I highly doubt).


I do read reviews, and the 4090 indeed does disappoint:
View attachment 270028


Enthusiasts should realise that overclocking is a thing of the past - we live in an age when GPUs and CPUs come delivering their fullest right out of the box. If you want some personalisation, you should underclock/undervolt more than anything, imo.

If you decide to compare "value" of top end SKU, why not do so in 4K?
Also some entries on this chart is pure comedy - Intel Arc cards especially
 
Joined
Sep 17, 2014
Messages
22,385 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I'm a bit surprised at the drop in clocks and CUs for the XT v. the XTX, to be honest - the drop in power seems a bit small compared to that difference, with 10% less CUs and 10% lower clocks for just ~15% less power. Makes me wonder whether the XT will either often boost higher than spec, or if it'll just have a ton of OC headroom - or if it's explicitly configured to allow essentially any silicon to pass binning for that SKU.

Either way, very much looking forward to benchmarks of these!



None, since it doesn't launch till December 3rd?
Clearly binning imho.

XTX is just a better chip.
Maybe they use a pretty low target for the bin on XT so that they can keep the price relatively low for both.

Its also a new type of product wrt the chips they use. I think they're conservative to keep yields up, so again, binning.
 

bug

Joined
May 22, 2015
Messages
13,735 (3.97/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
If AMD can sell fully enabled chips even in their highest-end products, then so can Nvidia.
They could, but that would mean they just design a smaller chip and when the yields improve, they'd had nothing better to sell. How would that aid you?
 
Joined
Jan 14, 2019
Messages
12,337 (5.79/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
If you decide to compare "value" of top end SKU, why not do so in 4K?
Also some entries on this chart is pure comedy - Intel Arc cards especially
The 4K chart doesn't look much better, either:
1668511812635.png


They could, but that would mean they just design a smaller chip and when the yields improve, they'd had nothing better to sell. How would that aid you?
If it's all about size, then why do they do the same with their lower-end chips, like the GA104?
 
Joined
Jun 5, 2018
Messages
237 (0.10/day)
I'm waiting for RT performance reviews before making a judgement on this card. But not RT + frame generating / upscaling etc etc. Just pure RT vs RT on 7900 vs 4080.
If the RT performance gap is not wider than previous gen, I think this card will be a good choice vs. the 4080.

I'm also curious to see what FSR 3.0 will bring, and I am thankful for the power requirements of this 7900 series. Having said that, it's generally very upsetting to see both normalizing $1k~$2k for high end gaming GPUs. This used to be a whole system budget not too long ago.
 
Joined
Sep 17, 2014
Messages
22,385 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I also don't get why people get hung up on dies being fully enabled or not. You get the product benched as-is and you know very well what it is capable of.
That is probably related to a guesstimate about binning and that golden lottery feeling.

The reasoning: if the chip is already cut down, its already not 'perfect', so it makes sense the rest of that chip is also less likely to be optimal. This doesn't translate to practice for most, but the emotion is what it is.

Another emotional aspect: 'I'm paying this much, it better be 'the whole thing'.

If it's all about size, then why do they do the same with their lower-end chips, like the GA104?
They have volume on each SKU; they move enough units to do the process of gradual improvement on each one. And it moves both ways, remember the re-purposed 104s on lower end products.
 
Joined
Jan 18, 2020
Messages
813 (0.46/day)
Both 4080 and 7900 series are way too expensive. Only worth bothering with for those who need 100fps+ at 4k. Current gen will do 4k 60 just fine in pretty much everything. Even 4090 won't run Cyber Punk with RT without serious image quality damage.
 
Joined
Feb 11, 2009
Messages
5,541 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
If barely decent video content(which is ray-traced by nature) is produced, is RT really expected to make games better today or in the future?

I am still waiting, for many years, for realistic audio in games. I'm probably the only one, as industry interest is next to zero.

no no, im fully with you, and I LOVE and encourage reviewers talking about sound quality.

I remember when BF Bad Company 2 came out and how there was a focus on the audio and everyone talked about it.
I thought it was a turning point but....alas.....

And yet we all know how important audio is to the experience, yet the budget and attention it gets is zero.

Hell I remember AMD also back in the day, I think it was related to what eventually became Vulkan, that they also had something that was meant to focus and increase the quality of audio in games.


That said, being a fan of Digital Foundry's work, I appreciate RT and what it can do/does
 

bug

Joined
May 22, 2015
Messages
13,735 (3.97/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
That is probably related to a guesstimate about binning and that golden lottery feeling.

The reasoning: if the chip is already cut down, its already not 'perfect', so it makes sense the rest of that chip is also less likely to be optimal. This doesn't translate to practice for most, but the emotion is what it is.

Another emotional aspect: 'I'm paying this much, it better be 'the whole thing'.


They have volume on each SKU; they move enough units to do the process of gradual improvement on each one. And it moves both ways, remember the re-purposed 104s on lower end products.
My point exactly: this is all psychological, it has no consequences irl, other than allowing faster future products without having to redesign the silicon (read: cheaper).
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,153 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
WHO CARES ABOUT RAY TRACING. ARE OU OUT YO MIND?
ME. NO.
if only ray tracing made games good
In the same way it cannot make games good, good games can also be enhanced by it.

We get it, lots of you don't care about RT, you've only been shouting that from the rooftops since Turings debut, believe me, we hear you loud and clear.

Would ya'll please be mindful there is a subset of enthusiasts who do want good RT performance in future purchases, for it not to be an afterthought, a checkbox to tick.

If it's not for you, awesome, but I tire of hearing nObOdY cArEs ABouT rAy TrACinG when clearly people do, so please, stop (silly to even ask I know, this will probably make the vocal among you double down and write me an essay on why RT is a gimmick or that 'most' people don't care, good on you!).

Personally I'd love to be able to very strongly consider AMD GPU's, but a prerequisite of that is for them to take RT more seriously, and lessen the hit, and they can certainly swing even more buyers there way if they deliver on that, so I eagerly wait to see if the top product has made significant strides.
 
Joined
Jul 15, 2020
Messages
1,020 (0.64/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
Whoever can pay 1000$+ for a GPU can also pay 1700$+ for a GPU.
In those price category, absolute pref is the bar- not price\performance or even power.
Their might be a small group of users who will push for the 1000$ range at a stretch but no more but most can just droop extra 700-1000$ without real problem- gaming is their hobby and thus it is a legit cost.
To be clear- I`m not one of those, quite the opposite (see my current GPU), but it`s the realty today. Many are willing to pay whatever extra for the ultimate quality/ performance.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.

bug

Joined
May 22, 2015
Messages
13,735 (3.97/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
ME. NO.

In the same way it cannot make games good, good games can also be enhanced by it.

We get it, lots of you don't care about RT, you've only been shouting that from the rooftops since Turings debut, believe me, we hear you loud and clear.

Would ya'll please be mindful there is a subset of enthusiasts who do want good RT performance in future purchases, for it not to be an afterthought, a checkbox to tick.

If it's not for you, awesome, but I tire of hearing nObOdY cArEs ABouT rAy TrACinG when clearly people do, so please, stop (silly to even ask I know, this will probably make the vocal among you double down and write me an essay on why RT is a gimmick or that 'most' people don't care, good on you!).

Personally I'd love to be able to very strongly consider AMD GPU's, but a prerequisite of that is for them to take RT more seriously, and lessen the hit, and they can certainly swing even more buyers there way if they deliver on that, so I eagerly wait to see if the top product has made significant strides.
Goes right up there with "who cares about a second mouse button"...
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Where's all the people who cried at the launch event that the 7900 XTX doesn't match up against the 1.6x more expensive 4090?
I doubt the $1600 MSRP of 4090. It is likely fake figure, as with 2080Ti which was claimed to have MSRP of $999 but sold for 20% more.

The cheapest 4090 in Germany is 2300+. That is 1860+ if you strip VAT.

Also, cough:

1668513996636.png


 
Joined
Apr 21, 2005
Messages
185 (0.03/day)
Yeah, that's pretty much my thinking exactly. While yields/defect rates for TSMC N5 aren't public in the same way they were for N7, we still know that they are good - good on a level where they're churning out massive amounts of large-ish dice with very high yields. Combine that with even AMD's biggest GPU die now being <400mm² and, like you say, most likely the vast majority of dice will qualify for the XTX SKU. We saw the exact same thing with Navi 21 on N7, where the 6900 XT was always (relatively) plentiful, while 6800 XT and 6800 supplies were nonexistent at times, and even at the best of times very limited, simply because AMD would rather sell a fully functioning die as a 6900 XT than cut it down to sell as a cheaper SKU.

Of course they're not in the same supply-constrained market today, so they're going to need to be a bit more flexible and more willing to "unnecessarily" sell parts as a lower bin than they actually qualify for - this has been the norm in chipmaking forever, after all. But I still expect their first push to be plenty of XTXes, and notably fewer XTs. This also (assuming the XT PCB is also used for the 7800 XT) makes the XTX having its own PCB more understandable - it's likely supposed to have enough volume to recoup its own costs, while the XT is designed to be an in-between SKU with more cost optimization. Which is kind of crazy for a $900 GPU, but it's not like those cost optimizations are bad, it just looks slightly less over-the-top.

It will definitely be interesting to see what the power limits for the XT will be like - AMD has had some pretty strict power limits for lower end SKUs lately, like the RX 6600, but nothing like that for a high end, high power SKU. It also raises the question of what premium AIB models of the XT will be like, as AMD is making it pretty clear that there'll be heavily OC'd partner XTXes. That might also be part of the pricing strategy here - with a mere 10% difference, ultra-premium 3GHz XTs don't make as much sense, as they'd cost more than a base XTX - so the upsell to an equally ultra-premium XTX would be relatively easy. And AMD makes money on selling chips after all, not whole GPUs, so they'd always want to sell the more premium SKU.

Also definitely looking forward to seeing what the 7800 XT will be - a further cut down Navi 31? If Navi 32 has the rumored CU count, they'd need another in-between SKU (the poor competitive performance of the 6700 XT demonstrated how they can't leave gaps that big in their lineup), but with die sizes being this moderate and GPUs being relatively affordable and easy to tape out compared to most chips (being massive arrays of identical hardware helps!) I could see AMD launching more Navi 3X dice than 2X from that fact alone.

An N32 based 7800XT with 2.8Ghz clocks would have it matching the 7900XT in pure shader performance with a loss of cache, bandwidth and ROP performance. I see that as more likely than further cuts to N31 since it will have more supply than a cut N31 and if pricing is around $650-700 it would be a pretty popular choice with a very healthy margin for AMD.

This would mean in stock configurations the 7800XT would be better value than the 7900XT and the 7900XTX would also be better value than the 7900XT, however, depending on what is failing to cause parts to be in the 7900XT bin it is possible that overclocking is rather strong on the card so even though stock performance is not that great from a value perspective the overclocked performance could be enough that some enthusiast buyers who like to tinker could see value in the 7900XT even at $900 ensuring there is a market for it, albeit a small low volume market that allows AMD to focus more on the XTX sku.

Then there is the cut N32 die. Will AMD bother with a vanilla 7800 or would they just cut N32 to about 5k shaders, pair it with 3MCDs and call it a 7700XT? I personally think the later and with a 200mm^2 die you are looking at stupid numbers per wafer so supply of 7700XT and 7800XT should be far far better than supply of 6800XT and 6700XT was even if the 7700XT has to use perfectly good dies AMD will have calculated for that.
 
Joined
May 31, 2016
Messages
4,437 (1.44/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Do you even read reviews? does 4090 performance dissappoint anyone with its 2048 shaders disabled?
Yes it does disappoint and here is why.
If you look closer at reviews, 4090 is basically double the performance of a 3080 10GB. 3080 MSRP was set to $700 at launch which we all know was so damn high at that time. Not to mention the enormous street prices. You may argue 4090 is faster than 3090 and 3090 Ti and it is cheaper or has a better performance per $ ratio the problem is those 3090 and 3090Ti had a crazy pricing anyway. Also 4090 is the fastest thus the stupid pricing and obviously you will have to pay premium for it. There is one problem though. It is not the fastest because it is not fully unlocked. You buy crippled GPU for astronomical price which will be replaced by a GPU with a higher price than ridiculous but fully unlocked.
4090 so far, despite its performance, has disappointed in all other fronts. 4080 16GB? Same thing considering its price. That is basically for every GPU NV is planning to release so far with the current information we have. Hopefully something will change but I doubt it.
 
Joined
Nov 11, 2016
Messages
3,394 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Yes it does disappoint and here is why.
If you look closer at reviews, 4090 is basically double the performance of a 3080 10GB. 3080 MSRP was set to $700 at launch which we all know was so damn high at that time. Not to mention the enormous street prices. You may argue 4090 is faster than 3090 and 3090 Ti and it is cheaper or has a better performance per $ ratio the problem is those 3090 and 3090Ti had a crazy pricing anyway. Also 4090 is the fastest thus the stupid pricing and obviously you will have to pay premium for it. There is one problem though. It is not the fastest because it is not fully unlocked. You buy crippled GPU for astronomical price which will be replaced by a GPU with a higher price than ridiculous but fully unlocked.
4090 so far, despite its performance, has disappointed in all other fronts. 4080 16GB? Same thing considering its price. That is basically for every GPU NV is planning to release so far with the current information we have. Hopefully something will change but I doubt it.

Couldn't care less if 4090 is crippled chip or 4090Ti come out in a year, I care that my 4090 is not artificially handicapped just so that Nvidia can sell 4090 OC edition.

Affordability is kinda relative, 1600usd is kinda pocket change for some people :)
 
Joined
May 2, 2017
Messages
7,762 (2.82/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Yes yes, very sympathetic when AMD lock 6900XT voltage/freq in order to sell higher SKUs: 6900XT LC, 6900XTXH, 6950XT
Does the 6900 XT have locked voltage/frequency? I know mine is a Navi 21 XTX die ("Ultimate"), but that's just a special bin picked for scaling better to higher clock speeds at higher voltages. Is the stock, Navi 21 XT 6900 XT locked down in terms of adjusting clock speeds or frequencies? Yes, I know there's a frequency ceiling for what can be set in software, but in my experience that tends to be higher than what can be done stably without exotic cooling anyhow, so I don't see the issue. All I've noticed about mine is that it doesn't really undervolt at all, but that's just a characteristic of that bin of the silicon - it still gets stupidly efficient with a moderate underclock.
That is actually a cost-saving measure. Nvidia traditionally engineers more complex chips. Yields for those are not that good at first. So you get the "not fully enabled" dies. Once production steps up, yields improve and fully unlocked chips become more viable. If they pushed for fully enabled dies you end up either with more expensive dies or with cut-down ones (to the level that can be produced initially) with nowhere to go once yields improve.
This is IMO a pretty reasonable approach - but in the market, it has the effect of saying "hey, this is the new cool flagship, the best of the best" only for 6 months to pass and them to say "hey, forget that old crap, this is the best of the best!" Which, regardless of the realities of production, is a pretty shitty move when the most explicit selling point of the previous product was precisely how it was the best. There's obviously a sliding scale of how shitty this is simply due to something faster always being on the way, but IMO Nvidia tends to skew towards pissing on their fans more than anything in this regard.
I also don't get why people get hung up on dies being fully enabled or not. You get the product benched as-is and you know very well what it is capable of.
This I wholeheartedly agree with. Whether a die is fully enabled or not is entirely irrelevant - what matters is getting what you're paying for, as well as having some base level honesty in marketing.
ME. NO.

In the same way it cannot make games good, good games can also be enhanced by it.

We get it, lots of you don't care about RT, you've only been shouting that from the rooftops since Turings debut, believe me, we hear you loud and clear.

Would ya'll please be mindful there is a subset of enthusiasts who do want good RT performance in future purchases, for it not to be an afterthought, a checkbox to tick.

If it's not for you, awesome, but I tire of hearing nObOdY cArEs ABouT rAy TrACinG when clearly people do, so please, stop (silly to even ask I know, this will probably make the vocal among you double down and write me an essay on why RT is a gimmick or that 'most' people don't care, good on you!).

Personally I'd love to be able to very strongly consider AMD GPU's, but a prerequisite of that is for them to take RT more seriously, and lessen the hit, and they can certainly swing even more buyers there way if they deliver on that, so I eagerly wait to see if the top product has made significant strides.
I mostly agree with this, in fact I waited to buy a new GPU in order to get RT support - but I'm also perfectly fine with RT performance on my 6900 XT. I loved Metro Exodus with RT enabled at 1440p, and while I've only barely tried Control, that too seemed to work fine. Is the 6900 XT perfect? Obviously not. Are Nvidia's contemporary offerings faster? Yes - but not that much faster, not enough that it'll matter in 2-3 years as RT performance becomes more important. And either way, my GPU beats both current gen consoles in RT, so I'll be set for base level RT performance for the foreseeable future.

RT performance is absolutely an important aspect of the value of Nvidia's GPUs - the question is how important. For me, it's ... idk, maybe 5:1 raster-v-RT? Rasterization is a lot more important overall, and for the foreseeable lifetime of this product and its contemporaries, I don't see the delta between them as that meaningful long term. When my 6900 XT performs between a 3070 and 3080 in RT depending on the title, and the 3090 Ti is maybe 20% faster than those on average, that means they'll all go obsolete for RT at roughly the same time. There are absolutely differences, but I don't see them as big enough to dismiss AMD outright.
 
Top