• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RDNA 4 GPU Memory and Infinity Cache Configurations Surface

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
There is a pretty good RDNA2 > RDNA3 comparison out there - 6800 vs 7800XT, both are 60CU, 256-bit.

The best comparison is RX 6650 XT -> RX 7600

1724914917822.png

1724914929256.png


Performance is equal.

1724914964004.png


 
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
There is a pretty good RDNA2 > RDNA3 comparison out there - 6800 vs 7800XT, both are 60CU, 256-bit.
What complicates things is the double ALU thing although it seems to have helped even less than same in Nvidia's case.
Less Infinity Cache but given evolution/optimization of the size of that on both AMD and Nvidia newer generations this has negligible impact.
Other than that - clocks are up 6-15% and VRAM bandwidth up 22%.

Based on the last TPU GPU review 7800XT is overall 22-23% faster than 6800 which basically matches the expectations based on specs.
In RT, 27-34% with gap increasing with resolution. There is a nice little jump there - AMD clearly did improve the RT performance.
The usefulness of the double ALU depends on the game you play. It shows a nice uplift in some of them, with the 7800 XT matching or exceeding the 6900 XT, while it does absolutely nothing in others.

RT in RDNA 3 is still quite meh in my opinion. I hope rumours about doubling the RT cores per shader in RDNA 4 are true, and we'll see some actual uplift that's worth talking about.

The best comparison is RX 6650 XT -> RX 7600

View attachment 361134
View attachment 361135

Performance is equal.

View attachment 361136

Yes, that's a pretty poor upgrade from AMD. With that mindset, though, you can pick good and bad examples from practically everywhere.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
The usefulness of the double ALU depends on the game you play. It shows a nice uplift in some of them, with the 7800 XT matching or exceeding the 6900 XT, while it does absolutely nothing in others.

This is a quite poor development path taken by AMD. The same applies for the Zen 5 Ryzens.
They have to develop so that the improvements can be seen everywhere, not to be load/app/support dependent.
 
Joined
Feb 3, 2017
Messages
3,753 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
The best comparison is RX 6650 XT -> RX 7600

View attachment 361134
View attachment 361135

Performance is equal.

View attachment 361136
You are right, that is an even better comparison. But I really do not understand what AMD did there with these two cards. It really is like nothing changed. Same spec lines down to the anemic PCIe link, even same frequencies in practice, same performance, same RT performance. RX7600 is supposedly on RDNA3 with at least the dual ALU and improved RT things but it simply does not show anywhere. Wow.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,745 (2.92/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / media-PC
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-A
Cooling Arctic Freezer 50 / Thermaltake Contac 21
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) RTX 3080 10GB / RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) 27" 4K120 IPS + 32" 4K60 IPS + 24" 1080p60
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Asus TUF H3 Wireless / Corsair HS35
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
What was the AMD's outlook last time when they tried this with the RX 580 and RX 5700 XT?

Maybe there is a major problem in AMD's management, and they have to sit around that Board of directors, and decide what actions are necessary to fix the abnormally poor execution of the graphics division.

The 6800 XT which is faster in some games has those 128 MB.
At least Polaris cards were hella popular even though they were mid-end cards.

Though still weird if they're not gonna compete in the high-end, 6800/6900 and 7900 series have been great.
 
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
You are right, that is an even better comparison. But I really do not understand what AMD did there with these two cards. It really is like nothing changed. Same spec lines down to the anemic PCIe link, even same frequencies in practice, same performance, same RT performance. RX7600 is supposedly on RDNA3 with at least the dual ALU and improved RT things but it simply does not show anywhere. Wow.
It shows in price at least... Oh wait... :wtf:

1724919267674.png

1724919287611.png


Personally, I just call RDNA 3 RDNA 2 refresh (just like Ada Ampere refresh).
 
Joined
Jul 31, 2024
Messages
333 (2.85/day)
Mindfactory is not a "single store". It is the largest retailer in Europe, and sells tons and tons of inventory. Be respectful.

Sells only to germany. there are tricks to circumvent this, but still. You have been warned do not read the strike through. bad - bad - bad word -> Mindfactory = racists.

Alternate has the joke with alternate.at with different prices to alternate.de. Austria and Germany have a common boarder in "Central Europe". Different inventory .de refuses to sell to .at
Proshop also distinguish between countries.
nbb.com

I prefer if you had written mindfactory is a retailer in GERMANY. Not europe!
There are more countries in europe as only Germany.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.39/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
I feel like RDNA4 is pointless, AMD called it a bugfix afterall and will target mid-end only, lets hope price is very aggressive but I guess not considering AMD wants to chase AI and enterprise marketshare instead (as they should)

RDNA5 is going to be the next somewhat exiting release but is not even close, late 2025 or even 2026

AMD probably spends 2% of R&D funds on consumer GPUs tops, they are getting worse and worse, sadly. MCM pretty much failed here. RDNA4 looks to be monolithic.

Ray Tracing should not be a focus from AMD. Improving FSR and Frame Gen should be a prime focus, its the reason Nvidia cards are selling like hotcakes. DLDSR, DLSS, DLAA, Frame Gen is the true magic of RTX, not Ray Tracing or Path Tracing, and no AMD is not even close, tried tons of AMD cards, including several SKUs in both 6000 and 7000 series. Had a 6800XT as primary card before I got a 4090.

AMD needs to release some marketshare-grapping cards, meaning top performance per dollar with FSR and Frame Gen close to or matching Nvidia's solutions. AMD lacks behind on all features today. Anyone who denies this fact, have not tried both recently.




Remove iGPUs and AMD is already below 10% dGPU marketshare, Nvidia dominates without even trying (AI focus)

I guess AMD and Intel can fight for the low-end dGPU market, maybe mid-end but high-end is already lost long ago
 
Last edited:
Joined
Aug 21, 2013
Messages
1,898 (0.46/day)
I feel like RDNA4 is pointless
I would not call a card that brings 7900XT+ performance from 700+ (900 initially) to <500 as pointless. That's exactly what most people have been asking for. Only enthusiasts want a "4090 killer" at any price or power consumption. But very few people by such expensive cards - even among Nvidia's users.
Having a Halo card certainly has it's benefits in terms of marketing and overall image of a series, but financially it's pretty expensive and getting more expensive every year.
lets hope price is very aggressive
I hope so too but seeing AMD's recent pricing they always seem to be able to shoot themselves in the foot with bad pricing and thus negative reviews only for the price to fall immideatly after launch.
AMD probably spends 2% of R&D funds on consumer GPUs tops, they are getting worse and worse, sadly.
How are they getting worse and worse?
MCM pretty much failed here.
At least they tried. I would not say they failed. They fell short of their own expectations and of those who though this was going to be a "4090 killer" but they're still competitive in most aspects from 4080S and down.
Ray Tracing should not be a focus from AMD. Improving FSR and Frame Gen should be a prime focus, its the reason Nvidia cards are selling like hotcakes. DLDSR, DLSS, DLAA, Frame Gen is the true magic of RTX, not Ray Tracing or Path Tracing,
You say as if the two are mutually exclusive. They are not. Why cant they improve both RT in hardware and FSR/FG in software at the same time?
It's not like one is taking resources away from each other. Engineers who design RT units in hardware are not coding FSR/FG the next day and vice-versa.

I would also say FSR FG was pretty good right out of the gate (despite being late) with wider compatibility. Even with Nvidia's own 20 and 30 series cards that were deprived of a feature that clearly could have worked on those cards (despite what Nvidia said). Even reviewers critical of FSR upscaling portion praised FSR FG as nearly indistinguishable from DLSS FG.
AMD needs to release some marketshare-grapping cards, meaning top performance per dollar with FSR and Frame Gen close to or matching Nvidia's solutions. AMD lacks behind on all features today. Anyone who denies this fact, have not tried both recently.
Nvidia lacked behind in terms of driver control panel for a long time. AMD's was unified and modern where as Nvidia's was fragmented and disjointed.
Only last year they started developing Nvidia App that while still in Beta has shown great progress towards unification.
You also say performance per dollar but then lambast AMD for not releasing a halo card - but a halo card is almost never top performance per dollar.
Remove iGPUs and AMD is already below 10% dGPU marketshare, Nvidia dominates without even trying (AI focus)
Nvidia dominates also largely because of "old fat" ie older cards like 30 series. Not their latest and greatest.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.39/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
I would not call a card that brings 7900XT+ performance from 700+ (900 initially) to <500 as pointless. That's exactly what most people have been asking for. Only enthusiasts want a "4090 killer" at any price or power consumption. But very few people by such expensive cards - even among Nvidia's users.
Having a Halo card certainly has it's benefits in terms of marketing and overall image of a series, but financially it's pretty expensive and getting more expensive every year.

I hope so too but seeing AMD's recent pricing they always seem to be able to shoot themselves in the foot with bad pricing and thus negative reviews only for the price to fall immideatly after launch.

How are they getting worse and worse?

At least they tried. I would not say they failed. They fell short of their own expectations and of those who though this was going to be a "4090 killer" but they're still competitive in most aspects from 4080S and down.

You say as if the two are mutually exclusive. They are not. Why cant they improve both RT in hardware and FSR/FG in software at the same time?
It's not like one is taking resources away from each other. Engineers who design RT units in hardware are not coding FSR/FG the next day and vice-versa.

I would also say FSR FG was pretty good right out of the gate (despite being late) with wider compatibility. Even with Nvidia's own 20 and 30 series cards that were deprived of a feature that clearly could have worked on those cards (despite what Nvidia said). Even reviewers critical of FSR upscaling portion praised FSR FG as nearly indistinguishable from DLSS FG.

Nvidia lacked behind in terms of driver control panel for a long time. AMD's was unified and modern where as Nvidia's was fragmented and disjointed.
Only last year they started developing Nvidia App that while still in Beta has shown great progress towards unification.
You also say performance per dollar but then lambast AMD for not releasing a halo card - but a halo card is almost never top performance per dollar.

Nvidia dominates also largely because of "old fat" ie older cards like 30 series. Not their latest and greatest.
You won't get 7900XT performance for 499 LMAO, just wait and see.

Clearly you are out of touch with the actual market. I do b2b sales for a living, Nvidia completely crushes AMD in terms of GPU sales, gaming, AI, enterprise, don't matter, Nvidia is the king.

Techpowerup has like 50+ FSR vs DLSS/DLAA tests and Nvidia wins every time. They also have superior Frame Gen without artifacts and ghosting.

Nvidia have superior drivers, by far. Nvidia runs flawlessly no matter which game you open. Early access, Betas, Emulation, Nvidia does it all without a problem. AMD has wonky drivers and I know this for sure since I am coming from a 6800XT and built like 100+ mid to high-end rigs in the last 5 years, minimum. 9 out of 10 people want Nvidia, thats the hard reality for you.

AMD GPUs gotten worse and worse in the last few generations, their focus shifted away from dGPUs, which shows.

Nvidia dominates because 9 out of 10 want Nvidia, its as simple as that. Many tried AMD at some point but came rushing back to Nvidia.

AMD is cheaper for a reason. If they were actually good, they would gain marketshare, not lose it, year after year. They have improved nothing in the last many generations. Rushed features that are cheap knockoffs of Nvidia's tech is what they do.

DLDSR beats VSR
DLSS/DLAA beats FSR
Nvidia Frame Gen beats AMD Frame Gen.
Reflex beats Anti Lag+ (and AL+ got people steam banned haha)
Nvidia have longer support, even GTX 600 series from 15 years ago still get drivers, meanwhile AMD pulled Polaris and Vega support
Nvidia cards can use RT and even Path Tracing
ShadowPlay beats ReLive

Every single feature, Nvidia invented and AMD tried to copy it, but failed.

Also AMD uses more power and has lower resell value, you save nothing by going AMD GPU in the end.

Thats why AMD GPUs are cheaper, and still don't sell.
 
Last edited:
Joined
Aug 21, 2013
Messages
1,898 (0.46/day)
You won't get 7900XT performance for 499 LMAO, just wait and see.
Exhibit A: Fury to RX 480. From 549 to 229 at 92% of the performance. Plus double the VRAM despite most other specs being downgraded.
Exhibit B: Vega 56 to 5700 XT. From 399 to 399 at 121% of the performance. Most specs were downgrades, but performance actually increased.

Both cases where AMD released a mid-range cards after high-end cards failed. Fury failed against Maxwell based 900 series and Vega failed against Pascal based 10 series.

And the history is about to repeat the third time. But sure. You believe what you want to believe in the hopes that no way history would repeat itself so soon, or ever.
Clearly you are out of touch with the actual market. I do b2b sales for a living, Nvidia completely crushes AMD in terms of GPU sales, gaming, AI, enterprise, don't matter, Nvidia is the king.
And is this crushing based on exclusively 4090 sales?
I was not arguing that people dont want, or dont buy Nvidia.
I was arguing that most people want cheaper cards with better performance, not faster cards at even higher prices.
Techpowerup has like 50+ FSR vs DLSS/DLAA tests and Nvidia wins every time. They also have superior Frame Gen without artifacts and ghosting.
I was not talking about upscaling. I was talking about Frame generation.
From TPU's on conclusion of FSR FG vs DLSS FG:
the image quality of FSR 3.1 Frame Generation is excellent. In Horizon Forbidden West, when using DLSS as the base image for both Frame Generation solutions, we didn't see any major differences in image quality between AMD's and NVIDIA's Frame Generation solutions, which is a very good thing. The only exception is a slightly softer overall image in motion with FSR 3.1 Frame Generation, specifically at 1080p resolution.
Nvidia have superior drivers, by far. Nvidia runs flawlessly no matter which game you open. Early access, Betas, Emulation, Nvidia does it all without a problem.
Spoken like a fanboy. Not a single card, no matter how "superior" it's drivers are runs "flawlessly" in every game.
Just open Nvidia forums and you'll see plenty of people with driver problems. It's true that Nvidia has less issues than AMD or especially Intel but i never claimed otherwise.

Nvidia lists known issues in their driver releases every time and often they stay there for months on end before finally (i presume) getting fixed.
Nvidia historically also has had worse drivers in Linux. You know the OS most of the world uses? (in enterprise, embedded, smartphones etc).
Only recently have they been starting to improve their Linux drivers by opening up more previously closed source code.
AMD has wonky drivers and I know this for sure since I am coming from a 6800XT and built like 100+ mid to high-end rigs in the last 5 years, minimum. 9 out of 10 people want Nvidia, thats the hard reality for you.
I too have AMD boxes in addition to Nvidia and i've yet to see these "wonky drivers" you speak of. Granted i only use WHQL versions.
I have friends who have AMD cards and they dont complain to me about "wonky drivers".
If you search the internet then there are plenty drivers problems with every product, no matter the manufacturer.
AMD GPUs gotten worse and worse in the last few generations, their focus shifted away from dGPUs, which shows.
Again i ask how? You speak about drivers. I assume you mean that? Or is it features?
AMD is cheaper for a reason. If they were actually good, they would gain marketshare, not lose it, year after year. They have improved nothing in the last many generations. Rushed features that are cheap knockoffs of Nvidia's tech is what they do.
Again spoken like a fanboy failing to see any progress from "cheaper" competitors who no doubt are worse and keep getting worse every year. Keep this positive outlook going buddy...
DLDSR beats VSR
DLSS/DLAA beats FSR
Nvidia Frame Gen beats AMD Frame Gen.
Reflex beats Anti Lag+ (and AL+ got people steam banned haha)
Nvidia have longer support, even GTX 600 series from 15 years ago still get drivers, meanwhile AMD pulled Polaris and Vega support
Nvidia cards can use RT and even Path Tracing
ShadowPlay beats ReLive
Have i said they dont?

Most of those features are also exclusive to Nvidia's own cards or even their latest series, screwing over their previous series customers.

Longer support? When we look at latest drivers then quarterly driver releases for Vega is not "pulling support". This is a myth that started to spread and keeps spreading. People that keep repeating this lie never actually bother to visit AMD's site and check for themselves because that would be too hard and disrupt their narrative.

Nvidia with their current drivers actively supports 900 series and newer. Released in 2014.
AMD supports 400 series and newer. Released in 2016.
The difference is between 10 vs 8 years.
So Nvidia has active support for 2 years more, not 4 years like you claim.

Vega series has very recent drivers from March of this year as does 400/500 series. Only the very old R9 200/300/Fury series are using legacy drivers from a few years back.

R9 200/300/Fury:
Adrenalin 22.6.1 WHQL
Release Date: 2022-06-23

Radeon VII/Vega 56 & 64 + RX 400/500:
Adrenalin 24.3.1 WHQL
Release Date: 2024-03-20

So it seems Nvidia supports their oldest series for up to ten years. Meaning 900 series support will likely be dropped next year. AMD seems to support their older series for 7-8 years.

Next time educate yourself, instead of spouting random nonsense you might have read or heard on the internet.

Hilarious that you say Nvidia can use RT? And AMD cant?
PT is a total non-issue (how many games actually use it?) as even 4090 struggles with it and needs every performance enhancing toggle enabled to get playable framerates. People who buy a 1700+ card to play at 60fps with upscaling and FG enabled in a handful of games are idiots.
PT is essentially a tech demo of what will one day be possible. Today it's a tech demo.
Every single feature, Nvidia invented and AMD tried to copy it, but failed.
AMD seems to be focusing more on hardware, not software features.

Who came up with MCM GPU's first? Nvidia has not even tried to copy it yet. Arguably they dont need to but one day they will have to by necessity as making huge monolithic chips on ever more expensive wafers is a big loss if it has any defects. They already do it to some degree with Blackwell where two big dies are joined together by a high speed interconnect. Not too dissimilar to AMD Infinity Fabric. It's only a matter of time before all three manufacturers move to MCM GPU's. At least for high-end cards.

Who introduced ReBAR first and who copied it?
Historically AMD has also been the first the use a new generation of VRAM. They did it with GDDR4, they did it HBM and HBM2 etc.
AMD cards are also more forward looking (in terms of hardware) with more VRAM out of the box, newer display outputs, hardware scheduling, async compute etc.
Also AMD uses more power and has lower resell value, you save nothing by going AMD GPU in the end.
Is this the old "AMD is hot and loud" argument again? I thought this had died in the R9 300 era but apparently not.
I see AMD cards reselling for quite some money. If you would be right i should be able to pick up high-end cards for pennies.
 
Last edited:
Joined
Dec 25, 2020
Messages
6,744 (4.71/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Does this mean they arent coming out with an 8th gen counterpart to the 7900XTX? or simply that they dont plan on focusing on increased performance as much?

Former, they are targeting RTX 4080 performance at the power footprint of a 7800 XT or 4070 Ti.

I think they'll be great cards if they can pull it off and if the price is right, but the high end will go uncontested.
 
Joined
Sep 28, 2005
Messages
3,326 (0.48/day)
Location
Canada
System Name PCGR
Processor 12400f
Motherboard Asus ROG STRIX B660-I
Cooling Stock Intel Cooler
Memory 2x16GB DDR5 5600 Corsair
Video Card(s) Dell RTX 3080
Storage 1x 512GB Mmoment PCIe 3 NVME 1x 2TB Corsair S70
Display(s) LG 32" 1440p
Case Phanteks Evolve itx
Audio Device(s) Onboard
Power Supply 750W Cooler Master sfx
Software Windows 11
I want a card I can replace my rtx 3080 with, that's less heat, less power draw and can do RTX better while being great to run on Linux.

So get cracking at it AMD
 
Joined
Oct 2, 2015
Messages
3,134 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
RDNA3 is no jump over RDNA2? A quick look at the 7900GRE review shows the 7900xtx being 43% faster than the 6900xt at 2560x1440. Hell, I was super tempted to switch over from a 3090 because it's just that much faster but the lack of side ported waterblocks was the only deterrent.

I also tend to play some Warzone nowadays and for whatever reason RDNA3 is stupid fast in that game, faster even than the 4090 at most resolutions.

I would say being 43% faster is a plus..



I think they will at least match the 7900xtx but most seem to think otherwise. I guess we'll see soon enough
Check lower end product numbers. The 7600 manages to be slower than the 6600 XT.

But hey, go and buy a 4070 which would have been an **60 class of GPU for $600 instead of the $300 it would have been before :kookoo: or if you want the best in class shell out $2K for a 4090, who the fuck wants to spend used car money on a fricken GPU that will become obsolete in 2 years and get bummed for another $2K when the 5090 comes out? this is not the majority of buyers, and Nvidia is taking the piss, now more people are waiting multiple gens to upgrade cause $5-$600 for shit mid-gen GPU's is not realistic in the real world, I have a RX 6800 I bought for £350 and nothing new comes close to it in terms of performance/£ even after 2 years of newer GPU's, I am going to hodl this until at least Radeon 10 series or Nvidia 70**, they can charge their overpriced BS money for the same performance class all they want, both of them, I won't be spending a dime on either
Yet you didn't upgrade to RDNA3.

NVIDIA is expensive, everyone knows that, the point is that AMD copying prices without offering anything in return made RDNA3 s terrible release.
 
Joined
May 7, 2023
Messages
649 (1.14/day)
Processor Ryzen 5700x
Motherboard Gigabyte Auros Elite AX V2
Cooling Thermalright Peerless Assassin SE White
Memory TeamGroup T-Force Delta RGB 32GB 3600Mhz
Video Card(s) PowerColor Red Dragon Rx 6800
Storage Fanxiang S660 1TB, Fanxiang S500 Pro 1TB, BraveEagle 240GB SSD, 2TB Seagate HDD
Case Corsair 4000D White
Power Supply Corsair RM750x SHIFT
Check lower end product numbers. The 7600 manages to be slower than the 6600 XT.


Yet you didn't upgrade to RDNA3.

NVIDIA is expensive, everyone knows that, the point is that AMD copying prices without offering anything in return made RDNA3 s terrible release.
if it's good enough for the goose it's good enough for the gander, why the fuck should AMD be expected to cut their prices because Nvidia is "better" they will be priced acordingly, so if a 7900 XT beats a 4080 in raster then it will be priced accordingly, where has the $200-$300 market gone? the one that has been the go-to for gamers for years, now all of a sudden we're expected to pay $600 for these class of cards? it's ridiculous, I have never paid more than $400 for a GPU and never will, looks like I'm relegated to waiting 3-4 gens for a mid class GPU that was supposed to be $300 but was inflated to $600 and buying used last gen parts, bubble's gonna burst, I will be waiting with baited breath when it does, fuck their AI and enterprise
 
Joined
Oct 2, 2015
Messages
3,134 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
if it's good enough for the goose it's good enough for the gander, why the fuck should AMD be expected to cut their prices because Nvidia is "better" they will be priced acordingly, so if a 7900 XT beats a 4080 in raster then it will be priced accordingly, where has the $200-$300 market gone? the one that has been the go-to for gamers for years, now all of a sudden we're expected to pay $600 for these class of cards? it's ridiculous, I have never paid more than $400 for a GPU and never will, looks like I'm relegated to waiting 3-4 gens for a mid class GPU that was supposed to be $300 but was inflated to $600 and buying used last gen parts, bubble's gonna burst, I will be waiting with baited breath when it does, fuck their AI and enterprise
Because it's 2024, not 2008, raster is not the sole metric to measure a GPU anymore.

Leaving the RT discussion aside, AMD is at a disadvantage in encoding and decoding, compute software quality, stability, and hardware support for it, a tensor equivalent and the software that takes advantage of it, system stability particularly in Linux with the lack of hardware support for GPU resets, etc.

It's a product that can only game well, it's a lower quality on anything else, and that merits a lower price. AMD themselves know this, so this is how they intend to tackle the problem.
 
Joined
May 3, 2018
Messages
2,881 (1.20/day)
It sure sounds to me like AMD is gonna stick to 7800xt performance and leave high end buyers out to dry.

Unfortunate but perhaps expected by this point.
HUh, all leaks indicate 8800XT will be 7900XT/XTX levels of raster and much faster in RT. It will be much stronger than rubbish 7700XT erm 7800XT at lower power.

The best comparison is RX 6650 XT -> RX 7600

View attachment 361134
View attachment 361135

Performance is equal.

View attachment 361136

Good to see AMD really adding new meaning to the word "progress". See if they had of called this 7500XT, lowered price $30 it would have been much better received. 7600XT is even more disappointing.
 
Joined
Dec 25, 2020
Messages
6,744 (4.71/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
HUh, all leaks indicate 8800XT will be 7900XT/XTX levels of raster and much faster in RT. It will be much stronger than rubbish 7700XT erm 7800XT at lower power.

Good to see AMD really adding new meaning to the word "progress". See if they had of called this 7500XT, lowered price $30 it would have been much better received. 7600XT is even more disappointing.

It's worth noting that the RX 7600/7600 XT (Navi 33) doesn't actually support some of the backend features nor contains some of the chip-level architectural improvements of the Navi 31 and 32 silicon, which makes these utterly redundant in the face of their RDNA 2-based predecessors. Progress indeed.

For all of Nvidia's faults... it could be a lot worse. Their competition is completely misguided as usual, and Arc isn't quite there yet. Things will heat up once BMG arrives, but Alchemist is a done deal at this point.
 
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
if it's good enough for the goose it's good enough for the gander, why the fuck should AMD be expected to cut their prices because Nvidia is "better" they will be priced acordingly, so if a 7900 XT beats a 4080 in raster then it will be priced accordingly, where has the $200-$300 market gone? the one that has been the go-to for gamers for years, now all of a sudden we're expected to pay $600 for these class of cards? it's ridiculous, I have never paid more than $400 for a GPU and never will, looks like I'm relegated to waiting 3-4 gens for a mid class GPU that was supposed to be $300 but was inflated to $600 and buying used last gen parts, bubble's gonna burst, I will be waiting with baited breath when it does, fuck their AI and enterprise
Same here. The 7800 XT was the most expensive GPU I've ever bought (although I sold it quickly to a friend). Everybody keeps bringing up "the market" and "current trends" as reasons why I should spend more, but honestly, I don't care. My salary doesn't reflect "the market" in any way, so if £300 was good enough for a GPU ten years ago, then £4-500 should be more than enough for one now. AMD not targeting higher segments with RDNA 4 doesn't affect me in the slightest. If they can improve on RT and the efficiency, and bring video playback power consumption down to acceptable levels, they'll have a buyer.
 
Joined
Oct 30, 2020
Messages
250 (0.17/day)
Check lower end product numbers. The 7600 manages to be slower than the 6600 XT.

Thats one model but you can't take that and say RDNA3 offers nothing over RDNA2. The performance numbers say otherwise.

Boring release yada yada that's fine. But it offers a substantial uplift at the top end
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.39/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Exhibit A: Fury to RX 480. From 549 to 229 at 92% of the performance. Plus double the VRAM despite most other specs being downgraded.
Exhibit B: Vega 56 to 5700 XT. From 399 to 399 at 121% of the performance. Most specs were downgrades, but performance actually increased.

Both cases where AMD released a mid-range cards after high-end cards failed. Fury failed against Maxwell based 900 series and Vega failed against Pascal based 10 series.

And the history is about to repeat the third time. But sure. You believe what you want to believe in the hopes that no way history would repeat itself so soon, or ever.

And is this crushing based on exclusively 4090 sales?
I was not arguing that people dont want, or dont buy Nvidia.
I was arguing that most people want cheaper cards with better performance, not faster cards at even higher prices.

I was not talking about upscaling. I was talking about Frame generation.
From TPU's on conclusion of FSR FG vs DLSS FG:


Spoken like a fanboy. Not a single card, no matter how "superior" it's drivers are runs "flawlessly" in every game.
Just open Nvidia forums and you'll see plenty of people with driver problems. It's true that Nvidia has less issues than AMD or especially Intel but i never claimed otherwise.

Nvidia lists known issues in their driver releases every time and often they stay there for months on end before finally (i presume) getting fixed.
Nvidia historically also has had worse drivers in Linux. You know the OS most of the world uses? (in enterprise, embedded, smartphones etc).
Only recently have they been starting to improve their Linux drivers by opening up more previously closed source code.

I too have AMD boxes in addition to Nvidia and i've yet to see these "wonky drivers" you speak of. Granted i only use WHQL versions.
I have friends who have AMD cards and they dont complain to me about "wonky drivers".
If you search the internet then there are plenty drivers problems with every product, no matter the manufacturer.

Again i ask how? You speak about drivers. I assume you mean that? Or is it features?

Again spoken like a fanboy failing to see any progress from "cheaper" competitors who no doubt are worse and keep getting worse every year. Keep this positive outlook going buddy...

Have i said they dont?

Most of those features are also exclusive to Nvidia's own cards or even their latest series, screwing over their previous series customers.

Longer support? When we look at latest drivers then quarterly driver releases for Vega is not "pulling support". This is a myth that started to spread and keeps spreading. People that keep repeating this lie never actually bother to visit AMD's site and check for themselves because that would be too hard and disrupt their narrative.

Nvidia with their current drivers actively supports 900 series and newer. Released in 2014.
AMD supports 400 series and newer. Released in 2016.
The difference is between 10 vs 8 years.
So Nvidia has active support for 2 years more, not 4 years like you claim.

Vega series has very recent drivers from March of this year as does 400/500 series. Only the very old R9 200/300/Fury series are using legacy drivers from a few years back.

R9 200/300/Fury:
Adrenalin 22.6.1 WHQL
Release Date: 2022-06-23

Radeon VII/Vega 56 & 64 + RX 400/500:
Adrenalin 24.3.1 WHQL
Release Date: 2024-03-20

So it seems Nvidia supports their oldest series for up to ten years. Meaning 900 series support will likely be dropped next year. AMD seems to support their older series for 7-8 years.

Next time educate yourself, instead of spouting random nonsense you might have read or heard on the internet.

Hilarious that you say Nvidia can use RT? And AMD cant?
PT is a total non-issue (how many games actually use it?) as even 4090 struggles with it and needs every performance enhancing toggle enabled to get playable framerates. People who buy a 1700+ card to play at 60fps with upscaling and FG enabled in a handful of games are idiots.
PT is essentially a tech demo of what will one day be possible. Today it's a tech demo.

AMD seems to be focusing more on hardware, not software features.

Who came up with MCM GPU's first? Nvidia has not even tried to copy it yet. Arguably they dont need to but one day they will have to by necessity as making huge monolithic chips on ever more expensive wafers is a big loss if it has any defects. They already do it to some degree with Blackwell where two big dies are joined together by a high speed interconnect. Not too dissimilar to AMD Infinity Fabric. It's only a matter of time before all three manufacturers move to MCM GPU's. At least for high-end cards.

Who introduced ReBAR first and who copied it?
Historically AMD has also been the first the use a new generation of VRAM. They did it with GDDR4, they did it HBM and HBM2 etc.
AMD cards are also more forward looking (in terms of hardware) with more VRAM out of the box, newer display outputs, hardware scheduling, async compute etc.

Is this the old "AMD is hot and loud" argument again? I thought this had died in the R9 300 era but apparently not.
I see AMD cards reselling for quite some money. If you would be right i should be able to pick up high-end cards for pennies.
Who came up with MCM GPUs, and failed miserably? Yeah AMD. Going MCM and STILL loosing in performance per watt and scalability was an utter fail.
Nvidia beats AMD with ease using monolithic, no need to go MCM.

Yeah AMD used HBM first and failed big time as well. 4GB on Fury series, DoA before they even launched and 980 Ti absolutely wrecked Fury X. Especially with OC, 980 Ti gained massive performance here and Fury X barely gained 1% while watt usage exploded. The worst GPU release ever. Lisa Su even called Fury X an overclockers dream, which has to be the biggest joke ever. Still laugh hard when I watch the video.

AMD seems to be focusing on CPUs like they should. They are a CPU company first. They barely makes a dime on consumer GPUs and target AI and Enterprise now yet Nvidia is king of AI. AMD wants a piece of the pie here, they don't care about gaming GPUs. Which shows. Already below 10% dGPU marketshare and their offerings are meh.

RDNA4 will be a joke, just wait and see. AMD spent no money developing it, its merely a RDNA3 bugfix with improved ray tracing, which is pointless since AMD can't do ray tracing and FSR/Frame Gen won't help them here either, because its mediocre as well.

AMD thinks 110C hot spot temp is acceptable so yeah, AMD is hotter, also uses more power. Low demand means low resell value. You save nothing buying an AMD GPU in the end.

You are the fanboy here, obviously. Everything I state is fact. AMDs features are mediocre, AMDs drivers are wonky, game support is meh. AMD spends most of their time improving performance in games that gets benchmarked, so they look decent in reviews, thats why most early access games, betas and just lesser popular games in general, tends to run like crap on AMD GPUs. Zero focus from AMD. Zero focus from dev's because 9 out of 10 uses Nvidia.

I use AMD CPU, why? Because they make good CPUs. I don't use AMD GPU, why? Because their GPUs are crap. Worse than ever pretty much. Miss ATi.
 
Last edited:
  • Love
Reactions: ARF
Joined
Oct 2, 2015
Messages
3,134 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
Thats one model but you can't take that and say RDNA3 offers nothing over RDNA2. The performance numbers say otherwise.

Boring release yada yada that's fine. But it offers a substantial uplift at the top end
That's the minimum it has to do, and if it fails to do it on the whole stack, it's a ridiculous release. The top end is dominated by better products from the competition, in any way you look at it, and the low end is a sidegrade or outright downgrade. Great job! You just failed the top end users that aren't married to the brand, and the value and low budget chasers that are the bulk of your sales.

Gotta love forgetting about the 7600 XT and 7700 XT while at it.
 
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Who came up with MCM GPUs, and failed miserably? Yeah AMD. Going MCM and STILL loosing in performance per watt and scalability was an utter fail.
Nvidia beats AMD with ease using monolithic, no need to go MCM.
If you think the MCM design is about performance, or saving power, then you're hugely mistaken. It's all about controlling defects (smaller chips have better yields) and thus, manufacturing costs. Check your CPU's idle power consumption if you don't believe me.

RDNA4 will be a joke, just wait and see. AMD spent no money developing it, its merely a RDNA3 bugfix with improved ray tracing, which is pointless since AMD can't do ray tracing and FSR/Frame Gen won't help them here either, because its mediocre as well.
If AMD is bad in RT, then should they spend nothing on improving it, and just give up altogether? I don't see your logic here.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
Who came up with MCM GPUs, and failed miserably? Yeah AMD. Going MCM and STILL loosing in performance per watt and scalability was an utter fail.
Nvidia beats AMD with ease using monolithic, no need to go MCM.

Yeah AMD used HBM first and failed big time as well. 4GB on Fury series, DoA before they even launched and 980 Ti absolutely wrecked Fury X. Especially with OC, 980 Ti gained massive performance here and Fury X barely gained 1% while watt usage exploded. The worst GPU release ever. Lisa Su even called Fury X an overclockers dream, which has to be the biggest joke ever. Still laugh hard when I watch the video.

AMD seems to be focusing on CPUs like they should. They are a CPU company first. They barely makes a dime on consumer GPUs and target AI and Enterprise now yet Nvidia is king of AI. AMD wants a piece of the pie here, they don't care about gaming GPUs. Which shows. Already below 10% dGPU marketshare and their offerings are meh.

RDNA4 will be a joke, just wait and see. AMD spent no money developing it, its merely a RDNA3 bugfix with improved ray tracing, which is pointless since AMD can't do ray tracing and FSR/Frame Gen won't help them here either, because its mediocre as well.

AMD thinks 110C hot spot temp is acceptable so yeah, AMD is hotter, also uses more power. Low demand means low resell value. You save nothing buying an AMD GPU in the end.

You are the fanboy here, obviously. Everything I state is fact. AMDs features are mediocre, AMDs drivers are wonky, game support is meh. AMD spends most of their time improving performance in games that gets benchmarked, so they look decent in reviews, thats why most early access games, betas and just lesser popular games in general, tends to run like crap on AMD GPUs. Zero focus from AMD. Zero focus from dev's because 9 out of 10 uses Nvidia.

I use AMD CPU, why? Because they make good CPUs. I don't use AMD GPU, why? Because their GPUs are crap. Worse than ever pretty much. Miss ATi.

If AMD was a normal company, Lisa Su's "head" exactly because the gpu compartment is not working, would have fallen long long time ago.
AMD must be GPU-centric and GPU-first company, in order to generate money as it should.
Stupid, stupid..
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.39/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
If you think the MCM design is about performance, or saving power, then you're hugely mistaken. It's all about controlling defects (smaller chips have better yields) and thus, manufacturing costs. Check your CPU's idle power consumption if you don't believe me.


If AMD is bad in RT, then should they spend nothing on improving it, and just give up altogether? I don't see your logic here.
MCM is about scalability, always have been. AMD said this officially. Usually MCM has better performance per watt, AMD GPUs don't - Their CPUs do.

My CPUs low power consumption is mainly due to low clockspeeds, 3D cache is fragile. Has nothing to do with MCM since its single CCD. I wanted the best gaming chip, and sadly for AMD, the 7800X3D beats both 7900X3D and 7950X3D here. Dual CCD is just not very good for gaming due to latency issues and it does not help that only one CCD has 3D cache either. 7900X3D in particular is bad, since its only 6 cores with 3D cache.
 
Top