• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon RX 7800 XT Based on New ASIC with Navi 31 GCD on Navi 32 Package?

Joined
Feb 24, 2023
Messages
3,019 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
You seem to over estimate how much 30w is. It's not a great deal of power
It's the whole TDP of a GT 1030. And yes, GT 1030 can watch you any movie at any reasonable (up to 8K I guess) resolution without any issue.

I don't need to remind this is a card from three generations before. 30 W is at least three times as much as it should've been. Opinions like yours are fuelling this clown fiesta with 500 W CPUs and 500 W GPUs.

10 percent is the magic number here by the way. By cutting the clocks by 10 percent and appropriate undervolting you lower the TDP of most GPUs by a solid third, sometimes more. This approach in mega corps could've lead to 350 220 W RTX 3090 which still is capable of 4K gaming, yet at a little less performance. And this would make the card way cheaper since it doesn't have to have so much VRM elements and so complex cooling solution, alas we are all eyewitnesses to pushing absurd clocks and wattages out of the box. We will have no compact GPUs in any price segment by 2030 this way. To say I hate this is to say nothing.
 
Joined
Jul 5, 2013
Messages
27,705 (6.66/day)
It's the whole TDP of a GT 1030. And yes, GT 1030 can watch you any movie at any reasonable (up to 8K I guess) resolution without any issue.

I don't need to remind this is a card from three generations before. 30 W is at least three times as much as it should've been. Opinions like yours are fuelling this clown fiesta with 500 W CPUs and 500 W GPUs.

10 percent is the magic number here by the way. By cutting the clocks by 10 percent and appropriate undervolting you lower the TDP of most GPUs by a solid third, sometimes more. This approach in mega corps could've lead to 350 220 W RTX 3090 which still is capable of 4K gaming, yet at a little less performance. And this would make the card way cheaper since it doesn't have to have so much VRM elements and so complex cooling solution, alas we are all eyewitnesses to pushing absurd clocks and wattages out of the box. We will have no compact GPUs in any price segment by 2030 this way. To say I hate this is to say nothing.
Quit with the nit-picking and name-calling.
 
Joined
Feb 24, 2023
Messages
3,019 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
@Vayra86 Looking at the 60Hz VSync power consumption numbers and Gaming power consumption numbers from the review page, for all three RDNA3 cards and comparing them with RDNA2 cards, it seems that RDNA3 is just heavily optimized for low to medium load situations. When pushed too hard it loses that efficiency. And 7600 seems to be set out of it's efficiency curve.

1687617381037.png


You seem to over estimate how much 30w is.
You seem to keep missing the point. I am not looking at my power bill when talking about the power consumption in video playback. You insist on that.
 
Last edited:
Joined
Jul 5, 2013
Messages
27,705 (6.66/day)
You seem to keep missing the point. I am not looking at my power bill when talking about the power consumption in video playback. You insist on that.
Then quit pointing it out. I'm not the one insisting that a few extra watts being used here and there is a big deal.
 
Joined
Feb 24, 2023
Messages
3,019 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
Yes, because whining about a few watts of power is going to make a difference...
I don't know what universe you are living at but 30 W is by no mean a few. It's humongous considering the task given. Just full-on insanity with no efficiency whatsoever.

And I dunno what I am supposed to do aside of "whining" about the real problem whilst I have no private army of a gajillion assassins who make nGreedia and AMD stop making faulty products and charging millions for it.

I'm doing my best. I'm advising people to buy nothing from them (unless they are in a desperate need of the wares of course) and pointing out how bad they became. More people thinking they are bad the merrier. Words have power, too.
 
Joined
Jul 5, 2013
Messages
27,705 (6.66/day)
I don't know what universe you are living at but 30 W is by no mean a few.
Perspective? I live in a world were room lighting, even LED models, uses between 65w to 275w on a constant basis. And that's just lighting. Anyone who is going to quibble over a few extra watts needs a fair dose of reality up-side the face.

I'm doing my best. I'm advising people to buy nothing from them (unless they are in a desperate need of the wares of course) and pointing out how bad they became. More people thinking they are bad the merrier. Words have power, too.
And that seems like a fair amount of special-snowflaking to me.. But what do I know...
 
Joined
Feb 24, 2023
Messages
3,019 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
Perspective? I live in a world were room lighting, even LED models, uses between 65w to 275w on a constant basis. And that's just lighting. Anyone who is going to quibble over a few extra watts needs a fair dose of reality up-side the face.


And that seems like a fair amount of special-snowflaking to me.. But what do I know...
My room lighting only consumes 20 W. 150 sqft. More than enough for that. The fact something is going the way it's going doesn't mean it's good. High wattages are not just like cancer, they are cancer per se.

Allowing for movies playback at a 30 W TDP doesn't differ from shooting pigeons with a bazooka. Same level overkill.
 
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Then quit pointing it out. I'm not the one insisting that a few extra watts being used here and there is a big deal.
No you are just someone who tries to downgrade my point by distorting it. And you keep doing it even with this post of yours.

And YES it does matter for me and probably others. It doesn't matter in the power bill much and that's not the point, it was never the point, but it does matter when looking at the card and it's architecture. Graphics cards today are not like old 3DFX cards that their sole purpose was 3D graphics. They are used to display the screen, playback (and not just that) video, use for compute tasks, drive one or more monitors. So a modern card that usually comes with all bells and whistles about efficiency, needs to be efficient in everything, or at least as efficient as the model it replaces. You disagree? OK. Then quit making it so much of a big deal distorting my point.
 
Joined
Jul 5, 2013
Messages
27,705 (6.66/day)
My room lighting only consumes 20 W. 150 sqft. More than enough for that.
That's your room. That's not everyone, everywhere.
High wattages are not just like cancer, they are cancer per se.
Really? :rolleyes:
Allowing for movies playback at a 30 W TDP doesn't differ from allowing shooting pigeons with a bazooka. Same level overkill.
That seems like more special-snowflaking.

NEWS Flash for you, most DVD/Bluray players use between 45w and 55w for movie playback... Just throwing it out there...

So for a GPU like a Radeon to use a few extra watts for video playback is NOT the end of the world, nor even something worrisome.

No you are just someone who tries to downgrade my point by distorting it. And you keep doing it even with this post of yours.
No, it's called reality. Staying factual is the goal of any scientist.

However, we've strayed from the topic. Let's rope ourselves in..
 
Last edited:
Joined
Feb 24, 2023
Messages
3,019 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
That's your room. That's not everyone, everywhere.
This means my lighting is more efficient than average. Why don't people make their living space more efficient?
[sarcasm] No.
more special-snowflaking
name-calling.
Heh.
DVD/Bluray players
Ancient tech is inefficient? OK, this happens. Still doesn't affect my point of modern GPUs going the way of killing ants with nukes. They become larger, hotter, and lose a lot in terms of efficiency. Just explain me why should I be okay with imminent danger of needing a hangar for an average video card alone?
 
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Perspective? I live in a world were room lighting, even LED models, uses between 65w to 275w on a constant basis. And that's just lighting. Anyone who is going to quibble over a few extra watts needs a fair dose of reality up-side the face.
Well we gone from 100W lamps, to 20W economy lamps, to 10W or less LED lamps.

NEWS Flash for you, most DVD/Bluray players use between 45w and 55w for movie playback... Just throwing it out there...
That's the whole device not the decoding chip.

Amazon.com: Sony UBP-X700 4K Ultra HD Home Theater Streaming Blu-Ray Player : Electronics
Stunning picture with 4K up scaling up to 60p. Power Consumption: 15 Watt in Operation 0.35 Watt in Standby


No, it's called reality.
Your reality seems to be different than reality.
 
Last edited:
Joined
Apr 13, 2023
Messages
310 (0.53/day)
System Name Can it run Warhammer 3?
Processor 7800X3D @ 5Ghz
Motherboard Gigabyte B650 Aorus Elite AX
Cooling Enermax Liqmax III 360mm
Memory Teamgroup DDR5 CL30 6000Mhz 32GB
Video Card(s) Gigabyte 4090
Storage Silicon Power XS70, Corsair T700
Display(s) BenQ EX2710Q, BenQEX270M
Case NZXT H7 Flow
Audio Device(s) AudioTechnica M50xBT
Power Supply SuperFlower Leadex III 850W
I'm talking about aforementioned absurdly high wattage at video playback and multi-monitor usage. It doesn't matter if it's fixed, the only thing that matters is they launched a product unable to perform efficiently. You can afford taking money from customers for a pre-alpha testing privilege when you're miles ahead of your competition, not when you're stone age behind and you yourself are not a competition at all.

7900 XTX was launched marginally cheaper than 4080 and has nothing to brag about. +8 GB VRAM does nothing since 4080 can turn DLSS3 on and yecgaa away from the 7900 XTX. "Sooper dooper mega chiplet arch" does also do nothing when 4080 can preserve 60ish framerates RT On whilst 7900 XTX goes for shambly 30ish FPS with permanent stuttering inbound. More raw power per buck? WHO CARES?
Cope award for June 2023 goes to
 
Joined
Sep 17, 2014
Messages
22,433 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
@Vayra86 Looking at the 60Hz VSync power consumption numbers and Gaming power consumption numbers from the review page, for all three RDNA3 cards and comparing them with RDNA2 cards, it seems that RDNA3 is just heavily optimized for low to medium load situations. When pushed too hard it loses that efficiency. And 7600 seems to be set out of it's efficiency curve.

View attachment 302292


You seem to keep missing the point. I am not looking at my power bill when talking about the power consumption in video playback. You insist on that.
What? No, I don't know if we need to ask the W1zz himself but AFAIK the Energy per frame chart is made up of the standard ultra settings suite. And given the fact all these cards have ran the same stuff, well...

60hz Vsync says nothing of efficiency, it says something about overall card performance relative to whatever is required to run the game at 60hz. Note that bigger cards need higher watts to run the same stuff at 60hz. Suddenly the energy per frame chart is turned upside down wrt where the 7600 is relative to a 7900XT(X). Obviously this has something to do with what we see elsewhere in lower load situations for these cards: they actually do NOT do low load situations well, even if they've dropped to 1/3rd of their maximum board power doing 60hz Vsync. Having a 7900XT, I know what clocks you're seeing then- somewhere along the lines of 1400mhz, perhaps even down to 1100mhz. With a peak stock of 2750, that's pretty damn dynamic in range. Another aspect is that higher tier cards carry a lot more VRAM and a much bigger die, so they play with voltages on a much bigger field; 0.1V is not as fine grained on a big GPU as it is on entry level.

And that of course echoes what we see in video playback and idle on RDNA3. They could simply do with a few more intermediate low load power states, is my guess, but that's probably harder going on chiplets. The world isn't ending here. We're seeing a child's disease of a new technology on GPUs. And it is, honestly, with the recent fixes, a mild one.

Its totally imaginable we'll see more dynamic power states in the future, perhaps they'll even manage to completely turn off specific chiplets to get an efficiency bump at low loads.

Efficiency where it matters, to me, is gaming at >60% load situations. That's where you'll be most of the time, or you'll be idling. Again... reality checks are in order, that low load gap of 30W (we see it here, again, in 60hz Vsync between the 7600 and 7900XT(X)!) is the equivalent of having one extra old school light bulb on in the house. Those began at just about 25W. Honestly, who cares.
 
Last edited:
Joined
Jun 2, 2017
Messages
9,127 (3.34/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
So many emotions, The issue that some people are not seeing is that AMD users only look at past AMD cards for reference. As someone else has pointed out the jump from 6000 to 7000 is most felt at 4K. I posted a video on what AMD needs to do when they did a side by side test at PAX East. The media and internet blasted AMD for being cocky about the launch of 7000 but it was proven to me as I bought my 4K monitor (FV43U) in the previous year but had to run many Games at 1440P to high rates now the only thing I change is Vsync. I fully expect the 7800Xt to blow people away with it's performance but it will be a 1440P card, which is a much more common resolution for Gamers.

Using other products to try to get an idea about performance is to me like people that think a 7950X3D with one CCD turned off is the same as a 78000X3D and then to say that the 7900X3D is a slower CPU than the 7800X3D because it has less cores.
 
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
What? No, I don't know if we need to ask the W1zz himself but AFAIK the Energy per frame chart is made up of the standard ultra settings suite. And given the fact all these cards have ran the same stuff, well...

60hz Vsync says nothing of efficiency, it says something about overall card performance relative to whatever is required to run the game at 60hz. Note that bigger cards need higher watts to run the same stuff at 60hz. Suddenly the energy per frame chart is turned upside down wrt where the 7600 is relative to a 7900XT(X). Obviously this has something to do with what we see elsewhere in lower load situations for these cards: they actually do NOT do low load situations well, even if they've dropped to 1/3rd of their maximum board power doing 60hz Vsync. Having a 7900XT, I know what clocks you're seeing then- somewhere along the lines of 1400mhz, perhaps even down to 1100mhz. With a peak stock of 2750, that's pretty damn dynamic in range. Another aspect is that higher tier cards carry a lot more VRAM and a much bigger die, so they play with voltages on a much bigger field; 0.1V is not as fine grained on a big GPU as it is on entry level.
Look at the lines.
RDNA1 representative, RX 5700XT (green line), is above the middle in the gaming chart and worst performer when set to just draw 60 fps.
RDNA2 cards (blue lines) do better when limited at 60 fps
RDNA3 cards (red lines) not only keep their positions in the charts, but even improving them.

I think AMD is targeting mobiles, not desktop with it's designs. Designs that can offer excelent performance at much lower frequencies, can be used in laptops, consoles, tablets. That being said it's strange 7000 cards to not excel in video playback.

Bigger cards have more CUs to hit those 60fps, so they don't need to push their GPU too high. 7900XT/X cards show better behavior limited at 60 fps compared to 7600. Still too high, but better behavior compared to the little model. Looking at Nvidia's 4000 series, the top GPU 4090, is not far apart from the lowest available model, the 4060 Ti. Just 20Ws. AMD needs to find a way to limit power consumption of it's GPUs no matter how many CUs they include. That will also help in other areas, like video playback and multi monitor.
 

Nkd

Joined
Sep 15, 2007
Messages
364 (0.06/day)
The problem isn't RDNA3, its the competitive edge. The whole RDNA3 'stack' (three SKUs lol) just doesn't lead and also fails to cover a relevant stack position in price/perf. Ada realistically has it beat; its ever so slightly more power efficient, has bigger featureset, and sacrificed raster perf/$ compared to RDNA3 is compensated for by DLSS3, if that's what you're willing to wait for. All aspects that command a premium. It should have competed on price. The only advantage it has is VRAM, but RDNA2 has that too.

That's where people imagine that 7800XT. But I think we already have the 7900XT doing this. Its just too expensive. The price drops we saw in news yesterday are finally nudging it into the right place, but now there is no wow effect and local pricing will adjust too slowly for it to really make a dent.

AMD did a penny wise pound foolish launch imho.


I'm sure 3 Ghz was a stretch goal for them or something, because I have actually seen my 7900XT go over 2900 on rare occasions. All it took was nudge the max frequency slider to the right. You won't have it reliably, but it will just situationally boost to it.

7900xt is literally cheaper than 4070ti retail price anow. So 779 is not a bad price for it sure it can always be even better. Shit amazon has had coupons to make the 7900xtx below 900 recently including the merc 310. So they are getting there.
 
Joined
Jan 14, 2019
Messages
12,337 (5.77/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
A smaller chip with lots more CUs than the big one, but cut in half to place it into its desired product segment... even if I let my imagination run wild, I find it hard to believe all this. There's literally zero financial benefit of designing a new GPU and shaving half of it off to sell it cheaper.
 
Joined
Sep 17, 2014
Messages
22,433 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Look at the lines.
RDNA1 representative, RX 5700XT (green line), is above the middle in the gaming chart and worst performer when set to just draw 60 fps.
RDNA2 cards (blue lines) do better when limited at 60 fps
RDNA3 cards (red lines) not only keep their positions in the charts, but even improving them.

I think AMD is targeting mobiles, not desktop with it's designs. Designs that can offer excelent performance at much lower frequencies, can be used in laptops, consoles, tablets. That being said it's strange 7000 cards to not excel in video playback.

Bigger cards have more CUs to hit those 60fps, so they don't need to push their GPU too high. 7900XT/X cards show better behavior limited at 60 fps compared to 7600. Still too high, but better behavior compared to the little model. Looking at Nvidia's 4000 series, the top GPU 4090, is not far apart from the lowest available model, the 4060 Ti. Just 20Ws. AMD needs to find a way to limit power consumption of it's GPUs no matter how many CUs they include. That will also help in other areas, like video playback and multi monitor.
I think you are still reading way too much into it.
 
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
I think you are still reading way too much into it.
That's the fun part with forums. Throw a number of theories/thoughts/facts, have some replies about them. Else we can just read news articles and be done with it. Go and spend our time with something more productive instead.
 
Joined
Jun 21, 2013
Messages
602 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
Interesting. Are you of the same opinion with CPU's? Cause amd cpus consume 30-40w sitting there playing videos while intel drops to 5 watts.
What Intel CPU does that exactly? Unless you are comparing laptop i3 vs 16 core desktop AMD, then this is pure nonsense as there are exactly 0 desktop CPUs that eat only 5W on idle, let alone during video playback.
 
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Okay, plenty of games (dota 2 as an example) crash the drivers when run on dx11. It was a very common issue with RDNA2 but im hearing it also happen with RDNA3.

Diablo 4 on RDNA2 makes your desktop flicker when you alt tab to it, happens on my g14 laptop.

Driver installation requires you to disconnect from the internet. Which I guess is okay if you already know about it, I didn't, took me couple of hours to figure out why my laptop isn't working.


Interesting. Are you of the same opinion with CPU's? Cause amd cpus consume 30-40w sitting there playing videos while intel drops to 5 watts.
OH MY GOD!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! :roll::roll::roll::roll::roll::roll::roll::roll::roll::roll::roll:

Your posts are pure gold. You throw nonsense and then expect from the other person to debunk that nonsense
"There is a pink Elephant in the center of the Great Red Spot of Jupiter selling Hot Dogs. Just ordered a dozen. Prove me wrong".

To be honest don't know about DOTA2. Done a quick google search, they say everything is fine. I also don't see why run it in DX11 mode with DX12 cards, when it runs in DX12 also.
Some games might have peculiar bugs here and there also with Nvidia cards. There are always open issues with any driver release from any company.

About Diablo 4. RTX 4070 is an Nvidia card. Right? All cards mentioned in the link below are Nvidia. The thread is almost 20 days old. No fix from Nvidia and whoever needs to fix something.
Alt+Tabbing causes massive pc slowdown - PC Bug Report - Diablo IV Forums
Of course you can blame the CPU here. It's AMD in all cases.

The driver installation issue was a very specific Windows issue, where Windows was trying to update the graphics drivers in parallel with AMD's driver installer when choosing in AMD's installer the optional "Factory Reset" option. That issue was solved in less than a week from the first time it was reported back in March. You come here months later throwing nonsense and trying to convince us that you still have it. I doubt you even have an AMD laptop anyway, because you would know by now that this issue was fixed long ago. If you do have an AMD laptop and don't know that the problem was identified, that's says something about your knowledge about hardware. If you do have an AMD laptop and know about the isuue and that it was identified and fixed, the question is why come here months latter and try to convince us that it is an on going issue.

As for video playback at 30-40W. You do realize there are AMD CPUs and APUs with 15W TDP right? I mean, if you are going to lie and in my opinion your whole post is just lies, at least make those lies look plausible.
 
Joined
Jun 14, 2020
Messages
3,441 (2.12/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
To be honest don't know about DOTA2. Done a quick google search, they say everything is fine. I also don't see why run it in DX11 mode with DX12 cards, when it runs in DX12 also.
Some games might have peculiar bugs here and there also with Nvidia cards. There are always open issues with any driver release from any company.
So because you can run dx12 it's not an issue that it doesn't work with dx 11?

Of course you can blame the CPU here. It's AMD in all cases.
Exactly. Im talking about specific cases where it's the amd gpu at fault, I know cause i've tested with the iGPU instead and it works fine.

The driver installation issue was a very specific Windows issue, where Windows was trying to update the graphics drivers in parallel with AMD's driver installer when choosing in AMD's installer the optional "Factory Reset" option. That issue was solved in less than a week from the first time it was reported back in March. You come here months later throwing nonsense and trying to convince us that you still have it. I doubt you even have an AMD laptop anyway, because you would know by now that this issue was fixed long ago. If you do have an AMD laptop and don't know that the problem was identified, that's says something about your knowledge about hardware. If you do have an AMD laptop and know about the isuue and that it was identified and fixed, the question is why come here months latter and try to convince us that it is an on going issue.
It IS an ongoing issue. Happened to me literally 2 weeks ago that I got my laptop. How is it fixed? You can doubt all you want, I don't have 1 amd laptop, I have 3 (6900hs, 6800u, 5600h).
As for video playback at 30-40W. You do realize there are AMD CPUs and APUs with 15W TDP right? I mean, if you are going to lie and in my opinion your whole post is just lies, at least make those lies look plausible.
Those amd cpus and apus are the mobile parts that don't have the io die. Those are great, that's why I buy amd laptops. But the desktop parts with the io die, those draw 30-40w sitting there idle doing nothing.

What Intel CPU does that exactly? Unless you are comparing laptop i3 vs 16 core desktop AMD, then this is pure nonsense as there are exactly 0 desktop CPUs that eat only 5W on idle, let alone during video playback.
Every single alder and raptor lake cpu idles at 2w and browses the web at 5-10w on balanced windows power plan.
 
Top