• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RDNA4 Architecture to Build on Features Relevant to Gaming Performance, Doesn't Want to be Baited into an AI Feature Competition with NVIDIA

Joined
Jan 14, 2019
Messages
12,585 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Upscale is good for 4k or low end card that barely can play anything. 1080p and upscaler is a misunderstanding at best.


I think you can sharpen the image. The problem is FSR and DLSS have different scale even though set the same. You have to find a sweet spot for each one separately. DLSS does not have to be blurry as much.
My sweet spot is DLSS/FSR turned off. Like I said, mileage may vary at higher resolutions, but for me at 1080p with no plans to upgrade, it's not gonna be a thing. Spending hundreds on a new monitor only to be forced to spend hundreds more on a graphics card and/or use some upscaling trickery for playable framerates sounds counter-intuitive to me. That's why I think it's a gimmick.
 
Joined
Dec 25, 2020
Messages
7,021 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
It was not a comment on the game (although I didn't find it as bad as people commonly do). It was a reference to the image being blurry as heck with basically any DLSS setting (except for off). I didn't think I had to explain, but here we go.

Oh, I know. Really needed to vent though, sorry ;):laugh:
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
There is plenty of showcases about that so there's no need to do that. If you limit it you will lose performance. 4090 is a very powerful card and I'm sure, chopping off some performance for less power draw is not a disaster but still you do lower the performance for some power saving. It is a good tradeoff though.
Yeah, i lost around 1.3% :roll:

I've tried both DLSS and FSR, and concluded that they look like crap on my 1080p monitor. Where's the agenda? :roll:

If forming an opinion based on first-hand experience means ignorance to you, then maybe you're the one with the agenda and there's nothing left to talk about.
1080p dunno, might be terrible, but it's not needed for 1080p anyways.. At 4k you really can't tell the difference. Actually, the magical part is, in lots of games it actually increases image quality. Plenty of games with poor implementation of TAA look like crap at native. So not only do you get a noticeable performance increase, not only is your GPU drawing much less power with it active, in some games you also get better graphics.
 
Joined
May 31, 2016
Messages
4,441 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Yeah, i lost around 1.3% :roll:
Someone else will lose more. As you see I can't rely on your data and your experience alone but I can rely on NV adverstisement. On the other hand, it is very hard to believe that NV would go with so much trouble to advertise a card with so much power consumption just to realize after, they have made a mistake and 1.3% less performance, would have manged to cut the power consumption by a third.

My sweet spot is DLSS/FSR turned off. Like I said, mileage may vary at higher resolutions, but for me at 1080p with no plans to upgrade, it's not gonna be a thing. Spending hundreds on a new monitor only to be forced to spend hundreds more on a graphics card and/or use some upscaling trickery for playable framerates sounds counter-intuitive to me. That's why I think it's a gimmick.
Your card can pull 1440p no problem. Maybe in some games would be hard but that can be addressed with a slightly lower setting. I would advise you to consider 1440p. For me playing at 4k, it would be hard to go 1080p since it is a blurry mess for me at this point just like DLSS is for you.
 
Joined
Jan 14, 2019
Messages
12,585 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
1080p dunno, might be terrible, but it's not needed for 1080p anyways.. At 4k you really can't tell the difference. Actually, the magical part is, in lots of games it actually increases image quality. Plenty of games with poor implementation of TAA look like crap at native. So not only do you get a noticeable performance increase, not only is your GPU drawing much less power with it active, in some games you also get better graphics.
I believe you that it's probably less bad at higher resolutions as it has a better quality sample to work with.

I'm not sure about improving quality, though. Upscaling is a technology designed to slightly worsen your image quality for more performance, so improving is a contradiction. I'll believe it when I see it, I guess. :)

Your card can pull 1440p no problem.
It sure can now, but what about 2-4 years later? Besides, I'm happy with 1080p, I don't feel like I'm missing out on anything at all. :)

For me playing at 4k, it would be hard to go 1080p since it is a blurry mess for me at this point just like DLSS is for you.
That's another reason why I don't want to upgrade. Higher res may sound like a nice thing to have until you look at games like Hogwarts Legacy that run like crap with everything turned on at 4k even on a 4090. I don't want to fall into the trap of getting used to it, and then not being able to go back, or having to spend more on a GPU upgrade when something new that I want to play doesn't run well.
 
Joined
May 31, 2016
Messages
4,441 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
It sure can now, but what about 2-4 years later? Besides, I'm happy with 1080p, I don't feel like I'm missing out on anything at all. :)
1440p screens support 1080p. Do not see a problem here. Same thing happened to me when I bought 4k screen 60Hz. When it came and I set it up, I said to myself why i did it? 1080p is more than enough. Then I started playing and well. I bought 4k 144hz with 6900XT so go figure.
That's another reason why I don't want to upgrade. Higher res may sound like a nice thing to have until you look at games like Hogwarts Legacy that run like crap with everything turned on at 4k even on a 4090. I don't want to fall into the trap of getting used to it, and then not being able to go back, or having to spend more on a GPU upgrade when something new that I want to play doesn't run well.
4k was my choice and I dont want to go back. Not to 1080p for sure. If anything I dial graphics settings down if FPS is lacking, moving to a lower res is a last resort for me which I always try to avoid.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I believe you that it's probably less bad at higher resolutions as it has a better quality sample to work with.

I'm not sure about improving quality, though. Upscaling is a technology designed to slightly worsen your image quality for more performance, so improving is a contradiction. I'll believe it when I see it, I guess. :)
I can post you some pictures for example from codbo later, the native TAA looks like a huge downgrade compared to dlss Q. Lots of reviewers have actually reported the same thing, even steve from hwunboxed, that dlss improves iq in lots of situations. It's a complete gamechanger.

Fg on the other hand is way more situational, it only works decently / great when your framerate is already at least 45-50 fps without it. Still wouldn't call it a gimmick

Someone else will lose more. As you see I can't rely on your data and your experience alone but I can rely on NV adverstisement. On the other hand, it is very hard to believe that NV would go with so much trouble to advertise a card with so much power consumption just to realize after, they have made a mistake and 1.3% less performance, would have manged to cut the power consumption by a third.
How much power do you think the 4090 consumes? You realise in the vast majority of games it hovers around 350 watts right?

Doesnt the same thing apply to zen 4 cpus? Don't they cut the power in half while losing a tiny amount of performance? You seem more eager to believe it when amd is involved don't you?
 
Joined
May 31, 2016
Messages
4,441 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
How much power do you think the 4090 consumes? You realise in the vast majority of games it hovers around 350 watts right?

Doesnt the same thing apply to zen 4 cpus? Don't they cut the power in half while losing a tiny amount of performance? You seem more eager to believe it when amd is involved don't you?
Why are you constantly eager to compare GPUs to CPUs? It is like comparing a grain picker to a tractor asking which one runs faster. Can you use a GPU to GPU comparison if you want to prove a point? I really dont understand your obsession about AMD's CPUs. Every device will lose performance while power constraint. Not sure what are you trying to prove here. I have dropped voltage and clocks a bit on my 6900xt to acquire less power consumption and drop few degrees as well while at it with which I will be comfortable. What's your point? NV GPUs can do the same? I didnt lose a lot but undoubtedly I did lose performance and I'm sure others would lose as well or even the settings I have applied to my card would not work with theirs. For me, I dont even see the difference in games to begin with. I can't say my 6900xt consumes 270W power while gaming when 100% utilization hits it even thought it does show 270W. So is 6900xt uses 270W? Mine does but that does not mean every single Red Devil 6900XT will be the same and some will have to use advertised power consumption no matter what you do. Judging a card just by one person's experience is false.
What I think 4090 consumes? According to TPU MSI RTX 4090 Gaming Trio during gaming consumes 429Watts. I'm not even talking about max power here. Saying that you have dropped 100Watts for 1.3% performance loss is rather spectacular and weird that NV did not advertise it by themselves. Or maybe, it is just your card which again, I can't rely on your experience alone since maybe only your card is that spectacular and majority of the same card are not. That is why the power consumption advertisement from NVidia otherwise a lot of cards would not pass the qualification test thus less in the market which means even higher prices for the product which is ridiculously highly priced anyway.
 
Joined
Jan 14, 2019
Messages
12,585 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
1440p screens support 1080p.
Sure, but lower resolution images on higher resolution screens always look worse than they would on a lower native resolution screen. Doing stuff on my parents' old 900p monitor feels normal, but if I set the desktop resolution to 900p on my 1080p monitor, it'll look horrible.

Same thing happened to me when I bought 4k screen 60Hz. When it came and I set it up, I said to myself why i did it? 1080p is more than enough. Then I started playing and well. I bought 4k 144hz with 6900XT so go figure.
I'd probably be the same, and that's what I want to avoid. ;)

I can post you some pictures for example from codbo later, the native TAA looks like a huge downgrade compared to dlss Q. Lots of reviewers have actually reported the same thing, even steve from hwunboxed, that dlss improves iq in lots of situations. It's a complete gamechanger.
Sure, why not. I have my doubts of it helping in all cases, but I won't say no to taking a look.

There are quite a few bad AA implementations, though. The one that's in the Nvidia driver (I'm not sure what it's called) gave me some terrible shimmering around UI elements in Skyrim back in the days. I remember because it took me a while to figure out what the problem was.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Of course maximum power draw is absolutely useless. Card A has 400w max power draw and 200w average, card B has 280w max and 250w average. Card A is clearly absolutely unarguably better at power draw. You can't even argue that.

So you proved me wrong by agreeing with me that the XT draws a lot more power. Great, and yes that's usually the case, you prove me wrong every single time by admitting that everything I said is absolutely the case. Gj, keep it up


Yeah, that 6c12t that amd launched in 2023 for 350 gives you great longevity over the 14c Intel offers. LOL

Efficiency cores are useless for gaming. Intel has 8 performance cores tops. If you believe their marketing, then you need to educate yourself.

i9-13900K has 8 performance cores and 16 useless e cores, then marketed as a 24 core chip, LMAO.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Efficiency cores are useless for gaming. Intel has 8 performance cores tops. If you believe their marketing, then you need to educate yourself.

i9-13900K has 8 performance cores and 16 useless e cores, then marketed as a 24 core chip, LMAO.
First of all, not only is that irrelevant, it's also wrong. The 16ecores are exactly as useless as the 2nd ccd the 7950x has. No more, no less. Both are useful mainly only on heavy threaded workloads.

Now with that said, with ecores off on for example cyberpunk, performance drops. Dramatically in fact. Tom's dinner area goes from around 100-120 fps to 80-95. So yeah..
 
Joined
May 31, 2016
Messages
4,441 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Sure, but lower resolution images on higher resolution screens always look worse than they would on a lower native resolution screen. Doing stuff on my parents' old 900p monitor feels normal, but if I set the desktop resolution to 900p on my 1080p monitor, it'll look horrible.
I think that depends on the screen quality and size of it. I would not suggest getting 32 inch if you are planning to play 1080p at some point in the future and you are not going to upgrade GPU.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Sure, but lower resolution images on higher resolution screens always look worse than they would on a lower native resolution screen. Doing stuff on my parents' old 900p monitor feels normal, but if I set the desktop resolution to 900p on my 1080p monitor, it'll look horrible.


I'd probably be the same, and that's what I want to avoid. ;)


Sure, why not. I have my doubts of it helping in all cases, but I won't say no to taking a look.

There are quite a few bad AA implementations, though. The one that's in the Nvidia driver (I'm not sure what it's called) gave me some terrible shimmering around UI elements in Skyrim back in the days. I remember because it took me a while to figure out what the problem was.
Of course it doesn't increase iq in all games, but it doesn't really decrease it either. I cannot for the life of me tell the difference, not even with screenshot next to each other. Even balanced looks great on static screens, but then it loses in motion where you can see some minor artifacts, but dlssQ is amazing
 
Joined
May 31, 2016
Messages
4,441 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
First of all, not only is that irrelevant, it's also wrong. The 16ecores are exactly as useless as the 2nd ccd the 7950x has. No more, no less. Both are useful mainly only on heavy threaded workloads.

Now with that said, with ecores off on for example cyberpunk, performance drops. Dramatically in fact. Tom's dinner area goes from around 100-120 fps to 80-95. So yeah..
Not true.
TPU test.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Not true.
TPU test.
You are telling me what is and isn't true about a cpu I have in a game I play. Seems legit. Did he test Tom's dinner or some other heavy area of the game? Cause obviously, in a non heavy area you don't need a gazillion cores, that's true for every game.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
First of all, not only is that irrelevant, it's also wrong. The 16ecores are exactly as useless as the 2nd ccd the 7950x has. No more, no less. Both are useful mainly only on heavy threaded workloads.

Now with that said, with ecores off on for example cyberpunk, performance drops. Dramatically in fact. Tom's dinner area goes from around 100-120 fps to 80-95. So yeah..

And in other games, like Far Cry 5, Metro Exodus, GreedFall performance is 10% lower with crappy e-cores enabled. So yeah... It's probably has better performance in some games because 8 performance cores are maxed out, however AMD has 12-16 cores on their high-end chips. As in true performance cores.

7950X 2nd CCD has performance cores + SMT only. Zero garbage cores.

Efficiency cores makes pretty much zero sense for desktop usage. And you are stuck with Windows 11 only, because without Thread Director, you will get wonky performance (software uses the wrong cores = crap performance)

The only reason why Intel does it, is to up the multithreaded performance, especially in synthetic tests like Cinebench and marked the chips as higher core count chips, but it's mostly just a marketing gimmick lie, because Intel has struggled for years with core count. They COULD have put 12 performance cores on 13900K, but watt usage would explode, however performance would have been much better than it is. Sadly Intel needs 5.5-6 GHz clockspeeds to match AMD and Upcoming Ryzen 7000 3D will beat Intel in gaming, again. 7800X3D at 399 dollars will probably smack even i9-13900KS which will be a 799 dollar chip. Sad but true.

And 7900XTX gets closer and closer to 4090, while beating 4080 more and more; https://www.techpowerup.com/review/asrock-radeon-rx-7900-xtx-taichi/31.html

Nvidia answer will be; 4080 Ti and 4090 Ti. Gimpy gimpy time. The leatherjacket soon pulls them out of the oven.
 
Joined
May 31, 2016
Messages
4,441 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
You are telling me what is and isn't true about a cpu I have in a game I play. Seems legit. Did he test Tom's dinner or some other heavy area of the game? Cause obviously, in a non heavy area you don't need a gazillion cores, that's true for every game.
You are telling everyone that @W1zzard with all his assortment of CPUs and GPUs performing benchmarks an being here testing stuff for us and literally showing the ecore dependency in games is incorrect or what are you saying?
I don't care what you have to be fair. Or is it like with your 4090 card claim? Considering how you showcase your 4090 as a legitimately low power consumption card proclaiming it applies to all 4090's, I see a flaw in your conclusions which deem you untrustworthy.
Another miracle uncovered by you. Some games or applications use more cores some don't. So you focus on whatever suits you. What is it you are trying to prove again here? That the CPU you have is exceptional? great lucky you.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
You are telling everyone that @W1zzard with all his assortment of CPUs and GPUs performing benchmarks an being here testing stuff for us and literally showing the ecore dependency in games is incorrect or what are you saying?
I don't care what you have to be fair. Or is it like with your 4090 card claim? Considering how you showcase your 4090 as a legitimately low power consumption card proclaiming it applies to all 4090's, I see a flaw in your conclusions which deem you untrustworthy.
Another miracle uncovered by you. Some games or applications use more cores some don't. So you focus on whatever suits you. What is it you are trying to prove again here? That the CPU you have is exceptional? great lucky you.
Never said the 4090 is a low power card. I said it's incredibly efficient. Which it is.

Never said the 13900k is exceptional. Actually I swapped back to my 12900k cause I prefer it. Had I said is that ecores are not useless in gaming, since there are games that benefit a lot from them. Try to actually argue with what people are saying instead of constant strawmaning
 
Joined
Jan 8, 2017
Messages
9,506 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
The e cores on Intel's CPUs are most certainly not useless, sometimes you can lose almost 10% performance with them enabled, so they definitely do something that's for sure. :roll:

1676977075610.png
 
Joined
May 31, 2016
Messages
4,441 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Never said the 4090 is a low power card. I said it's incredibly efficient. Which it is.

Never said the 13900k is exceptional. Actually I swapped back to my 12900k cause I prefer it. Had I said is that ecores are not useless in gaming, since there are games that benefit a lot from them. Try to actually argue with what people are saying instead of constant strawmaning
I said your CPU is exceptional not the 13900K. The 13900K is to showcase the ecores in gaming.
Not a lower power card. OK. Efficient? If you talk about efficiency you need to say against what? AMD CPUs since you have brought those up with your 4090? You need to have a metric. This is how much power this device use and this is how much power my device use for the same performance for instance or any other metric you have there when you talk about efficiency.
Yes you did say the ecores are not useless for games and I told you they are or if there is a difference, it is so mediocre it's pointless to mention it. Arguing that you have a CPU with ecores and I dont which somehow gives you the right to falsely claim things is not OK? If you want please refer to @W1zzard test of ecores and explain why it is wrong since your "exceptional CPU" performs different and W1zards finding are false.
 

OmniaMorsAequat

New Member
Joined
Jul 6, 2021
Messages
2 (0.00/day)
in short we lack the talent to compete so we aren't even going to try
why does this sound so familar

O wait this is exactly the same bullshit they said with bulldozer

I swear idiots run this company

futher more AI is becomming the hottest thing since the sun and AMD is like mmmm no thanks we are going to keep doing what we are doing
which is being a mediocur second fiddle to everybody else
because that as worked so well before
Arrrrrrrrrrg this level of stupid short sighted quitter talk drives me batty
What's wrong with what they reported? I don't see it as a bad idea to have to implement artificial intelligence in games (NPCs etc) instead of dedicating it to image processing like Nvidia does (Rockstar Games is doing something similar to AMD with their new game).
The 3D computation process should be fully managed by the video card and not with the artificial support of having extra frames by sacrificing the native resolution. I can understand the use of DLSS and Frame Generation in games that maybe use Ray Tracing quite complete and heavy enough to handle natively for a video card, and in any case you get good results even with traditional rendering methods.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
And in other games, like Far Cry 5, Metro Exodus, GreedFall performance is 10% lower with crappy e-cores enabled. So yeah... It's probably has better performance in some games because 8 performance cores are maxed out, however AMD has 12-16 cores on their high-end chips. As in true performance cores.

7950X 2nd CCD has performance cores + SMT only. Zero garbage cores.

Efficiency cores makes pretty much zero sense for desktop usage. And you are stuck with Windows 11 only, because without Thread Director, you will get wonky performance (software uses the wrong cores = crap performance)

The only reason why Intel does it, is to up the multithreaded performance, especially in synthetic tests like Cinebench and marked the chips as higher core count chips, but it's mostly just a marketing gimmick lie, because Intel has struggled for years with core count. They COULD have put 12 performance cores on 13900K, but watt usage would explode, however performance would have been much better than it is. Sadly Intel needs 5.5-6 GHz clockspeeds to match AMD and Upcoming Ryzen 7000 3D will beat Intel in gaming, again. 7800X3D at 399 dollars will probably smack even i9-13900KS which will be a 799 dollar chip. Sad but true.

And 7900XTX gets closer and closer to 4090, while beating 4080 more and more; https://www.techpowerup.com/review/asrock-radeon-rx-7900-xtx-taichi/31.html

Nvidia answer will be; 4080 Ti and 4090 Ti. Gimpy gimpy time. The leatherjacket soon pulls them out of the oven.
The heck are you talking about. Ecores are EXACTLY as useful as the 2nd ccd. The 2nd ccd is also useful for cine bench and synthetics. There is no application that the 2nd ccd boosts performance but the ecores don't. So your opinion has to be based on pure bias and nothing more.

I said your CPU is exceptional not the 13900K. The 13900K is to showcase the ecores in gaming.
Not a lower power card. OK. Efficient? If you talk about efficiency you need to say against what? AMD CPUs since you have brought those up with your 4090? You need to have a metric. This is how much power this device use and this is how much power my device use for the same performance for instance or any other metric you have there when you talk about efficiency.
Yes you did say the ecores are not useless for games and I told you they are or if there is a difference, it is so mediocre it's pointless to mention it. Arguing that you have a CPU with ecores and I dont which somehow gives you the right to falsely claim things is not OK? If you want please refer to @W1zzard test of ecores and explain why it is wrong since your "exceptional CPU" performs different and W1zards finding are false.
Well in most games more than 8 cores are useless, so of course the ecores are as well. That has nothing to do with ecores themselves, it had to do with the games not requiring that many cores. The same thing applies to the 7950x with it's 2nd ccd being useless in most games except a minority of games that scale in more than 8 cores
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
The heck are you talking about. Ecores are EXACTLY as useful as the 2nd ccd. The 2nd ccd is also useful for cine bench and synthetics. There is no application that the 2nd ccd boosts performance but the ecores don't. So your opinion has to be based on pure bias and nothing more.


Well in most games more than 8 cores are useless, so of course the ecores are as well. That has nothing to do with ecores themselves, it had to do with the games not requiring that many cores. The same thing applies to the 7950x with it's 2nd ccd being useless in most games except a minority of games that scale in more than 8 cores

What are YOU talking about, both CCDs have a CCX with 8C/16T with performance cores only. 2nd CCD runs slighly lower clockspeeds BUT USING SAME MICROARCHITECTURE and they are still FAST, with no risk of E cores complicating software compatibility. 7950X = 16 performance cores, 13900K = 8 performance cores.

Intels E cores are using dated and MUCH LOWER CLOCKED microarchitecture. The cores are NOT FAST at all. Their primary goal is to fool consumers into thinking the chip has more cores than it has.
i5-13600K is a "14 core chip" but only has 6 performance cores :roll: Intel STILL only has 6-8 performance cores across the board on mainstream chips in the upper segment. Rest is useless e cores.

Ryzen 7000 chips with 3D cache will beat Intel in gaming anyway. Hell even the 7800X3D 400 dollar chip will beat 13900KS with 6 GHz boost and twice if not triple the peak watt usage + 800 dollar price tag and Intel will abandon the platform after 2 years as usual. Meaning 14th gen will require new socket, new board. Milky milky time. Intels architechture is inferior which is why they need to run with high clockspeeds to be able to compete, SADLY for Intel this means high watt usage.

However i9-12900K/KS and i9-13900K/KS are pointless chips gamers, since i7 delivers the same gaming performance anyway, without the HUGE watt usage. Hell even i5's are within a few percent.
 
Last edited:

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.76/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
The e cores on Intel's CPUs are most certainly not useless, sometimes you can lose almost 10% performance with them enabled, so they definitely do something that's for sure. :roll:

View attachment 284825

I've seen this one before, but i don't recall if there have been similar tests for the 7900X and 7950X CPUs with CCD2 disabled VS both CCDs enabled: it would be interesting to see, i think, if it's Intel's approach that is slower with e-cores disabled or if it is AMD's approach that is slower with the 2nd CCD disabled.
 
Joined
May 31, 2016
Messages
4,441 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Well in most games more than 8 cores are useless, so of course the ecores are as well. That has nothing to do with ecores themselves, it had to do with the games not requiring that many cores. The same thing applies to the 7950x with it's 2nd ccd being useless in most games except a minority of games that scale in more than 8 cores
Can you point a game that uses ecores with pcores?
 
Top