Monday, February 20th 2023

AMD RDNA4 Architecture to Build on Features Relevant to Gaming Performance, Doesn't Want to be Baited into an AI Feature Competition with NVIDIA

AMD's next-generation RDNA4 graphics architecture will retain a design-focus on gaming performance, without being drawn into an AI feature-set competition with rival NVIDIA. David Wang, SVP Radeon Technologies Group; and Rick Bergman, EVP of Computing and Graphics Business at AMD; gave an interview to Japanese tech publication 4Gamers, in which they dropped the first hints on the direction which the company's next-generation graphics architecture will take.

While acknowledging NVIDIA's movement in the GPU-accelerated AI space, AMD said that it didn't believe that image processing and performance-upscaling is the best use of the AI-compute resources of the GPU, and that the client segment still hasn't found extensive use of GPU-accelerated AI (or for that matter, even CPU-based AI acceleration). AMD's own image processing tech, FSR, doesn't leverage AI acceleration. Wang said that with the company introducing AI acceleration hardware with its RDNA3 architecture, he hopes that AI is leveraged in improving gameplay—such as procedural world generation, NPCs, bot AI, etc; to add the next level of complexity; rather than spending the hardware resources on image-processing.
AMD also stressed on the need to make the GPU more independent of the CPU in graphics rendering. The company took several steps in this direction over the past many generations, with the most recent being the multi-draw indirect accelerator (MDIA) component introduced with RDNA3. Using this, software can dispatch multiple instanced draw commands that can be issued on the GPU, greatly reducing the CPU-level overhead. RDNA3 is up to 2.3x more efficient at this than RDNA2. Expect more innovations along these lines with RDNA4.

AMD understandably didn't talk anything about the "when," "what," and "how" of RDNA4 as its latest RDNA3 architecture is just off the ground, and awaiting a product ramp through 2023 into the various market-segments spanning from iGPUs to mobile GPUs, and mainstream desktop GPUs. RDNA3 is currently powering the Radeon RX 7900 series high-end graphics cards, and the company's latest 5 nm "Phoenix Point" Ryzen 7000-series mobile processor iGPUs. You can catch the 4Gamer interview in the source link below.
Sources: 4Gamers.net, HotHardware
Add your own comment

221 Comments on AMD RDNA4 Architecture to Build on Features Relevant to Gaming Performance, Doesn't Want to be Baited into an AI Feature Competition with NVIDIA

#126
ratirt
fevgatosYeah, i lost around 1.3% :roll:
Someone else will lose more. As you see I can't rely on your data and your experience alone but I can rely on NV adverstisement. On the other hand, it is very hard to believe that NV would go with so much trouble to advertise a card with so much power consumption just to realize after, they have made a mistake and 1.3% less performance, would have manged to cut the power consumption by a third.
AusWolfMy sweet spot is DLSS/FSR turned off. Like I said, mileage may vary at higher resolutions, but for me at 1080p with no plans to upgrade, it's not gonna be a thing. Spending hundreds on a new monitor only to be forced to spend hundreds more on a graphics card and/or use some upscaling trickery for playable framerates sounds counter-intuitive to me. That's why I think it's a gimmick.
Your card can pull 1440p no problem. Maybe in some games would be hard but that can be addressed with a slightly lower setting. I would advise you to consider 1440p. For me playing at 4k, it would be hard to go 1080p since it is a blurry mess for me at this point just like DLSS is for you.
Posted on Reply
#127
AusWolf
fevgatos1080p dunno, might be terrible, but it's not needed for 1080p anyways.. At 4k you really can't tell the difference. Actually, the magical part is, in lots of games it actually increases image quality. Plenty of games with poor implementation of TAA look like crap at native. So not only do you get a noticeable performance increase, not only is your GPU drawing much less power with it active, in some games you also get better graphics.
I believe you that it's probably less bad at higher resolutions as it has a better quality sample to work with.

I'm not sure about improving quality, though. Upscaling is a technology designed to slightly worsen your image quality for more performance, so improving is a contradiction. I'll believe it when I see it, I guess. :)
ratirtYour card can pull 1440p no problem.
It sure can now, but what about 2-4 years later? Besides, I'm happy with 1080p, I don't feel like I'm missing out on anything at all. :)
ratirtFor me playing at 4k, it would be hard to go 1080p since it is a blurry mess for me at this point just like DLSS is for you.
That's another reason why I don't want to upgrade. Higher res may sound like a nice thing to have until you look at games like Hogwarts Legacy that run like crap with everything turned on at 4k even on a 4090. I don't want to fall into the trap of getting used to it, and then not being able to go back, or having to spend more on a GPU upgrade when something new that I want to play doesn't run well.
Posted on Reply
#128
ratirt
AusWolfIt sure can now, but what about 2-4 years later? Besides, I'm happy with 1080p, I don't feel like I'm missing out on anything at all. :)
1440p screens support 1080p. Do not see a problem here. Same thing happened to me when I bought 4k screen 60Hz. When it came and I set it up, I said to myself why i did it? 1080p is more than enough. Then I started playing and well. I bought 4k 144hz with 6900XT so go figure.
AusWolfThat's another reason why I don't want to upgrade. Higher res may sound like a nice thing to have until you look at games like Hogwarts Legacy that run like crap with everything turned on at 4k even on a 4090. I don't want to fall into the trap of getting used to it, and then not being able to go back, or having to spend more on a GPU upgrade when something new that I want to play doesn't run well.
4k was my choice and I dont want to go back. Not to 1080p for sure. If anything I dial graphics settings down if FPS is lacking, moving to a lower res is a last resort for me which I always try to avoid.
Posted on Reply
#129
fevgatos
AusWolfI believe you that it's probably less bad at higher resolutions as it has a better quality sample to work with.

I'm not sure about improving quality, though. Upscaling is a technology designed to slightly worsen your image quality for more performance, so improving is a contradiction. I'll believe it when I see it, I guess. :)
I can post you some pictures for example from codbo later, the native TAA looks like a huge downgrade compared to dlss Q. Lots of reviewers have actually reported the same thing, even steve from hwunboxed, that dlss improves iq in lots of situations. It's a complete gamechanger.

Fg on the other hand is way more situational, it only works decently / great when your framerate is already at least 45-50 fps without it. Still wouldn't call it a gimmick
ratirtSomeone else will lose more. As you see I can't rely on your data and your experience alone but I can rely on NV adverstisement. On the other hand, it is very hard to believe that NV would go with so much trouble to advertise a card with so much power consumption just to realize after, they have made a mistake and 1.3% less performance, would have manged to cut the power consumption by a third.
How much power do you think the 4090 consumes? You realise in the vast majority of games it hovers around 350 watts right?

Doesnt the same thing apply to zen 4 cpus? Don't they cut the power in half while losing a tiny amount of performance? You seem more eager to believe it when amd is involved don't you?
Posted on Reply
#130
ratirt
fevgatosHow much power do you think the 4090 consumes? You realise in the vast majority of games it hovers around 350 watts right?

Doesnt the same thing apply to zen 4 cpus? Don't they cut the power in half while losing a tiny amount of performance? You seem more eager to believe it when amd is involved don't you?
Why are you constantly eager to compare GPUs to CPUs? It is like comparing a grain picker to a tractor asking which one runs faster. Can you use a GPU to GPU comparison if you want to prove a point? I really dont understand your obsession about AMD's CPUs. Every device will lose performance while power constraint. Not sure what are you trying to prove here. I have dropped voltage and clocks a bit on my 6900xt to acquire less power consumption and drop few degrees as well while at it with which I will be comfortable. What's your point? NV GPUs can do the same? I didnt lose a lot but undoubtedly I did lose performance and I'm sure others would lose as well or even the settings I have applied to my card would not work with theirs. For me, I dont even see the difference in games to begin with. I can't say my 6900xt consumes 270W power while gaming when 100% utilization hits it even thought it does show 270W. So is 6900xt uses 270W? Mine does but that does not mean every single Red Devil 6900XT will be the same and some will have to use advertised power consumption no matter what you do. Judging a card just by one person's experience is false.
What I think 4090 consumes? According to TPU MSI RTX 4090 Gaming Trio during gaming consumes 429Watts. I'm not even talking about max power here. Saying that you have dropped 100Watts for 1.3% performance loss is rather spectacular and weird that NV did not advertise it by themselves. Or maybe, it is just your card which again, I can't rely on your experience alone since maybe only your card is that spectacular and majority of the same card are not. That is why the power consumption advertisement from NVidia otherwise a lot of cards would not pass the qualification test thus less in the market which means even higher prices for the product which is ridiculously highly priced anyway.
Posted on Reply
#131
AusWolf
ratirt1440p screens support 1080p.
Sure, but lower resolution images on higher resolution screens always look worse than they would on a lower native resolution screen. Doing stuff on my parents' old 900p monitor feels normal, but if I set the desktop resolution to 900p on my 1080p monitor, it'll look horrible.
ratirtSame thing happened to me when I bought 4k screen 60Hz. When it came and I set it up, I said to myself why i did it? 1080p is more than enough. Then I started playing and well. I bought 4k 144hz with 6900XT so go figure.
I'd probably be the same, and that's what I want to avoid. ;)
fevgatosI can post you some pictures for example from codbo later, the native TAA looks like a huge downgrade compared to dlss Q. Lots of reviewers have actually reported the same thing, even steve from hwunboxed, that dlss improves iq in lots of situations. It's a complete gamechanger.
Sure, why not. I have my doubts of it helping in all cases, but I won't say no to taking a look.

There are quite a few bad AA implementations, though. The one that's in the Nvidia driver (I'm not sure what it's called) gave me some terrible shimmering around UI elements in Skyrim back in the days. I remember because it took me a while to figure out what the problem was.
Posted on Reply
#132
las
fevgatosOf course maximum power draw is absolutely useless. Card A has 400w max power draw and 200w average, card B has 280w max and 250w average. Card A is clearly absolutely unarguably better at power draw. You can't even argue that.

So you proved me wrong by agreeing with me that the XT draws a lot more power. Great, and yes that's usually the case, you prove me wrong every single time by admitting that everything I said is absolutely the case. Gj, keep it up


Yeah, that 6c12t that amd launched in 2023 for 350 gives you great longevity over the 14c Intel offers. LOL
Efficiency cores are useless for gaming. Intel has 8 performance cores tops. If you believe their marketing, then you need to educate yourself.

i9-13900K has 8 performance cores and 16 useless e cores, then marketed as a 24 core chip, LMAO.
Posted on Reply
#133
fevgatos
lasEfficiency cores are useless for gaming. Intel has 8 performance cores tops. If you believe their marketing, then you need to educate yourself.

i9-13900K has 8 performance cores and 16 useless e cores, then marketed as a 24 core chip, LMAO.
First of all, not only is that irrelevant, it's also wrong. The 16ecores are exactly as useless as the 2nd ccd the 7950x has. No more, no less. Both are useful mainly only on heavy threaded workloads.

Now with that said, with ecores off on for example cyberpunk, performance drops. Dramatically in fact. Tom's dinner area goes from around 100-120 fps to 80-95. So yeah..
Posted on Reply
#134
ratirt
AusWolfSure, but lower resolution images on higher resolution screens always look worse than they would on a lower native resolution screen. Doing stuff on my parents' old 900p monitor feels normal, but if I set the desktop resolution to 900p on my 1080p monitor, it'll look horrible.
I think that depends on the screen quality and size of it. I would not suggest getting 32 inch if you are planning to play 1080p at some point in the future and you are not going to upgrade GPU.
Posted on Reply
#135
fevgatos
AusWolfSure, but lower resolution images on higher resolution screens always look worse than they would on a lower native resolution screen. Doing stuff on my parents' old 900p monitor feels normal, but if I set the desktop resolution to 900p on my 1080p monitor, it'll look horrible.


I'd probably be the same, and that's what I want to avoid. ;)


Sure, why not. I have my doubts of it helping in all cases, but I won't say no to taking a look.

There are quite a few bad AA implementations, though. The one that's in the Nvidia driver (I'm not sure what it's called) gave me some terrible shimmering around UI elements in Skyrim back in the days. I remember because it took me a while to figure out what the problem was.
Of course it doesn't increase iq in all games, but it doesn't really decrease it either. I cannot for the life of me tell the difference, not even with screenshot next to each other. Even balanced looks great on static screens, but then it loses in motion where you can see some minor artifacts, but dlssQ is amazing
Posted on Reply
#136
ratirt
fevgatosFirst of all, not only is that irrelevant, it's also wrong. The 16ecores are exactly as useless as the 2nd ccd the 7950x has. No more, no less. Both are useful mainly only on heavy threaded workloads.

Now with that said, with ecores off on for example cyberpunk, performance drops. Dramatically in fact. Tom's dinner area goes from around 100-120 fps to 80-95. So yeah..
Not true.
TPU test.
www.techpowerup.com/review/rtx-4090-53-games-core-i9-13900k-e-cores-enabled-vs-disabled/2.html
Posted on Reply
#138
las
fevgatosFirst of all, not only is that irrelevant, it's also wrong. The 16ecores are exactly as useless as the 2nd ccd the 7950x has. No more, no less. Both are useful mainly only on heavy threaded workloads.

Now with that said, with ecores off on for example cyberpunk, performance drops. Dramatically in fact. Tom's dinner area goes from around 100-120 fps to 80-95. So yeah..
And in other games, like Far Cry 5, Metro Exodus, GreedFall performance is 10% lower with crappy e-cores enabled. So yeah... It's probably has better performance in some games because 8 performance cores are maxed out, however AMD has 12-16 cores on their high-end chips. As in true performance cores.

7950X 2nd CCD has performance cores + SMT only. Zero garbage cores.

Efficiency cores makes pretty much zero sense for desktop usage. And you are stuck with Windows 11 only, because without Thread Director, you will get wonky performance (software uses the wrong cores = crap performance)

The only reason why Intel does it, is to up the multithreaded performance, especially in synthetic tests like Cinebench and marked the chips as higher core count chips, but it's mostly just a marketing gimmick lie, because Intel has struggled for years with core count. They COULD have put 12 performance cores on 13900K, but watt usage would explode, however performance would have been much better than it is. Sadly Intel needs 5.5-6 GHz clockspeeds to match AMD and Upcoming Ryzen 7000 3D will beat Intel in gaming, again. 7800X3D at 399 dollars will probably smack even i9-13900KS which will be a 799 dollar chip. Sad but true.

And 7900XTX gets closer and closer to 4090, while beating 4080 more and more; www.techpowerup.com/review/asrock-radeon-rx-7900-xtx-taichi/31.html

Nvidia answer will be; 4080 Ti and 4090 Ti. Gimpy gimpy time. The leatherjacket soon pulls them out of the oven.
Posted on Reply
#139
ratirt
fevgatosYou are telling me what is and isn't true about a cpu I have in a game I play. Seems legit. Did he test Tom's dinner or some other heavy area of the game? Cause obviously, in a non heavy area you don't need a gazillion cores, that's true for every game.
You are telling everyone that @W1zzard with all his assortment of CPUs and GPUs performing benchmarks an being here testing stuff for us and literally showing the ecore dependency in games is incorrect or what are you saying?
I don't care what you have to be fair. Or is it like with your 4090 card claim? Considering how you showcase your 4090 as a legitimately low power consumption card proclaiming it applies to all 4090's, I see a flaw in your conclusions which deem you untrustworthy.
Another miracle uncovered by you. Some games or applications use more cores some don't. So you focus on whatever suits you. What is it you are trying to prove again here? That the CPU you have is exceptional? great lucky you.
Posted on Reply
#140
fevgatos
ratirtYou are telling everyone that @W1zzard with all his assortment of CPUs and GPUs performing benchmarks an being here testing stuff for us and literally showing the ecore dependency in games is incorrect or what are you saying?
I don't care what you have to be fair. Or is it like with your 4090 card claim? Considering how you showcase your 4090 as a legitimately low power consumption card proclaiming it applies to all 4090's, I see a flaw in your conclusions which deem you untrustworthy.
Another miracle uncovered by you. Some games or applications use more cores some don't. So you focus on whatever suits you. What is it you are trying to prove again here? That the CPU you have is exceptional? great lucky you.
Never said the 4090 is a low power card. I said it's incredibly efficient. Which it is.

Never said the 13900k is exceptional. Actually I swapped back to my 12900k cause I prefer it. Had I said is that ecores are not useless in gaming, since there are games that benefit a lot from them. Try to actually argue with what people are saying instead of constant strawmaning
Posted on Reply
#141
Vya Domus
The e cores on Intel's CPUs are most certainly not useless, sometimes you can lose almost 10% performance with them enabled, so they definitely do something that's for sure. :roll:

Posted on Reply
#142
ratirt
fevgatosNever said the 4090 is a low power card. I said it's incredibly efficient. Which it is.

Never said the 13900k is exceptional. Actually I swapped back to my 12900k cause I prefer it. Had I said is that ecores are not useless in gaming, since there are games that benefit a lot from them. Try to actually argue with what people are saying instead of constant strawmaning
I said your CPU is exceptional not the 13900K. The 13900K is to showcase the ecores in gaming.
Not a lower power card. OK. Efficient? If you talk about efficiency you need to say against what? AMD CPUs since you have brought those up with your 4090? You need to have a metric. This is how much power this device use and this is how much power my device use for the same performance for instance or any other metric you have there when you talk about efficiency.
Yes you did say the ecores are not useless for games and I told you they are or if there is a difference, it is so mediocre it's pointless to mention it. Arguing that you have a CPU with ecores and I dont which somehow gives you the right to falsely claim things is not OK? If you want please refer to @W1zzard test of ecores and explain why it is wrong since your "exceptional CPU" performs different and W1zards finding are false.
Posted on Reply
#143
OmniaMorsAequat
OneMoarin short we lack the talent to compete so we aren't even going to try
why does this sound so familar

O wait this is exactly the same bullshit they said with bulldozer

I swear idiots run this company

futher more AI is becomming the hottest thing since the sun and AMD is like mmmm no thanks we are going to keep doing what we are doing
which is being a mediocur second fiddle to everybody else
because that as worked so well before
Arrrrrrrrrrg this level of stupid short sighted quitter talk drives me batty
What's wrong with what they reported? I don't see it as a bad idea to have to implement artificial intelligence in games (NPCs etc) instead of dedicating it to image processing like Nvidia does (Rockstar Games is doing something similar to AMD with their new game).
The 3D computation process should be fully managed by the video card and not with the artificial support of having extra frames by sacrificing the native resolution. I can understand the use of DLSS and Frame Generation in games that maybe use Ray Tracing quite complete and heavy enough to handle natively for a video card, and in any case you get good results even with traditional rendering methods.
Posted on Reply
#144
fevgatos
lasAnd in other games, like Far Cry 5, Metro Exodus, GreedFall performance is 10% lower with crappy e-cores enabled. So yeah... It's probably has better performance in some games because 8 performance cores are maxed out, however AMD has 12-16 cores on their high-end chips. As in true performance cores.

7950X 2nd CCD has performance cores + SMT only. Zero garbage cores.

Efficiency cores makes pretty much zero sense for desktop usage. And you are stuck with Windows 11 only, because without Thread Director, you will get wonky performance (software uses the wrong cores = crap performance)

The only reason why Intel does it, is to up the multithreaded performance, especially in synthetic tests like Cinebench and marked the chips as higher core count chips, but it's mostly just a marketing gimmick lie, because Intel has struggled for years with core count. They COULD have put 12 performance cores on 13900K, but watt usage would explode, however performance would have been much better than it is. Sadly Intel needs 5.5-6 GHz clockspeeds to match AMD and Upcoming Ryzen 7000 3D will beat Intel in gaming, again. 7800X3D at 399 dollars will probably smack even i9-13900KS which will be a 799 dollar chip. Sad but true.

And 7900XTX gets closer and closer to 4090, while beating 4080 more and more; www.techpowerup.com/review/asrock-radeon-rx-7900-xtx-taichi/31.html

Nvidia answer will be; 4080 Ti and 4090 Ti. Gimpy gimpy time. The leatherjacket soon pulls them out of the oven.
The heck are you talking about. Ecores are EXACTLY as useful as the 2nd ccd. The 2nd ccd is also useful for cine bench and synthetics. There is no application that the 2nd ccd boosts performance but the ecores don't. So your opinion has to be based on pure bias and nothing more.
ratirtI said your CPU is exceptional not the 13900K. The 13900K is to showcase the ecores in gaming.
Not a lower power card. OK. Efficient? If you talk about efficiency you need to say against what? AMD CPUs since you have brought those up with your 4090? You need to have a metric. This is how much power this device use and this is how much power my device use for the same performance for instance or any other metric you have there when you talk about efficiency.
Yes you did say the ecores are not useless for games and I told you they are or if there is a difference, it is so mediocre it's pointless to mention it. Arguing that you have a CPU with ecores and I dont which somehow gives you the right to falsely claim things is not OK? If you want please refer to @W1zzard test of ecores and explain why it is wrong since your "exceptional CPU" performs different and W1zards finding are false.
Well in most games more than 8 cores are useless, so of course the ecores are as well. That has nothing to do with ecores themselves, it had to do with the games not requiring that many cores. The same thing applies to the 7950x with it's 2nd ccd being useless in most games except a minority of games that scale in more than 8 cores
Posted on Reply
#145
las
fevgatosThe heck are you talking about. Ecores are EXACTLY as useful as the 2nd ccd. The 2nd ccd is also useful for cine bench and synthetics. There is no application that the 2nd ccd boosts performance but the ecores don't. So your opinion has to be based on pure bias and nothing more.


Well in most games more than 8 cores are useless, so of course the ecores are as well. That has nothing to do with ecores themselves, it had to do with the games not requiring that many cores. The same thing applies to the 7950x with it's 2nd ccd being useless in most games except a minority of games that scale in more than 8 cores
What are YOU talking about, both CCDs have a CCX with 8C/16T with performance cores only. 2nd CCD runs slighly lower clockspeeds BUT USING SAME MICROARCHITECTURE and they are still FAST, with no risk of E cores complicating software compatibility. 7950X = 16 performance cores, 13900K = 8 performance cores.

Intels E cores are using dated and MUCH LOWER CLOCKED microarchitecture. The cores are NOT FAST at all. Their primary goal is to fool consumers into thinking the chip has more cores than it has.
i5-13600K is a "14 core chip" but only has 6 performance cores :roll: Intel STILL only has 6-8 performance cores across the board on mainstream chips in the upper segment. Rest is useless e cores.

Ryzen 7000 chips with 3D cache will beat Intel in gaming anyway. Hell even the 7800X3D 400 dollar chip will beat 13900KS with 6 GHz boost and twice if not triple the peak watt usage + 800 dollar price tag and Intel will abandon the platform after 2 years as usual. Meaning 14th gen will require new socket, new board. Milky milky time. Intels architechture is inferior which is why they need to run with high clockspeeds to be able to compete, SADLY for Intel this means high watt usage.

However i9-12900K/KS and i9-13900K/KS are pointless chips gamers, since i7 delivers the same gaming performance anyway, without the HUGE watt usage. Hell even i5's are within a few percent.
Posted on Reply
#146
HTC
Vya DomusThe e cores on Intel's CPUs are most certainly not useless, sometimes you can lose almost 10% performance with them enabled, so they definitely do something that's for sure. :roll:

I've seen this one before, but i don't recall if there have been similar tests for the 7900X and 7950X CPUs with CCD2 disabled VS both CCDs enabled: it would be interesting to see, i think, if it's Intel's approach that is slower with e-cores disabled or if it is AMD's approach that is slower with the 2nd CCD disabled.
Posted on Reply
#147
ratirt
fevgatosWell in most games more than 8 cores are useless, so of course the ecores are as well. That has nothing to do with ecores themselves, it had to do with the games not requiring that many cores. The same thing applies to the 7950x with it's 2nd ccd being useless in most games except a minority of games that scale in more than 8 cores
Can you point a game that uses ecores with pcores?
Posted on Reply
#148
fevgatos
ratirtCan you point a game that uses ecores with pcores?
Yeah, plenty. Cyberpunk, spiderman remastered, spiderman miles morales
lasWhat are YOU talking about, both CCDs have a CCX with 8C/16T with performance cores only. 2nd CCD runs slighly lower clockspeeds BUT USING SAME MICROARCHITECTURE and they are still FAST, with no risk of E cores complicating software compatibility. 7950X = 16 performance cores, 13900K = 8 performance cores.

Intels E cores are using dated and MUCH LOWER CLOCKED microarchitecture. The cores are NOT FAST at all. Their primary goal is to fool consumers into thinking the chip has more cores than it has.
i5-13600K is a "14 core chip" but only has 6 performance cores :roll: Intel STILL only has 6-8 performance cores across the board on mainstream chips in the upper segment. Rest is useless e cores.

Ryzen 7000 chips with 3D cache will beat Intel in gaming anyway. Hell even the 7800X3D 400 dollar chip will beat 13900KS with 6 GHz boost and twice if not triple the peak watt usage + 800 dollar price tag and Intel will abandon the platform after 2 years as usual. Meaning 14th gen will require new socket, new board. Milky milky time. Intels architechture is inferior which is why they need to run with high clockspeeds to be able to compete, SADLY for Intel this means high watt usage.

However i9-12900K/KS and i9-13900K/KS are pointless chips gamers, since i7 delivers the same gaming performance anyway, without the HUGE watt usage. Hell even i5's are within a few percent.
You said ecores are useless. I said they are as useless as the 2nd ccd. Can you actually give me a couple of applications that the 2nd ccd boosts performance but ecores don't? If not, then you HAVE to admit that ecores are as useless as the 2nd ccd.
Posted on Reply
#149
ratirt
fevgatosYeah, plenty. Cyberpunk, spiderman remastered, spiderman miles morales
Can you show me any proof that ecores are being used simultaneously with pcores by the games you have pointed out?
Posted on Reply
#150
fevgatos
ratirtCan you show me any proof that ecores are being used simultaneously by the games you have pointed out?
Sure, is a video with each individual core usage enough as proof?
Posted on Reply
Add your own comment
Jun 3rd, 2024 13:22 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts