Friday, August 25th 2023

AMD Announces FidelityFX Super Resolution 3 (FSR 3) Fluid Motion Rivaling DLSS 3, Broad Hardware Support

In addition to the Radeon RX 7800 XT and RX 7700 XT graphics cards, AMD announced FidelityFX Super Resolution 3 Fluid Motion (FSR 3 Fluid Motion), the company's performance enhancement that's designed to rival NVIDIA DLSS 3 Frame Generation. The biggest piece of news here, is that unlike DLSS 3, which is restricted to GeForce RTX 40-series "Ada," FSR 3 enjoys the same kind of cross-brand hardware support as FSR 2. It works on the latest Radeon RX 7000 series, as well as previous-generation RX 6000 series RDNA2 graphics cards, as well as NVIDIA GeForce RTX 40-series, RTX 30-series, and RTX 20-series. It might even be possible to use FSR 3 with Arc A-series, although AMD wouldn't confirm it.

FSR 3 Fluid Motion is a frame-rate doubling technology that generates alternate frames by estimating an intermediate between two frames rendered by the GPU (which is essentially what DLSS 3 is). The company did not detail the underlying technology behind FSR 3 in its pre-briefing, but showed an example of FSR 3 implemented on "Forspoken," where the game puts out 36 FPS at 4K native resolution, is able to run at 122 FPS with FSR 3 "performance" preset (upscaling + Fluid Motion + Anti-Lag). At 1440p native, with ultra-high RT, "Forspoken" puts out 64 FPS, which nearly doubles to 106 FPS without upscaling (native resolution) + Fluid Motion frames + Anti-Lag. The Maximum Fidelity preset of FSR 3 is essentially AMD's version of DLAA (to use the detail regeneration and AA features of FSR without dropping down resolution).
AMD announced just two title debuts for FSR 3 Fluid Motion, the already released "Forspoken," and "Immortals of Aveum" that released earlier this week. The company announced that it is working with game developers to bring FSR 3 support to "Avatar: Frontiers of Pandora," "Cyberpunk 2077," "Warhammer II: Space Marine," "Frostpunk 2," "Alters," "Squad," "Starship Troopers: Extermination," "Black Myth: Wukong," "Crimson Desert," and "Like a Dragon: Infinite Wealth." The company is working with nearly all leading game publishers and game engine developers to add FSR 3 support, including Ascendant, Square Enix, Ubisoft, CD Projekt Red, Saber Interactive, Focus Entertainment, 11-bit Studios, Unreal Engine, Sega, and Bandai Namco Reflector.
AMD is also working to get FSR 3 Fluid Motion frames part of the AMD Hyper-RX feature that the company is launching soon. This is big, as pretty much any DirectX 11 or DirectX 12 game will get Fluid Motion frames, launching in Q1-2024.

Both "Forspoken" and "Immortals of Aveum" will get FSR 3 patches this Fall.
Add your own comment

362 Comments on AMD Announces FidelityFX Super Resolution 3 (FSR 3) Fluid Motion Rivaling DLSS 3, Broad Hardware Support

#326
kapone32
fevgatosAnd are you suggesting I have issues running games on high fps on my rig? I don't understand what point you are trying to make.

My gpu is power limited to 70%, so basically 320w max. Highest peak power draw was on tlou at 550w, average is around 440-480 measured from the wall, that's on 4k. Cpu usually draws 50 to 70w depending on the game unless I go 1080p dlss performance in which case it peaks at 110w in tlou. Other games it stays below 100.

Btw your cpu power draw is disgusting. 50w on almost idle, yikes. Mine drops down to less than 3 watts. I'm browsing the web watching 2 videos steaming at 5-8 watts.
I am not suggesting anything just responding to the stsatement that your system "Absolutely demolishes" mine. You might as well had bought a 4080 if your focus is 1080P. Now I only play at 4K and you saw my GPU power limit at 318 W. If you actually took the time to look at the Games you would see high refresh rates. So where are your numbers? I will never understand why people don't read from right to left to understand that what you saw was the CPU power limit. What? Yes you can turn your 7900X3D into a 65 Watt CPU and still get over 5 GHz on all cores. I don't know what I am talking about though. The fact that you need 100 Watts to feed a GPU that is 70% cutoff should be worrying. I guess you should use SAM or Rebar
Posted on Reply
#327
JustBenching
AusWolfI'll never buy a new monitor not being sure if my PC will play games fluidly on it without DLSS/FSR/FG. A couple weeks ago, I was briefly considering opting in for a 1440p ultrawide (3440x1440), but considering that it's 1.34x the pixels as normal 1440p, or 2.38x the pixels as 1080p, I'm not so sure. I can't justify replacing a monitor that I'm perfectly happy with for one that may or may not play my games fluidly at native res.
But each to their own.


Are you saying that FG is mainly meant for spectators? Nobody watches me play my games (thank God), so what's the point, then? 70 or 120 FPS doesn't make any difference to me whatsoever.
I didn't say you should buy a new monitor. I'm saying when you are deciding what monitor to buy, if your card can only handle 1080p you should get a 1440p monitor and use dlss / fsr. It's kind of a no brainer honestly.

No, I'm not saying fg is for spectators. I'm saying it increases visual smoothness a truckload
kapone32I am not suggesting anything just responding to the stsatement that your system "Absolutely demolishes" mine. You might as well had bought a 4080 if your focus is 1080P. Now I only play at 4K and you saw my GPU power limit at 318 W. If you actually took the time to look at the Games you would see high refresh rates. So where are your numbers? I will never understand why people don't read from right to left to understand that what you saw was the CPU power limit. What? Yes you can turn your 7900X3D into a 65 Watt CPU and still get over 5 GHz on all cores. I don't know what I am talking about though. The fact that you need 100 Watts to feed a GPU that is 70% cutoff should be worrying. I guess you should use SAM or Rebar
My focus is 4k. Cpu is at 50 to 70w while gaming and browses the web at less than 5. I bet a paycheck the whole system consumes less power while driving more fps than yours.

Post your benchmark at a game of your choice, I'll post mine. Until then just stahp
Posted on Reply
#328
TheoneandonlyMrK
fevgatosI didn't say you should buy a new monitor. I'm saying when you are deciding what monitor to buy, if your card can only handle 1080p you should get a 1440p monitor and use dlss / fsr. It's kind of a no brainer honestly.

No, I'm not saying fg is for spectators. I'm saying it increases visual smoothness a truckload


My focus is 4k. Cpu is at 50 to 70w while gaming and browses the web at less than 5. I bet a paycheck the whole system consumes less power while driving more fps than yours.

Post your benchmark at a game of your choice, I'll post mine. Until then just stahp
So does v-sync or triple buffer v-sync, but it's not popular to use

Latency is key and reflex does not completely remove the latency cost of FG , no demo, or bench or review I have seen suggests this, just you.

None the less then Fsr3 might make all other types moot with a good enough IQ and superior support.

Or not.

Oh, And you paid 1.5/2X or got lucky ,but measuring your epean is a fitting look on you.
Posted on Reply
#329
JustBenching
TheoneandonlyMrKSo does v-sync or triple buffer v-sync, but it's not popular to use

Latency is key and reflex does not completely remove the latency cost of FG , no demo, or bench or review I have seen suggests this, just you.

None the less then Fsr3 might make all other types moot with a good enough IQ and superior support.

Or not.

Oh, And you paid 1.5/2X or got lucky ,but measuring your epean is a fitting look on you.
Well, he started measuring it without even having one. I mean, I know you know who started it but of course you have to quote me cause he is on your team. Boring.

Have you seen a review between native reflex off and fg + reflex on? Care to share it with me please?
Posted on Reply
#330
AusWolf
fevgatosI didn't say you should buy a new monitor. I'm saying when you are deciding what monitor to buy, if your card can only handle 1080p you should get a 1440p monitor and use dlss / fsr. It's kind of a no brainer honestly.
That's what I'll never do. If my card can only handle 1080p, I'll leave it at 1080p. I see no point in spending money on a monitor only to have to spend money on a faster graphics card as well. I'm not denying that DLSS/FSR can work well at 1440p (nor do I support the claim), as I've never seen it, but my 1080p experiences have relegated the technology into a "last resort before a GPU upgrade" status.

DLSS/FSR support will never be a factor when I'm choosing a new GPU, that's for sure.
fevgatosNo, I'm not saying fg is for spectators. I'm saying it increases visual smoothness a truckload
But you also acknowledged that it increases input latency, which isn't ideal when you're playing a game.
fevgatosWell, he started measuring it without even having one. I mean, I know you know who started it but of course you have to quote me cause he is on your team. Boring.
In case you're referring to me, I'm not on anyone's team. I buy the GPU that fits my purpose and budget, and I don't care about FSR, just as much as I don't care about DLSS, either.

I currently have 7 graphics cards (8 if you count the fake 550 Ti I got from a friend for free), and only 2 of them are AMD, the rest are Nvidia, so tell me who's side I'm on. ;)
Posted on Reply
#331
JustBenching
AusWolfThat's what I'll never do. If my card can only handle 1080p, I'll leave it at 1080p. I see no point in spending money on a monitor only to have to spend money on a faster graphics card as well. I'm not denying that DLSS/FSR can work well at 1440p (nor do I support the claim), as I've never seen it, but my 1080p experiences have relegated the technology into a "last resort before a GPU upgrade" status.

DLSS/FSR support will never be a factor when I'm choosing a new GPU, that's for sure.


But you also acknowledged that it increases input latency, which isn't ideal when you're playing a game.
And all I'm saying is that you should reconsider your stance.

I mean, are we in agreement that 1440p + DLSS Q / FSR Q looks better than native 1080p? To me that's obvious, and if you agree with that then getting the 1440p monitor and use upscaling is just a no brainer, since youll get better image quality for similar framerate.
Posted on Reply
#332
AusWolf
fevgatosAnd all I'm saying is that you should reconsider your stance.
I might later... after another GPU upgrade perhaps, to make sure I won't have to resort to using DLSS/FSR.
fevgatosI mean, are we in agreement that 1440p + DLSS Q / FSR Q looks better than native 1080p? To me that's obvious, and if you agree with that then getting the 1440p monitor and use upscaling is just a no brainer, since youll get better image quality for similar framerate.
I can't agree with, or deny something that I've never seen.
Posted on Reply
#333
tfdsaf
fevgatosAt the 4070ti launch the 7900xt was between 120 and 140 euros more expensive. I'm not talking about msrp, I'm talking about actual pricing in Europe. So yes, the 4070ti was even better at raster per dollar, on top of better rt performance, much lower power draw etc.

Nvidia does have a ton more features. It supports both dlss and fsr, both fg and now fsr 3, Cuda, reflex. Abd the list goes on


Why would I be getting shutdowns? I have no idea what you are talking about. My system draws 550w max from the wall while playing tlou, and of course it absolutely demolishes yours in performance. We can test it if you want to
What are those features? AMD has radeon Chill, radeon Boost, Radeon antilag, RSR, FSR, oneclick automatic overclocking, one button record, Freesync, Radeon image sharpening, AMD integer scaling, noise suppression, AMD link, AMD privacy view, smart access memory, smartaccess graphics, Fidelity FX, image generation, etc...

Which features does Nvidia have? As far as I can tell AMD has a ton more features and all of those are free and open source, unlike Ngreedia which are all closed off, all cost a premium and don't have any wide support.
Posted on Reply
#334
phanbuey
AusWolfI don't know. Cyberpunk at DLSS Quality seemed simply just a tad blurry to me.
Any upscaling at 1080P is terrible - it's really a tech to push higher resolutions. 1440P is a bit of a toss up - really depends on the game. 4K is where it really makes a difference.
Posted on Reply
#335
gffermari
@AusWolf
your gpu is begging to play at 1440p with FSR on. :p

My previous AMD gpu was a 5700XT, so I haven't played any games with FSR on.
But I've seen a friend's 6800XT and I don't really understand why someone wouldn't turn it on 100 out of 100 times that it is available.

On topic regarding FSR 3 (and DLSS 3 FG), if my card is not capable of running a game at 60fps, FG won't save me.
If my gpu can run it at 60+, then I don't need it.
My 4080 can run Hogwarts Legacy at more than 60 but with frame drops at 50-60s. I think I still prefer the frame drops instead of DLSS 3 FG.
((the only case is practically needed is when you want to run pathtraced rendered games. I don't see any other use case))

Respectively, FSR 3 won't save the lower end cards. It can just offer smoother experience on already capable cards like 6700/6800 and better.
Posted on Reply
#336
AusWolf
gffermari@AusWolf
your gpu is begging to play at 1440p with FSR on. :p

My previous AMD gpu was a 5700XT, so I haven't played any games with FSR on.
But I've seen a friend's 6800XT and I don't really understand why someone wouldn't turn it on 100 out of 100 times that it is available.
Let me think about it... (5 seconds later) Nah, it'll be fine. :p Seriously, though, I can't imagine how buying a 1440p or 4K monitor only to resort to upscaling is not a waste of money. Maybe I could imagine it if I had one, but I'm not gonna spend hundreds just to try and risk being disappointed.
gffermariOn topic regarding FSR 3 (and DLSS 3 FG), if my card is not capable of running a game at 60fps, FG won't save me.
If my gpu can run it at 60+, then I don't need it.
Exactly! Why this feature *khm... gimmick, is a selling point of any GPU is beyond me.
Posted on Reply
#337
TheoneandonlyMrK
fevgatosWell, he started measuring it without even having one. I mean, I know you know who started it but of course you have to quote me cause he is on your team. Boring.

Have you seen a review between native reflex off and fg + reflex on? Care to share it with me please?
The laws of physics dictate that two frames take longer to make then one frame(no matter what GPU)

The law of Fg is that two frames are made and one inserted in between.

I've seen reviewer's comment on the latency even with reflex enable in fact just today UFDtech reported just that, and he's used it, so I tend to believe physical science and reviewer's over forum chatter.

Because you know Physics
Posted on Reply
#338
JustBenching
TheoneandonlyMrKThe laws of physics dictate that two frames take longer to make then one frame(no matter what GPU)
But without Reflex the frame that is getting rendered isn't the latest frame your CPU prepared. That's the whole point of reflex.
tfdsafWhat are those features? AMD has radeon Chill, radeon Boost, Radeon antilag, RSR, FSR, oneclick automatic overclocking, one button record, Freesync, Radeon image sharpening, AMD integer scaling, noise suppression, AMD link, AMD privacy view, smart access memory, smartaccess graphics, Fidelity FX, image generation, etc...

Which features does Nvidia have? As far as I can tell AMD has a ton more features and all of those are free and open source, unlike Ngreedia which are all closed off, all cost a premium and don't have any wide support.
Everything you listed, nvidia has as well. Half of them also work much better on nvidia (noise suppression for example). Amd half bakes half their stuff.
AusWolfLet me think about it... (5 seconds later) Nah, it'll be fine. :p Seriously, though, I can't imagine how buying a 1440p or 4K monitor only to resort to upscaling is not a waste of money. Maybe I could imagine it if I had one, but I'm not gonna spend hundreds just to try and risk being disappointed.
First of all, it can't be a waste of money cause it's not like they cost extra. A 27" 1440p costs roughly the same money as a 27" 4k. Second of all, how is getting better image quality a waste of money? Why did you even buy a new GPU then, if you don't care about image quality you should have kept your old GPU and just drop settings to low, lol :D
TheoneandonlyMrKBecause you know Physics
Yeah, because I knew - as usual - you are full of it, I just started looking around for reviews. And lo and behold, as I've been saying, FG + Reflex = lower latency than straight up native without Reflex. Because you know...physics right? 101.5ms of latency is what every AMD card would get in this scenario since they don't have Reflex. So FG is smoother than straight up native on an amd card, rofl.

Posted on Reply
#339
Steevo
fevgatosBut without Reflex the frame that is getting rendered isn't the latest frame your CPU prepared. That's the whole point of reflex.


Everything you listed, nvidia has as well. Half of them also work much better on nvidia (noise suppression for example). Amd half bakes half their stuff.


First of all, it can't be a waste of money cause it's not like they cost extra. A 27" 1440p costs roughly the same money as a 27" 4k. Second of all, how is getting better image quality a waste of money? Why did you even buy a new GPU then, if you don't care about image quality you should have kept your old GPU and just drop settings to low, lol :D


Yeah, because I knew - as usual - you are full of it, I just started looking around for reviews. And lo and behold, as I've been saying, FG + Reflex = lower latency than straight up native without Reflex. Because you know...physics right? 101.5ms of latency is what every AMD card would get in this scenario since they don't have Reflex. So FG is smoother than straight up native on an amd card, rofl.

Nvida Physx amirite? Hairworkx too!!! Its almost like they make up gimmics to sell hardware.... like every other for profit company. Brand loyalty only shows how much of a tool anyone is, we all buy products that we like, maybe go talk about your glorious master in another forum instead of thread crapping?
Posted on Reply
#340
AusWolf
fevgatosAmd half bakes half their stuff.
Huh?

Edit: Sure, Dr Su doesn't pull GPUs out of her oven while wearing a leather jacket, but no AMD setting/option feels half-baked to me.
fevgatosFirst of all, it can't be a waste of money cause it's not like they cost extra. A 27" 1440p costs roughly the same money as a 27" 4k.
...which is way more money than not buying anything.
fevgatosSecond of all, how is getting better image quality a waste of money? Why did you even buy a new GPU then, if you don't care about image quality you should have kept your old GPU and just drop settings to low, lol :D
That's not a valid comparison. You roughly know what you get when you buy a GPU, because you've read the reviews - the performance numbers are there, the power consumption numbers are there, etc. Upping your resolution with DLSS/FSR on the other hand, is a totally unknown territory that you can't get a glimpse of by looking at static images and pre-recorded gameplay videos on a 1080p screen. The only way to see if you like it is to try it first-hand. I can only base anything I say about DLSS/FSR on my 1080p experiences, which I've been told are not valid for higher resolutions, which I can't prove or disprove without seeing it.

Seriously, does anyone have a spare 3440x1440 monitor that I could borrow for a week or so? :D
fevgatosYeah, because I knew - as usual - you are full of it, I just started looking around for reviews. And lo and behold, as I've been saying, FG + Reflex = lower latency than straight up native without Reflex. Because you know...physics right? 101.5ms of latency is what every AMD card would get in this scenario since they don't have Reflex. So FG is smoother than straight up native on an amd card, rofl.

Edited: Just as they said in the UFD Tech video today: frame gen gives you worse latency than native, even with Reflex enabled. So basically, you have to enable DLSS in order to have the same latency with frame gen that you would have at native with no frame gen. I'm curious how that affects one's gameplay experience.

What you're saying about AMD cards is is only wrong because Anti-Lag exists and works very well, in my opinion. So in your narrative, an equivalent AMD card would be around the 63 ms range with Anti-Lag on.
Posted on Reply
#341
TheoneandonlyMrK
fevgatosBut without Reflex the frame that is getting rendered isn't the latest frame your CPU prepared. That's the whole point of reflex.


Everything you listed, nvidia has as well. Half of them also work much better on nvidia (noise suppression for example). Amd half bakes half their stuff.


First of all, it can't be a waste of money cause it's not like they cost extra. A 27" 1440p costs roughly the same money as a 27" 4k. Second of all, how is getting better image quality a waste of money? Why did you even buy a new GPU then, if you don't care about image quality you should have kept your old GPU and just drop settings to low, lol :D


Yeah, because I knew - as usual - you are full of it, I just started looking around for reviews. And lo and behold, as I've been saying, FG + Reflex = lower latency than straight up native without Reflex. Because you know...physics right? 101.5ms of latency is what every AMD card would get in this scenario since they don't have Reflex. So FG is smoother than straight up native on an amd card, rofl.

you had to find phsyco RT on Nvidias sponsored EXAMPLE TITLE to fit your narrative, go you.
Posted on Reply
#342
R0H1T
fevgatosCare to share it with me please?
Not to go way off topic but anyways you were about to show the peak power consumption with AVX512 on 12th gen? Did you post that somewhere or did I miss it o_O

Since you love to quote "numbers" I'd also love to see these "real" tests!
TheoneandonlyMrKphsyco RT
For a second I thought you were talking about JHH :laugh:
Posted on Reply
#343
JustBenching
TheoneandonlyMrKyou had to find phsyco RT on Nvidias sponsored EXAMPLE TITLE to fit your narrative, go you.
How is RT or the sponsor relevant here? Why can't people just flat out admit they were wrong...I don't get it.

Here you go, this AMD sponsored. Im waiting for another excuse just so you don't have to admit whats in front of you

R0H1TNot to go way off topic but anyways you were about to show the peak power consumption with AVX512 on 12th gen? Did you post that somewhere or did I miss it o_O

Since you love to quote "numbers" I'd also love to see these "real" tests!
I've no idea what you are talking about. Care to remind me - send a link or something?

Peak power draw is irrelevant anyways. If you remove the power limits then a 12900k might hit something like 270 watts at ycruncher. If you power limit it then it will draw as much as your limits dictate
Posted on Reply
#344
R0H1T
The point was, since you're hell bent on numbers, that Intel's hard TDP limits can be exceeded on motherboards given the right task (or stress test) ~ we saw that under some reviews but as usual you tend to have "memento" memory when it comes to stuff like this! And of course your point was AMD's TDP numbers are fake, so are Intel's btw :rolleyes:
Posted on Reply
#345
JustBenching
AusWolf...which is way more money than not buying anything.
If you don't have a monitor, and you don't buy anything, you won't have a monitor. So what the heck are you talking about :D
AusWolfEdited: Just as they said in the UFD Tech video today: frame gen gives you worse latency than native, even with Reflex enabled. So basically, you have to enable DLSS in order to have the same latency with frame gen that you would have at native with no frame gen. I'm curious how that affects one's gameplay experience.
I haven't seen the video, but is he testing native with Reflex on or off? That's the important part
AusWolfWhat you're saying about AMD cards is is only wrong because Anti-Lag exists and works very well, in my opinion. So in your narrative, an equivalent AMD card would be around the 63 ms range with Anti-Lag on.
Antilag has nothing to do with reflex so no, it wouldn't be at 63ms. There are tests out there you know, why do we have to make stuff up?


Reflex makes a significant different in latency, antilag doesn't. Here you go. As i've been saying, Reflex + FG = same latency as amd at native. So if you have an AMD card, you just cannot talk about latency. It's a joke, you should stop.

R0H1Tthat Intel's hard TDP limits can be exceeded on motherboards given the right task (or stress test) ~
Nope, that can't happen. If you've seen it in a review please link the review. What you are saying is literally impossible. It just can't happen. I can power limit my CPU to 35w and run ycruncher, it will not draw a single watt over 35. Ever. Under no circumstances. You are just wrong.
Posted on Reply
#346
Steevo
fevgatosNope, that can't happen. If you've seen it in a review please link the review. What you are saying is literally impossible. It just can't happen. I can power limit my CPU to 35w and run ycruncher, it will not draw a single watt over 35. Ever. Under no circumstances. You are just wrong.
www.techpowerup.com/review/intel-core-i9-13900ks/21.html

"
Core i9-13900KS[RIGHT]$800
MSRP: $700[/RIGHT]
[RIGHT]8+16 / 32[/RIGHT][RIGHT]3.2 / 2.4 GHz[/RIGHT][RIGHT]6.0 / 4.3 GHz [/RIGHT][RIGHT]36 MB[/RIGHT][RIGHT]150 W[/RIGHT][RIGHT]Raptor Lake[/RIGHT][RIGHT]10 nm[/RIGHT][RIGHT]LGA 1700"[/RIGHT]


tpucdn.com/review/intel-core-i9-13900ks/images/power-per-application.png
Posted on Reply
#347
JustBenching
Steevowww.techpowerup.com/review/intel-core-i9-13900ks/21.html

"
Core i9-13900KS[RIGHT]$800
MSRP: $700[/RIGHT]
[RIGHT]8+16 / 32[/RIGHT][RIGHT]3.2 / 2.4 GHz[/RIGHT][RIGHT]6.0 / 4.3 GHz[/RIGHT][RIGHT]36 MB[/RIGHT][RIGHT]150 W[/RIGHT][RIGHT]Raptor Lake[/RIGHT][RIGHT]10 nm[/RIGHT][RIGHT]LGA 1700"[/RIGHT]


tpucdn.com/review/intel-core-i9-13900ks/images/power-per-application.png
And that somehow disproves what I said, how? lol

From the link you just posted " the i9-13900KS has a 320 W maximum turbo power mode "Extreme Power Delivery Profile" that is enabled by default in the motherboard BIOS"

Yeah, thanks for proving me right I guess :roll:
Posted on Reply
#348
Steevo
fevgatosAnd that somehow disproves what I said, how? lol

From the link you just posted " the i9-13900KS has a 320 W maximum turbo power mode "Extreme Power Delivery Profile" that is enabled by default in the motherboard BIOS"

Yeah, thanks for proving me right I guess :roll:
So Intels out of the box default is to use up to 320W? No, their official power spec is 253W www.intel.com/content/www/us/en/products/sku/232167/intel-core-i913900ks-processor-36m-cache-up-to-6-00-ghz/specifications.html

"The processor base power is now stepped up to 150 W, compared to 125 W for the i9-13900K, while interestingly, the maximum turbo power stays at 253 W. Intel may not list it in the specs, particularly its ARK product information page, but the i9-13900KS has a 320 W maximum turbo power mode "Extreme Power Delivery Profile" that is enabled by default in the motherboard BIOS, making it the processor's unofficial maximum turbo power value."

We should ask W1zz to review CPU's in their stock and official state instead of the way they come out of the box, it just makes AMD look better.
Posted on Reply
#349
JustBenching
SteevoSo Intels out of the box default is to use up to 320W? No, their official power spec is 253W
Yes, the 13900ks default is 320w. Actually most mobo manafacturers run every Intel cpu without any power limits.
Posted on Reply
#350
Assimilator
SteevoWe should ask W1zz to review CPU's in their stock and official state instead of the way they come out of the box, it just makes AMD look better.
Reviews don't exist to stroke your tiny AMD-loving ego.
Posted on Reply
Add your own comment
Dec 22nd, 2024 15:23 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts