Thursday, March 28th 2024

Developers of Outpost Infinity Siege Recommend Underclocking i9-13900K and i9-14900K for Stability on Machines with RTX 4090

Outpost: Infinity Siege developers recommend underclocking Intel's current and previous flagship desktop processors, the Core i9-14900K and i9-13900K, to prevent the game from crashing. This recommendation goes out to those with a GeForce RTX 4090 paired with either a Core i9-13900K or i9-14900K, we're fairly sure that the recommendation even extends to those with i9-14900KS and i9-13900KS. Team Ranger, the developers of the game, just released their second patch in just a week following the game's launch. In the patch notes, they ask users to use Intel Extreme Tuning Utility (XTU), to lower the P-core clock speeds down to at least 5.00 GHz (maximum boost). This development closely follows a February 2024 report which says that game stability issues of high-end "Raptor Lake" processors are linked to power limit unlocks.
Source: Tom's Hardware
Add your own comment

85 Comments on Developers of Outpost Infinity Siege Recommend Underclocking i9-13900K and i9-14900K for Stability on Machines with RTX 4090

#51
chrcoluk
Tek-CheckDoubtful, as vast majority of CPUs simply work with game, including i9. They clearly say it's a small number of i9, so it could be a batch or specific motherboards that overprovision those CPUs.
You trying to make it sound like its some kind of inherit fault due to bad manufacturing batch or perhaps poor silicon quality not stable at spec settings, this is where its misleading and a rumour mill going into conspiracy theory territory.

We already know UE is a crappy engine, so its not impossible that itself has inherit problems in its code, but in regards to Intel advising people to make changes its almost certainly telling them to put things back to spec, so if it is hardware instability it will be down to motherboard's running out of spec.

For reference its entirely possible to have buggy code that only gets exposed on fast enough hardware (timing issues). One reason why console manufacturers make such an effort to gimp newer refreshes of hardware to match timings and such.
Posted on Reply
#52
matar
lol this is why i am still using my i9-10900KF and RTX 3080 Ti
Posted on Reply
#53
Tek-Check
chrcolukYou trying to make it sound like its some kind of inherit fault due to bad manufacturing batch or perhaps poor silicon quality not stable at spec settings, this is where its misleading and a rumour mill going into conspiracy theory territory.

We already know UE is a crappy engine, so its not impossible that itself has inherit problems in its code, but in regards to Intel advising people to make changes its almost certainly telling them to put things back to spec, so if it is hardware instability it will be down to motherboard's running out of spec.

For reference its entirely possible to have buggy code that only gets exposed on fast enough hardware (timing issues). One reason why console manufacturers make such an effort to gimp newer refreshes of hardware to match timings and such.
Had you read the linked article (more carefully?), you would have found out that they ecplicitly observed the same behaviour, i.e. crashes, was replicated in other applications OUTSIDE of the game.
ToothlessNot really, no. Three models causing issue is not a "poor batch." It's simply game devs being lazy.
Did you actually read the linked article?
Posted on Reply
#54
Toothless
Tech, Games, and TPU!
Tek-CheckDid you actually read the linked article?
Yes, and even running a stock 14900k on another machine had zero issues. Still points to a dev issue.
Posted on Reply
#55
chrcoluk
Tek-CheckHad you read the linked article (more carefully?), you would have found out that they ecplicitly observed the same behaviour, i.e. crashes, was replicated in other applications OUTSIDE of the game.


Did you actually read the linked article?
How does that backup the point you trying to make?

There is nothing I can find on the net stating people running this hardware at spec is inherently unstable.
Posted on Reply
#56
Dr. Dro
kapone32Your claim is that no CPU could be bad or leaking voltage. There is also the fact that some board vendors like to turn the voltage up regardless of the processor.
Bad CPUs are the exception, not the norm. If a CPU doesn't operate at stock voltages in an otherwise well-equipped system that has adequate cooling, power supply and compatible motherboard at stock settings, then you need to request an RMA.
Posted on Reply
#57
Tek-Check
chrcolukThere is nothing I can find on the net stating people running this hardware at spec is inherently unstable.
Did you read in the article that the issue concerns small number of CPU that also crashed in other workloads? It's a not a pandemic. Some owners returned those and got replacements.
ToothlessYes, and even running a stock 14900k on another machine had zero issues. Still points to a dev issue.
No, it does not.
Posted on Reply
#58
kapone32
Dr. DroBad CPUs are the exception, not the norm. If a CPU doesn't operate at stock voltages in an otherwise well-equipped system that has adequate cooling, power supply and compatible motherboard at stock settings, then you need to request an RMA.
I am not saying they are the norm. I can't find it right now but I know MSI Insider did a live stream showing that a 14900K and 4090 will draw up to 1100 Watts in some scenarios. We don't know what wattage some of these failed tests could be running at. It could be as simple as PSU spikes. Would anyone use a PSU less than 1000W for that combo? I am sure there are those that have.
Posted on Reply
#59
Dr. Dro
kapone32I am not saying they are the norm. I can't find it right now but I know MSI Insider did a live stream showing that a 14900K and 4090 will draw up to 1100 Watts in some scenarios. We don't know what wattage some of these failed tests could be running at. It could be as simple as PSU spikes. Would anyone use a PSU less than 1000W for that combo? I am sure there are those that have.
If you let both of them run wild and overclocked to the max, yes, that seems plausible. The 4090 can pull up 600 W and the i9 another 400. Add 100 for the rest of the components and fans, even 1200 W+ seems rather plausible. Therein lies the problem, people running the effectively unlimited 4096 W PL1 setting while on conventional cooling, with a 750-850W PSU and expecting it to be stable. It's user error, where is all that heat going? How is the system being fed?
Posted on Reply
#60
Toothless
Tech, Games, and TPU!
Tek-CheckDid you read in the article that the issue concerns small number of CPU that also crashed in other workloads? It's a not a pandemic. Some owners returned those and got replacements.


No, it does not.
Okay so show me other games that suffer from a processor running too fast. Other games that suffer the same issues as this one where the devs tell people to slow their hardware down.
Posted on Reply
#61
Tek-Check
ToothlessOkay so show me other games that suffer from a processor running too fast.
I can only tell you to read the linked article again, as you do not seem to be willing to take information in.
- it's not about other games, but other workloads with similar behaviour observed
- it's not about any i9, but a small number of select CPUs
- and it's not about CPUs running "too fast", but getting into instability territory in several workloads outside of this game too
The article is pretty simple. In order to isolate the issue, more specific investigation would be needed.
Posted on Reply
#62
Dr. Dro
Games that malfunction on fast hardware exist and are nothing new, but I don't think it applies to this game or any modern engine, unless physics, scripting and/or animation are tied to frame rate.

Original 2011 Skyrim would break hard above 60 fps, Fallout 4 showed similar issues to a lower extent.


Even then limiting the frame rate should fix it.
Posted on Reply
#63
BoggledBeagle
Dr. DroBad CPUs are the exception, not the norm.
Speaking about what is normal - have you heard mr. Gelsinger talk about the frequencies - I think it was when Intel CPUs first reached 6 GHz. He commented how engineers were convinced such frequencies were impossible to reach.

Remember what 12900K runs at? The best silicon they could make could only do 5.2 GHz (absolute maximum of selected cores).

How do you think such big improvement with the same process in such a short period of time could have happened?

I can tell you that:
  • Some improvements in the process might have helped - but with the same process and the same underlying technology there is only a very limited scope of improvements that can be done.
  • They pushed frequencies hard. And then they pushed hard AGAIN. They are on the absolute edge and breaking point of what the chips can handle.
  • They must have abandoned reliability safety margins for these chips to make this happen.
So in this situation - can somebody be really surprised that some chips are unstable at these breakneck frequencies?

BTW 12900 was 8+8, 14700 is now 8 + 12 and 14900 8 + 16 cores. There is more heat and more power drawn in these new chips - that does not help either.

And one more thing: these chips can do AVX-512. It is disabled because it stressed the chips too much and was breaking them. I guess if we knew the frequencies the chips need to run at to be able to handle AVX-512 - we would learn the true stable frequency for Alder and Raptor lakes.
Posted on Reply
#64
watzupken
BoggledBeagleRecommendations of running Intel CPUs at 5 GHz are no surprise for me, I stated multiple times that these CPUs are not up to the insane speeds Intel pushes them to. I am running my 14900K at 5.2 GHz and I always felt adventurous for doing so.

I was just a little surprised that they recommend power limit of just 125W in the Oodle document. I got a feeling that 160W is just fine and comfortable power draw for these CPU, but apparently feelings may sometimes not be a reliable source of information.
My guess is that not all cores are active in games, and therefore, don't really need that much power to begin with. My 12700K was typically pulling about 80 to 90W in the games I play.
BoggledBeagleSpeaking about what is normal - have you heard mr. Gelsinger talk about the frequencies - I think it was when Intel CPUs first reached 6 GHz. He commented how engineers were convinced such frequencies were impossible to reach.

Remember what 12900K runs at? The best silicon they could make could only do 5.2 GHz (absolute maximum of selected cores).

How do you think such big improvement with the same process in such a short period of time could have happened?

I can tell you that:
  • Some improvements in the process might have helped - but with the same process and the same underlying technology there is only a very limited scope of improvements that can be done.
  • They pushed frequencies hard. And then they pushed hard AGAIN. They are on the absolute edge and breaking point of what the chips can handle.
  • They must have abandoned reliability safety margins for these chips to make this happen.
So in this situation - can somebody be really surprised that some chips are unstable at these breakneck frequencies?

BTW 12900 was 8+8, 14700 is now 8 + 12 and 14900 8 + 16 cores. There is more heat and more power drawn in these new chips - that does not help either.

And one more thing: these chips can do AVX-512. It is disabled because it stressed the chips too much and was breaking them. I guess if we knew the frequencies the chips need to run at to be able to handle AVX-512 - we would learn the true stable frequency for Alder and Raptor lakes.
Intel's 10nm was pushed very hard, just like its 14nm before. The main culprit for the insane power draw are the P-cores that are pushed very hard to get to this prized 6 or more Ghz since the power requirement grows exponentially when its pushed passed the "comfort zone" of the chip. The increased in E-cores and cache contributed to the increased power draw as well, but less so since 13900K and 14900K are essentially the same chip.

As you mentioned, I do worry about the longevity of these i9 processors because they are clearly running at frequencies, power and heat, that may cause them to degrade or fail prematurely. The number may sound low only because the number of i9 sold should be very low as compared to more popular models like i5/i7. I am happy with the i7 12700K because it fast enough and I don't need that many E-cores. 16 E-cores is just ridiculous and clearly not aiming at being efficient.
Posted on Reply
#65
Dr. Dro
BoggledBeagleSpeaking about what is normal - have you heard mr. Gelsinger talk about the frequencies - I think it was when Intel CPUs first reached 6 GHz. He commented how engineers were convinced such frequencies were impossible to reach.

Remember what 12900K runs at? The best silicon they could make could only do 5.2 GHz (absolute maximum of selected cores).

How do you think such big improvement with the same process in such a short period of time could have happened?

I can tell you that:
  • Some improvements in the process might have helped - but with the same process and the same underlying technology there is only a very limited scope of improvements that can be done.
  • They pushed frequencies hard. And then they pushed hard AGAIN. They are on the absolute edge and breaking point of what the chips can handle.
  • They must have abandoned reliability safety margins for these chips to make this happen.
So in this situation - can somebody be really surprised that some chips are unstable at these breakneck frequencies?

BTW 12900 was 8+8, 14700 is now 8 + 12 and 14900 8 + 16 cores. There is more heat and more power drawn in these new chips - that does not help either.

And one more thing: these chips can do AVX-512. It is disabled because it stressed the chips too much and was breaking them. I guess if we knew the frequencies the chips need to run at to be able to handle AVX-512 - we would learn the true stable frequency for Alder and Raptor lakes.
Yes, they can be surprised, you're just extremely skeptical of it. Intel has released a series of CPUs that offered increasingly higher clocks since the 12900K. Even Alder topped out at 5.5 with its KS, and now they just released a validated 6.2GHz CPU - I never believed they'd do a 14900KS myself.

There is no problem with Raptor Lake's boost frequencies. The processors are fully stable and capable of handling it. They will not malfunction as long as their cooling and power requirements are met.

No reliability margins were abandoned. Since Intel doesn't rely on TSMC and their production is fully vertically integrated from sand to packaged silicon, they've just stringently binned each processor for their quality grade. Remember every CPU since the 13900K is exactly the same, they just differ in clocks, with the 13900KS and now 14900KS being the highest quality chips they offer.

AVX-512 isn't a factor and even if it had been enabled it wouldn't run so high executing that kind of vectorized code. Current requirements would be insane.
Posted on Reply
#66
kapone32
Dr. DroYes, they can be surprised, you're just extremely skeptical of it. Intel has released a series of CPUs that offered increasingly higher clocks since the 12900K. Even Alder topped out at 5.5 with its KS, and now they just released a validated 6.2GHz CPU - I never believed they'd do a 14900KS myself.

There is no problem with Raptor Lake's boost frequencies. The processors are fully stable and capable of handling it. They will not malfunction as long as their cooling and power requirements are met.

No reliability margins were abandoned. Since Intel doesn't rely on TSMC and their production is fully vertically integrated from sand to packaged silicon, they've just stringently binned each processor for their quality grade. Remember every CPU since the 13900K is exactly the same, they just differ in clocks, with the 13900KS and now 14900KS being the highest quality chips they offer.

AVX-512 isn't a factor and even if it had been enabled it wouldn't run so high executing that kind of vectorized code. Current requirements would be insane.
I agree with you but specific to this thread, you can give Intel some for releasing a chip that is so power hungry. The 4090 is also a no holds barred GPU. Putting those together needs serious thought about PSU as we have already agreed.

You don't know if all Intel fabs produce the same quality chips so there is that. Having said that just like AMD the best chips become I9 and the worst chips become I3. They are not bulletproof.

If you enabled AVX-512 on these chips the PSU would definitely trip. Probably need a 1600W behemoth with a 4090.
Posted on Reply
#67
chrcoluk
Tek-CheckDid you read in the article that the issue concerns small number of CPU that also crashed in other workloads? It's a not a pandemic. Some owners returned those and got replacements.


No, it does not.
You asked me the same question twice, you not going to get a different answer, I read it and checked the rest of the net for anything suggesting Intel 14th gen CPUs are unstable when running at spec, found nothing.

Here is some examples for reference.

I am part of the FF7 modding community and we have had issues with timing when loading different modules into the game, as well as hext code, (both when system is too slow and also when its too fast), these were difficult to fix with changes to the hooking mechanism.

I currently use Flagrum mod manager with FF15, and on my game the autobuild.earc file doesnt get patched on launch, its almost certainly a timing issue as like the FF7 mod manager patching is done in memory on the fly. I even see occasional issues with workshop mods a official patching mechanism supported by Square Enix.
Posted on Reply
#68
BoggledBeagle
BTW a motherboard knows a lot less about the chip than Intel (I believe it reads just the voltage table). And even Intel has limited time to test the chip while binning it. Even they do not know everything about how the chip is at the moment and how it will develop in the future.

If somebody pushes chips so hard that there are (almost) no safety margins for reliability left, bad things happen.

In the old days 14900K would have been released with 4800/4000 MHz stock frequencies, a very nice, efficient and very easy to cool product. I am going to test how it runs set this way now.
Posted on Reply
#69
chrcoluk
BoggledBeagleBTW motherboard manufacturers know a lot less about the chip than Intel. And even Intel has limited time to test the chip while binning it. Even they do not know everything about how the chip is at the moment and how it will develop in time.

If somebody pushes chips so hard that there are (almost) no safety margins for reliability left, bad things happen.

In the old days 14900K would have been released with 4800/4000 MHz stock frequencies, a very nice, efficient and very easy to cool product. I am going to test how it runs set this way now.
But that isnt the issue though.

I mean look at this document which some are claiming proves faulty CPUs are the cause.

www.radgametools.com/oodleintel.htm

We can see in the article errors are being mislabelled, e.g. the devs acknowledge a bad hash of data can cause their engine to report "out of video memory" thats a code problem. They also report they are mislabelling verification errors as a generic "unable to load shader".

The article starts of by stating hardware problems, but then goes on to confirm its bios related, bios is software not hardware.

It also confirms they are not sure of what the problem is other than its generic system instability.

Some solutions reported by their customers include disabling XMP, reducing SVID back to spec, reducing power limits back to spec, and downclocking CPUs. The latter could work because it in effect can force the CPU back into normal operating range by the lower clock speeds, its masking a bad bios configuration.

So yes there is nothing even in this UE document that confirms any kind of faulty CPU.

This story should be putting pressure on board vendors to stop with what they doing and UE developers to improve their code, but its a let off for them if people start blaming the chips instead.
Posted on Reply
#70
kapone32
chrcolukBut that isnt the issue though.

I mean look at this document which some are claiming proves faulty CPUs are the cause.

www.radgametools.com/oodleintel.htm

We can see in the article errors are being mislabelled, e.g. the devs acknowledge a bad hash of data can cause their engine to report "out of video memory" thats a code problem. They also report they are mislabelling verification errors as a generic "unable to load shader".

The article starts of by stating hardware problems, but then goes on to confirm its bios related, bios is software not hardware.

It also confirms they are not sure of what the problem is other than its generic system instability.

Some solutions reported by their customers include disabling XMP, reducing SVID back to spec, reducing power limits back to spec, and downclocking CPUs. The latter could work because it in effect can force the CPU back into normal operating range by the lower clock speeds, its masking a bad bios configuration.

So yes there is nothing even in this UE document that confirms any kind of faulty CPU.

This story should be putting pressure on board vendors to stop with what they doing and UE developers to improve their code, but its a let off for them if people start blaming the chips instead.
Your argument could have some merit. Let's think back to the days of burning DVDs (Modern CPUs would be sweet for that) the speed would would be slower than what was possible to maintain the picture and audio. Even though benchmarks will show a SSD running at 10 GB/s the most Windows does as far as I have seen is 2.9 GB/s. This could hold true for processors as well. It is what makes PC so unique as a product, there could be a myriad of reasons that using the most power hungry parts are doing this. Regardless, all it does is highlight how much more efficient AMD are, especially when they use the chip everyone loves in the 7800X3D. It could even be something as stupid as the CPU robbing the GPU of power or vice versa with that single 12vHPWR connection on the 4090 with the way some PSUs may be wired.
Posted on Reply
#71
chrcoluk
kapone32Your argument could have some merit. Let's think back to the days of burning DVDs (Modern CPUs would be sweet for that) the speed would would be slower than what was possible to maintain the picture and audio. Even though benchmarks will show a SSD running at 10 GB/s the most Windows does as far as I have seen is 2.9 GB/s. This could hold true for processors as well. It is what makes PC so unique as a product, there could be a myriad of reasons that using the most power hungry parts are doing this. Regardless, all it does is highlight how much more efficient AMD are, especially when they use the chip everyone loves in the 7800X3D. It could even be something as stupid as the CPU robbing the GPU of power or vice versa with that single 12vHPWR connection on the 4090 with the way some PSUs may be wired.
Well yeah, but this is a separate discussion, we know these chips when they are allowed to can use ridiculous amounts of power, the same with Nvidia GPU's. But again this is out of spec behaviour.

The AMD platform also hasnt been plain sailing, burnt out chips, due to (guess who) motherboard bios's misconfigured (out of spec behaviour). Both vendors push to the limits but in different ways. AMD just do it differently to Intel. This is a reason why board vendors have started to get caught out, they took liberties for a long time with their baseline overclock, over volt etc. and largely got away with it due to the chip vendors leaving more tolerance in the products, those days are gone for the foreseeable future.
Posted on Reply
#72
Dr. Dro
kapone32I agree with you but specific to this thread, you can give Intel some for releasing a chip that is so power hungry. The 4090 is also a no holds barred GPU. Putting those together needs serious thought about PSU as we have already agreed.

You don't know if all Intel fabs produce the same quality chips so there is that. Having said that just like AMD the best chips become I9 and the worst chips become I3. They are not bulletproof.

If you enabled AVX-512 on these chips the PSU would definitely trip. Probably need a 1600W behemoth with a 4090.
Then again, an i9-KS is also a no holds barred CPU. It's the spiritual successor to the Core 2 Extreme processor that Intel offered before the lines were split from an unified LGA 775 into LGA 1366 (HEDT) and LGA 1156 (mainstream), after all. Yes yes, I know Skulltrail and all that... but the QX9775 was a one-off Xeon rebrand available for a single motherboard anyway.

The 13th and "14th generation" Core i9 processors are identical with no silicon-level changes between them. After a year manufacturing the same chips in an already mature node, Intel actually managed to mass-produce chips that reach the i9-13900KS's clock targets with their manufacturing level improvements in form of the i9-14900K, and honestly this is remarkable. And even more remarkable is that they managed to stretch this to actually make a batch or two of i9-14900KS chips that somehow go above and beyond, even if you need to trade the extreme increase in power consumption even compared to the already hungry 13900KS for the last 200 MHz.

It doesn't warrant a generational leap (which is why "14th Gen is a scam" is a thing), but it's a welcome improvement in chip quality regardless.
Posted on Reply
#73
BoggledBeagle
Dr. DroAfter a year manufacturing the same chips in an already mature node, Intel actually managed to mass-produce chips that reach the i9-13900KS's clock targets with their manufacturing level improvements in form of the i9-14900K, and honestly this is remarkable. And even more remarkable is that they managed to stretch this to actually make a batch or two of i9-14900KS chips that somehow go above and beyond, even if you need to trade the extreme increase in power consumption even compared to the already hungry 13900KS for the last 200 MHz.
Sorry, but abandoning an industry good practice of providing safety margins for product reliability and selling products stretched to breaking point is not remarkable, that is just SAD. Or even TRAGIC.

And even more sad is, that these are actually very good products, being LITERALLY DESTROYED by their manufacturer with insane out of the box settings and not having enough control over what motherboard manufacturers do with these chips, everything done only to improve how Intel looks, at the expense of the end customers. Because they will have problems dealing with all those baked, failing and unstable chips.

Tragedy.
Posted on Reply
#74
kapone32
Dr. DroThen again, an i9-KS is also a no holds barred CPU. It's the spiritual successor to the Core 2 Extreme processor that Intel offered before the lines were split from an unified LGA 775 into LGA 1366 (HEDT) and LGA 1156 (mainstream), after all. Yes yes, I know Skulltrail and all that... but the QX9775 was a one-off Xeon rebrand available for a single motherboard anyway.

The 13th and "14th generation" Core i9 processors are identical with no silicon-level changes between them. After a year manufacturing the same chips in an already mature node, Intel actually managed to mass-produce chips that reach the i9-13900KS's clock targets with their manufacturing level improvements in form of the i9-14900K, and honestly this is remarkable. And even more remarkable is that they managed to stretch this to actually make a batch or two of i9-14900KS chips that somehow go above and beyond, even if you need to trade the extreme increase in power consumption even compared to the already hungry 13900KS for the last 200 MHz.

It doesn't warrant a generational leap (which is why "14th Gen is a scam" is a thing), but it's a welcome improvement in chip quality regardless.
There is nothing new in that. Even with the KS, the 14900k is already turned up. All CPUs today are. We are in the middle of a CPU war. Intel has indeed refined the node but that is what happens with every CPU on the same node process that uses the same process. Look at the fact that we are getting GT processors on AM4. Unfortunately for them the other side has been exactly that in other sectors though. They can only respond with a refresh at the moment. They will have to change to the same process or a variant of what TSMC is to keep up. I cannot see the community being keen on a 500W 15900K that can do 6.2 Ghz as an example.
Posted on Reply
#75
Dr. Dro
kapone32There is nothing new in that. Even with the KS, the 14900k is already turned up. All CPUs today are. We are in the middle of a CPU war. Intel has indeed refined the node but that is what happens with every CPU on the same node process that uses the same process. Look at the fact that we are getting GT processors on AM4. Unfortunately for them the other side has been exactly that in other sectors though. They can only respond with a refresh at the moment. They will have to change to the same process or a variant of what TSMC is to keep up. I cannot see the community being keen on a 500W 15900K that can do 6.2 Ghz as an example.
It'll be Arrowlake next. It will take a completely different approach, and we might just have a 5775C situation in our hands. I don't expect the first ARL chips to outperform the 13900KS/14900K in gaming.
Posted on Reply
Add your own comment
May 16th, 2024 20:33 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts