• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Is Intel going to deliver a processor/chipset worth waiting for?

Ideally a brand new chip that hasn't been fiddled with previously, i.e. no running at 6.2 GHz all core for a year or smth.

Out of curiosity PL1/PL2 etc. what are the stock time limits?
Brand new chip.

I think that there are no time limits for 253W.

This also could be streamed live on youtube.
 
Running strictly within spec (so in the incredibly unlikely event of it failing somehow, you would be within rights to return the chip to retail, Amazon returns FTW) any volunteers?

I'd probably do it if I had a spare system and a LGA 1700 mobo.

But my spare is an AM4.
 
I think a fair test conditions for 14900K would be:

Running it at stock frequencies with 253W power limit, with Cinebecnch R23 load, with cooler fans set so that the CPU runs at or just below the temperature limit.

I would expect the chip last just a few days. Not 20 years.

((( I secretly think that it is hours, not days )))
With 253w limit it will last for 20 years man. I thought you were talking about no limits prime 95 or something. A cbr23 at 250w is trivial.

Running strictly within spec (so in the incredibly unlikely event of it failing somehow, you would be within rights to return the chip to retail, Amazon returns FTW) any volunteers?

I'd probably do it if I had a spare system and a LGA 1700 mobo.

But my spare is an AM4.
I'm willing to with the ks, but it's pointless. At 250w nothing is going to happen, lol. Especially within 24 hours. I thought the goal was prime 95 with no power limits, but even then I doubt it will degrade within a day or two
 
Intel itself states that ADL and RPL are the same microarchitecture. Just messing around with the cache and interconnects doesn't constitute a new uArch, in the same way that changing from a carb to fuel injection on a 350 v8 doesn't make it a new engine.


Yes, you linked that article and it's wrong. The only portion that Alder and Raptor share unmodified are the Gracemont E-cores.
 
If you are willing to pay money I'll up the stakes, I'll do it at 110c. For 8 hours? Pheww, nothing is going to happen.
I have no money to burn at the moment, sorry. :(

It would be a fun project for the weekend (if I had the money), but unfortunately I have no experience with streaming on youtube. I have a Nikon Z6 camera but I have no idea if I can get a live video from it and how to get it in the computer and on Youtube.

So yeah, I cannot help this time, the only stuff I can contribute now is measuring the radiator performance, and I am not even sure anybody is really interested in that.
 
I have no money to burn at the moment, sorry. :(

It would be a fun project for the weekend (if I had the money), but unfortunately I have no experience with streaming on youtube. I have a Nikon Z6 camera but I have no idea if I can get a live video from it and how to get it in the computer and on Youtube.
You don't need to stream it, you can do your thing and at the end show hwinfo with the runtime of 8 hours and an average temperature and power draw.
 
You had a different word there in place of "intelligent people". You also insulted most cores in my home PC CPU.

Why dont you go for a walk for a while to calm down?

You should complain to intel about the majority of your cpu's cores being e-waste cores, rather than to me for pointing it out.
 
At least we have two good options, that's not a given, historically.
This exactly, I don't get the fuss. Never did tbh. Buy what's best for your use case.
 
Waiting for what? Just use that damn PC for what it's made and don't look back... It will never be good for anyone in the world, what one likes, the other one hates. The same thing in the daily life, just live with it. :D
 
Obviously if the test is done in a very light area then extra cores won't do anything, do I really have to explain that? I can find areas on TLOU that the ecores don't do anything either cause they are very light.

It's common knowledge for Intel users that for maximum gaming perofmenace you turn off HT and leave ecores on. Anything else is just suboptimal. But if you wanna keep arguing, whatever, how can I convince you?

There is no way you can "convince me", as all evidence says otherwise - even digital foundry posted vids on it.

But you wanted to see what my cpu can do at toms dinner with path tracing enabled, so here you go :

WDVFtqp.jpg


2H9w4gE.jpg
[/img]

And no, this is not using FG.

This however is, and last one is FG and no path tracing :

tdRpuJy.jpg
tdRpuJy.jpg


WsjhaH6.jpg
[/img]
 
Last edited:
Yeah, when you want to make a point, compare against the worst value product from the competition /facepalm

That said, you claim that the 13500 is at the same performance level as the 7700x, which simply isn't true - the 7700x is a fair bit faster in games.

But the real comparison would have been the 7600x - it's just as fast as the 7700x in games, and costs the same as your 13500...

Edit : correction - the 7600x is 1800 dkk, and the 13500 is 2000 dkk, so the 13500 is actually more expensive...
I'm talking about single/multi core scores in applications.
For gaming... everyone has RTX 4090? Anyway, if you have RTX 4090, you play in 1440p/4K and don't choose ryzen 5 or i5.

7600X is cheaper because it is very weak in multicore compared to 13500.


For the others:
Idle: 21 W
yt 1080p: 35W
yt 4k@60fps: 45W
The whole system, without monitor. Of course, there are short jumps (as it loads the buffer, for example), but it stabilizes at these values.

I reinstalled windows on January 1st. That's when I also reset the wattmeter. At an average of 5 hours/day, it consumes about 6KW per month and I don't only use it for office and www. I also use it for games, recoding (three or four TV series this year) and photo editing. Only used for office and news/movies, I don't see this system exceeding 3KW/month.
Stop with the nonsense that an Intel destroys a nuclear power plant.

PS: 13.190 KW
That's what the wattmeter indicates, the total consumption from January 1 until now. Does that seem like a lot? In terms of multicore results, the 13500 outperforms the 7700X (a much more expensive processor) in CPU-Z and Cinebench R23. According to TPU's review of the 7700X, it outperforms it with lower consumption.

13500_cinebench_r23_CPUZ.jpg
 
There is no way you can "convince me", as all evidence says otherwise - even digital foundry posted vids on it.

But you wanted to see what my cpu can do at toms dinner with path tracing enabled, so here you go :

WDVFtqp.jpg


2H9w4gE.jpg
[/img]

And no, this is not using FG.

This however is, and last one is FG and no path tracing :

tdRpuJy.jpg
tdRpuJy.jpg


WsjhaH6.jpg
[/img]
You are a lost cause man, sure, w/e, ecores hurt gaming performance. I have the damn cpu but you know better,kk.

WDVFtqp.jpg


2H9w4gE.jpg
[/img]

And no, this is not using FG.

This however is, and last one is FG and no path tracing :

tdRpuJy.jpg
tdRpuJy.jpg


WsjhaH6.jpg
[/img]
Replicated your SS, but something is wrong with your game. Population density should be a LOT higher than what you are showing. First screenshot I get 130 on a stock 12900k but with much higher pop than your SS.
 
You are a lost cause man, sure, w/e, ecores hurt gaming performance. I have the damn cpu but you know better,kk.


Replicated your SS, but something is wrong with your game. Population density should be a LOT higher than what you are showing. First screenshot I get 130 on a stock 12900k but with much higher pop than your SS.

I do indeed know better.

Yeah yeah, talk is cheap dude - where is the proof ? Can't wait to see your 12900k being way slower while using 3 times as much power...
 
I do indeed know better.

Yeah yeah, talk is cheap dude - where is the proof ? Can't wait to see your 12900k being way slower while using 3 times as much power...
Wanna bet that it won't be way slower and won't consume 3 times as much? It's obvious that you are running population density to low. Just grabbed this, look at your npcs and look at mine, LOL.

Cyberpunk2077 2024-03-07 21-32-04.png
 
Wanna bet that it won't be way slower and won't consume 3 times as much? It's obvious that you are running population density to low. Just grabbed this, look at your npcs and look at mine, LOL.


Lol, and you post a screenshot without any stats, cause that will sure prove your point...
 
And this is with low population 1080p DLSS UP + RT Ultra
image-2024-03-07-213957833.png
Lol, and you post a screenshot without any stats, cause that will sure prove your point...
Yeah I noticed after the SS was posted, for some reason it didn't capture capframe stats. See above with stats. Point still stands, where the heck are your NPCS?
 
And this is with low population 1080p DLSS UP + RT Ultra
image-2024-03-07-213957833.png

Yeah I noticed after the SS was posted, for some reason it didn't capture capframe stats. See above with stats. Point still stands, where the heck are your NPCS?

That clearly isn't using path tracing... nice try - not.
 
That clearly isn't using path tracing... nice try - not.
Im writing it right there, it's with RT Ultra. You want PT, no problems, gimme a second.

Unlike you, I don't actually hide the pop settings :roll:

PT, oh mymymy, much slower while consuming 3 times the power. I just hope you realize a 14900k would send both our rigs for spare parts...

image-2024-03-07-215144637.png
[/URL]
 
Last edited:
Im writing it right there, it's with RT Ultra. You want PT, no problems, gimme a second.

Unlike you, I don't actually hide the pop settings :roll:

You are wasting your time arguing with him. accept it. even if you did show him non refutable proof, he still would not accept it. He might have used intel in the past, but now he is a resout AMD user. Until Intel pull another C2D(and they will) he is lost.
 
Im writing it right there, it's with RT Ultra. You want PT, no problems, gimme a second.

Unlike you, I don't actually hide the pop settings :roll:

PT, oh mymymy, much slower while consuming 3 times the power. I just hope you realize a 14900k would send both our rigs for spare parts...

image-2024-03-07-215144637.png
[/URL]

10% slower while using 55% more power.

Could the 14900k be slightly faster ? Sure, but at an absolutely absurd power draw. Not to mention that the 14900k is vastly more expensive than the 7800x3d at 4800 dkk vs 2900 dkk.
 
10% slower while using 55% more power.

Could the 14900k be slightly faster ? Sure, but at an absolutely absurd power draw. Not to mention that the 14900k is vastly more expensive than the 7800x3d at 4800 dkk vs 2900 dkk.
10% slower? You are getting 135 fps and im getting 132! Dude.....

I hope you realize if I decide to OC even my 3 gen old 12900k will just fly past your 3d, right? Sure, power draw will hit 150w, but it's a 3 gen old CPU. And slightly faster? My 14900k was dropping 15-20% performance on my 12900k, both running stock
 
You are wasting your time arguing with him. accept it. even if you did show him non refutable proof, he still would not accept it. He might have used intel in the past, but now he is a resout AMD user. Until Intel pull another C2D(and they will) he is lost.

Cute.

But yes, if intel does come up with a new killee product (namely one that ditches the e-waste cores and focuses on gaming performance), then im all for going intel again - was always a good experience.
 
Back
Top