• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Raptor Lake Refresh is coming!

Joined
Jan 14, 2019
Messages
13,237 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Intel does not have igp included in the CPU. It is in the same capsule, but completely separate. You can't get anything extra, as evidenced by the "F" processors, which disabled this igpu because it is defective. It is completely inactive (laser cut) and yet these processors are not more powerful at overclocking than their brothers with igp.
Exactly this.

I...really hope that both of you are trolling... If not, it's pretty sad to see that people are incapable of reading the words written, but not comprehending them at face value so you can assign your own.

To clarify, I was commenting on the slight boost to clock speeds that could be garnered. Basically, old chips traded production area that could have been used for more transistors, and instead used it for an iGPU core. Said core was not entirely dark even when disabled, but did decrease overclocking headroom by having less available transistors for CPU cores and had a big old dead space for anyone using a dGPU.

I was also commenting that Netburst was...not a bad thing. Hear me out, because that's a lot to say. What I mean is that modern CPUs are based off of the lessons of Netburst and Bulldozer. That means your modern 8 core 16 thread consumer grade CPUs exist because of lessons learned. I would be hard pressed to ask someone rocking a modern 6+core CPU why it just works, and not having to point out what failed in the past is directly responsible for today's success. Just like today AMD uses CCXs instead of a monolithic silicon chunk, the PCH is often a generation or two behind the lithographic tech of the main processor, and windows even required an update to the scheduler to address how programs were assigned by both AMD and Intel in the last five years, because their innovations often lead to short term issues.


I remember a wonderful time when I could get a CPU, then spend hours overclocking it. I now buy a CPU and it overclocks itself (ok enough to be passable). I remember people claiming that you'd never need more than a few cores, but consumer hardware now comes with 6+ cores and some form of threading as almost a standard. I remember asking for Intel to stop gimping my gaming CPU with an iGPU incapable of running 640x480 resolutions, so that I could eek out just a couple hundred more megahertz...which is the comment I actually was referring back to (and why I find it silly that you both really want me to be wrong without ever considering that I might be speaking to something other than what you're projecting onto my comments). Alas, apparently it's asking too much for people not to fight over something I didn't say, never meant, and something I even joked about as me being too old.
Whatever, complete the loop here. I...don't understand the reason for fighting about the value of an iGPU when I referred to it as a means of slightly increasing CPU frequency by its exclusion...but why does context matter?
Get any CPU with an iGPU, use it as a display adapter (meaning: have a dGPU in the system, but use the iGPU as a display adapter), and check iGPU power consumption in HWinfo. I'd be very surprised if it was higher than about 0.5 W, even on an old Intel chip.

I have an 11700 in my HTPC, and I use its iGPU as a display adapter because I need HDMI 2.0 that the 1050 Ti in the system doesn't have. Locked to 65 W, the CPU boosts up to 2.8 GHz in a Cinebench all-core test. If I disable the iGPU, and hook the TV up to the 1050 Ti, the CPU does... well, 2.8 GHz.

My point stands: the iGPU when unused, or when used as a display adapter, needs so minimal power that it doesn't eat into your CPU's power and thermal headroom. This is not trolling. This is fact.

If you don't believe me, send me an 11700F, and I'll test it for you. I'm sure it'll boost up to 2.8 GHz as well.

Edit: I agree with you about Netburst and Bulldozer. Failure is a necessary step on the road to success.
 
Last edited:
Joined
Feb 24, 2023
Messages
3,261 (4.76/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Your anecdotes
Added you to my "sees trolling in everything" list. Sorry man, but I've never attempted to make fun of you or even suspect you in spreading bullcake. Your extreme overreaction makes no sense.

Like, what's so hard in understanding the fact I was just very happy to see Intel making something cool after they invented Netburst?

Speaking of iGPUs, your point was, is and will be valid. Relax and stop seeking for non-existent insults.
 
Joined
Apr 2, 2011
Messages
2,856 (0.57/day)
Exactly this.


Get any CPU with an iGPU, use it as a display adapter (meaning: have a dGPU in the system, but use the iGPU as a display adapter), and check iGPU power consumption in HWinfo. I'd be very surprised if it was higher than about 0.5 W, even on an old Intel chip.

I have an 11700 in my HTPC, and I use its iGPU as a display adapter because I need HDMI 2.0 that the 1050 Ti in the system doesn't have. Locked to 65 W, the CPU boosts up to 2.8 GHz in a Cinebench all-core test. If I disable the iGPU, and hook the TV up to the 1050 Ti, the CPU does... well, 2.8 GHz.

My point stands: the iGPU when unused, or when used as a display adapter, needs so minimal power that it doesn't eat into your CPU's power and thermal headroom. This is not trolling. This is fact.

If you don't believe me, send me an 11700F, and I'll test it for you. I'm sure it'll boost up to 2.8 GHz as well.

Edit: I agree with you about Netburst and Bulldozer. Failure is a necessary step on the road to success.

You still seem to be absolutely incapable of getting a basic fact.

This is why I think you're trolling, because when stated multiple times in bold and capitals you still seem to be incapable of reading.


Let me ask you some basic questions so you can maybe get this.
1) When was the 11700F released?
2) What was released A DECADE AGO?
3) When we are discussing literal hundreds of MHz being the difference between a line and its refresh do you think maybe removing that iGPU could make a difference?

Now, I think you still don't get it. Let me fact check this.
1) 3-16-21. March 16th of 2021.
2) Sandybridge and Ivybridge. That's 2000 and 3000, whereas we are now at 13000.
3) 100 MHz is well within the silicon lottery, and based upon how Intel bins their chips is easily possible to aim for with increased power consumption and heat and minor process improvements. It's almost like my original point on this one was that optimistically A DECADE AGO people were asking Intel for the same removal of perceived value negative items for their use case, so they could get just a bit more perceived value. Almost like the person I responded to, who was asking for shearing off E cores to increase the performance of P cores.

I'm glad that you proved the trolling correct, because Beginner Micro Devices above seems to want to pretend like everyone who is continuing this is rational. Meanwhile, when I cite Sandybridge and 2011-2013 as my targets for reference you demand that I send you a device that I don't have, never claimed to have, and literally was only introduced to the discussion because...you know, I have to go with trolling because the alternative is you cannot comprehend the idea that there's old hardware. All of this because of a warning that asking for Intel to remove personally irrelevant hardware historically hasn't meant better performance, just literally disabling stuff or selling what otherwise would have been recycled as bad silicon. Because the original response was to someone asking for less E cores so they can boost their P cores...because apparently comprehending that has taken 7 pages of discussion and I've attempted to say this about half a dozen times.



I also want to make something clear. How much energy is half a Watt? You seem to not get a lot of the basics, so let me explain this. Let's assume that the measurement you're pulling from thin air is correct. Let me then explain that your CPU pulls way more than 65 Watts...the rating for heat is TDP, or thermally dissipated power. This means that assuming Intel and AMD are measuring correctly...they take in 250 watts, output 65 Watts as heat, and the remaining 185 Watts performs work or is otherwise transferred out (generally as electromagnetic potential energy). Assuming roughly the same efficiency of heat transfer, or 74%, and 0.5 input Watts, the thermal output on your theoretical iGPU would be 0.13 Watts. Let me idiot check that number...so you can see why it's not rational. What is the output of heat from the human body? 80-1050 watts according to professor Google. Cool. A processor can raise the temperature of a metal chunk from ambient to 70+ degrees C...but the human body producing 160 times as much energy cannot raise the temperature of the same cooler above about 98.6 F...or 37 C. That...doesn't make a lick of sense, even once you factor in a huge increase in surface area, an enormous decrease in heat transfer, and the delta in temperature being about 10 C versus the 42 C for the computer.

Now, if Intel's current iGPU is based off of the same tech as their dGPU...and their dGPUs are using a stepped up version, then the 11700k using UHD 750 graphics running at "0.5 Watts" should produce a dGPU capable of the same performance at the same wattage...right? Checking on that iGPU we see: TPU UHD 750 graphics data This should mean that those Intel cards absolutely slay...because even if they scale at about 50% you have 200/0.5 = 400 times as much hardware for the same very low 200 Watt power draw...200 times as much with scaling...but it doesn't... If it did both Nvidia and AMD would literally be out of the market. Let me say that again, imagine 200 iGPU cores running at once on a dGPU, pulling less energy than your average mid-range card, and having literally thousands of execution units (32*200) and more than 51000 shaders (256*200). The 4080 has 304 TMUs (not a direct comparison, but close enough) and 9728 shaders...or about 1/5th the amount on this theoretical Intel dGPU based off of your quoted results.
Hmmm.....It's almost like the basic review of data from your statements is a lie. Almost like that same data sheet calls out 15 Watts of TDP...which would be 58 Watts of energy flowing through the thing (with the previous 26% efficiency)...which represents 58/250 = 23% of all power the CPU consumes.... It's almost like basic fact checking is proving your assertions really wrong...

What do I know though, right? It's not like your numbers are moon logic wrong, your basic assumptions can be demonstrated as incorrect with a simple google search, or basic thought processes demonstrate that your assertion is that removing 23% of the power consumption from a processor should result in more overclocking headroom...if Intel didn't decide to throw thermal paste into their processors, have issues with contact area because of pressure application, or my favorite reason for all of this simply segment their offering to high heaven so they can sell basically all of their semi-defective silicon as something. Of course, I just don't get the value of an iGPU what would through its exclusion have decreased power consumption by 23% on a modern 11700k...let alone the absolute mess that Intel HD graphics was on Sandybridge. TPU database, Intel HD Graphics 2000 Yeah, it's irrational for me to call this trolling...because I'm supposed to assume that basic logical checks aren't something others are capable of doing, and fact checking is too much work, and best of all things like a collection of heat pipes between cold plates doesn't change a vapor chamber into fundamentally something else because people still call it a vapor chamber.

Tangent on that last one...but it's funny having the discussion about how nobody uses vapor chambers anymore. I was told I don't get it, and cited the new Radeon cards. They then define the issue as there was not enough fluid in the "vapor chamber" to properly phase change between the hot and cold plate (when these cards failed), and when they cracked the thing open it was basically a forest of heat pipes that transferred condensed liquid from the cold plate down to the hot plate instead of the definition of a vapor chamber...because the flaw of a vapor chamber is that you cannot define where liquid hits the hot plate, so they've changed the definition to basically be an integration of heat pipes...because that increases manufacturing complexity but can prove critical in spreading the phase change energy transfer evenly along a surface. All of this is to say that you should never trust somebody who doesn't understand the basic mechanics of things, or when basic fact checking about their numbers and assumptions is trivial allow them to go without checking. I want to live in your 0.5 Watt iGPU world...because it'd mean a couple hundred USD would offer current high end performance. Unfortunately we live in the real world...where a 4080 is 1/5th your theoretical iGPU to dGPU math and has a 320 Watt TDP instead of a 200 Watt one. Remember, that was assuming terrible scaling...off your numbers. Mine indicate that Intel has scaled well with ARC and their stated 15 Watt TDP of the iGPU.
 
Joined
Apr 2, 2011
Messages
2,856 (0.57/day)

GIGO
Garbage
in
Garbage
out

Wish I could live in the land of sunshine and magic where the average iGPU ran at 0.5 Watts, which meant less TDP. Unfortunately I live in the real world where a 15 Watt TDP means about 58 Watts of input energy...where we are dealing with CPUs that might pull 250 Watts of total energy.


Also want to live in a world where dropping 23% of the power draw for a chip wouldn't offer overclocking headroom...because that should mean magic overclocks and magic heat transfer. Unfortunately I live in a world where the laws of physics preclude magic. If I'm wrong, please point me in the direction of the iGPU that runs on 0.5 Watts (and can run like Intel Iris does today)...because all I want is to get somebody to link about 200 of them up and charge me about 200 USD for it. That'd be a fantastic GPU...as it'd outstrip the 1000+USD 4080 by about 4-6 generations based upon current trends...though that's subject to change.
If not, and Intel's 10-20 USD saving for no iGPU holds, I'll take enough for 40 iGPUs linked...because $400-$800 would still result in Intel beating a 4080 and a savings of at least 400 USD.
 
Joined
Jan 5, 2006
Messages
18,584 (2.68/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
The F24b BIOS "Supports New Gen. Intel Core CPUs" has been removed from the download section Z690 Aorus Master.


As I mentioned before... Intel didn't confirm anything..
 
Joined
Feb 1, 2019
Messages
3,684 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
The cpus are coming they wouldnt go as far as putting it in a bios I think otherwise, just Intel unhappy with the leak.
 

ir_cow

Staff member
Joined
Sep 4, 2008
Messages
4,613 (0.77/day)
Location
USA
Also want to live in a world where dropping 23% of the power draw for a chip wouldn't offer overclocking headroom...because that should mean magic overclocks and magic heat transfer.
Where are you getting these numbers from? Also the igpu is only active if you A) use the video port on the motherboard or B) if you install the drivers. The power stage responsible for the igpu reads zero for me when I probe it. Aka not active because I don't do either of the above.
 
Joined
Apr 2, 2011
Messages
2,856 (0.57/day)
Where are you getting these numbers from? Also the igpu is only active if you A) use the video port on the motherboard or B) if you install the drivers. The power stage responsible for the igpu reads zero for me when I probe it. Aka not active because I don't do either of the above.

Sigh.

I get the iGPU running at "0.5 Watts" from the above people who tell me it's not a thermal drain. I don't know where that came from...
I get the 15 Watts TDP from the TPU database for what the iGPU is supposed to run at.
I get the assertion that an iGPU today can functionally be disabled...mostly...but was referring back to 2011-2013 where people were asking for an iGPU to be removed and the extra space used for something valuable...like my point was something separate from "iGPU is bad" despite having to spend all of this time saying that.

I then pulled 250 Watts directly from my backside. Or not. The wall draw for PL2 is rated to 251 Watts with a TDP of 125 Watts, or about 50% efficiency on the conversion. This would mean that the 15 Watt TDP iGPU would draw 30 Watts at the same efficiency...or 30/251 = 12% of the total power demand for the chip. I'll admit to cheating a bit, because at the end of the day 65 Watts TDP was a huge underestimation of the full draw...because gaming is what I was referring to and generally speaking gaming doesn't pull the same workloads as encoding...you should review the same data I did here: Power for 11700k

OK, cool. I'm assuming the 11700k is more energy efficient than it is, meaning I assumed more power was pulled by the iGPU...and estimated the GPU power draw about 10% higher (23% versus 12%). I'm going to balance that out with a few things things.
One, iGPUs were less efficient in the Ivybridge and Sandybridge components I referred to...because everything was less efficient. They also had a lower TDP.
Two, you aren't encoding while playing games...and you definitely aren't running the CPU into the ground before the iGPU becomes your bottleneck.
Three, there most definitely is a point of diminishing returns...but my point was removing the iGPU would have (in 2011-2013) given better overclocking headroom or more cores. Period.
Finally, let's assume even worse. Instead of 23%, or 12%, let's assume 5% of the power in a processor goes to the iGPU. I've seen 0.05 volts be the difference between a stable PC and 200 MHz lower clocks...which was way less than 5% overall increase in power draw. I cannot provide you hard numbers, because by nature the silicon lottery is anecdotal only. That said I can demonstrate that the silicon lottery is a real thing simply by highlighting the overclocking world records: CPU overclocking records

So my point stands, and you have a point that I was...optimistic in assuming some of my efficiencies. I'll gladly admit this could be a problem if any of these numbers were hard rather than theoretical...but I stand with the assertion that an iGPU running at 0.5 Watts is moon logic...unless it's disabled. If it's disabled, then you're proving my point that a gaming focused CPU has limited usage of an iGPU, and if you're doing it to prove a point then you missed the context where all of this is about how removing an iGPU a decade ago was proposed as a way to get better overclocking... Cool. Can we be done now?

Can we also admit that in nongaming usages iGPUs are a godsend? Others highlighted servers. I'll highlight transcoding boxes, media servers, semi-headless units, automation projects, and even SBCs. Yeah, there's nothing quite like retroarch and a hundred bucks to build a nice gift for someone you love, and getting half a dozen other people asking you to build them one. Charging 200 USD for time and labor, then seeing something at Bestbuy for 500 USD...and knowing that what made all of this possible was an iGPU...despite still hating their waste in gaming CPUs a decade ago (that didn't have modern features like selecting the GPU and high enough C states to functionally make the silicon dark).
 
Joined
Jun 6, 2022
Messages
622 (0.66/day)
The F24b BIOS "Supports New Gen. Intel Core CPUs" has been removed from the download section Z690 Aorus Master.


As I mentioned before... Intel didn't confirm anything..
Nothing abnormal. Alpha/beta versions do not have a long life at Gigabyte. Even stable versions appear and disappear.
 

ir_cow

Staff member
Joined
Sep 4, 2008
Messages
4,613 (0.77/day)
Location
USA
Can we also admit that in nongaming usages iGPUs are a godsend? Others highlighted servers. I'll highlight transcoding boxes, media servers, semi-headless units, automation projects, and even SBCs. Yeah, there's nothing quite like retroarch and a hundred bucks to build a nice gift for someone you love, and getting half a dozen other people asking you to build them one. Charging 200 USD for time and labor, then seeing something at Bestbuy for 500 USD...and knowing that what made all of this possible was an iGPU...despite still hating their waste in gaming CPUs a decade ago (that didn't have modern features like selecting the GPU and high enough C states to functionally make the silicon dark).
You can Sigh all you want and ramble away too.

Anyways, iGPU are good for stuff none gaming related. Up until recently Intel has had a much better time with Laptops compared to AMD in that department. If you want a media box. iGPU is great for cost savings. Just make sure the codec is support or the CPU will be doing all the work instead.
 
Joined
Jan 14, 2019
Messages
13,237 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
You still seem to be absolutely incapable of getting a basic fact.

This is why I think you're trolling, because when stated multiple times in bold and capitals you still seem to be incapable of reading.


Let me ask you some basic questions so you can maybe get this.
1) When was the 11700F released?
2) What was released A DECADE AGO?
3) When we are discussing literal hundreds of MHz being the difference between a line and its refresh do you think maybe removing that iGPU could make a difference?

Now, I think you still don't get it. Let me fact check this.
1) 3-16-21. March 16th of 2021.
2) Sandybridge and Ivybridge. That's 2000 and 3000, whereas we are now at 13000.
3) 100 MHz is well within the silicon lottery, and based upon how Intel bins their chips is easily possible to aim for with increased power consumption and heat and minor process improvements. It's almost like my original point on this one was that optimistically A DECADE AGO people were asking Intel for the same removal of perceived value negative items for their use case, so they could get just a bit more perceived value. Almost like the person I responded to, who was asking for shearing off E cores to increase the performance of P cores.

I'm glad that you proved the trolling correct, because Beginner Micro Devices above seems to want to pretend like everyone who is continuing this is rational. Meanwhile, when I cite Sandybridge and 2011-2013 as my targets for reference you demand that I send you a device that I don't have, never claimed to have, and literally was only introduced to the discussion because...you know, I have to go with trolling because the alternative is you cannot comprehend the idea that there's old hardware. All of this because of a warning that asking for Intel to remove personally irrelevant hardware historically hasn't meant better performance, just literally disabling stuff or selling what otherwise would have been recycled as bad silicon. Because the original response was to someone asking for less E cores so they can boost their P cores...because apparently comprehending that has taken 7 pages of discussion and I've attempted to say this about half a dozen times.



I also want to make something clear. How much energy is half a Watt? You seem to not get a lot of the basics, so let me explain this. Let's assume that the measurement you're pulling from thin air is correct. Let me then explain that your CPU pulls way more than 65 Watts...the rating for heat is TDP, or thermally dissipated power. This means that assuming Intel and AMD are measuring correctly...they take in 250 watts, output 65 Watts as heat, and the remaining 185 Watts performs work or is otherwise transferred out (generally as electromagnetic potential energy). Assuming roughly the same efficiency of heat transfer, or 74%, and 0.5 input Watts, the thermal output on your theoretical iGPU would be 0.13 Watts. Let me idiot check that number...so you can see why it's not rational. What is the output of heat from the human body? 80-1050 watts according to professor Google. Cool. A processor can raise the temperature of a metal chunk from ambient to 70+ degrees C...but the human body producing 160 times as much energy cannot raise the temperature of the same cooler above about 98.6 F...or 37 C. That...doesn't make a lick of sense, even once you factor in a huge increase in surface area, an enormous decrease in heat transfer, and the delta in temperature being about 10 C versus the 42 C for the computer.

Now, if Intel's current iGPU is based off of the same tech as their dGPU...and their dGPUs are using a stepped up version, then the 11700k using UHD 750 graphics running at "0.5 Watts" should produce a dGPU capable of the same performance at the same wattage...right? Checking on that iGPU we see: TPU UHD 750 graphics data This should mean that those Intel cards absolutely slay...because even if they scale at about 50% you have 200/0.5 = 400 times as much hardware for the same very low 200 Watt power draw...200 times as much with scaling...but it doesn't... If it did both Nvidia and AMD would literally be out of the market. Let me say that again, imagine 200 iGPU cores running at once on a dGPU, pulling less energy than your average mid-range card, and having literally thousands of execution units (32*200) and more than 51000 shaders (256*200). The 4080 has 304 TMUs (not a direct comparison, but close enough) and 9728 shaders...or about 1/5th the amount on this theoretical Intel dGPU based off of your quoted results.
Hmmm.....It's almost like the basic review of data from your statements is a lie. Almost like that same data sheet calls out 15 Watts of TDP...which would be 58 Watts of energy flowing through the thing (with the previous 26% efficiency)...which represents 58/250 = 23% of all power the CPU consumes.... It's almost like basic fact checking is proving your assertions really wrong...

What do I know though, right? It's not like your numbers are moon logic wrong, your basic assumptions can be demonstrated as incorrect with a simple google search, or basic thought processes demonstrate that your assertion is that removing 23% of the power consumption from a processor should result in more overclocking headroom...if Intel didn't decide to throw thermal paste into their processors, have issues with contact area because of pressure application, or my favorite reason for all of this simply segment their offering to high heaven so they can sell basically all of their semi-defective silicon as something. Of course, I just don't get the value of an iGPU what would through its exclusion have decreased power consumption by 23% on a modern 11700k...let alone the absolute mess that Intel HD graphics was on Sandybridge. TPU database, Intel HD Graphics 2000 Yeah, it's irrational for me to call this trolling...because I'm supposed to assume that basic logical checks aren't something others are capable of doing, and fact checking is too much work, and best of all things like a collection of heat pipes between cold plates doesn't change a vapor chamber into fundamentally something else because people still call it a vapor chamber.

Tangent on that last one...but it's funny having the discussion about how nobody uses vapor chambers anymore. I was told I don't get it, and cited the new Radeon cards. They then define the issue as there was not enough fluid in the "vapor chamber" to properly phase change between the hot and cold plate (when these cards failed), and when they cracked the thing open it was basically a forest of heat pipes that transferred condensed liquid from the cold plate down to the hot plate instead of the definition of a vapor chamber...because the flaw of a vapor chamber is that you cannot define where liquid hits the hot plate, so they've changed the definition to basically be an integration of heat pipes...because that increases manufacturing complexity but can prove critical in spreading the phase change energy transfer evenly along a surface. All of this is to say that you should never trust somebody who doesn't understand the basic mechanics of things, or when basic fact checking about their numbers and assumptions is trivial allow them to go without checking. I want to live in your 0.5 Watt iGPU world...because it'd mean a couple hundred USD would offer current high end performance. Unfortunately we live in the real world...where a 4080 is 1/5th your theoretical iGPU to dGPU math and has a 320 Watt TDP instead of a 200 Watt one. Remember, that was assuming terrible scaling...off your numbers. Mine indicate that Intel has scaled well with ARC and their stated 15 Watt TDP of the iGPU.
Please forgive me for not reading every single word of your wall of text.

I can assure you that I've had Sandy and Ivy Bridge CPUs, and none of them showed any difference in achievable CPU speed with their iGPU enabled or disabled. The oldest CPU I currently have is a Haswell T chip. Do you want me to check its iGPU power consumption?

And please, let's not confuse an iGPU's power consumption with that of a dGPU. A dGPU has VRAM, a PCI-express interface, a memory controller, and other components on it that have to be powered on for the card to be functional. An iGPU does not.

Sigh.

I get the iGPU running at "0.5 Watts" from the above people who tell me it's not a thermal drain. I don't know where that came from...
I get the 15 Watts TDP from the TPU database for what the iGPU is supposed to run at.
I get the assertion that an iGPU today can functionally be disabled...mostly...but was referring back to 2011-2013 where people were asking for an iGPU to be removed and the extra space used for something valuable...like my point was something separate from "iGPU is bad" despite having to spend all of this time saying that.

I then pulled 250 Watts directly from my backside. Or not. The wall draw for PL2 is rated to 251 Watts with a TDP of 125 Watts, or about 50% efficiency on the conversion. This would mean that the 15 Watt TDP iGPU would draw 30 Watts at the same efficiency...or 30/251 = 12% of the total power demand for the chip. I'll admit to cheating a bit, because at the end of the day 65 Watts TDP was a huge underestimation of the full draw...because gaming is what I was referring to and generally speaking gaming doesn't pull the same workloads as encoding...you should review the same data I did here: Power for 11700k

OK, cool. I'm assuming the 11700k is more energy efficient than it is, meaning I assumed more power was pulled by the iGPU...and estimated the GPU power draw about 10% higher (23% versus 12%). I'm going to balance that out with a few things things.
One, iGPUs were less efficient in the Ivybridge and Sandybridge components I referred to...because everything was less efficient. They also had a lower TDP.
Two, you aren't encoding while playing games...and you definitely aren't running the CPU into the ground before the iGPU becomes your bottleneck.
Three, there most definitely is a point of diminishing returns...but my point was removing the iGPU would have (in 2011-2013) given better overclocking headroom or more cores. Period.
Finally, let's assume even worse. Instead of 23%, or 12%, let's assume 5% of the power in a processor goes to the iGPU. I've seen 0.05 volts be the difference between a stable PC and 200 MHz lower clocks...which was way less than 5% overall increase in power draw. I cannot provide you hard numbers, because by nature the silicon lottery is anecdotal only. That said I can demonstrate that the silicon lottery is a real thing simply by highlighting the overclocking world records: CPU overclocking records

So my point stands, and you have a point that I was...optimistic in assuming some of my efficiencies. I'll gladly admit this could be a problem if any of these numbers were hard rather than theoretical...but I stand with the assertion that an iGPU running at 0.5 Watts is moon logic...unless it's disabled. If it's disabled, then you're proving my point that a gaming focused CPU has limited usage of an iGPU, and if you're doing it to prove a point then you missed the context where all of this is about how removing an iGPU a decade ago was proposed as a way to get better overclocking... Cool. Can we be done now?

Can we also admit that in nongaming usages iGPUs are a godsend? Others highlighted servers. I'll highlight transcoding boxes, media servers, semi-headless units, automation projects, and even SBCs. Yeah, there's nothing quite like retroarch and a hundred bucks to build a nice gift for someone you love, and getting half a dozen other people asking you to build them one. Charging 200 USD for time and labor, then seeing something at Bestbuy for 500 USD...and knowing that what made all of this possible was an iGPU...despite still hating their waste in gaming CPUs a decade ago (that didn't have modern features like selecting the GPU and high enough C states to functionally make the silicon dark).
In your example, you seem to be assuming that the iGPU runs at 100% load all the time. A 65 W limit does not mean that the iGPU has 15 W and the CPU has 50. It means that the 65 W is shared between both. Sure, if you run a 100% load on your iGPU, it will eat into the CPU's headroom. But if you don't, then the CPU can have (nearly) all the 65 W.
 
Last edited:

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
9,213 (3.99/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
Hoping to be able to slap one of these on my little B660 board :D

If not.. well then I don't know.
 
Joined
Apr 2, 2011
Messages
2,856 (0.57/day)
You can Sigh all you want and ramble away too.

Anyways, iGPU are good for stuff none gaming related. Up until recently Intel has had a much better time with Laptops compared to AMD in that department. If you want a media box. iGPU is great for cost savings. Just make sure the codec is support or the CPU will be doing all the work instead.

I'm sighing because you are all acting idiotic, intentionally obtuse, or demonstrating that you cannot understand an idea more complex than A->B...because mine is A->B, based on this we can extrapolate that C->D will end badly based upon the relationship of A and B.

Please, get this through your heads collectively. You seem to not get it...so let me explain one more time for the slow crowd, or those who really want to not read.

The comment replied to was that we should remove E cores to boost the performance of P cores. If you can understand that, then we're good with step one.
The reply was careful what you wish for. A DECADE AGO the same ask was made of Intel when they first started integrating iGPUs into CPUs that were specifically designed for gaming...and thus would basically be guaranteed to have a dGPU.
If you had a dedicated GPU, and without the modern features that allow usage of an iGPU or selection based upon load, then the iGPU was functionally a drain on potential overclocking...and not relevant for those that were running a gaming setup (which was the target market for people running a consumer grade 2600k instead of either a workstation CPU or a lower end "white box" office PC).

I said this because a decade ago this was quoted as losing silicon on the monolithic die to an iGPU...and that the ask of removing it from the monolithic silicon chunk was to either support better overclocking or lower temperatures. It's functionally the same ask as someone today asking for disabling E cores for more or better performing P cores.



My sigh is because you are being completely idiotic in asking for numbers...and then you ignore the numbers. You wanted me to honestly tell you they were artificial, so you could dismiss them without actually doing anything....but I had the data so you have to go the apathy route. You then make the same idiotic statement that "my iGPU is useful" as though I said they had no use...because you're either a troll or incapable of reading. I start with a sigh because I'm not sure which you are...but at this point it's assured you're one of the two. I gave you the benefit of the doubt by providing data...and it's a waste of my time to give you that. I sigh, because you aren't worth talking to because you want to talk at a point I never made...because.

Please forgive me for not reading every single word of your wall of text.

I can assure you that I've had Sandy and Ivy Bridge CPUs, and none of them showed any difference in achievable CPU speed with their iGPU enabled or disabled. The oldest CPU I currently have is a Haswell T chip. Do you want me to check its iGPU power consumption?

And please, let's not confuse an iGPU's power consumption with that of a dGPU. A dGPU has VRAM, a PCI-express interface, a memory controller, and other components on it that have to be powered on for the card to be functional. An iGPU does not.


In your example, you seem to be assuming that the iGPU runs at 100% load all the time. A 65 W limit does not mean that the iGPU has 15 W and the CPU has 50. It means that the 65 W is shared between both. Sure, if you run a 100% load on your iGPU, it will eat into the CPU's headroom. But if you don't, then the CPU can have (nearly) all the 65 W.

You...are absolutely selective in quotation and missing the point of things...

When Intel quotes a CPU at 65 Watts TDP, and they quote an iGPU at 15 Watts of TDP, you think they share? Fine, no arguments (that's how I calculated it). 65-15 = 50 Watts from the CPU and 15 Watts from the iGPU. If we assume roughly the same efficiency, then:
(15/65)*250 = iGPU power draw
(50/65)*250 = CPU power draw
For the record, this is how I get the power draw for the iGPU. That's, for the 11700k, 125-15 = 110 Watts TDP going to the CPU and 15 Watts TDP going to the iGPU. Full draw is therefore easy to calculate on the iGPU because assuming roughly the same efficiency you have (15/125)*250 watts of electricity going to the iGPU and it's about 50% efficient with regards to the conversion of electricity to heat for the entire package (125/250). The 125 Watt TDP though seems to only be reached with specific instruction sets...so I pulled 65 Watts of TDP from my backside as a regular load where the iGPU is the bottleneck... because it seems about reasonable. That is arguable though...as any gaming experience is...because there are CPU bound games like Total War that might ping your CPU to the moon and on ultra performance setting might somehow not make the iGPU be the bottleneck...whereas most games seem to be the other way around.


Now about that dGPU...you understand some of the basics. Let me explain why I first calculated 100% scaling, then gave you 50%. You seem to have missed that bit. I was told that the iGPU somehow only used 0.5 Watts...and I'll take that at face value. 100% scaling for a 200 Watt card (because they said power draw, not TDP) would yield 200/0.5 = 400 iGPUs worth of hardware running on the same 200 Watt power draw as a middle of the road card. Assuming that scaling isn't 100%, and is in fact half, that's still 200 iGPUs worth of components that could be present with the same power draw.
What escapes you about having a 50% power efficiency still providing 200 times the hardware...because you seem to argue that it's not linear but even offering you 50% efficiency you can still have 200x the iGPU hardware on a dedicated video card for the same power draw...

Now, I described this as moon logic. If you use the quoted TDP of the hardware inside the 11700k's iGPU, and that's 15 Watts TDP, then we're looking at roughly what the ARC dGPUs offer. Let me quantify. 225 Watts TDP for the A750: TPU database, A750
If you then compare the UHD 750 at 15 Watts: TPU Database, UHD 750
225/15 = 15
256 shaders / 3584 shaders = 14x as many shaders on the A750
Why by jove, when they scale from iGPU to dGPU the scaling is 15:14 or 93%...and my example uses a 50% efficiency. Holy crap...93% of 400 = 373...so that iGPU to dGPU scaling should be 373 instead of 200. It's also so close that your argument evaporates when the statement was that the iGPU only uses 0.5 watts...and my point was that the 0.5 Watts was moon logic because the real math (15:14) basically agrees with the 15 Watts TDP quoted.


You're welcome to quibble over stupid things. In my book that's idiotic, but at a 50% efficiency the dGPU scale would have literally beat the pants off of a 4080...let alone 93%. When I use the right numbers the iGPU scales well with the dGPU...and unsurprisingly it's worth considering. So...is the 0.5 Watts usage an error...because Intel calls the TDP at 15 Watts...and 0.5 Watts total draw seems to indicate a deactivated core. Thing is, you're hell bent on saying the iGPU is valuable...so disabling it would mean that you are agreeing that removing it would have been better.



Now, your anecdotal overclocking. You either don't understand words, or that this wasn't about numbers. Not sure which. My point was removing 15 Watts of thermally dissipated power from a package would most definitely allow you to clock higher on cores...if you don't believe me explain why modern Intel CPUs do this by having a boost maximum of a single core...and an all-core overclock lower. Barring that, explain why it makes sense to claim that including an iGPU but disabling it makes sense...when the point was removing an iGPU would let you clock higher.

Barring any of that, please point me to your supplier. I feel as though there's some alteration to my mind that is required to understand how you can misconstrue this so hard...and I feel like whatever is altering your perception would be a heck of an experience. Hopefully it's something like crystals or the magic healing fields of plants...because I don't need to spend time avoiding officers. I hope this final moment of levity is useful for you, cause I needed the joke to not just be disgusted with this.
 
Joined
Feb 1, 2019
Messages
3,684 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Since I been using XTU a lot today, it was struggling to register 1 watt igpu usage.
 

ir_cow

Staff member
Joined
Sep 4, 2008
Messages
4,613 (0.77/day)
Location
USA
My sigh is because you are being completely idiotic in asking for numbers...and then you ignore the numbers.
Well you have a lot of text there and honestly I not going to read all of it. Mainly because your numbers don't really make sense and that's why I was asking where you got them from originally.

It's like a mixture of you citing official sources and just pulling numbers that you got from a wall meter / made up . Those values you got from your personal investigation maybe correct to you, but they seem... odd to put it nicely. Thus your entire argument about the iGPU is a real source of the CPUs power draw is completely invalid. As I said before the iGPU is not even enabled unless you're using it. Having a igpu on the silicon or not isn't a significant factor in your power draw.
 
Joined
Apr 2, 2011
Messages
2,856 (0.57/day)
Well you have a lot of text there and honestly I not going to read all of it. Mainly because your numbers don't really make sense and that's why I was asking where you got them from originally.

It's like a mixture of you citing official sources and just pulling numbers that you got from a wall meter / made up . Those values you got from your personal investigation maybe correct to you, but they seem... odd to put it nicely. Thus your entire argument about the iGPU is a real source of the CPUs power draw is completely invalid. As I said before the iGPU is not even enabled unless you're using it. Having a igpu on the silicon or not isn't a significant factor in your power draw.

I..am baffled. Each number, barring the 65 Watts I've cited, is sourced. If you spent half a second the sources are all linked...but that's obviously too much.

Each extrapolation is cited.

Now instead of addressing the various source (including this cite for the GPU TDPs), you want to just not read after asking for the numbers. Cool...just don't assume that your disrespect of my time to give you everything is ever going to be repaid.
Alternatively, let me ask you exactly why you distrust the TDPs and data from your own website, as most of it links back to here. You'll note that's a statement rather than a question. It's because I am showing you the same level of respect you are showing me. Conversation at you. It's kind of fun that all of this is because people can't read and comprehend the source, read their own interpretations into what I said, and believe something so stupid as I said iGPUs are useless...after multiple times stating flatly that they are not useless in all cases. That said, if you want to replace your gaming dGPU with an iGPU I implore you to explain the experience.

Also, so we are clear, I don't own a 11700k. I can't use my data...and it's only because of someone else that this chip is being cited specifically...


But, where's the fun in context and reading on a forum?



F*** it. After 12 years this place is finally Tom's Hardware then. Red team, green team, blue team, and the inability to check data but the desire to ask for it. Peace.
 

ir_cow

Staff member
Joined
Sep 4, 2008
Messages
4,613 (0.77/day)
Location
USA
@lilhasselhoffer first off, I jumped in half way through your argument with AusWolf and really wasn't following whatever was being argued about in the first place, so don't bundle me into the "hate train". I was just replying to the stuff I quoted and not everything else you were talking about. I skimmed a bit and found oddies, which made me question your entire augment of everything.

I then pulled 250 Watts directly from my backside. Or not. The wall draw for PL2 is rated to 251 Watts with a TDP of 125 Watts, or about 50% efficiency on the conversion.
Here is an example of some of the confusing numbers. Like what??Your wall meter is reading 250 watts and your PL2 rating is 251, but your TDP is 125? Somehow the translates to a iGPU drawing 30 watts? complete none-sense to me. Maybe you explained this in the next post, but I am not part of that conversation. I was just pointing out A) iGPU is part of the silicon and B) it doesn't draw power when not enabled, which also means it has no impact on P/E core clocks (at least when it isn't enabled - some real investigation maybe needs to happen?).
 
Last edited:
Joined
Jan 14, 2019
Messages
13,237 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
You...are absolutely selective in quotation and missing the point of things...

When Intel quotes a CPU at 65 Watts TDP, and they quote an iGPU at 15 Watts of TDP, you think they share? Fine, no arguments (that's how I calculated it). 65-15 = 50 Watts from the CPU and 15 Watts from the iGPU. If we assume roughly the same efficiency, then:
(15/65)*250 = iGPU power draw
(50/65)*250 = CPU power draw
For the record, this is how I get the power draw for the iGPU. That's, for the 11700k, 125-15 = 110 Watts TDP going to the CPU and 15 Watts TDP going to the iGPU. Full draw is therefore easy to calculate on the iGPU because assuming roughly the same efficiency you have (15/125)*250 watts of electricity going to the iGPU and it's about 50% efficient with regards to the conversion of electricity to heat for the entire package (125/250). The 125 Watt TDP though seems to only be reached with specific instruction sets...so I pulled 65 Watts of TDP from my backside as a regular load where the iGPU is the bottleneck... because it seems about reasonable. That is arguable though...as any gaming experience is...because there are CPU bound games like Total War that might ping your CPU to the moon and on ultra performance setting might somehow not make the iGPU be the bottleneck...whereas most games seem to be the other way around.


Now about that dGPU...you understand some of the basics. Let me explain why I first calculated 100% scaling, then gave you 50%. You seem to have missed that bit. I was told that the iGPU somehow only used 0.5 Watts...and I'll take that at face value. 100% scaling for a 200 Watt card (because they said power draw, not TDP) would yield 200/0.5 = 400 iGPUs worth of hardware running on the same 200 Watt power draw as a middle of the road card. Assuming that scaling isn't 100%, and is in fact half, that's still 200 iGPUs worth of components that could be present with the same power draw.
What escapes you about having a 50% power efficiency still providing 200 times the hardware...because you seem to argue that it's not linear but even offering you 50% efficiency you can still have 200x the iGPU hardware on a dedicated video card for the same power draw...

Now, I described this as moon logic. If you use the quoted TDP of the hardware inside the 11700k's iGPU, and that's 15 Watts TDP, then we're looking at roughly what the ARC dGPUs offer. Let me quantify. 225 Watts TDP for the A750: TPU database, A750
If you then compare the UHD 750 at 15 Watts: TPU Database, UHD 750
225/15 = 15
256 shaders / 3584 shaders = 14x as many shaders on the A750
Why by jove, when they scale from iGPU to dGPU the scaling is 15:14 or 93%...and my example uses a 50% efficiency. Holy crap...93% of 400 = 373...so that iGPU to dGPU scaling should be 373 instead of 200. It's also so close that your argument evaporates when the statement was that the iGPU only uses 0.5 watts...and my point was that the 0.5 Watts was moon logic because the real math (15:14) basically agrees with the 15 Watts TDP quoted.


You're welcome to quibble over stupid things. In my book that's idiotic, but at a 50% efficiency the dGPU scale would have literally beat the pants off of a 4080...let alone 93%. When I use the right numbers the iGPU scales well with the dGPU...and unsurprisingly it's worth considering. So...is the 0.5 Watts usage an error...because Intel calls the TDP at 15 Watts...and 0.5 Watts total draw seems to indicate a deactivated core. Thing is, you're hell bent on saying the iGPU is valuable...so disabling it would mean that you are agreeing that removing it would have been better.



Now, your anecdotal overclocking. You either don't understand words, or that this wasn't about numbers. Not sure which. My point was removing 15 Watts of thermally dissipated power from a package would most definitely allow you to clock higher on cores...if you don't believe me explain why modern Intel CPUs do this by having a boost maximum of a single core...and an all-core overclock lower. Barring that, explain why it makes sense to claim that including an iGPU but disabling it makes sense...when the point was removing an iGPU would let you clock higher.

Barring any of that, please point me to your supplier. I feel as though there's some alteration to my mind that is required to understand how you can misconstrue this so hard...and I feel like whatever is altering your perception would be a heck of an experience. Hopefully it's something like crystals or the magic healing fields of plants...because I don't need to spend time avoiding officers. I hope this final moment of levity is useful for you, cause I needed the joke to not just be disgusted with this.
It's fascinating that you're still under the assumption that the iGPU is under 100% load all the time.

Let me simply:
When you use your iGPU for 3D applications in a full-load situation, it'll eat 15 W, so your 65 W CPU will have 65 - 15 = 50 Watts left to work with.
When your iGPU is idle, it'll need maybe 0.1 W on a new CPU, so the rest of your CPU will have 65 - 0.1 = 64.9 W.

The CPU eats into the iGPU's power headroom as well, not just the other way around. The iGPU needs that 15 W only when it's under 100% load, which it will never be when you have a dGPU in the system.

Are we clear now? Or are you so frugal on your overclocking that even that 0.1 W of heat counts?
 
Joined
Dec 25, 2020
Messages
7,215 (4.88/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
I'm sighing because you are all acting idiotic, intentionally obtuse, or demonstrating that you cannot understand an idea more complex than A->B...because mine is A->B, based on this we can extrapolate that C->D will end badly based upon the relationship of A and B.

Please, get this through your heads collectively. You seem to not get it...so let me explain one more time for the slow crowd, or those who really want to not read.

The comment replied to was that we should remove E cores to boost the performance of P cores. If you can understand that, then we're good with step one.
The reply was careful what you wish for. A DECADE AGO the same ask was made of Intel when they first started integrating iGPUs into CPUs that were specifically designed for gaming...and thus would basically be guaranteed to have a dGPU.
If you had a dedicated GPU, and without the modern features that allow usage of an iGPU or selection based upon load, then the iGPU was functionally a drain on potential overclocking...and not relevant for those that were running a gaming setup (which was the target market for people running a consumer grade 2600k instead of either a workstation CPU or a lower end "white box" office PC).

I said this because a decade ago this was quoted as losing silicon on the monolithic die to an iGPU...and that the ask of removing it from the monolithic silicon chunk was to either support better overclocking or lower temperatures. It's functionally the same ask as someone today asking for disabling E cores for more or better performing P cores.



My sigh is because you are being completely idiotic in asking for numbers...and then you ignore the numbers. You wanted me to honestly tell you they were artificial, so you could dismiss them without actually doing anything....but I had the data so you have to go the apathy route. You then make the same idiotic statement that "my iGPU is useful" as though I said they had no use...because you're either a troll or incapable of reading. I start with a sigh because I'm not sure which you are...but at this point it's assured you're one of the two. I gave you the benefit of the doubt by providing data...and it's a waste of my time to give you that. I sigh, because you aren't worth talking to because you want to talk at a point I never made...because.



You...are absolutely selective in quotation and missing the point of things...

When Intel quotes a CPU at 65 Watts TDP, and they quote an iGPU at 15 Watts of TDP, you think they share? Fine, no arguments (that's how I calculated it). 65-15 = 50 Watts from the CPU and 15 Watts from the iGPU. If we assume roughly the same efficiency, then:
(15/65)*250 = iGPU power draw
(50/65)*250 = CPU power draw
For the record, this is how I get the power draw for the iGPU. That's, for the 11700k, 125-15 = 110 Watts TDP going to the CPU and 15 Watts TDP going to the iGPU. Full draw is therefore easy to calculate on the iGPU because assuming roughly the same efficiency you have (15/125)*250 watts of electricity going to the iGPU and it's about 50% efficient with regards to the conversion of electricity to heat for the entire package (125/250). The 125 Watt TDP though seems to only be reached with specific instruction sets...so I pulled 65 Watts of TDP from my backside as a regular load where the iGPU is the bottleneck... because it seems about reasonable. That is arguable though...as any gaming experience is...because there are CPU bound games like Total War that might ping your CPU to the moon and on ultra performance setting might somehow not make the iGPU be the bottleneck...whereas most games seem to be the other way around.


Now about that dGPU...you understand some of the basics. Let me explain why I first calculated 100% scaling, then gave you 50%. You seem to have missed that bit. I was told that the iGPU somehow only used 0.5 Watts...and I'll take that at face value. 100% scaling for a 200 Watt card (because they said power draw, not TDP) would yield 200/0.5 = 400 iGPUs worth of hardware running on the same 200 Watt power draw as a middle of the road card. Assuming that scaling isn't 100%, and is in fact half, that's still 200 iGPUs worth of components that could be present with the same power draw.
What escapes you about having a 50% power efficiency still providing 200 times the hardware...because you seem to argue that it's not linear but even offering you 50% efficiency you can still have 200x the iGPU hardware on a dedicated video card for the same power draw...

Now, I described this as moon logic. If you use the quoted TDP of the hardware inside the 11700k's iGPU, and that's 15 Watts TDP, then we're looking at roughly what the ARC dGPUs offer. Let me quantify. 225 Watts TDP for the A750: TPU database, A750
If you then compare the UHD 750 at 15 Watts: TPU Database, UHD 750
225/15 = 15
256 shaders / 3584 shaders = 14x as many shaders on the A750
Why by jove, when they scale from iGPU to dGPU the scaling is 15:14 or 93%...and my example uses a 50% efficiency. Holy crap...93% of 400 = 373...so that iGPU to dGPU scaling should be 373 instead of 200. It's also so close that your argument evaporates when the statement was that the iGPU only uses 0.5 watts...and my point was that the 0.5 Watts was moon logic because the real math (15:14) basically agrees with the 15 Watts TDP quoted.


You're welcome to quibble over stupid things. In my book that's idiotic, but at a 50% efficiency the dGPU scale would have literally beat the pants off of a 4080...let alone 93%. When I use the right numbers the iGPU scales well with the dGPU...and unsurprisingly it's worth considering. So...is the 0.5 Watts usage an error...because Intel calls the TDP at 15 Watts...and 0.5 Watts total draw seems to indicate a deactivated core. Thing is, you're hell bent on saying the iGPU is valuable...so disabling it would mean that you are agreeing that removing it would have been better.



Now, your anecdotal overclocking. You either don't understand words, or that this wasn't about numbers. Not sure which. My point was removing 15 Watts of thermally dissipated power from a package would most definitely allow you to clock higher on cores...if you don't believe me explain why modern Intel CPUs do this by having a boost maximum of a single core...and an all-core overclock lower. Barring that, explain why it makes sense to claim that including an iGPU but disabling it makes sense...when the point was removing an iGPU would let you clock higher.

Barring any of that, please point me to your supplier. I feel as though there's some alteration to my mind that is required to understand how you can misconstrue this so hard...and I feel like whatever is altering your perception would be a heck of an experience. Hopefully it's something like crystals or the magic healing fields of plants...because I don't need to spend time avoiding officers. I hope this final moment of levity is useful for you, cause I needed the joke to not just be disgusted with this.

It seems to me that you do not understand how power gating logic or hardware scaling across die sizes and process nodes work. The power used by the integrated graphics is effectively meaningless until the graphics engine is under heavy load due to extremely efficient power gating in modern semiconductor designs. Aforementioned TDP rating by Intel is an abstract number intended to represent an estimate on how much power should this piece of silicon use under a typical workload, it is not a nominal, permanent penalty imposed onto the processor - i.e. it's not always using 15W because the power management logic can effectively switch on and off the corresponding transistors in real time. This gating is so efficient that the power drawn by these transistors in a "sleep" state is effectively zero.

On desktop processors without a power limit imposed, the integrated GPU can consume significantly more than its rated wattage, I've had it going as high as 80 W on a i7-4770K. It is only then that the heat generated by the iGPU becomes a concern, and in my experience, it did not affect the CPU portion's general clockability, at least back then. My new i9-13900KS has the UHD 770 graphics, but I can't say I even bothered using it yet. There's little point in doing so. I only really tested QSV on it, works fine.

Arc does not use the Gen 12.2 Xe cores, the Alchemist processor is built on a different node with a newer IP generation compared to UHD Graphics 770 (and its 750 predecessor), which means their capabilities and thermal properties are completely different.
 
Joined
Jul 20, 2020
Messages
1,166 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
On desktop processors without a power limit imposed, the integrated GPU can consume significantly more than its rated wattage, I've had it going as high as 80 W on a i7-4770K. It is only then that the heat generated by the iGPU becomes a concern, and in my experience, it did not affect the CPU portion's general clockability, at least back then.

!!!

OK how did you get an Intel iGPU to use that much power? I've been using and monitoring power levels in Intel iGPUs for about 8 years now and have never seen one consume more than 20W. But then I've never had a -K SKU though so max listed clocks are the highest I ever see. Can you overclock Intel iGPUs?
 
Joined
Dec 25, 2020
Messages
7,215 (4.88/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
OK how did you get an Intel iGPU to use that much power? I've been using and monitoring power levels in Intel iGPUs for about 8 years now and have never seen one consume more than 20W. But then I've never had a -K SKU though so max listed clocks are the highest I ever see. Can you overclock Intel iGPUs?

Yes, you can/could overclock and overvolt them too, at least on Haswell K SKUs with a Z87 chipset motherboard. i7-4770K's flavor of HD 4600 runs stock is 1.25 GHz but if I recall correctly, I was running it at 1850 MHz. Paired with DDR3-2400, then tried to run Fallout 4 on it. Power consumption shoots up like a rocket.
 
Joined
Jun 6, 2022
Messages
622 (0.66/day)
AW + WoT(games) = 7W maximum
www news = 0.3 W maximum
youtube = ~ 1W maximum if you watch 4k@60

In addition, the presence of igp increases the contact surface between die and ihs and helps overclocking. In my opinion.

On this system, igp is all I need. Without it, other money spent on the dedicated video card, other fun.

ggg.jpg
 
Joined
Apr 2, 2011
Messages
2,856 (0.57/day)
@lilhasselhoffer first off, I jumped in half way through your argument with AusWolf and really wasn't following whatever was being argued about in the first place, so don't bundle me into the "hate train". I was just replying to the stuff I quoted and not everything else you were talking about. I skimmed a bit and found oddies, which made me question your entire augment of everything.


Here is an example of some of the confusing numbers. Like what??Your wall meter is reading 250 watts and your PL2 rating is 251, but your TDP is 125? Somehow the translates to a iGPU drawing 30 watts? complete none-sense to me. Maybe you explained this in the next post, but I am not part of that conversation. I was just pointing out A) iGPU is part of the silicon and B) it doesn't draw power when not enabled, which also means it has no impact on P/E core clocks (at least when it isn't enabled - some real investigation maybe needs to happen?).

You are asking for a discussion midway through to restart. Cool.

Let me explain physics 101. A processor takes in electrical energy. It outputs some work, and heat. Good with the 101 now?
TDP is thermally dissipated power.
The electrical energy pulled from the wall is not TDP. It's electrical Watts. Because your processor doesn't just output heat, TDP<electrical draw.
This is how your CPU has an electrical draw, an output of thermal energy in Watts, and an efficiency for conversion from Watts (electrical) to Watts (thermal), and where input energy in electrical Watts is immensely dependent upon package temperature.


So, you pull 251 Watts at the wall (more than that, but 251 Watts for the processor package). That electrical energy is converted into heat, or thermally dissipated power, or TDP. This is how you get a package TDP of 125 Watts.
Now...that 125 is split by everything on the package. That's the CPU cores, the iGPU, and anything else. Intel provides data...that I linked to...which says just the iGPU has an output of 15 Watts thermally... that is easy to guess on electrical input because 125/251 = about 50%. This means if the iGPU is as efficient as the entire package, 15 Watts TDP = 30 Watts input electrical energy.


Now, let me explain the rest of this. I stated that the 0.5 Watts of electrical input which were quoted to me, for the iGPU usage, was moon logic...and I explained that if said moon logic was true, and Intel simply built a dGPU out of that 0.5 Watt input energy would produce a dGPU better than the 4080 at only 200 Watts of input power... Which is moon logic because the A750 quoted shows that if you scale their iGPU to the same amount of parts as their dGPU it's a 14:15 component match for a 1:1 input power match. Cool? It's almost like my point was, has been, and will be that an iGPU is not useful in a situation where you will always assume the presence of a dGPU.

Now my final bit. You ask for numbers. Cool. "You're wrong and I won't read" is the response after this... My retort is that you haven't done any of the work. Not a conspiracy, this is you. I'm OK with you, personally, demonstrating a lack of understanding. I'm not ok with "too long, didn't read, and I don't care." It tells me that you have not respect and don't want put forward the effort to understand but believe that asking for sources and then disregarding them is...and acceptable way to treat your fellow. That's Tom's Hardware level of screaming discourse, where numbers don't matter. I...respectfully suggest that I'm done with you. It's because you've shown disrespect instead of asking...because if at any point you simply asked for why the wattages differ on the same chip package I'd have cared enough to explain. Instead it's fine to ask for sources, disregard, and claim that I'm paranoid and should have endlessly restarted and had another million miles of text to explain...but writing too much would prevent you from reading.


I sigh because I'm tired. I sigh because at the end of the day you expect respect, but show none. That's you...and I'm done explaining the concept of iGPU=lower overclock or less cores as it is pretty much an idiotically simple thought. I...said all of this because I literally responded to someone stating that wishing for less E cores to boost P core overclock was not going to end well, based upon the track record Intel has with iGPUs (that started a decade ago).
All of this...is just tiring. You could have argued that the 125 Watt TDP assumed at 65 Watts for actual gaming was irrational... You could have cited that you know the iGPU is more or less efficient than CPU cores...and maybe cited something I missed. You could have suggested anything but "I refuse to read" and it would have been a challenge to respond to rather than spit in my face. But now. As such I sigh once more. Tom's Hardware levels of discourse are...for those young and stupid enough to expend energy on fighting for nothing. I prefer the touch of grass on my feet, and the imagination that someday I'll find a discussion half as rewarding as I used to find here. That said, I hate all of the teams. I buy what makes sense, I want monopolies to die by fair competition, and I believe that competition through capitalism is what benefits the consumer...I also believe that Intel believes they know the consumer better than we know ourselves, which is why my last purchases were AMD...and I'm not looking forward to anything new because stuff like Raptor Lake is just...a nothing burger.
 
Top