- Joined
- Jan 14, 2019
- Messages
- 13,237 (6.05/day)
- Location
- Midlands, UK
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Anything from 4 to 48 GB |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Wired |
VR HMD | Not yet |
Software | Linux gaming master race |
Exactly this.Intel does not have igp included in the CPU. It is in the same capsule, but completely separate. You can't get anything extra, as evidenced by the "F" processors, which disabled this igpu because it is defective. It is completely inactive (laser cut) and yet these processors are not more powerful at overclocking than their brothers with igp.
Get any CPU with an iGPU, use it as a display adapter (meaning: have a dGPU in the system, but use the iGPU as a display adapter), and check iGPU power consumption in HWinfo. I'd be very surprised if it was higher than about 0.5 W, even on an old Intel chip.I...really hope that both of you are trolling... If not, it's pretty sad to see that people are incapable of reading the words written, but not comprehending them at face value so you can assign your own.
To clarify, I was commenting on the slight boost to clock speeds that could be garnered. Basically, old chips traded production area that could have been used for more transistors, and instead used it for an iGPU core. Said core was not entirely dark even when disabled, but did decrease overclocking headroom by having less available transistors for CPU cores and had a big old dead space for anyone using a dGPU.
I was also commenting that Netburst was...not a bad thing. Hear me out, because that's a lot to say. What I mean is that modern CPUs are based off of the lessons of Netburst and Bulldozer. That means your modern 8 core 16 thread consumer grade CPUs exist because of lessons learned. I would be hard pressed to ask someone rocking a modern 6+core CPU why it just works, and not having to point out what failed in the past is directly responsible for today's success. Just like today AMD uses CCXs instead of a monolithic silicon chunk, the PCH is often a generation or two behind the lithographic tech of the main processor, and windows even required an update to the scheduler to address how programs were assigned by both AMD and Intel in the last five years, because their innovations often lead to short term issues.
I remember a wonderful time when I could get a CPU, then spend hours overclocking it. I now buy a CPU and it overclocks itself (ok enough to be passable). I remember people claiming that you'd never need more than a few cores, but consumer hardware now comes with 6+ cores and some form of threading as almost a standard. I remember asking for Intel to stop gimping my gaming CPU with an iGPU incapable of running 640x480 resolutions, so that I could eek out just a couple hundred more megahertz...which is the comment I actually was referring back to (and why I find it silly that you both really want me to be wrong without ever considering that I might be speaking to something other than what you're projecting onto my comments). Alas, apparently it's asking too much for people not to fight over something I didn't say, never meant, and something I even joked about as me being too old.
Whatever, complete the loop here. I...don't understand the reason for fighting about the value of an iGPU when I referred to it as a means of slightly increasing CPU frequency by its exclusion...but why does context matter?
I have an 11700 in my HTPC, and I use its iGPU as a display adapter because I need HDMI 2.0 that the 1050 Ti in the system doesn't have. Locked to 65 W, the CPU boosts up to 2.8 GHz in a Cinebench all-core test. If I disable the iGPU, and hook the TV up to the 1050 Ti, the CPU does... well, 2.8 GHz.
My point stands: the iGPU when unused, or when used as a display adapter, needs so minimal power that it doesn't eat into your CPU's power and thermal headroom. This is not trolling. This is fact.
If you don't believe me, send me an 11700F, and I'll test it for you. I'm sure it'll boost up to 2.8 GHz as well.
Edit: I agree with you about Netburst and Bulldozer. Failure is a necessary step on the road to success.
Last edited: