Intel Core i5 661 3.3 GHz GPU Performance analyzed Review 45

Intel Core i5 661 3.3 GHz GPU Performance analyzed Review

Test Setup »

Introduction

First of all, I would like to thank our friends at ASUS who provided the CPU and motherboards used. Without their help it wouldn't have been possible to create this article.

Intel has just announced their latest lineup of Nehalem technology based processors. Clarkdale, as the new processor is called by engineers, is the first commercially available 32 nm based processor. It is also the first processor that features a graphics processing core which is located inside the processor's package - something that was first heard about when AMD talked about their Fusion project. It should be noted however that Intel did not put both CPU and GPU onto the same die of silicon.



Instead they took the 32 nm processor core and the 45 nm GPU core, and crammed them into a single processor package, as pictured above. This approach is called Multi-Chip module (MCM).

Let's talk about the processor core quickly. It is a dual core design, that features Hyper-Threading technology, effectively giving you support to execute four threads at the same time. On the cheapest processor model, the Pentium G6950, the HT feature has been disabled to offset it further from the more expensive offerings. Unlike other Nehalem and Lynnfield CPUs, the memory controller is not integrated in the CPU die, but can be found inside the GPU silicon, which results in increased memory latencies. You can find much more information on the processor in our colleagues' reviews.

Graphics Card Teardown PCB Front
Graphics Card Teardown PCB Back


Intel's graphics core is based on 45 nm technology and features 177 million transistors on a die size of 114 mm². You could imagine it as an improved G45 chipset (including the memory controller) with some magic sauce to make everything work in the CPU package. The GPU is clocked at 533, 733 or 900 MHz depending on the processor model. Our tested i5 661 features the highest GPU clock speed available, without overclocking, of 900 MHz. Intel also increased the number of execution units (shaders) from 10 to 12 and the HD video decoder is now able to decode two streams at the same time for picture-in-picture like you find on some Blu-Rays to show the director's commentary. HD audio via HDMI is supported as well, which will make setup of a media PC more easy, putting this solution on the same level as the latest offerings from AMD and NVIDIA. Whereas the mobile GPU version features advanced power saving techniques like dynamic clock scaling (think: EIST for GPUs) and overheat downclocking, the feature is not available on the desktop part.

Intel's GPU Clock is linked to the processor's BCLK. So if you use the standard CPU overclocking method of raising the BCLK you should be aware that your graphics core clock will go up too. So to get the maximum out of overclocking you need a motherboard BIOS that lets you select a lower multiplier for the VGA clock.
The actual clock is calculated as follows: BCLK / 4 * GPU_PLL_Multiplier. For example, on default clocks: 133 / 4 * 27 = 900 MHz
ASUS offers a nice GPU overclocking suite with their motherboards. You may select the VGA multiplier in the BIOS and also use their Turbo V EVO overclocking software in Windows to easily change VGA clocks without a reboot.

The graphics die and the processor die are connected using Intel's QPI interconnect which offers plenty of bandwidth and reasonable latencies for this marriage to turn out great. In order to transmit the display output signal to your monitor, Intel introduced a new bus called FDI that goes from GPU in the CPU package to the motherboard's H55/H57/Q57 chipset which then handles all the HDMI/DVI/VGA/DP output signal generation. If you don't have a chipset with FDI you can't use the GPU part of the processor - but the CPU itself will work. Earlier P55 based motherboards require a BIOS update to properly detect the CPU.

It should be noted that Intel only supports dual link for the DisplayPort based interface which means that your 2560x1600 30" DVI monitor will not be able to run at its native resolution (happened to me, sucks). HDMI is limited to 1920x1200 which should be fine for all usage scenarios.

Next Page »Test Setup
View as single page
Dec 26th, 2024 20:29 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts