NVIDIA clearly has a winner on their hands with the GTX 680. The new card, which is based on NVIDIA's GK104 graphics processor, that introduces the Kepler architecture, is a significant leap forward both in terms of performance and GPU technology. Technically GK104, as its name reveals is an upper mid-range GPU, not a pure high-end part. Following NVIDIA's naming convention such a chip would be called GK100. This subtle difference makes the GeForce GTX 680 even more impressive. Technically we'd have to compare it to GTX 560 Ti, not GTX 580. Even when compared to GeForce GTX 580, the performance improvement of GTX 680 is excellent. Averaged over our testing it increases performance by 16% (+37% vs. GTX 560 Ti!), and easily beats AMD's HD 7970. Achieving such performance levels nowadays has to go with improved performance per Watt, as modern high-end graphics cards are limited by power consumption and heat output.
NVIDIA claims massively improved power consumption with their latest architecture, which is confirmed by our testing. Compared to previous-generation cards, we see over 30% performance per Watt improvement. Every power measurement has gone down, yet performance has gone up. Very nicely done.
Reduced power consumption is a foundation that NVIDIA's new dynamic overclocking algorithm is built upon. Instead of a single fixed GPU clock, NVIDIA now markets two values. Base clock (1006 MHz), which is the guaranteed minimum clock speed in normal gaming, and boost clock (1058 MHz) which is the average clock speed reached in those scenarios. Depending on real-time monitoring, which includes board power, temperature and GPU load, the driver decides on a clock target beyond base clock, up to 1110 MHz. This approach enables a performance boost in most games, yet ensures that the card does not overheat by drawing too much power in more demanding scenes or games. In this review we did extensive testing of NVIDIA's Boost Clock algorithm and can say that it works exceedingly well for consumers who just want to install their new card and start playing games. Things do get more complicated for enthusiasts who are looking to squeeze the maximum out of their card. We confirmed that dynamic overclocking did not eat up all the overclocking headroom on the GeForce GTX 680. With manual overclocking we were able to increase performance by another 12% over the out of the box experience, which already uses dynamic overclocking. There are a few gotchas left with dynamic OC, for example that it can not be turned off. Also boost levels are dependent on temperature, which means that there might be a small performance drop when the card is used in badly ventilated cases. Last but not least, being based on a physical measurements, means that dynamic OC is dependent on component tolerances. Two cards that you buy will perform differently out of the box. I am worried that we might see people buying ten cards, pick the fastest one and then return the rest, which will turn this into a nightmare for retailers. Remember, this applies even to cards where no manual overclocking is done.
With lower power consumption usually comes reduced fan noise, too. NVIDIA did reduce noise levels of their cooler by a bit, but it's still far from what I would consider a quiet experience, especially in 3D. Custom design cards will hopefully address that. Another area that has seen improvement is NVIDIA's display output logic, which supports up to four active displays now. This makes it easier than ever before to setup a multi-monitor gaming system, with just one NVIDIA card. NVIDIA has also introduced new software features like adaptive V-Sync which will help reduce stutter and tearing in games, and the new anti-aliasing modes will make them look even prettier.
Pricing of the GeForce GTX 680 is reasonable, with $499. When compared to AMD's $549 HD 7970, the GTX 680 is the clear winner. It offers better performance at similar power consumption and comes with more features. Looking at the board design I am sure that there is still plenty of headroom for price reductions - a price war with AMD would be great for customers, resulting in better pricing in these tough economic times. Let's hope it happens.