- Joined
- Jun 3, 2008
- Messages
- 723 (0.12/day)
- Location
- Pacific Coast
System Name | Z77 Rev. 1 |
---|---|
Processor | Intel Core i7 3770K |
Motherboard | ASRock Z77 Extreme4 |
Cooling | Water Cooling |
Memory | 2x G.Skill F3-2400C10D-16GTX |
Video Card(s) | EVGA GTX 1080 |
Storage | Samsung 850 Pro |
Display(s) | Samsung 28" UE590 UHD |
Case | Silverstone TJ07 |
Audio Device(s) | Onboard |
Power Supply | Seasonic PRIME 600W Titanium |
Mouse | EVGA TORQ X10 |
Keyboard | Leopold Tenkeyless |
Software | Windows 10 Pro 64-bit |
Benchmark Scores | 3DMark Time Spy: 7695 |
Nice looking card! I really do like the over-engineered power delivery system.
On another note, your editor needs to step up a bit. I am by no means a professional in that field.
The "Lightning" series has been MSI's flagship series of high-end graphics cards geared toward overclockers and enthusiasts who are not afraid of using more exotic cooling forms like liquid nitrogen or dry ice.
Unlike the smaller version of the cooler, this one does not extend far beyond the back edge of the card which will allow it to fit in smaller cases.
On AMD cards, vendors are free to combine six TMDS links into any output configuration they want (dual-link DVI consuming two links); on NVIDIA, you are limited to two DVI outputs with one additional HDMI/DP output.
The second BIOS also serves as a backup in case something goes wrong during a BIOS flash.
A silicon chip consumes less power for the same amount of work when operated at cooler temperatures.
At idle, MSI's GTX 580 Lightning is a good deal quieter than NVIDIA's reference board.
Unfortunately 3D noise has not received the same treatment. It seems MSI has focused more on temperatures than on fan noise.
Given the low temperatures which are in the 70° range, I'd rather sacrifice a couple of degrees higher core temperature for similar or lower noise levels in 3D.
Temperatures are low in all tests. Given the increased fan noise under load, I think a better design choice would have settled for a few °C more under load in order to achieve less fan noise.
It has been a long known fact that overclocking headroom is increased in relation to an increase in operating voltage.
Until recently, software voltage control on VGA cards has been the exception and most users were not willing to risk their warranty by performing a soldered voltmod.
On another note, your editor needs to step up a bit. I am by no means a professional in that field.
The "Lightning" series has been MSI's flagship series of high-end graphics cards geared toward overclockers and enthusiasts who are not afraid of using more exotic cooling forms like liquid nitrogen or dry ice.
Unlike the smaller version of the cooler, this one does not extend far beyond the back edge of the card which will allow it to fit in smaller cases.
On AMD cards, vendors are free to combine six TMDS links into any output configuration they want (dual-link DVI consuming two links); on NVIDIA, you are limited to two DVI outputs with one additional HDMI/DP output.
The second BIOS also serves as a backup in case something goes wrong during a BIOS flash.
A silicon chip consumes less power for the same amount of work when operated at cooler temperatures.
At idle, MSI's GTX 580 Lightning is a good deal quieter than NVIDIA's reference board.
Unfortunately 3D noise has not received the same treatment. It seems MSI has focused more on temperatures than on fan noise.
Given the low temperatures which are in the 70° range, I'd rather sacrifice a couple of degrees higher core temperature for similar or lower noise levels in 3D.
Temperatures are low in all tests. Given the increased fan noise under load, I think a better design choice would have settled for a few °C more under load in order to achieve less fan noise.
It has been a long known fact that overclocking headroom is increased in relation to an increase in operating voltage.
Until recently, software voltage control on VGA cards has been the exception and most users were not willing to risk their warranty by performing a soldered voltmod.