- Joined
- Aug 16, 2016
- Messages
- 1,025 (0.34/day)
- Location
- Croatistan
System Name | 1.21 gigawatts! |
---|---|
Processor | Intel Core i7 6700K |
Motherboard | MSI Z170A Krait Gaming 3X |
Cooling | Be Quiet! Shadow Rock Slim with Arctic MX-4 |
Memory | 16GB G.Skill Ripjaws V DDR4 3000 MHz |
Video Card(s) | Palit GTX 1080 Game Rock |
Storage | Mushkin Triactor 240GB + Toshiba X300 4TB + Team L3 EVO 480GB |
Display(s) | Philips 237E7QDSB/00 23" FHD AH-IPS |
Case | Aerocool Aero-1000 white + 4 Arctic F12 PWM Rev.2 fans |
Audio Device(s) | Onboard Audio Boost 3 with Nahimic Audio Enhancer |
Power Supply | FSP Hydro G 650W |
Mouse | Cougar 700M eSports white |
Keyboard | E-Blue Cobra II |
Software | Windows 8.1 Pro x64 |
Benchmark Scores | Cinebench R15: 948 (stock) / 1044 (4,7 GHz) FarCry 5 1080p Ultra: min 100, avg 116, max 133 FPS |
That's only partially true. Some NVidia cards are terribly hot and very loud. Noise much more depends on fan/radiator design than solely on graphic chip.NVIDIA cards cost more because they perform better (power, heat and noise, not just framerate) so that's not an argument. Also, both sides have sponsorship deals to make their cards work better on certain games. It's a dirty political trick to boost sales that I don't like, but it's a fact of life unfortunately.
The point is that NVidia knows that it can sell 10-15% more expensive GPU than AMD's version with equal performance. Most buyers will buy NVidia instead of cheaper and sometimes faster AMD, because NVidia's marketing department and some popular gamers who never even built a PC brainwashed them.