newtekie1
Semi-Retired Folder
- Joined
- Nov 22, 2005
- Messages
- 28,473 (4.11/day)
- Location
- Indiana, USA
Processor | Intel Core i7 10850K@5.2GHz |
---|---|
Motherboard | AsRock Z470 Taichi |
Cooling | Corsair H115i Pro w/ Noctua NF-A14 Fans |
Memory | 32GB DDR4-3600 |
Video Card(s) | RTX 2070 Super |
Storage | 500GB SX8200 Pro + 8TB with 1TB SSD Cache |
Display(s) | Acer Nitro VG280K 4K 28" |
Case | Fractal Design Define S |
Audio Device(s) | Onboard is good enough for me |
Power Supply | eVGA SuperNOVA 1000w G3 |
Software | Windows 10 Pro x64 |
because nvidia cards scale so well in sli this would only need to be a dual GTX465 to compete well against the 5970
As long as "GF100-030-A3" is written on the gpus this will only be a dual GTX465 card. Even the memory quantity points to this conclusion.
I was going to come in an say exactly these two points.
These, based on the memory configuration, are definitely GTX465 cores, unless Galaxy hid 2 memory chips on the back of the card.
And yeah, two GTX480s perform the same as two HD5970s, so really two GTX465s would probably scale well enough to match a single HD5970.
And if these are GTX265s, then we are only looking at power consumption in the 300w range at peak. That shouldn't be too give of an issue considering it is only about 20w beyond what the HD4870x2 ran at, and they were fine.
Sorry that makes no sense. Yes if the gpu load is lower and the fan is not spinning as much power consumption will be lower. If you have the card on water at 100 gpu use the gpu will use the same power minus the fan (which uses ALOT of power for a fan around 20ish watts or even more cant remember from the water block reviews it is a 1.1+ amp fan) as the card will on air. Yes you may get a slight reduction in power consumption due to thermal dissipation and not as much of the power being converted into heat. (that last sentence might be better explained by someone who actually knows something about thermal dynamics.
It might not make any sense to you, but W1z's latest review of the GTX480 proves it. Fermi uses less power when it is cooler. The fan speed didn't effect the cards power consumption, in fact in his tests, when he lowered the fan speed via software, power consumption went up because the card was running hotter.
Now, we aren't sure if that is because the VRM area is getting hotter and less effecient, or if the GPU itself is less efficient due to voltage leakage, or a combination of both. However, the fact is, temperature and temperature alone is what is causing Fermi to consume so much power. In fact it is almost a exact linear progression, for every 1°C hotter the card runs, it needs 1.2w more power.