So I have a question about changing the gpu voltage in RBE. I'm a little confused, maybe someone can help clarify. When I open my bios with RBE (my stock bios, or my stock bios modded with the shader mod), here is what I get for my VID settings currently on the card.
http://img291.imageshack.us/img291/959/voltagesu.jpg
Some people say change the VID that has 1.1v in it because that is supposed to be the 6950s stock 3D voltage, others say change VID4 because that is always the 3D voltage on this card.
I have come to believe that VID4 is the correct one to change because it increased my 3D temps when increased while increasing VID3 did nothing.
I thought the 6950s were supposed to have a stock 3D core voltage of 1.100v (as in VID3), but as you see, my bios suggests 1.065 is our stock voltage if we go by VID4. Trixx reports stock voltage as 1.060 and Afterburner reports stock voltage as 1.100.
Why do you think there is such discrepancy in what our stock voltage is?
Do you think maybe it just depends on what each cards core takes to run at stock speeds and thats what the manufacturers set it as?
Or maybe different manufacturers just use different voltages?
If VID4 is the one that makes my 3D temps go up, then obviously its the one I need to adjust for my GPU core when overclocking, correct?
Does anyone have a clue as to what VID3 is then, and why its set at 1.100v, which is what most people say our stock voltage is supposed to be? Also, why would there be an extra voltage (VID3) that is higher than my 3D voltage (supposedly VID4)? What would a higher voltage be used for? Could this be a problem when overclocking since after VID4 is increased for overclocking, it will be higher than VID3... VID3 stock was higher than VID4.