# GPU-Z v0.1.6 only - Bugs Only



## deathvirus_me (Feb 5, 2008)

Well finally this detects every info about the gpu correctly .. 







But there is a slight confusion about the temps. sensor part .. GPUz reports the GPU Temp. roughly 7c less that what Rivatuner / Speedfan / Nvidia CP reports  ...






Also as a side note , the temps reported by GPUz are the same as Everest  , which also reports and additional "GPU Memory" temp.






Otherwise everything else is fine in this release  ...


----------



## AddSub (Feb 5, 2008)

Sensor information for one of my cards is mixed up.

GPU temp shows my actual PCB temp, and PCB temp shows GPU temp.

Card: Sapphire Radeon X850XT PCIE 256MB (the one in my specs on the left)


----------



## largon (Feb 5, 2008)

8800 series clock frequencies are still read as the setting (bios-set or OC'ed) is, not as what the real, applied freq is.


----------



## Richteralan (Feb 5, 2008)

Failed to launch on my laptop:

Problem signature:
  Problem Event Name:	APPCRASH
  Application Name:	GPU-Z.0.1.6.exe
  Application Version:	0.1.6.0
  Application Timestamp:	47a7bd52
  Fault Module Name:	GPU-Z.0.1.6.exe
  Fault Module Version:	0.1.6.0
  Fault Module Timestamp:	47a7bd52
  Exception Code:	c0000005
  Exception Offset:	0002192c
  OS Version:	6.0.6000.2.0.0.256.6
  Locale ID:	1033
  Additional Information 1:	38e5
  Additional Information 2:	7c3f3ba166b01645ce29fd5583b2bf40
  Additional Information 3:	22d3
  Additional Information 4:	a5eba8c9dbc852831b0c5f754274a0b7

Laptop config:

Gateway P-6831FX
Intel Core 2 Duo T5450
Intel PM965 Express Chipset
3GB DDR2-667 
NVIDIA Geforce 8800M GTS 512 (Driver: 167.59)
Windows Vista Business 32-bit


----------



## newtekie1 (Feb 6, 2008)

I think the temperature sensors are mixed up in GPU-Z.

GPU-Z shows the GPU temperature as 40°C and the PCB Temp as 35°C, but those numbers are reversed on atitool.


----------



## molnart (Feb 6, 2008)

Saving the bios on my X1950Pro still does not work....


----------



## deathvirus_me (Feb 7, 2008)

UPDATE : _Temps reported by GPUz are actually correct for the G92 .. by default Rivatuner and Speedfan reads temps off the forceware , while GPUz and Everest read it off the ADT7473 _ .. In such a case , with Rivatuner , u can switch to ADT7473 data source for the core temp. graph .. and for Speedfan simply set an offset value


----------



## spajdr (Feb 7, 2008)

I use two VF900 on Sapphire Radeon 3870X2 and when i run ati tool 3d view, temperature on
these chips inside blue box got over 100C in few seconds (thats for other dude who tested it for me with X2 too, but for me only one get temperature so high, on second it raise roughly like 4-5C = 52C max), so is there problem with detection that it doubles temperature by some bug? i dont wanna blow up the card. 





Under GPU-Z its line called GPU1 and GPU2 VDDC Slave #2

EDIT.: it reports same temperature under Everest Ultimate Edition too, so im not sure, if thats working temperature? 100C seems kinda high to me.


----------



## Wolf91 (Feb 8, 2008)

Sorry for my bad English . . . I can't still see the real status of my CrossFire . . . I have two HD 2900 XT . . . With GPU-Z 0.1.5, I saw "Enabled (1 GPUs)".... Now with GPU-Z 0.1.6, it says "Disabled (Crossfire Avilable)" . . .

However Crossfire really works, because running 3D Mark 06 there is a big difference: from 14000 to 15500...

Please, help me!


----------



## W1zzard (Feb 8, 2008)

which os and drivers are you using wolf91 ?


----------



## Wolf91 (Feb 8, 2008)

Oh, sorry! 

I'm using Windows XP Professional 32 bit with Catalyst 8.1 . . . .

I can confirm that GPU-Z is able to see CrossFire in Windows Vista with the same drivers, but no way on XP.

I also tried in the other topic, the one reserved to CrossFire Systems, but it still doesn't work . . . .


----------



## bumbar (Feb 8, 2008)

CPU and PCB temperature mixed up on 1950 GT and bios saving not working.


----------



## Wolf91 (Feb 8, 2008)

Poor Wizard! 

You are great


----------



## GoatX12 (Feb 14, 2008)

I know this bug was fixed in 1.6 for ATI & CF, But how about a fix for Nvidia SLI under Vista 64? As this is still not detecting SLI correctly.

Current setup is Dual 8800GTS 512 (G92) in SLI on a XFX 680I LT motherboard (nVidia reference board) running Vista Ultimate 64 as stated above. Everything else is detected correctly as far as I can tell (or atleast it matches the Everest info). Anyway thanks for the wonderful program and hope to see this addressed in the next build.


----------



## W1zzard (Feb 18, 2008)

the problem with sli under vista is that i havent found a method to ask the driver for its sli status under vista. what worked under xp doesnt work under vista anymore


----------



## GoatX12 (Feb 18, 2008)

Thanks for the heads up. is this problem just under Vista 64?


----------

