• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Catalyst 8.9 Incorrect GPU speed reading.

I thought it was to do with the ATI v8.9 drivers as well because the 8.8's I just uninstalled yesterday showed overclocked bandwidth at 140Gb/s on my 4870's CrossFire. I appreciate the people who make GPU-Z freely available, so we are all just having a bit of a dig at ATI.

I thought it was funny when I looked at the figures of a GPU clock of 819,246Mhz, suddenly ATI have a GPU to play Crysis on high ;)

Heres the proper screenshots:
With 8.6 ATI drivers (no overclocking)...



And heres the actual screenshot with GPU-Z v0.2.7, then ATI CCC v8.9 (CCC shows the right speeds)
Also notice the bandwidth is wrong at 64Gb/s instead of about 140Gb/s...

http://g.imageshack.us/img261/temp1vs4.png/1/
http://g.imageshack.us/img228/temp2bb5.png/1/

So does anyone know how exactly this effects the GPU-Z's tool to read the speeds & was this change worthwhile for better speeds or improved technology support? And, does ATI not give you guys access to their technical support to make you aware of any unusual code changes so you don't have to play "find the pin a haystack"?
 
Last edited:
wow nice to know that im not the only one. i was jacking with my card earlier then i opened up gpuz and thought i did something to the card. at least i can brag to my friends about my amazing overclock. :roll:
 
smithy have you tried GPUZ 2.6 with the 8.9 drivers see if it does the same thing?
 
i tried 2.7, 2.6, 2.5 and 2.4 and none of them worked. 2.5 showed a 0 core and memory clock while the others showed the 800,000 one. but when i tried 2.3 it worked and 2.0 worked as well.
 
i have a better idea, remove his thread because he sounds too much like a fanboy.

Yes, the post was off topic, but it was a light hearted joke about crysis being unoptimized. Watch who you call fanboy, kid.

NOW, back ON topic: Why would 2.3 and 2.0 work and not 2.7?
 
Last edited:
smithy have you tried GPUZ 2.6 with the 8.9 drivers see if it does the same thing?

I've just tried TechPowerUp GPU-Z v0.2.6 and the result is the same as before. Looks like we can pinpoint solely at ATI.


http://g.imageshack.us/img381/thesameresulttp7.png/1/

chron, that is an interesting question about 2.3, 2.0 working but not the others. So I ran TechPowerUp GPU-Z v0.2.3 & it does have issues, the GPU clock flips between 500Mhz and 125,624Mhz. And the other specs aren't right either as you can see here..


http://g.imageshack.us/img362/wrongspec2fk0.png/1/

TechPowerUp GPU-Z v0.2.0 is even worse as it keeps flicking between 3 different amounts & reports the wrong shaders. So we can all rule out using GPU-Z on ATI 8.9 drivers as they don't work correctly. Maybe 8.10 drivers will work instead, but don't hold your breath.
 
well its ATI that owns the hardware they can do whatever they like, it just takes Dedication to Get GPU Z to decipher their Routing of the Numbers, just like CPU-Z. Im wondering if GPU-Z takes Info from the drivers themselves or looks at it low level- bios coding, i would assume drivers as it would take a bios flash of the card to scramble those numbers considerably.
 
well its ATI that owns the hardware they can do whatever they like, it just takes Dedication to Get GPU Z to decipher their Routing of the Numbers, just like CPU-Z. Im wondering if GPU-Z takes Info from the drivers themselves or looks at it low level- bios coding, i would assume drivers as it would take a bios flash of the card to scramble those numbers considerably.

I say it's drivers since this issue isn't the same in 8.8. Looking at the low-level bios coding is probably the difference between 2.7 and the previous versions reading the clocks correctly.
 
I used to recently own a GeForce8800GTS before my new 4870s and I can't remember Nvidia changing their code as much as ATI just have. Is there another way of finding out my GPU bandwidth with different overclocks until this is fixed?

Well if the makers of GPU-Z hit a brick wall getting their excellent app to work with any ATI drivers later than the troubled 8.9, then I'd be glad to lend a hand in testing a new beta version. Failing that, we should start petitioning ATI ;)
 
If people had read my post and W1zzard's reply which I quoted in it then the answer would be clear. GPU-Z appears to use the drivers to access the information and between versions of drivers ATI is known to switch methods therefore GPU-Z doesn't get the information correctly. W1zzard went on to say that NVIDIA can go 5 years without changing their method while ATI seems to change things extremely frequently.
 
I'm glad this problem isn't due to my card. I was worried that my card was broken O_o;
 
Last edited:
The speeds are correctly read with the new Catalyst 8.10 drivers, as well. :)
 
The speeds are correctly read with the new Catalyst 8.10 drivers, as well. :)

8.10 driver released yesterday: apart from being able to change the fan speed, I didn't see many other big changes in the notes.

Good to see speeds are ok Kincaid, but its amazing at how frequent ATI release new drivers. Its hard to keep up when you've got a backlog of games to play & work to do, maybe in the other order :P
 
gpu-z does not access the driver to read the clocks. the clocks are read from a little microcontroller inside the gpu which does the power management. but almost every time ati updates the code for this mcu, the data layout in it changes which requires adjustments on gpuz side.

ati's own driver "knows" this change because it also makes those changes in the microcontroller. i am seriously considering to use the driver in the future to read clocks or maybe use it as fallback
 
gpu-z does not access the driver to read the clocks. the clocks are read from a little microcontroller inside the gpu which does the power management. but almost every time ati updates the code for this mcu, the data layout in it changes which requires adjustments on gpuz side.

ati's own driver "knows" this change because it also makes those changes in the microcontroller. i am seriously considering to use the driver in the future to read clocks or maybe use it as fallback

Considering how annoying it has been with the repeated issue with reading the clock speeds I would think the fallback idea would be a good one. The question is how much more work would that be? Then again it might save work in having to continually update the software because ATI has changed things yet again.

8.10 driver released yesterday: apart from being able to change the fan speed, I didn't see many other big changes in the notes.

Good to see speeds are ok Kincaid, but its amazing at how frequent ATI release new drivers. Its hard to keep up when you've got a backlog of games to play & work to do, maybe in the other order :P

A friend who had some graphic issues with BioShock when using his ATI 4870 said the new drivers cleared up the problem he was having. The release notes were a bit on the short side, though.

Edit: I should clarify my statement. My friend is using Vista x64. In the release notes City of Villains is specifically mentioned but no other games. The list is pretty short for fixes. BluRay playback, HDMI, etc. I suggest checking out the release notes.
 
Last edited:
The question is how much more work would that be?

had i asked me that question every time i dont think i wouldn't have released any enthusiast software at all. the question for me is what's the best solution. relying on the driver has several pro's and con's
 
had i asked me that question every time i dont think i wouldn't have released any enthusiast software at all. the question for me is what's the best solution. relying on the driver has several pro's and con's

True enough statement. If you can get the information from the drivers rather than playing games with ATI changing things then I'd say go the driver route. I won't claim to know the programming involved but if it means more coding right now but less down the road in fixing the idiotic changes that ATI keeps making then I'd say go for it. Of course, that's just my opinion. :)
 
Last edited:
Back
Top