# 6770M - Problem or not ?!



## nism (Sep 7, 2011)

Hello mates  Firstly I want to be excused for my bad English  . Well the problem is that when I start GPU-z firstly there is no Memory Type showed ?! Secondly the Memory clock values are NOT the same as they are showed in - http://www.notebookcheck.net/AMD-Radeon-HD-6770M.43955.0.html  . The values are http://gpuz.techpowerup.com/11/09/07/cqs.png  - GPU Clock Memory 800MHz and Default Clock 216 MHz. When I'm playing games (like Sacred 2 or AVP) the values for Memory clock are 216 ... 
	

	
	
		
		

		
			





 . Are they not supposed to be " Memory Speed *	1600 MHz " ?? Please answer and help a noobie to understand what's wrong and that to do .


----------



## Derek12 (Sep 7, 2011)

Could be two things:

- Your (laptop?) has a energy saving feature and lowers the clock of the memory or GPU if you're in battery even if the GPU is at load.

- Most likely: GPUZ is misreading because the (laptop?) is too new (specially for the unknown type of memory), try with Aida64 or Hwinfo or in Catalyst under Information -> hardware just in case and see if the readings are the same.


----------



## nism (Sep 7, 2011)

Thanks for the answer. The laptop is Hp dv6 - 6007tx. I've seen other pictures in the net and they are not the same memory speed. Also the energy saving options is - High performance. Is it this that 216 MHz must be multiplied by 5 (because of the GDDR5 ) ??

Edit: Even in Hwinfo the memory is 216.


----------



## Derek12 (Sep 7, 2011)

nism said:


> Thanks for the answer. The laptop is Hp dv6 - 6007tx. I've seen other pictures in the net and they are not the same memory speed. Also the energy saving options is - High performance. Is it this that 216 MHz must be multiplied by 5 (because of the GDDR5 ) ??



Maybe you have a newer revision with different memory modules and GPUZ couldn't recognize them (if I am wrong someone please correct me  )


The specs says 1600 MHz, but that is the *real data clock* because of the *Double Data Rate*, the real *memory clock* of your VRAM would be 800MHz 
No, in any DDRx would be multiplied by 2 to get the real data clock (in your case would be 432 according to that measure 

Try with Catalyst, Information -> Hardware and see if it reports the correct clock


----------



## nism (Sep 7, 2011)

Derek12 said:


> .........
> 
> Try with Catalyst, Information -> Hardware and see if it reports the correct clock



It says "Memory Clock in MHz	800 MHz	" ?

Edit: Really I am trying to understand... but it is ... hard :/ . I've read a lot in google but i cant ....


----------



## W1zzard (Sep 7, 2011)

the problem with memory type is because gpuz can't read the vga bios. i think this also affects the default clock reading as the default codepath looks in the bios to get that info.

the 1600 mhz vs. 800 mhz is causes by the double date rate of ddr as derek12 mentioned in the post above


----------



## nism (Sep 7, 2011)

W1zzard said:


> the problem with memory type is because gpuz can't read the vga bios. i think this also affects the default clock reading as the default codepath looks in the bios to get that info.
> 
> the 1600 mhz vs. 800 mhz is causes by the double date rate of ddr as derek12 mentioned in the post above



So you want to tell me that everyting is all right ? I multiply 800 (because the Catalyst says :"Memory Clock in MHz 800 MHz") by 2 and get the real 1600 MHZ ??


----------



## Derek12 (Sep 7, 2011)

nism said:


> It says "Memory Clock in MHz	800 MHz	" ?



Then it is right, Wizzard has further clarified


----------



## nism (Sep 7, 2011)

Okey. Let me tell you what I think. I hope it is right and everything is done ;] . Here in this site ( http://www.geeks3d.com/20100613/tut...-clock-real-and-effective-speeds-demystified/ ) is said that:

 " Roughly said, there are 2 kinds of graphics memory currently in use: DDR (Double Data Rate) and QDR (Quad Data Rate or Quad-pumped). For example, the GDDR3 memory (on GTX 200 for example) is a DDR memory while the GDDR5 (on HD 5000 or GTX 400) is a QDR memory.

The effective speed of a DDR memory is:

DDR_effective_speed = real_speed x 2

And the effective speed of a QDR memory is:

QDR_effective_speed = real_speed x 4 ".

So... the default clock is 216 Mhz (Shown here: http://gpuz.techpowerup.com/11/09/07/fqq.png ). I multiply it by 4 and get around 800 Mhz. After that i multiply it by 2 and get 1600 Mhz because of the "double date rate of ddr as derek12 mentioned in the post above". Am I right ? Really sory for all this but I'm really worried if the GPU is working properly. Why are there other pictures in the net with 6770M and the Clock information is other( like http://gpuz.techpowerup.com/11/05/06/494.png ).


----------



## Derek12 (Sep 7, 2011)

That confused me  because GDDR5 means *Graphic Double Data Rate 5*, why it isn't called *GQDR5*?


----------



## nism (Sep 7, 2011)

I have no idea man  I just want to know if my video card is running properly or not ;/


----------



## Derek12 (Sep 7, 2011)

nism said:


> I have no idea man  I just want to know if my video card is running properly or not ;/ http://gpuz.techpowerup.com/11/09/07/s9.png



Yes, if Catalyst reports 800MHz *it's fine* because this is the real clock as said Wizzard and I


----------



## nism (Sep 7, 2011)

I hope you are right


----------



## Jstn7477 (Sep 7, 2011)

The base frequency of the GDDR5 in that computer is 800MHz, so the effective speed would be 3200MHz, because GDDR5 is QDR RAM. The DDR3 equivalent (if HP used DDR3 instead of GDDR5) would be 1600MHz effective for that 800MHz clock.

My HD 5770 desktop card has 1250MHz 128bit GDDR5, which results in 5000MHz effective. My HD 6670 desktop has 1000MHz 128bit GDDR5, which results in a 4000MHz effective clock. A 128bit configuration of GDDR5 at 1000MHz (4000 effective) has a throughput of 64.0GB/s. Therefore, if GPU-Z read your memory correctly, the throughput of your 128bit GDDR5 @ 800MHz (3200 effective) would result in 51.2GB/s memory bandwidth. My laptop with 128bit 800MHz DDR3 (1600MHz effective) has 25.6GB/s of memory bandwidth.

Hope this doesn't confuse you.


----------



## Derek12 (Sep 7, 2011)

Many thanks I understood now


----------



## nism (Sep 7, 2011)

Jstn7477 said:


> The base frequency of the GDDR5 in that computer is 800MHz, so the effective speed would be 3200MHz, because GDDR5 is QDR RAM.
> 
> Hope this doesn't confuse you.



I understands you perfectly but my anxiety is why GPU-z shows 
	

	
	
		
		

		
		
	


	




 216Mhz. Why doesnt it show 800 Mhz ?  HWiNFO64 also shows 216 Mhz. I wonder if there is a problem or ... My video card is 6770m. If i believe in this 216 Mhz my effective speed would be 864 MHz ???


----------



## Derek12 (Sep 7, 2011)

nism said:


> I understands you perfectly but my anxiety is why GPU-z shows http://gpuz.techpowerup.com/11/09/07/gpq.png 216Mhz. Why doesnt it show 800 Mhz ?  HWiNFO64 also shows 216 Mhz. I wonder if there is a problem or ...





W1zzard said:


> the problem with memory type is because gpuz can't read the vga bios. *i think this also affects the default clock reading as the default codepath looks in the bios to get that info.*
> 
> the 1600 mhz vs. 800 mhz is causes by the double date rate of ddr as derek12 mentioned in the post above



-------------------------
About Hwinfo, Maybe have the same issue that GPUZ and cannot read the clocks well but I would trust Catalyst if it reports 800 , if you want to make sure more, try AIDA64 latest version 
Maybe it has to do with being QDR and the calc you did before. I wouldn't worry anymore


----------



## Jstn7477 (Sep 7, 2011)

It's likely a monitoring flaw. My HD 4250 IGP in my laptop (which works in tandem with my HD 5650) reads:

GPU Clock 500MHz core 398MHz mem
Default Clock 498MHz core 533MHz mem

and the monitoring page reads 497.6MHz core / 397.4MHz memory even though it has no sideport memory, and the main system ram is DDR3 1066 (533*2).

Both VGA BIOSes are integrated in the main system ROM.


----------



## Derek12 (Sep 7, 2011)

Jstn7477 said:


> It's likely a monitoring flaw. My HD 4250 IGP in my laptop (which works in tandem with my HD 5650) reads:
> 
> GPU Clock 500MHz core 398MHz mem
> Default Clock 498MHz core 533MHz mem
> ...



Yeah, and here is my GPUZ reading of my IGP






Not exactly the same because doesn't affect the default GPU clock but I would  anyway if that would be true, but the core clock is really 400 MHz, oh and the memory reading is also wrong


----------



## W1zzard (Sep 7, 2011)

misreadings happen sometimes and they are very hard to fix if i dont have physical access to the platform for testing


----------



## Derek12 (Sep 7, 2011)

W1zzard said:


> misreadings happen sometimes and they are very hard to fix if i dont have physical access to the platform for testing



Yeah there are many hardware types, vendors, models, and builds, I understand you


----------



## nism (Sep 7, 2011)

This is what AIDA shows while playing AVP. http://prikachi.com/images.php?images/154/3792154l.jpg . Now it is 400 Mhz ?! How can i understand whether my video card works properly or not  ?!


----------



## Derek12 (Sep 7, 2011)

nism said:


> This is what AIDA shows while playing AVP. http://prikachi.com/images.php?images/154/3792154l.jpg . Now it is 400 Mhz ?! How can i understand whether my video card works properly or not  ?!



Saying the real clock and effective clock the same is clearly a bad reading  also it can't read the memory type, BIOS date and part number . I wouldn't worry 


A PROPER reading would be more or less like this (except the original: 900 MHz thing


----------



## Jstn7477 (Sep 7, 2011)

nism said:


> This is what AIDA shows while playing AVP. http://prikachi.com/images.php?images/154/3792154l.jpg . Now it is 400 Mhz ?! How can i understand whether my video card works properly or not  ?!



Get ahold of a benchmark that other HD 6770M users have run, and compare your results to theirs. If your results are significantly lower, then there may be a problem with your card.

Your card is basically a desktop HD 6670 GDDR5 with lower clocks, so that's another baseline to consider. My HD 6670 GDDR5 has an 800MHz core and 1000MHz memory, so if your results are more than say 20% off W1zzard's 6670 review (without factoring the CPU difference), then there may be a problem. Considering a desktop HD 5770/6770 is basically 200% of a desktop 6670's core with ~150% memory bandwidth, the difference between 1000MHz GDDR5 on a desktop 6670 and properly working 800MHz GDDR5 in your laptop should be negligible, so the only different factors should be just the 10% core clock speed difference and the CPU.


----------



## nism (Sep 7, 2011)

Ok but as I am a newbie can you tell me a program which I can use to benchmark.


----------



## Jstn7477 (Sep 7, 2011)

nism said:


> Ok but as I am a newbie can you tell me a program which I can use to benchmark.



http://www.techpowerup.com/reviews/AMD/HD_6670/

Do you have any of the 3DMark benchmarks, or any of the games tested in the review above? Otherwise, I would suggest you to download FurMark 1.9.1 or something, but I don't want you to potentially burn up your new laptop (although it is unlikely).

I ran FurMark 1.9.1 on the 720p benchmark preset and got 528 points (8 FPS) on a Mobility Radeon HD 5650 with a 450MHz core (400 stream processors/8 ROPs) and 800 (1600 eff) 128bit DDR3. If you can roughly double my score, your card should be fine.


----------



## nism (Sep 8, 2011)

Here is what I've got - http://imageshack.us/photo/my-images/684/dddjlc.jpg/ . Later I've managed to make 21 max FPS :} . Im just curious why every program shows different data... Some 216 MHz, other 400 and third 800...


----------



## Jstn7477 (Sep 8, 2011)

nism said:


> Here is what I've got - http://imageshack.us/photo/my-images/684/dddjlc.jpg/ . Later I've managed to make 21 max FPS :} . Im just curious why every program shows different data... Some 216 MHz, other 400 and third 800...



I'm jealous of your laptop. 

Anyway, the monitoring programs likely all use custom methods of retrieving the clock information, so some programs may have better or worse methods for different cards.


----------



## nism (Sep 9, 2011)

Hah thanks mate  I have a question. I want to update my CCC. When i DL it from the ati site, should I uninstall the old CCC and ati drivers( and how to uninstall drivers  ) first ? Also should I uninstall the integrated video card driver or not ? I will be happy if you give me some tips


----------

