• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

TechPowerUp GPU-Z 2.14.0 Released

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,194 (7.56/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
TechPowerUp today released the latest version of GPU-Z, the popular graphics subsystem information and diagnostic utility. Version 2.14.0 adds support for Intel UHD Graphics iGPUs embedded into 9th generation Core "Coffee Lake Refresh" processors. GPU-Z now calculates Pixel and Texture Fill-rates more accurately, by leveraging the boost clock instead of the base clock. This is particularly useful for scenarios such as iGPUs, which have a vast difference between the base and boost clocks. It's also relevant to some of the newer generations of GPUs, such as NVIDIA RTX 20-series.

A number of minor bugs were also fixed with GPU-Z 2.14.0, including a missing Intel iGPU temperature sensor, and malfunctioning clock-speed measurement on Intel iGPUs. For NVIDIA GPUs, power sensors show power-draw both as an absolute value and as a percentage of the GPU's rated TDP, in separate read-outs. This feature was introduced in the previous version, this version clarifies the labels by including "W" and "%" in the name. Grab GPU-Z from the link below.

DOWNLOAD: TechPowerUp GPU-Z 2.14.0



The change-log follows.
  • When available, boost clock is used to calculate fillrate and texture rate
  • Fixed missing Intel GPU temperature sensor
  • Fixed wrong clocks on some Intel IGP systems ("12750 MHz")
  • NVIDIA power sensors now labeled with "W" and "%"
  • Added support for Intel Coffee Lake Refresh

View at TechPowerUp Main Site
 
Joined
Jan 4, 2018
Messages
155 (0.06/day)
About boost clocks shown -how about dynamic value? I mean, instead of showing some base boost, why won’t it show the maximum?
Mine strix 1080 ti, even with factory cooling always runs 1949-1974 MHz, instead of 1704 mhz shown in gpuz
Besides, it could ease up the comparison between various brands
 
Joined
Aug 5, 2015
Messages
286 (0.08/day)
Location
Debrecen, Hungary
Processor AMD Ryzen R7 5700X
Motherboard MSI B550 Gaming Plus
Cooling Alpenföhn Dolomit Premium
Memory Kingston Fury Beast 16 GB
Video Card(s) Asus RTX 4070 Dual OC 12 GB
Storage Kingston Fury Renegade 1 TB, Western Digital Red 2 TB
Display(s) Dell G2724D
Case Fractal Design Meshify 2 Compact
Audio Device(s) Asus Xonar DSX + Microlab B-77
Power Supply Seasonic Focus GX-550
Mouse Zowie ZA11
Keyboard Endorfy Thock TKL Brown
Software Windows 10 Pro
About boost clocks shown -how about dynamic value? I mean, instead of showing some base boost, why won’t it show the maximum?
Mine strix 1080 ti, even with factory cooling always runs 1949-1974 MHz, instead of 1704 mhz shown in gpuz
Besides, it could ease up the comparison between various brands
Because 1704 MHz is the boost 1.0 clocks, anything above that is boost 3.0 which changes dynamically depending on load, temperatures, power limit / consumption etc., so GPU-Z can only read those values under load. At least I believe so.
 
Joined
Feb 18, 2013
Messages
2,181 (0.51/day)
Location
Deez Nutz, bozo!
System Name Rainbow Puke Machine :D
Processor Intel Core i5-11400 (MCE enabled, PL removed)
Motherboard ASUS STRIX B560-G GAMING WIFI mATX
Cooling Corsair H60i RGB PRO XT AIO + HD120 RGB (x3) + SP120 RGB PRO (x3) + Commander PRO
Memory Corsair Vengeance RGB RT 2 x 8GB 3200MHz DDR4 C16
Video Card(s) Zotac RTX2060 Twin Fan 6GB GDDR6 (Stock)
Storage Corsair MP600 PRO 1TB M.2 PCIe Gen4 x4 SSD
Display(s) LG 29WK600-W Ultrawide 1080p IPS Monitor (primary display)
Case Corsair iCUE 220T RGB Airflow (White) w/Lighting Node CORE + Lighting Node PRO RGB LED Strips (x4).
Audio Device(s) ASUS ROG Supreme FX S1220A w/ Savitech SV3H712 AMP + Sonic Studio 3 suite
Power Supply Corsair RM750x 80 Plus Gold Fully Modular
Mouse Corsair M65 RGB FPS Gaming (White)
Keyboard Corsair K60 PRO RGB Mechanical w/ Cherry VIOLA Switches
Software Windows 11 Professional x64 (Update 23H2)
at last I can finally see some numbers on how my Intel iGPU is behaving xD
 
Joined
Dec 18, 2005
Messages
8,253 (1.20/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
this one matches what my palit software shows me.. the one i was using showed me 1724 boost instead of 1800.. it also explains why i could not clock as high as i thought i should be able to..

the actual boost with the valley benchmark running varies between 1900 and 2100... furmark has it down to 1500..

the main control (governor) seem to be power usage.. assuming the temps are okay as they should be..




trog


ps.. the memory reading is wrong though.. on my card it should be 7747.. the default is 7000.. i aint sure where the 1937 comes from
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.11/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
ps.. the memory reading is wrong though.. on my card it should be 7747.. the default is 7000.. i aint sure where the 1937 comes from

1937MHz is the actual memory frequency. DDR6(and DDR5X) is Quad-Data Rate, so you multiple the actual memory frequency by 4 to get your effective frequency of 7747. But the memory is actually runnig at 1937MHz, so that is what GPU-Z shows.
 
Joined
Dec 18, 2005
Messages
8,253 (1.20/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
1937MHz is the actual memory frequency. DDR6(and DDR5X) is Quad-Data Rate, so you multiple the actual memory frequency by 4 to get your effective frequency of 7747. But the memory is actually runnig at 1937MHz, so that is what GPU-Z shows.

yes i did think it might be that.. the earlier version showed it differently which is what confused me.. i prefer the earlier way i think.. he he

trog
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,566 (0.56/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
yes i did think it might be that.. the earlier version showed it differently which is what confused me.. i prefer the earlier way i think.. he he

trog
we display the correct clocks not MT/s
 
Joined
Jan 13, 2018
Messages
157 (0.06/day)
System Name N/A
Processor Intel Core i5 3570
Motherboard Gigabyte B75
Cooling Coolermaster Hyper TX3
Memory 12 GB DDR3 1600
Video Card(s) MSI Gaming Z RTX 2060
Storage SSD
Display(s) Samsung 4K HDR 60 Hz TV
Case Eagle Warrior Gaming
Audio Device(s) N/A
Power Supply Coolermaster Elite 460W
Mouse Vorago KM500
Keyboard Vorago KM500
Software Windows 10
Benchmark Scores N/A
because it's not possible to read maximum as far as I know

It could take the max boost clocks after using the PCI-E render test, which should be more accurate than just reading the specified boost clock, so first show the fillrates using factory clocks and update them after running the render test.

That is because NVIDIA cards boost way higher than their boost clocks but AMD cards can't even reach their boost clocks without increasing TDP or undervolting (no Vega card can reach its boost clock out of the box).

Btw, next version could also add FP32 performance in GFLOPS using both factory clocks and after a PCI-E render test.
 
Last edited:
Joined
Aug 5, 2015
Messages
286 (0.08/day)
Location
Debrecen, Hungary
Processor AMD Ryzen R7 5700X
Motherboard MSI B550 Gaming Plus
Cooling Alpenföhn Dolomit Premium
Memory Kingston Fury Beast 16 GB
Video Card(s) Asus RTX 4070 Dual OC 12 GB
Storage Kingston Fury Renegade 1 TB, Western Digital Red 2 TB
Display(s) Dell G2724D
Case Fractal Design Meshify 2 Compact
Audio Device(s) Asus Xonar DSX + Microlab B-77
Power Supply Seasonic Focus GX-550
Mouse Zowie ZA11
Keyboard Endorfy Thock TKL Brown
Software Windows 10 Pro
It could take the max boost clocks after using the PCI-E render test, which should be more accurate than just reading the specified boost clock, so first show the fillrates using factory clocks and update them after running the render test.

That is because NVIDIA cards boost way higher than their boost clocks but AMD cards can't even reach their boost clocks without increasing TDP or undervolting (no Vega card can reach its boost clock out of the box).

Btw, next version could also add FP32 performance in GFLOPS using both factory clocks and after a PCI-E render test.
And what if I don't want to run the render test? :eek:
 
Joined
Jan 13, 2018
Messages
157 (0.06/day)
System Name N/A
Processor Intel Core i5 3570
Motherboard Gigabyte B75
Cooling Coolermaster Hyper TX3
Memory 12 GB DDR3 1600
Video Card(s) MSI Gaming Z RTX 2060
Storage SSD
Display(s) Samsung 4K HDR 60 Hz TV
Case Eagle Warrior Gaming
Audio Device(s) N/A
Power Supply Coolermaster Elite 460W
Mouse Vorago KM500
Keyboard Vorago KM500
Software Windows 10
Benchmark Scores N/A
And what if I don't want to run the render test? :eek:
Then you don't see the updated fillrates that's all. The user should be the one to click in the render test button or if it will be automatic then ask to user it it wants tl run it.
 
Joined
Aug 5, 2015
Messages
286 (0.08/day)
Location
Debrecen, Hungary
Processor AMD Ryzen R7 5700X
Motherboard MSI B550 Gaming Plus
Cooling Alpenföhn Dolomit Premium
Memory Kingston Fury Beast 16 GB
Video Card(s) Asus RTX 4070 Dual OC 12 GB
Storage Kingston Fury Renegade 1 TB, Western Digital Red 2 TB
Display(s) Dell G2724D
Case Fractal Design Meshify 2 Compact
Audio Device(s) Asus Xonar DSX + Microlab B-77
Power Supply Seasonic Focus GX-550
Mouse Zowie ZA11
Keyboard Endorfy Thock TKL Brown
Software Windows 10 Pro
Then you don't see the updated fillrates that's all. The user should be the one to click in the render test button or if it will be automatic then ask to user it it wants tl run it.
My only problem with this run the render test method is that, sure, GPU-Z would be able to read the maximum boost clocks, but during gaming, 99% guaranteed the GPU is not going to be running on those clocks because of the temperatures and the aggressive clock decreasing of boost 3.0.
For example, my GTX 1070 boosts to 2100 MHz, but only below 55 or 60 °C. After reaching this temperature, the GPU starts decreasing its clocks by 12-13 MHz every 1-2 °C, which means that I play at 2050-2062 MHz 99% of the time. So yes, I agree that your method would allow GPU-Z to read the maximum boost clocks, but it would also make no sense, as I'm sure at least 80% of Pascal GPUs run above 60 °C, or even 70 °C while gaming.

Also, I'm sorry if sounded cocky in my previous comment, I didn't mean to. :)
 
Joined
Jan 13, 2018
Messages
157 (0.06/day)
System Name N/A
Processor Intel Core i5 3570
Motherboard Gigabyte B75
Cooling Coolermaster Hyper TX3
Memory 12 GB DDR3 1600
Video Card(s) MSI Gaming Z RTX 2060
Storage SSD
Display(s) Samsung 4K HDR 60 Hz TV
Case Eagle Warrior Gaming
Audio Device(s) N/A
Power Supply Coolermaster Elite 460W
Mouse Vorago KM500
Keyboard Vorago KM500
Software Windows 10
Benchmark Scores N/A
My only problem with this run the render test method is that, sure, GPU-Z would be able to read the maximum boost clocks, but during gaming, 99% guaranteed the GPU is not going to be running on those clocks because of the temperatures and the aggressive clock decreasing of boost 3.0.
For example, my GTX 1070 boosts to 2100 MHz, but only below 55 or 60 °C. After reaching this temperature, the GPU starts decreasing its clocks by 12-13 MHz every 1-2 °C, which means that I play at 2050-2062 MHz 99% of the time. So yes, I agree that your method would allow GPU-Z to read the maximum boost clocks, but it would also make no sense, as I'm sure at least 80% of Pascal GPUs run above 60 °C, or even 70 °C while gaming.

Also, I'm sorry if sounded cocky in my previous comment, I didn't mean to. :)

2100 Mhz is closer to 2062 Mhz than to maybe 1900 Mhz (which I estimate is your current boost clock according to GPU-Z), so it is still more accurate (and looks better) hehehe
 
Top