Friday, July 3rd 2009

FurMark Returns with Version 1.7.0

Nearly four months after its previous version, the chaps at oZone3D have released Furmark 1.7.0. This release packs a host of nifty new features, and a number of bug fixes. For starters, FurMark is able to work along with GPU-Z to provide real-time readings on the graphics card's temperatures, voltages and VDDC current (for cards that support it). An experimental feature allows you to Twit your score onto your Twitter account. While the stability test or benchmark is running, the main GUI stays minimized, so you needn't have to start another instance to run several tests.

With multiple GPUs doing the rendering, each GPU is given its own temperature graph. You can start or stop the rendering by hitting the space key without having to close the window. A number of new resolutions have been added, and the application is now also available in Castilian, Bulgarian, Polish, Slovak, and Spanish, thanks to translations. Issues relating to temperature updates in the graph, and the application's multithreading management are resolved. Give your graphics cards a sunbath.

DOWNLOAD: FurMark 1.7.0
Source: Geeks3D
Add your own comment

33 Comments on FurMark Returns with Version 1.7.0

#1
MRCL
I can't grill my cards in this heat, tehy would melt like I do! But nice features, I like the GPUz support.
Posted on Reply
#2
z1tu
MRCLI can't grill my cards in this heat, tehy would melt like I do! But nice features, I like the GPUz support.
yes my card is having trouble without having to grill it :roll:
Posted on Reply
#3
entropy13
MRCLI can't grill my cards in this heat, tehy would melt like I do! But nice features, I like the GPUz support.
There are vampires in Switzerland? :eek:
Posted on Reply
#4
boogerlad
How much cpu does furmark use?
Posted on Reply
#5
h3llb3nd4
boogerladHow much cpu does furmark use?
it's mostly GPU that it's utilising, so I don't think the CPU is being stressed..
Posted on Reply
#6
btarunr
Editor & Senior Moderator
boogerladHow much cpu does furmark use?
It's mostly single-threaded even today.

Posted on Reply
#7
boogerlad
Thanks. That's good. Then, I could stress test my cpu, gpu and ram all at the same time!
Posted on Reply
#8
erocker
*
Sweet, now I can destroy more than one card at once! :nutkick: I don't like programs that overstress hardware. Too bad they can't tone it down a bit.
Posted on Reply
#9
boogerlad
The ultimate tortue test. Furmark running at max settings, LinX 20 passes and memtest at the same time!
Posted on Reply
#10
boogerlad
I don't think that the power draw is right. 66watts full load for a gtx260?
Posted on Reply
#11
dcf-joe
Is this any good for a single 4870x2?

Posted on Reply
#13
W1zzard
erockerSweet, now I can destroy more than one card at once! :nutkick: I don't like programs that overstress hardware. Too bad they can't tone it down a bit.
of course you can tone it down, work with the different settings available
Posted on Reply
#15
Steevo
Sweet.



3C from idle for full load. 41C but the house is hot :(
Posted on Reply
#16
r1rhyder
I could grill a steak on my cards.


Posted on Reply
#17
largon
Curiously, btarunr's shot displays:
Renderer: GeForce GTX 260/PCI/SSE2 - GL=3.0
While for my HD4890 it says:
Renderer: ATI Radeon HD 4800 Series - GL=2.1
Why are not Radeons running the app with OpenGL 2.1 and not 3.0? These cards are supposed to be OpenGL 3.1 compliant. My shot was taken on official Cat9.6s.
boogerladI don't think that the power draw is right. 66watts full load for a gtx260?
For a 55nm card it wouldn't be a problem. Remember that figure accounts for nothing but the GPU. There are also 14 memory chips onboard that each munch away something like 2W.

Here's a shot of a HD4890 getting busy:

The reported wattage figure for this card is even less relevant as these things have a secondary core power circuitry whose output is not included in this figure. And ofcourse memory on top of that.
Posted on Reply
#18
boogerlad
Then why do we need one 6 pin and one 8 pin of some graphic cards? Pci-e slot= 75watts. 6 pin=75 watts, and 8 pin= 150watts. In total, the max is 300 watts for a graphic card. But none of these cards actually reach that high.
Posted on Reply
#19
Arrakis9
Also keep in mind that quite a bit of that converted wattage going through the VRM's is waisted as heat
Posted on Reply
#20
denice25
btarunrNearly four months after its previous version, the chaps at oZone3D have released Furmark 1.7.0. This release packs a host of nifty new features, and a number of bug fixes. For starters, FurMark is able to work along with GPU-Z to provide real-time readings on the graphics card's temperatures, voltages and VDDC current (for cards that support it). An experimental feature allows you to Twit your score onto your Twitter account. While the stability test or benchmark is running, the main GUI stays minimized, so you needn't have to start another instance to run several tests.

With multiple GPUs doing the rendering, each GPU is given its own temperature graph. You can start or stop the rendering by hitting the space key without having to close the window. A number of new resolutions have been added, and the application is now also available in Castilian, Bulgarian, Polish, Slovak, and Spanish, thanks to translations. Issues relating to temperature updates in the graph, and the application's multithreading management are resolved. Give your graphics cards a sunbath.

www.techpowerup.com/img/09-07-03/10b_thm.png

DOWNLOAD: FurMark 1.7.0

Source: Geeks3D
thanks for the share....
Posted on Reply
#21
largon
boogerladThen why do we need one 6 pin and one 8 pin of some graphic cards? Pci-e slot= 75watts. 6 pin=75 watts, and 8 pin= 150watts. In total, the max is 300 watts for a graphic card. But none of these cards actually reach that high.
Because if the card has an onboard PCIe power plug, slot power cannot be used for powering the same load as the 6pin plug is used for. Otherwise current load would be shared between slot and PCIe plug, and that's something one doesn't want to happen, for a number of reasons.
Arrakis+9Also keep in mind that quite a bit of that converted wattage going through the VRM's is wasted as heat
Volterra chips are around 90-95% efficient. Seems like they're more efficient than other more conventional VRMs, which is evident from the increased power consumption of GTX295 when it went from 2 PCBs to 1 PCB which no longer uses Volterra VRMs.
Posted on Reply
#22
hat
Enthusiast
largonBecause if the card has an onboard PCIe power plug, slot power cannot be used for powering the same load as the 6pin plug is used for. Otherwise current load would be shared between slot and PCIe plug, and that's something one doesn't want to happen, for a number of reasons.
That doesn't make sense. Why would one use a 75w power connector when one could simply use the 75w from the slot seeing as the power from the slot becomes unavailable when an external power connector is present. And pci-e 2.0 is 150w... why would anyone put a single 75w external power connector (ala 8800GTS G92) on a card that already gets 150w from the slot when using an external power source makes the slot power unavailable? I must have misunderstood somehow...
Posted on Reply
#23
btarunr
Editor & Senior Moderator
hatAnd pci-e 2.0 is 150w...
Uhm, no. The PCI-E 2.0 x16 slot provides 75W. Not a Watt more.
Posted on Reply
#24
hat
Enthusiast
btarunrUhm, no. The PCI-E 2.0 x16 slot provides 75W. Not a Watt more.
You know... you're the first person I've ever heard say that... can you back that statement up?
Posted on Reply
#25
btarunr
Editor & Senior Moderator
hatYou know... you're the first person I've ever heard say that... can you back that statement up?
I don't need to. Read up.
Posted on Reply
Add your own comment
Nov 15th, 2024 22:26 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts