Tuesday, October 13th 2009

Intel IGPs Use Murky Optimisations for 3DMark Vantage

Apart from being the industry's leading 3D graphics benchmark application, 3DMark has had a long history of 3D graphics hardware manufacturers cheating with their hardware using application-specific optimisations against Futuremark's guidelines to boost 3DMark scores. Often, this is done by drivers detecting the 3DMark executable, and downgrading image quality, so the graphics processor has to handle lesser amount of processing load from the application, and end up with a higher performance score. Time and again, similar application-specific optimisations have tarnished 3DMark's credibility as an industry-wide benchmark.

This time around, it's neither of the two graphics giants in the news for the wrong reasons, it's Intel. Although the company has a wide consumer base of integrated graphics, perhaps the discerning media user / very-casual gamer finds it best to opt for integrated graphics (IGP) solutions from NVIDIA or AMD. Such choices rely upon reviews evaluating the IGPs performance at accelerating video (where it's common knowledge that Intel's IGPs rely heavily on the CPU for smooth video playback, while competing IGPs fare better at hardware-acceleration), synthetic and real-world 3D benchmarks, among other application-specific tests.

Here's a shady trick Intel is using to up its 3DMark Vantage score: the drivers, upon seeing the 3DMark Vantage executable, change the way they normally function, ask the CPU to pitch in with its processing power, and gain significant performance according to an investigation by Tech Report. While the image quality of the application isn't affected, the load on the IGP is effectively reduced, deviating from the driver's usual working model. This is in violation of Futuremark's 3DMark Vantage Driver Approval Policy (read here), which says:
With the exception of configuring the correct rendering mode on multi-GPU systems, it is prohibited for the driver to detect the launch of 3DMark Vantage executable and to alter, replace or override any quality parameters or parts of the benchmark workload based on the detection. Optimizations in the driver that utilize empirical data of 3DMark Vantage workloads are prohibited.
There's scope for ambiguity there. To prove that Intel's drivers indeed don't play fair at 3DMark Vantage, Tech Report put an Intel G41 Express chipset based motherboard with Intel's latest 15.15.4.1872 Graphics Media Accelerator drivers, through 3DMark Vantage 1.0.1. The reviewer simply renamed the 3DMark executable, in this case from "3DMarkVantage.exe" to "3DMarkVintage.exe", and there you are: a substantial performance difference.

A perfmon (performance monitor) log of the benchmark as it progressed, shows stark irregularities in the CPU load graphs between the two, during the GPU tests, although the two remained largely the same during the CPU tests. An example of one such graphs is below:

When asked for a comment to these findings, Intel replied by saying that its drivers are designed to utilize the CPU for some parts of the 3D rendering such as geometry rendering, when pixel and vertex processing saturates the IGP. Call of Juarez, Crysis, Lost Planet: Extreme Conditions, and Company of Heroes, are among other applications that the driver sees and quickly morphs the way the entire graphics subsystem works. A similar test run on Crysis Warhead yields a similar result:

Currently, Intel's 15.15.4.1872 drivers for Windows 7 aren't in Futuremark's list of approved drivers, none of Intel's Windows 7 drivers do. For a complete set of graphs, refer to the source article.
Source: The Tech Report
Add your own comment

45 Comments on Intel IGPs Use Murky Optimisations for 3DMark Vantage

#1
lemonadesoda
Thanks for the news. That's real dirty. Class action, anyone?
Posted on Reply
#2
Bull Dog
Bla, bla, bla. But do they change image quality?
Posted on Reply
#3
btarunr
Editor & Senior Moderator
Bull DogBla, bla, bla. But do they change image quality?
It detects the application and changes vertex processing settings. Such a thing is prohibited under Futuremark's policy too. The GPU (IGP) itself isn't getting the graphics processing load it should be getting for the test to be fair and valid.
Posted on Reply
#4
KainXS
Bull DogBla, bla, bla. But do they change image quality?
it says it right there
this is done by drivers detecting the 3DMark executable, and downgrading image quality, so the graphics processor has to handle lesser amount of processing load from the application, and end up with a higher performance score.
:laugh:
Posted on Reply
#5
HalfAHertz
Why? Isn't the driver's job to provide the best possible experience, while utilizing all possible recourses? If so, then this is exactly what intel have done.

Now, on the other hand, if it does indeed lower the quality of the final product, and this wasn't stated anywhere, then by all means they should get sued.

The thing that really bugs me here, is how deppressingly and catastrophically bad intel's igps are in reality.
Posted on Reply
#6
btarunr
Editor & Senior Moderator
No, that's not the way it should work. NVIDIA or AMD's drivers don't leave the CPU to do the parts of the graphics processing the Intel drivers are making the CPUs do. The IGP itself is weaker than it's appearing to be. Offloading work to the CPU isn't even a standard model for Intel's drivers, proven by running the applications renamed.
Posted on Reply
#7
1Kurgan1
The Knife in your Back
HalfAHertzWhy? Isn't the driver's job to provide the best possible experience, while utilizing all possible recourses? If so, then this is exactly what intel have done.

Now, on the other hand, if it does indeed lower the quality of the final product, and this wasn't stated anywhere, then by all means they should get sued.

The thing that really bugs me here, is how deppressingly and catastrophically bad intel's igps are in reality.
It's not providing the best expirence. It's downgrading the image to get a better score, and thats only in Futuremark apps. So you think "wow this IGP is amazing" then you hit some games and get slapped in the face as the Futuremark products score led you to believe it was far more powerful.
Posted on Reply
#9
mdm-adph
So... Intel's IGP's are so astoundingly shitty that their drivers actually offload processing to the CPU to help out?

I'm... almost willing to let them have that, solely out of pity.
Posted on Reply
#10
Jstn7477
I will always hate Intel IGPs. Drivers suck, and they are outdated by the time they are released.
Posted on Reply
#11
[I.R.A]_FBi
Intel IGP's arent even good nuff for nettops
Posted on Reply
#12
WarEagleAU
Bird of Prey
Yet when you see computers or notebooks on those home shopping channels, you got some really retarded nerd touting the 128mb of share ram being dedicated to the intel G(I suck gP) graphics. Really retarded.
Posted on Reply
#13
FordGT90Concept
"I go fast!1!11!1!"
FutureMark just needs to have their installer randomize the executable's name.
Posted on Reply
#14
Regeneration
NGOHQ.COM
Nvidia and AMD are doing the same thing for years now - just with better techniques. If they are allowed, I don’t see any reason why Intel won’t be allowed too. It’s either everyone or nobody.
Posted on Reply
#15
wiak
Intel IGPs has allways been called integrated crapstics, even 2 generation old AMD IGPs outclass intel finest :nutkick:
Posted on Reply
#16
newtekie1
Semi-Retired Folder
btarunrYes, it detects the application and changes vertex processing settings. Such a thing is prohibited under Futuremark's policy too. The GPU (IGP) itself isn't getting the graphics processing load it should be getting for the test to be fair and valid.
Yes, it is against Furturemark's rules, but is it morally wrong to do it this way? I've always said, fuck benchmarks, all I care about is game performance, and it seems Intel was more worried about improving performance in games, and did simply applied the same optimizations to Futuremark's tests.
KainXSit says it right there

:laugh:
Funny how when you leave off that one word "often" it changes the whole sentence meaning. If only the original sentence was the one you editted it to...
HalfAHertzWhy? Isn't the driver's job to provide the best possible experience, while utilizing all possible recourses? If so, then this is exactly what intel have done.

Now, on the other hand, if it does indeed lower the quality of the final product, and this wasn't stated anywhere, then by all means they should get sued.

The thing that really bugs me here, is how deppressingly and catastrophically bad intel's igps are in reality.
Agreed, as far as I've read, the article mentions nothing about Intel actually lowering image quality. It seems their mistake was offloading the work to the CPU, which is also against the rules. This has nothing to do with lowering the quality of the final product.

If it helps the shitty performance of Intel's IGPs, I say let them do it, but they should remove the optimization from the Furturemark exes, just to adhere to the rules.
1Kurgan1It's not providing the best expirence. It's downgrading the image to get a better score, and thats only in Futuremark apps. So you think "wow this IGP is amazing" then you hit some games and get slapped in the face as the Futuremark products score led you to believe it was far more powerful.
Maybe I missed something in the article, where does it say that it is downgrading the image to get a better score?
Posted on Reply
#17
mastrdrver
wiakIntel IGPs has allways been called integrated crapstics, even 2 generation old AMD IGPs outclass intel finest :nutkick:
I always get a laugh when Intel gpus are talked about in reference to gaming.
Posted on Reply
#18
h3llb3nd4
doesn't really matter guys, it's not like your gonna play COD with an IGP
Posted on Reply
#19
Assimilator
This would be understandable if Intel IGPs were any good, but they aren't, so all this does is make Intel look stupid. It would, however, go a long way to explaining why Intel's IGP drivers are consistently a pile of suck.

And for heaven's sake, they use the EXE filename to detect when to make their IGP look better... a half-decent first-year university student wouldn't use such a simplistic approach...
Posted on Reply
#20
newtekie1
Semi-Retired Folder
AssimilatorAnd for heaven's sake, they use the EXE filename to detect when to make their IGP look better... a half-decent first-year university student wouldn't use such a simplistic approach...
EXE name based optimization is a pretty common practice for ATi and nVidia... How many times do we see the suggestions of renaming the EXE to get better performance or Crossfire/SLi support when a new game comes out? You want to know why that works? Because the drivers detect the EXE name and applies optimizations.

It just happens to be against Futuremark rules. Though logically, I have to wonder how much that rule makes sense. I mean, they are allowed to do it in real games, and 3DMark is supposed to be a benchmark to measure game performance....so why can't they apply the same optimizations to 3DMark as they do to real games?
Posted on Reply
#21
aj28
newtekie1so why can't they apply the same optimizations to 3DMark as they do to real games?
Because the idea is to benchmark the GPU only. If you're benchmarking the GPU + CPU, how are users supposed to know which is doing more of the work? The idea of the synthetic benchmark is to take all other elements out of the equation and analyze the GPU's raw power, which is why we still run them instead of only game benchmarks. What if Intel doesn't optimize for a game you play? Well, you're out in the cold, because you bought an IGP which you thought performed 25% better than it really does.
Posted on Reply
#22
Jakl
Intel... How am I not surprised... Trying to exceed every other companies by these little tweaks to make them better...
Posted on Reply
#23
newtekie1
Semi-Retired Folder
aj28Because the idea is to benchmark the GPU only. If you're benchmarking the GPU + CPU, how are users supposed to know which is doing more of the work? The idea of the synthetic benchmark is to take all other elements out of the equation and analyze the GPU's raw power, which is why we still run them instead of only game benchmarks. What if Intel doesn't optimize for a game you play? Well, you're out in the cold, because you bought an IGP which you thought performed 25% better than it really does.
Well...that isn't really the idea behind benchmarking. Yes, that is what they have become thanks to Futuremark turning it into more of a competition than a true benchmark. However, benchmarking is supposed to give you an idea of game performance, that is what 3DMark started out as. If the benchmark was truly all about raw GPU power, then there wouldn't be CPU tests included in it.

And we all know the various cards perform different for various games. So the argument that you bought something because you thought it performed better based on one benchmark doesn't work. Just an example of the flaws in your logic: What if I went and bought a HD4890 because it outscores a GTX260 216 in 3Dmark06...but if I went and fired up Far Cry, the GTX260 performs better... It is all about optimizations these days, and no one should be buying a card based on 3DMark scores...to do so is silly.

Now, if they applied this optimization to just 3DMark, I would say it is wrong, however it has been applied to most games as well. So, IMO, it isn't really that wrong.
Posted on Reply
#24
W1zzard
you would be surprised what else, other than exe name, you could use for app detection - that pretty much nobody on the planet is ever gonna figure out
Posted on Reply
#25
Tartaros
Well, I don't think blaming Intel igps for being so bad about gaming is not right, they're not for gaming and never pretended to. They are very cheap igps, with very low power consumption and low heat output, so someone who doesn't play games doesn't need the power of an ati or nvidia, less money, more battery. We usually forget about that, they are fine for what they were created.
Posted on Reply
Add your own comment
Nov 20th, 2024 11:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts