Wednesday, February 10th 2010

NVIDIA Optimus Technology Delivers Perfect Balance Of Performance And Battery Life

NVIDIA Corp., announces NVIDIA Optimus technology, a breakthrough for notebook PCs that chooses the best graphics processor for running a given application and automatically routes the workload to either an NVIDIA discrete GPU or Intel integrated graphics - delivering great performance while also providing great battery life.

"Consumers no longer have to choose whether they want great graphics performance or sustained battery life," said Rene Haas, general manager of notebook products at NVIDIA. "NVIDIA Optimus gives them both - great performance, great battery life and it simply works." Just as a Hybrid car chooses between the gas-powered and electric car engine on-the-fly and uses the most appropriate engine, NVIDIA Optimus technology does the same thing for graphics processors. NVIDIA Optimus Technology instantly directs the workload through the most efficient processor for the job, extending battery life by up to 2 times compared to similarly configured systems equipped with discrete graphics processors (GPUs). When playing 3D games, running videos, or using GPU compute applications the high-performance NVIDIA discrete GPU is used. When using basic applications, like web surfing or email, the integrated graphics processor is used. The result is long lasting battery life without sacrificing great graphics performance.
"The genius of NVIDIA Optimus is in its simplicity," said Dr. Jon Peddie, President of Jon Peddie Research, a pioneer of the graphics industry and a leading analyst. "One can surf the web and get great battery life and when one needs the extra horsepower for applications like Adobe Flash 10.1, Optimus automatically switches to the more powerful NVIDIA GPU."
Notebooks with NVIDIA Optimus technology will be available shortly, starting with the Asus UL50Vf, N61Jv, N71Jv, N82Jv, and U30Jc notebooks. For more information on NVIDIA Optimus technology visit the NVIDIA Website here.
Add your own comment

17 Comments on NVIDIA Optimus Technology Delivers Perfect Balance Of Performance And Battery Life

#1
mcloughj
I wonder if Hasbro will sue!?
Posted on Reply
#2
buggalugs
I dont think having 2 GPUs is very efficient, Why not just make a decent GPU that downclocks to IGP level when its not needed. That would be something to get excited about.
Posted on Reply
#3
[I.R.A]_FBi
Sorry the name optimus is taken and where's teh fermi?
Posted on Reply
#4
Mussels
Freshwater Moderator
[I.R.A]_FBiSorry the name optimus is taken and where's teh fermi?
its fermly on schedule for a release before 2012
Posted on Reply
#5
btarunr
Editor & Senior Moderator
mcloughjI wonder if Hasbro will sue!?
If they had named it "Optimus Prime", then yes.
Posted on Reply
#6
inferKNOX
An attempt at countering PowerPlay I take it...:ohwell:
Posted on Reply
#7
Mussels
Freshwater Moderator
inferKNOXAn attempt at countering PowerPlay I take it...:ohwell:
"we cant make power efficient cards for shit, so we're inventing this instead"
Posted on Reply
#8
[I.R.A]_FBi
Musselsits fermly on schedule for a release before 2012
it better
Mussels"we cant make power efficient cards for shit, so we're inventing this instead"
kludge ftw?
Posted on Reply
#9
Cheeseball
Not a Potato
This is HybridPower version 2, but it works with Intel GPUs.
Posted on Reply
#10
Fourstaff
I have PowerXpress, switching between 4200 and 4570 whenever I like. Took Nvidia long enough to notice that.
Posted on Reply
#11
xrealm20
Interesting, I have an old sony Z series laptop that does something similar to this. There's a switch just above the keyboard that either disables the nVidia graphics and enabled the onboard intel or vise versa.

It's there "stamina/performance" switch.
Posted on Reply
#12
TheLostSwede
News Editor
No need for a switch, it's all done on the fly as you "need" the Nvidia GPU if you believe the PR
Posted on Reply
#13
ktr
Yea, Sony did do this with the switch. Sadly you had to reboot to enable the other GPU (not on-the-fly). IMO, I don't see the point to this. High-end mobile GPUs can already underclock and undervolt during low usage (powerplay and other technology).
Posted on Reply
#14
newtekie1
Semi-Retired Folder
ktrYea, Sony did do this with the switch. Sadly you had to reboot to enable the other GPU (not on-the-fly). IMO, I don't see the point to this. High-end mobile GPUs can already underclock and undervolt during low usage (powerplay and other technology).
But even when underclocked and undervolted, they still use a huge amount of power compared to a weaker card, and that will continue to be the case until they figure out how to completely turn off parts of the silicon, which I don't thing will be much longer.
Posted on Reply
#15
Mussels
Freshwater Moderator
ktrYea, Sony did do this with the switch. Sadly you had to reboot to enable the other GPU (not on-the-fly). IMO, I don't see the point to this. High-end mobile GPUs can already underclock and undervolt during low usage (powerplay and other technology).
if the intel uses 10W at load (desktop) and a G92 card can use upto 50W at idle... you get the idea for battery life. the bigger the GPU, the more this will help.


shit, i want this on desktops - i get ~250W of power draw from my two cards idling.
Posted on Reply
#16
inferKNOX
ktrYea, Sony did do this with the switch. Sadly you had to reboot to enable the other GPU (not on-the-fly). IMO, I don't see the point to this. High-end mobile GPUs can already underclock and undervolt during low usage (powerplay and other technology).
PowerPlay is an ATi tech, so nVidia can't use it and are trying to make a competitive technology with this.
I can tell you for one, that the PowerPlay on my 5850 drops it's GPU freq from my overclocked 775MHz down to 157MHz and the same for the OC'd GDDR5 at 1125MHz that drops to 300MHz in about 1-2secs of becoming idle, dropping the power draw by a magnitude of almost 10!
nVidia realise, I think, that in the face of this (and the concern for the environment by most potential consumers), considering that they're making another monolithic GPU with the GF100 which will no doubt draw some crazy power, that they need to reassure everyone that once 10+ people switch on their GF100 rigged PCs, it won't cause a global blackout.:laugh:
Posted on Reply
Add your own comment
Dec 19th, 2024 17:39 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts