Tuesday, June 26th 2007

NVIDIA Hybrid SLI Technology

NVIDIA is planning a new technology called Hybrid SLI which can boost the graphics performance for systems that have NVIDIA's IGP (integrated graphics processor) and discrete GPUs, while it can also increase the battery life of notebook PCs. Hybrid SLI will turn off discrete GPUs and only run the IGP when the user is operating normal 2D applications. On the other hand when the user is running 3D applications, the technology will automatically turn on the discrete GPU and boost performance without the need to reboot the system and manually switch between the two graphics processors. NVIDIA expects to use the new technology on both desktop and notebook PCs. The technology is predicted to appear by the end of the year, according to Jen Hsun Huang - chief executive of NVIDIA Corp.
Source: DigiTimes
Add your own comment

37 Comments on NVIDIA Hybrid SLI Technology

#1
Wile E
Power User
Nifty. I like the idea.
Posted on Reply
#2
xylomn
Should help boost the battery life of high performance notebooks which is good...
Posted on Reply
#4
mdm-adph
Hmm.... a neat idea, but if you're going to only use the IGP during 2D animation, why not get rid of it entirely and use the CPU to process what's on the screen? With today's incredibly fast processors, it shouldn't be that much of a hassle, should it?
Posted on Reply
#5
Casheti
GiovanniSi,è una tecnologia molto utile.Nvidia è una compagnia forte.:rockout:
Agreed.
Posted on Reply
#6
WarEagleAU
Bird of Prey
definately sounds interesting. IGP and Discrete card. Sounds pretty sweet. Just wonder how game performance is when the IGP borrows from system memory.
Posted on Reply
#7
Conti027
i think it would be a little more of a problem then it would help. dont ask why
Posted on Reply
#8
mdm-adph
WarEagleAUdefinately sounds interesting. IGP and Discrete card. Sounds pretty sweet. Just wonder how game performance is when the IGP borrows from system memory.
I'd assume it'd release it when you load a 3D game.
Posted on Reply
#9
hat
Enthusiast
GiovanniSi,è una tecnologia molto utile.Nvidia è una compagnia forte.:rockout:

Uhh... what?
Posted on Reply
#10
lemonadesoda
CRAP IDEA

Basically it's a cludge because NVidia can't choke the power consumption of their discete GPUs.

Intel can do it. Can you imagine what BULLSHIT it would be to have a 386ULV running your OS for desktop, only to kick in a Core 2 Duo for hard core app work? Completely stupid idea. Intel got clock and voltage chocking to a dream. NVidia should pull their finger out and do the same.
Posted on Reply
#11
KennyT772
why is it a crap idea to shutdown a 100w draw when its twiddling its thumbs? integrated is alot better than it used to be and the demand \has shown it. i sould buy into this on the ati side.
Posted on Reply
#12
Grings
Hybrid Graphic System
Sony's revolutionary Hybrid Graphic System lets you set your graphics performance. A simple hardware switch enables you to toggle between an internal graphics chip for optimal power consumption with excellent performance and an external graphics chip for even more robust performance, for unmatched control of your time and output
this has been around a while, im sure i saw oanother laptop (not the sony referred to in the quote) that had a 7900go, but fell back on the intel onboard in 2d, i think its just a case of nvidia giving the technology a name (and probably a patent too)
Posted on Reply
#14
Solaris17
Super Dainty Moderator
Wile ENifty. I like the idea.
id have to agree
Posted on Reply
#15
Mussels
Freshwater Moderator
Gringsthis has been around a while, im sure i saw oanother laptop (not the sony referred to in the quote) that had a 7900go, but fell back on the intel onboard in 2d, i think its just a case of nvidia giving the technology a name (and probably a patent too)
That laptop had a physical switch to do it, but the laptop had to be powered off to change it.

To those who dont get this, a few facts.

8800GTX uses ~180W of power at most.
X2900XT uses 250W of power at most.

Onboard video - 30W would be a good figure.

Swapping to the onboard saves power, is less heat in the PC (therefore less cooling) The main 3D card will live a longer life, and the greenies wont hunt you down and murder you in your sleep for contributing to global warming.

I'd be glad for a way to reduce the 200W power draw of my PC at idle (its 298W load) as it makes it too expensive to leave on downloading overnight.
Posted on Reply
#16
hat
Enthusiast
Undervolt the video card.
Posted on Reply
#17
Mussels
Freshwater Moderator
hatUndervolt the video card.
Hat... we cant. Nvidia/ATI might be able to, but they dont seem to want to. This tech was mainly for laptops wheree battery life is king, but it also helps desktops too.
Posted on Reply
#18
tkpenalty
lemonadesodaCRAP IDEA

Basically it's a cludge because NVidia can't choke the power consumption of their discete GPUs.

Intel can do it. Can you imagine what BULLSHIT it would be to have a 386ULV running your OS for desktop, only to kick in a Core 2 Duo for hard core app work? Completely stupid idea. Intel got clock and voltage chocking to a dream. NVidia should pull their finger out and do the same.
Sometimes fanboys cant resist taking down good ideas from the enemies... cmon man give nvidia a break. :laugh:

Now I wish I had an onboard IGP..
Posted on Reply
#19
nora.e
how much "hard core" gaming actualy gets done on a notebook anyway??????
(I actualy don't know, not a trick question)
Posted on Reply
#20
Mussels
Freshwater Moderator
lemonadesodaCRAP IDEA

Basically it's a cludge because NVidia can't choke the power consumption of their discete GPUs.

Intel can do it. Can you imagine what BULLSHIT it would be to have a 386ULV running your OS for desktop, only to kick in a Core 2 Duo for hard core app work? Completely stupid idea. Intel got clock and voltage chocking to a dream. NVidia should pull their finger out and do the same.
I want you to show me a single AGP 8x/PCI-E 16x 3D video card from intel. Intel have clock and voltage control to a dream? since when? AMD have cool and quiet on their CPU's, Intel's method barely saves any power at all (no voltage control, only multiplier adjustments) - Intel don't even MAKE video cards, unless you count onboard.

It seems you already have this technology implanted in your head, as your brain switched off the minute you hit the reply button.

Edit:
GiovanniSi,è una tecnologia molto utile.Nvidia è una compagnia forte.:rockout:
"Yes, this techology is very useful. Nvidia is a strong company" - is that translation right? I dont speak... spanish?
Anyway, please speak english on the forums :) thank you
Posted on Reply
#21
hat
Enthusiast
lemonadesodaCRAP IDEA

Basically it's a cludge because NVidia can't choke the power consumption of their discete GPUs.

Intel can do it. Can you imagine what BULLSHIT it would be to have a 386ULV running your OS for desktop, only to kick in a Core 2 Duo for hard core app work? Completely stupid idea. Intel got clock and voltage chocking to a dream. NVidia should pull their finger out and do the same.
Uhh, dude...
Video cards don't have FSB's or Multipliers. The most they can do is lower the voltages and clocks of the video card when not under strain, but still this technology nVidia has is better than that because try as you will you won't get a video card running at 30~50 watts like an intergrated graphics chip.

fanboy.
Posted on Reply
#22
macci
Onboard video - 30W would be a good figure.
less than 10W TDP would be a realistic figure... (and idle is obviously a lot less)
Posted on Reply
#23
kwchang007
Mussels(no voltage control, only multiplier adjustments)
they have voltage control. when it steps down the multiplier it will also step down voltages
MusselsOnboard video - 30W would be a good figure.
i hope to god not. my laptop has a c2d (mobile) which has a TDP of 35 watts, and my power adapter is 65 watts. so if the cpu gave off 100% of the power it uses (loses 100% into heat) that would be 35 watts. which is 35 watts for the rest of the system. keep in mind i have a low end discrete graphics. 35 watts for gfx, monitor, wifi, hdd, optical drive, north bridge, south bridge, usb poser, nic, and maybe a few other things.
Posted on Reply
#24
Mussels
Freshwater Moderator
kwchang007they have voltage control. when it steps down the multiplier it will also step down voltages


i hope to god not. my laptop has a c2d (mobile) which has a TDP of 35 watts, and my power adapter is 65 watts. so if the cpu gave off 100% of the power it uses (loses 100% into heat) that would be 35 watts. which is 35 watts for the rest of the system. keep in mind i have a low end discrete graphics. 35 watts for gfx, monitor, wifi, hdd, optical drive, north bridge, south bridge, usb poser, nic, and maybe a few other things.
The voltage control doesnt work on any of the C2D systems i've used, and with my power meter i've seen barely a 5-10W drop in power use (stock and OC'd test) (These were all E6600's, on asus boards)

AMD at least can drop to 1GHz (not just two multiplier notches down) and the voltage drops significantly - i've seen 20-30W reduction in power use with AMD's CnQ.

AS for my 30W comment - i meant desktop onboard, and should have said <30W. Also i meant peak, not TDP or average. I've got an Nviida 6150LE onboard in a PC here and it uses 15W or so (compared to a PCI card with 512K ram ;) ) but yeah i guess <15 is better with the trend towards passive heatsinks on mobos these days. (Laptops of course, lower is better)
Posted on Reply
#25
newtekie1
Semi-Retired Folder
mdm-adphHmm.... a neat idea, but if you're going to only use the IGP during 2D animation, why not get rid of it entirely and use the CPU to process what's on the screen? With today's incredibly fast processors, it shouldn't be that much of a hassle, should it?
The graphics load is still high enough to warrant a dedicated GPU. Even simple tasks like scrolling in a window and moving windows around bogs down without some kind of graphics acceleration.
Posted on Reply
Add your own comment
Dec 26th, 2024 23:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts