Tuesday, June 26th 2007
NVIDIA Hybrid SLI Technology
NVIDIA is planning a new technology called Hybrid SLI which can boost the graphics performance for systems that have NVIDIA's IGP (integrated graphics processor) and discrete GPUs, while it can also increase the battery life of notebook PCs. Hybrid SLI will turn off discrete GPUs and only run the IGP when the user is operating normal 2D applications. On the other hand when the user is running 3D applications, the technology will automatically turn on the discrete GPU and boost performance without the need to reboot the system and manually switch between the two graphics processors. NVIDIA expects to use the new technology on both desktop and notebook PCs. The technology is predicted to appear by the end of the year, according to Jen Hsun Huang - chief executive of NVIDIA Corp.
Source:
DigiTimes
37 Comments on NVIDIA Hybrid SLI Technology
Uhh... what?
Basically it's a cludge because NVidia can't choke the power consumption of their discete GPUs.
Intel can do it. Can you imagine what BULLSHIT it would be to have a 386ULV running your OS for desktop, only to kick in a Core 2 Duo for hard core app work? Completely stupid idea. Intel got clock and voltage chocking to a dream. NVidia should pull their finger out and do the same.
To those who dont get this, a few facts.
8800GTX uses ~180W of power at most.
X2900XT uses 250W of power at most.
Onboard video - 30W would be a good figure.
Swapping to the onboard saves power, is less heat in the PC (therefore less cooling) The main 3D card will live a longer life, and the greenies wont hunt you down and murder you in your sleep for contributing to global warming.
I'd be glad for a way to reduce the 200W power draw of my PC at idle (its 298W load) as it makes it too expensive to leave on downloading overnight.
Now I wish I had an onboard IGP..
(I actualy don't know, not a trick question)
It seems you already have this technology implanted in your head, as your brain switched off the minute you hit the reply button.
Edit: "Yes, this techology is very useful. Nvidia is a strong company" - is that translation right? I dont speak... spanish?
Anyway, please speak english on the forums :) thank you
Video cards don't have FSB's or Multipliers. The most they can do is lower the voltages and clocks of the video card when not under strain, but still this technology nVidia has is better than that because try as you will you won't get a video card running at 30~50 watts like an intergrated graphics chip.
fanboy.
AMD at least can drop to 1GHz (not just two multiplier notches down) and the voltage drops significantly - i've seen 20-30W reduction in power use with AMD's CnQ.
AS for my 30W comment - i meant desktop onboard, and should have said <30W. Also i meant peak, not TDP or average. I've got an Nviida 6150LE onboard in a PC here and it uses 15W or so (compared to a PCI card with 512K ram ;) ) but yeah i guess <15 is better with the trend towards passive heatsinks on mobos these days. (Laptops of course, lower is better)