• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

MSI ''Big Bang'' P55 Motherboard Implements Lucid Hydra

if the chips soo sh(crash)it hot then why not have it on an X58 platform instead of this midrange garbage?

im gonna get sooo flamed for sayin that :D :P

There will be a 2nd gen of X58 boards you know ;)

But this is shit hot if it works man and scales :rockout:
 
"
The Lucid Hydra chip, as mentioned before should enable the use of 2 different video cards together on the same motherboard.
We have been told in the past that there will be different versions of the chip - one will enable the usage of video cards like
Radeon HD 4870 along with a HD 4650 for example , another version will enable the usage of NVIDIA based
video cards together - like a GTX260 together with 9600GT for example, and of course one version will enable a mix of video cards -ATI video card
together with a Nvidia video card on the same motherboard.
The Hydra chip is responsible for the Balance and link between the different video cards."


so if it's the version that mixes that will suck... ass nvidia just killed that -doh- unless lucid is going to make some drivers....


but the MAIN thing about this is not mixing card or anything along those lines, it's the new algathorim(sp?) they use to calculate running sli/x-fire which should be much improved(if implemented like they said) and we should see some good 85/90%++ scaling in dual gpu configurations
 
I have a driver combination at work which will run Nvidia and Ati cards with both drivers on the one PC without trouble, not working together to scale performance but running at the same time and stable, they're fairly old drivers now and its XP Pro 32bit, I'll find the version numbers and let you guys know :D
installed on an old Socket 939 premium gigabyte mobo.
 
I have a driver combination at work which will run Nvidia and Ati cards with both drivers on the one PC without trouble, not working together to scale performance but running at the same time and stable, they're fairly old drivers now and its XP Pro 32bit, I'll find the version numbers and let you guys know :D
installed on an old Socket 939 premium gigabyte mobo.

Yes, but can you guarantee that *ANY* driver combination will work?

Anyway I don't see why we need to try so much to mix up video cards from ATI and nVidia... it's not like we've fixed all the technical problems in the world, and this is the one thing that keeps us from achieving ascension.

I don't think you guys have seen the actual demo... There is no actual balancing like SLI and CF are doing (splitting up a frame into regions or rendering alternative frames). The chip is actually splitting the scene into it's objects, in the demo the walls and the gun (it was some kind of FPS game) were rendered by one of the cards, the ceiling, floor and other objects were rendered by the other card. The Lucid chip then reassembled the scene from the existing rendered objects.

I find this method insanely complicated and if it even works I think there will be complications until the end of time. For example it was not explained if AA modes actually work. Can you think of a good way of mixing up AA modes from ATI with AA modes from nVidia? Can you think of a good way of mixing up models rendered by an ATI card with the models rendered by an nVidia card? I mean they use very different ways of optimizing a scene. I know ATI cards can actually skip rendering the parts of the objects that would not be visible in a scene, overriding what the engine tells it to render (the reason why so many new games had rendering errors with older drivers). Can you think how you'll ensure the quality level setup in the ATI driver will match the one in the nVidia driver?

You can't mix two drivers with a Lucid chip. It's the first time I heard about the miracle Lucid chip version that would do this and I've been listening very close and following this from day one. You can mix them up now, on non-Lucid boards, but I don't think you can actually have any benefit while gaming.
 
Last edited:
well yes you can't mix 2 drivers with lucids chip but if we had newer and . . . . better more unified drivers it might work, even though that will lead to a lawsuit.
 
i'd really like to see that board reviewed here on techpowerup, when it's out of course ... then we will all be sure what it does and doesn't
 
Check my sig. You can use them together.

But I think this is vaporware.

Looks like there's already a decent prototype there -- can it still be vaporware?
 
i'd really like to see that board reviewed here on techpowerup, when it's out of course ... then we will all be sure what it does and doesn't

Second that, let's start a petition :D :respect:
 
this is the best news ive heard recently! :) this board looks fantastic!! i want to see benchies :) from you guys, so please find one+ a 1156 cpu :)
 
M~ Awesome name...
 
Second that, let's start a petition :D :respect:

Awesome! Be sure to donate some money so a complete i5 rig can be purchased and I will be happy to review it. :toast:
 
Sexy, sexy, and sexy. Now, if everyone would follow Asus route with the Q-Shield, which is more comfortable and a hell of a lot easier to install the backplate.
 
Looks like there's already a decent prototype there -- can it still be vaporware?

Well, I didn't read much other than the summary. What I mean is that using ATI and nvidia together is vaporware.
 
i'd prefer intel or AMD to buy it, and put it in their chipsets :D
 
This LUCID chip is GPU gateway, ATI must buys this company or nVidia will and integrate inside GPU it self!

Oh noes... Lucid is the next Aegia :eek:

I bet you're right... nv will buy these guys out, makes a hell of alot more sense than physX which is a floppity flop.
 
I don't think PhysX is that much of a flop, it just needs to be adopted more .... but i hope Lucid won't be bought by another big chip company, since then only one camp can "enjoy", if it works, all the multi-GPU magic
 
I don't think PhysX is that much of a flop, it just needs to be adopted more .... but i hope Lucid won't be bought by another big chip company, since then only one camp can "enjoy", if it works, all the multi-GPU magic

Well it's becoming a really big thing, most of the new titles are promising PhysX (or some form of physics acceleration, and there are no Havoc titles announced, so it must be PhysX), they will work without it but it's not the exact same experience.

All the games that are ported from PS3 will use PhysX, and some titles ported from the PS3 can already be seen as working flawlessly with nVidia cards and working very badly on ATI cards.

I am hoping for OpenCL to take off and to become the standard physics acceleration technique and all should forget about PhysX and Havoc. But for now, nVidia has the upper hand, which is why I use nVidia cards in my work and gaming systems.

Anyway, I don't see Lucid bought out by ATI or nVidia, but, if the stuff works, Intel already has it's claws into it since they are an important investor, as I stated a few posts back.
 
3 or 3 pci-e ! , don't expect bottlenick with core i5 cpu's
 
Windows 7 support loading separate display drivers using WDDMv1.1

Windows 7 support heterogeneous graphics adapters using WDDMv1.1 model drivers.
But vista does NOT support heterogeneous graphics adapters. But Windows 7 can run, for example ATI + NVIDIA WDDM1.1 drivers simultaneously[Just search web you will find such setups running].
So in Windows 7, Lucid's Hydra 200 can run ATI + NVIDIA cards.

Check this link:http://www.anandtech.com/video/showdoc.aspx?i=3646&p=1
 
This is the news I've been waiting for. It's excellent to see that the hydra chips are making it onto current (almost) motherboards, especially a mainstream/gamer/enthusiast board! I cannot wait to pick one up. :toast:

* Now I want to know how things will work on the driver side, or if drivers will be needed at all?

That's the interesting thing. While it doesn't use sli/crossfire drivers I couldn't see how it would not use standard drivers. It's a puzzle I'm not willing to drop a couple bucks on to test.
 
Douuuuble posting away. Alright after some reading I am convinced there will be no perf hit while using the Hydra. The reason is that unlike a NB or other I/O gateway the Hydra is actually a SoC, system on a chip, meaning it has a complete CPU to handle all of the tasks it's required. I'm not sure what this means for heat, but instructions sent to the GPU are intercepted and rewritten, then sent out to the GPUs, and the GPUs return the data to the Hydra chip to be recompiled and exported through the display.

It's specced to deal with more than 4 GPUs worth of work, and the interesting part is that unlike traditional scaling X-Fire or SLI, this 100% scaling can actually produce greater than 100% perf gains. Hard to believe? Well instead of split screen rendering the GPUs are given tasks that would normally be hindered by having to render other parts or effects of the scene. Imagine having a single gpu dedicated to particles and the impact that would have in your favorite FPS.

Article 1 (Hydra Explained): http://techgage.com/article/lucid_hydra_engine_multi-gpu_technology/1
Article 2 (Hydra Explained with demo): http://www.pcper.com/article.php?aid=607
Article 3 (Absolutely sick): http://www.legitreviews.com/article/1093/1/

::EDIT:: Seems the Hydra chip only draws 7W load or idle, interesting to know for heat.
 
Last edited:
This is really cool. I can't wait for it, I may have to sell my i7 setup to try this out :p
(that is if it makes it to retail)

blasphemy Jesse, blasphemy. :roll:

But on a serious note, Hydra seems to be damn awesome. My interest is peaked.
 
Last edited:
Back
Top