# MSI ''Big Bang'' P55 Motherboard Implements Lucid Hydra



## btarunr (Aug 11, 2009)

MSI already has its task cut out when Intel's first socket LGA-1156 processors hit stores. With entry-level P55-CD53, mid-range P55-GD65, enthusiast-range P55-GD80, and a micro-ATX P55M-GD45 motherboard offering in place, the lineup seems just about complete, except for two mysterious motherboards that aren't part of the list. First being the G9P55-DC that packs an NVIDIA BR-03 bridge chip that enables 3-way SLI with better interface bandwidth to the three graphics cards, and second is under the looking-glass today. Codenamed "Big Bang", this prototype motherboard by MSI packs a LucidLogix Hydra technology, which clearly on paper, is the next big thing as far as multi-GPU systems go. 

MSI P55 "Big Bang" looks similar to the P55-GD80, except for that under the top chipset heatsink (which, by the way, is purely cosmetic on the GD80), is a Lucid Hydra chip. The chip connects to all three (or four) PCI-Express x16 slots (lane configuration not known), and allows Lucid's multi-GPU technology that lets you make practically any combination of graphics cards, for performance scaling. The member cards needn't have parity on their performance, as the Hydra chip does all the load-balancing by itself. Products based on Hydra are slowly, but surely showing up in small numbers for now, including enterprise-grade rack-mount graphics rendering boxes like this one, conceived a long time ago. A lot of details are yet to emerge, especially around if there are more motherboard manufacturers eying Hydra, about when a Hydra-based product actually makes it to shelves, and more importantly, when does MSI plan to sell this and G9P55-DC.



 

 



*View at TechPowerUp Main Site*


----------



## mlee49 (Aug 11, 2009)

Well, well, well... Pairing a Ati and Nvidia card for performance scaling.  Sounds suspicious to me, weird Nvidia hasn't filed a lawsuit on this yet.

:skeptical:


----------



## erocker (Aug 11, 2009)

This is the news I've been waiting for. It's excellent to see that the hydra chips are making it onto current (almost) motherboards, especially a mainstream/gamer/enthusiast board! I cannot wait to pick one up. 

* Now I want to know how things will work on the driver side, or if drivers will be needed at all?


----------



## mtosev (Aug 11, 2009)

does it blow up to have the name > Big bang?


----------



## DrPepper (Aug 11, 2009)

Totally getting this for testing purposes


----------



## btarunr (Aug 11, 2009)

erocker said:


> This is the news I've been waiting for. It's excellent to see that the hydra chips are making it onto current (almost) motherboards, especially a mainstream/gamer/enthusiast board! I cannot wait to pick one up.
> 
> * Now I want to know how things will work on the driver side, or if drivers will be needed at all?



Individual cards remain abstract, their drivers remain subsets to Lucid's driver, from what little is known.


----------



## DrPepper (Aug 11, 2009)

Do you know how it works at all ? I mean seems pretty crazy to get two different architectures to work together.


----------



## btarunr (Aug 11, 2009)

DrPepper said:


> Do you know how it works at all ?



Do you know any better? If so, enlighten.


----------



## DrPepper (Aug 11, 2009)

btarunr said:


> Do you know better? If so, enlighten.



Your the expert I was asking if you knew how it works


----------



## btarunr (Aug 11, 2009)

DrPepper said:


> Your the expert I was asking if you knew how it works





> The HYDRA Engine is the first solution that "plays well with others." Unlike other technologies, it is completely compatible with all gaming applications, chipsets and GPUs from any vendor, so you can develop a totally customizable PC solution. Mix and match elements into your gaming system to achieve the price and performance level that's just perfect for you. And developers no longer have to write games and applications specific to a chip. Whether the API is OpenGL or Direct3D, the HYDRA Engine can tackle both.



So far, this is their investor-bait: http://www.lucidlogix.com/technology/technologies.html


----------



## DrPepper (Aug 11, 2009)

I just find it mind boggling. It just seems too good to be true.


----------



## 1Kurgan1 (Aug 11, 2009)

Crazy, but when will we see it on an AMD board? I would like flexibility like Intel chipsets have atm (granted I know this is better).


----------



## Easo (Aug 11, 2009)

This is so strange... but i definetily want this to live!


----------



## KainXS (Aug 12, 2009)

I really really hope this chip succeeds, I get tired of motherboard manufacturers prioritizing crossfire and sli on certain boards

and it came just in time for direct x 11 too XD

I remember they said they would make a pci/pci-e add in card version of this, I would buy it in a minute if this is good.


----------



## hat (Aug 12, 2009)

Big Bang because it will make a Big Bang in the legal world when it blows up into a lawsuit?

Either way, if this actually does work, BIG props to the engineers behind this, but I just can't see it working... I suspect lots of crashes, blue screens, performance issues, etc. However they would have to make thier own drivers for running Nvidia and ATi cards togeather, since you can't have Nvidia and ATi drivers running togeather on Vista or 7.

Hm... blurry picture is blurry. Could this be a fake?


----------



## Polarman (Aug 12, 2009)

Why would anyone put two different video drivers on the same pc? 

Big Bang BSOD!


----------



## KainXS (Aug 12, 2009)

I was thinking that too, but when looking at the chip all I could think is, man, thats pretty flat, it looks a bit too flat and dull, like paper or something,

maybe its just my eyes lol


----------



## newtekie1 (Aug 12, 2009)

This sounds amazing, I definitely will be looking more into this.  Does it apply to games, or just other tasks like 3D Modeling?



mlee49 said:


> Well, well, well... Pairing a Ati and Nvidia card for performance scaling.  Sounds suspicious to me, weird Nvidia hasn't filed a lawsuit on this yet.
> 
> :skeptical:



ATi seems to be the one that likes to throw around lawsuits recently, or at least the threats of lawsuits...so why did you pick nVidia as the one to sue over this?


----------



## mlee49 (Aug 12, 2009)

I'm merely playing it up, but someone is gonna get pissy about misuse of their product.  Surely the scaling will be marginal at first until drivers are fully developed, hacked, and rereleased.


----------



## LittleLizard (Aug 12, 2009)

i agree will all that said about the drivers but it should work kinda when u have an ati and a geforce for physx


----------



## boomstik360 (Aug 12, 2009)

This is really cool. I can't wait for it, I may have to sell my i7 setup to try this out 
(that is if it makes it to retail)


----------



## phanbuey (Aug 12, 2009)

If the Lucid driver sits on top of the gfx drivers, then Windows, nor the other driver sets will know about their co-existance...

Seems like a better implementation of SLI/CFX - something that won't depend on games being optimized or not.  Amen to that.


----------



## TheGuruStud (Aug 12, 2009)

hat said:


> Big Bang because it will make a Big Bang in the legal world when it blows up into a lawsuit?
> 
> Either way, if this actually does work, BIG props to the engineers behind this, but I just can't see it working... I suspect lots of crashes, blue screens, performance issues, etc. However they would have to make thier own drivers for running Nvidia and ATi cards togeather, since you can't have Nvidia and ATi drivers running togeather on Vista or 7.
> 
> Hm... blurry picture is blurry. Could this be a fake?



Check my sig. You can use them together.

But I think this is vaporware.


----------



## jagl4d (Aug 12, 2009)

> as the Hydra chip does all the load-balancing by itself



Added stage means another performance degradation. 
A penny-pinching enthusiast's mobo. Parting with moolah$$, so as to use low-end gpus.


----------



## livehard (Aug 12, 2009)

mlee49 said:


> Well, well, well... Pairing a Ati and Nvidia card for performance scaling.  Sounds suspicious to me, weird Nvidia hasn't filed a lawsuit on this yet.
> 
> :skeptical:



mhm


----------



## enzolt (Aug 12, 2009)

about time we heard about the elusive lucidlogix hydra again. its been a while since the last update. definitely looking forward to this as well as the pci/pci-ex implementation.


----------



## Mussels (Aug 12, 2009)

never heard of this before, but i liiiiike it


----------



## AsRock (Aug 12, 2009)

mtosev said:


> does it blow up to have the name > Big bang?



Think they mean BIG Bang as a new beginning.


Sounds cool hopefully it will work out for the better.  Shame there is no numbers.


----------



## qwerty_lesh (Aug 12, 2009)

if the chips soo sh(crash)it hot then why not have it on an X58 platform instead of this midrange garbage?

im gonna get sooo flamed for sayin that


----------



## Mussels (Aug 12, 2009)

qwerty_lesh said:


> if the chips soo sh(crash)it hot then why not have it on an X58 platform instead of this midrange garbage?
> 
> im gonna get sooo flamed for sayin that



1336 = old busted
1156 = new hotness



You dont get all the cool accesories on the old 'n busted.

Thats what you get for being an early adopter.


----------



## qwerty_lesh (Aug 12, 2009)

Mussels said:


> 1336 = old busted
> 1156 = new hotness
> 
> 
> ...



rofl, mate.. its still a better platform


----------



## buggalugs (Aug 12, 2009)

OH wow. Anandtech had a story about this a while ago. If this thing actually works its gonna be kickass and will eliminate all the usual multi GPU problems and increase performance over standard SLI or crossfire.

 Its strange though because Nvidia were talking about their "the big bang" coming soon but maybe its different from this "big bang".

 I hope its as good as they say.


----------



## Mussels (Aug 12, 2009)

better or not, its the older one. old 'n busted.

1156 is will smith, and therefore gets the new stuff 



nvidias big bang and big bang II were different things


----------



## Error 404 (Aug 12, 2009)

Not sure if anyone has asked previously, but where is the NB? The Lucid chip is right where the NB is, so unless they're integrated together then this mobo is missing a NB... :\


----------



## Kantastic (Aug 12, 2009)

I really wonder how this is going to work... it seems possible since they have their own exclusive drivers. 
Anybody want to guess how Nvidia/ATi are taking it?


----------



## phanbuey (Aug 12, 2009)

Error 404 said:


> Not sure if anyone has asked previously, but where is the NB? The Lucid chip is right where the NB is, so unless they're integrated together then this mobo is missing a NB... :\



NB is on teh chip... 1156 has NB on die.  also the Lucid is behind the first PCI-e slot... the NB is like glorified SB in the same place as NB has always been.


----------



## buggalugs (Aug 12, 2009)

Im sticking with 1366, you would have to be nuts to downgrade to 1156.(unless this lucid thing works out) I usually upgrade everytime something new comes out but in this case it would be a slight downgrade so this 1366 system has held its value and worth every cent.

 Plus i have already had a years worth of kickass performance while peasants have still got socket 775 boards. (lol im joking im a peasant too but i spend all my money on computers)


----------



## Mussels (Aug 12, 2009)

Error 404 said:


> Not sure if anyone has asked previously, but where is the NB? The Lucid chip is right where the NB is, so unless they're integrated together then this mobo is missing a NB... :\



1156 has no NB 
its part of the 'new hotness'


----------



## phanbuey (Aug 12, 2009)

Mussels said:


> 1156 has no NB
> its part of the 'new hotness'


----------



## Frick (Aug 12, 2009)

Ah, so the chip is completed then? I remember reading about this hydra thing some time ago here on TPU.. Should be interesting to see how well it does.


----------



## Bjorn_Of_Iceland (Aug 12, 2009)

Sounds more like "Bug Bang"


----------



## tastegw (Aug 12, 2009)

i really dont know what to think of this news.....so many unanswered questions.

but i would *LOVE* to see this work in real life.

285+4770+voodoo3?  lol 

no bridges? the drivers seem like alot of work,  and overclocking them? lots of questions here.

but if they get this to plug and play and only one driver with no bridge(s)....amazing.


----------



## parelem (Aug 12, 2009)

the gpu industry will really be interesting to watch if hydra works as they claim, eliminates quite a bit of competition.


----------



## Sihastru (Aug 12, 2009)

During the Lucid Demo presentation it was stated that ATI + nVidia configurations will *NOT* work. What Lucid promised is *close to 100% scaling* for SLI or CF. They specifically mentioned that you won't be able to run different drivers at the same time. All they said was that you can use *ANY* two or more nVidia *OR ANY* two or more ATI cards and the end result will be that the system performance numbers will be the SUMS of the performance numbers of the individual cards.

Don't go dreaming about ATI+nVidia LOVE, it won't happen.

Also there is no point in a lawsuit from ATI or nVidia since both CF and SLI are already licensed all for LGA1366 and LGA1156 platforms. I believe it was announced yesterday. It is the reason why you won't see Lucid chips on AMD motherboards too soon. Another reason will be that Intel Capital is one of key investors in Lucid Tech along side Giza Venture Capital and Genesis Partners. Until you see AMD/ATI on that list, you're out of luck.


----------



## Mussels (Aug 12, 2009)

thanks, good info.


----------



## AltecV1 (Aug 12, 2009)

Sihastru said:


> During the Lucid Demo presentation it was stated that ATI + nVidia configurations will *NOT* work. What Lucid promised is *close to 100% scaling* for SLI or CF. They specifically mentioned that you won't be able to run different drivers at the same time. All they said was that you can use *ANY* two or more nVidia *OR ANY* two or more ATI cards and the end result will be that the system performance numbers will be the SUMS of the performance numbers of the individual cards.
> 
> Don't go dreaming about ATI+nVidia LOVE, it won't happen.



i just like to add that it is not theyer fault it is windows fault(cant run different video drivers at the same time)


----------



## Mussels (Aug 12, 2009)

AltecV1 said:


> i just like to add that it is not theyer fault it is windows fault(cant run different video drivers at the same time)



its not windows 'fault' - its microsofts GREAT CHOICE.


Half the reason for windows XP installs corrupting, was two video drivers loading at once - and i dont mean ATI + nvidia, i mean half of one nvidia driver and half of another.

why do you think so many drivercleanup tools were created? to wipe them all out and solve the problem. MS *fixed* this problem, by only allowing one to load at a time.


----------



## KH0UJ (Aug 12, 2009)

I just hope this will not create a bridge to both camps to unite (Nvidia+ATI), if that time happens I think no more price competitions price monopoly:shadedshu


----------



## AltecV1 (Aug 12, 2009)

Mussels said:


> its not windows 'fault' - its microsofts GREAT CHOICE.
> 
> 
> Half the reason for windows XP installs corrupting, was two video drivers loading at once - and i dont mean ATI + nvidia, i mean half of one nvidia driver and half of another.
> ...



you say great choice, i say fault anyway lets stick with the thread


----------



## Mussels (Aug 12, 2009)

AltecV1 said:


> you say great choice, i say fault



i'd say enjoy your BSOD's, but as you use vista you're safe 
anything that makes my OS more stable, is a win IMO.


agreed: anyone else got info on this? Sihastru has some interesting info a few posts up


----------



## InTeL-iNsIdE (Aug 12, 2009)

qwerty_lesh said:


> if the chips soo sh(crash)it hot then why not have it on an X58 platform instead of this midrange garbage?
> 
> im gonna get sooo flamed for sayin that



There will be a 2nd gen of X58 boards you know 

But this is shit hot if it works man and scales


----------



## jamesrt2004 (Aug 12, 2009)

"
The Lucid Hydra chip, as mentioned before should enable the use of 2 different video cards together on the same motherboard.
We have been told in the past that there will be different versions of the chip - one will enable the usage of video cards like
Radeon HD 4870 along with a HD 4650 for example , another version will enable the usage of NVIDIA based
video cards together - like a GTX260 together with 9600GT for example, and of course one version will enable a mix of video cards -ATI video card
together with a Nvidia video card on the same motherboard.
The Hydra chip is responsible for the Balance and link between the different video cards."


so if it's the version that mixes that will suck... ass nvidia just killed that -doh- unless lucid is going to make some drivers....


but the MAIN thing about this is not mixing card or anything along those lines, it's the new algathorim(sp?) they use to calculate running sli/x-fire which should be much improved(if implemented like they said) and we should see some good 85/90%++ scaling in dual gpu configurations


----------



## qwerty_lesh (Aug 12, 2009)

I have a driver combination at work which will run Nvidia and Ati cards with both drivers on the one PC without trouble, not working together to scale performance but running at the same time and stable, they're fairly old drivers now and its XP Pro 32bit, I'll find the version numbers and let you guys know 
installed on an old Socket 939 premium gigabyte mobo.


----------



## Sihastru (Aug 12, 2009)

qwerty_lesh said:


> I have a driver combination at work which will run Nvidia and Ati cards with both drivers on the one PC without trouble, not working together to scale performance but running at the same time and stable, they're fairly old drivers now and its XP Pro 32bit, I'll find the version numbers and let you guys know
> installed on an old Socket 939 premium gigabyte mobo.



Yes, but can you guarantee that *ANY* driver combination will work?

Anyway I don't see why we need to try so much to mix up video cards from ATI and nVidia... it's not like we've fixed all the technical problems in the world, and this is the one thing that keeps us from achieving ascension.

I don't think you guys have seen the actual demo... There is no actual balancing like SLI and CF are doing (splitting up a frame into regions or rendering alternative frames). The chip is actually splitting the scene into it's objects, in the demo the walls and the gun (it was some kind of FPS game) were rendered by one of the cards, the ceiling, floor and other objects were rendered by the other card. The Lucid chip then reassembled the scene from the existing rendered objects.

I find this method insanely complicated and if it even works I think there will be complications until the end of time. For example it was not explained if AA modes actually work. Can you think of a good way of mixing up AA modes from ATI with AA modes from nVidia? Can you think of a good way of mixing up models rendered by an ATI card with the models rendered by an nVidia card? I mean they use very different ways of optimizing a scene. I know ATI cards can actually skip rendering the parts of the objects that would not be visible in a scene, overriding what the engine tells it to render (the reason why so many new games had rendering errors with older drivers). Can you think how you'll ensure the quality level setup in the ATI driver will match the one in the nVidia driver?

*You can't mix two drivers with a Lucid chip.* It's the first time I heard about the miracle Lucid chip version that would do this and I've been listening very close and following this from day one. You can mix them up now, on non-Lucid boards, but I don't think you can actually have any benefit while gaming.


----------



## KainXS (Aug 12, 2009)

well yes you can't mix 2 drivers with lucids chip but if we had newer and . . . . better more unified drivers it might work, even though that will lead to a lawsuit.


----------



## buggalugs (Aug 12, 2009)

Heres some good info on what its about, its about a year old though.

http://www.anandtech.com/showdoc.aspx?i=3385


----------



## soldier242 (Aug 12, 2009)

i'd really like to see that board reviewed here on techpowerup, when it's out of course ... then we will all be sure what it does and doesn't


----------



## mdm-adph (Aug 12, 2009)

TheGuruStud said:


> Check my sig. You can use them together.
> 
> But I think this is vaporware.



Looks like there's already a decent prototype there -- can it still be vaporware?


----------



## Sihastru (Aug 12, 2009)

soldier242 said:


> i'd really like to see that board reviewed here on techpowerup, when it's out of course ... then we will all be sure what it does and doesn't



Second that, let's start a petition


----------



## $ReaPeR$ (Aug 12, 2009)

this is the best news ive heard recently!  this board looks fantastic!! i want to see benchies  from you guys, so please find one+ a 1156 cpu


----------



## h3llb3nd4 (Aug 12, 2009)

M~ Awesome name...


----------



## erocker (Aug 12, 2009)

Sihastru said:


> Second that, let's start a petition



Awesome! Be sure to donate some money so a complete i5 rig can be purchased and I will be happy to review it.


----------



## WarEagleAU (Aug 12, 2009)

Sexy, sexy, and sexy. Now, if everyone would follow Asus route with the Q-Shield, which is more comfortable and a hell of a lot easier to install the backplate.


----------



## TheGuruStud (Aug 12, 2009)

mdm-adph said:


> Looks like there's already a decent prototype there -- can it still be vaporware?



Well, I didn't read much other than the summary. What I mean is that using ATI and nvidia together is vaporware.


----------



## filip007 (Aug 14, 2009)

This LUCID chip is GPU gateway, ATI must buys this company or nVidia will and integrate inside GPU it self!


----------



## Mussels (Aug 14, 2009)

i'd prefer intel or AMD to buy it, and put it in their chipsets


----------



## phanbuey (Aug 14, 2009)

filip007 said:


> This LUCID chip is GPU gateway, ATI must buys this company or nVidia will and integrate inside GPU it self!



Oh noes... Lucid is the next Aegia 

I bet you're right... nv will buy these guys out, makes a hell of alot more sense than physX which is a floppity flop.


----------



## soldier242 (Aug 14, 2009)

I don't think PhysX is that much of a flop, it just needs to be adopted more .... but i hope Lucid won't be bought by another big chip company, since then only one camp can "enjoy", if it works, all the multi-GPU magic


----------



## Sihastru (Aug 14, 2009)

soldier242 said:


> I don't think PhysX is that much of a flop, it just needs to be adopted more .... but i hope Lucid won't be bought by another big chip company, since then only one camp can "enjoy", if it works, all the multi-GPU magic



Well it's becoming a really big thing, most of the new titles are promising PhysX (or some form of physics acceleration, and there are no Havoc titles announced, so it must be PhysX), they will work without it but it's not the exact same experience.

All the games that are ported from PS3 will use PhysX, and some titles ported from the PS3 can already be seen as working flawlessly with nVidia cards and working very badly on ATI cards.

I am hoping for OpenCL to take off and to become the standard physics acceleration technique and all should forget about PhysX and Havoc. But for now, nVidia has the upper hand, which is why I use nVidia cards in my work and gaming systems.

Anyway, I don't see Lucid bought out by ATI or nVidia, but, if the stuff works, Intel already has it's claws into it since they are an important investor, as I stated a few posts back.


----------



## Hayder_Master (Aug 17, 2009)

3 or 3 pci-e ! , don't expect bottlenick with core i5 cpu's


----------



## harold2009 (Oct 27, 2009)

*Windows 7 support loading separate display drivers using WDDMv1.1*

Windows 7 support heterogeneous graphics adapters using WDDMv1.1 model drivers. 
But vista does NOT support heterogeneous graphics adapters. But Windows 7 can run, for example ATI + NVIDIA WDDM1.1 drivers simultaneously[Just search web you will find such setups running].
So in Windows 7, Lucid's Hydra 200 can run ATI + NVIDIA cards.

Check this link:http://www.anandtech.com/video/showdoc.aspx?i=3646&p=1


----------



## Binge (Oct 27, 2009)

erocker said:


> This is the news I've been waiting for. It's excellent to see that the hydra chips are making it onto current (almost) motherboards, especially a mainstream/gamer/enthusiast board! I cannot wait to pick one up.
> 
> * Now I want to know how things will work on the driver side, or if drivers will be needed at all?



That's the interesting thing.  While it doesn't use sli/crossfire drivers I couldn't see how it would not use standard drivers.  It's a puzzle I'm not willing to drop a couple bucks on to test.


----------



## Binge (Oct 27, 2009)

Douuuuble posting away.  Alright after some reading I am convinced there will be no perf hit while using the Hydra.  The reason is that unlike a NB or other I/O gateway the Hydra is actually a SoC, system on a chip, meaning it has a complete CPU to handle all of the tasks it's required.  I'm not sure what this means for heat, but instructions sent to the GPU are intercepted and rewritten, then sent out to the GPUs, and the GPUs return the data to the Hydra chip to be recompiled and exported through the display.

It's specced to deal with more than 4 GPUs worth of work, and the interesting part is that unlike traditional scaling X-Fire or SLI, this 100% scaling can actually produce greater than 100% perf gains.  Hard to believe?  Well instead of split screen rendering the GPUs are given tasks that would normally be hindered by having to render other parts or effects of the scene.  Imagine having a single gpu dedicated to particles and the impact that would have in your favorite FPS.

Article 1 (Hydra Explained): http://techgage.com/article/lucid_hydra_engine_multi-gpu_technology/1
Article 2 (Hydra Explained with demo): http://www.pcper.com/article.php?aid=607
Article 3 (Absolutely sick): http://www.legitreviews.com/article/1093/1/

::EDIT:: Seems the Hydra chip only draws 7W load or idle, interesting to know for heat.


----------



## johnnyfiive (Oct 27, 2009)

boomstik360 said:


> This is really cool. I can't wait for it, I may have to sell my i7 setup to try this out
> (that is if it makes it to retail)



blasphemy Jesse, blasphemy. 

But on a serious note, Hydra seems to be damn awesome. My interest is peaked.


----------



## Binge (Oct 27, 2009)

piqued*


----------



## johnnyfiive (Oct 27, 2009)

Hrmm... whats funny, now thinking about it, is the locaiton of the Hydra chip on the P55 boards, its where the traditional Northbridge would have been. Almost like LGA 1156 was made with the intenton of adapting Hydra.  . I want Hydra on a X58 board.

*EDIT:*
http://www.anandtech.com/video/showdoc.aspx?i=3646


----------



## xVeinx (Oct 28, 2009)

How much latency does this introduce into the mix? Even as a SoC, you'll have latency introduced with missed cycles and such (assuming a more traditional CPU style architecture, but I could be wrong of course!). So, even if you get better load balancing, could this affect minimum framerates while boosting the average, etc.? Just a thought...


----------



## Sihastru (Oct 28, 2009)

Considering nVidia makes liberal use of it's NF200 PCIe bridge chips with a lot of success (most older FSB tech boards, some X58 boards, there's even one onboard the GTX295) PCIe seems to not be very much affected by latency issues. At least not with current generation video cards.


----------



## wolf (Oct 28, 2009)

I was really keen on the idea of the original Lucid Demo, the one where the hydra chip was alone on a board with only pci-express slots, and a fat cable running back to a host card that sat in your pc's pci-express slot.

Of course I think they will sell more on a motherboard, but I really like the idea of a separate graphics subsystem altogether, with its own PSU and cooling etc, not to mention when you upgrade your pc, you don't need to buy another board with a lucid chip, you can keep keep the extra box.

I want it so bad;


----------



## johnnyfiive (Oct 29, 2009)

That reminds me of the Sega CD or Sega 32x. It's a neat idea, but it won't go far. Having it on the motherboard is essential in production costs. Having a separate box with a PCB housing the Hydra chip, PSU, etc., would get expensive. But yes, that would be awesome having a separate box just for the video cards.


----------



## LAN_deRf_HA (Oct 29, 2009)

Mussels said:


> better or not, its the older one. old 'n busted.
> 
> 1156 is will smith, and therefore gets the new stuff
> 
> ...



Wah? 1156 is 1366's retard sibling. Not getting any cpus with more than 4 cores and it doesn't even overclock as far. Either save money and buy a 775 or go all out on a 1366. 1156 is a dead end.


----------



## Mussels (Oct 30, 2009)

it was a men in black joke.


----------



## erocker (Oct 30, 2009)

LAN_deRf_HA said:


> Wah? 1156 is 1366's retard sibling. Not getting any cpus with more than 4 cores and it doesn't even overclock as far. Either save money and buy a 775 or go all out on a 1366. 1156 is a dead end.



Well, 1156 is replacing s775 and is cheaper than 1366 so why not? Anyways, if the Hydra chip actually works well, I'm positive we'll see it on the 1366 boards.


----------



## MrHydes (Dec 18, 2009)

mlee49 said:


> Well, well, well... Pairing a Ati and Nvidia card for performance scaling.  Sounds suspicious to me, weird Nvidia hasn't filed a lawsuit on this yet.
> 
> :skeptical:



i hope Lucid can work it out


----------



## Mussels (Dec 18, 2009)

MrHydes said:


> i hope Lucid can work it out



to me, it sounds like its dividing parts of the screen up - say if it was 2 cards, they'd each think they were driving half that resolution.

1600x1200

each card would be doing 1600x600 on its own monitor, as far as it was concerned


----------



## PaulieG (Dec 18, 2009)

LAN_deRf_HA said:


> Wah? 1156 is 1366's retard sibling. Not getting any cpus with more than 4 cores and it doesn't even overclock as far. Either save money and buy a 775 or go all out on a 1366. 1156 is a dead end.



Ugh. This is just not true. Dead end because of one very expensive cpu? Honestly, there is little difference b/t 1156 and 1366. Really, the only significant thing that 1366 gives you is a 6 core cpu that will cost you $1000. Other than that, you could argue that 1156 gives you more options to chose from, i5 or i7 chips.


----------



## ToTTenTranz (Dec 18, 2009)

Mussels said:


> to me, it sounds like its dividing parts of the screen up - say if it was 2 cards, they'd each think they were driving half that resolution.
> 
> 1600x1200
> 
> each card would be doing 1600x600 on its own monitor, as far as it was concerned



Nope, that's what SLI and Crossfire are already capable of doing, in some cases.

Lucid distributes the work into polygons. One GPU handles the characters whereas the other handles the environment. This way they don't have to load the same texture data so the system actually doubles the available video memory with two cards.


That's why you can couple different cards. They're doing different things.


----------



## Laurijan (Dec 18, 2009)

I have no time to read the hole thread now but i want to say that this seems really to good to be true... i will get a mobo that has this new feature installed before they stopped being produced cause there will be a lawsuit maybe...


----------

