Wednesday, September 19th 2007

Intel X38 Supports SLI ?

During his keynote today at IDF, Intel's Pat Gelsinger showed off a machine based on the company's Skulltrail enthusiast gaming platform. Skulltrail is a dual-socket platform based on the X38 chipset that supports Intel's upcoming 45nm Quad-Core processors, FB-DIMM memory and offers PCI Express 2.0 support and true dual-x16 PEG slots (or four x8 PCI Express x16 slots). What was interesting about the machine on display was that it used a pair of GeForce graphics cards running in SLI mode, but on an Intel chipset. According to Intel, this is a special version of Intel X38 paired with NVIDIA's nForce MCP to enable SLI support. There's no word on official SLI support with future Intel chipsets, though.

Ed. by W1zzard: You can't pair X38 with any NVIDIA MCP because both are the same: a Northbridge with memory controller. It would be possible to put an NVIDIA southbridge on the board but this wouldn't enable SLI because that many PCI-E lanes are not routed through the SB. Since technically SLI works on ANY chipset, NVIDIA could enable SLI on X38 only if an NVIDIA SB is present.
Source: HotHardware
Add your own comment

43 Comments on Intel X38 Supports SLI ?

#1
Weer
Oh my god.
Are we looking at an X38 + 680i hybrid?
Posted on Reply
#2
TheLostSwede
News Editor
So, tell me one thing, why in gods name would Intel do this?
They'd be insane to take anything from Nvidia and put it with their chipsets.
First of all, Nvidia is using HyperTransport, Intel is not, so their chipsets aren't compatible.
Secondly, what would be the point for Intel to add what would in their eyes be a lesser chipset to one of their motherboards?
And since we know that SLI and CrossFire is done via drivers, why would Intel have to add hardware?
There have been several software hacks for SLI to work, although badly, on the 915 chipset.
The first SLI platform was actually an Intel dual CPU server board, so if it could be done then, why can't it be done now?
This sounds like a load of crap to me and there's absolutley no reason for it to be this way and whoever wrote the initial story doesn't know shit about chipsets or how they work.
Posted on Reply
#3
W1zzard
also imagine the amount of money nvidia lost if they enabled sli on non-nv boards .. who would buy an nvidia board if they could get one with intel chipset?
Posted on Reply
#4
Weer
W1zzardalso imagine the amount of money nvidia lost if they enabled sli on non-nv boards .. who would buy an nvidia board if they could get one with intel chipset?
How do you figure?
There is no difference between nVidia giving their 680i chipsets to ASUS to create the P5N32-E SLi and Striker Extreme, then nVidia giving it to Intel to use WITH it's own chipsets.
Posted on Reply
#5
newtekie1
Semi-Retired Folder
W1zzardalso imagine the amount of money nvidia lost if they enabled sli on non-nv boards .. who would buy an nvidia board if they could get one with intel chipset?
I would, I personally don't really like Intel's chipsets all that much. My 650i and 680i are both far superior to my P965 board, my P965 board is actually very crappy.

Though I know there are people that prefer Intel chipsets, so I see you point.

However, I would imagine the profit loss would be made up by the number of extra graphics cards nVidia sells. Imagine all the people with Intel chipsets that would have two nVidia cards if they could.
Posted on Reply
#6
Weer
malwareEd. by W1zzard: You can't pair X38 with any NVIDIA MCP because both are the same: a Northbridge with memory controller. It would be possible to put an NVIDIA southbridge on the board but this wouldn't enable SLI because that many PCI-E lanes are not routed through the SB. Since technically SLI works on ANY chipset, NVIDIA could enable SLI on X38 only if an NVIDIA SB is present.
Ok, hold on just a second here.
It is my understanding that it is infact the SouthBridge (also known as the MCP or Media and Communications Processor) that handles the remainder of the PCIe lanes. Without it, there would be no SLi. Aren't most of the PCIe lanes routed through the SouthBridge?
Posted on Reply
#7
[I.R.A]_FBi
newtekie1I would, I personally don't really like Intel's chipsets all that much. My 650i and 680i are both far superior to my P965 board, my P965 board is actually very crappy.

Though I know there are people that prefer Intel chipsets, so I see you point.

However, I would imagine the profit loss would be made up by the number of extra graphics cards nVidia sells. Imagine all the people with Intel chipsets that would have two nVidia cards if they could.
well you got a crappy board with a good chipset ... its like good cologne on ur skin after 2 weeks without bathing ..
Posted on Reply
#8
Flint
[I.R.A]_FBiwell you got a crappy board with a good chipset ... its like good cologne on ur skin after 2 weeks without bathing ..
:roll: Lipstick on a pig
Posted on Reply
#9
jocksteeluk
with the Intel high end 3d cards to be released within a few years it wouldn't surprise me to see Nvidia allowing full SLI capability on Intel, Intel could license SLI for its own high end Graphics cards in a the future or further marginalise Nvidia's SLI technology by creating their own which could be detrimental to their business.
Posted on Reply
#10
newtekie1
Semi-Retired Folder
[I.R.A]_FBiwell you got a crappy board with a good chipset ... its like good cologne on ur skin after 2 weeks without bathing ..
The P5B is regarded by many as a very good, but low-end, P965 board. My P5B-Deluxe wasn't much better, it could only get my E4300 up to 3GHz regardless of settings, the board just wouldn't got past 333 FSB. I sold the P5B-Deluxe and put the E4300 in a P5N-E and got it to 3.2GHz.:nutkick:
Posted on Reply
#11
KennyT772
WeerOk, hold on just a second here.
It is my understanding that it is infact the SouthBridge (also known as the MCP or Media and Communications Processor) that handles the remainder of the PCIe lanes. Without it, there would be no SLi. Aren't most of the PCIe lanes routed through the SouthBridge?
Most PCI-E lanes are routed through the northbridge not southbridge. For the longest time northbridge chips only had a certian amount available so to add 16x sli you needed the southbridge to do so.
Posted on Reply
#12
Solaris17
Super Dainty Moderator
dual socket quad core running sli?.....wouldnt mind taking that rig for a test drive
Posted on Reply
#13
Unregistered
strange,my p5b-deluxe easily does 560fsb.my 6750 is at 450fsb now,i've never known a board with such a high fsb capability.
#14
newtekie1
Semi-Retired Folder
tigger69strange,my p5b-deluxe easily does 560fsb.my 6750 is at 450fsb now,i've never known a board with such a high fsb capability.
Not mine, which is why I got rid of it. Oddly, my P5B does 370 just fine.
Posted on Reply
#15
Flint
Overclocking is never a sure bet. Your P5B and his P5B and my P5B in theory should perform the same, but in reality perform very differently...in overclocking. That is also assuming everything else, but the board was identical (cpu, memory, cooling).
Posted on Reply
#16
Unregistered
i guess i was just lucky with mine,funny,my old 6300 does 560fsb too :)
Posted on Edit | Reply
#17
WarEagleAU
Bird of Prey
Maybe with all the hooplah surrounding ATI and their World Records on Intel Chipsets, Nvidia thought it might do well to lease some of its tech to intel and make even more money?
Posted on Reply
#18
W1zzard
WeerHow do you figure?
There is no difference between nVidia giving their 680i chipsets to ASUS to create the P5N32-E SLi and Striker Extreme, then nVidia giving it to Intel to use WITH it's own chipsets.
nvidia chipset = northbridge (expensive) + southbridge (cheap)

if they sell just the southbridge nvidia makes little money and intel makes the rest
Posted on Reply
#19
Deleted member 3
Solaris17dual socket quad core running sli?.....wouldnt mind taking that rig for a test drive
Dell has i5000 workstations with Quadros in SLI.
Posted on Reply
#20
Darkrealms

I'm hearing Xfire a lot more than SLI lately. Maybe Nvidia is too. If they make SLI that much more common it'll be just like Intel dropping prices and kicking AMD while their down. ATI has really made no advancements that have "WOW'd" the end users. I think its Nvidias way of kicking ATI while their down in this whole thing. Besides a lot of people still think Intel chipset = work and Nvidia = gaming performance. Both will still sell fine.

{edit}W1zzard thanked for clearifying original posting{/edit}
Posted on Reply
#21
kwchang007
Darkrealms
I'm hearing Xfire a lot more than SLI lately. Maybe Nvidia is too. If they make SLI that much more common it'll be just like Intel dropping prices and kicking AMD while their down. ATI has really made no advancements that have "WOW'd" the end users. I think its Nvidias way of kicking ATI while their down in this whole thing. Besides a lot of people still think Intel chipset = work and Nvidia = gaming performance. Both will still sell fine.

{edit}W1zzard thanked for clearifying original posting{/edit}
That's interesting spin on things. People are saying NV would lose money, but perhaps they would gain money because of increased revenue from gfx cards because Intel chipsets would support SLI.
Posted on Reply
#22
SK-1
kwchang007That's interesting spin on things. People are saying NV would lose money, but perhaps they would gain money because of increased revenue from gfx cards because Intel chipsets would support SLI.
That is what I am thinking.
Posted on Reply
#23
newtekie1
Semi-Retired Folder
kwchang007That's interesting spin on things. People are saying NV would lose money, but perhaps they would gain money because of increased revenue from gfx cards because Intel chipsets would support SLI.
Finally, someone else that sees it my way.
Posted on Reply
#24
panchoman
Sold my stars!
intel did recently acquire rights to use the sli technology we could be looking at an xfire & sli chipset, or if its sli, we can just use modded drivers to run xfire, just like how we used modded drivers to use sli on the p35 :roll:
Posted on Reply
#25
Chewy
does sli work good witht he modded drivers? I think I may be getting a new mobo soon.
Posted on Reply
Add your own comment
Dec 25th, 2024 10:41 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts