Thursday, December 21st 2006

Sapphire producing dual processor X1950pro


Sapphire Technology, a leading manufacturer of ATI graphics cards, has a new card nicknamed The Godfather. The card features two X1950pro graphics processing units and has been built to the 9" riser board standards, utilising a single 16x PCI-Express slot. According to Sapphire, this card performs similarly to two X1950pros in Crossfire. If the card uses the same speeds as standard X1950pros then it can be expected to have core speeds of 575MHz and memory at 700MHz, although this might be reduced if there are heat issues. The card appears to be crossfire capable, so it is likely that two of them could be used together for "Quad-Crossfire".
Source: DailyTech
Add your own comment

60 Comments on Sapphire producing dual processor X1950pro

#26
newtekie1
Semi-Retired Folder
Ketxxxnice card, but it makes no sence to me to use two X1950Pro cores. Why not do something meaningful and use two X1950XT cores? >.<
Simple answer: Heat

Long answer: The X1950XT core, which is really just a R580(90nm) used in the X1900XT/X and X1950XTX, puts out a lot of heat compared to the R570 used in the X1950Pro(80nm). The X1950XT cores requires a two slot cooler just to cool a single one, while the x1950Pro gets away with a single slot cooler. So if they were to put two X1950XT cores on one card, even a double slot cooler might not be enough to keep the card cool.
Posted on Reply
#27
InfDamarvel
Random Murdererright, but quad cf or quad sli still wont match 2 8800's...
True but AMD/ATIs game plan to create a good business is not always to put out the best and fastest product but to make a product that a consumer would buy. So if it was cheaper than two 8800GTS or GTXs then they might make a profit....Might
Posted on Reply
#28
tkpenalty
AMD/ATI is going down the best path.... they are going to make heaps out of it...

FINALLY SOMEONE DECIDES TO MAKE A DUAL CORE GRAPHICS CARD!!! :D (imagine... Octa Crossfire, not quad)
Posted on Reply
#29
Pinchy
meh, im happy with my single X1950 PRO :p
Posted on Reply
#30
Zubasa
Pinchymeh, im happy with my single X1950 PRO :p
Same here dude!!:rockout:
Octa-Crossfire.... I wonder when will PCs consume moore power than your air con....
Posted on Reply
#31
Random Murderer
The Anti-Midas
ZubasaSame here dude!!:rockout:
Octa-Crossfire.... I wonder when will PCs consume moore power than your air con....
i give it a year and a half...
Posted on Reply
#32
Zubasa
That will be a yummy electricity bill:respect:
Posted on Reply
#33
Random Murderer
The Anti-Midas
ZubasaThat will be a yummy electricity bill:respect:
just think, theres already psu's that pump out 1100-1200 watts, quad core procs(intel plans to release octo core this time next year), quad sli and now quad cf, raid arrays, external liquid cooling systems....
Posted on Reply
#34
Pinchy
but you gotta think, as time goes on, they will make cards consume less power :p
Posted on Reply
#35
Random Murderer
The Anti-Midas
Pinchybut you gotta think, as time goes on, they will make cards consume less power :p
in the direction theyre heading, i dont think its possible to lower the consumption... maybe maintain where its at though...
Posted on Reply
#36
newtekie1
Semi-Retired Folder
Random Murdererin the direction theyre heading, i dont think its possible to lower the consumption... maybe maintain where its at though...
Sure it is. Going to 65nm should help a lot in terms of power consumption. Though using more power efficient parts should also help.
Posted on Reply
#37
Random Murderer
The Anti-Midas
newtekie1Sure it is. Going to 65nm should help a lot in terms of power consumption. Though using more power efficient parts should also help.
its one thing to shrink the die and keep the same clock, but to increase the clock while shrinking the die, the decrease in power consumption will be minimal, if any.
Posted on Reply
#38
newtekie1
Semi-Retired Folder
Random Murdererits one thing to shrink the die and keep the same clock, but to increase the clock while shrinking the die, the decrease in power consumption will be minimal, if any.
I guess you missed the entire switch from 110nm to 90nm? You know, how the 7900GT was was just a die shrunk 7800GTX with higher clock speeds, and yet it used about 40w less under load.

Now of course, if you really jack up the clock speed(like the 7900GTX vs. 7800GTX) the power consumption will be a lot closer, but then again, performance will be a lot better also. Just like the other areas of the computer(CPU/RAM).
Posted on Reply
#39
overcast
Random Murdererin the direction theyre heading, i dont think its possible to lower the consumption... maybe maintain where its at though...
Have you been living under a rock for the past few years? On what basis did you conclude that it's not possible for these engineers to produce efficient designs? Especially with all the advancements in CPU architectures lately.
Posted on Reply
#40
Pinchy
lawl...as time goes on, things *should*

a) get smaller
b) get cooler and
c) use less power :p
Posted on Reply
#41
Random Murderer
The Anti-Midas
Pinchylawl...as time goes on, things *should*

a) get smaller
b) get cooler and
c) use less power :p
thats the theory anyway...
and yet in a year a Kw psu will be the standard for us overclockers, gamers, and enthusiasts....
Posted on Reply
#42
Pinchy
lol but just cus they are making such powerful PSU's, doesnt mean we will need them. Look at my Rig#2 in my signature...no-one thought i could use such a comp on a 200W with 12A, AND overclock, yet i tried it, and its been working fine for a couple of months now, gaming and benchmarking...
Posted on Reply
#43
Athlon2K15
HyperVtX™
think they should wait on this maybe make a R600 dual core?:rockout: :rockout: :rockout:
Posted on Reply
#44
stevorob
You got that right.

Should have invested the R&D to dual up on the new chips, instead of dualing the old chips that will be obsolete here shortly.
Posted on Reply
#45
Casheti
I really want to switch to DX10 to be honest but if that means buying Vista...well, then that's a big no. F*ck Vista, and f*ck Micro$hit. It looks like if I want to get DX10 it's Vista, and since that's not going to happen I'm gonna go high end DX9. Looks like I might want one of these bad boys. Or wait for R600 and run it in DX9 mode....
Posted on Reply
#46
Easy Rhino
Linux Advocate
TXchargerthen why are they making mobo's with 3 pci-x slots? lol
cause that is where we are at right now. some people think spending 1000$ on a crossfire/sli setup is a good investment for the long haul. those consumers are wrong. think about every single bit of technology. 40 years ago computers took up whole rooms. cars had massive engines to put out 350 HP. today weve got cars that put out as much power and use much less power. computers are a hundreds times more powerful and take up about a foot of space. technology improves effeciency. while history shows that at first, new technology tends to be ineffecient, it will eventually improve. so i dont give crossfire/sli very long. expect more cores on your gpu in a couple of years once they figure out how to eliminate all that heat.
Posted on Reply
#47
evil bill
I don't disagree but if you adopt the "waiting for the next big thing" attitude you would never buy a new PC - there is always something better just around the corner.

There is an exception to this right now - I'd bet if you weren't getting free upgrade to Vista with every copy of XP purchased, nobody would be buying an OS or new PC at the moment.
Posted on Reply
#48
Easy Rhino
Linux Advocate
evil billI don't disagree but if you adopt the "waiting for the next big thing" attitude you would never buy a new PC - there is always something better just around the corner.

There is an exception to this right now - I'd bet if you weren't getting free upgrade to Vista with every copy of XP purchased, nobody would be buying an OS or new PC at the moment.
yea, im definately not advocating the "wait for the next big thing" mentality. what im saying is investing in the 1000watt psu the new case to make it fit and keep it cool, and adding watercooling is all a big waste. IMO buying SLI or CROSSFIRE is fine if you didnt have to spend lots of money on all those other things. but there is this new mentality accepted by consumers that this trend is going to continue. and youve got the manufacturers laughing all the way to the bank. unless you have deep pockets, one highend graphics card really is enough on a modest power supply for atleast 2 years. especially if you are an enthusiast and like to make upgrades twice a year like most of us your going to be buying new stuff anyway right? am i crazy?
Posted on Reply
#49
Pinchy
i agree with rhino......i can pick up an X850 for what...$130 now?...and that playes a lot of the latest games at almost highest settings...sure its only SM2.0...but its only $130 :p!
Posted on Reply
Add your own comment
Nov 25th, 2024 16:37 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts