Saturday, February 9th 2008

ATI RV770 'On Par' With Expectations

With the launch of the GeForce 9 series getting closer and closer, AMD is hard pressed to find something to keep themselves competitive. While the RV670 and R680 are regaining some much needed market share, they will both pale when the GeForce 9 series is released to the public. Thankfully, AMD is not going down without a fight. About the same time as the GeForce 9 series is released, AMD is releasing a little something called the RV770. At this point, it appears that the RV770 is about 50% faster than the current HD3870, which is certainly respectable. How this compares to the GeForce 9 series is still a mystery. The release of CrossFire X technology ought to really help benchmark numbers, assuming AMD can make buying four AMD GPUs cost about as much as two from NVIDIA.
Source: Nordic Hardware
Add your own comment

57 Comments on ATI RV770 'On Par' With Expectations

#26
eidairaman1
The Exiled Airman
well notice those charts , majority of players wont see much cause the game moves at a crawl if it dips below 30 FPS.
btarunr2x 0.50 > 4x 0.25. --- while the baseball example did sound good, let me give a more natural one. Let's say tomorrow a game that's unoptimised for a multi-GPU setup comes up, what happens? The game would only exploit one of the four cards and it's 0.25. But, had the unit been 0.50, at least an unoptimised game would run better so it's important that ATI works on a strong single GPU than look at making powerful solutions based on multiple weak single units. We've pretty-much seen that happen with the Crysis benches for the HD 3870 X2:


^Look at what an unoptimised game can do. So, it's important to have stronger units. Let's face it, the G92 is a superior core to the R670, we've seen enough people attesting that, just that when in an SLI array, it slightly falters but that's the platform to blame, not the cards. In a 32-lane SLI chipset like the NForce 590 SLI or the 680i SLI, the northbridge and southbridge give out 16 lanes each to a card while in a 32 lane CrossFire compliant chipset like the AMD 790 FX or the Intel X38, the northbridge supplies all the 32 lanes. So well, a SLI of two G92 based cards falters slightly to a Crossfire of two R670 but if NVidia works well with its internal SLI on the 9800 GX2, we'll have a card waay more powerful than a HD 3870 X2, reason: the use of a powerful unit core. And supposing the 9800 GX2 does end up facing an unoptimised game like the HD3870 X2, it will perform better due to its stronger unit core.
Posted on Reply
#27
xfire
Take the concept of sli/crossfire
2 8800ultras are better than two 8800 GTX
Also isn't the gx2 their flagship?
Posted on Reply
#28
ChillyMyst
supreme commander is old, and was never optimized for sli or cf, game dev's today are fully aware they need to start supporting multi cpu solutions.

also note that supreme commander STILL HAS BUGGS, dispite all the patches its already receved.....(i know its one of my fav games)

now take a look a the resolutions your talking about, nobody i know games at 2048x1536, add to that the fact that your slaping 4x AA on to cards KNOWN to take a notable hit from AA and your example isnt very valid.

ultra hign res gameplay is still not main stream and wont be for a while to come, 1920x1080(1080p) still isnt even that common, hell ps3 and xbx360 studder in many games if u try and run them at higher then 720p and those are detocated gaming machiens.

as to your assertion that its the platform to blame, and that cf needs 32x from the videocard ports to perform better then SLI, i gotta call BULLSHIT on that, im sure somebody eventuly will show to back me on this, but CF DOSNT NEED 32X(16x per card) TO PERFORM BETTER THEN SLI, infact 8x per card is PLENTY due to how ati/amd use the INTERNAL BRIGE, cf is just the supperior dual/multi card design, probbly because ati started from scratch and nvidia just tryed to copy how 2 voodoo2 cards linked up.(same reasion the FX line sucked, they tryed to use 3dfx designs/tech/ideas mised with nvidia's own designs)

sure with AA cranked the 8800gt/gts are better at least till u hit CF/SLI then things change because in this case CF is better by design.

and i must say again, that the use of supreme commander is a bad choice, the game needs alot more tweaking and also needs driver optimizations done for it, but theres really no point because no normal person is going to have a monotor that does 2048x1536, a 1080p monotor/hdtv is already quite pricy, any higher and they are just down right un-affordable by 99% of the population.

i just love how people like you bring out insainly high res benchmarks that NOBODY REALLY GAMES AT to desscredit companys, i say the same thing when i see ati fans do it to nvidia products, and its happened alot, x1900/1950 vs 7800/7900/7950 cards for example, of the constant cf vs sli wars, where nvidia only wins if its a far more expencive set of cards on a far more expencive board.

want a good example, rd580 board vs 590sli 32x, i can get a good 580cf board for 80bucks or less, i cant get a 590 x32 board for less then 130.......thats a bit of a price diffrance, and you NEED the 32x for sli to show proper benifit, on the other hand cf, u dont, u could use ANY cf compatable motherboard and it will give you damn neer the same perf boost(given the bios are good and the system parts are on par)

you seem to think i hate nvidia, i dont, i just hate how nvidiots alwase badmouth ati/crossfire, or how they drag out benches done at INSAIN settings on games that are known to have issues with scaling, at insain resolutions and use that as a way to disscredit the company they dont like.

im using an 8800gt as i type this, and its a fast card, but it also has its flaws(omg now i will have more nvidiots say im an nvidia hater....) just as the 38*0 cards have their flaws, the only thing i can say with absolutly no qualms, NVIDIAS 8800GT cooler is PEICE OF SHIT AND THEY SHOULD BURN IN HELL FOR PUTTING IT ON THESE CARDS!!!!!, at least the 38*0 cards stock coolers dont suck that bad,infact all the ones i have seen are pretty decent,the 3rd party solutions that cost in the 40$ range are better but thats normaly true.

the way the worlds going both companys will endup going multi core insted of just continuing to use the older bruit force methods of more pipes and more shaders, really it gets silly,the way Nvidia gets their next gen like the 6 to 7 seirse, just make it beefyer and clock higher, or the gf3 where they just did a die shrink then upped the clocks!!!!

not that ati hasnt done the same thing in the past, the x800 was very close to just having 2x9800xt's in 1 chip.

blah, all this argueing over something that dosnt matter that much, I see the writing on the wall here tho, we will see more cores per card and more chips per card from now on, most likely we will endup seeing quadcore gpu's eventuly as well from both companys.

all game dev's are having to learn to multi thred their apps for dual core(the new normal chip in systems) and even quadcore(next step that will become normal in a few years)

the same will happen with GPU's, they will have to make games to take advantege of the changing prosessing enviroment of multi core cpu and gpu tech.
Posted on Reply
#29
btarunr
Editor & Senior Moderator
ChillyMystas to your assertion that its the platform to blame, and that cf needs 32x from the videocard ports to perform better then SLI, i gotta call BULLSHIT on that, im sure somebody eventuly will show to back me on this, but CF DOSNT NEED 32X(16x per card) TO PERFORM BETTER THEN SLI, infact 8x per card is PLENTY due to how ati/amd use the INTERNAL BRIGE, cf is just the supperior dual/multi card design, probbly because ati started from scratch and nvidia just tryed to copy how 2 voodoo2 cards linked up.(same reasion the FX line sucked, they tryed to use 3dfx designs/tech/ideas mised with nvidia's own designs)
I'm comparing the platforms there, not that the cards need all 32 lanes. The CF is better SLI because in 32lane setups of SLI, the HyperTransport chipset bus is congested.
ChillyMystsure with AA cranked the 8800gt/gts are better at least till u hit CF/SLI then things change because in this case CF is better by design.
What was I talking about?
ChillyMystand i must say again, that the use of supreme commander is a bad choice, the game needs alot more tweaking and also needs driver optimizations done for it, but theres really no point because no normal person is going to have a monotor that does 2048x1536, a 1080p monotor/hdtv is already quite pricy, any higher and they are just down right un-affordable by 99% of the population.
Crysis?
ChillyMysti just love how people like you bring out insainly high res benchmarks that NOBODY REALLY GAMES AT to desscredit companys, i say the same thing when i see ati fans do it to nvidia products, and its happened alot, x1900/1950 vs 7800/7900/7950 cards for example, of the constant cf vs sli wars, where nvidia only wins if its a far more expencive set of cards on a far more expencive board.
Fine, let's put up not-so-insane resolutions of an un-optimised game:
ChillyMystwant a good example, rd580 board vs 590sli 32x, i can get a good 580cf board for 80bucks or less, i cant get a 590 x32 board for less then 130.......thats a bit of a price diffrance, and you NEED the 32x for sli to show proper benifit, on the other hand cf, u dont, u could use ANY cf compatable motherboard and it will give you damn neer the same perf boost(given the bios are good and the system parts are on par)
Oh the AMD 580X based boards were expensive when they came out, the one that ASUS released had a release price of $160 which came down to $130 and now it's lower. There's just one board made by ECS that sells for peanuts and that's justified because it's a crappy board in terms of build-quality. But what triggered the price fall in 580X based boards is because not many were opting for CrossFire solutions on an AM2, it's pretty obvious why.
ChillyMystyou seem to think i hate nvidia, i dont, i just hate how nvidiots alwase badmouth ati/crossfire, or how they drag out benches done at INSAIN settings on games that are known to have issues with scaling, at insain resolutions and use that as a way to disscredit the company they dont like.
Okay, I'm an NVidiot, and it's justified because NVidia NEVER disappoints people who admire it. A high-end GPU is expected to do it all, I want to test a GPU at the highest resolution, with the highest texture filter/AA setting, else how would I deem it the best? The Radeon X1900 and X1950 passed all tests with flying colours and that's why even 'NVidiots' show respect to them.
ChillyMystim using an 8800gt as i type this, and its a fast card, but it also has its flaws(omg now i will have more nvidiots say im an nvidia hater....) just as the 38*0 cards have their flaws, the only thing i can say with absolutly no qualms, NVIDIAS 8800GT cooler is PEICE OF SHIT AND THEY SHOULD BURN IN HELL FOR PUTTING IT ON THESE CARDS!!!!!, at least the 38*0 cards stock coolers dont suck that bad,infact all the ones i have seen are pretty decent,the 3rd party solutions that cost in the 40$ range are better but thats normaly true.

the way the worlds going both companys will endup going multi core insted of just continuing to use the older bruit force methods of more pipes and more shaders, really it gets silly,the way Nvidia gets their next gen like the 6 to 7 seirse, just make it beefyer and clock higher, or the gf3 where they just did a die shrink then upped the clocks!!!!

not that ati hasnt done the same thing in the past, the x800 was very close to just having 2x9800xt's in 1 chip.

blah, all this argueing over something that dosnt matter that much, I see the writing on the wall here tho, we will see more cores per card and more chips per card from now on, most likely we will endup seeing quadcore gpu's eventuly as well from both companys.

all game dev's are having to learn to multi thred their apps for dual core(the new normal chip in systems) and even quadcore(next step that will become normal in a few years)

the same will happen with GPU's, they will have to make games to take advantege of the changing prosessing enviroment of multi core cpu and gpu tech.
You're playing a hate-game more than putting your point across. Using a NVidia card doesn't make a person an NVidiot? For $240 it makes sense to choose the 8800GT over a HD3870? Again on the topic, ATI has to concentrate on a single powerful GPU. The low-res charts too tell a story.
Posted on Reply
#30
xfire
Didn't nvidia dissapoint with their previous dual GPU?
Also with newer driver releases the games will start to get optimised, just like the 2900.
Even the 3x series are new so they too arent totally optimised.
I don't think your an Nvidiot cause you too are debating getting a 3870.
So actually ATI is on-par with Nvidia.
Posted on Reply
#31
btarunr
Editor & Senior Moderator
xfireDidn't nvidia dissapoint with their previous dual GPU?
Also with newer driver releases the games will start to get optimised, just like the 2900.
Even the 3x series are new so they too arent totally optimised.
I don't think your an Nvidiot cause you too are debating getting a 3870.
So actually ATI is on-par with Nvidia.
The 7950 GX2 was released in an environment of Windows XP, Well situationally it was a superior card to any of ATI's offerings at its time. So it wasn't a disappointment, it was just for NVidia to hold the performance leadership while it worked on the 8 series. Seriously, how much of an increment has the HD2900 got with driver releases so far? No, ATI becomes on par to NVidia when

1. It steps up its market share significantly
2. It comes up with a high-end GPU that holds its crown for over four months (which the 8800 GTX/Ultra succeeded in)
3. When it breaks the shackles of games being roped into TWIMTBP programme and ensures there's a neutral environment for game developers (or atleast a level playing field with its own developer relation programme, whatever happened to "Get into the game")
Posted on Reply
#32
ChillyMyst
btarunrCrysis?



Fine, let's put up not-so-insane resolutions of an un-optimised game:
yeah and crysis is HORRIBLY UNOPTIMIZED FOR ALL SYSTEMS, it SUCKS currently, its acctualy bit worse then farcry was when it came out, and thats after the first patch to crysis that was sposta give benifits across the board......was that bench you list done patched or unpatched crysis?(really want to know, i think one of the updates main points was to make dual gpu's work better)

given time ati and nvidia's drivers will mature and the game will mature into something far more playable(as did farcry, i love its SP game!!!)
Okay, I'm an NVidiot, and it's justified because NVidia NEVER disappoints people who admire it. A high-end GPU is expected to do it all, I want to test a GPU at the highest resolution, with the highest texture filter/AA setting, else how would I deem it the best? The Radeon X1900 and X1950 passed all tests with flying colours and that's why even 'NVidiots' show respect to them.
yes you are, remmber the FX line?, guess you think they didnt dissapoint people.........

ok, how i tell if cards best, i take the max res people acctualy play at, that would be 1080p(1920x1080) but then i remmber most people are closer to the 1600x1200 or even 1280x720(720p) range, so i compare at those resolutions, with games that are acctualy matured to the point of being valid bench tools, crysis is still very very young and needs alot more updates b4 it will really be up to par for perf.

this is BS(bull sh!t) go read some of the comments people on here and other forums have said about the 1900/1950 cards, am talking about nvidia users/nvidiots, they bad mouthed the drivers(and still do dispite not owning a card that uses them), they bad mouth the cooling, they also try and say that the 8600 is as fast as the 1900(ROFL) yeah, i have read it all, next your gonna tell me an 8600 is a better choice then a 1900 range card....
You're playing a hate-game more than putting your point across. Using a NVidia card doesn't make a person an NVidiot? For $240 it makes sense to choose the 8800GT over a HD3870? Again on the topic, ATI has to concentrate on a single powerful GPU. The low-res charts too tell a story.
no, im not hating on nvidia, just pointing out that your "we dont need multi-gpu solutions we just need massive singel gpu solutions" is a common nvidiot responce, nvidiots tend to want to do what their company is famous for, bruit forcing their way thru to the next gen, if you cant design a better chip, just shrink it, add more cooling and clock it higher, maby add more pipes as well.......

as to the 8800gt or 3870, this all depends on your use, if your gonna crank the AA then 8800gt, if your gonna bios mod/overclock it and replace the stock cooler 8800gt, if you just want a card thats as little hassle as possable and comes with cooling that dosnt suck donky dick, 3870, if you dont care about cranking the AA up 3870, if you watch alot of encoded movies/videos 3870(yv12 bugg is annoing as hell) if you want to save power when not gaming, 3870.

i could keep going, each card has its pluses,each has its negitives, if you havent owned/played with a 3800 card yourself then you really shouldnt bad mouth them, I have setup both 3850 and 3870 cards (power color units with zerotherm coolers and lifetime warr) and compared them to my 8800gt, and to tell the truth, they each had their pluses and minuses.

first the 8800gt was faster in benches and with aa cranked, BUT at 1600x1200 i had to crank the AA 2x that of the ati card to get the same quility of AA, thats annoing!!

second, the 8800gt drivers(and all nvidia drivers thru last year) have a yv12 color spaceing bugg, they dont render it properly, this is documented, check the ffdshow wiki.
mewiki.project357.com/wiki/Ffdshow_reference
look at the 8800 section, it explains what the problem is....

3rd, 8800gt has a crappy stock cooler unless u buy palit or gigabyte(or one based off their designs) the stock fan never spins up, the card gets HOT AS HELL, 90c or higher under gaming in a COLD ENVIROMENT!!!!

3870 has some draw backs as well.

1st, CCC, i hate catlysit control center!!!!!

2nd, notable performance hit using AA,far higher then the equivlant nvidia cards.

3rd, drives arent yet mature, so you run into buggs and some querks with slect games.

both card.

crysis runs like ass: this is because crysis is still in need of optimization AND it also is really a game made for the next gen of hardware if not the gen after that, it was intentionaly made to be as crazy high qulity gfx and phsyics and AI wise as possable.

soon to be replaced by next gen cards.



again if i wasnt so tired i could go on and on about both companys cards, but no need, most people here know what the plus and netigitive of each side are, if they dont, google can cure that :P


btw thanks for the quote man, i love it!!!!
Posted on Reply
#33
[I.R.A]_FBi
just wow ... lets try not to act like ravenous beast for a minute here
Posted on Reply
#34
ChillyMyst
btarunrThe 7950 GX2 was released in an environment of Windows XP, Well situationally it was a superior card to any of ATI's offerings at its time. So it wasn't a disappointment, it was just for NVidia to hold the performance leadership while it worked on the 8 series. Seriously, how much of an increment has the HD2900 got with driver releases so far? No, ATI becomes on par to NVidia when
the 2900 and the whole HD2k line got nice boosts from driver updates!!! ask their users who tested between drivers.

the 7950 was HORRIBLE, didnt work in MANY boards, had SHITTY driver support, NEVER Got decent quad SLI drivers, then was dumped like a bastared step child the moment they got somehing new out.

also the cards design was BAD BAD BAD, 2 pcb's sandwitched togather, horrible!!!!

i have 3 friends IRL that bought those cards, 2 of them did SLI with them, they ALL sold those cards off as soon as they realised that support would never come to fix the flaws of the cards, or to give quad sli decent performance benifits.

see my quote to explain how you could think the FX line and 7950gx2 where not dissapointments.....
Posted on Reply
#35
asb2106
btarunr
Yah those numbers are nowhere near correct, with my 3870 I got way more frames than that. I dont know where you dig these numbers up from but those are not true.
Posted on Reply
#36
xfire
chillymist green makes it hard to read. Use darker colours.
Also nvidia users complain about lack of driver updates from nvidia.
and the hd 2900 gained a lot of performance improvements after driver updates.
Posted on Reply
#37
btarunr
Editor & Senior Moderator
asb2106Yah those numbers are nowhere near correct, with my 3870 I got way more frames than that. I dont know where you dig these numbers up from but those are not true.
I dig them up from a website called TechPowerup.

www.techpowerup.com/reviews/HIS/HD_3870_X2/7.html
Posted on Reply
#38
asb2106
btarunrI dig them up from a website called TechPowerup.
sorry man but results like that are not true, I have seen so many people post results that vary all across the board.

With my 3870 I know that I got higher results than that. Guarenteed.

I really think its funny how a forum about ATI cards turns into a battle about how Nvidia is better, who cares, its not about that. This sounds like a battle between mac and PC and nobody really cares to hear it. At least not me. Im out PEACE
Posted on Reply
#39
candle_86
actully this looks similar to something ive seen, multiple GPU's to catch up. For multiple Generations

anyone rember Voodoo 5/4? Or voodoo2? 3dfx failed big time, the costs of dual GPU are more expensive than single because of PCB and memory compelxity
Posted on Reply
#40
btarunr
Editor & Senior Moderator
asb2106I really think its funny how a forum about ATI cards turns into a battle about how Nvidia is better, who cares, its not about that. This sounds like a battle between mac and PC and nobody really cares to hear it. At least not me. Im out PEACE
A. I wasn't talking about NV > ATI, just telling ATI has to focus on making a powerful single GPU in its path to regaining performance leadership, aspirational value, fostering engineering potential.

B. I wasn't doing NV > ATI until Chillymyst comes up with provocative words such as "NVidiots", etc.

C. I wasn't factually wrong. except for the GeForce FX part.
Posted on Reply
#41
brian.ca
It probably is a good idea to question the graphs above... I remember seeing similiar results prior to the official release then a note a few days later about an updated driver jacking up performance in a number of games.

The review over at Anandtech also supports this notion (www.anandtech.com/video/showdoc.aspx?i=3209&p=5 . Where the graphs above show a medicore (low-mid teens) improvement or actual downgrades going to a x2 from a single 3870 Anandtech's benchs show a 45 & 47% performance increase from the x2.




Also another point to consider, you said yourself that the the 7950 GX2 worked to give Nvidia a temp boost while it worked on it's bigger guns. When you consider the position ATI/AMD was in 3 to 6 months ago does it not make terrible sense for them to take a route like this that seemily lets them produce upgrades at a decent pace and also presumably saves them money in development and production costs?

If nothing else it's hard to deny the fact that in a very short time they were able to almost catch up and then jump ahead for the time being while still keeping their next move in the pipeline. NO doubt it will jump back and forth with Nvidia's upcoming releases and the new ATI chip but it's hard to argue against how much better of a position ATI is currently in, in part to going this route.

The main thing to look at I think will be how Nvidia's sandwhich design compares to ATI's 2 on 1 card. ATI is probably currently still "not there" but if Nvidia falters with their sandwhich card (be it that that design just doesn't scale as well as ATI's or some practicailtiy issue like heat/noise etc.) I think ATI will be in a very favorable position.
Posted on Reply
#42
imperialreign
guys, look - the R700 GPU is, as we know it, as rumor

ATI has not yet released an official statement as to it's development; we've only been seing rumors.

Rumor has it, also, that it will be a dual core GPU

rumor has it, that the core bridge will utilize hypertransport

rumor has it, that the core architecture will be more AMD based than ATI based

We don't know for sure yet, any way you look at it.

Currently, ATI's GPU has become better suited for arithmetic operations than graphics processing. Their GPU's are powerful when it comes to calculations and whatnot, but their graphical weakness has been becoming more and more apparent since the introduction of the HD2k series - this a big reason why nVidia has gained such a massive lead over ATI, their processor are better suited for graphical work. I think ATI has realised this also, and has been spending a lot of time at the drawing board - we haven't seen anything really "innovative" from ATI within the last 1.5 years (aside from the 3870x2 - but it's nothing they haven't done before), and considering how tight lipped they've been over the R7xx processors, it wouldn't surprise me if they've spent a lot of time building something from the ground up. Only time will tell.

Adn until we see some actual proof - sepculation will just be that. No need to get your thongs in a bunch over it.
Posted on Reply
#44
kapeeteeleest_peeg
nVidia vs ATi

I really dont know why u guys argue about who is best all the time. I owned the original riva tnt when nVidia broke thru over ten years ago, I owned the 9700pro when ATi decided to make a decent gpu almost 5 years ago. Recently, Ive owned both 2900xt & 8800gt.

This generation, nVidia has 'shaded' it in terms of performance, I will say though, that ATi is far superior in terms of image quality.

As for what next?
Well, I wouldnt underestimate AMD/Ati.
Besides, they make CPUs, Chipsets & GPU's as does Intel.

I can see nVidia being squeezed out by these two, eventually.
I havent bought an nVidia based motherboard for over a year and a half and really, why would I when intel makes best Chipsets for its CPU's...
Posted on Reply
#45
asb2106
imperialreignguys, look - the R700 GPU is, as we know it, as rumor

ATI has not yet released an official statement as to it's development; we've only been seing rumors.

Rumor has it, also, that it will be a dual core GPU

rumor has it, that the core bridge will utilize hypertransport

rumor has it, that the core architecture will be more AMD based than ATI based

We don't know for sure yet, any way you look at it.

Currently, ATI's GPU has become better suited for arithmetic operations than graphics processing. Their GPU's are powerful when it comes to calculations and whatnot, but their graphical weakness has been becoming more and more apparent since the introduction of the HD2k series - this a big reason why nVidia has gained such a massive lead over ATI, their processor are better suited for graphical work. I think ATI has realised this also, and has been spending a lot of time at the drawing board - we haven't seen anything really "innovative" from ATI within the last 1.5 years (aside from the 3870x2 - but it's nothing they haven't done before), and considering how tight lipped they've been over the R7xx processors, it wouldn't surprise me if they've spent a lot of time building something from the ground up. Only time will tell.

Adn until we see some actual proof - sepculation will just be that. No need to get your thongs in a bunch over it.
the fact of r700 is not a rumor. It is coming out. What it is and what it can do is still a rumor.
Posted on Reply
#46
WarEagleAU
Bird of Prey
Sounds interesting. If its a value core (which usually what the V equates too) with 50% gains over a HD3870 and Hd3870X2 we are looking at some truly remarkable architecture here.
Posted on Reply
#47
AddSub
Great news, if true. Maybe ATI will finally release a card that will be worth my time again. They really need to boost their ROP counts, which is something that hasn't happened in 4-5 years, because the whole unified shading architecture will take a GPU only so far.

Oh, btarunr, if 9800GX2 is best nVidia can come up with, and it seems like that IS the case, then chances are their stock is going to take a dive in the coming months, a one big-ass dive.
Posted on Reply
#48
OrbitzXT
Are there benchmarks or numbers about the 9800 GX2 or any of the new cards coming out? I keep hearing news about how nVidia is going to blow AMD and the 3870x2 away. I speculate they are but I have nothing to base that on, but I'm still confused of all this talk that seems more or less certain the 3870x2 will be beat.
Posted on Reply
#49
magibeg
I don't know how this topic turned into a massive flame war but if Ati can roll out a highend card 50% faster than the 3870 that does sound extremely tempting to sell what i have now and get one of those :P. Also could we stop flaming about the 2 cores vs 1 core vs octicores or whatever. Fact of the matter is that in terms of efficency we will have to move to multi-core cpu's and gpu's one day. Yes its true that currently there isn't a lot of software/games that properly support it but its a work in progress. Just please stop fighting about something no one can win.
Posted on Reply
#50
ChillyMyst
candle_86actully this looks similar to something ive seen, multiple GPU's to catch up. For multiple Generations

anyone rember Voodoo 5/4? Or voodoo2? 3dfx failed big time, the costs of dual GPU are more expensive than single because of PCB and memory compelxity
acctualy the dual voodoo2 combo was VERY popular in my experiance, i knew alot of ppl that got 1 card then later got another, problem was that 3dfx sucked for driver support they never made a full opengl or d3d driver for those cards, only thing they truely rocked at was glide.......

and the voodoo3/4/5 all sucked 16bit dithered insted of true 32bit colour......oh and dont get me started on the banshee cards............
Posted on Reply
Add your own comment
Jul 17th, 2024 02:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts