# ATI RV770 'On Par' With Expectations



## zekrahminator (Feb 9, 2008)

With the launch of the GeForce 9 series getting closer and closer, AMD is hard pressed to find something to keep themselves competitive. While the RV670 and R680 are regaining some much needed market share, they will both pale when the GeForce 9 series is released to the public. Thankfully, AMD is not going down without a fight. About the same time as the GeForce 9 series is released, AMD is releasing a little something called the RV770. At this point, it appears that the RV770 is about 50% faster than the current HD3870, which is certainly respectable. How this compares to the GeForce 9 series is still a mystery. The release of CrossFire X technology ought to really help benchmark numbers, assuming AMD can make buying four AMD GPUs cost about as much as two from NVIDIA.

*View at TechPowerUp Main Site*


----------



## btarunr (Feb 9, 2008)

50% faster than R670? That might not help with the performance increments people are talking about GeForce 9 having over the current line-up. If 4 ATI cards cost as much as two NVidia cards, assuming the two setups perform equally, I'd choose the NVidia setup over ATI for obvious heat/power/space/motherboard price/game-optimisation issues.


----------



## Ketxxx (Feb 9, 2008)

Card by card basis ATi offers far better price-performance ratio, and better drivers. Power consumption issues can be curbed by reducing GPU voltage and clocks in 2D mode.


----------



## springs113 (Feb 9, 2008)

btarunr said:


> 50% faster than R670? That might not help with the performance increments people are talking about GeForce 9 having over the current line-up. If 4 ATI cards cost as much as two NVidia cards, assuming the two setups perform equally, I'd choose the NVidia setup over ATI for obvious heat/power/space/motherboard price/game-optimisation issues.



I believe that this article is not stating everything...as like the r670 core the r770 is basically the lower end of the spectrum...the r670 aka 3870 x2 is the highend...
the r700 is the high end chip...eventually made up of a couple of r770s...i have heard multiple..with a possibility of upto four...so in actuallitythe r700 will be the high end while the 770 the low end or core component of the 700.
ati' see multiple gpus as a future so as long as the 3870x2 does well the r700 will continue to grow...and we are all waiting on drivers.


----------



## [I.R.A]_FBi (Feb 9, 2008)

btarunr said:


> 50% faster than R670? That might not help with the performance increments people are talking about GeForce 9 having over the current line-up. If 4 ATI cards cost as much as two NVidia cards, assuming the two setups perform equally, I'd choose the NVidia setup over ATI for obvious heat/power/space/motherboard price/game-optimisation issues.


what process will either manufacturers gpu be done on?


----------



## btarunr (Feb 9, 2008)

R770 = 45nm ; G100 = 55nm (on the 9800 GTX), G94 = 65nm (on the 9600 GT)


----------



## imperialreign (Feb 9, 2008)

R700 is supposed to be 45nm, also, and IIRC, they're supposed to be fairly energy efficient, too.

Still, though, dual core GPU . . . and if ATI decide to make a dual R770/R700 GPU PCB - that's 4 cores on one card

Rumor has it the R770 and R700 will also support GDDR5, and we all know how ATI loves to jump on new tech like that

brace yourselves . . . the GPU wars are about to become extremelly interesting again


----------



## phanbuey (Feb 9, 2008)

Ketxxx said:


> Card by card basis ATi offers far better price-performance ratio, and better drivers. Power consumption issues can be curbed by reducing GPU voltage and clocks in 2D mode.



Actually... any review site will show that from any price-performance consideration the 8800GT leads... even ahead of the 2600XT, and even ahead of HD3870 and 3850. -

Source:
http://www.techpowerup.com/reviews/HIS/HD_3870_X2/23.html << techpowerup has best reviews!!


----------



## [I.R.A]_FBi (Feb 9, 2008)

btarunr said:


> R770 = 45nm ; G100 = 55nm (on the 9800 GTX), G94 = 65nm (on the 9600 GT)


45 nm + powerplay = much cooler card


----------



## Scrizz (Feb 9, 2008)

meh, can't w8


----------



## mandelore (Feb 9, 2008)

sweet, maybe just what is needed to replace my 2900xt..


----------



## moto666 (Feb 9, 2008)

Sound's interesting!
I think ATI can make it Large Again! (Rise from the Ashes)
Sorry not ATI, AMD!


----------



## HaZe303 (Feb 9, 2008)

I hope ATI/AMD can come up with some really good and intresting stuff, we know how much they need the comeback. And maybe this would get Nvidia of their lazy butt´s!? We need something to get some competition again in the gfx market, and ATI getting the performance crown would be that something! Hope hope hope.... Hope they dont do another "2900xt" on us and fail miserably??


----------



## imperialreign (Feb 9, 2008)

well, if what rumors have surfaced about ATI's upcoming R7xx series are true, they've defi got what it takes to go toe-to-toe with nVidia with the next generation of hardware.  The R7xx series looks like it'll be a strong contender against nVidia's G92 series.


----------



## zOaib (Feb 9, 2008)

imperialreign said:


> well, if what rumors have surfaced about ATI's upcoming R7xx series are true, they've defi got what it takes to go toe-to-toe with nVidia with the next generation of hardware.  The R7xx series looks like it'll be a strong contender against nVidia's G92 series.



tumhaaraay moun main ghee shakkar <-------- a saying in india 

*translation,*

may your mouth be filled with sweets and sugar ..... lol

wub be nice to see the red team right there on the horizon =P


----------



## Kreij (Feb 9, 2008)

I'm waiting for AMD (ATI) to come out with a dual core GPU with built in crossfire and PPU.
Single chip, 45nm.  Use their new FABs for GPU as well as CPU

Put two on a card and you have single-slot, quad crossfire and a pair of PPUs.

That would seriously rock.


----------



## btarunr (Feb 9, 2008)

Guys, forget dual-core / many GPU's on one PCB / X2 GX2 , etc. The architectural finesse of a company lies in its ability to make leap-improvements, not notched improvements. With a weaker X1, X2 is bound to be weak. If ATI concentrates on a strong single GPU/core architecture, it can truly call itself a leading company which it is not right now.

It's like to reach 1.00 you can use two 0.50 or four 0.25. a 0.25 is weaker than 0.50, isn't it? The idea of using many weaker units to make a strong unit is worse than fewer units to reach the same? So to attain a certain performance would you take four ATI cards or two NVidia when a lot more things go against the ATI setup like the price of the platform, power and cooling?

_Jab do balwan toh chaar ka kya kaam?_ - zOaib can translate that.


----------



## eidairaman1 (Feb 9, 2008)

I dunno 3870 Kept them from dying so to speak since the 2900XT couldnt keep in spec due to heat. 3870X2 draws about as much as the 2900XT. I think the 50% is modest performance gain, but you gotta expect, usually RV is the Mid Range sort of like RV670. who knows how R780/90 will be.


----------



## Kreij (Feb 9, 2008)

btarunr said:


> Guys, forget dual-core / many GPU's on one PCB / X2 GX2 , etc. The architectural finesse of a company lies in its ability to make leap-improvements, not notched improvements. With a weaker X1, X2 is bound to be weak. If ATI concentrates on a strong single GPU/core architecture, it can truly call itself a leading company which it is not right now.



True, but if ATI could put a strong pair of GPUs on a single die with the crossfire logic built in, it would perform much better than having discreet chips on a board. Add a PPU to the mix and you would be looking at an excellent performing chip.

If they could do it in quad on a single chip, even better.


----------



## btarunr (Feb 9, 2008)

but end of it look at the design philosophy...you're clubbing many weaker pieces to make something big. There's a difference between a quad-core CPU and a quad-core GPU. The difference is that parallelism has already been achieved on a single core/die/PCB GPU with multiple pixel-pipelines/multiple shaders, etc. You can't boost parallelism by clubbing many tiny pieces and expect something big. OTOH, a quad-core CPU adds to its capabilities with stepping-up parallelism with increase in thread/core count. So clubbing four GPU cores with say 0.25x performance each would still be a no improvement design than a single GPU with 1.0x performance, it in fact becomes worse in the aspect of thermal envelope, operation costs, the price would be the same anyway as buying a powerful NVidia card.


----------



## sneekypeet (Feb 9, 2008)

springs113 said:


> I believe that this article is not stating everything...as like the r670 core the r770 is basically the lower end of the spectrum...the r670 aka 3870 x2 is the highend...
> the r700 is the high end chip...eventually made up of a couple of r770s...i have heard multiple..with a possibility of upto four...so in actuallitythe r700 will be the high end while the 770 the low end or core component of the 700.
> ati' see multiple gpus as a future so as long as the 3870x2 does well the r700 will continue to grow...and we are all waiting on drivers.



Just to address this...you all are assuming R770 and the like when it is the RVVVVVVVVV770 we are speking about. The V IIRC stands for Value. It is not made to be top of the market.

So dont everyone get worried Im sure there will be a R770 released as well


----------



## happita (Feb 9, 2008)

So much interesting news today. I hope the R700 is all it can be!!!!


----------



## ChillyMyst (Feb 9, 2008)

btarunr said:


> Guys, forget dual-core / many GPU's on one PCB / X2 GX2 , etc. The architectural finesse of a company lies in its ability to make leap-improvements, not notched improvements. With a weaker X1, X2 is bound to be weak. If ATI concentrates on a strong single GPU/core architecture, it can truly call itself a leading company which it is not right now.
> 
> It's like to reach 1.00 you can use two 0.50 or four 0.25. a 0.25 is weaker than 0.50, isn't it? The idea of using many weaker units to make a strong unit is worse than fewer units to reach the same? So to attain a certain performance would you take four ATI cards or two NVidia when a lot more things go against the ATI setup like the price of the platform, power and cooling?
> 
> _Jab do balwan toh chaar ka kya kaam?_ - zOaib can translate that.



duno, the japnese have pretty much stomped  the rest of the worlds eletronics markets into only buying stuff they designed by doing notched updates.

please lets be clear, if i cay "japs" that is not an insult, i love the japs, great people!!!

well their way of looking at tech advancement is why they can consistnatly bring out stuff that drives other countrys products off the market.

unlike americans and many western based culturs the japs dont go for the big score/touchdown/home run every time, they are happy hitting singles all day and just burrying you with runs(baseball anniligy)  American companys alwase go for the big advancement, where as japs and some other culruts will just make small improvements and bring them out more regularly, eventuly they BURRY you in small improvements AND they by using this method they have managed to keep production costs VERY LOW, so dont go by the "home runs are alwase better" assumption,

your annaligy would lead one to belive it would be better if amd and intel had stuck with making singel core chips and had just made them drastickly more powerfull and higher clocked, when thats just not true, for SMP/threded apps its better to have more cpu's/cores to spred the load over, and slowly but surely the world is headed threded!!!!

and it really dosnt matter how you get there if the results back it up, if amd uses 45nm vs nvidias 55 and 65nm chips then amd will almost certenly have the more energy efficent chip, if u tag 2 dual core versions of that on 1 pcb you have quadfire in 1 card!!!

as to the guy who said he wanted to see a PPU onboard as well, if game developters get off their ass they could do PPU work on the x1k cards, but they havent bothered, mostly because intel bought havoc and havoc was the main company working on gpu as ppu emplimentations.

a "True" ppu is just a very powerfull math unit, guess what, thats also what a modern shader based GPU is, an x1300pro is ruffly equivlant to a first gen agena ppu card, an x1300xt kills it, if you look at the HD seirse the 2400 kills agena's current products for raw prosessing power, its just that nobodys bothered emplimenting it YET.

we need somebody like MS to step in and say "heres a new standred for phsyics thru a common API, make your drivers fit the API calls and you cans upport phsyics"


----------



## btarunr (Feb 10, 2008)

2x 0.50 > 4x 0.25. --- while the baseball example did sound good, let me give a more natural one. Let's say tomorrow a game that's unoptimised for a multi-GPU setup comes up, what happens? The game would only exploit one of the four cards and it's 0.25. But, had the unit been 0.50, at least an unoptimised game would run better so it's important that ATI works on a strong single GPU than look at making powerful solutions based on multiple weak single units. We've pretty-much seen that happen with the Crysis benches for the HD 3870 X2:









^Look at what an unoptimised game can do. So, it's important to have stronger units. Let's face it, the G92 is a superior core to the R670, we've seen enough people attesting that, just that when in an SLI array, it slightly falters but that's the platform to blame, not the cards. In a 32-lane SLI chipset like the NForce 590 SLI or the 680i SLI, the northbridge and southbridge give out 16 lanes each to a card while in a 32 lane CrossFire compliant chipset like the AMD 790 FX or the Intel X38, the northbridge supplies all the 32 lanes. So well, a SLI of two G92 based cards falters slightly to a Crossfire of two R670 but if NVidia works well with its internal SLI on the 9800 GX2, we'll have a card waay more powerful than a HD 3870 X2, reason: the use of a powerful unit core. And supposing the 9800 GX2 does end up facing an unoptimised game like the HD3870 X2, it will perform better due to its stronger unit core.


----------



## xfire (Feb 10, 2008)

> 2x 0.50 > 4x 0.25[\quote]
> Agreed. but
> 2x1.0>2x5.0
> Which means not only should each of the company concentrate on producing faster single GPU but also on dual.
> ...


----------



## btarunr (Feb 10, 2008)

xfire said:


> Agreed. but
> 2x1.0>2x5.0



utterly confused 



xfire said:


> Which means not only should each of the company concentrate on producing faster single GPU but also on dual.
> Earlier you said that the increments are not enough considering th geforce 9 series, correct me if I am wrong but isn't the fastest in the 9x series a dual GPU based on the 8800GTX GPU?
> Also see the heading, its on par with expectations. Didn't a thread posted few days back say it was as good as nvidia's next gen.
> Also as chillymist said what about dual core/quad core cpu's. Shouldn't they have made a faster single core cpu.



The 9800 GTX isn't ?


----------



## eidairaman1 (Feb 10, 2008)

well notice those charts , majority of players wont see much cause the game moves at a crawl if it dips below 30 FPS.


btarunr said:


> 2x 0.50 > 4x 0.25. --- while the baseball example did sound good, let me give a more natural one. Let's say tomorrow a game that's unoptimised for a multi-GPU setup comes up, what happens? The game would only exploit one of the four cards and it's 0.25. But, had the unit been 0.50, at least an unoptimised game would run better so it's important that ATI works on a strong single GPU than look at making powerful solutions based on multiple weak single units. We've pretty-much seen that happen with the Crysis benches for the HD 3870 X2:
> 
> 
> 
> ...


----------



## xfire (Feb 10, 2008)

Take the concept of sli/crossfire
2 8800ultras are better than two 8800 GTX
Also isn't the gx2 their flagship?


----------



## ChillyMyst (Feb 10, 2008)

supreme commander is old, and was never optimized for sli or cf, game dev's today are fully aware they need to start supporting multi cpu solutions.

also note that supreme commander STILL HAS BUGGS, dispite all the patches its already receved.....(i know its one of my fav games)

now take a look a the resolutions your talking about, nobody i know games at 2048x1536, add to that the fact that your slaping 4x AA on to cards KNOWN to take a notable hit from AA and your example isnt very valid.

ultra hign res gameplay is still not main stream and wont be for a while to come, 1920x1080(1080p) still isnt even that common, hell ps3 and xbx360 studder in many games if u try and run them at higher then 720p and those are  detocated gaming machiens.

as to your assertion that its the platform to blame, and that cf needs 32x from the videocard ports to perform better then SLI, i gotta call BULLSHIT on that, im sure somebody eventuly will show to back me on this, but CF DOSNT NEED 32X(16x per card) TO PERFORM BETTER THEN SLI, infact 8x per card is PLENTY due to how ati/amd use the INTERNAL BRIGE, cf is just the supperior dual/multi card design, probbly because ati started from scratch and nvidia just tryed to copy how 2 voodoo2 cards linked up.(same reasion the FX line sucked, they tryed to use 3dfx designs/tech/ideas mised with nvidia's own designs)

sure with AA cranked the 8800gt/gts are better at least till u hit CF/SLI then things change because in this case CF is better by design.

and i must say again, that the use of supreme commander is a bad choice, the game needs alot more tweaking and also needs driver optimizations done for it, but theres really no point because no normal person is going to have  a monotor that does 2048x1536, a 1080p monotor/hdtv is already quite pricy, any higher and they are just down right un-affordable by 99% of the population.

i just love how people like you bring out insainly high res benchmarks that NOBODY REALLY GAMES AT to desscredit companys, i say the same thing when i see ati fans do it to nvidia products, and its happened alot, x1900/1950 vs 7800/7900/7950 cards for example, of the constant  cf vs sli wars, where nvidia only wins if its a far more expencive set of cards on a far more expencive board.

want a good example, rd580 board vs 590sli 32x, i can get a good 580cf board for 80bucks or less, i cant get a 590 x32 board for less then 130.......thats a bit of a price diffrance, and you NEED the 32x for sli to show proper benifit, on the other hand cf, u dont, u could use ANY cf compatable motherboard and it will give you damn neer the same perf boost(given the bios are good and the system parts are on par)

you seem to think i hate nvidia, i dont, i just hate how nvidiots alwase badmouth ati/crossfire, or how they drag out benches done at INSAIN settings on games that are known to have issues with scaling, at insain resolutions and use that as a way to disscredit the company they dont like.

im using an 8800gt as i type this, and its a fast card, but it also has its flaws(omg now i will have more nvidiots say im an nvidia hater....) just as the 38*0 cards have their flaws, the only thing i can say with absolutly no qualms, NVIDIAS 8800GT cooler is PEICE OF SHIT AND THEY SHOULD BURN IN HELL FOR PUTTING IT ON THESE CARDS!!!!!, at least the 38*0 cards stock coolers dont suck that bad,infact all the ones i have seen are pretty decent,the 3rd party solutions that cost in the 40$ range are better but thats normaly true.

the way the worlds going both companys will endup going multi core insted of just continuing to use the older bruit force methods of more pipes and more shaders, really it gets silly,the way Nvidia gets their next gen like the 6 to 7 seirse, just make it beefyer and clock higher, or the gf3 where they just did a die shrink then upped the clocks!!!!

not that ati hasnt done the same thing in the past, the x800 was very close to just having 2x9800xt's in 1 chip.

blah, all this argueing over something that dosnt matter that much, I see the writing on the wall here tho, we will see more cores per card and more chips per card from now on, most likely we will endup seeing quadcore gpu's eventuly as well from both companys.

all game dev's are having to learn to multi thred their apps for dual core(the new normal chip in systems) and even quadcore(next step that will become normal in a few years) 

the same will happen with GPU's, they will have to make games to take advantege of the changing prosessing enviroment of multi core cpu and gpu tech.


----------



## btarunr (Feb 10, 2008)

ChillyMyst said:


> as to your assertion that its the platform to blame, and that cf needs 32x from the videocard ports to perform better then SLI, i gotta call BULLSHIT on that, im sure somebody eventuly will show to back me on this, but CF DOSNT NEED 32X(16x per card) TO PERFORM BETTER THEN SLI, infact 8x per card is PLENTY due to how ati/amd use the INTERNAL BRIGE, cf is just the supperior dual/multi card design, probbly because ati started from scratch and nvidia just tryed to copy how 2 voodoo2 cards linked up.(same reasion the FX line sucked, they tryed to use 3dfx designs/tech/ideas mised with nvidia's own designs)



I'm comparing the platforms  there, not that the cards need all 32 lanes. The CF is better SLI because in 32lane setups of SLI, the HyperTransport chipset bus is congested. 



ChillyMyst said:


> sure with AA cranked the 8800gt/gts are better at least till u hit CF/SLI then things change because in this case CF is better by design.



What was I talking about?




ChillyMyst said:


> and i must say again, that the use of supreme commander is a bad choice, the game needs alot more tweaking and also needs driver optimizations done for it, but theres really no point because no normal person is going to have  a monotor that does 2048x1536, a 1080p monotor/hdtv is already quite pricy, any higher and they are just down right un-affordable by 99% of the population.



Crysis?



ChillyMyst said:


> i just love how people like you bring out insainly high res benchmarks that NOBODY REALLY GAMES AT to desscredit companys, i say the same thing when i see ati fans do it to nvidia products, and its happened alot, x1900/1950 vs 7800/7900/7950 cards for example, of the constant  cf vs sli wars, where nvidia only wins if its a far more expencive set of cards on a far more expencive board.



Fine, let's put up not-so-insane resolutions of an un-optimised game:










ChillyMyst said:


> want a good example, rd580 board vs 590sli 32x, i can get a good 580cf board for 80bucks or less, i cant get a 590 x32 board for less then 130.......thats a bit of a price diffrance, and you NEED the 32x for sli to show proper benifit, on the other hand cf, u dont, u could use ANY cf compatable motherboard and it will give you damn neer the same perf boost(given the bios are good and the system parts are on par)



Oh the AMD 580X based boards were expensive when they came out, the one that ASUS released had a release price of $160 which came down to $130 and now it's lower. There's just one board made by ECS that sells for peanuts and that's justified because it's a crappy board in terms of build-quality. But what triggered the price fall in 580X based boards is because not many were opting for CrossFire solutions on an AM2, it's pretty obvious why.



ChillyMyst said:


> you seem to think i hate nvidia, i dont, i just hate how nvidiots alwase badmouth ati/crossfire, or how they drag out benches done at INSAIN settings on games that are known to have issues with scaling, at insain resolutions and use that as a way to disscredit the company they dont like.



Okay, I'm an NVidiot, and it's justified because NVidia NEVER disappoints people who admire it. A high-end GPU is expected to do it all, I want to test a GPU at the highest resolution, with the highest texture filter/AA setting, else how would I deem it the best? The Radeon X1900 and X1950 passed all tests with flying colours and that's why even 'NVidiots' show respect to them.



ChillyMyst said:


> im using an 8800gt as i type this, and its a fast card, but it also has its flaws(omg now i will have more nvidiots say im an nvidia hater....) just as the 38*0 cards have their flaws, the only thing i can say with absolutly no qualms, NVIDIAS 8800GT cooler is PEICE OF SHIT AND THEY SHOULD BURN IN HELL FOR PUTTING IT ON THESE CARDS!!!!!, at least the 38*0 cards stock coolers dont suck that bad,infact all the ones i have seen are pretty decent,the 3rd party solutions that cost in the 40$ range are better but thats normaly true.
> 
> the way the worlds going both companys will endup going multi core insted of just continuing to use the older bruit force methods of more pipes and more shaders, really it gets silly,the way Nvidia gets their next gen like the 6 to 7 seirse, just make it beefyer and clock higher, or the gf3 where they just did a die shrink then upped the clocks!!!!
> 
> ...




You're playing a hate-game more than putting your point across. Using a NVidia card doesn't make a person an NVidiot? For $240 it makes sense to choose the 8800GT over a HD3870? Again on the topic, ATI has to concentrate on a single powerful GPU. The low-res charts too tell a story.


----------



## xfire (Feb 10, 2008)

Didn't nvidia dissapoint with their previous dual GPU?
Also with newer driver releases the games will start to get optimised, just like the 2900.
Even the 3x series are new so they too arent totally optimised.
I don't think your an Nvidiot cause you too are debating getting a 3870.
So actually ATI is on-par with Nvidia.


----------



## btarunr (Feb 10, 2008)

xfire said:


> Didn't nvidia dissapoint with their previous dual GPU?
> Also with newer driver releases the games will start to get optimised, just like the 2900.
> Even the 3x series are new so they too arent totally optimised.
> I don't think your an Nvidiot cause you too are debating getting a 3870.
> So actually ATI is on-par with Nvidia.



The 7950 GX2 was released in an environment of Windows XP, Well situationally it was a superior card to any of ATI's offerings at its time. So it wasn't a disappointment, it was just for NVidia to hold the performance leadership while it worked on the 8 series. Seriously, how much of an increment has the HD2900 got with driver releases so far? No, ATI becomes on par to NVidia when

1. It steps up its market share significantly
2. It comes up with a high-end GPU that holds its crown for over four months (which the 8800 GTX/Ultra succeeded in)
3. When it breaks the shackles of games being roped into TWIMTBP programme and ensures there's a neutral environment for game developers (or atleast a level playing field with its own developer relation programme, whatever happened to "Get into the game")


----------



## ChillyMyst (Feb 10, 2008)

btarunr said:


> Crysis?
> 
> 
> 
> Fine, let's put up not-so-insane resolutions of an un-optimised game:



yeah and crysis is HORRIBLY UNOPTIMIZED FOR ALL SYSTEMS, it SUCKS currently, its acctualy bit worse then farcry was when it came out, and thats after the first patch to crysis that was sposta give benifits across the board......was that bench you list done patched or unpatched crysis?(really want to know, i think one of the updates main points was to make dual gpu's work better)

given time ati and nvidia's drivers will mature and the game will mature into something far more playable(as did farcry, i love its SP game!!!)



> *Okay, I'm an NVidiot, and it's justified because NVidia NEVER disappoints people who admire it.** A high-end GPU is expected to do it all, I want to test a GPU at the highest resolution, with the highest texture filter/AA setting, else how would I deem it the best?* *The Radeon X1900 and X1950 passed all tests with flying colours and that's why even 'NVidiots' show respect to them.*



yes you are, remmber the FX line?, guess you think they didnt dissapoint people.........

ok, how i tell if cards best, i take the max res people acctualy play at, that would  be 1080p(1920x1080) but then i remmber most people are closer to the 1600x1200 or even 1280x720(720p) range, so i compare at those resolutions, with games that are acctualy matured to the point of being valid bench tools, crysis is still very very young and needs alot more updates b4 it will really be up to par for perf.

this is BS(bull sh!t) go read some of the comments people on here and other forums have said about the 1900/1950 cards, am talking about nvidia users/nvidiots, they bad mouthed the drivers(and still do dispite not owning a card that uses them), they bad mouth the cooling, they also try and say that the 8600 is as fast as the 1900(ROFL) yeah, i have read it all, next your gonna tell me an 8600 is a better choice then a 1900 range card....



> You're playing a hate-game more than putting your point across. Using a NVidia card doesn't make a person an NVidiot? For $240 it makes sense to choose the 8800GT over a HD3870? Again on the topic, ATI has to concentrate on a single powerful GPU. The low-res charts too tell a story.



no, im not hating on nvidia, just pointing out that your "we dont need multi-gpu solutions we just need massive singel gpu solutions" is a common nvidiot responce, nvidiots tend to want to do what their company is famous for, bruit forcing their way thru to the next gen, if you cant design a better chip, just shrink it, add more cooling and clock it higher, maby add more pipes as well.......

as to the 8800gt or 3870, this all depends on your use, if your gonna crank the AA then 8800gt, if your gonna bios mod/overclock it and replace the stock cooler 8800gt, if you just want a card thats as little hassle as possable and comes with cooling that dosnt suck donky dick, 3870, if you dont care about cranking the AA up 3870, if you watch alot of encoded movies/videos 3870(yv12 bugg is annoing as hell)  if you want to save power when not gaming, 3870.

i could keep going, each card has its pluses,each has its negitives, if you havent owned/played with a 3800 card yourself then you really shouldnt bad mouth them, I have setup both 3850 and 3870 cards (power color units with zerotherm coolers and lifetime warr) and compared them to my 8800gt, and to tell the truth, they each had their pluses and minuses.

first the 8800gt was faster in benches and with aa cranked, BUT at 1600x1200 i had to crank the AA 2x that of the ati card to get the same quility of AA, thats annoing!!

second, the 8800gt drivers(and all nvidia drivers thru last year) have a yv12 color spaceing bugg, they dont render it properly, this is documented, check the ffdshow wiki.
http://mewiki.project357.com/wiki/Ffdshow_reference
look at the 8800 section, it explains what the problem is....

3rd, 8800gt has a crappy stock cooler unless u buy palit or gigabyte(or one based off their designs) the stock fan never spins up, the card gets HOT AS HELL, 90c or higher under gaming in a COLD ENVIROMENT!!!!

3870 has some draw backs as well.

1st, CCC, i hate catlysit control center!!!!!

2nd, notable performance hit using AA,far higher then the equivlant nvidia cards.

3rd, drives arent yet mature, so you run into buggs and some querks with slect games.

both card.

crysis runs like ass: this is because crysis is still in need of optimization AND it also is really a game made for the next gen of hardware if not the gen after that, it was intentionaly made to be as crazy high qulity gfx and phsyics and AI wise as possable.

soon to be replaced by next gen cards.



again if i wasnt so tired i could go on and on about both companys cards, but no need, most people here know what the plus and netigitive of each side are, if they dont, google can cure that 


btw thanks for the quote man, i love it!!!!


----------



## [I.R.A]_FBi (Feb 10, 2008)

just wow ... lets try not to act like ravenous beast for a minute here


----------



## ChillyMyst (Feb 10, 2008)

btarunr said:


> The 7950 GX2 was released in an environment of Windows XP, Well situationally it was a superior card to any of ATI's offerings at its time. So it wasn't a disappointment, it was just for NVidia to hold the performance leadership while it worked on the 8 series. Seriously, how much of an increment has the HD2900 got with driver releases so far? No, ATI becomes on par to NVidia when



the 2900 and the whole HD2k line got nice boosts from driver updates!!! ask their users who tested between drivers.

the 7950 was HORRIBLE, didnt work in MANY boards, had SHITTY driver support, NEVER Got decent quad SLI drivers, then was dumped like a bastared step child the moment they got somehing new out.

also the cards design was BAD BAD BAD, 2 pcb's sandwitched togather, horrible!!!!

i have 3 friends IRL that bought those cards, 2 of them did SLI with them, they ALL sold those cards off as soon as they realised that support would never come to fix the flaws of the cards, or to give quad sli decent performance benifits.

see my quote to explain how you could think the FX line and 7950gx2 where not dissapointments.....


----------



## asb2106 (Feb 10, 2008)

btarunr said:


>



Yah those numbers are nowhere near correct, with my 3870 I got way more frames than that.  I dont know where you dig these numbers up from but those are not true.


----------



## xfire (Feb 10, 2008)

chillymist green makes it hard to read. Use darker colours.
Also nvidia users complain about lack of driver updates from nvidia.
and the hd 2900 gained a lot of performance improvements after driver updates.


----------



## btarunr (Feb 10, 2008)

asb2106 said:


> Yah those numbers are nowhere near correct, with my 3870 I got way more frames than that.  I dont know where you dig these numbers up from but those are not true.



I dig them up from a website called TechPowerup. 

http://www.techpowerup.com/reviews/HIS/HD_3870_X2/7.html


----------



## asb2106 (Feb 10, 2008)

btarunr said:


> I dig them up from a website called TechPowerup.



sorry man but results like that are not true, I have seen so many people post results that vary all across the board.  

With my 3870 I know that I got higher results than that.  Guarenteed.  

I really think its funny how a forum about ATI cards turns into a battle about how Nvidia is better, who cares, its not about that.  This sounds like a battle between mac and PC and nobody really cares to hear it.  At least not me.  Im out PEACE


----------



## candle_86 (Feb 10, 2008)

actully this looks similar to something ive seen, multiple GPU's to catch up. For multiple Generations

anyone rember Voodoo 5/4? Or voodoo2?  3dfx failed big time, the costs of dual GPU are more expensive than single because of PCB and memory compelxity


----------



## btarunr (Feb 10, 2008)

asb2106 said:


> I really think its funny how a forum about ATI cards turns into a battle about how Nvidia is better, who cares, its not about that.  This sounds like a battle between mac and PC and nobody really cares to hear it.  At least not me.  Im out PEACE




A. I wasn't talking about NV > ATI, just telling ATI has to focus on making a powerful single GPU in its path to regaining performance leadership, aspirational value, fostering engineering potential. 

B. I wasn't doing NV > ATI until Chillymyst comes up with provocative words such as "NVidiots", etc. 

C. I wasn't factually wrong. except for the GeForce FX part.


----------



## brian.ca (Feb 10, 2008)

It probably is a good idea to question the graphs above... I remember seeing similiar results prior to the official release then a note a few days later about an updated driver jacking up performance in a number of games.

The review over at Anandtech also supports this notion (http://www.anandtech.com/video/showdoc.aspx?i=3209&p=5 .  Where the graphs above show a medicore (low-mid teens) improvement or actual downgrades going to a x2 from a single 3870 Anandtech's benchs show a 45 & 47% performance increase from the x2.








Also another point to consider, you said yourself that the the 7950 GX2 worked to give Nvidia a temp boost while it worked on it's bigger guns.   When you consider the position ATI/AMD was in 3 to 6 months ago does it not make terrible sense for them to take a route like this that seemily lets them produce upgrades at a decent pace and also presumably saves them money in development and production costs?  

If nothing else it's hard to deny the fact that in a very short time they were able to *almost* catch up and then jump ahead for the time being while still keeping their next move in the pipeline.  NO doubt it will jump back and forth with Nvidia's upcoming releases and the new ATI chip but it's hard to argue against how much better of a position ATI is currently in, in part to going this route.

The main thing to look at I think will be how Nvidia's sandwhich design compares to ATI's 2 on 1 card.   ATI is probably currently still "not there" but if Nvidia falters with their sandwhich card (be it that that design just doesn't scale as well as ATI's or some practicailtiy issue like heat/noise etc.) I think ATI will be in a very favorable position.


----------



## imperialreign (Feb 10, 2008)

guys, look - the R700 GPU is, as we know it, as rumor

ATI has not yet released an official statement as to it's development; we've only been seing rumors.

Rumor has it, also, that it will be a dual core GPU

rumor has it, that the core bridge will utilize hypertransport

rumor has it, that the core architecture will be more AMD based than ATI based

We don't know for sure yet, any way you look at it.

Currently, ATI's GPU has become better suited for arithmetic operations than graphics processing.  Their GPU's are powerful when it comes to calculations and whatnot, but their graphical weakness has been becoming more and more apparent since the introduction of the HD2k series - this a big reason why nVidia has gained such a massive lead over ATI, their processor are better suited for graphical work.  I think ATI has realised this also, and has been spending a lot of time at the drawing board - we haven't seen anything really "innovative" from ATI within the last 1.5 years (aside from the 3870x2 - but it's nothing they haven't done before), and considering how tight lipped they've been over the R7xx processors, it wouldn't surprise me if they've spent a lot of time building something from the ground up.  Only time will tell.

Adn until we see some actual proof - sepculation will just be that.  No need to get your thongs in a bunch over it.


----------



## moto666 (Feb 10, 2008)

Many many rumors!
Hope the half of it come true!


----------



## kapeeteeleest_peeg (Feb 10, 2008)

*nVidia vs ATi*

I really dont know why u guys argue about who is best all the time. I owned the original riva tnt when nVidia broke thru over ten years ago, I owned the 9700pro when ATi decided to make a decent gpu almost 5 years ago. Recently, Ive owned both 2900xt & 8800gt.

This generation, nVidia has 'shaded' it in terms of performance, I will say though, that ATi is far superior in terms of image quality.

As for what next?
Well, I wouldnt underestimate AMD/Ati.
Besides, they make CPUs, Chipsets & GPU's as does Intel.

I can see nVidia being squeezed out by these two, eventually.
I havent bought an nVidia based motherboard for over a year and a half and really, why would I when intel makes best Chipsets for its CPU's...


----------



## asb2106 (Feb 10, 2008)

imperialreign said:


> guys, look - the R700 GPU is, as we know it, as rumor
> 
> ATI has not yet released an official statement as to it's development; we've only been seing rumors.
> 
> ...



the fact of r700 is not a rumor.  It is coming out.  What it is and what it can do is still a rumor.


----------



## WarEagleAU (Feb 10, 2008)

Sounds interesting. If its a value core (which usually what the V equates too) with 50% gains over a HD3870 and Hd3870X2 we are looking at some truly remarkable architecture here.


----------



## AddSub (Feb 10, 2008)

Great news, if true. Maybe ATI will finally release a card that will be worth my time again. They really need to boost their ROP counts, which is something that hasn't happened in 4-5 years, because the whole unified shading architecture will take a GPU only so far. 

Oh, btarunr, if 9800GX2 is best nVidia can come up with, and it seems like that IS the case, then chances are their stock is going to take a dive in the coming months, a one big-ass dive.


----------



## OrbitzXT (Feb 10, 2008)

Are there benchmarks or numbers about the 9800 GX2 or any of the new cards coming out? I keep hearing news about how nVidia is going to blow AMD and the 3870x2 away. I speculate they are but I have nothing to base that on, but I'm still confused of all this talk that seems more or less certain the 3870x2 will be beat.


----------



## magibeg (Feb 10, 2008)

I don't know how this topic turned into a massive flame war but if Ati can roll out a highend card 50% faster than the 3870 that does sound extremely tempting to sell what i have now and get one of those . Also could we stop flaming about the 2 cores vs 1 core vs octicores or whatever. Fact of the matter is that in terms of efficency we will have to move to multi-core cpu's and gpu's one day. Yes its true that currently there isn't a lot of software/games that properly support it but its a work in progress. Just please stop fighting about something no one can win.


----------



## ChillyMyst (Feb 10, 2008)

candle_86 said:


> actully this looks similar to something ive seen, multiple GPU's to catch up. For multiple Generations
> 
> anyone rember Voodoo 5/4? Or voodoo2?  3dfx failed big time, the costs of dual GPU are more expensive than single because of PCB and memory compelxity



acctualy the dual voodoo2 combo was VERY popular in my experiance, i knew alot of ppl that got 1 card then later got another, problem was that 3dfx sucked for driver support they never made a full opengl or d3d driver for those cards, only thing they truely rocked at was glide.......

and the voodoo3/4/5 all sucked 16bit dithered insted of true 32bit colour......oh and dont get me started on the banshee cards............


----------



## micon02 (Feb 19, 2008)

Sorry, it's my first post in this forum
No intention to bash anyone or the maker 

Rumors

RV770 R & D tested, beats 9800GTX
How reliable is the source, I wouldn't know

http://translate.google.com/transla...&extra=page%3D1&langpair=zh|en&hl=en&ie=UTF-8


----------



## phanbuey (Feb 19, 2008)

yeah they said that about R600 too... and Phenom... all that stuff is speculation since the RV770 is at most in prototype stages and the 9800GTX may not even be in existance. I hope so tho... i bought some AMD stock long


----------



## eidairaman1 (Feb 20, 2008)

Funny Picture man


ChillyMyst said:


> acctualy the dual voodoo2 combo was VERY popular in my experiance, i knew alot of ppl that got 1 card then later got another, problem was that 3dfx sucked for driver support they never made a full opengl or d3d driver for those cards, only thing they truely rocked at was glide.......
> 
> and the voodoo3/4/5 all sucked 16bit dithered insted of true 32bit colour......oh and dont get me started on the banshee cards............


----------



## imperialreign (Feb 20, 2008)

ChillyMyst said:


> acctualy the dual voodoo2 combo was VERY popular in my experiance, i knew alot of ppl that got 1 card then later got another, problem was that 3dfx sucked for driver support they never made a full opengl or d3d driver for those cards, only thing they truely rocked at was glide.......
> 
> and the voodoo3/4/5 all sucked 16bit dithered insted of true 32bit colour......oh and dont get me started on the banshee cards............



I'll say that 3DFX was a bit odd with their driver support, being an old school and proud 3DFX owner here.  Damn their drivers were buggy - but when they worked, they worked like a charm.

3DFX never needed to impliment an OpenGL driver for their cards, as their Glide API could process all the commands used in OGL.  Difference being, though, that the Glide API only made use of the commands useful for gaming, so it wasn't nearly as bloated, but, it was extensively faster at the time than any other competitors when rendering OGL based applications - and taking into account this was back during the time that DirectX and OpenGL were first gaining momentum, also, it helped push the VooDoo cards to the performance throne.  

True, the VooDoo cards were 16bit dithered, but the cards were still capable of near 24-bit output, and still _technically_ rendered a 32-bit image.  The actual display output was hard to distinguish any real color differences between a VooDoo and a competitors card; if you happened to own any of the VooDoo lineup, you should be able to remember that.  The big difference that everyone pointed out, though, was in their screenshots.

3DFX was a hardware company that carved their niche for performance graphics in gaming, and that led to their downfall (amoung other things).  They became complacent after the release of the VooDoo3 (which, IMO, was their best card they ever offered), and towards the end were releasing a bunch of insane products to try and recapture market share (one of my favorite 3DFX marketing slogans was "So Powerful, it's kind of ridiculous") - multi GPU cards (there was a card planned with 4 GPUs, required it's own power adapter, but never released), multi SLI setups (4 card SLI was possible, I think they might have even offered a 5 card setup, too).  Still, though, they were a pivital company through the late 90s, and I really think they paved the way for strong 3D graphics support and performance.  Love 'em or hate 'em, they had a lasting impact on the PC market, and will always be revered and remembered by both the die hard fans, and tech junkies alike.

You might find this an interesting read: http://www.x86-secret.com/articles/divers/v5-6000/v56kgb-1.htm


----------



## ChillyMyst (Feb 20, 2008)

imperialreign said:


> I'll say that 3DFX was a bit odd with their driver support, being an old school and proud 3DFX owner here.  Damn their drivers were buggy - but when they worked, they worked like a charm.
> 
> 3DFX never needed to impliment an OpenGL driver for their cards, as their Glide API could process all the commands used in OGL.  Difference being, though, that the Glide API only made use of the commands useful for gaming, so it wasn't nearly as bloated, but, it was extensively faster at the time than any other competitors when rendering OGL based applications - and taking into account this was back during the time that DirectX and OpenGL were first gaining momentum, also, it helped push the VooDoo cards to the performance throne.
> 
> ...



glide was a moded form of OGL, very hardware spicific, the prolem was 3dfx never bothered to make a full OGL icd like most other companys, they never made a 100% functional version of d3d eather in my exp, some games forced you to revert to glide because, to be honest any other mode looked bad or didnt run properly.

and their dithered output was NOT as good, i had banshee,voodoo3 and nvidia cards of the day, and the 16bit dithered in games was notable specly in firey explosions/flames and the like you could see the dithering easly, where as my nvidia and even ati rage128 cards didnt have that issue in 32bit mode, it looked flawless.

3dfx effectivly commited suicide, they cut themselves of from their main market(sales of chips to other card makers) then refused to make cards that where up to the standreds of the day, till it was to late, and their last futile act was the voodoo5, basickly an SLI voodoo4 on 1 card, it was fast, but it also was expencive as hell and HOT, the voodoo6 never went into full production because, 3dfx had lost so much of its market by then that they had to close down.

they are a prime example of what NOT to do as a card or chip maker.


----------



## imperialreign (Feb 20, 2008)

ChillyMyst said:


> they are a prime example of what NOT to do as a card or chip maker.



completely agree with that.  Although, at their beginning, they had it good - they covered an area of the market that was pretty much ignored.  But, they should've gone more mainstream friendly after the VD3, instead of 1337 only.



			
				ChillyMyst said:
			
		

> and their dithered output was NOT as good, i had banshee,voodoo3 and nvidia cards of the day, and the 16bit dithered in games was notable specly in firey explosions/flames and the like you could see the dithering easly, where as my nvidia and even ati rage128 cards didnt have that issue in 32bit mode, it looked flawless.



yeah, I do remember those semi-transparent sprites like that.  I have to agree that did look like crap comparatively.

Sorry, I had meant more along the lines of solid texture images and sprites.  I forgot how badly the VooDoos rendered transparent and translucent objects and colors (if at all, sometimes they'd revert to a solid color instead).


----------



## ChillyMyst (Feb 20, 2008)

imperialreign said:


> completely agree with that.  Although, at their beginning, they had it good - they covered an area of the market that was pretty much ignored.  But, they should've gone more mainstream friendly after the VD3, instead of 1337 only.
> 
> 
> 
> ...



yeah, i stoped buying myself voodoo cards after my fist tnt1, i got them as gifts a few times(banshee and v3) and the tnt1 and2 where just the better card, image quility was better, drivers where better, fetures better, the only thing the doodoo's and going for them was glide and it was being replaces with d3d and ogl fast because game makers saw that glide was dieing thanks to 3dfx's policy of not letting anybody else support it even with emulation.

in the begining 3dfx sold their chips to other companys, made alot of $ doing it, then they stoped that and went to only making and selling their own cards, the banshee was worse then 1 doodoo2 card, and 2 killed it, it was a striped down doodoo2 core, missing fetures, 16bit only in games, just...crap.......

honestly, thats why they arent around anymore, if they had kept up making chips and selling them to other companys to make cards on they would be around along side ati and nvidia today, even if they didnt get a good 32bit card out till the 6k or 7k seirse, hell look at creative, they are still around and their driver supports sucked ballz since the sblive days!!!!

but they put out cards in all price ranges, not just leet gearz, and now they even let other companys build cards with this x-fi chip(still wouldnt buy it tho, creative drivers SUCK!!!) its just a matter of knowing ur market and place in the market.

3dfx was getting royaly pwned in the market by companys like ati and nvidia, but insted of adapting and carving out a better nich for themselves and keeping their old partneer ships going strong, they dumped their old partners and kept making cards that, to be honest didnt compete well with their compotition on a price to perf/feture ratio.

meh, they are dead, nvidia bought their tech, then made the fx line with some of it*shudders* those horrible horrible fx cards.......


----------

