# The Ati vs. Nvidia Image Quality Thread



## DarkMatter (Mar 14, 2008)

Which one do you think has the better image quality in this generation? 

Give your opinion, but please state clearly why, and post images where the reason is evident if possible. Links to previously done reviews and comparisons are welcome.

Try to avoid flame wars, please, I want this one to be serious.

Some reviews concerning this.

http://www.maximumpc.com/article/videocard_image_quality_shootout?page=0,0 - A very interesting read. Read the entire article or you will miss the point.

http://techreport.com/articles.x/12458/5 - Pages 5, 6, 7 and 8.

http://sg.vr-zone.com/articles/ATi_Radeon_2000_Series_Launch:_X2900XT_Review/4946-15.html - pages 15, 16, 17.

http://www.hardocp.com/article.html?art=MTM0MSw1LCxoZW50aHVzaWFzdA== - pages 5 and 6.

Courtesy of calvary1980:

http://enthusiast.hardocp.com/article.html?art=MTQwNCw3LCxoZW50aHVzaWFzdA== - 2900XT vs 8800 Ultra on HL2:EP2

http://www.elitebastards.com/cms/in...sk=view&id=559&Itemid=29&limit=1&limitstart=4 - Video playback comparison.

Courtesy of KainX:

http://www.driverheaven.net/reviews/3870-XXX/IQ.php

Courtesy of cooler:

http://www.beyond3d.com/content/reviews/47/1 - AMD R6xx: Image Quality Analysis

http://www.beyond3d.com/content/reviews/3/1 -NVIDIA G80: Image Quality Analysis

http://www.vr-zone.com/articles/AMD_Radeon_HD3850_and_HD3870_-_AMD's_salvation?/5392-14.html

http://www.pcgameshardware.de/?article_id=621293&page=11 - In german(?), with animated gifs.

http://www.dailytech.com/AMD+Alleges...rticle8608.htm - Allegations of Nvidia Cheating on HD playback.

http://www.theinquirer.net/en/inquir...deo-benchmarks - Same allegations, explains that this happens because an optimization. You can turn it off, but it's enabled by default. Take your own conclusions.

Posted by EastCoasthandle:

http://www.driverheaven.net/reviews.php?reviewid=552&pageid=18 - Comparison between Purevideo HD and Avivo HD.


----------



## mandelore (Mar 14, 2008)

dude... why you bringing the argument to a whole new thread??
 its obvious its gonna start getting hot with all the flame-tastic comments about to arrive....


----------



## Morgoth (Mar 14, 2008)

ATI
this topic wil thurn into a fanboy flamewar topic...


----------



## xfire (Mar 14, 2008)

So how many users do you think really owned both the cards and descided to do a head to head comparision of both the cards. This is just going to turn out into flame wars. If you really want to find out yourself conduct a test among your friends using similar setups.


----------



## DarkMatter (Mar 14, 2008)

mandelore said:


> dude... why you bringing the argument to a whole new thread??
> its obvious its gonna start getting hot with all the flame-tastic comments about to arrive....



Because if we don't ruin it, it could become sticky and we don't have to talk about this on other threads.



xfire said:


> So how many users do you think really owned both the cards and descided to do a head to head comparision of both the cards. This is just going to turn out into flame wars. If you really want to find out yourself conduct a test among your friends using similar setups.



I did. I don't have to find out anything already. For me they are both the same.


----------



## mandelore (Mar 14, 2008)

DarkMatter said:


> Because if we don't ruin it, it could become sticky and we don't have to talk about this on other threads.



seriously, that aint gonna happen. all it takes is one person...

look at the intel vs AMD thread, for the first few pages it was civil, and adult, then one by one immatures launched into action and downwards the thread spiralled...


----------



## DarkMatter (Mar 14, 2008)

mandelore said:


> seriously, that aint gonna happen. all it takes is one person...
> 
> look at the intel vs AMD thread, for the first few pages it was civil, and adult, then one by one immatures launched into action and downwards the thread spiralled...



I really hope it goes well. And I strongly belive it is better than constantly debate this in other threads.


----------



## xfire (Mar 14, 2008)

DarkMatter said:


> I did. I don't have to find out anything already. For me they are both the same.


Then you dont have any need for this thread. Let people say what they want. Just cause some one calls you a dog you're not gonna be a dog.


----------



## DarkMatter (Mar 14, 2008)

xfire said:


> Then you dont have any need for this thread. Let people say what they want. Just cause some one calls you a dog you're not gonna be a dog.



Indeed this thread is for them to say what they want, and to let them back their opinions. And ultimately for the people who enter here seeking info.

Vote if you want to, or don't vote, but please don't go off-topic before this even started.


----------



## erocker (Mar 14, 2008)

I'm glad this thread was started!  As a matter of fact, I'm trying to test the differences in IQ between the two right now.  By the way, I started a "I need help with my 3850" thread, so stop on over if you can give me some insigthts.  Thanks.  I'll throw up some screens this weekend.


----------



## mandelore (Mar 14, 2008)

but.. at the end of the day it wont stop bitching in other threads, as the arguments are within a specific, often thread-specific context.

But I wish this thread well, I will not partake in it (unless people do actually behave responsibly and use this thread as a fact vault, with tested proof, not fan boyish arguments  ), if only not to further fuel any innevitable arson being commited


----------



## Hawk1 (Mar 14, 2008)

I'll have to abstain since I don't own either new generation cards. Sould have a 4th option for "I'm not sure." You will (probably) still get the fanboys choosing their brand based on loyalty rather than first hand experience.


----------



## DarkMatter (Mar 14, 2008)

Hawk1 said:


> I'll have to abstain since I don't own either new generation cards. Sould have a 4th option for "I'm not sure." You will (probably) still get the fanboys choosing their brand based on loyalty rather than first hand experience.



As long as they give some proof... or whatever they post. Anything would be ok. I mean they could post an image where they see a huge difference. But chances are that many people wouldn't see any, and the image will still be useful. The image it's there, the claim of superior IQ would be there, but then it's all up to the reader to decide if there's any difference and the job is done.

About your 4th option suggestion, I have thought about it and I think that "not sure" and "no difference" is the same if you have experience with the cards.


----------



## alexp999 (Mar 14, 2008)

Don't have any screenies (yet) but I have noticed between my system (ati, see specs) and my dad's (nvidia, see sig) that ati looks alot crisper and vibrant (even though he is on a higher resolution screen which costed nearly twice as much, lol!). Nvidia tends to look quite grainy and "flat", at least from what I have seen anyway.


----------



## DarkMatter (Mar 14, 2008)

alexp999 said:


> Don't have any screenies (yet) but I have noticed between my system (ati, see specs) and my dad's (nvidia, see sig) that ati looks alot crisper and vibrant (even though he is on a higher resolution screen which costed nearly twice as much, lol!). Nvidia tends to look quite grainy and "flat", at least from what I have seen anyway.



Ok, thanks for posting, and congrats since you are the first one to give a reason for your voting. 

Looking forward for those screenies.


----------



## panchoman (Mar 14, 2008)

it'd have to be ati because they have the only dx10.1 cards out, as of now, and of course, dx10.1 would have the best quality over say dx10 or dx9.


----------



## calvary1980 (Mar 14, 2008)

are we talking about an image or rendering a game, cause this is interesting.

http://enthusiast.hardocp.com/article.html?art=MTQwNCw3LCxoZW50aHVzaWFzdA==

keep in mind this is the 2900XT and is the only recent dirt I could find on either.

- Christine


----------



## Black Panther (Mar 14, 2008)

I'm glad you made this thread. I've only had Nvidia cards so far, starting from the MX400 and ending up to the 8800GT, but the earlier one weren't my choices. I hear all the time that ATI give a better image quality and are cheaper, but since now I haven't had the opportunity to test out ATI vs Nvidia myself. Hence I'm very interested in the outcome of this thread.


----------



## MilkyWay (Mar 14, 2008)

obv its the one with the power fuller gpu and currently its nvidia

ati provide a nice image quality but its equal on most games just at the moment ati will run slower


----------



## Skyguy (Mar 14, 2008)

I've owned both, but it really only is valid when you compare similar-level cards in the same generation.  Also toss in driver tweaks and you've got a real challenge.  Look at the new Crossfire X supersampling when you're 16x and above........ATI's tends to make everything look too soft and fuzzy. BUT..........who really runs 32x?  Seriously. The performance hit is exponential and foolish......it's of no practical value whatsoever.

So then compare apples to apples:  4x AA, HDR, other regular stuff.  Generally looks the same between the cards.  There may be some subtle differences, sure, but overall I've not noticed much difference.

Also:  keep in mind these are side-by-side comparisons, with a single screenshot, that are often isolated and enlarged.  Does anyone here really think such nuances will be noticed when you're running around in a game at 60 fps, busy actually playing?  Those minor differences (if any) won't even be noticed since you're too busy playing, not to mention you won't have another card/system running simultaneously to notice a difference.  99.999999% of the people here don't run games in ATI/Nvidia simultaneously LOL.......so, without a basis for comparison, nobody will actually tell the difference in a real world gaming application.

All things being considered, image quality is virtually a wash nowadays.  It would be wiser to pick a card based on price-to-performance, overall performance (if you're a nut for horsepower), power draw, noise, and heat criteria.  That's where you'll see BIG differences that will set one card apart from another.......not looking at some single frame enlargement, trying to grasp subtle minutae and nuanced differences in pixellated images LOL.

ummmmm....Yeah.


----------



## Solaris17 (Mar 14, 2008)

*READ IMPORTANT  thnx*

im glad this thread is out...and i dont think it will become a flame watr would you like to know why?...because we should all be pretty mature about it..all this talk about it only taking one person is ture to a point...for example some 13 year old will be like hey hey look at me im a fanboi!!!! well that is proably going to happen and theirs no stopping it however for those of us who are older and want to understand more of the technical aspecs rather than backing your favorite company than it will be a good thread because all we have to do to make this a success is ignore those ppl....as for me....i think ATI has better imag quality...in screen shots and comparisons some spacifics are diff that ati renders a little better....but only for certian things that iv noticed for example like rock textures in some games ati does better but water and similar things would be exactly the same....me? i use nvidia because they tend to play games slightely faster because i am TOLD nvidia doesnt render things as sharply but out of my personall experiance.....in my vid card timeline....radeon 7500,9250,9800,x1600XT Nvidia 8600GT,9600GT their is no diff imo...i suppose their might be...if i had an ati card and an nvidia card form the same generation like an 8600 compared to a 2900 and i could boot 2 computers and compair at the same time...however though i might see a diff it would probably be imo if i looked for something to be spacifically diff....out of all the cards iv owned i havent seen any diff in image quality in my day to day gaming and though nvidia is my personal pref atm...as you can see iv owned more ati and if i were to reccomend a card to somebody give me a budget and what you want to do and ill find you the best choice totally non biased


----------



## DarkMatter (Mar 14, 2008)

calvary1980 said:


> are we talking about an image or rendering a game, cause this is interesting.
> 
> http://enthusiast.hardocp.com/article.html?art=MTQwNCw3LCxoZW50aHVzaWFzdA==
> 
> ...



It's about everything that you think that should be mentioned.

And thanks you for the link, since it's really interesting. I liked the transparency AA thing, that will look better on closer vegetation on Ati, but distant vegetation on Nvidia. But the difference is negligible anyway, and just shows how complicated is to say one is better than the other.

I don't think the thing with the flashlight is common nowadays. Any Ati owner can confirm please? If persist we can conclude that in this specific feature of this single one game Nvidia is the clear winner, but Ati looks ok too, just doesn't get the same effect. Not that you won't buy an Ati card because of this...


----------



## pagalms (Mar 14, 2008)

Had both of them and i didn't see any difference in image quality.


----------



## Lillebror (Mar 14, 2008)

calvary1980 said:


> are we talking about an image or rendering a game, cause this is interesting.
> 
> http://enthusiast.hardocp.com/article.html?art=MTQwNCw3LCxoZW50aHVzaWFzdA==
> 
> ...



Maybe its just me.. But those flashlight shots - dosent the shadow on the walls look a little weird on the 8800?


----------



## cdawall (Mar 14, 2008)

i have to go with nvidia i have used both and TBH i have an FX5700 that looks better than a 3850 (same monitor same res) and the WS cards from what i could see looked better on the NV cards than the ATi ones


----------



## DarkMatter (Mar 14, 2008)

Lillebror said:


> Maybe its just me.. But those flashlight shots - dosent the shadow on the walls look a little weird on the 8800?



I think I know what you are talking about, the border seems pixelated, is that what you mean?

It's just Valve tried to simulate umbra/penumbra, but since they are using a low detail shadow (compared to modern games) it looks weird in a screenshot. You won't notice it at 30fps, because lit/unlit pixels will be different each frame.


----------



## niko084 (Mar 14, 2008)

I have seen it go back and forth a bit back and forth... Lately I think ATI cards look a bit better *not a lot but a tiny bit if your really picky*.

But remember quality beyond a whats technically more correct, and what you personally like is different. I don't like Sony branded TV's they punch reds and oranges a little much for me naturally, I like a naturally cooler color, *not totally fixable in the setup either*.


----------



## DarkMatter (Mar 14, 2008)

niko084 said:


> I have seen it go back and forth a bit back and forth... Lately I think ATI cards look a bit better *not a lot but a tiny bit if your really picky*.
> 
> But remember quality beyond a whats technically more correct, and what you personally like is different. I don't like Sony branded TV's they punch reds and oranges a little much for me naturally, I like a naturally cooler color, *not totally fixable in the setup either*.



That's a good point. One of my desires with this thread is indeed discover exactly that. When people say "I think XXXXX  IQ is better", are they talking about technically or their personal feelings? When on Crysis beta forums, there were many people who liked Ati, because it was more colorful. That's not a technically better IQ, but a color preset difference and shouldn't justify a "Ati has better IQ" claim being done in a tech forum, for example.

Now there are other things like AA where there are some technical differences, but not always one is better than the other, they are just different and once again it will depend on what you like more. cavalry1980's link is a good example of this.

So in the end, since all depends on personal preferences, the best way of trying to settle this is do a compilation of image comparisons and the feelings of the people about those images. If someone new comes and reads "xxxx has better IQ" he has a place to see what that IQ difference is, if there is at all according to what he sees on the pictures. One specific example can be the one of the color scheme, if you read the MaximumPC link in the firs post, most of the specialists that chose Ati, was because of the color, and was a subtle personal preference. If you just look at the charts you can take a misleading conclusion, that was not in the specialists mind.


----------



## Grings (Mar 14, 2008)

I think there really is nothing in it anymore, only thing i've found is that my Nvidia card (g80 640mb GTS) takes less of a performance hit than the ATi (HD3870 GDDR4), though i think 2900's take far less of a hit.
Both have had the odd issue here and there on various games, anti-aliasing not working in certain games for example, but both cards have had this happen in equal measures, and it usually gets fixed in either a patch or driver.


----------



## hat (Mar 14, 2008)

Did they pick random people off the street and ask them "hey, which one looks better to you?" or did they take computer knowledgeable people like us?

We all know during the GeForce 5, 6, and 7 series ATi supposedly had better image quality. So people carry that thought in the back of thier head, and when they look at 2 computers, one with ATi and one with Nvidia, they percieve in thier own minds that ATi is better due to the fact that they carry that thought in the back of thier heads.

I generally choose Nvidia because they have better performance, and I can hardly tell the difference between ATi and Nvidia in still images of the same thing, how the hell am I gonna tell when I'm running around trying to kill shit thats trying to kill me? Am I gonna stand there gawking about how pretty the enemy's muzzle flash looks? No, I'm showing him mine!

Bottom line the game and what settings you have set in your drivers (AA/AF) determine image quality.


----------



## ShadowFold (Mar 14, 2008)

Ati is definantly better at making cards and has better tech, they just dont have the horse power like nividia does.


----------



## das müffin mann (Mar 14, 2008)

the newer ati and nvidia cards i cant notice a diff in iq at all


----------



## niko084 (Mar 15, 2008)

DarkMatter said:


> That's a good point. One of my desires with this thread is indeed discover exactly that. When people say "I think XXXXX  IQ is better", are they talking about technically or their personal feelings? When on Crysis beta forums, there were many people who liked Ati, because it was more colorful. That's not a technically better IQ, but a color preset difference and shouldn't justify a "Ati has better IQ" claim being done in a tech forum, for example.



Well now on the variant side, I do believe that people have said before that ATI's cards are more "accurate" color/shading/drawing wise. That doesn't mean you will necessarily like it more but I have heard many say more "accurate".

As far as IQ from processing is concerned, well thats all who has more power and can process more.


----------



## strick94u (Mar 15, 2008)

WHY IS THERE NO MATROX !!!!!
Matrox is never even considered:shadedshu


----------



## cdawall (Mar 15, 2008)

strick94u said:


> WHY IS THERE NO MATROX !!!!!
> Matrox is never even considered:shadedshu



TBH i like Via onboards IQ its nice


----------



## yogurt_21 (Mar 15, 2008)

nvidia and here's why
while in the vrzone review they mention that ati with 16x aa is little better than the 8800 with 16x aa, they fail to mention the frams of both cards in those settings.

While ati can do ahigher 24x edge detect aa, it can't run it in almost any game since 2005. ( at a decent resolution) 

I've compared my oced 2900xt vs a friends oced 8800gt and IQ wise, he was able to run at better aa and af settings than I was.

This in my mind gives Nvidia better IQ in game, as theoretical is nice, but playability is really what I'm after. 
so why am I with ati? cause I'm a fanboy. plain and simple.


----------



## das müffin mann (Mar 15, 2008)

it would be interesting if ati and nvidia got together to make a card, granted im not looking at the competition or money aspect of this i just think together they could produce one hell of a card, shitty drivers at first...lol, but tbh back in the day i did think ati did have a little better iq but these days i cant tell the difference, both camps make cards that are cheap and get the job done


----------



## Widjaja (Mar 15, 2008)

I've just switched between a X1950pro and a  8800GT.

While I had the 8800GT (now RMA from overheating after 20min) I played two games, DiRT and S.T.A.L.K.E.R both at the same res same settings.
I am disregarding frames as the 8800GT obvioulsy beats the X1950pro quite easily.

DiRT, the X1950pro came out a bit better as I saw slight white lines on the road at times with the 8800GT where the 'faces' join together (If anyone knows about 3D modelling.)

STALKER, The 8800GT blew away the X1950pro with straight performance.
The game ran so smooth, but the the image quality stayed the same.

In this situation the X1950pro comes out on top with image quality by a little bit but nothing dramatic.
But still fps and smoothness was not included.


----------



## DarkMatter (Mar 15, 2008)

hat said:


> Did they pick random people off the street and ask them "hey, which one looks better to you?" or did they take computer knowledgeable people like us?
> 
> We all know during the GeForce 5, 6, and 7 series ATi supposedly had better image quality. So people carry that thought in the back of thier head, and when they look at 2 computers, one with ATi and one with Nvidia, they percieve in thier own minds that ATi is better due to the fact that they carry that thought in the back of thier heads.
> 
> ...



If you are talking about MaximumPC article, they are supposed to be experts in the subject. From the article:



> THE TEST SUBJECTS
> 
> We recruited our 21 evaluators from the ranks of the Future US staff, including editors and art directors from other print and online publications. We chose these individuals because of their in-depth expertise at evaluating image quality in all three of our test criteria.



http://www.futureus-inc.com



ShadowFold said:


> Ati is definantly better at making cards and has better tech, they just dont have the horse power like nividia does.



Apart from that being discutible, what does that have to do with the thread?



niko084 said:


> Well now on the variant side, I do believe that people have said before that ATI's cards are more "accurate" color/shading/drawing wise. That doesn't mean you will necessarily like it more but I have heard many say more "accurate".
> 
> As far as IQ from processing is concerned, well thats all who has more power and can process more.



Well, I'm an image designer and think that Nvidia's colors are more accurate when it comes to represent real colors, the ones that will later come from a printer. But that doesn't matter for gaming and I use a CRT monitor.

Indeed that's a question that I've been wandering about, could be different for LCDs and CRTs? It's an stupid question, I know. LCDs and CRTs look different, what I mean is if A is better on CRTs than B, is the same for LCDs?



yogurt_21 said:


> nvidia and here's why
> while in the vrzone review they mention that ati with 16x aa is little better than the 8800 with 16x aa, they fail to mention the frams of both cards in those settings.
> 
> While ati can do ahigher 24x edge detect aa, it can't run it in almost any game since 2005. ( at a decent resolution)
> ...



A very good point too. But I don't know if it's a little bit out of the scope of this thread. But I do see what's your point and clearly a higher level of AA or AF makes for a bigger improvement in IQ for sure.



das müffin mann said:


> it would be interesting if ati and nvidia got together to make a card, granted im not looking at the competition or money aspect of this i just think together they could produce one hell of a card



Haha! But it could happen, seriously! Chances are low, but with the open fight established right now between Nvidia (AMD seems to agree) and Intel about if GPU or CPU are more important nowadays, and looking at how AMD can't fight Intel in the high end CPU... You know Intel is trying hard to move graphics to CPU-like environments, AMD on the other hand is making Fusion, so they are staying with GPU-like graphics. Intel has a lot of resources and it could happen that graphics companies had to colaborate to fight the giant, or otherwise capitalize to the giants propositions.


----------



## Deusxmachina (Mar 15, 2008)

I've recently had or still have a 2400, a 2900, a 7600, and an 8500.  I don't play enough games to have noticed a difference that way, but I have watched HDTV stuff on all of them for hours.  ATI seems to have "better" colors.  Like they have more punch without looking fake or bleeding (red, for instance).  I can turn up the colors on the Nvidia so they're not as flat, but around the same time I really start enjoying them they start bleeding or look fake.  I can turn up the colors on the ATIs, and if I do they have a higher "fake" threshold.

Of course, it's not like any of them look "bad," but without really messing with the settings, I'd have to give the nod to ATI in the color department.


----------



## panchoman (Mar 15, 2008)

the poll results are quite interesting.. wasn't expecting those results at all!


----------



## Nitro-Max (Mar 15, 2008)

Ive owned a mix of both cards and ati are by far better in image quality i dont trust Nvidia not after the driver fixing scam around the time the fx cards were out.
It was a dissapointing sell out to many people Nvidia couldnt take the competition and sales were down so they cheated.

I choose ati everytime now regardless of fps results i simply lost trust with Nvidia.


----------



## DarkMatter (Mar 15, 2008)

Nitro-Max said:


> Ive owned a mix of both cards and ati are by far better in image quality i dont trust Nvidia not after the driver fixing scam around the time the fx cards were out.
> It was a dissapointing sell out to many people Nvidia couldnt take the competition and sales were down so they cheated.
> 
> I choose ati everytime now regardless of fps i simply lost trust.



So you have owned GeForce 8 series or not? You don't state it clear and anyone would think that not, because what you say of the FX thing and trusting Nvidia. That was long ago, we are discussing the present. And remember that even if not that exagerated Ati did the same. None of those are arguments for this thread.


----------



## Nitro-Max (Mar 15, 2008)

this isnt fiction they messed with some low level details but still it did infact lower image quality to boost fps results to fake benchmarks against ati cards. 

Read this OR google search "nvidia driver cheating" 

http://www.hardwareanalysis.com/content/article/1709/

theyve even been accused of it even now with crysis.it does'nt matter to me when it happend the fact is it HAPPEND and to me they cannot be trusted this is my view and my right. And no one or nothing will sway me from it.
Nvidia do some nice cards granted but id never trust them again.


----------



## Widjaja (Mar 15, 2008)

Nitro-Max said:


> this isnt fiction they messed with some low level details but still it did infact lower image quality compared to ati.
> 
> Read this OR google search "nvidia driver cheating"
> 
> ...



Thats what happens if you pay the game devs.
Some way to make the card look like it's performing better.

Although this didn't happen with Bioshock where alot of people with 7900GS's were getting serious issues and the ATi card were doing thier job fine.


----------



## hat (Mar 15, 2008)

Why are people concerned with running 16xaa? Really I find 2xaa to be "enough" because it hardly hits performance at all and it "softens" things up so it doesn't look like absolute crap. 4xaa is great, things look great with this setting and performance is, again, hardly hit at all.


----------



## Nitro-Max (Mar 15, 2008)

I think its down to us the users if we want to sacrifice image quality for preformance.But Nvidia had to do it to beat the competiton. I actually parted with £160 for a fx5600 ultra going on them fake driver benchmarks just to find out my old card was faster.!! what a waste.


----------



## DarkMatter (Mar 15, 2008)

Nitro-Max said:


> this isnt fiction they messed with some low level details but still it did infact lower image quality compared to ati.
> 
> Read this OR google search "nvidia driver cheating"
> 
> ...



The Crysis and 169.04 drivers was a bug, not a cheat. If it was a cheat they would have made those reflections look half well. That and that 169.09 were faster than 169.04 and looked ok.

And for this thread I asked some kind of justification, besides "Nvidia cheated in 2004 and I didn't buy a single Nvidia card so far, and thus Ati has better IQ" argument. That's the kind of argument of a fanboi. You need some kind of experience with the IQ of both cards, be it from personal experience or looking at reviews, and then give an opinion based on that, as objective as possible. And when I say objective, I only ask that you don't blindly despise Nvidia's pics. 
If you voted, until you do this, you are cheating in my thread.


----------



## Nitro-Max (Mar 15, 2008)

like i said dont matter when it happend the trust is gone i wont process or swallow anything from nvidia.
call it a one track mind but seems you are the same when it comes to Nvidia.

I voted and gave reason just like was asked and yes i have had a gts and a gtx since.


----------



## calvary1980 (Mar 15, 2008)

lol I don't like where this is going you guys are voting and arguing based on age old incidents. 

- Christine


----------



## Nitro-Max (Mar 15, 2008)

calvary1980 said:


> lol I don't like where this is going you guys are voting and arguing based on age old incidents.
> 
> - Christine



whos arguing?  I gave my vote and reason for my vote just as it says to do.

not my problem if he doesnt like it.


----------



## DarkMatter (Mar 15, 2008)

Nitro-Max said:


> like i said dont matter when it happend the trust is gone i wont process or swallow anything from nvidia.
> call it a one track mind but seems you are the same when it comes to Nvidia.



I have owned both Ati and Nvidia. In my main rig I've used more Nvidia in the past (6800GT, 7900GTX) because of Nvidia's Stereo 3D drivers, that's all. In that department Nvidia was infinitely better than Ati, since Ati didn't have any. Now I have 8800GT because here, don't know why, 8800 GT is cheaper than HD3870, 9600GT than HD3850... For instance, I bought my card for 203€, cheaper HD3870 that I found then was 210€. It was a no brainer.

Anyway my point is that you shouldn't have voted here, because you don't (and you don't want to) know how the last Nvidia's next gen looks.


----------



## Nitro-Max (Mar 15, 2008)

Dude i already said ive owned a gts and a gtx still doesnt change my vote lol nvidia looked blury like xbox games do lol and made the writing on my lcd blury too ati is alot sharper and crisp more vibrant.


----------



## KainXS (Mar 15, 2008)

I went from 5700FX to a X1650PRO to a 7950GT and finnaly to a HD3850 and I can say that ati defintely has better image quality from Nvidia

theres actually a review that shows crysis image quality on the HD3870 to the 8800GT and the HD3870 definitely looks better than the 8800GT

theres tons of image quality comparisons out there and they almost always end up with ATI having better quality


----------



## Nitro-Max (Mar 15, 2008)

KainXS said:


> I went from 5700FX to a X1650PRO to a 7950GT and finnaly to a HD3850 and I can say that ati defintely has better image quality from Nvidia
> 
> theres actually a review that shows crysis image quality on the HD3870 to the 8800GT and the HD3870 definitely looks better than the 8800GT
> 
> theres tons of image quality comparisons out there and they almost always end up with ATI having better quality



Amen!!!


----------



## calvary1980 (Mar 15, 2008)

I want to see this IQ Review.

- Christine


----------



## DarkMatter (Mar 15, 2008)

KainXS said:


> I went from 5700FX to a X1650PRO to a 7950GT and finnaly to a HD3850 and I can say that ati defintely has better image quality from Nvidia
> 
> theres actually a review that shows crysis image quality on the HD3870 to the 8800GT and the HD3870 definitely looks better than the 8800GT
> 
> theres tons of image quality comparisons out there and they almost always end up with ATI having better quality



I have 5 reviews in the first post as a proof that's not true, and I haven't see any other one saying that. Any comparison in which ForceWare 169.04 are involved (all those that compare Crysis) don't count. Please the point of this thread is that you provide proofs, links. Otherwise and since the only 5 posted links say both are of the same IQ, your point is easily rebatable and fanboi-ish. Post actual proofs, then you can say HD3870 looks definately better, until then your point has no weigh.



Nitro-Max said:


> Dude i already said ive owned a gts and a gtx still doesnt change my vote lol nvidia looked blury like xbox games do lol and made the writing on my lcd blury too ati is alot sharper and crisp more vibrant.



Where did you say that? I don't trust you anyway. You can't trust anything from Nvidia, yet you bought 2 cards with so close performance? I just can trust you.


----------



## Nitro-Max (Mar 15, 2008)

You like to say IQ and fanboi alot dont you.

Why start a thread and ask for votes and reason for the votes when you're just gonna argue with everyones reason? sounds like you're the fanboi to me.
just let people have there say if its what they feel who are you to change that!!.

"Oh a Nvidia owner"

Someone close this stupid flame thread.


----------



## DarkMatter (Mar 15, 2008)

Nitro-Max said:


> You like to say IQ and fanboi alot dont you.
> 
> Why start a thread and ask for votes and reason for the votes when you're just gonna argue with everyones reason? sounds like you're the fanboi to me.
> just let people have there say if its what they feel who are you to change that!!.



You didn't give a reason, that's why.
Many people have expressed their preference for Ati, but I didn't say anything to them, try guessing why...


----------



## KainXS (Mar 15, 2008)

DarkMatter said:


> I have 5 reviews in the first post as a proof that's not true, and I haven't see any other one saying that. Any comparison in which ForceWare 169.04 are involved (all those that compare Crysis) don't count. Please the point of this thread is that you provide proofs, links. Otherwise and since the only 5 posted links say both are of the same IQ, your point is easily rebatable and fanboi-ish. Post actual proofs, then you can say HD3870 looks definately better, until then your point has no weigh.
> 
> 
> 
> Where did you say that? I don't trust you anyway. You can't trust anything from Nvidia, yet you bought 2 cards with so close performance? I just can trust you.



I agree with nitro, the only fanboy I hear is you, what cards have you had in the past, all nvida right,

If I could get my hands on a 8800 series card then I would really be able to make a comparison but since I don't I will compare the 7950GT to the HD3850 as those are the 2 latest cards I have owned and the ATI had better quality.

This guy who made this thread just wants to get an argument started


----------



## Widjaja (Mar 15, 2008)

Pssst everbody's watching ROFL!!


----------



## Nitro-Max (Mar 15, 2008)

Conversation over ive been on this forum a long time now i aint gonna get into a Nvidia vs Ati owner battle I would be out of Character.

I gave my vote with reason having experience with both card makers if you dont like my reason then thats not my problem.

sry people i do apologies.


----------



## KainXS (Mar 15, 2008)

Nitro-Max said:


> Conversation over ive been on this forum a long time now i aint gonna get into a Nvidia vs Ati owner battle I would be out of Character.
> 
> I gave my vote with reason having experience with both card makers if you dont like my reason then thats not my problem.
> 
> sry people



I agree with you there Nitro


----------



## calvary1980 (Mar 15, 2008)

I want some IQ comparison tests for Crysis or at the very least some shake n' bake chicken to silence me!



> Fanboi
> 2 up, 1 down
> 
> 
> ...



- Christine


----------



## KainXS (Mar 15, 2008)

calvary1980 said:


> I want some IQ Comparison test for Crysis or at the very least some shake n' bake chicken to silence me!
> 
> - Christine



http://iax-tech.com/video/3870/38704.htm

heres one christine

its short so just read the bottom


----------



## Nitro-Max (Mar 15, 2008)

There nothing wrong with your IQ christine


----------



## Widjaja (Mar 15, 2008)

Nitro-Max said:


> There nothing wrong with your IQ christine




Not at all, it's just as low as the rest of ours.


----------



## cdawall (Mar 15, 2008)

old but useful

http://www.extremetech.com/slideshow/0,2394,s=1017&a=128800,00.asp


----------



## calvary1980 (Mar 15, 2008)

I know because I was smart enough to vote 'both' I think Crysis is stupid, I think people who play Crysis are stupid, I think people who build a system based on Crysis not realizing the programmers are at fault are stupid!

I also think the iax-tech article isn't much to lean on to swing a vote. 

HardOCP is one of the few places that do IQ comparisons and you have to go back to the XT and 7 series to find any dirt.

- Christine


----------



## cdawall (Mar 15, 2008)

calvary1980 said:


> I know because I was smart enough to vote 'both' I think Crysis is stupid, I think people who play Crysis are stupid, I think people who build a system based on Crysis not realizing the programmers are at fault are stupid!
> 
> I also think the iax-tech article isn't much to lean on to swing a vote.
> 
> - Christine



i think crysis is just fine i played it on an old AGP rig just fine @1024x768 med/high settings


----------



## Nitro-Max (Mar 15, 2008)

calvary1980 said:


> I know because I was smart enough to vote 'both' I think Crysis is stupid, I think people who play Crysis are stupid, I think people who build a system based on Crysis not realizing the programmers are at fault are stupid!
> 
> I also think the iax-tech article isn't much to lean on to swing a vote.
> 
> ...



Lmao cant argue with that lol i own crysis but i spend most of my time sorting out its bugs lol.
atm its saying emulation software is stopping it from loading?? i dont have any emulation software?


----------



## KainXS (Mar 15, 2008)

http://www.driverheaven.net/reviews/3870-XXX/IQ.php

heres another one for ya cindy


----------



## DarkMatter (Mar 15, 2008)

KainXS said:


> I agree with nitro, the only fanboy I hear is you, what cards have you had in the past, all nvida right,



In wich I am a fanboi? That's the problem with all fanbois, they think that whoever disagrees with them is a fanboi of the competition. Where did I favored Nvidia, when I am saying both have the same IQ? Where I am downplaying Ati? Nah I know that I'm arguing with a pair of blind fanboi's that can't do this simple thinking...

And for the record:

A Matrox, don't know which one.
3DFx Vodoo 3000

Nvidia cards: 
TNT2
GF Ti4800 (wanted Ati 9700 but it's price was double of the GF4 back then)
GF 6800 GT
GF 7900GTX
GF 8800GT

Ati cards:
7500
8500 pro
9250
x550
x800xt (exchanged with the 6800GT some months with a friend so he could play stereo3D)
X1900 XTX (same as above but with 7900GTX)
X1950 XTX (bought one on ebay.us with the idea of selling it here for profit, had it at home for 2 months since I couldn't sell it)

Apart from that, my friend bought those cards because I told him to do. And my uncles have had HD2400XT, X850XT, 9250, X550 pro, 8400GS, 6200LE and 9600 pro because I told them to buy those. But yeah I must admit I'm an incorregible Nvidia fanboi... 

EDIT: I had a hard time completing the list, I had to call one of my uncles to ask him. lol


----------



## Bundy (Mar 15, 2008)

I've got one of each but they are different generations by a mile (8800 Ultra vs radeon 9200!). LOL the ATI does look better if I select pictures etc, but it's old and very slow.
My vote in the end was for no difference because thats what I thought I saw when I was shopping for a card for my new system. I looked at both brands and my personal vote for what card to buy was therefore decided on fps and I changed away from ATI to NVidia.


----------



## KainXS (Mar 15, 2008)

DarkMatter said:


> In wich I am a fanboi? That's the problem with all fanbois, they think that whoever disagrees with them is a fanboi of the competition. Where did I favored Nvidia, when I am saying both have the same IQ? Where I am downplaying Ati? Nah I know that I'm arguing with a pair of blind fanboi's that can't do this simple thinking...
> 
> And for the record:
> 
> ...



I call you a fanboy because everytime I see one of your posts your either downing ati cards or praising nvidia like they are gods or calling someone else a fanboy, you and I both know that this is not the first time someone has called you a fanboy in the last month or week at that.


----------



## Nitro-Max (Mar 15, 2008)

I respect youre Nvidia loving ways you have your reasons for it. The same as i have mine so plz respect mine too and my vote. Thats all there is to it. What ever reason people like what they like its down to them.And im sure there vote will reflect this.
lets just keep it civilized.


----------



## Mussels (Mar 15, 2008)

i have only read part of this, and i wish to add my own comments to other peopels opinions.

ATI had better quality becaue for ages Nvidias anisotropic filtering was horrible - especially in the Unreal Tournament games you could see the effects VERY clearly, even in screenshots. This no longer occurs, its VERY close between the brands.

The point is this:
If ATI looks better than Nvidia with 4xaa and 16x AF, ATI has teh quality winner. However, if Nvidia gets 60FPS and ATI gets 30, i'd choose the Nvidia - in GAMEPLAY, i'd be using no AA and 4xaf on the ATI so the Nv would look better for ACTUAL USE.


----------



## KainXS (Mar 15, 2008)

Mussels said:


> i have only read part of this, and i wish to add my own comments to other peopels opinions.
> 
> ATI had better quality becaue for ages Nvidias anisotropic filtering was horrible - especially in the Unreal Tournament games you could see the effects VERY clearly, even in screenshots. This no longer occurs, its VERY close between the brands.
> 
> ...



thats actually very true mussels,


----------



## Nitro-Max (Mar 15, 2008)

Its definatly nessesary to sacrifice quality somtimes to get better fps but i also think its down to us the users to make that decission.

like i said in my previous posts about nvidia cheating by messing around with drivers and lowering details to look better than the competition on fps that was wrong of them and was giving out false benchmark scores.

Im not saying it happens now this was in the past but it did indeed happen and i lost trust in nvidia for this.


----------



## calvary1980 (Mar 15, 2008)

KainXS said:


> http://www.driverheaven.net/reviews/3870-XXX/IQ.php
> 
> heres another one for ya cindy



*scratches her chin in suspicion* ok my name is Christine or Chris not Cindy, are you going senile? :>

- Christine


----------



## Mussels (Mar 15, 2008)

Nitro-Max said:


> Its definatly nessesary to sacrifice quality somtimes to get better fps but i also think its down to us the users to make that decission.
> 
> like i said in my previous posts about nvidia cheating by messing around with drivers and lowering details to look better than the competition on fps that was wrong of them and was giving out false benchmark scores.
> 
> Im not saying it happens now this was in the past but it did indeed happen and i lost trust in nvidia for this.



that did happen. thats why i stayed away from nvidia until the 8800GTX came out, because at the time everyone drooled over the speed and the fact its quality was back up to scratch.



calvary1980 said:


> *scratches her chin in suspicion* ok my name is Christine or Chris not Cindy, are you going senile? :>
> 
> - Christine



I'll call you anything you want, if he's not good enough  (lol)


----------



## DarkMatter (Mar 15, 2008)

KainXS said:


> I call you a fanboy because everytime I see one of your posts your either downing ati cards or praising nvidia like they are gods or calling someone else a fanboy, you and I both know that this is not the first time someone has called you a fanboy in the last month or week at that.



Show me a proof of that. If I have been called a fanboi by a even more fanboi than you or not is out of this debate. I am not a fanboi. I never downplayed Ati, nor I praised Nvidia like gods. I don't have anything against Ati, I do have many reasons to not like R6xx, a well reasoned arguments. But that's only seen as Nvidia fanboism by a big Ati fanboi. Chrisntine's post #64 says it all. Don't continue with this. There are many proofs that say G92 (not in image quality, but almost everything else) is better, I am not a fanboi just because I agree with them.


----------



## erocker (Mar 15, 2008)

Next post arguing about "fanboys" ....  You probablly know what will happen.:shadedshu
No one want's to hear it.


----------



## Nitro-Max (Mar 15, 2008)

"I'll call you anything you want, if he's not good enough  (lol)"

LMAO!! just dont call her a fanboi


----------



## calvary1980 (Mar 15, 2008)

can't you rat bastards spell my name right lol

I thought about going Spider 4x HD 3850 for kicks but than I thought about the 8800GTS with vMod  ATI can never really give me a reason over nVidia.

- Christine


----------



## DarkMatter (Mar 15, 2008)

erocker said:


> Next post arguing about "fanboys" ....  You probablly know what will happen.:shadedshu



Yeah sorry, but I just can't bear when they insult me. Yes I consider fanboi an insult, when there are no reasons. But we have finished so far.


----------



## DarkMatter (Mar 15, 2008)

Iax-tech and Extremetech links are not valid.

Iax-tech is using 169,04 drivers and I already said they were faulty.

Extremetech is about 6800, x800. Come on, we are talking current gen...

Driverheaven link added.


----------



## Monkeywoman (Mar 15, 2008)

ATI has better image quality because they can optimize the large amount of shaders for AA and AF. only one problem, the R600 screwed everything up by processing AA with the driver, not the hardware like the X1xxx cards. this will be fixed though with the R700.


----------



## jbunch07 (Mar 15, 2008)

Lillebror said:


> Maybe its just me.. But those flashlight shots - dosent the shadow on the walls look a little weird on the 8800?



i agree the shadows did look a little strange.


----------



## Mussels (Mar 15, 2008)

Monkeywoman said:


> ATI has better image quality because they can optimize the large amount of shaders for AA and AF. only one problem, the R600 screwed everything up by processing AA with the driver, not the hardware like the X1xxx cards. this will be fixed though with the R700.



monkey woman?  is right. This is the point i was making - quality is pointless if you cant use it. no one wants 10 or 20 FPS with AA, they want 30+ or 60+. ATI screwed up AA on their CURRENT models just like Nvidia screwed up AF in the past.


----------



## calvary1980 (Mar 15, 2008)

nVidia is like O'doyle they rule! lol 

http://www.youtube.com/watch?v=CsNgRmsx-14

- Christine


----------



## jbunch07 (Mar 15, 2008)

why is it that some games look better on nv cards and some look better on ati cards always wanted to know this?
do game dev make games that will work better with one or the other or no?


----------



## Nitro-Max (Mar 15, 2008)

Well with my 1900xt i used low to medium quality to play counter strike souce to get max fps especially when playing on servers that were full.

I must say the 3870x2 Handles this no problem with everything up full. Quality and preformance without sacrifice.


----------



## Monkeywoman (Mar 15, 2008)

jbunch07 said:


> why is it that some games look better on nv cards and some look better on ati cards always wanted to know this?
> do game dev make games that will work better with one or the other or no?



when u buy a game, look at the box. if it says Nvidia "the way its meant to be played" then it will naturally run better on a geforce then a radeon because the game was developed on a geforce technology. this is why ati cards are slow to gain performance because they rely so much on driver support to optimize them for the game. Radeon cards are like wine, they get better with age.


----------



## DarkMatter (Mar 15, 2008)

Nitro-Max said:


> Well with my 1900xt i used low to medium quality to play counter strike souce to get max fps especially when playing on servers that were full.
> 
> I must say the 3870x2 Handles this no problem with everything up full. Quality and preformance without sacrifice.



That's impossible!! 

CSS runs 60+ fps in whatever setting you throw at it on a x1900xt. What do you call max fps??


----------



## erocker (Mar 15, 2008)

This link here is posted in the OP. http://www.driverheaven.net/reviews/3870-XXX/IQ.php

This is between a 3870 and a 8800GT.  Look closely at the pictures, especially things like the cracks in the ground as they get further away.


----------



## Nitro-Max (Mar 15, 2008)

DarkMatter said:


> That's impossible!!
> 
> CSS runs 60+ fps in whatever setting you throw at it on a x1900xt. What do you call max fps??



Actually the console limits your fps you have to change it.

CSS runs 60+ fps in whatever setting you throw at it {try this on a full 64man server}

Cpu speed can also decide what fps you get not just the card a 1900xt can be bottlenecked dont forget.


----------



## KainXS (Mar 15, 2008)

Monkey the only reason the R600 has better quality(slightly) is bacause it has 64 super scalar shaders while Nvidia's 8800's have about 96-128 I think, those 64 shaders have 5 stream units each and because of this they have a little better quality, its complicated and some people sometimes think that hey my cards faster because it has 320 shaders, it dosen't have 320 shaders

its just a fact that super scalar architectures usually equal better image quality but slower processing

monky, the cards do not have a a large amount of shaders(320), its actually 64 complex shaders and 256 simple shaders, kinda like image filters and whatnot



DarkMatter said:


> Show me a proof of that. If I have been called a fanboi by a even more fanboi than you or not is out of this debate. I am not a fanboi. I never downplayed Ati, nor I praised Nvidia like gods. I don't have anything against Ati, I do have many reasons to not like R6xx, a well reasoned arguments. But that's only seen as Nvidia fanboism by a big Ati fanboi. Chrisntine's post #64 says it all. Don't continue with this. There are many proofs that say G92 (not in image quality, but almost everything else) is better, I am not a fanboi just because I agree with them.



oh believe me their it tons of proof in other threads you posted in but erocker told me to cut it so I will.


----------



## DarkMatter (Mar 15, 2008)

KainXS said:


> Monkey the only reason the R600 has better quality(slightly) is bacause it has 64 super scalar shaders while Nvidia's 8800's have about 96-128 I think, those 64 shaders have 5 stream units each and because of this they have a little better quality, its complicated and some people sometimes think that hey my cards faster because it has 320 shaders, it dosen't have 320 shaders
> 
> its just a fact that super scalar architectures usually equal better image quality but slower processing
> 
> the cards do not have a a large amount of shaders(320)



They do have 320 shader processors. It just happens that they are in 64 groups of 1 scalar + 1 4D vectorial. They are not superscalar, they whised they were, I wish they were. Now that would be a fast card!!

Last sentences are just false, but I don't want to argue anymore...


----------



## KainXS (Mar 15, 2008)

DarkMatter said:


> They do have 320 shader processors. It just happens that they are in 64 groups of 1 scalar + 1 4D vectorial. They are not superscalar, they whised they were, I wish they were. Now that would be a fast card!!
> 
> Last sentence is just false, but I don't want to argue anymore...



isn't that what I just said


----------



## Monkeywoman (Mar 15, 2008)

KainXS said:


> Monkey the only reason the R600 has better quality(slightly) is bacause it has 64 super scalar shaders while Nvidia's 8800's have about 96-128 I think, those 64 shaders have 5 stream units each and because of this they have a little better quality, its complicated and some people sometimes think that hey my cards faster because it has 320 shaders, it dosen't have 320 shaders
> 
> its just a fact that super scalar architectures usually equal better image quality but slower processing
> 
> the cards do not have a a large amount of shaders(320)




yes but this is dependent on the cpu speed much more then the geforce cards because of the way the AA is handled. to get "performance" out of the R6xx ati cards, u have to have a fast cpu to process the driver. take a look at futuremark ORB, the faster the cpu the faster the scores because the bottleneck at the driver level is eliminated. 

the R7xx will not rely on driver because the driver relys on the cpu which leads to a drop in quality and performance.


----------



## DarkMatter (Mar 15, 2008)

KainXS said:


> isn't that what I just said, I mean are you just trying to piss me off or what
> 
> you just rephrased it



Vectorial is not in any way the same as superscalar. In superscalar you issue the task as in totally scalar designs, but the processor can start the second unit before the last one has finished.
In vectorial you have to issue atask for the whole vector, if only use one of the SPs the other 3 are lost in that clock spike.

Also scalar, vectorial or superscalar has nothing to do with image quality, but that was a lovely invention. LOL


----------



## jbunch07 (Mar 15, 2008)

Monkeywoman said:


> when u buy a game, look at the box. if it says Nvidia "the way its meant to be played" then it will naturally run better on a geforce then a radeon because the game was developed on a geforce technology. this is why ati cards are slow to gain performance because they rely so much on driver support to optimize them for the game. Radeon cards are like wine, they get better with age.



thanks Monkey! 
i like that phrase bout Radeon, nicely said and so true!


----------



## KainXS (Mar 15, 2008)

DarkMatter said:


> Vectorial is not in any way the same as superscalar. In superscalar you issue the task as in totally scalar designs, but the processor can start the second unit before the last one has finished.
> In vectorial you have to issue atask for the whole vector, if only use one of the SPs the other 3 are lost in that clock spike.
> 
> Also scalar, vectorial or superscalar has nothing to do with image quality, but that was a lovely invention. LOL



your right, idk anymore, your right man

still I gonna have to say compared to the HD38XX to the 79XX series ati has better quality, compared to the 88XX series, I don't know because I don't have a 88XX card

personally I am thinking it might be similar in most cases after seeing the driver heaven review


----------



## Widjaja (Mar 15, 2008)

bundyrum&coke said:


> I've got one of each but they are different generations by a mile (8800 Ultra vs radeon 9200!). LOL the ATI does look better if I select pictures etc, but it's old and very slow.
> My vote in the end was for no difference because thats what I thought I saw when I was shopping for a card for my new system. I looked at both brands and my personal vote for what card to buy was therefore decided on fps and I changed away from ATI to NVidia.



I fully agree with you as smoother fps was the reason why I went for nVidia this time around.
I knew there was a chance the graphics may not be as good in some games.

A good example was when I played a game called The Movies.
At the time I had a ATi 9550 then changed to a 7600GS and noticed the movie replays were alot more pixelated and just plain ugly but in saying that the game played alot smoother and not so stuttery.


----------



## DarkMatter (Mar 15, 2008)

erocker said:


> This link here is posted in the OP. http://www.driverheaven.net/reviews/3870-XXX/IQ.php
> 
> This is between a 3870 and a 8800GT.  Look closely at the pictures, especially things like the cracks in the ground as they get further away.



Hmm I have just noticed in the 2nd Oblivion image, the one in the bridge the Nvidia is doing a better job with anisotropic in the sidewalks. In the other image is evident too. Clearly 8800 GT is doing a better anisotropic filtering in Oblivion

In first Crysis screenshot Ati is doing a better job on the distant mountain, but I like a bit more the vegetation on the NVidia one. Apart from that I don't see any differences.

I would call it a tie with some little advantage for Nvidia for the good AF job in Oblivion. <--- According to what I see on the review. My conclusion is for the review, not the cards itseft. I know I must say this before someone crucifies me!!


----------



## erocker (Mar 15, 2008)

Just remember not to let color get in the way of your judgment.  I just wouldn't trust it in images.


----------



## DarkMatter (Mar 15, 2008)

erocker said:


> Just remember not to let color get in the way of your judgment.  I just wouldn't trust it in images.



I don't really understand that statement.
But about the pictures, I have seen and made many still images between different AF settings and qualities. Ati ones look blurred, I think my judgement is right, but if you have a reason to disagree, please explain it better, I couldn't undestand that last one. 

Anyway, after looking at them for a while it's like Ati was doing only 4x aniso, because the blurry part (even if subtle) starts very close to the player. I don't know...


----------



## KainXS (Mar 15, 2008)

why don't we make like a comparison database, the net really needs one, run the games all on the same settings and see which one looks better

of course I am talking about the tpu mods

I would say let users do it but somebody's gonna cheat and it won't be accurate at all

I would really love to see in the future 9800GTX image quality vs 4870 image quality because those are the 2 cards I am looking forward to right now


----------



## DarkMatter (Mar 15, 2008)

KainXS said:


> I would really love to see in the future 9800GTX image quality vs 4870 image quality because those are the 2 cards I am looking forward to right now



Well 9800GTX's image quality will be the same as now, since it's g92 too. I don't know about 4870 because, some say that RV770 has same underlying architecture as RV670, and other say it's a completely new one...


----------



## Mussels (Mar 15, 2008)

KainXS said:


> your right, idk anymore, your right man
> 
> still I gonna have to say compared to the HD38XX to the 79XX series ati has better quality, compared to the 88XX series, I don't know because I don't have a 88XX card
> 
> personally I am thinking it might be similar in most cases after seeing the driver heaven review



imo, the 8800 series matches anything ATI has. I have NOT used an ATI 3xx0 series, but since they have the software AA limit i dont know anyone willing to buy one in order to let me try it.


----------



## xfire (Mar 15, 2008)

The comparision also has to be made on different display ports(HDMI etc)


----------



## Mussels (Mar 15, 2008)

DarkMatter said:


> I don't really understand that statement.
> But about the pictures, I have seen and made many still images between different AF settings and qualities. Ati ones look blurred, I think my judgement is right, but if you have a reason to disagree, please explain it better, I couldn't undestand that last one.
> 
> Anyway, after looking at them for a while it's like Ati was doing only 4x aniso, because the blurry part (even if subtle) starts very close to the player. I don't know...



the problem also is that MANY effects (Such as anti aliasing) dont always show up in screenshost! ATI had that temporal anti aliasing, which DID NOT show in screenshots so everyone bagged it out. Also, screenshots are compressed making colors appear more dull - you can save teh same file twice and end up with slightly different colors.


----------



## KainXS (Mar 15, 2008)

DarkMatter said:


> Well 9800GTX's image quality will be the same as now, since it's g92 too. I don't know about 4870 because, some say that RV770 has same underlying architecture as RV670, and other say it's a completely new one...



Its gotta be different, if its not then ATI will be screwed but then again if it is that will make Nvidia get off their arses and make another amazing card like the 8800ULTRA and we all know how much arse that card kicked, I think the future might be bright, competition is always good.

you know what after thinking about it that way, I think I will skip the 48XX series and keep what I have and then get a Geforce 10, I mean can you imagine a new faster core and probably DDR5 with about 100 tex units, 512bit bus, and 256 shaders( I can dream right) on one single non dual gpu card

unless the 4870's are blazingly fast then ATI just lost a customer, lol, I don't have enough dough to keep buying cards every 4 months.


----------



## Mussels (Mar 15, 2008)

xfire said:


> The comparision also has to be made on different display ports(HDMI etc)



I dont think so.... screenshots have NO effect on whatevers on the screen. the image is taken INTERNALLY, long before it reaches the monitor.

If you are comparing it in person, then yes it can matter - but hey, i'm running HDMI over a VGA cable because i'm a bastard.

TV->HDMI-DVI adaptor-> DVI->VGA adaptor -> VGA cable -> VGA to DVI adaptor -> DVI port on 8800GTX


----------



## xfire (Mar 15, 2008)

Not screenshots posted by some random site beacuse we can never say when a website is biased.Also arent images rendered by the Graphics card?
We need a mod or someone who can be trusted to do the comparision.


----------



## imperialreign (Mar 15, 2008)

I'm going to cast a "post-vote" - meaning, I don't intend to actually vote in the poll, but will vote my opinion here in this post.


Brand loyalty on my part aside;

I feel that ATI has a slight edge in IQ, mostly in certain areas - and based on my experience dealing with both ATI cards and nVidia card's on my father's and friend's rigs, I find ATI's display to be a bit more color balanced, and brightness/contrast balanced better.  HDR graphics, IMO, tend to look a lot better on ATI also . . . it comes across to me as being more toned and balanced than nVidia's displays.

But, that being said, based on current hardware - the difference in negligible in the majority of applications . . . at this point, there are only a few apps that look night&day better with ATI over nVidia, and they typically tend to be those where ATI has spent some time with the developers.

And so, although differences are moot in the grand scheme of things, I still notice a very slight difference - so I can't in good faith give a vote either way.






Now . . . if we were talking a couple of card gens ago (1800/1900 vs 7800/7900), I'd hands down give the vote to ATI.


----------



## Mussels (Mar 15, 2008)

imperialreign said:


> I'm going to cast a "post-vote" - meaning, I don't intend to actually vote in the poll, but will vote my opinion here in this post.
> 
> 
> Brand loyalty on my part aside;
> ...



i agree with what you are saying. To me however, the fact i can run 2xaa with 4xaf and get the same speed as an ATI card, makes the quality slide over this way. That is how i decide what to buy


----------



## Widjaja (Mar 15, 2008)

Yeah come to think of it, I did notice a flatness in the colours on Colin Mcrae DiRT with the 8800GT compared to the X1950pro.

I found the flatness easier on the eyes though.


----------



## imperialreign (Mar 15, 2008)

Mussels said:


> i agree with what you are saying. To me however, the fact i can run 2xaa with 4xaf and get the same speed as an ATI card, makes the quality slide over this way. That is how i decide what to buy



and I can completely reason with that, too.  TBH, up through the 1900 series, ATI cards didn't take anywhere near the performance hit that nVidia cards did if you turned the eye candy on . . . but ATI's last two generations have been a rough phase for them, considering the merger and all (glad to see ATI getting back on their feet and catching up now, though).

But yet, what I do find a bit curious, though - the majority of CGI film companies that have been pumping out one CGI hit after another since Toy Story have all been reported to use ATI hardware (probably FireGL hardware, though, but . . .).  Honestly, if this is indeed true, it would make for a great marketing campaign on ATI's part (in contrast to nVidia's TWIMTBP campaign); ATI could ask that these film developers included a brief 10-15 second marketing clip with ATI's logo at the beginning of the film . . . if people were to see ATI's logo more often, and especially in areas of HD content, they'd be more apt to purchase ATI's hardware - especially with the move towards HD TV, film, etc.  People will want the best IQ, and if they believe that ATI delivers the best IQ, they'll spend their money accordingly - just like how the majority of users want the best in-game performance, and purchase nVidia hardware.


----------



## Deusxmachina (Mar 15, 2008)

DarkMatter said:


> Otherwise and since the only 5 posted links say both are of the same IQ, your point is easily rebatable and fanboi-ish. Post actual proofs, then you can say HD3870 looks definately better, until then your point has no weigh.



You already posted actual proofs for him.  I don't know where "both are of the same IQ" comes from when the very first link (MaximumPC) has ATI winning by 40% (21 to 15).  

A bit of bias from the article's summary.  They just couldn't seem to find much good to say about ATI:

"That gives AMD a slight edge at the $250 price point, but it leaves Nvidia unchallenged at every higher segment."

"would you be willing to take a major performance hit in order to render your game experience just a wee bit more shiny and colorful? We didn’t think so."

"...now the company need only worry about catching up on one performance metric: frame rate.  ...That leaves Nvidia in the catbird seat—again." 

From your original post:

"Read the entire article or you will miss the point."  

ATI won the tests 21 to 15.  I thought that was the point.


----------



## AsphyxiA (Mar 15, 2008)

its always been my opinion that ATI cards just render a better image than nVidia cards do.  On the other hand, nVidia cards seem to produce a better fps.  it all really depends on what you like better.  Personally, once the fps reaches a solid 70, I can't tell the difference one way or another.  If I can get more depth, better AA, etc... while sacrificing a few frames, thats the card I'll choose.  Either way its personal preferance, plus image quality isn't really that far apart from each other these days.


----------



## Nitro-Max (Mar 15, 2008)

Ok just to show im not tottally anti Nvidia {still dont trust them} But heres one in Nvidias favour http://www.firingsquad.com/hardware/ati_nvidia_image_quality_showdown_august06/page4.asp
But bare in mind that this is just one game ive also seen ati do better in other games Plus the tests dont always come down to hardware only. Driver improvements can improve image quality on both sides also But how can we compare drivers??. Not that it matters to most people. Most people just want more fps and to do that image quality is normally sacrificed anyway.
are we comparing dx9 to dx10 also?


----------



## Mussels (Mar 15, 2008)

Nitro-Max said:


> Ok just to show im not tottally anti Nvidia {still dont trust them} But heres one in Nvidias favour http://www.firingsquad.com/hardware/ati_nvidia_image_quality_showdown_august06/page4.asp
> But bare in mind that this is just one game ive also seen ati do better in other games Plus the tests dont always come down to hardware only. Driver improvements can improve image quality on both sides also But how can we compare drivers??. Not that it matters to most people. Most people just want more fps and to do that image quality is normally sacrificed anyway.
> are we comparing dx9 to dx10 also?



DX10 is what started this quality thing anew - is DX9 maxed out with 4x aa 'better' than DX10 on medium settings with no AA, for the same FPS? Some people max it and dont care about the framerate, whle others care considerably and opt for DX9


----------



## DarkMatter (Mar 15, 2008)

Nitro-Max said:


> Ok just to show im not tottally anti Nvidia {still dont trust them} But heres one in Nvidias favour http://www.firingsquad.com/hardware/ati_nvidia_image_quality_showdown_august06/page4.asp
> But bare in mind that this is just one game ive also seen ati do better in other games Plus the tests dont always come down to hardware only. Driver improvements can improve image quality on both sides also But how can we compare drivers??. Not that it matters to most people. Most people just want more fps and to do that image quality is normally sacrificed anyway.
> are we comparing dx9 to dx10 also?



I didn't bother to read the review since it's from August '06.
Come on guys, we are talking about current gen, last generaion's whereabouts are of no concern.


----------



## Nitro-Max (Mar 15, 2008)

DarkMatter said:


> I didn't bother to read the review since it's from August '06.
> Come on guys, we are talking about current gen, last generaion's whereabouts are of no concern.



Well in that case i stick to the fact that drivers are too premature to come to any conclusion as both cards show improvements in a mix of games.


----------



## DarkMatter (Mar 15, 2008)

Deusxmachina said:


> You already posted actual proofs for him.  I don't know where "both are of the same IQ" comes from when the very first link (MaximumPC) has ATI winning by 40% (21 to 15).
> 
> A bit of bias from the article's summary.  They just couldn't seem to find much good to say about ATI:
> 
> ...



Obviously you didn't read the entire article and you missed the point. 
There was a control group. They showed them the same image, that is, Ati vs. Ati, and Nvidia vs. Nvidia. And results were 9 for monitor A, 6 for monitor B and 3 saw no difference. Allof them would have said no difference and clearlystate both are Ati now because IQ is better,or both are Nvidia since it's worse,for example. That is the point, and they explain it well in the article. If Ati was better, since those people are experts, all of them would have taken Ati and not only some of them. They didn't, so there isn't enough ofa proof to give a win. Or we can take all the numbers as a proof and conclude that monitor A is a lot better than monitor B, even though is the same and has been calibrated with professional tool to look exactlythe same. READ the entire article...


----------



## DarkMatter (Mar 15, 2008)

Nitro-Max said:


> Well in that case i stick to the fact that drivers are too premature to come to any conclusion as both cards show improvements in a mix of games.



Drivers are premature? Nvidia series 8 launched November 2006, Ati HD2900 XT launched May 2007...


----------



## Nitro-Max (Mar 15, 2008)

but they arent current gen either my x2 is and the 9xxx series are they are the newest gen of gpus


----------



## zOaib (Mar 15, 2008)

*ATi *hands down 

have owned 

Geforce 8800 GTX , 8800 GTS 640MB , 8800 GTS (G92) , 8800 GT
also borrowed an 8800 Ultra for benching purposes .......

Owned
x800xt pe , x850 xt pe , x1800 xt , x1900 xtx , 2900 xt 1gb , hd 3870 CF 

current hd 3870 x2 ......


----------



## Nitro-Max (Mar 15, 2008)

Nitro-Max said:


> Well in that case i stick to the fact that drivers are too premature to come to any conclusion as both cards show improvements in a mix of games.






DarkMatter said:


> Drivers are premature? Nvidia series 8 launched November 2006, Ati HD2900 XT launched May 2007...





Nitro-Max said:


> but they arent current gen either my x2 is and the 9xxx series are they are the newest gen of gpus



Do you care to comment on that DarkMatter?

Because either you are confused or i am?

Maybe you should have made the thread more clearer when making comparissons.


----------



## KainXS (Mar 15, 2008)

Nitro-Max said:


> but they arent current gen either my x2 is and the 9xxx series are they are the newest gen of gpus



nvidia's 8800(G92) series is still current gen though, same gpu(X) as the 9XXX series so far, even the 8800GS is faster than the 9600GT in most cases with the same image quality


----------



## Nitro-Max (Mar 15, 2008)

So we are comparing new to old?


----------



## rampage (Mar 15, 2008)

1) i have only skimmed oveter the thread
2) have i missed somthing here but wouldnt the monitor play a larger part in all of this crt,lcd,vga/dvi ect??? (only thinking of things like, well i look at my mates screen and my gfx card looks way better) ???


----------



## DarkMatter (Mar 15, 2008)

rampage said:


> 1) i have only skimmed oveter the thread
> 2) have i missed somthing here but wouldnt the monitor play a larger part in all of this crt,lcd,vga/dvi ect??? (only thinking of things like, well i look at my mates screen and my gfx card looks way better) ???



All of the reviews posted are suposed to be run on the same monitor, cable, etc.

My personal experience is with both cards plugged in my monitor. It has two inputs and a switch to select one of them at will, so we plugged two computers at the same time. I think that anybody posting here has done something similar, we don't have any reason to suspect anything else.


----------



## DarkMatter (Mar 15, 2008)

Nitro-Max said:


> So we are comparing new to old?



You know very well that G80 and G92 is the same generation and R600 and RV670 is the same generation. We are talking about Ati's HD2000 and HD3000 series and Nvidia's 8 and 9 series.


----------



## Nitro-Max (Mar 15, 2008)

DarkMatter said:


> You know very well that G80 and G92 is the same generation and R600 and RV670 is the same generation. We are talking about Ati's HD2000 and HD3000 series and Nvidia's 8 and 9 series.



Ive actually been away for the last year and half been back a couple of weeks so ye im still catching up

G80 and G92 is the same generation well thats a bit crap then so its not really new tech! Thats how AMD lost there cpu crown.


----------



## DarkMatter (Mar 15, 2008)

Nitro-Max said:


> Ive actually been away for the last year and half been back a couple of weeks so ye im still catching up
> 
> G80 and G92 is the same generation well thats a bit crap then so its not really new tech! Thats how AMD lost there cpu crown.



Well Nvidia 6 and 7 series was the same architecture, Ati 9, 10 and X1800 were the same architecture (not completely sure about x1800, X1900 definately wasn't, but then again the difference between x1900 and x1800 was bigger than from the previous mentioned)  and HD2000 and HD3000 are the same architecture. And usually the image quality is common to the whole architecture


----------



## anticlutch (Mar 15, 2008)

Personally, I think ATi's image quality is better. Unless I'm doing something wrong (which is entirely possible), I'm getting a crapload of jaggies with max AA (6x MSAA I think) on HL2 with my 8800GT 512mb. My friend's 3870 runs HL2 flawlessly... I would go as far as to say that that the image produced by my old x1950pro would put my 8800GT to shame.


----------



## Morgoth (Mar 16, 2008)

cant wait for intel gpu wil beat both ati and nvidia


----------



## niko084 (Mar 16, 2008)

Well from the looks of this, ATI pretty much has the vote on better IQ...


----------



## yogurt_21 (Mar 16, 2008)

Mussels said:


> TV->HDMI-DVI adaptor-> DVI->VGA adaptor -> VGA cable -> VGA to DVI adaptor -> DVI port on 8800GTX


okay a little off topic, but wtf? why did you do that?


----------



## yogurt_21 (Mar 16, 2008)

DarkMatter said:


> Well Nvidia 6 and 7 series was the same architecture, Ati 9, 10 and X1800 were the same architecture (not completely sure about x1800, X1900 definately wasn't, but then again the difference between x1900 and x1800 was bigger than from the previous mentioned)  and HD2000 and HD3000 are the same architecture. And usually the image quality is common to the whole architecture



actually the x1800 and x1900 were more similar than the x800 and x1800, the x1800 was ati's first generation of what they dubbed their shader processor, on the x800 they still called it a pixel pipeline.


----------



## Mussels (Mar 16, 2008)

yogurt_21 said:


> okay a little off topic, but wtf? why did you do that?



to see if cables made any quality difference (they didnt).

Also, my HDMI cable is 30cm while my VGA is 1.8 meters.


----------



## calvary1980 (Mar 16, 2008)

screw ATI ! <insert canned audience> just kidding 

this poll is so stacked no way all these people have owned both brands (atleast not recent generations)

PS, Dark not only did you spell my name wrong but you spelt my nickname wrong, its calvary as in golgotha not cavalry in army.

- Christine


----------



## warhammer (Mar 16, 2008)

IQ is in eyes of the beholder  or one mans goddess is another man’s bush pig


----------



## imperialreign (Mar 16, 2008)

calvary1980 said:


> screw ATI ! <insert canned audience> just kidding
> 
> this poll is so stacked no way all these people have owned both brands (atleast not recent generations)
> 
> ...



I kinda agree, especially with newer hardware - I've been lucky in the fact that between me and my father, there is now a total of 4 rigs currently running in this house - where I'm an Intel/ATI loyalist, he prefers AMD/nVidia (odd-ass combination, ain't it ), so I've seen both.

I think the worst nVidia card I've seen, was a FX 5500 that he owned - aside from how quirky the card was, I found the IQ to be horrible; took me forever to talk him into buying a newer card.


----------



## KainXS (Mar 16, 2008)

calvary1980 said:


> screw ATI ! <insert canned audience> just kidding
> 
> this poll is so stacked no way all these people have owned both brands (atleast not recent generations)
> 
> ...



I'm gonna call you cindy from now on christine


----------



## cooler (Mar 16, 2008)

AMD R6xx: Image Quality Analysis
http://www.beyond3d.com/content/reviews/47/1

NVIDIA G80: Image Quality Analysis
http://www.beyond3d.com/content/reviews/3/1

Quality on game
http://www.vr-zone.com/articles/AMD_Radeon_HD3850_and_HD3870_-_AMD's_salvation?/5392-14.html
http://www.pcgameshardware.de/?article_id=621293&page=11
http://www.hardocp.com/article.html?art=MTQxMCwzLCxoZW50aHVzaWFzdA==


Quality on HD

http://www.dailytech.com/AMD+Alleges+NVIDIA+Cheats+in+HD+HQV/article8608.htm
http://www.theinquirer.net/en/inquirer/news/2007/09/18/nvidia-cynically-fiddled-video-benchmarks
http://www.tomshardware.com/2004/11/29/the_tft_connection/index.html

nvidia record 
http://www.geek.com/is-nvidia-cheating-on-benchmarks/
http://www.extremetech.com/article2/0,3973,1086025,00.asp
http://www.extremetech.com/article2/0,3973,1201076,00.asp


----------



## DarkMatter (Mar 16, 2008)

calvary1980 said:


> screw ATI ! <insert canned audience> just kidding
> 
> this poll is so stacked no way all these people have owned both brands (atleast not recent generations)
> 
> ...



Sorry.  
It's corrected in the OP now. And did I spell Christine wrong too? 

And I agree with you. There are 87 votes right now. 57 voted for one of the brands having better IQ, but almost no justifications for that. 

If you think one brand has better IQ, at least you can give a justification using some screenies from the provided links or something... Until people provide proofs and justifications the poll is pointless. I shouldn't have attached a poll, but I knew it would encourage more people to enter the thread. I just thought there would be more serious people.


----------



## DarkMatter (Mar 16, 2008)

cooler said:


> AMD R6xx: Image Quality Analysis
> http://www.beyond3d.com/content/reviews/47/1
> 
> NVIDIA G80: Image Quality Analysis
> ...



I have added your links to the OP. I have excluded those that are refering to older generations.


----------



## Saakki (Mar 16, 2008)

ati pwns when comes to watchin movies and stuff =)


----------



## Nitro-Max (Mar 17, 2008)

DarkMatter said:


> Sorry.
> It's corrected in the OP now. And did I spell Christine wrong too?
> 
> And I agree with you. There are 87 votes right now. 57 voted for one of the brands having better IQ, but almost no justifications for that.
> ...



Maybe theres no better proof than peoples own eyes m8 i just get the feeling if the votes were in Nvidias favour you wouldnt be so concerned this isnt a dig or anything but why should anyone trust reviews? people have had experiences with there own hardware that totally disscredit what reviews say i know i have.


----------



## Mussels (Mar 17, 2008)

Nitro-Max said:


> Maybe theres no better proof than peoples own eyes m8 i just get the feeling if the votes were in Nvidias favour you wouldnt be so concerned this isnt a dig or anything but why should anyone trust reviews? people have had experiences with there own hardware that totally disscredit what reviews say i know i have.



the other alternative is that people are still judging based on older hardware - people with an x800 voting because 'its better than Nvidia', when this thread was about current gen hardware... which 95% of people have NOT tried both ATI and Nvidia.


----------



## jbunch07 (Mar 17, 2008)

erocker said:


> If you need specific information, ask what it is, otherwise with the ads in your sig I'm going to have to hit the spam button.



i smell spam in the air....


----------



## das müffin mann (Mar 17, 2008)

ugh i hate spam, even on toast, but hey we may want some batteries


----------



## jbunch07 (Mar 17, 2008)

looks like its too late Muffin Mann ... but oh well back on topic....

i wonder how the X2's IQ will stack up against the GX2's?


----------



## DarkMatter (Mar 17, 2008)

Nitro-Max said:


> Maybe theres no better proof than peoples own eyes m8 i just get the feeling if the votes were in Nvidias favour you wouldnt be so concerned this isnt a dig or anything but why should anyone trust reviews? people have had experiences with there own hardware that totally disscredit what reviews say i know i have.



Nitro you are just ridiculous, stop trolling.


----------



## Mussels (Mar 28, 2008)

jbunch07 said:


> looks like its too late Muffin Mann ... but oh well back on topic....
> 
> i wonder how the X2's IQ will stack up against the GX2's?



from what i've seen:

x2's kick ass in benchmarks, but the crossfire isnt scaling so well in games


GX2: scaling is working REALLY WELL for once. even in quad SLI the scaling was almost perfect except in 3dmark and crysis.

I think the GX2 is the best buy for performance, but the x2 IS cheaper


----------



## erocker (Mar 28, 2008)

From my experience, which is going from G80 to RV670 I have to hand IQ to Ati.  I have yet to see G92 in person yet so we'll see.


----------



## largon (Mar 30, 2008)

*erocker*,
Would you like to elaborate how RV670 looks better than G80? 

Any RV670 users that want to do a Crysis IQ comparison with my 8800GTS 512MB and 9600GT by exchanging saved games so we get the exact same screen from both cards?


----------



## Widjaja (Mar 30, 2008)

Just somemore IQ comparison.

I have found my 8800GT is alot darker in the shadowing of Need for Speed Most Wanted compared to the X1950pro.
Gets to the pooint where I can't see where the car is under bridges.
TBH the NFSMW looks the s**t with ATi cards.

But then again the game is optimzed for ATi being a XBOX 360 launch title.
Not only that I also get R6025 pure function call error now and then with this game on nVidia cards.


----------



## Mussels (Mar 30, 2008)

Widjaja said:


> Just somemore IQ comparison.
> 
> I have found my 8800GT is alot darker in the shadowing of Need for Speed Most Wanted compared to the X1950pro.
> Gets to the pooint where I can't see where the car is under bridges.
> ...



good info, but theres no way to know if the problem lies with the game or the drivers.


----------



## Wile E (Mar 30, 2008)

I own both the 2900XT and 8800GT, neither really has better IQ than the other. Slightly different, yes, but neither is superior.


----------



## Widjaja (Mar 30, 2008)

*Real World Performance Comparisons X1950pro vs 8800GT so far*



Mussels said:


> good info, but theres no way to know if the problem lies with the game or the drivers.



TBH from what I have been reading it's to do with the games coding since it's a Visual C++ 2005 issue.

It's just that I have only experienced this issue with nVidia cards.
Another thing I have found which nVidia has over ATi in this game is vsync.
This game sucks hard with vsync on when using the X1950pro, it's chug city but there was near 0 any tearing with vsync.

I personally think I can say nVidia has more tearing with vsync off from my experience so far.
I still have to play more GPU intensive games like Bioshock which I will when I finish S.T.A.K.E.R.

*V8 Supercars 3 Australia*
I know nVidia pwns ATi in V8 Supercars 3 in a BIG way as old as it is.
Sutters for X1950pro in this game when loading the detailed textures of the cars which nVidia does not and I am 100% certain is driver related.
Codemasters backing up nVidia?

*S.T.A.L.K.E.R*
With S.T.A.L.K.E.R I have found it performs a little bit better so far.
Puases at the same places, but shorter and less frequent than the X1950pro.

*Sims 2*My GF plays it  and I know you closet sim fanatics are out there.
nVidia blows away ATi in this game based soley on drivers.
My GF has a laptop with a 7600GO and it runs very well.
It puts my X1950pro to shame.
8800GT of course doesn't even break a sweat on this game and also plays it well.


----------



## Mussels (Mar 30, 2008)

its offtopic, but i could never beleive how much power the sims games needed to run smooth...


----------



## Widjaja (Mar 30, 2008)

Mussels said:


> its offtopic, but i could never beleive how much power the sims games needed to run smooth...



Yeah pretty much pure CPU power.
You understand once you get into 3D editng.
Massive amount of polys on the screen at once in the game.
But then again, the game runs bad on consoles too.
Only reason why it's so popular is the fact women get absolutley addicted to it.
And Women do not pirate.


----------



## wolf (Mar 31, 2008)

Even if there are tiny differences,i personally cannot distinguish between most comparison screenshots, and i have used both brand cards and not noticed any change whatsoever.

so basically i feel the IQ argument is irrelevant cos its too close to call, FPS is what matters and the numbers don't lie.


----------



## Wile E (Mar 31, 2008)

wolf said:


> so basically i feel the IQ argument is irrelevant cos its too close to call, FPS is what matters and the numbers don't lie.


I sorta disagree with this statement. FPS isn't the only thing that matters. IQ does most certainly matter. In any generation of nVidia cards prior to 8800, the IQ difference was clear, but seeing as NV got their act together, that point is moot, I'll give you that. But what isn't moot is what feature sets each card has, their price/performance ratio, power usage, etc. So the only part I disagree with, is that FPS is most important.


----------



## wolf (Mar 31, 2008)

i guess what im saying is that FPS is what matters the most to me. the features are nice but i rarely use them. so theres alot of cards from both companies that offer great price/performance ratio's and features, so i wouldn't base any of my decision on IQ really as like i said, i think its too close to call, at least in this generation.

so basically my choice in gfx cards is how much FPS i get get for my $$$, and usually going with whats a popular card at the time. for example the x1950Pro and 8800GT


----------



## Wile E (Mar 31, 2008)

That I can't argue with.


----------



## Mussels (Mar 31, 2008)

Wile E said:


> That I can't argue with.



sure you could. you could tell him that popular cards make him a sheep, so he should use S3 and matrox and be a real man!


actually, i cant argue either. those two cards you mentioned really have no flaws.


----------



## wolf (Mar 31, 2008)

thanks guys  you know i really liked how with the X1950pro they remade the whole core, on a smaller process size too, instead of lazily cutting the unused shaders. its also cool factors like that, which may sway my decision...


----------



## DarkMatter (Mar 31, 2008)

I thought this thread was dead. It had to be at least in the 3rd page when revived, whoever revived it. Some people really like to dig in the forums. 

I'm glad you are reviving this and keeping it civil guys, great stuff. Thank you.

BTW X1950pro was a kick ass GPU. And as you say, wolf, those are cool factors. Anyway I am more inclined over price/performance/power effectiveness, regardless of the chip being new or a cut down version of another card. For example X1900GT was a kick ass card too, almost as fast as X1950pro, I'm talking about the first one, launched in May 06. The second one was slower, around X1950GT performance. Indeed we could say X1900GT rev. 1 was in fact X1900pro. God! Those were good cards.


----------



## newconroer (Mar 31, 2008)

I tell ya what, if someone can show me that ATI cards handle the ridiculously botched coding of games like EQ2 and Oblivion, I'll buy one just for that


----------



## das müffin mann (Mar 31, 2008)

ok how about my 2900?


----------



## Widjaja (Mar 31, 2008)

newconroer said:


> I tell ya what, if someone can show me that ATI cards handle the ridiculously botched coding of games like EQ2 and Oblivion, I'll buy one just for that



+1 Oblivion.

The games runs with stutters on my X1950pro.
Haven't tried it with my 8800GT.

For me FPS is not the big deal.
Stutter free games are.
Unfortunately that usually comes down to the coding.

Oblivion stutters on the PS3 too.


----------



## Wile E (Apr 1, 2008)

My 2900XT runs it flawlessly.


----------



## Kursah (Apr 1, 2008)

My 1950pro 256mb did a pretty damn good job at 1280x1024 / 1440x900, sure not everything maxed, but a lot on High. And my x1950xtx was able to do a little better with higher settings...there may be the rare hiccup here and there, but last I remember the gameplay and smoothness was quite a bit better than when I tried it on my 9600Pro or x850xt pe!


----------



## ShogoXT (Apr 3, 2008)

Wile E said:


> My 2900XT runs it flawlessly.



Yea mine runs oblivion great also. Full 1920x1200.

I will admit though the performance hit from turning AA on can be anoyying at times.

I havent had a 8800 so I cant comment on image quality.


----------



## Tatty_One (Apr 3, 2008)

I think the image quality thing is very subjective and quite often is game dependant based on the games code/architecture, that can directly benefit one manufacturer or the other depending on the cards architecture, so, you look at Oblivion and ATi may have a little better IQ, you look at Bioshock and NVidia might have the better IQ etc


----------



## Mussels (Apr 3, 2008)

tatty makes a good point.

I agree with him to be honest, the older (broader) problems have us looking at 'which looks best' when these days it really does come down to the title - some games are made hand in hand with Nvidia or with ATI, so those games look better on one over the other.


----------



## Thinker_145 (Apr 3, 2008)

The real image quality difference is going to be made by your monitor anyways.
Ironic that people using cheap TN panel LCD's will talk about image quality.


----------



## Mussels (Apr 3, 2008)

Thinker_145 said:


> The real image quality difference is going to be made by your monitor anyways.
> Ironic that people using cheap TN panel LCD's will talk about image quality.



its because they struggle so damned hard to get the quality, with such a handicap 

My TV at 1366x768 with its ginourmous pixels actually looks better than my high res 22" screen, due to the fact its a full 8 bit panel. People really are missing out buying the cheap LCDs on the market these days.


----------



## cdawall (Apr 4, 2008)

is an 8bit?

http://www.hyundaiq.com/pro_l70s.asp#


----------



## erocker (Apr 4, 2008)

largon said:


> *erocker*,
> Would you like to elaborate how RV670 looks better than G80?
> 
> Any RV670 users that want to do a Crysis IQ comparison with my 8800GTS 512MB and 9600GT by exchanging saved games so we get the exact same screen from both cards?



Mainly with colors.  With ATi, colors are way more vibrant, which in turn makes the textures look better as well.  It's very noticeable with games on Steam for some reason, but noticeable for everything I've played none the less.  I should be getting a 8800gt coming my way soon to compare more.  I do prefer Nvidia for anything 2D related though, as their monitor "callibration" or whatever you want to call it worked really well with my substandard 22" widescreen.


----------



## wolf (Apr 4, 2008)

i would never go as far as to say "way more vibrant" 

in comparison screenies, i have noticed slight color differences, and ATi's sometimes appear to have more depth, but imo its never even enough to notice during gameplay.


----------



## largon (Apr 4, 2008)

I'm not sure how comparably more vibrant colors in 3D are a sign of superior image quality - games aim to mimic reality and as we all know, in reality nature has no bright and vibrant colors. And for the record, how would we know that the colors produced by Radeons are closer to those intended by developer? 

Image quality is _not_ a subjective matter.


----------



## Wile E (Apr 4, 2008)

largon said:


> Image quality is _not_ a subjective matter.


Image quality is completely subjective. Image *accuracy* is not.



Thinker_145 said:


> The real image quality difference is going to be made by your monitor anyways.
> Ironic that people using cheap TN panel LCD's will talk about image quality.



Good to see there's somebody else that doesn't fall into the response time trap. I just can't wait to get my Westinghouse monitor replaced. I hate this 19" Samsung.


----------



## Thinker_145 (Apr 4, 2008)

Mussels said:


> its because they struggle so damned hard to get the quality, with such a handicap
> 
> My TV at 1366x768 with its ginourmous pixels actually looks better than my high res 22" screen, due to the fact its a full 8 bit panel. People really are missing out buying the cheap LCDs on the market these days.


Totally agreed.

Higher resolutions are overrated anyways IMHO.


----------



## DarkMatter (Apr 4, 2008)

largon said:


> I'm not sure how comparably more vibrant colors in 3D are a sign of superior image quality - games aim to mimic reality and as we all know, in reality nature has no bright and vibrant colors. And for the record, how would we know that the colors produced by Radeons are closer to those intended by developer?
> 
> Image quality is _not_ a subjective matter.



Agreed. That's one of the arguments that I've been debating for ages in IQ comparisons between different brands also including Intel IGP, Matrox, S3 and others. 

The same happens with LCD vs. CRT. Some people say LCD looks way better than CRT because of the same reason.

It has nothing to do with the quality, anyway. There was an experiment where thay tried to test this. It was not about any brand, just an experiment to test how people saw their reality and how colors could affect people:

They showed two screens to many people with some image comparisons and ask which one looked better. It was the same picture on both screens all the time, but one of the pictures was altered with more vibrant colors, though they were false colors. They changed the false picture from one screen to the other between subjects to eliminate the screen as a factor. They showed many different pictures from different parts of the world: tropics, deserts, forest, cities... The grand mayority of the people chose the false picture. When asked about the reason they had a hard time telling why, "it just looked better for them". Only people who worked in jobs related to image (photographers, image designers, painters) where able to say the reason and most of them choosed the image that was not altered. Another interesting thing is that on pictures of cities more than half the people chose the right picture, claiming that the other one had false coloration, "cities don't look like that". This is really interesting because on the article itself* there was the comparison where most people chose the altered picture: it was a tundra like forest that looked almost as a tropical jungle to me in the altered picture!!

The conclusion of the experiment was that most people can be "fooled" by colours, and that the closer that they are to them, the easier it is for them to recognice them. For example since most people lived in cities they chose the right city pictures, nature photographers chose the right pictures when the theme was nature, etc.

*I read an article about the test and it's conclusions, not the test files myself.


----------



## largon (Apr 4, 2008)

Wile E said:


> Image quality is completely subjective. Image *accuracy* is not.


Well now, that's an absurd idea. 

If we were to capture a screenshot rastered with software (that would be the perfect image), compare it to one rastered by a "GPU A" and another pic by "GPU B", there would likely be differences. Which one has better IQ? The only answer that makes any _sense_ would be: _"The one is closer to the image rastered in software"_, NOT the one that a given viewer thinks is more pleasant. If IQ somehow were a subjective matter then the very name of the term "image quality" would not be valid.


----------



## DarkMatter (Apr 4, 2008)

Wile E said:


> Image quality is completely subjective. Image *accuracy* is not.



Image quality is not subjective. Image *preference* is. Image accuracy is part of image quality, a big part of it. Anisotropyc filtering is all about image accuracy, same about anti-aliasing. Colour accuracy is also important in IQ, though I could admit that on games overall it could not be all that important for some people. On cartoonish games like TF2, Prey, Bioshock, etc its importance is low in general, though the closer to what developers wanted to do, the better. And we should take into account that, because they spent lots of hours trying to find the best colour balance and consequently, best "feeling". We, image designers DO spend many hours on that, I think game developers do it too.

On games like Crysis that is out of question. Image accuracy is all what it matters...


----------



## Mussels (Apr 4, 2008)

hehe and english language is failing us all here 

More or less: to those who arent that educated, they chose the brightest,shiniest colors.

When i was younger i know i looked at things in games and thought 'that looks crap, stupid games' and then later on saw the same effects in real life (example, chain linked fences from a distance have some sort of shimmer effect, a 'sparkling' effect on tips of small waves on a river - i lived near the ocean so i was basing it on how the ocean worked)

Oh as for color vibrancy and those who love it - Nvidia actually have a driver option for that. you can go quite further than ATI's method - my way of seeing is that people with poor screens can get lost colors back (i used it on my old, wearing out 19" CRT)


----------



## Wile E (Apr 5, 2008)

largon said:


> Well now, that's an absurd idea.
> 
> If we were to capture a screenshot rastered with software (that would be the perfect image), compare it to one rastered by a "GPU A" and another pic by "GPU B", there would likely be differences. Which one has better IQ? The only answer that makes any _sense_ would be: _"The one is closer to the image rastered in software"_, NOT the one that a given viewer thinks is more pleasant. If IQ somehow were a subjective matter then the very name of the term "image quality" would not be valid.


To the most people, the one with better IQ has nothing to do with what the developer wanted. It has everything to do with their personal preferences. One could have more vibrant colors than the developer intended, but that may look better to the person in front of the screen, so to them, it has better IQ.

Accuracy however, cannot be called into question. It can be measured. Image quality cannot. It's purely a matter of opinion on the user's part. If image quality was measured as a function of accuracy, half the LCD makers out there would be out of business.


----------



## erocker (Apr 5, 2008)

Wile E said:


> To the most people, the one with better IQ has nothing to do with what the developer wanted. It has everything to do with their personal preferences. One could have more vibrant colors than the developer intended, but that may look better to the person in front of the screen, so to them, it has better IQ.
> 
> Accuracy however, cannot be called into question. It can be measured. Image quality cannot. It's purely a matter of opinion on the user's part. If image quality was measured as a function of accuracy, half the LCD makers out there would be out of business.



I completely agree as I'm comparing with the hardware I use with the image displayed in front of me.  As an artist with a background in mostly painting, my eyes are trained to to decipher and process color, which has a big impact on what I consider "image quality".  Color doesn't make texture, but it helps define it.


----------



## CrackerJack (Apr 5, 2008)

To me a it just matters what your playing it back on. Not in less you have one ATI and one Nvidia card, the exact same monitors and at the same resolution. But this is only going  show which has the better picture. But most of us go towards performance and details within a game or benchmark. but anyway, that's my two cent!


----------



## gerrynicol (Apr 5, 2008)

My last card was an evga 8800gts 320, current card his 3870, I can honestly say I can't see the difference between them,  I have only really played Stalker with both cards and to me eyes there is no difference in how the game looks 1 card to the other.  Just my opinion though


----------



## largon (Apr 5, 2008)

*Wile E*, *erocker*,
So it just boils down to semantics: 
What I consider as "image quality" you think as "image accuracy". Maybe that's because my definition is based on the term image quality used in photography where just like DarkMatter said, image accuracy is a part of image quality. IQ only means the extent the device is able to replicate the scenery as human eye perceives it. If a camera distorts color hue in any way - no matter if the result was more pleasant for the viewer - it has worse image quality than the one that does not.


----------



## Wile E (Apr 5, 2008)

largon said:


> *Wile E*, *erocker*,
> So it just boils down to semantics:
> What I consider as "image quality" you think as "image accuracy". Maybe that's because my definition is based on the term image quality used in photography where just like DarkMatter said, image accuracy is a part of image quality. IQ only means the extent the device is able to replicate the scenery as human eye perceives it. If a camera distorts color hue in any way - no matter if the result was more pleasant for the viewer - it has worse image quality than the one that does not.


Yeah, I think you are correct. It does seems to be nothing more than semantics at play here.


----------



## Mussels (Apr 5, 2008)

one thign that hasnt been mentioned: eyesight. everyone is different, some people have blurrier vision than others, some have greater peripheral vision - some people see some colors brighter/clearer than others.

So while those of us with great eyesight (like myself on a shiny HDTV) see the colors as really gaudy and nasty when its cranked up via digital vibrance or something, others with poor color perception see it as suddenly looking awesome.

Its not an argument that can ever be won, we just get to keep choosing whats best for ourselves.


----------



## Lillebror (Apr 5, 2008)

I have a friend that has a 8800gts and a x800 card.. When he plays call of duty 2, he really loves the x800, cause the colour looks alot better and it just feels alot nicer to look at, than on the 8800gts. Its just weird, and its only looking better in that single game.


----------



## CrackerJack (Apr 5, 2008)

Lillebror said:


> I have a friend that has a 8800gts and a x800 card.. When he plays call of duty 2, he really loves the x800, cause the colour looks alot better and it just feels alot nicer to look at, than on the 8800gts. Its just weird, and its only looking better in that single game.



thats thee way here,  i believe alot of the older cards were like that. I play cod2 with ,9200,9250, x1300,x1950,2900gt,8600gts, and 6200. And it was a more colorful with the 9250 but just not as realstic. with x1300, i could get snow,rain and shadow effects. But then the color became darker.


----------



## wolf (Apr 6, 2008)

Wile E said:


> Yeah, I think you are correct. It does seems to be nothing more than semantics at play here.



agreed, so can we lock down this thread already? its really jsut given fanboys a place to try lord something that cant be proven.


----------



## Mussels (Apr 6, 2008)

i dont think this thread was too bad. we actually got a fair fwe honest opinions with valid points - people stated their views and we've more or less concluded there is no final judgement.

Older cards look better in older games in some cases, and games designed for certain series of cards tend to look and run better on those cards. Some of its kinda obvious, but it has been educational.


----------



## wolf (Apr 6, 2008)

i agree to an extent, but to an extent i think its VERY non-conclusive. and the ATi boys have been lording IQ over nvidians for a while, and theres just so little proof, and its all so subjective (monitor, eyes, etc)


----------



## desertjedi (Apr 11, 2008)

> To the most people, the one with better IQ has nothing to do with what the developer wanted. It has everything to do with their personal preferences. One could have more vibrant colors than the developer intended, but that may look better to the person in front of the screen, so to them, it has better IQ.
> 
> Accuracy however, cannot be called into question. It can be measured. Image quality cannot. It's purely a matter of opinion on the user's part. If image quality was measured as a function of accuracy, half the LCD makers out there would be out of business.


Well put - I agree completely. Maybe there are metrics and tests for image accuracy but the prime test for image quality is..."hey dude, what do ya think?".

Maximum PC did a "blind" image quality test with only "graphics pros" on the panel. ATI did eek out a slight victory...for what it's worth. The test included various media types I guess you'd say. 

TBH, at this point in time, IQ between the two GPU vendors does not affect my buying decision. At heart, I'm an ATI guy...who's currently using an 8800GTS 512 simply because it's a friggin awesome card. 

The only downer I have with my Nvidia card is that the font I end up seeing in several places in not as readable as the font that I end up seeing with my ATI card. I have a business rig (ATI card) and a gaming rig (Nvidia card) KVM'ed to a single 1600x1200 LCD. The text that appears in the browser address bar is far more readable with the "el cheapo" ATI card. The Nvidia "font" is either too dark (pure black text on white) or too thick and ends up looking less clean and less readable. I've tried Cleartype but it doesn't help. I'm thinking that ATI simply uses a lighter shade of gray for text and somehow its text has less aliasing but I'm not sure. I suppose I could take some screenies and analyze them to see if I can figure what the deal is.

I guess if I'm going to post here occasionally, I need to figure out what this "thanking" thing is. It says I've been thanked twice...? I guess that's better than being spanked twice...I guess. </OT>


----------



## Nitro-Max (Apr 11, 2008)

i did kinda state the own eyes thing  on page 7 but i got flamed for it by the thread poster and yes ive owned both cards new and old 8800gtx was my last Nvidia card i then went to 2x 2900xt's and now a 3870x2 and to me ati wins on IQ it just suits me more its visually pleasing.


----------



## Deusxmachina (Apr 11, 2008)

wolf said:


> agreed, so can we lock down this thread already? its really jsut given fanboys a place to try lord something that cant be proven.



The person with an Nvidia logo for his avatar says this thread is full of fanboys.  It's hard not to wonder if there would still be a call to close this thread if ATI didn't have nearly five times the votes Nvidia does.

There's no reason to close this thread.  The subjective part of image quality aside, other interesting things can come out such as the discussion of an image-quality discussion itself when done by people who use cheap-panel LCD monitors.


----------



## cdawall (Apr 11, 2008)

i will do another comparison myself i have a pair of 3850s now and they will be using the same monitor as my 7800GS which IMO looks better than some of the newer NV cards....


----------



## erocker (Apr 11, 2008)

I'm not closing this thread!  If people can't behave, you can bet they will be dealt with though.  Cdawall is going in the right direction, test it out for yourself!


----------



## cdawall (Apr 11, 2008)

woot i did something right


----------



## EastCoasthandle (Apr 18, 2008)

Source

This is the proof that clearly shows that ATI offers better IQ when playing movies then Nvidia.  Something I said and will continue to say when I had the opportunity to view both cards in action as I do like to watch movies on my monitor.  When you click on the link you will clearly see one example were Nvidia uses too much red and the hair on the women is to dark.  I've also seen the very opposite where the image looked washed out but that depends on which card used and which codec is being played.  This along with all those who voted should settle this once and for all as far as movie playback is concerned.  

Make sure you read the next few pages, again it will clearly show that ATI offers a more natural look. Even when you attempt to tweak Nvidia's offerings it can have a negative impact on IQ but you read it for yourself.


----------



## calvary1980 (Apr 18, 2008)

if you read the rest of the review it says nVidia is the clear winner.



> It would be fair to say that the image quality we obtained from ATI and Nvidia hardware has given us a lot to think about. On one hand we have ATI with a superior level of detail; on the other we have Nvidia with much more intense and realistic colour, not only on skin but many areas of scenery too.





> So this leaves us with Nvidia default quality against ATI default quality, and the performance or features that both manufacturers offer. In terms of performance and features we have to say that Nvidia is the clear winner as they currently allow playback with Aero enabled and good CPU usage statistics regardless of whether the image includes one stream or two. With ATI, losing Aero with a high definition disc running is a real disappointment, especially considering the length of time they have had to rectify this. Additionally the inability to decode both video streams can impact the desktop experience in a negative manner, especially on systems with lower specification processors.



I can also provide another article on the same subject with nVidia on top http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=559&Itemid=29



> What is perhaps surprising from our results, given the fact that for many years ATI were considered the king of video quality, is that NVIDIA appears to have finally surpassed them in this regard.



- Christine


----------



## EastCoasthandle (Apr 18, 2008)

Actually based on the results it clearly lost.  It won based on Aero not IQ (which you did quote, thanks).  But for those that don't know a thing about tweaking their monitor or making the proper calibrated adjustments ATI is the clear winner right out the box 

Image 1: Casino Royale 1hr 15m 09s = ATI

Image 2: Casino Royale 1hr 58m 12s = ATI

Image 3: Casino Royale 19m 27s = Nvidia

Image 4: Planet Earth, Fresh Water 4m 40s = Nvidia

Image 5: Planet Earth, Fresh Water 14m 20s = ATI

Image 6: Planet Earth, Fresh Water 25m 16s = ATI


After the IQ review the author then goes and tweak Nvidia standard settings in an attempt to make it look as good as ATI which wasn't touched.  This alone clearly tells you that ATI is the winner here .  When they did attempt to tweak PureVideo there were negative side effects that impacted IQ negatively (read the article).  Which is why I disagree with his conclusion.  He failed to properly weigh that when he came to his conclusion.


----------



## wolf (Apr 18, 2008)

Deusxmachina said:


> The person with an Nvidia logo for his avatar says this thread is full of fanboys.  It's hard not to wonder if there would still be a call to close this thread if ATI didn't have nearly five times the votes Nvidia does.
> 
> There's no reason to close this thread.  The subjective part of image quality aside, other interesting things can come out such as the discussion of an image-quality discussion itself when done by people who use cheap-panel LCD monitors.



each to their own, you might say that ATi is better, i may say nvidia, but theres no hard and fast way to say who is clearly better _overall_ (games, movies, desktop, etc)

and i may be an Nvidia FAN but i would not say fanboy, i have owned many ATi cards and i ALWAYS give them a fair go, no senseless flaming from me. also if i WERE a fanboy, dont you think i wouldhave voted for nvidia, instead of the same?

hec, the better the competition is, the more it drives the industry, so theres no point in anyone being a fanboy, just buy whatever brand/model you prefer, and let other people choose whatever they want. i'm not going to flame anyone for buying ATi, but if they ask for my opinion before purchase, at present, i will definitely recommend nvidia, and none of that recommendation will be based on IQ, theyre far too similar.

and like ive said before, having many times used cards from both companies in the same generation, there was no discernible IQ difference whatsoever.


----------



## Wile E (Apr 18, 2008)

Oh god. So what? That review doesn't prove anything except that nVidia felt that going with a warmer image looked better. It was a choice. Some people like that better, thus the warm and cool color presets you see on countless televisions.


Again, nothing has been proven here, and it is still 100% personal preference.


----------



## erocker (Apr 18, 2008)

I say I have the eyes of an eagle and it's too close to call... yet.


----------



## Mussels (Apr 18, 2008)

To be honest the differences are so minor it doesnt matter. Choose the one thats best overall for you at the time - i sure wouldnt pay another $100 for the same speed and tiny movie quality improvements.

You just cant really make a blanket statement as this kinda thing changes between hardware generations, AND software/driver updates.


----------



## Edito (Apr 18, 2008)

I think the ATI cards have a solid color and nvidia its a little bit pallid but this is nothing cause we can adjust the nvidia settings in order to get a solid color like ATI i have a nvidia car and i configured it very well in order to get a solid colors thats why i think we just don´t need to discus about this anymore.

Don´t Get me wrong please...


----------



## DarkMatter (Apr 18, 2008)

EastCoasthandle said:


> Actually based on the results it clearly lost.  It won based on Aero not IQ (which you did quote, thanks).  But for those that don't know a thing about tweaking their monitor or making the proper calibrated adjustments ATI is the clear winner right out the box
> 
> Image 1: Casino Royale 1hr 15m 09s = ATI
> 
> ...



Before I started to read the article I first compared all the images, so I wasn't influenced by the autor's opinions. Well, this were my findings:

- None of the two were the best and the best IQ should be somewhere in between the two cards. Thus both needed a lot of tweaking. This is common, IMO you can't even start talking about IQ if you don't calibrate each card, monitor, etc. 

- This review was showing almost the oposite situation that I had seen when I did my own comparison some months ago. Back then Ati colors where warmer and Nvidia's where more dull but maybe a little bit more natural. None where the best, again a mix of the two was probably the best solution. Interestingly, I have to say that after installing the latest drivers this month, I noticed a much more warmer and saturated image in some Galactica episodes, and I had to recalibrate the color settings although they were exactly where I left them. I just thought that the color in Galactica was a lot warmer than the rest of videos I had watched before. After testing other videos today, it seems that Nvidia has changed the default color settings.

After reading what they say, one thing is clear none of the two have the best IQ. Nvidia is enhancing blacks and Ati enhanced cyan in an attempt to create more natural feeling. Both have their benefits, but both fail in othe areas.

If there is one certainty in that article, that is that, though different, none of the two are better than the other. They show a different picture but none show the best picture, which would be in middle ground between them.

Another interesting thing that I found about the article, has nothing to do with the article itself, but with the use that people give to the articles trying to make their point true. And just how biased they are. Yeah, I'm pointing at you this time EastCoasthandle! In the conclusion of the article we can read:



> It would be fair to say that the image quality we obtained from ATI and Nvidia hardware has given us a lot to think about. On one hand we have ATI with a superior level of detail; on the other we have Nvidia with much more intense and realistic colour, not only on skin but many areas of scenery too.





> Image quality is much closer between the two due to both having good and bad points. Clearly with some time and patience it would be possible to work out the best settings for each card, but we should be much closer to perfection at default settings than we are. In reality, the best image is probably somewhere in between the two produced. If we had to choose one as our preference, overall it would probably be Nvidia as much of the detail issues could be fixed by an increase in brightness rather than significant work to get skin tone correct on ATI.



The only thing clear is that THEY are giving the win to Nvidia, if they had give it to anyone. because you just need some minor tweaks to reach to the "perfection", while you need more hard tweaking on the Ati card.

But we can easily conclude by the article that both have the same IQ. None are perfect but both differe from the perfect image more or less to the same extent. Of course those discrepancies can affect the personal preference of people, but they don't let space for such claims as "Ati has clearly the best IQ" or "I recommend you buy Ati becausse it has better IQ". Image quality is not an objective purchase decision argument anymore, every extense review demostrates that, period.

*I will add the link to the OP.


----------



## Mussels (Apr 18, 2008)

summary:

both have ups and downs, 'perfect' is somewhere between.

As far as i care that is where we are at - but this is ongoing. new drivers and new hardware may call us back here.


----------



## DarkMatter (Apr 18, 2008)

calvary1980 said:


> if you read the rest of the review it says nVidia is the clear winner.
> 
> 
> 
> ...



Hmm! No they don't give Nvidia a clear win nowhere, but it's interesting that they do give it to them. The win is small though, just as older reviews gave the win to Ati, because of slight differences.

Your link is added too.


----------



## DarkMatter (Apr 18, 2008)

It's interesting that these newer reviews are giving the edge to Nvidia now that they seem to have the warmer image. If you have read post #191, you all know what I am talking about.


----------



## calvary1980 (Apr 18, 2008)

nVidia won fair and square in both articles.

- Christine


----------



## Wile E (Apr 18, 2008)

calvary1980 said:


> nVidia won fair and square in both articles.
> 
> - Christine


No they didn't. They got the win only because the reviewers preferred doing the tweaks nVidia needed. Re-read the article, and it is clear that both needed tweaked to show the true images.


----------



## DarkMatter (Apr 18, 2008)

Wile E said:


> No they didn't. They got the win only because the reviewers preferred doing the tweaks nVidia needed. Re-read the article, and it is clear that both needed tweaked to show the true images.





No! No tweaking in Nvidia at all! They just enabled Dynamic Contrast and Colour Enhancement, which is just a feature that can be On or Off, a feature that based on what they say on the article Ati may use by default.


----------



## calvary1980 (Apr 18, 2008)

Elitebastards Article nVidia won Blu-Ray Playback Performance, Blu-Ray Picture in Picture Performance and HD Noise Reduction Quality

the only problem they had was Noise Reduction was Disabled by default. your making IF and BUT over some checkbox's.

- Christine


----------



## Wile E (Apr 18, 2008)

calvary1980 said:


> Elitebastards Article nVidia won Blu-Ray Playback Performance, Blu-Ray Picture in Picture Performance and HD Noise Reduction Quality
> 
> the only problem they had was Noise Reduction was Disabled by default. your making IF and BUT over some checkbox's.
> 
> - Christine


It winning in decoding performance (which is how I read it) has nothing to do with IQ. While you may be correct in noise reduction, they didn't even bother to tweak the ATI settings. They even go on to say that the differences weren't noticeable from real sources, and that both are a good buy for video playback. That doesn't sound very decisive to me. Which brings us back to square one, neither maker has superior IQ.


----------



## cdawall (Apr 18, 2008)

you know to be honest i have looked over alot of cards and im starting to like the onboard Via S3 haha


----------



## EastCoasthandle (Apr 18, 2008)

> ...Another interesting thing that I found about the article, has nothing to do with the article itself, but with the use that people give to the articles trying to make their point true. And just how biased they are. Yeah, I'm pointing at you this time EastCoasthandle! In the conclusion of the article we can read:...



I have provided the necessary information that didn't fit their conclusion that ATI won on IQ.  Sure the author gave them the win because they offered Aero playback.  What I pointed out was the fact that the IQ on ATI appeared more accurate then with Nvidia's offering.  When you consider both to be accurate you either lack the knowledge in discerning how a movie should look or you are just compromising to keep Nvidia IQ even.   In either case this clearly shows how bias you are.


----------



## DarkMatter (Apr 18, 2008)

EastCoasthandle said:


> I have provided the necessary information that didn't fit their conclusion that ATI won on IQ.  Sure the author gave them the win because they offered Aero playback.  What I pointed out was the fact that the IQ on ATI appeared more accurate then with Nvidia's offering.  When you consider both to be accurate you either lack the knowledge in discerning how a movie should look or you are just compromising to keep Nvidia IQ even.   In either case this clearly shows how bias you are.



It's really amusing how everything you say to me regarding biased behavior is always more aplicable to you than to me. Always. I'm not going to discuss with you anymore. 
Anyone with a brain is capable of reading the reviews and make their own idea, we don't need you trying to distort relity in your favor. Thank you very much.

And the author gave the win based on IQ not any Aero thing, as you can see on the quotes I put some posts above. Not wait maybe I'm asking you too much. Here they are:



> It would be fair to say that the image quality we obtained from ATI and Nvidia hardware has given us a lot to think about. On one hand we have ATI with a superior level of detail; on the other we have Nvidia with much more intense and realistic colour, not only on skin but many areas of scenery too.





> Image quality is much closer between the two due to both having good and bad points. Clearly with some time and patience it would be possible to work out the best settings for each card, but we should be much closer to perfection at default settings than we are. In reality, the best image is probably somewhere in between the two produced. If we had to choose one as our preference, overall it would probably be Nvidia as much of the detail issues could be fixed by an increase in brightness rather than significant work to get skin tone correct on ATI.



And the direct link to the conclusion page:

http://www.driverheaven.net/reviews.php?reviewid=552&pageid=18

That is the conclusion of the experts. Anything else is your opinion. What you think when you look at the screenies is your opinion. And when you try to put your own opinion as fact, even when the review and the author is saying the contrary, that is bias my friend.

Sorry to everybody for this.


----------



## EastCoasthandle (Apr 18, 2008)

DarkMatter said:


> It's really amusing how everything you say to me regarding biased behavior is always more aplicable to you than to me. Always. I'm not going to discuss with you anymore.
> Anyone with a brain is capable of reading the reviews and make their own idea, we don't need you trying to distort relity in your favor. Thank you very much.
> 
> And the author gave the win based on IQ not any Aero thing, as you can see on the quotes I put some posts above. Not wait maybe I'm asking you too much. Here they are:
> ...


What's amusing is that you need to type out your own advice instead of following it.  What you posted is neither informative nor offers any tangible benefit to your own thread.  As I said before I clearly showed you and this thread that ATI did win IQ based on the review.  Hidding behind the conclusion which didn't fit the actual test doesn't work here. Also, the fact that many people have the opportunity to use both cards and they believe that ATI is better (based on your own poll) speaks volumes IMO


----------



## erocker (Apr 18, 2008)

I want to point out that just because someone posts and article on the subject (be them a "professional" or otherwise) makes NOTHING fact.  This is all subjective anyway, you can argue back and forth about this untill you die which is fine.  It keeps me entertained.  Thread still open for business!!


----------



## DarkMatter (Apr 18, 2008)

erocker said:


> I want to point out that just because someone posts and article on the subject (be them a "professional" or otherwise) makes NOTHING fact.  This is all subjective anyway, you can argue back and forth about this untill you die which is fine.  It keeps me entertained.  Thread still open for business!!



But are we going to the point where the opinion of many professionals (ok, a "professional" who works in a publication, who is exposed to the public and lawsuits from companies and has to act accordingly) has the same weight then the opinion of anyone who wants to give their opinion in a forum? The opinion of a forumer that always takes the 2 lines that best fits his goals has the same weight as the coincident views of more than 5 publications with so called professionals in the matter, that at least took the time to bother doing the work and posting proofs of their findings? Until now EastCoast has only taken some reviews made by others and said quite the opposite the author said, and not only that but he tries to make as if the author said the opposite in the review than in the conclusion!! From my point of view there's a big difference between those "professionals" and EastCoasthandle: *they always say "In my opinion" before saying one looks better than the other to them.*

I just can't understand... :shadedshu

Proof of (I'm going to be kind and say, unintentional) bias:



EastCoasthandle said:


> Actually based on the results it clearly lost.  It won based on Aero not IQ (which you did quote, thanks).  But for those that don't know a thing about tweaking their monitor or making the proper calibrated adjustments ATI is the clear winner right out the box
> 
> Image 1: Casino Royale 1hr 15m 09s = ATI
> 
> ...



1 - Image 6: *TIE*



> Media Enthusiast Observations:
> Our final image looks at an underwater scene and there are two areas we feel are worthy of singling out. The first is the darker fish, towards the front and left of the image. On ATI its scales/markings are much clearer where as on Nvidia we again see detail suffer. Looking at the bright yellow fish near the centre of the image we find that the Nvidia image actually shows it to be richer and the stripes better defined. The ATI image of the fish is a little too washed out for our liking.
> 
> 
> ...



2- "After the IQ review the author then goes and tweak Nvidia standard settings..."

They activate a feature that Ati has activated by default as they say through out the article...

3- "... in an attempt to make it look as good as Ati which wasn't touched"

This is a personal opinion of EastCoasthandle himself trying to qualify as the reason they tried out the Dynamic Enhancement. False, since their conclusion is Nvidia has an slighlty better picture, and the reason they give why they tried it is:



> We decided to test these options and the results follow, it should be noted that we also enabled the noise reduction feature in the Nvidia drivers and set it to a level similar to ATI. (ATI enable noise reduction by default). In theory, this should represent the best image quality Nvidia has to offer, without changing edge enhancement values. When we asked ATI what settings we should use for testing, 'default' was the response, so we can assume that the screenshots produced so far represent their best image quality.


----------



## erocker (Apr 18, 2008)

Correct, it's a "professional" opinion, so take it as you will.  I was just stating that no "professionals" opinion or should be taken as fact.  A prefessional has different eyes and brains like the rest of us.  I've said it before, if you want to find out the facts for yourself, based upon you own perceptions, buy a modern card from either side and do the tests yourself  Use what you believe to be the best according you your eyes and your brain's perception of what IQ should be.  It's all subjective to the end-user, reviewer, or otherwise.


----------



## Wile E (Apr 19, 2008)

erocker said:


> Correct, it's a "professional" opinion, so take it as you will.  I was just stating that no "professionals" opinion or should be taken as fact.  A prefessional has different eyes and brains like the rest of us.  I've said it before, if you want to find out the facts for yourself, based upon you own perceptions, buy a modern card from either side and do the tests yourself  Use what you believe to be the best according you your eyes and your brain's perception of what IQ should be.  It's all subjective to the end-user, reviewer, or otherwise.


I've been trying to say that from the begining. But there are people that are convinced that one offers better quality over the other. I gotta say, I've always been an ATI man. The latest ATI card I have is a Powercolor 2900XT. I now have a Palit 8800GT. One's IQ just isn't superior to the other. I've had to tweak both an equal amount to get things correct.


----------



## Mussels (Apr 19, 2008)

Wile E said:


> I've been trying to say that from the begining. But there are people that are convinced that one offers better quality over the other. I gotta say, I've always been an ATI man. The latest ATI card I have is a Powercolor 2900XT. I now have a Palit 8800GT. One's IQ just isn't superior to the other. I've had to tweak both an equal amount to get things correct.



i'm an AMD/ATI man. i think ATI offers slightly better quality in what i do (HD codecs, 'low' res gaming) but at the same time i know that intel and Nv offer better performance/value for money when i upgraded... so i stayed with them.


----------



## Wile E (Apr 19, 2008)

Mussels said:


> i'm an AMD/ATI man. i think ATI offers slightly better quality in what i do (HD codecs, 'low' res gaming) but at the same time i know that intel and Nv offer better performance/value for money when i upgraded... so i stayed with them.



That's all I do is low res gaming (1440x900), and HD watching, and anime. There is no discernible difference between the cards in use. For video watching, HQV actaully gives the edge to NV, but I sure as hell can't see the difference, nor can anyone I know. In games, there are differences, but not in a way that makes one better than the other, only different.


----------



## Mussels (Apr 19, 2008)

Wile E said:


> That's all I do is low res gaming (1440x900), and HD watching, and anime. There is no discernible difference between the cards in use. For video watching, HQV actaully gives the edge to NV, but I sure as hell can't see the difference, nor can anyone I know. In games, there are differences, but not in a way that makes one better than the other, only different.



the way i see it, is if ATI has a slight lead (which is barely noticeable) and two cards had the same performance/price ratio otherwise.. i'd chose the one that looked a tiny bit better.

Atm that isnt happening - ATI cards have made a comeback (good prices, and their is a LOT of ATI users in the 3dmark06 thread here on TPU) but AMD CPU's have not. I am still considering ATI, and would already have gotten an x2 if not for the broken widescreen scaling bug (you cant disable scaling to add black bars. lets not reiterate this problem in this thread)


----------



## saadzaman126 (Apr 19, 2008)

on the poll atm ATI is killing Nvidia so I think there more ATI fans here, personally neither matter there is like unoticable difference in quality...


----------



## erocker (Apr 19, 2008)

Well, I finally saw a new 9800GTX in action on a monitor similar to mine.  There is no difference.  The fuzziness that I noticed once in a while with the G80's is gone.  Subjective.


----------



## cooler (Apr 20, 2008)

pic 9800gx2 vs 3870x2


----------



## wolf (Apr 21, 2008)

still no noticeable difference...


----------



## RoachHotel (Apr 27, 2008)

Well a equal comparison. Here ya go. This thread was about image quality between the 2 cards, not anything else. 
Well my brother has an 8800GTS 512 with a viewsonic vx22wm. 
I have 2 ATi 3870's crossfired with a viewsonic vx22wm.. same monitor. 

Its simple really. Using the exact same monitor, and we just a few weeks ago at Stompfest were sitting next to each other, it was clear that My ATi cards had a much more vibrant image then the Nvidia. Our settings in COD4 are set exactly the same. We also get almost the exact same frame rate. About 170 FPS solid. He even mentioned that my game looked alot better. 
In my opinion, the nvidia cards kick ass. Nothing wrong with the video quality at all. They just dont quite have the color as a similar ATi card. They are faster then the ATi cards, but I will take a little slower FPS to make my games look that much better. 

Hope this helped someone.


----------



## Wile E (Apr 27, 2008)

Even if you have the same model monitor, they are not equal. Monitors are just like any other component, each individual one responds differently. Calibrate both monitors with a Spyder2, and I bet the differences aren't there anymore.


----------



## Nitro-Max (Apr 27, 2008)

There will be no end to this thread lol its full of fans from both sides its just gonna go on and on close it already.


----------



## D4S4 (Apr 27, 2008)

Only thing i like more about nVidia is Digital Vibrance Control in driver. It adds more life to colors.


----------



## Nitro-Max (Apr 27, 2008)

D4S4 said:


> Only thing i like more about nVidia is Digital Vibrance Control in driver. It adds more life to colors.



Thats because it needs it


----------



## D4S4 (Apr 27, 2008)

Nitro-Max said:


> Thats because it needs it


----------



## Kovoet (Apr 27, 2008)

I've had both cards and this will be on going for a long time. I remember been such an ATI fan since the day of the 9800xt came out all the way till the HIS 1950pro turbo which I loved but since going over to the nVidia 8800GTS 512mb I can't decide anymore. Both cards have there advantages.

The only thing that gets me about nVidia it's going to make me bankrupt one day because the cards change every 5 minutes.


----------



## saadzaman126 (Apr 27, 2008)

we'll see when both new series come out and who has the better card i hear 4870x2 is insanity


----------



## Rebo&Zooty (May 23, 2008)

i own an 8800gt and have installed/setup 5-6 3800 cards, as well as getting to play with them.

out of the box without spending any time tweaking, the ati cards look BETTER in movies and most games, for video playback they are quite alot better then the 2900, the 2900 lacked something or other that helps in video playback acceleration, to lazy to look it up at the moment.

now with tweaking you can make both look better, but OUT OF THE BOX without any tweaking ATI/AMD clearly look better and this has been varifyed by more then a few people, including the votes on this thred, funny since nvidia's cards are so much faster and more popular in games, that they are winning the vote here.

My x1900xtx looks better for VIDEO PLAYBACK, image are more crips and clear, i dont know about cpu use, since it dosnt matter to me really, i can watch 720+ and 1080p movies and do other things at the same time witout lagg, so im happy.

now for gaming, no aa the cards are close to even, but PER AA SETTING, ati's is better for QUILITY, but nvidia cards can crank the AA up drasticly without having a major performance hit.

heres my personaly experiance using a gateway 2001fp gaming monotor(benq built) 

in games 2x ATI AA looks the same to me as 4x or 8xQ aa depending on the game, some games respond diffrently to AA settings then others, 

WoW for example at 1600x1200 looks as good with 2x ATI AA as it does at 8xQ aa on my 8800gt, 4x nvidia aa looks about the same as 2x nvidia, i dont know if its a driver or game bugg but thats my experiance, nvidias supersampled transpariancy AA looks better then adaptive AA, but dosnt look better then the advanced AA modes you can enable with atitool advanced tweaks, also supersampled mode really hammers performance in WoW and other games that use alot of 3d textures for stuff like ground clutter and trees, it can cut the fps in 1/2 or even 1/4th, where as i can enable adaptive AA+EATM and AlphaSharpenmode and have far less of a hit with the same or better quility.

mind you the new enhanced modes dont work with all games, but those that they work well with show a nice quility boost, and you can even dissable AAA and get the same quility boost as you had by enableing it with effectivly no perf hit(same perf as not having AAA enabled but same quility as it would be with it enabled.

so yeah, nvidia cards are faster and with higher settings you can get the same quility, but if you are just talking PURE QUILITY stock for stock, the way most people use their cards(most people do not tweak their drivers beyond maby setting aa/af levels in the driver) ati wins.

when you bring performance in dx9 and dx9+dx10 shaders into account with AA, nvidia stops ATI, no dought, in 10.1 ati pulls ahead by current reports/reviews because true dx10(10.1 is what dx10 was orignaly ment to be) uses shader based AA, so it dosnt have the huge impact that trying to do dx9 AA with shaders/software has.

you are talking about 2 totaly diffrent designs when you compare the g80/g92 with the r600/r670.
g80/92 are dx9 parts with dx10 shader support taged on(sm4) they are designed to work best in games that where out at the time the card came out, and that have mostly come out since, this was a good movie on nvidias part, because it allowed them to pull ahead nicely when the 2900 came out.

the r600/670 chips are a NATIVE dx10 design with dx9 supported via software, the problem here is that they didnt support AA with a hardware unit for use when playing dx9 based games like all games till very recently have been.

crysis and others are not true native dx10 games, they are dx9 games with dx10(shader model 4) shader fx added for teh dx10 version, and as such do not support true dx10 shader based AA, relaying on dx9 hardware AA insted.

when/if nvidia's next card comes out with NATIVE dx10.x support we will see how it does with older games, my guess is that they will keep the hardware unit there for games that work best with it and support shader based aa for those games that requier it or that will run best/look best with it.

I get an impression that the r700 based cards will have a hardware AA unit onboard to improve performance with older games as well as also fully supporting shader based AA as is requiered by true dx10 specs(ms backed off on some specs for nvidia, orignaly dx10 was ment to be what dx10.1 is) 

so yeah, nobodys really better IF you tweak things, but out of the box, ati wins in my book, and yet i have an 8800gt(well its in the shop, i will have it back soon i hope)


----------



## Mussels (May 23, 2008)

Rebo&Zooty:
One important thing here is that you are using a CRT screen. That means the RAMDACS on the cards are in use, whereas over DVI they would not be. If possible, compare over DVI as well - it wouldnt surprise me that in this LCD era, they've been using cheaper RAMDAC's lately.


----------



## ShadowFold (May 23, 2008)

Seeing as I just went from a 3850 to a 8800GT I did not notice anything image quality wise, just performance increases


----------



## Rebo&Zooty (May 23, 2008)

Mussels said:


> Rebo&Zooty:
> One important thing here is that you are using a CRT screen. That means the RAMDACS on the cards are in use, whereas over DVI they would not be. If possible, compare over DVI as well - it wouldnt surprise me that in this LCD era, they've been using cheaper RAMDAC's lately.



um, wrong, the dell 2001fp is a 20.1in lcd, benq dosnt make crt's afik.

and i also have a 19in kds monotor i hook up to watch movies on as i game, its a crt.

http://support.dell.com/support/edocs/monitors/2001fp/EN/specs.htm

results are the same,and the kds i have is a very high quility crt, the lcd is a high quility gaming lcd(dispite the specs looking very unimpressive today it dosnt ghost at all, and is VERY crisp) 

and nvidia sometimes uses crappy filtering on the analog portionof lower end cards, same crap they pulled back in the geforce1/2 days, but back then u could remove the filters with a few snips of plyers and get 100% quility boost from it.

havent seen the high end nvidia cards use bad filters in years, but the quility of the aa/af and hell genral iq on the pre 8 seirse cards was very questionable to be kind.

and yes this monotor is using NATIVE dvi-d, it has vga, svideo and componant plugs as well(may hook up a dvd player to or for the hell of it  )

one thing i can say about this monotor, i have yet to find a stand thats better or as good, this ones telescopic, lets me raise the monotor to eye level from it being dirrectly on the desk......my fathers newer dell 20.1in widescreen(samsung) as a CHEEZY little stand i had to build him a wooden piller to set his monotor on to get it to eye level(how laim is that?) 

i didnt pay for this lcd, it was a gift from a good friend, and dispite the dell logo on it, i like it  (but i hate dell  )


----------



## Mussels (May 23, 2008)

oh my bad, i thought it was a CRT model (used to have a 19" CRT with a very similar name)


----------



## Rebo&Zooty (May 23, 2008)

fp stands for flat panil 

oh and little FYI for you all, even todays best lcd's cant reproduce a TRUE RED, a good/great crt is better then a kickass lcd for colour reproduction and quility.

theres a reasion most prof gfx studios still use crt's for their high end work machiens.

only problem is the room they take up and how much they cost to ship.

place i use to get monotors http://www.merkortech.com/home.asp

good prices, specly if you stick with the free shiping models, my buddy got a 36in hdtv/crt for 400bucks around 1.5 years ago from them, thats SHIPED, granted it was shiped ground freight so it took 2 weeks to get here, but still, that was a killer price for that unit, its native res is 1080p, and it has both true digital dvi and vga, its got an hdmi port but the hdmi ports not encrypted like it needs to be for it to properly support hdcontent in vista


----------



## Wile E (May 23, 2008)

Rebo&Zooty said:


> i own an 8800gt and have installed/setup 5-6 3800 cards, as well as getting to play with them.
> 
> out of the box without spending any time tweaking, the ati cards look BETTER in movies and most games, for video playback they are quite alot better then the 2900, the 2900 lacked something or other that helps in video playback acceleration, to lazy to look it up at the moment.
> 
> ...



The G8x and G9x cards are true DX10 cards as well. Their shaders are totally unified and fully programmable. They aren't locked into specific tasks at all.

And the DX version has nothing to do with the way AA is processed. It doesn't require any specific type of AA processing at all. That's 100% up to the game developers, and always has been.

And 2xAA on ATI is in no way equal to 4x (or whatever) on Nvidia.


----------

