# What is the difference in Nvidia and ATI?



## Franklinwallbrown (Feb 2, 2008)

Start off slow and then get more technical so I can learn please.


----------



## Random Murderer (Feb 2, 2008)

well, nvidia has more letters than ati.


----------



## sneekypeet (Feb 2, 2008)

Nvidia starts with a N and ATI starts with an A.....lol j/k


----------



## Franklinwallbrown (Feb 2, 2008)




----------



## ktr (Feb 2, 2008)

I have to say this thread is gonna turn into a fanboy war. 

Any who, i will give the rough difference between the two. 

As of now...

ATI > NVIDIA when it comes to picture quality
NVIDIA > ATI when it comes to performance


----------



## PrudentPrincess (Feb 2, 2008)

In the long span of things (as in from the beginning of the two companies until now) there is no real difference. afaik.


----------



## btarunr (Feb 2, 2008)

Read as many video-card reviews as you can and then start a discussion like this. Our own TPU has more than enough threads/reviews to appetize you.

To contribute to this discussion: The current generation shader architectures are different for NV and ATi


----------



## Random Murderer (Feb 2, 2008)

now for the more technical:
Memory interfaces:
ATi uses a ring bus on their newest cards(HD2k series and on). This means that there's a dedicated IN pipeline and a dedicated OUT pipeline to each memory chip. For example, the 2900xt uses a 512bit ring-bus, which means each chip has 256bit in and 256bit out.
NVidia has a really weird interface on the g80 based 8800's. the "384bit interface" on the GTX an ULTRA is actually a 256 and 128bit bus "stuck together" with the 256bit bus addressing 512MB of memory and the 128bit bus addressing 256MB of memory for a total of 768MB memory on a 384bit bus. The original, g80-based, GTS uses a 320bit bus, which is a 256bit bus "stuck together" with a 64bit bus, with the 256bit bus addressing 512MB of memory on the 640MB version and 256MB on the 320MB version, and the 64bit bus addressing 128MB of memory on the 640MB version and 64MB on the 320MB version.
The newer, g9x cards use a more traditional single-address bus. they come in different "bit widths," but unlike ATi, they go both ways on a single bus instead of having dedicated pipes.

Shaders and Processors:
ATi uses "unified shaders," meaning the stream processors within the die handle everything short of allocating data. ATi has opted to use many unified shaders rather than use NVidia's method.
NVidia uses seperate types of shaders and processors for each task, which equates to having many different pieces being specialized in one thing rather than having many more powerful shaders that do everything.

more to come later. i'm bored and want to go play some cs:s


----------



## candle_86 (Feb 2, 2008)

the ring bus is on the x1800XT

Second of all Nvidia has better IQ now also, and better preformance, ATI is dead and its about time the heathens fell.


----------



## acperience7 (Feb 2, 2008)

candle_86 said:


> the ring bus is on the x1800XT
> 
> Second of all Nvidia has better IQ now also, and better preformance, ATI is dead and its about time the heathens fell.


My X1650 Pro, and X1950 Pro both have a ring bus. I thought that all X1K and HD2XXX cards had a ring bus. Also why do people seem to hate ATI? I like Nvidia too, but both companies make great performing cards.


----------



## Wile E (Feb 2, 2008)

candle_86 said:


> the ring bus is on the x1800XT
> 
> Second of all Nvidia has better IQ now also, and better preformance, ATI is dead and its about time the heathens fell.



No, nvidia doesn't have better IQ. I have both a 2900XT and this 8800GT, and the 2900 looks ever so slightly better. Not enough to make any claims of "superior IQ" It's barely noticable to all but those with the most trained eyes.


----------



## Random Murderer (Feb 2, 2008)

candle_86 said:


> the ring bus is on the x1800XT
> 
> Second of all Nvidia has better IQ now also, and better preformance, ATI is dead and its about time the heathens fell.



first off, please only reply if you know what the hell you're talking about. this is not the first idiotic, uninformed post i have seen by you. it's uninformed posts like yours that cause people to ignorantly buy worse products than what they could have gotten, and then get upset with TPU.
second off, fanboyism is not only frowned upon by this forum, it's childish and only leads to problems.
please, for the love of (insert deity here), learn what the hell you're talking about and try to back up your posts with actual information rather than opinion.


----------



## JrRacinFan (Feb 2, 2008)

@RM

Thank you for the best post I have seen in days!

@Franklin

To me there is really no difference, but I said it before and I will say it again, I am a budget buyer and can only give you personal experience with each brand. 

As I am stating that, my previous card was an ATi 9500 softmodded to pro. But before that was a GF4 ti4200. 

My GF4 definitely lasted longer, but my ATi performed better.

Sure this may not be an informative post and is strictly on personal opinion, but IMHO if you need hardware quality go with ATi. Need a tiny bit more performance, go (and can stand switching drivers every 3 weeks) with nVidia.


----------



## hat (Feb 2, 2008)

There's little difference in image quality between Nvidia and ATi these days. Go with whatever card is the best choice for you bang/buck wise. There's an HD2900PRO on newegg for $150, if you're building a new rig, grab it.


----------



## EastCoasthandle (Feb 2, 2008)

Wile E said:


> No, nvidia doesn't have better IQ. I have both a 2900XT and this 8800GT, and the 2900 looks ever so slightly better. Not enough to make any claims of "superior IQ" It's barely noticable to all but those with the most trained eyes.



Can you elaborate further on this?  Examples?


----------



## tkpenalty (Feb 2, 2008)

hat said:


> There's little difference in image quality between Nvidia and ATi these days. Go with whatever card is the best choice for you bang/buck wise. There's an HD2900PRO on newegg for $150, if you're building a new rig, grab it.



Little? HD3870 and 8800GT has MASSIVE image quality differences in NFS: PS.... Moire patterns are noticeable on a lot of the textures with the 8800GT.


----------



## Wile E (Feb 2, 2008)

EastCoasthandle said:


> Can you elaborate further on this?  Examples?



With the screenshots going around on Crysis, it looked obvious, but to actually spot the differences while playing is a bit tougher. ATI still has the edge, but it's nowhere near night and day.


----------



## candle_86 (Feb 2, 2008)

unless you pause a game to look you havnt noticed it since the Geforce6 days to be honest.


Also go learn something RM, the ringbus was introduced on all R520 based GPU's not the damned R600 line of chips. Also I am not a fanboi, I back the company that has always shown innovation and drive. Heck the GeforceFX is more advanced than the R300 in all truth but its shaders where badly coded leading to its problems. The NV40 brought us SM3 while ATI rehashed the R300. Nvidia was busy working on G80 so did a simple die shrink of the G70(NV47) and called it G71 so that we could get the G80 on time. Yes on time something to this day ATI hasnt fixed, limited supply's of cards at launch sure its hard but there are few at launch. This isnt fanboism this is hard facts, and the truth is for the past 4 years now NVidia offered better bang for the buck. SM3, and higher overclocks. I also told people get 7900GS and OC it, over the x1900GT, or get a 7950GT over the x1950pro, and OC it. Something ATI's couldnt do well untill just now, and lets not talk about there midrange products.

x600XT had to be replaced by x700pro in less than 6 months to combat 6600GT which it was unable to do. 

The 7600GT walked all over the 1600xt, and it wasnt untill x1650XT that they caught up, but it was also around 50 bucks more for the XT over the GT. Then the HD2600XT struggles to keep up with the 8600GT let alone the 8600GTS, and the HD3650 finally tied 8600GT, just in time for the 9600GT to arrive and cripple them. Quite frankly im tried of ATI's halfhearted attempts, and unless they can make a product range that is competive in all segments they will not interset me ever.


The only thing i see is a bunch of ATI Fanboies getting pissed because I dare challange there remarks, maybe it is you that should pull your head out of that red colored ass you have had it stuck in since 2003 and look around right now.


----------



## Random Murderer (Feb 2, 2008)

candle_86 said:


> unless you pause a game to look you havnt noticed it since the Geforce6 days to be honest.
> 
> 
> Also go learn something RM, the ringbus was introduced on all R520 based GPU's not the damned R600 line of chips.


No, wrong. The r500 series chips used a scalar MADD configuration.





candle_86 said:


> Also I am not a fanboi, I back the company that has always shown innovation and drive.


That's why you said:





candle_86 said:


> ATI is dead and its about time the heathens fell.


right?





candle_86 said:


> Heck the GeforceFX is more advanced than the R300 in all truth but its shaders where badly coded leading to its problems. The NV40 brought us SM3 while ATI rehashed the R300. Nvidia was busy working on G80 so did a simple die shrink of the G70(NV47) and called it G71 so that we could get the G80 on time. Yes on time something to this day ATI hasnt fixed, limited supply's of cards at launch sure its hard but there are few at launch. This isnt fanboism this is hard facts, and the truth is for the past 4 years now NVidia offered better bang for the buck. SM3, and higher overclocks. I also told people get 7900GS and OC it, over the x1900GT, or get a 7950GT over the x1950pro, and OC it. Something ATI's couldnt do well untill just now, and lets not talk about there midrange products.
> 
> x600XT had to be replaced by x700pro in less than 6 months to combat 6600GT which it was unable to do.
> 
> The 7600GT walked all over the 1600xt, and it wasnt untill x1650XT that they caught up, but it was also around 50 bucks more for the XT over the GT. Then the HD2600XT struggles to keep up with the 8600GT let alone the 8600GTS, and the HD3650 finally tied 8600GT, just in time for the 9600GT to arrive and cripple them. Quite frankly im tried of ATI's halfhearted attempts, and unless they can make a product range that is competive in all segments they will not interset me ever.


Sure, ATi may be a few months behind in development, but ATi has always been about cutting edge technology they design themselves from the ground up, and are currently the best bang for the buck.



candle_86 said:


> The only thing i see is a bunch of ATI Fanboies getting pissed because I dare challange there remarks, maybe it is you that should pull your head out of that red colored ass you have had it stuck in since 2003 and look around right now.


Reported.

Overall, I would say you need a spelling and grammar lesson, a change in attitude, and a vacation- if not from TPU then from whatever dead-end job you work. Go to Australia, I hear it's beautiful this time of year.


----------



## Dia01 (Feb 2, 2008)

I suppose a valid reason between the differences of both companies really comes down to one common reason.  If we didn't have both, regular technology advancements would not be present and pricing would not be controlled.  I'll go with whatever is the best either price wise or performance.


----------



## btarunr (Feb 2, 2008)

Holy shit look at this fresh bake workstation card from AMD: ATi FireGL V8650 2048M

http://ati.amd.com/products/fireglv8650/index.html
http://www.newegg.com/Product/Product.aspx?Item=N82E16814195055

It looks to be based on the R600 vaguely looking at its PCB and the cooler, the 512-bit memory bus (internal 1024bit ringbus):










Its competitor from NVidia is the QuadroFX 5600.
http://www.nvidia.com/object/quadro_fx_5600_4600.html


----------



## Deleted member 3 (Feb 2, 2008)

Random Murderer said:


> ?Sure, ATi may be a few months behind in development, but ATi has always been about cutting edge technology they design themselves from the ground up, and are currently the best bang for the buck.



And NV, Intel, XGI, S3 copy their designs? They build from the ground up as well.

Best bang for the buck is the 8800GT at the moment. Read the reviews on this very site.



@franklinwallbrown,
Could you please bother to do some of your own research before you start these threads? TPU is not Google you know.


----------



## Darren (Feb 2, 2008)

Franklinwallbrown had a genuine question but because of fanboyism and ego (I'm not naming names) we've scared him off. These non-constructive fanboy claims without backup or proof has to stop, and fast.


----------



## largon (Feb 2, 2008)

Random Murderer said:


> now for the more technical:
> Memory interfaces:
> ATi uses a ring bus on their newest cards(HD2k series and on).


Ringbus *was* first used on R520. 





> This means that there's a dedicated IN pipeline and a dedicated OUT pipeline to each memory chip.


Uh, yes? 
*All SDRAM*, including GDDR(3) have dedicated in/out signal paths. 





> the 2900xt uses a 512bit ring-bus


HD2900 (R600) has a 1024bit ringbus divided in two unidirectional 512bit links. Ringbus is completely internal to the GPU, it has nothing to do with anything outside the GPU die. 





> which means each chip has 256bit in and 256bit out.


Oh dear. I can tell you don't understand how memory buses and GDDR(3) chips work and are built... 
First off, a single GDDR3 chip has an I/O width of just 32+32bits (32bit read and 32bit write), not 2x 256bit. Installing multiples of such 32bit "wide" chips on a graphics board allows wider bus widths, for example, 4 such chips on a HD2600XT 256MB of GDDR3 yields a 128bit bus (4×32bit=128bit), 8 of the exact same chips on a HD3870 512MB yields 256bit (8×32bit=256bit) and finally, 16 chips on HD2900XT 1GB totals to 512bits (16×32bit). 





> NVidia has a really weird interface on the g80 based 8800's. the "384bit interface" on the GTX an ULTRA is actually a 256 and 128bit bus "stuck together" with the 256bit bus addressing 512MB of memory and the 128bit bus addressing 256MB of memory for a total of 768MB memory on a 384bit bus. The original, g80-based, GTS uses a 320bit bus, which is a 256bit bus "stuck together" with a 64bit bus, with the 256bit bus addressing 512MB of memory on the 640MB version and 256MB on the 320MB version, and the 64bit bus addressing 128MB of memory on the 640MB version and 64MB on the 320MB version.


nVIDIA's what you call "a really weird interface" is not any more weirder than the "traditional" 128bit or 256bit buses. It may seem "weird" to someone who (again) doesn't understand how memory buses are built; 384bits on a G80GTX is really just twelve 32bit wide GDDR3 chips working in parallel yielding 12×32bit=384bit bus width. 

Saying something like this GPU has "256bit+128bit" memory or that GPU has 256bit+64bit *makes no sense*. It's 12×32bit for G80GTX, 10×32bit for G80GTS, 8×32bit for G92GT/S, 6×32bit for G92GS. 





> The newer, g9x cards use a more traditional single-address bus. they come in different "bit widths," but unlike ATi, they go both ways on a single bus instead of having dedicated pipes.


^BS. 
There's no such thing as a "single-address bus". _You_ made that up. 
And for the record, a bus with a give bit width and a given frequency on an nVIDIA card is just as fast one on an ATi card. Both vendors use the same chips that all have dedicated in/out signal paths.

64bit (nV) = 64bit (ATi, with a ringbus)
128bit (nV) = 128bit (ATi, with a ringbus)
256bit (nV) = 256bit (ATi, with a ringbus)
320bit (nV) = 1.25 x 256bit (ATi, with a ringbus)
384bit (nV) = ¾ of 512bit (ATi, with a ringbus)
512bit (nV) = 512bit (ATi, with a ringbus)
And so on...



> Shaders and Processors:
> ATi uses "unified shaders," meaning the stream processors within the die handle everything short of allocating data. ATi has opted to use many unified shaders rather than use NVidia's method.
> NVidia uses seperate types of shaders and processors for each task, which equates to having many different pieces being specialized in one thing rather than having many more powerful shaders that do everything.


It's other way around. 
ATi's R6-generation shader processors are more specialized than the stream processors on nVIDIA's G80 and G92, granted, R600 does some things with shader that G80 does with ROPs, but for shader operations, nVIDIA's units are more flexible, every unit is identical to it's neighboring unit so each unit can calculate even the most complex operations independently. Btw, ATi likes to claim R600/RV670 have 320 unified shaders - that's marketing talk. If nVIDIA used the same counting method they could infact claim G80/G92 has 256 stream processors as each of those 128 units on both are capable of dual-issue (scalar MAD + scalar MUL) per clock. So, 320 marketing BS, but In reality R600/RV670 chips both have "only" 64 shader processors. Each of those 64 units has 5 different independent ALUs ("shader operators") of which 1 is a "complex" and is roughly comparable to each of those 128 units on G80/G92. The rest 4 are "simpler" thus they can't execute "complex shading operations". The highly specialized functionality of the shader units on R6-gen GPUs combined with the complex and yet unsupported VLIW coding language these GPUs work on is the main cause of R600/RV670 apparent inefficiency in 3D-gaming/applications.



Random Murderer said:


> first off, please only reply if you know what the hell you're talking about.


Oh, the irony...


----------



## PuMA (Feb 2, 2008)

I would get nvidia over ATI these days. learned something about 1950pro. alltough it had nice GFX the games seemed more heavy and laggy when playing compared to even the 7600gt wich i owned before the PRO. thats why I like nvidia, good performance on the card, and GFX is not that bad either.


----------



## Darren (Feb 2, 2008)

PuMA, strange as the x1950 Pro is suppose to be the superior card. 

largon, can we let this obsession with facts go, it's making people loose focus from Franklinwallbrown's original question.


----------



## largon (Feb 2, 2008)

Wile E said:


> With the screenshots going around on Crysis, it looked obvious, but to actually spot the differences while playing is a bit tougher. ATI still has the edge, but it's nowhere near night and day.


Sigh. 
The proverbial "Crysis screenshot" issues (like these) only exist due to flaws in *169.09 beta drivers* which caused a texture LOD distortion in distant textures.


----------



## ex_reven (Feb 2, 2008)

ktr said:


> As of now...
> 
> ATI > NVIDIA when it comes to picture quality
> NVIDIA > ATI when it comes to performance



Somebody please explain why the hell picture quality varies...
What specifically do you mean by "picture quality"


----------



## Mattgal (Feb 2, 2008)

nvidia  focus on picture quality
ATi     focus on speed


----------



## ex_reven (Feb 2, 2008)

Mattgal said:


> nvidia  focus on picture quality
> ATi     focus on speed



That doesnt tell me anything.

From what I know, the rendered image is what the game engine tells the graphics card to render/display. I wasnt aware at any point that different brands render textures differently...I mean, how can the card do anything other than what a game engine tells it what to do?

To me its just another cpu. It processes data... It doesnt change the data and what it means, but it transfigures it into something that can be DISPLAYED. Which brings me back to my question...why is a GPU any different? Why should a gpu change output when its only really a glorified CPU?


----------



## Pyeti (Feb 2, 2008)

Mattgal said:


> nvidia  focus on picture quality
> ATi     focus on speed



other way round


----------



## NONYA (Feb 2, 2008)

My x1950 pro kiksass for the $ I spent on it,the 8600 may be the best bang for the buck but that is assuming everyone has that much money to spend,I dont.The best bang for my buck was and is the Ati Saphire x1950pro.Some day I may upgrade to the 8600 but I have a ton of more important things to buy first,I havnt had a single problem running my gams on high with my ATi.


----------



## Mattgal (Feb 2, 2008)

@ Pyeti thats what i've always been told.
@ ex reven  its like a gpu makes the image look like jpg and another brand gpu makes it look like a bmp wich is much clear


----------



## btarunr (Feb 2, 2008)

lol do you even know how even a properly compressed jpg compares to a bitmap? No, the image quality differences aren't this much, trust me I use NVidia and things are quite clear and legible, nothing that looks like a jpg 

The only image quality issues with me are random moire patterns when rendering a desert scene in Counter Strike Source when there's a vast landscape of sand with HDR turned on (also seen by me on several race simulators with tracks that have lot ot texture filtering enabled). And nothing else. The 2D view of using the OS, watching high-res videos and even compressed images gives me very good quality and I never observed any difference between my 6800 GT, 8800 GTS and a friend's X800 XT and X1900 XTX.


----------



## Mattgal (Feb 2, 2008)

its the other way round.



> trust me I use NVidia and things are quite clear and legible, nothing that looks like a jpg


of course


> *nvidia focus on picture quality*
> ATi focus on speed


its not that big as a difference but it is there. bdw me2, i use 8600GTS


----------



## btarunr (Feb 2, 2008)

Mattgal said:


> its not that big as a difference but it is there. bdw me2, i use 8600GTS



Wrong, from whatever I read on tech-journals and forums so far, the image-quality issues that people claim to have are only with NVidia. They say ATI > NVidia in terms of image quality which I agree looking at the moire patterns.


----------



## Mattgal (Feb 2, 2008)

dont know then. i always had nvidia but i have many friends that have ati.


----------



## PuMA (Feb 2, 2008)

Darren said:


> PuMA, strange as the x1950 Pro is suppose to be the superior card.
> 
> largon, can we let this obsession with facts go, it's making people loose focus from Franklinwallbrown's original question.




yeah 1950pro had better FPS ingame, but games just seemed heavy and laggy. thats why i got 
nvidia again.

games with 8800gts looks pretty for my eyes + extra performace keeps me with nvidia, untill ATI can show a great performing card + the same image quality they have now.


----------



## candle_86 (Feb 2, 2008)

incase yall missed the reading every review on the geforce 8, the IQ is equal to ATI, every review site stated that one fact. Now what you need to do is go into drivers and enable IQ settings, such as Gamma AA and Transperncy AA.


----------



## Morgoth (Feb 2, 2008)

what i know is that lates ati Card talking abouth lates cards (x1900) hd2xxxx hd3xxxx
are way bether in AA and AF mod then 8800 does specialy hd3870x2 is a king in aa and af 
i realy dont care if i loses 30fps on AA and AF aslong my fps is idle around 30-40fps
so i am for quality


----------



## PrudentPrincess (Feb 2, 2008)

btarunr said:


> Wrong, from whatever I read on tech-journals and forums so far, the image-quality issues that people claim to have are only with NVidia. They say ATI > NVidia in terms of image quality which I agree looking at the moire patterns.



If you read an article in the newest Maximum PC mag they do a picture quality review with different movies, dx10 games, and images. Ati won in almost all of them. (the test subjects didn't know which screen was Ati/Nvidia.)


----------



## Athlon2K15 (Feb 2, 2008)

largon said:


> Ringbus *was* first used on R520. Uh, yes?
> *All SDRAM*, including GDDR(3) have dedicated in/out signal paths. HD2900 (R600) has a 1024bit ringbus divided in two unidirectional 512bit links. Ringbus is completely internal to the GPU, it has nothing to do with anything outside the GPU die.
> Oh dear. I can tell you don't understand how memory buses and GDDR(3) chips work and are built...
> First off, a single GDDR3 chip has an I/O width of just 32+32bits (32bit read and 32bit write), not 2x 256bit. Installing multiples of such 32bit "wide" chips on a graphics board allows wider bus widths, for example, 4 such chips on a HD2600XT 256MB of GDDR3 yields a 128bit bus (4×32bit=128bit), 8 of the exact same chips on a HD3870 512MB yields 256bit (8×32bit=256bit) and finally, 16 chips on HD2900XT 1GB totals to 512bits (16×32bit). nVIDIA's what you call "a really weird interface" is not any more weirder than the "traditional" 128bit or 256bit buses. It may seem "weird" to someone who (again) doesn't understand how memory buses are built; 384bits on a G80GTX is really just twelve 32bit wide GDDR3 chips working in parallel yielding 12×32bit=384bit bus width.
> ...



i think someone just got pwned so bad he wont show up on TPU for few days...


----------



## candle_86 (Feb 2, 2008)

PrudentPrincess said:


> If you read an article in the newest Maximum PC mag they do a picture quality review with different movies, dx10 games, and images. Ati won in almost all of them. (the test subjects didn't know which screen was Ati/Nvidia.)



the only way to see these problems is literly to pause the game, and quite frankly unless you play @ 6FPS you will never notice it. As for the ATI AA/AF verses the Nvidia AA/AF, id rather have Nvidia so i dont get a 20% penenlty, heck 2xAA is free.

The IQ is something ATI fanboies tell themselves at night so they feel better about there purchase, knowing right now Nvidia is walking all over AMD/ATI


----------



## PrudentPrincess (Feb 2, 2008)

candle_86 said:


> the only way to see these problems is literly to pause the game, and quite frankly unless you play @ 6FPS you will never notice it. As for the ATI AA/AF verses the Nvidia AA/AF, id rather have Nvidia so i dont get a 20% penenlty, heck 2xAA is free.
> 
> The IQ is something ATI fanboies tell themselves at night so they feel better about there purchase, knowing right now Nvidia is walking all over AMD/ATI



Yeah but ATI pretty much dominated the HD Movie and digital picture section. (not that I have a personal preference, I'm just posting what they found)


----------



## ex_reven (Feb 2, 2008)

I ran on x1950Pro 512mb and 8800GT OC and I didnt experience any difference apart from frames when gaming at the same settings.

(of course that was only for comparison, otherwise I run Crysis on high and max out every other game)


----------



## ktr (Feb 3, 2008)

ex_reven said:


> Somebody please explain why the hell picture quality varies...
> What specifically do you mean by "picture quality"



It has to do with the technology that ATI uses...such has higher levels of AA and other smoothing & blending filters. 

The recent blind test (happens every couple months or so) that i have read was one month ago when maximum pc did a blind quality test between ati and nvidia. They took the HP blackbird (ideal machine, same exact hardware, just different gcards) and they showed it to random peeps. Almost everyone said that ATI had better quality. 

Better quality means that the lines are more smooth, or the effects look sharp, the colors are more vibrant...etc.

Same shit happened when i compared my x850pro to my 7600gt. Both cards about the same (7600gt is slight faster) but if i put them in the same video config, i see better quality from ATI, but better performace (FPS) from the nvidia.


----------



## btarunr (Feb 3, 2008)

But doesn't the final image quality of say viewing a JPEG file in Windows Preview depend on the RAMDAC more than texture filters?


----------



## hat (Feb 3, 2008)

largon said:


> Sigh.
> The proverbial "Crysis screenshot" issues (like these) only exist due to flaws in *169.09 beta drivers* which caused a texture LOD distortion in distant textures.



I can't tell the difference...


----------



## Corrosion (Feb 3, 2008)

tkpenalty said:


> Little? HD3870 and 8800GT has MASSIVE image quality differences in NFS: PS.... Moire patterns are noticeable on a lot of the textures with the 8800GT.



how does prostreet run for you, i get ALOT of lag in my mouse and also keyboard. its so wierd. the only game that does it.


----------



## anticlutch (Feb 3, 2008)

Idk if this has been mentioned but here goes anyway:

nVidia is HORRIBLE with drivers. Takes forever for them to acknowledge an issue, and even longer to fix it (if they fix it at all). Case in point: the FPS drop issue in the 8800gts 320mb models due to incorrect flushing of the vram. It's been 6+months since it's popped up but it still hasn't been fixed yet. nVidia, however, has a redeeming point. Its hardware is generally more powerful than ATi's (or at least, gives you a higher FPS).

ATi on the other hand, is pretty decent (if not awesome) with their drivers compared to nVidia. They update drivers on a monthly basis, and actually fix problems instead of pointing the finger at software dev's *cough*nVidia*cough*. Performance is usually below par compared to nVidia, but I'd rather have lower FPS and consistent driver support along with better image quality than raw hardware power with nVidia.

I started out with ATi, but jumped ship when the 8800's came out (many people did that), and now I'm regretting it. I'll probably stick with ATi for my next purchase, and every purchase after that as long as nVidia continues to ignore its customers and only focuses on improving drivers for their new products... even then I doubt I'll ever buy nVidia again because red is just so damn cool


----------



## Firedomain (Feb 3, 2008)

wow.......... incredible thread!........... stopped reading about mid way through when i got sick & tired of everyone bitching at everyone else..........

possibly 1 of the worst threads i have read (or started reading) on TPU.......
even when ppl try 2 help & state what they think is facts gets snapped at & starts another bitch fight!.... pretty pointless bothering to read it i think....

as for the topic.... i will happily go with whoever has the best performance/price ratio when i go to upgrade (unless theres a new card entering the stage anytime soon) after i have read some reviews of any that i was looking at then i'll finally decide.
at the moment i am using an NV 8800GTS my old card (2006) was an ATI X850 pro OC'd to XT PE.

feel free to bitch about what i said seeing as it'll most likely happen anyway coz i probably hurt someones feelings........... very sad.........


----------



## OrbitzXT (Feb 3, 2008)

ktr said:


> I have to say this thread is gonna turn into a fanboy war.
> 
> Any who, i will give the rough difference between the two.
> 
> ...



That statement was true a long time ago, once the 8800 came out nVidia was on top for performance and Image Quality.


----------



## ktr (Feb 3, 2008)

OrbitzXT said:


> That statement was true a long time ago, once the 8800 came out nVidia was on top for performance and Image Quality.



Still not. 

Maximum PC reviewed two 3870 against its equal rival 8800gt. Blind test say the 3870 was better. 

Having the faster cards does not always mean better quality, the technology behind it is the major factor.


----------



## Wile E (Feb 3, 2008)

largon said:


> Sigh.
> The proverbial "Crysis screenshot" issues (like these) only exist due to flaws in *169.09 beta drivers* which caused a texture LOD distortion in distant textures.


Reread what I said, and what I said it in reference to. I was saying the the IQ is more or less the same. There are minor differences, but not enough to sway it definitively in either direction. This is coming from someone that owns both a 2900XT, and an 8800GT, so it's first hand experience.


----------



## BullGod (Feb 3, 2008)

This Franklinsheep guy is such an attention whore, with his endless posts. He's on TPU over a month now and acts like he doesn't know the difference between his ass and his mouth. And you people jump in the flame thread, yeah my Matchbox car is faster than yours... :shadedshu


----------



## btarunr (Feb 3, 2008)

We've been trying to tell him Google exists.


----------



## TechnicalFreak (Feb 3, 2008)

Often when I buy a gfx card I'm more concerned about the price tag.
I never look for "oh that almost looks like real, good video card!" , I'm more like that guy who would buy a kick ass card (may it be nVidia or ATi) and then turn down all graphics in a game just for performance..


----------



## JrRacinFan (Feb 3, 2008)

@FireDomain

Where in the world did you come up with that my post was flame-ish?

I am with you on that Tech. But I am partially opposite. I am the one that would buy last gen and tweak the shit out of it, to make it run like current gen.


----------



## Darren (Feb 3, 2008)

BullGod said:


> attention whore



Someones been visting seduction communities. lol

But yeah Franklinsheep's question is a bit retarded, google should aid his questions next time. 

I find it strange that he has an over clocked Sempron and overclocked Geforce 6600 but he don't know about manufacture differences?


----------



## btarunr (Feb 3, 2008)

Darren said:


> I find it strange that he has an over clocked Sempron and overclocked Geforce 6600 but he don't know about manufacture differences?



lol. 

And then he goes on to say: "start with basics and go on technical so I could learn"


----------



## Darren (Feb 3, 2008)

I think from now on, we should avoid threads that are blatantly begging for flame wars, and simply hand them an article to read. Although I love helping people out, we can't spoon feed, especially if they are not asking genuine questions.


----------



## Firedomain (Feb 3, 2008)

@JrRacinFan

no problem at all with urs... i just have a problem with people bitching about other people!!!!
not very productive for TPU forums.... 
not mentioning any names.....

I have no problem with a little humor though!


----------



## candle_86 (Feb 3, 2008)

hmm well if this was all a joke, at least we had fun.


----------



## hat (Feb 3, 2008)

He's just a noob who doesn't know much... first he learned how to overclock, now he is trying to learn this.

To put it simply, ATi is supposed to have better image quality and Nvidia is supposed to be better with performance, but Nvidia is starting to win in both areas...

I used to have an x1800XL (which malfunctioned), and when I got the 8500, I didn't notice any image quality difference.


----------



## Conti027 (Feb 3, 2008)

Franklinwallbrown said:


> Start off slow and then get more technical so I can learn please.



ones red and the other is green other then that nothing that matters.  lol
i mean they both do the same thing


----------



## keakar (Feb 3, 2008)

Franklinwallbrown said:


> Start off slow and then get more technical so I can learn please.



you want simple here it is:

ATI vs NVIDIA  is the same as  FORD vs CHEVY


both are good and some have better models than the other at times but thats about it.


lately the difference maker has been ATI not matching NVIDIA as new cards come out so they fell behind a little 
in the latest and greatest department but the cards they have still do the job and now with the 3870 they have 
something that can be comparable to NVIDIAs 8800GT card. 

it all comes down to how much performance you get for the money you are willing to spend.


----------



## yogurt_21 (Feb 3, 2008)

candle_86 said:


> incase yall missed the reading every review on the geforce 8, the IQ is equal to ATI, every review site stated that one fact. Now what you need to do is go into drivers and enable IQ settings, such as Gamma AA and Transperncy AA.



actually from the reviews of the 2900xt vs the 8800gts (640MB/320MB) most reviews said they looked about the same without aa enabled, with it enabled they seemed to point to the nvidia method of 16xaa as being better. 

http://vr-zone.com/articles/ATi_Radeon_2000_Series_Launch:_X2900XT_Review/4946-15.html
granted it's a rather old review, but most of the driver changes didn't affect IQ, they mostly helped performance.

dunno about the 3870's but it would seem that ati is behind both in performance and in IQ. and thats coming from an ati fanboy.


----------



## Darren (Feb 3, 2008)

yogurt_21 said:


> seem that ati is behind both in performance and in IQ. and thats coming from an ati fanboy.



you haven't seen the new 3870 X2 reviews, they seem to be ontop as far as performance....for now


----------



## Monkeywoman (Feb 3, 2008)

ATI shader clocks are locked with the core clock, i.e. 3870 is 775core and 775 shader. Nvidia's core is separate from the core, i.e. 8800GT is 650 core 1600 shader. but when u look at how the shaders work, the nvidia card works quicker because of its sheer speed; for ati to work properly a game has to be developed to use the ati architecture properly or LOTS of driver tweaks


----------



## thegave (Feb 3, 2008)

BullGod said:


> And you people jump in the flame thread, yeah my Matchbox car is faster than yours... :shadedshu




Permission to sig?


----------



## CDdude55 (Feb 3, 2008)

Well there is not much difference. It is just that some people prefer either one for different sorts of reasons. But when it comes to Ati and Nvidia i go both ways. But since i have a 680i chipset i go with Nvidia.


----------



## Franklinwallbrown (Feb 4, 2008)

Okay, I am sorry if you guys are mad at me for not doing research, but lately I have been very busy and not at my computer. I figured that I could ask you guys some questions and get on here every once and a while and just start reading, but if you don't want me to that is fine. Just tell me to go away & I will.

As for me & the reason I want to know. I am going to be building a new computer (date unknown) & I wanted to know as much as I could. I have already bought the headset and surround sound & they have both blown me out of the water, and I have TPU to thank for that. Now I'm not a professional & all I want to do is learn. I don't want people to fight I just want to know what video card would be good for HD movies and gaming. So, thank you.


----------



## erocker (Feb 4, 2008)

Franklinwallbrown said:


> Okay, I am sorry if you guys are mad at me for not doing research, but lately I have been very busy and not at my computer. I figured that I could ask you guys some questions and get on here every once and a while and just start reading, but if you don't want me to that is fine. Just tell me to go away & I will.
> 
> As for me & the reason I want to know. I am going to be building a new computer (date unknown) & I wanted to know as much as I could. I have already bought the headset and surround sound & they have both blown me out of the water, and I have TPU to thank for that. Now I'm not a professional & all I want to do is learn. I don't want people to fight I just want to know what video card would be good for HD movies and gaming. So, thank you.



Nobody wants you to go away!  I guess one thing you learn is to never start an ATi vs. Nvidia thread!


----------



## sneekypeet (Feb 4, 2008)

Franklinwallbrown said:


> Okay, I am sorry if you guys are mad at me for not doing research, but lately I have been very busy and not at my computer. I figured that I could ask you guys some questions and get on here every once and a while and just start reading, but if you don't want me to that is fine. Just tell me to go away & I will.
> 
> As for me & the reason I want to know. I am going to be building a new computer (date unknown) & I wanted to know as much as I could. I have already bought the headset and surround sound & they have both blown me out of the water, and I have TPU to thank for that. Now I'm not a professional & all I want to do is learn. I don't want people to fight I just want to know what video card would be good for HD movies and gaming. So, thank you.



It isnt that we wont help if you ask...but you have to realize that asking whats the diff in ATI/Nvidia....it was bound to start a war.

That is why Random and I gave you a ribbing in the beginning...we knew what was soon to come.

You may have a good question , but it needs to be narrowed down some. There is a huge amount of time and cards to concider with the generalization of your question. Also there are alot of threads covering the newer cards already. Take a peak and see what others are doing with what you want. read the review and see if your wallet covers it.

If you at this point in your research still have questions we will be more than happy to give you any information we can!


----------



## Franklinwallbrown (Feb 4, 2008)

I just want to learn really. I wish I could buy all the cards and look first hand, but unfortunately I don't have that much money. Hopefully when I start working I can and you guys can help me perform tests. Wouldn't that be great? And if it comes down to both being about the same then I would go with the less expensive one. And also, every time I go to google to try to research something I get thrown around a million different websites and none of them answer my question or they confuse me. So...what is a guys to do?


----------



## candle_86 (Feb 4, 2008)

Darren said:


> you haven't seen the new 3870 X2 reviews, they seem to be ontop as far as performance....for now



not by alot though, they overtook the Ultra by less than 20% in most tests and in a few even lost. Wait till the 14th and watch the X2 cry.

As for best bang for the buck thats still the 8800GT, GTX power for 200 less. As for good days and bad days, I dont really see a recent good day for ATI. Being honeset, but 2005, to little to late, 2006 Nvidia trumped them with the 7900GTX and 7900GT which untill oblivion and the like came out gave the G71 a lead. The 1950XTX came out with 2 months before G80 came, and 2900XT was no where to be seen, by the time it showed up most that wainted to upgrade already had. As for midrange, the x700, x1600, and 2600 all suck, there is no doubt. The x1650XT was fast but costed 70 more than the 7600GT for very litte gain. And sure the 8600 sucks but at least it can outpreform last years midrange.


----------



## Franklinwallbrown (Feb 4, 2008)

From the context of all of your posts you SOUND like an Nvidia fanboi, but that is just what you sound like. I don't really know you.


----------



## imperialreign (Feb 4, 2008)

Lord . . . I avoided this thread for the last day or so because I knew where it would end up, also.  I was right!

For future reference: asking what the differences are between ATi and nVidia will lead to a lot of truth stretching, name calling, nose picking, review posting, nipple twisting, ranting, raving, foaming at the mouth, bullshit crapping, refrences to specs that don't exist, alluding to screenshots that did exist, and general divine omnipotent, inter-galactic planetary, smoke on the water inna-gadda-da-vita claims of knowledge that can't be backed up like pics of a UFO chillin in your neighborhood park and toothpaste spackeled inside a computer case.

Same goes for asking the differences between AMD and Intel.

S'all good, though, we enjoy the foray


----------



## imperialreign (Feb 4, 2008)

candle_86 said:


> not by alot though, they overtook the Ultra by less than 20% in most tests and in a few even lost. Wait till the 14th and watch the X2 cry.



sorry for the double post . . .


. . . but, the HD 3870x2 was never intended to compete with the 8800 Ultra, nor the GTX.  If ATI managed to accomplish running with those dogs, then they've more than done their duty with the x2.


----------



## Franklinwallbrown (Feb 4, 2008)

That's good, because I'm book-smart, but I don't have any common sense. At least that is what I hear all the time.

So the GTX is the best right now + you can OC it?


----------



## candle_86 (Feb 4, 2008)

imperialreign said:


> sorry for the double post . . .
> 
> 
> . . . but, the HD 3870x2 was never intended to compete with the 8800 Ultra, nor the GTX.  If ATI managed to accomplish running with those dogs, then they've more than done their duty with the x2.



but what about when 9800GX2 arrives @ 500, most that would buy are waiting for this to see how it does, and if the 7950GX2 is an indictation for preformance we will see an ass whooping.


----------



## candle_86 (Feb 4, 2008)

Franklinwallbrown said:


> From the context of all of your posts you SOUND like an Nvidia fanboi, but that is just what you sound like. I don't really know you.



you can call me what you like, but i go for FPS anyday and when Nvidia can deliver ontime and outpreform it means something, something ATI has yet to grasp.


----------



## Franklinwallbrown (Feb 4, 2008)

I see. I just want smooth gameplay + awesome HD movie watchage.


----------



## candle_86 (Feb 4, 2008)

then get an nvidia 8800GT, great FPS with purevideo


----------



## imperialreign (Feb 4, 2008)

candle_86 said:


> but what about when 9800GX2 arrives @ 500, most that would buy are waiting for this to see how it does, and if the 7950GX2 is an indictation for preformance we will see an ass whooping.



the 3870x2 was not designed to compete with nVidia's next generation of cards.  You'd be comparing new tech to old hat again.

We'll wait and see how ATI's rumored R700 fares against the GeForce 9 lineup - whenever ATI get's around to releasing it, if it even exists.



			
				candle_86 said:
			
		

> then get an nvidia 8800GT, great FPS with purevideo



Or a HD 3870 - same league, cheaper price


----------



## BullGod (Feb 4, 2008)

thegave said:


> Permission to sig?



Yeah, whateva...


----------



## candle_86 (Feb 4, 2008)

imperialreign said:


> the 3870x2 was not designed to compete with nVidia's next generation of cards.  You'd be comparing new tech to old hat again.
> 
> We'll wait and see how ATI's rumored R700 fares against the GeForce 9 lineup - whenever ATI get's around to releasing it, if it even exists.
> 
> ...



really, how so, the 8800GT outpreforms it at every step of the way, and with AA and AF the 3870 is embrassingly slower than the G92.

Also 9800GX2 for those just tuning in, is 2x8800GTS 512's in dual card, thats geforce 8 tech, not 9. You really should read more.


----------



## Wile E (Feb 4, 2008)

candle_86 said:


> not by alot though, they overtook the Ultra by less than 20% in most tests and in a few even lost.


Yeah, on release drivers, on a card that costs $200-250 less.

And ATI's Avivo blows Pure Video out of the water.

And I'm not sold on the 9800GX2 being a good purchase. Support for their last GX2 is terrible. It really wasn't a good purchase at all. Hopefully they'll do things differently this time, but I somehow doubt it.


----------



## imperialreign (Feb 4, 2008)

candle_86 said:


> really, how so, the 8800GT outpreforms it at every step of the way, and with AA and AF the 3870 is embrassingly slower than the G92.



the 3870 is not really all that slower by much in the grand scheme of things, and considering the somewhat lower price of the 3870, it's a better overall value.  Sure the 8800GT is faster in _some_ applications, but those apps are notorious for being hard on ATI's offerings (i.e. Crysis)



candle_86 said:


> Also 9800GX2 for those just tuning in, is 2x8800GTS 512's in dual card, thats geforce 8 tech, not 9. You really should read more.



HET - following nVidia's naming scheme, even though the 9800 GX2 is brandishing 2 8800 GTs, it'll still be named into the GeForce 9 series (hence 9xxx GX2); doesn't matter to nVidia whether the GPU tech is GF 8 or not - new series of cards ushers in a new GeForce lineup.  Perhaps you're the one in need of reading?


----------



## largon (Feb 4, 2008)

imperialreign said:


> Sure the 8800GT is faster in some applications, but those apps are notorious for being hard on ATI's offerings


I think "some aplications" is a bit of an understatement as 8800GT _is faster_ in almost 80% of recent games compared to HD3870. 
Poor ATi is being bullied by most of game developers...


----------



## JrRacinFan (Feb 4, 2008)

candle_86 said:


> you can call me what you like, but i go for FPS anyday and when Nvidia can deliver ontime and outpreform it means something, something ATI has yet to grasp.



You say you go for FPS? 100fps versus 120 fps? Sorry to say but if its a difference of let's say $150 for those extra 20 fps, I know for one I will save my money and look elsewhere.

@largon 
Yes but comparable to price: HD3870 may win that "war". All depending on release price.


----------



## largon (Feb 4, 2008)

What recent games run at 100fps? 
Why buy a new video card for old games that run 100fps on current hardware? Surely, earlier gen or even current gen mainstream cards handle those titles at playable rates?


----------



## candle_86 (Feb 5, 2008)

imperialreign said:


> the 3870 is not really all that slower by much in the grand scheme of things, and considering the somewhat lower price of the 3870, it's a better overall value.  Sure the 8800GT is faster in _some_ applications, but those apps are notorious for being hard on ATI's offerings (i.e. Crysis)
> 
> 
> 
> HET - following nVidia's naming scheme, even though the 9800 GX2 is brandishing 2 8800 GTs, it'll still be named into the GeForce 9 series (hence 9xxx GX2); doesn't matter to nVidia whether the GPU tech is GF 8 or not - new series of cards ushers in a new GeForce lineup.  Perhaps you're the one in need of reading?




Hard on ATI, haha, it has to do with there design which isnt supported, they have 5 groups of 64 shaders, but only 1 shader per group can do complex, 4 can do simple and one does interger work actully. Seeing as how most shaders are complex you have a 64 vs 128 not to mention the engine has to be programmed with ATI in mind while the Nvidia solution is universal. ATI trid something similar in 2000 with a 2x3 core config, but only 1 game was made that supported it. Same mistake just 7 years later.


----------



## Graogrim (Feb 5, 2008)

DanTheBanjoman said:


> Best bang for the buck is the 8800GT at the moment. Read the reviews on this very site.


Actually, based purely on benchmarking per dollar with emotion and any artificial restrictions taken out of the equation, it's the *GeForce 8600 GT* that's the best bang for the buck.

Of course, few here would appreciate the performance they'd get with the 8600 GT, so if we fudge a bit and say "the best bang for the buck at competent performance levels in modern games" the result is the Radeon HD3850.

The 8800GT comes in second after that.

Now if you want to say "the best bang for the buck over $200," *that's* the 8800 GT.


----------



## Darren (Feb 5, 2008)

In the UK the best bang for your buck is:


For midrange cards: 3850

3850 because it out performs the 8600GTS easily. Both cards are priced between £95-120 depending on manufacturer. 8800GS is a worthy alternative to the 3850.

For high end: 3870

3870 because its between £125-140, whilst the 8800GT is priced between £145-200.


Ultra high end cards: 3870 X2

3780 X2 is cheaper and faster than Nvida's 8800 Ultra. The 3870 X2 is between £260-290 while the 8800Ultra is between £350-460



Here in the UK, ATI appear to be cheaper and "just" as fast as Nvida's offerings if not faster. But make up your own mind as pricing differs country to country.


----------

