# EVGA Launches e-GeForce 8800GT AKIMBO with 1024MB DDR3 Memory



## malware (Mar 26, 2008)

Official NVIDIA partner EVGA has released a new e-GeForce 8800GT 1024MB G92-powered card part of the AKIMBO line-up. The AKIMBO cards feature a dual-slot cooling system that resembles the stock 8800 GTS 512MB cooler, and strangly enough all AKIMBO cards have standard clocks. The new 1GB GeForce 8800 GT model has 112 Stream Processors set to 1500MHz, a GPU clocked at 600MHz and GDDR3 memory at 1800MHz. The card is in stock now for $299.99. 





*View at TechPowerUp Main Site*


----------



## Wile E (Mar 26, 2008)

No 3 phase power? I'd still rather have my Palit 1GB over this.


----------



## PVTCaboose1337 (Mar 26, 2008)

Meh I would still buy the normal one...  an extra $80 does not merit 1gb of ram.


----------



## hat (Mar 26, 2008)

Isn't it a little late to be screwing around with 8 series cards?


----------



## Swansen (Mar 26, 2008)

hat said:


> Isn't it a little late to be screwing around with 8 series cards?



haha, maayybee, considering the 9600's aren't even as good as the 8800 GTX's.... i don't get what Nvidia is doing lately.


----------



## btarunr (Mar 26, 2008)

Swansen said:


> haha, maayybee, considering the 9600's aren't even as good as the 8800 GTX's.... i don't get what Nvidia is doing lately.



9600 GT isn't supposed to be as good as 8800 GTX either, it replaces its older generation equivalent, the 8600 GTS, which it did. It's twice as fast as the card from the older generation.


----------



## springs113 (Mar 26, 2008)

btarunr said:


> 9600 GT isn't supposed to be as good as 8800 GTX either, it replaces its older generation equivalent, the 8600 GTS, which it did. It's twice as fast as the card from the older generation.



the 9600gt is only twice as fast the 8600gts because the 8600gts was the sh*ttiest excuse for a midrange bang for a bucker...


----------



## btarunr (Mar 26, 2008)

springs113 said:


> the 9600gt is only twice as fast the 8600gts because the 8600gts was the sh*ttiest excuse for a midrange bang for a bucker...



Technically, 'mid-range' by classical definition should be Radeon HD2600 XT, HD3650, GeForce 8600 GTS and 9600GT.  How 'shitty' an excuse was the HD2600 XT? 


The new definition of 'Mid-Range' would be that which allows you to play any current API standard game (currently, any DX 10 game) at medium settings at acceptable frame rates. The 8600 GTS and HD2600 XT couldn't do that....but a 9600 GT can....and so can its price-to-price equivalent, the Radeon HD3850. So, the real mid-range champions would be GeForce 8800 GT and the Radeon HD3870, looking into the price-stratum they fall into.


----------



## springs113 (Mar 26, 2008)

btarunr said:


> Technically, 'mid-range' by classical definition should be Radeon HD2600 XT, HD3650, GeForce 8600 GTS and 9600GT.  How 'shitty' an excuse was the HD2600 XT?
> 
> 
> The new definition of 'Mid-Range' would be that which allows you to play any current API standard game (currently, any DX 10 game) at medium settings at acceptable frame rates. The 8600 GTS and HD2600 XT couldn't do that....but a 9600 GT can....and so can its price-to-price equivalent, the Radeon HD3850. So, the real mid-range champions would be GeForce 8800 GT and the Radeon HD3870, looking into the price-stratum they fall into.



if you want to emphasize...emphasize the point that nvidias g92 9 stinks when compared to  the g92 8.  and like i have said to numerous i have both an 8800gt and 3870 so no fanboy here.  it definately looks like you are one of the pawns in this gpu battle...the 2600xt was still better than the 8600gt...and from the jump these cards were not meant to play crysis and other direct 10 games like these manufacturers wanted you think.  there as not been such an increase in performance like the jump from a 8600 and the 9600...that just flat out tells ya that the 8600 stinks...and to back my claim up the gts 512 and the gt perform very similarly and the impending gtx i know will not seem so superior when comparing them to the 8800s..its all a wash...

nvidia is playing a dirty game and it shows in how they destroy each one of their product line ups...and to push the envelope even further...the 9600 is a watered down gt...8800gt...and the two are so close in price that they have each other in chokeholds.


----------



## [I.R.A]_FBi (Mar 26, 2008)

springs113 said:


> the 9600gt is only twice as fast the 8600gts because the 8600gts was the sh*ttiest excuse for a midrange bang for a bucker...



i agree ...


----------



## newtekie1 (Mar 26, 2008)

springs113 said:


> the 2600xt was still better than the 8600gt



You don't know what you are talking about.  Anyone who thinks the HD2600XT was better than the 8600GT is either completely ignorant to the facts or a fanboy.

http://i1.techpowerup.com/reviews/Powercolor/HD_3650/images/perfrel.gif

The 8600GT was cheaper than the HD2600XT too, and it still cheaper than the HD3650, which is the exact same card as the HD2600XT.

The 9600GT is faster because it is a beast of a card.  It is faster than ATi's high end offering!  I mean WTF?!?!?!  The 8600GT wasn't super fast card, but it was till better than its competition, as is the 9600GT.

The current gen mid-range card is not supposed to be faster than the last gen's high end cards.  Especially when the generations are more like refreshes.


----------



## springs113 (Mar 26, 2008)

newtekie1 said:


> You don't know what you are talking about.  Anyone who thinks the HD2600XT was better than the 8600GT is either completely ignorant to the facts or a fanboy.
> 
> http://i1.techpowerup.com/reviews/Powercolor/HD_3650/images/perfrel.gif
> 
> The 8600GT was cheaper than the HD2600XT too, and it still cheaper than the HD3650, which is the exact same card as the HD2600XT.



well i should have clarified a little more...but i guess i like the debate thing...the 2600s/2400 werent really gamers...ati made those cards specifically(2600s) for those that wanted a good htpc remedy but at the same time some gaming...this is why i stated that its better...and come to think of it when it comes to all around performance the ati cards are better...lets not even go nvidias shady image practices to increase frames...lol


----------



## springs113 (Mar 26, 2008)

springs113 said:


> well i should have clarified a little more...but i guess i like the debate thing...the 2600s/2400 werent really gamers...ati made those cards specifically(2600s) for those that wanted a good htpc remedy but at the same time some gaming...this is why i stated that its better...and come to think of it when it comes to all around performance the ati cards are better...lets not even go nvidias shady image practices to increase frames...lol



also one review does not make or break a card...and w1zzards method is crazy at times...i asked him to stop doing the aa and af things at higher resolutions or even run test with and without aa because aa and af really were meant to improve the pictures at low resolutions...
but anyways one review does not make or break a card.


----------



## newtekie1 (Mar 26, 2008)

springs113 said:


> well i should have clarified a little more...but i guess i like the debate thing...the 2600s/2400 werent really gamers...ati made those cards specifically(2600s) for those that wanted a good htpc remedy but at the same time some gaming...this is why i stated that its better...and come to think of it when it comes to all around performance the ati cards are better...lets not even go nvidias shady image practices to increase frames...lol



Again, they were not better.  They were no better in an HTPC than an 8600GT.  And no, all around ATi's cards do not perform better.  Show me proof of this.  You can't, because it isn't true.  Every single ATi card currently available has an nVidia counterpart that performs the same for less money, or better for the same amount of money.  For cards not meant to be gamers, they sure were priced like gaming cards.

We can go into nVidia's shady image practices, but then again those are rare, and don't really apply here.  Nvidia's image quality is equal to ATi's currently.

Anyone who says ATi's all around performance is better is, again, either completely ignorant or a fanboy.  Which are you?



springs113 said:


> also one review does not make or break a card...and w1zzards method is crazy at times...i asked him to stop doing the aa and af things at higher resolutions or even run test with and without aa because aa and af really were meant to improve the pictures at low resolutions...
> but anyways one review does not make or break a card.



Your right, one review does not make or break a card, you are welcome to post some other reviews that show differently.  I'm not going to just take your word for it.

AA is used at all resolutions, I use AA and AF no matter what resolution I am gaming at.  And resolution has no affect on AF, it is needed on all resolution.


----------



## BumbRush (Mar 26, 2008)

aa/af are ment to improve picture quility PERIOD not just at low res, 1600x1200 i use 8xQ aa on my 8800gt, where as my 1900xtx only needed 2x-4x for the same quility.....nvidia drivers suck...:/

and btarunr=nvidiot, saw a quote a while back in somebodys sig where he admited it........


----------



## BumbRush (Mar 26, 2008)

@newtekie1: check some forums that are detocated to video playback, you will find that in most reviews ati's cards are better for htpc(video decoding) compared to nvidia's 8 seirse, the built in HD decoder and mpeg2 decoder are beter and offload more of the work from the cpu, this along with slitly better image quility in said decoding leads to them being a better choice for an HTPC thats not primarly used for gaming(who would build an HTPC to game on? )

its not just about game perf, and you seem to fail to understand that, also the 8800gts are not nesserly cheaper then the 2600/3650 cards, and the 2400/3450 vs the 8400, the 8400 looses in every way for htpc/mpc use as we all know, any card in that pricerange isnt for gaming at all, so the video decodign ablilitys whats important.


----------



## driver66 (Mar 26, 2008)

BumbRush said:


> and btarunr=nvidiot, saw a quote a while back in somebodys sig where he admited it........



Your point would be? Where has he even posted anything near as trollish as this post anywhere? Very rude of you, nice start to the forums, good day newbie :shadedshu BTW Welcome to TPU


----------



## Black Panther (Mar 26, 2008)

I think this speaks for itself:


----------



## Fhgwghads (Mar 26, 2008)

Eh, just save up and buy a 9800GX2.


----------



## springs113 (Mar 26, 2008)

BumbRush said:


> aa/af are ment to improve picture quility PERIOD not just at low res, 1600x1200 i use 8xQ aa on my 8800gt, where as my 1900xtx only needed 2x-4x for the same quility.....nvidia drivers suck...:/
> 
> and btarunr=nvidiot, saw a quote a while back in somebodys sig where he admited it........



well as far as tuning ...if you compare turning on aa/af at lower resolutions to the same at higher ones...you will notice that the difference as you increase resolutions are less visible...


----------



## v7100 (Mar 26, 2008)

Is that dual slot cool? Quite good for SFF. Does EVGA cheaper than 8800GTS 512MB?


----------



## springs113 (Mar 26, 2008)

newtekie1 said:


> Again, they were not better.  They were no better in an HTPC than an 8600GT.  And no, all around ATi's cards do not perform better.  Show me proof of this.  You can't, because it isn't true.  Every single ATi card currently available has an nVidia counterpart that performs the same for less money, or better for the same amount of money.  For cards not meant to be gamers, they sure were priced like gaming cards.
> 
> We can go into nVidia's shady image practices, but then again those are rare, and don't really apply here.  Nvidia's image quality is equal to ATi's currently.
> 
> ...



there ya go again...aa/af... i said there really is no need for them at higher resolutions....
I AM NOT A FAN BOY I AM A *FAN*...a lil tip...i own both amd and intel systems with both nvidia and ati gpus...and i will continue to pursue it this way because i like to have my cake and eat it too...

to the guy that said by a gx2..typical fanboy...i could afford to buy that crap but it is not worth it...

as for you again...i really dont think that i need to be digging through my old pc favorites folder just to prove one person wrong...every reviewer/techie knows that no two systems are a like even if you have the same components...


----------



## JC316 (Mar 26, 2008)

8600GT was one of the best mid range cards ever made. I had the GTS and at the time I was killing the much more expensive X1950Pro and on par with the X1900XT.

Not a fan boy either, I have had 5 Ati cards and 5 Nvidia cards. On top of that, I did the tests and had the proof to back it up.


----------



## springs113 (Mar 26, 2008)

JC316 said:


> 8600GT was one of the best mid range cards ever made. I had the GTS and at the time I was killing the much more expensive X1950Pro and on par with the X1900XT.
> 
> Not a fan boy either, I have had 5 Ati cards and 5 Nvidia cards. On top of that, I did the tests and had the proof to back it up.



the 8600 on par with atis 1900xt
am i seeing things


----------



## JC316 (Mar 26, 2008)

springs113 said:


> the 8600 on par with atis 1900xt
> am i seeing things



Nope, I was right behind them. I couldn't kill them, but I could keep up. Lost planet, F.E.A.R, and Oblivion to name a few that were only a few FPS behind the XT.


----------



## springs113 (Mar 26, 2008)

JC316 said:


> Nope, I was right behind them. I couldn't kill them, but I could keep up. Lost planet, F.E.A.R, and Oblivion to name a few that were only a few FPS behind the XT.



anyways once again nvidia is crazy ...even though it is one review, the review on the 9800gtx does show that the gx2 becomes even more unnecessary than before...it is on par with the 3870x2 and the 9800gx2... while being around the same price of the 3870x2


----------



## newtekie1 (Mar 27, 2008)

BumbRush said:


> @newtekie1: check some forums that are detocated to video playback, you will find that in most reviews ati's cards are better for htpc(video decoding) compared to nvidia's 8 seirse, the built in HD decoder and mpeg2 decoder are beter and offload more of the work from the cpu, this along with slitly better image quility in said decoding leads to them being a better choice for an HTPC thats not primarly used for gaming(who would build an HTPC to game on? )
> 
> its not just about game perf, and you seem to fail to understand that, also the 8800gts are not nesserly cheaper then the 2600/3650 cards, and the 2400/3450 vs the 8400, the 8400 looses in every way for htpc/mpc use as we all know, any card in that pricerange isnt for gaming at all, so the video decodign ablilitys whats important.



And what you fail to realize is that the MAJORITY of people buying these cards are buying them, not for HTPCs, but for gaming.  There more people buying dedicated graphics cards for game playing than for video decoding.  You don't even need a graphics card to decode videos, onboard is good enough for EVERYTHING.  So yes, it is all about game performance, because if you are just going for a HTPC application with no gaming, there is no reason to spend any money on a descrete graphics card.


----------



## BumbRush (Mar 27, 2008)

no i realise quite well that some morons buy them for gaming, but your wrong about how far ppl are buying them for htpc or better decoding.

onboard 6100(very common on OEM systems even now) or intel onboard SUCK ASS for video decoding, they use the cpu 100$ of the time and quility suffers from it.

on the other hand if u use an ati HD seirse you get better quility and even on a low end htpc you can watch 720p or 1080p movies perfectly fine, 1080 is a bitch with most onboard video i knowi have tryed it, intels is THE WORST!!!!!!  nvidias 6100 and older are better but still suck for video playback.....blah,  and please tone down the rude condecenting attatude you are using on me and others, i dont like it.


----------



## btarunr (Mar 27, 2008)

springs113 said:


> if you want to emphasize...emphasize the point that nvidias g92 9 stinks when compared to  the g92 8.  and like i have said to numerous i have both an 8800gt and 3870 so no fanboy here.  it definately looks like you are one of the pawns in this gpu battle...the 2600xt was still better than the 8600gt...and from the jump these cards were not meant to play crysis and other direct 10 games like these manufacturers wanted you think.  there as not been such an increase in performance like the jump from a 8600 and the 9600...that just flat out tells ya that the 8600 stinks...and to back my claim up the gts 512 and the gt perform very similarly and the impending gtx i know will not seem so superior when comparing them to the 8800s..its all a wash...
> 
> nvidia is playing a dirty game and it shows in how they destroy each one of their product line ups...and to push the envelope even further...the 9600 is a watered down gt...8800gt...and the two are so close in price that they have each other in chokeholds.



You have a low understanding, don't you   What exactly is the G92 '9' and '8' ? They're the same GPU. Exactly which of your G92 "9" stink? 9600 GT is based on G94 and not G92....9800 GTX outperforms the 8800 GTX (and is supposed to be cheaper)...9800 GT is just a 8800 GT with higher clocks and two gold-fingers for 3-way SLI. There's no difference in the GPU's just that NVidia altered their road-maps, rushed in the G92 to combat what it saw as a threat looming in to form of the RV670. It couldn't rush in the 9 series so they just rush in a 8800 GT (which outperforms the HD2900 XT) and the 8800 GTS 512M (which was left peerless in the price-category it fell in ~$350)

No, the HD2600 wasn't better than the 8600 GT, it took it high clocked 512 MB of GDDR4 memory to outperform the 8600 GT, though it couldn't get close to the 8600 GTS. Well, the jump between the 8600 GT and 9600 GT isn't a 'stinky' one as you look at it. it's just 32 SP's to 64 SP's, 128bit memory bus to 256 bit....it was a simple honest one which worked. The 9600 GT was supposed to be 2x faster than the 8600 GT and it turned out to be so.

You're not substantiating a your bold statements with any good logic/argument. Eg: calling a GPU stinky..."it's all a wash"....etc.

The 8800 GTS 512M was a rush in product aimed more at performance superiority than anything else. No, it didn't outperform the 8800 GTX. 

No, NVidia isn't playing a 'game'. They're not like AMD/ATi who seem to be experts in talking big about things they are about to release, play the numbers game on the specs. sheets, and then churn out bad performers. The 9600GT is a ~$169 card...irrespective of how it relates to the 8800GT, reviews already show it to perform better than the competitor's offering, the HD3850.  Quit fanboyism. I'm not a NV fanboy, though I regard it as a company that 'walks the talk'.


----------



## Swansen (Mar 27, 2008)

all this aside, its messed up, i'd expect any new generation of cards to be better that any previous generation card, but since the 8 series, its not the case.  Which is all really dumb and confusing, my point, if your going to release a new product, it should be better than whats already availible.  I pretty much hate whats going on in the video card department rate now, its all messed up.


----------



## BumbRush (Mar 27, 2008)

btarunr said:


> No, NVidia isn't playing a 'game'. They're not like AMD/ATi who seem to be experts in talking big about things they are about to release, play the numbers game on the specs. sheets, and then churn out bad performers. The 9600GT is a ~$169 card...irrespective of how it relates to the 8800GT, reviews already show it to perform better than the competitor's offering, the HD3850.  Quit fanboyism. I'm not a NV fanboy, though I regard it as a company that 'walks the talk'.




yes they are the same game they have played for years, the marketing game.

nvidia since the FX line has had blah driver support, they put out drivers fast as beta's just so they can keep on top of the benches that websites use for their cards, to nvidia all that matters is the benches, they dont fix long term known issues, they dont give a shit about anything but looking good in the benches.

u know if amd/ati where as bad as you imply they would just make drivers that only support the top games and give the best perf in them for benching, like the crysis beta drivers from nvidia that used cheats to get 1-2 fps boost in crysis.....what a joke, remove windows and mess up water reflection to get a small boost, then they still LOST TO AMD!!!!

this from a guy running an 8800gt.


oh also is anybody else thinking its utter bullshit that nvidia has only been updating drivers for the 9600/9800 cards and not the 8800gt/gts cards that use the same core?

i have had to hack the inf files to add support for installing on my card, i shouldnt have to do that, what a bunch of horseshit.

sure ati/amd's current offerins arent killing/faster then intels or nvidias, but so what, at least for the price they are compeditive, most people dont overclock so b4 anybody says anything about bang for the buck with overclocking, that dosnt count we may be clockers but 99.9% of pc users arent.


----------



## btarunr (Mar 27, 2008)

No, NVidia didn't lose to AMD. Apart from benchmarks, the fact that at stock speeds, NVidia's offerings outperform ATI's. So that argument about '99.9% of users aren't overclockers' is null and void. So what, if NV doesn't update drivers, end of the day NVidia's products play games faster and better than ATI's?


----------



## BumbRush (Mar 27, 2008)

better is relitive to prespective tho, your an nvidiot(as you have admited) you only care about max fps, and crysis when it came out ran faster on ati cards then it did on nvidia cards, so at the time they did loose, and they tryed to fix that with a cheat driver, that didnt really help much.

again, only an nvidiot wouldnt see that, even people i know who are more fond of nvidia then ati (non nvidiot's who prefer nvidia) admit nvidia has driver issues that they never bother to fix because they are to busy trying to get the highest benchmark scores on whatever the days top epeen games are.

and it matters that they dont update the drivers because the only reasion they dont update them is to try and keep their newer cards looking better then he older ones using the same chips.......what a crock.


i have owned both brands for years and years and years, i was as big an nvidiot as you are man, but i learned from the fx line, and wish I had waited and gotten a 3870, sure it would have been slower with aa cranked, but at least the IQ is better per setting and ati updates the drivers each month WHQL insted of a slew of betas that dont support any but the newist cards........


the fact is that better and worse are relitive to what your using the card for, if your using it for videos/movies then ati, if your just gonna game then nvidia(lower end cards i mean) 

nvidias g92 improved on the hd support but still from my exp the 3800 cards quility is a bit higher, its why i wish i had waited and gone that rout, eather way,i payed to much 

meh, whatever man, im not gonna argue about this anymore, no point trying to get an nvidiot to see that nvidia isnt alwase right and the best, same as mac freaks or intel fanbois who stuck with intel dispite the p4 sucking arse.

oh almost forgot was gonna say this earlyer, to me min framerate is more important then max, i would rather have a card thats max is 65fps that never drops below 30 in a game then a card thats max is 160 but drops into the 20's in the same games, smooth consistant gameplay is more important to me then seeing 160fps or the like when things are going well, and in most benches i have seen the min fps on the 3800's was higher then the min on the 8800's(g92's) in the same price brackets.

Oh yeah and  3dmark/epeenmark means jack shit, i can trick 3dmark into giving far higher scores with some simple little tweaks here and ther to the system but then game perf/quilty suffers so its really pointless do do those tweaks.....3dmark=pretty little tech demo that makes a good stability test for ur system after some overclocking


----------



## blkhogan (Mar 27, 2008)

I really do enjoy these threads. I learn something new everytime. I'm like a sponge....please continue


----------



## BumbRush (Mar 27, 2008)

sponge.....







or


----------



## trt740 (Mar 27, 2008)

Wile E said:


> No 3 phase power? I'd still rather have my Palit 1GB over this.



not to be stupid but how can you tell it if it is three phase.


----------



## Wile E (Mar 27, 2008)

trt740 said:


> not to be stupid but how can you tell it if it is three phase.



I'm not 100% certain, but I would imagine they would advertise a feature like that. Both Palit and GB do.


----------



## btarunr (Mar 27, 2008)

BumbRush said:


> better is relitive to prespective tho, your an nvidiot(as you have admited) you only care about max fps, and crysis when it came out ran faster on ati cards then it did on nvidia cards, so at the time they did loose, and they tryed to fix that with a cheat driver, that didnt really help much............



What do you need apart from fps? The game to crawl? And no, please no 'ATI haz better image quality', I won't buy that crap. So what, if they don't regularly roll out drivers? Sure, NVidia's drivers DO have bugs but they don't cause the game to stutter/lags/low fps, etc. They still perform better than the competition, why all this "NVidia is bad" drama? They don't make things that blow up? Just that they make real sure that the $220 hard earned money you pay, you get  the 8800 GT that lets you play all of today's games with decent LOD. 

If you think you can intimidate me by sig-quoting me, bad-news, I'm not one bit intimidated. At least I'm not ATIncompetent.


----------



## candle_86 (Mar 27, 2008)

First off, im an 8600 user, ive had both actully. 

First off the 2600XT ties the 8600GT in games and benchmarks, but the 2600XT can never hope to clock as high, period the end, so drop it.

As for HTPC use, no one buys the XT for that purpose, face it, the thing is nosiy, HTPC users look for silent solutions, aka passive cards, this would be the 8400, 8500, 8600GT, and 2400, 2600pro cards. These can be done passivly, as for quality, the diffrence's everyone likes to quote when looked at you will never notice unless you are looking at it very closely.

As for Nvidia drivers, i have no idea where you get your flawed ideas, but lets go back to GeforceFX shall we, because your now in my territory and its time you got a history lesson.

First off the FX wasnt the best card granted, and the drivers where poor on launch, but over the FX lifetime preformace improved to nearly 70% without an IQ loss at all. Everyone is fully aware Nvidia sets there drivers to quality and not max quality, but they have done that since Riva 128 so i can promise everyone knows that. While the R300 was faster, it to did not have the best drivers in the world on launch, and it also got canned for 3dmark03 cheating in 2003, both Nvidia and ATI where at fault for that. 

Fast forward if you will now. ATI has a 320 stream processor card, yet it preforms like a 64 stream card, care to explain that to me?

I can explain that to you very easily and it will also explain why the rest of ATI's cards preform so bad despite having these killer specs. Each processor is divided into groups of 5, but only 1 of these can do complex the others are for basic shader work and math, but thats not the problem as most shaders can be handled by the other 3 simple units and the math unit. Yet ATI has failed to relase a driver that can properly use its thread dispatch processor, which is why in reality most of those shaders are dormant and will never be used. You can crtizie Nvidia all you want, but at least they can use all there processors and there dispatch unit works right. 

As for naming, where you aware the 3870 ties the 2900XT, and the only change is the direct x support of the card, its an R600 with DirectX10.1 yet it warrants a new generational name. Yet here NVidia is and there 9800GTX from early becnhmarks shows it can out preform an 8800Ultra so there is a preformace increase there, unlike ATI. 

Before you go around spouting nonesense, i suggest you read up a little and become familiar with video cards, because this isnt 1999 and they have gotten more complex than 2 pipelines with 2 rops, so do us all a favor and learn before you open your mouth, we really dont like it when new members come around and act like foolish children. We are all very knowlegable about graphics and computers in general, and i can speak for all of us i think and politly tell you do not speak if you have no idea what you are talking about.

And one other thing, do not insult the members please, it only makes the mods mad, and you will get banned and will never learn anything

this is not directed towards btarunner in anyway, but to the two ATI notjobs running around


----------



## Wile E (Mar 27, 2008)

I haven't seen it mentioned yet, but one of the biggest reasons 2600XT's make a great HTPC card besides video decoding is the built-in audio.

And there was more than one passively cooled 2600XT made.


----------



## newtekie1 (Mar 27, 2008)

BumbRush said:


> no i realise quite well that some morons buy them for gaming, but your wrong about how far ppl are buying them for htpc or better decoding.
> 
> onboard 6100(very common on OEM systems even now) or intel onboard SUCK ASS for video decoding, they use the cpu 100$ of the time and quility suffers from it.
> 
> on the other hand if u use an ati HD seirse you get better quility and even on a low end htpc you can watch 720p or 1080p movies perfectly fine, 1080 is a bitch with most onboard video i knowi have tryed it, intels is THE WORST!!!!!!  nvidias 6100 and older are better but still suck for video playback.....blah,  and please tone down the rude condecenting attatude you are using on me and others, i dont like it.



No, people are not buying these for HTPCs, ATi's cards are loud on top of everything, people don't want loud cards in their HTPCs.  And no, current onboard is not as bad as you make it out to be.  Yes, with onboard the CPU has to work harder to do things, but even weak current gen CPUs are capable of outputing 1080p and decoding even high end HD video.  Hell my old 7300LE that is in my HTPC connected to my 1080p TV outputs HD video just fine paired with a shitty Celeron D@4GHz.  There is no quality loss from using the CPU only, the only "quality" lost is the BS after effects that ATi and nVidia apply to the video to make it "brighter" and "shinier" which is again completely Bullshit.  If the movie company wanted to make the movie shiny they would have.  ATi just applies a filter to make the movie looks shiny, and the natural human responce is that shiny means better.  It is less natural looking, and I don't like it.

The fact is you don't know what you are talking about.  Current generation onboard solutions offer full HDCP support via HDMI, and even the Intel solution is more than powerful enough to run full HD Video off of.  Yes, it uses the CPU more, however it doesn't affect quality, and if it is a true HTPC you don't need that CPU power for anything else so it doesn't matter.

And I really don't care how you like it.  Don't come here and spread BS, and you won't get attacked.



BumbRush said:


> yes they are the same game they have played for years, the marketing game.
> 
> nvidia since the FX line has had blah driver support, they put out drivers fast as beta's just so they can keep on top of the benches that websites use for their cards, to nvidia all that matters is the benches, they dont fix long term known issues, they dont give a shit about anything but looking good in the benches.
> 
> ...



Nvidia doesn't need to play the benchmark game when they are destroying the competition.

They release beta drivers for testing, they use the beta system like it is supposed to be used.  WHQL is complete BS and useless.

Nvidia's driver support has been rather poor in the past, but they have fixed that issue now, and the present is all that matters.  Using hacked inf files is only needed when you are installing a driver that nVidia hasn't tested for your card.  They don't do it to screw people over like the ATi fanboys say, they do it for protection.  If they haven't tested it with all the card, they don't want inexperienced people installing it and then not knowing how to fix the problem it might have.  It is really a simple concept to understand, and using a hacked INF is not a big deal.

I would still like you to prove your BS.  Show me how ATi's current offering are competitive for the price, when nVidia has virtually every card ATi has on the market beat both in price and performance.


----------



## BumbRush (Mar 27, 2008)

newtekie1 said:


> No, people are not buying these for HTPCs, ATi's cards are loud on top of everything, people don't want loud cards in their HTPCs.  And no, current onboard is not as bad as you make it out to be.  Yes, with onboard the CPU has to work harder to do things, but even weak current gen CPUs are capable of outputing 1080p and decoding even high end HD video.  Hell my old 7300LE that is in my HTPC connected to my 1080p TV outputs HD video just fine paired with a shitty Celeron D@4GHz.  There is no quality loss from using the CPU only, the only "quality" lost is the BS after effects that ATi and nVidia apply to the video to make it "brighter" and "shinier" which is again completely Bullshit.  If the movie company wanted to make the movie shiny they would have.  ATi just applies a filter to make the movie looks shiny, and the natural human responce is that shiny means better.  It is less natural looking, and I don't like it.
> 
> The fact is you don't know what you are talking about.  Current generation onboard solutions offer full HDCP support via HDMI, and even the Intel solution is more than powerful enough to run full HD Video off of.  Yes, it uses the CPU more, however it doesn't affect quality, and if it is a true HTPC you don't need that CPU power for anything else so it doesn't matter.
> 
> ...



first off yes alot of ppl do get the xt for htpc's i have sold more then my fair share of them for that perpous, not all of  the xt's are loud, infact many come with neerly slient solutions that once the case is on you cant even hear if you try.

2nd, not all HTPC's are "silent" most make a little noise if you put ur head directly against them, once they are installed in an entertainment center u cant hear it just as you cant hear acctivly cooled eq's and amps many peoples systems have.

3rd most onboard video sold today in small pc's is not the top end onboard you can get now, the 690/740/780 or 7050 are good, but most is older intel,nvidia or ati OLDER stuff based on the x300/6100/Intel GMA 950, these solutions tho fine for internet surfing and buisness apps are NOT MENT FOR VIDEO PLAYBACK, and def not for gaming, as such the performance sucks ball and the quility suffers and many lower end systems people still get for use as an "htpc" or "mpc" are using low end sempy or celeron/pentium chips, these chips CAN decode 1080p video BUT its hard on them using those older onboard video chips because the cpu has to decode the video FULLY with NO gpu support, as well as decode the audio, i have seen systems with the 6100 and intel low end/older GMA chipsets that would loose sync between video and audio due to the cpu use being to high, on the other hand, if you slap in a cheap 2400-2600 card the system becomes 100% smooth.

now you talk about hdcp and hdmi, untill this last gen of onboard gfx nobody fully supported hdmi/hdcp, and nvidia still uses an external wire on even their top cards for the audio, ati/amd have their audio ON CHIP no extra wires, and the system load is null because the gpu is prosessing the audio and video.

as to nvidias driver quility, its gotten better then the fx and early 6 days but its still bullshit that they dont fix long term known buggs because they are to busy getting 3 more fps in crysis.

no you talk about not testing the drivers with that card, um, the gpu is the same seirse on the newer 8800gs/gt/gts, so there should be no real testing issue, and as i have found on alot of other forums, even g80 users dont have problems installing those "untested" drivers with moded inf files, i saw a nice perf boost, thats why i feel that part of why u arent seeing them for the 8800gt/gts is because nvidia dosnt want their new top end card to look worse then it does now, they are wanting to insure it looks as fast as possable in all the top games compared to their older g9* based cards.


nvidia has done dirty tricks like this before, but never quite to this degree, its reminding me of ati back in the rage128days more and more........


----------



## newtekie1 (Mar 27, 2008)

1st, I'm sure people do buy them for HTPCs.  But that isn't the intended market, and that isn't what the majority of people are buying them for.  I know not all of them are loud, but most are.

2nd, I know most HTPCs aren't totally silent, nothing with a fan will every be totally silent.  However, people don't want leef blowers in their HTPCs, and that is what the HD3650/2600XT was.  There are some exceptions, but generally speaking they were too loud to put in a HTPC without replacing the cooling solution with aftermarket parts.

3rd, most onboard video sold today handles HTPC applications just fine.  You can argue this point until you are blue in the face, and it won't change the fact that you are wrong.  You don't need a descrete graphics solution to play HD content.  And if you want one, then there is no reason to go witht he HD2600XT/3650 when a cheaper card will do just as good, the only reason to go with either of those cards is to get the added gaming performance.  Anything over an HD2400Pro or 8400GS is overkill and the only reason to go higher is gaming performance.  They higher end cards offer no other gains.  Which is why your point is wrong.

The driver issue is another weak fanboy argument.  NVidia gets beta drivers out more often than ATi gets their drivers out.  Personally, I would rather have beta drivers released every week like nVidia than having to wait a whole month to get fixes for the latest games.

As for major issues not being fixed, I would like to know exactly what issues you are talking about.

Yes, nVidia doesn't include support for every single card in every single driver release.  Using a hacked INF is not that big of a deal, a pain, but not a big deal.  Again, the reasoning behind this is sound.  They don't want to test every single beta driver with every single card, and they don't want to have to deal with the slew of support issues that would arise if every single non-computer person installed every beta driver released.  If they haven't tested the driver with the cards, they won't support the cards.  It really is that simple, no consirosy here.


----------



## BumbRush (Mar 27, 2008)

candle_86 said:


> First off, im an 8600 user, ive had both actully.
> 
> First off the 2600XT ties the 8600GT in games and benchmarks, but the 2600XT can never hope to clock as high, period the end, so drop it.
> 
> ...


faulse, i personaly have sold/installed MANY 2600xt's in htpc's at the custmers dirrect request, and no i didnt "sell" them on the 2600, this was just them coming in and saying "i want a 2600 in the system because its better for videos", also some got the 2600xt for vivo fetures that nvidia cards do not offer.

i get my "fawed ideas" from the fact that im an 8800gt owner and have owned pretty much ever seirse of cards nvidias ever made even their old nv1(utter pos) and that since the FX line they havent fixed long term known issues due to their quest for king of the hill in whatever the days top epeen game/bench is.


this i can personaly call bullshit on, the 70% boost was not due to drivers at all, they made a totaly redesigned gpu that was used in the 5700, it performed better their older cards/chips still sucked utterly and fully for anything above dx8(ok they where good in older OpenGL games to)
as to no quility loss, just do some googling, the FX cards where CRIPPLED with driver updates that RUINED quility, I personaly can destify to this because I HAD A 5800ultra as well as a 5900xt, they sucked, you couldnt even hack the drivers to give back full IQ because nvidia hard coded alot of the "tweaks" into the fx line core files in order to get extra perf in benches, in all honisty its what drove me away from nvidia, i was as big an nvidiot as btarunr,newtekie1 and you till those days when i was given a low end 9600 by a buddy who had just gotten ahold of an unlockable 9500(9700 after unlock) i didnt want to use it because i remmberd how baddly ati drivers sucked for the rage128 under any NT based os, but i tryed it because i was so frustrated with the dx9 perf and image quility of my over priced nvidia cards, and i was shocked, in ALL newer games a 9600 that was 1/4 the price was FASTER this wasnt enought reasion for me to move fully to ati, at least at first, but after some gaming i compared screenshots of the 9600 vs the 5800 and 5900, the IQ diffrance was clear, explosions and partical effects looked totaly diffrent, no dithering under ati, no blury textures....it just looked nice, as my gf4ti 4400 had looked.

yes i know in gaming perf it ties, BUT it has better video decoding and hdmi support, check around, the perf went way up in video playback, just as the decoding on the g92 went up vs the g80 lines, also the 38*0 cards do have other small tweaks other then just a die shrink, and my point unlike some other peoples points was.
let me split this up a bit for easyer understanding.

ati never used the new core as a 2900 seiries card, it was renamed to diffr it from the 2900 and its bad rep/press, also makes it easyer for buyers to be sure they are getting a new core not an OLD core, this makes sence.

nvidia put the g92 into 8800gt/gs/gts cards THEN put the same chip into the 9600 and 9800, this is where i call bullshit, if they wanted to call them the 9 seirse then they should have callled them the 9 seirse, not use the same chip with less dissabled shaders/pipes/wtfe and called it 8800 vs 9800, honestly its bullshit and most ppl can see what i mean.

as to nvidiot, btarunr admited hes one, check the link in my sig, (found it from somebody elses sig and copyed it)  

im not a fanboi,tho you seem to think i am, i just have a diffrent prespective from you, i see that game perf isnt the only reasion to buy a card, and that sometimes other things matter more to the buyer/client, till you have worked in computers as long as i have (13 years doing this for $) you wont have the proper prespective as to what matters to diffrent markets, its not all about epeen mark scores or crysis bench scores, to be honest at times i wish it was, at least it would make it easyer to say "this is the best card for X price", as things are you have to know more about how the cards acctualy work in diffrent situations and uses, video playback, hdmi/hdcp/video decosing/video in/out, the list goes on and on, some fetures ati cards offer that nvidia just dont, like ViVo(video in video out) if somebody wants these fetures they got to go ati or buy another card, and in the case of many MATX systems adding another card isnt a viable option because the systems got only 1-2 useable expantion slots.



btarunr said:


> What do you need apart from fps? The game to crawl? And no, please no 'ATI haz better image quality', I won't buy that crap. So what, if they don't regularly roll out drivers? Sure, *NVidia's drivers DO have bugs but they don't cause the game to stutter/lags/low fps, etc.* They still perform better than the competition, why all this "NVidia is bad" drama? They don't make things that blow up? Just that they make real sure that the $220 hard earned money you pay, you get  the 8800 GT that lets you play all of today's games with decent LOD.
> 
> If you think you can intimidate me by sig-quoting me, bad-news, I'm not one bit intimidated. At least I'm not ATIncompetent.


 see above, i will say it again, fps dosnt matter if your primary use isnt gaming, if you buy the card to use in a media pc/htpc where all that matters is video quility and playback game fps dont come into the picture, If you want ViVo fetures nvidia is not an option because THEY DONT OFFER THEM you would need to buy another card, and as i said above, many of the small pc's people choose for media systems(be it media pc or htpc) dont have enought expantion slots, take a look at those dell mini desktops, if you stick videocard in you have 1-2 slots useable at best, and many people using it for a pure media playback device would have a 3rd party soundcard because those dell boxes onboard sound is still utterly crap(worse then any home built's onboard sound i have seen in the last 4-5 years)

*no but what if u cant game because you cant even get the system stable due to a driver bugg nvidia has known about since 2003?  namely the bugg that causes x64 windows and server 2003(what x64 pro is based on) to crash as soon as you try and access even the nvidia control panil? this is a known issue in nvidias bug database, they just dont bother to fix i, their advice, "reinstall till it works" and "try slipstreaming a driver into your windows disk"  NOBODY SHOULD HAVE TO DO THAT, most people wouldnt even know what slipstreaming was let alone how to do it...........*

both companys have their faults, nvidia and its nvidios fault is that they only care about game bench perf and nothing else, not stability on certen windows versions, no image quility not video playback perf/quility/buggs, just epeenmark and crysis scores..., ati's is that they took a bad path with the 2900/3800 cores that eather dosnt have optimum driver support or game support, or it just needs redesigned, oh and they need to fix scaling on old games for wide screen monotors.....


i fully know that the top 2 at the moment each have their share of problems, but for my $ as a tech i would rather deal with ati cards/drivers then nvidia's in most situations sure they dont get the same uber high max fps but at least i dont have to deal with them crashing on x64pro or video rendering buggs that plauge the 8800drivers (well it acctualy effects many cards using the same driver revisions as the 8800 as well) 

im on an 8800gt, its nice, but its got its buggs, and it can frustrate the hell out of me.

hey lets all just agree on this, at least we arent stuck with via/S3 and the like


----------



## btarunr (Mar 27, 2008)

BumbRush said:


> im on an 8800gt, its nice, but its got its buggs, and it can frustrate the hell out of me.



Oh the irony.


----------



## BumbRush (Mar 27, 2008)

newtekie1 said:


> 1st, I'm sure people do buy them for HTPCs.  But that isn't the intended market, and that isn't what the majority of people are buying them for.  I know not all of them are loud, but most are.
> 
> 2nd, I know most HTPCs aren't totally silent, nothing with a fan will every be totally silent.  However, people don't want leef blowers in their HTPCs, and that is what the HD3650/2600XT was.  There are some exceptions, but generally speaking they were too loud to put in a HTPC without replacing the cooling solution with aftermarket parts.
> 
> ...



i need dirrect you no farther then newegg's computer section, check out the prebuild's and see the onboard video most use, its NOT hdcp/hdmi complyant AT ALL, its mostly OLDER nvidia/ati chipsets for amd and Intels OLD ass GMA for intel systems, hence you do NEED a 3rd party card if you want proper HDMI support.

http://www.newegg.com/Product/Produ...Deals&SubCategory=10&StoreType=2&N=2032280010



> lenovo 3000 J200(969084U) Celeron M 420(1.60GHz) 512MB DDR2 80GB Intel GMA 950 Windows Vista Home Basic - Retail
> Audio: Sound card - Integrated
> Cache Per Processor: 1MB L2 Cache
> Ethernet: Realtek 10/100 Ethernet
> ...



just some examples, most of those do not have hdmi support, possable some of the 1200/1250 units do, but thats very iffy.

i could have copyed all 100 listings from the first page but that would have been excessive, so yeah, your full of crap about all computers sold today have onboard video that fully supporrts hdmi/hdcp.....

only the ones with 1250 gfx can support hdmi but thats only if the company has that feture put on the board other wise ur screwed.


----------



## BumbRush (Mar 27, 2008)

btarunr said:


> Oh the irony.



its ironic that i got this card because i had another one die, then a few weeks later out came the 3870 cheaper(tho i payed MSRP for this not the 320+ ppl where paying online) 

really i have personaly setup and compared the 3870 and 8800gt, side by side in the same games at 1600x1200, unless u crank the AA u cant tell the diffrance.

and ATI/AMD need 1/2 the aa setting to get the same quility, this drives me crazy, so at 16x AA its the same as my ati card was at 8x, at 8xQ aa its same as my ati card was at 4x for quility, it goes on and on, something i still havent seen benched tho is AMDti's temporal AA modes used insted of old skool AA.

maby somebody could get wiz or even WileE to bench his cards  side by side using taa insted of normal aa mode

use t2 and t3 modes at 2x and 4x, t2 is 2x the aa setting, t3 is 3x, so at 2x it would be 4x at t2 and 6x at t3, at 4x it would be 8x and at t3 it would be 12x, tho u need to keep a steady fps above XX number( u can set this number manualy as well)  for taa to work, but still in my experiance at 1600x1200 with a 3850/3870 u never need more then 2x t2 AA on any game you run across, and they are all playable, one exception may be crysis, that damn heavly unoptimized game!!!!!


----------



## newtekie1 (Mar 27, 2008)

BumbRush said:


> i need dirrect you no farther then newegg's computer section, check out the prebuild's and see the onboard video most use, its NOT hdcp/hdmi complyant AT ALL, its mostly OLDER nvidia/ati chipsets for amd and Intels OLD ass GMA for intel systems, hence you do NEED a 3rd party card if you want proper HDMI support.
> 
> http://www.newegg.com/Product/Produ...Deals&SubCategory=10&StoreType=2&N=2032280010
> 
> ...



Yes, and you still failed completely to address my points.

1.) Integrated is enough.  When paired with even a low end processor it is enough to output HD content just fine.

2.) If you want HDCP and have no interest in gaming and only want to take some load of the CPU, then either the HD2400Pro or 8400GS is going to do the task.  The 8400GS being cheaper($35) than the HD2400Pro($37).  The built in audio on the HD2400Pro is no better than onboard sound either, so don't even try to say it is worth anything.

3.) The only reason to go higher than these to cards is gaming performance, and any higher card is going to do the same tasks as the lower cards, but with better gaming performance.  Nvidia offers better gaming performance.  Why would anyone upgrade to these cards if they were not looking for gaming performance?

And just because I don't want to quoat your whole previous post, no ATi does not have better IQ, and no it does not take half the AA on an ATi card to get the same quality as an nVidia card.

http://sg.vr-zone.com/articles/ATi_Radeon_2000_Series_Launch:_X2900XT_Review/4946-15.html

Read there for a nice image comparison between the ATi HD series and nVidia's 8800 series.  The conclusion, ATi is slightly better than nVidia at the highest possible settings, but only when you look really really really close.

I'm done with this argument, you have failed to back up your points in any meaningful way, and you just keep talking in circles.  You make long posts to try and make it seem like you are saying a lot, when you are really saying nothing.


----------



## BumbRush (Mar 27, 2008)

thats an old ass artical man, and its one sorce, ask around people who have nvidia and ati cards to compare as i do, hate to tell you this but ati's aa is better per setting then nvidias period, only an nvidiot would be unwilling to admit that.

and yes the ati onboard sound is worth note because IT WORKS OVER HDMI without need of any external wires or using the onboard sound on the computer, hence its better then the 8400(that dosnt support hdmi sound at all) so your "point" is just more nvidia fanboi isim.

why do you insult and call anybody who says anything bad about nvidia a fanboi but you insist ur not a fanboi/nvidiot for running around badmouthing ati every chance you get?


----------



## Wile E (Mar 28, 2008)

newtekie1 said:


> The built in audio on the HD2400Pro is no better than onboard sound either, so don't even try to say it is worth anything.


I gotta disagree with you here. Ease of use/simplicity in hookup. It's certainly no worse quality wise than using the onboard, but is a hell of a lot easier to deal with.


And I hate to sit the fence on the quality issue, but neither have better quality. Both just render things subtly different. Which looks better is purely a matter of preference.


----------



## newtekie1 (Mar 28, 2008)

Wile E said:


> I gotta disagree with you here. Ease of use/simplicity in hookup. It's certainly no worse quality wise than using the onboard, but is a hell of a lot easier to deal with.



A valid point indeed.


----------



## candle_86 (Mar 28, 2008)

BumbRush said:


> faulse, i personaly have sold/installed MANY 2600xt's in htpc's at the custmers dirrect request, and no i didnt "sell" them on the 2600, this was just them coming in and saying "i want a 2600 in the system because its better for videos", also some got the 2600xt for vivo fetures that nvidia cards do not offer.
> 
> i get my "fawed ideas" from the fact that im an 8800gt owner and have owned pretty much ever seirse of cards nvidias ever made even their old nv1(utter pos) and that since the FX line they havent fixed long term known issues due to their quest for king of the hill in whatever the days top epeen game/bench is.
> 
> ...




quite frankly if i want ViVo Ill use my HD Haupage TV Tuner card it does the trick a hell of a lot better. My PC double for Media playback, but guess what i use an HD TV decoder card designed for MPEG4 and 2 Decoding. This is how i watch Media on my computer. AS for DVD playback i also do not use my DVD-Drive, i use a DVD Player a 10 disk DVD-Changer to be excat. My computer lets me watch cable and plays my movies, heck i have a VCR hooked up to it, because VHS looks better


----------



## newtekie1 (Mar 28, 2008)

On the VIVO issue, nVidia did offer it.  Up until the 8000 series actually.  The 7 series had VIVO.  Now nVidia doesn't, and the reason is simple.  It s much better to buy a cheap TV tuner card and get better results than buying a video card with VIVO.  VIVO on video cards is dead, ATi just missed the memo.  It never worked acceptably, it was a pain in the ass to use, and there were far better solutions out there.

You can litterally get a TV tuner for under $20, and it will work better than any VIVO video card.  Even ATi has realized this and stopped including VIVO, the only two cards that still actually have it are the 2900XT and 3870, and only very specific models made by Diamond.

I've also asked him to tell me what these "long term bugs" were and he failed to do so.  So one can only assume he is full of it when he talks about them.  Just add the fanboy to your ignore list and move on.  Saying something as a point in an argument is one thing, but then failing to back it up even after being asked, usually means you point is BS.


----------



## Valorumguygee (Mar 30, 2008)

*So the question I have is..*

Should I buy the 8800gt 1gig?

From that list on the first page, it seems at the highest setting on the biggest games, it has a significant difference in some places.. but overall it seems the be behind the 512 by a couple fps. do I take this as a sign that the 1 gig could have a much longer life then the 512 cause it can support the games over the next few years that will be much beefier then most of those games? Or is it a smarter buy to do one of the GTX or GTS'?

Gotta buy something today 

-Chris


----------



## [I.R.A]_FBi (Mar 30, 2008)

you get the enefit of 1 gb when you overclock the card.


----------



## BumbRush (Mar 30, 2008)

i assume newekie is still ranting about me, to bad for him i have him on my iggy list since hes such an nvidiot.

as to vivo, i have had alot of clients who NEED vivo on their videocard, because using a HTPC/MPC that only has 1-2 useable expantion slots means they gotta have eather a nic(wireless most times) or they can have a videocard(needed for most media pc's weather some ppl belive it or not) the best solution ofcorse would be a way to avoid addin cards all togather but that day hasnt quite came yet, newer onboard video is just about there tho, only problem is that most people use old work boxes(eg recycled computers) to build their media pc's and these units are very limmited in what you can add to them, u can eather add a videocard and wireless or videocard and tunter card now you can use a USB tuner,but those dont have all the inputs some people need, like my last client who needed a VIVO videocard really wanted an all in wonder because he NEEDS svideo and rca(compsit) inputs for some of the devices he uses, we couldnt find an AIW card so i managed to find him a 2600 card with vivo card that had the 2 main things he needed, then got a tuner card and a very pricy coax imput adapeter to go with a tuner card he alwase had bought, it works, but an AIW would be better, he couldnt bring his old AIW into the new systemb because the system had ZERO pci slots(its was a pci rage wonder card)  the dell box we setup as his media edting/capture box wasnt able to take most cards due to lack of slots or inablility to fund 1/2 high cards that would work,(i hate those dell's..........HATE) he got the system from his office stuck a pentium-d in it and it works fine for what he wants to do, but only 2expantion slots pci-e 16x and 1x thats it, horrible design but thats what alot of ppl endup with when they recycle computers(the systems about 9 months old and his company s replaced all dell's with another companys products due to a dispute over on site support)


----------

