Wednesday, March 26th 2008
EVGA Launches e-GeForce 8800GT AKIMBO with 1024MB DDR3 Memory
Official NVIDIA partner EVGA has released a new e-GeForce 8800GT 1024MB G92-powered card part of the AKIMBO line-up. The AKIMBO cards feature a dual-slot cooling system that resembles the stock 8800 GTS 512MB cooler, and strangly enough all AKIMBO cards have standard clocks. The new 1GB GeForce 8800 GT model has 112 Stream Processors set to 1500MHz, a GPU clocked at 600MHz and GDDR3 memory at 1800MHz. The card is in stock now for $299.99.
Sources:
EVGA, TechConnect Magazine
55 Comments on EVGA Launches e-GeForce 8800GT AKIMBO with 1024MB DDR3 Memory
onboard 6100(very common on OEM systems even now) or intel onboard SUCK ASS for video decoding, they use the cpu 100$ of the time and quility suffers from it.
on the other hand if u use an ati HD seirse you get better quility and even on a low end htpc you can watch 720p or 1080p movies perfectly fine, 1080 is a bitch with most onboard video i knowi have tryed it, intels is THE WORST!!!!!! nvidias 6100 and older are better but still suck for video playback.....blah, and please tone down the rude condecenting attatude you are using on me and others, i dont like it.
No, the HD2600 wasn't better than the 8600 GT, it took it high clocked 512 MB of GDDR4 memory to outperform the 8600 GT, though it couldn't get close to the 8600 GTS. Well, the jump between the 8600 GT and 9600 GT isn't a 'stinky' one as you look at it. it's just 32 SP's to 64 SP's, 128bit memory bus to 256 bit....it was a simple honest one which worked. The 9600 GT was supposed to be 2x faster than the 8600 GT and it turned out to be so.
You're not substantiating a your bold statements with any good logic/argument. Eg: calling a GPU stinky..."it's all a wash"....etc.
The 8800 GTS 512M was a rush in product aimed more at performance superiority than anything else. No, it didn't outperform the 8800 GTX.
No, NVidia isn't playing a 'game'. They're not like AMD/ATi who seem to be experts in talking big about things they are about to release, play the numbers game on the specs. sheets, and then churn out bad performers. The 9600GT is a ~$169 card...irrespective of how it relates to the 8800GT, reviews already show it to perform better than the competitor's offering, the HD3850. Quit fanboyism. I'm not a NV fanboy, though I regard it as a company that 'walks the talk'.
nvidia since the FX line has had blah driver support, they put out drivers fast as beta's just so they can keep on top of the benches that websites use for their cards, to nvidia all that matters is the benches, they dont fix long term known issues, they dont give a shit about anything but looking good in the benches.
u know if amd/ati where as bad as you imply they would just make drivers that only support the top games and give the best perf in them for benching, like the crysis beta drivers from nvidia that used cheats to get 1-2 fps boost in crysis.....what a joke, remove windows and mess up water reflection to get a small boost, then they still LOST TO AMD!!!!
this from a guy running an 8800gt.
oh also is anybody else thinking its utter bullshit that nvidia has only been updating drivers for the 9600/9800 cards and not the 8800gt/gts cards that use the same core?
i have had to hack the inf files to add support for installing on my card, i shouldnt have to do that, what a bunch of horseshit.
sure ati/amd's current offerins arent killing/faster then intels or nvidias, but so what, at least for the price they are compeditive, most people dont overclock so b4 anybody says anything about bang for the buck with overclocking, that dosnt count we may be clockers but 99.9% of pc users arent.
again, only an nvidiot wouldnt see that, even people i know who are more fond of nvidia then ati (non nvidiot's who prefer nvidia) admit nvidia has driver issues that they never bother to fix because they are to busy trying to get the highest benchmark scores on whatever the days top epeen games are.
and it matters that they dont update the drivers because the only reasion they dont update them is to try and keep their newer cards looking better then he older ones using the same chips.......what a crock.
i have owned both brands for years and years and years, i was as big an nvidiot as you are man, but i learned from the fx line, and wish I had waited and gotten a 3870, sure it would have been slower with aa cranked, but at least the IQ is better per setting and ati updates the drivers each month WHQL insted of a slew of betas that dont support any but the newist cards........
the fact is that better and worse are relitive to what your using the card for, if your using it for videos/movies then ati, if your just gonna game then nvidia(lower end cards i mean)
nvidias g92 improved on the hd support but still from my exp the 3800 cards quility is a bit higher, its why i wish i had waited and gone that rout, eather way,i payed to much :P
meh, whatever man, im not gonna argue about this anymore, no point trying to get an nvidiot to see that nvidia isnt alwase right and the best, same as mac freaks or intel fanbois who stuck with intel dispite the p4 sucking arse.
oh almost forgot was gonna say this earlyer, to me min framerate is more important then max, i would rather have a card thats max is 65fps that never drops below 30 in a game then a card thats max is 160 but drops into the 20's in the same games, smooth consistant gameplay is more important to me then seeing 160fps or the like when things are going well, and in most benches i have seen the min fps on the 3800's was higher then the min on the 8800's(g92's) in the same price brackets.
Oh yeah and 3dmark/epeenmark means jack shit, i can trick 3dmark into giving far higher scores with some simple little tweaks here and ther to the system but then game perf/quilty suffers so its really pointless do do those tweaks.....3dmark=pretty little tech demo that makes a good stability test for ur system after some overclocking :)
or
If you think you can intimidate me by sig-quoting me, bad-news, I'm not one bit intimidated. At least I'm not ATIncompetent.
First off the 2600XT ties the 8600GT in games and benchmarks, but the 2600XT can never hope to clock as high, period the end, so drop it.
As for HTPC use, no one buys the XT for that purpose, face it, the thing is nosiy, HTPC users look for silent solutions, aka passive cards, this would be the 8400, 8500, 8600GT, and 2400, 2600pro cards. These can be done passivly, as for quality, the diffrence's everyone likes to quote when looked at you will never notice unless you are looking at it very closely.
As for Nvidia drivers, i have no idea where you get your flawed ideas, but lets go back to GeforceFX shall we, because your now in my territory and its time you got a history lesson.
First off the FX wasnt the best card granted, and the drivers where poor on launch, but over the FX lifetime preformace improved to nearly 70% without an IQ loss at all. Everyone is fully aware Nvidia sets there drivers to quality and not max quality, but they have done that since Riva 128 so i can promise everyone knows that. While the R300 was faster, it to did not have the best drivers in the world on launch, and it also got canned for 3dmark03 cheating in 2003, both Nvidia and ATI where at fault for that.
Fast forward if you will now. ATI has a 320 stream processor card, yet it preforms like a 64 stream card, care to explain that to me?
I can explain that to you very easily and it will also explain why the rest of ATI's cards preform so bad despite having these killer specs. Each processor is divided into groups of 5, but only 1 of these can do complex the others are for basic shader work and math, but thats not the problem as most shaders can be handled by the other 3 simple units and the math unit. Yet ATI has failed to relase a driver that can properly use its thread dispatch processor, which is why in reality most of those shaders are dormant and will never be used. You can crtizie Nvidia all you want, but at least they can use all there processors and there dispatch unit works right.
As for naming, where you aware the 3870 ties the 2900XT, and the only change is the direct x support of the card, its an R600 with DirectX10.1 yet it warrants a new generational name. Yet here NVidia is and there 9800GTX from early becnhmarks shows it can out preform an 8800Ultra so there is a preformace increase there, unlike ATI.
Before you go around spouting nonesense, i suggest you read up a little and become familiar with video cards, because this isnt 1999 and they have gotten more complex than 2 pipelines with 2 rops, so do us all a favor and learn before you open your mouth, we really dont like it when new members come around and act like foolish children. We are all very knowlegable about graphics and computers in general, and i can speak for all of us i think and politly tell you do not speak if you have no idea what you are talking about.
And one other thing, do not insult the members please, it only makes the mods mad, and you will get banned and will never learn anything
this is not directed towards btarunner in anyway, but to the two ATI notjobs running around
And there was more than one passively cooled 2600XT made.
The fact is you don't know what you are talking about. Current generation onboard solutions offer full HDCP support via HDMI, and even the Intel solution is more than powerful enough to run full HD Video off of. Yes, it uses the CPU more, however it doesn't affect quality, and if it is a true HTPC you don't need that CPU power for anything else so it doesn't matter.
And I really don't care how you like it. Don't come here and spread BS, and you won't get attacked. Nvidia doesn't need to play the benchmark game when they are destroying the competition.
They release beta drivers for testing, they use the beta system like it is supposed to be used. WHQL is complete BS and useless.
Nvidia's driver support has been rather poor in the past, but they have fixed that issue now, and the present is all that matters. Using hacked inf files is only needed when you are installing a driver that nVidia hasn't tested for your card. They don't do it to screw people over like the ATi fanboys say, they do it for protection. If they haven't tested it with all the card, they don't want inexperienced people installing it and then not knowing how to fix the problem it might have. It is really a simple concept to understand, and using a hacked INF is not a big deal.
I would still like you to prove your BS. Show me how ATi's current offering are competitive for the price, when nVidia has virtually every card ATi has on the market beat both in price and performance.
2nd, not all HTPC's are "silent" most make a little noise if you put ur head directly against them, once they are installed in an entertainment center u cant hear it just as you cant hear acctivly cooled eq's and amps many peoples systems have.
3rd most onboard video sold today in small pc's is not the top end onboard you can get now, the 690/740/780 or 7050 are good, but most is older intel,nvidia or ati OLDER stuff based on the x300/6100/Intel GMA 950, these solutions tho fine for internet surfing and buisness apps are NOT MENT FOR VIDEO PLAYBACK, and def not for gaming, as such the performance sucks ball and the quility suffers and many lower end systems people still get for use as an "htpc" or "mpc" are using low end sempy or celeron/pentium chips, these chips CAN decode 1080p video BUT its hard on them using those older onboard video chips because the cpu has to decode the video FULLY with NO gpu support, as well as decode the audio, i have seen systems with the 6100 and intel low end/older GMA chipsets that would loose sync between video and audio due to the cpu use being to high, on the other hand, if you slap in a cheap 2400-2600 card the system becomes 100% smooth.
now you talk about hdcp and hdmi, untill this last gen of onboard gfx nobody fully supported hdmi/hdcp, and nvidia still uses an external wire on even their top cards for the audio, ati/amd have their audio ON CHIP no extra wires, and the system load is null because the gpu is prosessing the audio and video.
as to nvidias driver quility, its gotten better then the fx and early 6 days but its still bullshit that they dont fix long term known buggs because they are to busy getting 3 more fps in crysis.
no you talk about not testing the drivers with that card, um, the gpu is the same seirse on the newer 8800gs/gt/gts, so there should be no real testing issue, and as i have found on alot of other forums, even g80 users dont have problems installing those "untested" drivers with moded inf files, i saw a nice perf boost, thats why i feel that part of why u arent seeing them for the 8800gt/gts is because nvidia dosnt want their new top end card to look worse then it does now, they are wanting to insure it looks as fast as possable in all the top games compared to their older g9* based cards.
nvidia has done dirty tricks like this before, but never quite to this degree, its reminding me of ati back in the rage128days more and more........
2nd, I know most HTPCs aren't totally silent, nothing with a fan will every be totally silent. However, people don't want leef blowers in their HTPCs, and that is what the HD3650/2600XT was. There are some exceptions, but generally speaking they were too loud to put in a HTPC without replacing the cooling solution with aftermarket parts.
3rd, most onboard video sold today handles HTPC applications just fine. You can argue this point until you are blue in the face, and it won't change the fact that you are wrong. You don't need a descrete graphics solution to play HD content. And if you want one, then there is no reason to go witht he HD2600XT/3650 when a cheaper card will do just as good, the only reason to go with either of those cards is to get the added gaming performance. Anything over an HD2400Pro or 8400GS is overkill and the only reason to go higher is gaming performance. They higher end cards offer no other gains. Which is why your point is wrong.
The driver issue is another weak fanboy argument. NVidia gets beta drivers out more often than ATi gets their drivers out. Personally, I would rather have beta drivers released every week like nVidia than having to wait a whole month to get fixes for the latest games.
As for major issues not being fixed, I would like to know exactly what issues you are talking about.
Yes, nVidia doesn't include support for every single card in every single driver release. Using a hacked INF is not that big of a deal, a pain, but not a big deal. Again, the reasoning behind this is sound. They don't want to test every single beta driver with every single card, and they don't want to have to deal with the slew of support issues that would arise if every single non-computer person installed every beta driver released. If they haven't tested the driver with the cards, they won't support the cards. It really is that simple, no consirosy here.
i get my "fawed ideas" from the fact that im an 8800gt owner and have owned pretty much ever seirse of cards nvidias ever made even their old nv1(utter pos) and that since the FX line they havent fixed long term known issues due to their quest for king of the hill in whatever the days top epeen game/bench is.
this i can personaly call bullshit on, the 70% boost was not due to drivers at all, they made a totaly redesigned gpu that was used in the 5700, it performed better their older cards/chips still sucked utterly and fully for anything above dx8(ok they where good in older OpenGL games to)
as to no quility loss, just do some googling, the FX cards where CRIPPLED with driver updates that RUINED quility, I personaly can destify to this because I HAD A 5800ultra as well as a 5900xt, they sucked, you couldnt even hack the drivers to give back full IQ because nvidia hard coded alot of the "tweaks" into the fx line core files in order to get extra perf in benches, in all honisty its what drove me away from nvidia, i was as big an nvidiot as btarunr,newtekie1 and you till those days when i was given a low end 9600 by a buddy who had just gotten ahold of an unlockable 9500(9700 after unlock) i didnt want to use it because i remmberd how baddly ati drivers sucked for the rage128 under any NT based os, but i tryed it because i was so frustrated with the dx9 perf and image quility of my over priced nvidia cards, and i was shocked, in ALL newer games a 9600 that was 1/4 the price was FASTER this wasnt enought reasion for me to move fully to ati, at least at first, but after some gaming i compared screenshots of the 9600 vs the 5800 and 5900, the IQ diffrance was clear, explosions and partical effects looked totaly diffrent, no dithering under ati, no blury textures....it just looked nice, as my gf4ti 4400 had looked.
yes i know in gaming perf it ties, BUT it has better video decoding and hdmi support, check around, the perf went way up in video playback, just as the decoding on the g92 went up vs the g80 lines, also the 38*0 cards do have other small tweaks other then just a die shrink, and my point unlike some other peoples points was.
let me split this up a bit for easyer understanding.
ati never used the new core as a 2900 seiries card, it was renamed to diffr it from the 2900 and its bad rep/press, also makes it easyer for buyers to be sure they are getting a new core not an OLD core, this makes sence.
nvidia put the g92 into 8800gt/gs/gts cards THEN put the same chip into the 9600 and 9800, this is where i call bullshit, if they wanted to call them the 9 seirse then they should have callled them the 9 seirse, not use the same chip with less dissabled shaders/pipes/wtfe and called it 8800 vs 9800, honestly its bullshit and most ppl can see what i mean.
as to nvidiot, btarunr admited hes one, check the link in my sig, (found it from somebody elses sig and copyed it)
im not a fanboi,tho you seem to think i am, i just have a diffrent prespective from you, i see that game perf isnt the only reasion to buy a card, and that sometimes other things matter more to the buyer/client, till you have worked in computers as long as i have (13 years doing this for $) you wont have the proper prespective as to what matters to diffrent markets, its not all about epeen mark scores or crysis bench scores, to be honest at times i wish it was, at least it would make it easyer to say "this is the best card for X price", as things are you have to know more about how the cards acctualy work in diffrent situations and uses, video playback, hdmi/hdcp/video decosing/video in/out, the list goes on and on, some fetures ati cards offer that nvidia just dont, like ViVo(video in video out) if somebody wants these fetures they got to go ati or buy another card, and in the case of many MATX systems adding another card isnt a viable option because the systems got only 1-2 useable expantion slots. see above, i will say it again, fps dosnt matter if your primary use isnt gaming, if you buy the card to use in a media pc/htpc where all that matters is video quility and playback game fps dont come into the picture, If you want ViVo fetures nvidia is not an option because THEY DONT OFFER THEM you would need to buy another card, and as i said above, many of the small pc's people choose for media systems(be it media pc or htpc) dont have enought expantion slots, take a look at those dell mini desktops, if you stick videocard in you have 1-2 slots useable at best, and many people using it for a pure media playback device would have a 3rd party soundcard because those dell boxes onboard sound is still utterly crap(worse then any home built's onboard sound i have seen in the last 4-5 years)
no but what if u cant game because you cant even get the system stable due to a driver bugg nvidia has known about since 2003? namely the bugg that causes x64 windows and server 2003(what x64 pro is based on) to crash as soon as you try and access even the nvidia control panil? this is a known issue in nvidias bug database, they just dont bother to fix i, their advice, "reinstall till it works" and "try slipstreaming a driver into your windows disk" NOBODY SHOULD HAVE TO DO THAT, most people wouldnt even know what slipstreaming was let alone how to do it...........
both companys have their faults, nvidia and its nvidios fault is that they only care about game bench perf and nothing else, not stability on certen windows versions, no image quility not video playback perf/quility/buggs, just epeenmark and crysis scores..., ati's is that they took a bad path with the 2900/3800 cores that eather dosnt have optimum driver support or game support, or it just needs redesigned, oh and they need to fix scaling on old games for wide screen monotors.....
i fully know that the top 2 at the moment each have their share of problems, but for my $ as a tech i would rather deal with ati cards/drivers then nvidia's in most situations sure they dont get the same uber high max fps but at least i dont have to deal with them crashing on x64pro or video rendering buggs thatplaugethe 8800drivers (well it acctualy effects many cards using the same driver revisions as the 8800 as well)
im on an 8800gt, its nice, but its got its buggs, and it can frustrate the hell out of me.
hey lets all just agree on this, at least we arent stuck with via/S3 and the like :)
www.newegg.com/Product/ProductList.aspx?Submit=SubCatDeals&SubCategory=10&StoreType=2&N=2032280010 just some examples, most of those do not have hdmi support, possable some of the 1200/1250 units do, but thats very iffy.
i could have copyed all 100 listings from the first page but that would have been excessive, so yeah, your full of crap about all computers sold today have onboard video that fully supporrts hdmi/hdcp.....
only the ones with 1250 gfx can support hdmi but thats only if the company has that feture put on the board other wise ur screwed.
really i have personaly setup and compared the 3870 and 8800gt, side by side in the same games at 1600x1200, unless u crank the AA u cant tell the diffrance.
and ATI/AMD need 1/2 the aa setting to get the same quility, this drives me crazy, so at 16x AA its the same as my ati card was at 8x, at 8xQ aa its same as my ati card was at 4x for quility, it goes on and on, something i still havent seen benched tho is AMDti's temporal AA modes used insted of old skool AA.
maby somebody could get wiz or even WileE to bench his cards side by side using taa insted of normal aa mode
use t2 and t3 modes at 2x and 4x, t2 is 2x the aa setting, t3 is 3x, so at 2x it would be 4x at t2 and 6x at t3, at 4x it would be 8x and at t3 it would be 12x, tho u need to keep a steady fps above XX number( u can set this number manualy as well) for taa to work, but still in my experiance at 1600x1200 with a 3850/3870 u never need more then 2x t2 AA on any game you run across, and they are all playable, one exception may be crysis, that damn heavly unoptimized game!!!!!
1.) Integrated is enough. When paired with even a low end processor it is enough to output HD content just fine.
2.) If you want HDCP and have no interest in gaming and only want to take some load of the CPU, then either the HD2400Pro or 8400GS is going to do the task. The 8400GS being cheaper($35) than the HD2400Pro($37). The built in audio on the HD2400Pro is no better than onboard sound either, so don't even try to say it is worth anything.
3.) The only reason to go higher than these to cards is gaming performance, and any higher card is going to do the same tasks as the lower cards, but with better gaming performance. Nvidia offers better gaming performance. Why would anyone upgrade to these cards if they were not looking for gaming performance?
And just because I don't want to quoat your whole previous post, no ATi does not have better IQ, and no it does not take half the AA on an ATi card to get the same quality as an nVidia card.
sg.vr-zone.com/articles/ATi_Radeon_2000_Series_Launch%3A_X2900XT_Review/4946-15.html
Read there for a nice image comparison between the ATi HD series and nVidia's 8800 series. The conclusion, ATi is slightly better than nVidia at the highest possible settings, but only when you look really really really close.
I'm done with this argument, you have failed to back up your points in any meaningful way, and you just keep talking in circles. You make long posts to try and make it seem like you are saying a lot, when you are really saying nothing.
and yes the ati onboard sound is worth note because IT WORKS OVER HDMI without need of any external wires or using the onboard sound on the computer, hence its better then the 8400(that dosnt support hdmi sound at all) so your "point" is just more nvidia fanboi isim.
why do you insult and call anybody who says anything bad about nvidia a fanboi but you insist ur not a fanboi/nvidiot for running around badmouthing ati every chance you get?
And I hate to sit the fence on the quality issue, but neither have better quality. Both just render things subtly different. Which looks better is purely a matter of preference.