# Nvidia is scared of ATI’s success



## AphexDreamer (Feb 6, 2008)

After almost two years, Nvidia launched a presentation where it wanted to educate Analysts that R680, Radeon HD 3870 X2 is a bad, bad thing.

Our colleague Charlie from The Inquirer has posted an interesting mail that Nvidia has sent to financial analysts and you can read this fun part http://www.theinquirer.net/gb/inquirer/news/2008/01/30/nvidia-tries-snow-financial.

However, the reality is somewhat different. This is the first time in two years that Nvidia did such a thing, as Radeon 3870 X2 is better and faster than anything that Nvidia has on the market. Nvidia will catch up and beat ATI with Geforce 9800 GX2 but it doesn’t look like Nvidia will ship this part before early March. If it executes, Nvidia will ship its card a month after ATI and even then we were warned that the card will be much hotter than ATI's and that it won’t be widely available.

This doesn’t mean that ATI has won the war but it got closer to Nvidia than ever in the last two years and this got Nvidians scared.   

The real battle is the next generation, codenamed GT200 and R700, as it is not clear who wins this round. The second currently unknown fact is who will be the first to market as this certainly counts. Both Nvidia and ATI’s chip are scheduled for early second half of 2008 but we’ve seen some delays before.

ATI is currently the only profitable and healthy part of AMD which bleeds cash on the CPU side and the graphics group of AMD is being treated quite well, as these guys are doing better than expected. Nvidia and Intel on the other hand are AMD’s technology partners, at least on paper, but in reality they are fearsome competition.

Shape up Nvidia, the last thing you want is to lose to ATI now, when it is critically important to win in this hostile market. 

http://www.fudzilla.com/index.php?option=com_content&task=view&id=5576&Itemid=1


----------



## choppy (Feb 6, 2008)

that inquirer article is spot on! tbh i dont want nvidia to die neither do i want ati to die..if theres one thing i hate its fanboys.

we need both of these companies to form a competitive market where we (the consumer) actually get bang for buck and we see a market thats always moving forwards


----------



## wolf (Feb 6, 2008)

i highly doubt theyre shytting their pants or anything, 8800GT sli is good, the 9800GX2 should be about the same, as the X2 was to the 3870.

then theres always this 55nm monster ive been hearing rumors about with 384sp's and 1024 megs of 512bit memory, and even if ATi gets the upper hand who cares, apart from fanboys, people will always buy what they want to buy, and given the market over the past few years, nvidia will always be able to sell their current lineup, as does ATi.

these companies are understanding that no matter how you compare, its all about the price : performance ratio. price em rite, and they will sell.

fair enough theyve been ahead and gotten lazy, now all they have to do is pull the finger out and get these products theyre talking out out to the consumer.

i dont think the word scared is right, maybe worried about losing profits, maybe anxious about the situation, but nobody, and i mean nobody should ever be scared of a video card.


----------



## ShadowFold (Feb 6, 2008)

"Poor scaling and compatibility on games that are not top benchmarks" Yea cause there cards scale sooooo well


----------



## wolf (Feb 6, 2008)

driver optimization will always be an issue with any dual card setup, theyre trading blows at the moment.


----------



## DarkMatter (Feb 6, 2008)

This is the only sentence in that inquirer article that I agree with.



> Mr Hara is an idiot for posting his mailing list publicly.



I'm sure that this kind of messages are common to any company: Intel, AMD, IBM, whatever...
But they don't make the mistake of making them public.

It doesn't say anything that isn't true either, though he interprets data to their benefit. What can I say, common bussiness. Move along.


----------



## pt (Feb 6, 2008)

ATI RULES!!! 
IT'S GONNA TRASH WHATEVER NVIDIA THROWS AT THEM!
and will become the best gfx card maker again


----------



## CH@NO (Feb 6, 2008)

pt said:


> ATI RULES!!!
> IT'S GONNA TRASH WHATEVER NVIDIA THROWS AT THEM!
> and will become the best gfx card maker again



sarcasm right???


----------



## Exceededgoku (Feb 6, 2008)

Hmmm I'm an avid AMD/ATI supporter and I definitely wouldn't say that ATI is in a position to scare Nvidia. The architecture is damn good though and hopefully through the next generation we will start to see the effect of their forward thinking design!


----------



## cdawall (Feb 6, 2008)

hahaha look at the charts on hwbot the 8800ultra is killed by the 3870 X2

http://hwbot.org/hardware.compare.do?type=gpu&id=1043_1&id=1236_1&id=1255_1&id=1183_1&id=1279_1

haha i love how it cost so much less $4xx vs $6xx

http://fxvideocards.com/Sapphire-Ra...ETAIL-MFG-Sapphire-Part-100221SR-p-16219.html

http://mirinix.com/1173487-xfx_pvt8...ideo_card__dual_dvi__s_video_-pvt80ushf9.html


----------



## CH@NO (Feb 6, 2008)

I'm with Wolf, both companies are great. Almost all my past VGA cards was from ATi and I was very happy with the performance, later I tried Nvidia and guess what????, THE SAME!!, Nvidia card's performs very well too.


----------



## CH@NO (Feb 6, 2008)

cdawall said:


> the 8800ultra is killed by the 3870 X2
> 
> http://hwbot.org/hardware.compare.do?type=gpu&id=1043_1&id=1236_1&id=1255_1&id=1183_1&id=1279_1



mmmmm, that's really nice but the 8800 ULTRA isn't supposed to fight with an X2. It's single cored and is from the "past generation", when Nvidia launches the 9800GX2 model, put them to trade blows and then post a real comparsion.


----------



## das müffin mann (Feb 6, 2008)

i am happy for ati, they worked there asses off to produce some great gpu's over the past 2 yrs and its paying off, they have built something that is scaring Nvidia and for that i commend them, both are great companies and i use cards from both camps both make quality products, maybe now nividia will lower their prices a bit to be a bit more competitive now if only AMD could get something going with their processors


----------



## BullGod (Feb 6, 2008)

Aphex this has got to be the most full of bullshit article I've ever read. How can a journalist be so one sided, narrow minded and plain stupid? I mean what has Nvidia done to him? What does he have against that company? Oh and as far as I know the best selling graphics cards are from Nvidia and they just made a 150mil aquisition . What has ATI done recently, besides being bought out? Sometimes this fanboysm just makes me sick. :shadedshu


----------



## largon (Feb 6, 2008)

cdawall said:


> hahaha look at the charts on hwbot the 8800ultra is killed by the 3870 X2
> 
> http://hwbot.org/hardware.compare.do?type=gpu&id=1043_1&id=1236_1&id=1255_1&id=1183_1&id=1279_1
> 
> haha i love how it cost so much less $4xx vs $6xx


Oh my dear _god_!

Almost 1½ year old design looses to a two generation newer, total of two fab nodes smaller chip - sorry - _dual chip solution_ in a synthetic benchmark! What's even more impressive is that in games HD3870X2 manages to beat 8800Ultra (again, 1½ old design) by a whopping 5% while consuming 50W more power! 
 


.
.
.






Disclaimer: 50% of all video cards owned by me were made by ATi, 75% of those that were not were bought after nVIDIA 8800 -series came out.


----------



## das müffin mann (Feb 6, 2008)

the 2900, 3850/70- 3870x2 need i say more (bullgod)


----------



## pt (Feb 6, 2008)

CH@NO said:


> sarcasm right???



no, fanboism


----------



## [I.R.A]_FBi (Feb 6, 2008)

nvidiots cant do much better right now... theyre running scared, esp like how they got lazy and decided to do a dual card like ati while at slaved over the r700 (which is 6 mths ahead of schedule)

"heights of great men reached and kept were not attained by sudden flight, but they while their companions slept were upward toiling through the night"

NV had no reason to go dual other than the fact that ATI was doing it as well ...


----------



## BullGod (Feb 6, 2008)

das müffin mann said:


> the 2900, 3850/70- 3870x2 need i say more (bullgod)



And which one of those cards performs better then their counterparts? 8800GT/GS ???


----------



## BullGod (Feb 6, 2008)

Oh and why at one point even people on this site were selling their brand new 2900's and buying the 8800GT?


----------



## [I.R.A]_FBi (Feb 6, 2008)

BullGod said:


> And which one of those cards performs better then their counterparts? 8800GT/GS ???


8800GS a card?


----------



## BullGod (Feb 6, 2008)

[I.R.A]_FBi said:


> 8800GS a card?



well it was supposed to come out. Maybe it wont. I don't really know.


----------



## Conti027 (Feb 6, 2008)

ATI and Nvidia really bring out the worst in people on this site. i mean they both do the same thing and sometime ones better then the other.


----------



## das müffin mann (Feb 6, 2008)

BullGod said:


> And which one of those cards performs better then their counterparts? 8800GT/GS ???



it just kinda depends on where you get your bench marks from, i can show you where ati is better and where nvidia is better, do you really want me to start posting sites? because im not sure how many others threads where this specfic topic has poped up, and people have showed benches where one card performs better then the other, i personally cant tell teh difference in game between an 8800 and a 2900 there are games where one performs better than the other, and vise versa


----------



## BullGod (Feb 6, 2008)

Anyways I don't know how people can read or trust such a shitty site like that one. Here is an excerpt from one of their articles: "THE LATEST DRIVER out from the guys in Satan Clara, dubbed ForceWare" 

I mean come on, how fucking lame is that? Nvidia is not the devil. WTF??? How more biased can you get? And I am supposed to believe anything those guys are saying about Nvidia. As far as I know they made that email up.


----------



## das müffin mann (Feb 6, 2008)

BullGod said:


> I mean come on, how fucking lame is that? Nvidia is not the devil. WTF??? How more biased can you get? And I am supposed to believe anything those guys are saying about Nvidia. As far as I know they made that email up.



dude, nvidia brought it on themselves, he was simply disproving some of what he said, and do you really think they made the email up, i mean come on dude, i smell a nvidia fanboy


----------



## Hawk1 (Feb 6, 2008)

Conti027 said:


> ATI and Nvidia really bring out the worst in people on this site. i mean they both do the same thing and sometime ones better then the other.



It brings out the worst on any site. Fanboys are fanboys (and I'll admit I'm one also). As such, not much is going to change peoples minds on who they are going for. You either have to keep your mouth shut when the other team is in the lead, or slag their cards for whatever faults they have (heat/driver issues etc). You like whatever team you like for whatever reason, regardless of whose in the performance lead.


----------



## BullGod (Feb 6, 2008)

das müffin mann said:


> dude, nvidia brought it on themselves, he was simply disproving some of what he said, and do you really think they made the email up, i mean come on dude, i smell a nvidia fanboy



Well if they made that name up to denigrate them, why couldn't it be possible that they invented other things too? Show me another source than those ATI fanatics and I will believe it. You smell wrong, atm I'm running all Intel. And I'm not a fan of brands either way. I'm a fan of good products. I don't care if they are made by Sony, Samsung, AMD or whatever. I've studied journalism at the University though and I just can't stand something like this. I've heard bad things about the Inquirer, well I guess they were true.


----------



## snuif09 (Feb 6, 2008)

do you know what i want to happen....
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
when ati and nvidia are bitching that 3dfx comes suddenly back in the race with an amazing card


----------



## phanbuey (Feb 6, 2008)

largon said:


> Oh my dear _god_!
> 
> Almost 1½ year old design looses to a two generation newer, total of two fab nodes smaller chip - sorry - _dual chip solution_ in a synthetic benchmark! What's even more impressive is that in games HD3870X2 manages to beat 8800Ultra (again, 1½ old design) by a whopping 5% while consuming 50W more power!
> 
> ...



AAAHAHAHA... finally someone with a grip on history... "scared" of a company thats bleeding money like there is no tomorrow? ATI $7 a share? VliW... People said the same thing about 3dfx @ $7 a share - and the 4 chip voodoo 5000 board... about how glide was such a better API - OMFG SO MUCH POWER SO MUCH FASTER THE RIVA TNT 2 WILL BE A PAPERWIEGHT... blah blah... nvidia has a faster production cycle than ATI, and the only reason they havent released new tech is to recoup the costs of the 8800 series.

Think of it like a track - ATi is ahead because theyre about to be lapped, to someone who has no previous knowledge of the race, it may seem as if they are ahead... but just sit and watch - you'll see how much theyre about to struggle.


----------



## CH@NO (Feb 6, 2008)

BullGod said:


> Anyways I don't know how people can read or trust such a shitty site like that one. Here is an excerpt from one of their articles: "THE LATEST DRIVER out from the guys in Satan Clara, dubbed ForceWare"
> 
> I mean come on, how fucking lame is that? Nvidia is not the devil. WTF??? How more biased can you get? And I am supposed to believe anything those guys are saying about Nvidia. As far as I know they made that email up.



Yep.


----------



## CH@NO (Feb 6, 2008)

das müffin mann said:


> dude, nvidia brought it on themselves, he was simply disproving some of what he said, and do you really think they made the email up, i mean come on dude, i smell a nvidia fanboy



I think It's not fanboism, I agree with him and I'm not a fanboy...I hate that. It's just the way the article say the things. I like ATi and Nvidia and both have better cards than the other....and both experienced hard times.....the thread wasn't turned Ati Vs Nvidia thread if the article was objective.


----------



## strick94u (Feb 6, 2008)

Matrox Millennium G550 PCIe 
rules and will beat them both, Ati and Nvidia do not have a card to compete with it.


----------



## trog100 (Feb 6, 2008)

ati are the innovators.. they are the ones that come up with new ideas..  one day a new idea might actually win the race.. for the last two or three years it hasnt.. but who knows.. ???

basically green simply watches what red are up to then simply using more brute force grunt but no new ideas beats it.. he he he

praps they are "scared" that one day their rather simplistic approach wont work..

for example.. takes reds new x 2.. its works in a reasonable fashion..  its possible greens x2 wont.. heat being the possible and likely problem..

reds innovation.. a new chip that dosnt get too hot and draw too much current.. which is why reds x 2 works

just imagine a red x 2 using two 2900 type chips.. he he he he 

trog


----------



## erocker (Feb 6, 2008)

My 2 cents, Nvidia has never made a good dual gpu card, NEVER!  The GX2 may barely beat the X2, though it should crush it, unfortunately Sli is inferior to Crossfire.

GT200 = 65nm
R700 = 45nm

ATi should win the next round.  I don't know about you, but I don't want another space heater in my system ala GT200.  This is going to be ATi's year.  Products like the 8800 don't come around too often.


----------



## largon (Feb 6, 2008)

erocker said:
			
		

> GT200 = 65nm


Source?


----------



## [I.R.A]_FBi (Feb 6, 2008)

phanbuey said:


> AAAHAHAHA... finally someone with a grip on history... *"scared" of a company thats bleeding money like there is no tomorrow?* ATI $7 a share? VliW... People said the same thing about 3dfx @ $7 a share - and the 4 chip voodoo 5000 board... about how glide was such a better API - OMFG SO MUCH POWER SO MUCH FASTER THE RIVA TNT 2 WILL BE A PAPERWIEGHT... blah blah... nvidia has a faster production cycle than ATI, and the only reason they havent released new tech is to recoup the costs of the 8800 series.
> 
> Think of it like a track - ATi is ahead because theyre about to be lapped, to someone who has no previous knowledge of the race, it may seem as if they are ahead... but just sit and watch - you'll see how much theyre about to struggle.



ao if they arent explain the damn lying email


----------



## erocker (Feb 6, 2008)

largon said:


> Source?



Nothing final, but everything I've read says it's true.  Google 9800gtx, or g90, or gt200


----------



## Black Panther (Feb 6, 2008)

DarkMatter said:


> This is the only sentence in that inquirer article that I agree with.
> 
> _Mr *Hara* is an idiot for posting his mailing list publicly. _
> 
> ...



Lol that's funny. If you knew what Hara is translated in Maltese...

It means this.


----------



## [I.R.A]_FBi (Feb 6, 2008)

erocker said:


> My 2 cents, Nvidia has never made a good dual gpu card, NEVER!  The GX2 may barely beat the X2, though it should crush it, unfortunately Sli is inferior to Crossfire.
> 
> GT200 = 65nm
> R700 = 45nm
> ...



do u think nvidia tooka $$$ hit with the 8800 series?


----------



## Scrizz (Feb 6, 2008)

snuif09 said:


> do you know what i want to happen....
> 
> when ati and nvidia are bitching that 3dfx comes suddenly back in the race with an amazing card



3dfx was the shizzz 
that's why I don't like NV

Matrox Triple head ftw


----------



## Black Panther (Feb 6, 2008)

[I.R.A]_FBi said:


> do u think nvidia tooka $$$ hit with the 8800 series?



I definitely don't.  I got off topic here but I've never heard of someone called Mr $$$ hit so far!


----------



## PaulieG (Feb 6, 2008)

Conti027 said:


> ATI and Nvidia really bring out the worst in people on this site. i mean they both do the same thing and sometime ones better then the other.



This really says it all. Can we please stop these ridiculous and petty fan boy wars. It's so damn tiring and juvenile. It's a competitive market. Ati and Nvidia take turns making the better product and getting a bigger market share. Both are trying to out perform simply to make more money. That is all it is. Can we please, please just stop.


----------



## [I.R.A]_FBi (Feb 6, 2008)

Paulieg said:


> This really says it all. Can we please stop these ridiculous and petty fan boy wars. It's so damn tiring and juvenile. It's a competitive market. Ati and Nvidia take turns making the better product and getting a bigger market share. Both are trying to out perform simply to make more money. That is all it is. Can we please, please just stop.



the point of the trhead is that nvidias marketing is being juvenile ..


----------



## Crazyhorse (Feb 6, 2008)

[I.R.A]_FBi said:


> the point of the trhead is that nvidias marketing is being juvenile ..



True but those marketing strategies are all and everywhere the same. Intel does it AMD does ATi did it and NVidia does it. I m no friend of NV cards even though they do make good cards no doubt. I might be a little bid of an ATi Fanboy but I can admit that the other Company does make good cards as well and for the past 2 years NVidia owned AMD/ATi. But there is no telling what AMD/ATi has been working on all along i m sure they are more for a long shot solution and its only a matter of time when we see VPU's integrated in either CPU's or the board itself not the today onboard video though. 

But yeah that is clearly trashtalk from NVidias side but then everybody else does or did it so thats nothing new in the world of technology. My product owns yours....

I be happy to see good products from all companies that means competition and that means better pricing for us.


----------



## jammy86 (Feb 6, 2008)

I hounestly thing that ATI may have an upper hand here. Not right now, the benchmarks show that, but in releasing the x2 card. 

Right now what is happening is that a small percentage of the market are buying the x2. The people that are buying it arent worried that the quad fire thing doesnt work yet and they are also happy to put up with niggles and quirks of the card that arent quite perfect. Now I'm no electronic engineer, but I am a mechanical one, and it seems to be that the issue with dual gpu on a single board will lie in how each gpu talks to each other. The bit i'm talking about is that chip in the middle of the x2. By simply getting the x2 technology into the public market they are getting it tested and all sorts of errors thrown up and dealt with. Then comes along a nice new dual core processor designed to work to the current system, with a bridge that works perfectly and soon you've got a card which has had most of its refinement done long before the card existed....

just a theory mind...

JAmes.


----------



## snuif09 (Feb 6, 2008)

trog100 said:


> ati are the innovators.. they are the ones that come up with new ideas..  one day a new idea might actually win the race.. for the last two or three years it hasnt.. but who knows.. ???
> 
> basically green simply watches what red are up to then simply using more brute force grunt but no new ideas beats it.. he he he
> 
> ...



+6749689386489439

thats exactly where i didnt came up. 
ty mate cause thats just the thing how it goes in vga world


----------



## gOJDO (Feb 6, 2008)

This thread is funny. 
The_INQ and FUDzilla are spreading only rumors, most of them proven as total BS. For exmaple, the Reverse Hyper Threading, the outstanding K10 scores in synthetic benchmarks, the 3GHz K10, etc. etc. These people are noobz about hardware. They are just writing attractive stories for the masses, thus making traffic(read money) for their web sites.
ATi have nothing scary for nVidia and their 3870X2 is not an answer to nVidia. Just Some people here were amazed about the 3D Mark socres. 3D Mark is completely useless for representing real-life performance. Somebody running a high-end graphics card(s) would like to have a high-end graphics. Just check DX10 gaming performance with AA and AF turned on. Not only the 8800Ultra, but the 8800GTX also beats the 3870X2. Just look at the min. FPS od the 3870X2, it's unacceptable for high-end graphics card.

About the price, the 3870X2 is cheaper than the 8800Ultra, but requires much more expensive PSU and a large case to be placed in. 
Bottom line, the 3870X2 is a good competitor to nVidia high-end offerings. Competition brings faster and cheaper products.


----------



## Darknova (Feb 6, 2008)

Who had the better product last gen (DX9)? - ATi with the 1950 series.

Now nvidia has the 8800 which is a better product.

AMD has the lead in the CPU wars with the Athlon 64 until Intel brought about the Core 2 Duo.

Same thing will always happen.


----------



## btarunr (Feb 6, 2008)

One thing that both NVidia and ATI certainly succeed in equal proportions is that they spread hatered among tech-forum members, people buy their stuff, take sides and use their cards to pull down eachothers' underwear. :shadedshu

Make love (to your partners) not war (to fellow-forum-members).


----------



## bud951 (Feb 6, 2008)

I am a fan of both companies. I currently own 2 x 8800GT's in SLI and it smokes! Except for crysis of corse. I read an article linked from this site about a week ago and the X2 did not really impress me in there benchmarks. The 8800GT/GTS in SLI killed a single X2. I will go with whoever builds the better card for real world gaming. Right now I think its the 8800 in SLI but it will be the X2 in CF and then the GX2 in quad SLI. I am praying for the day when  a single card can play ANY game at hi res and not actually raise the temp of the room by 5+ degrees. Yeah, my 8800GT SLI combo does that!


----------



## cdawall (Feb 6, 2008)

CH@NO said:


> mmmmm, that's really nice but the 8800 ULTRA isn't supposed to fight with an X2. It's single cored and is from the "past generation", when Nvidia launches the 9800GX2 model, put them to trade blows and then post a real comparsion.







largon said:


> Oh my dear _god_!
> 
> Almost 1½ year old design looses to a two generation newer, total of two fab nodes smaller chip - sorry - _dual chip solution_ in a synthetic benchmark! What's even more impressive is that in games HD3870X2 manages to beat 8800Ultra (again, 1½ old design) by a whopping 5% while consuming 50W more power!
> 
> ...



answer to both of you

and yet the 8800 ultra costs 33% more and performs 5% worse  way to be a fanboy


----------



## candle_86 (Feb 6, 2008)

im not worried a bit, Nvidia likes to play headgames with everyone, its there thing


----------



## Tatty_One (Feb 6, 2008)

cdawall said:


> hahaha look at the charts on hwbot the 8800ultra is killed by the 3870 X2
> 
> http://hwbot.org/hardware.compare.do?type=gpu&id=1043_1&id=1236_1&id=1255_1&id=1183_1&id=1279_1
> 
> ...



Lol, and the 3870x2 gets beaten in a couple of benches by the lowly single 8800GTS 512Mb


----------



## candle_86 (Feb 6, 2008)

the fact is, anyone with an 8800GTS512, GTX or Ultra will not upgrade to this as 10% at best isnt worth the money

and to quote the ATI fanboi's from yesteryear the only way ATI managed to catch Nvidia is to put two GPU's on one card, its not a true single card solution.


----------



## Ravenas (Feb 6, 2008)

candle_86 said:


> the fact is, anyone with an 8800GTS512, GTX or Ultra will not upgrade to this as 10% at best isnt worth the money
> 
> and to quote the ATI fanboi's from yesteryear the only way ATI managed to catch Nvidia is to put two GPU's on one card, its not a true single card solution.



I disagree, I think the future of video cards will be multiple GPUs or multi-threaded gpus.


----------



## candle_86 (Feb 6, 2008)

i agree but i couldn't help myself, thats all the ATI camp screamed at the 7950GX2 back in 06 if you remember.


----------



## imperialreign (Feb 6, 2008)

It's this kind of response from nVidia whenever they start to feel threatened (whether they really need be or not) that leaves little to be admired.

So, we all see how ATI can couple two R680 GPUs on one card, and throw it in their face . . . nVidia should need not worry, what with their dual PCB scheculed for release, and the G92 scheduled for this year . . .

unless, they're stopping to look at the fact that ATI's R700 GPU is scheduled for release late this year, and it' rumored to be a dual core GPU - keep in mind, though, we've seen very little "info leaks" on this new procs; ATI seems to be keeping very, very tight lipped - and if ATI gets their quad setup perfectly polished, really . . . how hard would it be for them to slap two R700's on one PCB at this point?  They've proven to the world that two R680s work great together, what to stop them from a quad GPU core board?

That, I think, is what _really_ is itching nVidia's crotch right now; the 3870x2 is doing a lot better than they expected, and although they know their G92 should put that card into it's place, I don't think they're prepared to tackle ATI in a "more cores than you really need" approach.


Asides, that letter was way below the belt on nVidia's part.  If they keep up with actions like that, they could very well find themselves under the microscope by some 3rd party like Intel is right now . . .



			
				candle_86 said:
			
		

> the fact is, anyone with an 8800GTS512, GTX or Ultra will not upgrade to this as 10% at best isnt worth the money
> 
> and to quote the ATI fanboi's from yesteryear the only way ATI managed to catch Nvidia is to put two GPU's on one card, its not a true single card solution.



I disagree - ATI could have 8 cores on one PCB, and it's still a true _single_ card solution.  Now, if there was more than one PCB in their setup (like what nVidia has developed), then I'd say, no, it's not a true _single_ card solution.


----------



## das müffin mann (Feb 6, 2008)

i have to go with imperial on this one


----------



## jamupnorth (Feb 6, 2008)

Competition is Good.

We need competing company's or we will get a crap deal !

They are both good.


----------



## [I.R.A]_FBi (Feb 6, 2008)

candle_86 said:


> the fact is, anyone with an 8800GTS512, GTX or Ultra will not upgrade to this as 10% at best isnt worth the money
> 
> and to quote the ATI fanboi's from yesteryear the only way ATI managed to catch Nvidia is to put two GPU's on one card, *its not a true single card solution.*




so when u install an x2 .. how many cards do u install?


----------



## das müffin mann (Feb 6, 2008)

they both make killer cards, and while they are at each others throats/scared of each other it will cause them to be more innovative and drive prices down which is good


----------



## [I.R.A]_FBi (Feb 6, 2008)

and untop off all this R700 is 6 months ahead ...


----------



## AphexDreamer (Feb 6, 2008)

das müffin mann said:


> they both make killer cards, and while they are at each others throats/scared of each other it will cause them to be more innovative and drive prices down which is good



....for us. lol indeed.


----------



## das müffin mann (Feb 6, 2008)

AphexDreamer said:


> ....for us. lol indeed.



ill drink to that


----------



## jamupnorth (Feb 6, 2008)

AphexDreamer said:


> ....for us. lol indeed.



Or is it just a big cartel ? lol..


----------



## Bytor (Feb 6, 2008)

As Choppy stated..  We need both these companies fighting for our business with better and better products.  Its WE the End Users that win in the end......


----------



## WarEagleAU (Feb 6, 2008)

Nvidia has the high end crown but they dont have it totally bug free and it definitely wasnt easy going with their first cards. ATI will make a high end card but in the meantime, I see no need for it with their admirably performing parts.


----------



## erocker (Feb 6, 2008)

I would love a 3rd or 4th party to join the mix.  For both GPU's and CPU's.  Then we would see some real competition.  And some blazing fast stuff!


----------



## Bytor (Feb 6, 2008)

3dfx Ftw


----------



## Hawk1 (Feb 6, 2008)

erocker said:


> I would love a 3rd or 4th party to join the mix.  For both GPU's and CPU's.  Then we would see some real competition.  And some blazing fast stuff!



Larabee will grant you your wish (for GPU's anyway)


----------



## flashstar (Feb 6, 2008)

The key to the next year is for ATI to get the R700 out as quickly as possible. If they can cut the 7 month wait down to 4-5 months, Nvidia will have absolutely nothing to respond with immediately at which point Nvidia would have to lower prices or get their product out faster.


----------



## das müffin mann (Feb 6, 2008)

the triumphant return of 3dfx


----------



## gOJDO (Feb 6, 2008)

Bytor said:


> 3dfx Ftw


$10 , I wonder how much it cost when it was released.


----------



## AphexDreamer (Feb 6, 2008)

This kind of makes me think... If 3DFX had to resort to two core on one PCB to catch up with its competition and it later just disappeared. Kind of makes me assume that AMD might end up with the same fate as 3DFX.


----------



## panchoman (Feb 6, 2008)

Nvidia definently seems to be shitting their pants and bending truths to make investors still invest them. truth is, amd is gonna come out on top for a while. gtx/ultra might beat the x2 by 5%, but costs like 30 percent less.. which one do you pick?


----------



## DarkMatter (Feb 6, 2008)

Hawk1 said:


> Larabee will grant you your wish (for GPU's anyway)



In order to succeed with Larabee, Intel needs to change current gaming graphics model (rasterization) to one based on raytracing. Otherwise they won't be able to compete with Ati or Nvidia. My 2 cents.


----------



## warhammer (Feb 7, 2008)

The only thing ATI and NVIDIA care about is the bottom line to their share holders.
Everything else takes second place, its a marketing game they play if they can convince the consumer that there product is the best that money can buy, well then it makes their share holders happy.
And lets hope that ATI R700 delivers this time around. Not like the R600 debacle early this year.

we realy need a third player


----------



## candle_86 (Feb 7, 2008)

well S3 and XGI left the market agian, so no 3rd party.


as for Best IQ it belongs to Matrox


----------



## gOJDO (Feb 7, 2008)

panchoman said:


> which one do you pick?


None. 8800GT 512MB, 8800GTS g92 512MB, a pair of 3850's or a pair of 8800GT's.


----------



## EastCoasthandle (Feb 7, 2008)

Lets face it people, Nvidia hasn't been challenged in the high end market since the GTX's release.  Now that the GTX and Ultra are we as consumers are benefiting from the compeition with lower prices (at least on ATI's side of things).  The real high end competition starts with the R700 vs GT200.  In all honesty, the first one to market may in fact win favor with consumers regardless of what the other is capable of.  Only the dedicated fans will wait and see what the other is capable of.  This is found by those buying up the X2's right now.


----------



## imperialreign (Feb 7, 2008)

warhammer said:


> The only thing ATI and NVIDIA care about is the bottom line to their share holders.
> Everything else takes second place, its a marketing game they play if they can convince the consumer that there product is the best that money can buy, well then it makes their share holders happy.
> And lets hope that ATI R700 delivers this time around. Not like the R600 debacle early this year.
> 
> we realy need a third player



not sure about all that - ATI has always been more open to what the consumer wants and what they care about than nVidia.  ATI has almost always been second best to nVidia, and they know that - there have only been a couple of times they've released a product that has slapped nVidia back in gear.  ATI just never had the kind of resources nVidia has, and now that AMD is looking over their shoulder, they're being even more careful how they spend their money.  They can't afford to just ignore their customers is search of bottomline - they need to build up trust, confidence, and make an impression on new customers, meanwhile maintaining their strong customer support base . . . it's a balancing act, and they can't afford any major slip ups.




			
				AphexDreamer said:
			
		

> This kind of makes me think... If 3DFX had to resort to two core on one PCB to catch up with its competition and it later just disappeared. Kind of makes me assume that AMD might end up with the same fate as 3DFX.



I don't really think it's going that way yet - don't forget, even ATI has done two procs on one board before . . . but it was during the whirlwind 3D Accelerator firestorm, and it was quickly dismissed amoungst 3DFX's and nVidia's lineup at the time.  Difference here, though, is that 3DFX made some really bull-headed business moves that dug their hole and buried them.  A struggling company can do alright in a niche market, as long as you have other business their to invest and support you - once you burn your bridges, though, and your niche is dying, you're done.  ATI hasn't severed any ties yet, quite the contrary if you consider what all AMD has been partnering with recently . . . and although ATI's IQ superiority has been matched by nVidia within the last couple of years, ATI still has a slowly growing loyal base that have been very happy with their impressive support, stable hardware, stable drivers, and performance that is still quite relevant and impressive when you look at it from a "sit down and play it" viewpoint, instead of the FRAPS and 3m06 benchmarks.


----------



## EastCoasthandle (Feb 7, 2008)

This will create lower prices for high end video cards.  I honestly don't see a problem here. Can we see the day when a high end video card doesn't cost $600+?


----------



## Hawk1 (Feb 7, 2008)

EastCoasthandle said:


> Can we see the day when a high end video card doesn't cost $600+?



No. It's like asking will they make a cheap Ferrari or Rolex watch. It's the class system. People who have the money will pay just 'cause they can, and those who don't and want it bad enough, will hock/go into debt to get one so they can keep up with the Jones' (in the geek world that is). It's the way it is, and the companies know it.


----------



## EastCoasthandle (Feb 7, 2008)

Hawk1 said:


> No. It's like asking will they make a cheap Ferrari or Rolex watch. It's the class system. People who have the money will pay just 'cause they can, and those who don't and want it bad enough, will hock/go into debt to get one so they can keep up with the Jones' (in the geek world that is). It's the way it is, and the companies know it.



Wrong, once we see the return of competition in the high end segment those prices will be the thing of the past.  It was seen prior to the G80 and it will be seen again.
The "class" system is none sense coming from you.  If it takes owning a particular product to justify your PC's worth then we are no longer discussing video cards, so let's stay on topic.


----------



## Hawk1 (Feb 7, 2008)

EastCoasthandle said:


> Wrong, once we see the return of competition in the high end segment those prices will be the thing of the past.  It was seen prior to the G80 and it will be seen again.



Well, I'd love you to be right, but I dont see it happening anytime soon. If ATI get their R700 out as expected, and it lives up to the hype (fingers crossed), it will be well above $600 (and probably above $700). Same goes for the G100/gt200/whaterver their calling it for Nvidia. But, again, would LOVE for you to be right)


----------



## ChillyMyst (Feb 7, 2008)

BullGod said:


> Anyways I don't know how people can read or trust such a shitty site like that one. Here is an excerpt from one of their articles: "THE LATEST DRIVER out from the guys in Satan Clara, dubbed ForceWare"
> 
> I mean come on, how fucking lame is that? Nvidia is not the devil. WTF??? How more biased can you get? And I am supposed to believe anything those guys are saying about Nvidia. As far as I know they made that email up.



nvidia do this stuff when they get worried about compotition, check back when the kyro2 was out, it was faster in a decent system then the gf2gts, a card that was 2x the price or more, i had both, infact i still have the gf2gts(got it back in a trade from the buddy who bought it off me years back)the kyro2 i had got killed by a mobo that over volted it.

both cards where good, but with a decent chip behind it the kyro2 was faster dispite lacking a t&l engine.

well after the kyro2 and TBR got good press nvidia sent out a letter to a bunch of people showing why TBR was/is bad dispite it being a better design then what nvidia or ati where/are using.



Tatty_One said:


> Lol, and the 3870x2 gets beaten in a couple of benches by the lowly single 8800GTS 512Mb


early drivers, wait and see once they update the drivers and add new profiles 




AphexDreamer said:


> This kind of makes me think... If 3DFX had to resort to two core on one PCB to catch up with its competition and it later just disappeared. Kind of makes me assume that AMD might end up with the same fate as 3DFX.




3dfx died not because of the 2 chip solution but because they tryed to cut out all their old partners by only making their own cards, 3dfx would likely still be around had they kept using their orignal method of selling partners chips insted of making their own cards.

also 3dfx was arrogant, they insisted for the longist time that there was no need for 32bit colour in gfx cards that their dithered 16bit was just as good(it wasnt!!!)  everybody else was putting out cards that could do 32bit,  tnt could do 32bit in games, rage128 was NATIVE 32bit colour, yes NATIVE, the 2k/nt drivers sucked for the rage128 cards tho.....



imperialreign said:


> I don't really think it's going that way yet - don't forget, even ATI has done two procs on one board before . . . but it was during the whirlwind 3D Accelerator firestorm, and it was quickly dismissed amoungst 3DFX's and nVidia's lineup at the time.  Difference here, though, is that 3DFX made some really bull-headed business moves that dug their hole and buried them.  A struggling company can do alright in a niche market, as long as you have other business their to invest and support you - once you burn your bridges, though, and your niche is dying, you're done.  ATI hasn't severed any ties yet, quite the contrary if you consider what all AMD has been partnering with recently . . . and although ATI's IQ superiority has been matched by nVidia within the last couple of years, ATI still has a slowly growing loyal base that have been very happy with their impressive support, stable hardware, stable drivers, and performance that is still quite relevant and impressive when you look at it from a "sit down and play it" viewpoint, instead of the FRAPS and 3m06 benchmarks.



acctualy its bee a few times ati has done dual chips on a card.

rage 128 maxx edition was 2 rage128 chips on 1 card, it never got proper driver support tho, hence it died out pretty fast.

x1950pro x2 by sapphire(sapphire=ati  ati=sapphire) 

38*0 x2 cards

the x1950pro x2 was/is still a nice card, my buddy has one, its a great performer, and has kept him from bothering to get an 8800gt or 2900/38*0 card because the perf boost would be so very small compared to what he already has.....


Again this is just one more time nvidia is bad mothing the compotition from fear/worries......


----------



## ShadowFold (Feb 7, 2008)

Yea I saw the X1950PRO X2 for awile but I guess it didnt sell well.


----------



## pt (Feb 7, 2008)

AphexDreamer said:


> This kind of makes me think... If 3DFX had to resort to two core on one PCB to catch up with its competition and it later just disappeared. Kind of makes me assume that AMD might end up with the same fate as 3DFX.



nvidia did it 1st with the 7900 series
and nowadays 2 cores on a pcb is a good sign not a bad one


----------



## asb2106 (Feb 7, 2008)

i have heard rumors that ATI's next gen cards, the r700, are gonna be a duel core GPU.  That I could see scaring Nvidia


----------



## ChillyMyst (Feb 7, 2008)

ShadowFold said:


> Yea I saw the X1950PRO X2 for awile but I guess it didnt sell well.



it wasnt a well marketed/publisized card, didnt get alot of press at the time :/


----------



## Scrizz (Feb 7, 2008)

the 1950prox2 came out too late


----------



## imperialreign (Feb 7, 2008)

Scrizz said:


> the 1950prox2 came out too late



and coupled with poor marketing . . .

awesome cards, though - and they're friggin extremelly hard to find nowadays


----------



## Monkeywoman (Feb 7, 2008)

asb2106 said:


> i have heard rumors that ATI's next gen cards, the r700, are gonna be a duel core GPU.  That I could see scaring Nvidia



u mean dual gpu on one die. gpus are dual core, in ati's case they are 64 core with 5 hyper-threading to total 320 processing units


----------



## driver66 (Feb 7, 2008)

Monkeywoman said:


> u mean dual gpu on one die. gpus are dual core, in ati's case they are 64 core with 5 hyper-threading to total 320 processing units



?   :shadedshu


----------



## btarunr (Feb 7, 2008)

Scrizz said:


> the 1950prox2 came out too late



Have you read TPU's very own review on this card?

http://www.techpowerup.com/reviews/Sapphire/X1950_Pro_Dual


----------



## ChillyMyst (Feb 7, 2008)

i still find it funny it keeps up with the GTX pretty damn well, and that it would still play todays games GREAT.

and i also find it funny people like newtekie saying people should just get a cheap card and wait for dx10 because dx10 cards will be so much better 

honesly the 1900/1950 cards where truely beastly in their day even now i would be using my x1900xtx if it hadnt died and forced me to rma it and buy a replacement :/


----------



## niko084 (Feb 7, 2008)

ATI and Nvidia go back and forth just like AMD and Intel, it will flip back and forth for years to come on who is on top... That business, its not like either company sucks at what they do.


----------



## strick94u (Feb 7, 2008)

Still any of these cards don't have a chance against what Matrox will be releasing soon.


----------



## imperialreign (Feb 7, 2008)

ChillyMyst said:


> honesly the 1900/1950 cards where truely beastly in their day even now i would be using my x1900xtx if it hadnt died and forced me to rma it and buy a replacement :/



ATI's 1900 series were the biggest bitch slap anyone has ever delivered to nVidia; and that had a lot to do with how ATI pulled it off.  The 1800 series was released, and went over ho-hum, dodgy drivers, flaky performance . . . nVidia complacently kicked back grinning while still retaining their crown.  ATI was still in the midst of fine tuning the cards, and put them through another revision and slapped on some more bells and whisles behind closed doors and 2.5 months after the 1800 release, the 1900s rolled out - and what a hell of a lineup, too; IIRC, it was the 1900 Crossfire Editions, 1900 XT, 1900 AIW, 1900 XTX.  They peed all over nVidia's supremacy parade.

No one saw the 1900 series coming, ATI kept it under tight wraps at the time, as everyone's focus was on the 1800 series.


The 1950s are still quite capable of handling new titles without any problems (unless DX10 is your thing).  I don't plan on giving my two up until I have to


----------



## ChillyMyst (Feb 7, 2008)

rofl, matrox is good only for a VERY VERY small niche market, they never where good at 3d, their name recognition by people of my era is from the matrox millimium and mystic that worked so well paired with a voodoo card.

for gaming i dont ever see matrox becoming a player, they abandoned that market long long ago(tho they shouldnt have, they should have just hired better driver programers and pushed out newer cards faster....)

the companys i miss are companys like.

oak technologys: they had a killer run over the years of isa/vesa/pci, they even had a first gen singel chip 3d accelerator, but it never officialy went on the market(but i know 2 ppl that had them, they where great for their day, true ogl ICD not a mini ICD!!!)

Rendition: made the varite 2100/2200 chips that kicked all kinds of ass, and had stunning d3d and ogl drivers for their day as well as a good "native" icd simlar to glide.

those 2 companys where da bomb back in the day to geeky techs like me, trident was good stuff to, but not for gaming, they never got out a truely good gaming chip, though they where working hard on one from all reports.

My top company i wish still was in the GPU/GFX market is PowerVR, the powervr pvr2 and kyro2 cards ROCKED, to bad they lost their chip maker(st sold their gfx chip devision) sucked, i would love to see a dx10.1 Tiler on the market, could probbly stick with ddr3 for a couple years to come on high end cards due to bandwith savings tilers excibit.


s3/xgi made decent hardware but the software side sucked horribly 

i need sleep, very tired.......very very tired infact....


----------



## ChillyMyst (Feb 7, 2008)

imperialreign said:


> ATI's 1900 series were the biggest bitch slap anyone has ever delivered to nVidia; and that had a lot to do with how ATI pulled it off.  The 1800 series was released, and went over ho-hum, dodgy drivers, flaky performance . . . nVidia complacently kicked back grinning while still retaining their crown.  ATI was still in the midst of fine tuning the cards, and put them through another revision and slapped on some more bells and whisles behind closed doors and 2.5 months after the 1800 release, the 1900s rolled out - and what a hell of a lineup, too; IIRC, it was the 1900 Crossfire Editions, 1900 XT, 1900 AIW, 1900 XTX.  They peed all over nVidia's supremacy parade.
> 
> No one saw the 1900 series coming, ATI kept it under tight wraps at the time, as everyone's focus was on the 1800 series.
> 
> ...



exectly, ati's designs are modular, they effectivly took the 1800 and tacked on more shader units, very effective and also alot easyer then redesigning the core from scratch.

nvidia's 6 and 7 cards are effectivly just designs on a theme, they tweaked them a bit, clocked them higher, changed numbers of pipes and such, but nothing really was redesigned, hence the whole 6 and 7 line has seirous limmitations, like fp16 hdr+AA being IMPOSSABLE because the same part of the chip is used to do both jobs.....where on the other hand the 9500 and up could do it (maby not at decent perf but they could do it) 

well i gotta get some sleep, im nodding off as  i type this, nvidia needs to stop just optimizing for top games and fix the little side buggsin their drivers, my 8800gt is nice but some little things annoy the hell out of me :/


----------



## imperialreign (Feb 7, 2008)

I remember some of the other brands back during the 3D Accelerator wars were good - that was during a time that Matrox was on par with ATI, nVidia and 3DFX.  Their Mystique card really took everyone by surprise.

Oak tech was good also, but was hard to come by around here

Trident was decent, but they were more of a mid-range "you get what you pay for" kinda company.  Although, the were much better in the early 90s.

We all know what 3DFX did for everyone's nuts back then - and some of their brand licensing was really . . . odd - anyone remember Creative Labs toss into the 3D Accelerator wars with a 3DFX based PCB?  The Banshee, IIRC.

Anyone remember the Hercules 3D Prophet cards, back when Guillemot/Hercules actual designed their own offerings?  Towards their end, though, they were putting out boards based on both ATI and nVidia chipsets.


----------



## Darknova (Feb 7, 2008)

imperialreign said:


> Anyone remember the Hercules 3D Prophet cards, back when Guillemot/Hercules actual designed their own offerings?  Towards their end, though, they were putting out boards based on both ATI and nVidia chipsets.



HAHAH, bloody hell...I forgot I had one of those...that was years ago jeez....might still have it somewhere in my junk lol.


----------



## bud951 (Feb 7, 2008)

I wish s3 would get back in the high end market. The old Viper 2 (Savage 2000 chipset) was awesome compared to the 2nd gen Geforce and Radeon cards of the day. The software was abysmal though. A few 3rd party geeks did come out with some improvments that helped me keep the card well past its lifespan. Ahh.. the days of UT running in s3 MeTal.


----------



## candle_86 (Feb 7, 2008)

ChillyMyst said:


> exectly, ati's designs are modular, they effectivly took the 1800 and tacked on more shader units, very effective and also alot easyer then redesigning the core from scratch.
> 
> nvidia's 6 and 7 cards are effectivly just designs on a theme, they tweaked them a bit, clocked them higher, changed numbers of pipes and such, but nothing really was redesigned, hence the whole 6 and 7 line has seirous limmitations, like fp16 hdr+AA being IMPOSSABLE because the same part of the chip is used to do both jobs.....where on the other hand the 9500 and up could do it (maby not at decent perf but they could do it)
> 
> well i gotta get some sleep, im nodding off as  i type this, nvidia needs to stop just optimizing for top games and fix the little side buggsin their drivers, my 8800gt is nice but some little things annoy the hell out of me :/



only in Half Life two can R300 and R400 cards do HDR+AA, heck only in HL2 can they do HDR, whats even funnier is so can Nvidia cards if you force them to use SM2.0 instead of SM3.0 render path, its because the HL2 engine doesnt use FP64 HDR, its uses FP16HDR and is alot lower quality


----------



## largon (Feb 7, 2008)

imperialreign said:


> The 1800 series was released, (...)  ATI was still in the midst of fine tuning the cards, and put them through another revision and slapped on some more bells and whisles behind closed doors and 2.5 months after the 1800 release, the 1900s rolled out (...)


You didn't mention R520 was delayed 6 months from the originally planned realease date (2Q05) due to a design flaw & severe scaling problems and that R580 actually was released like it was planned from the begining. And R580 isn't just "another revision" of R520 which ATi conjured in a whim to pull of somekind of a stunt - but a evolutionary design derivate planned from the conception of R500-architecture. 
In my opinion X1800 failed miserably, but X1900 was great. But certainly, ATi could've scrapped R520 and hurried R580 when it became obvious R520 didn't deliver, instead, they "double-charged" the customers.


----------



## Tatty_One (Feb 7, 2008)

[I.R.A]_FBi said:


> do u think nvidia tooka $$$ hit with the 8800 series?



Hell No, I will have a dig out to find a magazine piece I read just last month, the 8 series versus 2 series in 2007, I think that Nvidia had 68% of the market share


----------



## Tatty_One (Feb 7, 2008)

trog100 said:


> ati are the innovators.. they are the ones that come up with new ideas..  one day a new idea might actually win the race.. for the last two or three years it hasnt.. but who knows.. ???
> 
> basically green simply watches what red are up to then simply using more brute force grunt but no new ideas beats it.. he he he
> 
> ...




So the fact that NVidia brought out the 8 series a full 6 months before ATi released the 2900 series suggests to you that NVidia are using ATi's innovative development & technology and as they had the first DX10 card they are not the innovators? despite the fact that the 2 architectures are about as far apart as my toilet and Pluto?  On top of that, would it be too obvious to think that one of the reasons ATi delayed the release of the 2900 is because they studied the 8 series and realised what they had to offer might struggle and they need to add some horsepower to their product?  Just a little speculation there.


----------



## Random Murderer (Feb 7, 2008)

for once, i actually agree with the inquirer. i'm not trying to sound like a fanboy here, but that email was so riddled with blatant dis-information and stupidity that it makes me sick.


----------



## gOJDO (Feb 7, 2008)

Tatty_One said:


> So the fact that NVidia brought out the 8 series a full 6 months before ATi released the 2900 series suggests to you that NVidia are using ATi's innovative development & technology and as they had the first DX10 card they are not the innovators? despite the fact that the 2 architectures are about as far apart as my toilet and Pluto?  On top of that, would it be too obvious to think that one of the reasons ATi delayed the release of the 2900 is because they studied the 8 series and realised what they had to offer might struggle and they need to add some horsepower to their product?  Just a little speculation there.


And what ATi brought 6 months after 8800? A card that wastes more energy, produces more heat and sux at DX10, AA and AF. It not only got beaten by the cheaper 8800GTS(let alone 8800GTX and 8800Ultra), but in many cases it got beaten by their 1950XTX. WOW, what a innovation.


----------



## DarkMatter (Feb 7, 2008)

Tatty_One said:


> So the fact that NVidia brought out the 8 series a full 6 months before ATi released the 2900 series suggests to you that NVidia are using ATi's innovative development & technology and as they had the first DX10 card they are not the innovators? despite the fact that the 2 architectures are about as far apart as my toilet and Pluto?  On top of that, would it be too obvious to think that one of the reasons ATi delayed the release of the 2900 is because they studied the 8 series and realised what they had to offer might struggle and they need to add some horsepower to their product?  Just a little speculation there.



Haha! True. I'm tired of the whole Ati's innovation and Nvidia copying them (false?) claims (proof otherwise). For once I would like to see any fact of this, in the shape of a link where the explain how this feature or that architecture has been copied. Seriously I don't discard my incompetence in searching info about this, but I never found something like that. I would like to know what these Ati guys are always talking about, since I have had 3 Ati cards and never found anything different.

And *trog*, Ati broght in HD2000 series, not HD3000, it was AMD who made HD3000 series. I mean, the only difference between those two is that HD3000 series made a die shrink and feature the power saving techniques developed by Transmeta and used by AMD. Where do you see the innovation in the hands of Ati there? Nowhere to be seen in that example you put. Now if you call innovation using smaller processes and Transmeta patents, I guess the most innovative company in the world is Intel.

No, this is not greens against reds, it's greens (Nvidia) against...
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...greens? (Amd)


----------



## Tatty_One (Feb 7, 2008)

gOJDO said:


> And what ATi brought 6 months after 8800? A card that wastes more energy, produces more heat and sux at DX10, AA and AF. It not only got beaten by the cheaper 8800GTS(let alone 8800GTX and 8800Ultra), but in many cases it got beaten by their 1950XTX. WOW, what a innovation.



Yes but to be fair, the 2900XT was a damn good card for the price once the drivers underwent some development, and again to be fair in this discussion and not fanboi...istic)), neither camps have aalways got it right straight from the off, even though I quoted the G80 release etc, just ask Vista owners how frustrated they were with the early driver support for the 8800 series.


----------



## trog100 (Feb 7, 2008)

it was said the 2900 series failed because it was too innovative.. games writers didnt bother to use the cards abilities and this suited the more traditional nvidia approach better..

as for innovation.. its just a word.. but the future does seem to be about more energy efficiency chips as opposed to brute force.. i think ati at present have the edge here..

i try not to be a fanboy..  but admit to a certain bias.. he he

i recon it goes back to my dust buster vs 9700 days..  

trog


----------



## candle_86 (Feb 7, 2008)

i honestly think AMD hasnt got a prayer in the world.


----------



## das müffin mann (Feb 7, 2008)

candle_86 said:


> i honestly think AMD hasnt got a prayer in the world.



after some of the stuff they have been releasing over teh last few years i have to greatly dissagree


----------



## pt (Feb 7, 2008)

candle_86 said:


> i honestly think AMD hasnt got a prayer in the world.



i'm praying for them


----------



## warhammer (Feb 7, 2008)

trog100 said:


> it was said the 2900 series failed because it was too innovative.. games writers didnt bother to use the cards abilities and this suited the more traditional nvidia approach better..
> 
> as for innovation.. its just a word.. but the future does seem to be about more energy efficiency chips as opposed to brute force.. i think ati at present have the edge here..
> 
> ...




The games were out befor the 2900 like FAR CRY ops our card don't work on that game failed to run. Let see HALF LIFE 2 codes there games for ATI the 2900 didnt preform well

And rumur is NVIDIA is have talks and negotating with them so HL3 will be power by NVIDIA


----------



## das müffin mann (Feb 7, 2008)

all that it runs better on a "insert name of product here" is bs anyway


----------



## vexen (Feb 7, 2008)

AMD does not have that much trouble.

They power the Wii with their ATI Hollywood.
They power the XBOX360 with the ATI Xenos.
Nvidia powers the PS3 with the RSX.

They beat Nvidia on price-performance, having trouble on higher end - but now back up with the 3870 X2 - does everyone has a top of the line GPU?

What more can you ask?


----------



## largon (Feb 7, 2008)

vexen said:
			
		

> What more can you ask?


It would be nice if ATi broke even, let alone made profit for the first time after AMD bought it. But ofcourse, it doesn't really matter as the total net loss of AMD graphics division (ATi) was ~$120 million and the whole company lost ~$3400 million during 2007.


----------



## asb2106 (Feb 7, 2008)

Monkeywoman said:


> u mean dual gpu on one die. gpus are dual core, in ati's case they are 64 core with 5 hyper-threading to total 320 processing units



wow, I didnt know that, I was completely unaware of the build of a GPU, anyone know of any literature I can read to brush up on this?


----------



## snuif09 (Feb 7, 2008)

im going to become an 3d level modeller and i swear that i will use ati only
(i have an nvidia card now but i like ati more than nvidia)


----------



## JrRacinFan (Feb 7, 2008)

cdawall said:


> hahaha look at the charts on hwbot the 8800ultra is killed by the 3870 X2
> 
> http://hwbot.org/hardware.compare.do?type=gpu&id=1043_1&id=1236_1&id=1255_1&id=1183_1&id=1279_1
> 
> ...



Yes but why buy an Ultra when you can get a GTX and overclock it to Ultra speeds? 

Still either way, the X2 is one heck of a card!


----------



## asb2106 (Feb 7, 2008)

JrRacinFan said:


> Yes but why buy an Ultra when you can get a GTX and overclock it to Ultra speeds?
> 
> Still either way, the X2 is one heck of a card!



I do agree, and a thing you have to look at is price, nobody talks about that, a Ultra doesnt "kill" a x2, it beats it in a few areas sure, but maybe drivers can fix that. 

But when you can get a x2 for 450 and a ultra for 600, hmm not a tough choice to me...

And on top of that, you can overclock a GTS G92 to outperform a Ultra, easily, so a g92 is in that mix too.


----------



## largon (Feb 7, 2008)

Indeed, and 8800GTXs go for silly pennies these days, GTX is actually ~10-15% cheaper than a HD3870X2 at the moment.


----------



## Scrizz (Feb 7, 2008)

btarunr said:


> Have you read TPU's very own review on this card?
> 
> http://www.techpowerup.com/reviews/Sapphire/X1950_Pro_Dual



Yes I already had, did you see the date on it?
the 8800 series was already out....


----------



## DarkMatter (Feb 7, 2008)

trog100 said:


> it was said the 2900 series failed because it was too innovative.. games writers didnt bother to use the cards abilities and this suited the more traditional nvidia approach better..
> 
> as for innovation.. its just a word.. but the future does seem to be about more energy efficiency chips as opposed to brute force.. i think ati at present have the edge here..
> 
> ...



Not exactly true. It failed to compete for performance crown, because it relies on VLIW instructions to achieve high peak computational power. Unless scalar or superscalar designs where the hardware issues instructions to available processing units based on first comes first serves phylosophy, VLIW designs must issue an instruction to every unit and it does this at software level. Example:
2 chips, one scalar and one VLIW. Each chip can calculate 5 operations on a given time, let's call it T. We have to process the following program:

1. a=+3 (a=a+3)
2. b=-2
3. c=a+b
4. d=a-b
5. a=c+d

Fully SCALAR aproach: Easy, the chip would read instructions as they come and calculate them in order. So the steps the chip would follow are the same as in the program. Step 2 can't be calculated until step 1 has finished and so on. Each step lasts T/5 for a total of T for 5 steps.

VLIW: The code that reaches the chip must be "tied together" to form a long instruction. The ideal to use the total power of the chip would be something like this:

step 1. [ALU 1]-> a=+3 [ALU2]-> b=-2 [ALU3] c=a+b [ALU4]-> d=a-b [ALU5]-> a=c+d

In this ideal case, step 1. will last T time and it would conform the whole program. But as you can see step 3. and 4. are dependant to 1. and 2., and 5. is dependant to 3. and 4. meaning that said instruction is imposible. The only thing you can do is:

step 1. [ALU 1]-> a=+3 [ALU2]-> b=-2 
step 2. [ALU3] c=a+b [ALU4]-> d=a-b 
step 3. [ALU5]-> a=c+d

The program would finish in 3T, 3x the time of the scalar so it would need 3 times the theoretical power to run the same program in this particular example, which is not the best case scenario, but neither it is the worst one. Taking the program and converting it to said steps (VLIW instructions), is the job of the drivers, but as we can see from the example, even with that job well done the chip needs 3T to complete the task. This means that even with perfect drivers the architechture has this flaw.

The only way to fix this, it's that the developers of the software specifically made it to run on 5 units long VLIW architechture. 
Is this enough to say that the problem is that developers didn't bother to use all of the potential of the HD series? (AKA specifically program for that architecture)
Let me say A BIG NO, they have been programming the same way for years, and a hardware company expecting them to change this from one day to another is unrealistic, not to say irresponsible. (EDIT: don't know why Sony comes to my mind )
In a way it's true that HD series don't run to full potential, but the only responsible for this is Ati. I have heard a lot of people (many times, and honestly I'm getting really tired of it) baming developers and TWIMTBP for the fact that developers don't specifically program in a way that HD series would see its benefits.

And to finish, not only happens that, right now (or yet if ever), VLIW architecture seems not to be the best way to go, but it's not even new or innovative. GeForce FX series followed this approach, and I think that 3Dfx used this aproach in their last designs too, but not sure about this last one. 
See the trend?

And as for power usage, I agree. Nvidia should implement something to fix that, specially since it seems that GT200 will use 250W!!!! (Erm HD2900XT uses ~235W anyway but still)
If Hybrid SLI works well and as expected that could be the answer and a lot better than current AMD's solution. Said current because they are working on same feature right?


----------



## Tatty_One (Feb 7, 2008)

pt said:


> i'm praying for them



Lets all pray for them!  as many of us keep saying.....we want BOTH camps to be competative, without that it's just gonna hurt our wallets that much more!


----------



## Tatty_One (Feb 7, 2008)

asb2106 said:


> wow, I didnt know that, I was completely unaware of the build of a GPU, anyone know of any literature I can read to brush up on this?



Have a read of this first which explains the historic approach:

http://www.cdrinfo.com/Sections/News/Details.aspx?NewsId=14458

Then have a read here.....it gets heavier!

http://insidehpc.com/2006/11/14/wha...or-is-crays-strategy-the-right-one-after-all/

And then to sum it all up in a more "layman" terminology:

http://www.legionhardware.com/document.php?id=703

Now your an expert!


----------



## DarkMatter (Feb 7, 2008)

Tatty_One said:


> Lets all pray for them!  as many of us keep saying.....we want BOTH camps to be competative, without that it's just gonna hurt our wallets that much more!



In the end that's what we all want. That is why I kind of "attack" the losing one, and specially its supporters (fanboys) that try to disimulate the fact that their side is losing. I know it's somewhat radical, but I always support the winner, which is not the same as downplaying the other. But those who choose a side will scream aloud "mine is better!" when it's better and (it's almost the same and cheaper) when they lose. Not that they aren't right, but it's very dificult for them to admit it's slower, as if admiting it a big part of their heart would be torn appart. And I have a problem with that, what's the point of competition if we have to support both the winner and the loser in one specific battle? The war isn't dead, there will be other battles. Let's call winner a winner, and hope the loser to wake up!
I've felt the same with GeForce FX, Athlon XP in a short period, Pentium 4, Athlon 64 and look at the responses the respective companies gave us: GeForce 6, Athlon 64, Core 2... Yeah AMD has failed to give a response as of now, but let them try hard...


----------



## DarkMatter (Feb 7, 2008)

asb2106 said:


> wow, I didnt know that, I was completely unaware of the build of a GPU, anyone know of any literature I can read to brush up on this?



I like The Tech Report for this. They include many info about how the GPUs work on their reviews' intro and close up sections. They show easy block diagrams too, which helps. Here are some:

G80: 

http://techreport.com/articles.x/11211

R600:

http://techreport.com/articles.x/12458

G92:

http://techreport.com/articles.x/13479

RV670:

http://techreport.com/articles.x/13603

Enjoy!


----------



## Laurijan (Feb 7, 2008)

"Our colleague Charlie from The Inquirer" should have posted the 3DMark06 scores the 3780X2 had in the review which W1Z conducted on an His 3780X2 which btw where not that sunny.. 

http://www.techpowerup.com/reviews/HIS/HD_3870_X2/20.html


----------



## jpierce55 (Feb 7, 2008)

I think it would be great if the companies struggled between 40-60% back and forth all of the time, that would keep the market good, or a third company. I don't want either to die and the return of ATI is great!


----------



## asb2106 (Feb 7, 2008)

thanks all!!  Looking forward to some good readin


----------



## ChillyMyst (Feb 7, 2008)

candle_86 said:


> only in Half Life two can R300 and R400 cards do HDR+AA, heck only in HL2 can they do HDR, whats even funnier is so can Nvidia cards if you force them to use SM2.0 instead of SM3.0 render path, its because the HL2 engine doesnt use FP64 HDR, its uses FP16HDR and is alot lower quality



partialy corect but fp16 is where nvidia fails not 64 

and a few other games support sm2 HDR, its not a matter of the hardware not supporting it or sm2 hdr being lower quility,its a matter of lazy programers just using sm3 because it was easyer to just do "bloom" in sm2 and then spend time on what they got payed for, optimizing for nvidia's sm3 cards(the way its ment to be played)

i have seen demo's of EXTREAMLY High quility in sm2 and sm3 modes that looked IDENTICAL, i will have to try and find the site that hosted them.

the reasion u didnt see more sm2 HDR was nvidia did a smart thing, they payed ALOT of companys to only support sm3 hdr, crytek for example was working on HDR for sm2 cards, BUT they droped it after nvidia payed them to.

where this turned into a fail on nvidias part is the x1k line of cards that showed they got almost no perf hitt by adding AA to HDR in sm3 games like oblivion, and NO nvidia card could do hdr+aa unless it was sm2 hdr.



> HDR and Anti-aliasing
> 
> Question
> 
> ...


Nvidia Faq

that pretty much prooves what i was saying, also prooves your talking our ur ass about fp64 (dont even think their is such a thing yet!!!)



Laurijan said:


> "Our colleague Charlie from The Inquirer" should have posted the 3DMark06 scores the 3780X2 had in the review which W1Z conducted on an His 3780X2 which btw where not that sunny..
> 
> http://www.techpowerup.com/reviews/HIS/HD_3870_X2/20.html



3dmark disnt mean shit, why do people like you alwase run to 3dmark?

honestly i have been a gamer since the virge3d days, and i have known that 3dmark was a bad way to judge the TRUE perf of a gaming system since 3dmark99max was out, its SYNTHETIC!!!!!!!!!

also your seeing EARLY drivers for a dual gpu card, im sure given a few revisions the drivers will improve, just like they SLOWLY did for the 8800's.


as to innovation, well nvidia was once VERY innovative, the riva128,tnt,gf cards where innovations in their day, but in recent years their innovations have been....lacking.

need i mention the gf5/fx line?  it was very innovative, so innovative that it sucked total ass at its one selling point dx9, nvidia gammbled on getting game devs to program spicificly for their SHITTY design, it failed.

the 6 and 7 cards where fixed BUT wherent really innovative, sure they had sm3 BUT it was useless for the most part because the sm2 perf was HORRIBLE and then x800 walked allover their cards in games of the day.
when you enabled sm3 mode for things like HDR on the 6 cards games became unplayable(farcry for example) unless u planned to drop 1-2 res settings if not more.

I had a 6800gt and an x800, the x800 KILLED the 6800gt@ultra in every test/bench other then sm3 version of 3dmark that came out, but in real games, well nvidia failed to impress, costed more, ran worse, had more driver bugs, had FAR slower driver updates(ati had moved to 1 driver a month, nvidia was every few months.....unless u wanted to risk leeked betas...) 

i have owned both brands for years, and untill the FX line nvidia was top dog, but ATI learned, they adapted to what costmers wanted, and they have been doing well ever sence, sure they had a few blah products, the x1800 was blah, the xt was ok, but it was soundly stomped by the 1900/1950 cards that came out as an evolution to their design.
the 2900 was.....well a hot power hungery beast, BUT after the drivers got updated it turned out to be a decent product for the price, the 38*0 cards have turned out to be GOLD, low power use, very good performance, low heat, quite an upgrade over the older design.
yes they are just a small evolution on the 2900 design wise, but they made a nice diffrance, and allowed for dual gpu cards!!!!.

i own an x1900xtx and 8800gt, and as i have said b4, the 8800gt is a nice card, but its stock fan is a loud noisy bitch if you turn it up to a level that will keep the card from burning itself out, i would still be using my x1900xtx if it hadnt needed rma'd, and i would still be content, sure it wouldnt win any bench awards,  but then again neither will my overclocked GT to be honest, and i know the 3870 and 3870x2 cards arent gonna be best in all benches, but i still feel they are a WIN for AmdTI because they are selling like butter soaked hotcakes at a fat camp!!!!!

if i could trade the 8800gt for a good 3870 i would, you know why, lower temps, better stock coolers, and lower power use, oh yeah, and the drivers are better IMHO, i really disslike some of the little buggs i have found with nvidias drivers, some i cant talk about because i was asked to wait and see what nvidia says about fixing them, but one i can is the yv12 color spacing buggs with videoplayback..........a basic feture thats been broken since the start of last year............nvidia still dosnt got it properly fixed!!!


----------



## cdawall (Feb 7, 2008)

asb2106 said:


> I do agree, and a thing you have to look at is price, nobody talks about that, a Ultra doesnt "kill" a x2, it beats it in a few areas sure, but maybe drivers can fix that.
> 
> But when you can get a x2 for 450 and a ultra for 600, hmm not a tough choice to me...
> 
> And on top of that, you can overclock a GTS G92 to outperform a Ultra, easily, so a g92 is in that mix too.



there was a g92 8800GTS 512mb listed in my source


----------



## asb2106 (Feb 7, 2008)

cdawall said:


> there was a g92 8800GTS 512mb listed in my source



my bad


----------



## DarkMatter (Feb 7, 2008)

ChillyMyst
Many discrepancies there. But I won't respond to them. Only that I own a 6800 gt and a x800 xt and by no means the Radeon kills the GeForce. What a lie man, they trade blows depending on the game. Also not a single x800 was faster than 6800 ultra, x850 PE did and 850XT trades blows, but x800? Nah!!

Also the 6800 GT ran Farcry like a champ at any setting.

And that thing about Nvidia (or any other) paying developers not to implement some features? Don't make me laugh. They don't have enough money to pay publishers to do so, because the money Nvidia has available for such a thing is a lot less than what publishers will lose from those Ati users that won't buy the game. Not to mention reputation they would lost. And for the same reason, if distributors knew that their developers where doing that, they will put them six feet under. Hearing that anyone can think that most developers are such cheap as to accept money for such a thing makes me sick. 
I will add that to my "stupid things people say on forums" list.


----------



## MadCow (Feb 7, 2008)

DarkMatter said:


> ChillyMyst
> Many discrepancies there. But I won't respond to them. Only that I own a 6800 gt and a x800 xt and by no means the Radeon kills the GeForce. What a lie man, they trade blows depending on the game. Also not a single x800 was faster than 6800 ultra, x850 PE did and 850XT trades blows, but x800? Nah!!
> 
> Also the 6800 GT ran Farcry like a champ at any setting.
> ...



You think that's funny? Have you ever wondered why so many games have the nvidia logo at startup? They don't pay them to cripple ATI cards, they pay them to "optimize" their games for nvidia hardware. It's been that way for a while now. Nvidia's the bigger company, they have more money, they can afford to do that.


----------



## Tatty_One (Feb 7, 2008)

MadCow said:


> You think that's funny? Have you ever wondered why so many games have the nvidia logo at startup? They don't pay them to cripple ATI cards, they pay them to "optimize" their games for nvidia hardware. It's been that way for a while now. Nvidia's the bigger company, they have more money, they can afford to do that.



I am not disagreeing with your principle there, however NVidia bigger than AMD?


----------



## imperialreign (Feb 7, 2008)

Tatty_One said:


> I am not disagreeing with your principle there, however NVidia bigger than AMD?



well . . . as of the last 6 months - a year (ahenver the ATI/AMD merger was complete); nVidia is only large based upon revenue.

when ATI was seperate is I think where they were going with that argument


----------



## warhammer (Feb 7, 2008)

I bet NVIDIA share holder are very happy


----------



## MadCow (Feb 7, 2008)

imperialreign said:


> well . . . as of the last 6 months - a year (ahenver the ATI/AMD merger was complete); nVidia is only large based upon revenue.
> 
> when ATI was seperate is I think where they were going with that argument



Well you still proved my point, nvidia has alot more money to spare, so they can bribe game companies. ATI/AMD can't do that right now.


----------



## zOaib (Feb 7, 2008)

CH@NO said:


> mmmmm, that's really nice but the 8800 ULTRA isn't supposed to fight with an X2. It's single cored and is from the "past generation", when Nvidia launches the 9800GX2 model, put them to trade blows and then post a real comparsion.



well u miss the point , its cheaper about 200 dollars less , dual gpu and performs extremely well  a person spending that much money on this level of card will definately get the radeon instead the ultra , its economics , people dont care if it uses 2 gpus against one , PRICE is the deciding factor here , what will you choose , or be like hey they are cheating with 2 gpus on 1 pcb and are also charging way less ........ i dont like them they are cheaters , LOL



dont worry this is good for us consumers when companies do this we WIN always


----------



## wolf (Feb 7, 2008)

wether or not the 8800U costs $6xx and the X2 @ $4xx, the fact remains that the G80 has been out almost a year and a half, which means, that nobody will buy one now over a G92 or X2, but if theyve already got one, then whats the point in upgrading at all?

not to mention i think the 384 bit 768 megs of ram will keep the G80 chopping for a while. whereas even with dual gpu the X2 only has 2x512 and 2x256 bit, so each gpu can still only adress that much at once.

i believe benching an 8800U against an X2 in 1-2 years will show whom is meatyer, although on that note, then X2 might come ahead with its 640 combined shaders...only time will tell.

in any case people, like ive said before, both companies rock hardcore, and even if your a fanboy either way, you couldnt be if it werent for the other company exisitng, so take the good with the bad, and dont just bag the company you dont like, jsut because you dont like them, have a reason!


----------



## asb2106 (Feb 7, 2008)

wolf said:


> wether or not the 8800U costs $6xx and the X2 @ $4xx, the fact remains that the G80 has been out almost a year and a half, which means, that nobody will buy one now over a G92 or X2, but if theyve already got one, then whats the point in upgrading at all?
> 
> not to mention i think the 384 bit 768 megs of ram will keep the G80 chopping for a while. whereas even with dual gpu the X2 only has 2x512 and 2x256 bit, so each gpu can still only adress that much at once.
> 
> ...



way to copy and paste,  thats original:shadedshu


----------



## wolf (Feb 7, 2008)

totally , but seriously i think what i wrote applies to the discussion in either thread. so for those who arent into both, they can still read my opinion, yay!


----------



## ChillyMyst (Feb 7, 2008)

DarkMatter said:


> ChillyMyst
> Many discrepancies there. But I won't respond to them. Only that I own a 6800 gt and a x800 xt and by no means the Radeon kills the GeForce. What a lie man, they trade blows depending on the game. Also not a single x800 was faster than 6800 ultra, x850 PE did and 850XT trades blows, but x800? Nah!!
> 
> Also the 6800 GT ran Farcry like a champ at any setting.
> ...



x800 was refering to the whole x800xt/xt pe line, the x850 was just a SMALL refresh of the x800 core, VERY slitly higher clocks(lower then my overclock on my x800xt pe!!!) 

and yes publishers will dissable/not optimize for all cards because they get PAYED decently to optimize for nvidia cards, its been happening for years.

look up tiger woods golf, one of the older versions of that game only worked properly on nvidia cards, the game limmited feturs and even resolutions on ALL over brands, but you could trick it by using a program that let you change the ID of your card under windows, if you set it to say you had ANY nvidia card even a riva128 the game allowed full fetures and all avalable resolutions.



> The Way It's Meant To Be Played (TWIMTBP) is a program that helps game developers to optimize and incorporate exclusive features in their games and applications exclusively for NVIDIA's graphics cards. The deal also adds a splash screen to "the way it's meant to be played" games as well as branding within the game; this is widely considered as a promotion campaign for NVIDIA. This program was launched 2003 by NVIDIA, a graphics card producer. The program aims at providing the best experience possible for users of NVIDIA GeForce graphics cards, and more particularly provides extensive guidelines on game performance optimizations for the GeForce graphics cards.


http://en.wikipedia.org/wiki/The_Way_It's_Meant_to_be_Played

and that program does cover PAYING developers and publishers to make games run better on nvidia cards or in some cases even causing artificial performance and feture loss if your videocard is not nvidia based.


http://en.wikipedia.org/wiki/NVIDIA


> Shortcomings of the FX series
> 
> At this point Nvidia’s market position looked unassailable, and industry observers began to refer to Nvidia as the Intel of the graphics-industry. However, their major remaining rival ATI Technologies did stay competitive due to their Radeon which was mostly on par with the GeForce 2 GTS. Though their answer to the GeForce 3, the Radeon 8500, was later and initially plagued by driver issues, the 8500 proved a superior competitor due to its lower price and greater potential. Nvidia countered ATI's offering with the GeForce 4 Ti line, though the Ti 4200's delayed rollout enabled the 8500 to carve out a niche. ATI opted to work on their next generation Radeon 9700 rather than a direct competitor to the GeForce 4 Ti.
> 
> ...



remmber this has been edited by nvidia(just as most large companys edit entrys about them.....)

but that shows what kinda of innovation nvidia pulls off at times, people(nvidiots) say the 1800 and 2900 where dessasters, the fx line was one giant trainwrek in the middle of a typhoon with the wrek of the xeon valdeese floating around.


oh and i love this section



> GeForce 6 series and later
> Image:Nvidia Logo.svg
> The old Nvidia logo, in use until the release of the Geforce 7 series.
> 
> ...



FUD, crossfire was avalable with the x800/x850 cards first not the x1000 cards.

part of the performance problems with the 9500/9600/9700/9800 cards was that nvidia PAYED Id to use shader code that would run poorly on current ATI drivers, but that ran better on nvidia drivers. this was disscovered to be true by some people who changed the shader code from one mode to another(to decimal or something)

links to more info for nvidiots who will dennie this is true, and for those who just didnt know it in the first place, ati fixed this with a driver update but still it shows how far nvidia will go to enhance their benchmark scores in spicific hot titles.

removed links, replaced with 1 megagakes link and quote, this shows how nvidia pays companys to "optimize" for their hardware.

also take not that 

MegaGames quote on how to do this yourself.
http://www.megagames.com/news/html/pc/doom3enhancetheexperiencept2.shtml


> Enhance the ATI Experience
> 
> 
> It is, of course, a well known fact that Doom 3 is a game which performs best when using boards by nVidia. This has left ATI fans frustrated and eager for a driver update or some other fix. Since ATI has not yet responded, a way of improving the way Doom 3 handles on ATI cards has been posted on the Beyond3D forums. According to the author, the performance increase can increase frame rate from 34fps in 1280x1024 to 48fps. Changes would, of course, depend on each individual set-up. A further suggestion from the forum is that the fix really kicks-in if vsync is enabled. Please feel free to post your experience with the fix on the MegaGames Forums.
> ...



i did this fix tho i just downloaded the prefixed file because im lazy lol.

till the driver fix came out this was the only way to get doom3 to run really well on ati hardware, Id would never admit it, but they know they intentionaly used a bad code path for ati hardware that would give their long time buddys at nvidia an advantege in benchmarks, at least till ati made a driver to address the game spicific problems in doom3.

and from what i remmber there are diffren code paths for ARB1, ARB2, NV10, and NV20 and other gpu's, again nvidia needed to use spicific coding for cards like the FX line to keep them from sucking to hard.

http://en.wikipedia.org/wiki/Get_in_the_game
ati's version of The way its ment to be played, diffrance, ati dosnt pay them to ruin perf on other cards, just make it better on their cards.


----------



## asb2106 (Feb 7, 2008)

wolf said:


> totally , but seriously i think what i wrote applies to the discussion in either thread. so for those who arent into both, they can still read my opinion, yay!



haha its cool, it did apply.  I just thought it was funny!


----------



## wolf (Feb 7, 2008)

i remember first copying and pasting in primary school in typing class, CTRL+C and CTRL+V ftw, i  was top of the class and i still type with my 2 index fingers


----------



## DarkMatter (Feb 8, 2008)

MadCow said:


> You think that's funny? Have you ever wondered why so many games have the nvidia logo at startup? They don't pay them to cripple ATI cards, they pay them to "optimize" their games for nvidia hardware. It's been that way for a while now. Nvidia's the bigger company, they have more money, they can afford to do that.





Sorry, but this is at you. Are you talking to me as if I knew nothing about TWIMTBP? As if I was an ignorant? Sorry for how I answered, but right now I feel you deserve it. 
There's a difference between optimize for one platform and cripple the other. It makes me really angry, not because of you (both and are many many others) are saying that Nvidia guys pay to cripple their competitions performance, I don't care about Nvidia in that respect. It's because at the same time you say that 80% of game developers are cheap whores. 
Do you know anything about how much costs making a game? Once that they have done finished the game, they spend 6-9 months optimizing the game. Under TWIMTBP Nvidia sends a team that helps them in that job and they put at their disposal the lab with 500 different PC configs they own and specificaly created for that purpose. They don't pay a single dollar! They just do the job that otherwise developers would have to spend a lot of man/time, which translates into money. Just look at those games you mention and look at the thanks in the credits, there are always like 10 people from Nvidia and 1 or 2 from Ati. But of course they pay for that too... I forgot it! They pay and that's all!! They pay!!! they pay!! they pay!!! Again, they don't pay a single dollar. It's that way, otherwise prove it.

Nvidia's total net income for 2007 was $448 million and gaming industry has generated almost $50 billion during the same time, 17% of it being in PC, $8.5 billion. Around half of that comes from Ati/AMD owners Do you really believe that Nvidia has the money to pay (and convince) them to risk $4 billions. You're a fool if you do.


----------



## wolf (Feb 8, 2008)

prawned.


----------



## Hawk1 (Feb 8, 2008)

DarkMatter said:


> Sorry, but this is at you. Are you talking to me as if I knew nothing about TWIMTBP? As if I was an ignorant? Sorry for how I answered, but right now I feel you deserve it.
> There's a difference between optimize for one platform and cripple the other. It makes me really angry, not because of you (both and are many many others) are saying that Nvidia guys pay to cripple their competitions performance, I don't care about Nvidia in that respect. It's because at the same time you say that 80% of game developers are cheap whores.
> Do you know anything about how much costs making a game? Once that they have done finished the game, they spend 6-9 months optimizing the game. Under TWIMTBP Nvidia sends a team that helps them in that job and they put at their disposal the lab with 500 different PC configs they own and specificaly created for that purpose. They don't pay a single dollar! They just do the job that otherwise developers would have to spend a lot of man/time, which translates into money. Just look at those games you mention and look at the thanks in the credits, there are always like 10 people from Nvidia and 1 or 2 from Ati. But of course they pay for that too... I forgot it! They pay and that's all!! They pay!!! they pay!! they pay!!! Again, they don't pay a single dollar. It's that way, otherwise prove it.
> 
> Nvidia's total net income for 2007 was $448 million and gaming industry has generated almost $50 billion during the same time, 17% of it being in PC, $8.5 billion. Around half of that comes from Ati/AMD owners Do you really believe that Nvidia has the money to pay (and convince) them to risk $4 billions. You're a fool if you do.




And you know this how? I don't think you know factually that this happens, the same way as the people stating they are paying the developers off. So unless you can tell me you work/have worked for Nvidia/AMD/a game developer and know first hand, you dont really know, and can now get off your high horse.


----------



## DarkMatter (Feb 8, 2008)

Oh ChillyMyst don't bother with your lies. Wikipedia is for morons. Even a monkey wouldn't believe half of what is said there. I want proofs or your mouth shut. 

It's clear you don't know anything about optimization. The fact is Doom3 was fully optimized for Nvidia cards, thanks to TWIMTBP, but it wasn't on Ati's because Ati never bothered to create a program like that. Ati cards performed worse because a lack of optimization, because they didn't help optimize as Nvidia did. On the other hand all Valve's games run better on Ati cards, and Oblivion, Call Of Juarez... but this is because, let me guess... Ati paid Valve, Techland and Bethesda to make GeForces underperform in their games! Of course!! How could I have been so blind until now?

No, the truth is and I know well, because I have actually programmed, that optimizations can make a program run almost 200% better or 200% worse than the average optimization, depending if it's a good one or a bad one. The lack of part of that optimization accounts for those differencies in actual games. Many times they fix them on patches if they can afford to loose their time or sometimes they are fixed on drivers. Crytek as part of TWIMTBP helped Nvidia with their drivers AFTER they launched the game. Why would they do that, and not further cripple Ati's performance before launch? 

You see a conspiration because you want to see one. But there isn't such a thing. I suggest you start believing in aliens, would be better. 
They don't pay anything, it's that simple, they work for free, which is different. Let this enter your mind man, since it's the truth.


----------



## DarkMatter (Feb 8, 2008)

Hawk1 said:


> And you know this how? I don't think you know factually that this happens, the same way as the people stating they are paying the developers off. So unless you can tell me you work/have worked for Nvidia/AMD/a game developer and know first hand, you dont really know, and can now get off your high horse.



"Innocent until proven otherwise." I think this means something, but I can't remember it quite well now... Let's see... Ermmm what does that mean... 

I'm being sarcastic. It's not me or Nvidia in this case who need to find a proof of innocence. It's those naysayers who have to demostrate their BS. I know all that because I have read it from many respectable developers and journalists. Can you say the same about your BS? Tell me.


----------



## Hawk1 (Feb 8, 2008)

DarkMatter said:


> "Innocent until proven otherwise." I think this means something, but I can't remember it quite well now... Let's see... Ermmm what does that mean...
> 
> I'm being sarcastic. It's not me or Nvidia in this case who need to find a proof of innocence. It's those naysayers who have to demostrate their BS. I know all that because I have read it from many respectable developers and journalists. Can you say the same about your BS? Tell me.



What BS of mine? I'm just simply stating that unless you/someone knows from the horses mouth, you/we don't know either way. Just like websites taking payola for good review of products. Lots of people suspect, lots say it would kill a website if it did such a thing so why would they risk that. 

We don't know (at least I don't) either way.  You can suspect and make claims one way or the other (yes, I've been guilty of that, in joking sort of ways), but the truth is, unless there is someone on the inside posting here, we will probably never know for sure.


my .02


----------



## DarkMatter (Feb 8, 2008)

Sorry if I thought you were supporting their BS, but you have to understand how I feel as a gamer.
Stating that 80% (if not more) of game developers are cheap moneythirsty whores, including Epic, Crytek, ID, Infinity Ward, Ubisoft and now Valve too, without a single proof of that, is in my book the biggest piece of BS I have ever had the grace to see with my eyes.
Stating that a $448 million a year company (2006's was $301 millions and 2005 $88 millions, link down) can buy a $50 billion industry with obscure intentions, without anyone noticing it, I mean official channels, is just plain silly too. It's laughable. Don't you think that Ati or Amd would have filled a lawsuit already? If they are so good in those things why are they at Nvidia and not in the CIA? Wait!  (insert X-Files theme here)

FFS  just a bit of common sense!

http://www.marketwatch.com/tools/quotes/financials.asp?symb=NVDA


----------



## Hawk1 (Feb 8, 2008)

DarkMatter said:


> Sorry if I thought you were supporting their BS, but you have to understand how I feel as a gamer.
> Stating that 80% (if not more) of game developers are cheap moneythirsty whores, including Epic, Crytek, ID, Infinity Ward, Ubisoft and now Valve too, without a single proof of that, is in my book the biggest piece of BS I have ever had the grace to see with my eyes.
> Stating that a $448 million a year company (2006's was $301 millions and 2005 $88 millions, link down) can buy a $50 billion industry with obscure intentions, without anyone noticing it, I mean official channels, is just plain silly too. It's laughable. Don't you think that Ati or Amd would have filled a lawsuit already? If they are so good in those things why are they at Nvidia and not in the CIA? Wait!  (insert X-Files theme here)



LOL Agreed. 

I jokingly say things like Nvidia is buying out the developers etc. (just to make myself feel better - I am an ATI fanboy),  but truth is I think its the different architecture they use that makes ATI weaker in certain games/situations (look at their AA performance hits on some games - even the new x2 cards).

Anyway, I hope both companies bring out great highend cards to compete with each other. Otherwise we'll have a stagnant sector with the same thing for a (relatively) long time - G80 and Intel P4 CPU's as cases in point.


----------



## ChillyMyst (Feb 8, 2008)

DarkMatter said:


> Oh ChillyMyst don't bother with your lies. Wikipedia is for morons. Even a monkey wouldn't believe half of what is said there. I want proofs or your mouth shut.
> 
> It's clear you don't know anything about optimization. The fact is Doom3 was fully optimized for Nvidia cards, thanks to TWIMTBP, but it wasn't on Ati's because Ati never bothered to create a program like that. Ati cards performed worse because a lack of optimization, because they didn't help optimize as Nvidia did. On the other hand all Valve's games run better on Ati cards, and Oblivion, Call Of Juarez... but this is because, let me guess... Ati paid Valve, Techland and Bethesda to make GeForces underperform in their games! Of course!! How could I have been so blind until now?
> 
> ...



got a better one for ya "mr know it all" proove me wrong with FACTS since wikipedia is for morons and the other links i posted dont fit your high standreds.

acctualy nvidia has been working with crytek for some time, they just wherent keeping up with changes from what i was reading, thats not that uncommon for nvidia, they are slow to update drivers in my experiance, it takes them far to long to fix small issues that should be easy to remidy.

btw, a few studys have shown wikipedia to be as good a sorce for information as encarta or Britannica 2 of the top names in the field.

http://www.news.com/2100-1038_3-5997332.html



> Over the last couple of weeks, Wikipedia, the free, open-access encyclopedia, has taken a great deal of flak in the press for problems related to the credibility of its authors and its general accountability.
> 
> In particular, Wikipedia has taken hits for its inclusion, for four months, of an anonymously written article linking former journalist John Seigenthaler to the assassinations of Robert Kennedy and John F. Kennedy. At the same time, the blogosphere was buzzing for several days about podcasting pioneer Adam Curry's being accused of anonymously deleting references to others' seminal work on the technology.
> 
> ...



i alwase love people who dissmiss any sorce of information that says their stance is wrong, its just so nice to see that there really are people who know everything there is to know about everything and can proove it.....oh wait....they dont proove it, they just spout off then tell somebody else to proove them wrong.

in this case im calling you out buddy (DarkMatter), proove the articals i posted wrong!!!


----------



## trog100 (Feb 8, 2008)

whether failure to properly "optimize" is much different in its end results than a deliberate "cripple" i dont know..  

i dont quite see the world in black and white.. to me its mostly shades of grey..

its not about payola.. more like dont bite the hand that feeds u.. which is why things like unbiased reviews are so hard to find on the internet.. 

money talks.. free help is the same as money.. reward is reward and nothing comes free..

trog


----------



## ChillyMyst (Feb 8, 2008)

trog100 said:


> whether failure to properly "optimize" is much different in its end results than a deliberate "cripple" i dont know..
> 
> i dont quite see the world in black and white.. to me its mostly shades of grey..
> 
> ...



im just waiting for DM to post saying [H] is unbias........rofl.......


----------



## NastyHabits (Feb 8, 2008)

snuif09 said:


> when ati and nvidia are bitching that 3dfx comes suddenly back in the race with an amazing card



Ah yes.  And I remember when Matrox ruled.


----------



## btarunr (Feb 8, 2008)

Wikipedia articles can be written/edited by a lot of people, fanboys included.


----------



## ChillyMyst (Feb 8, 2008)

btarunr said:


> Wikipedia articles can be written/edited by a lot of people, fanboys included.



yeah, but its harder to get away with now, also if you  read that artical, its very praising of nvidia, and it miss-represents the crossfire situation stating that crossfire didnt come along till the x1000 line of cards, when it came out with the x800 line quite some time b4 the x1k's where out.

oh and a "real" encyclopedia can be wrong as offten as the wiki, in highschool i got extra credit for finding 5 faulse encyclopedia articals that where out right bs or where VERY poorly writen, so much so that if you went by their information you would think the internet and networking was powered by some kinda of number obsessed pixys..........(did my sr project on computer networking and its future.)

What makes me laugh is that harvard and other large highly respected colleges have some instructors that acctualy support the use of wikipedia as A sorce in research, now they requier more then just wikipedia to be sited, BUT they also do the same thing with hard copy encyclopedias.

I had an instructor in college that acctualy dissallowed us the use of britanica and encarta because she had found errors herself in them, so we where FORCED to find other sorces, god that can be time consuming, thankfully the wiki saved alot of us, it pointed us in the right dirrection to find books/articals to sorce for our papers(god that class sucked!!!!) 

i never trust just 1 sorce of info EVER, because any site/sorce can have FUD in it, its like trusting [H] to be objective about an nvidia product when the staff are all being given products to test by nvidia.........

i sited wikipedia because the info i posted is all EASY to varify, the FX lines flaws, The doom3 "optimization" that crippled ati cards(and if fixed crippled nvidia cards!!!), the x1k not being the first CF cards.

the HDR+AA issues i linked NVIDIA THEM SELVES!!! they admit their cards can only do it if its the same kind of HDR that the r300/400 range of cards can do.

i have years of experiance with all this crap, i owned these cards in their prime, i got a 5800ultra the week it became avalable localy(god damn that sucker was LOUD.....guess thats not true eather since its in wikipedia and DM says everything in the wiki is lies...) 

little background on me when it comes to videocards

oak tech 256k video card(isa)
oak tech 512k video card(isa)
cirus logic 1mb videocard(pci)
s3 virge 1mb videocard(pci
s3 trio64 2mb videocard(pci)
s3 virge 4mb videocard(pci)
ati mach 32(visa local buss)
ati mach 64(pci)
ati rage 3d/pro (pci)
3dfx voodoo1 3d card(pci)
trident 4mb videocard(pci)
voodoo rush videocard(pci)
verite v2100/2200 cards 4 and 8mb(pci)
trident 8mb pci card+voodoo1
riva128
tnt
rage128 pro
tnt2 ultra
gf1sdr
gf2gts
kyro2
gf3ti
gf4ti(same chip as above really just higher clocks)
fx5800ultra
radeon 9600 256mb
radeon 9800se 256bit(hard mode to pro at higher then XT clocks)
gforce 6800gt
radeon x800xtpe
x1900xtx
8800gt

also have had 7 seirse cards and a few cards i didnt list, to many of those old pci/vesalocal bus cards to remmber all the names.  Oh i also had the savage2000 card, its drivers sucked, i sold it FAST and got the geforce card insted lol.
oh and the only radeons i didnt own where the radeon classic and 7k line(till i was GIVEN a system with a 7k card in it, years after they where WAY out of date)

i have also owned a couple xgi and sis cards, the numbers/modems escape me at the moment, a side effect of owning to much damn hardware 

i have had alot of personaly experiance with the flaws of both companys, and how they use dirty tricks to get ahead, at one point ATI did driver optimizations for 3dmark(same thing nvidia had been doing for a long time) people called them on it, and they stoped.

ATI also use to be HORRIBLE about slow as snot driver updates, but they fixed that with the catlyist driver monthy updates.

ATI moved to using mixed mode filtering optimizations insted of using full triliniar AF, this wasnt really a cheat, but in some cases it did effect quility back then, now it dosnt seem to have any negitive effect by all reviews, just boosts performance, and they still let you force full triliniar filtering if you want it.

my 5800ultra had so many driver "optimizations" that it wasnt even funny, that thing was HORRIBLE, it was LOUD AS HELL(no card today is as loud, even the x800/1900/2900 or 8800gt at 100% arent any where close to that loud!!!)
every time they updated the drivers quility(image quility) got worse to gain a few FPS in a few games, but they never got the dx9 speeds playable so what was the damn point?

i never had that experiance with ati, sure i had bad exp with their win2k drivers for the rage128, and that drove me to HATE THEM WITH A PASSION, but then again my 6800gt had driver problems when i had it as well, problems nvidia varifyed where known issues and said they would fix, they fixed 2 of the 3 problems, the 3rd well, if u got it, you endup getting a diffrent card from a diffrent chip maker......

i dont hate nvidia, but i dont think they are the shit eather, i dont think ati is the be all-end-all of gfx chip makers eather, the 2900 was to hot and to power hungery for me, BUT it didnt perform that bad for its price.

on the other hand the 38*0 cards made a real step in the right dirrection, low power use, lower temps, same perf.......

the 8800gt would have been a kickass move if not for 2 things.

1. shitty stock cooler that lets the gpu hit 90c or higher under heavy gaming.

2. known driver bugg/buggs with video playback.

i just wish that nvidia would work WITH amd/ati to make a GPU based PPU standred that they could both support in their own way, that way we wouldnt be stuck with no game devs wanting to bother supporting gpu based phsyics because the markets so split between nvidia and ati.........

this is one time MS steping in and forcing a GPU as PPU standred on the market would acctualy be a good thing in my eyes, at least that way we would beable to get games coming out that could use extra gpu power to improve the gfx and gameplay even more!!!!

altho i would REALLY like to see OPENGL step up and get version 3.x out, get some game companys to support it and have a shoot out with dx10.x, im sick of ms trying to force me to buy/use their latist "greatist" os, hell ms GAVE me vista......i removedit after 2 weeks of use, got sick of the buggs and apps i use alot that eather didnt work or didnt work fully.......xp x64 or server 2k3 work FLAWLESS with them all.....


----------



## btarunr (Feb 8, 2008)

That mini-Gospel still doesn't explain why I should trust Wikipedia as a valid source when a lot of articles aside video-cards are genuine BS. (What they have about Indian history, etc. for example is pure sterilized BS) so it kinda makes me not want to trust an encyclopaedia and rely more on looking up information myself from various sources and draw a consensus and I'm talking in general, not just video-cards.


----------



## ChillyMyst (Feb 8, 2008)

btarunr said:


> That mini-Gospel still doesn't explain why I should trust Wikipedia as a valid source when a lot of articles aside video-cards are genuine BS. (What they have about Indian history, etc. for example is pure sterilized BS) so it kinda makes me not want to trust an encyclopaedia and rely more on looking up information myself from various sources and draw a consensus and I'm talking in general, not just video-cards.



as i said, thats true of any book, you cant trust books in genral, because they are writen by men, and men put their own prespective into everything they write.

Its like reading about american history and what was done to the native americans, its very....diffrent from what acctualy happened in most cases, or reading about morman history from an outsiders point of view then reading the mormon version of things(VERY VERY DIFFRENT!!!!) 

the point i was making is that wikipedia is as reliable as any book you choose to grab because just like wikipedia, books are writen by people, and people are flawed, they see things in their own way.

have you ever heard 5 people who saw the same thing describe it diffrently?  i have, its weird how peoples prespective differs even if they are all togather and see the exect same events.

and i still want somebody to dissprove with a crediable sorce all that i posted/quoted from the wiki........gonna be hard since other then what i marked as FUD its pretty accurate(well other then blaming the xbox and ms for nvidia making a crappy chip with the fx line, that was due to them trying to mix 3dfx tech with what they already had then cheapen out with partial persission shader units..........
but thats beside the point!!!


blah, im tired, i been up since 5am, i am gonna watch a movie and get some sleep!!!


----------



## btarunr (Feb 8, 2008)

....which is why I said:



btarunr said:


> That mini-Gospel still doesn't explain why I should trust Wikipedia as a valid source when a lot of articles aside video-cards are genuine BS. (What they have about Indian history, etc. for example is pure sterilized BS) so it kinda makes me not want to trust an encyclopaedia and rely more on looking up information myself from *various sources and draw a consensus* and I'm talking in general, not just video-cards.


----------



## ex_reven (Feb 8, 2008)

This thread is stupid.
And the people in it, with their endless use of sarcasm to try and prove a point about the opposing video card manufacturer.

I also noted lots of libel, and crappy spelling.
On a more amusing note, you should all be shot.

Good Day.


----------



## ChillyMyst (Feb 8, 2008)

/me farts on poster above


----------



## ex_reven (Feb 8, 2008)

ChillyMyst said:


> /me farts on poster above



You've already been farting through the pages the thread!?
I see no difference.

Oh wait. That was your argument .


----------



## ChillyMyst (Feb 8, 2008)

go eat some vegemite and leave us to our fun


----------



## ex_reven (Feb 8, 2008)

ChillyMyst said:


> go eat some vegemite and leave us to our fun



Vegemite is disgusting.
How about a beer?


----------



## ChillyMyst (Feb 8, 2008)

thought all u ausi's liked that nasty brown smelly crap(looks like baby poop  )

i dont drink, or do drugs, maby a mt dew for me


----------



## warhammer (Feb 8, 2008)

Feel the LUV

So do you own a KIMBER?


----------



## ex_reven (Feb 8, 2008)

ChillyMyst said:


> i dont drink, or do drugs, maby a mt dew for me



Me neither, but Id still take beer over vegemite or that Mt Dew crap.

Mmm coke.


----------



## candle_86 (Feb 8, 2008)

Dublin Dr Pepper or bust


----------



## Tatty_One (Feb 8, 2008)

Hawk1 said:


> And you know this how? I don't think you know factually that this happens, the same way as the people stating they are paying the developers off. So unless you can tell me you work/have worked for Nvidia/AMD/a game developer and know first hand, you dont really know, and can now get off your high horse.



His facts aside, when you look at the financial aspects of his argument they do seem plausible, the NVidia net profit figures seem correct as I have just done some googling, just taking that into account, it would seem unlikely that they would be able to have that great an influence in a gaming industry that makes NVidia look like fairly small fry TBH, although, as you have pointed out, on my part at least that is speculation.


----------



## candle_86 (Feb 8, 2008)

so what if Nvidia is paying them off, its bussiness


----------



## DarkMatter (Feb 8, 2008)

ChillyMyst



LOL I'm not denying your sources, I'm denying the conclusions you take and the analisys you do out of them. Examples:




> Quote:
> 
> 
> > HDR and Anti-aliasing
> ...



Haha, this is funny. Indeed it's FP16, so you are right there, but it is also (mistakenly) known as FP64, because as you very well know (), color scheme used by graphics cards is RGBA (Red, Green, Blue, Alpha, just to be sure...) and HDR uses 16 bits in floating point for each channel, thus FP16 x 4 = 64 Floating Point bits. What it is funny about this, is how you go and say he's inventing it, without even thinking just a bit about it, and came to the same conclusion as I did. 
Anyway, I think Half-Life 2 used a custom made hdr, with different bit depth for color and alpha, and a total of 40 or 48 bits used, but don't quote me saying this, since I may be wrong (most likely than not), it's just something that I remember not very clearly TBH from when the game was released long ago. Newer games from Valve use the standard HDR so far.



> Quote:
> 
> 
> > The Way It's Meant To Be Played (TWIMTBP) is a program that helps game developers to optimize and incorporate exclusive features in their games and applications exclusively for NVIDIA's graphics cards. The deal also adds a splash screen to "the way it's meant to be played" games as well as branding within the game; this is widely considered as a promotion campaign for NVIDIA. This program was launched 2003 by NVIDIA, a graphics card producer. The program aims at providing the best experience possible for users of NVIDIA GeForce graphics cards, and more particularly provides extensive guidelines on game performance optimizations for the GeForce graphics cards.
> ...



Links says nothing, but you are fast to say your BS paragraph.




> Quote:
> 
> 
> > GeForce 6 series and later
> ...




Wrong, they say Ati responded with both x1000 and Crossfire, not that one came first than the other or not. Pff ffs.

That, my friend, is what optimization stands for, to change some specific code to make the game run better. But saying Nvidia payed so they did that way is BS. Where was Ati then? Where are your proofs?
The truth is they did it that way because they thought it was the best way to achieve the effect they wanted, and not thinking it would jurt Ati's performance. Once they knew about the problem they fixed it on the patch, isn't it? Anyway that alternative method you mention, in apparency changes a texture call with a constant variable. They change a texture with a constant color ffs!! The end result may be similar, but sure it's not doing the same thing.

An example contrary to yours: Call of Juarez. AFAIK this game is/was under TWIMTBP. They use an Antialiasing method that severely hurts Nvidia's performance. They get paid by Ati, of course. Or Ati/Amd don't do such things? Share your thoughts about this, I'm willing to read your response.

Another example, STALKER: This game ran poorly on every card, be it Ati or Nvidia. Some community made mods to shaders (including adding phong shading instead of blinn) made the game run almost a 100% better depending on circunstancies, while looking better. There were many of those mods or hacks. Oblivion is another example of the same nature. 
Now since both Ati and Nvidia ran like crap in the first place and those mods were the proof that the game could run better, there's no other than to think that Sony paid them, because they wanted to destroy PC gaming. Indeed why not go and say that Sony has been doing this for years and blame Sony for the decadency in optimization that the PC has been suffering as a whole in the late years? Oooh, thank you Chilly, now I see the light!!


About [H], don't worry I never had that as a reliable source. Nor I do about wikipedia since I saw many changes that make both companies and people and that lasted weeks before they corrected them back. I have seen so many times something like: "And according to Mr. XXXX a reliable scientist in the matter... blah blah... *Mr. XXXXX is a moron, and a motherfucker, niggars are lower than apes in the evolutive chain.*" <-- This lasting weeks is unnaceptable for me, so I can't have it as a reliable source, sorry. 

And sorry for the language, but how can I express the thing if I don't use the languaje they used?


----------



## DarkMatter (Feb 8, 2008)

Tatty_One said:


> His facts aside, when you look at the financial aspects of his argument they do seem plausible, the NVidia net profit figures seem correct as I have just done some googling, just taking that into account, it would seem unlikely that they would be able to have that great an influence in a gaming industry that makes NVidia look like fairly small fry TBH, although, as you have pointed out, on my part at least that is speculation.



Oh no! You too? 
I can't believe how this conversation is running.  
Yeah, technically what I'm doing is speculation too, since we don't have proofs that demostrate that what I say is true, but we don't have any that prove otherwise neither. But reality is different. We don't have proofs that aliens exist and that they are here right now, neither we have proofs that say otherwise.
We don't have proofs that leprechauns don't exist too.
No the world doesn't run like that (plese tell me it doesn't ffs), a lack of proofs denying that aliens are not between us, doesn't make plausible their existence between us. Same with leprechauns.

Up to this point that we have reached, I'm going to risk a thread close and even my ban for a last example: 
My claim is the following.

ChillyMyst is mentally deficient, because when he was 5 her mother raped him.

Come on give me the proofs that demostrate that's false. 
ChillyMyst any proof that you give us is not valid, as you could have falsificated them, nor you can call your friends, since you could have payed them to say what you want. 



ChillyMyst, I don't want to make a personal offense with this, I just want to try to demostrate how stupid is to accuse without proofs. I deliverately added your mom to try to reflect how by accusing Nvidia, you are accusing game developers too. And how since the claims are false, they can hurt many people who love them. 

Sincerely waiting for those proofs, 

DarkMatter


----------



## btarunr (Feb 8, 2008)

That's it. 

DarkMatter, your flaming/insulting/trolling is unacceptable. Sure Chillymist can't provide proofs for everything just as there's no proof (required) for certain things but that doesn't give you the freedom to shoot your mouth around people as you like. You have something to prove, made your point, a link or two, a quote or two, seal your mouth and take it to the next thread. Chillymyst nor Tatty have to succumb to someone calling them "mentally deficient".


----------



## DarkMatter (Feb 8, 2008)

btarunr said:


> That's it.
> 
> DarkMatter, your flaming/insulting/trolling is unacceptable. Sure Chillymist can't provide proofs for everything just as there's no proof (required) for certain things but that doesn't give you the freedom to shoot your mouth around people as you like. You have something to prove, made your point, a link or two, a quote or two, seal your mouth and take it to the next thread. Chillymyst nor Tatty have to succumb to someone calling them "mentally deficient".



Agreed. And sorry. But I think that it's clear that I don't want to flame him, that I don't mind what I said. It's only an example, an out of tone and offending one. But as I say in white letters (did you read them?) I'm not trying to make an offense. I thought that stating that in the message would make that clear. And yeah putting it in white doesn't help, but it's the only way that I could think of to make him take a momentary offense, as to feel the same he is inflicting.

On the other hand I never flamed Tatty. I expressed my sadness when I saw him saying I was speculating and later I tried to explain why even though I was speculating, mine was kinda better than the one CHillyMyst is doing. And I think that I used a kind language there.

Anyway, at this point, I don't care if I get a ban. I'm tired of this forum since this thread, because I'm sick (yeah whatever) looking how a person with not a single proof can accuse many respectable developers and companies, and still get more support than me, that I am only defending what, until proven otherwise, is innocent.


----------



## ex_reven (Feb 8, 2008)

Heres an idea.
Everyone Shutup, and I will ask God (or your god) to forgive you.

People who believe in Satan are screwed tho


----------



## Deleted member 3 (Feb 8, 2008)

Great, now that everything is cleared up we can magically turn this thread into a for-grown-ups-only place.


----------



## ex_reven (Feb 8, 2008)

DanTheBanjoman said:


> Great, now that everything is cleared up we can magically turn this thread into a for-grown-ups-only place.



Is that magical happy land a sponsor of the new banned logo being advertised under darkmatters screen name? Oh wait...


----------



## largon (Feb 8, 2008)

Geez!

Nobody bothered to read *DarkMatter*'s post... 


			
				DarkMatter said:
			
		

> ChillyMyst, I don't want to make a personal offense with this, I just want to try to demostrate how stupid is to accuse without proofs.


His example was indeed quite crude but for pete's sake, *it was just an example*. Some people are *so* patheticly sensitive.


----------



## Leprechauns (Feb 8, 2008)

DanTheBanjoman said:


> Great, now that everything is cleared up we can magically turn this thread into a for-grown-ups-only place.





> Forum Rules
> 
> Registration to this forum is free! We do insist that you abide by the rules and policies detailed below. If you agree to the terms, please check the 'I agree' checkbox and press the 'Register' button below. If you would like to cancel the registration, click here to return to the forums index.
> 
> ...



Before I get banned again (I know that if I don't insult anyone I won't, but I don't have this forums in a very high place right now) because I will go off topic or because of it's nature or because who am I, I would want to give some advices regarding the Forum Rules and forum administration, not that I feel that I am in place of doing it, but because sincerely and humbly think they could be better:

1. Make clear that any violation to the rules means an instant ban, without administrators advertences and/or explanations or the right to be corrected. I say so because as a forum surfer that I am, I have seen many admins at work and an advertence or a post ban message is the norm, even if the reason of the ban is clear. Makes the forum fell more democratic.

2. A mail to the banned person comunicating the ban would be apreciated. Again regardless of the violation. Maybe the banned person doesn't deserve being treated as a person, but by doing so, you demostrate your own humanity.

3. Try not to insult the banned person after banning him for insulting. Yes, I find your sentence as offensive as mine if taken into its contex, and I would like you to say sorry for that. Thank you.

With that said, goodbye to everybody. I will countinue entering this site, though I won't post anymore (and who cares? I know that's what you are thinking).
It's a shame how some circunstancies took me to the point that I lost my mind as to insult ChillyMyst (being one the lack of support and his flaming to the whole gaming industry), but I still think that a community like this should prohibit or fight a progressive discreditation campaign in which ChillyMyst had engaged. 
I also strongly believe that admins should give the right to let people correct himself, at least once.

Sincerely, 

DarkMatter


----------



## btarunr (Feb 8, 2008)

largon said:


> Geez!
> 
> Nobody bothered to read *DarkMatter*'s post...
> His example was indeed quite crude but for pete's sake, *it was just an example*. Some people are *so* patheticly sensitive.



Unfortunately a moderator doesn't have the time/patience to figure out invisible ink in forum posts. I read it but it still violates the forum guidelines of bad-language.


----------



## Leprechauns (Feb 8, 2008)

largon said:


> Geez!
> 
> Nobody bothered to read *DarkMatter*'s post...
> His example was indeed quite crude but for pete's sake, *it was just an example*. Some people are *so* patheticly sensitive.



Thanks Largon. I had the feeling they didn't actually read it, but life is crude...
Goodbye bro.


----------



## largon (Feb 8, 2008)

*btarunr*,
No time? 
Then they MUST have time to correct the mistakes they make if they're in such a hurry. Not trying to tell how to moderate but IMO a warning + edit should have been enough here. 

PS.
The quoted text is just written in white, not "invisible" (#f5f5ff).


----------



## Deleted member 3 (Feb 8, 2008)

Darkmatter got banned because he was fully aware that insulting people isn't acceptable. He even said he expected a ban. Hence he got a permanent one for behaving in a way of which he knew was unacceptable. I did him the favor. 

I indeed do not have to time or patience to select text and read things that basically state "just kidding".

Additionally, I believe when a mod interferes in drama it means the drama should stop. It did not > close thread.


----------

