# First Radeon HD 5870 Performance Figures Surface



## btarunr (Sep 14, 2009)

Here are some of the first performance figures of AMD's upcoming Radeon HD 5870 published by a media source. Czech Gamer posted performance numbers of the card compared to current heavyweights including Radeon HD 4870 X2, Radeon HD 4890, and GeForce GTX 285. Having not entered an NDA with AMD, the source was liberal with its performance projections citing AMD's internal testing that include the following, apart from the two graphs below:

Radeon HD 5870 is anywhere between 5~155 percent faster than GeForce GTX 285. That's a huge range, and leaves a lot of room for uncertainty.
When compared to GeForce GTX 295, its performance ranges between -25 percent (25% slower) to 95 percent (almost 2x faster), another broad range.
When two HD 5870 cards are set up in CrossFire, the resulting setup is -5 percent (5% slower) to 90 percent faster than GeForce GTX 295. Strangely, the range maximum is lesser than that on the single card.
When three of these cards are setup in 3-way CrossFireX, the resulting setup is 10~160 percent faster than a GeForce GTX 295.
The Radeon HD 5850 on the other hand, can be -25 percent (25% slower) to 120 percent faster than GeForce GTX 285.

AMD reportedly used a set of 15 games to run its tests. Vague as they seem, the above numbers raise more questions than provide answers. The graphs below are clear, for a change.



 




*Update:* Here are allegedly AMD's own performance figures sourced from Chinese website ChipHell.com.



 



*View at TechPowerUp Main Site*


----------



## Initialised (Sep 14, 2009)

Presumably the GTX295 compares well against 58x0 when is has good SLi scaling and a nVidia logo on the box but has it's ass served to it when the game runs better on ATi hardware and doesn't scale in SLi.

BTW I posted these here first.


----------



## tkpenalty (Sep 14, 2009)

wow. That is FAST.


----------



## Initialised (Sep 14, 2009)

tkpenalty said:


> wow. That is FAST.


My 4870x2 is starting to feel inadequate already!


----------



## HellasVagabond (Sep 14, 2009)

Why am i not surprised that both benchmarks where done with AMD-Friendly games/applications ?
Had 3DMark Not disabled PhysX for NVIDIA cards this would be funny....

In any case wait for real reviews before judging any product, by NVIDIA or AMD.


----------



## The Witcher (Sep 14, 2009)

Are those benchmarks really accurate ???

I only trust techpowerup and guru3d reviews...


----------



## tkpenalty (Sep 14, 2009)

HellasVagabond said:


> Why am i not surprised that both benchmarks where done with AMD-Friendly games/applications ?
> Had 3DMark Not disabled PhysX for NVIDIA cards this would be funny....
> 
> In any case wait for real reviews before judging any product, by NVIDIA or AMD.



Yeah and does physX = real gaming performance? No it equals an extra score to add onto the overall score because Nvidia's GPUs get to run again. 

Synthetic benchmarks mean squat anyway and the card isnt even out yet so we're seeing what seems to be premature drivers from that range of improvement.


----------



## btarunr (Sep 14, 2009)

The Witcher said:


> Are those benchmarks really accurate ???
> 
> I only trust techpowerup and guru3d reviews...



You can definitely trust us on the 23rd. Till then every media source is going to be the "drunk opponent at the bar" playing darts with you.


----------



## sapetto (Sep 14, 2009)

Official ATI Benches
http://forum.beyond3d.com/showpost.php?p=1334677&postcount=2969


----------



## KainXS (Sep 14, 2009)

thats insane

100percent faster almost than the GTX295 in wolfenstein, i can't even believe that


----------



## TheMailMan78 (Sep 14, 2009)

btarunr said:


> You can definitely trust us on the 23rd. Till then every media source is going to be the "drunk opponent at the bar" playing darts with you.



Hands down that is the most intelligent thing you have ever said.  You need not a grain of salt with these results but, a salt mine.


----------



## csendesmark (Sep 14, 2009)

ATI ROXX ^^

I hope I can grab my 5870 in this month


----------



## HellasVagabond (Sep 14, 2009)

KainXS said:


> thats insane
> 
> 100percent faster almost than the GTX295 in wolfenstein, i can't even believe that



Some engines are especially nice to ATI cards and some to NVIDIA cards..
Do not expect ATI to post Game Benches that favor NVIDIA cards and vice versa.


----------



## pantherx12 (Sep 14, 2009)

If those official ATI benches are real, holy shitting cock nipples that's a powerful GPU.

Out does a card with two gpus in quite a lot of games!

And its not to far behind in the ones its not, amazing for a single gpu.


----------



## The Witcher (Sep 14, 2009)

sapetto said:


> Official ATI Benches
> http://forum.beyond3d.com/showpost.php?p=1334677&postcount=2969



This is too good to be true....way too good


----------



## ..'Ant'.. (Sep 14, 2009)

Nvidia will come back to hunt them down eventually just like they did when the 48xx series came out.  But hopefully the price range for the 58xx series will be good who knows might get one.


----------



## The Witcher (Sep 14, 2009)

I won't buy any new hardware until they release the i9 (6 cores) : )

Just to make sure nothing will bottleneck my next dream computer.


----------



## KainXS (Sep 14, 2009)

..'Ant'.. said:


> Nvidia will come back to hunt them down eventually just like they did when the 48xx series came out.  But hopefully the price range for the 58xx series will be good who knows might get one.



they will but the situation is different now, when the 48XX came out the G92's were still competing with them but once these cards come out, from the looks nvidia isn't going to have anything to compete for a while


----------



## aCid888* (Sep 14, 2009)

These results are to be taken with a salt mine (thanks MailMan LOL)...the drivers are fresh so I can only imagine how far out these benches are.


That being said...it does show its potential and I fully expect it being a contender for the fastest card out there and without a doubt the quickest single-GPU card.


----------



## gumpty (Sep 14, 2009)

Those figures make sense (in the fact that they compete head on with the 295).

The GTX295 was only about 15-30% faster than a 4870X2, and given that the X2 had a massive price-drop (down to £240 in places) and then went out of stock about a month ago, it made sense that that price/performance zone was where ATI were going to position their new hardware.

But yeah, all that is very very vague. It doesn't even say whether it is a 1GB or 2GB model.

23rd is booked in as a review reading day.


----------



## MopeyMartian (Sep 14, 2009)

a) figures don't lie, but lies can figure

b) yes, it may be "twice" as powerful but if it uses twice the wattage you can forget it.


----------



## [I.R.A]_FBi (Sep 14, 2009)

Imagine the gains after driver maturity kicks in like vtec


----------



## Sihastru (Sep 14, 2009)

We do need to see real numbers (FPS) and not just percentages, because % is relative to something. For example 95% over 10 FPS is 19.5 FPS and 95% over 100 FPS is 195 FPS, and in both of this cases it makes no difference when you play the actual game...

We need new, graphically intensive games since they tell the story of future games and not games based on old dusty engines.

PhysX does matter. 3D Vision can be fun.

It looks a lot better then I thought it would.


----------



## Polarman (Sep 14, 2009)

It may be possible since the card has a lot more transistors, shaders, texture units, rops and more than a 4890.


----------



## krisna159 (Sep 14, 2009)

just wait and see..


----------



## csendesmark (Sep 14, 2009)

New HQ images 1
New HQ images 2


----------



## mdm-adph (Sep 14, 2009)

That huge range in possible scores (-5 to 90% faster?) just shows you how horribly optimized some games are. :shadedshu


----------



## mdm-adph (Sep 14, 2009)

MopeyMartian said:


> a) figures don't lie, but lies can figure
> 
> b) yes, it may be "twice" as powerful but if it uses twice the wattage you can forget it.



Considering that it's 40nm and that they've apparently fixed the GDDR5 flicker bug, I really don't think that's the case.  

Who knows how fast it is, but it's sure to not use a lot of power.


----------



## HellasVagabond (Sep 14, 2009)

KainXS said:


> they will but the situation is different now, when the 48XX came out the G92's were still competing with them but once these cards come out, from the looks nvidia isn't going to have anything to compete for a while



NVIDIA will be rolling their first samples within September so how do you figure the "for a while" ?


----------



## btarunr (Sep 14, 2009)

Sihastru said:


> We do need to see real numbers (FPS) and not just percentages, because % is relative to something. For example 95% over 10 FPS is 19.5 FPS and 95% over 100 FPS is 195 FPS, and in both of this cases it makes no difference when you play the actual game...



Regardless it has "5% less fps" or "95% more fps" compared to the NVIDIA accelerator. So the relativity is the GTX 285/GTX 295 used for the comparison.



HellasVagabond said:


> NVIDIA will be rolling their first samples within September so how do you figure the "for a while" ?



Till NVIDIA's competing products have healthy global inventories.


----------



## Urbklr (Sep 14, 2009)

Wow, if these are close to real sign me up for one! Cannot wait for the 23!

36 people are viewing this thread right now, wow!


----------



## pantherx12 (Sep 14, 2009)

Sihastru said:


> We do need to see real numbers (FPS) and not just percentages, because % is relative to something. For example 95% over 10 FPS is 19.5 FPS and 95% over 100 FPS is 195 FPS, and in both of this cases it makes no difference when you play the actual game...
> 
> .



Well its in Comparison to the 295 right?

Just need to find reviews on the 295 and see what FPS they get in the game.


----------



## W1zzard (Sep 14, 2009)

those chiphell performance figures are against the gtx 285, at least thats what it says there


----------



## pantherx12 (Sep 14, 2009)

My mistake, still same applies just 285 instead of 295, thanks W1zzard.


----------



## A Cheese Danish (Sep 14, 2009)

WOW!  That is some very nice performance!
I can almost guarantee that I will get one of these if and only if the price is reasonable.
That is just...ahh  Breathtaking


----------



## kid41212003 (Sep 14, 2009)

Even the HD4890 is faster than GTX285, that's pretty BS. :shadedshu


----------



## pantherx12 (Sep 14, 2009)

Emails about this thread are messing up for me.

@ Kid4121. in that game, yes.

Just like how lower spec nvidia cards can beat the pants of some higher spec ati cards in certain games.

Its optimisation, check out the 3d mark score, the 285 beats the 4890 by quite a lot.


----------



## wolf (Sep 14, 2009)

kid41212003 said:


> Even the HD4890 is faster than GTX285, that's pretty BS. :shadedshu



uuhhhhh.... no no.

perhaps in hawx with dx10.1 .... but in general hardly.


----------



## newtekie1 (Sep 14, 2009)

I'm glad to see that ATi's next generation can beat the current generation cards, but I'm more concerned about next gen vs. next gen at this point...



btarunr said:


> Till NVIDIA's competing products have healthy global inventories.



I'm going to guess that nVidia is only a few weeks behind, maybe a month.  I doubt it is going to be "a while" before nVidia has competing products out on the market.


----------



## InTeL-iNsIdE (Sep 14, 2009)

HellasVagabond said:


> Why am i not surprised that both benchmarks where done with AMD-Friendly games/applications ?
> Had 3DMark Not disabled PhysX for NVIDIA cards this would be funny....
> 
> In any case wait for real reviews before judging any product, by NVIDIA or AMD.



how is vantage ati friendly  ? they disabled physx cause the scores from nvidia cards were not representative of real life, that was futuremarks slip up and they fixed it. Physx has a hit on game perf not the opposite way around as vantage was reporting when it was first released, now its an even playing ground.

And the same can be said for Nvidia having "The Way Its Meant To Be Played" plastered all over games they use in their benchmarks  that kind of shit shouldnt be allowed, everyone knows the game devs are paid to make them play better on Nvidia, its an outrage if you ask me but meh whatever. 

The fact you keep arguing the toss and all 3 of your rigs in your specs are nvidia actually says a lot. 

I am neither for nor against and I do agree about waiting for real reviews and benchmarks, cause heaven forbid an ATI card knock Nvidia off the performance top spot for a little while


----------



## btarunr (Sep 14, 2009)

newtekie1 said:


> I'm going to guess that nVidia is only a few months behind, maybe a month.  I doubt it is going to be "a while" before nVidia has competing products out on the market.



Touché, we'll see.


----------



## newtekie1 (Sep 14, 2009)

In case it wasn't clear, I mean a few weeks behind, maybe a month.  Not a few months behind...that would be "a while".


----------



## KainXS (Sep 14, 2009)




----------



## newtekie1 (Sep 14, 2009)

InTeL-iNsIdE said:


> how is vantage ati friendly  ? they disabled physx cause the scores from nvidia cards were not representative of real life, that was futuremarks slip up and they fixed it. Physx has a hit on game perf not the opposite way around as vantage was reporting when it was first released, now its an even playing ground.



PhysX does not have a hit on game performance.  Yes, the FPS goes down because there are more object to render.  However, the big performance improvemnt comes with the game physics.  If the same level of physics was done without hardware acceleration, the game would slow to a crawl.

So for a benchmark to show this, it is entirely representitive of real life.



InTeL-iNsIdE said:


> And the same can be said for Nvidia having "The Way Its Meant To Be Played" plastered all over games they use in their benchmarks  that kind of shit shouldnt be allowed, everyone knows the game devs are paid to make them play better on Nvidia, its an outrage if you ask me but meh whatever.



Yes, it is soooooo outragous that nVidia pays for development time to make the game optimized better.  Developement time costs money, and nVidia paying game devs to spend extra time optimizing the game to run better for the customer is a good thing.  I wish ATi cared as much about their customers and making them happy.  They had a similar program, and all but dropped it...


----------



## ArmoredCavalry (Sep 14, 2009)

If this is the true performance, I really can't see it being priced at $300 (at least not for long). :\

Also, about physX, I could care less about physX. It was a joke when Ageia owned it, and it is a joke now that Nvidia owns it. I stood next to some flags with phsyX on in Mirror's Edge. Hey look waving flags. Then I turned off phsyX, hey look waving flags, and now I get 20 more fps....


----------



## Suijin (Sep 14, 2009)

newtekie1 said:


> In case it wasn't clear, I mean a few weeks behind, maybe a month.  Not a few months behind...that would be "a while".



I have seen some posts that indicate Nvidia won't have the new cards out until 2010.  That's 2.5 months for now, but then again ATI doesn't have theirs on the streets yet either.


----------



## pantherx12 (Sep 14, 2009)

I wonder if nvidia will use newer fab with their next cards.


----------



## Marineborn (Sep 14, 2009)

SPANKED!!!! that is all, i shall be getting it...or maybe 3...*giggles and runz off*


----------



## W1zzard (Sep 14, 2009)

pantherx12 said:


> I wonder if nvidia will use newer fab with their next cards.



nvidia is producing at tsmc for the forseeable future


----------



## mdm-adph (Sep 14, 2009)

newtekie1 said:


> In case it wasn't clear, I mean a few weeks behind, maybe a month.  Not a few months behind...that would be "a while".



You think Nvidia's going to have their answer to the 5000 series out before December?  Everything I've ever heard around here says no.



newtekie1 said:


> Yes, it is soooooo outragous that nVidia pays for development time to make the game optimized better *[explicitly on their hardware, while potentially making the game run slower on others' hardware]*.  Developement time costs money, and nVidia paying game devs to spend extra time optimizing the game to run better for the customer is a good thing.  I wish ATi cared as much about their customers and making them happy.  They had a similar program, and all but dropped it...



...fixed that for you there.

If you think Nvidia isn't "encouraging" developers to design in ways that only makes it faster on _their_ hardware while simultaneously making it slower on other brands, you're fooling yourself.  It's business, and if they can make money off of it, you can bet they're doing it.

There is no such thing as a conspiracy in the business world if there's money to be made.

I'd rather the game run the same speed on ALL hardware, and not have to buy a certain company's products to get full speed.


----------



## pantherx12 (Sep 14, 2009)

Taiwan Semiconductor Manufacturing Company?

So I'm guessing that's a yes then, since they can produced things on a 45nm scale.


----------



## W1zzard (Sep 14, 2009)

pantherx12 said:


> Taiwan Semiconductor Manufacturing Company?
> 
> So I'm guessing that's a yes then, since they can produced things on a 45nm scale.



yes that's tsmc. yes tsmc has a 40 nm node. nvidia has been with tsmc for ages


----------



## Nemo~ (Sep 14, 2009)

this is just wicked sick :O


----------



## Valdez (Sep 14, 2009)

W1zzard said:


> those chiphell performance figures are against the gtx 285, at least thats what it says there



One of those shows 5870 vs gtx2*9*5.


----------



## InTeL-iNsIdE (Sep 14, 2009)

newtekie1 said:


> PhysX does not have a hit on game performance.  Yes, the FPS goes down because there are more object to render.  However, the big performance improvemnt comes with the game physics.  If the same level of physics was done without hardware acceleration, the game would slow to a crawl.
> 
> So for a benchmark to show this, it is entirely representitive of real life.



Its simply because 
Physx is an Nvidia trademark, and in case you havent noticed it is dying and will be dead and gone when DX11 games start to come out. There where very few titles that used physx and it was a non starter from the get go, the score from an nvidia card running physx is not comparable to a ATI card of roughly the same perf cause the phyics tests in vantage were supposed to be done by the cpu and were not specifically for the Physx API. So saying a Nvidia card that got 40% more than a comparable ATI card in vantage is not representative of real life benchmarks and games, it was a bug that got fixed, end of. 





newtekie1 said:


> Yes, it is soooooo outragous that nVidia pays for development time to make the game optimized better.  Developement time costs money, and nVidia paying game devs to spend extra time optimizing the game to run better for the customer is a good thing.  I wish ATi cared as much about their customers and making them happy.  They had a similar program, and all but dropped it...



Wrong again, development my ass, what about the .Exe names that could be changed to give ATI users the same performance that nvidia users were getting in these so called " optimised for Nvidia" games, its marketing BS and it is wrong for game devs to bend over for a quick buck from Nvidia when the games shouldnt run any worse on ATI hardware


----------



## pantherx12 (Sep 14, 2009)

Don't Nvidia have greater market share?

Devs will make games that work with the majority everytime.

If ATI take the lead then dev will make things work better with ATI.


----------



## FreedomEclipse (Sep 14, 2009)

time to trade in my 2 4870's.....


----------



## pantherx12 (Sep 14, 2009)

mastrdrver posted these in the HD5 series discussion thread.

Edit: see my below post.


----------



## mastrdrver (Sep 14, 2009)

Thanks panther but your links don't work. 

Just watch XS and B3D. Stuff is just popping up left and right on those two forums. There will be nothing left to reveal by the time the NDA is up.


----------



## newtekie1 (Sep 14, 2009)

mdm-adph said:


> You think Nvidia's going to have their answer to the 5000 series out before December?  Everything I've ever heard around here says no.



I think there is a possibility.



mdm-adph said:


> ...fixed that for you there.
> 
> If you think Nvidia isn't "encouraging" developers to design in ways that only makes it faster on _their_ hardware while simultaneously making it slower on other brands, you're fooling yourself.  It's business, and if they can make money off of it, you can bet they're doing it.
> 
> ...



That isn't true, and there isn't a single thing even indicating it.  When you show me a sliver of proof, I'll talk to you about this, until then, don't spread lies.



InTeL-iNsIdE said:


> Its simply because
> Physx is an Nvidia trademark, and in case you havent noticed it is dying and will be dead and gone when DX11 games start to come out. There where very few titles that used physx and it was a non starter from the get go, the score from an nvidia card running physx is not comparable to a ATI card of roughly the same perf cause the phyics tests in vantage were supposed to be done by the cpu and were not specifically for the Physx API. So saying a Nvidia card that got 40% more than a comparable ATI card in vantage is not representative of real life benchmarks and games, it was a bug that got fixed, end of.



When DX11 comes out, I'll be glad, as a unified physic API is what the industry needed.

However, DX11 and PhysX dying has nothing to do with Vantage, a DX10 benchmark.  Vantage was not just a benchmark to give raw graphical performance, it was a benchmark to give overall computer performance.  The PhysX API was included, and meant to be used.  Futuremark even allowed it entirely using dedicated PhysX cards.  All the world records were set with Ageia cards in the machines, it was perfectly acceptable.

The PhysX tests in Vantage were meant to test PhysX performance, it didn't matter where it was calculated at, until nVidia started doing it on the GPUs, and the ATi fans started freaking out.



InTeL-iNsIdE said:


> Wrong again, development my ass, what about the .Exe names that could be changed to give ATI users the same performance that nvidia users were getting in these so called " optimised for Nvidia" games, its marketing BS and it is wrong for game devs to bend over for a quick buck from Nvidia when the games shouldnt run any worse on ATI hardware



Show me this, because I doubt it ever happened.  What you are saying isn't possible.  Changing the EXEs improves performance for ATi for several reasons, but not the reason you are stating.  Usually it is done to invoke optimizations from and older game for a new game, in the drivers that were applied by ATi.  It has noething to do with the game devs at all.  Changing the name of the EXE fools outside programs(drivers), but it doesn't change the what the program runs itself.  The in game optimizations won't be change if you rename the EXE.

But I fear we are going way off topic here, so I'll end it here.  I think if you want to discuss it further, a dedicated topic would be best.


----------



## gumpty (Sep 14, 2009)

pantherx12 said:


> mastrdrver posted these in the HD5 series discussion thread.
> 
> http://tweakimg.net/g/forum/template...com/oho6qa.png
> http://tweakimg.net/g/forum/template...om/2gt16v5.png
> ...



Your links appear to be broken, sir.

EDIT: I am so slow.


----------



## pantherx12 (Sep 14, 2009)

Bollocks!

Sorry guys

Fixed in this post

http://tweakimg.net/g/forum/templates/tweakers/html/showimage.html?http://i31.tinypic.com/oho6qa.png

http://tweakimg.net/g/forum/templat...image.html?http://i28.tinypic.com/2gt16v5.png

http://tweakimg.net/g/forum/templates/tweakers/html/showimage.html?http://i32.tinypic.com/wj9ru9.png


----------



## mdm-adph (Sep 14, 2009)

newtekie1 said:


> That isn't true, and there isn't a single thing even indicating it.  When you show me a sliver of proof, I'll talk to you about this, until then, don't spread lies.



Prove that it's not.  

It's a mechanism by which Nvidia could reap higher profits at absolutely zero risk, even if they're found out.  Therefore, in the business world, trust me -- it's being done.


----------



## newtekie1 (Sep 14, 2009)

mdm-adph said:


> Prove that it's not.
> 
> It's a mechanism by which Nvidia could reap higher profits at absolutely zero risk, even if they're found out.  Therefore, in the business world, trust me -- it's being done.



You made the negative claims, it is your responsibility to prove them.  It isn't a persons responsibility to prove they didn't commit murder, just because someone else says they did.  The person making the accusations bears the burden of proof.


----------



## laszlo (Sep 14, 2009)

according to this is not a big "leap" older cards are still good enough


----------



## pantherx12 (Sep 14, 2009)

What are you talking about thats a huge leap!

the 295 has two gpus! the 5870 one gpu.


----------



## mdm-adph (Sep 14, 2009)

newtekie1 said:


> You made the negative claims, it is your responsibility to prove them.  It isn't a persons responsibility to prove they didn't commit murder, just because someone else says they did.  The person making the accusations bears the burden of proof.



Yeah, well this isn't a criminal claim I'm making, either.    It's civil -- and in that case, as long as there is a "preponderance of the evidence," the burden of proof is on Nvidia to prove that they're not guilty.

And I think the fact that they could make millions is all the "preponderance" I (and anyone else) need.


----------



## TheMailMan78 (Sep 14, 2009)

But will the 5870 fix the G-D DAMN SHADOWS IN BF2??!


----------



## KainXS (Sep 14, 2009)

laszlo said:


> according to this is not a big "leap" older cards are still good enough



its big enough for me it outperforms the highest end consumer card available(not including the mars) at high res with AA based on what ive seen


----------



## newtekie1 (Sep 14, 2009)

mdm-adph said:


> Yeah, well this isn't a criminal claim I'm making, either.    It's civil -- and in that case, as long as there is a "preponderance of the evidence," the burden of proof is on Nvidia to prove that they're not guilty.
> 
> And I think the fact that they could make millions is all the "preponderance" I (and anyone else) need.



Hardly a preponderance of evidence.  "They could make money if they did" isn't evidence of any kind.  It is still up to the accuser to provide evidence, and motive is not evidence and certainly not enough to be considered a preponderance.



laszlo said:


> according to this is not a big "leap" older cards are still good enough



It is a big leap for the single GPU, but more importantly a very good sign.  It is the first time in a long while that ATi has managed to put out a next generation single GPU card that can top all the cards from a previous generation.  This is what the industry/consumer needs.


----------



## araditus (Sep 14, 2009)

Newtekie, you troll hardcore bro, gotta always get the last word, this is a news thread, input your news,or specific facts and chill, toast a bagel or go for a run to get the saratonin levels up


----------



## KainXS (Sep 14, 2009)

newtekie is right, I don't think he's trolling

Some games aren't optimized for ATI cards because game developers in general aren't going to optimize the games unless they are paid to do so. . . . because they can make money, its all about the money, its the money game as usual.

and certain games are faster after being renamed because ATI does optimizations every now and then for games by having the drivers hook onto the games filename, but if the game has no fixes or whatnot the game will run without them and might run like crap, so you rename the exe to a filename thats the drivers can recognize and hook to and then the game loads with those optimizations, nvidia tries to avoid this by saying here take this money and optimize this game for our cards, . . . . . done, . . . . . and ATI does the same thing sometimes.

Theres nothing evil about it, thats ATI's fault for not playing the game.

I can say this and I will be one of the first in line to get me a HD5870 in two weeks no matter what.


----------



## mdm-adph (Sep 14, 2009)

newtekie1 said:


> Hardly a preponderance of evidence.  "They could make money if they did" isn't evidence of any kind.  It is still up to the accuser to provide evidence, and motive is not evidence and certainly not enough to be considered a preponderance.



All a business does is try to make more money -- that's the motive.  The stupid "Way It's Meant to Be Played" is the evidence.  End of line.

But that's all I'm going to bother saying -- it's not worth it.  

Another reason why they've got nothing to lose is that they've got people like you, ready to defend anything they do.


----------



## newtekie1 (Sep 14, 2009)

araditus said:


> Newtekie, you troll hardcore bro, gotta always get the last word, this is a news thread, input your news,or specific facts and chill, toast a bagel or go for a run to get the saratonin levels up



Because you obviously don't know the definition of trolling /\/\/\//\/\That/\/\/\/\/\ is trolling...how ironic.

Being in a discussion doesn't make me a troll, no matter how much your disagree with my opinions.



mdm-adph said:


> All a business does is try to make more money -- that's the motive.  The stupid "Way It's Meant to Be Played" is the evidence.  End of line.
> 
> But that's all I'm going to bother saying -- it's not worth it.
> 
> Another reason why they've got nothing to lose is that they've got people like you, ready to defend anything they do.



Hardly evidence to support your claims.  Come on, at least try.  Again, when you can back up your claims, then we'll talk.  Show me at least a little hard evidence, and not just conspiracy BS.



KainXS said:


> newtekie is right, I don't think he's trolling
> 
> Some games aren't optimized for ATI cards because game developers in general aren't going to optimize the games unless they are paid to do so. . . . because they can make money, its all about the money, its the money game as usual.
> 
> ...



Thank you. There is really no use stating logic though, mdm-adph and others like him don't care.  They believe nVidia is evil, with no evidence of any wrong doing, and nothing you can say will change that.  mdm-adph, is the first person to bash nVidia as much as possible, and then call everyone else that disagrees with him a fanboy because they don't agree that ATi is god's gift to video card.

You say negative things about ATi, and your an instant nVidia fanboy to mdm-adph.  Doesn't matter that you say something postive in the same post, or even that you say negative things about nVidia also...


----------



## erocker (Sep 14, 2009)

araditus said:


> Newtekie, you troll hardcore bro, gotta always get the last word, this is a news thread, input your news,or specific facts and chill, toast a bagel or go for a run to get the saratonin levels up





newtekie1 said:


> Because you obviously don't know the definition of trolling /\/\/\//\/\That/\/\/\/\/\ is trolling...how ironic.
> 
> Being in a discussion doesn't make me a troll, no matter how much your disagree with my opinions.



Stop... Now. Use the report button. Infractions are given for trolls and those who troll, trolls. Keep on topic. Don't feel the need to explain yourself. Don't derail the subject at hand. State your case and move along.

Thank you.


----------



## niko084 (Sep 14, 2009)

I'll wait for real review and real test, see what the power consumption is like.

Been holding off buying a 4870 1GB or 4890 for a bit here, glad time is closing in.


----------



## Valdez (Sep 14, 2009)

http://forum.hwsw.hu/index.php?showtopic=147595&view=findpost&p=6033805


----------



## TheMailMan78 (Sep 14, 2009)

newtekie1. You have a knack for pissing people off. More than I do.


----------



## overclocking101 (Sep 14, 2009)

so basically its just like when the HD4XXX series came out the 4870 was a small margin faster than the 3870X2 im not really surprised tbh


----------



## FreedomEclipse (Sep 14, 2009)

Valdez said:


> http://forum.hwsw.hu/index.php?showtopic=147595&view=findpost&p=6033805



thats cool. thanks for that, just reading some of the results of the benches there - the GTX295 does an amazing job & in most cases in benches where it beats the 5870 it beats it senseless like a moose clubbing a yeti over the head with a turkey drumstick. this is awesome!


----------



## HellasVagabond (Sep 14, 2009)

ArmoredCavalry said:


> If this is the true performance, I really can't see it being priced at $300 (at least not for long). :\
> 
> Also, about physX, I could care less about physX. It was a joke when Ageia owned it, and it is a joke now that Nvidia owns it. I stood next to some flags with phsyX on in Mirror's Edge. Hey look waving flags. Then I turned off phsyX, hey look waving flags, and now I get 20 more fps....



In the past couple of months i have seen 5 games getting released that support Ageia ( Including Batman Arkham Asylum ) so on what do you base your assumption ? Ageia does NOT improve graphics, it makes them more Realistic so a flag is not exactly what i would call special in terms of rendering.


----------



## Benetanegia (Sep 14, 2009)

mdm-adph said:


> All a business does is try to make more money -- that's the motive.  The stupid "Way It's Meant to Be Played" is the evidence.  End of line.
> 
> But that's all I'm going to bother saying -- it's not worth it.
> 
> Another reason why they've got nothing to lose is that they've got people like you, ready to defend anything they do.



Broken logic. If anything like that was happening, AMD or Ati before, would have said something. You pretend that a company spends millions hurting the competition (because that's bussiness right?) and the competition does nothing in the meantime?

Against a company that spends millions doing somehing like that, the answer couldn't be simpler: a press conference stating what's happening. But in 5+ years nothing, nada. In the meantime AMD has been engaged in 3 antitrust cases against Intel, spending millions in each of them.

And don't pretend that in 5 years Ati couldn't find a single proof against Nvidia's evil conspiration, if that was happening, both Ati and Nvidia know a lot of each others specs, deadlines, mannufacturing success, etc, because they have spies. Now that IS bussiness.

Stop with the TWIMTBP BS for once...

Wow! I forgot to write the iportant part...

Those are nice FPS numbers we are seing here! I'll wait until Wizzard does a good review though. I never believe slides like those, where not onlythey pick the games that best suit them but also the best settings. Going down to 8X AF from 16x or upping the AA to 8x as they see fit. I hate that. Let's see some real numbers in the 23rd.


----------



## Wile E (Sep 14, 2009)

How did we get into a TWIMTBP discussion again? Look, nVidia pays developers to optimize, not to make ATI perform worse. If ATI wants devs to optimize for their hardware, they need to pay up as well. Until then, the responsibility falls on ATI to optimize in drivers. That's the path they choose, for whatever reasons they may have. It's not a damn conspiracy.

As far as these bench results, I'll take them with a grain of salt. I'll wait for the NDA to be lifted and check out official reviews.


----------



## mtosev (Sep 14, 2009)

when will be the 5870 X2 available?


----------



## Easy Rhino (Sep 14, 2009)

was the guy drunk when he did the benchmark? i never put much stock in these initial benchmarks anyway. ill go through 3 of them from different sources (tpu being 1 of them) and then make my buying decision. except i will be buying nvidia dx11 cards.


----------



## largon (Sep 14, 2009)

nV doesn't even pay anything, they borrow engineers to developers participating their TWIMTBP program so that they can optimise the game engine to run optimally on their own graphics architecture. End of story.


----------



## Easy Rhino (Sep 14, 2009)

Wile E said:


> How did we get into a TWIMTBP discussion again? Look, nVidia pays developers to optimize, not to make ATI perform worse. If ATI wants devs to optimize for their hardware, they need to pay up as well. Until then, the responsibility falls on ATI to optimize in drivers. That's the path they choose, for whatever reasons they may have. It's not a damn conspiracy.
> 
> As far as these bench results, I'll take them with a grain of salt. I'll wait for the NDA to be lifted and check out official reviews.



im sure amd is trying to create some anti-competition lawsuit against nvidia. they did it with intel...


----------



## mdm-adph (Sep 14, 2009)

Benetanegia said:


> Broken logic. If anything like that was happening, AMD or Ati before, would have said something. You pretend that a company spends millions hurting the competition (because that's bussiness right?) and the competition does nothing in the meantime?
> 
> Against a company that spends millions doing somehing like that, the answer couldn't be simpler: a press conference stating what's happening. But in 5+ years nothing, nada. In the meantime AMD has been engaged in 3 antitrust cases against Intel, spending millions in each of them.
> 
> And don't pretend that in 5 years Ati couldn't find a single proof against Nvidia's evil conspiration, if that was happening, both Ati and Nvidia know a lot of each others specs, deadlines, mannufacturing success, etc, because they have spies. Now that IS bussiness.



Never said it was evil -- it's just business.  And who knows?  Not that the Intel case is mostly over, maybe they'll do something on the nVidia front.


----------



## TheMailMan78 (Sep 14, 2009)

Wile E said:


> How did we get into a TWIMTBP discussion again? Look, nVidia pays developers to optimize, not to make ATI perform worse. If ATI wants devs to optimize for their hardware, they need to pay up as well. Until then, the responsibility falls on ATI to optimize in drivers. That's the path they choose, for whatever reasons they may have. It's not a damn conspiracy.
> 
> As far as these bench results, I'll take them with a grain of salt. I'll wait for the NDA to be lifted and check out official reviews.



Nvidia hired Oswald because JFK wasn't playing the way it was meant to be played.


----------



## LaidLawJones (Sep 14, 2009)

Just kind of wondering. Isn't the 5870 DX11. It's nice that it is doing well with the DX10./10.1 but we won't really know how it performs until some DX11 games are out. 

Other wise we are into the old battle of , you ran the card with/without physix/ you ran with 10 not 10.1 etc. 

We need to compare apples to apples.


----------



## newtekie1 (Sep 14, 2009)

At this point, people are going to be buying these for the same reason they bought 8800GTX's.  Because they dominate the last generation of DX, not because they can run the latest which has next to no games.


----------



## Benetanegia (Sep 14, 2009)

Easy Rhino said:


> im sure amd is trying to create some anti-competition lawsuit against nvidia. they did it with intel...



They did it with Intel because Intel was cheating and they had proofs. Also the lawsuit was filled in 2005. 



> Intel also faces a U.S. lawsuit filed in federal court by rival chip maker Advanced Micro Devices  (nyse: AMD -  news  -  people ) in 2005, claiming that Intel had forced PC makers to boycott AMD, "threatened retaliation" against customers using or selling AMD processors and offered rebates to customers designed to block the purchase of AMD's products.



Source: http://www.forbes.com/2008/06/06/intel-antitrust-ftc-tech-enter-cx_ag_0606intel.html



mdm-adph said:


> Never said it was evil -- it's just business.  And who knows?  Not that the Intel case is mostly over, maybe they'll do something on the nVidia front.



If Ati or AMD had one against Nvidia we would know already. TWIMTBP is been in place since 2002(?) I think. In that long, Ati/AMD had time to create a fake game developer themselves, enter the program and discover exactly what happens under TWIMTBP, have some proofs and all that. They had the time to do that 4 times. 

Sorry to insist, but a competitive company wouldn't let another one outsell them in a 2/1 basis for so long without saying *something* if they had the slightless idea that something shaddy like that was happening. AMD did exactly that legally (filling a lawsuit) in 2005 regarding Intel's behavior, but was stating the same "unnoficially" since much earlier. Show me an official statement regarding TWIMTBP or even an AMD/Ati representative that says something that looks even slightly suspicious. Maybe then, this conversation could have some sense. But it won't. It doesn't matter how hard you try to convince an alien conspirator that they are not here, they will always find something to make them think there's a posibility. And if it's posible, it's happening, that's the logic they follow. And sorry mate, but it's happening the same to you here. Look some posts above if you want to read yourself saying that...


----------



## aCid888* (Sep 14, 2009)

newtekie1 said:


> At this point, people are going to be buying these for the same reason they bought 8800GTX's.  Because they dominate the last generation of DX, not because they can run the latest which has next to no games.



If I was one of the ones who got a GTX when it came out...I'd be extremely happy as it still puts up a good fight even two years later! 



The 5870 should dominate all the current cards, look at the 1GB 4870 vs 3870x2, it slaps it silly....but you'd expect it to being a whole gen newer. 

I am interested in these new ATI cards as the idea of having one card drawing less power and being just as powerful as my current Crossfire setup is almost orgasmic.


----------



## erocker (Sep 14, 2009)

newtekie1 said:


> At this point, people are going to be buying these for the same reason they bought 8800GTX's.  Because they dominate the last generation of DX, not because they can run the latest which has next to no games.



DX11 may make a little impact on a buying decision, but it's all about performance. If I were to buy a card for DX11, I would wait a while. For a single GPU solution that can compete with 4870x2s and GTX295s there is definitely a temptation to get one.


----------



## 3870x2 (Sep 14, 2009)

LaidLawJones said:


> Just kind of wondering. Isn't the 5870 DX11. It's nice that it is doing well with the DX10./10.1 but we won't really know how it performs until some DX11 games are out.
> 
> Other wise we are into the old battle of , you ran the card with/without physix/ you ran with 10 not 10.1 etc.
> 
> We need to compare apples to apples.



apples to apples? Performance is relative, if it performs 15% better on 9/10.1/11, it is likely to do the same on 11.  This has been proven time and time again.


----------



## gumpty (Sep 14, 2009)

Meh, all this 'nvidia are evil, no, ati are dumb' about TWIMTBP is retarded. 

The reality is they're both as evil as each other and have both worked together to stiff *us* out of our hard earned dosh.

Price Fixing

They're both soulless corporations that exist solely to squeeze as much money out of us as possible - absolutely crazy to back one rather than the other.

So anyway. I'm pretty excited to see what these can new GPUs can do. Will be waiting to see what's Nvidia's offerings can do before I buy one though.


----------



## Benetanegia (Sep 14, 2009)

erocker said:


> DX11 may make a little impact on a buying decision, but it's all about performance. If I were to buy a card for DX11, I would wait a while. For a single GPU solution that can compete with 4870x2s and GTX295s there is definitely a temptation to get one.



I don't think it will have any this time around. After the DX10 fiasco most people won't jump into the same trap again. I expect exactly the oposite in fact, that most people will be more skeptical about DX11 than they should. But that's just my opinion.


----------



## erocker (Sep 14, 2009)

Benetanegia said:


> I don't think it will have any this time around. After the DX10 fiasco most people won't jump into the same trap again. I expect exactly the oposite in fact, that most people will be more skeptical about DX11 than they should. But that's just my opinion.



You are quite right. Many people feel burned/scorned by DX10. Here's to hoping DX11 doesn't turn into DX10.


----------



## laszlo (Sep 14, 2009)

newtekie1 said:


> It is a big leap for the single GPU, but more importantly a very good sign.  It is the first time in a long while that ATi has managed to put out a next generation single GPU card that can top all the cards from a previous generation.  This is what the industry/consumer needs.




my observation is based on the fact that the greens will also launch a new gpu (i also remember the news with the 500 times more powerful...) so i expected a little bit more


----------



## mdm-adph (Sep 14, 2009)

erocker said:


> You are quite right. Many people feel burned/scorned by DX10. Here's to hoping DX11 doesn't turn into DX10.



I couldn't care less about DX10 or DX11, as long as the DX9 performance is much faster!


----------



## Benetanegia (Sep 14, 2009)

Don't get me wrong though. I think sales will be better comparatively (after calculating the impact of the recession), just based on performance.



mdm-adph said:


> I couldn't care less about DX10 or DX11, as long as the DX9 performance is much faster!



That is true. Until the console porting crap ends, we are somehow bound to DX9 performance.


----------



## AsRock (Sep 14, 2009)

newtekie1 said:


> You made the negative claims, it is your responsibility to prove them.  It isn't a persons responsibility to prove they didn't commit murder, just because someone else says they did.  The person making the accusations bears the burden of proof.






mdm-adph said:


> Prove that it's not.
> 
> It's a mechanism by which Nvidia could reap higher profits at absolutely zero risk, even if they're found out.  Therefore, in the business world, trust me -- it's being done.




If  it is happening ATI have surely gone beyond.  If not they still made the 58xx range real interesting.

IMO i'm not sure but it would not surprise me one bit if it was happening.

What counts at this time AMD \ATI seem like they just made another great step in the right direction if game are or not optimized for their cards.


----------



## Scrizz (Sep 14, 2009)

we already know of games that are going to come out with DX11
so...


----------



## grunt_408 (Sep 14, 2009)

This is good! If the 5870 realy does perform that well then I will be getting rid of my 4890's for one. I am not suprised at them bench results given the sheer size of the PCB and Chip. Crikey something that big has to mean buisness.


----------



## Imsochobo (Sep 14, 2009)

HellasVagabond said:


> In the past couple of months i have seen 5 games getting released that support Ageia ( Including Batman Arkham Asylum ) so on what do you base your assumption ? Ageia does NOT improve graphics, it makes them more Realistic so a flag is not exactly what i would call special in terms of rendering.




Hi.

About this ageia tech.
Perform a blind test and see if users can tell their gameplay experience was better on an ati system or nvidia system(same fps numbers is required for this test)
Unless you say they should notice it, they rarely WILL.
This is fact, and i dont say, you see diffrence in some games, but rather that in most cases.
you wont.
Eyefinity might be just as much of a argument as ageia.
Never the less, there is a replacement for ageia, not for eyefinity(matrox which means extra cost !), physx is dead soon.

Cuda is not.
Yet.
Same applies for stream, nothing is going to be mainstream before BOTH have it, and i dont se any reason for anyone to really concider buying a nvidia card for something that is nvidia only and have to developed by game developers to support that technology.

Who would buy a damn car if only 10 roads were supported by the damn car.


----------



## chron (Sep 14, 2009)

LOL! Such a lively discussion on the matter.  I read as much as I can but the discussion seems to go in a loop.


----------



## newtekie1 (Sep 14, 2009)

Scrizz said:


> we already know of games that are going to come out with DX11
> so...



We already knew of games that were coming out with DX10 when the 8800GTX was released.  And just like then, these games will be DX9 games with DX10 and now DX11 features added on, and the DX10/11 features will add next to nothing to the gameplay.


----------



## FreedomEclipse (Sep 14, 2009)

Imsochobo said:


> Who would buy a damn car if only 10 roads were supported by the damn car.



that would depend on:

#1. what the car is
#2. how fast the car goes
#3. if its a kit car (or not)
#4. is fuel also infinite?
#5. if there are any speed limits on these 10 roads you speak of


----------



## Imsochobo (Sep 14, 2009)

FreedomEclipse said:


> that would depend on:
> 
> #1. what the car is
> #2. how fast the car is
> ...



If it performed like any other car.

Or a diffrent scenario.
a car that could drive on those 10 roads and drive on the other roads, but had like 5 mph/10km/h faster speed limit, just incremental improvements.


----------



## grunt_408 (Sep 14, 2009)

FreedomEclipse said:


> that would depend on:
> 
> #1. what the car is
> #2. how fast the car goes
> ...



lol I still think it looks like a wicked card and its BIG


----------



## FreedomEclipse (Sep 14, 2009)

Imsochobo said:


> If it performed like any other car.
> 
> Or a diffrent scenario.
> a car that could drive on those 10 roads and drive on the other roads, but had like 5 mph/10km/h faster speed limit, just incremental improvements.



if its a car that only goes 5mph, forget it. I can sprint for 30mins & get where i need to go a lot quicker


----------



## extrasalty (Sep 14, 2009)

ATI seems to finally have pulled it off, but unless prices come down I don't plan upgrades for the next 12 months- a pair of GTX260 cost me $300 after rebates and I sincerely doubt any ATI 58xx will bring over 18000 PPD in Folding@Home.


----------



## tastegw (Sep 14, 2009)

FreedomEclipse said:


> if its a car that only goes 5mph, forget it. I can sprint for 30mins & get where i need to go a lot quicker



sprint for 30 minutes! wow, you should try out for the olympics 
thats about 5-7 miles in that 30 minutes.

but back on topic,  im very excited about the nest gen cards from both ATI and nVidia.
a single gpu card that competes against the 295 sounds smexy!, and i know (its my opinion) nvidia's flagship single gpu card will be either as good or even better than the 5870.


----------



## troyrae360 (Sep 14, 2009)

Wow, Thats Awsum and Im sure ATI will Have a little somthing up there sleave eg(5870x2 of 5890) for when Nvidia releases there next GPU lol


----------



## springs113 (Sep 14, 2009)

tastegw said:


> sprint for 30 minutes! wow, you should try out for the olympics
> thats about 5-7 miles in that 30 minutes.
> 
> but back on topic,  im very excited about the nest gen cards from both ATI and nVidia.
> a single gpu card that competes against the 295 sounds smexy!, and i know (its my opinion) nvidia's flagship single gpu card will be either as good or even better than the 5870.




Has it come to any ones mind that maybe ati already have another 4890...(5890) in mind just in case nvidia comes out with something decent.

on  a side note i know every company is here to make money...but i would like to think that ati might be charging the prices speculated because the product is worth just that...given that the prices will drop when nvidia comes out with gt300


----------



## pantherx12 (Sep 14, 2009)

And don't forget they have HDx900 cards they could introduce ( obviously they've not done that before but just throwing some shapes  )


----------



## Scrizz (Sep 14, 2009)

newtekie1 said:


> We already knew of games that were coming out with DX10 when the 8800GTX was released.  And just like then, these games will be DX9 games with DX10 and now DX11 features added on, and the DX10/11 features will add next to nothing to the gameplay.



if DX11 gives more fps I'm all for it


----------



## btarunr (Sep 14, 2009)

Hemlock will take care of GT300. Assuming Cypress is 1.6x as fast as GT200, GT300 will still have to be 3x as fast as its predecessor to stand a chance against Hemlock. A 3x performance jump from its previous generation to me sounds unreal. Would be awesome if they pull it off.


----------



## LaidLawJones (Sep 14, 2009)

> apples to apples? Performance is relative, if it performs 15% better on 9/10.1/11, it is likely to do the same on 11. This has been proven time and time again.



Yes apples to apples. The last round of reviews was a pain in the ass. Between green and red throwing accusations about this on or off, this or that that version, and then throw in you used this game not that one, it was a nightmare. 

When Green releases their card at least we can have the same version DX. Sites can then balance out the games that favor each card. Drivers are the problem of the company's that make the card.

The problems of what systems are used to test the cards will still be open to debate, but at least we won't be using 3 v of DX.


----------



## OneCool (Sep 14, 2009)

I dont beleive those OpenGL numbers AT ALL!!


----------



## pantherx12 (Sep 14, 2009)

btarunr said:


> Hemlock will take care of GT300. Assuming Cypress is 1.6x as fast as GT200, GT300 will still have to be 3x as fast as its predecessor to stand a chance against Hemlock. A 3x performance jump from its previous generation to me sounds unreal. Would be awesome if they pull it off.



They'd be breaking moores law D:

It be super impressive.


----------



## EastCoasthandle (Sep 14, 2009)

Remember folks, there is a diminishing return on the additional performance with current gen games.  We are beyond the era were we actually need better cards because current gen cards don't offer enough performance for todays games.  Like I said before, the max fps is a no brainer.  It's the mins that's going to tell the real story here.  With what looks like a good start for DX11 games it's apparent that we should see them more prevalent then DX10 games IMO.  That's were the true benchmark results will tell us what these cards are worth.


----------



## Benetanegia (Sep 15, 2009)

btarunr said:


> Hemlock will take care of GT300. Assuming Cypress is 1.6x as fast as GT200, GT300 will still have to be 3x as fast as its predecessor to stand a chance against Hemlock. *A 3x performance jump from its previous generation to me sounds unreal*. Would be awesome if they pull it off.



I don't know what to think about that. Rumors say GT300 will be 500 mm2 or bigger so that's far more than double compared to what a GT200 would be at 40nm. Also RV870 is 330 mm2 and has 2.1 billion transistors. By simple math GT300 would have around 3.2 billion transistors, more than double that of GT200. Also Ati doubled up everything, Nvidia doesn't need to do that, in theory. 32 ROPs is more than enough, Nvidia already had them. Already had 512 bit memory bus too, and if you look at the die shots of GT200, half the chip was dedicated to ROP/memory so, they *could* have 3x the shader/texturing power into a chip that is twice the size and even more. What they do in the end, that's another story.

IMO Nvidia has all the ability to win this round hands down, even when RV870 is so impressive. Ati had all the advantages with RV770 (55nm, GDDR5) and Nvidia all the disadvantages (like trying to push GPGPU by using 10% of the die area exclusively for that purpose). Even then Nvidia managed the situation very well. This time around both have the same weapons and Nvidia still has MIMD, which I don't know if it's going to be a blessing or a burden (for gaming, for GPGPU will undoubtely own). But the thing is that Nvidia has a lot of posibilities this time around.

Also I think that a dual Nvidia card was already planned from the beginning. If GTX295 could be done they will be able to make a GTX395. It's more a matter of "do we need it?".


----------



## happita (Sep 15, 2009)

Everything said at this point about Cypress is 99.9% exaggerated at this point to me. The day that real performance numbers come up on the 23rd will be the day I open my eyes and believe what I actually see. Until then, I guess these figures will give us a slight clue as to how it will compete with current cards on the market.


----------



## troyrae360 (Sep 15, 2009)

I would say these benchmarks are real and im supprised that benchmarks wern't leeked a couple of days ago, 

but then again i was supprised that CAT 9.9 wasnt leaked this month eather.


----------



## DaC (Sep 15, 2009)

well, it seems from all leaked reviews (and all of them are pretty close), that the HD5XXX will be an outstanding card.... I really don't care about which Nvidia or ATI, will have the fastest card, all I care is that these cards will make prices drop a lot and more when GT300 comes.... 
By then everybody will be happy with new gen cards, that consume low power and can handle even 2500x1600 8xAA 16xAF very well, for around $200-$250 (just like when 3870 and 4870 had their first price drops).
In the end.... Nvidia or ATI, doesn't matter, because we are the truly winners....


----------



## jaydeejohn (Sep 15, 2009)

Thing is, nVidia doesnt have a tesselator in their shrink, nor can anyone at this point consider what other costs die size wise DX11 will need, besides the tessellator


----------



## Benetanegia (Sep 15, 2009)

jaydeejohn said:


> Thing is, nVidia doesnt have a tesselator in their shrink, nor can anyone at this point consider what other costs die size wise DX11 will need, besides the tessellator



AFAIK in DX11 tesselation is part of the Shader Model, so both brands should have tesselation inside the shaders, no need for a separate tesselator. I mean, I suppose that tesselation on the shaders is a requirement.

In fact DX11 tesselation is suposedly very different from the tesselation that Ati was doing on their own, outside of the DX API, so there's no advantage there, except maybe some more experience with it. And even then tesselation is not even remotely new, anyone with a grade in something related to graphics knows all the how-to very well, so I don't think there's going to be too much there. In the end it's all maths, a lot like calculating the averages of the position of two adjacent pixels.


----------



## HellasVagabond (Sep 15, 2009)

Imsochobo said:


> Hi.
> 
> About this ageia tech.
> Perform a blind test and see if users can tell their gameplay experience was better on an ati system or nvidia system(same fps numbers is required for this test)
> ...




1) There are around 20 titles that i am aware of that have PhysX support and i am sure there are more which i don't know.
2) In 2010 we will have many many titles supporting PhysX.
3) You don't pay extra so PhysX is Good.
4) In some games the difference is Extreme.


----------



## PCpraiser100 (Sep 15, 2009)

Incredible. To the X2 owners, you just got pwned.


----------



## wiak (Sep 15, 2009)

HellasVagabond said:


> Why am i not surprised that both benchmarks where done with AMD-Friendly games/applications ?
> Had 3DMark Not disabled PhysX for NVIDIA cards this would be funny....
> 
> In any case wait for real reviews before judging any product, by NVIDIA or AMD.


lol PhysX enabled 3dmark vantage isnt even a valid score according to futuremark
and did you know, that most of nvidia current lineup is just old renames? hehe


----------



## newtekie1 (Sep 15, 2009)

wiak said:


> lol PhysX enabled 3dmark vantage isnt even a valid score according to futuremark
> and did you know, that most of nvidia current lineup is just old renames? hehe



Yeah, pretty amazing that cards that old can still compete perofrmance wise with all but ATi's top 3 cards...

And they aren't exactly renames, they are for the most part have been refined versions of previous cards, not identical.  The only true renames have been the 8800GT -> 9800GT(and even that was suposed to be a different card at first, but got reworked to just be a rename due to costs of retooling), and the 8800GS to 9600GSO.


----------



## Wile E (Sep 15, 2009)

Yeah, I hate the renames, but it doesn't mean the cards aren't capable.

And why do people bash Physx? You get it free if you have a nVidia card. What's to lose? I actually miss it. Loved the explosions in Graw2 with my 8800GT. They're just not as nice with this setup, but this setup does tear thru absolutely everything without Physx. lol.


----------



## jaydeejohn (Sep 15, 2009)

Benetanegia said:


> AFAIK in DX11 tesselation is part of the Shader Model, so both brands should have tesselation inside the shaders, no need for a separate tesselator. I mean, I suppose that tesselation on the shaders is a requirement.
> 
> In fact DX11 tesselation is suposedly very different from the tesselation that Ati was doing on their own, outside of the DX API, so there's no advantage there, except maybe some more experience with it. And even then tesselation is not even remotely new, anyone with a grade in something related to graphics knows all the how-to very well, so I don't think there's going to be too much there. In the end it's all maths, a lot like calculating the averages of the position of two adjacent pixels.



So, youre saying nVidia is going to have no fixed function Tessellator at all, and itll all be done thru shaders?
I know the interpolation is being done inside the ATI shader cores, but still has a fixed function unit AFAIK, so nVidia may just forego all fixed function?
Like using the fixed function unit for a particular tesselation kernel, even tho interpolation is still being done inside the shader cores


----------



## bangmal (Sep 15, 2009)

newtekie1 said:


> Yeah, pretty amazing that cards that old can still compete perofrmance wise with all but ATi's top 3 cards...
> 
> And they aren't exactly renames, they are for the most part have been refined versions of previous cards, not identical.  The only true renames have been the 8800GT -> 9800GT(and even that was suposed to be a different card at first, but got reworked to just be a rename due to costs of retooling), and the 8800GS to 9600GSO.





hey boy, nvidia needed a 9800GTX+ and $100 price to compete against a 4850 

i am sure they could still compete when they are reduced to $19.99


----------



## Scrizz (Sep 15, 2009)

why do ppl keep saying you get PhysX free?
First of all you have to *BUY* a NVdia card :shadedshu


----------



## Wile E (Sep 15, 2009)

Scrizz said:


> why do ppl keep saying you get PhysX free?
> First of all you have to *BUY* a NVdia card :shadedshu



Because if you already have an nVidia card, which most of the discrete gfx card owners of the world do, it is free. If you get an ATI card, you still had to buy it, but you don't get physx with it.


----------



## Mussels (Sep 15, 2009)

InTeL-iNsIdE said:


> its marketing BS and it is wrong for game devs to bend over for a quick buck from Nvidia when the games shouldnt run any worse on ATI hardware



Take batman arkham asylum "oh we made that engine so AA only works on nvidia"
Rename it to UE3.exe and turn AA on in the CCC - whaddya know, it works.
Suddenly a backflip, its all good! ATI will fix it soon!




HellasVagabond said:


> In the past couple of months i have seen 5 games getting released that support Ageia ( Including Batman Arkham Asylum ) so on what do you base your assumption ? Ageia does NOT improve graphics, it makes them more Realistic so a flag is not exactly what i would call special in terms of rendering.



And in all of those games, it doesnt affect gameplay one little bit. run in hardware or software mode and it has zero impact on the game - its no different to having a 'debris' 'breakable glass' or 'realistic' cloth tickbox - *if those features didnt use physx, no one would care one bit about them!*



erocker said:


> You are quite right. Many people feel burned/scorned by DX10. Here's to hoping DX11 doesn't turn into DX10.



D3D 11 works on DX10 cards, just like how Nvidia cards can run HAWX but with the 10.1 feature greyed out. if your card supports stream/cuda, then you'll end up getting the other features of DX11 working (compute shaders) letting you use 'dx11 physics' and such - and since its coming to vista as well, they're getting a large amount of users with DX11 compatibility compared to when DX10 launched



Scrizz said:


> if DX11 gives more fps I'm all for it


Unlikely. DX11's main improvements lie elsewhere than D3D11 (and no DX upgrade has ever given more FPS)



jaydeejohn said:


> Thing is, nVidia doesnt have a tesselator in their shrink, nor can anyone at this point consider what other costs die size wise DX11 will need, besides the tessellator


see below



Benetanegia said:


> AFAIK in DX11 tesselation is part of the Shader Model, so both brands should have tesselation inside the shaders, no need for a separate tesselator. I mean, I suppose that tesselation on the shaders is a requirement.
> 
> In fact DX11 tesselation is suposedly very different from the tesselation that Ati was doing on their own, outside of the DX API, so there's no advantage there, except maybe some more experience with it. And even then tesselation is not even remotely new, anyone with a grade in something related to graphics knows all the how-to very well, so I don't think there's going to be too much there. In the end it's all maths, a lot like calculating the averages of the position of two adjacent pixels.



i heard the same things. ATI's tesselation isnt whats been used for DX11, so its an even slate there. 5870 should have one that meets the standards.



HellasVagabond said:


> 1) There are around 20 titles that i am aware of that have PhysX support and i am sure there are more which i don't know.
> 2) In 2010 we will have many many titles supporting PhysX.
> 3) You don't pay extra so PhysX is Good.
> 4) In some games the difference is Extreme.



name one. as i said a few posts above, i've tested them - my housemates have nvidia cards, and i've done side by side comparisons. i'm yet to see ANYTHING even remotely approaching game play affecting, since the map pack for unreal 3 - which was a showcase when it first came out.

game devs ARE NOT making it a useful feature, so that they dont alienate users of laptops, old Nvidia cards, or ATI users.



Wile E said:


> Because if you already have an nVidia card, which most of the discrete gfx card owners of the world do, it is free. If you get an ATI card, you still had to buy it, but you don't get physx with it.



even if nvidia has the largest market share, there is a minimum requirement for Physx that most nvidia cards dont meet. many entry level, onboard, and laptop GPU's dont have the power for it - and thats where most of nvidias cards are.


----------



## btarunr (Sep 15, 2009)

Benetanegia said:


> I don't know what to think about that. Rumors say GT300 will be 500 mm2 or bigger so that's far more than double compared to what a GT200 would be at 40nm. Also RV870 is 330 mm2 and has 2.1 billion transistors. By simple math GT300 would have around 3.2 billion transistors, more than double that of GT200. Also Ati doubled up everything, Nvidia doesn't need to do that, in theory. 32 ROPs is more than enough, Nvidia already had them. Already had 512 bit memory bus too, and if you look at the die shots of GT200, half the chip was dedicated to ROP/memory so, they *could* have 3x the shader/texturing power into a chip that is twice the size and even more. What they do in the end, that's another story.



I still don't think that will translate into "3x the shader/texturing power" compared to GT200, although I don't write it off completely. For MIMD to prove effective, you'll need app-specific optimizations that will almost not work with any other GPU. With AMD's rising market share in the GPU industry, I don't think developers will venture into that.

And oh, PhysX is a dead technology. With DirectCompute driven physics acceleration available industry-wide, it will be foolhardy for developers of the DirectX 11 generation of games to opt for PhysX. Have fun playing those 20 odd present gen games.


----------



## ste2425 (Sep 15, 2009)

just looking at the first picture from that chinese site, an i saw prey, a game i currently have that my rig can max out, i also noticed that from 8 af to 16 af theres almost a double in performance?


----------



## Valdez (Sep 15, 2009)

Some hi-res pictures about the card:
http://hardwarebg.com/forum/showpost.php?p=2069589&postcount=1220


----------



## Hayder_Master (Sep 15, 2009)

this one beat 4870X2 in some tests , they kill the old beast


----------



## Meizuman (Sep 15, 2009)




----------



## pantherx12 (Sep 15, 2009)

Just adding to the Physx discussion, I've always found the CPU good process physics fine enough.

I've seen very realistic physics on games that don't need Nvidia cards.

And with processors now being multi core it taking up processing power is irrelevant, unless you decode video whilst you play video games.


----------



## HTC (Sep 15, 2009)

ste2425 said:


> just looking at the first picture from that chinese site, an i saw prey, a game i currently have that my rig can max out, *i also noticed that from 8 af to 16 af theres almost a double in performance?*



You seem to be miss-reading the chart, dude: those percentages are *relative to the GTX285 performance*.

- *Suppose the GTX285 gives 60 FPS @ 8AF*: then, with the 5870 being around 150% better, that would mean the 5870 gives around 90 FPS @ 8AF.

- Now, *suppose the GTX285 gives 40 FPS @ 16AF*: then, with the 5870 being almost 220% better, that would mean the 5870 gives around 87 FPS @ 16AF.


----------



## newconroer (Sep 15, 2009)

HellasVagabond said:


> Why am i not surprised that both benchmarks where done with AMD-Friendly games/applications ?
> Had 3DMark Not disabled PhysX for NVIDIA cards this would be funny....
> 
> In any case wait for real reviews before judging any product, by NVIDIA or AMD.



Funny you should mention that, cause if you notice, none of the graphs show games like Age of Conan, which was a major surprise for probably even AMD, especially given it's Nvidia support.

But none of these graphs tell us anything. 
I laughed when I saw the HAWX one, like who cares? The game runs at 60fps anyways, what is the point of telling us that the 5870 runs it twenty more fps than the X2 when they're both near or over 100..


It's all just a waste of time.

These cards nor Nvidia's offering are going to spank anything. They'll fall down in the exact same places that current cards do. And they'll continue that way for quite some time.


----------



## Mussels (Sep 15, 2009)

newconroer said:


> Funny you should mention that, cause if you notice, none of the graphs show games like Age of Conan, which was a major surprise for probably even AMD, especially given it's Nvidia support.
> 
> But none of these graphs tell us anything.
> I laughed when I saw the HAWX one, like who cares? The game runs at 60fps anyways, what is the point of telling us that the 5870 runs it twenty more fps than the X2 when they're both near or over 100..
> ...



the reason for hawx is that its the only DX10.1 game on the list really.


----------



## porculete (Sep 15, 2009)

Anyone knows that the actual TDP of this card is 376W?!! This mean that u need a fridge inside your desk to keep it cool.


----------



## leonard_222003 (Sep 15, 2009)

Why everyone is hung up on what Nvidia will release , seems Nvidia did brainwashed some people around here and made some huge fanboy base for them.
First of all the power this new gpu's have is worthelles when we don't have games that can really use them , yes we can raise the resolution and probably an absurd 8xAA but this isn't great graphics , just more polished , some cool new effects and more detailed games would be great for all that power.
Second , even if you plan to buy a new graphic card , why waiting for what Nvidia will bring ? seems they are in the place of rumors and demo's , no pictures of actual card , no nothing.
You people think that news like this is released by some idiot who stoled some info from them ? of course not , it's released by themselves to build up hype  , if Nvidia didn't released something until now then it's clear they have NOTHING , it could be some months before we see something from Nvidia and i bet you it will be expensive and not so good as everyone expected , as i read somewhere they want to battle with AMD/ATI in production costs and finally make some integrators happy ( so companys like XFX won't start to make  evil red cards ).
So if you want a beast you can go ahead and buy one of these 40nm baby's,  but my advice is to keep your money until some games needs a powerfull GPU , and of course for the Nvidia fans to see what the green camp releases but i don't expect a miracle, they battle with AMD now , not with punny ATI who was tiny compared to Nvidia.
Nvidia's reign is coming to an end faster than some expected , look at the prices AMD have for the new generation , if this were from Nvidia they would've been untouchable for most people , look how much faster they released the dx11 generation , so you can imagine this guys want Nvidia dead fast.
Also the 40nm process was a merit of AMD from the rumors around the web , Nvidia is kind of get in line and wait your turn , you didn't contributed to make 40nm a reality as much as AMD so don't expect to be treated as equal , i really don't see Nvidia as inovative these days , renaming products over and over again , being late with DX11 parts and acting like a child planing press demo's of future generation on same day as AMD releses their line , and who knows what crap they spread that i don't know about.
Conclusion , AMD should ask Nvidia let's see what you have now motherf..... , demo's ? then get the fuc.. out until you have some working parts losers.


----------



## InnocentCriminal (Sep 15, 2009)

^^

Nice punctuation Leonard.


----------



## Valdez (Sep 15, 2009)

porculete said:


> Anyone knows that the actual TDP of this card is 376W?!! This mean that u need a fridge inside your desk to keep it cool.



That's not true. The tdp's are 188w for the 5870 and 170w for the 5850.


----------



## Imsochobo (Sep 15, 2009)

leonard_222003 said:


> Why everyone is hung up on what Nvidia will release , seems Nvidia did brainwashed some people around here and made some huge fanboy base for them.
> First of all the power this new gpu's have is worthelles when we don't have games that can really use them , yes we can raise the resolution and probably an absurd 8xAA but this isn't great graphics , just more polished , some cool new effects and more detailed games would be great for all that power.
> Second , even if you plan to buy a new graphic card , why waiting for what Nvidia will bring ? seems they are in the place of rumors and demo's , no pictures of actual card , no nothing.
> You people think that news like this is released by some idiot who stoled some info from them ? of course not , it's released by themselves to build up hype  , if Nvidia didn't released something until now then it's clear they have NOTHING , it could be some months before we see something from Nvidia and i bet you it will be expensive and not so good as everyone expected , as i read somewhere they want to battle with AMD/ATI in production costs and finally make some integrators happy ( so companys like XFX won't start to make  evil red cards ).
> ...



No flaming please.

But you got a point.
Physx, not nvidia's work.
Sli not nvidia's work.
So what have they've manage to create.

Cuda.
Thats the sole product from nvidia since, 2001, pure h/l and hardware h/l lightning or something that came while 3dmark 2001 was the thing !
SLI was bought from 3dfx, and really havnt been fixed alot

Credz for them to make theirself a fanbase. i wonder how 50 it people like ati after having a laptop with ati gpu that alone gave the experience to buy an ati graphics card. before that we had nvidia.


----------



## Easy Rhino (Sep 15, 2009)

leonard_222003 said:


> Why everyone is hung up on what Nvidia will release , seems Nvidia did brainwashed some people around here and made some huge fanboy base for them.
> First of all the power this new gpu's have is worthelles when we don't have games that can really use them , yes we can raise the resolution and probably an absurd 8xAA but this isn't great graphics , just more polished , some cool new effects and more detailed games would be great for all that power.
> Second , even if you plan to buy a new graphic card , why waiting for what Nvidia will bring ? seems they are in the place of rumors and demo's , no pictures of actual card , no nothing.
> You people think that news like this is released by some idiot who stoled some info from them ? of course not , it's released by themselves to build up hype  , if Nvidia didn't released something until now then it's clear they have NOTHING , it could be some months before we see something from Nvidia and i bet you it will be expensive and not so good as everyone expected , as i read somewhere they want to battle with AMD/ATI in production costs and finally make some integrators happy ( so companys like XFX won't start to make  evil red cards ).
> ...



im waiting for nvidia because i have the 790i chipset.


----------



## ToTTenTranz (Sep 15, 2009)

leonard_222003 said:


> Why everyone is hung up on what Nvidia will release , seems Nvidia did brainwashed some people around here and made some huge fanboy base for them.
> First of all the power this new gpu's have is worthelles when we don't have games that can really use them , yes we can raise the resolution and probably an absurd 8xAA but this isn't great graphics , just more polished , some cool new effects and more detailed games would be great for all that power.
> Second , even if you plan to buy a new graphic card , why waiting for what Nvidia will bring ? seems they are in the place of rumors and demo's , no pictures of actual card , no nothing.
> You people think that news like this is released by some idiot who stoled some info from them ? of course not , it's released by themselves to build up hype  , if Nvidia didn't released something until now then it's clear they have NOTHING , it could be some months before we see something from Nvidia and i bet you it will be expensive and not so good as everyone expected , as i read somewhere they want to battle with AMD/ATI in production costs and finally make some integrators happy ( so companys like XFX won't start to make  evil red cards ).
> ...





This post made my brain hurt.. but it's only logical to wait for the competition to make their move, just in case their offer is better.


The fact that we don't know anything about GT300, could be because it's awfully late or because nVidia managed to keep a tight secret about it so far.
Let's not forget that nVidia already cancelled most of the  GT210 family, probably due to production issues. This could mean that they scratched the GT210s in order to accelerate the GT300 release, allowing for a 2009 launch.


----------



## ste2425 (Sep 15, 2009)

HTC said:


> You seem to be miss-reading the chart, dude: those percentages are *relative to the GTX285 performance*.
> 
> - *Suppose the GTX285 gives 60 FPS @ 8AF*: then, with the 5870 being around 150% better, that would mean the 5870 gives around 90 FPS @ 8AF.
> 
> - Now, *suppose the GTX285 gives 40 FPS @ 16AF*: then, with the 5870 being almost 220% better, that would mean the 5870 gives around 87 FPS @ 16AF.



aa my bad lol cheers


----------



## Tatty_One (Sep 15, 2009)

ToTTenTranz said:


> This post made my brain hurt.. but it's only logical to wait for the competition to make their move, just in case their offer is better.
> 
> 
> The fact that we don't know anything about GT300, could be because it's awfully late or because nVidia managed to keep a thight secret about it so far.
> Let's not forget that nVidia already cancelled most of the  GT210 family, probably due to production issues. This could mean that they scratched the GT210s in order to accelerate the GT300 release, allowing for a 2009 launch.



We do know some things, we know that it's indicated release date is Q1 2010 and it has been that for some time, also speculated specs were released months ago......I find it strange that some members (not you) beleive they (NVidia) have nothing because we have not seen any pics or "leaked" info yet, the HD58XX series is due out imminently but we have only fairly recently seen "real" pics of the cards so why would NVidia want to commit themselves several months before their card is due?  people are talking as though ATI leaked "real" material several months ago...... they didnt, speculation and rumour yes, if anyone wants to "google" GT300 they will see 224,000 links, try RV870 and you get 430,000, for a card that is of course on the verge of being released, so nothing going on about GT300?

By preference I prefer ATi cards, I regularily own cards from both sides though, I really hate short memories, lets all shout hurray to ATi for bringing us the first DX11 card, they deserve much applause for that, now who brought us the first DX10 card?


----------



## leonard_222003 (Sep 15, 2009)

ToTTenTranz said:


> The fact that we don't know anything about GT300, could be because it's awfully late or because *nVidia managed to keep a thight secret *about it so far.



At this stage when AMD/ATI card is final there is little reason to keep a secret the supposedly super beast Nvidia should release , on the contrary , it's a good strategy to release some numbers , some graphs of perf. , some pictures of the beast , who would buy the puny AMD/ATI card if they would know the Nvidia card would be 2 times faster and with who knows what new features/effects.
Do you think AMD/ATI can do something now to counter a super GPU from Nvidia ? or a month ago ? , not really , they would have to start from scratch building a better GPU which could mean waiting more time before a launch.
That's why people are concerned about Nvidia , because they have nothing at the moment but some demo's they want to show at some point , really pathetic.
It's always best to buy the card that offers best price/perf. ratio , no matter the if it's Nvidia or AMD/ATI , if you buy based on color you are ...............


----------



## erocker (Sep 15, 2009)

When Nvidia decided to have a "press conference" on Sept. 10th along the 58xx unveiling I was thinking there were going to show a glimpse of their upcoming DX11 cards. Didn't happen. The only thing about new cards that has come from Nvidia has been their renaming strategy once again and DX10.1 parts. If Nvidia had anything to show about their "super" DX11 chip they would of showed it to take away some thunder from ATi. There just not a lot to be optimistic about Nvidia with the exception of what has happened historically with G80, G92, GT200 etc. I hope Nvidia comes out with a great DX11 card, we need competition.


----------



## ToTTenTranz (Sep 15, 2009)

erocker said:


> When Nvidia decided to have a "press conference" on Sept. 10th along the 58xx unveiling I was thinking there were going to show a glimpse of their upcoming DX11 cards. Didn't happen. The only thing about new cards that has come from Nvidia has been their renaming strategy once again and DX10.1 parts. If Nvidia had anything to show about their "super" DX11 chip they would of showed it to take away some thunder from ATi. There just not a lot to be optimistic about Nvidia with the exception of what has happened historically with G80, G92, GT200 etc. I hope Nvidia comes out with a great DX11 card, we need competition.



nVidia doesn't need to spoil ATI's paperlaunch (that's what the 14th September was all about).
nVidia needs to spoil ATI's hard launch, and that's only one week from now.


----------



## erocker (Sep 15, 2009)

ToTTenTranz said:


> nVidia doesn't need to spoil ATI's paperlaunch (that's what the 14th September was all about).
> nVidia needs to spoil ATI's hard launch, and that's only one week from now.



I still don't expect anything from Nvidia regarding their DX11 cards. We'll see. For Nvidia's sake I hope they have their DX11 cards ready for launch before Christmas.


----------



## Benetanegia (Sep 15, 2009)

Maybe my memory is short or defective, but there was not a lot of info about RV770 before it was launched. Not true info, at least. I remember seeing some online stores posting the specs one week before launch in which it said 640 SP instead of 800 and we knew very little about anything else. They wanted it to be a surprise and it was.

So I don't know why people are so surprised about the absence of info about GT300 (it's not even it's real name BTW). You can say a lot about Nvidia, but you can't say they don't learn from their mistakes. With GT200 they made the mistake of releasing first without having a clue about the competition. IMO they are preparing a revenge using the same strategy. And thing is they don't need to spoil AMD's press conference, they need to spoil their launch which is going to be in 23rd but with availability in probably November. It's then when they need to release some info, not yet. And the reason I have faith in GT300 is because after every flop, Nvidia has released a hell of a good card (Ti4200, 6800, 8800). GT200 has been considered a disaster right? Just my opinion.



btarunr said:


> I still don't think that will translate into "3x the shader/texturing power" compared to GT200, although I don't write it off completely. *For MIMD to prove effective, you'll need app-specific optimizations that will almost not work with any other GPU.* With AMD's rising market share in the GPU industry, I don't think developers will venture into that.
> 
> And oh, PhysX is a dead technology. With DirectCompute driven physics acceleration available industry-wide, it will be foolhardy for developers of the DirectX 11 generation of games to opt for PhysX. Have fun playing those 20 odd present gen games.



Not necessarily IMHO. Availability. That's the key word. Availavility of the shaders, I mean. Yes we are told that grphics rendering is a very parallel thing, but it's not as much as we think when they say that. I'm not an expert, but by logic it can't be. Imagine this example:

A light saber in vertical position in the scene you have to render. It's 12 pixels wide*, translucent and shiny. So a lot of shaders are going to be applied to it, more than what are apllied to the scene surrounding it. Now take G200 with it's 24 SP wide SIMD clusters. Only 12 would be used at any time a shader must be calculated for the light saber. Even if tiling rendering was used (i.e 16x16 pixels) instead of scanline you would never be able to use them all, because not all the pixels in the tile are going to have the same shader applied to it. With MIMD the drivers can take care of that and use them all. GT300 will certainly take advantage of this, but I can't say how much.

Also, if current rendering is so parallel and straightforward as to render MIMD useless, why is it that driver updates bring so big of improvements (sometimes greater than 25%) and multi-gpu profiles are so much needed?

* It's an example, for my example it doesn't matter if it is 36 (24+12) pixels wide, 26 (24+2) or 56 (24+24+8), you get the idea.



jaydeejohn said:


> So, youre saying nVidia is going to have no fixed function Tessellator at all, and itll all be done thru shaders?
> I know the interpolation is being done inside the ATI shader cores, but still has a fixed function unit AFAIK, so nVidia may just forego all fixed function?
> Like using the fixed function unit for a particular tesselation kernel, even tho interpolation is still being done inside the shader cores



I'm saying all the work must be done in the shaders in order to comply with DX11's Shader Model. (SM 5.0?)

*@leonard_222003*

So, in the same post you say it's stupid to rush and buy a card whose performance we don't need (debatable, although I could kind of agree), and then say that is stupid to wait for Nvidia's competing products and call a brainless fanboy to anyone that does that instead of buying a HD5xxx card. Who's showing his colors here?


----------



## ToTTenTranz (Sep 15, 2009)

erocker said:


> I still don't expect anything from Nvidia regarding their DX11 cards. (...)



To be honest, neither do I. 

But I do know it would be logical for nVidia to spoil the HD5800's hard launch rather than the soft launch.

If they don't spoil either one with some kind of rumours or backed "unreleased" benchmarks, nVidia will be delivering the most profitable quarter (Q4) in a plate to ATI.

And they need gaming performance numbers. PhysX and CUDAs won't do the trick anymore (let's just hope they have this in mind).


----------



## Easy Rhino (Sep 15, 2009)

nvidia wont need to put out dx11 cards until there is a reason to buy them. amd will blow their load early like they always do and come out with the best cards that nobody can take advantage of so nobody will buy. and then nvidia will release their cards when there is an actual reason to own dx11 cards and will provide a greater variety to the consumer. it always happens this way. btw, we enthusiasts make up a very small percentage of GPU consumers. our buying habits do not reflect those of the majority.


----------



## btarunr (Sep 15, 2009)

Performance leadership, more than DirectX 11 support is what is going to sell these cards from AMD. So people aren't not going to buy these because they don't need DX11 cards just yet. By that logic nobody needed a GeForce 8800 GTX / 8800 GTS when it came out, but people bought it because it ran every current game / benchmark faster, and provided some future-proofing.


----------



## erocker (Sep 15, 2009)

Easy Rhino said:


> nvidia wont need to put out dx11 cards until there is a reason to buy them. amd will blow their load early like they always do and come out with the best cards that nobody can take advantage of so nobody will buy. and then nvidia will release their cards when there is an actual reason to own dx11 cards and will provide a greater variety to the consumer. it always happens this way. btw, we enthusiasts make up a very small percentage of GPU consumers. our buying habits do not reflect those of the majority.



Thing is this AMD release looks a lot to me like the G80 (8800 Series) release. I don't care much about DX11. It's the single GPU performance that gets me. G80 was a DX10 card that was released before there were DX10 games. It made one hell of a DX9 card.

*I pretty much just said what bta said...


----------



## ToTTenTranz (Sep 15, 2009)

Easy Rhino said:


> nvidia wont need to put out dx11 cards until there is a reason to buy them. amd will blow their load early like they always do and come out with the best cards that nobody can take advantage of so nobody will buy. and then nvidia will release their cards when there is an actual reason to own dx11 cards and will provide a greater variety to the consumer. it always happens this way. btw, we enthusiasts make up a very small percentage of GPU consumers. our buying habits do not reflect those of the majority.



I disagree.

We've had 3 generations of DX10 cards. Many people are now holding to their HD2900/3800s and Geforce 8800s until the first DX11 cards are here.

Furthermore, don't underestimate the importance of the enthusiasts in this market. There are many millions of avid PC gamers around the world now, and that means many millions of potential high-end card buyers.


----------



## Benetanegia (Sep 15, 2009)

The 8800 didn't sell too much at all in the first 3 months. And the situation was very different anyway. First of all, prices of released cards weren't as good as they are today. 79xx and X19xx cards were selling for $300+ and the 8800 GTS gave twice the performance for $400. Also 1920x1200 LCD panels were starting to be "affordable" around this time. 4x AA became mandatory at this point too. Then the 320 MB came in and it's then when the thunder really began, at least around where I live.

Anyway the situation is very different today: the new generation (top card) comes at $400 and is double as fast as a $150 card and there's no higher def. monitors around the corner and 8x/16 AA isn't as sweet as 4x was. Quite a different story.


----------



## Easy Rhino (Sep 15, 2009)

btarunr said:


> Performance leadership, more than DirectX 11 support is what is going to sell these cards from AMD. So people aren't not going to buy these because they don't need DX11 cards just yet. By that logic nobody needed a GeForce 8800 GTX / 8800 GTS when it came out, but people bought it because it ran every current game / benchmark faster, and provided some future-proofing.



i know, but the vast majority of consumers will not spend more than $200 on a GPU. they are casual PC gamers. they were not buying the 8800GTX they were holding onto their 7000 series cards until a mid range 8000 series card came out.


----------



## Easy Rhino (Sep 15, 2009)

ToTTenTranz said:


> I disagree.
> 
> We've had 3 generations of DX10 cards. Many people are now holding to their HD2900/3800s and Geforce 8800s until the first DX11 cards are here.
> 
> Furthermore, don't underestimate the importance of the enthusiasts in this market. There are many millions of avid PC gamers around the world now, and that means many millions of potential high-end card buyers.



if there were millions of people buying high end GPUs both NVIDIA and AMD would be rolling in the money. truth is that most people are buying the mid range cards ~$200 or less. we enthusiasts are an exception!


----------



## btarunr (Sep 15, 2009)

Easy Rhino said:


> i know, but the vast majority of consumers will not spend more than $200 on a GPU. they are casual PC gamers. they were not buying the 8800GTX they were holding onto their 7000 series cards until a mid range 8000 series card came out.



HD 5870 isn't for the "vast majority" anyway, just as 8800 GTX wasn't, as with every >$200 card.


----------



## Easy Rhino (Sep 15, 2009)

btarunr said:


> HD 5870 isn't for the "vast majority" anyway, just as 8800 GTX wasn't, as with every >$200 card.



exactly, which means the 5870 wont sell all that well. nvidia knows this and will wait until people have a reason to buy dx11 cards. dx11 is a marketing tool to sell these new cards.


----------



## Benetanegia (Sep 15, 2009)

Easy Rhino said:


> if there were millions of people buying high end GPUs both NVIDIA and AMD would be rolling in the money. truth is that most people are buying the mid range cards ~$200 or less. we enthusiasts are an exception!



I don't know if we can say HD4890/GTX275 is mid-range though...  (with rebates both are there).

Let's be honest, not even a $150 HD4870/GTX260 is midrange. The price doesn't tell the range of the card, but how they stack up against other cards in the lineup. The HD4890 is the second fastest Ati card!!


----------



## btarunr (Sep 15, 2009)

Easy Rhino said:


> exactly, which means the 5870 wont sell all that well. nvidia knows this and will wait until people have a reason to buy dx11 cards. dx11 is a marketing tool to sell these new cards.



It will sell, in the segments it is made for. "Selling well" is relative to the segment. That minority of people who spend $299, $399, or $449, on graphics cards, will opt for AMD instead of NVIDIA at the same price points.


----------



## Easy Rhino (Sep 15, 2009)

btarunr said:


> It will sell, in the segments it is made for. "Selling well" is relative to the segment. That minority of people who spend $299, $399, or $449, on graphics cards, will opt for AMD instead of NVIDIA at the same price points.



im talking overall strategy. you dont market your products to enthusiasts who are a small percentage of your consumer base. amd knows this. this doesnt mean that nvidia is behind in its pursuit of dx11, it just means nvidia is waiting to release when it knows its own cards will sell. nvidia is worth twice AMD and AMD makes CPUs as well. doesn't that tell you something about their volume of sales?


----------



## dir_d (Sep 15, 2009)

ToTTenTranz said:


> I disagree.
> 
> We've had 3 generations of DX10 cards. Many people are now holding to their HD2900/3800s and Geforce 8800s until the first DX11 cards are here.
> 
> Furthermore, don't underestimate the importance of the enthusiasts in this market. There are many millions of avid PC gamers around the world now, and that means many millions of potential high-end card buyers.



I am one of those people and will be buying a 5870 and have it next day air to my front door. Held on to my 8800 GTS for 3 years...Time to put it to rest. When the DX12 Cards come out ill probably buy one of those.


----------



## erocker (Sep 15, 2009)

Easy Rhino said:


> im talking overall strategy. you dont market your products to enthusiasts who are a small percentage of your consumer base. amd knows this. this doesnt mean that nvidia is behind in its pursuit of dx11, it just means nvidia is waiting to release when it knows its own cards will sell. nvidia is worth twice AMD and AMD makes CPUs as well. doesn't that tell you something about their volume of sales?



Nvidia has been doing a great job, especially since G80. Unfortunately they've been riding G80 this whole time. Hence, it's taking them longer to come up with a new product. There is no debate that Nvidia has been selling a lot of cards. Times seem to be changing though, time will tell.


----------



## btarunr (Sep 15, 2009)

Easy Rhino said:


> im talking overall strategy. you dont market your products to enthusiasts who are a small percentage of your consumer base. amd knows this. this doesnt mean that nvidia is behind in its pursuit of dx11, it just means nvidia is waiting to release when it knows its own cards will sell. nvidia is worth twice AMD and AMD makes CPUs as well. doesn't that tell you something about their volume of sales?



You said AMD (these cards) won't sell (because nobody needs DX11 now). I said AMD (these cards) will sell because of the performance incentive with current applications, and the technology incentive with future applications. You're tying to thrust in a "it won't cater to the vast-majority" side-argument which I respond to by saying that the same "vast-majority" doesn't buy GeForce GTX 285 or GTX 295 either. So AMD is making a performance/enthusiast product line, targeting the $200~$500 market and its leadership over NVIDIA in those segments which will translate into sales (in those segments again) itself becomes mission-accomplishment for AMD.

As for other DX11 cards such as Juniper, they'll fit inside the $200 market and will attempt to play the usual performance/$ leadership game.


----------



## Benetanegia (Sep 15, 2009)

You both (sides) are focusing in a different aspect of what "selling well" means. 

- bta is talking about the sales when the card are launched. In comparison to previous launches HD5xxx will do well.

- easy rhino is talking about sales throughout the card's life. The amount of money that the card will bring at launch is small in comparison to what it will bring later. By the time Nvidia releases their card, people will be aware of the perf jump that the new gen brings it on AND Nvidia cards will not be any worse or pricey if they want to compete. He is saying that as long as Nvidia cards are competitive at launch, they will sell well, because the market is now prepared to buy that kind of hardware. If HD2900 had been more competitive it would have sold much better at launch than 8800 did at it's launch.


----------



## btarunr (Sep 15, 2009)

Benetanegia said:


> easy rhino is talking about sales throughout the card's life. The amount of money that the card will bring at launch is small in comparison to what it will bring later.



Uh no...



Easy Rhino said:


> nvidia wont need to put out dx11 cards until there is a reason to buy them. amd will blow their load early like they always do and come out with the best cards that nobody can take advantage of so nobody will buy.



Both of us (Rhino and me) know what the other is talking about.


----------



## Benetanegia (Sep 15, 2009)

btarunr said:


> Uh no...
> 
> 
> 
> Both of us know what the other is talking about.



No new generation of cards sell a lot in their first quarter, not nearly as much as they do in the next ones. That's history, no need to debate about that. During the first quarter a lot of people will start thinking if that kind of performance is needed. It's in the second quarter and later when most people will start buying them. That happened with 8800 and I don't see why this time is going to be different.

BTW sorry to try to explain what I thought that each other was saying. I didn't want to put words in your (both) mouths, if that is the case at all.


----------



## btarunr (Sep 15, 2009)

Benetanegia said:


> No new generation of cards sell a lot in their first quarter, not nearly as much as they do in the next ones.



Right, and the sales (in the first quarter) will be on the grounds that here are the fastest cards you can buy for $x, $y, and $z. Just as GeForce GTX 275, GTX 285, and GTX 295 are holding those sports now. Nobody will stay away from these new generation "because they don't need it", but probably because "it's too expensive" (which is what probably limited the sales of 8800 GTX/GTS in their first quarter, and became the history you're now citing). As and when AMD sees it's not able to sell a HD 5870 for $449 (the price currently doing rounds), it will simply make it more affordable.


----------



## Benetanegia (Sep 15, 2009)

btarunr said:


> Right, and the sales (in the first quarter) will be on the grounds that here are the fastest cards you can buy for $x, $y, and $z. Just as GeForce GTX 275, GTX 285, and GTX 295 are holding those sports now. Nobody will stay away from these new generation "because they don't need it", but probably because "it's too expensive" (which is what probably limited the sales of 8800 GTX/GTS in their first quarter, and became the history you're now citing).



I don't think the price was the problem at all. The perf/price improvement they were offering was much bigger than now.

IMO you are seeing the HD5870 as a high-end card, and it's not. It's the high performance card, but belongs to the performance segment. The high-end card will be the X2 card and will be released soon after. Is this card the one that someone looking for a GTX295 will wait for. And will the HD5870 take up GTX285's market share? Of course, until Nvidia lowers it's price to the good spot. Then people will decide if the GTX at it's new price is better for them or not, and will probably be, just like the HD4870/90 GTX260/75 have always been better than GTX285 for most.


----------



## Easy Rhino (Sep 15, 2009)

btarunr said:


> You said AMD (these cards) won't sell (because nobody needs DX11 now). I said AMD (these cards) will sell because of the performance incentive with current applications, and the technology incentive with future applications. You're tying to thrust in a "it won't cater to the vast-majority" side-argument which I respond to by saying that the same "vast-majority" doesn't buy GeForce GTX 285 or GTX 295 either. So AMD is making a performance/enthusiast product line, targeting the $200~$500 market and its leadership over NVIDIA in those segments which will translate into sales (in those segments again) itself becomes mission-accomplishment for AMD.
> 
> As for other DX11 cards such as Juniper, they'll fit inside the $200 market and will attempt to play the usual performance/$ leadership game.



if AMD's mission is to sell a limited number of cards to a small segment of its consumer base then no wonder they are getting their asses handed to them in market share. i dont see how beating nvidia to the punch with these cards will translate into sales. as i mentioned in my first post they have been doing it since they bought out ATI and it hasnt worked yet!


----------



## btarunr (Sep 15, 2009)

Benetanegia said:


> I don't think the price was the problem at all. The perf/price improvement they were offering was much bigger than now.



At $650 (8800 GTX) and $400, at a time when the Radeon X1950 XTXs and the GeForce 7800 GTXs were flying at >$400 price points, yes. But taking in the account today's market where you'll easily find GTX 285 or a HD 4870 X2 at incredibly low price points, no.


----------



## pantherx12 (Sep 15, 2009)

Actually ATI have done really well in the Mobile GPU area, I Imagine they'll start doing better in desktop graphics this year


----------



## btarunr (Sep 15, 2009)

Easy Rhino said:


> if AMD's mission is to sell a limited number of cards to a small segment of its consumer base then no wonder they are getting their asses handed to them in market share.



Lamborghini isn't trying to sell a Murcielago to everyone. It sells lesser number of cars than say Fiat. 

AMD here is selling both "Lamborghinis" and "Fiats". Radeon HD 5800s are the former, Radeon HD 5700 and below are the latter.

And oh, their market share grew at the expense of NVIDIA's even when NVIDIA had the faster cards in Q2.


----------



## Benetanegia (Sep 15, 2009)

I think we are mixing things up. No one is saying these cards won't sell. At least I'm not. I'm just saying that they won't sell any better than what current cards are selling right now at their respective price point. In fact they will do worse, because people that were considering the GTX285 at $300 or the HD4890/GTX275 at $200 will just probably go for the GTX285 at $200 ad not for anything more expensive.

The success (or lack off) of this cards will not influence the sales of the GT300 neither, again, as long as it is competitive AND GT300 will sell according to the volume of "next gen" cards that are being sold at that moment. In Q1 after release of DX11 cards that volume will be small, let's say 100.000 cards. In Q2 after DX11 release the volume will be higher, imagine 500.000. If Nvidia manages a 50% market share for DX11 cards GT300 will sell 250.000, far more than HD5xxx at launch. Again, if it is competitive.


----------



## Easy Rhino (Sep 15, 2009)

btarunr said:


> Lamborghini isn't trying to sell a Murcielago to everyone. It sells lesser number of cars than say Fiat.
> 
> AMD here is selling both "Lamborghinis" and "Fiats". Radeon HD 5800s are the former, Radeon HD 5700 and below are the latter.
> 
> And oh, their market share grew at the expense of NVIDIA's even when NVIDIA had the faster cards in Q2.



true, but lamborghini have built their entire operation around that idea. AMD have not! now if AMD wants to just make GPUs for enthusiasts that is fine by me. But they better adjust their business model if that is the route they want to take.


----------



## btarunr (Sep 15, 2009)

Easy Rhino said:


> true, but lamborghini have built their entire operation around that idea. AMD have not! now if AMD wants to just make GPUs for enthusiasts that is fine by me. But they better adjust their business model if that is the route they want to take.



There's no hard and fast rule that a company has to work with a particular segment in mind. And it's not that AMD "wants to just make GPUs for enthusiasts", it has a mainstream GPU codenamed Juniper to do the wetwork.

Radeon HD 5800s will fetch AMD money from the segments they are made for, while Radeon HD 5700s will do so with the VastMajority™


----------



## W1zzard (Sep 15, 2009)

i have never in my life seen any new card that offers best perf/$ at launch. there's always an existing card on the market that offers better price/performance. it makes sense to reap in some premium on a brand new product - and people are willing to pay for it to have the latest and greatest when the hype is big.


----------



## Deleted member 24505 (Sep 15, 2009)

Remeber people,there are a hell of a lot of "must have" enthusiasts out there that have the money to blow and will just buy every new piece of hardware that comes out.We all know we are gonna buy a 58xx,if only to see how good it is for ourselves.


----------



## Benetanegia (Sep 15, 2009)

W1zzard said:


> i have never in my life seen any new card that offers best perf/$ at launch. there's always an existing card on the market that offers better price/performance. it makes sense to reap in some premium on a brand new product - and people are willing to pay for it to have the latest and greatest when the hype is big.



True 99% of times, but in honor to the truth. The 8800GT/HD3xxx card did exactly that at launch. JUst before the prices of tem started to be inflated because of the success and shortage. And before existing cards got their price adjustment, of course...


----------



## W1zzard (Sep 15, 2009)

right .. 8800 gt might be an exception even though i'm not 100% sure .. my reviews back then didnt have all the good info


----------



## Benetanegia (Sep 15, 2009)

W1zzard said:


> right .. 8800 gt might be an exception even though i'm not 100% sure .. my reviews back then didnt have all the good info



Some lucky ones got the 8800GT at $250 in day one, which was it's MRSP, if I'm not mistaken. It was in the weeks after that the 8800GT reached $300. 

HD3870 started at $230 too and ended up at $260 + pretty soon. And then down from there, both.

Why did they reach such a high price? Who wouldn't pay $300 for the 8800GT when that card performs like the $400-500 cards that were out at that moment?? Nobody cared about what the MSRP, because at $300+ it still offered better perf/price.


----------



## btarunr (Sep 15, 2009)

Benetanegia said:


> Why did they reach such a high price?



Ho Ho Ho!

Because Santa was willing to pay back then.

X-Mas shopping season caused supply shortages. 8800 GT had a "jelly-launch" (a new term I coin to describe a hard launch that has such limited quantity, it's almost soft launch), and HD 3870 became an alternative buy.


----------



## Benetanegia (Sep 15, 2009)

And now that we mentioned those cards, I think it's important to remeber them, because they changed everything. We just can't compare DX11 launch with DX10 launch, because we got that generation of cards in between both releases that changed everything, dramatically. Before that generation anyone expecting some decent playablity at high settings was willing to pay $300. Not anymore. We are entering the territory of diminishing returns and that will affect DX11 cards sales. Until the most enthusiastic buy them and start showing them to the public, the public will not say "me wants".



btarunr said:


> Ho Ho Ho!
> 
> Because Santa was willing to pay back then.
> 
> X-Mas shopping season caused supply shortages. 8800 GT had a "jelly-launch" (a new term I coin to describe a hard launch that has such limited quantity, it's almost soft launch), and HD 3870 became an alternative buy.



That's what I said.

Also relistically speaking there was no shortage of those cards, they simply sold much much more than what anything could thought. Their respective revenues went up dramatically selling cards that were half the price, that tells you something.


----------



## Kantastic (Sep 15, 2009)

btarunr, do you know if the TPU review of the 5870 will show whether or not PCI-E X8 will bottleneck the card?


----------



## btarunr (Sep 15, 2009)

Kantastic said:


> btarunr, do you know if the TPU review of the 5870 will show whether or not PCI-E X8 will bottleneck the card?



For sure it will. We have it covered. There are more such goodies in our Cypress coverage.


----------



## pantherx12 (Sep 15, 2009)

I imagine not in the actual review, but I'm sure he could quickly test that out for you if you ask him all nice like.


Edit : I stand corrected!


----------



## btarunr (Sep 15, 2009)

Benetanegia said:


> Also relistically speaking there was no shortage of those cards, they simply sold much much more than what anything could thought.



There was. At one point leading retailers simply ran out of it (all brands' 8800 GT) so there definitely was a supply shortage. I remember HD 3870 reviews telling "you can buy this, now that 8800 GT is hard to find/not available."


----------



## Benetanegia (Sep 15, 2009)

btarunr said:


> There was. At one point leading retailers simply ran out of it (all brands' 8800 GT) so there definitely was a supply shortage. I remember HD 3870 reviews telling "you can buy this, now that 8800 GT is hard to find/not available."



Yeah but because the demand was higher than what they thought, not because the supply was one bit lower than with any previous release.

http://www.xbitlabs.com/news/video/...rds_Continue_to_Drop_Jon_Peddie_Research.html

Look at the charts and the graphs. In Q4 2007 there was a big jump in sales and a more than normal decline in the price, but the overall revenue went up by a lot. The problem (for them) is that the average selling point never went back to normal and the sales did come down to normal levels. The conclusion is that right now the discrete graphics card market is just half of what it used to be. We are simply not going to see another 8800 launch.

In the by segments graph you can also see how the performance segments is going down in favor to the mainstream one, while the high-end market stays estable. I supose the difference between your thinkking and mine is that I take the HD5850/70 as the performance segment and HD5870 X2 as the high end, while you think all three are high-end.


----------



## btarunr (Sep 15, 2009)

Benetanegia said:


> Yeah but because the demand was higher than what they thought, not because the supply was one bit lower than with any previous release.



Demand didn't cause that supply shortage, but I may be wrong.

For example, the TPU review back then said 



> Unfortunately the Sapphire HD 3870 is not faster than the GeForce 8800 GT which is sold at about the same price point. I would expect that the HD 3870 price drops soon, also the HD 3870 is in stock now and will be. NVIDIA's 8800 GT is sold out because of limited quantity available. So if you need a Christmas present and you need it now, the HD 3870 is an excellent choice.



So it seems like the supply shortage wasn't courtesy extreme demand, given the output NVIDIA is capable of.



Benetanegia said:


> I supose the difference between your thinkking and mine is that I take the HD5850/70 as the performance segment and HD5870 X2 as the high end, while you think all three are high-end.



I was squaring them off as >$200 / high-end to support my older arguments, those which the "vast majority" don't buy. I acknowledge them to be performance/high-end products.

That big jump in sales cannot be squared down to one product. That is an industry-wide figure, and doesn't pertain to a company or its products. And we don't know in which segment 8800 GT was classified  back in 2007 because it was a high performing product at a >$200 price point.


----------



## Benetanegia (Sep 15, 2009)

btarunr said:


> Demand didn't cause that supply shortage, but I may be wrong.
> 
> For example, the TPU review back then said
> 
> ...



IMO numbers talk by themselves. Also information now is much more complete than what it was back then when Wizzard did that review. I supose he has his sources, but I don't think he has access to sales data before the quarterly numbers are posted, but I might be wrong, so let's just let him correct me if I'm wrong. Until then I can only assume that he said what everyone in that situation would have said/thought. I would have said the same back then, but after the numbers were posted, I have no choice but to change my opinion...


----------



## RoutedScripter (Sep 16, 2009)

Interesting
	

	
	
		
		

		
			






HD5870 1GB version for $350


----------



## pantherx12 (Sep 16, 2009)

If that price is right for 5870 I'm going to get 2gb, just for kicks!


----------



## ToTTenTranz (Sep 16, 2009)

Let's not forget that nVidia used to do the dirty little trick of selling at the same price as the ATI counterpart in the USA, and a lot more expensive in the rest of the world. -> This way they could get the halo-effect in most web reviews and at the same time get better profit margins.


Here in Europe, the 8800GT was always +40€ ($60) more expensive than the HD3870, the GTX260 Core 216 was always +40€ more expensive than the HD4870, and the GTX275 was always +40€ more expensive than the HD4890.


----------



## Benetanegia (Sep 16, 2009)

ToTTenTranz said:


> Let's not forget that nVidia used to do the dirty little trick of selling at the same price as the ATI counterpart in the USA, and a lot more expensive in the rest of the world. -> This way they could get the halo-effect in most web reviews and at the same time get better profit margins.
> 
> 
> Here in Europe, the 8800GT was always +40€ ($60) more expensive than the HD3870, the GTX260 Core 216 was always +40€ more expensive than the HD4870, and the GTX275 was always +40€ more expensive than the HD4890.



That's not truth. That doesn't happen everywhere.


----------



## ToTTenTranz (Sep 16, 2009)

Benetanegia said:


> That's not truth. That doesn't happen everywhere.



Yes, I just said that the USA is the exception.


----------



## Wile E (Sep 16, 2009)

I think these will fly off the shelves. I bet they go out of stock almost immediately.


----------



## Chad Boga (Sep 16, 2009)

Wile E said:


> I think these will fly off the shelves. *I bet they go out of stock almost immediately*.


That might depend on whether there is decent quantity to begin with.

I guess it will depend on the pricing that ATI sets and that retailers choose, but the 5870 could end up being the best value mid to high end gaming card ever on initial release.


----------



## Hayder_Master (Sep 16, 2009)

i still confused how this card have this performance with 256 bit only , i know it is great with 32 ROp's and more texture units but only 256 why ? price maybe


----------



## Mussels (Sep 16, 2009)

hayder.master said:


> i still confused how this card have this performance with 256 bit only , i know it is great with 32 ROp's and more texture units but only 256 why ? price maybe



DDR5 doubles it. thats equal to 512 bit in DDR3


----------



## pantherx12 (Sep 16, 2009)

GDDR5 I would imagine .


----------



## Mussels (Sep 16, 2009)

pantherx12 said:


> GDDR5 I would imagine .



the whole DDR vs GDDR thing got confusing. it used to be different, now it seems they only call it GDDR because its on a video card.


----------



## pantherx12 (Sep 16, 2009)

Oh that wasn't in reply to you, was in reply to the other guy.

XD


Although having said that, this bits a reply to your post, the picture from wikipedia thurther up the page has DDR and GDDR on the bus type column, to further add to our confusion


----------



## Fatal (Sep 16, 2009)

I am glad I held off was going to get a 4890 or two not any more mauhahaha!! Have some thing to look forward to now Will wait for TPU review  cant wait!


----------



## Benetanegia (Sep 16, 2009)

Mussels said:


> the whole DDR vs GDDR thing got confusing. it used to be different, now it seems they only call it GDDR because its on a video card.



I think that it still is quite different. And I know that by the book, Quad data rate isn't suitable at all for main memory. I doubt we will see it on main memory anytime soon.



ToTTenTranz said:


> Yes, I just said that the USA is the exception.



I meant everywhere within the EU and in Europe. I've been engaged in a road trip and have checked prices in retail stores in various countries, because I wanted to find a better deal than at home, I didn't. Prices simply aren't as you mentioned, nowhere. Nvidia cards do tend o be more expensive (they are a bit faster too), but just along the lines they are in Newegg in the US, just 10 euros or so more than Ati. There's definately no 40 euros difference. I mean the HD4870 is selling for 140 euros and the GTX260 is around 145 if you take the average. Maybe you can find some special offers or rebates for Ati that make them 40 cheaper somewhere, sometimes, but it's definately not the norm as you were implying in your post.

It is said that in some countries in the EU, Nvidia does sell much more than Ati (mainly in southern contries), I mean much more than the 2-to-1 (66%-33%) they do in the US, so it might be that Ati was forced to lower the prices there, but it doesn't happen always everywhere.

There's the posibility that you are comparing full retail prices on GTX cards against light retail prices on HD cards. I've seen many light retail HD cards and none on the other camp. Ok you can find lower prices that way, but you are NOT getting the same thing. You don't get the goodies and although enthusiast like us might not want those goodies and just want the card, esentially it's not the same. One friend got one of these light packages from Club3D (ok I know, I know) and it didn't carry any 4pin to 6pin connector, nor 6-to-8 or 4-to-8 although the card required one 6pin and one 8 pin. He had to buy the connectors because his PSU although being 650w and of good quality and brand, was old and didn't have those connectors. He ended up paying muh more. I don't know if that is the norm in light retail packages, but what you certainly don't get is games or software usually licensed for 1 year, while full retail packages tend to have them. Again it is debatable if you want that game and software, but it's not the same product you are getting. Basically as an enthusiast that buys a card every 6 months, you could blame Nvidia's partners for not offering these light packages as widely as Ati does, but when talking about prices is essential to compare apples to apples, IMO.


----------



## mdm-adph (Sep 16, 2009)

Chad Boga said:


> That might depend on whether there is decent quantity to begin with.
> 
> I guess it will depend on the pricing that ATI sets and that retailers choose, but the 5870 could end up being the best value mid to high end gaming card ever on initial release.



Regardless of the size of the initial quantity, it's soon to be gone, whatever it is.


----------



## Easy Rhino (Sep 16, 2009)

btarunr said:


> There's no hard and fast rule that a company has to work with a particular segment in mind. And it's not that AMD "wants to just make GPUs for enthusiasts", it has a mainstream GPU codenamed Juniper to do the wetwork.



well you cant take a massive company like AMD designed to manufacture hundreds of thousands of chips across the globe and expect to make a profit by scaling back production. 



> Radeon HD 5800s will fetch AMD money from the segments they are made for, while Radeon HD 5700s will do so with the VastMajority™



yea but not THAT much money. AMD probably makes 90% of their revenue from mid-range cards.


----------



## btarunr (Sep 16, 2009)

Easy Rhino said:


> well you cant take a massive company like AMD designed to manufacture hundreds of thousands of chips across the globe and expect to make a profit by scaling back production.



Where did it scale back production? It only released a new high-performance series, backed by mainstream series too



Easy Rhino said:


> yea but not THAT much money. AMD probably makes 90% of their revenue from mid-range cards.



Right, and Juniper is built for that. I'm sure the same 90:10 ratio applies to NVIDIA as well.


----------



## ToTTenTranz (Sep 16, 2009)

Benetanegia said:


> I meant everywhere within the EU and in Europe. I've been engaged in a road trip and have checked prices in retail stores in various countries, because I wanted to find a better deal than at home, I didn't. Prices simply aren't as you mentioned, nowhere. Nvidia cards do tend o be more expensive (they are a bit faster too), but just along the lines they are in Newegg in the US, just 10 euros or so more than Ati. There's definately no 40 euros difference. I mean the HD4870 is selling for 140 euros and the GTX260 is around 145 if you take the average. Maybe you can find some special offers or rebates for Ati that make them 40 cheaper somewhere, sometimes, but it's definately not the norm as you were implying in your post.



It seems to me that you're only comparing the HD4870 to the GTX260, but those are not in the same performance level. The GTX260 core 216 was the card launched by nVidia to counter the HD4870.

Furthermore, what counts is not the price right now. It's the price when most reviews are written. When most HD4870 vs. GTX260 c216 reviews were being written, the GTX260 was always more expensive than the HD4870.

Yeah, maybe the lowest priced ATI cards don't bundle unimportant stuff, but who cares? The reviewers also had that into account.

And there are nVidia cards with weak bundles. I once bought a 8800GT that came only with the PCI-E power cable and a DVI->D-Sub adapter, from Club3D.


----------



## Benetanegia (Sep 16, 2009)

ToTTenTranz said:


> It seems to me that you're only comparing the HD4870 to the GTX260, but those are not in the same performance level. The GTX260 core 216 was the card launched by nVidia to counter the HD4870.
> 
> Furthermore, what counts is not the price right now. It's the price when most reviews are written. When most HD4870 vs. GTX260 c216 reviews were being written, the GTX260 was always more expensive than the HD4870.
> 
> ...



Since when is the price important when reviews are written? No my friend, price is important at the time of buying. Most reviews are written when the cards are launched, who in hell cares about the price of a GTX260/HD4870 8 months ago???

Also I supose there are weak bundles for Nvidia cards, I just stated what I saw in 10+ stores around 5 countries. There's no such light packages where I live.

And as a side note. Where it matters the GTX260 216 destroys the 512 MB HD4870 which is the only one that is much cheaper. It takes the 1 GB version to average more or less the same as the GTX260 you can get today.


----------



## ToTTenTranz (Sep 16, 2009)

Benetanegia said:


> Since when is the price important when reviews are written? No my friend, price is important at the time of buying.



No, price difference is more important when the cards are compared, at the time of the review.
When you buy the card now, what you read is the reviews from 8 months ago, because no reviewer cares about those cards right now.


nVidia even increased the price of the 8800GT, after most of the reviews were done, saying it was better than the HD3870 for the same price.


----------



## Benetanegia (Sep 16, 2009)

ToTTenTranz said:


> No, price difference is more important when the cards are compared, at the time of the review.
> When you buy the card now, what you read is the reviews from 8 months ago, because no reviewer cares about those cards right now.
> 
> 
> nVidia even increased the price of the 8800GT, after most of the reviews were done, saying it was better than the HD3870 for the same price.



Don't you see you make no sense? You read reviews to learn about performance of the cards. You dont' read reviews to know the price of the cards, God save you otherwise. You go to the store or etailer to know the price. I couldn't care less that the GTX260 costed $450 when it launched, I can walk 100 m and get it's 216SP (better) version for 140 euros and that's all that I need to know. Aroud here GTX260 and HD4870 are selling for almost the same price, and its that price after which I would base my purchasing decision, not $450 price found in reviews, that is not true anymore.

What's more important is that prices can change from day to day, in fact, they do, so the price that a reviewer posts means absolutely nothing if a change occurred the next day after he posted his review.


----------



## RoutedScripter (Sep 16, 2009)

The Frontpage news comments are cut after the #161 post , nothing more is visible after that. Plus the button no were to be found , i have to navigate the whole forum to get to the topic.


----------



## TheMailMan78 (Sep 16, 2009)

RuskiSnajper said:


> The Frontpage news comments are cut after the #161 post , nothing more is visible after that. Plus the button no were to be found , i have to navigate the whole forum to get to the topic.



Its because you're not American.......or you need to reload the page.


----------



## RoutedScripter (Sep 16, 2009)

TheMailMan78 said:


> Its because you're not American.......or you need to reload the page.



What the..?


----------



## W1zzard (Sep 16, 2009)

RuskiSnajper said:


> The Frontpage news comments are cut after the #161 post , nothing more is visible after that. Plus the button no were to be found , i have to navigate the whole forum to get to the topic.



i dont understand, can you elaborate ? screenshots ? does a reload fix it ?


----------



## TheMailMan78 (Sep 16, 2009)

RuskiSnajper said:


> What the..?



Hit reload man.


----------



## RoutedScripter (Sep 16, 2009)

W1zzard said:


> i dont understand, can you elaborate ? screenshots ? does a reload fix it ?



LoL , why do everybody think I don't know how to reload the page. I get to it 2 times separately today and it just cuts after 161 post , it's been that way for about 2 days , I will screenshot in a moment , I am using firefox , i think it's the most up to date.

Second why would I hit reload if I just connected to it , and I Did hit reload just in case and still nothing.


----------



## btarunr (Sep 16, 2009)

Yes, firefox has problems displaying "very long" pages. Please browse through the thread in the forums.


----------



## TheMailMan78 (Sep 16, 2009)

RuskiSnajper said:


> LoL , why do everybody think I don't know how to reload the page.


Because even the best of us make mistakes my friend.


----------



## RoutedScripter (Sep 16, 2009)

TheMailMan78 said:


> Because even the best of us make mistakes my friend.



Indeed, it's okay , you couldn't know the circumstances.


----------



## W1zzard (Sep 16, 2009)

i'll look into that problem with the comments


----------



## AsRock (Sep 16, 2009)

RuskiSnajper said:


> The Frontpage news comments are cut after the #161 post , nothing more is visible after that. Plus the button no were to be found , i have to navigate the whole forum to get to the topic.



You could try going into your UserCP then to Edit options \  Thread Display Options and lower the number were it says Number of Posts to Show Per Page


----------



## W1zzard (Sep 16, 2009)

no, this is an issue with the frontpage site engine, i'll get this fixed in the next days


----------



## mdm-adph (Sep 16, 2009)

btarunr said:


> Yes, firefox has problems displaying "very long" pages.



[citation needed]


----------



## Scrizz (Sep 16, 2009)

RuskiSnajper said:


> LoL , why do everybody think I don't know how to reload the page. I get to it 2 times separately today and it just cuts after 161 post , it's been that way for about 2 days , I will screenshot in a moment , I am using firefox , i think it's the most up to date.
> 
> Second why would I hit reload if I just connected to it , and I Did hit reload just in case and still nothing.



interesting, I got the same too
also using FF latest version


----------



## W1zzard (Sep 16, 2009)

guys, dont worry about that .. i'll add a limit to 40 posts per page and some page browser like on the forums


----------



## Easy Rhino (Sep 16, 2009)

btarunr said:


> Where did it scale back production? It only released a new high-performance series, backed by mainstream series too
> 
> 
> 
> Right, and Juniper is built for that. I'm sure the same 90:10 ratio applies to NVIDIA as well.



you said there were no hard and fast rules. im telling you that you cant be a multi billion dollar company and decide to make chips for just enthusiasts without reorganizing the business. i know amd isnt doing that but you suggested they just make enthusiast chips.


----------



## btarunr (Sep 16, 2009)

Easy Rhino said:


> you said there were no hard and fast rules. im telling you that you cant be a multi billion dollar company and decide to make chips for just enthusiasts without reorganizing the business. i know amd isnt doing that but you suggested they just make enthusiast chips.



No, I suggested that with Radeon HD 5800, they just targeted the performance/enthusiast market (>$200, >VastMajority).


----------



## Easy Rhino (Sep 16, 2009)

btarunr said:


> No, I suggested that with Radeon HD 5800, they just targeted the performance/enthusiast market (>$200, >VastMajority).



right which agree wont make them enough money to support their entire business which is why they have a whole number chip for that.  nvidia waits to release their chips so the consumer has more choice and it has worked every time.


----------



## btarunr (Sep 16, 2009)

Easy Rhino said:


> right which agree wont make them enough money to support their entire business which is why they have a whole number chip for that.  nvidia waits to release their chips so the consumer has more choice and it has worked every time.



No, NVIDIA's model isn't any different or any more credit worthy than AMD's. The only thing NVIDIA has bigger is sales-volumes, largely because its market share is higher. Even that is coming down. Slowly.

Radeon HD 5800 = caters to performance+ >$200 segment, while Radeon HD 5700 caters to <$200. So there's nothing that won't make AMD enough money. The introduction of these two lines will also mean existing inventories of Radeon HD 4000 series getting digested. Maybe at even lower price points. So you see, nothing with regards to production is affecting AMD as such. Replenished competitiveness against NVIDIA will only help its cause better, throughout its lineup.


----------



## TheMailMan78 (Sep 16, 2009)

btarunr said:


> No, I suggested that with Radeon HD 5800, they just targeted the performance/enthusiast market (>$200, >VastMajority).



Bta is right. When you get right down to brass tacts the 4800 series is overkill for I would say 96% of all games currently on the market. The 5800 series is over kill for EVERYTHING on the market. Only guys like us are really going to push these things.


----------



## erocker (Sep 16, 2009)

TheMailMan78 said:


> Bta is right. When you get right down to brass tacts the 4800 series is overkill for I would say 96% of all games currently on the market. The 5800 series is over kill for EVERYTHING on the market. Only guys like us are really going to push these things.



Not while forcing 24x AA.


----------



## Wile E (Sep 16, 2009)

erocker said:


> Not while forcing 24x AA.



Which is also reserved for nutballs like us, not normal people. lol.


----------



## Benetanegia (Sep 16, 2009)

btarunr said:


> No, NVIDIA's model isn't any different or any more credit worthy than AMD's. The only thing NVIDIA has bigger is sales-volumes, largely because its market share is higher. Even that is coming down. Slowly.
> 
> Radeon HD 5800 = caters to performance+ >$200 segment, while Radeon HD 5700 caters to <$200. So there's nothing that won't make AMD enough money. The introduction of these two lines will also mean existing inventories of Radeon HD 4000 series getting digested. Maybe at even lower price points. So you see, nothing with regards to production is affecting AMD as such. Replenished competitiveness against NVIDIA will only help its cause better, throughout its lineup.



Hmmm, I thought the lower cards were going to be released in November. My previous comments in the matter were largely affected by that missconception.


----------



## Easy Rhino (Sep 16, 2009)

btarunr said:


> No, NVIDIA's model isn't any different or any more credit worthy than AMD's. The only thing NVIDIA has bigger is sales-volumes, largely because its market share is higher. Even that is coming down. Slowly.
> 
> Radeon HD 5800 = caters to performance+ >$200 segment, while Radeon HD 5700 caters to <$200. So there's nothing that won't make AMD enough money. The introduction of these two lines will also mean existing inventories of Radeon HD 4000 series getting digested. Maybe at even lower price points. So you see, nothing with regards to production is affecting AMD as such. Replenished competitiveness against NVIDIA will only help its cause better, throughout its lineup.



nvidia's model is totally different. they approach their lineup in completely different ways. nvidia isnt rushing their new lineup of cards because there is no rush! amd is jumping the gun and selling their first cards to a small segment of people. when nvidia launches there will be more cards and more of a reason to buy them. that has always been their strategy and it has always worked. nvidia's market share is larger because they can effeciently manufacture chips and are a more reliable business partner than amd. nvidia market themselves better and appeal to a wider consumer base.


----------



## Benetanegia (Sep 16, 2009)

Easy Rhino said:


> that has always been their strategy and it has always worked.



It's not been their strategy at all. Both Ati and Nvidia have always tried to release their cards first, which for the reason we've been giving is a mistake IMHO, if and only if it means to rush the release, but that really isn't the case now.

Well, it also is an advantage if your design is flexible enough that you can change it to make it faster after you see the competitor's product. For example if Nvidia had waited, they could have released GT2xx cards with higher clocks in order to be definately faster, they had enough room for overclocking.


----------



## Wile E (Sep 16, 2009)

Easy Rhino said:


> nvidia's model is totally different. they approach their lineup in completely different ways. nvidia isnt rushing their new lineup of cards because there is no rush! amd is jumping the gun and selling their first cards to a small segment of people. when nvidia launches there will be more cards and more of a reason to buy them. that has always been their strategy and it has always worked. nvidia's market share is larger because they can effeciently manufacture chips and are a more reliable business partner than amd. nvidia market themselves better and appeal to a wider consumer base.



Ummm, nVidia releases enthusiast cards ahead of their mainstream offerings as well. Both operate on that business model.

Not only that, but how is releasing like this in any way detrimental to ATI's sales? Whoever has the fastest card gets the most market exposure. So even if they don't sell a bunch, it sells a lot more of their mainstream cards. It's marketing, with the sale of a few high end cards in the meantime. No sense in holding the card back if you have it, especially if it trounces your market competitor's current offerings in performance.

This launch is in no way different than the G80 launch. The G80 sure as hell didn't hurt nVidia, but it did put a serious dent in ATI sales because NV's top end cards stole all of ATI's thunder.

I'm sorry, but I'm gonna have to side with bta on this. I feel you are just flat out mistaken.


----------



## Deleted member 24505 (Sep 16, 2009)

I think there are a lot of people who just love to have the newest hardware,specially if its really good too.I think it will do ati no harm releasing these 5800 cards at this time.


----------



## pantherx12 (Sep 16, 2009)

Nvidia came first, that's why they are successful in my opinion, ATIs market share increases every year if they keep up bringing out high performance cards at lower prices then eventually the market share of both companies will level out.

Again just my opinion.


----------



## Easy Rhino (Sep 16, 2009)

Wile E said:


> Ummm, nVidia releases enthusiast cards ahead of their mainstream offerings as well. Both operate on that business model.
> 
> Not only that, but how is releasing like this in any way detrimental to ATI's sales? Whoever has the fastest card gets the most market exposure. So even if they don't sell a bunch, it sells a lot more of their mainstream cards. It's marketing, with the sale of a few high end cards in the meantime. No sense in holding the card back if you have it, especially if it trounces your market competitor's current offerings in performance.
> 
> ...



when was the last time nvidia hard launched one card in their upcoming lineup prior to ati doing so? to my knowldege nvidia has always launched several cards at a time when they hard launch.


----------



## Wile E (Sep 16, 2009)

Easy Rhino said:


> when was the last time nvidia hard launched one card in their upcoming lineup prior to ati doing so? to my knowldege nvidia has always launched several cards at a time when they hard launch.



Not really. 8800GT and G80 are the most recent examples. Both times, they only released 1 or 2 models, just like ATI is releasing a 5850 and 5870. The lower lines cam much later.


----------



## pantherx12 (Sep 16, 2009)

Ati are reportedly* coming out with a few entry level cards at the same time as the 5870 and 5850.

550 etc perfect for HTPC and main stream computer users.


* Not confirmed at the moment but judging from the 4 series cards I imagine they will.

*edit* my typos are getting worse and worse I'm now writing completely different words from what I intend : /


----------



## Easy Rhino (Sep 16, 2009)

if amd releases a larger lineup of cards before nvidia and if the top cards are that significantly more powerful (we will wait to see w1zzards review) than the current nvidia top card then i will say i was wrong. but at this point nvidia is in no rush to launch their lineup. nvidia will not lose anything from amd launching first and amd wont gain much in launching first. as i said in my first post about the subject, we enthusiasts make up a terribly small portion of the consumer base. our buying habits do not mimick those of the usual consumer and hardly generate enough revenue to brag about being first.


----------



## Wile E (Sep 16, 2009)

Easy Rhino said:


> if amd releases a larger lineup of cards before nvidia and if the top cards are that significantly more powerful (we will wait to see w1zzards review) than the current nvidia top card then i will say i was wrong. but at this point nvidia is in no rush to launch their lineup. nvidia will not lose anything from amd launching first and amd wont gain much in launching first. as i said in my first post about the subject, we enthusiasts make up a terribly small portion of the consumer base. our buying habits do not mimick those of the usual consumer and hardly generate enough revenue to brag about being first.



But that's where you are wrong. Our buying habits strongly influence the market. That's why both generally release their top end first. If ATI trounces all over the current cards, it will make it look like NV is at a disadvantage (whether perceived or real) and it will negatively affect their sales of all their lines. The company with the fastest card always makes the most headlines, and therefore get free market exposure.


----------



## pantherx12 (Sep 16, 2009)

Searching for ATI Radeon HD5870 pulls  3,210,000 from google, I'd say there is some impact from enthusiast cards.


----------



## Easy Rhino (Sep 16, 2009)

Wile E said:


> But that's where you are wrong. Our buying habits strongly influence the market. That's why both generally release their top end first. If ATI trounces all over the current cards, it will make it look like NV is at a disadvantage (whether perceived or real) and it will negatively affect their sales of all their lines. The company with the fastest card always makes the most headlines, and therefore get free market exposure.



i have a hard time buying that considering both companies are worth billions of dollars. there is a lot more going on than just "sell to enthusiasts, they will buy anything." considering a top of the line card costs $500 you would need 1 million sales of that card to generate a half billion dollars. that is about half of amd's quartly revenue and they also sell CPUs and motherboard chipsets...


----------



## Wile E (Sep 16, 2009)

Easy Rhino said:


> i have a hard time buying that considering both companies are worth billions of dollars. there is a lot more going on than just "sell to enthusiasts, they will buy anything." considering a top of the line card costs $500 you would need 1 million sales of that card to generate a half billion dollars. that is about half of amd's quartly revenue and they also sell CPUs and motherboard chipsets...



Again, you are missing the point. They don't need to sell a lot of the cards. It is advertisment for them, and against NV if the performance is better. People see the headlines of the 5870 being faster and having more features than NV, then they check out the card. If they see the price and decide it's too high, they are likely to check out what else ATI has to offer. In the meantime, AMD will still sell a few high end cards. So it's in no way any kind of loss for them.

Again, we aren't making this up. It's been proven. That's why both manufacturers spend so much time attacking the top end.


----------



## Tatty_One (Sep 17, 2009)

Whats important here is momentum...... ATi have this meomentum currently ...... irrespective of whether you have a preference for card vendor/manufacturer..... card sales normally are measured over a 1 year life span these days, purely because within a year each side will bring out faster hardware even if it does not imply improved architecture, most buyers concentrate on value for money and speed, not transistor count, fabrication process or even supported IQ...... generally, in the last 3 years or so, the company that gets first to a new market or performance point maintains improved market share ahead of it's rival for the first or just maybe the 2nd quarter dependant on it's competitors release but in the world we live in.... the majority of sales DON'T come in the first 3 months of a cards release, yes the "enthusisats" tend to buy the new superfast architecture then but 80% of the sales accrue 6 - 12 months after the initial launch, so as we all are painfully aware...... over the forthcoming year, who wins the battle will be decided by the performance and price of NVidia's offerings as opposed to some degree by ATi's earlier offerings if that makes sense.


----------



## Benetanegia (Sep 17, 2009)

Tatty_One said:


> Whats important here is momentum...... ATi have this meomentum currently ...... irrespective of whether you have a preference for card vendor/manufacturer..... card sales normally are measured over a 1 year life span these days, purely because within a year each side will bring out faster hardware even if it does not imply improved architecture, most buyers concentrate on value for money and speed, not transistor count, fabrication process or even supported IQ...... generally, in the last 3 years or so, the company that gets first to a new market or performance point maintains improved market share ahead of it's rival for the first or just maybe the 2nd quarter dependant on it's competitors release but in the world we live in.... the majority of sales DON'T come in the first 3 months of a cards release, yes the "enthusisats" tend to buy the new superfast architecture then but 80% of the sales accrue 6 - 12 months after the initial launch, so as we all are painfully aware...... over the forthcoming year, who wins the battle will be decided by the performance and price of NVidia's offerings as opposed to some degree by ATi's earlier offerings if that makes sense.



That's what I've been saying. It makes sense.


----------



## RoutedScripter (Sep 17, 2009)

Look the funny side

Now nvidia thinks FPS and resolution is not all , indeed I do agree with the FPS part, but wonder how green products always had FPS the best but the other stuff was a bit weak. 

About the resolution I do not agree , cause Eyefinity is what it is , it's something never done before and nvidia is just jealous. Things like game developing studios , workstations could also seriously benefit from such features.

I do think that eyefinity is strictly for entusiast side when it comes to gaming ,  which is known it's not as large as mainstream, but innovation is good anyways. Plus the Samsung preparing the super thin edges LCDs just for this feature , cause there's a new market that's surely going to rise.


----------



## Benetanegia (Sep 17, 2009)

RuskiSnajper said:


> Look the funny side
> 
> Now nvidia thinks FPS and resolution is not all , indeed I do agree with the FPS part, but wonder how green products always had FPS the best but the other stuff was a bit weak.
> 
> ...



I don't know how Eyefinity actually works but it doesn't natively run the games at those high resolutions. It probably creates a profile in which you specify at what resolution you want your game rendered and it upconverts to your 6 monitors total resolution. Then this profile appears in the game with the "fake" resolution as the name, just as I can create any resolutions that I want for my CRT and they will appear in the game too.

If the cards had the ability (aka shader power + texturing power + rasterization) to play any game at 5000x3000 or whatever, the performance figures shown in these thread would be absolutely fake, it would be running them at 1000 fps!!


----------



## RoutedScripter (Sep 18, 2009)

That's true , it would seriously need to be a massive GPU to run at those resolutions. 

I am just curious for one thing , and was almost right on , how games do not need to support a specific resolution? 

Or are games like Crysis so advanced they do not need this game-tide support so whatever your GPU can view , it will work?

Well I hope so.


----------



## Mussels (Sep 18, 2009)

RuskiSnajper said:


> That's true , it would seriously need to be a massive GPU to run at those resolutions.
> 
> I am just curious for one thing , and was almost right on , how games do not need to support a specific resolution?
> 
> ...



games tend to render internally at one aspect ratio, and then crop to fit.
for example, BF2 was 4:3, and they cut the vertical view down to make it widescreen.

they dont have an "internal" resolution limit, most modern games simply allow any resolution that windows sees fit (the 3D aspect will be able to scale almost perfectly, although UI/HUD elements may stretch slightly)

the ATI xenos chip in the 360 has the ability to render at one resolution, and upscale to another - say a game could be 720P, but the screen can be at its native resolution of 1440x900, for example - _who cares if its not rendered at maximum resolution - its a great alternative to lowering the resolution when your performance is lacking, as its better than running an LCD at non-native resolutions_


----------

