# HD 4870X2 Beats GeForce GTX 280



## wolf2009 (Jun 6, 2008)

The "Reliable" till now informer CJ has done it again. 

Here's what he reported to VR-Zone . 



> While we are away for Computex, CJ let us know that the Radeon HD 4870 X2 R700 prototype card is out and it beats the NVIDIA GeForce GTX 280 in 3DMark Vantage. The R700 card is basically made up of two RV770 GPUs with 2x256-bit memory interface to either GDDR3 or GDDR5 memories. We asked our sources about R700 in Computex and apparently AMD is going to let AIB partners to decide the specs themselves. Therefore, the partners will set their own clock speeds, PCB design, memory types, cooler solutions etc. and there will be Radeon HD 4850 X2 and 4870 X2 cards differentiate by the memory type. The R700 card apparently doing pretty well at this stage scoring about X5500 in 3DMark Vantage Extreme preset while the GeForce GTX 280 card is scoring X4800. Both sides are still working hard on optimizing their drivers for the new architecture so probably we will see the performance to improve over time.



http://www.vr-zone.com/articles/Radeon_HD_4870_X2_R700_Beats_GeForce_GTX_280/5851.html

Next few months look very promising for ATI :cough: :cough: sorry ,  AMD   .


----------



## Duffman (Jun 6, 2008)

hmmm, interesting.  Looking good.  I think now that I have a bad 2900XT (cooler fan won't operate), i will be looking strongly at a 4870 X2


----------



## ChaoticBlankness (Jun 6, 2008)

As someone who still uses an ATi x1900 AIW, I'm totally ready for an upgrade.


----------



## wiak (Jun 6, 2008)

Duffman said:


> hmmm, interesting.  Looking good.  I think now that I have a bad 2900XT (cooler fan won't operate), i will be looking strongly at a 4870 X2


get a proper cooler like zalman VF900-CU


----------



## alexp999 (Jun 6, 2008)

Surely its obvious a dual gpu is going to beat a single one.  . The only comparison is between an ati X2 and an nvidia GX2


----------



## Th3-R3as0n (Jun 6, 2008)

But Nvidia Aparantely said that their Single Card Solution would be the equivalent or better than AMD's Dual core solution? 

Am i right?

This is great news for ATI users. Im still sitting on a X1950XT so the upgrade should be well worth the wait


----------



## Deleted member 3 (Jun 6, 2008)

alexp999 said:


> Surely its obvious a dual gpu is going to beat a single one.  . The only comparison is between an ati X2 and an nvidia GX2



Of course not, why would you compare based on amount of chips used? I think you should compare cards in the same price range. Sure there is always the matter of a performance crown, but in the end we all want the best price/performance ratio. For instance it's nice if there is a GX280 GX2 version which would probably kick that 4870 x2 around, but if it's double the price that still doesn't mean a thing.

It's similar to that "but C2Q isn't a 'real' quad" discussion. It is not relevant how something is created, how it looks on the inside. If it does the job it's fine. For instance if the NV card has to run at a zillion Ghz to be faster so be it, as long as it does so at the same price and same power consumption (ie not offer any major downsides) I couldn't care less.


----------



## tkpenalty (Jun 6, 2008)

Nvidia cannot counter this as the GTX280's design makes it impossible to have two of the cores in close proximity without something like a water block in use...


----------



## alexp999 (Jun 6, 2008)

Okay fair enough. I assumed that all the "next gen" graphics cards would be roughly on par. Same price thats all. Don' really follow up coming stuff too much, lol.
Doesn't this mean the ATi single gpu's will get walked over by nvidia single gpu's? but i suppose if Ati gets the price right...


----------



## DaedalusHelios (Jun 6, 2008)

alexp999 said:


> Okay fair enough. I assumed that all the "next gen" graphics cards would be roughly on par. Same price thats all. Don' really follow up coming stuff too much, lol.
> Doesn't this mean the ATi single gpu's will get walked over by nvidia single gpu's? but i suppose if Ati gets the price right...



Yeah, ATi will have to make there money in the price-performance category due to the lower dev income to work with.


----------



## largon (Jun 6, 2008)

> HD 4870X2 Beats GeForce GTX 280 in 3DMark Vantage


Fixed.


----------



## newconroer (Jun 6, 2008)

DanTheBanjoman said:


> Of course not, why would you compare based on amount of chips used? I think you should compare cards in the same price range. Sure there is always the matter of a performance crown, but in the end we all want the best price/performance ratio. For instance it's nice if there is a GX280 GX2 version which would probably kick that 4870 x2 around, but if it's double the price that still doesn't mean a thing.
> 
> It's similar to that "but C2Q isn't a 'real' quad" discussion. It is not relevant how something is created, how it looks on the inside. If it does the job it's fine. For instance if the NV card has to run at a zillion Ghz to be faster so be it, as long as it does so at the same price and same power consumption (ie not offer any major downsides) I couldn't care less.




Speaking of power consumption, any word on the X2 ? Oh wait, all they've done is VANTAGE scores..wow..I'm glad we're already jumping to conclusions based on that.


----------



## niko084 (Jun 6, 2008)

I never put much faith in pre release VR reviews...


----------



## Megasty (Jun 6, 2008)

P/P is everything. If the 4870x2 matches the GTX 280 then nuff said. But we all know it won't. It'll beat it by a fair margin. The paper numbers even comfirms that...if the GTX 280 doesn't come up with a terrible amount of little magic people  It might be the best single gpu card but at $650-700, bull. I don't even know why this is still an issue


----------



## farlex85 (Jun 6, 2008)

Both companies know what they're doing. If the 4870x2 beats the 280gtx, its a pretty safe bet nvidia will try to make that card priced competitively, or create another one to compete w/ it. The 4870 is supposed to be about $300 at launch right? So if the 4870x2 comes soon after that'll probably be $550 or so at launch. By then the 280 gtx may be equally priced. We'll just have to see. Don't count either out.


----------



## flashstar (Jun 6, 2008)

According to AMD, they will never release a card that costs more than $500. The 4870x2 will most likely be around $499. That's still $100 cheaper than Nvidia's variant. Additionally, Nvidia won't be able to lower the price at all because the margins are already really small for the 280 and 260.


----------



## farlex85 (Jun 6, 2008)

flashstar said:


> According to AMD, they will never release a card that costs more than $500. The 4870x2 will most likely be around $499. That's still $100 cheaper than Nvidia's variant. Additionally, Nvidia won't be able to lower the price at all because the margins are already really small for the 280 and 260.



That's good to hear, I didn't know they said that. I personally don't ever want to pay over $400 for a card, considering I can get a counsel for the same. Maybe $300, I'm gonna wait either way though. We shall just have to see I suppose.


----------



## EastCoasthandle (Jun 6, 2008)

Th3-R3as0n said:


> But Nvidia Aparantely said that their Single Card Solution would be the equivalent or better than AMD's Dual core solution?
> 
> Am i right?
> 
> This is great news for ATI users. Im still sitting on a X1950XT so the upgrade should be well worth the wait



Correct, many from that camp believe that a monolithic design is still "superior" then a dual gpu solution.  Coming from CJ it's hard not to believe it's accuracy but, I still take it with a grain a salt for now.   But if it's true its a good slap in the face towards those who undermine the 4800 series.  This is because I read a few posts that the X2 couldn't beat GTX280 in Vantage as it is more geared towards nv hardware (be it true or otherwise).  You have some who believe that 06 is designed for the 2900/3800 and Vantage is for nv.


----------



## niko084 (Jun 6, 2008)

flashstar said:


> According to AMD, they will never release a card that costs more than $500. The 4870x2 will most likely be around $499. That's still $100 cheaper than Nvidia's variant. Additionally, Nvidia won't be able to lower the price at all because the margins are already really small for the 280 and 260.



Doesn't much matter most people will pay more for Nvidia.


----------



## ghost101 (Jun 6, 2008)

The other thing is, it doesnt "just" beat it, its 14.5% faster. Nice to see, although nvidia might counter with a 256 shader product if previous rumours that nvidia are only using 240 shaders due to poor yields are to be believed.


----------



## imperialreign (Jun 6, 2008)

alexp999 said:


> Surely its obvious a dual gpu is going to beat a single one.  . The only comparison is between an ati X2 and an nvidia GX2



well - if R700 does turn out to be a dual-core GPU, than technically, it still counts as only one GPU on the board . . . 

no one is sure, yet, as to the R700 specs, as we haven't heard anything 100% reliable or concrete


----------



## Megasty (Jun 6, 2008)

niko084 said:


> Doesn't much matter most people will pay more for Nvidia.



True but that don't make them smart...

I'll be buying the best product for my cash, if I had to - $600-700 is just crazy & I don't care about the Ultra either :shadedshu


----------



## EastCoasthandle (Jun 6, 2008)

*Remember to take all this is a pinch of salt folks...*



EastCoasthandle said:


> Correct, many from that camp believe that a monolithic design is still "superior" then a dual gpu solution.  Coming from CJ it's hard not to believe it's accuracy but, I still take it with a grain a salt for now.   But if it's true its a good slap in the face towards those who undermine the 4800 series.  This is because I read a few posts that the X2 couldn't beat GTX280 in Vantage as it is more geared towards nv hardware (be it true or otherwise).  You have some who believe that 06 is designed for the 2900/3800 and Vantage is for nv.



Remember when I said take these CJ posts (and other relating posts) with a grain of salt?



> Well here is one good reason why
> I literally just got out of a meeting with ATI, and honestly, I'd just advise everyone to calm down and wait for direct info from ATI/AMD.
> 
> A lot of these rumors and speculation are just plain wrong, some of the numbers floating around out there are made up, and the ones that aren't are often based on *older drivers* that aren't performing as well as the newer stuff.
> ...


Source

Hard to say what this means for the 4800 series.  Who is this guy?  Well check out ExtremeTech.


----------



## Haytch (Jun 6, 2008)

wolf2009 said:


> The "Reliable" till now informer CJ has done it again.
> 
> Here's what he reported to VR-Zone .
> 
> ...



Its not that i dont believe you, or your source, its more towards the fact that prior a cards release theres always heaps of shit talking.

Then theres the fact that once cards and other hardware are released, you get a whole bunch of users that dont know how to manage and overclock and then bench their equipment, giving us all crappy and inaccurate estimates.

Lastly, lets try not to forget those that dont know what CUDA is yet.

Ive got high expectations this time round from both Nvdia and AMD/ATi.


----------



## Squirrely (Jun 6, 2008)

flashstar said:


> According to AMD, they will never release a card that costs more than $500. The 4870x2 will most likely be around $499. That's still $100 cheaper than Nvidia's variant. Additionally, Nvidia won't be able to lower the price at all because the margins are already really small for the 280 and 260.



Taken from an article (this was posted on a thread by wolf2009 a few days ago) about GT200's yields from 65nm wafers (http://www.theinquirer.net/gb/inquirer/news/2008/05/29/nvidia-gt200-sucessor-tapes):


> Word has come out of Satan Clara that the yields are laughable. No, make that abysmal, they are 40 per cent. To add insult to injury, that 40 per cent includes both the 280 and the 260 yield salvage parts. With about 100 die candidates per wafer, that means 40 good dice per wafer. Doing the maths, a TSMC 300mm 65nm wafer runs about $5000, so that means each good die costs $125 before packaging, testing and the like. If they can get away with sub-$150 costs per GPU, we will be impressed.



I think the only way Nvidia would be able to lower the costs is when they introduce their GT200b core. They do have a little bit of playing room with the current costs per chip (~$150), but once they go 55nm, that will lower the costs quite a bit. That maybe allows them to put the chip more competitively with ATI. But also there's the factor which was stated above, people will pay more for that little green logo. So they could mark it up a bit more.

I guess we just have to wait it out and see what the two do.


----------



## Edito (Jun 6, 2008)

Nice for ATI but i want to see real and official benchmarks cause nvidia always has something in their cards some kind of magic what makes them a little better in real time gaming just want to see a head to head but very good move from ATI cause in the very end all we need is price/performance... but i think the GTX280 will be better...


----------



## Duffman (Jun 7, 2008)

wiak said:


> get a proper cooler like zalman VF900-CU




Had a Zalman VF-1000 on it and it couldn't cut it to my satisfaction.  Have an Arctic cooling Accelero Extreme on it but fans just won't operate off of the card.  It's a card issue.

I can't wait to see official numbers and reviews.  More the reviews than anything.


----------



## DrPepper (Jun 7, 2008)

I take this with a grain of salt ... and a salt mine just in case but this always happens and no one really knows which is better.


----------



## trt740 (Jun 7, 2008)

wiak said:


> get a proper cooler like zalman VF900-CU



no thermalright hr03


----------



## TonyStark (Jun 7, 2008)

GTX 280 will probably have higher minimum frame rates and won't suffer from the micro stutter like multi-GPU setups.


----------



## Laurijan (Jun 7, 2008)

I look forward to ATIs surprises.. the GFX war between ATI and Nvidia never gets boring..


----------



## JRMBelgium (Jun 7, 2008)

This year could be the first year that I buy ATI hardware


----------



## J-Man (Jun 7, 2008)

I'm gonna buy another 3870 X2 in a few weeks and run those in xfire for a few months then I'll buy a 4870 X2


----------



## DaedalusHelios (Jun 7, 2008)

Its all speculation. I doubt half of this stuff is true about either company. Their websites just want more hits for the adverts.


----------



## warhammer (Jun 8, 2008)

Looks like ATI may have a little NVIDIA in it http://www.fudzilla.com/index.php?option=com_content&task=view&id=7771&Itemid=1


----------



## Megasty (Jun 8, 2008)

ATi is already playing hardball with the 4870x2 monster. They been trying to get their hardware back up to the high bar set by the 88xx series. Havoc and PhysX will only will only give ATi's fire an oxygen line.


----------



## panchoman (Jun 8, 2008)

i highly doubt amd cards will feature cuda sadly, amd will probably find a way to make physx work on their gpu's a different way,


----------



## OzzmanFloyd120 (Jun 8, 2008)

Looks like my next build might be a spider


----------



## WarEagleAU (Jun 8, 2008)

I think ATIs way of physix is one card dedicated to it. I dont see them making a separate physics card.


----------



## imperialreign (Jun 8, 2008)

WarEagleAU said:


> I think ATIs way of physix is one card dedicated to it. I dont see them making a separate physics card.



that's how they did it before, during the first physics war between the two - and ATI fared really well.

I can't see ATI buying into Cuda, either, and knowing nVidia, I can't imagine Cuda running anywhere near as well on an ATI card as it would run on a green card - and ATI have proven in the past to be quite formidable with physics processing.



Y'know - with the way ATI have been going with their designs, it wouldn't surprise me if they decided at some point to include a smaller processor on their card for physics processing.  One card could house a GPU and a PPU - they've already proven dual-GPU cards to be efficient and effective.


----------



## Megasty (Jun 8, 2008)

All this talk about NV & ATi fanboism is really annoying

Fanboism is one thing but I can't compare cards that don't cost within the same price range. If you guys think that 280 will be worth $650 then by all means go buy it & see NV drop yall in the gutter 3 months later. Wow, NV will be faster cause they're cards costs more, big woof. Even now you can get a 3870x2 for $320 while a 9800GX2 is $430. Is 15% extra performance worth $110, hell no. Fanboism & being fiscal are two different things. When NV or ATi releases buggy POSs, I'm going to give my say no matter who does it. And the same goes if either release BS that costs too much - which NV seems to be doing quite often these days 

If the 4870x2 kills the GTX280 then the world will end, NV will eat humble pie, & they will be forced to stop sandbagging their crap. Hopefully not in that order


----------



## Edito (Jun 8, 2008)

some GTX280 pics...

http://www.vr-zone.com/articles/Detailed_Geforce_GTX_280_Pictures/5826.html


----------



## erocker (Jun 9, 2008)

wolf said:


> lol this is hilarious, nvidia will always have the crown, theyve been humoring ATi for 2 years now, everytime ATi makes something they quickly and easily make something a little bit better
> 
> live it up ATi fanboys, really, live it up, i hope you get in all your shots now cos it wont last long.
> 
> ...



This isn't the place to come and rant about some paradox that lives inside your head.  This is complete flame-bait.


----------



## JRMBelgium (Jun 9, 2008)

wolf said:


> lol this is hilarious, nvidia will always have the crown, theyve been humoring ATi for 2 years now, everytime ATi makes something they quickly and easily make something a little bit better
> 
> live it up ATi fanboys, really, live it up, i hope you get in all your shots now cos it wont last long.
> 
> ...



Eeuhm, ATI was faster for a very long time with the ATI 9 series, after that, the X1 series also owned the Geforce 7 and now the 3870X2 is still the best high-end product if you ask me. The 9800GX2 is faster, but 160$ more expensive. There is a lot of stuff you can buy with 160$. Cooling for all your hardware, a new case, a better PSU, more memory, a lot of stuff that lasts way longer then a GPU.

The Geforce 8 period was actually the first LONG period were Nvidia had the upper hand the entire time. So your fanboyism doesn't make any sense...

I've never bought ATI hardware in my life. Not because I am an Nvidia fanboy, but because Nvidia was always the best at the time I wanted to upgrade. My next GPU will be my last PC upgrade. I only play two games anymore "America's Army" and "Trackmania Nations". Both games run at 1600x1200 8xAA 8xAF at the moment. But I want to play both of them at 2048x1536 16XAA 16xAF and my 8800GT isn't capable of doing that. If the 4870 is as fast as the 3870X2, then I will buy it.


----------



## WarEagleAU (Jun 9, 2008)

Tastefully done Jelle. On the topic about that pic of the GT200, I really like that full enclosure. Gives it a nice sleek, sexy look. I like how the exhaust is too. Honestly, Im an ATI fan, but I like rooting for the underdog and I love how its in my price range. This 3870 is the first brand new gpu tech Ive ever bought. I usually by one or two generations behind. However, when I was trying to play lost planet, My x800GTO2 modded wouldnt play it, because it didnt have SM 3.0. I can honestly say Im very happy with my upgrade.


----------



## wolf (Jun 9, 2008)

just an opinion fellas, just as you are entitled to your own. cant wait on those benchmarks.

i am willing to grant its fanboyism, and anyone that took offense to it, im sorry. ATi area great company, and i have nothing against them, hec i even buy their cards from time to time 

behind all that fanboy rage, i was trying to point out how nvidia have started slow and come back quite hard in the past, dont count them out yet.


----------



## wolf (Jun 9, 2008)

yeah i really did come across quite assholey, but basically what i was getting at is that alot of people on these forums are counting nv outta this race already, and the products aren't even released.

6 months into this generation (ie christmas) we will truly see how they compete, after driver optimizations and perhaps a newer revision, like how the 7800GTX 512 was the the 256, and ATi might even have a better revision too.

and we need to have 2 comparisons, GPU vs GPU, (ie 1vs 1 and 2vs 2 etc) and a price comparison. i feel the results will sway heavily on either test.


----------



## KainXS (Jun 9, 2008)

wolf said:


> yeah i really did come across quite assholey, but basically what i was getting at is that alot of people on these forums are counting nv outta this race already, and the products aren't even released.
> 
> 6 months into this generation (ie christmas) we will truly see how they compete, after driver optimizations and perhaps a newer revision, like how the 7800GTX 512 was the the 256, and ATi might even have a better revision too.
> 
> and we need to have 2 comparisons, GPU vs GPU, (ie 1vs 1 and 2vs 2 etc) and a price comparison. i feel the results will sway heavily on either test.



but we should know by now that ATI's drivers even 6 months after the respective cards are released are usually not so good, they need better drivers, seriously


----------



## panchoman (Jun 9, 2008)

have you guys noticed that this ati's new gpu that's doubled up vs nvidia's new cpu? now what happens to ati when its 4870 vs 280 or 4870 vs 280x2


----------



## OzzmanFloyd120 (Jun 9, 2008)

panchoman said:


> have you guys noticed that this ati's new gpu that's doubled up vs nvidia's new cpu? now what happens to ati when its 4870 vs 280 or 4870 vs 280x2



I think it was supposed to be a cost comparrison, as in the cost of the 4870x2 will be equal to the new NV chips.


----------



## panchoman (Jun 9, 2008)

ah okay.. but to me this sort of implies that the 4870 wont be as powerful as the 280.. which is probably bad for ati...


----------



## erocker (Jun 9, 2008)

280 = $600 / 4870 = $300.  No one loses.  The 280's chip is about the size of two RV770's anyways.  What Nvidia needs is a card that will run head to head with the 4870 and be in it's price range.


----------



## OzzmanFloyd120 (Jun 9, 2008)

I think that really depends on the cost, if you can get a 4870X2 for $300 and the NV280 is $320 (I just pulled those prices out of my ass) and the 4870X2 ends up out performing the 280 it's going to hurt nvidia more because of the price difference.


----------



## JRMBelgium (Jun 9, 2008)

panchoman said:


> ah okay.. but to me this sort of implies that the 4870 wont be as powerful as the 280.. which is probably bad for ati...



Not nessesarily. If ATI succeeds in releasing a product that provides 3870X2 performance, a product that consumes way less power then the 280, a product that is at least 100$ cheaper then the 280, then it will sell, oh yes it will...

It's going to be 4870 vs 260 & 4870x2 vs 280 I think.


----------



## panchoman (Jun 9, 2008)

well lets hope ati does just that, they can still pull it off, but it all depends on the prices and other factors.


----------



## Megasty (Jun 9, 2008)

panchoman said:


> well lets hope ati does just that, they can still pull it off, but it all depends on the prices and other factors.



Exactly, because ATM things don't add up at all: performance-4870<260<280<4870x2 price-4870<260<4870x2<280. All we need is a steady fight not a NWO


----------



## wolf (Jun 9, 2008)

OzzmanFloyd120 said:


> I think it was supposed to be a cost comparrison, as in the cost of the 4870x2 will be equal to the new NV chips.



the cost will be irrelevant to the niche market who go for only the best of the best - this may well be 2x4870X2, but its a tricky subject....

it seems (so far) that in price, the 4870X2 "should" best a 280GTX, however as ive read before from techreport, ATi is building its Multi gpu setups with smaller building blocks as such. so trying to get 3 or 4 nvidia gpu's to scale well bring alot more issues.

my guess at this stage would be that the 2 most polular entusiast setups are:

4870X2 = great performance at a great price
GTX280 SLi = insane performance at an insane price

still, so much of this is educated guessing, speculation and fanboyism (of which i myself am undoubtedly guilty)..... 

cant wait for this headline...

"GTX2XX and 48XX go head to head"


----------



## erocker (Jun 9, 2008)

wolf said:


> 4870X2 = great performance at a great price
> GTX280 SLi = insane performance at an insane price



I agree, except there also there's the insane power consumption and heat generation for the GTX280.  Theoretically if the GTX280 chip was a little bigger it would pretty much be two RV770's put together.  Both companies are taking different approaches to these new cards, and really that's a good thing and gives a more distinction of choice.


----------



## wolf (Jun 9, 2008)

+1 diversity of choice in product will be awesome for the first time in a while now. its been 6 of one and half a dozen of the other for months and months now.


----------



## Hayder_Master (Jun 9, 2008)

yeeeeeeeeeehaaaaaaaaaaaaaa , yeh baby ati wins


----------



## Hayder_Master (Jun 9, 2008)

nvidia try to put gdd5, i just want to tell them time is out ati is score even if you make gddr6


                                            atinvidia


nvidia say    ambush , ati 4870x2 is coming , 
ati say    everybody go out is my land now
nvidia say  please don't kill me


----------



## DarkMatter (Jun 9, 2008)

erocker said:


> I agree, except there also there's the insane power consumption and heat generation for the GTX280.  Theoretically if the GTX280 chip was a little bigger it would pretty much be two RV770's put together.  Both companies are taking different approaches to these new cards, and really that's a good thing and gives a more distinction of choice.



But I think you guys are forgetting that 4870 X2 are 2 GPUs slapped together, so it's power consumption could (by common sense it will) be a lot higher than that of the 280, unless Ati can do some magic this time around, which I doubt. I do think it will be faster, but I don't think it will do a lot better than 3870 X2 in comparison. << Note that this implies I'm saying it WILL DO better (overal or in the market, etc.), but not A LOT BETTER in terms of performance-per-watt. The X2 consumes more than the 8800 Ultra and a lot more than the 8800 GTS 512/ 9800 GTX, while not being a lot faster. It consumes more than 2xHD3870 and I don't think they can improve this situation by much in the next generation. Nv 280 consumes exactly 50% more than HD4870, so I don't think they can make the X2 consume less than 2x the power of a single card and that would make the Hd4870 X2 draw significantly more power than Nv's cards. 

Then there's also the price issue. I could be wrong, but I find it hard to believe that costing the HD4870 more than $300 they can make the X2 cost around $500. Maybe (I'm almost sure really) they are counting with GDDR5 pricing going down by the time they launch the card, but by then all the cards are going to cost less. Specially if the reason for GT200 to be so expensive is that it has extremely low yields, as that's something they can fix "easily". They usually do (speaking of chipmakers in general).

Also later but soon Nvidia will launch it's 55nm refresh with the possibility of a GX2 regaining the crown. And sadly if that happens it will be an all-around crown because Nvida right now is a bit better in perf/watt and perf/price than Ati while using a bigger manufacturing process. 55nm will only make this fact more radical.

IMO Ati is going to be the better purchase option in the near future, as it's selling their cards within the hot spot, but we clearly can't count Nvidia off. According to leaked specs and bencharks HD4850 won't be faster than 9800 GTX so Nvidia has that card to fight against Ati. It will also have the GT selling for significantly less $. If HD4870 will be "better" than GTX 200 cards because its price, won't it be HD4870 < 9800GTX < HD4850 < 9800GT ?? IMO Yes. Where's the low limit? It will depend on the performance of the cards, and as it stands right now BOTH companies will have their products in different segments. They won't compete in performance, they WILL do in price/performance and there, the lower you go the better bang for the buck you will find. And that will be 9800 GT/GTX* IMO, but only time will tell.

* Because contrary to HD4000 series they won't be high-end and prices will be easier to lower to the desired point. Similar situation as with the HD3870 vs. 9600GT and HD3850 vs. 9600GSO.


----------



## trt740 (Jun 9, 2008)

DarkMatter said:


> But I think you guys are forgetting that 4870 X2 are 2 GPUs slapped together, so it's power consumption could (by common sense it will) be a lot higher than that of the 280, unless Ati can do some magic this time around, which I doubt. I do think it will be faster, but I don't think it will do a lot better than 3870 X2 in comparison. << Note that this implies I'm saying it WILL DO better (overal or in the market, etc.), but not A LOT BETTER in terms of performance-per-watt. The X2 consumes more than the 8800 Ultra and a lot more than the 8800 GTS 512/ 9800 GTX, while not being a lot faster. It consumes more than 2xHD3870 and I don't think they can improve this situation by much in the next generation. Nv 280 consumes exactly 50% more than HD4870, so I don't think they can make the X2 consume less than 2x the power of a single card and that would make the Hd4870 X2 draw significantly more power than Nv's cards.
> 
> Then there's also the price issue. I could be wrong, but I find it hard to believe that costing the HD4870 more than $300 they can make the X2 cost around $500. Maybe (I'm almost sure really) they are counting with GDDR5 pricing going down by the time they launch the card, but by then all the cards are going to cost less. Specially if the reason for GT200 to be so expensive is that it has extremely low yields, as that's something they can fix "easily". They usually do (speaking of chipmakers in general).
> 
> ...



Guys from what I read the 4850 is supposed to be on par with a 8800gts /9800gtx and the 4870 near the speed of a 3870x2 is this correct?


----------



## HTC (Jun 9, 2008)

trt740 said:


> Guys from what I read the 4850 is supposed to be on par with a 8800gts /9800gtx and the 4870 near the speed of a 3870x2 is this correct?



From what i've read, yes.

Please look in here: *apparently*, as resolutions and details go up, the performance gap widens.

EDIT

If the 4850 does that, i wonder how the 4870 and 4870x2 will against their nVidia counterparts!


----------



## DaMulta (Jun 9, 2008)

I wonder if SLi 260 cards could take the 4870X2. Yes we will have to wait see if it does in the coming weeks.


----------



## EastCoasthandle (Jun 9, 2008)

DaMulta said:


> I wonder if SLi 260 cards could take the 4870X2. Yes we will have to wait see if it does in the coming weeks.



I would think that solution would cost more then X2.  I'm not sure if that's practical even if you are using a 780i/790i board.


----------



## DaMulta (Jun 9, 2008)

Cost isn't always a issue, but I'm thinking SLi 260 would cost closely the same(also depends on what you say close is....). Plus you could have this on release day, and not have to wait an extra month for the 4870X2 to be released. I have also found that SLi works better than CF in most games. Now this could be totally different with the new 4HD cards.


----------



## EastCoasthandle (Jun 9, 2008)

DaMulta said:


> Cost isn't always a issue, but I'm thinking SLi 260 would cost closely the same. Plus you could have this on release day, and not have to wait an extra month for the 4870X2 to be released. I have also found that SLi works better than CF in most games. Now this could be totally different with the new 4HD cards.


My thinking is that if 2, 260's cost as much (or more) than a 280 then why not buy the 280? This is why I believe that 2, 260's vs X2 may not be a practical solution.


----------



## DaMulta (Jun 9, 2008)

Because for one you could add a 3ed card for less cost than you could doing it with the 280. This is the truth with most SLi and CF setups. When you hit a price point you can just move up to the next higher card. The only thing that you DO NOT GET ON SINGLE CARDS is the extra AA settings that you get with a SLi or CF setup. I myself enjoy having those settings for my older games so 2 or more cards for me always, but that is what I like having.


----------



## Jansku07 (Jun 9, 2008)

Here are my thoughts of this issue: According to AMD HD4870 will be 1.3x faster than 4850. If the HD4850's performance is +-5% from 9800GTX then 4870 is 30% faster than 9800GTX, correct? 8800GTS is what, 5-10% slower than 9800GTX and two of them slapped in one PCB make 9800GX2.

Crossfire and SLI are somewhat close to eachother in performance. So 4870X2 is 1,05*1,3=1,365 faster than 9800GX2. We don't know if there are any improvements in 4870X2 compared to usual Crossfire so we'll just have to wait and see. 

There will not be 55 nm 280X2 simply because one card can take 300W from one PCI-E 16x slot + power-pins. 55 nm 260X2 is possible..


----------



## EastCoasthandle (Jun 9, 2008)

DaMulta said:


> Because for one you could add a 3ed card for less cost than you could doing it with the 280. This is the truth with most SLi and CF setups. When you hit a price point you can just move up to the next higher card. The only thing that you DO NOT GET ON SINGLE CARDS is the extra AA settings that you get with a SLi or CF setup. I myself enjoy having those settings for my older games so 2 or more cards for me always, but that is what I like having.


I not following here.  I never mentioned SLI'ing 3, 280's.  The issues with 260 in sli is if it's going to perform consistently with 280. Something we really don't know as of yet.  Also, the cost factor.  When you compare it to a X2 it's will probably cost more.  Now the next question would be if you are paying a premium at/higher then a 280.  I understand this maybe your preference.   I just look at it from a different point of view.


----------



## EastCoasthandle (Jun 9, 2008)

HTC said:


> Source: PCzilla but originally found @ PCDig@ (Portuguese site)



Is this really good?  36 FPS in Crysis at 1920 is about the same of what some people experienced at 1280 (with all the quirks and slow downs that came with it).  Also, this is just the fly by (unless another type of benchmark exist).  So, actual game play may be different.


----------



## DaMulta (Jun 9, 2008)

From what I have heard, all the new cards will be able to handle anything on the market with flying colors. I don't even know if you will truly need an X2 or anything more than a 260 card. 

Now we(or I) really don't know this to be true untill release day in the coming weeks. But the X2/260 Sli/280 might be a waste in a lot of peoples minds when they are released.


I see where you are coming from price wise, and AMD ALWAYS seems to win in that section.


----------



## Disruptor4 (Jun 9, 2008)

I thought this to be interesting...
http://www.pczilla.net/en/post/35.html
Now if the 4870 can beat that, I'll be extremely happy!


----------



## wolf (Jun 9, 2008)

DaMulta said:


> I see where you are coming from price wise, and AMD ALWAYS seems to win in that section.



iunno dude, in Australia 8800GT's can be had everywhere cheaper than a 3870


----------



## Ravenas (Jun 9, 2008)

AMD is on top now graphics wise. Their 3870 X2 1GB chips are beating Nvidia's 9800 GX 2 1GB chips in
3dmark06 according to the latest PC Gamer magazine.

I think AMD's tri cores are going to take off on the next stepping release. But who knows...


----------



## EastCoasthandle (Jun 9, 2008)

Disruptor4 said:


> I thought this to be interesting...
> http://www.pczilla.net/en/post/35.html
> Now if the 4870 can beat that, I'll be extremely happy!



This is from ORB



> This score in Crysis are authentical ... i got very similar FPS on my GTX 280 ...
> 
> GTX 280 is not brutal powerfull card! performance is good, but i expected many more ....
> 
> According testing of Radeon HD4850 in CF i can say - radeons are better then GTX 280!



Take a look at his sig.  He prefers nv hardware.  ORB is suppose to be under NDA until 16th.  Now if he's saying that 4850 in CF is better then a 280 GTX then what will an X2 do?  Time will tell...


----------



## Megasty (Jun 9, 2008)

ATi seems to be creating a quandary with their graphical situation. The 4870 doesn't appear to be the best choice this time around. If we hold ATi to their word, the 4870x2 will be $499 or cheaper (yeah right  ) But the 4850 CF will be about $400, a true PP winner. Although the ultimate PP winner is the 4870x2, both of those options are fiscally better than the 4870. That is, unless the 4870 can hold its own with all the power hungry games out there, then both higher options will be overkill - but we all love overkill don't we


----------



## EastCoasthandle (Jun 9, 2008)

I also have to wonder if the 260 is even playable at 1920 @ Very High?  If the 280 gives 36 FPS using a fly by then I have to assume that the 260 would be around 30 or so.


----------



## JRMBelgium (Jun 9, 2008)

EastCoasthandle said:


> I also have to wonder if the 260 is even playable at 1920 @ Very High?  If the 280 gives 36 FPS using a fly by then I have to assume that the 260 would be around 30 or so.



Eeuhm, I play Gears of war, America's Army, Trackmania Nations, Assasins Creed, etc... on 2048x1536 with my 8800GT. Without AA and AF offcourse and I never use the highest shadow-settings, but all the rest is at highest.

And the 260 will be more powerfull then the 8800GT, so yeah, 1920 should not be a problem at all...

PS: Keep in mind the CPU can do a great deal in framerates...


----------



## EastCoasthandle (Jun 9, 2008)

Jelle Mees said:


> Eeuhm, I play Gears of war, America's Army, Trackmania Nations, Assasins Creed, etc... on 2048x1536 with my 8800GT. Without AA and AF offcourse and I never use the highest shadow-settings, but all the rest is at highest.
> 
> And the 260 will be more powerfull then the 8800GT, so yeah, 1920 should not be a problem at all...
> 
> PS: Keep in mind the CPU can do a great deal in framerates...


I am only comparing the results shown so far which is Crysis. What I am saying is that if 280 can do 36 FPS at 1920 what will the 260 do?  I guessimate around 30 FPS.  But we won't know until it's finally reviewed.  And as far as your statement goes I will assume the same will apply for the the 4870/x2.  However, we don't have any leaked benchmark results at this time.


----------



## EastCoasthandle (Jun 9, 2008)

Anyone live near a TDshop

GTX280


----------



## Megasty (Jun 9, 2008)

EastCoasthandle said:


> Anyone live near a TDshop
> 
> GTX280



*That's $911.10 *


----------



## yogurt_21 (Jun 9, 2008)

I guess thats the premium to be the first to have it lol.


----------



## spearman914 (Jun 9, 2008)

I knew it! Nice find dude...


----------



## JRMBelgium (Jun 9, 2008)

EastCoasthandle said:


> Anyone live near a TDshop
> 
> GTX280



That must be fake.


----------



## DarkMatter (Jun 9, 2008)

Megasty said:


> *That's $911.10 *



No that's 582 euros. It's a lot less than what I was expecting.

EDIT:

HD3870 X2 - http://www.tdshop.fr/Negozio.asp?IdNegozio=1&Categoria=45&SottoCategoria=&CodProdotto=M310568

8800 GTS - http://www.tdshop.fr/Negozio.asp?IdNegozio=1&Categoria=45&SottoCategoria=&CodProdotto=N350346

Those are ~$550 OMG  <<<Irony


----------



## wolf (Jun 10, 2008)

Ravenas said:


> AMD is on top now graphics wise. Their 3870 X2 1GB chips are beating Nvidia's 9800 GX 2 1GB chips in
> 3dmark06 according to the latest PC Gamer magazine.
> 
> I think AMD's tri cores are going to take off on the next stepping release. But who knows...



3d mark 06 isnt everything, the 9800Gx2 comes out on top in like 90% of games and applications dude.


----------



## Megasty (Jun 10, 2008)

DarkMatter said:


> No that's 582 euros. It's a lot less than what I was expecting.
> 
> EDIT:
> 
> ...



582 euros = 911 dollars

I feel bad for Euro gaming enthusiasts


----------



## KainXS (Jun 10, 2008)

wolf said:


> 3d mark 06 isnt everything, the 9800Gx2 comes out on top in like 90% of games and applications dude.



but you have to look at it like this, this is the cheapest 9800Gx2 on Newegg
http://www.newegg.com/Product/Product.aspx?Item=N82E16814150284

and the most of the 3870X2's are under 400 dollars like this one
http://www.newegg.com/Product/Product.aspx?Item=N82E16814102723

and that includes the new gddr4 version
http://www.newegg.com/Product/Product.aspx?Item=N82E16814131100

both are good brands but I won't pay an extra 100+ dollars extra for a card thats like 5 fps faster on average.


----------



## HAL7000 (Jun 10, 2008)

I will wait and see what they do in real time. All this guessing is like listening to my wife b*tch about how much I spend on upgrading. All I am hoping for is that none of the camps cry fowl or say oops....maybe next time i will get it right.....


----------



## KainXS (Jun 10, 2008)

I will wait also, The 4870 and 4850 will more than likely drop in price like the 38XX series did


----------



## wolf (Jun 10, 2008)

KainXS said:


> but you have to look at it like this, this is the cheapest 9800Gx2 on Newegg
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150284
> 
> and the most of the 3870X2's are under 400 dollars like this one
> ...



the fact is the 9800GX2 has the crown, if your talking price, SLI 9600GT 1024 mb will be cheaper and roughly on par. hec in AU 8800GT is cheaper AND better.


----------



## DarkMatter (Jun 10, 2008)

Megasty said:


> 582 euros = 911 dollars
> 
> I feel bad for Euro gaming enthusiasts



That's only because of the weak dollar and the purchasing power difference that exists between US and the EU, even if you can't talk about that in general in the EU as each country has its own. Anyway the average of the most advanced countrys in the EU, namely the first 15 is bigger than in the US. 600  euros are as easy to spend for most people in Europe as $600 for most in the US. It is a bit harder for us here in Spain but it's not as throwing out $900 for US people. In Germany, Finland and many others 600 euros are worth less compared to their purchasing power than $600 in America.

Anyway, even if that wasn't the case and was indeed harder on the pocket for us, it's been always like that meaning that GTX 280 won't be expensive compared to previous generations. According to an article in TomsHardware, France was the most expensive country (of the tested) in the world for hardware. In Spain (usually cheaper) the 8800 GTX debuted at 625 euros, the HD2900 at 500+ euros, the 8800 GT almost 300 and HD3870 250 to name a few.

EDIT: I must say that I feel as a kind of a traitor, as if I was telling a secret that everyone here are exploiting and should not be said.


----------



## Haz197 (Jun 10, 2008)

well I got my £400 in the bank !!

Back to the red team this time round, yay !!!

So glad ATI have 'what looks like' a good contender, will only get the GTX 280 if 4870 sux ass big time, but I can't see it........

I remember taking back my TI4800 (TI4400 with 8x AGP) cause it wouldn't run Morrowind in 1600x1200 and being blown away by the 9700pro, that was it, a red to the bone !!

TI4800 > 9700pro > 5900XT > X800XTPE > X1950XT AGP > 8800GTS G92 > HD4870 ???

Roll on the 17th !!!!


----------



## Megasty (Jun 10, 2008)

DarkMatter said:


> That's only because of the weak dollar and the purchasing power difference that exists between US and the EU, even if you can't talk about that in general in the EU as each country has its own. Anyway the average of the most advanced countrys in the EU, namely the first 15 is bigger than in the US. 600  euros are as easy to spend for most people in Europe as $600 for most in the US. It is a bit harder for us here in Spain but it's not as throwing out $900 for US people. In Germany, Finland and many others 600 euros are worth less compared to their purchasing power than $600 in America.
> 
> Anyway, even if that wasn't the case and was indeed harder on the pocket for us, it's been always like that meaning that GTX 280 won't be expensive compared to previous generations. According to an article in TomsHardware, France was the most expensive country (of the tested) in the world for hardware. In Spain (usually cheaper) the 8800 GTX debuted at 625 euros, the HD2900 at 500+ euros, the 8800 GT almost 300 and HD3870 250 to name a few.
> 
> EDIT: I must say that I feel as a kind of a traitor, as if I was telling a secret that everyone here are exploiting and should not be said.



A weak dollar is one thing but the cost of living in Europe as a whole is ridiculous when compared to the US. Not too long ago, I lived in the UK - in a nice 4 room apt. In the states, that translated over to a 20 room house  The rent in the apt costed about the same as the Ultra. I'm in a much better financial position now but I still can't imagine blowing that much cash on a graphics card. Even right now, that stuff costs twice as much in the UK as the US which is just plain wrong :shadedshu 

The card is relatively cheaper than its predecessors, but not by much & that's only one online store. I know that extra cash you're paying is going to all the tariffs being placed on Asian products but at some point you have to say its just excessive


----------



## HTC (Jun 10, 2008)

DarkMatter said:


> *That's only because of the weak dollar and the purchasing power difference that exists between US and the EU, even if you can't talk about that in general in the EU as each country has its own. Anyway the average of the most advanced countrys in the EU, namely the first 15 is bigger than in the US. 600  euros are as easy to spend for most people in Europe as $600 for most in the US. It is a bit harder for us here in Spain but it's not as throwing out $900 for US people. In Germany, Finland and many others 600 euros are worth less compared to their purchasing power than $600 in America.*
> 
> Anyway, even if that wasn't the case and was indeed harder on the pocket for us, it's been always like that meaning that GTX 280 won't be expensive compared to previous generations. According to an article in TomsHardware, France was the most expensive country (of the tested) in the world for hardware. In Spain (usually cheaper) the 8800 GTX debuted at 625 euros, the HD2900 at 500+ euros, the 8800 GT almost 300 and HD3870 250 to name a few.
> 
> EDIT: I must say that I feel as a kind of a traitor, as if I was telling a secret that everyone here are exploiting and should not be said.



Yesterday, i was watching TV and they were speaking of people earning little a year ($38000) and how the gas prices were affecting them hard. Last year, i made little over 17000 euros (before taxes and social security) and that's well above average, here in Portugal.

I make 757 (base) + 25% (shift subsidy for 3 rotating shifts, soon to becoming 30% for 4 rotating shifts) - IRS - social security and that makes around 750 euros per month which is like $26000 a year, isn't it?

A few days ago, it was said in CNN that, in US, 11% (or 13%: not sure) of the total price of gas is taxes. In England, 60% is taxes and, if i'm not mistaken, in Portugal it's even higher.

The Spanish truckers are complaining about their gas prices and yet, they pay 28 cents per litre *LESS* then in Portugal and they earn a LOT more then us: just look @ minimum wages.

Compare the attachments and see the difference in prices: see how much more we have to pay for the same thing 


Back on topic: I really hope the performance ends up being similar so that the war continues and the consumer ends up with cheaper cards. We don't need another Blue Ray scenario!!


----------



## DaedalusHelios (Jun 10, 2008)

I have had both at the same time and on par is a bit of a stretch.


----------



## Voltaj .45 ACP (Jun 10, 2008)

yep they're right.this place boring as dead, no hooligans and english.nice.. can i hang out a little in here?if you know a bit another language except eng. then go to that country's harware forums in these days for less stress.so.. 4870 nice 280gtx expensive 4870x2 better than 280gtx but price?nobody knows all details.that's short brief but ignore me keep talking about truckers, portugal, euro gaming enthusiasts..etc


----------



## DaedalusHelios (Jun 10, 2008)

Voltaj .45 ACP said:


> yep they're right.this place boring as dead, no hooligans and english.nice.. can i hang out a little in here?if you know a bit another language except eng. then go to that country's harware forums in these days for less stress.so.. 4870 nice 280gtx expensive 4870x2 better than 280gtx but price?nobody knows all details.that's short brief but ignore me keep talking about truckers, portugal, euro gaming enthusiasts..etc



Thats the strangest insult I have ever heard.


----------



## Megasty (Jun 10, 2008)

DaedalusHelios said:


> Thats the strangest insult I have ever heard.



Wow, more like a rant than an insult. But he did wrap up this thread in one sentence


----------



## DarkMatter (Jun 10, 2008)

Megasty said:


> A weak dollar is one thing but the cost of living in Europe as a whole is ridiculous when compared to the US. Not too long ago, I lived in the UK - in a nice 4 room apt. In the states, that translated over to a 20 room house  The rent in the apt costed about the same as the Ultra. I'm in a much better financial position now but I still can't imagine blowing that much cash on a graphics card. Even right now, that stuff costs twice as much in the UK as the US which is just plain wrong :shadedshu
> 
> The card is relatively cheaper than its predecessors, but not by much & that's only one online store. I know that extra cash you're paying is going to all the tariffs being placed on Asian products but at some point you have to say its just excessive



We're are going totally off-topic. 

The cost of living in Europe is ridiculous indeed, but I have to say that UK AFAIK is well above the average and we could say UK is not Euro Zone. BUT when it comes to hardware or electronics prices are not excesive taking into account the average salaries. Yes we have to exclude Portugal, Greece, Eastern Europe countries and yeah Spain, as they are below the average, while prices are the same (I am from Basque Country an "economically independent" region that is above EU average, though). So it's dificult in those countries to pay as much money, but not on others and not by the average. Last time I checked average yearly salary for the EU was 34.000 euros, while at the same time in US was $32.000, in Finland was 47.000 euros and Germany 42.000 BTW. Those salaries are mitigated by the cost of living as a whole, but that means they can buy less houses, eat less, drink less and drive less, BUT they can buy hardware easily, which was my point. Hardware is "cheap", living is expensive. The prices for an apartment in Spain is also ridiculous thanks to the housing bubble derived from the establishment of the euro and the rounding of prices. Happened the same on pretty much everything but things were getting better until the global crysis. Basically 1 euro = 166 pesetas (spanish currency before euro), but most bussineses did 1 euro = 100 peseta so you can see the difference. Important to note that the same didn't happen in the salaries, of course.

In the end things are not a lot better in Spain than in Portugal, in comparison to most countries in the EU, even though they are percentually (as to say) better. In my post I was talking about EU in general as prices are set for the entire EU with slight differences and are calculated with the average purchaser in mind. As I said in France hardware is the more expensive in the world! And that shop is particularly expensive, look at other prices. If previous generations of cards are going to be any indicator, that is, Nvidia didn't change their pricing policy this time around, 582 euros in that shop from France means the card would not cost more than $600 in US and 550 euros in Spain, probably even cheaper in Germany. With the possibility of one vendor selling the card well below that mark, as this is something that happens a lot in Europe and is probably related to the fact that 582 euros = 900 dollar. i.e. when I bought my card for 200 euros, the average was above 250 and the next cheaper card I could ind was 230...

EDIT:  Haha! I was watching TV and they have started talking about the *crisis* and in the same moment they mentioned it I just realised I had written *crysis* in this post. Just another fact of the ubiquitous presence of Crysis.


----------



## Megasty (Jun 10, 2008)

Here's more NH garb...

http://www.nordichardware.com/news,7845.html


----------



## Laurijan (Jun 10, 2008)

Isnt the GTX 280 the 9800 GTX a single core GFX.. if yes why compare it to a dual core GFX?


----------



## tkpenalty (Jun 10, 2008)

Megasty said:


> Here's more NH garb...
> 
> http://www.nordichardware.com/news,7845.html



Old news. Moreover the fact that the RV770 Cores arent running AA through shaders means that AA wont have such an impact on perf. That bit of info they have there *isn't* garbage.


----------



## Megasty (Jun 10, 2008)

tkpenalty said:


> Old news. Moreover the fact that the RV770 Cores arent running AA through shaders means that AA wont have such an impact on perf. That bit of info they have there *isn't* garbage.



lol, when I said _garb_ I was referring to _info_. The links extends to a bit of info that the 4870x2 won't use the bandwidth sucking PLX chip which is possibly the best thing ATi could do for this thing. As for the AFR, I'll believe it when I *don't* see it


----------



## LiveOrDie (Jun 10, 2008)

well nvidia will come up with a faster card a few months after the GTX280 just like the 8800GTX smack ULTRA THEN THE XXX VERSION, nvidia wont lose the fight.


----------



## wolf (Jun 10, 2008)

Laurijan said:


> Isnt the GTX 280 the 9800 GTX a single core GFX.. if yes why compare it to a dual core GFX?



Exactly.

when compared to nvidia dual core they lose.

all they have is a price argument.


----------



## yogurt_21 (Jun 10, 2008)

considering 99% of gamers pruchase a card based on price/performance it's actually the ONLY argument that makes sense. $500 for a 4870x2 and $600 for a GTX280 means that if the dual out performs it the GTX280's price/performance ratio is in the negatives. 

so you and your 1% go ahead and buy it lol.


----------



## Darkmag (Jun 10, 2008)

Laurijan said:


> Isnt the GTX 280 the 9800 GTX a single core GFX.. if yes *why compare it to a dual core GFX?*



<VENT>
CUASE IT DOESN`T F***ING MATTER, GEEZUZ F***ING F**K. Im serious getting tired of this kind of mindless thinking. ATI/AMD is trying a *DIFFERENT *approach, stop throwing these sissy fits of ignorance. You'd be singing a different tune if it was nVidia doing it​</VENT>

On a less serious note, none of us really know which of the 2 card is faster, hoping ATI/AMD but until any real benchmark hopefully gaming related are released, its all speculation


----------



## JRMBelgium (Jun 10, 2008)

yogurt_21 said:


> considering 99% of gamers pruchase a card based on price/performance



Eeuhm, 99% of the gamers don't know **** about GPU's. They tink that 9500GT > 8800GT. We look at the price/performance but for every cranky geek on this forum there are 99 "normal" gamers who really don't know anything about GPU's.

I think that out of that 1%, 5% buys the fastest thing out there every generation, no mather what the price, 25% sticks with a specific brand ( Nvidia or ATI ) and upgrades once every 1-2 years and the rest looks at the price/performance ratio.


----------



## Megasty (Jun 10, 2008)

Jelle Mees said:


> Eeuhm, 99% of the gamers don't know **** about GPU's. They tink that 9500GT > 8800GT. We look at the price/performance but for every cranky geek on this forum there are 99 "normal" gamers who really don't know anything about GPU's.
> 
> I think that out of that 1%, 5% buys the fastest thing out there every generation, no mather what the price, 25% sticks with a specific brand ( Nvidia or ATI ) and upgrades once every 1-2 years and the rest looks at the price/performance ratio.



Just thank the heavens for sites like this one. I see ppl coming here all the time asking what card they should buy. Its always the PP winner being suggested. As long as ppl are still asking then our say won't be in vain. But really if the card didn't do it then why is all these goofy sites saying that it does. Would the same be said if the NV _was_ beating the ATi. These 2 comps been fighting each other forever & if one ever got on stop they wouldn't be their for long. It was true x1900 v. 7900 & its true for the 8800 v. 3800. History will repeat itself forever until a 3rd ringer comes in & break this BS up. But I'm not holding my breath


----------



## newconroer (Jun 10, 2008)

TonyStark said:


> GTX 280 will probably have higher minimum frame rates and won't suffer from the micro stutter like multi-GPU setups.




And that, to real gamers, would undermine any potential top end FPS. As always 150fps means squat when you enter a battle scene and hitch and stutter and your card tanks leaving you with 20fps.

There's nothing on the market right now, and probably not for a while, that even needs the performance these cards are supposedly going to offer.


----------



## wolf (Jun 11, 2008)

as we have seen over the past year or so, nvidia does not fight a losing battle.

they have stayed very competitive in almost every segment (price/performance), alot of people here are very quick to assume that this time round theyre just going to play it dumb...i dont think so boys.

"they're starting with their top performer for the time being, knowing that ATI's 4870x2 is still a ways away.

Meaning if you want the best performing card on the market out of both camps current next gens, you'll have to have a ton of disposable income, or be patient enough to wait until the 70x2 rolls out.

If the 4870x2 comes out and nVidia still tromps it, price will stay up; if the 70x2 is performing equal or better, nVidia will lower their prices to compete."

i think thats fairly accurate, theyll get it out first, and make a heap of sales (price/performance aside, it will be king of the hill, i dont think anybodies saying a 4870 will be faster) then when X2 comes along, theyll see how she fares against them, and price her accordingly. all marketing decisions.


----------



## yogurt_21 (Jun 11, 2008)

see therin lies the problem, the gtx280 is too expensive to produce, meaning that if nvidia lowers the price of the gtx280, they lose most of their profit margin (which at the moment is really slim) ati is in a much better position to lower the cost of their chip as it is cheaper to produce. so nvidia can lower the price on the gtx280, but only by a little bit. while ati can go even lower. see it's not that we're counting nidia out, it's that were being realistic based on the specs provided. the gtx280 is monster chip and because of that it will always sell at a monster chip price. (8800 ultra anyone?) 
http://www.newegg.com/Product/Product.aspx?Item=N82E16814130346

over a year after its debut and still 500$. the gtx280 will be the same, it's too expensive to produce. if anything nvidia can be competitive with a half memory version of the gtx260. but the gtx280 will always be a monster card. it'll have the performance sure, but enoguh to beat a dual card? dunno. most of us seem to think not. but hey it happens the 7800 series didn't compete well with the x1800xt, so nvidia launched a 512mb monster in reply in limited quantities. it turned out to be a pre-release version of the 7900gtx. and thats more likely to happen here. if the gtx280 is struggling, nvidia will simply bump up the launch of it's revision (which fuad says is already taped out) the revision will have a smaller die and could easily be lowered in price while increasing performance from the gtx280.


nvidia lately hasn't been know for lowering prices when theirs is too expensive per performance, they simply launch a revision that has a better price/performance ratio.


----------



## Jansku07 (Jun 11, 2008)

> HD4870 30-50%>9800GTX, HD4850 25-50%>8800GT thus (minus marketing) HD4850 ~10% faster than 9800GTX & HD4870 ~30%>9800GTX.



Source: w0mbat @ b3d


----------



## EastCoasthandle (Jun 11, 2008)

Source


> 3D 06 1280x1024
> 
> GTX 280 -* 14700*
> 
> ...



Taken with just a pinch of salt.


----------



## Exceededgoku (Jun 11, 2008)

He's a very reliable guy...


----------



## Megasty (Jun 11, 2008)

Jansku07 said:


> Source: w0mbat @ b3d



Oh god now ATi's doing the same thing. I already used up all the salt I had for these cards


----------



## TooFast (Jun 11, 2008)

Nvidia is in trouble this time. ati is taking a much better approach towards gamers.


----------



## Joe Public (Jun 11, 2008)

I'll believe which card is faster than the other on the official release day, perhaps preferably after a bit more mature drivers. Speculation always makes my head spin.   And just 3DMark scores never did and never will do it for me.   As for manufacturers benchmarks, sometimes I've found them a bit optimistic.


----------



## Darren (Jun 12, 2008)

Jansku07,
LOL. The benchmark is obviously fake. I have evidence too.

*Look at the first picture the 9800 GTX (green bar) performs the same as the 8800 GT (green bar) in the second picture.*



zanex said:


> Hope you're kidding with that one...



Scroll up to post #117 and look at the two charts yourself and tell me if iam kidding.


----------



## zanex (Jun 12, 2008)

Darren said:


> Jansku07,
> LOL. The benchmark is obviously fake. I have evidence too.
> 
> *Look at the first picture the 9800 GTX (green bar) performs the same as the 8800 GT (green bar) in the second picture.*



Hope you're kidding with that one...


----------



## farlex85 (Jun 12, 2008)

Darren said:


> Jansku07,
> LOL. The benchmark is obviously fake. I have evidence too.
> 
> *Look at the first picture the 9800 GTX (green bar) performs the same as the 8800 GT (green bar) in the second picture.*



Those are actually comparison graphs, so the nvidia comparisons are set to 1 across the board on each bench. The nvidia cards aren't comparable in that graph. Those seem reasonable enough, but we shall just have to wait and see.......


----------



## HTC (Jun 12, 2008)

Well, i found and posted this (with a few modifications) on another thread: dunno if it fits in this one but, since were talking about how good / bad a 4850 is, then ...







Source (translated with google): www.publish.it168.com but originally found @ PCDig@ (Portuguese site).

*Check the source (publish.it168): it's worth it!*

There's also VANTAGE results as well, in the other pages of the source!







Personally, i would like to see a comparison of both 3D06 and Vantage results @ higher settings with a 9800GX2 as i believe this card (4850) is lower @ lower resolutions but higher @ higher resolutions.


----------



## broke (Jun 12, 2008)

uhm, they might be fake but the fact that the green bars are always at 1 doesn't make it fake. it just meant that the 4870 is being compared relatively to the 9800gtx, and in the second graph with the 8800gt. At no point is the 8800gt being compared to the 9800gtx.

edit: lol farlex beat me to it


----------



## erocker (Jun 12, 2008)

For refference.  My score:
Q6600 @ 3.2ghz = 11789 CPU
3870 @ 850/1200 = 4242
Total score P5050

4850 is a heck of a step up!


----------



## HTC (Jun 12, 2008)

erocker said:


> For refference.  My score:
> Q6600 @ 3.2ghz = 11789 CPU
> 3870 @ 850/1200 = 4242
> Total score P5050
> ...



Mine was this (and it won't change since i've removed my OC with the exception of the RAM OC):

http://img.techpowerup.org/080502/Vantage.png

About the 3D06 scores in the link (source) of my previous post, with my CPU OCed to 3536 but *not* my GPU @ 1280 / 1024, i got slightly over 10K (don't remember the exact number) and, if you check that link, you'll see that the 4850 does nearly 10.5K @ 1920 / 1200: in my book, that's a *HUGE* step up!!!


----------



## HTC (Jun 12, 2008)

*Has this been posted?*

Translated with google: Tom's Hardware.CN


----------



## yogurt_21 (Jun 12, 2008)

Darren said:


> Jansku07,
> LOL. The benchmark is obviously fake. I have evidence too.
> 
> *Look at the first picture the 9800 GTX (green bar) performs the same as the 8800 GT (green bar) in the second picture.*
> ...



are all the users of tpu unable to read graphs? the graphs are not that hard to read. in the first you'll see the 9800gtx used as a benchmark (seriously people look it up it DOES NOT mean 3dmarks i mean wow) meanin its performance will always be 1. so if it got 40 frames in game 1, and 20 frames in game 2, it's performance on the graph would still be 1!!. meaning that if the 9800gtx got 40 frames in crysis, the 4870 would have 62frames.

the second graph shows the same type of comparison between the 4850 and the 8800gt. meaning that if the 8800gt got 30 frame sin game 1 and 20 frames in game 2 it's performance would always be 1!!!. 

*IT DOES NOT MEAN THAT THE 8800GT IS PERFORMING THE SAME AS THE 9800GTX. THE GRAPHS ARE SEPERATE FOR A REASON.*

I seriously think we have a problem here. maybe we should have a whole class on how to read graphs or something. based on the responses in all these threads with graphs we'll have to make it mandetory to be a forum member. lol


----------



## Megasty (Jun 12, 2008)

Its obvious why the graphs are comparing the RV770 pro & xt to the 8800gt & 9800gtx respectively. Those two cards are priced to be competitive with the NV counterparts. Not only that but they completely destroy them at high res with AA. Its even true that they're not in the same league with the NV cards. If those charts are true the 4870 is slightly faster than the 9800gx2 which is nucking futs  The costs of these things are ridiculous compared to the performance


----------



## DarkMatter (Jun 12, 2008)

Megasty said:


> Its obvious why the graphs are comparing the RV770 pro & xt to the 8800gt & 9800gtx respectively. Those two cards are priced to be competitive with the NV counterparts. Not only that but they completely destroy them at high res with AA. Its even true that they're not in the same league with the NV cards. If those charts are true the 4870 is slightly faster than the 9800gx2 which is nucking futs  The costs of these things are ridiculous compared to the performance



The HD4850 is listed at 169 euros, is the only price that I have seen so far. You can get a 8800 GT NOW for as low as 110 euros and 8800 GTS for 175. 9600 GT (not too far performance wise) can be found well below 100 euros. Now 9800 GTX is more expensive (220), but it's that high because they have no competition. Same happens with 8800 GTS because the only competition is Nvidia's own 9800 GTX. This way profits are higher, common bussines: Nvidia asks for a bit more money for the chip, partners up the price percentualy, intermediaries and retairlers do the same. 9800 GTX price will go down drastically because of this and 9800 GT WILL BE even more cheaper than 8800 GT, because it's 55 nm chip and AFAIK will also use simpler PCB. That would put the 9800 GT around 100 euros if not lower and I don't see the HD4850 near those prices anytime soon.

Also if you want to check my opinion on the charts and want another place to flame me (not that I consider you have done it so far) check this thread :

http://forums.techpowerup.com/showthread.php?t=62698


----------



## Darren (Jun 12, 2008)

yogurt_21 said:


> are all the users of tpu unable to read graphs? the graphs are not that hard to read. in the first you'll see the 9800gtx used as a benchmark (seriously people look it up it DOES NOT mean 3dmarks i mean wow) meanin its performance will always be 1. so if it got 40 frames in game 1, and 20 frames in game 2, it's performance on the graph would still be 1!!. meaning that if the 9800gtx got 40 frames in crysis, the 4870 would have 62frames.
> 
> the second graph shows the same type of comparison between the 4850 and the 8800gt. meaning that if the 8800gt got 30 frame sin game 1 and 20 frames in game 2 it's performance would always be 1!!!.
> 
> ...



I'm fully aware how to read the graph thank you. My point was that any other type of graph or comparison could of been used, but they decided to *CHOOSE *a method of measurement which makes the 4000-series look like it's miles better and hence why many of us are suspicious about the accuracy of the benchmark due to the extra effort to make the readings as ambiguously in favour of the 4000-series as possible.

They could of had a straight forward bar chart with the frames per second for each card listed on the Y axis but they didn't which has lead me and many others to doubt the accuracy, legitimacy, and integrity of the benchmarks.


----------



## yogurt_21 (Jun 12, 2008)

Darren said:


> I'm fully aware how to read the graph thank you. My point was that any other type of graph or comparison could of been used, but they decided to *CHOOSE *a method of measurement which makes the 4000-series look like it's miles better and hence why many of us are suspicious about the accuracy of the benchmark due to the extra effort to make the readings as ambiguously in favour of the 4000-series as possible.
> 
> They could of had a straight forward bar chart with the frames per second for each card listed on the Y axis but they didn't which has lead me and many others to doubt the accuracy, legitimacy, and integrity of the benchmarks.



uh huh, your post 124 begs to differ, i either suggest you edit that one, or edit that you think you know how to read the graph out of this one. and fyi nvidia did it first with the gtx260 and gtx280 vs the 3870x2.


----------



## Megasty (Jun 12, 2008)

DarkMatter said:


> The HD4850 is listed at 169 euros, is the only price that I have seen so far. You can get a 8800 GT NOW for as low as 110 euros and 8800 GTS for 175. 9600 GT (not too far performance wise) can be found well below 100 euros. Now 9800 GTX is more expensive (220), but it's that high because they have no competition. Same happens with 8800 GTS because the only competition is Nvidia's own 9800 GTX. This way profits are higher, common bussines: Nvidia asks for a bit more money for the chip, partners up the price percentualy, intermediaries and retairlers do the same. 9800 GTX price will go down drastically because of this and 9800 GT WILL BE even more cheaper than 8800 GT, because it's 55 nm chip and AFAIK will also use simpler PCB. That would put the 9800 GT around 100 euros if not lower and I don't see the HD4850 near those prices anytime soon.
> 
> Also if you want to check my opinion on the charts and want another place to flame me (not that I consider you have done it so far) check this thread :
> 
> http://forums.techpowerup.com/showthread.php?t=62698



Naw, its all good 

I can get a GTS over here right now for about $200 & a GTX for $300. It would have been better for AMD to compare the 4850 with the GTS. It would have been just as impressive. I'm sure NV has something in the works at that price point to combat these cards but right now they don't. Too bad I've spoiled myself with the highest performance I can get. The only game that needs all that power is Crysis  I'll play with whatever NV comes up with to nix these cards but I don't think I'll be the $150 GT200 chip. They'll already be losing money on the thing if they sell it for $500  But NV & ATi make their money on low to mid-range cards anyway.


----------



## DarkMatter (Jun 12, 2008)

Megasty said:


> Naw, its all good
> 
> I can get a GTS over here right now for about $200 & a GTX for $300. It would have been better for AMD to compare the 4850 with the GTS. It would have been just as impressive. I'm sure NV has something in the works at that price point to combat these cards but right now they don't. Too bad I've spoiled myself with the highest performance I can get. The only game that needs all that power is Crysis  I'll play with whatever NV comes up with to nix these cards but I don't think I'll be the $150 GT200 chip. They'll already be losing money on the thing if they sell it for $500  But NV & ATi make their money on low to mid-range cards anyway.



They won't lose money, be sure of that. They just won't repeat the financial success of this last 2 years and that's all. Anyway who cares if they lose money with that chip? They can't be growing forever, but they won't FAIL as many seem to think nor they have screwed everything up. If the card is fast and is priced at a good point better for us.


----------



## yogurt_21 (Jun 12, 2008)

the 9800gtx remains on the chart for the gt200 release so I agree that it will likely replace the 8800gt pricepoint. in fact if the 9600gt remains around this will be a great hardware year for everyone, as the 9600 will likely drop to 8600gt prices. granting a performance ratio that hasn't been seen since the geforce4ti series. (as the geforce4ti4200 64mb was dirt cheap and played every game out as a decent detail setting) 

all in all it seems like a win for the customers this launch. now imagine what happens if ATI does happen to be competitive this time. prices get even better.


----------



## Darren (Jun 12, 2008)

yogurt_21 said:


> uh huh, your post 124 begs to differ, i either suggest you edit that one, or edit that you think you know how to read the graph out of this one. and fyi nvidia did it first with the gtx260 and gtx280 vs the 3870x2.



I'm not editing a thing. Post #124 merely states the 8800 GT and 9800 GTX share the same results. You missed my sarcasm that implied the results were deliberately displayed in an unconventional manner to show a greater margin between Nvidia's and ATI's cards. The conventional method of presentation being to have the FPS on the Y axis and a list of cards on the X axis. The graph ignores this convention.


----------



## yogurt_21 (Jun 12, 2008)

Darren said:


> I'm not editing a thing. Post #124 merely states the 8800 GT and 9800 GTX share the same results. You missed my sarcasm that implied the results were deliberately displayed in an unconventional manner to show a greater margin between Nvidia's and ATI's cards. The conventional method of presentation being to have the FPS on the Y axis and a list of cards on the X axis. The graph ignores this convention.



your sarcasm needs ALOT of work if it truly was sarcasm. and as I stated before nvidia did it first, placing the 3870x2 at 1 with the gtx260 and gtx280's performance listed above. it's called marketing and it's nothing new. both campanies do it.


----------



## Darren (Jun 12, 2008)

yogurt_21 said:


> I stated before nvidia did it first, placing the 3870x2 at 1 with the gtx260 and gtx280's performance listed above. it's called marketing and it's nothing new. both campanies do it.



I think it's a below the belt method marketing and if these benchmarks are from ATI then I've lost a lot of respect for ATI. At first glance fanboys are going to see the huge gap between the cards and assume the 4000 series is upto 25% faster without rationalizing that it could only translate into only few frame rate difference. e.g Crysis is just under 25% faster on the 4870 than the 9800 GTX but fanboys forget that if the 4870 gets 40 FPS the 9800 GTX gets 30 FPS which in reality isn't very much separation in comparison to what the graph conveys. This might entice fanboys to get the 4870 on product release as they expected a huge performance increase whereas they might not of made the purchase if they knew what the frame rate translation was.

Then again if these statistics are genuine it means ATI are marketing geniuses as it has people arguing over their product over many threads spanning many pages as well as in many other forums and changing their alliances.


----------



## JRMBelgium (Jun 13, 2008)

I think everyone agrees when I say:
"we all benefit if ATI kicks Nvidia's butt"

Nvidia has been ripping people off with the 9 series for to freakin long. I really nope that they get a good kick in the nuts so that they are forced to lower their prices.


----------



## Megasty (Jun 13, 2008)

Darren said:


> I think it's a below the belt method marketing and if these benchmarks are from ATI then I've lost a lot of respect for ATI. At first glance fanboys are going to see the huge gap between the cards and assume the 4000 series is upto 25% faster without rationalizing that it could only translate into only few frame rate difference. e.g Crysis is just under 25% faster on the 4870 than the 9800 GTX but fanboys forget that if the 4870 gets 40 FPS the 9800 GTX gets 30 FPS which in reality isn't very much separation in comparison to what the graph conveys. This might entice fanboys to get the 4870 on product release as they expected a huge performance increase whereas they might not of made the purchase if they knew what the frame rate translation was.
> 
> Then again if these statistics are genuine it means ATI are marketing geniuses as it has people arguing over their product over many threads spanning many pages as well as in many other forums and changing their alliances.



I have to agree with you with the below-the-belt marketing. I nearly cracked my mouse by squeezing it too hard when I saw those charts. I was like _'first NV now ATI pulls the same shit'_ :shadedshu They did include some settings instead of leaving us completely in the dark like NV. However, the NV charts made the 3870x2 look like a POS to the _untrained _eye. Uninformed ppl are easily moved by this propaganda. We are not, atleast most of us aren't and that's all that matters.


----------



## Hayder_Master (Jun 13, 2008)

cute


----------



## btarunr (Jun 13, 2008)

> While we are away for Computex, CJ let us know that the Radeon HD 4870 X2 R700 prototype card is out and it beats the NVIDIA GeForce GTX 280 *in 3DMark Vantage*. The R700 card is basically made up of two RV770 GPUs with 2x256-bit memory interface to either GDDR3 or GDDR5 memories. We asked our sources about R700 in Computex and apparently AMD is going to let AIB partners to decide the specs themselves. Therefore, the partners will set their own clock speeds, PCB design, memory types, cooler solutions etc. and there will be Radeon HD 4850 X2 and 4870 X2 cards differentiate by the memory type. The R700 card apparently doing pretty well at this stage scoring about X5500 in 3DMark Vantage Extreme preset while the GeForce GTX 280 card is scoring X4800. Both sides are still working hard on optimizing their drivers for the new architecture so probably we will see the performance to improve over time.



What's new? ATI cards normally score high with synthetic tests.



hayder.master said:


> cute



What?.......rather, who?



tkpenalty said:


> Nvidia cannot counter this as the GTX280's design makes it impossible to have two of the cores in close proximity without something like a water block in use...



For now, maybe. But I'm sure there is a fat-free GPU on the drawing boards (just as what G92 was to the G80). There could be a GPU that's practical for dual-GPU cards.


----------



## Hayder_Master (Jun 13, 2008)

anyone have detalis about hd4850,4870,4870x2 , evrything


----------



## btarunr (Jun 13, 2008)

hayder.master said:


> anyone have detalis about hd4850,4870,4870x2 , evrything



Yes, Wikipedia says: http://en.wikipedia.org/wiki/Radeon_R700

Google Says: http://www.google.co.in/search?hl=en&q=ATI+R700&btnG=Google+Search&meta=


----------



## Nitro-Max (Jun 13, 2008)

lol ive given up on these threads now people always flame each other over there personal tastes few months from now you'll be doing the same again over the next gen cards it gets stupid and boring to view after a few years and personal opinion starts to count more than others opinion buy what you are going to buy thats what i say if you have cash to throw away then do it personally i prefere ati image quality and pricing v's preformance no matter what anyone says.

currently playing race driver grid on ultra high settings on my 3870x2 runs smooth as hell that'll do me for now.


----------



## yogurt_21 (Jun 13, 2008)

Darren said:


> I think it's a below the belt method marketing and if these benchmarks are from ATI then I've lost a lot of respect for ATI. At first glance fanboys are going to see the huge gap between the cards and assume the 4000 series is upto 25% faster without rationalizing that it could only translate into only few frame rate difference. e.g Crysis is just under 25% faster on the 4870 than the 9800 GTX but fanboys forget that if the 4870 gets 40 FPS the 9800 GTX gets 30 FPS which in reality isn't very much separation in comparison to what the graph conveys. This might entice fanboys to get the 4870 on product release as they expected a huge performance increase whereas they might not of made the purchase if they knew what the frame rate translation was.
> 
> Then again if these statistics are genuine it means ATI are marketing geniuses as it has people arguing over their product over many threads spanning many pages as well as in many other forums and changing their alliances.



all this talk about fanboys and you still fail to recognize that *nvidia did it first*!

I mean seriously you're only mentioning how it's bad for ati to do it? only an extreme fanboy would say it's ok for nvidia to do it, but not for ati.


----------



## Megasty (Jun 13, 2008)

Nitro-Max said:


> lol ive given up on these threads now people always flame each other over there personal tastes few months from now you'll be doing the same again over the next gen cards it gets stupid and boring to view after a few years and personal opinion starts to count more than others opinion buy what you are going to buy thats what i say if you have cash to throw away then do it personally i prefere ati image quality and pricing v's preformance no matter what anyone says.
> 
> currently playing race driver grid on ultra high settings on my 3870x2 runs smooth as hell that'll do me for now.





Lets toast to the never-ending cycle of.... 

BTW, I just got around to finishing the world races in GRID & all I can say is damn those Ravenswest AIs can drive


----------



## imperialreign (Jun 13, 2008)

yogurt_21 said:


> all this talk about fanboys and you still fail to recognize that *nvidia did it first*!
> 
> I mean seriously you're only mentioning how it's bad for ati to do it? only an extreme fanboy would say it's ok for nvidia to do it, but not for ati.



agreed - although it's a little in poor taste for ATI to pull the same stunt, and they don't typically resort to such measures, it's the only way they'd be able to keep the interest of the n00bz that fell for nVidia's propaganda.  Like was mentioned, at least ATI threw some system specs out as well, instead of just waving a majik wand over a spreadsheet and saying "wiki-wiki hoodoo-voodoo."


----------



## Darren (Jun 13, 2008)

yogurt_21 said:


> all this talk about fanboys and you still fail to recognize that *nvidia did it first*!
> 
> I mean seriously you're only mentioning how it's bad for ati to do it? only an extreme fanboy would say it's ok for nvidia to do it, but not for ati.



I'm far from a fan boy I've used both ATI and Nvidia and have no preference concern except for my wallet. In the last few years I've used the ATI 9600 Pro, x1600 Pro and recently Nvidia 9600 GT all of which were the best bang for the buck upon purchase and I have no regrets or alliances. 

I don't care whether Nvidia started marketing with dodgy graphs first or not it doesn't mean ATI has to follow the band wagon. Secondly I lost some respect when Nvidia did it too.

- When graphics card companies have to mislead customers into purchases the only person that looses is the customer. Two wrongs do not make a right.


----------



## farlex85 (Jun 13, 2008)

Who said they even did it? How do you guy infer the companies themselves are responsible? Someone completely independent of those companies (a fanboy perhaps, or an insider looking to stir up meyhem) could have fabricated them. It doesn't really make sense to call a graph fake and then say the companies intentionally made it.


----------



## Nitro-Max (Jun 14, 2008)

Darren said:


> I'm far from a fan boy I've used both ATI and Nvidia and have no preference concern except for my wallet. In the last few years I've used the ATI 9600 Pro, x1600 Pro and recently Nvidia 9600 GT all of which were the best bang for the buck upon purchase and I have no regrets or alliances.
> 
> I don't care whether Nvidia started marketing with dodgy graphs first or not it doesn't mean ATI has to follow the band wagon. Secondly I lost some respect when Nvidia did it too.
> 
> - When graphics card companies have to mislead customers into purchases the only person that looses is the customer. Two wrongs do not make a right.



very well said i too lost some trust in Nvidia over their driver scam, were going back to around 2002 now im not knocking there cards far from it and im not a fan boi i too rather best buy than spent huge amounts on a few extra fps cos somtimes thats all it is and it just aint worth the extra money.

just cos Nvidia spends loads on advertising in games etc doesnt mean to say they are and will always be top.

Norton antiviris did the same mass advertising and bundle deals with motherboard makers buisnesses etc.. and at the end of it all they still arent the best virus software out there i rate. panda, f-secure, nod32, over it anyday simply because norton updated once and still failed to find 3 viruses on my system were as F-secure found them right away and dissabled them.


----------



## MrHydes (Jun 14, 2008)

i just don't see single RV770 beating GT200...

even with this 800 Stream processors story!


----------



## JRMBelgium (Jun 14, 2008)

My speculations:
4850 = 3870X2
4870 = Nvidia's 260
4870X2 = Nvidia's 280

But ATI WILL HAVE BETTER PRICE/PERFORMANCE ratio, I am 99.99% sure of that. Nvidia knows that they will automaticly sell better, they already own a huge part of the market and all the latest best games have Nvidia ads in it. Don't think "sell better" the rong way, pretty sure ATI will gain grounds this year.


----------



## btarunr (Jun 14, 2008)

Jelle Mees said:


> My speculations:
> 4850 = 3870X2
> 4870 = Nvidia's 260
> 4870X2 = Nvidia's 280




So you speculate GTX 280 to be twice as fast as a GTX 260?

Here's the truth:

Pic removed.

Now, draw your conclusions, that chart is legit.

As for price, word is that NV is bringing it to $599 for GTX 280, $499 for 260.


----------



## Nitro-Max (Jun 14, 2008)

btarunr said:


> So you speculate GTX 280 to be twice as fast as a GTX 260?
> 
> Here's the truth:
> 
> ...




Thats what im talking about another $100 for just over 5 fps difference its stupid. and not everyones guarenteed those results depends on other system specs.

But dont just use crysis as a decission maker theres other games that would benefit more for the $100 difference its just down to you to decide if its all worth it.


----------



## Hayder_Master (Jun 15, 2008)

What?.......rather, who?


sure who is in the title ,the winner ati


----------



## Hayder_Master (Jun 15, 2008)

btarunr said:


> Yes, Wikipedia says: http://en.wikipedia.org/wiki/Radeon_R700
> 
> Google Says: http://www.google.co.in/search?hl=en&q=ATI+R700&btnG=Google+Search&meta=





realy i want depth details , just like this one 

http://www.vr-zone.com/articles/H1_...AH3870X2_and_XFX_GeForce_9800_GX2/5766-1.html


----------



## neo1231 (Jun 15, 2008)

when you think about this you can only put 2 4870x2 together but you can put 3 gtx280 together, id like to see some compare that!


----------



## DarkMatter (Jun 15, 2008)

hayder.master said:


> realy i want depth details , just like this one
> 
> http://www.vr-zone.com/articles/H1_...AH3870X2_and_XFX_GeForce_9800_GX2/5766-1.html



I wouldn't desire such "depth details" considering they are *wrong*!

9800GX2 has 16 ROPS and not 24.


----------



## MrHydes (Jun 15, 2008)

> GTX 280 benchmark
> 
> 
> 
> ...


----------



## EnergyFX (Jun 15, 2008)

I think the point here is (and pretty much always has been) is comparing flagship to flagship.  Number of cores is irrelevant if a single core card is the best thing nVidia has to offer in their line-up.

If ATI's dual core flagship walks on nVidia's single core flagship then it's a fair call for ATI to get to carry the 'top performer' torch.  

If the simple question is 'who has the fastest card' then its a simple question with a simple benchmarked answer.  Price only comes into play if the question is 'who has the best performance/value card'... which is not a simple question.


----------



## Hayder_Master (Jun 15, 2008)

DarkMatter said:


> I wouldn't desire such "depth details" considering they are *wrong*!
> 
> 9800GX2 has 16 ROPS and not 24.




that is just what i talking about , how the 9800gx2 24 rop , cuz i think the ati 3870x2 i too weak 
thanx man , can you give me the trust site have this compare


----------



## Selene (Jun 15, 2008)

Sigh, the only way to compair, Flagship is and always has been the best card each makes vs each other.
now we all know what is going to happen, ATI makes a good card, and a good price, but NV makes a better card but charges way to much, but ppl buy them any way.
G2X280 > every thing.
GTX280 > 4870x2
GTX260 > 4870
9800GT > 4850
You can say what you want, but price means nothing in this war, it never has, ppl buy the best they can, when they can, and 5fps makes all the difrence in the world when it comes to braging on the net.


----------



## raptori (Jun 15, 2008)

well i will wait for 9800GT i think there is always some surprises comes with the GT


----------



## Wile E (Jun 15, 2008)

Meh. I'm not buying any of these benches until the cards are actually released. Until then, it's all FUD as far as I'm concerned.


----------



## btarunr (Jun 15, 2008)

Alright, let me join this speculators' orgy.
HD4870 > 9800GTX 
GTX 260> HD4870 but costs $499 so I'll throw it out of the window. Even if a HD4870 performs 80% as good as a GTX260, even at a price of ~$399, HD4870 wins. 

HD3850 512M is known to be 75~80% as fast as HD3870. If it's the same case with HD4000, you can consider 9800 GT lost.


----------



## raptori (Jun 15, 2008)

consider that GT  series always be powerful and cheap


----------



## raptori (Jun 15, 2008)

55nm 'GeForce 9800 GT' launch set for July

http://techreport.com/discussions.x/14546


----------



## KainXS (Jun 15, 2008)

the 9800GT is gonna be a bust, a rebranded 8800GT with only slightly better performance and a die improvement, considering that and the fact that it will have slightly higher clock even if you were to overclock it, it would never outperform a HD48XX seeing how they near outperform the GTX at stock


----------



## kylew (Jun 15, 2008)

imperialreign said:


> well - if R700 does turn out to be a dual-core GPU, than technically, it still counts as only one GPU on the board . . .
> 
> no one is sure, yet, as to the R700 specs, as we haven't heard anything 100% reliable or concrete



We know that the R700 is dual GPUs, there were leaked pictures of its heatsink a while back now, I can't remember where they were from though.


----------



## MrHydes (Jun 15, 2008)

> *GeForce GTX 280: The official pictures leaked*
> 
> 
> 
> ...



i rather like this one...


----------



## MrHydes (Jun 15, 2008)

> review
> 
> 
> 
> ...



see with AAx4 AFx1 at 1900x1200 VERY HIGH *29.6*

            AAx4 AFx1 at 1600x1200 VERY HIGH *23.9 *

not bad for green drivers and some filters on


----------



## wolf (Jun 15, 2008)

watch out ATi, its looking like 177.26 drivers are increasing any card on the unified architecture's performance by roughly 15%

this will make the 9800GTX and 8800GT more comopetitive against the 48xx cards for sure! coupled with a small price drop, awesome value!, might get another 9800GTX and go SLi....


----------



## AsRock (Jun 15, 2008)

I know it's from the INQ lol. How ever thought it was interesting.
http://www.theinquirer.net/gb/inquirer/news/2008/06/14/complete-gtx280-scores-here


----------



## wolf (Jun 15, 2008)

notice in that test the 9800GTX uses 175.16's, and my 9800GTX has picked up approximately 15% across the board on 177.26's


----------



## Megasty (Jun 15, 2008)

wolf said:


> notice in that test the 9800GTX uses 175.16's, and my 9800GTX has picked up approximately 15% across the board on 177.26's



Damn that boy looks like a pretty good pick up after that silly rebate. In a few weeks it could be that price w/o the rebate - making it a true mid-range card


----------



## HTC (Jun 15, 2008)

wolf said:


> *watch out ATi, its looking like 177.26 drivers are increasing any card on the unified architecture's performance by roughly 15%*
> 
> this will make the 9800GTX and 8800GT more comopetitive against the 48xx cards for sure! coupled with a small price drop, awesome value!, might get another 9800GTX and go SLi....



Could this be why W1zzard pulled out the results saying they *were not final ones*?


----------



## EastCoasthandle (Jun 15, 2008)

Hellgate: Street
..................................GTX280..............GTX..........Difference 280 vs GX2
1680x8 DX9....................174.20.............139.80...........34.40FPS
1680x8 DX10....................76.00..............45.80...........30.20FPS
----------------------------------------------------------------------
Difference DX10 vs DX9......98.20FPS.........94.00FPS.........4.20FPS 


1600x4 DX9....................172.70.............123.90...........48.80FPS
1600x4 DX10....................57.80..............31.10...........26.70FPS
---------------------------------------------------------------------
Difference DX10 vs DX9....114.90FPS.........92.80FPS.........22.10FPS


1920 DX9.....................118.00.............103.60...........14.40FPS
1920 DX10.....................50.00..............26.50...........23.50FPS
---------------------------------------------------------------------
Difference DX10 vs DX9....68.00FPS.........77.10FPS.........9.10FPS


2560 DX9.......................79.10..............47.10...........32.00FPS
2560 DX10......................43.10..............??.??...........??.??FPS
---------------------------------------------------------------------
Difference DX10 vs DX9....36.00FPS..........??.??FPS.........?.??FPS



World in Conflict
1680x8 DX9.....................64.00..............52.00...........12.00FPS
1680x8 DX10....................50.00..............39.00...........11.00FPS
----------------------------------------------------------------------
Difference DX10 vs DX9......14.00FPS.......13.00FPS.........1.00FPS 


1600x4 DX9.....................62.00..............49.00...........13.00FPS
1600x4 DX10...................47.00..............33.00...........14.00FPS
---------------------------------------------------------------------
Difference DX10 vs DX9....15.00FPS.........16.00FPS.........1.00FPS


1920 DX9.......................54.00..............36.00...........18.00FPS
1920 DX10......................34.00..............20.00...........14.00FPS
---------------------------------------------------------------------
Difference DX10 vs DX9.....20.00FPS........16.00FPS.........4.00FPS

Interesting...


----------



## Hayder_Master (Jun 16, 2008)

btarunr said:


> Alright, let me join this speculators' orgy.
> HD4870 > 9800GTX
> GTX 260> HD4870 but costs $499 so I'll throw it out of the window. Even if a HD4870 performs 80% as good as a GTX260, even at a price of ~$399, HD4870 wins.
> 
> HD3850 512M is known to be 75~80% as fast as HD3870. If it's the same case with HD4000, you can consider 9800 GT lost.



and 4870x2>>gtx280


----------



## vega22 (Jun 16, 2008)

i predict the 280gx2 will pwn all, if the drivers work propperly 

good to see that this could be the first card that can run gamezs on that dell 30" monster @native res.


----------



## wolf (Jun 16, 2008)

i reckon wait for 3840x2400, 2560x1600 is nice, but i dont think itll ever be a standard like 1280x1024 or 1920x1200/1080


----------



## farlex85 (Jun 16, 2008)

marsey99 said:


> i predict the 280gx2 will pwn all, if the drivers work propperly
> 
> good to see that this could be the first card that can run gamezs on that dell 30" monster @native res.



Probably won't be one. More likely a 380gx2 after core revisions.


----------



## platinumyahoo (Jun 16, 2008)

You can compare these, because they are in the same price range, if a dual GPU gives me more performance for the same price, i dont care that its dual GPU, plus, I have been tired of nVidia being arrogant so im ready for a change! Now I dont have to get their sloppy chipset if i ever want to do dual graphics cards!


----------



## wolf (Jun 16, 2008)

GT200b around December/January ftw.


----------



## yogurt_21 (Jun 16, 2008)

gt200 is the same speed or sometime less than the 9800gx2. so it seems like this like most other predicted is the most expensive card to not moffer the best performance from nvidia in a long time. the gtx260 seems to be s better deal, but will probabaly also need to come down in price once the 4000 series launches.


----------



## KainXS (Jun 16, 2008)

marsey99 said:


> i predict the 280gx2 will pwn all, if the drivers work propperly
> 
> good to see that this could be the first card that can run gamezs on that dell 30" monster @native res.



theres not gonna be a G280GX2 for a good while until the die strinks, then theres the fact that ATI's R800 specifications are now finalized and with a set release date early 2009

This is what I hoped for though, ATI managed to push Nvidia a little

its good to see them make a "new" GPU, they have been using the same variations on the G92's for a while.

I wanna see Tri Fire 4870X2 vs Tri GTX280 now


----------



## PVTCaboose1337 (Jun 16, 2008)

Too soon to tell!  Also:

Who cares what beats what if they are priced so the ordinary consumer cannot afford them!


----------



## handydagger (Jun 17, 2008)

Did someone mentioned the power of Crossfire 4870x2 X 2 let me guess 4 gflops

or quad 4870 DDR5 may reach 4.5-4.8 Gflops of power this of sure will leave SLI 280 in dust.....

unless Nvidia think about 55nm or 45nm and I don't think this will happen in the short turn
in price over performance ATI will own at lest for the next 5-8 months.


----------



## johnnyfiive (Jun 17, 2008)

PVTCaboose1337 said:


> Too soon to tell!  Also:
> 
> Who cares what beats what if they are priced so the ordinary consumer cannot afford them!



Exactly.


----------



## Woody112 (Jun 19, 2008)

I'm at the point where I don't really care about benchmarks anymore but rather the whole package of the card. For example the new 48xx is suppose to have some steller physix capabilities, better video quality. The only video game out as of now that can put one of these cards on its knees is crysis and thats on a 24" display or larger. I love nvidia but am kind of ticked off at the fact they have not been pushing technology like they could have. While ATI/AMD is developing there physix, and video quality. Nvidia keeps pushing FPS, I was really expecting alot more out of the 280's than what I'm seeing. I love Nvidia but I would like to have more out of a 600 dollar gpx card than just high fps. As for now, personally I'm still on the fence as to who has what because Nvida has always got something up their sleeve.


----------



## Wile E (Jun 19, 2008)

Woody112 said:


> I'm at the point where I don't really care about benchmarks anymore but rather the whole package of the card. For example the new 48xx is suppose to have some steller physix capabilities, better video quality. The only video game out as of now that can put one of these cards on its knees is crysis and thats on a 24" display or larger. I love nvidia but am kind of ticked off at the fact they have not been pushing technology like they could have. While ATI/AMD is developing there physix, and video quality. Nvidia keeps pushing FPS, I was really expecting alot more out of the 280's than what I'm seeing. I love Nvidia but I would like to have more out of a 600 dollar gpx card than just high fps. As for now, personally I'm still on the fence as to who has what because Nvida has always got something up their sleeve.



nVidia will be enabling Aegia Physics on their cards thru CUDA in a driver update.


----------



## DarkMatter (Jun 19, 2008)

Woody112 said:


> I'm at the point where I don't really care about benchmarks anymore but rather the whole package of the card. For example the new 48xx is suppose to have some steller physix capabilities, better video quality. The only video game out as of now that can put one of these cards on its knees is crysis and thats on a 24" display or larger. I love nvidia but am kind of ticked off at the fact they have not been pushing technology like they could have. While ATI/AMD is developing there physix, and video quality. Nvidia keeps pushing FPS, I was really expecting alot more out of the 280's than what I'm seeing. I love Nvidia but I would like to have more out of a 600 dollar gpx card than just high fps. As for now, personally I'm still on the fence as to who has what because Nvida has always got something up their sleeve.



As Wile E said, as it stands right now, Nividia is the one with better physics support (a lot better I have to say). GTX 280/260 even have a sort of Physics processor slapped into the core: an additional FP64 unit (30 ALUs) and specific cache that cannot be used for graphics and are only reachable by CUDA. For physics or whatever yo want to use them, but certainly in games it won't be used for anything else IMO. GT200 is a lot more than a graphics card. The problem remains on whether DO WE REALLY WANT something more (those extra features make the card have 270 SPs but only 240 are usable for graphics) and pay the BIG price premium. Not right now, that's for sure.


----------



## Woody112 (Jun 19, 2008)

Nice! Did not know that. And yes to me I really want that kind of candy on a card and more. 
I'll wait to see some side by side benchies of the two before I go the the egg. I screwed up last time when I bought two 3870's instead of two 8800gtx's just to save a few bucks. I wont make the same mistake twice. I've had nothing but driver issues since the day I installed them.


----------



## AsRock (Jun 19, 2008)

Personally i don't care about CUDA.  And had loads of fun watching Havok at work and want more...   A great example Company Of Hero's.

My questions are, when is CUDA going be available ?.
When are games going support it ?.
Is this going be another problem added to go wrong ?.


----------



## DarkMatter (Jun 19, 2008)

AsRock said:


> Personally i don't care about CUDA.  And had loads of fun watching Havok at work and want more...   A great example Company Of Hero's.
> 
> My questions are, when is CUDA going be available ?.
> When are games going support it ?.
> Is this going be another problem added to go wrong ?.



First of all one comment, CUDA and PhysX is not the same thing. CUDA is a C++ based API that lets you run non-graphical aplications on GPUs. PhysX will be enable through CUDA and it's performance is between tens to one hundred bigger than CPU physics, like current Havok. CPUs have 10 GFlops per core, new GPUs have 1000 GFlops. 

1 - CUDA is available now. It's been available for some time already. PhysX support has already been launched with 177.35 drivers and will supposedly come soon to games. 

2- They are already developing with it, but it's imposible to know exactly when, or if it's going to be succesful.

3- I don't understand what you really mean here, but in any case is pretty early to say. I would even bet that neither developers nor Nvidia know how this is going to go in the future.


----------



## lemonadesoda (Jun 19, 2008)

hayder.master said:


> and 4870x2>>gtx280


Dont be so sure.

At 2560x1600 the GTX280 is 70% more powerful than 4850. So, my guess, about 50%-60% more powerful than a 4870.

4870x2 will probably be ">" or "=" to GTX280 at standard resolutions, but NOT ">>"

And at 2560x1600, my guess is they will be "="


----------



## KainXS (Jun 20, 2008)

If you guys and gals remember, a while back, their was news that the extra 320SPU's would be used for physx, and now we all know that the cards do have 800SPU's

so could that be true?

could ati have a ninja waiting in their cards ready to strike back at cuda


----------



## Miyazaki270 (Jun 20, 2008)

*I dont think so...*

The ATi cards will be really good cards but I dont think they will compete with the gtx 280, because ATi makes cards for speeds thats why the perfom better on 3d mark, but there shaders are grouped much more largely, while Nvidia makes them smaller witch makes them run better, ATi will be good competition for the 9 series but i dont think they will compete, in performance but in price they might, and lets remeber the GTX 280 is a single gpu and the 4870xw is 2 and this is a speculation, but the GTX 280 has just  been released and the drivers are still new so will see. The ATi cards should perform well in 3d mark but when it comes to games i dont think they will compete though they should show some great improvement over the 38** series.
Nvidia knows what there doing as do ATi
ATi=good price good performance
Nvidia=not so great price but bareble, but great performance


----------



## a_ump (Jul 24, 2008)

Miyazaki270 said:


> The ATi cards will be really good cards but I dont think they will compete with the gtx 280, because ATi makes cards for speeds thats why the perfom better on 3d mark, but there shaders are grouped much more largely, while Nvidia makes them smaller witch makes them run better, ATi will be good competition for the 9 series but i dont think they will compete, in performance but in price they might, and lets remeber the GTX 280 is a single gpu and the 4870xw is 2 and this is a speculation, but the GTX 280 has just  been released and the drivers are still new so will see. The ATi cards should perform well in 3d mark but when it comes to games i dont think they will compete though they should show some great improvement over the 38** series.
> Nvidia knows what there doing as do ATi
> ATi=good price good performance
> Nvidia=not so great price but bareble, but great performance



no offense, but please research before u post, the HD 4870x2 is definetly a match for the GTX 280 and i would guess it will sell more since it offers better performance for just a little more, and even if GTX 280 is on beta drivers these benchmarks are on engineering samples of the HD 4870x2 WITH beta drivers so i would think that that would set the HD 4870x2 back more than the GTX 280. How could u logically think that nvidia makes their stream processors smaller than ATI's when they have a little more than 1/4th the processors yet their GPU is about 2.1x larger yet the HD 4870 has 800 processors. 
http://www.pcper.com/article.php?aid=581&type=expert
GTX 280= 576 mm^2
HD 4870= 260 mm^2

http://www.guru3d.com/article/radeon-hd-4870-x2-preview/
http://anandtech.com/video/showdoc.aspx?i=3354
http://www.extremetech.com/article2/0,2845,2325444,00.asp
http://www.guru3d.com/article/radeon-hd-4870-x2-preview/
http://www.legitreviews.com/article/745/1/
http://www.pcper.com/article.php?aid=590

please research instead of saying your speculation


----------



## trt740 (Jul 24, 2008)

Miyazaki270 said:


> The ATi cards will be really good cards but I dont think they will compete with the gtx 280, because ATi makes cards for speeds thats why the perfom better on 3d mark, but there shaders are grouped much more largely, while Nvidia makes them smaller witch makes them run better, ATi will be good competition for the 9 series but i dont think they will compete, in performance but in price they might, and lets remeber the GTX 280 is a single gpu and the 4870xw is 2 and this is a speculation, but the GTX 280 has just  been released and the drivers are still new so will see. The ATi cards should perform well in 3d mark but when it comes to games i dont think they will compete though they should show some great improvement over the 38** series.
> Nvidia knows what there doing as do ATi
> ATi=good price good performance
> Nvidia=not so great price but bareble, but great performance



I own a 280GTX and it is a monster video card but a 4870x2 will be about 25 percent faster or more in somethings, even when my 280 is oced to 717/1509/1280. Your talking about a dual gpu card against a single gpu card. The 3870 cards were slower and not aswell made as the current 4870 cards. Take it from me the 4870x2 is gonna be faster than the 280 gtx and about 100.00 more expensive. Now the 280b revision, the 4870x2 maybe 5 to 10 percent faster. The 280 is the fastest single gpu card but not the fastest video card (atleast it won't be in about two weeks)


----------



## Nick89 (Jul 24, 2008)

wiak said:


> get a proper cooler like zalman VF900-CU



VF900cu's are crap, they arnt powerfull enought to cool a X1950XT let alone a 2900XT.:shadedshu


----------



## newconroer (Jul 26, 2008)

trt740 said:


> I own a 280GTX and it is a monster video card but a 4870x2 will be about 25 percent faster or more in somethings, even when my 280 is oced to 717/1509/1280. Your talking about a dual gpu card against a single gpu card. The 3870 cards were slower and not aswell made as the current 4870 cards. Take it from me the 4870x2 is gonna be faster than the 280 gtx and about 100.00 more expensive. Now the 280b revision, the 4870x2 maybe 5 to 10 percent faster. The 280 is the fastest single gpu card but not the fastest video card (atleast it won't be in about two weeks)



25% is the rough estimate of how much faster the X2 and the GT200b will be.

Yet 25% isn't telling us much. A 25% increase to clocks all around will do very little in real world. 

If you're running a 3d application now, with a 280, and are still suffering low frames in some situations, the X2 and GT200b aren't going to noticeably fix that. 

Given the brute power and capabilities of all recent GPUs both Nvidia and ATi, the problem(s) lies more with the correlation between CPU>GPU>RAM and the coding of the application you are running.

If we look at a modern game such as AoC, and compare results under heavy taxing circumstances, a 640 GTS, GT, 4850, 9800 GTX, 260, 4870 and GTX280 will all provide nearly the same performance, with gaps being less than 10fps, and in some cases less than 5fps.

For most people, that's not worth an upgrade to any of the cards.

Which means the X2 and the GT200b aren't going to break this mold.

Now if you're looking for another 3fps, then by all means purchase one. Yet there's more important issues that latter generation cards address ; stutter, microstutter, taxation at high resolution when using MSAA, SuperSampling AA, HDR, minimum fps stability etc. Those should be the reasons why or why not to upgrade.

Having said that, the X2 and the GT200b are not likely to address those said issues any better than a 4870 or 280 do.

The only real benefit would be less power consumption and heating issues.

Yet anyone who's serious about using high end GPUs should be involved in aftermarket coolers, whether air or liquid, and if that is the case, the die shrink will be more or less irrelevant.


Prices on both cards are down to a very reasonable cost. I can't imagine the new ones will be that much cheaper.

So the question becomes, do you want to wait a few more months just to save 10 degrees?


----------



## trt740 (Jul 26, 2008)

newconroer said:


> 25% is the rough estimate of how much faster the X2 and the GT200b will be.
> 
> Yet 25% isn't telling us much. A 25% increase to clocks all around will do very little in real world.
> 
> ...



Your kidding right the 280 gtx kills every prior card I have owned with all the settings turned up even at lower resolutions and the minimum frame rates are mutch higher. Also since the 280gtx is a single gpu it doesn't suffer from micro stuttering. As for suffering in 3d application, that not happening in my machine. As for this part of your statement *If we look at a modern game such as AoC, and compare results under heavy taxing circumstances, a 640 GTS, GT, 4850, 9800 GTX, 260, 4870 and GTX280 will all provide nearly the same performance, with gaps being less than 10fps, and in some cases less than 5fps.*    The 640 gts struggles at max setting in new games and if your telling me that the 280gtx and 4870 will only increase frame rates by 5 fps your kidding right. Also the minimum frame rates of a 280gtx are double a 9800gtx or near so most of the time when it is overclocked, and yes 5 to 10 frames is a giant difference in some cases. The difference of 18 frames per second to 28 frames per second is significant. The 18 frames would be unplayable but at 28 frames the game becomes very playable. If your saying that the majority of the video cards you listed, excluding the 8800gts 640 mb (and in some cases even it), can play most games out very well I would agree and if we are talking about the average person I would agree, but this is not a average users forum and we are not average users. If we were we would all have  P35 motherboards with, e7200 or Q6600 cpus, DDR2 800 ram and 9600gt video cards (very good systems I might add for the money) at stock speeds with stock cooling.


----------



## StarYoshi (Jul 26, 2008)

I have yet to find a card that's a worthwhile upgrade from my 8800GTS 640mb at 1680x1050, a common medium-high end resolution. For 1600x1200 and higher resolutions these new cards (HD4, GTX) are great, but I'm wanting for a card that really scales with resolution and doesn't give a flat performance for me at lower resolutions. The Hd4870 only improved my Crysis framerates. All other games were negligible. It's beginning to look like i'll have to settle for SLI 8800GTs or pick up another 8800GTS if I want more performance at this resolution. That said, this beast still maxes most games


----------



## EastCoasthandle (Jul 26, 2008)

StarYoshi said:


> *I have yet to find a card that's a worthwhile upgrade from my 8800GTS 640mb at 1680x1050*, a common medium-high end resolution. For 1600x1200 and higher resolutions these new cards (HD4, GTX) are great, but I'm wanting for a card that really scales with resolution and doesn't give a flat performance for me at lower resolutions. The Hd4870 only improved my Crysis framerates. All other games were negligible. *It's beginning to look like i'll have to settle for SLI 8800GTs or pick up another 8800GTS* if I want more performance at this resolution. That said, this beast still maxes most games




The performance between the HD4 and 640 are miles apart at 1680x1050 with 4xAA/16xAF.  Then you go on to talk about a 8800GT in sli which is no comparison on your 640 that you referenced earlier in your post.  Then you talk about a 8800 GTS which is no competition to a HD4870.  This entire post of yours contradicts your  thought that there are no cards worthwhile upgrading to.

In hindsight of your post there are plenty of video cards to upgrade to when using a 640.  The only thing that is preventing you from do so is your own personal preference.


----------



## AsRock (Jul 26, 2008)

StarYoshi said:


> I have yet to find a card that's a worthwhile upgrade from my 8800GTS 640mb at 1680x1050, a common medium-high end resolution. For 1600x1200 and higher resolutions these new cards (HD4, GTX) are great, but I'm wanting for a card that really scales with resolution and doesn't give a flat performance for me at lower resolutions. The Hd4870 only improved my Crysis framerates. All other games were negligible. It's beginning to look like i'll have to settle for SLI 8800GTs or pick up another 8800GTS if I want more performance at this resolution. *That said, this beast still maxes most games *




Just wait till next year then...  It's that simple. It's not as if you have to or really need to so why bother ?. I had the chance to get a 4870 last week but decided against it and waiting for the R800.


----------



## Kursah (Jul 26, 2008)

That's another thing to choose...buy this "next best thing" or wait for the next "next best thing" lol! I decided to jump now, and I may step up to the GTX200b later on if it's out, and go from there.

I say if you're gaming without much issue with modern games (crysis aside), don't worry too much...if you have the itch and the money, then nothing will stop ya. But I'm sure ATI's and NV's next-next best thing will be pretty kickass too!


----------



## wolf (Feb 2, 2009)

well we have it, a slightly overclocked GTX280 or stock GTX285, according to our very own in-house review, can par a 4870X2 - give/take ~1%

nvidia always have an answer to ATi, as does ATi for Nvidia, but the question is, which company answers performance, and which answers price?

naturally neither company always answer one aspect.... but i think i can see a trend ....


----------

