# R700 Launch in for Wet-Weather, GTX 280 Slash Coming Up



## btarunr (Jul 18, 2008)

In what could be a serious blow to AMD's R700 dreams as the company struggles to survive, NVIDIA could be working out another price-slash for the GeForce GTX 200 series, signs of which are already surfacing. You can get a GeForce GTX 280 for US $440. American e-tailer Newegg has already put several brands' GeForce GTX 280 on shelves for $440 (listed here), lower than the launch-price of the GeForce GTX 260. 

Implications are:

 This paves the way for a 'GeForce GTX 280 Extreme', a new model which has been put on pre-order on British website OCUK (covered here). 
 While board partners seemed displeased at the current market trend with the GTX 200 series (covered here), all of a sudden, they are all offering price-cuts. Could NVIDIA be subsidizing these products, given that some reports suggest poor reception of these products in the North American markets?
 Assuming the R700 comes in at around $499 and $549, it loses badly to the GTX 280 at its newest price on both fronts of price/performance and performance per Watt.
 No price-cuts for GTX 260 have been noted, showing this cut is very much specific to the GTX 280, serves two purposes: 1. Phasing out the 65nm parts, 2. striking R700 when it hurts the most, during launch.
 While NVIDIA has the resources to maintain the leadership of GeForce GTX 200 (by affording to slash prices), AMD doesn't. 

Although the new $440 price tag looks tempting, you could wait for the R700 to launch, following which a more informed decision on your part could be made.

*View at TechPowerUp Main Site*


----------



## [I.R.A]_FBi (Jul 18, 2008)

linky?


----------



## btarunr (Jul 18, 2008)

I am the link.


----------



## Darkrealms (Jul 18, 2008)

btarunr said:


> I am the link.


@_0  local enquirer?!?  LoL

Interesting perspective.  I guess we'll have to see how ATI's new cards do.  Good for me though.  I'm all for cheap Nvidia cards : )


----------



## btarunr (Jul 18, 2008)

Darkrealms said:


> @_0  local enquirer?!?  LoL
> 
> Interesting perspective.  I guess we'll have to see how ATI's new cards do.  Good for me though.  I'm all for cheap Nvidia cards : )



Why, is it dogma that TPU's news should always be borrowed? Our news is based on facts, not sources. If facts come from a source, we acknowledge them as the source of the fact. Even if Fudzilla comes up with something and has pictures to back it, we cover it. Go to Newegg and see for yourself. If more than two brands are using the same low $440 price (no POS rebate whatsoever), then it has to be a standard price. As for GTX 280 Extreme, we already covered it days ago, dig through the news archive, see for yourself. Even that was based on facts and even other sites covered that.

Edit: added links.


----------



## TooFast (Jul 18, 2008)

well from what I have seen the r700 is way faster.


----------



## alexp999 (Jul 18, 2008)

Okay, now I defintely think nvidia are selling these at a loss. maybe it will cause AMD to drop their prices, based on a $499 launch price they probably have the headroom to drop and still make a profit.

If the price gap is too big though I will be going with the 280.


----------



## wolf2009 (Jul 18, 2008)

btarunr said:


> Why, is it dogma that TPU's news should always be *begged*/borrowed?



I am pleased that TPU came with something original . THanks !

I dont think begged is the right word , while borrow is .


----------



## chron (Jul 18, 2008)

If that whole thing is true with AMD and NVIDIA meeting together to come up with pricing for their products, then how do we know these prices are still any good?

I guess what I'm saying is, how do we know all this isn't just a load of bullshit, and the card's prices after being "slashed" are still higher than they should be, and AMD and NVIDIA have done this to get people thinking they are getting a good deal on the cards?


Also - i can't find any gtx280's for $440.  450, but not 440.  Typo?


----------



## phanbuey (Jul 18, 2008)

TooFast said:


> well from what I have seen the r700 is way faster.



yes but theyre having problems with the engineering sample r700's pulling too much wattage and de-stabilizing some PCI-e 1.1 motherboards... 

The power pins alone do not supply nearly enough juice for that monster - and there is 95W of power that is pulled from the pcie slot, which has a maximum capacity of 75W...  Hexus had to switch out their motherboard for the testing because the first one would crash.

"the manifestation of such power is a frankly ludicrous power-draw of around 320W when placed under sustained load. Some commentators may argue that's required for two HD 4870 boards, which is true, but the very fact that the X2 variant plugs into a single PCIe (2.0) x16 slot can become problematic. Why? Because the 8-pin and 6-pin PCIe power arrangement provides in the order of 225W to a card; the rest needs to be pulled from the PCIe slot."

- source http://www.hexus.net/content/item.php?item=14178&page=3


----------



## Megasty (Jul 18, 2008)

chron said:


> If that whole thing is true with AMD and NVIDIA meeting together to come up with pricing for their products, then how do we know these prices are still any good?
> 
> I guess what I'm saying is, how do we know all this isn't just a load of bullshit, and the card's prices after being "slashed" are still higher than they should be, and AMD and NVIDIA have done this to get people thinking they are getting a good deal on the cards?



All that BS happened b4 AMD took over. The mess that happened with the GTX280/260 is a prime example of how we know it has ended. BTW, the GTX280 is a Be@sT


----------



## btarunr (Jul 18, 2008)

wolf2009 said:


> I am pleased that TPU came with something original . THanks !
> 
> I dont think begged is the right word , while borrow is .



I was just pissed at someone equating us to the Inq.  We're not the Inq? Every news publication includes analyzing details and facts, some of the analysis turns out true, some doesn't, it's the same with TPU as a tech-publication, or even CNN as a news giant. That's how news works. Besides, I've based everything on facts alone.


----------



## Frederik S (Jul 18, 2008)

I have more faith in btarunrs conclusions than in the Fuds or The Inqs.
Nice observation, looking forward to being able to afford a GTX260.

Cheers,
Frederik


----------



## Tatty_One (Jul 18, 2008)

Hmmmmm..... GTX Extreme.......a 280 that clocks at 738 Mhz with shaders running 25+% faster..........now thats a fast card, by my reckoning that would make it around 25% faster overall than the current 280GTX vanilla, that would mean it would be a match for R700..........pure speculation as always of course before either has hit the shelves, but if you add to that NVidia's reputation at least for overclocking GPU's it is quite possible that this beast could be faster than R700.


----------



## mdm-adph (Jul 18, 2008)

Desperately cutting prices and selling products at a loss just to impede on a competitor's progress?  Sounds like Intel (or Microsoft).


----------



## candle_86 (Jul 18, 2008)

kinda makes you wonder how expensive it really is for Nvidia doesnt it?

65nm G200 cores must be having a higher yield than originally thought, id say 90% of the usable dies are also fully functional, which would make sense as 65nm is mature by this point


----------



## PCpraiser100 (Jul 18, 2008)

NVIDIA can't defeat ATI's 4870 X2, its going for $500, if the green team thinks they can pull it off, think again. I've been getting rumors that Infinity Ward (COD4 Developer) is thinking about developing their next title with AMD tools instead of NVIDIA's tools, especially when AMD has succeeded in conquering performance records, at a price NVIDIA bows down for. With this and the upcoming 40nm 800 cores coming, green games are getting burned with red flames.


----------



## trt740 (Jul 18, 2008)

Tatty_One said:


> Hmmmmm..... GTX Extreme.......a 280 that clocks at 738 Mhz with shaders running 25+% faster..........now thats a fast card, by my reckoning that would make it around 25% faster overall than the current 280GTX vanilla, that would mean it would be a match for R700..........pure speculation as always of course before either has hit the shelves, but if you add to that NVidia's reputation at least for overclocking GPU's it is quite possible that this beast could be faster than R700.



they will go a bunch higher than the my vanilla 280 does 717/1500/1280 up from 602/1296/1107.With a smaller die the new 280 should do 800 core. Even then I think the 4870x2 will beat a overclocked 280b by about 10 percent when the 4870x2 is overclocked. Judging by my card. Thats a total estimate on my part.


----------



## phanbuey (Jul 18, 2008)

trt740 said:


> they will go a bunch higher than the my vanilla 280 does 717/1500/1280 up from 602/1296/1107.With a smaller die the new 280 should do 800 core.



800 on a GTX 280??? monsterous.


----------



## pentastar111 (Jul 18, 2008)

Meh, I'm still goin AMD next build...Doesn't matter what nVidia comes out with...I'm already  soured...They have way too many band-aids going on.


----------



## trt740 (Jul 18, 2008)

phanbuey said:


> 800 on a GTX 280??? monsterous.



Well you figure the 280b will start out at stock faster than my max overclock on my 65nm 280 and since 280b is a 55nm chip, that might be a bit conservative on my part. Lets compare  my max stable overclock 717/1500/2560 vanilla 280 65nm, new 280b extreme 738 /1666 / 2520 MHz


----------



## PCpraiser100 (Jul 18, 2008)

For some reason, this problem is linked to what I said about DX11. If NVIDIA fails to exceeded in bang-for-buck, they will not have enough development money to get ready for DX11. AMD is basically taking away a very important supply line to NVIDIA's success in the past, their customers. And if they don't pull it off, stocks will lower in value, CEO voting madness, more gambling, and more partnership with other companies to survive. ATI was in the same trouble and was sold to AMD as a result.


----------



## yogurt_21 (Jul 18, 2008)

lol "vanilla" being applied to a gtx280. 

this will defiently make things interesting. especially being that while nvidia is in a better position to sell the gtx280 for a loss than amd is. they're not exactly faring too well themselves. both companies better watch themselves in the pricewar as intel has been biding it's time waiting to retake the gpu market.


----------



## phanbuey (Jul 18, 2008)

PCpraiser100 said:


> For some reason, this problem is linked to what I said about DX11. If NVIDIA fails to exceeded in bang-for-buck, they will not have enough development money to get ready for DX11. AMD is basically taking away a very important supply line to NVIDIA's success in the past, their customers. And if they don't pull it off, stocks will lower in value, CEO voting madness, more gambling, and more partnership with other companies to survive. ATI was in the same trouble and was sold to AMD as a result.



not to mention the monster that is Intel is about to pop into their little duopoly - and its gonna have to take market share from one of them... If ATI is more competitive and has better product that means that Intel is gonna eat Nvidia's lunch.

EDIT: Yogurt beat me to it... its gonna be a great time for gamers with those three at it.  Ray tracing FTW!


----------



## GPUCafe (Jul 18, 2008)

btarunr said:


> Implications are:
> 
> This paves the way for a 'GeForce GTX 280 Extreme', a new model which has been put on pre-order on British website OCUK (covered here).
> While board partners seemed displeased at the current market trend with the GTX 200 series (covered here), all of a sudden, they are all offering price-cuts. *Could NVIDIA be subsidizing these products*, given that some reports suggest poor reception of these products in the North American markets?
> ...


They are and by a good margin. We already covered that here: http://gpucafe.com/?p=18

I'm told that the drop from $499 to $449 is due to slow sales of the 280, not due to R700. 260 is selling "well" at $299-329.


----------



## Megasty (Jul 18, 2008)

Eh, life is great isn't it. I remember when my monsterous 8800GTX started to die in new games. The 4870 & GTX280 are so damn fast compared to that thing & I hope that game developers don't get stupid & try to eat these cards soon too.

My 280 can do 725/1530/1300. Its completely game stable but I think it can go a bit higher. Not that it matters though, it can already max out everything except everyone's favorite POS anyway. 

[sarcasm] I really don't care about the pricewar on the companies' side, they both deserve it. If the 280 drops even more, then that only means the 4870x2's launch price will soon after. They should just keep on dropping prices so these babies will be in everyone's range  [/sarcasm]


----------



## pentastar111 (Jul 18, 2008)

candle_86 said:


> kinda makes you wonder how expensive it really is for Nvidia doesnt it?
> 
> 65nm G200 cores must be having a higher yield than originally thought, id say 90% of the usable dies are also fully functional, which would make sense as 65nm is mature by this point


 EXACTLY!! It is appearing that they could've sold these for much cheaper right from the gate.... This price slashing proves it...nVidia is a big company..BUT it's not as big as Micro$oft, who can sell at a loss at first, then turn profit by shear volume of units sold (Was'nt it Stalin who said "Quantity has a quality all it's own"?)...X-box 360 anyone? Anyway, to say that nVidia's taking a loss on these cards is ridiculous...Nobody(or company) in their right mind can afford to take such a hit in the wallet unless their company is HUGE(see Micro$oft) or they've been OVERPRICING the products... nVidia could probably lower the price another $50 and STILL make decent profit...We've been getting gouged for some time now....Good to see it tapering down a bit..


----------



## Ravenas (Jul 18, 2008)

btarunr said:


> Why, is it dogma that TPU's news should always be borrowed? Our news is based on facts, not sources. If facts come from a source, we acknowledge them as the source of the fact. Even if Fudzilla comes up with something and has pictures to back it, we cover it. Go to Newegg and see for yourself. If more than two brands are using the same low $440 price (no POS rebate whatsoever), then it has to be a standard price. As for GTX 280 Extreme, we already covered it days ago, dig through the news archive, see for yourself. Even that was based on facts and even other sites covered that.
> 
> Edit: added links.



I just think this type of posting isn't reporting news, rather it's your opinion on what could potentially occur in the very near future.

Shouldn't this be in an opinion forum, or a hardware forum for that matter? I just don't see why you didn't do that? Possibly because you wanted everyone to see your opinion 

The only thing that should have been put in is article for me to actually consider it news is you posting the information on the price drops, and only that news. What you really did was post some links to the price drops (not an official announcement link) and then throw in your opinion on what could happen to the market from a direct result of these price drops.

Great price drops though, maybe now people will feel more tempted to by graphics cards in a time where people are reluctant to buy much of anything besides food and gas.

EDIT: Changed "is" to "isn't".


----------



## candle_86 (Jul 18, 2008)

not really most of Nvidia's money comes from 

IGP
and Sub 100 dollar graphics

So right now Nvidia is still doing fairly well


----------



## GPUCafe (Jul 18, 2008)

candle_86 said:


> not really most of Nvidia's money comes from
> 
> IGP
> and Sub 100 dollar graphics
> ...


If by money you mean revenue then yes, most of their fat (50% margins) profits come from the high-end.


----------



## candle_86 (Jul 18, 2008)

yes revenue, look at it this way, they sell 1 million 6150 chipsets @ 20 dollar profit

or 100,000 GTX 280's @ 100 profit

Thats 20mil vs 1mil profit, and the numbers for low end are prolly bigger in both profit and and quanity sold than the highend. As long as Nvidia controls the lowend in this manner they win.


----------



## dragonavenger (Jul 18, 2008)

F.

U.

D.


----------



## GPUCafe (Jul 18, 2008)

candle_86 said:


> yes revenue, look at it this way, they sell 1 million 6150 chipsets @ 20 dollar profit
> 
> or 100,000 GTX 280's @ 100 profit
> 
> Thats 20mil vs 1mil profit, and the numbers for low end are prolly bigger in both profit and and quanity sold than the highend. As long as Nvidia controls the lowend in this manner they win.


I dont think thats correct. Here's why:
1. 6150 Chipset doesnt have $20 profit, about $5-8.
2. 1,000,000 chipsets at that margin is $8million with a revenue of $40million.
3. 100,000 GT200 at $100 margin is $10million (not $1mil )with a revenue of $20million.

So they make more money ($2million) on the GT200 but generate more revenue ($20million) on the 6150. Remember this; low volume high profit, high volume low profit. 

By slashing to these levels Nvidia isnt making much. They are doing that to help their partners (moving inventory) and at the same time stop bleeding their market share to AMD.


----------



## pentastar111 (Jul 18, 2008)

GPUCafe said:


> I dont think thats correct. Here's why:
> 1. 6150 Chipset doesnt have $20 profit, about $5-8.
> 2. 1,000,000 chipsets at that margin is $8million with a revenue of $40million.
> 3. 100,000 GT200 at $100 margin is $10million (not $1mil )with a revenue of $20million.
> ...


 They definitely won't go broke.


----------



## imperialreign (Jul 18, 2008)

pentastar111 said:


> They definitely won't go broke.



biggest reason why nVidia can afford the price slash - it takes AMD by the ballz


----------



## GPUCafe (Jul 18, 2008)

pentastar111 said:


> They definitely won't go broke.


Definitely not.



imperialreign said:


> biggest reason why nVidia can afford the price slash - it takes AMD by the ballz


Reason is they were in AMD's current position since 7800 series.


----------



## Tatty_One (Jul 18, 2008)

trt740 said:


> they will go a bunch higher than the my vanilla 280 does 717/1500/1280 up from 602/1296/1107.With a smaller die the new 280 should do 800 core. Even then I think the 4870x2 will beat a overclocked 280b by about 10 percent when the 4870x2 is overclocked. Judging by my card. Thats a total estimate on my part.



Well I just read a preview of tests between the GTX280 and the 4870x2 over at Xtreme systems and the R700 lost about a third of the benches (all were at 16xx and above) but won the two thirds by a fair margin but across the board was 20-25% faster but in some benches 40% faster, of course those were at stock speeds so it does seem that at stock the R700 would even beat an Extreme, of course previews are great but not always the complete answer, for the overclocker though, the decision may be more interesting but I cant help but think ATi will get the pricing more realistic.


----------



## Tatty_One (Jul 18, 2008)

yogurt_21 said:


> lol "vanilla" being applied to a gtx280.
> 
> this will defiently make things interesting. especially being that while nvidia is in a better position to sell the gtx280 for a loss than amd is. they're not exactly faring too well themselves. both companies better watch themselves in the pricewar as intel has been biding it's time waiting to retake the gpu market.



I kinda liked the term vanilla, it reminds me of my old 6800.....things are getting way to complicated, especially where the green team are concerened ATM.


----------



## Deleted member 24505 (Jul 18, 2008)

I had a vanilla 6800,cost me £230 

Lol at nvidia,its a bit dirty what they are doing,but theres no love in buisness i guess.


----------



## Tatty_One (Jul 18, 2008)

tigger69 said:


> I had a vanilla 6800,cost me £230
> 
> Lol at nvidia,its a bit dirty what they are doing,but theres no love in buisness i guess.



Mine was a Leadtek, at the time my most expensive gfx card at about £170, and I thought that was bad!


----------



## Rash-Un-Al (Jul 18, 2008)

Very interesting editorial…

I invite you to take the following into consideration:

GTX 280/260 cards have additional clocking and driver optimization potential.  However, 48xx cards do, as well (especially when future "unlocked" 4870 variants will result in dramatic core and mammoth DDR5 speed increases).  So, the card revision argument, on both camps, is relatively moot… both possess relatively new architectures with performance increases we can expect.

When the 9800 GX2 arrived on the market, its price premium over the 3870 X2 didn't prevent GX2s from experiencing healthy sales (in spite of ATI lowering its prices on the 3870 X2).  And, so far, the average performance advantage that the 4870 X2 holds over the GTX 280 (with preliminary silicon and drivers) is greater than the performance lead the GX2 initially had over the 3870 X2.  Additionally, the price premium of R700 over the GTX 280 (even with slashed GTX 280 prices) will not amount to the price disparity which eventually existed between the GX2s and 3870 X2.
Naturally, the R700 will cost more because it (is and) will be the absolute and outright performance leader… and, at launch, will hold that performance title at a lower price-point than the GX2s took advantage of for so long.


----------



## vsary6968 (Jul 18, 2008)

Tatty_One said:


> Hmmmmm..... GTX Extreme.......a 280 that clocks at 738 Mhz with shaders running 25+% faster..........now thats a fast card, by my reckoning that would make it around 25% faster overall than the current 280GTX vanilla, that would mean it would be a match for R700..........pure speculation as always of course before either has hit the shelves, but if you add to that NVidia's reputation at least for overclocking GPU's it is quite possible that this beast could be faster than R700.



al
What are you been smoking? You can share some for me,too.The GTX 280 not that much faster than the HD4870 in overall. Even you overclock it to 738Mhz with shaders running 25% faster, Could not even match the R700. The only that they can beat the R700 is to make dual GTX280.I can give you the new name for the GTX 280 is GX2 280. And this will not appear until 2H 2009.By that time the new generation of R700  for ATI will appear on the market and this one will be the killer machine.And this one will build from a 2x R700 on a single pcb and will be the new design card with single 512 bit.

GTX 280 is 1GB single card. RV770 4870 is 512mb and R700 is 2X RV770 stick together on a single pcb.So the R700 is 512mb + 512mb = 1GB .Eventhough the R700 is a dual chip it only a 512mb X2. When you add it up it is the same as 1GB as the GTX 280. Only is ATI will come with the 2GB.


----------



## Megasty (Jul 18, 2008)

vsary6968 said:


> al
> What are you been smoking? You can share some for me,too.The GTX 280 not that much faster than the HD4870 in overall. Even you overclock it to 738Mhz with shaders running 25% faster, Could not even match the R700. The only that they can beat the R700 is to make dual GTX280.I can give you the new name for the GTX 280 is GX2 280. And this will not appear until 2H 2009.By that time the new generation of R700  for ATI will appear on the market and this one will be the killer machine.And this one will build from a 2x R700 on a single pcb and will be the new design card with single 512 bit.
> 
> GTX 280 is 1GB single card. RV770 4870 is 512mb and R700 is 2X RV770 stick together on a single pcb.So the R700 is 512mb + 512mb = 1GB .Eventhough the R700 is a dual chip it only a 512mb X2. When you add it up it is the same as 1GB as the GTX 280. Only is ATI will come with the 2GB.



 

You know, there are reasons that the 280 beats the 4870x2 in some benches. The main reason is that the drivers are what allows the 4870x2 to copy data over both GPU sets. 
When the game doesn't support that then it will behave like one 4870. In that case, the benches show that the 4870 & 4870x2 perform relatively the same. 
All the newer game that came out over the past 6 months or so support both SLI & CF. 
You won't have that problem now unless the game designers start having massive hangovers on the job.
Also, the 4870x2 is a 2gb card (2x1gb). Some of the ES's were 1gb (1x512mb) which were either a fluke or AMD was trying to test that version of the card as well. 
But the launch card will be 2gb of GDDR5. AMD is also going to make a 'super' RV770XT chip that can clock over 1GHz with proper cooling.
We might see a latter version of the 4870x2 that uses 2 of those chips.


----------



## imperialreign (Jul 19, 2008)

Megasty said:


> You know, there are reasons that the 280 beats the 4870x2 in some benches. The main reason is that the drivers are what allows the 4870x2 to copy data over both GPU sets.
> When the game doesn't support that then it will behave like one 4870. In that case, the benches show that the 4870 & 4870x2 perform relatively the same.
> All the newer game that came out over the past 6 months or so support both SLI & CF.
> You won't have that problem now unless the game designers start having massive hangovers on the job.
> ...





correct me if I'm dead nutz wrong - but I thought with any Crossfire setup (dual or single-PCB), if there is no rendering profile for the game within the video drivers, the GPUs default to AFR.  Meaning that you'd still see a benefit over single GPU with two GPUs in a game with no profile - just not anywhere near as much as you would with a game that has been profiled.

ATI drivers don't allow for hard setting MGPU rendering styles like nVidia does, but ATI's will determine as best they can the best style for of MGPU rendering method.


----------



## powerwolf (Jul 19, 2008)

My cousin bought a 2900XT on launch day then replaced it with a GTX Z80 280 on its launch day. He is cursed, I tell you.

We point and laugh at him.


----------



## AsRock (Jul 19, 2008)

I pm'ed you  you about this price drop earlier this morning..  Anyways thats tightening the nuts on ATI for sure.  BUT as i have said all ways i'd buy of ATI and not NV as i don't support companys that rip you off till they have to drop there price.

Wounder how much there making per card at those prices if they are making any that is.  Maybe there just getting rid asap till they can make a better GPU.


----------



## 1c3d0g (Jul 19, 2008)

Don't forget, there's the rumored 55nm shrink of the G200's (G200b IINM) coming in the September/October timeframe. Still, at only $440 for the GTX280 makes it VERY tempting to buy...can you imagine the amount of work you could do in Folding@Home (GPU client)? Gheeezzz...


----------



## Megasty (Jul 19, 2008)

imperialreign said:


> correct me if I'm dead nutz wrong - but I thought with any Crossfire setup (dual or single-PCB), if there is no rendering profile for the game within the video drivers, the GPUs default to AFR.  Meaning that you'd still see a benefit over single GPU with two GPUs in a game with no profile - just not anywhere near as much as you would with a game that has been profiled.
> 
> ATI drivers don't allow for hard setting MGPU rendering styles like nVidia does, but ATI's will determine as best they can the best style for of MGPU rendering method.



I was referring to the game profiles that are updated through driver releases. The default AFR on the 3870x2 only gave about a 0-5 fps increase over the 3870  
That's why its an incredible thing that AMD can still release updated drivers each month for these dual cards even when there are an influx of new games for that month. 

However, there are games that naturally support the AFR in the coding (Assassin's Creed). Those games will have improvements that nearly match the game profiles itself. 
I just wish more developers will do that, even though most of them are now doing it in secret anyway


----------



## tkpenalty (Jul 19, 2008)

AMD is probably yelling:

"no...NO... NO!!!! NOT AGAIN!!!!!"


Nvidia is just... taking losses just to kill AMD. Very dirty and I think thats something that may  land them up with a lawsuit on their hands.


EDIT: You guys just have to remember that the 4850 are selling like hotcakes everywhere. The GTX280/260s aren't really that popular anyway. In australia none of the 9800GTX pricecuts have come into effect while the 4850's have.


----------



## hv43082 (Jul 19, 2008)

OK does this mean I should return my hd4850 and hang on to the 8800GTX until the dust settle before upgrading?


----------



## tkpenalty (Jul 19, 2008)

hv43082 said:


> OK does this mean I should return my hd4850 and hang on to the 8800GTX until the dust settle before upgrading?



no. Thats a stupid way to think. 4850 > 8800GTX anyway.


----------



## imperialreign (Jul 19, 2008)

Megasty said:


> I was referring to the game profiles that are updated through driver releases. The default AFR on the 3870x2 only gave about a 0-5 fps increase over the 3870
> That's why its an incredible thing that AMD can still release updated drivers each month for these dual cards even when there are an influx of new games for that month.
> 
> However, there are games that naturally support the AFR in the coding (Assassin's Creed). Those games will have improvements that nearly match the game profiles itself.
> I just wish more developers will do that, even though most of them are now doing it in secret anyway



sorry, you're earlier post was a bit misleading 

still, nVidia are tightening the belt something fierce right now, and it doesn't really look like it's helping them much at all.


I must say, though, I think this is the first time we've ever seen nVidia constantly slashing their prices since a series was released.  AMD have definitely lit a pyre under their ass


----------



## tkpenalty (Jul 19, 2008)

Actually guys, remember that Nvidia has a majority of the cards already sold to the AIB partners. Since nvidia is controlling the pricing, they've dropped it. This doesn't mean that the have losses; the AIB partners do. 

As a result Nvidia ISNT losing much, BUT they are using this to kill the competition. Anti-competitive if you ask me. 

"Lets hope they make as little as possible with this so their next product wont be as effective so we can take advantage and profit"


----------



## Megasty (Jul 19, 2008)

imperialreign said:


> sorry, you're earlier post was a bit misleading
> 
> still, nVidia are tightening the belt something fierce right now, and it doesn't really look like it's helping them much at all.
> 
> ...



Yeah, I didn't want to get too technical in that post, plus I lost my train-of-thought mid-way through 

The last few weeks certainly has been interesting indeed. I guess NV got way too caught up in that _'we are the best so we charge the most'_ BS. 
Getting a once $650 card for $420 has me reeling just as much as NV was when they finally remembered the details on the 4870x2


----------



## PCpraiser100 (Jul 19, 2008)

I hate NVIDIA because they are just keep on getting assisted by Microsoft and other companies when new things pop up such as DX11. AMD (the good guys) are just doing what is best for us while NVIDIA just catches up by being "innocent" in saying that they're the leaders in graphics solution, they are BUT NOT IN THEORY. They are basically cheating their way in by Microsoft kissing their ass if they get hurt while bribing game manufacturers to create engines that scale more with NVIDIA cards. AGEIA is a great example as they are kissing NVIDIA's ass by giving them the right to put PhysX technology in their GTX 200 line. While AMD comes too late in the offering that AGEIA only accepts them by CERTAIN CONDITIONS. Samsung and other display companies are best friends with AMD which is why ATI cards are the only cards to have GDDR4 and GDDR5 memory. AMD is also being kissed in the ass as well, but only for the sake of customers to keep creating 100% genuine solutions that truly kick ass when benched in games with mediocre engines such as Source-powered Half-life 2 Episode 2, and in some cases COD4. This is why NVIDIA is always on the move to cut corners in the world of performance, because AMD is their biggest threat since they cut off important limbs of NVIDIA that needed for the green team to survive. Such as stocks, customers, and profiting-by-price. Soon, because of NVIDIA focused on surviving rather than competing this will spread on to sister companies such as BFG moving on to manufacturing ATI cards, or more custom driver availability for those with SLI components. But I don't mean to be ORACLE OF COMPUTER TECHNOLOGY as I have already told you enough so get out lol.


----------



## TheGuruStud (Jul 19, 2008)

^^ Someone needs to calm down. I love AMD, but that's a lot of speculation and ATI does not equal AMD. I wouldn't speak of them as the same entity for quite some time.


----------



## H82LUZ73 (Jul 19, 2008)

I like seeing an $800 card go down $400 in price with in 2 weeks lol


----------



## TheGuruStud (Jul 19, 2008)

H82LUZ73 said:


> I like seeing an $800 card go down $400 in price with in 2 weeks lol



No kidding. 

I'm thinking about 9800 GTX+ in SLI (and overclocked) when the prices fall some more.

Would be pretty hard to beat for the money.


----------



## candle_86 (Jul 19, 2008)

PCpraiser100 said:


> I hate NVIDIA because they are just keep on getting assisted by Microsoft and other companies when new things pop up such as DX11. AMD (the good guys) are just doing what is best for us while NVIDIA just catches up by being "innocent" in saying that they're the leaders in graphics solution, they are BUT NOT IN THEORY. They are basically cheating their way in by Microsoft kissing their ass if they get hurt while bribing game manufacturers to create engines that scale more with NVIDIA cards. AGEIA is a great example as they are kissing NVIDIA's ass by giving them the right to put PhysX technology in their GTX 200 line. While AMD comes too late in the offering that AGEIA only accepts them by CERTAIN CONDITIONS. Samsung and other display companies are best friends with AMD which is why ATI cards are the only cards to have GDDR4 and GDDR5 memory. AMD is also being kissed in the ass as well, but only for the sake of customers to keep creating 100% genuine solutions that truly kick ass when benched in games with mediocre engines such as Source-powered Half-life 2 Episode 2, and in some cases COD4. This is why NVIDIA is always on the move to cut corners in the world of performance, because AMD is their biggest threat since they cut off important limbs of NVIDIA that needed for the green team to survive. Such as stocks, customers, and profiting-by-price. Soon, because of NVIDIA focused on surviving rather than competing this will spread on to sister companies such as BFG moving on to manufacturing ATI cards, or more custom driver availability for those with SLI components. But I don't mean to be ORACLE OF COMPUTER TECHNOLOGY as I have already told you enough so get out lol.



Agieia didnt give Nvidia physix rights, Nvidia owns Ageia, Ageia was going belly up and Nvidia bought them out its that simple, Ageia didnt have much of a say if they wanted to leave the market with any money


----------



## PCpraiser100 (Jul 19, 2008)

candle_86 said:


> Agieia didnt give Nvidia physix rights, Nvidia owns Ageia, Ageia was going belly up and Nvidia bought them out its that simple, Ageia didnt have much of a say if they wanted to leave the market with any money



Thanks for the edit, but it is true that the companies had relations by optimizing PhysX for NVIDIA, before they were bought from the company.


----------



## candle_86 (Jul 19, 2008)

PCpraiser100 said:


> Thanks for the edit, but it is true that the companies had relations by optimizing PhysX for NVIDIA, before they were bought from the company.



not really, untill the merger Ageia did it own thing, its why it failed, it didnt get support from any of the big players man. If it had partnered with Nvidia in the beggining we'd have alot more Physx games today, no the sad fact is Ageia thought they go into this market without any support from the other players, and we all know the new kid always gets picked up unless one of the popular kids steps in to stop the fight


----------



## PCpraiser100 (Jul 19, 2008)

candle_86 said:


> not really, untill the merger Ageia did it own thing, its why it failed, it didnt get support from any of the big players man. If it had partnered with Nvidia in the beggining we'd have alot more Physx games today, no the sad fact is Ageia thought they go into this market without any support from the other players, and we all know the new kid always gets picked up unless one of the popular kids steps in to stop the fight



Hmmm...sounds reasonable to me. I guess mind block is on the loose.


----------



## NinkobEi (Jul 19, 2008)

well considering the 4870x2 reviews show it 80% faster than the GTX280, unless nv drop the 280's price down to 300 it wont be able to compete.


----------



## candle_86 (Jul 19, 2008)

yea man just cool it ok, we are all annoyed at Both compaines around here actully. ATI with there drivers and Nvidia with there pricing, both compaines are being annoying. Just sit back, grab a beer and watch the show


----------



## candle_86 (Jul 19, 2008)

Ninkobwi said:


> well considering the 4870x2 reviews show it 80% faster than the GTX280, unless nv drop the 280's price down to 300 it wont be able to compete.



i beg to differ actully, pre release reviews show alot of things, i wait till we know cold hard facts myself


----------



## NympH (Jul 19, 2008)

I don't care if they lower it to 200$, im buying a 4870 X2 anyway...


----------



## candle_86 (Jul 19, 2008)

go right ahead and do so, people bought the 2900XT over the 8800GTS640, and that didnt make sense either


----------



## PCpraiser100 (Jul 19, 2008)

I'm not fumed up or anything, I was just giving out details, just keep the community balanced with what I know about both companies. I've been looking at both companies, past and present, for nearly 3 years and I was just telling their history like an elder lecturing the kiddies lol.


----------



## PCpraiser100 (Jul 19, 2008)

candle_86 said:


> go right ahead and do so, people bought the 2900XT over the 8800GTS640, and that didnt make sense either



The new R700  core and Teraflop engine is really starting to shine, since it has potential of resisting so much on what graphics has to offer, which is why its really starting to show lost muscle in Crysis. But don't get too comfortable, as DX11 is announced and some NVIDIA games like GRID are putting a real hit on these cards when Xfire'd. If the engine of this game is continued, don't buy unless the future drivers fix it. However there is some light at the end of the cave as Catalyst 8.7's beta drivers is positively previewing the upcoming driver in 3DMark06. Google through different forums and see for yourself!


----------



## Megasty (Jul 19, 2008)

The 8.7 beta actually did give the 4870x2 a great performance boost to the previewers that used it. Some of those goofballs even used 8.5  But most of them used the 8.6 hotfix. But that beta is terribly unstable with the 4870 & 4850, & it yacks up CFx to the point where it eats away at your device manager. I'm never using any beta drivers, EVER :shadedshu


----------



## PCpraiser100 (Jul 19, 2008)

Megasty said:


> But that beta is terribly unstable with the 4870 & 4850, & it yacks up CFx to the point where it eats away at your device manager. I'm never using any beta drivers, EVER :shadedshu



Well of course, all betas are unstable at some extent. If not then what is the point of betas?!


----------



## dragonavenger (Jul 19, 2008)

Rash-Un-Al said:


> Very interesting editorial…
> 
> I invite you to take the following into consideration:
> 
> ...





Great Post. The reality.


----------



## twicksisted (Jul 19, 2008)

PCpraiser100 said:


> The new R700  core and Teraflop engine is really starting to shine, since it has potential of resisting so much on what graphics has to offer, which is why its really starting to show lost muscle in Crysis. But don't get too comfortable, as DX11 is announced and some NVIDIA games like GRID are putting a real hit on these cards when Xfire'd. If the engine of this game is continued, don't buy unless the future drivers fix it. However there is some light at the end of the cave as Catalyst 8.7's beta drivers is positively previewing the upcoming driver in 3DMark06. Google through different forums and see for yourself!



i checked out Catalyst 8.7beta in google and it seems that all they are is the Hotfix driver for the 48XX series cards.... hope not as i was expecting a performance boost with the next catalyst for my 4870 card


----------



## imperialreign (Jul 19, 2008)

twicksisted said:


> i checked out Catalyst 8.7beta in google and it seems that all they are is the Hotfix driver for the 48XX series cards.... hope not as i was expecting a performance boost with the next catalyst for my 4870 card



IIRC - wasn't there a thread around here that debunked the 8.7 betas as being fake?


----------



## candle_86 (Jul 19, 2008)

meh i cant see 80% lead myself seeing as how a single card preforms compared to a GTX280 its damn near impssible without 100% efficny whish is IMO impossible.


----------



## farlex85 (Jul 19, 2008)

candle_86 said:


> meh i cant see 80% lead myself seeing as how a single card preforms compared to a GTX280 its damn near impssible without 100% efficny whish is IMO impossible.



Well, in some benchmarks they actually have achieved very near 100% efficiency on some of the latest cards from ati and nvidia. Ati also is saying they have a new way of bridging the gpu's to avoid problems that normally arise w/ cf. Still, I would think 80% average is pushing it too, more like 50-60% maybe. We shall see though.


----------



## candle_86 (Jul 19, 2008)

yea honestly, and 100% isnt possible, you loose data simply be transfer, no GPU is 100% effecint, nothing can ever be, thats a rule of pshycis


----------



## Megasty (Jul 19, 2008)

candle_86 said:


> meh i cant see 80% lead myself seeing as how a single card preforms compared to a GTX280 its damn near impssible without 100% efficny whish is IMO impossible.



You would be surprised to see how close those 2 cards are if you had the cards firsthand. Benchmarks are garbage. They have their biases all across the board. 
Of course sometime the cards are duds but when you get a great versions of binned GPUs, you can tell right off. 
There are games where the GTX280 is 25% faster than the 4870 then there are games where that closes to 10%. 
But that still means that the 4870x2 will have to have 100% eff. in order to do that 80%.
Games that can take such performance for the 4870x2 already exist & its clear to which ones they are because they can already do 200+ fps maxed out with CFx.


----------



## candle_86 (Jul 19, 2008)

Megasty said:


> You would be surprised to see how close those 2 cards are if you had the cards firsthand. Benchmarks are garbage. They have their biases all across the board.
> Of course sometime the cards are duds but when you get a great versions of binned GPUs, you can tell right off.
> There are games where the GTX280 is 25% faster than the 4870 then there are games where that closes to 10%.
> But that still means that the 4870x2 will have to have 100% eff. in order to do that 80%.
> Games that can take such performance for the 4870x2 already exist & its clear to which ones they are because they can already do 200+ fps maxed out with CFx.



but none the less 100% is impossible for any device, phsyics apply to electronics, we loose efficny due to heat, travel time, ect. 90% i can belive possible


----------



## farlex85 (Jul 19, 2008)

Megasty said:


> You would be surprised to see how close those 2 cards are if you had the cards firsthand. Benchmarks are garbage. They have their biases all across the board.
> Of course sometime the cards are duds but when you get a great versions of binned GPUs, you can tell right off.
> There are games where the GTX280 is 25% faster than the 4870 then there are games where that closes to 10%.
> But that still means that the 4870x2 will have to have 100% eff. in order to do that 80%.
> Games that can take such performance for the 4870x2 already exist & its clear to which ones they are because they can already do 200+ fps maxed out with CFx.



Game fps are just other benchmarks.  Real world performance is somewhat qualitative, thus the need for benchies to quantify it.



candle_86 said:


> but none the less 100% is impossible for any device, phsyics apply to electronics, we loose efficny due to heat, travel time, ect. 90% i can belive possible



But the physics are the same for 1 card as for 2 (essientially). The amount of energy lost as heat and friction and the like does not factor in when talking about scaling across 2 cards, b/c it is already factored in to the overall performance of a single card.


----------



## Megasty (Jul 19, 2008)

candle_86 said:


> but none the less 100% is impossible for any device, phsyics apply to electronics, we loose efficny due to heat, travel time, ect. 90% i can belive possible



Yeah, that also means that none of these GPU are 100% eff either. So if the 4870x2 has twice the performance of the 4870, its still not 100% eff over the 2 cores. The 4870x2 doesn't have to be 100% eff to be 80% faster than the GTX280 in 2 or 3 games. OK, I need to stop splitting hairs 



			
				farlex85 said:
			
		

> Game fps are just other benchmarks.  Real world performance is somewhat qualitative, thus the need for benchies to quantify it.



I know, just how fraps take away from performance too  We don't know how much it really is taking away but atleast its taking it away in all cases


----------



## candle_86 (Jul 19, 2008)

farlex85 said:


> Game fps are just other benchmarks.  Real world performance is somewhat qualitative, thus the need for benchies to quantify it.
> 
> 
> 
> But the physics are the same for 1 card as for 2 (essientially). The amount of energy lost as heat and friction and the like does not factor in when talking about scaling across 2 cards, b/c it is already factored in to the overall performance of a single card.



true, but a single GPU will always be the more efficent way to go untill game designers make there grahpics engine split to take advantage of multiple GPU's.


----------



## PCpraiser100 (Jul 19, 2008)

candle_86 said:


> true, but a single GPU will always be the more efficent way to go untill game designers make there grahpics engine split to take advantage of multiple GPU's.



Yes, which is why dual core GPUs are out. They basically shave off about 50-75 watts over a dual video card setup, but has very similar performance compared to Crossfire and in some cases SLI. SLI performed so well on the 8000 series that it out values the 8800 Ultra by nearly a $500, which was one of the ony times I started liking NVIDIA. But today's newest CrossfireX has a lot faster and more reliable bandwidth than SLI, so the HD 4870 X2 really pays for itself as it is more affordable than two HD 4870s by a $100 difference. Not much difference right now, but the gap will get bigger in the future.


----------



## vsary6968 (Jul 19, 2008)

Tatty_One said:


> Well I just read a preview of tests between the GTX280 and the 4870x2 over at Xtreme systems and the R700 lost about a third of the benches (all were at 16xx and above) but won the two thirds by a fair margin but across the board was 20-25% faster but in some benches 40% faster, of course those were at stock speeds so it does seem that at stock the R700 would even beat an Extreme, of course previews are great but not always the complete answer, for the overclocker though, the decision may be more interesting but I cant help but think ATi will get the pricing more realistic.





You can see that even the GTX 280 SLI is competive to the R700.Go to this website and check out the benchmark. http://www.techreport.com/articles.x/15105 and http://www.anandtech.com/video/showdoc.aspx?i=3354.


----------



## farlex85 (Jul 19, 2008)

vsary6968 said:


> You can see that even the GTX 280 SLI is competive to the R700.Go to this website and check out the benchmark. http://www.techreport.com/articles.x/15105 and http://www.anandtech.com/video/showdoc.aspx?i=3354.



Yes but it also costs $880 w/ these price drops for gtx 280 sli. To have it merely compete w/ a $500 card is laughable at best.


----------



## PCpraiser100 (Jul 19, 2008)

Yes, I know the $550 GTX 280 kicks ass at SLI but what will happen if the more kick ass $500 HD 4870 X2 took Xfire for a test drive? I know there are cheaper versions of the GTX 280 but I'm presenting the most clocked GTX 280 from ASUS which is about $550. Approximate MSRP of HD 4870 X2 is $500 so I added that in.


----------



## NinkobEi (Jul 19, 2008)

i'm curious what crossfired R700s will do...true, thats $1000 in video cards...but they gotta have some serious oomph.


----------



## vsary6968 (Jul 19, 2008)

Megasty said:


> You know, there are reasons that the 280 beats the 4870x2 in some benches. The main reason is that the drivers are what allows the 4870x2 to copy data over both GPU sets.
> When the game doesn't support that then it will behave like one 4870. In that case, the benches show that the 4870 & 4870x2 perform relatively the same.
> All the newer game that came out over the past 6 months or so support both SLI & CF.
> You won't have that problem now unless the game designers start having massive hangovers on the job.
> ...



All the review and benchmark which I went through all of those site it is a 1gb (512 x 2). The 2GB version haven't sample yet. But this 1GB card still alot faster than GTX 280 already.And also this is only the 1GB version.Their is no 2GB version sample yet. If you can find out if it is 2GB version let me know.If you go to Hexus.net they will say this is a 1GB.

We cannot emphasise enough that the AMD R700, which HEXUS has tested, is an early engineering sample.

AMD has not confirmed the final AMD R700 specification or disclosed final pricing to HEXUS.

The final specifications of AMD R700 may ultimately differ in detail from those we've reported.

Some of the AMD R700 specifications detailed below are a logical extrapolation of known data relating to ATI Radeon HD 4870 in CrossFire.

Furthermore, all measured performance is very likely to improve as the product, its BIOS and its device drivers are optimised to production status, the SKU launches and becomes available to purchase


----------



## Megasty (Jul 19, 2008)

vsary6968 said:


> All the review and benchmark which I went through all of those site it is a 1gb (512 x 2). The 2GB version haven't sample yet. But this 1GB card still alot faster than GTX 280 already.And also this is only the 1GB version.Their is no 2GB version sample yet. If you can find out if it is 2GB version let me know.If you go to Hexus.net they will say this is a 1GB.
> 
> We cannot emphasise enough that the AMD R700, which HEXUS has tested, is an early engineering sample.
> 
> ...



So DH is BS, I don't think so.

http://www.driverheaven.net/reviews.php?reviewid=588


----------



## vsary6968 (Jul 19, 2008)

If you don't think so , u have to wait and see and I don't want to argue something have not release yet.


----------



## candle_86 (Jul 19, 2008)

the BS Sites include but are not limited to

The Inquirer
Toms Hardware
Rage3d
NvNews
The Registar
Hard OCP

other than that most sites tend to agree on things


----------



## btarunr (Jul 19, 2008)

Ravenas said:


> I just think this type of posting isn't reporting news, rather it's your opinion on what could potentially occur in the very near future.
> 
> Shouldn't this be in an opinion forum, or a hardware forum for that matter? I just don't see why you didn't do that? Possibly because you wanted everyone to see your opinion
> 
> ...



Flip thru your newspaper, get amazed at the number of news items which become 'merely opinions' according to your description. After quoting facts and links to facts, it is news.  Newegg sells a whole fleet of stock-clocked cards for $440, it is news. GPU Cafe 'opionates' later in the thread of a further slash, it's something to wait for, but since $440 GTX 280 is a reality, it's not merely opinion. Wake up.


----------



## Megasty (Jul 19, 2008)

candle_86 said:


> the BS Sites include but are not limited to
> 
> The Inquirer
> Toms Hardware
> ...



I really don't want to say it but the last one is BS itself. It was fashioned, formed, & derived from BS. It is the original BS, all others merely follow suit.

Sites that don't try to find out anything about what they're testing also should be included.


----------



## From_Nowhere (Jul 19, 2008)

Well that's good, GTX 280 is within my reach now. Gotta love competition (even if it is staged? )  


Oh and for those looking at sub-par tech reviews and news websites look no further than these: 

Sub-par Tech Reviews:
Anandtech
Tom's Hardware
CNET Reviews

Sub-par Tech News:
N4G 
The Inquirer
CNET News


----------



## candle_86 (Jul 19, 2008)

anyone remember when tomshardware was actully a good place to get info, what happened, when did there stuff go to crap?


----------



## Bundy (Jul 19, 2008)

btarunr said:


> Flip thru your newspaper, get amazed at the number of news items which become 'merely opinions' according to your description. After quoting facts and links to facts, it is news.  Newegg sells a whole fleet of stock-clocked cards for $440, it is news. GPU Cafe 'opionates' later in the thread of a further slash, it's something to wait for, but since $440 GTX 280 is a reality, it's not merely opinion. Wake up.



Last time I flipped through a newspaper, the opinions were firmly anchored in the opinions section. Most quality news organisations and reporters strive to improve their reputation by being as impartial as possible. In the end, this determines the nature of the site/paper and its reputation, something I think you could consider a little more before posting. 

*bta, its great that you are posting up such a huge volume of up to date info *and maybe I'm just showing my true bias but I'd just prefer to see less AMD/ATI fanboi flavour in your news. If techpowerup wishes to be known as the place where Nvidia gets trashed and ATI is god, you are welcome to ignore my comments as being ignorant.


----------



## btarunr (Jul 19, 2008)

bundyrum&coke said:


> Last time I flipped through a newspaper, the opinions were firmly anchored in the opinions section. Most quality news organisations and reporters strive to improve their reputation by being as impartial as possible. In the end, this determines the nature of the site/paper and its reputation, something I think you could consider a little more before posting.
> 
> *bta, its great that you are posting up such a huge volume of up to date info *and maybe I'm just showing my true bias but I'd just prefer to see less AMD/ATI fanboi flavour in your news. If techpowerup wishes to be known as the place where Nvidia gets trashed and ATI is god, you are welcome to ignore my comments as being ignorant.



I'm not biased. I use NV hardware, I'm happier than anyone that GTX 280 sells for $440 (as low as $419 AR), so I could buy one soon. NV isn't trashed in the news, pray where? ATI isn't looked up like God? At least I don't. If the same article was made by say another publication and not me, I bet many wouldn't have said "oh it's FUD, on it's opinion". After putting up facts you still put a cold shoulder to it, It becomes a very bad day for me at office. So much effort in vain.


----------



## candle_86 (Jul 19, 2008)

bundyrum&coke said:


> Last time I flipped through a newspaper, the opinions were firmly anchored in the opinions section. Most quality news organisations and reporters strive to improve their reputation by being as impartial as possible. In the end, this determines the nature of the site/paper and its reputation, something I think you could consider a little more before posting.
> 
> *bta, its great that you are posting up such a huge volume of up to date info *and maybe I'm just showing my true bias but I'd just prefer to see less AMD/ATI fanboi flavour in your news. If techpowerup wishes to be known as the place where Nvidia gets trashed and ATI is god, you are welcome to ignore my comments as being ignorant.



you cant be serious here can you, BTA gets called the biggest Nvidiot around here by the newbs running around, when did he become a fanATIc


----------



## btarunr (Jul 19, 2008)

That's the problem, Candle. Every time I say I like this card and it happens to be NVIDIA, people call me NVIDIOT, otherwise ATI-biased. Why don't you just call me a 'consumer'? That I'm a person with value-for-money?


----------



## candle_86 (Jul 19, 2008)

lol bta i never call you either, im just laughing about all of this


----------



## eidairaman1 (Jul 19, 2008)

ok all you fools that are trying to start a flame war need to get the piss out, because the mods will be here to remove those who start a war.


----------



## candle_86 (Jul 19, 2008)

eidairaman1 said:


> ok all you fools that are trying to start a flame war need to get the piss out, because the mods will be here to remove those who start a war.



what im innocent this time i swear


----------



## Bundy (Jul 19, 2008)

btarunr said:


> I'm not biased. I use NV hardware, I'm happier than anyone that GTX 280 sells for $440 (as low as $419 AR), so I could buy one soon. NV isn't trashed in the news, pray where? ATI isn't looked up like God? At least I don't. If the same article was made by say another publication and not me, I bet many wouldn't have said "oh it's FUD, on it's opinion". After putting up facts you still put a cold shoulder to it, It becomes a very bad day for me at office. So much effort in vain.



sorry


----------



## Tatty_One (Jul 19, 2008)

candle_86 said:


> meh i cant see 80% lead myself seeing as how a single card preforms compared to a GTX280 its damn near impssible without 100% efficny whish is IMO impossible.



No across the board from what I have read I see 20-25%


----------



## v-zero (Jul 19, 2008)

This suggests that GTX 280 will be EOL relatively soon. They simply cannot afford to make these losses selling the 280 like this for an extended period, and considering the R700 is already much faster and will only improve with drivers, I still don't find the 280 massively attractive.


----------



## Tatty_One (Jul 19, 2008)

v-zero said:


> This suggests that GTX 280 will be EOL relatively soon. They simply cannot afford to make these losses selling the 280 like this for an extended period, and considering the R700 is already much faster and will only improve with drivers, I still don't find the 280 massively attractive.



Depends really on what the profit margins for NVidia were in the first place, the 280 extreme is on the verge of release, all that is is a highly revved up version of the 280 but with around a 25% performance hike (working on the fact that both core and shader clocks are around 25% higher), that will at least make it more competative and competative is what keeps prices down hopefully!

You could say at the end of the day in any case, NVidia is only having to do now what ATi has been doing for 2 years.....making little profit out of one (or a couple of cards), difference probably is that NVidia can afford it.


----------



## vojc (Jul 19, 2008)

hope that 4850/70 get cheaper  i lika ncidia and ati, and i like myself too


----------



## candle_86 (Jul 19, 2008)

The Extreme looks like it can tie the 4870 X2 and being a single card solution will draw less power and not have the issues of Xfire or SLI to go with it


----------



## Megasty (Jul 19, 2008)

Looking at the averages for the benchmarks (grr), The 4870x2 is about 25% faster than the GTX280. 
This doesn't include the uber high differences that even have the GTX280 SLI losing to the 4870x2. 
Even I have a hard time believing that. Generally speaking, the extreme does have the chance to match the 4870x2 in most games. 

But even so, this makes the regular card all the more of a better buy. It can already handle everything (I'm anti-Crysis ) 
So if it even drops to or below $400 b4 the extreme comes out then that's a clear time to buy it. The GTX280 carries the load & then some. 

The only thing about this notion is that both the GTX260 & 4870 can eat all the games too (I'm still anti-Crysis ) 
It all depends on what level you want to play a game at: maxed out @ 40-60+ fps or maxed out @ 70-100+ fps.


----------



## vojc (Jul 19, 2008)

25+fps is just fine  for strategy


----------



## Hayder_Master (Jul 20, 2008)

the R700 make nvidia Waive the vanity and principles


----------



## newconroer (Jul 20, 2008)

PCpraiser100 said:


> NVIDIA can't defeat ATI's 4870 X2, its going for $500, if the green team thinks they can pull it off, think again. I've been getting rumors that Infinity Ward (COD4 Developer) is thinking about developing their next title with AMD tools instead of NVIDIA's tools, especially when AMD has succeeded in conquering performance records, at a price NVIDIA bows down for. With this and the upcoming 40nm 800 cores coming, green games are getting burned with red flames.




Meh, let them. I haven't an interest in Infinity Ward. Their fame will be short lived.


----------



## Darkrealms (Jul 21, 2008)

btarunr said:


> Why, is it dogma that TPU's news should always be borrowed? Our news is based on facts, not sources. If facts come from a source, we acknowledge them as the source of the fact. Even if Fudzilla comes up with something and has pictures to back it, we cover it. Go to Newegg and see for yourself. If more than two brands are using the same low $440 price (no POS rebate whatsoever), then it has to be a standard price. As for GTX 280 Extreme, we already covered it days ago, dig through the news archive, see for yourself. Even that was based on facts and even other sites covered that.
> 
> Edit: added links.


My apologies, I wasn't insulting you.  I was making a joke.   Particularly at the two comments before mine. 

I said your perspective was interesting which it was/is.  If it wasn’t interesting or unique there would have been no point in posting it.  Obviously it has sparked interest and comments at this time there are 111 of them.  I will tell you when I disagree with something (and I have) and I will thank you when I agree or think your post is worth it (which I have).


----------

