# Preliminary Tests on GeForce GTX 295 Run, Leads Radeon HD 4870 X2



## btarunr (Dec 16, 2008)

A Chinese technology portal, IT168 has conducted a preliminary performance evaluation of the upcoming GeForce GTX 295 graphics card. The card will be a flagship offering by NVIDIA. It will feature two G200b graphics processors. Also provided are the first pictures of the finished product, along with a burst-shot of the card and its cooling assembly. Across several game tests, the evaluation showed the GTX 295 to outperform the HD 4870 X2 by up to 80%, while providing superior power characteristics.



 

 

 

 



*View at TechPowerUp Main Site*


----------



## douglatins (Dec 16, 2008)

Preliminary tests, are all but preliminary, they don't represent the truth. They don't have any driver information and run only Nvidia preferable games, Fallout being the exception, which shows hardly any real gain. But then again it should be better, because this is actually 2 GTX280, or it would be a major fail

And its great to see people are not taking this BS


----------



## leonard_222003 (Dec 16, 2008)

Really IT168 ? those graphs look like they come straight from Nvidia , top 5 games for them are COD WW , fallout3 , far cry2 , left 4 dead , dead space.
Looks like picked games , except fallout and far cry 2 the rest of the games are not top games.
They should've picked crysis warhead , crysis , the new gta4  and i wonder why they never pick grid racer  or nfs undercover .
Until it's tested by 10 reputable websites we don't believe anything from the makers of CUDA.


----------



## Fitseries3 (Dec 16, 2008)

as long as id doesnt burn up im down to get one. selling my 4870x2s right now. i've got a few other things in the works too.


----------



## kid41212003 (Dec 16, 2008)

I see only "pure" advertise here.

NVIDIA PhysX support -> 2nd benchmark (which is not needed, and doesn't really show any true performance over ATI cards).

But, I'm gonna buy one if it's under ~$350, lols. Maybe after 4-5 months when it launched.


----------



## newtekie1 (Dec 16, 2008)

We need to wait until the real reviews are out, and pricing information is important also.


----------



## kylew (Dec 16, 2008)

I call BS on those benches. There's no chance that the GTX295 is pulling 150FPS+ at 2560X1600 with 4AA considering current benchmarks show GTX280 SLI struggling to get 120FPS at 1680x1050 with 4AA.

Those benchmarks are totally misleading, especially the PhysX one. Why would you compare a game ran on two different computers, where one is doing physics on the CPU, while the other is on the GPU, when everyone knows the GPU is faster?

That, and the fact that ATi cards don't officially do PhysX on the GPU, there's no point in them benching it and using it as a comparison.


----------



## mdm-adph (Dec 16, 2008)

Did they even need to show the PhysX benchmark?   

The gaming benchmarks are higher, but not as high as I thought they'd be, especially since I'm sure Nvidia hand-picked these games.


----------



## [I.R.A]_FBi (Dec 16, 2008)

rumor garbage.


----------



## Fitseries3 (Dec 16, 2008)

benchmarks aside.... 

3 gtx260s are SICK so just imagine what 4 of them could do. thats what im going off of.. but i may just be dumb.


----------



## PCpraiser100 (Dec 16, 2008)

Most of the time, the performance between the two are too close to consider looking into, unless if you really like turning up custom AA on Dead Space or if you wanna SLI them together after getting a promotion back at work. Either way, i don't think the card will be competitive at price AT ALL, there are two GTX 280s AND PCBs slabbed into one package for crying out loud. PRICE COMPETITIVE FAIL! At least thanks to the two PCBs it should have a bit better OC capabilities...


----------



## btarunr (Dec 16, 2008)

douglatins said:


> Preliminary tests, are all but preliminary, they don't represent the truth. They don't have any driver information and run only Nvidia preferable games, Fallout being the exception, which shows hardly any real gain. But then again it should be better, because this is actually 2 GTX280, or it would be a major fail
> 
> And its great to see people are not taking this BS



Capt. Obvious, hence it's called a prelimnary evaluation, a test whose details usually aren't disclosed, often influenced by the company. These charts are just as realistic as the ones we got before HD4870X2 release which showed it to outperform GTX 280 by 60%, and the average gains being higher throughout. The cores are part GTX 260, part GTX 280 (have 240 SPs, but 448bit mem bus) so you can't really call it a 2x GTX 280, also given that the system interface bandwidth is made to be shared between the two GPU. 

In any case the component GPUs on this card are stronger than the RV770s on a HD4870X2. This card won't have a problem outperforming that.



[I.R.A]_FBi said:


> rumor garbage.



funny how I never saw such posts when we covered prelim benches (the same as this story) for R700.


----------



## indybird (Dec 16, 2008)

I could believe these, but the presentation is missing a lot of info.

If these cards are more than $500 they won't sell.  They should probably be sold around $450.

-Indybird


----------



## mdm-adph (Dec 16, 2008)

btarunr said:


> funny how I never saw such posts when we covered prelim benches (the same as this story) for R700.



Strange, I remember tons.    Fanboyism has always been rampant on all sides.


----------



## TRIPTEX_CAN (Dec 16, 2008)

I really don't think they will be less than $600 USD but I could be wrong. 

It should be a good card though. I wonder what ATI will do to counter.


----------



## [I.R.A]_FBi (Dec 16, 2008)

btarunr said:


> funny how I never saw such posts when we covered prelim benches (the same as this story) for R700.




Cant i become more vocal 

But you do have a point.

But how is posting a benchmark in a field where the competitior has no competing hardware not garbage though?


----------



## newtekie1 (Dec 16, 2008)

indybird said:


> I could believe these, but the presentation is missing a lot of info.
> 
> If these cards are more than $500 they won't sell.  They should probably be sold around $450.
> 
> -Indybird



So the card is going to outperform the HD4870x2(and I don't think anyone here is doubting that, we just don't know by how much yet), but you say it won't sell unless they price it drastically below the HD4870x2's price?  What are you smoking, and can I have some?



mdm-adph said:


> Strange, I remember tons.    Fanboyism has always been rampant on all sides.



http://forums.techpowerup.com/showthread.php?t=65382

That is the prelim thread for R700, not a whole lot of ATi bashing going on there for puting out the same marketting crap as nVidia is here.  In fact, IRA is one of the first to make a post implying what a wonderful day it was for ATi.


----------



## [I.R.A]_FBi (Dec 16, 2008)

newtekie1 said:


> So the card is going to outperform the HD4870x2(and I don't think anyone here is doubting that, we just don't know by how much yet), but you say it won't sell unless they price it drastically below the HD4870x2's price?  What are you smoking, and can I have some?



At six bills they will sell.


----------



## mdm-adph (Dec 16, 2008)

newtekie1 said:


> http://forums.techpowerup.com/showthread.php?t=65382
> 
> That is the prelim thread for R700, not a whole lot of ATi bashing going on there for puting out the same marketting crap as nVidia is here.  In fact, IRA is one of the first to make a post implying what a wonderful day it was for ATi.



One example does not a rule make.


----------



## [I.R.A]_FBi (Dec 16, 2008)

mdm-adph said:


> One example does not a rule make.



still no one has addresed



> But how is posting a benchmark in a field where the competitior has no competing hardware not garbage though?


----------



## ShadowFold (Dec 16, 2008)

The reason more people are saying this looks fake and stuff us because nvidia does this stuff all the time. AMD does too don't get me wrong, they are just more likeable being the underdog and I think a lot of people want/ed to see them on top again. Nvidia was on top for the longest time..


----------



## CDdude55 (Dec 16, 2008)

If i get rich i would SLI two on my 680i


----------



## ShinyG (Dec 16, 2008)

I hope this doesn't mean the start of a new "chart-war" started by obscure sites citing "official" sources!


----------



## InnocentCriminal (Dec 16, 2008)

As always, I'll await the real reviews. Should be interesting to see how well this card does with AA enabled. I don't think we need SLi or Crossfire, single cards FTW!


----------



## allen337 (Dec 16, 2008)

Just another way for nvidia to sell a video card for $800, if the shoe was on the other foot AMD would do the same. Good thing about this is amd will be dropping prices again and nvidia will be on top. I like ati cards better than nvidia so I hope it tripples the 4870X2s score Give ya $100 for one of those old outdated 4870x2s FIT.  ALLEN


----------



## buggalugs (Dec 16, 2008)

ShinyG said:


> I hope this doesn't mean the start of a new "chart-war" started by obscure sites citing "official" sources!



If the past is anything to go by, Nvidia will charge charge a kings ransom for these. Then in 2 months the new range will come out and have the same performance for 1/3 of the price.

 I'll stick with my 4870.


----------



## InnocentCriminal (Dec 16, 2008)

buggalugs said:
			
		

> I'll stick with my 4870.



Smart move, I really don't see the need for this card unless you have over a 26" monitor and want to max out your games.


----------



## Lu(ky (Dec 16, 2008)

One thing I learned about Nvidia is that they like to "LIE THERE ASSES OFF" when it comes to benchmarks. There CEO should go to jail with all of there B.S he did months back. I am no fanboy at all. This is my first 4870 X2 Ati card ever owned. But they want us to do jump back and forth and spend our $$$ on something with very little improvements.


----------



## [I.R.A]_FBi (Dec 16, 2008)

inbetween all of the stress .. "preliminary" is spelt incorrectly


----------



## EastCoasthandle (Dec 16, 2008)

*In my opinion*

It is hard to gauge the accuracy of this information as it appears to me to be just marketing.  However, what I would watch is how AMD will respond (based on how it performs).  As rumors are surfacing that Lil Dragon is ready (by whatever definition that means).

Another point of interest will be how this product will be priced.  It will either be competitive or priced "as usual".  This can branch off as meaning 2 things but I'll hold off on that speculation for now.  

Another point of interest is the availability of the 55nm part.  I recall news that nv told investors (or whoever) that 55nm was in full production. 

If someone is looking to upgrade I would wait until mid Jan 2009 to see how the dust settles between this gen and next gen GPUs. In particular:
-see actual performance numbers between cards
-see how AMD responds (if it's just a price drop, for example, I would wait)
to see which way one should go.


----------



## Castiel (Dec 16, 2008)

This seems pretty sweet. I wanna see what they get in TriSLi on crysis.


----------



## EastCoasthandle (Dec 16, 2008)

Odd, when I click on the link I get a blank page.


----------



## kid41212003 (Dec 16, 2008)

The link is dead, it re-directed you to the homepage, the smiling face say "this link is no longer avaiable".







J.k, I'm not sure what it said, but i think it meant that.


----------



## newtekie1 (Dec 16, 2008)

mdm-adph said:


> One example does not a rule make.



It does when it is the only example.


----------



## [I.R.A]_FBi (Dec 16, 2008)

newtekie1 said:


> It does when it is the only example.


 What about the other half of my post?


----------



## DarkMatter (Dec 16, 2008)

[I.R.A]_FBi said:


> What about the other half of my post?



It's not garbage. Nvidia cards have PhysX, Ati cards don't. It was Ati's call. That's what you will get in actual games with PhysX enabled. The chart states it clearly "PhysX Performance". It should not matter to anyone who doesn't care about PhysX.


----------



## newconroer (Dec 16, 2008)

indybird said:


> I could believe these, but the presentation is missing a lot of info.
> 
> If these cards are more than $500 they won't sell.  They should probably be sold around $450.
> 
> -Indybird



Nah, let them go out at $600. 770s were $550 for a good while, and if these out do them, then noone can complain.

Unfortunatley those tests are pointless. They don't mention anything about minimum framerates.




Lu(ky said:


> One thing I learned about Nvidia is that they like to "LIE THERE ASSES OFF" when it comes to benchmarks. There CEO should go to jail with all of there B.S he did months back. I am no fanboy at all. This is my first 4870 X2 Ati card ever owned. But they want us to do jump back and forth and spend our $$$ on something with very little improvements.





Uh...that's why you wait until something is released, and get the proper reviews, and I mean the PROPER ones, from actual users, not those half-brained 'independent' _reviewers_.

You purchased your X2 on good sources right? Or did you...


----------



## btarunr (Dec 16, 2008)

EastCoasthandle said:


> Odd, when I click on the link I get a blank page.









is


> Sorry, the page can not be open for the time being, at the same time may visit the page number of people, Please try to refresh the look of the page, or return to this page IT168



Good thing we got the pics before they became unavailable.


----------



## Apocolypse007 (Dec 16, 2008)

I wonder why they disappeared?


----------



## Selene (Dec 16, 2008)

I think they will hit at $550-$600, then drop down to what ever teh 4870x2 is after about 30days.
I also dont trust the benchmarks, im sure there is alot of fudging their.
I do think this card will be the 4870x2 most of the time, but not all, this is based on 2x gtx260s 55nm, not the 280 as one of the ppl above said, so it will be cheaper then 2x 280s but probly about the same as 2x 260s.


----------



## btarunr (Dec 16, 2008)

Apocolypse007 said:


> I wonder why they disappeared?



Unavailable due to excess local traffic. It shouldn't matter since it was in Chinese anyway.


----------



## mdm-adph (Dec 16, 2008)

newtekie1 said:


> It does when it is the only example.



...that anyone could bother to find.  

Look -- I'm not going to turn this into another useless pissing match about whose cards are better and whose fanboys aren't as virulent.  Both sides complain needlessly about the other -- both are guilty of the same things.  

However, if you want to continue researching the forums to do a fanboy-complaint-vs-compliment analysis, feel free.


----------



## Tatty_One (Dec 16, 2008)

leonard_222003 said:


> Really IT168 ? those graphs look like they come straight from Nvidia , top 5 games for them are COD WW , fallout3 , far cry2 , left 4 dead , dead space.
> Looks like picked games , except fallout and far cry 2 the rest of the games are not top games.
> They should've picked crysis warhead , crysis , the new gta4  and i wonder why they never pick grid racer  or nfs undercover .
> Until it's tested by 10 reputable websites we don't believe anything from the makers of CUDA.



Do you call them NVidia games because the GTX260 largely runs them faster?  Whichever way you look at it....most of the newest and most popular games run better and most (not all) resolutions on a GTX260 216 as opposed to a HD4870 512MB/1GB and not all those games have "the way it's meant to be played" on the startup screen, what a reviewers supposed to do.....not test on the most popular games in case NVidia wins a few more than ATi so they can therefore be accused of running test with games only optimised for NVidia.....cmon, if the 5 most popular games at the moment are being tested then that is relevant because dont we want to see how these cards perform on games that are being played today, on todays hardware by the majority of players.


----------



## mdm-adph (Dec 16, 2008)

Tatty_One said:


> Do you call them NVidia games because the GTX260 largely runs them faster?  Whichever way you look at it....most of the newest and most popular games run better and most (not all) resolutions on a GTX260 216 as opposed to a HD4870 512MB/1GB and not all those games have "the way it's meant to be played" on the startup screen, what a reviewers supposed to do.....not test on the most popular games in case NVidia wins a few more than ATi so they can therefore be accused of running test with games only optimised for NVidia.....cmon, if the 5 most popular games at the moment are being tested then that is relevant because dont we want to see how these cards perform on games that are being played today, on todays hardware by the majority of players.



I'd like to think that any game with Nvidia's "meant to be played" badge should be excluded from reviews, *unless it's added with a caveat*, like "this game has been programmed with Nvidia's help to work better with Nvidia hardware."

The same goes for any ATI-badged game -- can't think of one right now, though.


----------



## Tatty_One (Dec 16, 2008)

ShadowFold said:


> The reason more people are saying this looks fake and stuff us because nvidia does this stuff all the time. AMD does too don't get me wrong, they are just more likeable being the underdog and I think a lot of people want/ed to see them on top again. Nvidia was on top for the longest time..



Was?....they still have much more of the market share than ATi so keep cheering for the underdog!


----------



## Tatty_One (Dec 16, 2008)

mdm-adph said:


> I'd like to think that any game with Nvidia's "meant to be played" badge should be excluded from reviews, *unless it's added with a caveat*, like "this game has been programmed with Nvidia's help to work better with Nvidia hardware."
> 
> The same goes for any ATI-badged game -- can't think of one right now, though.



I see where you are coming from but personally I cant agree with ya there, if the ten most played games are never tested, then how is the gamer to know how well a card performs with them?  The average consumer is not in the slightest bit interested in whether NVida pays "development costs" or supports the game with self marketing, the average consumer wants to know which card performs on which game at the lowest bang for buck.  Also, bare in mind that as i said "the way it's meant to be played" does not show on all ten......also ATi does win on some benches on THWIMTBP" games.


----------



## mdm-adph (Dec 16, 2008)

Tatty_One said:


> I see where you are coming from but personally I cant agree with ya there, if the ten most played games are never tested, then how is the gamer to know how well a card performs with them?  The average consumer is not in the slightest bit interested in whether NVida pays "development costs" or supports the game with self marketing, the average consumer wants to know which card performs on which game at the lowest bang for buck.  Also, bare in mind that as i said "the way it's meant to be played" does not show on all ten......also ATi does win on some benches on THWIMTBP" games.



1)  Never said the games can't be tested and reviewed -- just that the ones with Nvidia's badge need an added caveat.  Perhaps these games could be tested on Nvidia's hardware, alone?

2)  The average consumer isn't the slightest bit interested in video card reviews, either.   The enthusiast is, however -- and if an enthusiast isn't concerned that some game benchmarks may be slanted towards a certain manufacturer, they should be.


----------



## J-Man (Dec 16, 2008)

Out perform my card up to 80%... Haha, doubt it. My card is gonna be king for a long time.


----------



## pentastar111 (Dec 16, 2008)

All arguing aside...I think it will be great if it does indeed outperform the 4870 X2. The closer the compitition between the two companies, the harder they try to best each other, which results in lower priced, higher performing tech for us...


----------



## spearman914 (Dec 16, 2008)

Fake!! That GTX295 can't beat the 4870x2 by that much.


----------



## Tatty_One (Dec 16, 2008)

mdm-adph said:


> 1)  Never said the games can't be tested and reviewed -- just that the ones with Nvidia's badge need an added caveat.  Perhaps these games could be tested on Nvidia's hardware, alone?
> 
> 2)  The average consumer isn't the slightest bit interested in video card reviews, either.   The enthusiast is, however -- and if an enthusiast isn't concerned that some game benchmarks may be slanted towards a certain manufacturer, they should be.



OK, using your argument, no ATi card owner would get a review to show him how his card would perform on the newest games if the game said "TWIMTBP"........I doubt that would work and contrary to your beleif, i think the "average" consumer" has only a game or Gfx card review to go by before they select the once in a 2-3 year card, the guys down my street that game, not particularily hardcore have never been to a tech forum in their life, they either buy a PC mag or go to the PC Mag's online reviews to make sure their 8400GS they are considering buying will run Crysis at 1920 x 1200


----------



## erocker (Dec 16, 2008)

I'm interested in seeing some GTX285 results.  This 295 is going to be an excellent card for sure, though I want to stick with a single GPU.





That doesn't seem unbelievable to me.^^


----------



## Selene (Dec 16, 2008)

Im not sure not reviewing games with either companys badge is a good thing, most ppl are not like us, we read the reviews and bench marks alot, this helps us decide on upgrades, so If i play game "A" and game "C" I find the info I need and then get the card that does the best.
If you only do games that dont have any support from either one, then it would make it harder to find wich card is best for that person.
Also 90% of ppl dont even know about sites like this so they are blind and only read whats on the box, and we all know NV and ATI are very missleading, the whole 2gig/1gig on the X2 cards is realy realy iffy, almost none of my gamer friends even new the SLI and CF did not combine the memery that way so some of them are ticked off at both sides.


----------



## wolf (Dec 16, 2008)

all in all those benchmarks look fairly realistic, cant wait to see this card overclocked


----------



## Melvis (Dec 16, 2008)

I was hopeing this card would be built the same as the 3870x2 and 4870x2 with two GPU's on the one board, but no, its the same old way with two cards stuck together =/ Dont they run like SLI when there done like this? unlike the X2's? or am i wrong. Ive heard back when the 7950GX2 was out you had to enable SLI for both cards to run, so i thought this would be the same?
I might of bought this card instead of the 4870x2 but idk now.


----------



## wolf (Dec 16, 2008)

Melvis said:


> I was hopeing this card would be built the same as the 3870x2 and 4870x2 with two GPU's on the one board, but no, its the same old way with two cards stuck together =/



some people, like myself, think this design is actually better that the dual radeons, believe it or not.

if for no other reason (which there are) is because both GPU's are cooled at the same time, ie same temp.


----------



## phanbuey (Dec 16, 2008)

haha... just as expected.  people who have limited experience with SLI on current hardware knocked this card, but in reality these are the results, AT STOCK.  this card will OC past 700, radeon 4780 X2 never had a chance.  448Bit bus for OCing heeeeaaaaveeennnn.


----------



## erocker (Dec 16, 2008)

Dual gpu setups may be a thing of the past soon. There is speculation that the dual RV8xx series card is based on *this*  I like the Nvidia design for thier dual cards, though it's obviously not as cost effective as ATi's design.  Plus, if Nvidia were to try putting two massive GTX chips on one pcb, the card would have to be huge.


----------



## phanbuey (Dec 16, 2008)

erocker said:


> Dual gpu setups may be a thing of the past soon. There is speculation that the dual RV8xx series card is based on *this*  I like the Nvidia design for thier dual cards, though it's obviously not as cost effective as ATi's design.  Plus, if Nvidia were to try putting two massive GTX chips on one pcb, the card would have to be huge.



i remember reading about this during IBM's intercooled chip debut... cool stuff, but i bet we're still 4 or 5 years away... and even then... ATI 9780X2 3D INTERCOOLED EDITION!!!!


----------



## WarEagleAU (Dec 16, 2008)

the GTX 280 is a monster of a card, but I dont see it winning in the power draw department (lets face it the number of transistors makes it impossible) and I dont see the advantages as that HUGE. I think it will do better but not in the margin they are showing it. Also, the 1GB 4870 makes up for the 512mb 4870 card against the GT 260 216.


----------



## WarEagleAU (Dec 16, 2008)

as an aside, its interesting to note that since ATI places the 4870 in the realm of the 260 (and now 260 216) they are taking their top gpu or a blend, and using that to trounce the 4870x2.


----------



## insider (Dec 17, 2008)

Price/performance anyone?

Unless it sells for very close to the 4870 x2 not many will care, not many bothered with the 4870 x2 to start with!

I'd rather they release something close/slightly slower/exceeds the 48070 X2 performance but cheaper, that is something more people will buy.


----------



## a_ump (Dec 17, 2008)

hhhhmmmm interesting, i say some of those marks would be possible but the fact that SLI'd GTX 280's, which would technically be superior to the GTX 295, don't out score an HD 4870x2 i don't see how this card which is 2 GPU's in between 280 and 260 performance(GTX 270 i'd like to call em ) there isn't a way that these are accurate, though nvidia may have done some hardcore optimizing in drivers and whatnot but i still say sketchy. idc about this card it's nothing new, just the current put together, but i am looking forward to seeing the RV870 benchmarks.


----------



## Tatty_One (Dec 17, 2008)

insider said:


> Price/performance anyone?
> 
> Unless it sells for very close to the 4870 x2 not many will care, not many bothered with the 4870 x2 to start with!
> 
> I'd rather they release something close/slightly slower/exceeds the 48070 X2 performance but cheaper, that is something more people will buy.



Agreed, but these cards are only ever aimed to fill about 3-5% of all Graphics card sales, they are a flag waver for the rest (and more affordable) cards in the range.

So am I right in saying, the 295 is actually the equivilent of two souped up GTX280's (cause i thought it was 2 x 240 shaders) with higher clock speeds and 55nm?


----------



## a_ump (Dec 17, 2008)

Tatty_One said:


> Agreed, but these cards are only ever aimed to fill about 3-5% of all Graphics card sales, they are a flag waver for the rest (and more affordable) cards in the range.
> 
> So am I right in saying, the 295 is actually the equivilent of two souped up GTX280's (cause i thought it was 2 x 240 shaders) with higher clock speeds and 55nm?



Yes and no, it's more like a souped up GTX 260, since it still has the 448-bit bus with 896mb for each chip, but has 240 shaders per chip, GTX 270 imo ^_^


----------



## Tatty_One (Dec 17, 2008)

a_ump said:


> Yes and no, it's more like a souped up GTX 260, since it still has the 448-bit bus with 896mb for each chip, but has 240 shaders per chip, GTX 270 imo ^_^



Naaaa the 448 bit bus and 896mb memory are more than enuff, that shouldnt hold them back at all.....maybe if your gaming on a 30+ inch above 1920 resolutions slightly....IDK.


----------



## a_ump (Dec 17, 2008)

Tatty_One said:


> Naaaa the 448 bit bus and 896mb memory are more than enuff, that shouldnt hold them back at all.....maybe if your gaming on a 30+ inch above 1920 resolutions slightly....IDK.



i agree, i thk if the GTX 260 did have 240SPU's it would perform the same as the GTX 280 cept for 2560x1600.


----------



## imperialreign (Dec 17, 2008)

btarunr said:


> funny how I never saw such posts when we covered prelim benches (the same as this story) for R700.



there were - but not as rampant - at least ATI-provided slides for their preliminary benches are setup to appear to be more . . . . possible.  They try to not go overboard with what they're selling and typically are willing to say what CAT beta they're running . . . and they do have a track record for being *somewhat* more accurte than nVidia with preliminaries . . .

These slides are anything but . . .




Although, interesting to say the least - if this card does perform anywhere near that well (although I doubt so), it'll shift market focus back towards nVidia . . . I wonder what ATI has cooked up for their "trump" 2 weeks later


----------



## Melvis (Dec 17, 2008)

wolf said:


> some people, like myself, think this design is actually better that the dual radeons, believe it or not.
> 
> if for no other reason (which there are) is because both GPU's are cooled at the same time, ie same temp.



Thats fair enough if someone like the design of there card, but i personally don't like the way there done, i can see these cards getting very hot, with not much room between the cards to improve on cooling, or by putting a bigger cooler on it, you would have to stick with what it comes out with, or maybe later get water cooling. Where the ATi dual GPU cards can be changed, and gainward and Palit have done so, and there very nicely done to 
Dont get me wrong i like Nvidia over ATI, but with the ATi X2's i really like there design. Personal choice.

No 1 has said if these cards or the older versions have to be turned on like SLI has to? as a m8 has a SLI rig, and its a total pain in the ass, nothing but trouble =/ and not alot of games support SLI still, but soon will be tho.


----------



## Tatty_One (Dec 17, 2008)

Melvis said:


> Thats fair enough if someone like the design of there card, but i personally don't like the way there done, i can see these cards getting very hot, with not much room between the cards to improve on cooling, or by putting a bigger cooler on it, you would have to stick with what it comes out with, or maybe later get water cooling. Where the ATi dual GPU cards can be changed, and gainward and Palit have done so, and there very nicely done to
> Dont get me wrong i like Nvidia over ATI, but with the ATi X2's i really like there design. Personal choice.
> 
> No 1 has said if these cards or the older versions have to be turned on like SLI has to? as a m8 has a SLI rig, and its a total pain in the ass, nothing but trouble =/ and not alot of games support SLI still, but soon will be tho.



Theoretically you are right......in practice, the 9800GX2 ran cooler than the 3870x2 I beleive


----------



## Valdez (Dec 17, 2008)

I think it will be slower than gtx280sli, but faster than gtx260/216sli. It's enough to beat the 4870x2 in most cases.


----------



## Kursah (Dec 17, 2008)

All I can say is this harkons back to to what ATI was doing with 4870 X2 "preliminary" graphs such as this, also with higher performance in Far Cry 2 vs a GTX260 by A LOT(4870 vs 260 that is)...lol, unfortunately we all know how that turned out...I hate this prelim crap, both sides are guilty, IMO it's doing it's job and stirring up the enthusiasts, getting some hyped up about it, getting some pissed off about it, no matter how you look at it everyone who has posted here has been affected by the news in the OP of the prelim results or by someone elses post because of those very results.

I don't really care much for these results...I don't really think the performance increase will be THAT drastic...but I'm sure it will be one helluva card. The 9800GX2 is still a damn fine card for those that want a simpler SLI setup in a single card. I'm a single card, single gpu in my system kind of guy...when my card isn't good enough I'll upgrade. I'm curious to see how hot these run, and what the actual facts and actual results are.


----------



## CDdude55 (Dec 17, 2008)

Wonder how much the GTX 285 will be probably in the $450-$500 range.


----------



## a_ump (Dec 17, 2008)

Valdez said:


> I think it will be slower than gtx280sli, but faster than gtx260/216sli. It's enough to beat the 4870x2 in most cases.



wow that's all the GTX285 is? well at least they did decent with sticking to their naming scheme, so basically an oc'd GTX 280 will perform the same as GTX285, i wonder when they're going to come out with the GTX265.


----------



## DarkMatter (Dec 17, 2008)

a_ump said:


> wow that's all the GTX285 is? well at least they did decent with sticking to their naming scheme, so basically an oc'd GTX 280 will perform the same as GTX285, i wonder when they're going to come out with the GTX265.



That's what it was suposed to be from the start. But yeah, an overclocked GTX280 will perform the same, but at much higher power consumption and potentially higher temps. The GTX285 should overclock far better too.


----------



## Melvis (Dec 17, 2008)

Tatty_One said:


> Theoretically you are right......in practice, the 9800GX2 ran cooler than the 3870x2 I beleive



Yea the 3870x2 and 4870x2 do run very hot, even this 3850 im running now runs dam hot, at 80c under load:shadedshu but at least with all these cards you can update and put a better cooler on them, to bring down there temps, and there for overclocking them higher to get better performance with out the risk of overheating.

When i buy a GPU i always look at the cooling on the card, as i like i nice cool running system, so the heat doesn't also heat up inside the case and makes other components run hotter as well.


----------



## wolf (Dec 17, 2008)

a_ump said:


> wow that's all the GTX285 is? well at least they did decent with sticking to their naming scheme, so basically an oc'd GTX 280 will perform the same as GTX285, i wonder when they're going to come out with the GTX265.



yeah but look at the power consumption and die size.

this card should be easyer to cool with more oc headroom, yes its not much better than the original 280, but just think of it as a newer core stepping.


----------



## CDdude55 (Dec 17, 2008)

My 8600 GTS gets hot as hell, sadly.:shadedshu


----------



## wolf (Dec 17, 2008)

well at GTX285 speeds a GTX280 would be consuming considerably more than 236w, i think its a testament to the 55nm shrink, they've done well. so far 

the power difference is 22.5% both cards at stock, at the same speeds, i bet its over 25%

not bad for a ~15% die shrink right! they must have done some core steppings/revisions along the way to squeeze that sort of power savings out of it.

and +1, this card should overclock very nicely. anyone here want GTX280 specs running at 800 core  i know i do


----------



## btarunr (Dec 17, 2008)

erocker said:


> Dual gpu setups may be a thing of the past soon. There is speculation that the dual RV8xx series card is based on *this*  I like the Nvidia design for thier dual cards, though it's obviously not as cost effective as ATi's design.  Plus, if Nvidia were to try putting two massive GTX chips on one pcb, the card would have to be huge.



Dual-core and Quad-core CPUs could never put multi-socket to rest, although that's what was talked about when Pentium D was launched 

Today we have multi-socket setups comprising of those dual/quad-core chips. This is what could happen. GTX 280 wasn't quite a cost-effective design either (although the credit goes to the great way in which NVIDIA dealt with its partners to bring the price down).


----------



## Binge (Dec 17, 2008)

I like what I'm seeing.  Let's hope Wiz gets a review sample as soon as they are available!


----------



## farlex85 (Dec 17, 2008)

Damn dual cards. *growl* Sure they put up ridiculous numbers, but you pay out the ass for them and some in games they perform worse than a single card. Power, but where's the refinement and efficiency. Anybody can slap a couple cards together and charge double the price (more actually sometimes), I wanna see some innovative engineering, this is becoming more mundane than tick-tock.......


----------



## a_ump (Dec 17, 2008)

+1, i agree, and that's why i want to see what ATI is doing with RV870, or their gen after that. There were rumors that the RV870 would actually be dual chip like a Pentium D but i think that was shot down, but hopefully on next gen it will happen, i'd like to see the performance of 2 gpu's with a ring-bus.


----------



## eidairaman1 (Dec 17, 2008)

a_ump said:


> +1, i agree, and that's why i want to see what ATI is doing with RV870, or their gen after that. There were rumors that the RV870 would actually be dual chip like a Pentium D but i think that was shot down, but hopefully on next gen it will happen, i'd like to see the performance of 2 gpu's with a ring-bus.



Ring Bus was the Flaw of the R670.


----------



## a_ump (Dec 17, 2008)

flaw yes, but flaws can be fixed as can all problems some way or another, i don't know much about this ring bus i just know(or thk) that they'd have to use one for 2 gpu's to share the same memory pool or w/e. there was also the idea that the 512-bit memory bus was fail but nvidia is doing just fine with one. though there is that non-active sideport on the HD 4870x2 that would allow direct communication with the 2 chips, which supposedly would negate micro-stuttering or at least minimize it greatly, wonder if they ever got it working or if it'll make a difference, and if it does have they been waiting to activate it in case nvidia did do a GX2 card.


----------



## eidairaman1 (Dec 17, 2008)

AMD has Architectural Charts of the R670 vs the R770.


----------



## Hayder_Master (Dec 17, 2008)

looks good , but we can't depend on first test , we know this first tests always become not clear and not accurate , but if this card beat 2x gtx260 that's mean sure beat 4870x2 , but if is not this result will be forget it , and im sure ati do some respond and i think it will be release driver which is active the bridge between the two gpu in 4870x2 , sure this driver only support  4870x2


----------



## Super XP (Dec 17, 2008)

AGREED!!!!!

Nobody is talking about the HD 4870x2's secret "Sideport" weapon which can only be enabled through a driver update. Right now the HD 4870x2 is king, and if this new Nvidia card is a little faster it will  only drive the HD 4870x2 down in price which is a good thing. 

Read the review on this site about the added "sideport" of an extra 5 GB/s both ways. 
http://www.techpowerup.com/reviews/Sapphire/HD_4870_X2/

Can this be a secret weapon to increase speed in games at high res?


----------



## DarkMatter (Dec 17, 2008)

Super XP said:


> AGREED!!!!!
> 
> Nobody is talking about the HD 4870x2's secret "Sideport" weapon which can only be enabled through a driver update. Right now the HD 4870x2 is king, and if this new Nvidia card is a little faster it will  only drive the HD 4870x2 down in price which is a good thing.
> 
> ...



Short answer. NO.


----------



## Pixelated (Dec 17, 2008)

kid41212003 said:


> I see only "pure" advertise here.
> 
> NVIDIA PhysX support -> 2nd benchmark (which is not needed, and doesn't really show any true performance over ATI cards).
> 
> But, I'm gonna buy one if it's under ~$350, lols. Maybe after 4-5 months when it launched.



Ouch! Yeah this thing will be EOL after 3 months. What a waste of resources that Nvidia could be using toward their next architecture. Also if this thing isn't engineered perfectly it could turn out to be a HUGE disaster that not even Nvidia's PR dept will be able to spin. Let's hope it doesn't idle at 80c.


----------



## a_ump (Dec 18, 2008)

Super XP said:


> AGREED!!!!!
> 
> Nobody is talking about the HD 4870x2's secret "Sideport" weapon which can only be enabled through a driver update. Right now the HD 4870x2 is king, and if this new Nvidia card is a little faster it will  only drive the HD 4870x2 down in price which is a good thing.
> 
> ...



i had already mentioned this in an ealier page , and i hope that this would improve performance, i know if it were to work as it should it would improve performance, but i don't they got it working. Probly will on the HD 5870x2.



			
				eidairaman1 said:
			
		

> AMD has Architectural Charts of the R670 vs the R770.



this means very little, it just means they changed the architecture of the gpu cause they couldn't get it working, doesn't mean it won't work. Did you know the Radeon X1000 series used a ring bus as well? for some reason the ring bus didn't work out with the 2900XT. Did you know Intel plans to use the ring bus on larrabee? they're not a stupid company so it must still have it's uses


----------



## ThorAxe (Dec 18, 2008)

kylew said:


> I call BS on those benches. There's no chance that the GTX295 is pulling 150FPS+ at 2560X1600 with 4AA considering current benchmarks show GTX280 SLI struggling to get 120FPS at 1680x1050 with 4AA.
> 
> Those benchmarks are totally misleading, especially the PhysX one. Why would you compare a game ran on two different computers, where one is doing physics on the CPU, while the other is on the GPU, when everyone knows the GPU is faster?
> 
> That, and the fact that ATi cards don't officially do PhysX on the GPU, there's no point in them benching it and using it as a comparison.



Actually a single GTX 260+ gets 132FPS at 2560x1600 4xAA 16AF.

http://www.pcper.com/article.php?aid=645&type=expert&pid=3


----------



## a_ump (Dec 18, 2008)

yea but that's at dx9, big difference.


----------



## dmwxr9 (Dec 18, 2008)

douglatins said:


> Preliminary tests, are all but preliminary, they don't represent the truth. They don't have any driver information and run only Nvidia preferable games, Fallout being the exception, which shows hardly any real gain. But then again it should be better, because this is actually 2 GTX280, or it would be a major fail
> 
> And its great to see people are not taking this BS




What do you mean this by this is 2 GTX280's?  Are you saying that it has the performance of 2 280's or are you saying that they used 2 280's for this benchmark?


----------



## a_ump (Dec 18, 2008)

technically the GTX 260 and GTX 280 are the same chip remember. so these 2 chips on the GTX295 are their own variant, one of GTX 280 specs-240 SPU's, but GTX 260 memory design-448-bit bus 896mb memory per chip. so it should perform less than GTX 280 SLI but more than GTX 260 SLI.


----------



## Super XP (Dec 18, 2008)

a_ump said:


> this means very little, it just means they changed the architecture of the gpu cause they couldn't get it working, doesn't mean it won't work. Did you know the Radeon X1000 series used a ring bus as well? for some reason the ring bus didn't work out with the 2900XT. Did you know Intel plans to use the ring bus on larrabee? they're not a stupid company so it must still have it's uses


I believe ATI was going to go with the Ring Bus memory once again with the HD 4870's. Thanks to AMD and the good old "Crossbar Switch" they've taken the performance crown away from Nvidia.

Thanks the Microsoft heavily investing into ATI's R&D for the XBOX 360, ATI had the ability to architecturally design the "Ring Bus Memory Controller" for the XBOX 360. With no effort, they applied the same design for there Graphics Cards. 

In theory the Ring Bus should have performed much more efficient then it currently does, but IMO I think they need more time to perfect the technology. 

This is why AMD went with the Crossbar Switch which is quite similar to what is found in the Athlon 64 I believe.


----------



## DarkMatter (Dec 18, 2008)

The problem with ring bus is that you need at least 2 similarly fast, bandwidth hungry units atached to it to make sense, or in defect of that, more than one fast memory pool. In GPUs that's not the case, you have 1 GPU and one memory pool, so building a costly pathway between them doesn't make sense perf/price or perf/watt wise. The ring bus is similar to the PCI bus in that the different units atached to it have to be arbitered and given some "time". Being so the ring bus was a lot of it's time busy feeding non-performance related units that didn't need all the bus power, canibalizing that power that could be used by the performance parts. It's like building a 12 lane highway just because sometimes 12 trucks are gonna go through it and then allocate the use of that highway to a lot of lonely cars, while the trucks have to wait.

I might not be correct on this one but in the Xbox 360 the ring bus connected the GPU, the CPU, the main memory and the embedded memory, and of course many other "minor" things wiht DMA access. That's 4 performance units, instead of two, and many more units which required access to memory. So there it does make more sense. It does make sense to make a shared big highway, instead of many dedicated small pathways.


----------



## Tatty_One (Dec 18, 2008)

dmwxr9 said:


> What do you mean this by this is 2 GTX280's?  Are you saying that it has the performance of 2 280's or are you saying that they used 2 280's for this benchmark?



The GTX295 will contain two 55nm 240sp GPU's so same shader count as two current GTX280's, but with the memory and bus of the 260.....although because of the 55nm process they should be clocked much higher......I think thats the plan.


----------



## drayzen.ocau (Dec 18, 2008)

kylew said:


> I call BS on those benches. There's no chance that the GTX295 is pulling 150FPS+ at 2560X1600 with 4AA considering current benchmarks show GTX280 SLI struggling to get 120FPS at 1680x1050 with 4AA.
> 
> Those benchmarks are totally misleading, especially the PhysX one. Why would you compare a game ran on two different computers, where one is doing physics on the CPU, while the other is on the GPU, when everyone knows the GPU is faster?
> 
> That, and the fact that ATi cards don't officially do PhysX on the GPU, there's no point in them benching it and using it as a comparison.



It might be possible to achieve those frame rates if they are using the Lucid Hydra, which has been rumored to achieve 100% scaling with multiple GPU's.
Considering Lucid have kept their lips sealed about who they're working with, it's quite possible.
Or perhaps those smartys at NVIDIA have come up with their own version...


----------



## a_ump (Dec 18, 2008)

Interesting i had not heard of this till you linked it, and no i don't think nvidia got or made a chip like that, it's just regular SLI working on the GTX 295. i see the performance but now i want to know the price.


----------



## DarkMatter (Dec 18, 2008)

a_ump said:


> Interesting i had not heard of this till you linked it, and no i don't think nvidia got or made a chip like that, it's just regular SLI working on the GTX 295. i see the performance but now i want to know the price.



Just look at the front page and prepare your wallet.


----------



## Super XP (Dec 19, 2008)

Great point. You make a lot of sense in the matter. It explains why the XBOX 360 does just fine in gaming. Who knows, maybe AMD/ATI will use an updated version of the Ring Bus for their GPU/CPU combo platform. The upcoming HD 5000 series from what I've read will not be multi-GPU, but should perform more than 150% faster than the current HD 4000 series. The HD 6000 which is already designed is a different story, it may be based on the long awaited Multi-GPU design which may come out with the Ring Bus...

NVIDIA's new card does look impressive, I mean it took them well over 6 months to come out with it. I think ATI should not counter this card, just let it be, they've already made a lot of money on the HD 4000 series. And I am happy to have XFX join ATI

What do you guys think? Is Physics going to me main stream? Intel and ATI do not want to support it.

*A quick note, and off topic, SORRY, but HOW the HELL do you open and close DOORS in Left 4 Dead PC????????? It's killing me, and the user manual does not say anything about it.* 


DarkMatter said:


> The problem with ring bus is that you need at least 2 similarly fast, bandwidth hungry units atached to it to make sense, or in defect of that, more than one fast memory pool. In GPUs that's not the case, you have 1 GPU and one memory pool, so building a costly pathway between them doesn't make sense perf/price or perf/watt wise. The ring bus is similar to the PCI bus in that the different units atached to it have to be arbitered and given some "time". Being so the ring bus was a lot of it's time busy feeding non-performance related units that didn't need all the bus power, canibalizing that power that could be used by the performance parts. It's like building a 12 lane highway just because sometimes 12 trucks are gonna go through it and then allocate the use of that highway to a lot of lonely cars, while the trucks have to wait.
> 
> I might not be correct on this one but in the Xbox 360 the ring bus connected the GPU, the CPU, the main memory and the embedded memory, and of course many other "minor" things wiht DMA access. That's 4 performance units, instead of two, and many more units which required access to memory. So there it does make more sense. It does make sense to make a shared big highway, instead of many dedicated small pathways.


----------



## InnocentCriminal (Dec 19, 2008)

Super XP said:
			
		

> A quick note, and off topic, SORRY, but HOW the HELL do you open and close DOORS in Left 4 Dead PC????????? It's killing me, and the user manual does not say anything about it.



I believe it's *E*.

On topic, Bit-Tech have a preview on the GTX 295.


----------



## a_ump (Dec 19, 2008)

well what a shame true it's in engineering stage but those number's aren't near what projections were. HD 4870x2 is better right now by the amount of games it comes out on top, but there's still alot nvidia can do driver wise to tweak the GTX 295 to it's peak potential. They've already gotten 10-15% more performance out of the GTX260/280 since their release. it'll be close imo win some and lose some but i don't see it toppling the HD 4870x2 in performance or price.


----------



## [I.R.A]_FBi (Dec 19, 2008)

Super XP said:


> Great point. You make a lot of sense in the matter. It explains why the XBOX 360 does just fine in gaming. Who knows, maybe AMD/ATI will use an updated version of the Ring Bus for their GPU/CPU combo platform. The upcoming HD 5000 series from what I've read will not be multi-GPU, but should perform more than 150% faster than the current HD 4000 series. The HD 6000 which is already designed is a different story, it may be based on the long awaited Multi-GPU design which may come out with the Ring Bus...
> 
> NVIDIA's new card does look impressive, I mean it took them well over 6 months to come out with it. I think ATI should not counter this card, just let it be, they've already made a lot of money on the HD 4000 series. And I am happy to have XFX join ATI
> 
> ...



Re Physics Implementation, it shall be included DX11.


----------



## TRIPTEX_CAN (Dec 19, 2008)

Super XP said:


> *A quick note, and off topic, SORRY, but HOW the HELL do you open and close DOORS in Left 4 Dead PC????????? It's killing me, and the user manual does not say anything about it.*



Are you joking?  

If not... the key to open doors in the "use" key. Same thing for picking up weapons and items.


----------



## CDdude55 (Dec 19, 2008)

lol


----------



## mdm-adph (Dec 19, 2008)

TRIPTEX_MTL said:


> Are you joking?
> 
> If not... the key to open doors in the "use" key. Same thing for picking up weapons and items.



Hey, let's be easy on him.    Maybe this is his first time playing a Source game.

Got to admit, the first time I played Quake Wars I couldn't figure out what keys did anything -- I'm used to "E" being used for pretty much everything, like in Source games


----------



## CDdude55 (Dec 19, 2008)

True, kind of like in F.E.A.R. where' ''F''(thank god it's right next to ''D'') is to open things. Every game just needs to use ''E'' as the standard.(which most do)


----------



## Tatty_One (Dec 20, 2008)

CDdude55 said:


> True, kind of like in F.E.A.R. where' ''F''(thank god it's right next to ''D'') is to open things. Every game just needs to use ''E'' as the standard.(which most do)



One day we will get interactive voice control, so you can just walk upto a door in the game and through a mic say "open" and it does


----------



## Super XP (Dec 21, 2008)

TRIPTEX_MTL said:


> Are you joking?
> 
> If not... the key to open doors in the "use" key. Same thing for picking up weapons and items.


I should give myself a kick in the arse 

I forgot the set up that key in the keyboard settings. But don't worry I gave myself a 

I also use the mouse with my left hand and the keyboard with my right. So, this is why I have to change keyboard setting with every single game I install.


----------



## farlex85 (Dec 21, 2008)

Tatty_One said:


> One day we will get interactive voice control, so you can just walk upto a door in the game and through a mic say "open" and it does



Have you played Endwar?


----------



## Tatty_One (Dec 21, 2008)

farlex85 said:


> Have you played Endwar?



No, why? does it have voice control/recognition?


----------



## eidairaman1 (Dec 21, 2008)

Ya thats what Ghost Recon Advanced Warfighter needed, that menu based system leaves you dead quick


----------



## farlex85 (Dec 21, 2008)

Tatty_One said:


> No, why? does it have voice control/recognition?



It does, the entire game is played through voice commands. As of yet a counsel only title though.


----------



## SiliconSlick (Dec 28, 2008)

*That's strange*



mdm-adph said:


> Strange, I remember tons.    Fanboyism has always been rampant on all sides.



I'm not sure where you were reading because it is more than clear to me the reds have a burning desire in every thread to swear by their red card and put down the NVidia - and it has also become apparent that the main reason is their empty wallet.


----------



## SiliconSlick (Dec 28, 2008)

*the GTX295 KICIKS BUTT IN ALL GAMES*



douglatins said:


> Preliminary tests, are all but preliminary, they don't represent the truth. They don't have any driver information and run only Nvidia preferable games, Fallout being the exception, which shows hardly any real gain. But then again it should be better, because this is actually 2 GTX280, or it would be a major fail
> 
> And its great to see people are not taking this BS



Nice dream - too bad it's not going to pan out for you. WHY, you say ?

" The winner today of course is the GeForce GTX 295 and considering we were using an early engineering sample card with a very beta driver, things are looking real good for NVIDIA.

Also despite the fact we were limited to testing a handful of games,  we internally of course did run the majority of benchmarks with other games already. And the performance widespread is consistent  and the card worked with any game we threw at it. "

http://www.guru3d.com/article/geforce-gtx-295-preview/14

Yes, I know, it's very hard to take. Your world is crumbling.


----------



## ShadowFold (Dec 28, 2008)

So its gonna be 500$ right? I might grab one when they are like 200$-250$ like the 9800GX2 is now.


----------

