# New 2900XT (new drivers) vs 8800 GTS Review



## d44ve (Jun 14, 2007)

OK, first off.... *PLEASE DO NOT SHIT IN THIS THREAD IF YOU DO NOT LIKE THE RESULTS*

I am posting this because I see they use new drivers and it seems to have helped the 2900XT a bit. 

I love my 8800, but I am always happy to see when other people I know with the 2900XT can get an extra boost for their card. I could care less about which one is better.  It wouldnt make me happy that one of my friends bought a card and it sucked. So this isnt about that. 

Once again, I thought it was interesting.... Also, this was posted on the [H]. I hope there are not any problems with me posting this.

thanks!

Here is the link... I will try to put the actuall review in this thread.

*forgive me if this has been posted or old news.*

Thanks

2900XT w\new Driver vs 8800GTS


Introduction 


We are going to make this evaluation short, sweet and to the point since we have already performed a major evaluation of the ATI Radeon HD 2900 XT. Please read that evaluation first to get the lowdown on the new ATI Radeon HD 2900 XT video card and all the specifications therein. 


In our initial evaluation we made comparisons with a 640 MB GeForce 8800 GTS as well as a GeForce 8800 GTX. We found the performance of the 640 MB GeForce 8800 GTS bested the Radeon HD 2900 XT in everything we threw at it. Our conclusion was that the 640 MB 8800 GTS was the better value, you simply get more for your money. What if there is an even better value now? 


The ATI Radeon HD 2900 XT is closer in price to the 640 MB GeForce 8800 GTS. At the time of launch we did not feel it was explicitly needed to compare the Radeon HD 2900 XT to the 320 MB version of the 8800 GTS, after all the only difference is memory capacity between the two 8800 GTS configurations. However, prices on the 320 MB 8800 GTS have fallen drastically since that launch. You can find video cards such as this factory overclocked Leadtek for $259.99 with all the rebates, or this standard clocked EVGA for $289.99 before rebate and $269.99 after rebate. 


When you compare this to the lowest price we could find on Newegg of the ATI Radeon HD 2900 XT, $409.99 you will see a very large gap in price. What is surprising though is the large gap in performance as well, and it doesn’t swing the way you would typically think. In our minds we are taught that you get what you pay for, the more expensive item will be the better item, but in this case, we might be seeing a complete reversal of that! 


We know from testing that there is little performance difference between the 320 MB and 640 MB GeForce 8800 GTS video cards. The GPU is exactly the same between both, same clock speeds, the only difference is the amount of RAM. With the performance of the 320 MB 8800 GTS being close to the 640 MB 8800 GTS, and prices being so low, we had to see how this compares to the ATI Radeon HD 2900 XT which is more expensive. 


We are going to jump straight to gaming on the next page. For system setup specifications look here. We are using the latest drivers officially supplied by ATI which are known as 8.37.4.2, these drivers have all the performance tweaks that are found in 8.38 which have been rolled up into Cat 7.5.

____________________________________________________________________________________________________

Page One





Elder Scrolls IV: Oblivion

(DirectX 9)


Oblivion uses the multi-platform Gamebryo game engine. Oblivion features DirectX 9 shaders and Havok physics. The engine supports lush vegetation, soft shadows, and high dynamic range lighting (HDR). Oblivion also features SpeedTree for rendering trees.


For testing we have chosen to do a manual run-through riding horseback from outside the Imperial City to Chorrol to Bruma. This run-through allows us to push the hardware as much as the game can. While this is an outdoor run-through we do make sure to test indoor situations in our gameplay analysis as well. We have found that turning on the torch indoors with HDR lighting takes a big hit on performance in some situations. We make sure to test this scenario. You really have to look at the game as two different scenarios, Outdoors and Indoors.














In Oblivion we found the 320 MB GeForce 8800 GTS to match the 640 MB GeForce 8800 GTS’s gameplay experience. Framerates were slightly lower, but still within an acceptable range for smooth gameplay. The framerates are more on par with the ATI Radeon HD 2900 XT, but notice that the Radeon HD 2900 XT had to run with lower grass detail. The GeForce 8800 GTS 320 MB and 640 MB handled grass performance faster than the Radeon HD 2900 XT in our experiences. This is odd because the Radeon HD 2900 XT has more memory bandwidth, and the grass, using alpha textures, is a memory bandwidth limited function. Despite knowing that, it is what it is, the 320 MB GeForce 8800 GTS is able to deliver a better gaming experience in Oblivion. 




Image Quality












The difference between the lower 25% grass setting and half grass setting can clearly be seen in the screenshots above. As you zoom out farther in third person view the grass disappears quicker with less grass distance. At 50% grass the grass is all visible at full zoom out.

________________________________________________________________________________________________________

Page Two


Battlefield 2142

(DirectX 9)


Battlefield 2142, 4th of the Battlefield series games, is a future environment first person shooter. For our gameplay evaluation we found settings that would be playable in single player mode as well as very large player number multiplayer games.


For AF we will force this from the control panel in order to receive higher levels like 16X which there are no options for in-game. We will also force AA from the control panel if the in-game slider does not give us enough options. 


Our manual-run through graphed gameplay evaluation will be an entire map on the “Cerbere Landing” single player mission.












There is no mistaking the advantages both GeForce 8800 GTS based video cards bring to Battlefield 2142 over the ATI Radeon HD 2900 XT. Even the lowly 320 MB GeForce 8800 GTS is able to smoothly run Battlefield 2142 at 1600x1200 with 16X Transparency Supersampling Antialiasing with full in-game settings. The ATI Radeon HD 2900 XT was able to manage 4X Performance Adaptive AA at 1600x1200. In Performance Adaptive AA mode the Radeon HD 2900 XT operates with fewer alpha texture samples taken than the multisample edge antialiasing. So in this case it is taking the equivalent of 2X AA samples on alpha textures, and 4X AA on edge samples. Compare this to 16X AA samples on edge and alpha textures of both GeForce 8800 GTS based video cards. The performance just simply is not there to go any higher with the ATI Radeon HD 2900 XT. 


Note that the short downspike to 23 FPS on the 320 MB GeForce 8800 GTS was experienced while standing directly over a grenade while exploding, a situation which results in the character dying, so no loss in the gameplay experience there because of the framerate drop. 




Image Quality


Screenshots below are in PNG format to eliminate compression artifacts and can range in file size of 1-3 MB.











The above screenshots illustrate the superior image quality of 16X Transparency Supersampling in Battlefield 2142.



____________________________________________________________________________________________________


*Page Three*


S.T.A.L.K.E.R. Shadow of Chernobyl

(DirectX 9)


S.T.A.L.K.E.R. Shadow of Chernobyl is a highly-anticipated game by GSC that was recently released in the USA. THQ published the game on Wednesday, March 21st 2007, and by the end of the week, it was the number 1 selling PC game, and the 8th best selling game on any platform.

S.T.A.L.K.E.R. takes place in the "zone", a depopulated area around the burned out sarcophagus of the famous nuclear power reactor at Chernobyl, in Russia. It is an open-ended, non-linear style of game, viewed through the first person. It combines some of the role-playing elements of Oblivion with tactical combat strongly reminiscent of Counter-Strike. Players must eat and rest to maintain their energy level, and they must use bandages to patch up lacerations, and anti-radiation medication (or Vodka!) to counteract the effects of the radiation contamination that is so rampant in the area. Mutants, military, and mercenaries roam the area, hunting for Stalkers and artifacts.


The game effectively creates a "dirty" look and feel, and the graphics are quite demanding on hardware with full DX9 dynamic lighting support. Our manual run-through consists of the first few missions the player undertakes and goes between a small village, a country road, a farm, and a tunnel.



Rendering Modes


S.T.A.L.K.E.R. is an interesting beast to test since it supports some rather large differences in rendering ability. There are three “Render” modes, “Static Lighting”, “Object Dynamic Lighting” and “Full Dynamic Lighting”. The Static Lighting mode runs in DirectX 8 mode with light maps. The Object Dynamic Lighting mode runs in DirectX 9 mode with light maps. Finally the Full Dynamic Lighting mode runs in DirectX 9 mode with full dynamic lights. 


There is a very large difference between Static Lighting and Object Dynamic Lighting in this game, the former uses DX8 and the latter DX9. With Static Lighting the options of “Lighting Distance” and “Shadow Quality” are disabled. With Object Dynamic Lighting these options are available. These two options have a very large impact on the visual quality in the game. Without them, running in DX8 mode under Static Lighting, the game has a uniformed lighting scheme that does not change with the environment. It basically gives the game a ‘flat’ kind of appearance, there is less bumpmapping and techniques that give a 3D feel to textures on objects. The game also looks very dated under Static Lighting. However, Static Lighting provides a tremendous performance boost. Bumping up to the Object Dynamic Lighting takes a very large performance hit.












In our testing today this is the only game where we saw a significant performance difference between the 320 MB GeForce 8800 GTS and 640 MB GeForce 8800 GTS. We had to lower some settings with the 320 MB GeForce 8800 GTS in order to receive smooth gameplay. Notably we had to lower the Anisotropic Filtering level to 4X instead of 8X on the 640 MB GeForce 8800 GTS. This is somewhat baffling since AF is more process intensive than bandwidth intensive like AA. However, it seems higher levels of AF did affect performance in S.T.A.L.K.E.R. greater with less physical video card memory. We also had to lower grass to 50% density for performance reasons, this lessens the fullness of the grass in the world outdoors. 


Though the 320 MB GeForce 8800 GTS was slower than the 640 MB GeForce 8800 GTS it was still much faster than the ATI Radeon HD 2900 XT in S.T.A.L.K.E.R. With the Radeon HD 2900 XT we not only had to drop to 1280x1024 but we had to lower the rendering mode to “Objects Dynamic Lighting” instead of “Full Dynamic Lighting” that both GeForce 8800 GTS based video cards enjoyed. Dropping to this mode disables Sun Shadows and Grass Shadows and uses lightmaps instead of dynamic lights. We were however able to run at 8X AF thanks to the 512 MB framebuffer. The experience though was just not as good with this video card in S.T.A.L.K.E.R. 



Image Quality


Screenshots below are in PNG format to eliminate compression artifacts and can range in file size of 1-3 MB.














 The above screenshots illustrate the difference between full dynamic lighting and objects dynamic rendering. Shadow detail is greatly reduced with all objects that cast shadows, including trees. In the second screenshot you can really see the difference between lightmaps and real-time dynamic lighting. 


____________________________________________________________________________________________________


*Page Four*


continued...

Lost Planet Demo

(DirectX 9)


Lost Planet is a popular XBOX 360 game that is coming to the PC this summer. The demo for this game was recently released in both a DirectX 9 version and DirectX 10 version. This is one of the first games that will be released new with DirectX 10 support in Vista. This game is wildly popular, and fun. The story goes like this:



Rescued from a veil of ice with only fragments of memory, Wayne Holden struggles to recover his past on a blizzard-ridden world swarming with deadly aliens. With only treacherous snow pirates and the mysterious NEVEC Corporation remaining, can anyone be trusted or is everything lost?



This is a third person shooter that spans across many different terrain types indoors and out with a lot of particles, effects, weather and details. The game supports HDR, motion blur, intense soft shadows and some of the best smoke and explosions we’ve seen in a game. 


Today we are going to test the DirectX 9 version in Windows XP. One neat capability of the demo is that it has a performance test inside. This performance test is unique however, it does not work like other scripted “timedemo” playback loops that you are use to. This test is one of the first performance tests to run in real-time using all the features and capabilities of the game. All of the game’s AI and physics run in real-time as they would while you are playing the game. That means that each time you run the test it will produce slightly different results, just like a FRAPS run-through, because everything is dynamic. We have tested this performance test up against a manual run-through and have found it to be in-line with doing a manual FRAPS run-through, we get close to the same results. 


Though it acts like a FRAPS run-through it still takes the human element out, which is part of the equation when actually playing the game. Framerates are one thing, but actually getting in the game and “feeling” it is also important to the gameplay experience and really the best way to find what settings are playable in a game and how it looks. Therefore for this game we will dive into the game, play it, like we do other games to find the playable settings. But for the graph below we are actually going to use the performance test to show the framerates. 













There are many settings in Lost Planet and some of them drastically alter performance depending on the situation, indoors or outdoors. We found that overall in the snow covered landscape outdoors performance is low in this game, but once you move indoors performance skyrockets up. However, there is one section in the performance test where indoors performance is very slow; this is in the second “Cave” test. We suspect the full version game may include many levels like this so it is an important part of testing performance in this game. Because of the difference in performance between outdoor and indoor environments you will find that to keep gameplay smooth throughout the game you will have really high indoor framerates and low 30’ish FPS framerates outdoors.


In our testing we wanted to raise every in-game option possible for the highest gameplay experience, even if we had to sacrifice resolution. We found that we could almost get everything at the highest level if we dropped to 1280x960 on all three video cards; this was just simply the best balance in resolution to achieve higher in-game settings. Moving to 1600x1200 meant we would have had to drop many in-game settings making the game look bad. 


It seems that Motion Blur and HDR cause a very large performance hit in this game. We opted to keep the HDR level at High and move the Motion Blur level down to Low, this really helped the framerates. The motion blur, while a neat effect, can be distracting and annoying at times and in our opinion a little overdone. So turning it to Low actually helps by increasing the framerates and making the game look better in our opinion. We also had to drop Shadow Resolution to Medium; we found this to affect performance as well and did not see a large difference in image quality by lowering it to medium. Besides that we kept AF at 8X and found the game to be playable on all three video cards at these same settings. 


Overall we found some interesting results between the ATI Radeon HD 2900 XT and GeForce 8800 GTS in the Lost Planet Demo. It seems that enabling 2X AA on the ATI Radeon HD 2900 XT causes a very large performance hit, causing the framerates to hit the teens. While on the 320 MB GeForce 8800 GTS enabling 2X AA we found only maybe a 5 FPS drop in performance. It was enough not to be playable, but it is interesting the HD 2900 XT is taking a much larger performance hit once AA is enabled, losing about 15 FPS in our testing. 


Without AA enabled though we noticed something else interesting. It seems that ATI Radeon HD 2900 XT, for the most part, is faster than both GeForce 8800 GTS based video cards in this demo in the outdoor environments. However, in that second performance test “Cave” we found the HD 2900 XT to struggle and drop down below both GeForce 8800 GTS based video cards. You can see this on the graph starting at the 177 second mark and on. 


I honestly don’t know what this will mean for the full version game. If I were to take a guess right now with the DirectX 9 version it would seem that outdoors the HD 2900 XT may be a little faster, but indoors with a lot of shader and HDR usage the HD 2900 XT could struggle. I don’t know specifically what effects are being done in the “Cave” test, but from appearance it looks like a lot of occlusion mapping, HDR and high texture detail. 



Image Quality


Screenshots below are in PNG format to eliminate compression artifacts and can range in file size of 1-3 MB.






In the screenshot above we are comparing medium resolution shadows versus high resolution shadows. The major difference we see between them is that with the high resolution shadow screenshot the edges of the shadow are better defined which makes details like the gun turret more clear and defined. Throughout the game this is what we witnessed, with character shadows the shadow was more defined on the edge with the high resolution shadows. Since you are running through this game blasting everything in sight however it really wasn’t any loss of the gameplay experience running at the medium shadow resolution in this demo.






Now we come to low motion blur versus high motion blur. Motion blur is something that is experienced in motion; it is hard to capture the differences in a screenshot since velocity of the object affects the blur in a static screenshot. Still, we managed to grab a representative of the extremeness amount of blurring that can happen when you have high motion blur enabled versus low motion blur. There is simply a point in which the motion blurring just becomes too much, in our opinion low looks much better, and less annoying while playing the game.









Here is a comparison between the ATI Radeon HD 2900 XT and GeForce 8800 GTS 320 MB. The only difference we see between them is that the Radeon HD 2900 XT is producing a darker character shadow and darker character texture tone. This could simply be a product of different gamma levels between the cards, though in the game we used the default brightness and contrast levels and made no manipulation of either. We will have to examine the full version game in more detail to see if this difference continues in other parts of the game. 









(Radeon HD 2900 XT)









(GeForce 8800 GTS 320 MB)


Here are some full-size screenshots to look at for fun



____________________________________________________________________________________________________


*Page Five*


Overall Performance Summary


In Oblivion we found the 320 MB and 640 MB GeForce 8800 GTS based video cards to perform faster than the ATI Radeon HD 2900 XT. Oddly enough they were able to handle higher grass distance settings in Oblivion despite the Radeon HD 2900 XT having much higher memory bandwidth. 


Battlefield 2142 had a large difference in the gameplay experience between the ATI Radeon HD 2900 XT and both GeForce 8800 GTS video cards. Even with the much less expensive 320 MB GeForce 8800 GTS we were able to play the game smoothly at 16X Transparency Supersampling at 1600x1200 with no problems at all in intense gun fights with massive explosions. The more expensive ATI Radeon HD 2900 XT could not handle anything higher than 4X Performance Adaptive AA at 1600x1200.


S.T.A.L.K.E.R. also proved to separate these video cards by performance. The ATI Radeon HD 2900 XT was clearly the weaker performing video card. We had to lower the rendering quality to “Objects Dynamic Lighting” and run at 1280x1024 to receive playable performance. Unfortunately this does diminish the gameplay experience compared to the GeForce 8800 GTS based video cards. We were able to take the game up to full rendering quality and play at 1600x1200 with NVIDIA based cards. With the 320 MB version we had to drop the AF level to 4X and grass density to 50%. 


Lost Planet is a fun game, plain and simple; we had a blast playing through the demo. If this is the future of gaming then we are very happy. There is no question that next generation titles will require fast hardware to keep up with the intense detail. This demo presented some interesting results for us. We found that the ATI Radeon HD 2900 XT really does take a large performance hit when enabling AA; to the point where it just isn’t a viable option right now. The GeForce 8800 GTS based video cards on the other hand don’t take as great a hit and some gamers may find 2X AA or more playable depending on what framerates you are comfortable with. 


In Lost Planet’s outdoor areas the ATI Radeon HD 2900 XT, without AA, is slightly better performing than both GeForce 8800 GTS based video cards. However, in that one indoor area of the performance test called “Cave” we saw the framerates suffer and perform slower than the GeForce 8800 GTS based video cards. We cannot wait until the full version game is released so we can test all the levels and see how the video cards really compare throughout the entire game. 



Display Resolution – 1920x1200


In our first evaluation we tested up to 1920x1200 resolution. 1920x1200 is the sweet spot for video cards with 512 MB or greater, of RAM. The 320 MB GeForce 8800 GTS does become memory capacity bottlenecked at this resolution and beyond, unlike the Radeon HD 2900 XT and 640 MB GeForce 8800 GTS. In our experiences we do have to lower in-game quality settings moving from 1600x1200 to 1920x1200 on the 320 MB GeForce 8800 GTS. However the impact is not that great in all cases. 


For example in Oblivion instead of running at 2X AA at 1600x1200 we have to disable AA at 1920x1200 but can maintain all the same in-game options in the game. For S.T.A.L.K.E.R. we see the largest hit in performance at 1920x1200 and find we have to drop to Object Dynamic Lighting. Performance at this resolution trades blows with the ATI Radeon HD 2900 XT. For example in Oblivion we can run at 2X AA at 1920x1200 on the ATI Radeon HD 2900 XT but we have to disable grass completely. With the 320 MB GeForce 8800 GTS we have to disable AA but we can run at 50% grass. 


S.T.A.L.K.E.R. on the other hand shows a clear advantage with the 320 MB GeForce 8800 GTS at 1920x1200. We are able to run at Objects Dynamic Lighting with maximum in-game settings except for 50% grass. However, on the ATI Radeon HD 2900 XT we have to drop to Objects Dynamic Lighting and decrease view distance, object detail, grass density, lighting and shadow quality to a “Lowest” setting. 


Our conclusion from this is that at 1920x1200 the 320 MB GeForce 8800 GTS is more strained due to its memory capacity, but in some games it is still much faster than the ATI Radeon HD 2900 XT, while in others they trade blows back-and-forth. 



Video Card Value 


This evaluation is the embodiment of what video card value is to the gamer. We have a video card, the 320 MB GeForce 8800 GTS at around $289 providing a noticeable gameplay experience advantage compared to the ATI Radeon HD 2900 XT that costs $409. In some cases the performance gap is very wide (S.T.A.K.E.R. and BF 2142), in other games performance is closer (Oblivion, Lost Planet), but in most cases the 320 MB GeForce 8800 GTS is providing higher framerates than the ATI Radeon HD 2900 XT sometimes equating to a better gaming experience. 


The 8800 GTS 320MB video card is $120 less expensive and provides a gaming experience that is equal to or superior to the HD 2900 XT. Not only that, but as our original evaluation proved the ATI Radeon HD 2900 XT uses much more power than a GeForce 8800 GTS! The 8800 GTS is simply more efficient both in terms of power and performance.



The Bottom Line


We hoped newer driver revisions would improve performance on the ATI Radeon HD 2900 XT. With the newer driver we used for this evaluation we did not see any “magic” happen when it comes to real world gaming experiences at resolutions at and above 1600x1200. The ATI Radeon HD 2900 XT is not even a match for even the much less expensive and much less power hungry 320 MB GeForce 8800 GTS. 



*WORKING ON THE REST NOW*


----------



## anticlutch (Jun 14, 2007)

Jeez... and to think that I was going to sell my 8800GTS for the HD 2900xt 
I'm an ATi lover at heart but they have some serious work to do... until then I guess I'm forced to say nVidia ftw!


----------



## wazzledoozle (Jun 14, 2007)

I really dont like how hardocp does their reviews.


----------



## Corrosion (Jun 14, 2007)

Thats a bull review, they dont even have the same setting for the screen shots. totally a nvidia sided review. I like invidia also but thats just lame.


----------



## d44ve (Jun 14, 2007)

Christ that was a lot of work. I hope you appreciate it!


Otherwise, piss off!


----------



## Corrosion (Jun 14, 2007)

Who said they didnt appreciate it, i do im just saying that its nvidia sided it seems like.


----------



## DaMulta (Jun 14, 2007)

I run Elder Scrolls IV: Oblivion at 6AA/16AF (1280X768)with adaptive AA off and it runs smooth at all times.

I look at the average frame rates not the highs and lows. To me with there numbers it seemed better than the GTS.

Why not just run them on the same settings for the review. also why have different resolutions for Image Quality?


----------



## d44ve (Jun 14, 2007)

Corrosion said:


> Who said they didnt appreciate it, i do im just saying that its nvidia sided it seems like.




I didnt say anyone DIDNT appreciate it. I said I hope you do... thats it, nothing about the comments or yours.


----------



## GLD (Jun 14, 2007)

I am stunned to see the lower performance of the 2900XT. I am one who had higher hopes for the 2900XT.  I am waiting awhile to buy my DX10 card. Will buy when we actually have games available. (I don't play CoH) I plan to spend around $250 for it. With the card choices of today, that $ would put me in a 8600GTS. I want more horsepower then a 8600 card. And after reading this review I can bet I will want more hp then the upcoming 2600 cards. 

If ATi would drop the price of the 2900XT to in between the 320 and 640 GTS I might even buy one. If not, I bet the 320 GTS's will be had for $250 or less before the DX10 titles hit the shelves. From the numbers, and my limited 1280x1024 res. of my 17' LCD, I am sure a 320mb GTS (for $250 or less  ) will make me happy.


----------



## TonyStark (Jun 14, 2007)

Corrosion said:


> Thats a bull review, they dont even have the same setting for the screen shots. totally a nvidia sided review. I like invidia also but thats just lame.



You're absolutely right, the nVidia cards were run with higher settings, and produce better performance. Totally one sided review.


----------



## Corrosion (Jun 14, 2007)

d44ve said:


> I didnt say anyone DIDNT appreciate it. I said I hope you do... thats it, nothing about the comments or yours.



ok ok, sry. just thought you were talking to me.


----------



## d44ve (Jun 14, 2007)

honestly... thats how every review is going to seen as by one side or another....

It doesnt matter who wrote it. My grandmother couldve wrote it and if the review showed one card better than the other card people will say that it was one sided.

Now, I am not saying this isnt one side.  I just havent seen *ONE REVIEW* where both sides agreed


----------



## d44ve (Jun 14, 2007)

Corrosion said:


> ok ok, sry. just thought you were talking to me.




its all good


----------



## DaMulta (Jun 14, 2007)

Look at this with DX10 it wins. It's a DX10 card not a DX9 card. Sure it's nice to be able to still play DX9 games, but that's not what this generation of cards are about. That goes on both sides Nvidia and AMD. 

There is still no DX10 3Dmark, and very very few games running in DX10.

Mike on TeamATi ran lost planet DX9 last night and got this score.
average 72 snow 74 cave 42


----------



## Corrosion (Jun 14, 2007)

I dont think there is ever gonna be a review where both sides agree. but o well. i hope the GDDR4 Version of the 2900 is a little better .. or alot


----------



## DaMulta (Jun 14, 2007)

TonyStark said:


> You're absolutely right, the nVidia cards were run with higher settings, and produce better performance. Totally one sided review.



Sometimes cards run better at high resolutions and higher settings. Which is odd but it's true.


----------



## d44ve (Jun 14, 2007)

Corrosion said:


> I don't think there is ever gonna be a review where both sides agree. but o well. i hope the GDDR4 Version of the 2900 is a little better .. or a lot




It will be better.... I just hope it CLOSE to peoples expectations.


----------



## devguy (Jun 14, 2007)

DaMulta said:


> Sometimes cards run better at high resolutions and higher settings. Which is odd but it's true.



Yeah, in the original reviews for the hd 2900xt with the original drivers, games ran horrible at 1024x768 and 1280x1024, but they ran better at higher resolutions.  We just need more time for AMD to work on drivers.  I mean look at how well they already perform on Vista!  With the Nvidia 8800 series, it took much longer to get to a playable state on Vista, with people even considering class action lawsuits...


----------



## tkpenalty (Jun 14, 2007)

Hmm... the drivers really still bottleneck it but. Its all good


----------



## EastCoasthandle (Jun 14, 2007)

I think I know how they got those grass results.  Adaptive aa is a bit screwed up so if you enabled A AA it will appear that grass is missing.  In reality it's a driver problem.  That review is only taking advantage of a goofy set of drivers.  Another issue is that they used beta 8.37.4.2 drivers not Cat 7.5 that should be noted in this review.


----------



## Kasparz (Jun 14, 2007)

tkpenalty said:


> Hmm... the drivers really still bottleneck it but. Its all good


*
THIS IS NOT A NEW DRIVERS!!!*


----------



## Wile E (Jun 14, 2007)

Hmmm, here we go again with differing results between reviews.

Here, check out this review, which uses Cat 7.4s

http://www.tweaktown.com/reviews/1115/msi_radeon_hd_2900_xt_512mb_graphics_card/index.html


----------



## Ketxxx (Jun 14, 2007)

Not to shit on the 2900, but I play with the settings used on the 8800 in that link @ 1280*1024 and its smooth as silk.. :\

ed- HOCP link that is, not the tweaktown one. I've never liked HOCP way of doing reviews, its retarded.


----------



## bigboi86 (Jun 14, 2007)

Go read HardOCPs first article on this too. The motion blur on the ATI sucks, and it lags horribly with high grass settings(Oblivion). 

I didn't like how H does their reviews either at first, but they look at things in the right perspective. That review is not one-sided. This is a review on how the card will affect real world performance, it's not a let's see who gets higher numbers type of thing. 

I told you guys the new drivers weren't going to help.


----------



## DaMulta (Jun 14, 2007)

Then let the thing lag when you do the review.......


----------



## newtekie1 (Jun 14, 2007)

You know I really don't like how they change the settings in each game for each card.  Personally I don't think that is a good way to benchmark and is misleading.

Instead of keeping all the tests equal and just giving the information about what each card does when presented with the same senerio they change the tests to make the cards performance seem equal.

I understand that they are trying to give a realistic idea of what the cards are capable of by adjusting the tests to playable real world settings, but at the same time that makes the comparison performance results useless.


----------



## bigboi86 (Jun 14, 2007)

Go read the review itself guys, they explain why they do the things the way they do. 

If you do a review, at settings above the cards ability, then how would that review be useful to people who want to see how the card performs in real life? Do you think someone is going to play oblivion all laggy?  

They put both cards at their max ability, then compared them. Sorry if the ATI had to be the one crippled to show decent gameplay, don't blame the [H], blame ATI.

EDIT: I see where you're coming from newtekie, but that's just how [h] decided to do their reviews. I remember a while back how Kyle(H owner/editor) dissed synthetic benchmarks, and a lot of review sites(including my main forum), saying how their benchmarks are useless. You now see his new methods of reviews. I kind of like it and dislike it at the same time, but I do see his logic in how he does it.


----------



## DaMulta (Jun 14, 2007)

I did read it. I just don't understand why they just don't leave them both on the same setings.

It's not like the thing black screened when you tried to run it.


----------



## bigboi86 (Jun 14, 2007)

DaMulta said:


> I did read it. I just don't understand why they just don't leave them both on the same setings.
> 
> It's not like the thing black screened when you tried to run it.



First of all, ATI and Nvidia have different AA techniques(as well as other features). Kyle wanted to show both of them off and compare the both of them, as any reviewer should. Not just spit off framerates. 

Secondly, Kyle doesn't like to review something that wont help the people reading the reviews. If you read this review, you should know EXACTLY which card is better and why. The review does that, doesn't it? 

Would you buy a card that couldn't play oblivion with as high of settings as the other offering? This is the gist of his review.


----------



## newtekie1 (Jun 14, 2007)

Of course people aren't going to play Oblivion all laggy, but what is the point of doing a performance comparison if you are just going to adjust the settings so all the cards spit out the same performance numbers?


----------



## newtekie1 (Jun 14, 2007)

bigboi86 said:


> Would you buy a card that couldn't play oblivion with as high of settings as the other offering? This is the gist of his review.



Yes, and that could have just as easily been accomplished by leaving the settings at exactly the same settings and showing that the game is playable on one card and not on the other.

Now the person actually has to read the review closely to determine which card performed worse and pay attention to the little details such as grass levels and resolutions used for each card.

If the review was done properly all the reader would have to do is glance at the performance charts read the framerates and they would know the 2900XT tanked in the tests.  Instead at first glance [H] made it look like the cards performed equally, and lets face it, most readers don't go past first glances when reading performance comparisons.


----------



## HellasVagabond (Jun 14, 2007)

Unavoidable results and quite true ( Although i did expect the stock 8800GTS abit slower than this ) if you ask me


----------



## DaMulta (Jun 14, 2007)

bigboi86 said:


> First of all, ATI and Nvidia have different AA techniques(as well as other features). Kyle wanted to show both of them off and compare the both of them, as any reviewer should. Not just spit off framerates.
> 
> Secondly, Kyle doesn't like to review something that wont help the people reading the reviews. If you read this review, you should know EXACTLY which card is better and why. The review does that, doesn't it?
> 
> Would you buy a card that couldn't play oblivion with as high of settings as the other offering? This is the gist of his review.



I can play that game(x1950XTX) HDR/6AA/16AF with adaptive aa off and I get around 40-50-60 frames at all times on the X1950XTX. Full Grass everything to max at 1280x768.

I bet the HD 2900XT could do the same.


----------



## rampage (Jun 14, 2007)

not bashing the review, it was done well, with alot of effort involved, but i agree with a few ppl here in saysing that the comparisions should ahae done at the same resolution or multiple resolutions for each card, im only sayiing this because i know with my 8800gtx i get a higher frame rate at 1680*1050 then i do at 1280*1024, for example Day of Defeat Source 1280*1024 90>120 fps 1680*1050 110>130 fps. 
so for a fair test each card should be tested at multiple resolutions so you can choose the card that best suites the resolution you run at....


----------



## Xaser04 (Jun 14, 2007)

The point of the reviews is not to show which cards can get the highest numbers (which can often be more mis-leading) but to show what are the maximum playable settings each card offers in different games. 

Normally the slowest card will be the benchmark and the quicker ones will have 'gameplay advantages' which basically state what the faster card can offer in this game other the one used as a benchmark )ie 2900xt against a 8800GTS. 

It makes a refreshing change to read a review like this as gives more of a real world test. 

They also make the point that in certain games a high average FPS can be quite mis-leading as the game may actually feel like its lagging despite holding a high fps. (something that would be missed with my line is bigger than your line benchmarks)


----------



## WarEagleAU (Jun 14, 2007)

D44ve, while your way of presenting things are peculiar, I thank you for the link and the review. 

Next time though, try not to take things so hard :: j/k


----------



## d44ve (Jun 14, 2007)

WarEagleAU said:


> D44ve, while your way of presenting things are peculiar, I thank you for the link and the review.
> 
> Next time though, try not to take things so hard :: j/k




Thanks, I will try harder next time


----------



## WarEagleAU (Jun 14, 2007)

:: good deal there buddy.


----------



## uber_cookie (Jun 14, 2007)

only when full DX10 games come out full potential of 2900 will be seen till then fingers crossed for ATI.
i love red big noisy demon


----------



## bigboi86 (Jun 14, 2007)

Xaser04 said:


> *The point of the reviews is not to show which cards can get the highest numbers (which can often be more mis-leading) but to show what are the maximum playable settings each card offers in different games.*
> 
> Normally the slowest card will be the benchmark and the quicker ones will have 'gameplay advantages' which basically state what the faster card can offer in this game other the one used as a benchmark )ie 2900xt against a 8800GTS.
> 
> ...




My thoughts exactly. Thanks for putting it in better words.


----------



## EastCoasthandle (Jun 14, 2007)

bigboi86 said:


> My thoughts exactly. Thanks for putting it in better words.


That's not what I'm seeing.  What I am seeing is:
-a pre-retail HD 2900XT being used to show current retail card results
-pre-release beta drivers being used to show IQ and frame rates

These are but a few examples of why people have complained about this review.  However, other reasons are (but not limited to):
-differences between the pre release drivers betas and current Cat 7.5 drivers
-differences between resolutions
-differences between XP and Vista
-etc.  

That is only part of the reason why some view the review as bad IMO.


----------



## HellasVagabond (Jun 15, 2007)

banghead:


----------



## Wile E (Jun 15, 2007)

Again, I point out that different reviewers obtain different results.

Look at this review: http://www.tweaktown.com/reviews/1115/msi_radeon_hd_2900_xt_512mb_graphics_card/index.html

It's done with Cat 7.4s


----------



## HellasVagabond (Jun 15, 2007)

Different settings , different reviewers, different rigs, different drivers...They all matter.
However in the last 5 X2900XT reviews ive seen in none does it score 100% better than the 8800GTS. The results are mixed 50/50 and they shouldnt since the 8800GTS has what ? Almost a year on its back while the X2900XT just a month or so.....


----------



## Wile E (Jun 15, 2007)

HellasVagabond said:


> Different settings , different reviewers, different rigs, different drivers...They all matter.
> However in the last 5 X2900XT reviews ive seen in none does it score 100% better than the 8800GTS. The results are mixed 50/50 and they shouldnt since the 8800GTS has what ? Almost a year on its back while the X2900XT just a month or so.....


But the 8800 has also had that long to improve their drivers, based on customer feedback. Look back to when it first released, compared to now. They've seen some pretty big improvements.

The 2900 has been in the wild for only a month.

And you can't say that ATI had all that time to perfect the drivers. Just like nVidia, they have to rely on customer input. There are just too many possible hardware/software combinations to do all the testing in-house.

Note, that I'm not saying the 2900 will win in the end. I'm just saying it's still too early to call a victor.


----------



## EastCoasthandle (Jun 15, 2007)

Wile E said:


> But the 8800 has also had that long to improve their drivers, based on customer feedback. Look back to when it first released, compared to now. They've seen some pretty big improvements.
> 
> The 2900 has been in the wild for only a month.
> 
> ...


Exactly, if the HD 2900XT was out for 7-8 months and there was no performance increase it would indicate something wrong.  But as it stands now it's only been 1 month and 1 driver release making any argument about it's performance moot.


----------



## Zeratul_uy (Jun 15, 2007)

DaMulta said:


> Then let the thing lag when you do the review.......



Agreed, that way you see if the thing lag and the other thing it's better or not......


----------



## Zeratul_uy (Jun 15, 2007)

Wile E said:


> But the 8800 has also had that long to improve their drivers, based on customer feedback. Look back to when it first released, compared to now. They've seen some pretty big improvements.
> 
> The 2900 has been in the wild for only a month.
> 
> ...



Wile E you  and I totally agree with that too


----------



## mandelore (Jun 15, 2007)

DaMulta said:


> I can play that game(x1950XTX) HDR/6AA/16AF with adaptive aa off and I get around 40-50-60 frames at all times on the X1950XTX. Full Grass everything to max at 1280x768.
> 
> I bet the HD 2900XT could do the same.



yeah i run oblivion 1920x1200 pretty much maxed out, withhdr /6aa adaptive on some tweaks and 8xaf and game runs over 40fps outside, and i even have the mod foe extensive weather and foliage etc on my current 1900xtx, the 2900xt is gonna be awsome, so no complaints here


----------



## HellasVagabond (Jun 15, 2007)

Lets not forget that the X2900XT was to be released a few months sooner than it was released in the end due to Driver improvements.....Im not saying that we wont see Bugs fixed in Several Games but i dont think we can expect an big performance boost.


----------



## bigboi86 (Jun 15, 2007)

Let's just say you have to be open minded to like Kyle's review methods. I can clearly see which card is the better card in his review though, that's all that matters.


----------



## newtekie1 (Jun 15, 2007)

It doesn't matter how long the card has been in the wild.  If you are going to make consumers wait 6+ months for a product to compete it better damn well actually be able to compete when it is released.


----------



## Wile E (Jun 16, 2007)

newtekie1 said:


> It doesn't matter how long the card has been in the wild.  If you are going to make consumers wait 6+ months for a product to compete it better damn well actually be able to compete when it is released.


It does compete. Look at other reviews. It sometimes even competes with a GTX.

And the amount of time a product is on the market has everything to do with driver quality.


----------



## HellasVagabond (Jun 16, 2007)

To compete with a GTX in Games such as Half Life 2 and in 3dmark isnt exactly fair since the first is ATI optimised and the second isnt really a game.
And like i said before the X2900XT was delayed 1,2 months ( dont remember exactly ) because ATI had driver issues with it so their drivers should be quite complete.
Anyways this round is won by the Green guys so lets wait and see what the red team brings into the ring next.


----------



## mandelore (Jun 16, 2007)

"driver issues".. dude, nvidia required months of user feedback to get their drivers working, allow that for ATI plz ^^


----------



## Wile E (Jun 16, 2007)

HellasVagabond said:


> To compete with a GTX in Games such as Half Life 2 and in 3dmark isnt exactly fair since the first is ATI optimised and the second isnt really a game.
> And like i said before the X2900XT was delayed 1,2 months ( dont remember exactly ) because ATI had driver issues with it so their drivers should be quite complete.
> Anyways this round is won by the Green guys so lets wait and see what the red team brings into the ring next.


No, this round hasn't been won by the green team. You definitely can't call that without fully fledged DX10 games to test on.

And for the last time, the driver cannot be completed without user feedback. It is impossible to release a fully optimized driver without feedback from customers. ATI would've had to test the driver on every possible combination of hardware and software in order for it to be optimized upon release. Just not possible. 

nVidia can't do it either. They also rely on customer feedback to improve their drivers. Why do you think the performance of the 8800 is so much better now, compared to when it released? They haven't changed the hardware, so the performance improvement could only attributed to the drivers.


ATI is only on the first update, since release of the 2900. How many updates did nVidia take before the performance really started to improve on the 8800?

Give it a couple more driver revisions and some DX10 games before you try to call a winner.


----------



## HellasVagabond (Jun 16, 2007)

By the time DX10 come in numbers both companies will have new GPUs to show off thats why im saying this round is won by the Green team.
As for the drivers i cant say i have ever see a huge or even a big improvement in FPS between drivers. Only perhaps in certain games where some drivers wont work as planned but in the Benchmarks we see around the internet we dont see 1,2,3 games but many times more and in almost all we see the same results so i cant really say that a driver can be that bad in every game.


----------



## Wile E (Jun 16, 2007)

HellasVagabond said:


> By the time DX10 come in numbers both companies will have new GPUs to show off thats why im saying this round is won by the Green team.
> As for the drivers i cant say i have ever see a huge or even a big improvement in FPS between drivers. Only perhaps in certain games where some drivers wont work as planned but in the Benchmarks we see around the internet we dont see 1,2,3 games but many times more and in almost all we see the same results so i cant really say that a driver can be that bad in every game.


But it isn't that bad in every game. That's the point everyone is missing here. They trade blows, depending on the game.

As far as not seeing an improvement in your fps with new drivers, how long have you owned the card? Ask someone that bought an 8800 at it's release how big the improvement was.


----------



## HellasVagabond (Jun 16, 2007)

Perhaps the 8800 had problems when it launched ( I got the 640mb version 3 months after ) but i never heard of a delay of the card due to driver problems so perhaps Nvidia prefered to let the customers feedback help them fix the drivers rather than doing it on their own. However i did hear ATI delayed the X2900XT launch due to driver problems so i think they did tune them even a bit and then released the card to the public. 
Besides even the Nvidia drivers can be tweaked more and so can the ATI ones but i doubt we can see any major differences just based on drivers since both companies will tweak them and tweak them until they release new GPUs.


----------



## Wile E (Jun 16, 2007)

HellasVagabond said:


> Perhaps the 8800 had problems when it launched ( I got the 640mb version 3 months after ) but i never heard of a delay of the card due to driver problems so perhaps Nvidia prefered to let the customers feedback help them fix the drivers rather than doing it on their own. However i did hear ATI delayed the X2900XT launch due to driver problems so i think they did tune them even a bit and then released the card to the public.
> Besides even the Nvidia drivers can be tweaked more and so can the ATI ones but i doubt we can see any major differences just based on drivers since both companies will tweak them and tweak them until they release new GPUs.


We're obviously not going to change each other's minds. I say we just agree to disagree.


----------



## niko084 (Jul 16, 2007)

I don't personally like that review either... They didn't give any real information, they didn't standardize anything...

And for all the better at least 2 of those games are directly written for NVIDIA...
Well what good is that for performance?

If I made hardware that worked directly with my ATI card and didn't require a driver and then wrote software to work directly with the hardware... hmmm

Not going to hate on either card by any and all means... But a poorly controlled and written review.


----------



## Tatty_One (Jul 16, 2007)

mandelore said:


> "driver issues".. dude, nvidia required months of user feedback to get their drivers working, allow that for ATI plz ^^



Correction slightly if I may......to get their Vista drivers working, whilst that is unacceptable vista was pretty damn new when the 8800 series was released.


----------



## yogurt_21 (Jul 16, 2007)

lol ancient thread


----------



## Vincent11 (Dec 15, 2007)

ATI RULEZ, thats all I know, haha


----------



## warhammer (Jan 29, 2008)

No my wallet RULZs..


----------

