# NVIDIA Claims Upper Hand in Tessellation Performance



## btarunr (Mar 19, 2010)

A set of company slides leaked to the press reveals that NVIDIA is claiming the upper hand in tessellation performance. With this achievement, NVIDIA is looking to encourage leaps in geometric detail, probably in future games that make use of tessellation. NVIDIA's confidence comes from the way its GF100 GPU is designed (further explained here). Each GF100 GPU physically has 16 Polymorph Engines, one per streaming multiprocessor (SM) which helps in distributed, parallel geometry processing. Each Polymorph Engine has its own tessellation unit. With 15 SMs enabled on the GeForce GTX 480 and 14 on the GeForce GTX 470, there are that many independent tessellation units. 

NVIDIA demonstrated its claims in the presentation using the Unigine Heaven, where the GeForce GTX 480 was pitted against a Radeon HD 5870. In many scenes where tessellation is lower, the GPUs performed neck-and-neck, with the GTX 480 performing better more often. But in scenes with heavy tessellation (particularly the "dragon" scene, where a highly detailed model of a dragon needs to be rendered with densely tessellated meshes, the GTX 480 clocks nearly a 100% performance increment over the HD 5870. NVIDIA has been confident about the tessellation performance back since January, when it detailed the GF100 architecture. The GeForce GTX 400 series graphics cards will be unveiled on the 26th of March. 



 

 



Images Courtesy: Techno-Labs

*View at TechPowerUp Main Site*


----------



## KainXS (Mar 19, 2010)

so they did drop a sm in the GTX480, meaning it only has 60 tmu's not 64


----------



## btarunr (Mar 19, 2010)

KainXS said:


> so they did drop a sm in the GTX480, meaning it only has 60 tmu's not 64



Yup, 60 TMUs it is.


----------



## [I.R.A]_FBi (Mar 19, 2010)

i'll wait a week


----------



## human_error (Mar 19, 2010)

hmm yeah i'll wait for some unbiased benchies running on the same (read: not nvidia specific) version of the euginie heaven benchmark. Still if this helps drive tesselation support up in games and gets both teams working on better tesselation then i'm all for it.


----------



## Steevo (Mar 19, 2010)

Even at stock I never went below 20FPS during the Dragon scene, At my current overclock i stay about 25FPS at the dragon scene, and that is before the new drivers (10.3a). Plus the benchmark Nvidia was using was version 1.1 according to many, that included up to a 30% performance increase during heavy tessellation by aggressive culling.


So their effective 100% boost can be cut down 30% by the use of the new Heaven benchmark, plus the new drivers ATI is releasing 10-20%, and the ability to overclock most 5870 by at least 20%.


So this is Nvidias fancy way of saying by using selective math we can bend numbers our way, all day. Just like I could say.

Available DX11 Video card manufacturers.

ATI



So ATI has the fastest DX11 video card on the market in any and all configurations.


----------



## phanbuey (Mar 19, 2010)

Whoop de doo.


----------



## Kitkat (Mar 19, 2010)

this news is 3 weeks old. they had this video on youtube for the longest time.


----------



## erocker (Mar 19, 2010)

I'm curious as to how much of a boost the 10.3a drivers from ATi gave the 5870 in relation to that graph. Anyone with a i7 at 3.3ghz and a 5870 know?


----------



## crow1001 (Mar 19, 2010)

Is it really necessary to grab all the unsubstantiated fermi trash of the web, at least have a reliable source that can be backed up. The bloke next door could have compiled them graphs for all we know.



erocker said:


> I'm curious as to how much of a boost the 10.3a drivers from ATi gave the 5870 in relation to that graph. Anyone with a i7 at 3.3ghz and a 5870 know?



NV were using an released version of heaven in their tests, with up to 30% tessellation optimization, so comparsions are not valid.

performance in heaven with my 5870 and 10.3's is now a lot smoother, never dips below 30 FPS.


----------



## xaira (Mar 19, 2010)

waw a 300 uhm, sorry 298watt gpu barely getting the upper hand on a 188watt gpu, thats really something to be proud of.


----------



## Sasqui (Mar 19, 2010)

Why compare one selective synthetic benchmark against a 5870, then turn around and provide a whole performance chart relative to a GTX 285?

Confused marketing.


----------



## MarcusTaz (Mar 19, 2010)

Did not Nvidia just release a driver that caused video cards to overheat and fry? 

I will stick with ATI as far as the eye can see. Nvidia will not get my business, not after their 7900GTX debacle vista driver and memory 2D sync issues and never fessing up to it. They took my money then and will not take it again. Plus everyone knows they cheat with benchmarks. 

I am very happy with my HD5850 thank you that is green and runs cool...


----------



## truehighroller1 (Mar 19, 2010)

I have the following setup and can run this if you guys like.

GA-EP45-UD3P Rev 1.6 
Q9550 E0 @ 4.2~Ghz 1.37v~ 1.34v Under Load 
Swiftech Water Cooling
2x2Gb OCZ DDR2 1150Mhz @ 1188MHz 5,5,5,18 @ 2.0v
XTX 5870 @ 1GHz / 1200
3x Western Digital RE3 320Gb 16Mb Cache SataII Drives in Raid0
Corsair HX 850w Power Supply
Antec 1200 Case 
3dmark Vantage Score = 19049
3DMark06 Score = 23393


----------



## mechtech (Mar 19, 2010)

meh the only thing I have time for nowadays is an ocassional game of CS Source, and my 4850 pushes that beyond my monitor refresh rate anyway.

Why dont video card reviews use HL2 source engine for a gaming benmark, to give relevence for us running all the old classics??


----------



## evillman (Mar 19, 2010)

Steevo said:


> Even at stock I never went below 20FPS during the Dragon scene, At my current overclock i stay about 25FPS at the dragon scene, and that is before the new drivers (10.3a). Plus the benchmark Nvidia was using was version 1.1 according to many, that included up to a 30% performance increase during heavy tessellation by aggressive culling.
> 
> 
> So their effective 100% boost can be cut down 30% by the use of the new Heaven benchmark, plus the new drivers ATI is releasing 10-20%, and the ability to overclock most 5870 by at least 20%.
> ...



It's true, ATI have the most powerful card until now, but in a week the fastest graphic chip will be released.


----------



## mechtech (Mar 19, 2010)

evillman said:


> It's true, ATI have the most powerful card until now, but in a week the fastest graphic ship will be released.



yep thats the way it goes, back and forth, nice to have competition to stimulate technology and keep prices in check.

And in another 6-8 months or so, it may be back to ATI again, if the 6000 series is out by then, time will tell.


----------



## crow1001 (Mar 19, 2010)

In a week the fastest GFX card on the planet will still be the 5970..


----------



## phanbuey (Mar 19, 2010)

truehighroller1 said:


> I have the following setup and can run this if you guys like.
> 
> GA-EP45-UD3P Rev 1.6
> Q9550 E0 @ 4.2~Ghz 1.37v~ 1.34v Under Load
> ...



Yes!!!  I really would like to see that... also what is your GPU score for High in Vantage?


----------



## DaedalusHelios (Mar 19, 2010)

Sasqui said:


> Why compare one selective synthetic benchmark against a 5870, then turn around and provide a whole performance chart relative to a GTX 285?
> 
> Confused marketing.



It was to show progression of the architecture rather than a comparison to the competition like the previous slides were comparing. It didn't confuse me.


----------



## truehighroller1 (Mar 19, 2010)

phanbuey said:


> Yes!!!  I really would like to see that... also what is your GPU score for High in Vantage?



Edit:  19704 Compare Link: http://service.futuremark.com/resultComparison.action?compareResultId=1836125&compareResultType=19 . I'll see what I can do right now with the test, I have it on here already so.. bb


Edit some more: What settings did they use? I'll run with default for now and check back here shortly.


----------



## Phxprovost (Mar 19, 2010)

im sorry but dosnt your product have to be on the market in order to have an upper hand in anything?


----------



## DeathByTray (Mar 19, 2010)

Is that a new bench or just rehash of yesteryear? ;p


----------



## v12dock (Mar 19, 2010)

Old


----------



## rizla1 (Mar 19, 2010)

wait for the gtx 460 . that will be a gud card. for middleclass


----------



## nt300 (Mar 19, 2010)

Funny how those are Nvidia clips showing benchmarks. No worry ATI does same 
We wait and see real world benchmarks.


----------



## SetsunaFZero (Mar 19, 2010)

rizla1 said:


> wait for the gtx 460 . that will be a gud card. for middleclass



yes, like the gtx260 series but only if the price is good


----------



## Mussels (Mar 20, 2010)

yay, nvidia is faster in a single feature no one cares about.

If they were faster in games, they'd be advertising that - and they arent.

Fermi looks like its still going to be a massive fail.


----------



## _33 (Mar 20, 2010)

Now if there could be other games than Dirt 2 that actually use Tesselation...


----------



## [I.R.A]_FBi (Mar 20, 2010)

fermi is 1.5 times better at making em sweat, it is a market leader!


----------



## simlariver (Mar 20, 2010)

Mussels said:


> yay, nvidia is faster in a single feature no one cares about.
> 
> If they were faster in games, they'd be advertising that - and they arent.
> 
> Fermi looks like its still going to be a massive fail.



yup, not only the performance is behind, but the feature-sert as well. No multimonitor gaming (you can game on max 3 monitor with a SLI setup, fail) and 3D gaming uses proprietary expensive glasses (not that anyone cares about 3D gaming) and still no word for 7.1 sound on the hdmi output. probably non-existent.


----------



## CyberCT (Mar 20, 2010)

simlariver said:


> yup, not only the performance is behind, but the feature-sert as well. No multimonitor gaming (you can game on max 3 monitor with a SLI setup, fail) and 3D gaming uses proprietary expensive glasses (not that anyone cares about 3D gaming) and still no word for 7.1 sound on the hdmi output. probably non-existent.



Wrong. Multimonitor gaming is stupid.  Who in their right mind is going to pay for 6 monitors to play a game, surrounded by a monitor bezel between each monitor which interrupts the total screen?  That defeats the whole purpose.

3D gaming is awesome and until you experience it first hand, you have no idea what your talking about or smoking.  The glasses are not that expensive and I have yet to hear one of my friends (both male and female) say 3D gaming is not amazing on my 56" DLP.

I agree that Nvidia dropped the ball this round.  But your other statements make absolutely no sense and you're a fanboy. Owned.


----------



## Mussels (Mar 20, 2010)

i've used 3D glasses for gaming recently, and back when they first came out (Geforce 3 era)

They sucked then, and they suck now.

Multi monitor gaming is not as bad as you think it is, LCD's have fairly thin bezels, and thats the point of running 3 monitors - you dont have a bezel in front of you, you just have extra peripheral vision if you turn your head.

Am i interested in 3 monitors for FPS games? no, not at all. But at the same time i know just how flawed 3D gaming is and how problematic it is. Its doomed to failure all over again.

Heres a few facts: it requires 120Hz screens, and your in game FPS to be 120 (or in some cases, just 60FPS doubled) - thats a CONSTANT 60FPS, not a max, not an average. you drop your FPS to 30, and you're getting a nauseating slideshow.

it requires powered 3D glasses to use - that means batteries or corded. they're expensive, and a pain to replace should they break.

DirectX 10 has no way to force refresh rates, i assume DX11 is the same. That means in most games you play these days, you cant force 120Hz - so 3D wont work (this ofc may get fixed over time... but its taken several years now, so i dont see it happening any time soon)


----------



## CyberCT (Mar 20, 2010)

Mussels said:


> i've used 3D glasses for gaming recently, and back when they first came out (Geforce 3 era)
> 
> They sucked then, and they suck now.
> 
> ...



I don't follow what your're saying.  I use the NVIDIA control panel to force vsync and it works.  I don't understand your problem with 3D gaming.  It works perfectly for me in all games except Crysis.  Well, actually, Crysis works fine in 3D if I set everything to low.  But it still looks incredible on low.  You could get a large DLP from Mitsubishi that supports 3D much cheaper than you would get for 6 LCD monitors.  Not to mention the power consumption benefit too.  The 3D glasses were $100 for two BTW.  Not that expensive at all consideriing the final result.  I have the Samsung HL56A650.  Awesome TV for the price almost 2 years ago.

Again.  Perhaps you should try it now.  I have a GTX 285 FTW from EVGA and my E8500 OC'd to 3.8 GHz.  Works great with 3D gaming like I mentioned above.  Sadly, I think I'll sit out this round until the next gen of cards that are worth the price.  My GFX card is awesome and runs great with everything except Crysis.


----------



## newtekie1 (Mar 20, 2010)

Steevo said:


> Even at stock I never went below 20FPS during the Dragon scene, At my current overclock i stay about 25FPS at the dragon scene, and that is before the new drivers (10.3a). Plus the benchmark Nvidia was using was version 1.1 according to many, that included up to a 30% performance increase during heavy tessellation by aggressive culling.
> 
> 
> So their effective 100% boost can be cut down 30% by the use of the new Heaven benchmark, plus the new drivers ATI is releasing 10-20%, and the ability to overclock most 5870 by at least 20%.
> ...



I love it every time someone uses the "but the slower part can be overclocked to match the faster, ignore the fact that the faster part can be overclocked also, that isn't important..."

I'm waiting for some real reviews before I make any judgement.  Personally, if they both perform equally in DX10 games, I don't care as they will both destroy DX10 games.  I'm more interested in DX11 performance, mainly tessellation, so if the GTX280/270 is better at that, then that IMO is the imporant thing.


----------



## Super XP (Mar 20, 2010)

[I.R.A]_FBi said:


> fermi is 1.5 times better at making em sweat, it is a market leader!


If NVIDIA's Marketing Team wanted, they can easily sell a Bowl of Soup, name it GTX 450, put a nice sticker on it, with a heatsink & fan, post some benchmarks and most likely you would have people go nuts for the darn thing 


CyberCT said:


> Wrong. Multimonitor gaming is stupid.  Who in their right mind is going to pay for 6 monitors to play a game, surrounded by a monitor bezel between each monitor which interrupts the total screen?  That defeats the whole purpose.
> 
> 3D gaming is awesome and until you experience it first hand, you have no idea what your talking about or smoking.  The glasses are not that expensive and I have yet to hear one of my friends (both male and female) say 3D gaming is not amazing on my 56" DLP.
> 
> I agree that Nvidia dropped the ball this round.  But your other statements make absolutely no sense and you're a fanboy. Owned.


Multi-Monitor for gaming is awesome if you think about it. Though I don't think its for everybody, its an option that some will buy into providing you got the funds. You don't have to buy 3 monitors right away, you can buy one then wait a bit save up and buy another. They also have some nice thin bezels out there.
 I am not a fan of 3D glasses, I think its stupid that you have to where glasses to play games IMO. Some may like it and others may not. What would be interesting is if they can do it without wearing glasses. 

Just like 3D HDTV's which companies are trying to sell to us, it stinks and I hope it fails unless they stop ripping us off by forcing us to pay $300+ for one pair of 3D glasses. Most if not all HDTV's out today can already do 3D "WITHOUT" 3D glasses. That is what they should be pushing for, not this nonesense 3D glasses and so called 3D HDTV's.


----------



## [I.R.A]_FBi (Mar 20, 2010)

newtekie1 said:


> I love it every time someone uses the "but the slower part can be overclocked to match the faster, ignore the fact that the faster part can be overclocked also, that isn't important..."



But if the aim is X amount of performance and not a certain amount of money spent or a certain part why should it matter?


----------



## afw (Mar 20, 2010)

This is like taking forever ... i want to see some reviews ...


----------



## eidairaman1 (Mar 20, 2010)

if they were so confident in the performance why wasn't the board released in Late Jan Middle February.  TBH Nvidia has made so many promises they can't keep it's just like the Current President of the United States (One Big Ass Mistake America)!!! And no you can't blame the previous stuff as this one has done nothing but Lied to our faces and also has tried saying the others are Idiots when the others have way better plans and ideas than Current losers do.

3D Glasses Tech is such a gimmick, it was just a money stealing technique when you can make your own 3D glasses tech for about 25 Cents?



Mussels said:


> i've used 3D glasses for gaming recently, and back when they first came out (Geforce 3 era)
> 
> They sucked then, and they suck now.
> 
> ...



you can include no Vista and 7 Support for NF2 and 3 motherboards from them either (greedy Bastards)



MarcusTaz said:


> Did not Nvidia just release a driver that caused video cards to overheat and fry?
> 
> I will stick with ATI as far as the eye can see. Nvidia will not get my business, not after their 7900GTX debacle vista driver and memory 2D sync issues and never fessing up to it. They took my money then and will not take it again. Plus everyone knows they cheat with benchmarks.
> 
> I am very happy with my HD5850 thank you that is green and runs cool...


----------



## Mussels (Mar 20, 2010)

CyberCT said:


> I don't follow what your're saying.  I use the NVIDIA control panel to force vsync and it works.  I don't understand your problem with 3D gaming.  It works perfectly for me in all games except Crysis.  Well, actually, Crysis works fine in 3D if I set everything to low.  But it still looks incredible on low.  You could get a large DLP from Mitsubishi that supports 3D much cheaper than you would get for 6 LCD monitors.  Not to mention the power consumption benefit too.  The 3D glasses were $100 for two BTW.  Not that expensive at all consideriing the final result.  I have the Samsung HL56A650.  Awesome TV for the price almost 2 years ago.
> 
> Again.  Perhaps you should try it now.  I have a GTX 285 FTW from EVGA and my E8500 OC'd to 3.8 GHz.  Works great with 3D gaming like I mentioned above.  Sadly, I think I'll sit out this round until the next gen of cards that are worth the price.  My GFX card is awesome and runs great with everything except Crysis.



forcing Vsync has nothing to do with games not supporting the higher refresh rate - when i first used it, if your game couldnt be forced to 120Hz, 3D didnt work. Perhaps thats different now, and it only needs to run at 60 and the drivers do the rest.


----------



## CyberCT (Mar 20, 2010)

Super XP said:


> Multi-Monitor for gaming is awesome if you think about it. Though I don't think its for everybody, its an option that some will buy into providing you got the funds. You don't have to buy 3 monitors right away, you can buy one then wait a bit save up and buy another. They also have some nice thin bezels out there.
> I am not a fan of 3D glasses, I think its stupid that you have to where glasses to play games IMO. Some may like it and others may not. What would be interesting is if they can do it without wearing glasses.
> 
> Just like 3D HDTV's which companies are trying to sell to us, it stinks and I hope it fails unless they stop ripping us off by forcing us to pay $300+ for one pair of 3D glasses. Most if not all HDTV's out today can already do 3D "WITHOUT" 3D glasses. That is what they should be pushing for, not this nonesense 3D glasses and so called 3D HDTV's.



I never understood what was so cool about multimonitor gaming.  My monitor is 23" and I can't imagine putting more monitors on my desk (they won't fit anyway) to a "wow" factor.  I feel this is a countermeasure by ATI against Nvidia's native support for 3D gaming. Heck, I can play nearly all my games at 1080p at 60fps on my 56" HDTV and they look stunning.  The TV isn't sharp enough to show jaggies beyond 2x FSAA so the games look flawless.  And that's about the same if not more real estate than 6 monitors next to each other (also you have to add the hardware to mount all of them).

If you ever tried 3D gaming you would be absolutely amazed.  Games like Bioshock (wow), Left 4 Dead, Mirrors Edge, Crysis (low), etc ... the list goes on.  They all run at 120 fps so they run very smooth in 3D.  Like I said, eveyone guy and girl I've had over my place just to whitness it was absolutely stunned, like me.  You absolutely must try it.

I can't understand why you're saying you hope it fails.  I hope it catches ground if anything.

There is a possibility of having 3D without glasses.  I forget which company showcased it at some trade show (maybe Samsung) but it was preliminary and still a ways off from mainstream production.  Without glasses would be much better, I agree with you there.

I have no idea where you're getting $300 for glasses.  Mine cost me $50 for one pair for the tridef setup.  Shop around and you'd be amazed what you could find.


----------



## Mussels (Mar 20, 2010)

supreme commander worked well for multi monitor, one monitor was the game, the other a giant version of the minimap (with full zoom functionality - same as first screen, but no HUD)

multimonitor never took off in games because until now, very very few (matrox only really) video cards allowed THREE monitor at once. HD5K is the first gaming grade cards to offer that, so you dont have a gap right smack bang where your crosshair would be in an FPS game.


so you're saying you can run all those games at 120FPS constant, at what resolution and settings, on what hardware?


----------



## eidairaman1 (Mar 20, 2010)

Sorry 50 USD for a Gimmick that doesn't look good wearing them out in the world or provide any functionality other than the Monitor is lamer than buying the Worlds Best Pair of Oakley's or Ray Ban.




CyberCT said:


> I never understood what was so cool about multimonitor gaming.  My monitor is 23" and I can't imagine putting more monitors on my desk (they won't fit anyway) to a "wow" factor.  I feel this is a countermeasure by ATI against Nvidia's native support for 3D gaming. Heck, I can play nearly all my games at 1080p at 60fps on my 56" HDTV and they look stunning.  The TV isn't sharp enough to show jaggies beyond 2x FSAA so the games look flawless.  And that's about the same if not more real estate than 6 monitors next to each other (also you have to add the hardware to mount all of them).
> 
> If you ever tried 3D gaming you would be absolutely amazed.  Games like Bioshock (wow), Left 4 Dead, Mirrors Edge, Crysis (low), etc ... the list goes on.  They all run at 120 fps so they run very smooth in 3D.  Like I said, eveyone guy and girl I've had over my place just to whitness it was absolutely stunned, like me.  You absolutely must try it.
> 
> I have no idea where you're getting $300 for glasses.  Mine cost me $50 for one pair for the tridef setup.  Shop around and you'd be amazed what you could find.


----------



## CyberCT (Mar 20, 2010)

Mussels said:


> supreme commander worked well for multi monitor, one monitor was the game, the other a giant version of the minimap (with full zoom functionality - same as first screen, but no HUD)
> 
> multimonitor never took off in games because until now, very very few (matrox only really) video cards allowed THREE monitor at once. HD5K is the first gaming grade cards to offer that, so you dont have a gap right smack bang where your crosshair would be in an FPS game.
> 
> ...



I have an E8500 OC'd to 3.8 GHz and an EVGA GTX285 FTW edition running on XP at 1080p on my HDTV for 3D gaming.  Like I said, I'll probably pass on this round of cards because I'm a bit disappointed with their performance (only taking NVIDIA's preliminary review numbers though).  My card runs everything fine.  I can OC my CPU and GFX card more too, just in case.  They work great!


----------



## eidairaman1 (Mar 20, 2010)

CyberCT said:


> I have an E8500 OC'd to 3.8 GHz and an EVGA GTX285 FTW edition running on XP at 1080p on my HDTV for 3D gaming.  Like I said, I'll probably pass on this round of cards because I'm a bit disappointed with their performance (only taking NVIDIA's preliminary review numbers though).  My card runs everything fine.  I can OC my CPU and GFX card more too, just in case.  They work great!



Heres the other problem, games are starting to transition to Vista and 7 only, basically Meaning DX 10- DX 11 support only, no DX 9 Support any longer.


----------



## CyberCT (Mar 20, 2010)

eidairaman1 said:


> Sorry 50 USD for a Gimmick that doesn't look good wearing them out in the world or provide any functionality other than the Monitor is lamer than buying the Worlds Best Pair of Oakley's or Ray Ban.



Haha!!  Whatever.  Enjoy your 6 monitor setup.  That will look funnier on a desk than any type of buddy holly glasses on your face.  To each their own.  You don't know what you're missing out on.  I guess ignorance is bliss??


----------



## CyberCT (Mar 20, 2010)

eidairaman1 said:


> Heres the other problem, games are starting to transition to Vista and 7 only, basically Meaning DX 10- DX 11 support only, no DX 9 Support any longer.



Well I have a HUGE library of games that supports DX9 and runs at 120 FPS to matchthe 120 Hz on my HDTV, so no worries here.


----------



## eidairaman1 (Mar 20, 2010)

CyberCT said:


> Well I have a HUGE library of games that supports DX9 and runs at 120 FPS to matchthe 120 Hz on my HDTV, so no worries here.



ensure you are using multiple quote button at bottom of each post you want in a single post and not posting 1 right after the other. Vista and 7 only included DX 9 support so you can play your games from around 2002/2003 to 2009.


----------



## Mussels (Mar 20, 2010)

so in other words cyberCT, you're talking out your arse.

I specifically mentioned DX10 and 11 for both performance and refresh rate reasons and you told me that yours works fine - but if you're in XP, you cant use DX10 or 11.

Please dont try and tell us everything is awesome and great, when you arent running at even medium graphics (DX10) in modern games.

(stalker CoP, Metro 2033 and bad company 2 all offer DX9, 10 and 11 modes - and DX11 is the best looking and most demanding in all of them)


----------



## CyberCT (Mar 20, 2010)

Mussels said:


> so in other words cyberCT, you're talking out your arse.
> 
> I specifically mentioned DX10 and 11 for both performance and refresh rate reasons and you told me that yours works fine - but if you're in XP, you cant use DX10 or 11.
> 
> ...



LOL!  I'm not talking out of my arse.  I know I can't use DX10 or DX11 in XP.  Until the next gen of consoles, DX10 or DX11 will take off very slowly because obviously the X360 & PS3 run on DX9.

Tell me what I'm majorly missing from DX10 & DX11 from DX9.  My buddy has Windows 7 and an ATI 5850 card.  Far Cry 2 looks "barely" better in DX10.  Halo 2 looks like a high rez X360 version in DX10.  It's another game that could have been done in DX9 (and others have made it work on DX9 on the net) but it was a ploy by MS to push Vista, just like Crysis "Very High" mode which is easily done on DX9 with a few changes to the files.  We both have BC2 also, and I do see a little bit of a difference in DX11.  But not enough to make me excited.  Actually, I'll bet Dice could have made it look just as good under DX9 but they didn't try perhaps because of MS.  Heck, FSAA isn't possible in BC2 in DX9, and that's just BS in my book because it runs better on my PC than my friend's without FSAA on either system.  FSAA in DX9 is possible in the rest of the games I own.

I have no interest in stalker CoP, Metro 2033 , or the new AVP game.  Maybe DIRT2 but I'll wait until later this year / next year when I upgrade and I'm sure it will be packaged with the DX11 card I buy.  Look at people's opinions around the web on DX11 games thus far.  People aren't that enthused about how much "better" games look compared to the DX9 version, nor how much slower they run.

All my games that are DX9 I run at max settings with at least 2XFSAA (except Crysis) and they run at a very fluid frame rate without slowdown.  I don't have any games that are DX8 or below installed on my system.  Only DX9.

Until we see the Xbox 720 & PS4 we are stuck at a stalemate for mainstream technology progression.


----------



## AsRock (Mar 20, 2010)

If true they beat a 6 month old card lol.  I'm sure W1z will prove what the truth is and what is not .


----------



## Zubasa (Mar 20, 2010)

CyberCT said:


> LOL!  I'm not talking out of my arse.  I know I can't use DX10 or DX11 in XP.  Until the next gen of consoles, DX10 or DX11 will take off very slowly because obviously the X360 & PS3 run on DX9.
> 
> Tell me what I'm majorly missing from DX10 & DX11 from DX9.  My buddy has Windows 7 and an ATI 5850 card.  Far Cry 2 looks "barely" better in DX10.  Halo 2 looks like a high rez X360 version in DX10.  It's another game that could have been done in DX9 (and others have made it work on DX9 on the net) but it was a ploy by MS to push Vista, just like Crysis "Very High" mode which is easily done on DX9 with a few changes to the files.  We both have BC2 also, and I do see a little bit of a difference in DX11.  But not enough to make me excited.  Actually, I'll bet Dice could have made it look just as good under DX9 but they didn't try perhaps because of MS.  Heck, FSAA isn't possible in BC2 in DX9, and that's just BS in my book because it runs better on my PC than my friend's without FSAA on either system.  FSAA in DX9 is possible in the rest of the games I own.
> 
> ...


So you buy a DX10 graphics card to run games in DX9, you might as well get a console and be done with it 
You want to know why BF:BC2 does not support FSAA in DX9? Because they simply port that from the console which couldn't afford FSAA. 

DX11 actually is taking on much better than DX10, for the very lease there are games that support DX11 within the first 6 months the hardware is released.
DX10 fell flat on its face mainly because how people hates Win Vista, and the fact that nVidia released pussy mid-range 8600GTs that can't even run DX9 games maxed out.

The point about FarCry2 not looking much better in DX10?
That games is base on the Dunia engine which is a modified CryEngine, and it is a DX9 engine with added DX10 support.
The Halo 2 Engine is mainly designed for the Xbox that makes it an even worst example.
By the way Halo 2 is released in 2004, that is way before Vista even exists. *These are DX9 native games.*
*
Most important thing is this is a thread about the new DX11 GPU from nVidia, not some Console vs PC thread.
Mussle's earlier post was to point out that nvidia's 3DVision has too many limitations, and that includes FPS/refresh rate issues in DX10/11 those have nothing to do with Vsync*


----------



## tkpenalty (Mar 20, 2010)

CyberCT said:


> Haha!!  Whatever.  Enjoy your 6 monitor setup.  That will look funnier on a desk than any type of buddy holly glasses on your face.  To each their own.  You don't know what you're missing out on.  I guess ignorance is bliss??



So what does 3D actually help with? Just eyecandy, and nothing more, its not like you need to judge distance (and you can't really). For it to even work well you'd have to keep your head still. 

However a 6 monitor setup can be wrapped around your field of vision (or 3), and lets you see more of whats going on in games. Especially more useful in racing games, where in a normal 16:9 ratio, you'd rely on a button to check who's beside you, whereas in a multiple monitor setup you could just duck your eye to the side quickly without being distracted by the button. 

The difference is that multiple monitors actually facilitate gaming, whilst 3D doesn't do anything but create an illusion. 

You may find your 23 inch screen big enough but others who want more than just eye candy would object.


----------



## CyberCT (Mar 20, 2010)

Zubasa said:


> So you buy a DX10 graphics card to run games in DX9, you might as well get a console and be done with it
> You want to know why BF:BC2 does not support FSAA in DX9? Because they simply port that from the console which couldn't afford FSAA.
> 
> DX11 actually is taking on much better than DX10, for the very lease there are games that support DX11 within the first 6 months the hardware is released.
> ...



Did you read that I have XP?  I only have DX9 on my PC, which is perfectly fine.  Even though I have an X360 Id much rather have my games running at 60fps at 1080p than 30fps at 720p.  To each their own.



tkpenalty said:


> So what does 3D actually help with? Just eyecandy, and nothing more, its not like you need to judge distance (and you can't really). For it to even work well you'd have to keep your head still.
> 
> However a 6 monitor setup can be wrapped around your field of vision (or 3), and lets you see more of whats going on in games. Especially more useful in racing games, where in a normal 16:9 ratio, you'd rely on a button to check who's beside you, whereas in a multiple monitor setup you could just duck your eye to the side quickly without being distracted by the button.
> 
> ...



Well, like you also said, to each their own.  I see absolutely no reason for a 3 (let alone 6) monitor setup for gaming.  Maybe for day trading stocks, but nothing else.  I couldn't care less if I either turn my eyes left or right to see a car on my side, or to quickly press a buton to see the same thing on my single monitor.  And it would be more than a grand to get a 6 monitor setup (worthless for anything other than PC eye candy) with a very thin bezel, how much power would they consume alltogether, and would it run at 60fps on the same setup as a single monitor?   I find it strage you say you have a hard time doing 3D in DX10/11 because Nvidia has native support for stereo 3D in Vista or Windows 7 only (DX9/DX10/11 only not XP), and the reviews say it works great.  I have to use D3D drivers or IZ3D drivers for stereoscopic 3D, both of which work fine for me.  You can call 3D a gimmick, but I call 6 monitors a gimmick, especially since there are thick bezels in between monitors.  If the bezels were .01" thick, then maybe I could agree with you.  I've seen youtube videos (and even at Bestbuy as a console setup) of a 6 monitor setup and it looks retarded.  Like looking through a paned window:

http://www.youtube.com/watch?v=X6jYycRmWz4

3D, however, looks amazing as everything jumps out at you.  I don't play like that daily.  Just once in a while to get a treat or to show it off.  An ultra highrez monitor shrunk to 23" that supports 2560 x 1600 would be awesome cause there would be no need for FSAA.  Until then, a 56" DLP would probably take up more real estate as a 6 monitor setup, consume less power, and be cheaper.  It could also be used as a regular TV, monitor, and have no thick bezels inbetween.  It could also do 3D if I please.

But whatever, to each their own.


----------



## tkpenalty (Mar 20, 2010)

CyberCT said:


> You can call 3D a gimmick, but I call 6 monitors a gimmick, especially since there are thick bezels in between monitors.  If the bezels were .01" thick, then maybe I could agree with you.  I've seen youtube videos (and even at Bestbuy as a console setup) of a 6 monitor setup and it looks retarded.  Like looking through a paned window:
> 
> http://www.youtube.com/watch?v=X6jYycRmWz4
> 
> ...



Bezels aren't really a problem because in the end you see like literally 4000% more? Its like saying that windows on the side of a car are useless because theres the roof support structures on either side... 

imo its useless in FPS games though. 

Anyway back on OT.


----------



## Deleted member 67555 (Mar 20, 2010)

nice Title..
I hope it's proven correct when these cards are released and at a competitive price..

ok gonna go ride my unicorn across the river of chocolate now...


----------



## the_wolf88 (Mar 20, 2010)

I don't believe in a benchmark from Nvidia !!

We need to see reviews from other sources not from Nvidia !!

Of course Nvidia will show it is winning in their benchmarks even if they are losing to ATI..

Anyway 6 days to go and we'll see who is the KING OF HELL !!


----------



## rizla1 (Mar 20, 2010)

simlariver said:


> yup, not only the performance is behind, but the feature-sert as well. No multimonitor gaming (you can game on max 3 monitor with a SLI setup, fail) and 3D gaming uses proprietary expensive glasses (not that anyone cares about 3D gaming) and still no word for 7.1 sound on the hdmi output. probably non-existent.



Who in the world uses 6 monitors for gaming other than the few super rich?   and for desktop 2 is enough is it not?


----------



## Zubasa (Mar 20, 2010)

CyberCT said:


> Did you read that I have XP?  I only have DX9 on my PC, which is perfectly fine.  Even though I have an X360 Id much rather have my games running at 60fps at 1080p than 30fps at 720p.  To each their own.


Yes I did read you have XP, just because you are running on XP does not mean the issue can be ignored.

Newer games are developed with DX10 and 11 in mind, just because you don't care about quality doesn't mean quality does not matter.
There is a increase in quality from DX9 to DX10 and also to DX11 *(with tessellation which is the topic), but significant or not is your opinion.*
The point of whole nVidia's "3D Vision" is also to in a way increase quality "to the eye", if you don't care about that why bother posting?

Eyefinity (the "stupid multi-monitior gaming" as you put it) on the other hand is a pretty much fail safe feature.
It does not require a high refresh-rate monitor, ultra high-end hardware or expensive shutter glasses.

I think I have gone off-topic far enough, I will leave it here.


----------



## temp02 (Mar 20, 2010)

Just one question, did they use different versions of a software to compare different hardware like the benchmark result "released" the other day (were nVidia used a custom yet to be released to public version of Heaven Benchmark, v1.1) or is this an "OK" result where they did both benchmarks (on both cards, on the same day) using the same version of the software?


----------



## Zubasa (Mar 20, 2010)

temp02 said:


> Just one question, did they use different versions of a software to compare different hardware like the benchmark result "released" the other day (were nVidia used a custom yet to be released to public version of Heaven Benchmark, v1.1) or is this an "OK" result where they did both benchmarks (on both cards, on the same day) using the same version of the software?


From the way it looks it is still the same "yesteryear" graph.


----------



## Mussels (Mar 20, 2010)

honestly, i shouldnt have said the talking out your arse part. typing out your arse would have made more sense - apologies on being rude, regardless.


----------



## Black Panther (Mar 20, 2010)

If this card runs slightly better than the 5870 but results to be much more expensive (as has always been the norm for Nvidia), then one would be better off buying a 5970 I guess.


----------



## simlariver (Mar 20, 2010)

rizla1 said:


> Who in the world uses 6 monitors for gaming other than the few super rich?   and for desktop 2 is enough is it not?



it's not only for gaming, having multiple monitors for visualization, remote management, cad design, movie editing, security, etc... Is definitely a plus.


----------



## Mussels (Mar 20, 2010)

simlariver said:


> it's only for gaming, having multiple monitors for visualization, remote management, cad design, movie editing, security, etc... Is definitely a plus.



my brother works as a security guard where they use some ridiculously expensive hardware to get multi monitor for the banks of security cameras - a single eyefinity 5/6 card would solve all their issues.


----------



## locoty (Mar 20, 2010)

3D, i enjoy it only in movie, not games

multi monitor?Eyefinity?  if it's not good, why nvidia copy it? why they bother to release Nvidia Surround

Someday, Samsung, LG, Dell or other LCD maker will release a monitor which its bezel can be taken off for the purpose of multi monitor

FERMI will be out next week, and i think ATI prepare 5890 to counter it, that's why ATI prohibited its partner to release some high-clocking 5870, because it is served for 5890


----------



## Mussels (Mar 20, 2010)

locoty said:


> Someday, Samsung, LG, Dell or other LCD maker will release a monitor which its bezel can be taken off for the purpose of multi monitor



i'm expecting that within a year or two actually, or at the very least a very thin bezel on the sides. Until eyefinity, there was no real market to bother doing it for.


----------



## runnin17 (Mar 20, 2010)

Sasqui said:


> Why compare one selective synthetic benchmark against a 5870, then turn around and provide a whole performance chart relative to a GTX 285?
> 
> Confused marketing.



Welcome to the wonderful world of nVidia marketing. They know that their card won't offer any true performance gains for the price and power draw so they are relying on the nVidia fanboys to do all the buying and recommending of the card to others for them.

nVidia = fail.

I do hope the card is not a complete bust though, b/c I want some price wars.

PRICE WARS = Crossfire or ever Tri-fire 5870's for me


----------



## Fourstaff (Mar 20, 2010)

the_wolf88 said:


> Anyway 6 days to go and we'll see who is the KING OF HELL !!



Its obviously Kratos.


----------



## newtekie1 (Mar 20, 2010)

[I.R.A]_FBi said:


> But if the aim is X amount of performance and not a certain amount of money spent or a certain part why should it matter?



I've never bought a part based on the aim of a certainly amount of performance, and I don't think many others do either.  I always buy the best part the budget can afford.  

Comparing overclocked results to stock results in discussions like this always makes the person look silly.


----------



## Deleted member 67555 (Mar 20, 2010)

newtekie1 said:


> I've never bought a part based on the aim of a certainly amount of performance, and I don't think many others do either.  I always buy the best part the budget can afford.
> 
> Comparing overclocked results to stock results in discussions like this always makes the person look silly.



and on that note every Benchmark I see using ANY OC's gets instantly dismissed by me
which just so happens to be about 90% of benchmarks on the web..That is unless it's showing the difference between stock and a OC


----------



## Imsochobo (Mar 20, 2010)

CyberCT said:


> Did you read that I have XP?  I only have DX9 on my PC, which is perfectly fine.  Even though I have an X360 Id much rather have my games running at 60fps at 1080p than 30fps at 720p.  To each their own.
> 
> 
> 
> ...




Update to win7 and see that you get DX11 75fps 1080 P instead of DX9 60 fps 1080P

Its true


----------



## phanbuey (Mar 21, 2010)

Imsochobo said:


> Update to win7 and see that you get DX11 75fps 1080 P instead of DX9 60 fps 1080P
> 
> Its true



no its not.  because he gets 60 fps whether he gets 60fps or 150000fps, he picked that number because he knows what he's talking about.  Alot of consoles render 30 because they cant render the whole 60.


----------



## Valdez (Mar 21, 2010)




----------



## Mussels (Mar 21, 2010)

far cry 2 shows up an awful lot there... its not even DX11 (and its an nvidia sponsored game, regardless)

the performance gains just arent very high, considering how much more powe this card uses at load.


----------



## TAViX (Mar 21, 2010)

Valdez said:


> http://i39.tinypic.com/14xp3qh.png



Where did you find this garbage??? No cards released, yet you have the benchies???? WTF?! Another BS!


----------



## Mussels (Mar 21, 2010)

TAViX said:


> Where did you find this garbage??? No cards released, yet you have the benchies???? WTF?! Another BS!



its quite possible for it to not be fake. We know for a fact reviewers have the cards under NDA - someone could have leaked the images.


----------



## Valdez (Mar 21, 2010)

TAViX said:


> Where did you find this garbage??? No cards released, yet you have the benchies???? WTF?! Another BS!



I don't know its fake or not, i just found it on another forum.


----------



## TheMailMan78 (Mar 21, 2010)

I love the title of this thread. "NVIDIA Claims Upper Hand in Tessellation Performance!" Like Nvidia would ever say "NVIDIA Claims *LOWER* Hand in Tessellation Performance!" 

I mean F#$K it! Heres the new title. 

TheMailMan78 claims new Claims Upper Hand in Tessellation Performance! Heres charts to prove it!


----------



## Mussels (Mar 21, 2010)

TheMailMan78 said:


> I love the title of this thread. "NVIDIA Claims Upper Hand in Tessellation Performance!" Like Nvidia would ever say "NVIDIA Claims *LOWER* Hand in Tessellation Performance!"



"nvidia claims upper hand in something no one cares about, cause they suck in everything else"

is how i read it.

You know they're reaching when marketing cant even make it sound good.


----------



## TheMailMan78 (Mar 21, 2010)

Mussels said:


> "nvidia claims upper hand in something no one cares about, cause they suck in everything else"
> 
> is how i read it.
> 
> You know they're reaching when marketing cant even make it sound good.



Yeah but they do not have a cool chart like me. TheMailMan78 tessellation support beats AMD and Nivdia.


----------



## DeathByTray (Mar 21, 2010)

Let's say cat is a synonym for real and dog is a synonym for fake.

So, W1zzard, is this a cat or a dog?
http://i39.tinypic.com/14xp3qh.png


----------



## TheMailMan78 (Mar 21, 2010)

Damn with a graph like mine I figured everyone would want a MailMan!



DeathByTray said:


> Let's say cat is a synonym for real and dog is a synonym for fake.
> 
> So, W1zzard, is this a cat or a dog?
> http://i39.tinypic.com/14xp3qh.png



Its a Hermaphrodite!


----------



## dir_d (Mar 21, 2010)

I wonder if they were using 10.3a in those Dirt 2 benches...I gain 15FPS over the 10.2 by switching to 10.3a. I was seriously amazed


----------



## the54thvoid (Mar 21, 2010)

http://www.techpowerup.com/reviews/MSI/R5870_HD_5870_Lightning/15.html

W1zz's own 5870 review puts the basic 5870 at 76 fps at that resolution on Far Cry 2 and thats with 9.12 drivers.  So it's fair to say theres a lot of 'uncertainty' about the validity of the set up if not the authenticity of the graphs.

Also, the Dirt 2 graph states 4AA, the Far Cry 2 graphs states nothing about settings.  Therefore, it's a crap post regardless.  In science we need to make comparisons based on facts and numbers, not unsourced, unspecified benchs.

And for the hell of it, if you added 'dir d's' extra 15fps from the 10.3a drivers that makes 91 fps.  Yes, i know thats not a valid statement but given how unspecific those graphs are, why not make it.  And that makes it on a par with the GTX 480.

Game set and match to both my logic and my use of illogic.


----------



## erocker (Mar 21, 2010)

TAViX said:


> Where did you find this garbage??? No cards released, yet you have the benchies???? WTF?! Another BS!



I wouldn't say BS, but anyone can make a graph at anytime using believable or unbelievable facts and figures. Five Four and a half more days and the truth will be revealed. I don't think anyone will be surprised when it comes to the results. Anything until the 26th is speculation and or fake.


----------



## the54thvoid (Mar 21, 2010)

http://www.fudzilla.com/content/view/17754/40/

And this, with catalyst 10.2 gives Dirt 2 @ 1920x1200 with 4xMSAA at full settings on DX11 gives 59.3 fps which is 13 fps HIGHER than the GTX 480 gets from the unspecified charts.  This is most clearly not a genuine graph and if it is, it severely compromises the performance of the 5870.
(It also gives 47 fps at 2560x1600 = beats GTX 480 still.)
Wins for me again.

(and I used Fuad's site - most def not an ATI site.)


----------



## Valdez (Mar 21, 2010)

erocker said:


> Anything until the 26th is speculation and or fake.



Exactly. 

Don't get angry at a graph which nobody claims as authentic.


----------



## Benetanegia (Mar 21, 2010)

the54thvoid said:


> http://www.fudzilla.com/content/view/17754/40/
> 
> And this, with catalyst 10.2 gives Dirt 2 @ 1920x1200 with 4xMSAA at full settings on DX11 gives 59.3 fps which is 13 fps HIGHER than the GTX 480 gets from the unspecified charts.  This is most clearly not a genuine graph and if it is, it severely compromises the performance of the 5870.
> (It also gives 47 fps at 2560x1600 = beats GTX 480 still.)
> ...



Apart from what Erocker and Valdez said, you are aware that benching in a different spot of a map/level/track will always net *very different* results right? You can never compare benchmarks from different sites.


----------



## the54thvoid (Mar 21, 2010)

Uh huh.  I know.

My point kinda points to the fact that a snippet of a bench or unsupported graphs are meaningless.  

I quote myself

"Game set and match to both my logic and my use of illogic."

_use if illogic _- see, I know my comparisons aren't valid, i'm just saying look what unsupported info gives you.  And yes, I'll wait for W1zz's review.  Until then we only know it's late and hot.


----------



## Benetanegia (Mar 21, 2010)

I understand your point, but this is no science. This is a thread for especulation and everybody here undestands that.

There's no reason to believe those graphs are legit, but there's no reason to believe those graphs are fake either. Everybody here understands that too. There's no reason to put all the effort you put into proving those graphs could be fake, because no one has claimed them to be true, as Valdez said.

I'm not trying to be critical or offend you, but IMO the only problem with those graphs is your innability to deal with especulation. I'm not trying to be harsh, I'm just saying there's no need to prove or disprove speculation.


----------



## DeathByTray (Mar 21, 2010)

TheMailMan78 said:


> Its a Hermaphrodite!


----------



## TheMailMan78 (Mar 21, 2010)

Benetanegia said:


> There's no reason to believe those graphs are legit, but there's no reason to believe those graphs are fake either.


 Of course there is reason to think those graphs are fake. They are not from a TPU review. Anything less would be uncivilized...


----------



## SetsunaFZero (Mar 21, 2010)

Valdez said:


> http://i39.tinypic.com/14xp3qh.png



even if this bench's are real, i would go ATI this time. Electricity isn't cheap in austria
nvidia fail this time in performance and power efficiency  ATI's HD5k is win IMHO


----------



## the54thvoid (Mar 22, 2010)

Benetanegia said:


> I understand your point, but this is no science. This is a thread for especulation and everybody here undestands that.
> 
> There's no reason to believe those graphs are legit, but there's no reason to believe those graphs are fake either. Everybody here understands that too. There's no reason to put all the effort you put into proving those graphs could be fake, because no one has claimed them to be true, as Valdez said.
> 
> I'm not trying to be critical or offend you, but IMO the only problem with those graphs is your innability to deal with especulation. I'm not trying to be harsh, I'm just saying there's no need to prove or disprove speculation.



Dude, chill.

I am neither offended or feel chastised.  All the GTX 4xx info thus far is necessarily speculation ( i refuse to use the term 'e'speculation).  However, i thought unsourced info showing one side to be superior was akin to trolling.
I've seen folk do it here before.
And for the cheap seats - it took no effort to google two reviews.  If i wanted to put effort into something, it wouldn't be on this.  I'd rather put more effort into deciding which album i prefer - Ride the Lightning or Master of Puppets.

So lets hug and wait for the 26th?


----------



## Deleted member 67555 (Mar 22, 2010)

I hope those charts are completely true...

THINK ABOUT PRICES PEOPLE..

Plus that just means quicker ATI 6000 series which means even LOWER PRICES

and when I say low I mean i want my 5830 for $159 or LESS where it should be!!


----------



## DeathByTray (Mar 22, 2010)

I've heard Nvidia's driver aren't even beta worthy, can you comment on that W1zzard? CAT OR DOG?


----------



## Benetanegia (Mar 22, 2010)

the54thvoid said:


> Dude, chill.
> 
> I am neither offended or feel chastised.  All the GTX 4xx info thus far is necessarily speculation ( i refuse to use the term 'e'speculation).  However, i thought unsourced info showing one side to be superior was akin to trolling.
> I've seen folk do it here before.
> ...



No offense, but you are again wrongly judging those charts (and the action of posting them) and in the process, giving them the importance you were trying to avoid with your posts. Trolling? He just posted some graphs, which may or may not be legit, but don't kill the messenger. Besides, the charts can easily be legit, or at least the performance lead they are showingcon be true. It's not far fetched, not at all. First we have FarCry 2, a game that runs better on Nvidia hardware and second we have Dirt2 on DX11, which means Tesselation, where we know Nvidia is much much faster. Dirt2 has little tesselation, maybe not even worth mentioning performance wise, but everything adds up.

The only thing I agree with is the last sentence, let's wait for the 26th, because only then we will know if the rumors and speculation are true or not. We know nothing or very little about these cards, and *everything* we know falls on the same bag: speculation. Showing some charts where one side is faster is hardly trolling, especially when there's no additional comment and... it might even be true. The posts akin to yours on the other hand are far closer to being trolling. They don't really help at all and the forums are full of them (which makes them redundant). The posts like the one that Valdez posted are the helpful ones, their content might be legit or not, but like I said, it's just naked info and people in these forums are intelligent enough to distinguish between (probably/maybe) fake and (probably/maybe) legit information.

No one needs to be reminded that rumors/speculation and pre-release graphs can be fake after every turn. We give no more importance to those charts than the one deserved. Like I said, the only one who gave them real importance in this thread was you so far and I only pointed out that you were using bad measures to treat/refute the info, because comparing benchmarks from different sources proves nothing.

Now it's clear you did got offended, so sorry for that, not my intention and neither is with this post. Let's wait a week and we will know which info was legit and which one was not.

PS. I love Metallica, you'd be better searching anything about them than trying to figure out just another way to demostrate how Nvidia has failed, while we don't even know if they trully failed yet.

PS2. Thanks very much for correcting my spelling. I really appreciate when people correct my english, because let's be honest, what kind of person is unable to write english properly? It must be some kind of illiterate, or worst , some kind of heretic for whom english is just his 4th language in order of importance or something wild like that. Not mad at you BTW, only at people whom mother language is english, in general. I can patter more or less 7 languages and only with english you find people who correct you after every turn. Sadly you fell under that demographic with that comment and we do feel offended by that kind of comments.


----------



## the54thvoid (Mar 22, 2010)

> PS2. Thanks very much for correcting my spelling. I really appreciate when people correct my english, because let's be honest, what kind of person is unable to write english properly? It must be some kind of illiterate, or worst , some kind of heretic for whom english is just his 4th language in order of importance or something wild like that. Not mad at you BTW, only at people whom mother language is english, in general. I can patter more or less 7 languages and only with english you find people who correct you after every turn. Sadly you fell under that demographic with that comment and we do feel offended by that kind of comments.



Ben.  Ben.  Ben.  Lets stop this now.  I wasn't correcting your english! You used the term *especulation*. I thought the 'e' meant 'electronic' as in e-mail or e-tailer.  I wasn't correcting you, lol.  I thought it was one of those modern terms used for online stuff.

I'm not a spelling nazi - I only use one langauge (harsh language) so all credit to the multi linguists.

I think somewhere we've got wires crossed and perhaps the tone of my post has been lost long ago.  I'm not going to even bother explaining anymore, i figure no matter what i say you'll have an equally long, 'yes but...' post. (When fact is we're probably singing the same words with a different tune).

So, to summarise - No offence to you - I honestly was not correcting your spelling - i thought you were using a tech term by prefixing the letter 'e' in front of another word.  Like this e-discussion.

As for Metallica - I know it's Master of Puppets. 

And remember one thing - my use of english is minimal - I'm Scottish - Scandinavians sound better at englsh than us!


----------



## nt300 (Mar 22, 2010)

Cant wait for CCC 10.4, something new is going to be added that will make HD 5000's special but cant tell right now


----------



## the54thvoid (Mar 22, 2010)

Something that will speed up loading times on BC2?  DX11 and ATI+ BC2 = Slow load.  I always miss the vehicles at map starts...


----------



## SNiiPE_DoGG (Mar 22, 2010)

the54thvoid said:


> Something that will speed up loading times on BC2?  DX11 and ATI+ BC2 = Slow load.  I always miss the vehicles at map starts...



Thats really an HDD/RAM issue mate, dx11 has to load more data of course and it emphasizes any data transfer speed bottlenecks you may have.


----------



## the54thvoid (Mar 22, 2010)

I've got 6GB od DDR3 triple Channel on an i7920.  Samsung spinpoint HDD.  My pc is actually quite fast.

This link covers the problem well.

http://forums.electronicarts.co.uk/...c2-pc-still-slow-loading-map-good-system.html

It's to do with the drivers having to reload the map shaders each time....  NV have it sussed.

I've since changed the settings.ini file to force DX9 and it loads far quicker now.  Bah ATI drivers.


----------



## the54thvoid (Mar 22, 2010)

Ben?

Found something over at Hexus.

http://www.hexus.net/content/item.php?item=23032

Those damn graphs we're talking about earlier are Nvidia's own in house benchies.  

Nuff said.


----------



## araditus (Mar 22, 2010)

"We can't comment on the authenticity of the results, but the figures reinforce the belief that NVIDIA's GeForce GTX 480 will become the fastest single-GPU graphics card on offer."

scroll down on that hexus chart, As most have said, nothing is real till the 26th, and its only 1 game (not to mention one of the most GPU bias-ed games ever), with an unknown system and Ive heard of this cool program called photoshop


----------



## DeathByTray (Mar 22, 2010)

The slow loading times in BC2 DX11 is a known issue. ATI is working on it.


----------



## Marineborn (Mar 22, 2010)

get a quad core or more itll help your load times atleast it does mine, i get into games in under 30 seconds, i only have a 7200rpm hardrive


----------



## CyberCT (Mar 22, 2010)

Well I guess the only way to know if Nvidia is worth it or not this round is within a week.  I was thinking about upgrading to Windows 7 next month but we will see.  Maybe I'll wait until the DX11 cards compete and drop in prices a bit more.

I was actually spending a lot more time looking into the whole eyefinity thing.  The videos on youtube that show no bezels actually do look pretty cool.  I guess with their recording software or whatever they were able to zoom i a certain way to eliminate the bezels of their actual trimonitor setup.  Over the last few years I remembered news of some developers coming out with large widescreen monitors that don't have bezels in between what would be regular monitors' resolutions in the same setup.  Specifically the one from Alienware and I forget the other one.  But after some research they both cost almost $8,000 ... rediculous.  The bezels are still thick enough not to be seamless at a good pricepoint with current monitors on the market.  Heck, even the Samsung 3 or 6 monitor display that's supposedly made with eyefinity in mind costs a heck of a lot of money. 

With much less than that money, I could buy a 1080p DLP projector and project a 130" screen which would wipe the floor with eyefinity.  I'm not saying ATI's technology is bad, but in the future where bezels either get ultra thin or merged monitors enter the market with no bezels, then maybe it would make more sense.  I have yet to find a 1080p 3D projector for a great price, but I'm sure it would come out within the next few years.  I still feel 3D is awesome for games and movies, and I have yet to hear the opposite from anyone that comes to my place to see it once in a while.  Maybe you guys that are naysayers don't have it set up correctly or your display is too small?


----------



## TheMailMan78 (Mar 22, 2010)

Benetanegia said:


> Dirt2 on DX11, which means Tesselation, where we know Nvidia is much much faster. Dirt2 has little tesselation, maybe not even worth mentioning performance wise, but everything adds up.:


 And you know Dirt 2 was developed on ATI hardware much like Farcry 2 was developed on Nvidia hardware so HOW do you figure this chart makes any sense? Especially since Nividia doesn't even have an DX11 card out yet?!? Thats a MAJOR flaw in your logic. Are these charts real? Could be. However if they showed the new Nvidia cards in a negative light I wonder if you would be so protective.


----------



## Benetanegia (Mar 22, 2010)

the54thvoid said:


> Ben?
> 
> Found something over at Hexus.
> 
> ...



First of all, sorry for the confusion with the language thing, you have to admit there's many language Nazis out there, but I can see that I overreacted. I was trying to make a satiric comment about something that happens a lot, but after reading it I see I failed at my intention.

*Geforce GTX 480 TDP at 250W* - http://www.fudzilla.com/content/view/18170/1/

That's why you can't believe any rumors and much less make any assumptions based on them. Which one is truth now? 300w? 250? You see?



TheMailMan78 said:


> And you know Dirt 2 was developed on ATI hardware much like Farcry 2 was developed on Nvidia hardware so HOW do you figure this chart makes any sense? Especially since Nividia doesn't even have an DX11 card out yet?!? Thats a MAJOR flaw in your logic. Are these charts real? Could be. However if they showed the new Nvidia cards in a negative light I wonder if you would be so protective.



One word TESSELATION.

Regarding the issue with the language, I admit I overreacted, but it has nothing to do with arrogance, not mine at least. If at all it's the (probably unconscious) arrogance of the average english speaking people who always find the time to correct others people's grammar who made me write about that. That was not the case (he was not correcting), and I apologyse to the54thvoid, but it's absolutely NOT YOUR issue. I'm starting to think you are in love with me anyway, because you step up on my discussions on every oportunity you find and well, I'm not used to so much attention (it's usually me who has to go after women). Sorry, and I will do use my arrogance this time, but you are not my type.


----------



## PCpraiser100 (Mar 23, 2010)

Will the world of computing ever know that a) there are still driver issues with ATI b) The fact that Nvidia's upcoming series will kill your wallet.

I'm being patient with the upcoming Catalyst, are you?


----------



## EchoMan (Mar 23, 2010)

PCpraiser100 said:


> Will the world of computing ever know that a) there are still driver issues with ATI
> 
> I'm being patient with the upcoming Catalyst, are you?



Not to sound nasty but I've heard the same notion ever since ATI could be found on the map (9000 series)... Pretty sure it's known.


----------



## jellyrole (Mar 23, 2010)

Every release their drivers improve. Give them props for having shitty drivers, which are improving, yet having such amazing hardware results that makes up for the poorly performing drivers. It shows that they're doing something right and that they're improving on something else.


----------



## Zubasa (Mar 23, 2010)

PCpraiser100 said:


> Will the world of computing ever know that a) there are still driver issues with ATI b) The fact that Nvidia's upcoming series will kill your wallet.
> 
> I'm being patient with the upcoming Catalyst, are you?


You forgot c) nVidia's Drivers simply destroys your card


----------



## TheMailMan78 (Mar 23, 2010)

Benetanegia said:


> One word TESSELATION.



And you base this off of synthetic benches?

I ask because you know ATI and Nvidia handle tessellation differently and therefore games that are written based off their respected hardware will perform differently. To say Nvidia does tessellation "better" is ignorant of the process.

I found this post on another forum. It may explain things better for you.



> What people are calling the tesselator on Direct3D 11 compatible hardware, are in fact 3 different things: 1. The Hull Shader, 2. The Tesselator, 3. Domain Shader
> 
> To achive the effect of tesselated geometry you have to use the 3 stages in the pipeline. The hardware tesselator in the ATI card is only the number 2 item, that fits between 2 new programable software shader stages. There is not enough info on the Nvidia card to be sure if it really has a hardware tessellator or if that stage is also executed on the programable cores using software as was implied until recently by Charlie.
> 
> ...



http://www.semiaccurate.com/forums/showpost.php?p=22193&postcount=33


----------



## HalfAHertz (Mar 23, 2010)

When I first heard 300w, I remembered our first microwave was about 300w in power. One thing led to another and I made the following with this morning's coffee...
Note: the following material is not to be used as flame bait. It's just a joke peeps!

Not satisfied with your purchase? You expect more than just Dx11 and Physx? 
	

	
	
		
		

		
			





Then call now and order your very own Easy-bake solution at 1-800-Half-A-Hertz
	

	
	
		
		

		
		
	


	




Our technology manages to harness all that unused extra power of your new GPU and focus it in two perfectly sized hot plates
	

	
	
		
		

		
		
	


	




You can heat up your coffee, cook a balanced breakfast or just warp up in those cold winter days.
	

	
	
		
		

		
		
	


	




Brought to you by your friendly Half-A-Hertz & Co. and powered by NVIDIA T.M.


----------



## afw (Mar 23, 2010)

... thats really ingenious    ...


----------



## phanbuey (Mar 23, 2010)

TheMailMan78 said:


> And you base this off of synthetic benches?
> 
> I ask because you know ATI and Nvidia handle tessellation differently and therefore games that are written based off their respected hardware will perform differently. To say Nvidia does tessellation "better" is ignorant of the process.
> 
> ...



Keep in mind you're talking to a someone who used to make elaborate (and quite good) graphs comparing old and new nvidia shader designs and why the gtx 480 was going to beat the 5970...  The Green Force is strong with this one.

No offense B, you're a smart guy, but it seems ATi definitely pee'd in your coffee at some point in time.

Got 3 days left


----------



## Benetanegia (Mar 23, 2010)

TheMailMan78 said:


> And you base this off of synthetic benches?
> 
> I ask because you know ATI and Nvidia handle tessellation differently and therefore games that are written based off their respected hardware will perform differently. To say Nvidia does tessellation "better" is ignorant of the process.
> 
> ...



Everything based on the assumption that Fermi has no dedicated tesselator...  It has 15. Nuff said.



phanbuey said:


> Keep in mind you're talking to a someone who used to make elaborate (and quite good) graphs comparing old and new nvidia shader designs and why the gtx 480 was going to beat the 5970...  The Green Force is strong with this one.
> 
> No offense B, you're a smart guy, but it seems ATi definitely pee'd in your coffee at some point in time.
> 
> Got 3 days left



Yep, in three days we will find out, but I never said it would outright beat the 5970, I said that based on the performance scaling on past cards it *could* beat it. I especified a performance range that went from 90% of HD5970's performance to 150% and I said I believed it to be closer to the lower end of that range (ROP bottleneck). Furthermore, those claims were based on the specs of the time, that is, 750 Mhz core and 1700 Mhz shaders. The memory has been severely crippled too, so although I didn't consider that to be a bottleneck in those old charts, now I do thing it will affect performance although just a bit. Without very heavy calculations (only 480 cores and core/shader clock adjustements) the new specs move that range to around 80-110% the performance of the 5970. My charts were based on now old HD5xxx performance, untouched by 6 months of driver improvement. Things have changed quite a bit, but that means that if Fermi ends up 15-30% faster as some of those charts suggest, I was dead spot on. 3 days...


----------



## Wshlist (Mar 23, 2010)

*yay*

I love it when they fight, now let's see prices drop, and availability improve for cards like the HD5850.


----------



## Altered (Mar 23, 2010)

Thats some funny shit HalfAHertz. Made me laugh for the first time today.


----------



## TheMailMan78 (Mar 23, 2010)

Benetanegia said:


> Everything based on the assumption that Fermi has no dedicated tesselator...  It has 15. Nuff said.


Its not "Nuff said" in the least.


----------



## evillman (Mar 23, 2010)

Very funny discussion. Keep going. ROFL


----------



## nt300 (Mar 24, 2010)

ATIs tesselators are quad pumped  how about Nvidia? I dont think so


----------



## nt300 (Mar 24, 2010)

Heres a slap in the face for Nvidia 
So according to this article Nvidia Fermi fails to impress. So much for ATI dropping prices 


> *Nvidia bends the definition of honesty in GTX480 benches*
> Same old same old, but this time much hotter!
> http://www.semiaccurate.com/2010/03/23/nvidia-bends-definition-honesty-gtx480-benches/
> 
> ...


----------



## Benetanegia (Mar 24, 2010)

nt300 said:


> ATIs tesselators are quad pumped  how about Nvidia? I dont think so



Quad pumped yes, but 1 op/cycle anyway, because it needs 4 clocks to finish the work. It's the same for Nvidia (1 op/cycle) ones AFAIK and even if it wasn't the same it still has 16. If not quad pumped it would still have 4 times the tesselation capabilities. 

Tesselators themselves are not the problem anyway. Triangle setup is far more important. You could have a tesselator hexa or hecta pumped (or 100 tesselators), I don't care, but it wouldn't make a difference if you can only do one tri/cycle as is the case with Evergreen. Nvidia didn't put 16 tesselators because they are needed anyway, they are there to increase availability and reduce latency by giving each SM one tesselator to operate with. Tesselation performance will still be limited primarily by the tri setup and secondly by hull/domain shaders anyway (both of which Fermi seems to deal better with thanks to its L1/L2 caches), but HD5870/HD5770 demostrate that Evergreen is heavily limited by tri setup/tesselator even with the low tesselation levels found in Stalker, Dirt2, etc and not by shader performance: when tesselation is enabled the HD5770 loses much less than it's biggest brethren.


----------



## Mussels (Mar 25, 2010)

ouch, so they cheat by running DX9? nasteh


----------



## phanbuey (Mar 25, 2010)

god will they just release this thing or what... i wanna see it beat a 285 by 35% 

I hate NDA's.


----------



## TheMailMan78 (Mar 25, 2010)

Benetanegia said:


> Quad pumped yes, but 1 op/cycle anyway, because it needs 4 clocks to finish the work. It's the same for Nvidia (1 op/cycle) ones AFAIK and even if it wasn't the same it still has 16. If not quad pumped it would still have 4 times the tesselation capabilities.
> 
> Tesselators themselves are not the problem anyway. Triangle setup is far more important. You could have a tesselator hexa or hecta pumped (or 100 tesselators), I don't care, but it wouldn't make a difference if you can only do one tri/cycle as is the case with Evergreen. Nvidia didn't put 16 tesselators because they are needed anyway, they are there to increase availability and reduce latency by giving each SM one tesselator to operate with. Tesselation performance will still be limited primarily by the tri setup and secondly by hull/domain shaders anyway (both of which Fermi seems to deal better with thanks to its L1/L2 caches), but HD5870/HD5770 demostrate that Evergreen is heavily limited by tri setup/tesselator even with the low tesselation levels found in Stalker, Dirt2, etc and not by shader performance: when tesselation is enabled the HD5770 loses much less than it's biggest brethren.









DX9 FTW!


----------



## nt300 (Mar 25, 2010)

Charlie is angry isn't he 
*Nvidia forces garbage on those wanting GTX480s*
You must buy 80 cards before you get GTX480/470s
http://www.semiaccurate.com/2010/03/23/nvidia-forces-garbage-those-wanting-gtx480s/

But I like his articles.


----------



## Benetanegia (Mar 25, 2010)

TheMailMan78 said:


> http://www.semiaccurate.com/static/uploads/2010/03_march/Dirt2_scores.jpg
> 
> DX9 FTW!



So you pretend to say the charts posted are in DX9? Seriously?
To pretend that the card is slow is one thing, to pretend that it performs somewhere in between a GTS250 and a GTX260... 

Anyway. Wow! We've really fallen under, Nvidia bashers, didn't we? Less than a quarter of a screenshot, with no single proof of which card is being used, coming from SA and Charlie Demerjian, both of which have not access to Nvidia presentations because they are never invited, but we are happy now with just that. we don't any more proof. A semi-properly presented charts are not believable, but this? Oh sure. Come on guys... it's PATHETIC.

The only truth is that CD had 2 days (1 now) to make noise and get attention for his new home SA, because once Fermi is launched he will not have anything relevant to talk about and he is deemed to oblivion, and you all fell in the trap... Poor boys...


----------



## HalfAHertz (Mar 25, 2010)

Benetanegia said:


> So you pretend to say the charts posted are in DX9? Seriously?
> To pretend that the card is slow is one thing, to pretend that it performs somewhere in between a GTS250 and a GTX260...
> 
> Anyway. Wow! We've really fallen under, Nvidia bashers, didn't we? Less than a quarter of a screenshot, with no single proof of which card is being used, coming from SA and Charlie Demerjian, both of which have not access to Nvidia presentations because they are never invited, but we are happy now with just that. we don't any more proof. A semi-properly presented charts are not believable, but this? Oh sure. Come on guys... it's PATHETIC.
> ...



God you sure are pregidous sometimes...


----------



## erocker (Mar 25, 2010)

I wold love a rule stating no links to Fudzilla or SemiAccurate. Seriously, why do people post them here? We all have the internet and have the capability to input their web addresses into our browsers and read their content on their site. It's so close to spam it feels like I'm eating a pork sandwich. Oh I miss the times when smart conversation was based on fact and not rumor and heresay.. if that time ever existed, I'm probably just crazy or something. Release is tomorrow, er.. well late tonight. You know you're excited.


----------



## DaedalusHelios (Mar 25, 2010)

HalfAHertz said:


> God you sure are pregidous sometimes...



Says the guy with the signature built on propaganda. 

Just look in the mirror man. 

I buy high end regardless of brand. I don't trash either company. I just share experiences.


----------



## TheMailMan78 (Mar 25, 2010)

Benetanegia said:


> So you pretend to say the charts posted are in DX9? Seriously?
> To pretend that the card is slow is one thing, to pretend that it performs somewhere in between a GTS250 and a GTX260...
> 
> Anyway. Wow! We've really fallen under, Nvidia bashers, didn't we? Less than a quarter of a screenshot, with no single proof of which card is being used, coming from SA and Charlie Demerjian, both of which have not access to Nvidia presentations because they are never invited, but we are happy now with just that. we don't any more proof. A semi-properly presented charts are not believable, but this? Oh sure. Come on guys... it's PATHETIC.
> ...



No I don't pretend. I just knew that would piss you off. 

Anyway ATI not lowering its prices is all the proof I need that Fermi is a failure.


----------



## HalfAHertz (Mar 25, 2010)

DaedalusHelios said:


> Says the guy with the signature built on propaganda.
> 
> Just look in the mirror man.
> 
> I buy high end regardless of brand. I don't trash either company. I just share experiences.



Why are you right:

Yes I am *buuuut *I have the complete right to be so. I am simply tired of huge corporations(please note the plural form here) doing whatever they want completely ignoring the interest of their customers in the process.

Please do ignore all the bullshit and rumors on sites like Semiaccurate and Fudzilla and just look at the bare facts. We're discussing a product that should have been here about six months ago. It was delayed again and again and again. 

Why are you wrong:

Why should you care and why is this hurtful to you? Well you were promised something and that promise was broken over and over again. The graphics card ecosystem has been broken, there is no healthy competition and the prices instead of falling, have continued to rise. 

Everybody has the right of freedom of speech and having an opinion. You cannot simply label or insult someone only because his view on the matter differs from your own, unless of course you can deliver solid facts.

Do you have any proof right now that my belief is skewed? Can you honestly tell me that I am wrong, when all the facts are here and are staring you in the face: there is no competition on the market, the prices of video cards has sky-rocketed, *the consumer simply does not get the same performance/$ as he did at the same time last year.*

Could this have been avoided? Hell yes! In the face of obvious manufacturing difficulties Nvidia, and to some extent Ati as well, have failed to maintain sufficient stock of their older generation and instead of lowering prices, to at least try to create some competition in the mid-range, prices have continued to grow uncontrollably.If Nvidia was more open and honest they could have prepared us for the current situation. 

Are Ati any better? No, because they have failed to keep their products at the originally announced price.

I don't care anymore if they will finally deliver on their promise tomorrow or two weeks after that. The damage to the market has already been done and they deserve everything that's coming to them.

Here's what I think will happen tomorrow: There won't be reviews from everyone, just from people that have been cherry picked, after signing some dubious agreement that they will show only controlled  results that have been verified by Nvidia. 

These are my own beliefs, you can continue to argue with me, prove me that I am wrong *with solid facts* or you can simply ignore me, but you should not label, demean or undermine me, or the fore-mentioned sites simply because you disagree. 

/rage off

P.S. My signature was intended as a joke, if anyone asks me nicely, I'll gladly remove it.


----------



## Benetanegia (Mar 25, 2010)

TheMailMan78 said:


> No I don't pretend. I just knew that would piss you off.



So you are a troll. I've been telling you for a long time, you were becoming one.



> Anyway ATI not lowering its prices is all the proof I need that Fermi is a failure.



Considering your previous post, taking that as a proof is definately a step in the right direction , but still nothing sort of a proof for failure. Nvidia said and did the same regarding HD5xxx cards (regarding price cuts).



HalfAHertz said:


> These are my own beliefs, you can continue to argue with me, prove me that I am wrong *with solid facts* or you can simply ignore me, but you should not label, demean or undermine me, or the fore-mentioned sites simply because you disagree.



So let's see if I understand. You can say whatever you want without any proofs, but if someone argues with you he needs to provide solid facts.

IMO there are places for doing that and they are not called forums, they're called blogs. You can create one for free and say whatever you want there. In forums, you can't say whatever you want without proofs or someone will argue with you by either presenting you with the facts or exposing your lack of evidence. You are entitled to your beliefs, but others are too, even when those beliefs are about you.


----------



## HalfAHertz (Mar 26, 2010)

Benetanegia said:


> So let's see if I understand. You can say whatever you want without any proofs, but if someone argues with you he needs to provide solid facts.
> 
> IMO there are places for doing that and they are not called forums, they're called blogs. You can create one for free and say whatever you want there. In forums, you can't say whatever you want without proofs or someone will argue with you by either presenting you with the facts or exposing your lack of evidence. You are entitled to your beliefs, but others are too, even when those beliefs are about you.



First of all the definition of forum: http://www.thefreedictionary.com/forum please note points 1b and 1c

Secondly you completely misunderstood me. I stated my proof - the obvious fact that the card has been delayed beyond any even remotely acceptable time frame. The only logical reason for that is that there were some problems with it - be it software, hardware,manufacturing or all of the above, it doesn't really matter, all that matters is that it is still 6 months late. And that is the undeniable truth, Charlie may be right or wrong and we may choose to believe him or not but that doesn't make us "Nvidia bashers" or "PATHETIC". 

What really matters here is that the graphics card market has been foiled with and the consumer has been hurt. If you ask me, we should all stop buying gfx cards until the prices drop to pre-Dx11 levels and we have the same performance/$ we had back then... 
Stick it ot the Man, man


----------



## [I.R.A]_FBi (Mar 26, 2010)

why cant we live together and show some man love?


----------



## Benetanegia (Mar 26, 2010)

HalfAHertz said:


> First of all the definition of forum: http://www.thefreedictionary.com/forum please note points 1b and 1c



Yeah, discussion, so if you post your opinion and someone doesn't like it because it has no real proofs behind them, he will disscuss your opinion, which is mostly baseless. Nowhere you were talking about the delay.



> Secondly you completely misunderstood me. I stated my proof - the obvious fact that the card has been delayed beyond any even remotely acceptable time frame. The only logical reason for that is that there were some problems with it - be it software, hardware,manufacturing or all of the above, it doesn't really matter, all that matters is that it is still 6 months late. And that is the undeniable truth, Charlie may be right or wrong and we may choose to believe him or not but that doesn't make us "Nvidia bashers" or "PATHETIC".



First of all, you had not posted about that ss so I was definately not talking about you. Besides I am not arguing against the critics, but on which ""proofs"" those critics are based off. Where's the GPU-z shot to begin with? Heck, where's the complete screenshot? Where's the proof that it comes from Nvidia and not from anyone else (or CD himself)? Like I said I know it didn't come from a Nvidia presentation, because CD has not been and will never be in a Nvidia presentation.

None of you have even considered those things, while those things *have* been asked for, in the case of the benchmarks. And not to say they could be (or could be not) false, which was *my* point, no, because of the lack of that info the graphs were directly labeled as false. That is what it is pathetic and yeah it makes those who act in that way bashers, because they are more that willing to bash without a single proof. The same people can't be claiming some graphs to be false because of x reasons and then claim Nvidia is cheating based on something that breaks those x reasons and then some.


----------



## HalfAHertz (Mar 26, 2010)

Ok then, those are valid points. The SS are most probably fake, but consider that we live in a digital age and the internet connects people... CD may not have been on any of Nvidia's presentation, but that does not mean someone else could not have sent him valid shots. Anyway they are irrelevant now because hopefully later today we'll see some real results.

Ignoring the screenshot, my other points still count valid, Nvidia is late, that hurts the consumer ergo Nvidia brought all that negatity on themselves because of the way they conduct their business...


----------



## TheMailMan78 (Mar 26, 2010)

Benetanegia said:


> Yeah, discussion, so if you post your opinion and someone doesn't like it because it has no real proofs behind them, he will disscuss your opinion, which is mostly baseless. Nowhere you were talking about the delay.
> 
> 
> 
> ...



The reason people do that is because you were basing your shallow theories off of "positive" info about fermi but couldn't take it when someone shook those theories to the ground with fact. The sad part is your a fanboy in denial which is why you get trolled so much. People play you like a puppet on this forum but your ego is so large you don't even realize it.

I have no idea if Fermi will be good or not but rest assured Nvidia has fallen from grace lately from a PR point of view. That alone is bad for everyone. I honestly believe they rode that last architecture for FAR to long. I guess time will tell.


----------



## Benetanegia (Mar 26, 2010)

TheMailMan78 said:


> The reason people do that is because you were basing your shallow theories off of "positive" info about fermi but couldn't take it when someone shook those theories to the ground with fact. The sad part is your a fanboy in denial which is why you get trolled so much. People play you like a puppet on this forum but your ego is so large you don't even realize it.
> 
> I have no idea if Fermi will be good or not but rest assured Nvidia has fallen from grace lately from a PR point of view. That alone is bad for everyone. I honestly believe they rode that last architecture for FAR to long. I guess time will tell.



 Nice theory. If only a single proof (fact) had been posted...


----------



## phanbuey (Mar 26, 2010)

this shall be settled in 8 hours and one shall emerge the winner 

im betting it will be shlow... and that they will argue immature drivers.  Lets see how it OC's...  Despite being a huge hog it may OC well - especially the 470.


----------



## afw (Mar 26, 2010)

phanbuey said:


> this shall be settled in 8 hours and one shall emerge the winner
> 
> im betting it will be shlow... and that they will argue immature drivers.  Lets see how it OC's...  Despite being a huge hog it may OC well - especially the 470.



And we'll also have to see what sort of drivers they'll (reviewers) be using on ATi cards ... the 10.3 have shown some good gains over the 10.2 so ... we must watch out for that ... some of the pro nVidia sites might use old drivers to widen the margin ....  ...  

I'm eagerly waiting for W1zz's review ....  ... since its gives more accurate performance summary charts (due to testing on a wide array of games - 18 infact )  ... I hope he uses the 10.3 driver set


----------



## Deleted member 67555 (Mar 26, 2010)

Oh i can't wait for the Actual Fanboi Reviews Followed by Real world reviews So I can finally see if this card is worth something


----------

