• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 4XX Series Discussion

Status
Not open for further replies.

HammerON

The Watchful Moderator
Staff member
Joined
Mar 2, 2009
Messages
8,398 (1.49/day)
Location
Up North
System Name Threadripper
Processor 3960X
Motherboard ASUS ROG Strix TRX40-XE
Cooling XSPC Raystorm Neo (sTR4) Water Block
Memory G. Skill Trident Z Neo 64 GB 3600
Video Card(s) PNY RTX 4090
Storage Samsung 960 Pro 512 GB + WD Black SN850 1TB
Display(s) Dell 32" Curved Gaming Monitor (S3220DGF)
Case Corsair 5000D Airflow
Audio Device(s) On-board
Power Supply EVGA SuperNOVA 1000 G5
Mouse Roccat Kone Pure
Keyboard Corsair K70
Software Win 10 Pro
Benchmark Scores Always changing~
Sorry to be a skeptic, but I find this hard to be true. Do you have links to the "benchies" you mentioned?
 
Joined
Apr 30, 2008
Messages
4,879 (0.82/day)
Location
Multidimensional
System Name Boomer Master Race
Processor AMD Ryzen 7 7800X3D 4.2Ghz - 5Ghz CPU
Motherboard MSI B650I Edge Wifi ITX Motherboard
Cooling CM 280mm AIO + 2x 120mm Slim fans
Memory Kingston Fury 32GB 6000Mhz
Video Card(s) ASUS RTX 4070 Super 12GB OC
Storage Samsung 980 Pro 2TB + WD 2TB 2.5in HDD
Display(s) Sony 4K Bravia X85J 43Inch TV 120Hz
Case CM NR200P Max TG ITX Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply CoolerMaster V850 SFX Gold 850W PSU
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 10 Home 64bit
Benchmark Scores Don't do them anymore.
but some benchies have said even OC Core i7 was bottlenecked HD 5970, we need more demanding game to use this fermi (or better CPU)

The HD5970 was designed for high resolutions, all benchmark reviews are mixed up so it makes it hard to trust one over the other, so i just watch video reviews.:toast:
 
Joined
Nov 13, 2007
Messages
10,326 (1.69/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
The HD5970 was designed for high resolutions, all benchmark reviews are mixed up so it makes it hard to trust one over the other, so i just watch video reviews.:toast:

TRUTH!

but ... wizz reviews are top notch too...


I agree tho, 5970 is dominant at higher rez - same should apply for the fermi monster.


I am hoping for an improvement in IQ as well from the Fermi - but that might be hard to do, as g200 IQ is very advanced already.
 

shevanel

New Member
Joined
Jul 27, 2009
Messages
3,464 (0.63/day)
Location
Leesburg, FL
lol fermi i just hope the ferm comes packed with like 3 games.. and all bockbusters.

do they integrate a gpu2 for physics or are they just going to expect people to by GTX's for physx dedicated cards?

also, since there is eyenother ty gimmick in the other camp to allow multi monitors do you expect nv to bundle 3d shutter glasses and use that as their gimmick?
 
Joined
Nov 13, 2007
Messages
10,326 (1.69/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
lol fermi i just hope the ferm comes packed with like 3 games.. and all bockbusters.

do they integrate a gpu2 for physics or are they just going to expect people to by GTX's for physx dedicated cards?

also, since there is eyenother ty gimmick in the other camp to allow multi monitors do you expect nv to bundle 3d shutter glasses and use that as their gimmick?

i expect them to go a bit hyperactive on CUDA as an answer. i.e dual core with a fermi is faster than an i7 type thing. Fermi is still primarily a GP GPU - the only one with error correction...

so even though ATI has better SP tflop rating no ecc means they are at disadvantage for HPC and gpgpu.
 
Joined
Jan 26, 2009
Messages
488 (0.09/day)
I think what he means is: Would his mainboard memory reduce the performance of the Fermi?

Nah, probably not, unless your CPU is clocked lower than say, 3GHz and only a dual-core processor. Some games like Far Cry 2 would be bottlenecked by the CPU at 30 fps or less if you have an Athlon X2 5500 or a Celeron for example, no matter how powerful your GPU is. Also, if your DDR2-800 is CAS 6 or higher, then that could also hurt performance. But if you're using an overclocked 3.6GHz Core 2 Quad with DDR2-800 at CAS 4 latency, then it would not reduce the performance of Fermi for most games.

I am planning to buy p II x4 955...no plans for overclock.will use mainboard memory 4gb 800mhz ddr2..will it reduce the performance of a fermi?
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.21/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
Just another friendly reminder... please keep this thread on topic. It's about the GT3XX series and not about ATI, ATI's cards, or ATI's business tactics.

I am planning to buy p II x4 955...no plans for overclock.will use mainboard memory 4gb 800mhz ddr2..will it reduce the performance of a fermi?

It would reduce performance of fermi in games, but if you were using fermi for F@H there would be no reduction.
 

Bo_Fox

New Member
Joined
May 29, 2009
Messages
480 (0.09/day)
Location
Barack Hussein Obama-Biden's Nation
System Name Flame Vortec Fatal1ty (rig1), UV Tourmaline Confexia (rig2)
Processor 2 x Core i7's 4+Gigahertzzies
Motherboard BL00DR4G3 and DFI UT-X58 T3eH8
Cooling Thermalright IFX-14 (better than TRUE) 2x push-push, Customized TT Big Typhoon
Memory 6GB OCZ DDR3-1600 CAS7-7-7-1T, 6GB for 2nd rig
Video Card(s) 8800GTX for "free" S3D (mtbs3d.com), 4870 1GB, HDTV Wonder (DRM-free)
Storage WD RE3 1TB, Caviar Black 1TB 7.2k, 500GB 7.2k, Raptor X 10k
Display(s) Sony GDM-FW900 24" CRT oc'ed to 2560x1600@68Hz, Dell 2405FPW 24" PVA (HDCP-free)
Case custom gutted-out painted black case, silver UV case, lots of aesthetics-souped stuff
Audio Device(s) Sonar X-Fi MB, Bernstein audio riser.. what??
Power Supply OCZ Fatal1ty 700W, Iceberg 680W, Fortron Booster X3 300W for GPU
Software 2 partitions WinXP-32 on 2 drives per rig, 2 of Vista64 on 2 drives per rig
Benchmark Scores 5.9 Vista Experience Index... yay!!! What??? :)
I am planning to buy p II x4 955...no plans for overclock.will use mainboard memory 4gb 800mhz ddr2..will it reduce the performance of a fermi?

That's a nice CPU, so it should not reduce the performance by much. However, some games might show noticeable improvements from overclocking the CPU along with memory (or at least faster CPU/memory).


but some benchies have said even OC Core i7 was bottlenecked HD 5970, we need more demanding game to use this fermi (or better CPU)

Are you talking about the new article recently done by Xbitlabs?

http://www.xbitlabs.com/articles/video/display/radeon-hd5870-cpu-scaling.html

Usually, Crossfire/SLI setups need much faster CPU's, given the greater overhead. A single 5870 usually did just fine with a non-overclocked i7 920 in most games, so I'd expect almost the same for a single Fermi (well, at least a 3.2-3.4 GHz Core i7 ).
 
W

wahdangun

Guest
i expect them to go a bit hyperactive on CUDA as an answer. i.e dual core with a fermi is faster than an i7 type thing. Fermi is still primarily a GP GPU - the only one with error correction...

so even though ATI has better SP tflop rating no ecc means they are at disadvantage for HPC and gpgpu.

but, G-force version of FERMi WILL NOT use ECC because it bring performance hit.
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.21/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
They've only promised ECC support to the Fermis in their Tesla model cards.
 
W

wahdangun

Guest
hey, it doesn't matter at all, the most important thing is raw power, and im'm glad hey don't use ECC if it just bring performance hit
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.21/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
hey, it doesn't matter at all, the most important thing is raw power, and im'm glad hey don't use ECC if it just bring performance hit

It hasn't been released that they won't use ECC either, but I agree. ECC would slow down ma gaming, and I need all my frames in Crysis.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.58/day)
and I need all my frames in Crysis.

:laugh:

:slap:


Damn Cevat. Why'd he do that...without that game, noone would need such power, really. Even GTA4 is jsut cpu-bound...:rolleyes:

Bring on the ECC. I know that current titles will play just fine, given what seems to be expected of Fermi's gaming card, so the ECC should have no effect with the extra power. Would be a crowning acheivement, IMHO.

Gotta stop throwing gpu power @ bad programming. It allows programmers to be too lazy...
 
Joined
Oct 6, 2009
Messages
2,824 (0.52/day)
Location
Midwest USA
System Name My Gaming System
Processor Intel i7 4770k @ 4.4 Ghz
Motherboard Asus Maximus VI Impact (ITX)
Cooling Custom Full System Water cooling Loop
Memory G.Skill 1866 Mhz Sniper 8 Gb
Video Card(s) EVGA GTX 780 ti SC
Storage Samsung SSD EVO 120GB - Samsung SSD EVO 500GB
Display(s) ASUS W246H - 24" - widescreen TFT active matrix LCD
Case Bitfenix Prodigy
Power Supply Corsair AX760 Modular PSU
Software Windows 8.1 Home Primeum
Just another friendly reminder... please keep this thread on topic. It's about the GT3XX series and not about ATI, ATI's cards, or ATI's business tactics.

I would agree with that but sometimes it's hard to talk about one with out the other..... Because there are only two companies to compare.

The more I think about it from my discussion before..... I am starting to believe that Femi will be the more powerful card and not a flop. While I still believe that the GTX380 won't be able to beat a 5970 I do believe it will be more powerful than a 5870. I also believe that Femi will be Uber expensive and only cater to a select people who have to have the absolute best. Which will probably be a problem for Nvidia....... That still remains to be seen though......

The reason I am now starting to believe this more...... is if you look at all the graphs that were shown here...... Yes Nvidia has been pretty good on competing with them selves for better performance.

I also believe that who has the crown for better performance in irrelevant. If you look back at history (especially recent...) You would notice that ATI's 4800's series came out in the summer if 2008..... And Nvidia's offering GTX200 series in Winter later that year (myabe early 2009 I can't remember). It almsot seems to me that they both take the crown away from each other for about 6 months at a time.

The only way to tell if what Nvidia has invested with Femi will hurt the company is to wait and see when they come out. Other wise anything else is just guessing.

But with that said my guess is that the cost of making these cards and selling will further hurt Nvidia. I also believe while they might be more powerful in computations ..... I think that they will act about on par with what ATI is offering currently. So If you want to do Folding and the like wait for Nvidia. Other wise By either when they come out.

I just think that it is stupid for ATI owners and Nvidia owners to want their card Manufacturer to win so bad. Because if that ever really did happen and one knocked the other off the map..... Then the remaining company would be so over priced and the there would be never any performance gains. There would be no reason because they wouldn't have anyone to compete against.
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.21/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
:nutkick:

Cry more about Crytek. Whining about ECC to a bunch of people who buy cards for benchmarking/gaming isn't going to make it part of the consumer grade card. Even if you whine to NV I doubt they'd listen either because they don't need to make their consumer cards even more expensive. Everyone knows ATI's aren't ready to compete in GPGPU, and even the 5 series are just starting to match the G92s. Without the competition there's no need for them to seriously consider using ECC memory in their cards.
 
Joined
Jul 2, 2008
Messages
3,638 (0.62/day)
Location
California
If it's not as fast as HD5970, but faster than HD5870, then it's obvious the price will stay between them.

And I don't have any problems buying a card under HD5970's price.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.58/day)
:nutkick:

Cry more about Crytek. Whining about ECC to a bunch of people who buy cards for benchmarking/gaming isn't going to make it part of the consumer grade card. Even if you whine to NV I doubt they'd listen either because they don't need to make their consumer cards even more expensive. Everyone knows ATI's aren't ready to compete in GPGPU, and even the 5 series are just starting to match the G92s. Without the competition there's no need for them to seriously consider using ECC memory in their cards.
LoL, personally, we only have good FAH on nV because of nV supporting stanford. There was a time when ATi could fold only. nV didn't invent GPGPU, altohugh many would like to say that. Your compare is not apt...that fault is programming, where AMD invests nothing, but nV invests alot. I don't need nV's help for that...nor will I be blinded by it.

Without the competition there's no need for them to seriously consider using ECC memory in their cards.

True enough. This is why i have alot of repsect for nV...they are actively helping with the programming. They invest money into thier products...AMD just makes the hardware.
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.21/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
I would agree with that but sometimes it's hard to talk about one with out the other..... Because there are only two companies to compare.

The more I think about it from my discussion before..... I am starting to believe that Femi will be the more powerful card and not a flop. While I still believe that the GTX380 won't be able to beat a 5970 I do believe it will be more powerful than a 5870. I also believe that Femi will be Uber expensive and only cater to a select people who have to have the absolute best. Which will probably be a problem for Nvidia....... That still remains to be seen though......

The reason I am now starting to believe this more...... is if you look at all the graphs that were shown here...... Yes Nvidia has been pretty good on competing with them selves for better performance.

I also believe that who has the crown for better performance in irrelevant. If you look back at history (especially recent...) You would notice that ATI's 4800's series came out in the summer if 2008..... And Nvidia's offering GTX200 series in Winter later that year (myabe early 2009 I can't remember). It almsot seems to me that they both take the crown away from each other for about 6 months at a time.

The only way to tell if what Nvidia has invested with Femi will hurt the company is to wait and see when they come out. Other wise anything else is just guessing.

But with that said my guess is that the cost of making these cards and selling will further hurt Nvidia. I also believe while they might be more powerful in computations ..... I think that they will act about on par with what ATI is offering currently. So If you want to do Folding and the like wait for Nvidia. Other wise By either when they come out.

I just think that it is stupid for ATI owners and Nvidia owners to want their card Manufacturer to win so bad. Because if that ever really did happen and one knocked the other off the map..... Then the remaining company would be so over priced and the there would be never any performance gains. There would be no reason because they wouldn't have anyone to compete against.

A comparison is perfectly relevant in some respects, but it has to be relevant to the GT3XX series which has yet to be released. What needs to be considered is that unless mentioning the 5XXX series helps to understand NV's performance goal via competitive scaling then it's completely useless to mention them.

For the record the last generation was released Q4 2008 Nvidia, end of Q1 2009 ATI.
 
Joined
Oct 6, 2009
Messages
2,824 (0.52/day)
Location
Midwest USA
System Name My Gaming System
Processor Intel i7 4770k @ 4.4 Ghz
Motherboard Asus Maximus VI Impact (ITX)
Cooling Custom Full System Water cooling Loop
Memory G.Skill 1866 Mhz Sniper 8 Gb
Video Card(s) EVGA GTX 780 ti SC
Storage Samsung SSD EVO 120GB - Samsung SSD EVO 500GB
Display(s) ASUS W246H - 24" - widescreen TFT active matrix LCD
Case Bitfenix Prodigy
Power Supply Corsair AX760 Modular PSU
Software Windows 8.1 Home Primeum
A comparison is perfectly relevant in some respects, but it has to be relevant to the GT3XX series which has yet to be released. What needs to be considered is that unless mentioning the 5XXX series helps to understand NV's performance goal via competitive scaling then it's completely useless to mention them.

For the record the last generation was released Q4 2008 Nvidia, end of Q1 2009 ATI.

I still believe what I believe with that and like I say the only way to see is when the cards release out.......

Now I am going to post a pic that doesn't help my point very much but it helps yours....(with the single gpu vs double gpu top Cards) That you and I both a gree will eventually happen. These benchmarks are supposed to be leaked from someone who has a connection from Nvidia. They were released on Guru's website....... So it will support your theory.... While I still find these hard to believe and I don't think they are true (because the situation you are talking about I believe won't happen till the next Gen 6800 and GT400) But regardless I will admit that I am wrong if these are true...... Here you go:) Link at the bottom







Link here!!!

http://www.guru3d.com/news/geforce-gtx-360-and-380-benchmarks/
 

Attachments

  • N 2.jpg
    N 2.jpg
    197.7 KB · Views: 557
  • N 3.jpg
    N 3.jpg
    171.6 KB · Views: 577
  • N4.jpg
    N4.jpg
    194.1 KB · Views: 541
Joined
Oct 6, 2009
Messages
2,824 (0.52/day)
Location
Midwest USA
System Name My Gaming System
Processor Intel i7 4770k @ 4.4 Ghz
Motherboard Asus Maximus VI Impact (ITX)
Cooling Custom Full System Water cooling Loop
Memory G.Skill 1866 Mhz Sniper 8 Gb
Video Card(s) EVGA GTX 780 ti SC
Storage Samsung SSD EVO 120GB - Samsung SSD EVO 500GB
Display(s) ASUS W246H - 24" - widescreen TFT active matrix LCD
Case Bitfenix Prodigy
Power Supply Corsair AX760 Modular PSU
Software Windows 8.1 Home Primeum
If the above I posted is correct and they can beat ATI with out having to build a GX2 version of their cards.... I think Nvidia might be able to get back into the game. But this is only a rumor and my mind still hasn't change but I thought I would still share considering that Guru is pretty trust worthy.

Nvidia could endure the cost because than they would only have to have one die per card opposed to ATI's duel. That is the only way I think Nvidia could stay in the game. If not I believe they will be too expensive and there will be now other choice to fold...... But personally I don't think that we will see this competition till next gen.

Have fun with this Binge <----- you notice How I wrote this in red..... It's just to show you how pretty of a color it is to green...... :) Just yanking your chain bud!
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.49/day)
Location
Reaching your left retina.
I still believe what I believe with that and like I say the only way to see is when the cards release out.......

Now I am going to post a pic that doesn't help my point very much but it helps yours....(with the single gpu vs double gpu top Cards) That you and I both a gree will eventually happen. These benchmarks are supposed to be leaked from someone who has a connection from Nvidia. They were released on Guru's website....... So it will support your theory.... While I still find these hard to believe and I don't think they are true (because the situation you are talking about I believe won't happen till the next Gen 6800 and GT400) But regardless I will admit that I am wrong if these are true...... Here you go:)

http://forums.techpowerup.com/attachment.php?attachmentid=31393&stc=1&d=1260727780

http://www.guru3d.com/news/geforce-gtx-360-and-380-benchmarks/

LOL. Freaking similar to my charts. If I said that I've made those slides, I'm sure more than one would believe me. :laugh:
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.58/day)
I find it odd that they chose those apps? It would be more interesting to see nVidia beat AMD in AMD-supported apps, no? None of these engines does anything really interesting. Where's DX11? Who cares about DX10? Looks like a pretty even 80% over GTX285?

And really, R5xxx isn't relevant at all. For a DX11 chip, 120FPS is a bare minimum, considering 3D. So what those graphs say is that for 3D gaming @ 1920x1200, I gotta buy "2xGTX380"? To add in Phys-X effects?

How about thoes apps? Can I run 3D @ 120hz with Phys-X? I am hoping a single chip can do this, not two.

That's what Fermi needs to be, at least to me as a consumer. If I have to buy two $600 cards, and maybe a third, and then a 1200W PSU as well, so I can have the pleasure of buying another kit for $600 with monitor and glasses...taxes, shipping...Fermi costs $2000 for the complete end user package, @ 1920x1080? Add in cpu, memory, motherboard, casing, HDD's, ODD's....I'm not a happy camper.

Haw many games will work perfectly? What about those DX11 titles? Seems they are just now ready for DX10, performance wise, or do we need three gpus still?
 
Last edited:
Joined
Oct 6, 2009
Messages
2,824 (0.52/day)
Location
Midwest USA
System Name My Gaming System
Processor Intel i7 4770k @ 4.4 Ghz
Motherboard Asus Maximus VI Impact (ITX)
Cooling Custom Full System Water cooling Loop
Memory G.Skill 1866 Mhz Sniper 8 Gb
Video Card(s) EVGA GTX 780 ti SC
Storage Samsung SSD EVO 120GB - Samsung SSD EVO 500GB
Display(s) ASUS W246H - 24" - widescreen TFT active matrix LCD
Case Bitfenix Prodigy
Power Supply Corsair AX760 Modular PSU
Software Windows 8.1 Home Primeum
I find it odd that they chose those apps? It would be more interesting to see nVidia beat AMD in AMD-supported apps, no? None of these engines does anything really interesting. Where's DX11? Who cares about DX10? Looks like a pretty even 80% over GTX285?

And really, R5xxx isn't relevant at all. For a DX11 chip, 120FPS is a bare minimum, considering 3D. So what those graphs say is that for 3D gaming @ 1920x1200, I gotta buy "2xGTX380"? To add in Phys-X effects?

How about thoes apps? Can I run 3D @ 120hz with Phys-X? I am hoping a single chip can do this, not two.

That's what Fermi needs to be, at least to me as a consumer. If I have to buy two $600 cards, and maybe a third, and then a 1200W PSU as well, so I can have the pleasure of buying another kit for $600 with monitor and glasses...taxes, shipping...Fermi costs $2000 for the complete end user package, @ 1920x1080? Add in cpu, memory, motherboard, casing, HDD's, ODD's....I'm not a happy camper.

Haw many games will work perfectly? What about those DX11 titles? Seems they are just now ready for DX10, performance wise, or do we need three gpus still?
__________________

+1 on your statement with both companies fronts!!!!

LOL. Freaking similar to my charts. If I said that I've made those slides, I'm sure more than one would believe me.

Yeah actually I posted this for you in mind and put someone else's name down by accident. You know it probably didn't get stolen from you but...... When you do searches on this topic Tech power up comes up a lot.

So you know I wouldn't be surprised if it was stolen from you.!!!

I still just think though bud we are one gen away yet on your theory coming true. Unless when ATI stated " We have a surprise for Nvidia when Femi releases " they were talking about their X3 or X4 cards then.

But man if these specs are true...... this will make me either buy two more 5870's or I might have no other choice to go with the green team. As long as the price is right. It still makes sense to use one card opposed to multiple. More power efficient and cooler. Xfire and SLI are still not were they need to be for me to think otherwise.
 
Status
Not open for further replies.
Top