• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 580 1536 MB

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,058 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
What happens once a game comes along that uses more than 300W, nvidia just expects users to live with underperformance then? Or do they expect you to upgrade cards?


Would you buy a car with a wood block under the throttle?

If 300W is the PCI-e limit i dont see it being an issue. Under current protocols for meeting specs, i dont think any game coding would be 'valid' that did that. The design spec is after all 300W. Why design games that require more power than a single card can meet by specification. Given the console domination of gaming design, we're still not even getting DX 11 up to a good standard yet.

In the future i dont see it happening either as the manufacture processes shrink.

As for the car analogy, most super sport production cars have speed limiters (150mph for many BMW/Mercedes etc) built in, so we do buy cars with metaphorical chokes built in.
 

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.77/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
Oh Jesus Christ!!! I quoted him saying exact that! Do you have a hard time reading? I'm not quoting him again, you can read my post above to see where he said it, or even better read the review!:shadedshu

Again, why are you taking a statement from the overclocking section about temperature and trying to say it has anything to do with the Over Current Protection? Did you not get that the temperature protection is a totally different thing from the current protection? I believe I already told you this.

You can read pretty much the same statement about GTX480s at well: http://www.techpowerup.com/reviews/MSI/N480GTX_GTX_480_Lightning/31.html

But I guess you will still want to go on about the temp limit like it is the same thing as the current limit, but now I'm sure you'll say the GTX480 must have had it too...right? Because this is like the 3rd time you've went on about the temperature limit like it is related to the current limit.:shadedshu

Again, temperature is not the same as current, they are two different things and hence two different protection systems.

Correct he did say that, and I bolded the important part. And when it is put in context with the fact that he said the OCP is only activated when it detects OCCT and Furmark.

Again, I don't know why you can't be bothered to read the review as W1z already said this in it. The driver detects Furmark and OCCT and activates the limitter. It is limitted by those 2 programs specifically because those are the only two programs the driver detects. In all other programs the driver doesn't monitor the overcurrent sensors.

You were right. I'm man enough to admit when i'm wrong and judging by W1zzard's quote below, i was indeed wrong.

What I was stating was that NVIDIA never stated that only OCCT and Furmark triggered the OCP protection cap.

thats exactly what nvidia told me

It's all about interpretation: Until now, W1zzard hadn't stated what you have been claiming as fact (the card really does react to Furmark and OCCT) and that's what i was clinging onto.

The thing is, when i'm convinced i'm right, i'll argue and argue, and then argue some more ... until someone proves me wrong, just like W1zzard did.
 

bear jesus

New Member
Joined
Aug 12, 2010
Messages
1,534 (0.29/day)
Location
Britland
System Name Gaming temp// HTPC
Processor AMD A6 5400k // A4 5300
Motherboard ASRock FM2A75 PRO4// ASRock FM2A55M-DGS
Cooling Xigmatek HDT-D1284 // stock phenom II HSF
Memory 4GB 1600mhz corsair vengeance // 4GB 1600mhz corsair vengeance low profile
Storage 64gb sandisk pulse SSD and 500gb HDD // 500gb HDD
Display(s) acer 22" 1680x1050
Power Supply Seasonic G-450 // Corsair CXM 430W
The did it to either mislead the public in power use, or to protect the card from being used to the full in a optimized way.

The one thing that makes me think you could be right about that is the fact so many people quoted the 480's power usage as what it used in furmark and not real in game power usage when complaining about how much power the card used, i assume so many people did that because they are talking about the max power the card could possibly use and with this limit makes it seam much better when quoting the absolute max power.
 
Joined
Apr 4, 2008
Messages
4,686 (0.77/day)
System Name Obelisc
Processor i7 3770k @ 4.8 GHz
Motherboard Asus P8Z77-V
Cooling H110
Memory 16GB(4x4) @ 2400 MHz 9-11-11-31
Video Card(s) GTX 780 Ti
Storage 850 EVO 1TB, 2x 5TB Toshiba
Case T81
Audio Device(s) X-Fi Titanium HD
Power Supply EVGA 850 T2 80+ TITANIUM
Software Win10 64bit
Well only stress testing programs use it, so I don't see a need to disable it.I mean if all you're using the card for is gaming, and you want to over-clock, then why not stress test with a demanding game instead? Sometimes I've had OCs which would have corruptions and artefacts in The Furry Donut™, but would work perfectly in games.

That's a horrible idea. Sometimes it takes a good 5 hours for a game to crash from a bad overclock, OCCT will find it in 10-20 minutes, and then you don't need to worry about finding stability with hours of testing for each individual program. And "the furry donut" is only good for heating up your card or telling you you're way past the stability limit, it's not sensitive enough for real stress testing. At least not with current cards. If that or programs based on it is the only test you use you're not going to have a truly stable overclock, then you'll get crashes in games and blame the games or the drivers when it's really user error.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.10/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
You were right. I'm man enough to admit when i'm wrong and judging by W1zzard's quote below, i was indeed wrong.





It's all about interpretation: Until now, W1zzard hadn't stated what you have been claiming as fact (the card really does react to Furmark and OCCT) and that's what i was clinging onto.

The thing is, when i'm convinced i'm right, i'll argue and argue, and then argue some more ... until someone proves me wrong, just like W1zzard did.

Its cool man, I don't hold a grudge or anything, and it wasn't like I was really angry or anything. And I'm the same way when I'm convinced I'm right.:toast:

The one thing that makes me think you could be right about that is the fact so many people quoted the 480's power usage as what it used in furmark and not real in game power usage when complaining about how much power the card used, i assume so many people did that because they are talking about the max power the card could possibly use and with this limit makes it seam much better when quoting the absolute max power.


The problems I have with the whole idea that nVidia did it to give false power consumption reading is that if they wanted to do that they would have done a better job at it. The power consumption with the limitter on under Furmark is like 150w, that is lower than game power consumption. So it makes it pretty obvious what is going on there, and anyone taking power consumption numbers would have instantly picked up on that. If they were really trying to do this to provide false power consumption numbers they would have tuned it so that power consumption under Furmark was at least at a semi-realistic level.
 

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.77/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
That's a horrible idea. Sometimes it takes a good 5 hours for a game to crash from a bad overclock, OCCT will find it in 10-20 minutes, and then you don't need to worry about finding stability with hours of testing for each individual program. And "the furry donut" is only good for heating up your card or telling you you're way past the stability limit, it's not sensitive enough for real stress testing. At least not with current cards. If that or programs based on it is the only test you use you're not going to have a truly stable overclock, then you'll get crashes in games and blame the games or the drivers when it's really user error.

Agreed. Furmark and other such programs "find" a bad OC quicker but that doesn't mean it's full proof.

Sometimes, you run the stress progs for several hours on your OCs and it all checks out fine and then, while playing some game, you get crashes. Who's to blame: the game? The VGA drivers? Most of the time it's the OCs, be them CPU related or GPU related.
 
Joined
Nov 21, 2007
Messages
3,688 (0.59/day)
Location
Ohio
System Name Felix777
Processor Core i5-3570k@stock
Motherboard Biostar H61
Memory 8gb
Video Card(s) XFX RX 470
Storage WD 500GB BLK
Display(s) Acer p236h bd
Case Haf 912
Audio Device(s) onboard
Power Supply Rosewill CAPSTONE 450watt
Software Win 10 x64
yea, for me to feel my overclock is stable usually takes 1-3days of messing around, stress tests, gaming, everything. Its not when you can game or when you can pass a stress test that its stable, its when it can do everything :p. If anything starts being faulty after an OC i always bounce back to square 1.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.10/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Agreed. Furmark and other such programs "find" a bad OC quicker but that doesn't mean it's full proof.

Sometimes, you run the stress progs for several hours on your OCs and it all checks out fine and then, while playing some game, you get crashes. Who's to blame: the game? The VGA drivers? Most of the time it's the OCs, be them CPU related or GPU related.

That is one of the things about Furmark I've notices, it doesn't use a whole lot of VRAM. So if your RAM overclock is slightly unstable it will almost never find it. That is usually when I fire up Unigine at full tessellation settings to really fill that VRAM up.:toast:
 
Joined
Aug 9, 2006
Messages
1,065 (0.16/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
A total of 307 comments? Make it 308 now. Only a passionate hate of nVidia can make a thread grow this fast and this large. Whatever, this card is pretty much as fast as two 5870 GPU's (5970) as per the following really cool link and all without all the CrossFire scaling issues, since sadly (for CrossFire tech users that is) SLI is still better tech of the two.

Till the next round then, although I don't think AMD will stick around for that long since their Abu Dhabi sugar daddies... ummm, investors, yeah that's it, "investors", well they aren't doing too well themselves. Let's see, who's got half a dozen to a dozen billion dollars (US) sitting around to be spent in this time of global economic downturn in order to bail out and save AMD yet again? IBM? Microsoft? Sony? Fat chance!

Let me put it this way for hard-core nVidia haters: come Christmas time 2011 (maybe even a few months earlier the way things are going) it's either nVidia GPU or nVidia GPU when it comes to your upgrading purposes.
 
Joined
Apr 4, 2008
Messages
4,686 (0.77/day)
System Name Obelisc
Processor i7 3770k @ 4.8 GHz
Motherboard Asus P8Z77-V
Cooling H110
Memory 16GB(4x4) @ 2400 MHz 9-11-11-31
Video Card(s) GTX 780 Ti
Storage 850 EVO 1TB, 2x 5TB Toshiba
Case T81
Audio Device(s) X-Fi Titanium HD
Power Supply EVGA 850 T2 80+ TITANIUM
Software Win10 64bit
That is one of the things about Furmark I've notices, it doesn't use a whole lot of VRAM. So if your RAM overclock is slightly unstable it will almost never find it. That is usually when I fire up Unigine at full tessellation settings to really fill that VRAM up.:toast:

It's interesting with occt, the vram testing part never found any errors at all. It was letting me crank it all the way up to 4000mhz effective. The occt gpu test though was able to find vram errors, probably because both clocks are really tied together in the 4xx series. The vram test must just be showing what the chips can do, not what the controller can handle.
 
Joined
Nov 4, 2005
Messages
11,986 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
A total of 307 comments? Make it 308 now. Only a passionate hate of nVidia can make a thread grow this fast and this large. Whatever, this card is pretty much as fast as two 5870 GPU's (5970) as per the following really cool link and all without all the CrossFire scaling issues, since sadly (for CrossFire tech users that is) SLI is still better tech of the two.

Till the next round then, although I don't think AMD will stick around for that long since their Abu Dhabi sugar daddies... ummm, investors, yeah that's it, "investors", well they aren't doing too well themselves. Let's see, who's got half a dozen to a dozen billion dollars (US) sitting around to be spent in this time of global economic downturn in order to bail out and save AMD yet again? IBM? Microsoft? Sony? Fat chance!

Let me put it this way for hard-core nVidia haters: come Christmas time 2011 (maybe even a few months earlier the way things are going) it's either nVidia GPU or nVidia GPU when it comes to your upgrading purposes.

I don't hate Nvidia anymore than I hate Ford cars and trucks. My company car is a Ford Explorer.


I use what works best for me, and right now it is ATI for the money.


Back to your comment about AMD, they have paid off millions of their debts that is why they were not showing a profit, if you understand balance sheets and finance you would understand this.

If 300W is the PCI-e limit i dont see it being an issue. Under current protocols for meeting specs, i dont think any game coding would be 'valid' that did that. The design spec is after all 300W. Why design games that require more power than a single card can meet by specification. Given the console domination of gaming design, we're still not even getting DX 11 up to a good standard yet.

In the future i dont see it happening either as the manufacture processes shrink.

As for the car analogy, most super sport production cars have speed limiters (150mph for many BMW/Mercedes etc) built in, so we do buy cars with metaphorical chokes built in.

Yep, and some don't have the limiters.

A game does not have anything to do with power consumption anymore than a movie has to do with power use. The game specs don't list how many watts you to have to run it. Nvidia chooses the power consumption of a card based on the coolers ability, and other specs. They made a card that pulls 350+ watts in a real world performance test. Then they put a self limiting throttle on it to keep it from pulling that amount. They claim they have the most powerful card, and in some games they do, but when pushed to the max by a program designed to do so it has to self limit to maintain standards. Like a dragster that has a self deployment chute when you go full throttle. Or a block of wood under the pedal.
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.10/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
They made a card that pulls 350+ watts in a real world performance test.

Furmark is hardly a real world performance test. It is a torture test more than even a benchmark, though it does have the benchmark function built in. And even then it isn't a real world benchmark, it is a synthetic benchmark.

And according to W1z it doesn't pull 350+ watts, it pulls ever so slightly over 300w.
 

CDdude55

Crazy 4 TPU!!!
Joined
Jul 12, 2007
Messages
8,178 (1.29/day)
Location
Virginia
System Name CDdude's Rig!
Processor AMD Athlon II X4 620
Motherboard Gigabyte GA-990FXA-UD3
Cooling Corsair H70
Memory 8GB Corsair Vengence @1600mhz
Video Card(s) XFX HD 6970 2GB
Storage OCZ Agility 3 60GB SSD/WD Velociraptor 300GB
Display(s) ASUS VH232H 23" 1920x1080
Case Cooler Master CM690 (w/ side window)
Audio Device(s) Onboard (It sounds fine)
Power Supply Corsair 850TX
Software Windows 7 Home Premium 64bit SP1
I don't hate Nvidia anymore than I hate Ford cars and trucks. My company car is a Ford Explorer.


I use what works best for me, and right now it is ATI for the money.

It's the way you come across, a lot of your posts come across as the stereotypical rabid ignorant fanboy. You should really be aware of that , because if you really are non bias, comments like ''Nfags can leave this thread, I hope Nvidia doesn't drop their prices and you continue to get assraped by them.'', aren't actually in your favor of being non bias.. just saying.
 
Joined
Nov 9, 2010
Messages
5,689 (1.11/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
...this card is pretty much as fast as two 5870 GPU's (5970) as per the following...and all without all the CrossFire scaling issues...
Actually at launch the 580 did have serious scaling issues. It was stomped by the 480 in dual GPU SLI in almost every test. Here's hoping driver maturation will sort that out, because right now that's the only thing keeping me from buying one, other than maybe trying to hit a holiday sale.
 
Joined
Nov 4, 2005
Messages
11,986 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
It's the way you come across, a lot of your posts come across as the stereotypical rabid ignorant fanboy. You should really be aware of that , because if you really are non bias, comments like ''Nfags can leave this thread, I hope Nvidia doesn't drop their prices and you continue to get assraped by them.'', aren't actually in your favor of being non bias.. just saying.

:toast: Yes, I do get a bit heated when some people piss me off.


Part of the reason why is in the last few years this digital dream bullcrap that all these companies have been promising hasn't come true. I get really pissed when either of them start spewing numbers, then hold users back from enjoying and using the hardware they purchase by limiting it, just to spin the wheel again at a later date with nothing more than a new paint job. ATI has failed me, Adobe has failed me, Canon has failed me, Intel is promising crap they can't deliver, Motorola has failed me, Nvidia has failed to deliver, Microsoft is not pushing people to get standardized programming.


I bought a canon high def camcorder, it records M2TS, the same as blu-ray. According to ATI we should be processing that on stream processors with this shiny new....but this $600 software, then install this patch, then install this set of codecs, then you have to export it to this format, then burn it.

Intel is still fiddle Fing around with crap they are to large and clumsy to do right the first three times.

My phone still doesn't support flash and they promise "its coming, just wait" Sounds like Atari who still haven't fixed their TDU game, or many other issues with games that just get thrown to the side.

Nvidia pushes proprietary crap like Physx, that works on all of 13 GPU enabled titles, despite others showing it works just as fast on CPU when moved up from antiquated code, besides it now being a part of DX11. Also Nvidia and Adobe seem to be stuck in a 69 swap meat, they disable the hardware stream acceleration when at ATI card is present, some forum members have learned how to bypass it, and wonder of all wonders, it still works using the ATI GPU to perform the calculations, and not CUDA according to them as long as it doesn't get shut down.


So this shiny new future, is bullshit. it is the same crap we have had from day one. I'm tired of spending thousands of dollars to be told I still have it wrong.
 
Joined
Aug 20, 2010
Messages
209 (0.04/day)
Location
Mostar, Bosnia & Herzegovina
System Name Micro Mule
Processor Intel i7 950 Stock + Noctua NH-C14
Motherboard Asus Rampage III Gene MicroATX
Cooling Noctua 120mm/80m Fans
Memory Crucial Ballistix 6GB DDR3 1600MHz
Video Card(s) Asus nVidia GTX 580
Storage Samsung 850 Pro SSD, WD Caviar Black 2TB HDD
Display(s) LG 42LD650 42" LCD HDTV
Case Silverstone Fortress FT03
Audio Device(s) Creative SB X-Fi Titanium HD + Sennheiser PC360 Headset
Power Supply Corsair AX850 - 850W Modular Gold
Software Windows 7 Ultimate 64 bit
:toast: Yes, I do get a bit heated when some people piss me off.


Part of the reason why is in the last few years this digital dream bullcrap that all these companies have been promising hasn't come true. I get really pissed when either of them start spewing numbers, then hold users back from enjoying and using the hardware they purchase by limiting it, just to spin the wheel again at a later date with nothing more than a new paint job. ATI has failed me, Adobe has failed me, Canon has failed me, Intel is promising crap they can't deliver, Motorola has failed me, Nvidia has failed to deliver, Microsoft is not pushing people to get standardized programming.


I bought a canon high def camcorder, it records M2TS, the same as blu-ray. According to ATI we should be processing that on stream processors with this shiny new....but this $600 software, then install this patch, then install this set of codecs, then you have to export it to this format, then burn it.

Intel is still fiddle Fing around with crap they are to large and clumsy to do right the first three times.

My phone still doesn't support flash and they promise "its coming, just wait" Sounds like Atari who still haven't fixed their TDU game, or many other issues with games that just get thrown to the side.

Nvidia pushes proprietary crap like Physx, that works on all of 13 GPU enabled titles, despite others showing it works just as fast on CPU when moved up from antiquated code, besides it now being a part of DX11. Also Nvidia and Adobe seem to be stuck in a 69 swap meat, they disable the hardware stream acceleration when at ATI card is present, some forum members have learned how to bypass it, and wonder of all wonders, it still works using the ATI GPU to perform the calculations, and not CUDA according to them as long as it doesn't get shut down.


So this shiny new future, is bullshit. it is the same crap we have had from day one. I'm tired of spending thousands of dollars to be told I still have it wrong.

... well put ... :rockout:


A total of 307 comments? Make it 308 now. Only a passionate hate of nVidia can make a thread grow this fast and this large. Whatever, this card is pretty much as fast as two 5870 GPU's (5970) as per the following really cool link and all without all the CrossFire scaling issues, since sadly (for CrossFire tech users that is) SLI is still better tech of the two.

Till the next round then, although I don't think AMD will stick around for that long since their Abu Dhabi sugar daddies... ummm, investors, yeah that's it, "investors", well they aren't doing too well themselves. Let's see, who's got half a dozen to a dozen billion dollars (US) sitting around to be spent in this time of global economic downturn in order to bail out and save AMD yet again? IBM? Microsoft? Sony? Fat chance!

Let me put it this way for hard-core nVidia haters: come Christmas time 2011 (maybe even a few months earlier the way things are going) it's either nVidia GPU or nVidia GPU when it comes to your upgrading purposes.

... if nVidia becomes the only choice of discrete GPU (although I know that it's never going to happen), I think that it'll be the day on which I switch to Intel integrated graphics, or better still AMD Fusion ... in fact, with its current management; I believe that nVidia will eventually be acquired by Intel ... again; I'm not Red nor Green, but I hate it when idiot fan boys try to transform every single discussion on these forums into an nVidia/ATI trashing circus ...
 
Last edited:

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,058 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
A total of 307 comments? Make it 308 now. Only a passionate hate of nVidia can make a thread grow this fast and this large. Whatever, this card is pretty much as fast as two 5870 GPU's (5970) as per the following really cool link and all without all the CrossFire scaling issues, since sadly (for CrossFire tech users that is) SLI is still better tech of the two.

Till the next round then, although I don't think AMD will stick around for that long since their Abu Dhabi sugar daddies... ummm, investors, yeah that's it, "investors", well they aren't doing too well themselves. Let's see, who's got half a dozen to a dozen billion dollars (US) sitting around to be spent in this time of global economic downturn in order to bail out and save AMD yet again? IBM? Microsoft? Sony? Fat chance!

Let me put it this way for hard-core nVidia haters: come Christmas time 2011 (maybe even a few months earlier the way things are going) it's either nVidia GPU or nVidia GPU when it comes to your upgrading purposes.

Odd. I'm an ATI owner and i've been praising the GTX 580, trying to not buy one so i can gauge the competition when it comes out in December. Your post is ignorant with regards to scaling:

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580_SLI/24.html GTX 580
1 GTX 580 is 77% of GTX 580 sli (all resolutions)
http://www.techpowerup.com/reviews/ATI/Radeon_HD_6870_CrossFire/23.html HD 6870
1 HD6870 is 73% of HD 6870 crossfire (all resolutions)

So the 6 series scales better in dual gpu config. Granted, on 5 series, the sli option is better but the 6 series nailed it well.

As for hard core NVidia haters (not a nice comment to use - hate is such a strong word) - i think at christmas we'll get a fair choice. My personal feeling is that indeed the 6970 isn't faster than a 580. I think if it was faster there would be some leaks out from AMD PR to say, look, our card is better - hold off buying that 580. But if it doesn't perform as well, there's nothing to leak - better safe to stay quiet.
Hope i'm wrong because if i'm not the 580's will go up in price.

I think though that you're way off base. Most people do tend to take sides but 'hating' isn't part of it. It more shows your own predisposition against AMD. But at least you wear your colours on your sleeve. It makes you prone to make erroneous statements (a la the one above ref: scaling).
 
Joined
May 4, 2009
Messages
1,972 (0.35/day)
Location
Bulgaria
System Name penguin
Processor R7 5700G
Motherboard Asrock B450M Pro4
Cooling Some CM tower cooler that will fit my case
Memory 4 x 8GB Kingston HyperX Fury 2666MHz
Video Card(s) IGP
Storage ADATA SU800 512GB
Display(s) 27' LG
Case Zalman
Audio Device(s) stock
Power Supply Seasonic SS-620GM
Software win10
That's a horrible idea. Sometimes it takes a good 5 hours for a game to crash from a bad overclock, OCCT will find it in 10-20 minutes, and then you don't need to worry about finding stability with hours of testing for each individual program. And "the furry donut" is only good for heating up your card or telling you you're way past the stability limit, it's not sensitive enough for real stress testing. At least not with current cards. If that or programs based on it is the only test you use you're not going to have a truly stable overclock, then you'll get crashes in games and blame the games or the drivers when it's really user error.

It's always easier to have a stress testing program open, don't get me wrong. But what I was trying to say was that pointless and needless to run it for 5+ hours. I usually set a clock, test for a couple of mins, go higher, test for a couple of mins, go higher, test for a couple of mins. The moment I get artifacts, i go back 10 MHz and try again.Once I'm bored of that, I fire a game and if crashes, I just go back 10-20 MHz on both RAM and Core and try again...

I agree that it's unrealistic to think that a game can go over the 300W limit because of the way game code is written and because of the randomness that the human player creates.
The game-play is always random and that means that the environment is always created in real time. Thus every scene has to go through the entire pipeline and spend finite ammounts in each step of it.
To be fair stress testing tools are more like advanced HPC calculations or even folding, where a specific part is stressed over and over for long periods of time.

Edit:
And if we're talking about corporate takeovers, I think Nvidia will be snatched up first, not because they're in danger of going down or anything crazy like that, but because it would be a smart purchase. Their cards are doing great in the HPC space and it would be a smart move for someone like IBM or Oracle(or even HP and Dell) to snatch them up while Nvidia hasn't gotten too much momentum and are still cheaper. That would allow them to add them to their server farm line up and have an ace up their sleeves compared to the opposition.
 
Last edited:
Joined
Oct 9, 2009
Messages
716 (0.13/day)
Location
Finland
System Name RGB-PC v2.0
Processor AMD Ryzen 7950X
Motherboard Asus Crosshair X670E Extreme
Cooling Corsair iCUE H150i RGB PRO XT
Memory 4x16GB DDR5-5200 CL36 G.SKILL Trident Z5 NEO RGB
Video Card(s) Asus Strix RTX 2080 Ti
Storage 2x2TB Samsung 980 PRO
Display(s) Acer Nitro XV273K 27" 4K 120Hz (G-SYNC compatible)
Case Lian Li O11 Dynamic EVO
Audio Device(s) Audioquest Dragon Red + Sennheiser HD 650
Power Supply Asus Thor II 1000W + Cablemod ModMesh Pro sleeved cables
Mouse Logitech G500s
Keyboard Corsair K70 RGB with low profile red cherrys
Software Windows 11 Pro 64-bit
Do I run Furmark 24/7? No.
Does it break if I do run Furmark without the limiter? No.
Does the limiter kick in games even with overvolting and overclocking? No.
Does it prevent someone breaking card if they don't know what they are doing with voltages? Quite possibly.
Card priced right compared to previous gens? Yes.
Fastest single GPU at least for the moment? Yes.
Does it run relatively quiet and at reasonable temps? Yes.
Do I need new card? Yes.

= Ordered GTX 580

Seriously, this bullshit whining about limiters in programs as Furmark is silly, it is not even new thing and even AMD has done driver level limiters. There is huge total of 0 people to whom it is a problem, except in their heads and yet another thing to bash NV about with no intentions to ever even look in the direction of their cards.

Oh and just to be sure: I have had over 10 ATi and 10 NV cards in past 2 years, go figure.

If 580 isn't for you then please move along, I am sure the HD 69xx will come and deliver too. But stop this nonsense please.

/End of rant
 
Joined
Nov 15, 2010
Messages
9 (0.00/day)
Location
Australia
Wtf at 5970 scores in WoW?

compare these two

http://www.techpowerup.com/reviews/ATI/Radeon_HD_6870_CrossFire/18.html

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/20.html

5970 is in totally different places in these 2 tests, while the other gpus are at the exact same fps.

are we %100 sure that this site is trustable ?

i looked into this regarding the 6870's CF performance with WoW however the score seems to be half that of just 1 card, i believe that this is a mistake on your end techpowerup when u benchmarked the 6870 cards.

Please give a logical explanation for the 2 entirely different answers to the same benchmark.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,850 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
5970 is in totally different places in these 2 tests [reviews], while the other gpus are at the exact same fps.

are we %100 sure that this site is trustable ?

dont trust this site!! read the test setup page before making accusations
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,058 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Can I swear?

BASTARDS!

Overcockers, sorry OverclockersUK are price gouging for sure. Only have the ASUS board in stock and it's £459.99. They'll do this until the HD 6970 comes out. Same way the 6850 and 6870 prices generally increased almost immediately.
 
Joined
Nov 15, 2010
Messages
9 (0.00/day)
Location
Australia
if so can you rebenchmark WoW with 10.10 drivers for 6870's in CF?

Edit: my initial post was copied from a overclockers site, maybe i shouldve removed the trust bit lol XD my bad
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,850 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
if so can you rebenchmark WoW with 10.10 drivers for 6870's in CF?

Edit: my initial post was copied from a overclockers site, maybe i shouldve removed the trust bit lol XD my bad

just go by the 5970 numbers and the 5970 vs 6870 relative performance in other games

and please go to that forum and tell them what's going on with the numbers, so no need to cry conspiracy
 
Top