• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Gainward Readies GeForce GTX 580 Phantom 3 GB Graphics Card

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Last month, Gainward introduced its GeForce GTX 570 Phantom, which won quite some appreciation among readers for its radical design. The company decided to extend it to the GeForce GTX 580, too, since it's based on the same GF110 silicon. The only major change is that 2 Gbit memory chips are used, yielding a total memory of 3072 MB (3 GB). The memory is spread across 12 2 Gbit memory chips across a 384-bit wide GDDR5 memory interface. Clock speeds stick to NVIDIA reference: 783/1566/4020 MHz (core/CUDA cores/memory effective). The GTX 580 packs 512 CUDA cores.

The cooler makes use of a black aluminum fin array, to which heat is fed by six heat pipes. Instead of fans on its obverse side blowing air onto the PCB, there are three fans on its reverse side, drawing air from the aluminum fin array, and onto the PCB. The result is a better-looking product. Display outputs include two DVI, and one each of HDMI and DisplayPort. Gainward will release its 3 GB GTX 580 Phantom graphics card soon, at a premium over the reference design.



View at TechPowerUp Main Site
 
Oooh, a 3GB version. I may upgrade in the future.
 
hmm, I might have to sell my 470.
 
So this is going to cost...say...650€~700€ ?
 
:twitch:^2

WeuW!

Been thinking of an ontopic response, but the next few hours i think i wont come up with any. Too flabbered.
 
So does this mean we can finally play GTAIV the way it was meant to be played? ;) :lol:
 
two of them in sli = perfect setup
 
I can imagine 3GB being a big advantage for the kind of environments that GTX580 SLI setups end up in - multiple monitors, 3d etc.
 
:respect: Oh God... Let me win the lottery... I promise to frag in your name... amen...
 
Actually, it comes with a small OC. The reference card has 772/1544/4008 Mhz.

This card is similar in frequency to the company's GTX 580 GOOD Edition.
 
Is this a reference board? Meaning I could put a backplate on it.
 
regardless how much this thing will cost..WOW
And it looks amazing too.

EDIT: would be nice if somehow they would manage to develop a cooling design with the air being blown out of the case.
 
I have to see benches, otherwise I am not sold. I could give 2 shits about a 3Gb 580 unless it can perform, especially demanding dx11 games.
 
I would be willing to bet it would make a difference when gaming at resolutions like 7680x1600 and high levels of AA (with sli of course)
 
I would be willing to bet it would make a difference when gaming at resolutions like 7680x1600 and high levels of AA (with sli of course)

Yeah right - how many of us will actually run in these resolutions :roll: ..... how many did you say ..... well I thought so ..... none ;)
 
:laugh: but the people who have the money for a pair of these probably have the money for three 2560x1600 monitors.

I do wonder what effect it would have when using three 2560x1600, 2560x1440, 1920x1200 or 1920x1080 monitors, i doubt there would be much difference with three 1680x1050 ones though.
 
I guess these were made for the pissed of Nvidia fans that found out that two 6970s on three monitors were as good or even better in games.
 
I guess these were made for the pissed of Nvidia fans that found out that two 6970s on three monitors were as good or even better in games.

how many users actually use triple screen gaming? and I doubt Nvidia made them because a tiny portion of their market were "pissed off" about AMD's cards. Nvidia hold their own very well with the GTX580 already, I don't think the 6970 scares them save for the fact that some people will look at a 1.5gb card, and a 2gb card and automatically think the 2gb is faster.

really I feel like Nvidia is catering for the people with more money than brains when it comes to a box with 3GB GDDR5! written on it, as oppose to fans that are pissed off about 6970's in CFX @ Eyefinity resolutions competing well vs GTX580's
 
how many users actually use triple screen gaming? and I doubt Nvidia made them because a tiny portion of their market were "pissed off" about AMD's cards. Nvidia hold their own very well with the GTX580 already, I don't think the 6970 scares them save for the fact that some people will look at a 1.5gb card, and a 2gb card and automatically think the 2gb is faster.

really I feel like Nvidia is catering for the people with more money than brains when it comes to a box with 3GB GDDR5! written on it, as oppose to fans that are pissed off about 6970's in CFX @ Eyefinity resolutions competing well vs GTX580's

For all we know this is nothing to do with Nvidia at all, but rather their partners taking the initiative.
 
For all we know this is nothing to do with Nvidia at all, but rather their partners taking the initiative.

an excellent point, and in any case its a move to boost sales, which is what any company is interested in.
 
:laugh: but the people who have the money for a pair of these probably have the money for three 2560x1600 monitors.

I do wonder what effect it would have when using three 2560x1600, 2560x1440, 1920x1200 or 1920x1080 monitors, i doubt there would be much difference with three 1680x1050 ones though.

It depends on the settings of your games also. I'm playing GRAW 2 with 8xMSAA+8xTrSSAA (supersampling) and it uses 1500mb of vram.
 
It depends on the settings of your games also. I'm playing GRAW 2 with 8xMSAA+8xTrSSAA (supersampling) and it uses 1500mb of vram.

True some games/game engines with the right levels of AA just eat up crazy amounts of memory even with single monitor resolutions.
 
Sexy but late to the game IMO, I have my eyes set on the EVGA dual 570, perhaps even 2 of them, tax refund permiting! :roll:
 
Back
Top