Monday, March 12th 2012

GeForce GTX 580 to Get Price-Cuts

To pave the way for the GeForce GTX 680, which will arrive later this month in small but sizable quantities, with wide availability in the months to come, NVIDIA is cutting the prices of its GeForce GTX 580 graphics card. The GF110-based behemoth of 2011, will now start at 339.8 EUR, according to European price-aggregators such as Geizhals.at. The new price makes GeForce GTX 580 1.5 GB cheaper than the Radeon HD 7950, and having a slightly improved price-performance ratio. The 3 GB variants of GeForce GTX 580 are priced similar to the HD 7950. The GTX 570 starts at 249 EUR.
Add your own comment

54 Comments on GeForce GTX 580 to Get Price-Cuts

#1
Enmity
edit : nevermind i just cant read lol

cant wait to see how this card performs and if this dynamic oc feature works well or if its a flop. i am eagerly awaiting nvs offerings coz i wanna give them a crack since i havent used a nv card since the 8800gts 320mb
Posted on Reply
#3
Aceman.au
Lol and so the price dropping begins... As per usual
Posted on Reply
#4
Bjorn_Of_Iceland
Still expensive. I can see people getting 2nd hand cards instead.
Posted on Reply
#5
RejZoR
Well, for 340 EUR i see it as a good competition even against current HD7000 lineup despite the fact its the older series. With absolute nonsense prices around HD7950 and HD7970 (both way over 400 EUR) this GTX 580 is a good option still.
Posted on Reply
#6
wickerman
What Nvidia would be wise to do, is release their high end Kepler products and go for the crown, and just try to die shrink the GTX 580 and 570 to go after the HD 7800 series.

G80 to G92 transition proved you could teach an old dog new tricks and that by shrinking a existing die, making minor tweaks, and upping the clock speeds you could make a good architecture last in the market for a number of years while your competitor struggled to scale new architectures from their high end downward and still make them competitive.

A more efficient GTX 580 would be a perfect card, knock it down to 28nm and rebalance the clock speeds accordingly and even without DX11 I bet they'll sell very well against the 7800 series. And a 28nm GTX 570/560 TI would really make a lot of Nvidia's cluttered midrange obsolete.
Posted on Reply
#7
LAN_deRf_HA
And it only took what, like a year and a half?
Posted on Reply
#8
vega22
wickermanWhat Nvidia would be wise to do, is release their high end Kepler products and go for the crown, and just try to die shrink the GTX 580 and 570 to go after the HD 7800 series.

G80 to G92 transition proved you could teach an old dog new tricks and that by shrinking a existing die, making minor tweaks, and upping the clock speeds you could make a good architecture last in the market for a number of years while your competitor struggled to scale new architectures from their high end downward and still make them competitive.

A more efficient GTX 580 would be a perfect card, knock it down to 28nm and rebalance the clock speeds accordingly and even without DX11 I bet they'll sell very well against the 7800 series. And a 28nm GTX 570/560 TI would really make a lot of Nvidia's cluttered midrange obsolete.
wow

and there i was thinking the 580 was a revision of the 480 and had dx11 :|

flushing the channels before a new launch, what a shocker :lol:
Posted on Reply
#9
xenocide
LAN_deRf_HAAnd it only took what, like a year and a half?
It was always priced competatively.

I also don't think it's as easy as just shrinking the pre-existing GTX580 to 28nm, it would take a hefty amount of reworking. Kepler is intented to replace Fermi as a similar but much more efficient (in terms of power and heat generation) iteration. I don't see them wasting time on reworking that design when they already have the improved version in production.
Posted on Reply
#10
hhumas
I have sold mine GTX 580 Black ops today @ 460$
Posted on Reply
#11
Jonap_1st
$4 cheaper, 7950 still faster on the most recent games (exc. skyrim) and consume 80-150W less than 580 on peak & maximum. but if you love your card, then what can everyone say? :)
Posted on Reply
#12
dj-electric
With HD7950's super-massive overclocking abilities even 399$ will be somewhat not worth it
Posted on Reply
#13
LAN_deRf_HA
xenocideIt was always priced competatively.

I also don't think it's as easy as just shrinking the pre-existing GTX580 to 28nm, it would take a hefty amount of reworking. Kepler is intented to replace Fermi as a similar but much more efficient (in terms of power and heat generation) iteration. I don't see them wasting time on reworking that design when they already have the improved version in production.
It was never priced competitively. None of the highend cards have been for years. One company, usually nvidia, sets a price they want you to pay, and then the other company releases their cards in between those price points. There is no competition. It's price fixing without actual communication.
Posted on Reply
#14
xenocide
LAN_deRf_HAIt was never priced competitively. None of the highend cards have been for years. One company, usually nvidia, sets a price they want you to pay, and then the other company releases their cards in between those price points. There is no competition. It's price fixing without actual communication.
I'm not saying I agree with it, not in the slightest, and it is completely price fixing. At this rate Nvidia's cards will be priced to compete with AMD's, so if the GTX680 is faster than the HD7970, it will probably launch at nearly $600.
Posted on Reply
#15
wickerman
marsey99wow

and there i was thinking the 580 was a revision of the 480 and had dx11 :|

flushing the channels before a new launch, what a shocker :lol:
I just confused my terms as I intended to write D3D11.1 but wrote DX11 since DirectX is what we are always used to talking about not Direct3D. HD 7900 series launched as the first with Direct3D 11.1 support, which is one of the major changes of Windows 8 and one of the larger selling points of this next generation of GPUs. Obviously Kepler will be Direct3D 11.1 as well, where Fermi was not - Fermi was Direct3D 11. But there you have it, a mistake, good thing I'm not a brain surgeon. :wtf:
Posted on Reply
#16
eddman
Jonap_1st$4 cheaper, 7950 still faster on the most recent games (exc. skyrim) and consume 80-150W less than 580 on peak & maximum. but if you love your card, then what can everyone say? :)
Maximum?! That can only be reached while stress testing with Furmark. Which gamer does that!

The real ones are average and peak values, at 88W and 85W respectively. Still a big difference but not that bad considering that we are comparing an old 40nm GPU to a 28nm one. All in all, 7950 looks to be a better choice, unless someone wants things like CUDA and/or physx.
Posted on Reply
#17
m1dg3t
580 is still too expensive in my area, cheapest i saw was $440
Posted on Reply
#18
Selene
It really sucks that we are going to have to pay $499.99 for the GF104(now GTX680) that was slated to be the GTX660TI @299.99 because AMDs 7950 and 7970 where not as powerful as NV thought they would be.
Posted on Reply
#19
Benetanegia
SeleneIt really sucks that we are going to have to pay $499.99 for the GF104(now GTX680) that was slated to be the GTX660TI @299.99 because AMDs 7950 and 7970 where not as powerful as NV thought they would be.
To me this price-cut does sound like it's going to be significantly cheaper than 500, say 350-400, or significantly faster than most of us expect now, because there would be no rush to lower GTX580 price to 330 otherwise. If they were going to sell it for 500, and it is 25% faster than GTX580, there would still be a place for GTX580 at 400 or so, no need to go as low as 330 and 250 for the GTX570. They would be making the new offering look very overpriced and it's forcing AMD to lower prices too, BEFORE GTX680 launches which is shooting themselves in the foot, because it's the GTX680 that needs the fame, not the EOL'd card.
Posted on Reply
#20
erocker
*
SeleneIt really sucks that we are going to have to pay $499.99 for the GF104(now GTX680) that was slated to be the GTX660TI @299.99 because AMDs 7950 and 7970 where not as powerful as NV thought they would be.
$499 for the GTX680? Lol, that would be a bargain. I've heard prices are going to be a bit more than that.
Posted on Reply
#21
Selene
erocker$499 for the GTX680? Lol, that would be a bargain. I've heard prices are going to be a bit more than that.
Normally I would agree but this card was intended to be a mid range $299.99 card and has now been renamed and price jacked to a flag ship because of the lack of competition with AMDs 7900s.
Posted on Reply
#22
erocker
*
It's going to be more expensive than the 7970. I could be wrong I suppose. Lack of competition from AMD? Nvidia doesn't even have a card out yet.
Posted on Reply
#23
Steevo
SeleneNormally I would agree but this card was intended to be a mid range $299.99 card and has now been renamed and price jacked to a flag ship because of the lack of competition with AMDs 7900s.
I have not seen where Nvidia was releasing this as anything other than a high end card. Can someone provide a link, proof, or something other than heresay from some other post on another site?
Posted on Reply
#24
Casecutter
SeleneIt really sucks that we are going to have to pay $499.99 for the GF104(now GTX680) that was slated to be the GTX660TI @299.99 because AMDs 7950 and 7970 where not as powerful as NV thought they would be.
It doesn't suck; no one is making you purchase it, if you can't stomach the price there will be some that do. So, Nvidia isn't pressured to release a high end part, while trying to implement some new wiz-bang feature on lower chips to achieve competitiveness… and that’s somehow AMD's fault. :laugh:

So in your mind back a half-a-year ago or more Nvidia knew "with certainty" the reference 7970’s would bring about 12-18% improvement over the GTX 580, so with that they shelved the "top dog" embarking to make the GK-104 competitive. Hinging on if they can successfully implement some untried feature to boost the performance as need. :cool:

Nvidia was able to even if already a "think tank" project; completely evaluate, implement and tested all the parameters to make Dynamic profiles features function flawlessly in those 6 months? A tall order, but plausible.

You deduce AMD placed the bar too low, so Nvidia's running up to it, and figures ah! what the heck watch this… Rolls it shoulders back and limbos under the bar wowing the faithful! They throw arm up in victory saying we got to the other side; though with new rules ok? Wait for their next attempt.

So A GK-100 is such the virtuoso in performance, power, and price Nvidia couldn't bring themselves to market it? :wtf:

Don’t get me wrong if it works and they planned it this way (even as a contingency) when first setting out in developing Kepler… Kudos for them, and good game changer. :respect:

While if they had apathy 6 months back and changed their game plan based on a speculation of performance, wow they risked a lot on what might or should've been at least shown previously in a trail product. :twitch:
Posted on Reply
#25
Benetanegia
SteevoI have not seen where Nvidia was releasing this as anything other than a high end card. Can someone provide a link, proof, or something other than heresay from some other post on another site?
GK104, the 4 in the end has always been indicative of midrange/performance part. just like 6 is lower mainstream and 8 is low end. And the existence of GK100 and GK110 indicative of high-end is very well known, though nothing else is really known except that at some point they were/are in the making.

@ Casecutter

They didn't know exactly what AMD would bring, but they had an idea a long way back. Specs were known, there were some leaked figures, etc. So they already knew it would be close. They also had the previous generation to compare with where GF114 (midrange/performance) was faster than Cypress (high-end), even though it released later but it's what GF104 was suposed to be. So if Kepler is much better than Fermi it was relatively safe to assume they could do better this generation. But they didn't just stop making the high-end chip, who said that? They only scraped it (if they have canceled it at all) once it was known HD7970's performance. According to rumors, things were not going well with the chip, so instead of releasing another cut down chip like GTX480, they cancelled/post-poned it. And again IF they have done so at all, because no one really knows fuck about it. If GK104 couldn't compete with Tahiti, they would have forced another cut down card like GTX480.
Posted on Reply
Add your own comment
Dec 22nd, 2024 00:40 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts