Monday, March 12th 2012
GeForce GTX 580 to Get Price-Cuts
To pave the way for the GeForce GTX 680, which will arrive later this month in small but sizable quantities, with wide availability in the months to come, NVIDIA is cutting the prices of its GeForce GTX 580 graphics card. The GF110-based behemoth of 2011, will now start at 339.8 EUR, according to European price-aggregators such as Geizhals.at. The new price makes GeForce GTX 580 1.5 GB cheaper than the Radeon HD 7950, and having a slightly improved price-performance ratio. The 3 GB variants of GeForce GTX 580 are priced similar to the HD 7950. The GTX 570 starts at 249 EUR.
54 Comments on GeForce GTX 580 to Get Price-Cuts
cant wait to see how this card performs and if this dynamic oc feature works well or if its a flop. i am eagerly awaiting nvs offerings coz i wanna give them a crack since i havent used a nv card since the 8800gts 320mb
G80 to G92 transition proved you could teach an old dog new tricks and that by shrinking a existing die, making minor tweaks, and upping the clock speeds you could make a good architecture last in the market for a number of years while your competitor struggled to scale new architectures from their high end downward and still make them competitive.
A more efficient GTX 580 would be a perfect card, knock it down to 28nm and rebalance the clock speeds accordingly and even without DX11 I bet they'll sell very well against the 7800 series. And a 28nm GTX 570/560 TI would really make a lot of Nvidia's cluttered midrange obsolete.
and there i was thinking the 580 was a revision of the 480 and had dx11 :|
flushing the channels before a new launch, what a shocker :lol:
I also don't think it's as easy as just shrinking the pre-existing GTX580 to 28nm, it would take a hefty amount of reworking. Kepler is intented to replace Fermi as a similar but much more efficient (in terms of power and heat generation) iteration. I don't see them wasting time on reworking that design when they already have the improved version in production.
The real ones are average and peak values, at 88W and 85W respectively. Still a big difference but not that bad considering that we are comparing an old 40nm GPU to a 28nm one. All in all, 7950 looks to be a better choice, unless someone wants things like CUDA and/or physx.
So in your mind back a half-a-year ago or more Nvidia knew "with certainty" the reference 7970’s would bring about 12-18% improvement over the GTX 580, so with that they shelved the "top dog" embarking to make the GK-104 competitive. Hinging on if they can successfully implement some untried feature to boost the performance as need. :cool:
Nvidia was able to even if already a "think tank" project; completely evaluate, implement and tested all the parameters to make Dynamic profiles features function flawlessly in those 6 months? A tall order, but plausible.
You deduce AMD placed the bar too low, so Nvidia's running up to it, and figures ah! what the heck watch this… Rolls it shoulders back and limbos under the bar wowing the faithful! They throw arm up in victory saying we got to the other side; though with new rules ok? Wait for their next attempt.
So A GK-100 is such the virtuoso in performance, power, and price Nvidia couldn't bring themselves to market it? :wtf:
Don’t get me wrong if it works and they planned it this way (even as a contingency) when first setting out in developing Kepler… Kudos for them, and good game changer. :respect:
While if they had apathy 6 months back and changed their game plan based on a speculation of performance, wow they risked a lot on what might or should've been at least shown previously in a trail product. :twitch:
@ Casecutter
They didn't know exactly what AMD would bring, but they had an idea a long way back. Specs were known, there were some leaked figures, etc. So they already knew it would be close. They also had the previous generation to compare with where GF114 (midrange/performance) was faster than Cypress (high-end), even though it released later but it's what GF104 was suposed to be. So if Kepler is much better than Fermi it was relatively safe to assume they could do better this generation. But they didn't just stop making the high-end chip, who said that? They only scraped it (if they have canceled it at all) once it was known HD7970's performance. According to rumors, things were not going well with the chip, so instead of releasing another cut down chip like GTX480, they cancelled/post-poned it. And again IF they have done so at all, because no one really knows fuck about it. If GK104 couldn't compete with Tahiti, they would have forced another cut down card like GTX480.