• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Second Wave of NVIDIA GeForce GTX 600 Products Due For May

I'm sure there will be an Asus direct CUII 660 or 670 with a backplate, but you're just in la-la land with the '666' thing, I think.

Oh I am sure, but one can dream.
 
GK110 not going to happen then? I'd rather get the "full" Kepler card than the 680.
 
GK110 not going to happen then? I'd rather get the "full" Kepler card than the 680.

it'll happen, but I have a feeling GK110 is going to be the GTX 780.
 
8800 GTX vs 7900 GTX was around 35%
8800 GTX vs GTX 280 was around 32%
GTX 480 vs GTX 280 was around 33%
GTX 480 vs GTX 680 was around 27%
I guess I looked up some bad reviews as Anandtech/TPU found:

8800GTX vs 7900GTX: 112% increase with new architecture and full node process shrink. +146% die size change and 146% more transistors.
GTX280 vs 8800GTX: 69% increase with same architecture and full node process shrink. +19% die size change and 19% more transistors.
GTX480 vs GTX285 47% increase with new architecture and half node process shrink. -8% (vs GTX280) / +12% (vs GTX285) die size change and 114% more transistors.
GTX680 vs GTX580: 29% increase (apples to oranges, performance vs high-end) new architecture and full node process shrink. -49% die size change and 19% more transistors.
 
i compared TPU reviews only, looked up same tests and games i could find then manually did the numbers, Andantech shows larger gains TPU didnt most of the gain was due to 512mb frame buffer vs 768 when AA is turned on granted its still valid comparison the 7900 GTX lacked enough vram which with AA on causes problems Oblivion back in the day could approach 800-900mb of Vram usage

so looking at 1280x1024 which was the most popular resolution of that time which was according to steams hardware survey around 36-40% of all steam users
http://web.archive.org/web/20060825052346/http://www.steampowered.com/status/survey.html

if we focus on the majority (right now today majority is 1920x1080)

we can see the 7900 GTX to 8800 GTX is roughly 40%
at 1280x1024
Far Cry = 1%
Prey = 30%
Fear = 36%
Quake 4 = 23%
X3 = 1%

Avg 18%

at 1600x1200
Far Cry = 27%
Prey = 41%
Fear = 43%
Quake 4 = 47%
X3 = 23%

Avg 36%

as resolution increases performance gap widens but as far as TPU reviews go but at the most commonly used resolutions of the time period it was 35-40% difference give or take

http://www.techpowerup.com/reviews/Zotac/GeForce_8800_GTX_OC/10.html

X3 7900 GTX 51 FPS 8800 GTX 66 FPS
51 / 66 = .77 aka 23 altho maybe im doing the math wrong, havent slept in 48 hours
 
nvidia's discrete market share to 63% in Q4 2011.

That is for Q3 2011, not Q4.
The bulk of desktop sales come from performance and mainstream cards, not high-end geforce or tesla and quadro cards.

Sorry here's Q4:
"NVIDIA is exiting the integrated graphics segments and shifting focus to discrete GPUs. The company showed good desktop discrete market share gain (3.7% qtr-qtr)".

Discrete isn't broken out as game/professional so seeing they were off -6.2% Q3 they gained back some but where still negative... And yes it probably had to do with finally getting aggressive with pricing on 5XX Series SKU’s (GTX580 rebates some as high as $60) which they had to due to the product life cycle and shareholder starting a lynch mob. I'm mean they lost market share they pulled in more unit sales, that telling Using these marketing number it no as rosy as your 63% improvement seen on steam Qtr-Qtr.

Market share this quarter/ Market share last Qtr/ Unit Change Qtr-Qtr/ Share Change Qtr-Qtr/ Market Share last yr
Nvidia: 15.7%; 16.1%; -13.2%; -3.1%; 22.5%

http://www.techpowerup.com/161065/Jon-Peddie-Research-Reports-Q4-Graphics-Shipments.html

I don't take much stock in marketing information... from Steam users! :roll:
 
1280x1024 which was the most popular resolution of that time which was according to steams hardware survey
Wow, be still my gentle heart... 1280x! I realize point that trying to be projected, although such a viewpoint is not productive.

I’m just going to interject that isn't any position (defensible) today, given these cards shouldn't ever be a consideration for any one still work with 1280x. Other than what might turn out to be the GTX650 (hark back to that Nvidia called a GTS 450 'LAN Party Pwning' @1680x) purchasing such cards it must be stated has no value/sense for 1280x. It's 2012 to let the subject of new graphics at this price even to be associated in the context of 1280x is not productive. There a card for that it’s a 7750 for $100, and the youngster doesn’t worry about the PSU in their OEM Box.
 
Last edited:
uh did i say 680 at 1280 no i said 7900GTX

a 7900 GTX dosent even have enough memory for 1080p gameplay period,

key word at that time

were talking 200 and fucking 5, get a clue, that was 7 years ago if 7 years ago the average performance difference at the most common resolutions was 30-40% and that same difference is what we see today at most common resolutions aka 1080p its a viable comparison. so yes the viewpoint is productive

since those perfromance % differences pretty much were ment for the one person bitching about every gen putting up 80% difference over previous gen, when looking back at reviews shows performance difference at proper resolutions is still 30-40% there for the mythical 80% never really was. which is what my post was pointing out to 1 specific person.

so again 2005 7900 GTX vs 2006 8800GTX most common resolution for gaming was 1280x1024
2006 to 2007 8800 GTX vs GTX 280 most common resolution began to increase 1680x1050 became far more common
2007-2009 res remained roughly the same with widescreen becoming most prevalent, yet 1680x1050 persevered,

2009 GTX 280 vs 2010 GTX 480 at 1080p performance difference was 33% according to TPU
blah blah blah
 
that was 7 years ago
And today the target age for those on a 1280x...

Wow don't take it so personally, I just see the whole argument of improvement "80% increase over previous generation" never had truth as you pointed out. If they don't like the "improvement" (which gets harder to achieve because gaming engines are more demanding) don't try convincing them. My beef wasn't with you, but folk thinking there’s some seamless correlation back to 2005/1280x, though you made a valiant stab...

For what this is about, where we are... today 2012 those of the "not enough improvement" camp either need to accept or don't spend money. They can continue playing outdated games on their 17".
 
I ignored CPU bottlenecked benches.
 
never existed go back look at old reviews

8800 GTX vs 7900 GTX was around 35%
8800 GTX vs GTX 280 was around 32%
GTX 480 vs GTX 280 was around 33%
GTX 480 vs GTX 680 was around 27%

give or take a few bad benchmarks and we look at roughly a 25-35% gain generation to generation if we add in refresh series aka 9000 and 500 it looks worse but in general each gen is a 30% bump where people get this ridiculous 80% increase is beyond me

Forgot to add, that they used to take no more than a year for that 42+80% = 122%, now a year gives you 27%
3870($250) vs 4870($299) = 70% inc? 1 year / 7800gtx vs 8800gtx 80%
4870($299) vs 5870($379) = 65% inc? 1 year / 8800gtx vs gtx280 42%
+130% over two years? / +122% over 2 years

and, gtx480 (2010) vs gtx 680(2012) = +27% 2 years..
hd5870($379) vs hd6970($369) = +17% 1 year
hd6970($360) vs hd7970($550) = +24% 1 year (perf./price? fuck you?)
40% over 2 years
 
Last edited:
Sorry here's Q4:
"NVIDIA is exiting the integrated graphics segments and shifting focus to discrete GPUs. The company showed good desktop discrete market share gain (3.7% qtr-qtr)".

Discrete isn't broken out as game/professional so seeing they were off -6.2% Q3 they gained back some but where still negative... And yes it probably had to do with finally getting aggressive with pricing on 5XX Series SKU’s (GTX580 rebates some as high as $60) which they had to due to the product life cycle and shareholder starting a lynch mob. I'm mean they lost market share they pulled in more unit sales, that telling Using these marketing number it no as rosy as your 63% improvement seen on steam Qtr-Qtr.

Market share this quarter/ Market share last Qtr/ Unit Change Qtr-Qtr/ Share Change Qtr-Qtr/ Market Share last yr
Nvidia: 15.7%; 16.1%; -13.2%; -3.1%; 22.5%

http://www.techpowerup.com/161065/Jon-Peddie-Research-Reports-Q4-Graphics-Shipments.html

I don't take much stock in marketing information... from Steam users! :roll:

You are not making any sense. That 63% I mentioned was from, jpr not steam.

http://www.techpowerup.com/161316/J...cs-Add-in-Board-Shipments-Down-6.5-in-Q4.html

Is that proof enough? We are NOT talking about total market share which includes IGPs that nvidia does not have.
We are talking about discrete market share. Jpr says nvidia went from 59.7% in q3 to 63.4% in q4. Steam also shows more 500 series cards than hd 6000. All of those means that 500 series actually increasd their discrete market share. Very simple to understand. End of story. Is it really that hard for you to connect a few dots?
 
^No doubt this is due to ATI/AMD driver problems + NV often having the higher performing top end cards.
I haven't had any driver issues with my 6850, and CCC works just fine for me, but from what I've seen, the NV driver console looks pretty slick and clean, in comparison. That, along with a good performing 660 (or 760) would make me switch; if the 660 (or 760) had the same interesting gpu design architecture as the 680, in the $200-250 range.
I wait and watch.
 
i compared TPU reviews only, looked up same tests and games i could find then manually did the numbers, Andantech shows larger gains TPU didnt most of the gain was due to 512mb frame buffer vs 768 when AA is turned on granted its still valid comparison the 7900 GTX lacked enough vram which with AA on causes problems Oblivion back in the day could approach 800-900mb of Vram usage

so looking at 1280x1024 which was the most popular resolution of that time which was according to steams hardware survey around 36-40% of all steam users
http://web.archive.org/web/20060825052346/http://www.steampowered.com/status/survey.html

if we focus on the majority (right now today majority is 1920x1080)

we can see the 7900 GTX to 8800 GTX is roughly 40%
at 1280x1024
Far Cry = 1%
Prey = 30%
Fear = 36%
Quake 4 = 23%
X3 = 1%

Avg 18%

at 1600x1200
Far Cry = 27%
Prey = 41%
Fear = 43%
Quake 4 = 47%
X3 = 23%

Avg 36%

as resolution increases performance gap widens but as far as TPU reviews go but at the most commonly used resolutions of the time period it was 35-40% difference give or take

http://www.techpowerup.com/reviews/Zotac/GeForce_8800_GTX_OC/10.html

X3 7900 GTX 51 FPS 8800 GTX 66 FPS
51 / 66 = .77 aka 23 altho maybe im doing the math wrong, havent slept in 48 hours

Yeah you are doing it wrong. You calculated how much slower the 7900 GTX is compared to 8800 GTX. Not how much faster the 8800GTX is.

For example 66/51 = 1.294 >> 29.4% faster

So overall for 1600x1200* and assuming you did the math right, the 7900GTX was 36% slower (GTX580 is 18% slower than GTX680), which is the same as saying that 8800 GTX was 56% faster (GTX680 25% faster). But on top of that, a little memory would go a long way and the 8800 GTX trully needed the fastest CPU to shine, which was Core2 Duo back then. Here's a Tom's Hardware article on that:

GeForce 8800 needs the fastest CPU

image7.gif


1600x1200 -> 83/48 == 73%
1280x1024 -> 114/69 == 65%

EDIT: Also just to add a little more perspective into it. 7900 and X1900 were impressive by themselves, they were refreshes that brought an impressive performance increase of between 33% and 50% a few months after the release of 7800. 7800 GTX vs 8800 GTX was much more than double the performance (+100%) in less than 18 months. GTX480 vs GTX680 is a +40% increase in 2 years. Same with AMD, no, actually worse, since HD5870 was released 6 months earlier, 2.5 years ago.

*I don't agree with 1280x1024 being the only common resolution between enthusiasts back then. 2048x1536 is there, above 1600x1200 for a reason. And Tom's even lists 2560x1600, as well as Anandtech. I used 1600x1200 back then when posible.
 
Last edited:
Yeah you are doing it wrong. You calculated how much slower the 7900 GTX is compared to 8800 GTX. Not how much faster the 8800GTX is.

For example 66/51 = 1.294 >> 29.4% faster

So overall for 1600x1200* and assuming you did the math right, the 7900GTX was 36% slower (GTX580 is 18% slower than GTX680), which is the same as saying that 8800 GTX was 56% faster (GTX680 25% faster). But on top of that, a little memory would go a long way and the 8800 GTX trully needed the fastest CPU to shine, which was Core2 Duo back then. Here's a Tom's Hardware article on that:

GeForce 8800 needs the fastest CPU

http://img.tomshardware.com/us/2006/11/29/geforce_8800_needs_the_fastest_cpu/image7.gif

1600x1200 -> 83/48 == 73%
1280x1024 -> 114/69 == 65%

EDIT: Also just to add a little more perspective into it. 7900 and X1900 were impressive by themselves, they were refreshes that brought an impressive performance increase of between 33% and 50% a few months after the release of 7800. 7800 GTX vs 8800 GTX was much more than double the performance (+100%) in less than 18 months. GTX480 vs GTX680 is a +40% increase in 2 years. Same with AMD, no, actually worse, since HD5870 was released 6 months earlier, 2.5 years ago.

*I don't agree with 1280x1024 being the only common resolution between enthusiasts back then. 2048x1536 is there, above 1600x1200 for a reason. And Tom's even lists 2560x1600, as well as Anandtech. I used 1600x1200 back then when posible.

Couldn't have said it better.
You know what would be awesome Ben? 3870 vs 4870 :D:roll:
 
Yeah you are doing it wrong. You calculated how much slower the 7900 GTX is compared to 8800 GTX. Not how much faster the 8800GTX is.

For example 66/51 = 1.294 >> 29.4% faster

So overall for 1600x1200* and assuming you did the math right, the 7900GTX was 36% slower (GTX580 is 18% slower than GTX680), which is the same as saying that 8800 GTX was 56% faster (GTX680 25% faster). But on top of that, a little memory would go a long way and the 8800 GTX trully needed the fastest CPU to shine, which was Core2 Duo back then. Here's a Tom's Hardware article on that:

GeForce 8800 needs the fastest CPU

http://img.tomshardware.com/us/2006/11/29/geforce_8800_needs_the_fastest_cpu/image7.gif

1600x1200 -> 83/48 == 73%
1280x1024 -> 114/69 == 65%

EDIT: Also just to add a little more perspective into it. 7900 and X1900 were impressive by themselves, they were refreshes that brought an impressive performance increase of between 33% and 50% a few months after the release of 7800. 7800 GTX vs 8800 GTX was much more than double the performance (+100%) in less than 18 months. GTX480 vs GTX680 is a +40% increase in 2 years. Same with AMD, no, actually worse, since HD5870 was released 6 months earlier, 2.5 years ago.

*I don't agree with 1280x1024 being the only common resolution between enthusiasts back then. 2048x1536 is there, above 1600x1200 for a reason. And Tom's even lists 2560x1600, as well as Anandtech. I used 1600x1200 back then when posible.

well there we go 2 days no sleep = dont try doing math lol
 
Back
Top