Thursday, March 1st 2012

Kepler Unbeatable: NVIDIA

The tiresome wait for NVIDIA's next-generation GPU is drawing to a close. Or so suggests a Facebook wall post by NVIDIA Italy, which reads (in Italian, of course):
Aspettando Kepler... pazienza, pazienza, pazienza che il momento giusto arriverà, e allora... non ce ne sarà più per nessuno! :-)
That can be translated as "Waiting for Kepler ... patience, patience, patience, the right time will come, and then ... it will be unbeatable (sic)." From various sources we're hearing that there will be hectic activity surrounding the launch of NVIDIA's next-gen GPU in the weeks to come.
Source: XtremeSystems Forums
Add your own comment

165 Comments on Kepler Unbeatable: NVIDIA

#51
Animalpak
arnoo1Who cares about power usage,
People who buy high end gpu's don't care about power usage, most hardware junkies get water blocks or aftermarket coolers

I know i can't wait and at the end of this year a will grap one, i don't care about price, my gtx275 is getting old as shit
Well my Q9650 + X48 chipset are getting old as shit, bottleneck my GPU.
Posted on Reply
#53
ChristTheGreat
newtekie1NVidia is far more competent at making powerful GPUs than AMD is at making powerful CPUs...:laugh:

It also helps if you have been in the lead the past two generations already, which AMD wasn't when Bulldozer came out, but nVidia is with Kepler.
Lead for maximum power yes, but not at sales.. I you look t he market share, AMD is in front of nVidia..

ANd I don't know why there was an annoucement, xbox720 and the new playstation will use AMD graphics, while the Wii 2 use also AMD graphics.. This is bad news for nVidia :(

Anyway, AMD was good at making CPU, they just sit on their ass a few years ago, and now you see the what it does.

But for now, HD7970 is here, way faster than what nVidia has and that's it..it has a powerful overclocking capabilities, which gives spaces for newer cards or super overclocked edition. While nVidia doesn't show their Kepler, AMD might be preparing already, a new revision ot Tahiti GPU, for next gen. HD8000 will not be a new architecture, I'm sure. So it's just speculation about Kepler for now, they talk but show nothing.

I'm still waiting has I want cheaper card that will perform fine :) (Or getting a second HD6950, at low price)


edit: for power consumption, AMD controlle their power consumption way better than nVidia.. If you run your rig 24/7, this could make a difference at the end of the year. anyway, mine run smooth idle but CPU 100% for BOINC :)
Posted on Reply
#54
jpierce55
I think as PSU prices go up that people may start having more concern over power consumption. Regardless, even if I am wrong.... IF you had 2 gpu's with the same performance/price why buy the one that uses more power? I would take a slight hit on performance for a good power saving.
Posted on Reply
#55
NC37
Wow, AMD graphics fanboys out in force today.

Heck I switch between sides often enough but AMD hasn't launched anything since 5000 series which really has been worth it to me. 6000 series was a bunch of rebadges in the mid and disappointments in other segments. 7000 hasn't fared much better. Fermi was a breath of fresh air after the G92 era.

NV makes a lot of bonehead moves, why they lost all the contracts for the next consoles. But I can't fault them on building good GPUs. With Kepler moving them away from the monolithic monster GPU design, I can't wait to see it.
Posted on Reply
#56
EarthDog
I would take a slight hit on performance for a good power saving.

I would also get one single more powerful card vs SLI/Crossfire any day with a single monitor.

Why would you do that? Have you ever sat down and actually calculated out the difference between a GPU using 225W vs 300W over the course of a year? I would bet my paycheck says it would barely take your family to McDonalds* with typical GPU usage.


*Unless you participate in a distributed platform using the GPU. ;)
Posted on Reply
#57
cowie
ChristTheGreatLead for maximum power yes, but not at sales.. I you look t he market share, AMD is in front of nVidia..
Thats not true for the discrete market nv has a 10% lead over amd.
nvidia is the 1# seller of discrete desktop cards in the world.

techreport.com/discussions.x/22543
It does not help one bit in market terms that amd has the first top dollar card out.
Now if they had luanched there 7870 that was priced 250usd and as fast as a 6970 they would have grabed market share...not with 470+ cards they wont.
470USD cards are les then 3% of total sales.
Its not first blow but the last blow that will call market winners
If you look at the whole picture igp and cpu/gpu combos then you would be right :)
Posted on Reply
#58
cadaveca
My name is Dave
EarthDogI would take a slight hit on performance for a good power saving.

I would also get one single more powerful card vs SLI/Crossfire any day with a single monitor.

Why would you do that? Have you ever sat down and actually calculated out the difference between a GPU using 225W vs 300W over the course of a year? I would bet my paycheck says it would barely take your family to McDonalds* with typical GPU usage.


*Unless you participate in a distributed platform using the GPU. ;)
I pay $0.15/kWh. That means it cost me near $36/month for a 300W GPU, if it ran 24/7. My power bill for January was $437.30. You bet performance/watt matters, because over 12 months, that's $432 to buy Mcdonalds with. I'll take that cheque, please, as it's easy enough for me personally to make it worthwhile. For the average user, it still might buy that McDonalds. I can't be bothered to guess at how long a GPU is at full load, on average...depends on the app and such, but it's be interesting to get a real number.
Posted on Reply
#59
BlackOmega
1c3d0gDon't compare that POS faildozer to Kepler. If Kepler is even half of what it's claimed, it'll be a good improvement over Fermi. Most probably it'll slaughter AMD once again, but I guess that's just too sensitive for the red fan boys, so they won't accept this.
Why not? Remember when Fermi came out? All the nvidia fanboy's were like "wait for Fermi, wait for Fermi" So a lot of people did. Then when it tanked, arguably as hard as Bulldozer, ALL of the ATi video cards sold out overnight --Literally. I watched the prices for the AMD cards rise, also overnight, $50+.

So I'm willing to bet that nvidia saw how well the 79xx cards perform and realize they have work to do yet.
NC37Wow, AMD graphics fanboys out in force today.

Heck I switch between sides often enough but AMD hasn't launched anything since 5000 series which really has been worth it to me. 6000 series was a bunch of rebadges in the mid and disappointments in other segments. 7000 hasn't fared much better. Fermi was a breath of fresh air after the G92 era.

NV makes a lot of bonehead moves, why they lost all the contracts for the next consoles. But I can't fault them on building good GPUs. With Kepler moving them away from the monolithic monster GPU design, I can't wait to see it.
More like hot air. Ever see the YouTube video where a guy cooked an egg on his 480?
cadavecaI pay $0.15/kWh. That means it cost me near $36/month for a 300W GPU, if it ran 24/7. My power bill for January was $437.30. You bet performance/watt matters, because over 12 months, that's $432 to buy Mcdonalds with. I'll take that cheque, please, as it's easy enough for me personally to make it worthwhile. For the average user, it still might buy that McDonalds. I can't be bothered to guess at how long a GPU is at full load, on average...depends on the app and such, but it's be interesting to get a real number.
Bah you're better off spending that money on power than McDonalds anyway. ;)
Posted on Reply
#60
HD64G
pioneerlike hd6970?w saw how hd6900 series beat fermi :lol:

in this time kepler has extremely powerfull sm architecture and much number cuda core's .... so we saw GK107 with 75w beating hd7770 and much power full than hd6850

kepler is clear winner
You simply forget that 6990 did beat 590. So, Fermi was beaten in the end. Somehow, I suspect the same thing is going to happen again this time...
Posted on Reply
#61
EarthDog
cadavecaI pay $0.15/kWh. That means it cost me near $36/month for a 300W GPU, if it ran 24/7. My power bill for January was $437.30. You bet performance/watt matters, because over 12 months, that's $432 to buy Mcdonalds with. I'll take that cheque, please, as it's easy enough for me personally to make it worthwhile. For the average user, it still might buy that McDonalds. I can't be bothered to guess at how long a GPU is at full load, on average...depends on the app and such, but it's be interesting to get a real number.
You are absolutely right. However most dont run distu platforms or their GPU 24/7365. You made an example out of the worst case scenario in which I specifically mentioned was an exception. Good job! :p

So now, do the math and help this guy out... 75W difference (225W vs 300W) Lets just say 100W to make it easy on me (college is over, so is mathssssssssssssz). So divide your numbers by 66%. Thats the difference if you run 24/7/365 between a 225W card and a 300W card (142.xx /year or ~$12 /month at your rate assuming my math is correct.

Now, if someone plays games 2 hours /day for 30 days (so 60 hours vs 720 /month), you can see the Mcdonalds analogy coming CLEARLY in to focus I would imagine... which is why I put the "*" disclaimer there in the first place to prevent replies like yours!
Posted on Reply
#62
NdMk2o1o
If some rumours and slides are to believed Keplar will have 100% increase in performance over the 5** series for the respectable replacements, I seriously hope this is true as I would buy double the performance of an 570 for the same price, nevermind TWIMTBP, TWTUTBM = the way they used to be made :P
Posted on Reply
#63
jpierce55
EarthDogI would take a slight hit on performance for a good power saving.

I would also get one single more powerful card vs SLI/Crossfire any day with a single monitor.

Why would you do that? Have you ever sat down and actually calculated out the difference between a GPU using 225W vs 300W over the course of a year? I would bet my paycheck says it would barely take your family to McDonalds* with typical GPU usage.


*Unless you participate in a distributed platform using the GPU. ;)
Why would you not??? Keep in mind I said slight hit for GOOD power savings.
Posted on Reply
#64
EarthDog
I'm not sure you are understanding how negligible the differences are for 'average' users. If one want sto save a few dollars /year on your electric bill, why the hell are they buying $500 GPU's in the first place?

Do the math to see what the differences actually are (Its about $1 /month if used ~2 hours /day @ .15 kw/h...again assuming my math is correct from above). :)

If you are trying to save $12 a year, i would say not to buy a $500 GPU instead. :p
Posted on Reply
#65
the54thvoid
Super Intoxicated Moderator
HD64GYou simply forget that 6990 did beat 590. So, Fermi was beaten in the end. Somehow, I suspect the same thing is going to happen again this time...
Really?

In the summary page (7970 xfire review), the 590 beats the 6990 at every resolution. Here's the 2560 res summary.



I'm only putting this in to stop blatant mistruths. Lots of people give the 590 a hard time but it runs cooler and quieter by most accounts and the very own TPU round up for the link above puts 590 as better for every resolution. But as always, it's really game dependant.

I'm pissed NV is holding back info on Kepler as I'm looking to upgrade but it's so close I need to wait to see how Kepler performs as I'm keen to see a 7970 price drop. Unless Kepler is way better (doubt it).

If Kepler bombs, I'm buying 2 7970's just as a capitalist reaction!!
Posted on Reply
#66
TheoneandonlyMrK
NC37Wow, AMD graphics fanboys out in force today.

Heck I switch between sides often enough but AMD hasn't launched anything since 5000 series which really has been worth it to me. 6000 series was a bunch of rebadges in the mid and disappointments in other segments. 7000 hasn't fared much better. Fermi was a breath of fresh air after the G92 era.
:) exatctly what ive been thinking, and no need for bickering amd is winning now for a bit then its nvidias turn for a bit then lo and behold its amd's turn add nauseum:D

I just hope nvidia dont fully drop the ball, as amd did with its hype management when releasing BD, the fallout of such a thing could prove expensive to us enthusiasts

ie kepler Needs to be good and worthy of such hype ,either way tho i will deffinately be buying a low to mid kepler card for some folding/hybrid physx action(gits will obv make this bit hard unnecessarily):)
EarthDogI'm not sure you are understanding how negligible the differences are for 'average' users.
to be fair dude you are on TPU and this is a place not oft visited by the Average user, i want it all personally max performance and minimal power draw, the power pulled can be highly regarded by some as it is by me, if you fold and can run two cards 24/7 with a smaller psu and cheaper case doors can open(welll not doors more folding oportunities:))
Posted on Reply
#67
Wrigleyvillain
PTFO or GTFO
NC376000 series was a bunch of rebadges in the mid and disappointments in other segments
Not completely. The crossfire scaling was much improved and overall great which is what led me to go through the hassle and hit of selling my 5850 to buy two 6850s instead of just getting another of the former. Very happy with that $300 purchase for the power for the last year and likely would still be using them for awhile but I wanted to try the NV drivers again for a change and also want more vram than 1GB. So I just found a cheap 480 for now (lots out there too; soon to be even more surely come April...).
Posted on Reply
#68
magibeg
Reading the original post here, I can't help but feel like we gained no new information. This is just nvidia saying: "Hey guys! The stuff we make will be awesome!"
Posted on Reply
#69
TheoneandonlyMrK
magibegReading the original post here, I can't help but feel like we gained no new information. This is just nvidia saying: "Hey guys! The stuff we make will be awesome!"
bang on

70 posts later the dabate rages on:D
Posted on Reply
#70
xenocide
I don't think anyone can really refute that Nvidia will offer more powerful cards, they have pretty reliably, but the real question is both cost and relative performance. If you could pay $1000 for a 680 that has 30-40% higher performance than an HD7970, you're not exactly compelled to buy it. If Nvidia can match AMD at price points and offer much higher performance they will crush the competition.

I have high hopes for Kepler and plan on getting a 6xx series GPU to give Nvidia a shot since the last time I had an Nvidia card was a 7900GS a few years back. Nvidia is just trying to keep people in anticipation of their new line, but given the problems with the 7xxx series and drivers I've seen around, I don't think they really have much to worry about.
Posted on Reply
#71
sclera
Reasons to care about Kepler:
  • AMD price drops
Posted on Reply
#72
Crap Daddy
What you will see in April or maybe even sooner won't destroy the 7970. Instead it will bring something that NV was behind: perf/watt and maybe also a better perf/dollar. Don't expect a "performance" chip to beat AMD's top dog. That will come later with the GK110. I think GK104 will be (at stock clocks) between the 7950 and the 7970.
Posted on Reply
#73
jpierce55
EarthDogI'm not sure you are understanding how negligible the differences are for 'average' users. If one want sto save a few dollars /year on your electric bill, why the hell are they buying $500 GPU's in the first place?

Do the math to see what the differences actually are (Its about $1 /month if used ~2 hours /day @ .15 kw/h...again assuming my math is correct from above). :)

If you are trying to save $12 a year, i would say not to buy a $500 GPU instead. :p
More power consumption boils down to more than energy savings per month. I don't even look at it that way. I think about a smaller power supply, a power supply lasting longer, and a quieter system due to less heat output.

This argument is pointless though, for all we truly know Kepler could use very little energy.
Posted on Reply
#74
Aquinus
Resident Wat-man
I like how everyone is ranting about a chip where there is almost no factual information available yet. How many times do I have to say that we're still waiting on Kepler and that it does no one any good until it is released. The 7970 is here and it is doing great, that is more I can say for Kepler at the moment...
Posted on Reply
#75
EarthDog
theoneandonlymrkto be fair dude you are on TPU and this is a place not oft visited by the Average user, i want it all personally max performance and minimal power draw, the power pulled can be highly regarded by some as it is by me, if you fold and can run two cards 24/7 with a smaller psu and cheaper case doors can open(welll not doors more folding oportunities:))
:roll: This place is littered with average users. But the point you may have missed is how I defined an average user and the exception(s). Take a minute and reread my posts again.

And for the rest of your post, sorry, Im tired as hell but that makes no sense to me... :confused:
jpierce55More power consumption boils down to more than energy savings per month. I don't even look at it that way. I think about a smaller power supply, a power supply lasting longer, and a quieter system due to less heat output.

This argument is pointless though, for all we truly know Kepler could use very little energy.
You looked at it that way up top.. now the numbers come out so more reasons come out? Timely... :D

PSU wont last longer with a slightly lesser load, my god man..pass the dutchy this way.

So you are going to go out and buy a lesser PSU? Does that make sense...?

The discussion (not an argument) is relevent.. or it was when you brought up those points... now...... its not? :wtf:
Posted on Reply
Add your own comment
Nov 29th, 2024 05:12 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts