Thursday, March 1st 2012
Kepler Unbeatable: NVIDIA
The tiresome wait for NVIDIA's next-generation GPU is drawing to a close. Or so suggests a Facebook wall post by NVIDIA Italy, which reads (in Italian, of course):
Source:
XtremeSystems Forums
Aspettando Kepler... pazienza, pazienza, pazienza che il momento giusto arriverà, e allora... non ce ne sarà più per nessuno! :-)That can be translated as "Waiting for Kepler ... patience, patience, patience, the right time will come, and then ... it will be unbeatable (sic)." From various sources we're hearing that there will be hectic activity surrounding the launch of NVIDIA's next-gen GPU in the weeks to come.
165 Comments on Kepler Unbeatable: NVIDIA
ANd I don't know why there was an annoucement, xbox720 and the new playstation will use AMD graphics, while the Wii 2 use also AMD graphics.. This is bad news for nVidia :(
Anyway, AMD was good at making CPU, they just sit on their ass a few years ago, and now you see the what it does.
But for now, HD7970 is here, way faster than what nVidia has and that's it..it has a powerful overclocking capabilities, which gives spaces for newer cards or super overclocked edition. While nVidia doesn't show their Kepler, AMD might be preparing already, a new revision ot Tahiti GPU, for next gen. HD8000 will not be a new architecture, I'm sure. So it's just speculation about Kepler for now, they talk but show nothing.
I'm still waiting has I want cheaper card that will perform fine :) (Or getting a second HD6950, at low price)
edit: for power consumption, AMD controlle their power consumption way better than nVidia.. If you run your rig 24/7, this could make a difference at the end of the year. anyway, mine run smooth idle but CPU 100% for BOINC :)
Heck I switch between sides often enough but AMD hasn't launched anything since 5000 series which really has been worth it to me. 6000 series was a bunch of rebadges in the mid and disappointments in other segments. 7000 hasn't fared much better. Fermi was a breath of fresh air after the G92 era.
NV makes a lot of bonehead moves, why they lost all the contracts for the next consoles. But I can't fault them on building good GPUs. With Kepler moving them away from the monolithic monster GPU design, I can't wait to see it.
I would also get one single more powerful card vs SLI/Crossfire any day with a single monitor.
Why would you do that? Have you ever sat down and actually calculated out the difference between a GPU using 225W vs 300W over the course of a year? I would bet my paycheck says it would barely take your family to McDonalds* with typical GPU usage.
*Unless you participate in a distributed platform using the GPU. ;)
nvidia is the 1# seller of discrete desktop cards in the world.
techreport.com/discussions.x/22543
It does not help one bit in market terms that amd has the first top dollar card out.
Now if they had luanched there 7870 that was priced 250usd and as fast as a 6970 they would have grabed market share...not with 470+ cards they wont.
470USD cards are les then 3% of total sales.
Its not first blow but the last blow that will call market winners
If you look at the whole picture igp and cpu/gpu combos then you would be right :)
So I'm willing to bet that nvidia saw how well the 79xx cards perform and realize they have work to do yet. More like hot air. Ever see the YouTube video where a guy cooked an egg on his 480? Bah you're better off spending that money on power than McDonalds anyway. ;)
So now, do the math and help this guy out... 75W difference (225W vs 300W) Lets just say 100W to make it easy on me (college is over, so is mathssssssssssssz). So divide your numbers by 66%. Thats the difference if you run 24/7/365 between a 225W card and a 300W card (142.xx /year or ~$12 /month at your rate assuming my math is correct.
Now, if someone plays games 2 hours /day for 30 days (so 60 hours vs 720 /month), you can see the Mcdonalds analogy coming CLEARLY in to focus I would imagine... which is why I put the "*" disclaimer there in the first place to prevent replies like yours!
Do the math to see what the differences actually are (Its about $1 /month if used ~2 hours /day @ .15 kw/h...again assuming my math is correct from above). :)
If you are trying to save $12 a year, i would say not to buy a $500 GPU instead. :p
In the summary page (7970 xfire review), the 590 beats the 6990 at every resolution. Here's the 2560 res summary.
I'm only putting this in to stop blatant mistruths. Lots of people give the 590 a hard time but it runs cooler and quieter by most accounts and the very own TPU round up for the link above puts 590 as better for every resolution. But as always, it's really game dependant.
I'm pissed NV is holding back info on Kepler as I'm looking to upgrade but it's so close I need to wait to see how Kepler performs as I'm keen to see a 7970 price drop. Unless Kepler is way better (doubt it).
If Kepler bombs, I'm buying 2 7970's just as a capitalist reaction!!
I just hope nvidia dont fully drop the ball, as amd did with its hype management when releasing BD, the fallout of such a thing could prove expensive to us enthusiasts
ie kepler Needs to be good and worthy of such hype ,either way tho i will deffinately be buying a low to mid kepler card for some folding/hybrid physx action(gits will obv make this bit hard unnecessarily):) to be fair dude you are on TPU and this is a place not oft visited by the Average user, i want it all personally max performance and minimal power draw, the power pulled can be highly regarded by some as it is by me, if you fold and can run two cards 24/7 with a smaller psu and cheaper case doors can open(welll not doors more folding oportunities:))
70 posts later the dabate rages on:D
I have high hopes for Kepler and plan on getting a 6xx series GPU to give Nvidia a shot since the last time I had an Nvidia card was a 7900GS a few years back. Nvidia is just trying to keep people in anticipation of their new line, but given the problems with the 7xxx series and drivers I've seen around, I don't think they really have much to worry about.
This argument is pointless though, for all we truly know Kepler could use very little energy.
And for the rest of your post, sorry, Im tired as hell but that makes no sense to me... :confused: You looked at it that way up top.. now the numbers come out so more reasons come out? Timely... :D
PSU wont last longer with a slightly lesser load, my god man..pass the dutchy this way.
So you are going to go out and buy a lesser PSU? Does that make sense...?
The discussion (not an argument) is relevent.. or it was when you brought up those points... now...... its not? :wtf: