# No New GPUs from AMD for the Bulk of 2013



## btarunr (Feb 10, 2013)

AMD's product manager for desktop graphics products Devon Nekechuk, in an interview with Japanese publication 4Gamer.net, revealed that his firm won't be launching any new Radeon GPUs in 2013, and that the company would instead play out the year on its current Radeon HD 7000 series' performance, with price adjustments and possible performance increments through driver updates. In a slide released to 4Gamers.net, AMD pointed that its Radeon HD 7900 series (high-end), HD 7800 series (performance), and HD 7700 series (mainstream), will carry on the company's mantle "throughout 2013."

This announcement is indication that GPU makers have decided to slow things down from the streak of rapid new GPU launches that lasted from some time around 2007, running up to 2012, which can be heavily taxing in terms of R&D costs for either companies. We know for sure that NVIDIA is clearing its backlog of consumer GPU development by releasing the GeForce GTX "Titan" graphics card in a couple of weeks' time, and we know from older reports that NVIDIA could launch a "refreshed" GeForce Kepler lineup, that largely retains the GeForce Kepler silicon while topping up with subtle changes (clock speeds, software features that don't involve redesigning the silicon, etc.,) but AMD coming out in the open with this announcement could change everything. NVIDIA has the opportunity to save a few coins by sticking to its current lineup (plus the upcoming GTX "Titan,") and responding to competition from AMD by price-adjustments and timely driver optimizations of its own.



 

 



*View at TechPowerUp Main Site*


----------



## drdeathx (Feb 10, 2013)

Is AMD cash strapped? LOL


----------



## progste (Feb 10, 2013)

i'm okay with this if the performance leap between each series is higher


----------



## Nordic (Feb 10, 2013)

I too am ok with this for reasons people are stating above and below me here. I might consider upgrading every generation if I was biannual.


----------



## progste (Feb 10, 2013)

Phusius said:


> I am ok with this, GPU's were coming out way to fast.  Games can't even keep up.  My 7950 at 1200 core destroys everything at 1080p... so yeah, I could care less.
> 
> inb4that1response "omg u must not have tried witcher 2 with ubersampling turned on"
> 
> go shove a pole up your ass u retard. lulz.



also i did try the witcher 2 at ultra with ubersampling at 1080p on my 7970 stock clocks and it runs perfectly fine


----------



## Kaynar (Feb 10, 2013)

Its clear that AMD wants to focus on the lauch of new PS and Xbox, they will need alot of fab capacity for those...

Also whats the point of releasing new series where top GPU does 20-30% better than previous gen for double the price? I am glad that they realised their current card have alot of potential and I hope we will see alot of driver improvements.

EDIT: for what is said above for Witcher 2, I played it with max graphics and ubersampling on 1900x1200 with a single overclocked 7970 and is was perfectly fine.


----------



## Phusius (Feb 10, 2013)

well you get my point, lol.


----------



## Kaynar (Feb 10, 2013)

Phusius said:


> well you get my point, lol.



Yeah we do.

Actually i wanted to add that it is lame that they advertise their card's performance based on a single benchmark....


----------



## DarkOCean (Feb 10, 2013)

Phusius said:


> I am ok with this, GPU's were coming out way to fast.  Games can't even keep up.  My 7950 at 1200 core destroys everything at 1080p... so yeah, I could care less.
> 
> inb4that1response "omg u must not have tried witcher 2 with ubersampling turned on"
> 
> go shove a pole up your ass u retard. lulz.


Its not about that its about progress, faster and cheaper cards for us the poor folks.



Kaynar said:


> Also whats the point of releasing new series where top GPU does 20-30% better than previous gen for double the price?


double the price - where did you get that ?
if newer cards comes out the current ones always will decrease in price at best the new ones will come at the old ones lauch prices so nowhere double; almost noone will buy them if they just double the $$$.



Kaynar said:


> EDIT: for what is said above for Witcher 2, I played it with max graphics and ubersampling on 1900x1200 with a single overclocked 7970 and is was perfectly fine.


it better max it for a $500 graphics card that dx9 game.


----------



## newtekie1 (Feb 10, 2013)

So AMD, the ones that started the accelerated generation jumps to make their weak products look better are now deciding not to do that?  Yeah, I'll believe it when I see it, but I think they'll release HD8000 sooner than we think.

And seriously they put the HD7970 GHz edition ahead of the GTX680, way to mislead by using just one benchmark...


----------



## mastrdrver (Feb 10, 2013)

Dave Baumann comment



> The slide doesn't go to less granulatity than "product series"; that leaves a lot of wiggle room. Beside, new products have been launched recently and the slides indicate that there will be more in the near future.


----------



## Filiprino (Feb 10, 2013)

I prefer it this way. If this means that they'll launch a much bigger HD8000 series perfect. And if this translates to better driver support including Linux, good.


----------



## leopr (Feb 10, 2013)

newtekie1 said:


> So AMD, the ones that started the accelerated generation jumps to make their weak products look better are now deciding not to do that?  Yeah, I'll believe it when I see it, but I think they'll release HD8000 sooner than we think.
> 
> And seriously they put the HD7970 GHz edition ahead of the GTX680, way to mislead by using just one benchmark...



The 7970 GHZ outperforms the 680 in most of the current games not only benchmarks.


----------



## Cortex (Feb 10, 2013)

AMD fails again. Could 2014 AMD Radeon "HD 8970" Fail Edition compete with nVidia's next high end chip (~30 percent faster than Titan). I think not.


----------



## newtekie1 (Feb 10, 2013)

leopr said:


> The 7970 GHZ outperforms the 680 in most of the current games not only benchmarks.



No it doesn't overall they are equal:
http://tpucdn.com/reviews/ASUS/ARES_II/images/perfrel.gif

And if you prefer a few modern games  to an actual idea of performance.

Assassin's Creed 3 GTX680 Outperforms HD7970-GHz:
http://tpucdn.com/reviews/ASUS/ARES_II/images/ac3_1920_1200.gif

Batlefield 3 HD7970-GHz outperforms GTX680:
http://tpucdn.com/reviews/ASUS/ARES_II/images/bf3_1920_1200.gif

Borderland 2 GTX680 outperforms HD7970-GHz:
http://tpucdn.com/reviews/ASUS/ARES_II/images/borderlands2_1920_1200.gif

Far Cry 3 HD7970-GHz outperforms GTX680:
http://tpucdn.com/reviews/ASUS/ARES_II/images/farcry3_1920_1200.gif

They trade blows in modern games, that is why I said *OVERALL* they are equal.


----------



## Phusius (Feb 10, 2013)

DarkOCean said:


> Its not about that its about progress, faster and cheaper cards for us the poor folks.



You can't afford $269.99 free ship with a $20 rebate and 3 free games making the card viably less than $200?  That's how much I paid for my Sapphire 7950 on a Newegg sale...


----------



## leopr (Feb 10, 2013)

newtekie1 said:


> No it doesn't:
> http://tpucdn.com/reviews/ASUS/ARES_II/images/perfrel.gif



Relative performance table ? Go pick up some game benchmarks and come back, you have plenty on that review.

Don't worry, i will help you:



Spoiler







































And i'm pretty sure the 7970GHZ results are based on the 12.11 BETA which are a couple of months old...


----------



## newtekie1 (Feb 10, 2013)

leopr said:


> Relative performance table ? Go pick up some game benchmarks and come back, you have plenty on that review.
> 
> Don't worry, i will help you:
> 
> ...



Yes, that is relative means overall performance, they are synonymous.  Notice how I said overall in my original statement. Also notice how in the game benchmarks the cards trade blows, again *OVERALL* performance is equal.

Also, no the wasn't with 12.11 BETA, the AMD cards used 13.1 and the nVidia cards used 310.70, which is kind of a disadvantage to nVidia since 313.96 actually brought some significant gains in some of the games used in the benchmark suite.


----------



## natr0n (Feb 10, 2013)

This is good news as I just bought my card, so it wont be obsolete for a long while.

Also, with the driver rewrites coming things are looking up.


----------



## Nordic (Feb 10, 2013)

4% difference in the 7970's favor sounds pretty equal to me.


Spoiler


----------



## Zubasa (Feb 10, 2013)

newtekie1 said:


> And seriously they put the HD7970 GHz edition ahead of the GTX680, way to mislead by using just one benchmark...


Well, these kind of marketing stuff are BS 99% of the time as usual.
No surprise.


----------



## Hayder_Master (Feb 10, 2013)

lies, they just want to see what NVIDIA going to release, in all times AMD(ATI) release series and NVIDIA respond with a bit more performance, now i think things going to reverse.


----------



## newtekie1 (Feb 10, 2013)

james888 said:


> 4% difference in the 7970's favor sounds pretty equal to me.
> 
> 
> Spoiler
> ...



It is certainly a lot more equal than the 15% difference AMD is claiming(and basing their strategy on?).



Zubasa said:


> Well, these kind of marketing stuff are BS 99% of the time as usual.
> No surprise.



Oh of course, nVidia does the same thing with their graphs that start at 100% to make the 2-5% improvements seem like huge jumps.


----------



## DarkOCean (Feb 10, 2013)

Phusius said:


> You can't afford $269.99 free ship with a $20 rebate and 3 free games making the card viably less than $200?  That's how much I paid for my Sapphire 7950 on a Newegg sale...



Obviously i was not not lucky enough to be born in a more civilized country like you (in the god forsaken place where i live 7950's are at least $500-$600 and the ones that actually come with a game are even higher) otherwise i wouldnt bitching around about it.




newtekie1 said:


> It is certainly a lot more equal than the 15% difference AMD is claiming(and basing their strategy on?).
> Oh of course, nVidia does the same thing with their graphs that start at 100% to make the 2-5% improvements seem like huge jumps.


even this shows them equal.


----------



## BigMack70 (Feb 10, 2013)

Cortex said:


> AMD fails again. Could 2014 AMD Radeon "HD 8970" Fail Edition compete with nVidia's next high end chip (~30 percent faster than Titan). I think not.



I think it's a little ridiculous to pronounce the downfall of AMD's graphics cards over this.

For one thing, a lot of the signs had already been pointing to there being no new series of cards from either AMD or Nvidia this year, at least till the Christmas timeframe. It's likely that the GTX Titan will be the only new card before Christmas unless something unexpected happens.

For another, we don't know what Nvidia is going to do with this Titan card. We don't know if it's just a standalone release or if it will get the GTX 780 label, and we definitely don't know that their next chip will beat it or when it will be released.

Not only that, but a $900 single GPU card is more or less not going to change the current GPU world at all. It doesn't compete with any single GPU card because it's so expensive, and it doesn't look set to really do much to dethrone the current dual-GPU setups from AMD or Nvidia that are already here (690/7990 and equivalent dual-card configs).

AMD and Nvidia have been very even this round, which is a great thing for consumers. It looks like they're going to remain more or less even throughout the year (though AMD fans may get their feelings hurt a bit that Nvidia will have the fastest single GPU). Hopefully it will continue next year with new releases from both sides.


----------



## GSquadron (Feb 10, 2013)

This is just great!
Old days are coming back 
Games just cannot handle this much power, even AAA games.

As for Geforce Titan, I think it is just a counter part of Nvidia for the 7970 Ghz edition


----------



## xkm1948 (Feb 10, 2013)

After they lost so many engineers, I am not surprised.

The only people AMD needs to get rid of is its useless management team.


----------



## de.das.dude (Feb 10, 2013)

smart move IMO. instead of wasting money on developing newer tech to compete each year, they are spending it on a longer term project. hopefully this will pay off and will yeild good results. 

also they are saving money by not making newer fabrication methods.

only risk is will their pupolarity be tarnished when nvidia releases new GPUs this year?
i know there are a lot of people who will say "look at AMD, they are crappy, so they didnt release new gpus this year. nvidia FTW," bla bla.

i guess certain risks have to be taken.


----------



## cadaveca (Feb 10, 2013)

BigMack70 said:


> It's likely that the GTX Titan will be the only new card before Christmas unless something unexpected happens.



Since the Tesla cards have been out for some time, I wouldn't exactly call this Titan a new card. It's a new SKU, but the card already exists, the GPU already exists, and systems built with it are available right now.

Just not in Geforce clothing.



IF anyone is shocked that Titan is coming, they've not been paying much attention. The Telsa K20X has 2688 shaders and 384-bit memory. GTX680 has 1536 and 256-bit. GTX690 is 3072 shaders and 256-bit.

I find it hard to grasp how less shaders than GTX690 is going to be so much faster than GTX690, as rumoured. The $899 pricing, now, that makes a lot of sense, based on shader count.


With that said, I see no reason for AMD to launch anything any time soon. I suspect they may ALSO have a larger GPU they can launch on 28nm, if needed, but building cards for that price point, well, is not were the real money is.

I hope to see HD 7950 drop to $229. 7970 to $279, and 7970 GHz @ $329. IF this happens, AMD will be quite well off, and this would match what they are doing with Piledriver right now.

I don't expect real new GPUs until just before, or just after new consoles launch.


----------



## KainXS (Feb 10, 2013)

no amd has no card to counter the titan but this is the thing they might not need to counter it directly power for power like they did with the 7970, as games are optimized for the amd only consoles as a side effect games will simply be optimized moreso for amd gpu's then nvidia's I beleive and nvidia is preparing for this with larger gpu's.

If I were a developer having to develop for a PS4, 720, WiiU and PC and I could optimize all 4 platforms to on AMD's gpu platform since they are similar, I'm going to optimize all 4 for AMD and not nvidia and thats just how it goes.

I hope AMD can pull some resources away from the their gpu development and put some towards their cpu's.


----------



## Zen_ (Feb 10, 2013)

This isn't really a surprise when the most popular PC games by far can now be fan just fine with newer integrated graphics, or discrete cards that are 4+ years old.


----------



## the54thvoid (Feb 10, 2013)

So many sad empty people fighting over which graphics card is best.

The 7970 basic version is slower than the GTX680.  However the clocks are so differentiated it hurts.  Nvidia (in the past few years) has always released gfx cards clocked slower than ATI.  But due to their monolithic die size and transistor counts, tended to perform quite a bit better than ATI.

This time AMD released the 7970 with clocks that were far too conservative.  Nvidia (more than likely) made great use of their usual, late to the table, card by tweaking the clocks skyward.  Why else are they voltage locked? Adaptive V-sync and the massive clocks are more than likely reaching their boards power limits.

AMD react with their GHz edition cards (lame quite frankly).  Now the 7970 clocked at the boost speeds of the 680 does tend to win on most occasions.  But on Nvidia's side is a 'generally' better driver team.  If AMD piled their resources into driver development and created better gaming development strategies (like Nvidia does) you;d probably see the 7970 cards humping all over the 680.  
The 7970 is a much more powerful card than the 680 but it is far less refined.  For a change Nvidia have the sleek purring kitten and AMD has the brute growler.  A card is only as good as it's hardware AND software and AMD have the hardware battle won for now but not the software one.

However, if the GTX Titan is not a myth then it will more than likely piss over the 7970.  But, given the rumours it will only be for show.  It will be like the Ares 2 7970 card.  

Also, to add to what many sensible people have said, there is no need for new gfx cards this year.  The 680 and the 7970 are excellent cards.  Driver tweaking will keep them well up to speed with all the new games.

It is a sign of confidence to not rush out a new product - not weakness.  And the Titan is simply a technology argument from Nvidia (probably a very bloody good one too).

Both AMD and Nvidia have done enough with graphics recently.  They need to sit back and work on other areas now, software, mobile, coding assistance.  I'd much rather keep a card that can work on new games because they're coded better than have to buy new cards because of substandard coding.

I find playing BF3 at 2560x1440 with ALL FX maxed out with 80-100+ fps is more than enough on two 7970's.  Why the hell do i want more powerful cards?

Though I might _downgrade_ to Titan......


----------



## lemonadesoda (Feb 10, 2013)

I'd really like to see some new launches in 2013. If not faster, at least more power efficient. There is always room for improvement there. I'm on a previous generation GPU. I'm not too enthusiastic to upgrade for the sake of 10-30% GPU horsepower.  But if that was combined with an additional 10-30% improved power efficiency, then I would have reason to upgrade.


----------



## Recus (Feb 10, 2013)

Cortex said:


> AMD fails again. Could 2014 AMD Radeon "HD 8970" Fail Edition compete with nVidia's next high end chip (~30 percent faster than Titan). I think not.



Yeah, Project Win (Dead) is rolling. 



de.das.dude said:


> smart move IMO. instead of wasting money on developing newer tech to compete each year, they are spending it on a longer term project. hopefully this will pay off and will yeild good results.
> 
> also they are saving money by not making newer fabrication methods.



Meanwhile in real world. http://fudzilla.net/home/item/30412-amd-spent-up-to-$4-million-on-far-cry-3-bundle


----------



## BigMack70 (Feb 10, 2013)

cadaveca said:


> IF anyone is shocked that Titan is coming, they've not been paying much attention.



GK110 was the card I wanted early last year when it was supposed to be the GTX 680, but then Nvidia royally screwed the pooch on that one and I have no interest in a $900 version of that chip unless it manages to at least match my 7970 CF performance.


----------



## Crap Daddy (Feb 10, 2013)

AMD has to sell what they have now and I guess they have a pretty hard time since they cut prices on their cards countless times and are now at the second games bundle offer.  On the other hand the market doesn't quite need new cards to flood the market. If people have a PC in the house then it's good enough, they'll buy a mobile piece of hardware next. Even Microsoft is trying desperately to move away from the desktop so what can we expect?


----------



## Phusius (Feb 10, 2013)

BigMack70 said:


> GK110 was the card I wanted early last year when it was supposed to be the GTX 680, but then Nvidia royally screwed the pooch on that one and I have no interest in a $900 version of that chip unless it manages to at least match my 7970 CF performance.



If it matches 7970 CF performance I will be impressed and may grab one, but I highly doubt it comes close to that, especially since 7970 drivers are really hitting their climax in the coming months.


----------



## BigMack70 (Feb 10, 2013)

Phusius said:


> If it matches 7970 CF performance I will be impressed and may grab one, but I highly doubt it comes close to that, especially since 7970 drivers are really hitting their climax in the coming months.



I agree, but if by chance it can give me the same performance I'm already getting, with a single GPU, I'll be selling my 7970s and getting one


----------



## jihadjoe (Feb 10, 2013)

cadaveca said:


> I find it hard to grasp how less shaders than GTX690 is going to be so much faster than GTX690, as rumoured. The $899 pricing, now, that makes a lot of sense, based on shader count.



Im betting on at least same performance, if not slightly better than the 690.

If we multiply 3072 shaders by 80%, which is a generous figure for SLI scaling IMO, that's the equivalent of less than 2500 shaders. And that's without knowing if GK110's shaders might have higher IPC compared to GK104's.


----------



## cadaveca (Feb 10, 2013)

jihadjoe said:


> And that's without knowing if GK110's shaders might have higher IPC compared to GK104's.



That is not an unknown. Telsa cards tell the full story about the Titan card. There's no mystery at all with this chip. And rumours and such are just traffic generators.

Like I hate to be a realist here, but really, there's no reason for any of the mystery here. NONE. the only mystery is final clockspeeds.


----------



## BigMack70 (Feb 10, 2013)

cadaveca said:


> That is not an unknown. Telsa cards tell the full story about the Titan card. There's no mystery at all with this chip. And rumours and such are just traffic generators.
> 
> Like I hate to be a realist here, but really, there's no reason for any of the mystery here. NONE. the only mystery is final clockspeeds.



There is some mystery as to how exactly it's going to perform in games, but overall I agree with you. I do think there's good reason to believe the "85% of 690 performance" rumors.


----------



## Xzibit (Feb 10, 2013)

Crap Daddy said:


> AMD has to sell what they have now and I guess they have a pretty hard time since they cut prices on their cards countless times and are now at the second games bundle offer.



You can also say GK110 didnt sell that well and Nvidia has to offload inventory so selling them in GeForce variants is a way to recoupe that R&D money.


They also have a Quaterly report coming on the Feb 13, Since SEC 2013 didnt workout to well for them its time for the hype machine to drum up investor interest.
I suspect the HYPE will continue all the way to there GPU Tech Conf. on March 13-18.


----------



## Crap Daddy (Feb 10, 2013)

cadaveca said:


> That is not an unknown. Telsa cards tell the full story about the Titan card. There's no mystery at all with this chip. And rumours and such are just traffic generators.
> 
> Like I hate to be a realist here, but really, there's no reason for any of the mystery here. NONE. the only mystery is final clockspeeds.



85% performance of a GTX690. The most reasonable high expectation. Fantastic performance nonetheless. Price is a different discussion but I guess they can ask whatever they want since the only card to compare to is also Nvidia and it's a 1000$ card.


----------



## cadaveca (Feb 10, 2013)

BigMack70 said:


> There is some mystery as to how exactly it's going to perform in games, but overall I agree with you. I do think there's good reason to believe the "85% of 690 performance" rumors.



I agree, and the pricing rumoured reflects this as well. It's a very simple that they have been binning chips for these cards for months now, and the rest probably went into the Tesla cards. I kind of expect a 900 or 950 MHz sort of clock, like the other Geforce cards, and then shader count answers the rest.


Ram bandwidth increase could bring a decent gains a well, but since there is more shaders added, that bandwidth maybe already be utilized by those shaders.



So, why would AMD release anything to counter that? They have a clear intent already of not taking performance crowns, so why would that change?


----------



## BigMack70 (Feb 10, 2013)

cadaveca said:


> So, why would AMD release anything to counter that? They have a clear intent already of not taking performance crowns, so why would that change?



I agree. The Titan isn't set to compete with anything or to put pressure on anything. It's just going to add one more niche option to the market and assuming SLI works, will provide a KILLER setup for anyone looking to drop ~$2K on graphics cards .


----------



## LAN_deRf_HA (Feb 10, 2013)

If AMD isn't releasing a new card then neither will nvidia. They benefited so tremendously from being second to market that they'll want to do it again. I guess the only graphics event in 2013 will be the Titan replacing the 690. Nice not having to get new hardware every 6 months I suppose.


----------



## esrever (Feb 10, 2013)

why release new cards when nvidia isn't offering any competition. 

amirite?


----------



## Cortex (Feb 10, 2013)

Seems like AMD is trying to kill Moore's law. First they slowed down CPU development with Bulldozer, now they stalling graphics cards market. Good job, AMD.


----------



## Protagonist (Feb 10, 2013)

Now Nvidia should follow suit and launch 7xx next year too, overall i like the news of no new GPU's this year, except for the rumor Titan.

This is a good chance for Intel to play catch up on the GPU side...


----------



## HumanSmoke (Feb 10, 2013)

Given the recent article about AMD's possible implementation of dynamic boost, I'm going to crystal ball gaze and say that AMD will _re-release _(again) the HD 7000 series...

BEHOLD! The HD 7970 GRIMACE (*G*Hz *R*aised *I*ncrementally- a *M*Hz *A*dded *C*ollectors* E*dition)


----------



## de.das.dude (Feb 10, 2013)

or maybe amd just lied so that nvidia can launch first and then amd can take advantage of that


----------



## WarEagleAU (Feb 10, 2013)

LOL HumanSmoke that was funny.

Cortex - come on dude please. Really?

Don't argue with NewTekie1, he is always right about Nvidia and AMD. (Yes I Went there) but honestly he isn't usually too far off base so I have to give him that overall they are evenly matched.

Personally, love this idea and it is something that should have been done forever. That should mean that when I want to move from my 6870 to a 7870/7970 that the price increase won't be such a shell shocker and will be more affordable. Honestly my single 6870 (slightly oc'd) runs everything I want to at 1920x1080 very nicely with no issues, stuttering, etc. I hope this leads to the monthly driver releases or at least better driver releases from AMD. I miss my monthly driver upgrades even if it is more for a product/game I don't have than what I Do. I tend to notice newer drives makes everything better especially games and power consumption. One thing I wish they would do is allow the GPU rendering  (encoding/decoding) bit to do a lot more file types and containers. I have yet to actually be able to use it because even for files I believe it uses it won't let me use it and it pisses me off 

I thought quoting Fudzilla was, uhm like not good as they aren't real news? LOL.

De das Dude - that would be cool for once. I honestly wish AMD would try a shader clock or something different.


----------



## buildzoid (Feb 10, 2013)

I don't care for the Titan if nvidia doesn't lower clocks it's gonna eat 300+watts and if they lower the clock it won't beat the 690 or 7990 so thats that, and this news is great because I was starting to get worried that my 7970Ghz would be a midrange card. I guess since the PS4/720 are only gonna be the equivalent of a mid range gaming PC then there is no point in releasing even more overpowered GPUs that would have nothing to as 1 7970 can run most games maxed out on 5760x1200(And few people have this many screens) at 30fps.


----------



## W1zzard (Feb 10, 2013)

cadaveca said:


> Like I hate to be a realist here, but really, there's no reason for any of the mystery here. NONE. the only [geforce titan] mystery is final clockspeeds.



nope.. just wait and see


----------



## erocker (Feb 10, 2013)

As long as I keep seeing these very good driver improvements with my current card, I'm a happy happy man. Resale value should stay higher as well... which kinda sucks if I want to pick up a used 2nd card for CrossFire. I just hope we'll see a couple neat "specialty" cards from AMD before the end of the year.


----------



## GLD (Feb 10, 2013)

My 7850 is rockin' on my 19" lcd with 1280x1024 resolution.  The two free games I got with the card are icing on the cake. AMD makes good products. Taking time to fine tune their drivers and let their gpu's mature is fine with me. Espically with the sweet game bundless AMD is giving out. Far Cry 3 and now Tomb Raider.


----------



## Cold Storm (Feb 10, 2013)

erocker said:


> As long as I keep seeing these very good driver improvements with my current card, I'm a happy happy man. *Resale value should stay higher as well*... which kinda sucks if I want to pick up a used 2nd card for CrossFire. I just hope we'll see a couple neat "specialty" cards from AMD before the end of the year.





That's what I'm thinking this whole thing is.. they know they're going to get a good amount come back because of the new systems, plus the effects that they've probably put all the r&d into the consoles alone.. Nvidia showed off "shield".. So, AMD knows that they probably need to work on the branching aspect as well.. They even lost a good amount last year... They probably believe it's better to wait a year than to just throw something out, then slash prices left and right to try and "sell" what units they've projected to sell.. 

[just my 2 cents]


----------



## cadaveca (Feb 10, 2013)

W1zzard said:


> nope.. just wait and see


----------



## Sasqui (Feb 10, 2013)

W1zzard said:


> nope.. just wait and see



While they improve the current architecture, who knows!  I bet we'll see a few more dual GPU cards, or something similar, or... perhaps multicore fabs?



erocker said:


> As long as I keep seeing these very good driver improvements with my current card, I'm a happy happy man. Resale value should stay higher as well... which kinda sucks if I want to pick up a used 2nd card for CrossFire. I just hope we'll see a couple neat "specialty" cards from AMD before the end of the year.



They either:

Landed on a leap-frog development that will really impress by late '13 or early '14 or.... have no spades up their sleeve, other than scaling the existing chips.

Either way th 7xxx series is rocking!


----------



## PopcornMachine (Feb 10, 2013)

Maybe the industry is just slowing things down, or they have many 7000s in stock still.

My worst fear is that this is just another indication of AMD's bad financial state.



HumanSmoke said:


> Given the recent article about AMD's possible implementation of dynamic boost, I'm going to crystal ball gaze and say that AMD will _re-release _(again) the HD 7000 series...
> 
> BEHOLD! The HD 7970 GRIMACE (*G*Hz *R*aised *I*ncrementally- a *M*Hz *A*dded *C*ollectors* E*dition)



Yeah, I think that would make everyone grimace.


----------



## Xzibit (Feb 10, 2013)

Cold Storm said:


> Nvidia showed off "shield".. So, AMD knows that they probably need to work on the branching aspect as well.. They even lost a good amount last year... They probably believe it's better to wait a year than to just throw something out, then slash prices left and right to try and "sell" what units they've projected to sell..
> 
> [just my 2 cents]



When Nvidia revealed Project Shield it was a double edge sword.  They wanted to gain credibility with Project Shield since they didnt secure any of the 3 console contracts but they ended up hurthing themselves.

The fact that Tegra 4 needed a passive heat-sink turned off a few potential buyers from the mobile industry.  They since lost Nexus and Windows Surface contracts.
It didnt help much that Samsung Octa and Qualcomm new offering had a 40-60% performance increase and can do 4k.  So whos going to take a risk on putting a Tegra 4 A15x4 that might need a heat-sink into there desing rather then a proven or the newer more power effecient SoC that are much faster. The Tegra 4 was already hurting going into SEC with rumour that it couldnt even compete with Apples A6.  So they didnt do themselves any favors when they introduced Project Shield with a heat-sink on Tegra 4.


----------



## KashunatoR (Feb 10, 2013)

I don't know why everyone is so sure Titan is a single gpu card...


----------



## KainXS (Feb 10, 2013)

KashunatoR said:


> I don't know why everyone is so sure Titan is a single gpu card...



. . . . . . . . . .

maybe because we already have Dual GK104 and Dual GK110 is not gonna happen when not a single Geforce GK110 card is out+restrictions.


----------



## Xzibit (Feb 10, 2013)

You know that as soon as Nvidia releases TITAN

AMD will respond like this


----------



## GC_PaNzerFIN (Feb 10, 2013)

Well good for you AMD. But now I have only one choice and go buy the Titan. Fastest single GPU card with massive margin to HD 7970?, yes it is damn worth the price and I gladly pay it. 8800 Ultra once again.


----------



## newtekie1 (Feb 10, 2013)

DarkOCean said:


> Obviously i was not not lucky enough to be born in a more civilized country like you (in the god forsaken place where i live 7950's are at least $500-$600 and the ones that actually come with a game are even higher) otherwise i wouldnt bitching around about it.
> 
> 
> 
> ...



Yeah, that is the graph I posted originally, and he responded with "OMG!  Relative performance is stoopid! Look at these cherry picked game numbers, they proves HD7970 is better!"



the54thvoid said:


> The 7970 basic version is slower than the GTX680.  However the clocks are so differentiated it hurts.  Nvidia (in the past few years) has always released gfx cards clocked slower than ATI.  But due to their monolithic die size and transistor counts, tended to perform quite a bit better than ATI.



Don't forget that, in the past, nVidia used hot clocks while AMD stuck with linking the core and shader clock speeds 1:1.  So while nVidia's card's cores were clocked lower than AMD, often times their shaders, which do all the real grunt work, were clocked at least double the cores.  So with the GTX580 and HD6970, for example, the HD6970 was clocked at 880MHz while the GTX580 was _only_ clocked at 772MHz.  However, the GTX580's shaders ran at 1544MHz.  That was a big part of where nVidia got their performance from.



the54thvoid said:


> This time AMD released the 7970 with clocks that were far too conservative.  Nvidia (more than likely) made great use of their usual, late to the table, card by tweaking the clocks skyward.  Why else are they voltage locked? Adaptive V-sync and the massive clocks are more than likely reaching their boards power limits.
> 
> AMD react with their GHz edition cards (lame quite frankly).  Now the 7970 clocked at the boost speeds of the 680 does tend to win on most occasions.  But on Nvidia's side is a 'generally' better driver team.  If AMD piled their resources into driver development and created better gaming development strategies (like Nvidia does) you;d probably see the 7970 cards humping all over the 680.



Actually, I think nVidia was very ready with the card, but waited for AMD to make the move so they would know what they had to compete against.  The weak showing by AMD this round is why nVidia ended up releasing what they planned to be a mid-range card as the actual high end card.  They didn't need to release GK110 so they held it back to retaliate against whatever AMD had planned next.  The lame step of simply boosting clock speeds meant nVidia didn't really need to retaliate.




the54thvoid said:


> The 7970 is a much more powerful card than the 680 but it is far less refined.  For a change Nvidia have the sleek purring kitten and AMD has the brute growler.  A card is only as good as it's hardware AND software and AMD have the hardware battle won for now but not the software one.
> 
> However, if the GTX Titan is not a myth then it will more than likely piss over the 7970.  But, given the rumours it will only be for show.  It will be like the Ares 2 7970 card.



I wouldn't even say AMD has the better hardware right now because GK110 is a much more powerful card hardware wise, but nVidia hasn't had a need to release it with the HD7970 and the GTX680 being so close in performance and price. 

I doubt Titan will just be for show.  Normally cards that are just for show are super expensive to produce.  However, given the specs of GK110, it won't be any more expensive to produce than an HD7970.



the54thvoid said:


> Also, to add to what many sensible people have said, there is no need for new gfx cards this year.  The 680 and the 7970 are excellent cards.  Driver tweaking will keep them well up to speed with all the new games.
> 
> It is a sign of confidence to not rush out a new product - not weakness.  And the Titan is simply a technology argument from Nvidia (probably a very bloody good one too).
> 
> ...



I have to agree, I think the market is getting tired of the accelerated product releases.


----------



## sergionography (Feb 10, 2013)

Sasqui said:


> While they improve the current architecture, who knows!  I bet we'll see a few more dual GPU cards, or something similar, or... perhaps multicore fabs?
> 
> 
> 
> ...



amd is working on a dynamic overclocking method in their drivers according to some rumors on the web, that alone might give them a bigger boost than we might think, or allow lower power consumption figures and what not
so while many mentioned the misleading benchmark for 3dmark well i say its not biased, its very accurate, because it truly reflects the capability of GCN under a good optimized environment, and while that might not be the case with game it remains better off for amd to take out the best of what they have before they release a new generation while the first gen gcn didnt even stretch its legs yet.  so closer developer relationship that were seeing already so its good, and more serious driver updates that we are also seeing, amd is totaly on the right track here


----------



## Xzibit (Feb 10, 2013)

newtekie1 said:


> I doubt Titan will just be for show.  Normally cards that are just for show are super expensive to produce.  However, given the specs of GK110, it won't be any more expensive to produce than an HD7970.



You forget Tahiti is 365mm2. GK110 is 550mm2. Size alone makes the GK110 44% more expensive.

If your going by dates then even tape-out Tahiti has 6+ months of maturity in fab process. Thats another cost reducing measure in Tahitis favor.

I dont see how that was ever a comparison.


----------



## Melvis (Feb 11, 2013)

Meh doesn't worry me at all. I only upgrade my cards every 4-5yrs anyway and with all these console ports I think I got it covered for along time yet.

With this "Titan" card, its Nvidia, they hate having the slower card, they do it all the time, and its good for marketing also to claim they have the fastest single GPU bla bla bla, what's new honestly? and 99% of people wont buy it, its to expensive. I just don't see what all the fuss is about.


----------



## BigMack70 (Feb 11, 2013)

A somewhat interesting update on Titan:
http://wccftech.com/nvidia-geforce-...it-memory-915-mhz-core-1019-mhz-boost-clocks/


----------



## the54thvoid (Feb 11, 2013)

BigMack70 said:


> A somewhat interesting update on Titan:
> http://wccftech.com/nvidia-geforce-...it-memory-915-mhz-core-1019-mhz-boost-clocks/



lol, $1600.


----------



## Xzibit (Feb 11, 2013)

the54thvoid said:


> lol, $1600.



For that price it better come bundled with Project Shield 

You'll notice he said US based retailer.  Weird cause its based in Australia.  Since when did US aquire Australia.  Did I miss something ?


----------



## erocker (Feb 11, 2013)

BigMack70 said:


> A somewhat interesting update on Titan:
> http://wccftech.com/nvidia-geforce-...it-memory-915-mhz-core-1019-mhz-boost-clocks/



Doesn't show anything that isn't already rumored. That website is wrong way too often to be considered relevant unfortunately.


----------



## KainXS (Feb 11, 2013)

based on the previous telsa releases I expected 900mhz but not 1000+, will wait for more trustworthy source.


----------



## Super XP (Feb 11, 2013)

progste said:


> i'm okay with this if the performance leap between each series is higher


I see one benefit to this, games will have a chance to be properly optimised with a familiar GPU Architecture, the HD 7000 series. 

I am still sticking to my guns, and waiting for HD 8900 Series. I always skip a generation and the HD 7900 was that skip. Unless of course the cards go down in price a lot.


----------



## HumanSmoke (Feb 11, 2013)

BigMack70 said:


> A somewhat interesting update on Titan:
> http://wccftech.com/nvidia-geforce-...it-memory-915-mhz-core-1019-mhz-boost-clocks/



Firstly, as has been mentioned...it's WCCF 
Secondly, seems strange that the Titan is being attributed the exact same clocks as the GTX 690
Thirdly, online (r)etail have a history of gouging on price ( €700 for a HD 7970 for example)
Fourthly, 512MB bus width? really? So Nvidia disabled two 64-bit controllers for a Tesla card where bandwidth is paramount, yet will enable them for a consumer card that likely would benefit less from the increased bandwidth in gaming. OK.


----------



## erocker (Feb 11, 2013)

I wonder if we'll see a beefed up 7970? 7990 could be a name that works since AMD never officially released one. They did this before with the 4870->4890.


----------



## HumanSmoke (Feb 11, 2013)

erocker said:


> I wonder if we'll see a beefed up 7970? 7990 could be a name that works since AMD never officially released one. They did this before with the 4870->4890.


Would seem likely. My earlier post was only partly facetious, and I'd assume that the 28nm process would become more refined as it matures. Wouldn't surprise if AMD gained some headroom with voltage from a revision, which would translate to either low power envelope, or more likely, more headroom for clocking- especially if AMD implement a form of dynamic boost to take full advantage of the hardware (both low-power states and Powertune +20 for example).

Not sure about the HD 7990 nomenclature though. If AMD are rebranding the HD 7970 to HD 8970 (OEM), then maybe HD 8990 might be an option, with HD 8980 for a similarly revised HD 7950.


----------



## atikkur (Feb 11, 2013)

i'm waiting for kepler refresh with 384memory bus width and other hardware optimizations for lower or the same price. as ussualy nvidia refreshes, better performance and lower price. i'm holding back this gen for hardware refreshes or new architecture.


----------



## [H]@RD5TUFF (Feb 11, 2013)

Too bad, was hoping AMD would respond to titan, and cause a price war.


----------



## Nordic (Feb 11, 2013)

How much more could amd buff a 7970 without being obscene? If they make a card with 300tpd compared with the 7970's 250, it could not be that much of a buff even with refinements. Could it?


----------



## Rahmat Sofyan (Feb 11, 2013)

AMD don't care about the Titan, because they've owned PS4 and XBOX 720 project.

and for that reason, nVidia moaning and this is the result geforce Titan...hahaha just happy as regular user, new product price war whatever...

AMD may said, we are Spartan!!!


----------



## Mr. Fox (Feb 11, 2013)

james888 said:


> 4% difference in the 7970's favor sounds pretty equal to me.
> 
> 
> Spoiler
> ...


They are real close overall in terms of FPS. But, you're comparing the GHz Edition, which packs a greater punch than the average 7970.


----------



## Nordic (Feb 11, 2013)

Mr. Fox said:


> They are real close overall in terms of FPS. But, you're comparing the GHz Edition, which packs a greater punch than the average 7970.



4% in favor of the 680 if useing the non ghz 7970... still a 7970. 4% is practically nothing.


----------



## qubit (Feb 11, 2013)

AMD must be mad not to compete with NVIDIA's Titan. :shadedshu Graphics cards are the one area where they remain competitive and profitable, so why are they throwing this advantage away?

At this rate I can see AMD disappearing within 5 years or so.


----------



## Deleted member 67555 (Feb 11, 2013)

I think this is good...Take the time to make a decent card that plays all these ports even better...LOL

I am actually happy with this decision by AMD...nothing like buying a $300 high~end video card to watch a mid level card come out 6 months later that dominates it with a $200 price tag...

Personally I'd like to see an 18-24 month cycle with better improvements with drivers then see yearly product line releases with the older lines being abandoned before their drivers are mature.


----------



## Mr. Fox (Feb 11, 2013)

james888 said:


> 4% in favor of the 680 if useing the non ghz 7970... still a 7970. 4% is practically nothing.


I agree, and the additional features, not to mention better driver support, with NVIDIA more than make up for that. In the mobile segment, 680M pretty much owns 7970M in everything other than compute function.


----------



## Nordic (Feb 11, 2013)

Mr. Fox said:


> compute function



Why I got a 7970


----------



## Mr. Fox (Feb 11, 2013)

james888 said:


> Why I got a 7970


I went through 4 7970M and had too many functionality problems with them. The AMD mobile GPU is not nearly as good and stable a product as the desktop 7970. Finally got fed up and traded them for a pair of GTX 680M and couldn't be happier. Although, I do wish they had the same compute performance that Fermi did, I don't really need that for the way I use my system (exclusively gaming and overclocked benching). 

I agree with some of the thoughts here about having the latest and greatest rendered obsolete and getting more mileage out of having the current best that red and green have to offer. But, I kind of hate to see either one of them slacking off the pursuit of more performance. Having the two companies constantly pushing and trying to outdo one another is really a great thing for consumers regardless of brand preference.


----------



## Widjaja (Feb 11, 2013)

These guys are so uncertain of what they are going to be doing.
Will be releasing 8xxx series Q2...then Q4....oh hold on lets make it next year.

What I think they should be working on is their console side and the drivers for their PC side.


----------



## Mr. Fox (Feb 11, 2013)

Widjaja said:


> These guys are so uncertain of what they are going to be doing.
> Will be releasing 8xxx series Q2...then Q4....oh hold on lets make it next year.
> 
> What I think they should be working on is their console side and the drivers for their PC side.


AMD making consistently excellent drivers over the course of this year would be a better use of their time than making a new GPU. The 13.X Catalysts are a major improvement over the previous, but it took them way too long to release good quality drivers for the 7XXX series.


----------



## 15th Warlock (Feb 11, 2013)

What I think is, as mentioned by others, AMD will finally release the official 7990, which in theory would still be fast enough to beat a single Titan card, they can price it close to what nVidia will charge for Titan and still proclaim to have the fastest card available.

Crossfire scaling, in games that are properly supported by the driver, is close to 90%, and seeing the prices of 7970 cards lately, AMD could afford to price this card very competitively and still make a good profit.

This card would carry AMD through the rest of 2013 and allow them enough R&D time to effectively counter whatever the green team releases after Kepler.


----------



## tastegw (Feb 11, 2013)

KashunatoR said:


> I don't know why everyone is so sure Titan is a single gpu card...



prolly cause if the titan was a dual gpu card featuring 2 of the telsa chips,  85-115% performance of the 690 would be a joke on the low side.

why go through all that trouble with a dual gpu card from the pro cards when you can just slap 2 of the the 600 series together like the 690

im sure it would be much easier and cheaper to slap two 660ti's together than it would be 2 K20 chips.


----------



## xenocide (Feb 11, 2013)

tastegw said:


> why go through all that trouble with a dual gpu card from the pro cards when you can just slap 2 of the the 600 series together like the 690.



The 690 was hardly slapped together.  Nvidia actually put quite a bit of work into engineering that thing to be a beautiful well crafted card.


----------



## Prima.Vera (Feb 11, 2013)

Consoles won. RIP PC...


----------



## BigMack70 (Feb 11, 2013)

HumanSmoke said:


> Firstly, as has been mentioned...it's WCCF
> Secondly, seems strange that the Titan is being attributed the exact same clocks as the GTX 690
> Thirdly, online (r)etail have a history of gouging on price ( €700 for a HD 7970 for example)
> Fourthly, 512MB bus width? really? So Nvidia disabled two 64-bit controllers for a Tesla card where bandwidth is paramount, yet will enable them for a consumer card that likely would benefit less from the increased bandwidth in gaming. OK.



This is why I said it's a "somewhat interesting" update... it's worth a read but I never meant to suggest more than that.


----------



## erocker (Feb 11, 2013)

qubit said:


> AMD must be mad not to compete with NVIDIA's Titan. :shadedshu Graphics cards are the one area where they remain competitive and profitable, so why are they throwing this advantage away?
> 
> At this rate I can see AMD disappearing within 5 years or so.



Wait... Who said they weren't? Titan is not the 7XX series afaik.


----------



## GC_PaNzerFIN (Feb 11, 2013)

erocker said:


> Wait... Who said they weren't? Titan is not the 7XX series afaik.



It is codenamed to be different gen than GK104. Unless you are claiming they will never release anything else but the top model with that chip and throw away all not too well working chips I can't see how this would be any different. Sure, price is this time higher but well competition has moved to play with consoles so...


----------



## erocker (Feb 11, 2013)

It's one card.


----------



## GC_PaNzerFIN (Feb 11, 2013)

erocker said:


> It's one card.



I can guarantee you there will be like a ton of not-good-enough-for-Titan GK110s. Its massive chip with poor yields


----------



## TRWOV (Feb 11, 2013)

It already comes with one SMX module disabled isn't it? The full chip is supposed to have 15 SMX AFAIK but the core count suggest 14 active.


----------



## sergionography (Feb 11, 2013)

TRWOV said:


> It already comes with one SMX module disabled isn't it? The full chip is supposed to have 15 SMX AFAIK but the core count suggest 14 active.



the tesla chip does, i doubt the same will be for the consumer gk110
i remember it was the same thing during the gtx480 era were the pro cards had disabled parts
the reason being is pro cards have to be 24/7 operation guaranteed


----------



## Xzibit (Feb 11, 2013)

TRWOV said:


> It already comes with one SMX module disabled isn't it? The full chip is supposed to have 15 SMX AFAIK but the core count suggest 14 active.



I don't know why anyone thinks its 14 or higher if it has 1 SMX disabled.

C'MON MAN 

K20X
14 SMX

K20
13 SMX

So if it has 1 disabled its a K20 that didnt sell or (was binned but we need more detail on the chip and its differences)

Heck it could be 1 disable and be referancing a K20 and be a 12 SMX, W1zzard did allude that hertz was not the only surprise so there is another or more.

SURPRISE!!!
TITAN
12 SMX

It would follow the GK104 model 680, 670, 660Ti (GK110 model K20X, K20, TITAN). Disabling 1 SMX along with others components. TITAN could be a gimped K20 GK110 like the GK104 in 660Ti is to 670.

K20 GK110s that could be salvaged by disabling 1 SMX or other components and you get your Limited TITAN product.


----------



## Protagonist (Feb 11, 2013)

Xzibit said:


> I don't know why anyone thinks its 14 or higher if it has 1 SMX disabled.
> 
> C'MON MAN
> 
> ...



And they call it GTX685


----------



## HumanSmoke (Feb 11, 2013)

Xzibit said:


> I don't know why anyone thinks its 14 or higher if it has 1 SMX disabled.


Probably because:
1.The full GK 110 has 15 SMX @ 192 cores per, and
2.When was the last time a Tesla (or Quadro) part features more shaders than the GeForce variant (hint: *never*).


Xzibit said:


> So if it has 1 disabled its a K20 that didnt sell or (was binned but we need more detail on the chip and its differences)


...or more likely a higher leakage GPU


Xzibit said:


> SURPRISE!!!
> TITAN
> 12 SMX


Extremely unlikely  Tesla K20X/K20 are A1 revision GPU's...and GPU's invariably yield better as the process matures.
More likely 14 SMX (or possibly 15 if a fully enabled die falls outside Tesla's 225-235W power envelope / assuming that the 15th SMX isn't built in redundancy to improve yield)


GC_PaNzerFIN said:


> I can guarantee you there will be like a ton of not-good-enough-for-Titan GK110s. Its massive chip with poor yields


And you know this how? Do you work for Nvidia or TSMC ? There have been precisely two SKU's based on the GK 110...one has 93% functionality, and the other 86.7%. Using your logic, Tahiti's yields must also be poor, since the split is 100%, 87.5%, and 75%.


----------



## qubit (Feb 11, 2013)

Xzibit said:


> I don't know why anyone thinks its 14 or higher if it has 1 SMX disabled.
> 
> C'MON MAN
> 
> ...



No, that's not right. The GK110 has 15 SMX units and is used in the K20 card.

The apparently cut down version used in the upcoming Titan has one disabled, making 14 SMX units.

NVIDIA's official whitepaper can be downloaded from here, which shows this.


----------



## Aceman.au (Feb 11, 2013)

Well guess Im buying a titan then. Thanks for making it an easy choice (of course I'll look @ numbers against my 7970s before buying though)


----------



## qubit (Feb 11, 2013)

Aceman.au said:


> Well guess Im buying a titan then. Thanks for making it an easy choice (of course I'll look @ numbers against my 7970s before buying though)



It's gonna cost like a GTX 690. Are you prepared to pay that kind of money?


----------



## Aceman.au (Feb 11, 2013)

qubit said:


> It's gonna cost like a GTX 690. Are you prepared to pay that kind of money?



Yeah. The performance jump would have to be relatively high for me to buy it though.


----------



## qubit (Feb 11, 2013)

Good for you. 

I wish I could buy it - I'd seriously geek out with a card like this.


----------



## leopr (Feb 11, 2013)

I sincerely hope the Titan will outperform my CF of 7970 @ 1200/1700, if thats the case now it would be the moment to get rid of my crossfire.


----------



## NeoXF (Feb 11, 2013)

^ OK people, enough with the GK110 talk, there's already 3 or more threads in the news section alone, about it.

On to the subject at hand. I'm all for it... as long as it's not because AMD is in a bad rut. Otherwise I'm pretty sure nVidia will follow suit, do you guys really think they'd shoot themselves in the foot like that? Release GF Titan for $900 then 1-2Q later, release GTX 780 for $500 that would offer the same performance? In either case, both nVidia and AMD have to release something mighty amazing when time is due (Q4 2013+).


----------



## SIGSEGV (Feb 11, 2013)

well, some of people have a lot of money to buy those titan, that's great , but for me, i have no reason spending ton of money for titan (benchmark? i don't care about benchmark scores lol), instead i would go to buy the next gen console especially ps4 replacing my current console machine (ps3) for around $350 - $400.  

i'd like to say thanks to amd for no new gpus in 2013.


----------



## buggalugs (Feb 11, 2013)

Oh well I guess I can save some money on GPUs this year.


----------



## Axaion (Feb 11, 2013)

One thing all the AMD guys keep forgetting when posting those higher framerates benchmarks for amd is 

"FRAME LATENCY" 


Let the massive excrement war begin, may the dank side be victorious.


----------



## Eagleye (Feb 11, 2013)

Axaion said:


> One thing all the AMD guys keep forgetting when posting those higher framerates benchmarks for amd is
> 
> "FRAME LATENCY"
> 
> Let the massive excrement war begin, may the dank side be victorious.



Think you missed the memo where AMD fixed "FRAME LATENCY" 



GC_PaNzerFIN said:


> It is codenamed to be different gen than GK104. Unless you are claiming they will never release anything else but the top model with that chip and throw away all not too well working chips I can't see how this would be any different. Sure, price is this time higher but well competition has moved to play with consoles so...



From the info out there, the GK110 was made b4 the GK104.



HumanSmoke said:


> And you know this how? Do you work for Nvidia or TSMC ? There have been precisely two SKU's based on the GK 110...one has 93% functionality, and the other 86.7%. Using your logic, Tahiti's yields must also be poor, since the split is 100%, 87.5%, and 75%.



It took nvidia roughly 6 Months to supply Amazon roughly 3000 cards.



newtekie1 said:


> Actually, I think nVidia was very ready with the card, but waited for AMD to make the move so they would know what they had to compete against.  The weak showing by AMD this round is why nVidia ended up releasing what they planned to be a mid-range card as the actual high end card.  They didn't need to release GK110 so they held it back to retaliate against whatever AMD had planned next.  The lame step of simply boosting clock speeds meant nVidia didn't really need to retaliate.



It took nvidia 6/7 years to design and make the Keplar chip, so how they quickly put one together in 3/4 months after seeing the AMD 7 series is beyond me.


----------



## EarlZ (Feb 11, 2013)

This is bad news.. it may mean that Nvidia will sell at a higher price since there is no competition


----------



## Axaion (Feb 11, 2013)

Eagleye said:


> Think you missed the memo where AMD fixed "FRAME LATENCY"
> 
> 
> 
> ...


Indeed i have, please provide source, as i cant freaking find it.


----------



## EarlZ (Feb 11, 2013)

Eagleye said:


> Think you missed the memo where AMD fixed "FRAME LATENCY"



Link to a credible source please


----------



## okidna (Feb 11, 2013)

EarlZ said:


> Link to a credible source please



Since Catalyst 13.2 BETA 3 : http://support.amd.com/us/kbarticles/Pages/AMDCatalyst132BetaDriver.aspx



> AMD Catalyst 13.2 Beta 3 for Windows
> 
> 
> Improves performance up to 15% in high MSAA cases for the Crysis 3 beta release
> ...



TR did a review for this driver : http://techreport.com/review/24218/a-driver-update-to-reduce-radeon-frame-times


----------



## SIGSEGV (Feb 11, 2013)

Axaion said:


> Indeed i have, please provide source, as i cant freaking find it.





EarlZ said:


> Link to a credible source please



i do see mass failed here. 

http://support.amd.com/us/kbarticles/Pages/AMDCatalyst132BetaDriver.aspx


----------



## NeoXF (Feb 11, 2013)

I keep thinking... who honestly thinks nVidia would offer anything new other that this Titan thing, for this year, I mean, hello, even if it is $900 bucks, that'd be shooting theselves in the foot, the theoretical GTX 780 will have to end up faster than "Titan"... Maybe even be based on Maxwell?

Either way, there's chance of refreshes (withing this "gen") from both sides.


----------



## Axaion (Feb 11, 2013)

SIGSEGV said:


> i do see mass failed here.
> 
> http://support.amd.com/us/kbarticles/Pages/AMDCatalyst132BetaDriver.aspx



Eh, i dont run an AMD card, not going into their site that often. missed the TR article.

Surely you browse the nvidia beta driver section all the time, lol.


----------



## Axaion (Feb 11, 2013)

okidna said:


> Since Catalyst 13.2 BETA 3 : http://support.amd.com/us/kbarticles/Pages/AMDCatalyst132BetaDriver.aspx
> 
> 
> 
> TR did a review for this driver : http://techreport.com/review/24218/a-driver-update-to-reduce-radeon-frame-times



Neat, I hope its fixed for all games, least i can switch over to AMD next time i upgrade then.
Edit; i see its still somewhat behind, but massive improvement, So theres always hope


----------



## wowman (Feb 11, 2013)

Possibly  AMD is getting cold feet and is shook knowing Titan's performance number.


----------



## TheMailMan78 (Feb 11, 2013)

I think AMD just showed the hand of the industry as a whole. Mobile and console are the future and pursuing a shrinking market (desktop PC's) aggressively isn't a smart move. NVIDIA has to anymore. They hold no real advantage at this point except in the dedicated GPU market.


----------



## Crap Daddy (Feb 11, 2013)

TheMailMan78 said:


> I think AMD just showed the hand of the industry as a whole. Mobile and console are the future and pursuing a shrinking market (desktop PC's) aggressively isn't a smart move. NVIDIA has to anymore. They hold no real advantage at this point except in the dedicated GPU market.



I've been preaching this whenever I have the opportunity. NV is testing the wallet of the small niche of enthusiasts to see how many limited Lamborghini type of cards can they sell while trying to get as much money for as long it's possible from their mid and low range cards. AMD has no Ferrari to compete but they have good cards for a fair price plus glorious bundles to keep them in the business until they figure out if the desktop PC is going extinct.


----------



## cadaveca (Feb 11, 2013)

okidna said:


> Since Catalyst 13.2 BETA 3 : http://support.amd.com/us/kbarticles/Pages/AMDCatalyst132BetaDriver.aspx
> 
> 
> 
> TR did a review for this driver : http://techreport.com/review/24218/a-driver-update-to-reduce-radeon-frame-times



that's just a fix for those three apps specifically.


----------



## Mindweaver (Feb 11, 2013)

I agree with AMD not making a new card until next year, but something's not adding up.. Why are the high end 7xxx cards drying up at Newegg? Has anyone see there stock on any version of 79xx's? They are slim to none.. 

To me this looks like AMD is getting ready to release the 8XXX cards. This also, looks like something to get everyone that's holding off for the next gen to go ahead and buy now. If you look at the stock of nvidia cards at newegg you can see they still have a ton of cards.. AMD maybe pulling a fast one on Nvidia.. If AMD releases the 8XXX cards and they are a lot faster than nvidia's 6xx cards for the same price, then nvidia will be forced to drop the price on those cards and take a big lose.. With AMD in all of the new consoles they can afford to do this... It reminds me of the sleeping dragon quote..  Only time will tell now.


----------



## TheMailMan78 (Feb 11, 2013)

Mindweaver said:


> I agree with AMD not making a new card until next year, but something's not adding up.. Why are the high end 7xxx cards drying up at Newegg? Has anyone see there stock on any version of 79xx's? They are slim to none..
> 
> To me this looks like AMD is getting ready to release the 8XXX cards. This also, looks like something to get everyone that's holding off for the next gen to go ahead and buy now. If you look at the stock of nvidia cards at newegg you can see they still have a ton of cards.. AMD maybe pulling a fast one on Nvidia.. If AMD releases the 8XXX cards and they are a lot faster than nvidia's 6xx cards for the same price, then nvidia will be forced to drop the price on those cards and take a big lose.. With AMD in all of the new consoles they can afford to do this... It reminds me of the sleeping dragon quote..  Only time will tell now.



They are drying up because they are rebranding them as the 8xxx cards.


----------



## Xzibit (Feb 11, 2013)

qubit said:


> No, that's not right. The GK110 has 15 SMX units and is used in the K20 card.
> 
> The apparently cut down version used in the upcoming Titan has one disabled, making 14 SMX units.
> 
> NVIDIA's official whitepaper can be downloaded from here, which shows this.



Might want to read more.

15 SMX was the design the original silicon had.  The one Nvidia taunted when it announced GK110. Nothing with 15 SMX has come out not K20X or K20.

Even the white paper you link alludes to that but look up specs for K20X 14 SMX and K20 13 SMX none use 15 SMX. Dont know where you get K20 uses 15 SMX if K20X doesnt.



> A full Kepler GK110 implementation includes 15 SMX units and six 64‐bit memory controllers. Different
> products will use different configurations of GK110. For example, some products may deploy 13 or 14
> SMXs.



There are a few other things in the Whitepaper just like the 15 SMXs that never made it to K20X and K20.


----------



## yogurt_21 (Feb 11, 2013)

simple reason, AMD has never had to deal with 3 console launches before, they likely don't have the resources to work on those AND release a new gen. SO they'll wait it out while working on the more profitable market.


----------



## happita (Feb 11, 2013)

Well, this sucks. I was planning on a whole new build this summer. But I guess since the new consoles are coming at the end of the year and AMD being in both of them, I guess they are focusing more on them versus the PC market for now. Guess I'll have to stick with my 5850 for now, unless I can score a great deal on something else.


----------



## TheMailMan78 (Feb 11, 2013)

happita said:


> Well, this sucks. I was planning on a whole new build this summer. But I guess since the new consoles are coming at the end of the year and AMD being in both of them, I guess they are focusing more on them versus the PC market for now. Guess I'll have to stick with my 5850 for now, unless I can score a great deal on something else.



670 sir. Don't look back.


----------



## okidna (Feb 11, 2013)

cadaveca said:


> that's just a fix for those three apps specifically.



Yup, sadly you're right


----------



## cadaveca (Feb 11, 2013)

okidna said:


> Yup, sadly you're right



And with that said, I'd much rather see more drivers, than more cards from AMD right now.


----------



## Ravenas (Feb 11, 2013)

Why should they release them before then? Console games are the driving force of GPU power requirements. The consoles will probably be available by the end of this year with next generation games. Those next generation games require better GPUs and the developers were be building engines around that. So need to rush these things... These comments about AMD being cash strapped ect... of course they are, but they are still going to be putting high quality affordable GPUs on the market. There is no reason to rush a product out when the consumer doesn't have a high demand for it yet.


----------



## Frick (Feb 11, 2013)

qubit said:


> Good for you.
> 
> I wish I could buy it - I'd seriously geek out with a card like this.





I am indifferent.


----------



## GC_PaNzerFIN (Feb 11, 2013)

HumanSmoke said:


> And you know this how? Do you work for Nvidia or TSMC ? There have been precisely two SKU's based on the GK 110...one has 93% functionality, and the other 86.7%. Using your logic, Tahiti's yields must also be poor, since the split is 100%, 87.5%, and 75%.



Now now don't start putting words in others mouths. You got that logic part from your own head. Since this subject obviously needs clarifying...

I know it because I have been studying computer architecture and processor design as in HW for a very long time.

Do you have any idea how hard it is to manufacture a 7 billion transistor GPU which is well over 500mm2 in die area? Especially how the defect rate/yield relates to the die area? Yeah I guess you don't. You don't have to work for NVIDIA for that btw... 

There is no fully enabled GK110 out and there won't be for a long time. Why? Because the amount of fully working chips from these pre-launch risk wafers is too low to launch a product with mass availability without making unacceptable trade of amount of GPUs you can sell and the cost of manufacturing them. Even GK104 and Tahiti get a lot partially defective GPUs from the Fabs just look at the amount of them being sold. And those are more than third smaller than GK110 die. As die size increases not only are you getting less chips out of the wafer in best scenario but the change there are multiple defects on a single chip goes up too and fast. 

After a long period of time which is many months the yields will improve and stabilize as the manufacturing process matures for that exact product. This is especially true for gigantic ICs. At this point you'd generally start selling the stockpile of partially working ICs too because there is plenty in storage from manufacturing up to this point. 

What I am in fact saying is quite the opposite you thought as my logic. Tahiti, and GK104 for that matter, are less prone to multiple defects per chip resulting in better yields.


----------



## Wrigleyvillain (Feb 11, 2013)

Just bought 7950 so all fine and dandy with me (and I agree with Dave re. driver work instead).


----------



## badtaylorx (Feb 11, 2013)

im ok with this so long as AMD puts out more cards along the line of the Tahiti LE class cards. I was quite impressed with that lil' guy.  I'd really like to see an HD 7890 Tahiti LEx2 put out.

3072 Stream Processors at a price-point under $500... 

anyone else up for that???


----------



## Casecutter (Feb 11, 2013)

Sasqui said:


> They either landed on a leap-frog development that will really impress by late '13 or early '14 or…





TheMailMan78 said:


> I think AMD just showed the hand of the industry as a whole. Mobile and console are the future and pursuing a shrinking market (desktop PC's) aggressively isn't a smart move. NVIDIA has to anymore. They hold no real advantage at this point except in the dedicated GPU market.





Crap Daddy said:


> I've been preaching this whenever I have the opportunity. NV is testing the wallet of the small niche of enthusiasts to see how many limited Lamborghini type of cards can they sell while *trying to get as much money for as long it's possible from their mid and low range cards*. AMD has no Ferrari to compete but they have good cards for a fair price plus glorious bundles to keep them in the business until they figure out if the desktop PC is going extinct.



With this news of waiting it out, I kind of want to think AMD made GCN and Boost work really super together, something they may partially implemented in console parts and why they got the contracts.  While game developer release new titles for those console, AMD knows they'll have a leg up with GCN drivers and buying-time for those releases to port to the PC platform.  While I think AMD realizes Kepler (GK104) is at this point fairly tapped out.  Nvidia can pick-up modest gains with a re-spin, but without an overall revamp (major investment) and perhaps even die size bump they won’t achieve the next hurdle.  If AMD released in say 3 months they’d take the lead price/performance, but why?  This all may have to do that they are hope to "dig-up" some information as to what Nvidia could do with Kepler, it been fairly quiet on that. 

Yes, It may come down to slowing development, resources, manpower, tied to a shrink market. IDK   It could be that pulling back is related to both Volcanic Islands, and Nvidias’ Maxwell and 20Nm manufacturing.  But overall AMD probably knows Nvidias' path is 8+ months out, so *why not* slow it down.  Last time they moved early and it brought more grief on them.

Lastly this has almost nothing to do with Titian, except Nvidia (and AMD) wants to see how many they can sell, and can they elevate the market in future go arounds. Proving there might be enough takers in a $600-750 range to bump normal "enthusiast" price point fo Maxwell in 2014-2015. This will permit AMD to see if Nvidia might bring other lower derivatives from further gelding GK110’s. That could give AMD an idea if Nvidia could in fact bump the GK104 die with a slight revamp while hitting good gains?   I really believe AMD priced the 7970 because they really thought Nvidia would be onboard, but they pulled a fast one on AMD by getting to work from the GK104.  

This is AMD being smart... for all the right reasons.


----------



## HumanSmoke (Feb 11, 2013)

GC_PaNzerFIN said:


> There is no fully enabled GK110 out and there won't be for a long time. Why? Because the amount of fully working chips from these pre-launch risk wafers is too low to launch a product with mass availability without making unacceptable trade of amount of GPUs you can sell and the cost of manufacturing them.


Nvidia began supplying ORNL with GK 110 based Tesla's (93% functional) in the first week of September. Nine weeks later ORNL completed a Linpack run using 18688 of these GPUs (2000+ per week). This in an atmosphere where Nvidia's wafer starts were primarily aimed at fulfilling OEM mobile ( the GT 625M, GT 645M, GTX 670MX, GTX 675MX and GTX 680MX were all released during the same timeframe), as well as consumer GK 106 and GK 104 variants.

While a larger die invariably will contain more defects than a smaller one, no one here- not even you - know what the defect rate is, and more importantly, how much redundancy is built into the GPU die to allow for defects. So, while I would be agreement with you over the likelihood of a lower yield as die size increases, I am not in agreement with absolute statements that aren't supplied with corroborative proof....


GC_PaNzerFIN said:


> I can guarantee you there will be like a ton of not-good-enough-for-Titan GK110s. *Its massive chip with poor yields *



I'd also contend that "massive" is a relative term. I don't think the GK 110 would be appreciably larger than GF 110 - and if anyone has experience in producing large complex GPUs, it's probably Nvidia...the 20,000+ units shipped from the first A1 revision are probably testimony to that.


----------



## GC_PaNzerFIN (Feb 11, 2013)

HumanSmoke said:


> Nvidia began supplying ORNL with GK 110 based Tesla's (93% functional)...



Again you prove my point. None of sold was fully enabled GK110.


----------



## erocker (Feb 11, 2013)

GC_PaNzerFIN said:


> Again you prove my point. None of sold was fully enabled GK110.



I think I lost what the point _was_.


----------



## GC_PaNzerFIN (Feb 11, 2013)

erocker said:


> I think I lost what the point _was_.



That the 1000 dollar Titan is not the only GK110 to get into markets this year. And that you only disable parts when you have yield problems.


----------



## erocker (Feb 11, 2013)

GC_PaNzerFIN said:


> That the 1000 dollar Titan is not the only GK110 to get into markets this year. And that you only disable parts when you have yield problems.



Ah, I see. Many of us believe that the current cards (6 series) are the derivative of GK110 already. But really this is for a different thread. Discuss here: http://www.techpowerup.com/forums/showthread.php?t=179919

This thread is about discussing what AMD is going to do this year (or not do, lol).

Carry on.


----------



## Goodman (Feb 11, 2013)

That doesn't surprise me as AMD are pretty low on cash (for a company that size) best thing for them to do is use what they already have make it better & still sale it at better profit 

The other thing is i don't see the point of new GC's either from AMD or Nvidia not until we see screens/monitors getting mainstream with higher resolution then 1080 , no really don't we have this resolution for long enough already? (for PC)


----------



## Lionheart (Feb 11, 2013)

Either AMD is too busy with the next gen consoles or they're chucking a Sony 

Sony CEO Kazuo Hirai - "Why go first, when your competitors can look at your specifications and come up with something better?” 

Week later, Sony show's off a teaser trailer of the most likely PS4 announcement Feb 20th trolling Microsoft & surprising everyone


----------



## TheMailMan78 (Feb 11, 2013)

Lionheart said:


> Either AMD is too busy with the next gen consoles or they're chucking a Sony
> 
> Sony CEO Kazuo Hirai - "Why go first, when your competitors can look at your specifications and come up with something better?”



That didn't help them with the last generation at all.


----------



## Xzibit (Feb 11, 2013)

TheMailMan78 said:


> That didn't help them with the last generation at all.



Thats cause Sony went with Nvidia and Microsoft with AMD


----------



## OneCool (Feb 11, 2013)

TheMailMan78 said:


> I think AMD just showed the hand of the industry as a whole. Mobile and console are the future and pursuing a shrinking market (desktop PC's) aggressively isn't a smart move. NVIDIA has to anymore. They hold no real advantage at this point except in the dedicated GPU market.




^^^ This.


----------



## Super XP (Feb 11, 2013)

TheMailMan78 said:


> They are drying up because they are rebranding them as the 8xxx cards.



That should be illegal. I've gotten sick of NV doing such a thing and ATI back in the day.


cadaveca said:


> And with that said, I'd much rather see more drivers, than more cards from AMD right now.


How about better drivers for performance improvements on every single game. Keeping this GPU series longer should assist in familiarizing themselves with the GPU.


----------



## Lionheart (Feb 11, 2013)

TheMailMan78 said:


> That didn't help them with the last generation at all.



I didn't quite finish my post accurately 

I meant "Chucking a sony" as in saying that they will wait for Microsoft to release their console first. Then 1 week later they surprise everyone with this Feb 20th announcement which is most likely the PS4 catching microsoft off guard . Just wondering if AMD might do the same thing to Nvidia but I doubt it as AMD have their hands tied with next gen console hardware


----------



## GSquadron (Feb 12, 2013)

I can feel the smell of a Geforce Titan review....


----------



## Phusius (Feb 12, 2013)

Aleksander Dishnica said:


> I can feel the smell of a Geforce Titan review....



Not worth the $900 unless it can do 7970 CF performance.


----------



## BigMack70 (Feb 12, 2013)

Phusius said:


> Not worth the $900 unless it can do 7970 CF performance.



But possibly worth the $1800 if you have some cash burning through your wallet and want to put two of them in SLI


----------



## eidairaman1 (Feb 12, 2013)

Goodman said:


> That doesn't surprise me as AMD are pretty low on cash (for a company that size) best thing for them to do is use what they already have make it better & still sale it at better profit
> 
> The other thing is i don't see the point of new GC's either from AMD or Nvidia not until we see screens/monitors getting mainstream with higher resolution then 1080 , no really don't we have this resolution for long enough already? (for PC)



or most games taking advantage of the hardware (Higher Levels of Detail compared to Consoles, aka Like back during the R9700 Pro Days)


----------



## dude12564 (Feb 12, 2013)

BigMack70 said:


> But possibly worth the $1800 if you have some cash burning through your wallet and want to put two of them in SLI



What if quad-SLI...

That's $3600.


----------



## BigMack70 (Feb 12, 2013)

dude12564 said:


> What if quad-SLI...
> 
> That's $3600.



Yeah but you can actually play games on 2-way SLI.

Quad-SLI... not so much. Though the 3dmark scores would undoubtedly be impressive.

You could probably buy 4 of them and then sell the two you don't want at crazy markup on ebay after running a quad-SLI bench and make back most of the money off the second card.


----------



## PopcornMachine (Feb 12, 2013)

Story completely changing now.  

Got this link from OCC to article at PCWorld.

AMD promises to squash confusion and quickly clarify 2013 Radeon plans


----------



## Xzibit (Feb 12, 2013)

PopcornMachine said:


> Story completely changing now.
> 
> Got this link from OCC to article at PCWorld.
> 
> AMD promises to squash confusion and quickly clarify 2013 Radeon plans



Cant wait for the next "AMD may delay thread" to talk about Nvidia TITAN for 75% of the post. 

Atleast someones going out to get an official word aside from just bulletinboard rumors.  I always wondered that about review sites and hardware sites.  So quick to post rumors when they can just fire away an e-mail to said rep and get a clarification.



Nice catch.


----------



## Easy Rhino (Feb 12, 2013)

just more evidence that points to console domination and the end of pc gaming. the gpu is going the way of the nic card and the sound card.


----------



## HumanSmoke (Feb 12, 2013)

PopcornMachine said:


> Story completely changing now.
> Got this link from OCC to article at PCWorld.


Not sure about the "completely changing" aspect- still seems as clear as mud. It was always a given that AMD were releasing at least some HD 8000 series- both OEM and iGPU (Richland) have already been announced.


> "We will certainly have new [GPU] products in 2013."


Check.


> Why did that cause such a hub-bub? The GPU road map AMD showed off at CES gave equal billing to both "Sea Islands," AMD's internal code name for the Radeon HD 7000 series, as well as "Solar System," the code name for the next-gen Radeon HD 8000 series


AFAIW, "Solar System" (Mars, Sun, Venus, Neptune) are the code name for _mobile_. "Sea Islands" (Oland, Bonaire, Hainan, Curacao, Aruba) are supposedly the new desktop cards, with only the first three having any commonality with the mobile parts.


> Robert Hallock said as much when Legit Reviews approached him about his @AMDRadeon tweet: "I should note that HD 8000 Series has never been so much as hinted at for a channel release. Anything to the contrary is an unsubstantiated rumor fabricated to drive traffic.


Except those pesky mobile units that AMD have been publicizing I presume:


----------



## KissSh0t (Feb 12, 2013)

Maybe AMD will have a chance to get some really good drivers out now for the current lineup of cards.


----------



## D007 (Feb 12, 2013)

Thank God.. This constant release of New GPU's has been really annoying..


----------



## xenocide (Feb 12, 2013)

D007 said:


> Thank God.. This constant release of New GPU's has been really annoying..



How has it been _constant_?  The first 7970 review was posted on TPU on 12/11/11, which at this point was 1.25 years ago.  They didn't have a set launch date for the 8xxx series beyond Q2-Q4 2013.  For all we know Q4 could mean 2 years from when the 7970 launched, which isn't exactly constant.  What they have done is start releasing a series and then taper it out slowly releasing more models as time goes on, which is somewhat annoying.  I'd rather see a hard launch with the entire series...


----------



## Mr. Fox (Feb 12, 2013)

TheMailMan78 said:


> I think AMD just showed the hand of the industry as a whole. *Mobile and console are the future* and pursuing a shrinking market (desktop PC's) aggressively isn't a smart move. NVIDIA has to anymore. They hold no real advantage at this point except in the dedicated GPU market.


Can we expect anything less from a generation of people that graduate from high school and college, but they can barely read, write or spell? When the ignorant are allowed to breed, and make big decisions that affect the intelligent, all kinds of bad things can happen, LOL.

Seriously though... Death of the PC has been predicted many times. If the enthusiasts buying killer products all die, and the money dries up for the products they always purchased, that could really happen. Kids can only afford consoles and cheap, under-performing mobile devices. As disposable as they are, they end up in landfills quickly and the concepts of modding, performance tuning and tweaking is like a foreign language to them. After all, if Sony, Microsoft or Nintendo offer it for sale, it must already be awesome, right? 

Kind of like the concept of everyone driving crappy electric cars. They call it "vision," "innovation" and "progress" when it's really quite the opposite if you're a performance enthusiast. It really sucks when some people get their way.


----------



## Bjorn_Of_Iceland (Feb 12, 2013)

Quite confused about the death of the pc and rise of the console posts.. on other news I've been reading, it's the consoles who are close to extinction XD.


----------



## TRWOV (Feb 12, 2013)

I could live with a 2 year cycle between gens. I usually skip a generation anyway.


----------



## anubis44 (Feb 12, 2013)

Looks like nVidia and AMD are finally calling a truce so they can both focus on the real enemy of both companies: Intel. I just keep hoping that nVidia and AMD will see the light and merge. The merged AMD/nVidia would instantly have a lock on the graphics market, and have competitive x86 and ARM designs already, with the financial strength of nVidia and the deep intellectual property and engineer teams of AMD. Together, they would be serious force to be reckoned with.


----------



## Nordic (Feb 12, 2013)

anubis44 said:


> Looks like nVidia and AMD are finally calling a truce so they can both focus on the real enemy of both companies: Intel. I just keep hoping that nVidia and AMD will see the light and merge. The merged AMD/nVidia would instantly have a lock on the graphics market, and have competitive x86 and ARM designs already, with the financial strength of nVidia and the deep intellectual property and engineer teams of AMD. Together, they would be serious force to be reckoned with.



And have a monopoly suit on their hands


----------



## Prima.Vera (Feb 12, 2013)

Easy Rhino said:


> just more evidence that points to console domination and the end of pc gaming. the gpu is going the way of the nic card and the sound card.



Mm...Don't think so.


----------



## KissSh0t (Feb 12, 2013)

Easy Rhino said:


> just more evidence that points to console domination and the end of pc gaming. the gpu is going the way of the nic card and the sound card.



Consoles have been dead to me since... 2001..

PC gaming is where it's at.


----------



## DannibusX (Feb 12, 2013)

My 5970 told me that it didn't care a while back.

I still don't think it does, as there's really been nothing to push it and I don't feel the need to dump large chunks of cash into a new GPU every year.

AMD will be releasing new SKU's this year, they've been rebranding OEM stuff.  I at least expect 7 series revisions and something to be released after nVidia puts out.


----------



## qubit (Feb 12, 2013)

Mr. Fox said:


> Kind of like the concept of everyone driving crappy electric cars. They call it "vision," "innovation" and "progress" when it's really quite the opposite if you're a performance enthusiast. *It really sucks when some people get their way.*



Too right. It always seems to be the vocal meddlers with their crappy ideas that get their way and we all pay the price.

There's usually corruption and big money changing hands somewhere along the line. Think those useless wind farms.


----------



## Prima.Vera (Feb 12, 2013)

qubit said:


> Too right. It always seems to be the vocal meddlers with their crappy ideas that get their way and we all pay the price.
> 
> There's usually corruption and big money changing hands somewhere along the line. Think those useless wind farms.



I agree except last sentence. Why do you think wind farm are useless. 1 single generator can output 5GW on average. You know how many houses can be powered up with 5GW??


----------



## qubit (Feb 12, 2013)

Prima.Vera said:


> I agree except last sentence. Why do you think wind farm are useless. 1 single generator can output 5GW on average. You know how many houses can be powered up with 5GW??



hmmm, we could go back and forth on that one, but it's unfortunately off topic for this thread. If you'd like to talk about it, then why not start a thread in the science section and put a link to it here?


----------



## tokyoduong (Feb 12, 2013)

You guys are so weird lol. Do you guys realize that the only reasons consoles and mobile are so popular now is because it became more and more like a PC? except it's much cheaper, smaller and easier to use.

There is now a flood of new products in all various sizes and prices. One day, it will saturate and people will want less devices and even more simplicity. I can't see the PC in its current form come back to dominance but something like that will. TBH, I'm getting sick of all these "mobile" devices already. With the electronics and accessories a person carry around, I think it's starting to slow people down more than making them  more mobile. Then you have all the associated problems with multiple OS, devices, connectors, etc... It seems like people are lost and can't even get groceries by themselves if their smartphone didn't tell them where to go.

A few years back, I read some stories about Psychologists and rehab centers making millions working with patients with "texting disorder". I can't wait to see what new nonsense will come up now.


----------



## Mr. Fox (Feb 12, 2013)

anubis44 said:


> Looks like nVidia and AMD are finally calling a truce so they can both focus on the real enemy of both companies: Intel. *I just keep hoping that nVidia and AMD will see the light and merge.* The merged AMD/nVidia would instantly have a lock on the graphics market, and have competitive x86 and ARM designs already, with the financial strength of nVidia and the deep intellectual property and engineer teams of AMD. Together, they would be serious force to be reckoned with.


That would be a horrible thing for consumers. Having them as bitter enemies and tearing at one another's jugulars is exactly what we need. Competition is what makes both of their products get better and better. If they were to merge we would get whatever they want us to have... and nothing more. That would really blow chunks.


----------



## cadaveca (Feb 12, 2013)

tokyoduong said:


> consoles and mobile are so popular now is because it became more and more like a PC? except it's much cheaper, smaller and easier to use.



I do not agree. NOrmal users are into these sorts of things because of Facebook, and that is all. Bringing Facebook to smartphones may have saved the industry, but we are in a transition period right now, where people are having issues finding a use for PCs when their smartphones casts less, and do all the same basic things.

A game console...well, that's for games. Higher population globally means a larger market.

Perhaps AMD will release something...I kinda of hope they do.


----------



## Crap Daddy (Feb 12, 2013)

cadaveca said:


> we are in a transition period right now, where people are having issues finding a use for PCs when their smartphones casts less, and do all the same basic things.



The problem is with desktop computers, those who can accommodate a discrete GPU. I think there is still good life ahead for notebooks, tablet/laptop hybrids like the Surface Pro with integrated graphics. I think that's what AMD is probably concentrating now is rebranding the 7000 mobile chips to 8000 moniker and get more design wins and mobile APUs.


----------



## cadaveca (Feb 12, 2013)

Crap Daddy said:


> The problem is with desktop computers, those who can accommodate a discrete GPU. I think there is still good life ahead for notebooks, tablet/laptop hybrids like the Surface Pro with integrated graphics. I think that's what AMD is probably concentrating now is rebranding the 7000 mobile chips to 8000 moniker and get more design wins and mobile APUs.



It seems to em that it isn't any wonder that AMD kind of has driver issues, considering the many different core designs they have floating around, all using a universal driver. APUs, VLIW4/VLIW5, GCN, different memory designs on each, different chip latencies, etc...

They need to strengthen their position, for sure, and to me, that can only be done of consolidating into either a more homogenized architecture, or by investing more into programming existing devices. With that in mind, low-power computing is far more forward-thinking than what the desktop space provides, especially when you consider the mindset of the general consumer, whose real introduction into personal computing is going to be a smartphone. It makes more sense to build UP from there, rather DOWN form the high-end in performance, simply because of user experience.


----------



## erocker (Feb 12, 2013)

AMD promises to squash confusion and quickly clarify 2013 Radeon plans


----------



## Prima.Vera (Feb 12, 2013)

erocker said:


> AMD promises to squash confusion and quickly clarify 2013 Radeon plans



Yeah, we are for an 8000 series this year. But I really hope is NOT just a re-brand series...


----------



## eidairaman1 (Feb 13, 2013)

cadaveca said:


> It seems to em that it isn't any wonder that AMD kind of has driver issues, considering the many different core designs they have floating around, all using a universal driver. APUs, VLIW4/VLIW5, GCN, different memory designs on each, different chip latencies, etc...
> 
> They need to strengthen their position, for sure, and to me, that can only be done of consolidating into either a more homogenized architecture, or by investing more into programming existing devices. With that in mind, low-power computing is far more forward-thinking than what the desktop space provides, especially when you consider the mindset of the general consumer, whose real introduction into personal computing is going to be a smartphone. It makes more sense to build UP from there, rather DOWN form the high-end in performance, simply because of user experience.



APUs use VLIW or GCN depending on series


----------



## Jurassic1024 (Feb 13, 2013)

This news is good for me, because now I can stop the debate on whether to wait for the 8000 series or not, and can pick up a second Sapphire 7870 GHz Ed.

I think that dude up there is right about AMD focusing on the consoles for now. After that he took a left to Crazy Town, but that's another story. AMD does not have a lot of money on tap. R&D is expensive, and R&D takes time that AMD does not have to do their regular business, and work with MS and Sony on their consoles' development, as well as [AMD] customizing their hardware that will go inside.


----------



## Jurassic1024 (Feb 13, 2013)

cadaveca said:


> A game console...well, that's for games.



Yea no one uses them for (HD) movies, or twitter, or netflix, or hulu, or creating an avatar, or browsing the web, or sharing pictures, or chatting, or....

In fact, if it wasn't for the blu ray player in the PS3, HD DVD may have won. A stretch perhaps, but no one could deny the PS3 was, at the time, the cheapest blu ray player on the market. The PS3 accounted for a rather large amount of sales just for the blu ray player it had. Just today a coworker said he ONLY uses his PS3 for blu ray movies. Another said he uses his for games, music and movies. 

So consoles for games only?  You must of just come out from under your rock. Welcome back, btw.


----------



## cadaveca (Feb 13, 2013)

Jurassic1024 said:


> So consoles for games only? You must of just come out from under your rock. Welcome back, btw.



Knowing many smartphone owners, they tend to use their phones, rather than consoles. Younger users, sure, consoles are a social gateway, and rightly so.


I can only convey what I see, and I live in a different country than you do, even. TO think that it's exactly the same here as there...well, is not well advised.

BTW, Blu-Ray was decided by the porn industry, just like VHS was. Sony had nothing to do with it.


BTW, I use my PS3 to watch BluRays, and the odd game, if I'm in bed. All other times, my PC does both as well, and with greater ease of use, too. You cannot forget that for the majority of the market, ease-of-use is priority number 1.


----------



## xenocide (Feb 13, 2013)

cadaveca said:


> I do not agree. NOrmal users are into these sorts of things because of Facebook, and that is all. Bringing Facebook to smartphones may have saved the industry, but we are in a transition period right now, where people are having issues finding a use for PCs when their smartphones casts less, and do all the same basic things.



The problem is every company in the Tech Industry is just trying to make the next big thing. Smartphones were around for years before Apple made one that appealed to most people and got the ball rolling--same thing with Tablets.  Somebody makes a product and companies like HP, Dell, ASUS, Samsung, and anyone else with the money for it start pumping out similar products and it becomes all that is marketted. Remember when consoles were going to kill PC's?  Or Laptops were going to kill Desktops?  What about when Netbooks were going to kill Laptops? No, how about when Smartphones were going to kill off all the above? Okay, so now it's tablets that are going to kill everything off?

Give me a break.

You want to know why PC Hardware is faultering?  For starters, why the fuck do we have lower res and quality displays on PC's than Tablets?  How does that even make sense???  I went to look at laptops, and they are all 1366x768.  You can get entry level phones that have nearly the same resolution.  Why are Laptop manufacturers still peddling crap and high prices?  If they can put a 2560x1600 display in an 11' tablet, why can't I get one as a 24-27' monitor for even the same price???  

This is why companies are losing sales in various fields.  Because they don't know how to refine their products.  Apple is the pinnacle of refined products, even if it doesn't sell phenomenally, if they put it out they will improve on the next iteration in every way.  Sony is the only company with Laptops on Newegg that has affordable models with 1920x1080 screens, I found a dual-core i5 with a dedicated HD7730 or HD7750 or something for lik $650, which was a damn steal.  Why aren't laptops like that littering Best Buy and Walmart???



cadaveca said:


> A game console...well, that's for games. Higher population globally means a larger market.



Consoles are trying to build themselves as living room entertainment systems.  Hence why all the new ones should have BluRay, and why they all rely heavily on Netflox, Hulu, and similar services for support.  Microsoft and Sony want to have their devices be your all in one entertainment console--think HTPC but more convenient.  I honestly only use my PS3 to stream stuff to my downstairs TV and play DVD\BR movies, and occasionally to have music on downstairs while I clean.  Haven't played a game on it in probably 8 months.



cadaveca said:


> BTW, Blu-Ray was decided by the porn industry, just like VHS was. Sony had nothing to do with it.



I'm sure it had nothing to do with the fact that Sony got more film studios to support it out of the gates, made it built into their console rather than requiring an expensive attachment, or the fact that it has better capacity and a longer lifespan than HD-DVD did.  Nope, had to be porn.


----------



## buildzoid (Feb 13, 2013)

Your monitor doesn't need that high pixel density because you gonna sit 1-2m away from it so the pixels will appear smaller also those epic displays on tables make them cost so damn much. Also the base material for LCDs is Indium oxide. Indium is really really expensive. making high quality and high pixel density displays is already expensive on a 11" scale so 22" with the same pixel density is gonna cost 2x the price or more because of a high production fail rate. A 9.7" retina cost around 500$ so a 27" retina will cost something like 1400$ that's why we don't have 27" retina displays on pc the refresh rate on them must be pretty bad by gaming standards. BTW if you get a good 1080p monitor (like the iMac ones) then it looks amazing even with low pixel density. I'm going to be getting 3 23" MX monitors from asus for that reason a good panel will make full HD look a lot better while not costing a fortune.


----------



## ChaoticAtmosphere (Feb 13, 2013)

Wow, this is great news! The industry is slowing down a bit. Now I won't feel so bad going crossfire this spring!


----------



## xenocide (Feb 13, 2013)

buildzoid said:


> Your monitor doesn't need that high pixel density because you gonna sit 1-2m away from it so the pixels will appear smaller also those epic displays on tables make them cost so damn much. Also the base material for LCDs is Indium oxide. Indium is really really expensive. making high quality and high pixel density displays is already expensive on a 11" scale so 22" with the same pixel density is gonna cost 2x the price or more because of a high production fail rate. A 9.7" retina cost around 500$ so a 27" retina will cost something like 1400$ that's why we don't have 27" retina displays on pc the refresh rate on them must be pretty bad by gaming standards. BTW if you get a good 1080p monitor (like the iMac ones) then it looks amazing even with low pixel density. I'm going to be getting 3 23" MX monitors from asus for that reason a good panel will make full HD look a lot better while not costing a fortune.



The pixel density doesn't have to be astronomically high--as you said, you are sitting a bit further.  If that panel on an iPad alone cost $500, they would have to be selling them at a loss.  The fact is, laptops have been stuck on 1366x768 for years, and desktops were forced into the 1080p craze.  Hell, CRT's were capable of something like 2560x1400, how the fuck have we not gotten to at least that resolution on equivalent size LCD's in a decades time?


----------



## Prima.Vera (Feb 13, 2013)

xenocide said:


> The pixel density doesn't have to be astronomically high--as you said, you are sitting a bit further.  If that panel on an iPad alone cost $500, they would have to be selling them at a loss.  The fact is, laptops have been stuck on 1366x768 for years, and desktops were forced into the 1080p craze.  Hell, CRT's were capable of something like 2560x1400, how the fuck have we not gotten to at least that resolution on equivalent size LCD's in a decades time?



Plenty of monitors with 1440p out there. Or you want them for cheap?? Just remember when the first 1200p monitors were launched how much did they cost?


----------



## ChaoticAtmosphere (Feb 13, 2013)

Jurassic1024 said:


> Yea no one uses them for (HD) movies, or twitter, or netflix, or hulu, or creating an avatar, or browsing the web, or sharing pictures, or chatting, or....
> 
> In fact, if it wasn't for the blu ray player in the PS3, HD DVD may have won. A stretch perhaps, but no one could deny the PS3 was, at the time, the cheapest blu ray player on the market. The PS3 accounted for a rather large amount of sales just for the blu ray player it had. Just today a coworker said he ONLY uses his PS3 for blu ray movies. Another said he uses his for games, music and movies.
> 
> So consoles for games only?  You must of just come out from under your rock. Welcome back, btw.



When most consumers consider buying a GAME console, usually their primary use for it is GAMING.


----------



## BigMack70 (Feb 13, 2013)

ChaoticAtmosphere said:


> When most consumers consider buying a GAME console, usually their primary use for it is GAMING.



I dunno... my wife and I use the xbox for netflix a lot more than for gaming. The only game I play anymore on console is Halo because some of my old friends and I do once a month Halo nights.


----------



## tokyoduong (Feb 13, 2013)

BigMack70 said:


> I dunno... my wife and I use the xbox for netflix a lot more than for gaming. The only game I play anymore on console is Halo because some of my old friends and I do once a month Halo nights.



he did say "most"
you are just a strange bird of the west


----------



## cadaveca (Feb 13, 2013)

tokyoduong said:


> he did say "most"
> you are just a strange bird of the west



Enthusiasts do NOT make up "most" of the market, though. What we do, really, has no relevance in the subject.


----------



## tokyoduong (Feb 13, 2013)

tokyoduong said:


> You guys are so weird lol. Do you guys realize that the only reasons consoles and mobile are so popular now is because it became more and more like a PC? except it's much cheaper, smaller and easier to use.
> 
> There is now a flood of new products in all various sizes and prices. One day, it will saturate and people will want less devices and even more simplicity. I can't see the PC in its current form come back to dominance but something like that will. TBH, I'm getting sick of all these "mobile" devices already. With the electronics and accessories a person carry around, I think it's starting to slow people down more than making them  more mobile. Then you have all the associated problems with multiple OS, devices, connectors, etc... It seems like people are lost and can't even get groceries by themselves if their smartphone didn't tell them where to go.
> 
> A few years back, I read some stories about Psychologists and rehab centers making millions working with patients with "texting disorder". I can't wait to see what new nonsense will come up now.





cadaveca said:


> I do not agree. NOrmal users are into these sorts of things because of Facebook, and that is all. Bringing Facebook to smartphones may have saved the industry, but we are in a transition period right now, where people are having issues finding a use for PCs when their smartphones casts less, and do all the same basic things.
> 
> A game console...well, that's for games. Higher population globally means a larger market.
> 
> Perhaps AMD will release something...I kinda of hope they do.



Did you just tell me i'm wrong and then repeat what I said in different words to make yourself right?

So basically what you're saying is that people's smartphones can browse web, check facebook/twitter/email/etc..., play games, store photos, etc... I can go on and on but these are all computer functions in a smaller, cheaper, convenient and easier to use form. Hmmm.... With consoles it's cheaper, more convenient, works with just one button. You can also watch movies, listen to music, stream whatever you want with apps, browse web, etc...
See where this is going. They are making PC replacements with these devices. It's more attractive when you don't have to have a tower + monitor + desk + accessories + a lot of extra cost. We are moving towards newer mobile devices that can replace traditional PC and has been for a long time. 
And "normal" users make up the vast majority of the consumer market. Why wouldn't manufacturers build for them?
I don't even know what the hell your point is but thanks for confirming my statement.


----------



## tokyoduong (Feb 13, 2013)

cadaveca said:


> Enthusiasts do NOT make up "most" of the market, though. What we do, really, has no relevance in the subject.



what are you talking about? most people buy a console to play games on it and do some other things like netflix but its primary focus is games.

There are some people that will do primarily video streaming or web browsing but that's a small %.

Did you even read what he responded to?


----------



## cadaveca (Feb 13, 2013)

tokyoduong said:


> I don't even know what the hell your point is but thanks for confirming my statement.



I'm an enthusiast. Of course I agree. However, like I said, what WE do, isn't what everyone does.

Talking to local people from 18-24, most use their smart phones. About 60% own a game console. about 70% have a PC. 100% have a smart phone. Not ONE of them uses consoles for anything but gaming. Not even movie watching..Netflix up here sucks, and we all get better service via our cable providers. I get Netflix on my cable box.

Those that own a PC...still sit on the couch, phone in hand, usually texting, and then browsing and doing other stuff on the phone between texts.

That's just a small sampling of the market. It doesn't denote what everyone does with their devices. That's just who I talk to.

Talk to my wife, her friends and such all have PCs, 15% have consoles, 90% have PCs. 100% have a smartphone.



The market is a wide and varied place, where there truly is no "standard" by which anyone uses their devices. Period. How many billions of people are on the planet? And they all use their devices like you do? I think not...and I know better.


Looking at the big picture, there's much reason for AMD to only release new GPUs for that most popular markets, and those are smartphones, and consoles. Consoles they have taken care of...but smartphones...that's NVidia's turf. I would like to see AMD there, too.


----------



## Xzibit (Feb 13, 2013)

cadaveca said:


> Looking at the big picture, there's much reason for AMD to only release new GPUs for that most popular markets, and those are smartphones, and consoles. Consoles they have taken care of...but smartphones...that's NVidia's turf. I would like to see AMD there, too.



Nvidia is a small player in Smartphones and getting smaller both in Tablets and Phones. It was only able to get 1 smartphone HTC One X and then the updated HTC One X+. The HTC M7 will now carry a Qualcomm chip not a Tegra.

So its 0 smartphones for both AMD and Nvidia as of now 2013.

Tablets:
Google Nexus
Microsoft Windows Surface RT
Sony Xperia

Those 3 decided to go with Qualcomm for there next release.


----------



## BigMack70 (Feb 13, 2013)

tokyoduong said:


> he did say "most"
> you are just a strange bird of the west



I don't think so. None of my over-30 friends (all of whom are married) use their Xbox primarily for gaming. Most use it primarily as a netflix/etc hub with some gaming on the side like I do.

I'd believe that most teenagers use their consoles for gaming, but from my experience (and from reports that more time is spent non-gaming than gaming on consoles), I don't think game consoles are used primarily for gaming by a good chunk of people anymore.


----------



## TheMailMan78 (Feb 13, 2013)

Xzibit said:


> Nvidia is a small player in Smartphones and getting smaller both in Tablets and Phones. It was only able to get 1 smartphone HTC One X and then the updated HTC One X+. The HTC M7 will now carry a Qualcomm chip not a Tegra.
> 
> So its 0 smartphones for both AMD and Nvidia as of now 2013.
> 
> ...



What about the Asus TF? They still use Tegras.


----------



## WhiteLotus (Feb 13, 2013)

TheMailMan78 said:


> What about the Asus TF? They still use Tegras.



And they are awesome.


----------



## erocker (Feb 13, 2013)

TheMailMan78 said:


> What about the Asus TF? They still use Tegras.



Pretty sure he's talking about new and upcoming products. Going by reviews, the last two revisions of Tegra are pretty fail compared to the competition.


----------



## TheMailMan78 (Feb 13, 2013)

erocker said:


> Pretty sure he's talking about new and upcoming products. Going by reviews, the last two revisions of Tegra are pretty fail compared to the competition.



Well yeah I figured as much but I was talking about the next gen of Asus TF. I was wondering if he had heard anything.


----------



## Xzibit (Feb 13, 2013)

TheMailMan78 said:


> What about the Asus TF? They still use Tegras.



They havent announced refreshes to products. The most recent refreshes are Lenovo and Toshiba that still use Tegra 3.

Only tablet officially using Tegra 4 is a Vizio. Rumoured to have refreshes on Tegra 4 are Asus and Acer.  No matter how you look at it there footprint in tablets is down more then 50%.  They had over 9 products in tablets lost 3 to Qualcomm and only 1 has officially announced adaptation of Tegra 4.

MWC will tell us more if they loose more to competition or renew contracts.


----------



## TheMailMan78 (Feb 13, 2013)

Xzibit said:


> They havent announced refreshes to products. The most recent refreshes are Lenovo and Toshiba that still use Tegra 3.
> 
> Only tablet officially using Tegra 4 is a Vizio. Rumoured to have refreshes on Tegra 4 are Asus and Acer.  No matter how you look at it there footprint in tablets is down more then 50%.  They had over 9 products in tablets lost 3 to Qualcomm and only 1 has officially announced adaptation of Tegra 4.
> 
> MWC will tell us more if they loose more to competition or renew contracts.



That's not good news at all for NVIDIA. No consoles, losing mobile market. They better get it together.


----------



## HumanSmoke (Feb 13, 2013)

TheMailMan78 said:


> That's not good news at all for NVIDIA. No consoles, losing mobile market. They better get it together.


Consoles are great for revenue, but the margins are pretty thin. All good for AMD since they basically pull in money while the console makers, foundry and ODM sort out the manufacture and supply.
As for the mobile market and Nvidia, the Tegra 4 is a stop gap chip between Tegra 3 and the ARM Cortex A50 (ARMv8) based Tegra 5, solely to provide a continued presence in the marketplace. 

Note: Vizio won't be the only user of Tegra 4 (HP for example), even if the chip is a short-timer. A few large OEM wins usually trump a raft of smaller wins.


----------



## Xzibit (Feb 13, 2013)

@TheMailMan78

Jen mentioned a Asus tablet that uses Tegra 4 in the Earnings Report Confress Call.  He only mentioned that one by name and alluded to more.
Tegra 4 ramps in Q2 so Q3 2013 they should be coming to market.


----------



## Fluffmeister (Feb 13, 2013)

I don't really get all the nVidia doom and gloom, they just posted more stella results 

http://techreport.com/news/24357/nvidia-posts-another-round-of-record-results


----------



## Xzibit (Feb 14, 2013)

Fluffmeister said:


> I don't really get all the nVidia doom and gloom, they just posted more stella results
> 
> http://techreport.com/news/24357/nvidia-posts-another-round-of-record-results



Year over Year is nice

Quarter to Quarter its bad.

Operating expenses continue to climb and now they consolidating FR to 2 sections to minimize bad outlook.  Basickly if one division does bad a good division with in that sector can balance it out now while going forward in reports from this quater on out.  Just have to dig through the numbers to figure out which division is doing poorly and which one isnt.

Revenue = -8.1%
Gross Margin = Flat
Operating Expenses = +4.6%
Net Income = -16.8%
Earnings Per Share = -15.2%


----------



## HumanSmoke (Feb 14, 2013)

Xzibit said:


> Year over Year is nice
> 
> Quarter to Quarter its bad.



Not surprising considering the last quarter was a record for Nvidia.

Revenue for FY13 : $4.28 bn (up from $4 bn last year)
Profit for the quarter: $174 million (takes into account Nvidia buying back $100 million of its shares- basically Nvidia using the Intel settlement to buy back the company. Nice.)
Stock share dividend total paid out: $46.9 million



Xzibit said:


> Basickly if one division does bad a good division with in that sector can balance it out now while going forward in reports from this quater on out.  Just have to dig through the numbers to figure out which division is doing poorly and which one isnt.


Pity you don't spend the same amount of effort in dissecting AMD's quarterly returns- in fact the sum total of your AMD analysis amounts "not doing as good as Nvidia" ...followed by, wait for it -more Nvidia numbers!
I'll add to the numbers you posted...
Q4 2011 - $116 million profit
Q1 2012 - $ 60 million profit
Q2 2012 - $119 million profit
Q3 2012-  $209 million profit
Q4 2012-  $174 million profit
All while buying back company shares, and having $3.1 billion in cash and short term investments on hand.
Yup, looks like a company circling the drain


----------



## Xzibit (Feb 14, 2013)

HumanSmoke said:


> All while buying back company shares, and having $3.1 billion in cash and short term investments on hand.
> Yup, looks like a company circling the drain



I never said it was circling the drain but its not immune to PC market trends.  It just happened to hit other companies sooner.
That same argueement is beated to death by that Ashraf dude over at SeekAlpha. How the company should be valued more then it is.  Its actually very funny.

Nvidia CFOs outlook was at best FLAT or NEGATIVE outlook for next Q.  So if you have a problem with the numbers take it up with Nvidia.
Better yet go buy some more GPUs. 



So far the market didnt like the report.  12.66 1hr before closing & the report. Closed at 12.37. After hours its 12.18


----------



## Deleted member 67555 (Feb 14, 2013)

Ouya is Tegra 3....seems Nvidia is getting into the console market as well


----------



## HumanSmoke (Feb 14, 2013)

Xzibit said:


> That same argueement is beated to death by that Ashraf dude over at SeekAlpha. How the company should be valued more then it is.  Its actually very funny.


Yeah, what a hoot:


> As of late October, the company had $3.43 billion in cash and short-term investments on its balance sheet. Assuming no share repurchases, this amount has probably grown to around $3.8 billion by now, as NVIDIA received a $300 million royalty payment from Intel Corporation (NASDAQ:INTC) in January as part of a cross-licensing agreement between the two companies. Cash thus represents nearly half of NVIDIA's market cap of $7.9 billion.


Maybe the Ashraf dude and the rest of the world might have more knowledge than random internet troll. Personally I won't believe it until it features on Mythbusters...although I'd note that this is an AMD thread...and of the FOURTEEN posts you've made in it EVERY SINGLE ONE is about Nvidia, and ONLY TWO even feature references to AMD.



Xzibit said:


> So far the market didnt like the report.  12.66 1hr before closing & the report. Closed at 12.37. After hours its 12.18


Pretty standard SOP for tech companies. AMD beat the forecast last quarter, and look what happened on the day the financials came out:


----------



## Xzibit (Feb 14, 2013)

HumanSmoke said:


> Yeah, what a hoot:
> 
> Maybe the Ashraf dude and the rest of the world might have more knowledge than random internet troll. Personally I won't believe it until it features on Mythbusters...although I'd note that this is an AMD thread...and of the FOURTEEN posts you've made in it EVERY SINGLE ONE is about Nvidia, and ONLY TWO even feature references to AMD.



I beleive I was replying to post that have already been made.

I do find it humorous that pointing to Nvidia Quarterly numbers makes you this upset. 

If you want to follow Ashraf advise be my guest. Like this gem he made going into CES 2013
What did the stock do during CES 2013 ? 
01/08/2013 12.8 Open
01/11/2013 12.09 Close
Hes perfect for selling hype. Read the comments left by others. He bought a Nvidia GPU and is in love with that company.  See any similarities in you ? 

!!!WARNING!!! More numbers you wont like
Nvidia Q4 FY 2013 Summary
Weird since the buy back you referance is suppose to help ESP its down year over year 
_*Might want to read the fine print on that. Over a 10yr period and subject to change at Nvidia discreation_

If you look at Q/Q
GPU = Down 6.9%
Tegra = Down 14.6%
Total = Down 8.1%
^ and thats with the new consolidation structure.

I'm not saying its the end all be all but you can't even put it into context because your too busy raging over anything that doesnt praise Nvidia.  *This is coming off 2012 Holiday sales for 600 series* and a year ago they were off-loading 500 series getting ready for GTX 680 release with all the rumors of how they were going to counter the 7970 with the GK100 that still hasnt come.


----------



## ChaoticAtmosphere (Feb 14, 2013)

Gah!


----------



## DannibusX (Feb 14, 2013)

jmcslob said:


> Ouya is Tegra 3....seems Nvidia is getting into the console market as well



I nearly choked on my Coke.


----------



## Deleted member 67555 (Feb 14, 2013)

DannibusX said:


> I nearly choked on my Coke.



It's looking good too!



> OUYA and NVIDIA have a kind of love thing going on right now. The $99 Android-powered game console designed by Yves Béhar's fuseproject is powered by NVIDIA's Tegra 3 -- this much we already know. What we didn't know is that the folks at OUYA are working directly with a team of folks at NVIDIA on the project, and that NVIDIA is helping the company to max out its Tegra 3 processor for use on a console rather than a mobile (no battery dependency means the little chip can go much further than usual).



Games like Stargate unleashed are what I'm looking forward to.








As far as I'm concerned I could care less if AMD or Nvidia stopped making PC GFX's tomorrow....I think little boxes like Ouya are the future...but I still think Ouya will flop..LOL

EDIT: Oh yeah Stargate Unleashed will only be available via Ios and Android.


----------



## HumanSmoke (Feb 14, 2013)

Xzibit said:


> I do find it humorous that pointing to Nvidia Quarterly numbers makes you this upset


I presume you are unable to comprehend what is written- merely what you want to think is there so that you can continue to troll.


Xzibit said:


> If you want to follow Ashraf advise be my guest.


You seem to have a love affair with the guy- that's twice you've mentioned him...while I haven't mentioned, or otherwise linked to the guy. Want to keep attributing falsehoods and innuendo? I'm happy to let the moderators handle it.

And before you start lecturing people on financials, maybe you should learn something about how to interpret them


----------



## HammerON (Feb 14, 2013)

Enough with the financials and eveyone's interpretations of such. Only and last warning


----------



## Ultraman1966 (Mar 9, 2013)

So back on topic, AMD is wise to not bother releasing any new GPUs. As much as I love progression there's simply not going to be enough new games that will take full advantage of better hardware until the next next gen of consoles arrive. It's said but after all these years I've come to accept those terms.


----------



## TRWOV (Mar 9, 2013)

I could get behind a two year GPU cycle. I upgrade GPUs every two years anyway.


----------



## TheoneandonlyMrK (Mar 9, 2013)

TRWOV said:


> I could get behind a two year GPU cycle. I upgrade GPUs every two years anyway.



Id rather get my moneys worth too +1


----------



## NeoXF (Mar 10, 2013)

Radeon HD 7790, which will probably get rebranded as HD 8770 or 8750 once the full fledged HD 8000 series hits is proof of the bullshit this rumor is...

Anyway, can't AMD do 192/320bit buses? I know nVidia can...


But either way, all in all a 2-year schedule with a bigger performance boost from generation to generation seems way more rational/market-friendly and even more profitable (if done right).


----------



## BigMack70 (Mar 10, 2013)

I'd rather have a driver update that gets rid of crossfire stuttering than any new hardware... the 7970 is still the best hardware out there by far considering its price - it's only latency/CF issues that bring it down.

If AMD can ever fix the latency/stutter issues on 7970 CF, I won't be upgrading for a long time. If they can't do that, I'll be heading for GTX 780 SLI or whatever the next high end thing is from Nvidia.


----------



## qubit (Mar 10, 2013)

BigMack70 said:


> I'd rather have a driver update that gets rid of crossfire stuttering than any new hardware... the 7970 is still the best hardware out there by far considering its price - it's only latency/CF issues that bring it down.
> 
> If AMD can ever fix the latency/stutter issues on 7970 CF, I won't be upgrading for a long time. If they can't do that, I'll be heading for GTX 780 SLI or whatever the next high end thing is from Nvidia.



I dunno why driver issues are such a bugbear with AMD. You'd think they'd invest some money into getting this critical piece of software right, wouldn't you?

I haven't experienced any significant graphics problems since going over to nvidia in 2009 and I'm now on my third graphics card with them (GTX 285, GTX 580, GTX 590).

However, I also see many people with AMD cards which work just fine and in CF too, to be fair.


----------



## WhiteLotus (Mar 10, 2013)

I haven't had a single problem with any of my AMD cards ever. I often wonder if this driver issue is just made up to excuse the lower frame rates than nVidia.  Though quite frankly, anyone that thinks they "need" 60+FPS is, in my opinion, a fool with more money than sense.


----------



## qubit (Mar 10, 2013)

WhiteLotus said:


> Though quite frankly, anyone that thinks they "need" 60+FPS is, in my opinion, a fool with more money than sense.



Then according to your definition, I'm one of those fools, lol.

You're dead wrong there howver. I have a 120Hz monitor and gaming at a solid 120fps makes for a massive improvement, easily visible without even trying to look for it. It's even noticeable just moving the mouse and windows around the desktop.

And finally, my monitor can handle 144Hz and yes, you can see the difference when compared to 120Hz, although it's more subtle.


----------



## BigMack70 (Mar 10, 2013)

WhiteLotus said:


> I haven't had a single problem with any of my AMD cards ever. I often wonder if this driver issue is just made up to excuse the lower frame rates than nVidia.  Though quite frankly, anyone that thinks they "need" 60+FPS is, in my opinion, a fool with more money than sense.



60fps constant is pretty much mandatory for a multiplayer FPS IMO. Other than that I can live with anything that has minimum framerate above 30.

AMD has stuttering issues with crossfire, and I have zero ideas why they can't improve it. When I (and many others) can vastly improve the experience by just using a framerate limiter, why the heck can't AMD figure out what the problem is and fix it? Makes no sense to me at all...

I've more or less alternated between AMD and Nvidia over my 15 years of building PCs but if Nvidia can release something that doesn't suck to me, my 7970 experiment may be a short lived one.


----------



## Aquinus (Mar 10, 2013)

BigMack70 said:


> AMD has stuttering issues with crossfire, and I have zero ideas why they can't improve it. When I (and many others) can vastly improve the experience by just using a framerate limiter, why the heck can't AMD figure out what the problem is and fix it? Makes no sense to me at all...



If it's so easy to fix, why don't you do it? It's not as easy of a problem to solve as one might think it to be. nVidia has the same problems in multi-gpu setups as well and all it has to do is how long it takes for the the second card to render and sends its frame buffer to the other card. The problem is one card always has a leg up when one is master and drives the display because it takes time to send that data from one GPU to another. Now this goes very quickly, Less than a millisecond, we're talking maybe a couple hundred nano seconds, but that's added time that gets put on top of the rendering load that slows down that second frame. So what you have is it alternating between one frame rendered a little faster and the next rendered a little slower, and this difference between each frame is the stuttering you're seeing.

There was a review on another site the showed jitter and that running 3 cards in crossfire had less jitter than 2 and I would hypothesize that it happens because two of the three cards has the latency introduced because of the crossfire bridge and one gpu being master. So what happens is two out of 3 frames render in about the same amount of time with 1 frame getting rendered slightly faster. So the frame rate jitters less often than with two cards.

So since you have a frame rendering where you do not know how long it will take to render, then the next frame that might render more quickly or slowly than the last frame depending on what has changed on top of the added latency of the crossfire link and the master handling the frame buffer. So you tell me, how do you make that jitter less? I suspect both AMD and nVidia have programmers and engineers thinking about it but it's a tough problem, so don't go blaming either company for their multi-gpu short-comings because it's not an easy problem to solve at all.


----------



## BigMack70 (Mar 10, 2013)

When all it takes to vastly reduce stutter is a framerate limiter (which in my understanding just cuts out those frames that get displayed ultra-quickly thus creating jitter), why can't AMD just build that (or something like it) into their software implementation of CF?

That's what makes no sense to me. I'm not claiming to be a tech guru about this, but it isn't obvious at all to me why that would be so difficult to do.


----------

