Tuesday, June 17th 2014

Graphics Card Shipments to Drop Drastically in Q2 2014

Graphics card vendors are bracing for a brutal spell for Q2, 2014, in which they expect sales to sequentially drop by 30 to 40 percent. They are attributing this to swelling inventories (unsold graphics cards) lower down the supply chain. The drop in graphics card sales, for the first time, is being attributed to a drop in the demand for GPUs by crypto-currency miners, who are either moving on to more energy-efficient mining technologies, such as ASICs, or quitting the business, following the drop in value of various major crypto-currencies, such as Bitcoin. These miners end up selling high-end graphics cards at attractive prices on marketspace websites such as Ebay, affecting brand-new high-end graphics card sales. Graphics card vendors (AIB and AIC partners), have asked GPU manufacturers AMD and NVIDIA, to help them cut prices to boost sales, however, both have cut down supplies, to deal with the situation, instead. Swelling inventories often translate into price-cuts for the end users, so be on the look out.
Source: DigiTimes
Add your own comment

51 Comments on Graphics Card Shipments to Drop Drastically in Q2 2014

#26
john_
buildzoidThe development of the software is paid from GPU sales therefore regardless of whether you use the software or not the price gets added to the price of every GPU you buy.
Yes if you buy the GPU. But if you see that someone is trying to force you to pay also for cookies when you just want milk, you don't buy it. Not in the normal price anyway. If you see a 20% sale that's another story of course.
Posted on Reply
#27
Xzibit
buildzoidThe development of the software is paid from GPU sales therefore regardless of whether you use the software or not the price gets added to the price of every GPU you buy.
Companies always pass on the cost to the consumer. That doesn't mean its justified, it just means its an inherit cost you have to swallow for choosing that product.

Fermis were introduced at a much lower price.
600s were introduced at a higher price and the software wasn't introduced until a few months later well after people pay the inherit hidden cost of the software at the time of purchase. Mind you that not all the features work on all the cards nor are supported by all Keplers.

How many of those 600 series users or newer would have loved to had a $50 savings rather then pay the hidden cost of the software they might use or don't even want.
Posted on Reply
#28
awesomesauce
hardware have surpasse software.

building a game who can utilise all the hardware(PC) is pretty difficult atm.

it need: energy,money and alot of time..

it pretty normal hardware stop selling.
It gonna be good for us who will see price drop some days :toast:

pretty shure we gonna se some delay for the next generation (800 series)..
Posted on Reply
#29
Yorgos
"...the sources said." is like "the analysts say that..."

but nobody has ever seen those sources or spoken to those analysts.

the analysts never learn their lesson when every Quarter things are not as they predicted and
the sources sounds like someone let his kid write an article and he went to the local pub to watch the world cup.
I remember that some sources claimed that bitcoin is dying the last 2 years, litecoin is never getting an HDL algorithm and so forth.

on other words: Let's go get an AMD 290x .... (UT)DENIED, stock is low due to mining (insert facepalm here)
Posted on Reply
#30
dj-electric
Donno what about the people in USA, but here R9 290s and 290X along side 280X's are being sold 2nd hand from miners for almost nothing. One can get a couple of R9 290s for about 400 euros.
Posted on Reply
#31
JTristam
RCoonI'd rather not pay an extra £100 for NVENC packaged inside of software(because I can already use it), and something that changes game settings for me.
I hate both GFE and CCC, I have neutral hatred for drivers from both. Plus AMD offer their own version of shadowplay anyway, not that I've tried it. I switched from AMD during their crossfire-sucks period to NVidia, and pretty much avoid software from both parties.
I understand you find their software useful (I used shadowplay a fair bit too, but it simply sucks compared to OBS or FRAPS), but the price simply does not justify it. I pay for a GPU, and a GPU only, not some software that implements things already available to me.
This. I've been an NVIDIA customer for, hell, I don't know how long. But one thing always bothered me: the price. I don't know why JHH & Co. always insisted to put high price on their cards. Sure, GeForce's good and they developed their own tech and all though I can't say the same about GFE or ShadowPlay because I never use them, but it'd be better if, say, they sell 780 Ti for the price of 290/290x for example. I doubt production cost is really that high so why the expensive price?
Posted on Reply
#32
Assimilator
Be very careful of buying 2nd-hand mining GPUs that have been run at max clocks 24/7 for months to mine cryptocurrencies.
Posted on Reply
#33
TRWOV
AssimilatorBe very careful of buying 2nd-hand mining GPUs that have been run at max clocks 24/7 for months to mine cryptocurrencies.
Why? I had my GPUs run WCG work units for half a year and they work fine to this day. I don't think running them for crypto would be any different.
Posted on Reply
#34
Casecutter
arbiterthat would hit AMD more then it would Nvidia.
Well when someone can grab a Ebay 280X for like $150, it also effects Nvidia, because that was the same buyer who looked at both 760-770's prices see’s that the bait as super enticing.
The folks that are looking for deals from Crypto-mining left overs, aren't necessarily those that exhibit any brand constancy. Honestly the Crypto-mining left overs might impact Nvidia the same given their prices on GK104’s are out-of-sync with the price/performance… with Crypto-mining now dead. It was okay for Nvidia to be judged a "deal" in reviews when AMD had stupid crazy retail pricing, now that AMD's are more often positioning below MSRP (or well-below here in the USA) it's just fine for Nvidia and the price/performance no-longer being part of the topic.
Posted on Reply
#35
Recus
Another interesting fact that scrubs saying how AMD is beter, AMD is cheaper... But still sales dropped 30-40%. Half of them probably sitting with HD 6000 and trolling.
Posted on Reply
#36
btarunr
Editor & Senior Moderator
Serpent of DarknessBottom line, Btarunr's first sentence said it all: "Graphic card vendors are bracing for a brutal spell for Q2 2014, in which they expect sales to sequentially drop by 30 to 40 percent." The rest is all bias rhetoric.
Maybe if you click the source link, you'll find that the crypto-coin attribution for the impending GPU sales crash comes from the source, and is not my personal observation.
Serpent of Darkness@btarunr

nice R9-290 from MSI. I've only now noticed you've got one... :D
Thanks, I bought it last November, at its launch price of $399 (+taxes), and well before I wrote editorials warning people that crypto-coin miners that use GPUs have an appetite for newborns.
Posted on Reply
#37
tokyoduong
RecusAnother interesting fact that scrubs saying how AMD is beter, AMD is cheaper... But still sales dropped 30-40%. Half of them probably sitting with HD 6000 and trolling.
In Q1, AMD had a ridiculous rise in GPU sales due to crypto mining. They're just losing that extra demand.

I think the only troll here is you.
Posted on Reply
#38
The Von Matrices
RCoonI'd rather not pay an extra £100 for NVENC packaged inside of software(because I can already use it), and something that changes game settings for me.
I hate both GFE and CCC, I have neutral hatred for drivers from both. Plus AMD offer their own version of shadowplay anyway, not that I've tried it. I switched from AMD during their crossfire-sucks period to NVidia, and pretty much avoid software from both parties.
I understand you find their software useful (I used shadowplay a fair bit too, but it simply sucks compared to OBS or FRAPS), but the price simply does not justify it. I pay for a GPU, and a GPU only, not some software that implements things already available to me.
XzibitHow many of those 600 series users or newer would have loved to had a $50 savings rather then pay the hidden cost of the software they might use or don't even want.
While we're on that point of unnecessary software, can we include bundled games in this category? I thought things were changing for the better when AMD introduced special Battlefield 4 editions of cards, meaning that if you didn't want the game you could save money by getting the non-bundled version. But now the industry, both AMD and NVidia, are back to bundles again forcing people like me to pay for games I do not want and then requiring me to resell the code instead of receiving a lower price in the first place.
Posted on Reply
#39
AsRock
TPU addict
john_When the mining madness started and AMD's card prices sky rocketed(in US), many hardware sites where saying that when people where going to start selling the cards on eBay that would be only AMD's problem. From the article is obvious that both AMD and Nvidia are seeing a huge drop in card shipments. So all those who where saying that "AMD selling a truckload of cards was going to be a problem for.... AMD" it seems that they where wrong. Who could imagine that lol.
They were saying that as AMD is a much smaller campany than nVidia so it's more likely to go bust way before.
Posted on Reply
#40
Assimilator
TRWOVWhy? I had my GPUs run WCG work units for half a year and they work fine to this day. I don't think running them for crypto would be any different.
Cooling fans don't last if run at 100% speed 24/7; I'm predicting that there is going to be a glut of mined-out Radeons floating around in < 2 years with broken fans and no more warranty.
Posted on Reply
#41
HumanSmoke
RCoonWhen all you offer in terms of ACTUAL new GPU cores is something that runs on a nuclear reactor for power, or a GPU that costs in excess of $1000, no shit are your sales going to drop.
Because discrete sales are built on $1000 cards ? You do realise that the consumer add-in board market is damn small to begin with compared to OEM contracts, and $1000 (or even $400, $500, $600+ for that matter) sales of consumer boards is minute in the greater scheme of things.
Here's a breakdown of Mercury Research's market segment analysis from three years ago. Note the sales of entry level boards
RCoonI was looking on ebay at the 290X and the 780ti's, and right now I see no discernible reason to buy NVidia. The 290X's are going cheap as chips, and are within percentage points of 780ti performance.
Mainly because there is a general suspicion of cards that have been run hard in mining. If you're happy whatever guarantee offered mitigates any downside then go for it. Personally I purchased a Gigabyte GTX 780 GHz Ed. on Ebay (new, sealed) for $US440, and it is on par with a 290X for my usage scenario. I wouldn't consider "upgrading"to either the 290X or 780 Ti, much like 99.99999% of people shopping for a graphics upgrade
RCoonNVidia need to swallow their elitist attitude and cut prices for once, and AMD just plain need to keep going clearing stock.
Are you sure you understand the concept of brand? If Nvidia made the bulk of its revenue from consumer GTX 780 Ti/Titan Black then you'd have a point. The fact that most sales are to OEM's on thin margins, and professional cards for the cream, tends to marginalize the narrow focus of your attention.
RejZoRRehashing old architectures with tiny performance bumps aren't helping sell more stuff....
Like the vast majority of buyers give a shit about architecture? Show me a single person who could differentiate the cache or scheduling structure in a GPU architecture, and I'll show you a thousand who buy based on marketing and "bigger number = better performance".
RejZoRAlso the insane inability to make common standards for gaming makes things even worse. It's preposterous that they failed to create a physics standard for hardware accelerated physics in all these years.
The combined might of Intel and ATI/AMD were supposed to revolutionize physics with HavokFX as I remember. Still waiting...
RejZoRPhysX is useless because it's NVIDIA only and especially since it's not widely used standard tech, they can only use it for gimmicky visual stuff, but they can't use it for core gameplay, because those games just wouldn't even work for half the users then.
So you're singling out PhysX because it's gimmicky, Nvidia, and is...actually used?
Why not rant about the development of Bullet ? or Tokomak ? or ODE ? Havok is used in a lot more games than PhysX and doesn't do appreciably more (less in a lot of simulations) than Nvidia's (also) proprietary engine.
JTristamThis. I've been an NVIDIA customer for, hell, I don't know how long. But one thing always bothered me: the price. I don't know why JHH & Co. always insisted to put high price on their cards.
Says the guy who's specs list SLI'ed GTX Titans. You don't think you maybe answered your own question?
Posted on Reply
#43
SaltyFish
RejZoRRehashing old architectures with tiny performance bumps aren't helping sell more stuff... Also the insane inability to make common standards for gaming makes things even worse. It's preposterous that they failed to create a physics standard for hardware accelerated physics in all these years. Games could make insane leaps in interactivity with HW physics being standard, surpassing consoles on that front entirely due to better precision with mice. Instead, it's year 2014 and we still run the same shitty physics we had back in 2001 on CPU. It's just idiotiq. PhysX is useless because it's NVIDIA only and especially since it's not widely used standard tech, they can only use it for gimmicky visual stuff, but they can't use it for core gameplay, because those games just wouldn't even work for half the users then.
AMD and Nvidia could also drive demand by simply invigorating the stagnant desktop monitor market. Get desktop monitor manufacturers to produce more monitors with higher refresh rates (120Hz doubles the usual 60Hz load), higher resolutions (why do laptops and tablets have such higher resolutions available when desktops are more likely to have the hardware to drive them) and higher pixel density (not everyone has room for a 27" or 30" monitor; again something laptops and tablets have but not desktops). Make a 24" monitor with 2560x1600 resolution and I'll throw money at it along with a fancy new graphics card.
Posted on Reply
#44
TRWOV
AssimilatorCooling fans don't last if run at 100% speed 24/7; I'm predicting that there is going to be a glut of mined-out Radeons floating around in < 2 years with broken fans and no more warranty.
There's always aftermarket heatsinks.
Posted on Reply
#45
MilkyWay
Exactly the same reason i'm still on my i5 2500k.
Posted on Reply
#46
RCoon
HumanSmokeBecause discrete sales are built on $1000 cards ? You do realise that the consumer add-in board market is damn small to begin with compared to OEM contracts, and $1000 (or even $400, $500, $600+ for that matter) sales of consumer boards is minute in the greater scheme of things.
Here's a breakdown of Mercury Research's market segment analysis from three years ago. Note the sales of entry level boards


Mainly because there is a general suspicion of cards that have been run hard in mining. If you're happy whatever guarantee offered mitigates any downside then go for it. Personally I purchased a Gigabyte GTX 780 GHz Ed. on Ebay (new, sealed) for $US440, and it is on par with a 290X for my usage scenario. I wouldn't consider "upgrading"to either the 290X or 780 Ti, much like 99.99999% of people shopping for a graphics upgrade

Are you sure you understand the concept of brand? If Nvidia made the bulk of its revenue from consumer GTX 780 Ti/Titan Black then you'd have a point. The fact that most sales are to OEM's on thin margins, and professional cards for the cream, tends to marginalize the narrow focus of your attention.

Like the vast majority of buyers give a shit about architecture? Show me a single person who could differentiate the cache or scheduling structure in a GPU architecture, and I'll show you a thousand who buy based on marketing and "bigger number = better performance".

The combined might of Intel and ATI/AMD were supposed to revolutionize physics with HavokFX as I remember. Still waiting...

So you're singling out PhysX because it's gimmicky, Nvidia, and is...actually used?
Why not rant about the development of Bullet ? or Tokomak ? or ODE ? Havok is used in a lot more games than PhysX and doesn't do appreciably more (less in a lot of simulations) than Nvidia's (also) proprietary engine.

Says the guy who's specs list SLI'ed GTX Titans. You don't think you maybe answered your own question?
Hey I recognise you, you're that guy that trollbaits in news posts.

Not gonna happen sonny :)
Posted on Reply
#48
Peter1986C
TRWOVWhy? I had my GPUs run WCG work units for half a year and they work fine to this day. I don't think running them for crypto would be any different.
Smart miners who know what they are doing are lowering the Scrypt intensity so much that the loads on the cards are roughly the same as those for Folding cards. The vast majority however (I expect) is using the max settings thinking they get more money that way (not knowing they are screwing the efficiency and the VRMs of their cards).
Posted on Reply
#49
xvi
As others have said, I suspect it's just the increased demand now leveling off. AMD would be hurting now if they made the mistake of spending all that money from the cryptocurrency craze leaving themselves with nothing to ride out the post-crash. With these high end, modern, used GPUs going rather cheap, AMD just has to sit and smile while nVidia struggles to remain price competitive. (I realize I'm comparing used AMD mining cards against new nVidia cards, but seeing as how that's what's generally available on the market, I think it's a valid point)

Yes, mining cards were probably run 24/7 with every last bit of performance tweaked out of them, but I suspect those in the market for a used 290x won't even realize that could be an issue.
Posted on Reply
#50
john_
All those mining cards are still in warranty, right? In Europe all electronics have a minimum of two years. So, worst case scenario, the only case scenario where AMD and it's partners are going to regret really really badly the choice of letting those cards constantly run at 94 Celsius, is if those cards start dying in a few months or a year from now before the warranty period is over.
Posted on Reply
Add your own comment
Nov 25th, 2024 19:40 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts