Monday, November 16th 2015

Q3-2015 VGA Market - AMD Recovers Market Share at NVIDIA's Expense: JPR

Jon Peddie Research (JPR), the industry's research and consulting firm for graphics and multimedia, announced estimated PC graphics add-in-board (AIB) shipments and suppliers' market share for Q3'15. Two big introductions came in the third quarter. AMD released their Radeon AIB lineup: the Fury series, powered by it new high-bandwidth memory (HBM), and the company's brand new Fiji GPU.

At Nvidia's GPU Technology Conference in Japan, the company revealed its next-generation GPU code-named "Pascal." AIBs based on Pascal will have up to 16GB of HBM2 with up to 1 TB/s bandwidth. JPR's AIB Report tracks PC add-in graphics boards (AIBs), which carry discrete graphics chips. AIBs are used in desktop PCs, workstations, servers, and other devices such as scientific instruments. They are sold directly to customers as aftermarket products, or are factory installed. In all cases, AIBs represent the higher end of the graphics industry using discrete chips and private high-speed memory, compared to the integrated GPUs in CPUs that share slower system memory.
The news was encouraging and seasonally understandable, quarter-to-quarter; the AIB market increased 27.6% (compared to the desktop PC market, which increased 7.6%).

On a year-to-year basis, we found that total AIB shipments for the quarter fell -3.9%, which is less than desktop PCs, which fell -8.9%.

However, in spite of the overall decline, due somewhat to tablets and embedded graphics, the PC gaming momentum continues to build and is the bright spot in the AIB market.

The overall PC desktop market increased quarter-to-quarter including double-attach (the adding of a second (or third) AIB to a system with integrated processor graphics. To a lesser extent, dual AIBs in performance desktop machines with high end CPUs also added to the quarter to quarter increase. Typically, these machines use either AMD's Crossfire or Nvidia's SLI technology.

The attach rate of AIBs to desktop PCs has declined from a high of 63% in Q1 2008 to 39% this quarter.

The quarter in general
JPR found that AIB shipments during the quarter behaved according to past years with regard to seasonality. AIB shipments increased 27.59% from the last quarter (the 10-year average is 12.6%).
  • From last quarter, total AIB shipments decreased to 12.0 million units this quarter.
  • AMD's quarter-to-quarter total desktop AIB unit shipments increased 33.3%.
  • Nvidia's quarter-to-quarter unit shipments increased 26.4% Nvidia continues to hold a dominant market share position at 81.1%.
  • Figures for the other suppliers were flat to declining.
  • The change from year to year decreased -3.8% compared to last year.
  • This quarter compared to quarter-to-quarter percentage changes for the vendors are shown in Table 1.
The AIB market now has just four chip (GPU) suppliers, who also build and sell AIBs. The primary suppliers of GPUs are AMD and Nvidia. There are 48 AIB suppliers, and they are customers of the AIB OEM GPU suppliers.

In addition to privately branded AIBs offered worldwide, about a dozen PC suppliers offer AIBs as part of a system, and/or as an option, and some offer AIBs as separate aftermarket products. We have been tracking AIB shipments quarterly since 1987-the volume of those boards peaked in 1999, reaching 114 million units, in 2013 65 million shipped.

This detailed 69-page report will provide you with all the data, analysis and insight you need to clearly understand where this technology is today and where it is headed. The report can be purchased here.
Add your own comment

20 Comments on Q3-2015 VGA Market - AMD Recovers Market Share at NVIDIA's Expense: JPR

#1
Frick
Fishfaced Nincompoop
Aww poor Matrox. :(
Posted on Reply
#2
RCoon
FrickAww poor Matrox. :(
Pretty sure Matrox us AMD chips anyway, they just do more customisation than a standard AIB partner.
Posted on Reply
#4
RejZoR
That's good to hear. Ideal would be 50:50 between NVIDIA and AMD.
Posted on Reply
#5
medi01
Only couple of percent, not that good news.
Posted on Reply
#6
HumanSmoke
medi01Only couple of percent, not that good news.
0.8% gain/loss with the only new SKUs launched in Q3 being Nvidia's GTX 950 and AMD's Fury and Fury Nano.
With all the DX12 and Fury hype I might have expected some more uplift in AMD's market share. At least the AIB market is up. Extrapolating the figures it seems good news all round. AMD's vendors shipped about two and a quarter million boards in the quarter, up just over half a million from Q2. Nvidia shipped 9.73 million (up 2 million on the last quarter). Looks like they'll break 50 million between them for the year.
Posted on Reply
#7
64K
A little good news is better than bad news. I'm pretty confident that Radeon Tech will be around for a while whether owned by AMD or possibly not. iirc Lisa Su said revenue is expected to be down 10% overall in Q4 but that could be mostly due to projected CPU/APU sales and not GPUs. Next year is going to be an exciting year for GPUs. We're finally going to be out of this 28nm rut.
Posted on Reply
#8
FreedomEclipse
~Technological Technocrat~
RCoonPretty sure Matrox us AMD chips anyway, they just do more customisation than a standard AIB partner.
Matrox only cater towards the industrial market anyway. Their cards are available to buy at some commercial retailers though though with AMD and Nvidia having their own cards that are made for that part of the market. Competition is stiffer than a dead cat with rigamortis
Posted on Reply
#9
Musaab
"Overall GPU shipments increased 9% from last quarter, Nvidia increased 21%, AMD increased 16%, and Intel increased 5%" jonpeddie.com/publications/market_watch/
According to the title of the report nVidias share should increase, Something is not right, I think nVidia lost in AIB market and AMD lost in laptop market. That is the only way to make the title right
Posted on Reply
#10
Uplink10
medi01Only couple of percent, not that good news.
It is not even 1%, it is only 0.8%.
Posted on Reply
#11
rtwjunkie
PC Gaming Enthusiast
ANY percentage increase is good for AMD. Frankly, the smart move would be to have their PR capitalize on this and spin it positively, which would generate even more sales.
Posted on Reply
#12
Musaab
rtwjunkieANY percentage increase is good for AMD. Frankly, the smart move would be to have their PR capitalize on this and spin it positively, which would generate even more sales.
I am not sure they gained anything because they lost more in the non AIB sector ( all in one, laptops and any other device use discrete GPU not in normal PICe form)
Posted on Reply
#13
AsRock
TPU addict
FrickAww poor Matrox. :(
Yeah, i remember my 1st one which had a memory add-on module.
Posted on Reply
#14
suraswami
so next year on that spreadsheet, we can see only Nvidia?

hopefully AMD stays!!
Posted on Reply
#15
Prima.Vera
AsRockYeah, i remember my 1st one which had a memory add-on module.
Same with S3. Good ol' days of Trio and Virge cards :)
Posted on Reply
#16
medi01
RejZoRThat's good to hear. Ideal would be 50:50 between NVIDIA and AMD.
Ideal would be nVidia to turn into underdog for a while, until they give up pushing proprietary crap / bribing game devs.
Posted on Reply
#17
RejZoR
Prima.VeraSame with S3. Good ol' days of Trio and Virge cards :)
I have very good memories of my S3 Savage3D 8MB. Especially due to S3 Metal API (kinda S3's version of AMD's Mantle if you want) that allowed me to run all Unreal based games at really high framerate at 1024x768 which was pretty high resolution back then despite the fact Savage3D wasn't anything to write home about spec wise compared to some Rage and TNT cards of that time. UT99, Rune, Deus Ex, Undying, The Wheel of Time, they all ran insanely fast at max details and even with exclusive S3TC textures. It was only Quake 3 later that caused some problems with OpenGL and lightmaps, but with vertex lighting, at expense of visual quality, it was fast again with still everything else cranked up high.
medi01Ideal would be nVidia to turn into underdog for a while, until they give up pushing proprietary crap / bribing game devs.
Or this. Just for long enough to restore balance.
Posted on Reply
#18
Musaab
medi01Ideal would be nVidia to turn into underdog for a while, until they give up pushing proprietary crap / bribing game devs.
I know you don't like nVidia. But why every one don't like nVidia make it looks like the incarnated devil himself.
nVidia won fair and square. They didn't do what Intel did to AMD's OEM/ODM at the K7/K8 days they didn't even offer cheaper hardware. They won the hearts and minds of their customers and partners.
nVidia sells their hardware for higher price than what AMD do for the same theoretical performance but people still willing to pay the extra money. At the same time nVidia spend some of that extra money to develop new features (Game work, physix) and pay for game companies to embed these features in their games so that you can turn them on and off. So customers get extra features nVidia get extra money and customers and game developers get some extra easy money. No harm has been done. The only thing AMD offered was Mantle which is a rip off from DX12 and the idea of direct reach to GPU was already part of every game console since first PS/XBOX and nVidia had something similar since ten years ago inherited from 3Dfx called Glide API. So AMD need to do more if they want customers back. Thanks
Posted on Reply
#19
medi01
Musaab
People buy stuff without much of their heart being involved, why dive into pathetic? Jeez.

I don't have strong feelings for any corp out there, chill down.

You can compete fairly and you can use dirty tricks. Cheating harms not only your competitor, but also the market, hence the customers, hence its bad.

One could argue that AMD simply could not do that kind of shit, since they never were in the dominant position, well, could be.

Nevertheless world would be a better place without nVidia bribing devs to cripple products for competitors and also without nVidia's proprietary tech (and no, they didn't really 'develop' most of it, e.g. PhysX was bought, and much of their 'development' effort further on was spent on making things incompatible with competing products. In case of G-Sync that wasted (alternative tech comes free of charge with scaler chips lol) evil effort is simply monumental)

I'll ignore nonsense about "Mantle being rip off", seriously, WTH...
Posted on Reply
#20
Musaab
medi01Musaab
People buy stuff without much of their heart being involved, why dive into pathetic? Jeez.

I don't have strong feelings for any corp out there, chill down.

You can compete fairly and you can use dirty tricks. Cheating harms not only your competitor, but also the market, hence the customers, hence its bad.

One could argue that AMD simply could not do that kind of shit, since they never were in the dominant position, well, could be.

Nevertheless world would be a better place without nVidia bribing devs to cripple products for competitors and also without nVidia's proprietary tech (and no, they didn't really 'develop' most of it, e.g. PhysX was bought, and much of their 'development' effort further on was spent on making things incompatible with competing products. In case of G-Sync that wasted (alternative tech comes free of charge with scaler chips lol) evil effort is simply monumental)

I'll ignore nonsense about "Mantle being rip off", seriously, WTH...
At first your argument for the well known phrase (winning people mind and heart) mean you are talking from your heart.
If you don't have any proof that nVidia cheated you should at least admit that truth.
You are right about they bought Agaia and made physx available for all nVidia's customers so they don't need to buy a separate card and that is a point for them because they paid money to give their customers extra features for their money. And by the way what left of the original physx is no more than 10% .
About G sync you know well that when nVidia developed it there was no other sync tech in the market and the DP 1.2a adaptive sync AKA free sync became available almost one year later. And still every professional test both will tell you that the G sync is more advance despite the more expensive FPGA the G sync used but still people buy nVidia's card with G sync monitors despite that they can get AMD cards with free sync for much less.
Posted on Reply
Add your own comment
Nov 8th, 2024 18:19 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts