Sunday, October 30th 2011
Are Improving Integrated Graphics Slowly Killing Off Discrete Graphics Cards?
Intel started the trend of improving integrated graphics with their second generation LGA1155 socket Core i3, i5 & i7 line of processors. Depending on the model, these processors sport integrated HD2000 or HD3000 graphics right on the processor die, which nowadays give acceptable performance for low-end gaming and can play Full HD 1080p video perfectly. This trend is increasing with the upcoming Ivy Bridge processors, which will be able to support a massive 4096 x 4096 pixel display, as we reported here. AMD now also have equivalent products with their Llano-based A-series processors. So, where does this leave discrete graphics cards? Well, the low end market is certainly seeing reduced sales, as there really isn't enough of a performance difference nowadays to always warrant an upgrade from an IGP. As integrated graphics improve further, one can see how this will hurt sales of higher end graphics cards too. The problem is that the bulk of the profit comes not from the top-end powerhouse graphics cards, but from the low to mid-end cards which allow these companies to remain in business, so cannibalizing sales of these products to integrated graphics could make high-end graphics cards a much more niche product and crucially, much more expensive with to boot.
Hence, it's not surprising to see that Digitimes are reporting that while NVIDIA are about to produce the next generation Kepler-based GPU's on TSMC's 28nm process and AMD have already started production of their Southern Islands-based GPU's, the graphics card manufacturers are cautious about jumping in head first with cards based on these new products. Taiwan-based card makers are watching the market before making decisions, according to Digitimes' industry sources:
What's interesting, is that as AMD are now a combined CPU & GPU company, they know full well that their IGP solutions eat into sales of their own discreet low to mid-end graphics cards. It will be worth watching AMD's strategy for dealing with this problem, closely.
Hence, it's not surprising to see that Digitimes are reporting that while NVIDIA are about to produce the next generation Kepler-based GPU's on TSMC's 28nm process and AMD have already started production of their Southern Islands-based GPU's, the graphics card manufacturers are cautious about jumping in head first with cards based on these new products. Taiwan-based card makers are watching the market before making decisions, according to Digitimes' industry sources:
Compared to the makers' eagerness for the previous-generation GPUs, graphics card makers are rather conservative about the upcoming 28nm chips due to concerns such as TSMC's weak 40nm process yield rate issues may re-occur in its 28nm process and weakening demand for graphics cards and lower-than-expected gross margins.The poor 28nm yield rate isn't helping either:
Although previous rumors have indicated that TSMC's poor 28nm process yield rate could affect Nvidia's launch of its 28nm GPUs on schedule at the end of 2011, as TSMC already announced its 28nm process has entered mass production, Nvidia's new Kepler GPUs are expected to be announced in December.All this of course, is bad news for PC enthusiasts, who are always looking to upgrade their PCs with the latest technology so that they can run power-intensive tasks on them, such as 3D gaming and distributed projects such as Folding@Home. On the plus side, a top-end card like a GTX 580 or HD 6970 will not be integrated into an IGP any time soon, because of the sheer power, heat and die size requirements, so there is still hope that affordable high-end cards will remain available.
What's interesting, is that as AMD are now a combined CPU & GPU company, they know full well that their IGP solutions eat into sales of their own discreet low to mid-end graphics cards. It will be worth watching AMD's strategy for dealing with this problem, closely.
79 Comments on Are Improving Integrated Graphics Slowly Killing Off Discrete Graphics Cards?
IGPs on computers serve 90% of computer users. If somebody wants an upgrade, it's because 1) 3D gamer enthusiast 2) professional graphics software 3) need more/different output options. But really most computer users only need their graphics processor to be able to run the OS GUI.
basicly graphics built into something that it shares die space and main memory with
The HD 3200 aka 780G was the first chipset to allow you to play Blu-ray on the pc in hardware, it was the first that also had the best 3d performance, intel igps was horrible at the time
techreport.com/articles.x/14261/8
www.techspot.com/review/233-intel-core-i5-661/page13.html intel graphics wont run it and many many many other games too, and that amd do update their drivers on a monthly cycle dooh
It all depends on the software (game engines in this case). And it has some logic to it as well:
the need of high end graphic card/s (more then one) relays within the need to render at it's maximum visual potential a certain videogame, so basicaly it's engine...this greed of power is determined by the achitecture of the engine itself...
...the turning point is, do you continue making more hungry power engines or do you make more efficient ones? IMHO I think efficiency is the winner card...if you could make a game engine that can match the visual impact of the highend ones but with less computational demand, you wouldn't really need high-end graphics, now would you?
Sounds almost to good to be true...and how could you achieve that?
I would like to know as well but...seems to me that some people out there are already way down that path:
www.youtube.com/watch?v=00gAbgBu8R4
www.youtube.com/watch?v=1sfWYUgxGBE
...it's just a matter of time ;) true...but isn't that the overall tendency of todays industries (not just videogame companies)? The quality of a product is just one the major consequences of COMPETITION.
IGP really threatens the market for oem's who want to sell their parts to company's like dell and hp on their budget pc's though, theres no point for it really since most IGP's now and handle HD video flawlessly,(I remember buying a extra card just for video not even 5 years ago because thats how bad the Integrated was but times have changed)
Why?
Because any company dumping huge loads of money on discrete Graphics Cards is going to make sure they have a market and a product to fit that market reasonably.
Short example:
Any MP3 device, up to 120gb of storage ect ect..
Hard-drive, up to 2-3terabytes of storage.
MP3 is used for multiple productive decisions daily.
Hard-drive's have 1 use, store memory, and store some more memory.
On-board gpu, shares space and is designed inside of a multiple productive design USE for the user.
Discrete gpu, designed for ONE(1) use, to process any graphical output that needs to be displayed.
When someone design's either of those products they will think about different usage scenarios there for resulting in a overall difference in performance in different scenarios guarantied BASED ON PRIORITY'S.
ATi or nVidia were the ones who started making integrated GPUs capable of playing low-end games for low-end to casual users. AMD's 780G was the first one to be able to play some of the higher-end games of the day, albeit at a reduced resolution and detail setting, which the casual gamers don't mind as much as us enthusiats (casual gamer's preference from personal experience with those people)
IIRC It was in AMD's roadmap to integrate graphics into the processor long before Intel announced any such thing. Intel just delivered fourty dumptrucks full of gold bullion to their R&D department and beat AMD to the punch. Even if intel was planning to do it before AMD released information about it, they never announced anything. So you could say that Intel delivered on AMDs roadmap before AMD could, with a weaker graphics core. (My google-fu is failing me on this, so I'm going by memory and hence willing to concede this point to anybody who can prove otherwise)
Even ignoring motherboard graphics, it's only because Intel's integrated graphics started out so damn laughably weak (whether speaking of chipset or processor based) that the current upgrades look good, AMD's graphics have already been at the same level as Intel is "Upgrading to" since they came out. (adjusting for release date and technology improvements, of course. I'm not implying that 780G or AMD's low-end APUs are a match for current integrated, just it's in the same class)
Oh, BTW, quoting marketing fluff like 4Kx4K res support is meaningless. Even today's single-chip enthusiast-class graphics cards would be reduced to a slideshow at that res for anything beyond 2d productivity apps. Intel could claim res all they want, until I see a video recorded at that res playing back fluidly with at least (minimum fps, with no drops below) 24fps on a 4K screen without discrete graphics, I call bullshit. It's like claiming a chevy s-10 can reach 300mph*
*when dropped out of an airplane, with an aerodynamically designed outer shell surrounding it
(wow that whole thing came off a lot more harsh sounding than I meant it to... Oh well, this note's here to point out the fact it wasn't meant to)
nowadays, everything has onboard video - even mid and high range products.
the IGP isnt killing off discrete cards, its killing off ENTRY level discrete cards.
There is no problem in this: AMD eat into sales not of their own only but from all the market and i think the low end GPUs will be history in the near future.
what about the high end GPUs, there is no way that any company manufacturer GPUs in the world makes the high end GPUs is embedded
I was skeptical until I tried it. My i3 does everything except DTS-HD passthrough. It will do 1080P and DTS, Dolby Digital without a hick up. To beat it you have to go with a 5650 or 430 or above.
dedicated vga is better if you wanna something better (like performance) over the integrated that made for all around needs
www.techpowerup.com/reviews/ATI/HD_2900_XT/
www.techpowerup.com/reviews/HIS/Radeon_HD_6970/
I think with improving IGP's and better mobile devices "The enthusiast market" is diminishing
I think both AMD and Nvidia are both already heading away from monster card market specially when most games are heading for cross market sales.
I look forward to the return of a PC without the need of an overpriced overpowered GFX solution....I think the problem is many of you weren't around before you "needed" a discrete GFX solution.
Yes, what you said matches the fact that nvidia will be releasing low end cards first for their next gen, which is opposite to what used to happen. Sad, but true. :(
Being an Enthusiast Forum I'm sure we'll be (are already) one of the last hold outs for high end GFX cards.