Tuesday, May 8th 2018

Intel Could Unveil its Graphics Card at 2019 CES

It looks like Intel is designing its discrete graphics processor at a breakneck pace, by a team put together by Raja Koduri. Its development is moving so fast, that the company could be ready with a working product to show the world by the 2019 International CES, held in early-January next year. Intel's development of a graphics processor is likely motivated by the company's survival instinct to not fall behind NVIDIA and AMD in making super-scalar architectures to cash in on two simultaneous tech-booms - AI and blockchain computing.

A blessing in disguise for gamers is the restoration of competition. NVIDIA has been ahead of AMD in PC graphics processor performance and efficiency since 2014, with the latter only playing catch-up in the PC gaming space. AMD's architectures have proven efficient in other areas, such as blockchain computing. NVIDIA, on the other hand, has invested heavily on AI, with specialized components on its chips called "tensor cores," which accelerate neural-net building and training.
Source: TweakTown
Add your own comment

38 Comments on Intel Could Unveil its Graphics Card at 2019 CES

#26
cadaveca
My name is Dave
W1zzardJust wondering, isn't anybody considering that Intel might just fail again at this? Or that their GPU isn't targeted at gamers at all? (which needs a ton of driver support)
I'm hoping that with all the ATI staff they have hired, they have at least some people with a reasonable amount of experience in making a consumer-facing driver for 3Dgraphics that doesn't have the same issues that their iGPU drivers do. It's the only reason they have, really , for all those hires.

You'll note I didn't say AMD employees.... :p;)
Prima.VeraOne stupid question, and apologies in advance for it... :D
Isn't the SHADER cores nVidia and AMD proprietary ONLY?? If so, how can Intel develop a new GPU without those?? Sorry, just asking....
Doing graphics is just doing math. That's why GPUs are so good for "mining". There's no patent on doing math equations in that way, and where there is, Intel has licenses for most of the tech they need that they don't outright own. There's no story to be told at all about any of that side of this.
Posted on Reply
#27
XiGMAKiD
I agree with this discrete GPU Intel can kill more birds with one stone like AI and cryptocurrency with gaming as side dish, assuming they can keep up not only the hardware development cycle but also the software support
Posted on Reply
#28
DeathtoGnomes
XiGMAKiDI agree with this discrete GPU Intel can kill more birds with one stone like AI and cryptocurrency with gaming as side dish, assuming they can keep up not only the hardware development cycle but also the software support
if there was a choice between AI and crypto, I'd think they would heavily leaning towards AI. The saying 2's company, 3 is a crowd, I hope Intel can stay away from crypto "stuff"
Posted on Reply
#29
bug
sergionographyThat would actually be good for both of them. Someone definitely needs to put cuda in its place.
Easier said than done. Simply because CUDA (as proprietary as it might be) works. I mean, CUDA-enabled programs usually beat OpenCL2 implementations and not having OpenCL2 support in place is (was?) supposed to be a weakness for Nvidia.
Posted on Reply
#30
InVasMani
I hate the fact that x86 has all this licensing red tape locking out competition meanwhile Intel can just jump right into GPU development.
Posted on Reply
#31
bug
InVasManiI'm glad you said usually cause sometimes it just can't do basic math correctly lol
Yeah that was only for Volta and was probably a driver bug. Nothing to do with CUDA itself.
Posted on Reply
#32
InVasMani
3.5GB the way it's meant to be played...presenting the GTX970 4GB!
Posted on Reply
#33
Jism
InVasManiI hate the fact that x86 has all this licensing red tape locking out competition meanwhile Intel can just jump right into GPU development.
They made a concept of their own: www.techpowerup.com/241669/intel-unveils-discrete-gpu-prototype-development

It's not infriding tech of Nvidia or AMD. It's a free market and thus their ability to create a new GPU.

Intel owned the whole market for many years with integrated GPU's. This to delivery of complete motherboards and chips that dont need a discrete graphics card in the first place. But they still hold a great portion of desktop space market. It's just not suitable or even comparible with gaming in the first place.

It's just "2D" and some basic 3D. They have created some IGP's into their CPU's lately but they are not strong enough (esp with Vega and HBM) to compete in the first place.
Posted on Reply
#34
medi01
W1zzardJust wondering, isn't anybody considering that Intel might just fail again at this? Or that their GPU isn't targeted at gamers at all? (which needs a ton of driver support)
Larabee type of fail? Certainly not. Not bringing exceptional performance? Well, yeah, but so what?
Fail, learn from it, improve is the normal cycle.
If they are serious about GPU presence, I simply don't see what could stop them.

If there is a company on the planet which can enter consumer GPU market, it's Intel.
InVasManiI hate the fact that x86 has all this licensing red tape locking out competition meanwhile Intel can just jump right into GPU development.
I don't follow.
It's understandable to hate "x86 licensing red tape" (not that we would have real competition even if there were no licensing problems).
But hating it "because GPU is not limited", huh?

Intel actually paid nVidia and now pays AMD for GPU patents.
Posted on Reply
#35
InVasMani
I'm mostly concerned by it since we've seen Intel abuse it's power in the past and at the same time lock out competition from them. I just don't feel like I want to welcome in trouble personally. Intel has done some anti competitive things to both Nvidia and AMD in the past so them getting into their primary business market is a pretty reasonable concern from a consumer standpoint. Yeah it could bring short lived increased competition followed by a decade of the most incremental updates they can get away with all while raising prices gradually. Then we again we've sort of reached that influx now anyway with Nvidia behaving in much the same way. I think the biggest concern is how it might cripple AMD at a point in time where they really just becoming competitive again when it's desperately needed for consumers. I view it as a double edged sword that cuts both ways.
Posted on Reply
#36
Blueberries
InVasManiI think the biggest concern is how it might cripple AMD at a point in time where they really just becoming competitive again when it's desperately needed for consumers. I view it as a double edged sword that cuts both ways.
The only thing competitive about AMD is their marketing team.
Posted on Reply
#37
bug
BlueberriesThe only thing competitive about AMD is their marketing team.
That's not true and you know it. At the same time, if every company would be able to succeed only when competition was playing nicely, then we wouldn't have any successful companies at all. AMD needs to man up and play the hand they were dealt. Which they have done, lately.
Posted on Reply
#38
_UV_
W1zzardJust wondering, isn't anybody considering that Intel might just fail again at this? Or that their GPU isn't targeted at gamers at all? (which needs a ton of driver support)
previous iteration isn't fail, they made big money, drop support for products released less than 1-2 ago and go further. Now they focused on Internet of things to make make big money on fools who trust in their "technological superiority", so me waiting for rebranding and repurposing this products to sell it in data center markets.
Posted on Reply
Add your own comment
Jan 22nd, 2025 12:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts