Wednesday, August 15th 2018
Intel Teases Their Upcoming Graphics Cards for 2020
Right in time for SIGGRAPH, the world's leading conference for computer graphics, the people around Raja Koduri and Chris Hook have posted a video on Twitter, which shows a teaser for their upcoming graphics cards, that are scheduled to become available in 2020.The video is produced in a style that's typical for what Chris Hook has been releasing at AMD, too. It starts with a history lesson, praising Intel's achievements in the graphics department, and then continues to promise that in 2020, Intel discrete graphics cards "will be set free, and that's just the beginning".
In the comments for the video, Chris Hook, who left AMD to join Intel as head of marketing for their graphics department said: "Will take time and effort to be the first successful entrant into the dGPU segment in 25 years, but we have some incredible talent at Intel, and above all, a passion for discrete graphics."
You can find the video here.
In the comments for the video, Chris Hook, who left AMD to join Intel as head of marketing for their graphics department said: "Will take time and effort to be the first successful entrant into the dGPU segment in 25 years, but we have some incredible talent at Intel, and above all, a passion for discrete graphics."
You can find the video here.
80 Comments on Intel Teases Their Upcoming Graphics Cards for 2020
I'm very curious to know more about the architecture. That should clarify a lot of things.
I'm still rooting for VIA though
The growth is in deep learning, neural networks, and AI. Intel can make a fortune targeting those clients without gamers.
We'll have to wait and see if Intel makes a similar announcement in GamesCom.
The hype train has departed to a station called Disappointment Ave...
He is another one like Peter Molyneux. Lots of grand, waffling speeches, and very little real-life content to match up to it after the dust settles, yet the media hail him as a visionary.
ARM isn't really taking off, x86-64 is starting to look more difficult than it ever was, IoT is too small and insignificant (plus: full of privacy and security concerns) and everything else is just not interesting for core business. Graphics though... that is an area where the market is screaming for progress and where competition is lacking. And its that unicorn Intel really never could catch in the right way.
Raja probably convinced Intel to take another shot. Of all people. Not having high hopes, all this is, is the right voice at the right time, with no product to really back it up.
AMD releases a new GPU. It's subpar like it always is.
Intel then releases their new GPU shortly after that (within a few months) and it's superior to AMD but still falls slightly behind nvidias current offering at the time.
Nvidia releases their product within a 6 month time frame of Intel and completely demolishes both AMD and Intel like they always seem to do but we still pay a slightly higher premium for their cards because, Nvidia and they continue to dominate the GPU marketplace.
- uses die space very efficiently
- can be fully utilized for every type of workload (and remains efficient doing so)
- can work within a limited TDP budget (~250-300W tops)
- can scale well
- does delta compression and efficiently uses memory
- has a flexible clock range
- can mitigate heat/throttling issues while keeping performance
.... and another few dozen things we've not seen Intel do yet.
Intel has made a lot of expensive mistakes. For example, they bought McAfee. They also tried and failed already to make a dGPU. The problem is they are trotting out a lot of marketing BS now, and its not credible. "In 2030 we are going to change the world!!111one" Sure you are.
They don't have a steve jobs to pull an iphone out of his ass and save them, they have... Raja Koudori, and a lot of management issues. Maybe something has changed at intel and they will be successful, if so, and the dGPU is better than everyone else, i'll be right there standing in line with you waiting to get one. And maybe I'll see a Unicorn today too.
Intel needed to promote Broadwell as a single-threaded specialty chip - like they did with 7350K, posthumously. 128MB L4 that almost negates any calls to memory should have been promoted as such and therefore, Intel's shortcoming isn't an engineering one.
PS: they have this which I find interesting. Anything you can sample with a gaussian, you can use for centroid domain antialiasing which I find a good match to build a bilateral filter with.
Any 2x antialiasing mode will automatically generate 3 taps. The intermediate boundary is prone to artifacting the most. Bilateral filters have no negative 'intermediary' lobes however much you sharpen, so it can seam any image together without incurring any 'shock' of its own. It is a good basis point to postprocess an image - which gpus naturally do. Literally, if you are into type or video filtering, bilateral is a must for conserving the edge detail.