• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Teases Their Upcoming Graphics Cards for 2020

Larabee for Gaming Part Deux?

It'd be nice to see another player in the gaming graphics arena, please by all means! More competition is good! :)

Bring S3, PowerVR/STMicro/VideoLogic back along with Hitachi SuperH!
 
Bring S3, PowerVR/STMicro/VideoLogic back along with Hitachi SuperH!

Nostalgia time. My first PC had a "Cirrus Logic".. I think with a whopping 1MB.. or maybe 4? And a Cyrix CPU. :)
 
Raja wasn’t brought in on the ground floor. This had already been under development for awhile before he left AMD.

That is likely, especially if the rumors about the cards being targeted at professionals pan out. But if Raja had nothing to do with developing the architecture, why bother bringing him aboard? It's not like his efforts at AMD led to the best GPUs they ever put out...

I'm very curious to know more about the architecture. That should clarify a lot of things.
 
Bring Voodoo back! Stupid Nvidia didn't even get SLI right.
 
Bring Voodoo back! Stupid Nvidia didn't even get SLI right.

..And their cringy ads to boot. :P

297462-rudepcad.jpg
 
That's great news actually. Competition is great, but I have fears that Intel might eat into AMD's market share and actually make things worse and turn it into two player game again, just with different players. We certainly don't want that to happen. So, I hope that NVIDIA continues with their success, that Intel does really well with its new graphic card and that AMD makes a huge comeback as well. If all that happens, it'll be a golden era for gaming graphics. Assuming Intel is targeting that and not professionals only. Holding fingers crossed.
 
That's great news actually. Competition is great, but I have fears that Intel might eat into AMD's market share and actually make things worse and turn it into two player game again, just with different players. We certainly don't want that to happen. So, I hope that NVIDIA continues with their success, that Intel does really well with its new graphic card and that AMD makes a huge comeback as well. If all that happens, it'll be a golden era for gaming graphics. Assuming Intel is targeting that and not professionals only. Holding fingers crossed.

This has a big chance of turning into a 2 player game again.
 
In fact, I wouldn't be surprised at all if Intel's card has no ROPs or TMUs (things mostly for gaming/rendering). Think Xeon Phi but instead of x86, it's a new SIMD architecture (a whole lot of shaders and not much else).

Now matter how much you'll try, if you don't include fixed function hardware blocks like ROPs and TMU it will never succeed as GPU, even if it's supposed to be for professionals. Intel would be shooting themselves in the foot and making the same mistake yet again. They have to get it out of their head that they can just turn CPUs into GPUs.
 
Last edited:
There have always been brilliant minds at intel. That didn't stop them from utterly failing both times they tried to do a dedicated GPU.



Money doesn't solve everything. See intel's woes with the 10nm process. Also, see how their first two attempts at dedicated GPUs turned out. They need people who can focus people and spending in the right direction.

This being said, something's really not right here. It really hasn't been that long since Raja defected from AMD and went on to intel. Building an architecture from the ground up takes a while. It seems that intel are in a hurry to ship this thing, and that's never a good sign.

Even if the hardware part will be well done, intel still has to invest a lot in drivers. Their IGP drivers are the worst thing since square wheels. And they'll have significantly less than two years to make the drivers work well (they have to have the hardware first). This is a big warning sign here.


Again, we can't know these facts, we can't know if koduri joined a project which was already started, if they'll be able to release their cards in 2020 and won't suck it means they probably were working on it for much longer than when koduri joined them. We just have to wait, but this is intel we're talking about, and this time doesn't sound like the last time.
 
Now matter how much you'll try, if you don't include fixed function hardware blocks like ROPs and TMU it will never succeed as GPU, even if it's supposed to be for professionals. Intel would be shooting themselves in the foot and making the same mistake yet again. They have to get it out of their head that they can just turn CPUs into GPUs.
That's the point. Intel wants to be selling cards into the $2000+ market. That's compute products to corporate customers. What makes the most sense to me is packaging a monolithic SIMD architecture with a IGP imbedded so it's still technically a GPU but it sucks at real time rendering (making it terrible at gaming but still very useful in the performance segment because of the performance at creating pre-renders.

The growth is in deep learning, neural networks, and AI. Intel can make a fortune targeting those clients without gamers.

We'll have to wait and see if Intel makes a similar announcement in GamesCom.
 
You can certainly tell Raja is "working" there...

The hype train has departed to a station called Disappointment Ave...

He is another one like Peter Molyneux. Lots of grand, waffling speeches, and very little real-life content to match up to it after the dust settles, yet the media hail him as a visionary.
 
You can certainly tell Raja is "working" there...

The hype train has departed to a station called Disappointment Ave...

He is another one like Peter Molyneux. Lots of grand, waffling speeches, and very little real-life content to match up to it after the dust settles, yet the media hail him as a visionary.
Raja isn't the problem, they employed the marketing guy responsible for "POOR VOLTA", Chris Hook.
 
You can certainly tell Raja is "working" there...

The hype train has departed to a station called Disappointment Ave...

He is another one like Peter Molyneux. Lots of grand, waffling speeches, and very little real-life content to match up to it after the dust settles, yet the media hail him as a visionary.
Raja isn't the problem, they employed the marketing guy responsible for "POOR VOLTA", Chris Hook.

This. This is what you're seeing here and Intel has already been searching hard many years for a new big business move.

ARM isn't really taking off, x86-64 is starting to look more difficult than it ever was, IoT is too small and insignificant (plus: full of privacy and security concerns) and everything else is just not interesting for core business. Graphics though... that is an area where the market is screaming for progress and where competition is lacking. And its that unicorn Intel really never could catch in the right way.

Raja probably convinced Intel to take another shot. Of all people. Not having high hopes, all this is, is the right voice at the right time, with no product to really back it up.
 
In 2020 at best they will have a medium performance gpu versus the compatition in that age. And with mediocre drivers that will make their gpu seem even weaker and less efficient. Don't underestimate the driver aspect in gpu as AMD was crippled for years until they managed to have great drivers, and many were buying nVidia because of drivers back then. To have real competition from the blue side in gpu market (gaming wise at least) we will have to wait at least until 2022-3. Even for the medium class of performance. They won't get into it by offering cheap gpus imho. That will ruin their leader image of their company. Unless if they get hurt heavily in the cpu market and will desperately need market share in gpus to save the day as AMD did in the 2013-2017 period.
 
Last edited:
I forsee it happening like this:

AMD releases a new GPU. It's subpar like it always is.
Intel then releases their new GPU shortly after that (within a few months) and it's superior to AMD but still falls slightly behind nvidias current offering at the time.
Nvidia releases their product within a 6 month time frame of Intel and completely demolishes both AMD and Intel like they always seem to do but we still pay a slightly higher premium for their cards because, Nvidia and they continue to dominate the GPU marketplace.
 
In 2020 at best they will have a medium performance gpu versus the compatition of those age. And drivers mediocre that will make their gpu seem even weaker and less efficient. Don't underestimate the driver aspect in gpu as AMD was crippled for years until they managed to have great drivers, and many were buying nVidia because of drivers back then. To have real competition from the blue side in gpu market (gaming wise at least) we will have to wait at least until 2022-3. Even for the medium class of performance. They won't get into it by offering cheap gpus imho. That will ruin their leader image of their company. Unless if they get hurt heavily in the cpu market and will desperately need market share in gpus to save the day as AMD did in the 2013-2017 period.
i740 contradicts pretty much everything you just said. Just sayin'.
 
In 2020 at best they will have a medium performance gpu versus the compatition of those age. And drivers mediocre that will make their gpu seem even weaker and less efficient. Don't underestimate the driver aspect in gpu as AMD was crippled for years until they managed to have great drivers, and many were buying nVidia because of drivers back then. To have real competition from the blue side in gpu market (gaming wise at least) we will have to wait at least until 2022-3. Even for the medium class of performance. They won't get into it by offering cheap gpus imho. That will ruin their leader image of their company. Unless if they get hurt heavily in the cpu market and will desperately need market share in gpus to save the day as AMD did in the 2013-2017 period.

Its not just drivers you need, its also an architecture that:

- uses die space very efficiently
- can be fully utilized for every type of workload (and remains efficient doing so)
- can work within a limited TDP budget (~250-300W tops)
- can scale well
- does delta compression and efficiently uses memory
- has a flexible clock range
- can mitigate heat/throttling issues while keeping performance

.... and another few dozen things we've not seen Intel do yet.
 
Let them fail then, if you think they suck so much. :p

But they shouldn't stop before they even start. What if AMD had such a defeatist attitude 5 years ago?

GPU isn't intel's core competency, no pun intended. And if they can't get their business in order with their CPU manufacturing and design, as indicated by their recent marketing failures, lack of 10nm progress, or even architectural design, then this is truly just hype. It is a realistic assessment, not an attitude, based on their trajectory, past and present.

Intel has made a lot of expensive mistakes. For example, they bought McAfee. They also tried and failed already to make a dGPU. The problem is they are trotting out a lot of marketing BS now, and its not credible. "In 2030 we are going to change the world!!111one" Sure you are.

They don't have a steve jobs to pull an iphone out of his ass and save them, they have... Raja Koudori, and a lot of management issues. Maybe something has changed at intel and they will be successful, if so, and the dGPU is better than everyone else, i'll be right there standing in line with you waiting to get one. And maybe I'll see a Unicorn today too.
 
GPU isn't intel's core competency, no pun intended. And if they can't get their business in order with their CPU manufacturing and design, as indicated by their recent marketing failures, lack of 10nm progress, or even architectural design, then this is truly just hype. It is a realistic assessment, not an attitude, based on their trajectory, past and present.

Intel has made a lot of expensive mistakes. For example, they bought McAfee. They also tried and failed already to make a dGPU. The problem is they are trotting out a lot of marketing BS now, and its not credible. "In 2030 we are going to change the world!!111one" Sure you are.

They don't have a steve jobs to pull an iphone out of his ass and save them, they have... Raja Koudori, and a lot of management issues. Maybe something has changed at intel and they will be successful, if so, and the dGPU is better than everyone else, i'll be right there standing in line with you waiting to get one. And maybe I'll see a Unicorn today too.
If we forget they have been building integrated GPUs since forever and that they know best where their manufacturing stands, you are spot on :wtf:
 
The only truth to Intel's failure is a marketing one. Hardware does just what it is told to do. If Intel is to succeed, they need better software support, based on which I can state they will succeed on that front.
Intel needed to promote Broadwell as a single-threaded specialty chip - like they did with 7350K, posthumously. 128MB L4 that almost negates any calls to memory should have been promoted as such and therefore, Intel's shortcoming isn't an engineering one.
 
Intel needs to provide a real benefit if they want people to buy their card as Nvidia got the top performance crown and (usually) AMD got the perf/dollar aspect. I doubt Intel is gonna price their card with low margin so it's not gonna be mainstream line, high end segment seems to be the best place for them to test their new card while maintaining good margin.
 
Intel needs to provide a real benefit if they want people to buy their card as Nvidia got the top performance crown and (usually) AMD got the perf/dollar aspect. I doubt Intel is gonna price their card with low margin so it's not gonna be mainstream line, high end segment seems to be the best place for them to test their new card while maintaining good margin.
What they have is good software engineering. If they can leverage that and show continuous support for features, they have a winner.
PS: they have this which I find interesting. Anything you can sample with a gaussian, you can use for centroid domain antialiasing which I find a good match to build a bilateral filter with.
Any 2x antialiasing mode will automatically generate 3 taps. The intermediate boundary is prone to artifacting the most. Bilateral filters have no negative 'intermediary' lobes however much you sharpen, so it can seam any image together without incurring any 'shock' of its own. It is a good basis point to postprocess an image - which gpus naturally do. Literally, if you are into type or video filtering, bilateral is a must for conserving the edge detail.
 
Back
Top