• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Unveils Xe DG1-SDV Graphics Card, Demonstrates Intent to Seriously Compete in the Gaming Space

Its a compute card, being setup as a general purpose graphics card.
 
Its a compute card, being setup as a general purpose graphics card.

This may being somewhat of a spoiler but all GPU architectures nowadays are first and foremost compute accelerators with some fixed function graphics blocks bolted on, even the ones for mobile SoCs. Compute has priority over graphics, you just have to look at some block diagrams describing their architecture and you'll immediately realize this.
 
The fact that you'd like GPUs to be cheaper doesn't make them overpriced. That's just a price. On a free market. GPU makers can set it however they deem fit.

For a long time we had higher Nvidia's prices and lower AMD's. Nvidia was making money, AMD wasn't.
Now AMD has raised prices to Nvidia's level, which is just a confirmation of who was right all along.

Intel isn't exactly known for selling at break-even. Don't expect them to be the cheapest option. :)

Seriously, PC gaming is a cheap hobby anyway. Don't be such a miser...

Not only that some people were turning up their noses at AMD cards because they were 'cheap', and some went as far as to put down others who couldn't nvidia cards lost count of 'ooh, look at me flex, I have two titan x in sli and all I ever turn this pc on to do is browse the web and youbtube, oooh.'
 
So, low end start. Im actually still curious about performance per watt and such, cause if it can be scaled up..
 
All new GPU will be measured against the 1080Ti, and the 2080Ti.

All Intel really has to do is come out swinging with a GPU that can ray trace better than the 2080Ti art 4K 60 fps and costs less than $1000.
 
To all the "bring competition to AMD/Nvidia" prophets, buckle up for some serious disappointment. I have a feeling the purpose of this demo is to sort of like bring down the expectations a bit, there have been some wild hopes and dreams put in Intel's discreet GPUs.
I think this GPU is too discreet for the demo to tell us anything.
 
All yet non-existing GPUs, as NVIDIA "+50%", AMD 5600XT, "Big Navi" and... this:

Price/performance test real world test and general availability. Or it didn't happen.

Mine 0.02g on the game chosen - Intel wouldn't put a random game on show. They knew exactly what they are showing, they had at least decently optimized driver for it, and I wouldn't exclude the possibility that game itself was patched undocumentedly for this occasion. Yes, a shadow of a doubt, except it's a much more than shadow. Not a random game someone installed - this was pretty much the best performing game with best and custom driver optimization, on (granted) engineering sample, but no doubt hand-picked...

We can't say something like 'we witnessed games gain 20%+ on driver optimization' (as in attached video) in this case... Games did that, but that was something else, entirely. You would never see the first appearance of new architecture on a game that natively performs badly on GPU architecture and furthermore hasn't own drivers optimized for that game.

But, price/performance real world test and general availability, and then it happened...
 
All new GPU will be measured against the 1080Ti, and the 2080Ti.

All Intel really has to do is come out swinging with a GPU that can ray trace better than the 2080Ti art 4K 60 fps and costs less than $1000.
I dont think anyone is that pushed with RT yet to make it something people need to chase, and I dont think 4K60 at a grand is a great bar either. Those GPUs will likely be 3 and a half and 2 years old respectively by the time something is launched, so I dont want to know about "competes with", I want to know about "destroys" - for a grand! I seriously doubt nvidia are going to be idle either.

Really there is nothing to see here. In fact, what was demonstrated, I would want an eye witness to confirm the dude didnt plug the HDMI into the mobo ><!

It is 2020 right? The year they enter the dGPU market? And we are impressed with a shroud? Okay...
 
Lets ask the obvious question: If Intel is struggling to keep up with demand with their CPUs since all of their fabs are running at full capacity, then where is there room to make a GPU?
 
Lets ask the obvious question: If Intel is struggling to keep up with demand with their CPUs since all of their fabs are running at full capacity, then where is there room to make a GPU?
They could take it to a different fab. Samsung for example
 
They could take it to a different fab. Samsung for example
Or TSMC to push out AMD. :)
But I believe the Samsung partnership has already been leaked (if not officially announced).
Completely underwhelming.
Because you were expecting what? 2080Ti competitor? Even your beloved AMD can't make that.

Some people here thought (hoped?) it will take Intel 1-2 years more to come up with a working POC. But it's already here. :)
The fact that you think "overpriced" can be anything but an opinion, and trying to fight that opinion as if its a fact is just...silly to say the least.
It can't be anything other than an opinion, because price perception is always subjective.
Something can be called "overpriced" when it's more expensive than majority of alternative products. It's about comparing, about statistics.
If you have a single product of a kind - with no competition - it just can't be overpriced.
 
Oh hey we "Intel" are able to make a slick prototype that runs a game! We're still here....

How quaint!
Okay we see you.
 
Since AMD also decided to join Nvidia in overcharging their low and midrange GPUs, this looks like a good time to break up this duopoly. AMD fans sure misses Raja Koduri’s breath taking prices.

Indeed. AMD just isn't cutting it in competition. Everything is going to RT and AMD still releases non RT parts. Then charges a premium for them. Which is ridiculous. Those GPUs were already discontinued the moment they launched. They should have cut the prices to make up for the devaluation.

The day the 104 series chips are back at the high mid price range is the day I return to nVidia. At this point AMD can't do that. If Intel can, all the best to them.
 
Everything is going to RT and AMD still releases non RT parts. Then charges a premium for them. Which is ridiculous. Those GPUs were already discontinued the moment they launched. They should have cut the prices to make up for the devaluation.
As if the GTX16 series that nVidia released has RT or Tensor cores in them.
Apparently nVidia believes that it is a waste of die space for current gen mainstream gpus.
 
Even for an early prototype it's just too underwhelming, it's also obvious this is till no where near a real product that will be out there in the hands of people anytime soon.
All the coverage says Intel reps kept saying exactly that. It is a development vehicle - to get developers started on something actually Xe before any real product appears.

While Intel has iGPUs and drivers/software for them, what they are aiming for now is far higher. AMD and Nvidia have been doing that for decades and have a lot of built-up software support that Intel needs to somehow match. This will take considerable amount of time. Intel is taking every chance to try and start that process before an actual product is out.
 
Its new and exciting
I love the idea that something is at least being attempted
Roll on Computex, hopefully more juicy details and demonstrations
 
They could take it to a different fab. Samsung for example

Except the GPU was made in Intel fabs, not some one elses. It would be very time consuming to port it to another fab.
 
Only 8 PCIe lanes on the PCB and the complete lack of any details still isn't promising but at least they show actual hardware now instead of just renders.

I seriously doubt it's saturating 8 lanes, why rig it up to 16 lanes when it won't use them ‍
 
This is as 'new' for Intel as RDNA is new for AMD. Its just an iteration with fancy packaging. This will end up as a faster IGP.

It does not compete with dGPU at all, currently and from what we've seen. Can this scale? Sure. Will it scale as well as AMD or Nvidia's tech? Not by a long shot. The fact it can run some ancient game at low is... well.. even Intel IGPs have done that for years. Being capable of 30~60 fps at 1080p low instead of 720p medium is not an achievement in 2020.

What they've shown is a new shroud, really, and lots of plans and powerpoint slides.
 
Back
Top