• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Unveils Discrete GPU Prototype Development

Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
I could see Intel really carving out a decent niche in the lower-mid to lower-enthusiast set of products.

1) Not quite as strong as Nvidia/AMD's halo products
2) Not quite as good price/perf as AMD's midrange
3) But industry-leading perf/watt.

For the time, the 10-25w IvyBridge-Broadwell had absolutely incredible perf/watt. But they just couldn't scale them up efficiently past even ~50w...
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
This isn't Raja's first step. He just joined the company last month, no way he did these designs in a month. Raja will have a hand in their next gen architecture and current / future drivers.
This is a hatchet job to see what they have , hence the fpga interface , you may be right but in time scale terms this prototypes actually on point for Raja to have a hand in , but yes i guess I'd agree someone at intel could have had this idea a little bit longer than a month ago ,possibly but the execution isnt all that then.
 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Assuming they can throw enough money on the problem and come out with good GPUs, of course.

Given their track record , they failed to apply that strategy numerous times. Unless the money was thrown towards bribing OEMs.

Play fair dude, this is likely Raja's very first step towards something

Raja isn't a one man army. He can't do miracles , people that are knowledgeable and skilled enough in this area are incredibly rare , most of them are working for AMD and Nvidia already and the rest at Qualcomm and ARM. Intel is more than likely pursuing a compute oriented GPU to compete against Nvidia in the datacenters.
 
Last edited:
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Given their track record , they failed to apple that strategy numerous times. Unless the money was thrown towards bribing OEMs.



Raja isn't a one man army. He can't do miracles , people that are knowledgeable and skilled enough in this area are incredibly rare , most of them are working for AMD and Nvidia already and the rest at Qualcomm and ARM. Intel is more than likely pursuing a compute oriented GPU to compete against Nvidia in the datacenters.
I dissagree in part ,in part because I think calling it a compute oriented GPU could be both right and a simplification , it's right if they go traditional discrete Gpu route but im half expecting something a bit more now I've seen this.

If they leverage the experience they have putting an Fpga to work in the data center they could have a Computer accelerator instead of just a graphics or compute accelerator.

After all the ease of reconfiging FPGAs make them the ultimate possible accelertor of Anything.

This could sidestep any graphics performance gap by increasing common addoption of fpga APIs obviously within direct X.

All chip companies are both diversifying and adopting more modular, many accelerator designs ,an fpga usurps a lot of that in one package so it's only time before we see them in consumer land.

All opinions on possibilities, too early to know tbh.
 
Last edited:
Joined
Feb 24, 2009
Messages
3,516 (0.61/day)
System Name Money Hole
Processor Core i7 970
Motherboard Asus P6T6 WS Revolution
Cooling Noctua UH-D14
Memory 2133Mhz 12GB (3x4GB) Mushkin 998991
Video Card(s) Sapphire Tri-X OC R9 290X
Storage Samsung 1TB 850 Evo
Display(s) 3x Acer KG240A 144hz
Case CM HAF 932
Audio Device(s) ADI (onboard)
Power Supply Enermax Revolution 85+ 1050w
Mouse Logitech G602
Keyboard Logitech G710+
Software Windows 10 Professional x64
Intel is a CPU company first.

i740 was a terrible GPU even for the day. It would be like calling a gtx 1030 a good midrange gpu.

Intel can not make a GPU and never will because it's a CPU company first.
 
Joined
Jul 5, 2013
Messages
27,752 (6.67/day)
It would be cool if we got 3rd GPU player on graphics market.
I think the best Intel can hope for is mid-tier standings. Of course that's a big market and if they can get a GPU that provided respectable performance, people might take them a bit more seriously.

My "Yes" vote was reserved and contingent on the idea that Intel is going to be competitive in the entry and mid-tier levels, which currently they aren't.
 

bug

Joined
May 22, 2015
Messages
13,761 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I could see Intel really carving out a decent niche in the lower-mid to lower-enthusiast set of products.

1) Not quite as strong as Nvidia/AMD's halo products
2) Not quite as good price/perf as AMD's midrange
3) But industry-leading perf/watt.

For the time, the 10-25w IvyBridge-Broadwell had absolutely incredible perf/watt. But they just couldn't scale them up efficiently past even ~50w...
Those were probably never meant to scale past their current TDPs anyway. Intel has Iris Pro for those that need more juice. Almost as rare as hen's teeth as far as availability is concerned, but the option is out there.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
I always knew Intel could get into the discrete graphics card market if it wanted to and have said so before. I’m also confident that they can give NVIDIA decent competition in time.
 
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
Those were probably never meant to scale past their current TDPs anyway. Intel has Iris Pro for those that need more juice. Almost as rare as hen's teeth as far as availability is concerned, but the option is out there.

Well that's probably what Intel would say in damage control mode lol. But they would by lying....

There were side-by-side comparisons of a GTX 650 and a Iris Pro graphics where they said "You can't tell the difference!" Then they announced they had a GPU with double the EU's coming out the next generation (Clearly trying to infer that they would be in the midrange segment for the first time ever). They wanted to get to the level where they could be in most gaming laptops, but they couldn't and so AMD is providing them with their Vega Gx products.


P.S. Also keep in mind that even to get to "GTX 650 levels," Intel had to include ludicrously expensive EDRAM. So sure they beat AMD for a year by ~30%, but their product was using 14nm instead of 28nm; and it cost 2-3x as much to make lol. That's not sustainable....
 
Joined
Mar 24, 2012
Messages
533 (0.12/day)
There already are a lot more GPU makers than Nvidia and AMD, but none of them are for PC's.

ARM, Imagination Technologies, Qualcomm and Vivante all make GPU's, as well as technically S3/VIA...
I most likely missed some companies as well, but the problem is, none of them can keep up with Nvidia due to the amount of money they're throwing at their R&D of new graphics architectures.

Even Intel isn't going to be able to catch up any time soon. At best I'd expect something mid-level for the first 2-3 years, as it's expensive, it's resource intensive and time consuming to make GPU's.

sometimes i don't think it's about money. company like Qualcomm or Intel have the money needed for the required R&D but the problem is Nvidia heavily focusing on things that many hardware maker did not like: software.
 
Joined
Sep 15, 2016
Messages
484 (0.16/day)
Intel actually has a surprisingly good R&D department so if they roll something out it's going to have a lot more engineering and thought into it than just "similar performance at a reduced cost."

I'll agree with other comments in this thread that I'd be hard pressed to believe they will catch up to nVidia anytime soon but if they release something I can almost guarantee it will be something interesting and different.
 
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
Intel actually has a surprisingly good R&D department

When they aren't blowing 7 Billion Dollars on McAfee lol.

They could have use just THAT dumb expenditure to make an entire GPU line-up, or possibly even a new CPU architecture that wouldn't have been Curb Stomped by Zen...
 
Joined
Jun 15, 2016
Messages
1,042 (0.34/day)
Location
Pristina
System Name My PC
Processor 4670K@4.4GHz
Motherboard Gryphon Z87
Cooling CM 212
Memory 2x8GB+2x4GB @2400GHz
Video Card(s) XFX Radeon RX 580 GTS Black Edition 1425MHz OC+, 8GB
Storage Intel 530 SSD 480GB + Intel 510 SSD 120GB + 2x500GB hdd raid 1
Display(s) HP envy 32 1440p
Case CM Mastercase 5
Audio Device(s) Sbz ZXR
Power Supply Antec 620W
Mouse G502
Keyboard G910
Software Win 10 pro
So many comments :p
 
Joined
Jun 21, 2011
Messages
165 (0.03/day)
There already are a lot more GPU makers than Nvidia and AMD, but none of them are for PC's.

ARM, Imagination Technologies, Qualcomm and Vivante all make GPU's, as well as technically S3/VIA...
I most likely missed some companies as well, but the problem is, none of them can keep up with Nvidia due to the amount of money they're throwing at their R&D of new graphics architectures.

Even Intel isn't going to be able to catch up any time soon. At best I'd expect something mid-level for the first 2-3 years, as it's expensive, it's resource intensive and time consuming to make GPU's.

That's not 100% correct. All GPU IP can be used on a PC, it is not "x86-specific", and we've seen examples of x86 vendors licensing GPU IP for their low-budget kit (i.e. Intel & Imagination). But each GPU developer has their own reasons not to go into the PC market, and it's not for lack of budget. Qualcomm, ARM and Imagination could afford it (and God knows Imagination needed to diversify, but they opted for the easy way out: sell to the Chinese)

1 - It doesn't serve their strategy to diversify into the discrete desktop GPU market
2 - Their development focus is in low-power, mobile devices. Shifting to desktop discrete might upset the status quo and the investment outlook.
3 - Their GPU strategy is to supplement their SoC strategy in exclusivity (i.e. Adreno)
4 - reasons...

Basically you need someone with big balls and deep pockets to take on the power that be...
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.46/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Two things strike me as odd in these documents:
1) Why a dual chip solution? Why separate the computer interfacing completely from the GPU? Was one chip borrowed from some other project (e.g. wifi controller) so they saved time/money by piggybacking on the basics of existing tech? This doesn't make sense by itself.

2) They are really focused on keeping power consumption to a minimum. This point is particularly interesting because it tells us where their focus is: portable/low power devices. My first thought was Atom+4K which they're already doing. Second thought is that Intel is very concerned about Vega M powered Ryzen chips. That doesn't explain why Intel is pursuing discreet graphics though.

Then it struck me: Intel wants to divorce themselves completely from NVIDIA which means no more Intel/NVIDIA Optimus systems. AMD sucks right now in the laptop GPUs market so, other than crawling back to NVIDIA, Intel's only option is to scale up their IGPs to match NVIDIA's in gaming/workstation laptops.

Circling back to #1, why the two chip solution? Intel owns the CPU and chipset. They're soon going to own the GPU too. Intel likely kept them separate so they could create a new interface (replacing the generic PCIE controller) tailored to the dual GPU configuration (IGP + IGP expansion card). In a corporate environment, IT could order 100 units of the same computer with 20 IGP expansion cards then only add the card to computers that are destined for people that need it. Presumably there is no additional set up and to the operating system, it will only see one GPU with expanded capabilities. The IGP can enable/disable EU clusters as demand increases/decreases.

It makes perfect sense. i7-8809G may have tested that new interface too. They could also package the MCM on a PCB to be hooked into desktop PCIE slots. I'm still thinking there has to be something unique about the PCIE implementation here--perhaps a dedicated low latency slot that removes most of the overhead by having a dedicated and constrained PCIE controller (only talks to one device).
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,761 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
About that #1, it's a PoC, I wouldn't read too much into it just yet.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.46/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
If it was entirely benign, they wouldn't have mentioned it in the IEEE documents.
 
Joined
Jun 21, 2011
Messages
165 (0.03/day)
Two things strike me as odd in these documents:
1) Why a dual chip solution? Why separate the computer interfacing completely from the GPU? Was one chip borrowed from some other project (e.g. wifi controller) so they saved time/money by piggybacking on the basics of existing tech? This doesn't make sense by itself.

2) They are really focused on keeping power consumption to a minimum. This point is particularly interesting because it tells us where their focus is: portable/low power devices. My first thought was Atom+4K which they're already doing. Second thought is that Intel is very concerned about Vega M powered Ryzen chips. That doesn't explain why Intel is pursuing discreet graphics though.

Then it struck me: Intel wants to divorce themselves completely from NVIDIA which means no more Intel/NVIDIA Optimus systems. AMD sucks right now in the laptop GPUs market so, other than crawling back to NVIDIA, Intel's only option is to scale up their IGPs to match NVIDIA's in gaming/workstation laptops.

Circling back to #1, why the two chip solution? Intel owns the CPU and chipset. They're soon going to own the GPU too. Intel likely kept them separate so they could create a new interface (replacing the generic PCIE controller) tailored to the dual GPU configuration (IGP + IGP expansion card). In a corporate environment, IT could order 100 units of the same computer with 20 IGP expansion cards then only add the card to computers that are destined for people that need it. Presumably there is no additional set up and to the operating system, it will only see one GPU with expanded capabilities. The IGP can enable/disable EU clusters as demand increases/decreases.

It makes perfect sense. i7-8809G may have tested that new interface too. They could also package the MCM on a PCB to be hooked into desktop PCIE slots. I'm still thinking there has to be something unique about the PCIE implementation here--perhaps a dedicated low latency slot that removes most of the overhead by having a dedicated and constrained PCIE controller (only talks to one device).

I guess the dual chip plays to Intel's own experience in CPU design, where they separate a number of functions from the core in what they call the "system agent" (or uncore), mostly power management and memory management.

In the old days you'd have CPU+North Bridge+South Bridge, then Intel moved the north bridge onto the CPU. CPUs now have their own memory controller, PCIe controller, etc... That's what Intel is used to doing.

In this case, from what they're showing here, it seems they are using a modular design for expedience to show the proof of concept. Whatever route they take down the line will determine if this will be a single piece of silicon or a bunch of chips soldered together.

However, don't cream your pants just yet. This PoC is not a new architecture but a tweaked Gen9LP chip (HD Graphics 5xx-series) attached to a system agent, and with integrated voltage regulators to better manage power and frequencies... they're just trying to optimize efficiency, rather than create an entirely new GPU design. Apparently they have low-power cores and *cough* "high-performing" *cough* cores (EUs), much like ARM has light- and heavy- workload cores. This design doesn't change anything architecturally (i.e. the GPUs are as crappy as before), but it does bring in power savings and extended battery life (where battery life is relevant).

Superimpose this on the PCIe interface and you can optimize any type of interface: EMIB, add-in card, etc...

It's a bit "meh" to be honest, but Raja probably told people they needed to do *something* for IEEE.
 
Last edited:

iO

Joined
Jul 18, 2012
Messages
529 (0.12/day)
Location
Germany
Processor R7 5700x
Motherboard MSI B450i Gaming
Cooling Accelero Mono CPU Edition
Memory 16 GB VLP
Video Card(s) AMD RX 6700 XT Accelero Mono
Storage P34A80 512GB
Display(s) LG 27UM67 UHD
Case none
Power Supply Fractal Ion 650 SFX
It looks a lot more like the test vehicle of their iGPU in discrete form, which they might built every time they test and tune a new gen rather than fabbing the whole CPU...
 
Joined
Nov 29, 2016
Messages
670 (0.23/day)
System Name Unimatrix
Processor Intel i9-9900K @ 5.0GHz
Motherboard ASRock x390 Taichi Ultimate
Cooling Custom Loop
Memory 32GB GSkill TridentZ RGB DDR4 @ 3400MHz 14-14-14-32
Video Card(s) EVGA 2080 with Heatkiller Water Block
Storage 2x Samsung 960 Pro 512GB M.2 SSD in RAID 0, 1x WD Blue 1TB M.2 SSD
Display(s) Alienware 34" Ultrawide 3440x1440
Case CoolerMaster P500M Mesh
Power Supply Seasonic Prime Titanium 850W
Keyboard Corsair K75
Benchmark Scores Really Really High
Joined
Apr 19, 2011
Messages
2,198 (0.44/day)
Location
So. Cal.
this is likely Raja's very first step towards something
You realize Raja is just tasked with picking up all the pieces a interesting tidbit's Intel has developed and tried (hold patents on) over the years and will now be sorting through all that and see how it might string together into something that could be brought to a discrete GPU offering (without stepping in Nvidia/AMD proprietary technologies and ending up in litigation) and start making ROI for all of Intel's time and investment. (that's a nice run-on!)
 

GAR

Joined
Aug 20, 2008
Messages
138 (0.02/day)
Processor Intel Core i7 4770K @ 4.6ghz
Motherboard MSI Z87-GD65 Gaming
Cooling Corsair H100i
Memory 16GB Corsair Vengeance Pro 2400MHZ
Video Card(s) AMD Radeon R9 290X
Storage Samsung 500GB SSD
Display(s) Asus VG248QE 144HZ
Case Corsair Air 540
Audio Device(s) Creative Labs ZxR
Power Supply Corsair HX1050
Software Windows 8.1 Pro
Larabee? is that you? back from the dead?
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
You realize Raja is just tasked with picking up all the pieces a interesting tidbit's Intel has developed and tried (hold patents on) over the years and will now be sorting through all that and see how it might string together into something that could be brought to a discrete GPU offering (without stepping in Nvidia/AMD proprietary technologies and ending up in litigation) and start making ROI for all of Intel's time and investment. (that's a nice run-on!)
Read the thread pal and realise i know i commented as much after that post

I said in that post , did you read one line.

Play fair dude, this is likely Raja's very first step towards something,




"ie intels present best igpu with hybrid shaders and an fpga added to make up for all thats missing if they don't fit cpu cores and more importantly the rest of its supporting circuitry related to specific purposes ie power control but, the fpga mainly acts as an interface between intels proprietary inter chip interface type and pciex since it's clear the igpu was designed without pciex in mind it makes sense to test what they can expect before scaling up the design, they won't actually make many of these even for themselves, it's a stepping stone chip clear as day."


Also

The shop bought ones four or five years out imho and certainly won't be a dual chip solution maybe MCM though.
Because that FPGA could possibly be a game changer if used to pump the GPUs core performance intelligently and or to directly affect a new Api for mainstream use case acceleration, intel already do fpga in server so they know it's got legs.

Later on i said this too,

im half expecting something a bit more now I've seen this.

If they leverage the experience they have putting an Fpga to work in the data center they could have a Computer accelerator instead of just a graphics or compute accelerator.

After all the ease of reconfiging FPGAs make them the ultimate possible accelertor of Anything.

This could sidestep any graphics performance gap by increasing common addoption of fpga APIs obviously within direct X.

All chip companies are both diversifying and adopting more modular, many accelerator designs ,an fpga usurps a lot of that in one package so it's only time before we see them in consumer land.


So i clearly get that Raja grabbed whatever was near, it didn't have external connections just inter chip ones so they added an fpga ,which coincidentally could be a viable co processor itself.

Seams another news agro site agrees with me but they have the name of purpose.

Edge computing ,not Gaming per say , that would be a shame.
 
Last edited:
Top