Monday, December 13th 2021
Intel Envisions a Future in which Your Devices Share Hardware Resources
We've had remote gaming for several years now—where your laptop with a basic iGPU can stream gameplay as it's being rendered on your gaming desktop, either across your home network, or across the Internet. We've also seen cloud-gaming, where you pay a provider like NVIDIA GeForce NOW to host your digital game licenses, and render your game in datacenters, to stream it across to your device. What if this idea is turned on its head—what if your laptop holds your software or games, and it simply streams close-to-metal data over network, to use their hardware resources? Intel thinks this is possible.
Intel today pulled off a fascinating presentation titled "Powering Metaverses," along the sidelines of the Real-Time Conference 2021 (RTC 2021) virtual summit. Dubbed "resource abstraction," Intel is working on a technology that intelligently senses compute resources available to a device across other devices on the network; accounts for network bandwidth and latency; and treats these resources as if they are available to a local machine. The company put out a conceptual demo of a laptop with a basic iGPU dramatically improving gaming performance by roping in hardware resources of a gaming desktop on the network; without the game actually being installed on that desktop. If latency-sensitive applications like games could be pulled off on this system, it bodes really well for applications that aren't as latency-sensitive, or even as bandwidth-sensitive. Resource Abstraction will feature a lot as Intel steers toward Web 3.0 and metaverses. The concept video can be found here.
Intel today pulled off a fascinating presentation titled "Powering Metaverses," along the sidelines of the Real-Time Conference 2021 (RTC 2021) virtual summit. Dubbed "resource abstraction," Intel is working on a technology that intelligently senses compute resources available to a device across other devices on the network; accounts for network bandwidth and latency; and treats these resources as if they are available to a local machine. The company put out a conceptual demo of a laptop with a basic iGPU dramatically improving gaming performance by roping in hardware resources of a gaming desktop on the network; without the game actually being installed on that desktop. If latency-sensitive applications like games could be pulled off on this system, it bodes really well for applications that aren't as latency-sensitive, or even as bandwidth-sensitive. Resource Abstraction will feature a lot as Intel steers toward Web 3.0 and metaverses. The concept video can be found here.
11 Comments on Intel Envisions a Future in which Your Devices Share Hardware Resources
The key word is conceptual. Even Thunderbolt seriously bottlenecks GPUs, you would need very expensive wired networking gear to pull this off and those tend to generate quite a bit of heat too. Wireless definitely wouldn’t work.
What I don’t understand is that why you wouldn’t just stream from the desktop computer in the first place. Intel’s approach has only 1 advantage over streaming and that is that you wouldn’t have to install the game on the desktop computer but is that enough considering all the downsides?
I don’t want to call this a dumb idea just yet, but I am definitely thinking it.
It'd be like geforce now/steam in house streaming/moonlight, but mixed with GPU mining Because the program had to be installed in the desktop.
Their example is getting the GPU power, without needing the game/program installed at the desktop end
Edit: i cant reply lower down, server issue No we dont
Steam in home streaming requires steam installed at both ends, and the game at the server end
This is asking it to be done the other way around, with a remote machine adding GPU power
125MB/s can stream 4K from netflix... five times. They're not using this is a HDMI cable here, they're using compressed video data of some kind.
In the realm of "fast enough", gigabit is somewhere between molasses and the heat death of the universe. We already have that, in home streaming was solved by valve 8 years ago. We cant get two GPUs, linked together through a high speed interface, in a PC to sync properly, and you think intel is going to achieve this with two entire different computers, possibly running different architectures, linked over a network? Intel's paper is pure conjecture on "what could happen", and wouldnt be out of place from disney's "tomorrowland" from the 60s.
So out of everyone on the internet 5% may and all of them are people who know what that means, and will usually have a dedicated gaming rig and a decent gaming laptop with steam in home streaming already installed.
Congratulations Intel, you envisioned 2018. It’s time to stop dreaming of the past and trying to rebrand it or repackage it as your own.
Steam in home streaming requires steam installed at both ends, and the game at the server end
This is about NOT needing the games installed anywhere else, and not needing a server... just borrowing their GPU power.
This is asking it to be done the other way around, with a remote machine adding GPU power
125MB/s can stream 4K from netflix... five times. They're not using this is a HDMI cable here, they're using compressed video data of some kind.
Why reverse which system runs the software exactly!? It seems like they want the client to just run the GPU drivers and hardware off a server essentially, but why do they want that over the server running both and streaming it to the client which can already be done pretty well and easily at the same time.
Also Intel's presentation slide be like...it's never too late to Mario party! My whole perspective of the internet has forever changed...
They never said they wanted to limit it to one GPU max
This could be something they envision powering Nvidias Geforce now server racks, with the load spread across multiple machines - or in a workplace where everyone gets ARM based apple products with no real horsepower, but they can remotely add the grunt on-demand from servers elsewhere in (or out) of the office
I wouldnt mind having one big GPU in the house, and the rest on IGP - knowing the GPU power could be streamed around as needed
Too niche perhaps but as a concept I like shared resources.
It seems like they could've presented the idea better. I think it would head in this direction eventually though out of ease and necessity as opposed to building a server bigger and bigger and more interconnected. Much like with yields on larger chip dies you can only expand so much and so easily that alternatives start to make better sense.