I see bunches of things in this.
For one, the tech works just fine. Incoming signal would be the equivalent of a live HD video, we can find these all the time. Outgoing signal would be the incredibly basic and tiny inputs from your controller to the host. Not much else.
This was done with a PS3 over wireless to transmit Crysis, it worked perfectly there with a 54mbps connection.
Also, this would likely be done with a server farm of highly optimized GPU clusters. You dial in for a game, the server fires up, and all it needs to do is send the image signal after it processes and renders everything.
This is a console killer, no doubt. No need to use em aside from glorified BD players, and sad excuses for a PC with DLC.
Speaking of DLC, that is what will keep the PC alive when this is released. Nothing can compare to the freedom and moddability of a PC, not to mention, they'll be around due to the business sector anyways. On top of that, developers will be releasing their games to load into a server farm, and will be easily portable to PC (comparatively being a single server, no big deal).
I'd imagine they'll have issues with laggy input response, but at the millisecond level. Possibly the occasional server crash interrupting your game, (unless they have a kick-ass failover system like the ones i build). Then there's the lack of doing anything resembling personalization, unless they give you a vHDD to store things on that it can access when you load a game. Even so, you get save games, but I wouldn't imagine they would allow you to DL a 500mb add-on to your Fallout 3, STALKER, CoD, etc etc etc.
No worries for PC gaming. This might actually help it out as it will get devs back to coding x86, instead of consoles.