This is pretty cool to see! I personally don't think it will catch on too wildly in my circles, 40-80ms of input delay is not a trivial number, and in the video footage you can see it clearly. It definitely found a niche in this guy though, and probably many more as well. Even if I share some of the sentiments from the other comment, it's pretty amazing what we're able to do with the tech.
On a side note, is that laptop connected with wifi? I've found that even "in-home" in home streaming from my computer to a laptop can get pretty choppy if I'm not sitting in the living room in front of the router (sometimes that isn't enough either).
ISP network engineer here: I really don't see this ever being practical in terms of latency, even if the distances from client to GPU are only 5ms or less. It is still an eternity compared to the latency and performance of a GPU on a local PCI express 3.0 x16 motherboard bus adjacent to CPU, ram and m.2 SSD. Just a distance like Seattle to Portland would be a latency killer. Even the routing/fiber path from issaquah to Seattle would be too much, at 2.5ms.
It's not uncommon for screens to have input latencies in the 50-100ms range - John Carmack tweeted "I can send an IP packet to Europe faster than I can send a pixel to the screen. How f’d up is that?". [0]
Given that people are generally happy to accept that kind of latency for a frame getting from their GPU to photons, I don't see why an extra ~10ms of latency getting a frame back and forth from a local datacenter would be a dealbreaker except for the most demanding scenarios. It does require that the datacenter is close to the user though.
> The latency to the screen is fixed with no jitter. The human mind can adapt.
In addition, latency is cumulative and after a certain threshold becomes extremely annoying. Adding network delays to the already-present latency elsewhere in the system just makes you more likely to hit that threshold.
I can ping google.com with 1-5ms in almost any locale I've been to assuming fiber is installed. Google has a presence in so many areas that low latencies are trivial to get.
Sub 10-15ms is easily achievable even with a modest investment for gaming services. Azure or AWS could do this if they chose, but they would have to deploy many mirrors. Even in 'fast action' multiplayer games, as long as the total latency stays under 40-50ms you're good, not great, but completely playable. That's perfectly achievable in a world where we're moving to gigabit and other lower latency last mile solutions. Heck, even my uverse vdsl isn't too bad with latency and that's a last gen technology.
OnLive launched when home bandwidth was slow and had higher latencies. I could see this service doing okay as people migrate to better internet connections. Also hardware accelerated mpeg4 is on pretty much all PCs now considering it launched with Sandy Bridge '11. OnLive just launched 5 years too early.
The industry would have to change along with this. You'd probably have a slower tick rate on multiplayer game servers to make up for increased latency and other things hardcore gamers won't like, but for most of us, it won't matter. Hardcores won't be using this service anyway. Even if some multiplayer games cant be played like this, its still an incredible value for single player games. A gaming PC is prohibitively expensive still and a hypothetical MS or Amazon version of OnLive could be cheaper than buying a console and buying games for it.
No it's not, something that's not fancy but that can drive a 1920x1080 display at excellent frame rates with all modern games can be built for less than $550 total.
The problem with "cloud" GPU services is that huge number of random customers will be using less than optimal broadband connections via shitty ADSL2+ lines, last mile WISPs with sub 10 Mbps PtMP connections, overcongested cablemodem networks (hello comcast). The ISP and last mile connection are totally outside of the control of the cloud gaming service, which results in a customer perception of a shitty gaming experience.
$550 gets you a very sorry gaming machine. I'm sorry but budget builds aren't worth anyone's time and if money is an issue it makes more sense to get a console.
$750 is going to be the minimum for a moderately specc'd machine that can actually play new games at 1080p. Especially if they want it to last more than 12 months. Note most people don't have a spare monitor, ssd, or windows license. They need to buy everything again.
I see this as being the sad, yet inevitable, future of gaming. First we didn't own the game, we owned a licensed copy. Next we are going to not own a copy, or own a license, but instead rent time to play a game at exorbitant rates and it will be heralded as "amazing" since it will be "cheaper for you" since you don't need to buy your own hardware.
This has been tried before (it was called OnLive AFAIR) and it didn't catch on. The perspective is not that great for publishers since the cost of investment and maintaining such a solution would be enormous for what would essentially be a subpar experience comparing to what we have right now.
I'd expect it to be a distributor providing this service. Think Steam as a subscription service. Play any game in the catalog, as much as you want, for a monthly fee, without any installations or driver hunts. Seems like a good proposition for the customer, provided it works and is cheap enough. Don't know how publishers and game devs would take it.
On a side note, is that laptop connected with wifi? I've found that even "in-home" in home streaming from my computer to a laptop can get pretty choppy if I'm not sitting in the living room in front of the router (sometimes that isn't enough either).