Here's a guy's Google Stadia review in one GIF.
https://twitter.com/i/status/1196488999524802562
(So much for "negative latency")
Here's a guy's Google Stadia review in one GIF.
https://twitter.com/i/status/1196488999524802562
(So much for "negative latency")
These kinds of issues always occur as soon as a new game/app/service debuts.
Then the developers iterate and the issue goes away.
Much as I wish this means that Google was destined for a massive failure and financial hit, it's likely they'll have the issue reduced or eliminated within the next couple of Agile sprints.
RATATATATATATATATATATABLAM
If there's nothing wrong with having to show an ID to buy a gun, there's nothing wrong with having to show an ID to vote.
For legal reasons, that's a joke.
You can't change the speed of light and it takes time to get from one place to another. Latency at the user interface level is intolerable. People are used to waiting for a response from a web page, not mouse/keyboard/controller/graphics interaction. I don't believe there's any way that Google or anybody else can make this work.
Liberals never met a slippery slope they didn't grease.
-Me
I wish technology solved people issues. It seems to just reveal them.
-Also Me
Monitors with like a 2-3ms response time are heavy purchases for serious gamers. They try to squeak every advantage they can, and will pay several hundred to shave just 12ms in rendering. In comparison, best case latency on cloud gaming is going to be an added overhead of 150ms, by the time it's said and done it'll be about 1/5th of a second or worse.. and anything above 150 is perceptible by most people, which is before you're PC even works to output it to a screen. So like Gman pointed out, simple latency alone of WAN networks due to distance makes centralized cloud gaming a thing for broke-azz gamers who have no choice, but few others, no matter how much they optimize - the only exception is if they deployed "clouds" to virtually every municipality, which might cut it some, but it also has to be rendered by the remote hardware, sent through interfaces, back through applications, back through cache, gfx, etc., so you really can't trim the fat off that cow, it's got to be processed twice even if your local graphics hardware isn't dedicated to doing the advanced rendering.