Monitors with like a 2-3ms response time are heavy purchases for serious gamers. They try to squeak every advantage they can, and will pay several hundred to shave just 12ms in rendering. In comparison, best case latency on cloud gaming is going to be an added overhead of 150ms, by the time it's said and done it'll be about 1/5th of a second or worse.. and anything above 150 is perceptible by most people, which is before you're PC even works to output it to a screen. So like Gman pointed out, simple latency alone of WAN networks due to distance makes centralized cloud gaming a thing for broke-azz gamers who have no choice, but few others, no matter how much they optimize - the only exception is if they deployed "clouds" to virtually every municipality, which might cut it some, but it also has to be rendered by the remote hardware, sent through interfaces, back through applications, back through cache, gfx, etc., so you really can't trim the fat off that cow, it's got to be processed twice even if your local graphics hardware isn't dedicated to doing the advanced rendering.