(Warning, long read ahead)

So, roaming the Battlefield 4 forums, someone posted this:

So I see all of you Xbox One fans saying how the cloud is going to be some miracle that boosts the Xbox performance to PC performance levels or levels much higher than the PS4. Well I'm here to tell you that isn't going to happen.

Cloud computing is not going to be what you think or expect. I'll try to explain what could computing will be for you so you have an idea of why the Xbox will not be some uber console like you think it will and despite what MS is promising.

So what is Microsoft's cloud computing for Xbox One?

Essentially, what cloud computing is plain and simple, is that every player can get access to their own dedicated server. Imagine if you could have your Xbox and you hooked it up to another computer across a really long network cable. In a way you can have two machines working simultaneously. However, the rules of physics come into play. It takes a certain amount of time for data to both to go that other server, process, and get data back. That time is longer than the time it takes the game to render an individual image (frame), so you can only use cloud computing for things that aren't needed every frame. That time is known as Latency.

Essentially, cloud computing is a way to add additional CPU resources to your platform by reducing some of the work you do locally, and instead do the computation on the server.

Keep in mind, the server is not a physical server box as some of you may think. It is a virtual server (software) running in a server farm. This way the server can scale proportionally to the demand on it. It can grow to use more physical hardware as needed. This also helps explain the 300,000 servers Microsoft claims to have at release. They are all virtual and you can run multiple virtual servers on a single piece of server hardware. So there are not 300,000 computers sitting in a warehouse somewhere.

So now lets look at some of the problems with this arrangement.

The problem of Latency:

As mentioned previously, Latency is the time it takes to communicate with the server, the time the server needs to calculate the problem, and the time required for the server to send back the results.

Because of this Microsoft has directly said that cloud computing in the Xbox One is aimed at "latency-insensitive computation". Right away that tells you that anything that is latency sensitive is not going to be accelerated by the cloud. This is why the Onlive online gaming service went out of business. It too was a cloud computing gaming service. They couldn't solve the latency problem with cloud computing because it can not be solved.

So what types of things in gaming are sensitive to latency that would not be accelerated by the cloud?

A quote from Microsoft: "Things that I would call latency-sensitive would be reactions to animations in a shooter, reactions to hits and shots in a racing game, reactions to collisions. Those things you need to have happen immediately and on frame and in sync with your controller."

So hit detection, animations in shooters, collisions, etc. are all sensitive to Latency. So as you can see, most things common to FPS games and fast passed action game are LATENCY SENSITIVE and can not be accelerated by the cloud due to Latency.

Things like rendering the image on the screen are also extremely latency sensitive. The cloud is never going to be rendering the image on your TV. Your console will be doing all of that work. The servers don't even have graphics cards so they couldn't even if they wanted to do it. The console's graphics card will still be doing all the heavy lifting to calculate all the pixels and display the image.

Here is what they Microsoft servers probably will look like:
http://en.wikipedia.org/wiki/File:Bladecenter-front.jpg [en.wikipedia.org]

They will go in cabinets like this:
http://en.wikipedia.org/wiki/File:UP...dorMagerit.jpg [en.wikipedia.org]

The servers will be blades or another similar form factor so they can get a lot of them into a small space. There is no room for graphics cards and the servers are not designed for doing graphics.

So what types of things of things can the cloud accelerate then?

If you've ever played an MMO, all of the game logic itself is calculated on servers. There's not a whole lot that is being computed locally. Essentially you send up your input (example: "Use Fireball"), the server calculates the result ("You dealt X points of damage") and your game then plays the cool special effects, updates the health bar, plays the enemy hit animation, etc. This is a perfect use of cloud computing and cloud computing is aimed at these types of games (MMO and RTS) and not FPS or fast passed action games.

AI is another thing cloud computing can accelerate. Titanfall is going to be using cloud computing for their AI. The AI computation will be run on the server and the server only sends back the position of the bots and their actions to the console. The console will get this data from the cloud but will still have to generate the scene and render the screen.

Cloud computing may help accelerate multiplayer games so they can have larger levels with more players than without cloud computing. The cloud will be able to keep track of all the players and just send out position updates to player's consoles so the console can render the world and scene for that point in time. This would take some burden off the consoles. It is similar to how multiplayer games already work on PC.

Things like these are perfect for being accelerated by cloud computing because they are not dependent on Latency or having to communicate back and forth with the console several times before the console can use the data directly.


Can the cloud accelerate any graphical effects at all?

Some minor graphics effects can be accelerated. There are also some visual things in a video game world that don't necessarily need to be updated every frame or don't change that much in reaction to what's going on. One example of that might be lighting. Let’s say you’re looking at a forest scene and you need to calculate the light coming through the trees, or you’re going through a battlefield and have very dense volumetric fog that’s hugging the terrain. Those things often involve some complicated up-front calculations when you enter that world, but they don’t necessarily have to be updated every frame.

Also, think about a lighting technique like ambient occlusion that gives you all the cracks and crevices and shadows that happen not just from direct light. There are a number of calculations that have to be done up front, and as the camera moves the effect will change. So when you walk into a room, it might be that for the first second or two the fidelity of the lighting is done by the console, but then, as the cloud catches up with that, the data comes back down to the console and you have incredibly realistic lighting.

But think of how this will look to the player. The scene will start off being rendered on the console and then at some point the entire scene will change and look more realistic when the cloud catches up because of the latency problem. It may look very strange if not done properly. It will be like the level of detail changes in BF3 when you look around but probably worse. And every time you change viewing angle it will happen all over again. Because the console will have to take over while the cloud calculates the data and catches back up. Again this is because of the latency issue.

So in a FPS this may be incredibly annoying since you are constantly changing your viewing angle at high speeds. So I doubt these things will be accelerated in FPS games. The latency would just make it impractical.


So In Conclusion

*Cloud computing will only be beneficial to latency insensitive computation. This type of computation involves things that can be calculated in the background which don't have to be sycned with the real time action. So this means cloud computing will have little effect in FPS games like BF4 that are very latency sensitive.

*Cloud computing will never improve the performance of the graphics on the consoles. The cloud can not, nor ever will be directly rendering any images shown on the screen in your living room.

*Cloud computing is not going to make the XBox faster than the PS4 especially in games like BF4 or any other FPS title


References
http://arstechnica.com/gaming/2013/0...oud-computing/ [arstechnica.com]
http://www.ign.com/boards/threads/co...ing.453130243/ [ign.com]
http://www.eurogamer.net/articles/di...ansform-gaming [eurogamer.net]
http://www.theverge.com/2013/5/24/43...ne-requirement [theverge.com]
I'm not a genius on these technical issues. If what this poster said was true, the PS4 is going to TRUMP the One in specs (specifically for FPS's). Now, I'm still getting an Xbox One either way, but I'd like to have more ammo to convince my friends to buy a One. And I want to know what I'm talking about when I tell people about the One.