By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Xbox One "secret sauce" revealed. 40x more powerful than 360 when connected to the cloud.

AnthonyW86 said:
BenVTrigger said:
avais1993 said:
Can someone please explain how cloud can make it more powerful?


Developers can offload things like physics, AI, and load times to the cloud to massively free up the CPU and GPU.

Theoretically it coukd make things like 4k gaming a reality

Sorry to bust your bubble here but that would mean it would have to have a constant internet connection, and a fast one with that. Any hiccup and your retail bought game becomes unplayable.

Just to clarify cloud based gaming can work, and system based game what we have now works, but not a combination of both. Latency alone would mean any optimisation on a game based on the Xbox One's specs becomes impossible.

The difference between cloud based processing is that latency does not become such a big deal like cloud base streaming.  One advantage MS has is that they have been investing in there cloud platform for years while Sony bought there's.  Gaikai does not have anywhere close to the amount of servers MS have.  Right now I doubt Sony has the capital to keep up with the push MS is investing in there server farm.  

Why does this matter.  Location.  With over 300,000 and growing servers and growing, the chance MS will have servers local to your locatin is high.  With local servers in your area mean pings will be low and cloud processing will be fast enough to make a difference.  



Around the Network
Imaginedvl said:
Scoobes said:
Pemalite said:
Scoobes said:
thranx said:
If i'm not mistaken doesnt Diablo 3 for PC also use some assets from their own servers and not local based. I forget exactly what and how it works though.

Yes, it did, but it also caused a lot of stuttering issues and lag, especially early on with the heavy server loads. This is for a game that is fairly light on a graphical level.

What Microsoft are claiming here just isn't realistic for a long time.


Even after a year of release I still get stuttering, lag and rubber-banding BECAUSE Blizzard thought it was a great idea not to have a server near-by, so it was a 300ms jump to the US of A instead and a "deal with it" because it's not financially viable. (Yet ISP's here have offered high-speed servers with fantastic CDN's for free to Blizzard, yet they refuse to budge.)

Cloud gaming of any form has never been a smooth experience at all for me.

Exactly why I don't see many devs making efficient use of this. What MS claim in the OP is typical marketing crap. All companies do it but this is pushing it.

No it is not marketing crap. You are just generalizing the "cloud" computing...
For you and many it is like playing remotely to games (like Gaikai or OnLive) or to have a CDN which is TOTALY different from what Microsoft is proposing.

If it is so hard to understand, just compare what their VCPUs with small dedicated servers. Maybe you do not see any potential to that but I see a LOT of potential and idea to use this. I aready gave few exemple in this thread but haters will not even read them as they want to stick on their own "idea" of what Microsoft is proposing... More convenient I guess :)

The only thing Microsoft did wrongly (like many other companies and certainly not by pushing it like you said, I can show you some very good exemple (even recent) from another "big" console company which is way more ridiculous...) is too take as an exemple something about the rendering. While I think it may be used that way; it would be complex for a dev. team to use it in a an efficient way...

But using it to pre-compute the next room/area for instance in a game... Or to process all the background tasks in a persitent world like Skyrim etc...
And then free up the local CPU for other stuff is actually just easily doable...

I just do not get why people are bashing when a company is adding more stuff and possibilities (looking at all the sig, I think I understand why actually :))...
If you "don't" like it just ignore it but it is still a plus at the end for future Xbox owners.

I haven't once mentioned Gaikai so no, that's not what it means to me. The OP/thread title suggests the cloud will give 40x more power than the 360 based solely on the metric that each Xbone will have 3x the computing power of a single console available. Having that power is great but when it's been used in the past (see Diablo 3 and Sim City; not exactly graphically heavy games) the inherent problems of such a system become apparent. The connection drops, the stuttering, the lag... all in what should have been single player games. You can also see similar artifacts occuring in MMOs which again, are mostly poor on a graphical level. It's ludicrous to suggest this is going to make the system as a whole 40x more powerful than a 360.

It doesn't matter how much power you have available if the internet infrastructure isn't there. We may start to see more games take advantage of the tech but it won't truly take off for a long time yet. Either way, it's not going to compare to having that power available locally in a system. It currently works for non-gaming related tasks because the latency isn't as big an issue. 

So yes, this is marketing crap, in the same way all the "theoretical" values the console manufacturers give are over-inflated BS.



Scoobes said:
Imaginedvl said:
Scoobes said:
Pemalite said:
Scoobes said:
thranx said:
If i'm not mistaken doesnt Diablo 3 for PC also use some assets from their own servers and not local based. I forget exactly what and how it works though.

Yes, it did, but it also caused a lot of stuttering issues and lag, especially early on with the heavy server loads. This is for a game that is fairly light on a graphical level.

What Microsoft are claiming here just isn't realistic for a long time.


Even after a year of release I still get stuttering, lag and rubber-banding BECAUSE Blizzard thought it was a great idea not to have a server near-by, so it was a 300ms jump to the US of A instead and a "deal with it" because it's not financially viable. (Yet ISP's here have offered high-speed servers with fantastic CDN's for free to Blizzard, yet they refuse to budge.)

Cloud gaming of any form has never been a smooth experience at all for me.

Exactly why I don't see many devs making efficient use of this. What MS claim in the OP is typical marketing crap. All companies do it but this is pushing it.

No it is not marketing crap. You are just generalizing the "cloud" computing...
For you and many it is like playing remotely to games (like Gaikai or OnLive) or to have a CDN which is TOTALY different from what Microsoft is proposing.

If it is so hard to understand, just compare what their VCPUs with small dedicated servers. Maybe you do not see any potential to that but I see a LOT of potential and idea to use this. I aready gave few exemple in this thread but haters will not even read them as they want to stick on their own "idea" of what Microsoft is proposing... More convenient I guess :)

The only thing Microsoft did wrongly (like many other companies and certainly not by pushing it like you said, I can show you some very good exemple (even recent) from another "big" console company which is way more ridiculous...) is too take as an exemple something about the rendering. While I think it may be used that way; it would be complex for a dev. team to use it in a an efficient way...

But using it to pre-compute the next room/area for instance in a game... Or to process all the background tasks in a persitent world like Skyrim etc...
And then free up the local CPU for other stuff is actually just easily doable...

I just do not get why people are bashing when a company is adding more stuff and possibilities (looking at all the sig, I think I understand why actually :))...
If you "don't" like it just ignore it but it is still a plus at the end for future Xbox owners.

I haven't once mentioned Gaikai so no, that's not what it means to me. The OP/thread title suggests the cloud will give 40x more power than the 360 based solely on the metric that each Xbone will have 3x the computing power of a single console available. Having that power is great but when it's been used in the past (see Diablo 3 and Sim City; not exactly graphically heavy games) the inherent problems of such a system become apparent. The connection drops, the stuttering, the lag... all in what should have been single player games. You can also see similar artifacts occuring in MMOs which again, are mostly poor on a graphical level. It's ludicrous to suggest this is going to make the system as a whole 40x more powerful than a 360.

It doesn't matter how much power you have available if the internet infrastructure isn't there. We may start to see more games take advantage of the tech but it won't truly take off for a long time yet. Either way, it's not going to compare to having that power available locally in a system. It currently works for non-gaming related tasks because the latency isn't as big an issue. 

So yes, this is marketing crap, in the same way all the "theoretical" values the console manufacturers give are over-inflated BS.

The point is that you discard completly the plus it would give to the console devs by just sticking to one of two details (like the title the thread) and even if 80% of the devs out there are not using it; it is still a positive addition to the console toolkit.

Whatever makes you happy, if you think adding this capabilities is just marketing crap, good for you. I do not :)



Imaginedvl said:
Scoobes said:

I haven't once mentioned Gaikai so no, that's not what it means to me. The OP/thread title suggests the cloud will give 40x more power than the 360 based solely on the metric that each Xbone will have 3x the computing power of a single console available. Having that power is great but when it's been used in the past (see Diablo 3 and Sim City; not exactly graphically heavy games) the inherent problems of such a system become apparent. The connection drops, the stuttering, the lag... all in what should have been single player games. You can also see similar artifacts occuring in MMOs which again, are mostly poor on a graphical level. It's ludicrous to suggest this is going to make the system as a whole 40x more powerful than a 360.

It doesn't matter how much power you have available if the internet infrastructure isn't there. We may start to see more games take advantage of the tech but it won't truly take off for a long time yet. Either way, it's not going to compare to having that power available locally in a system. It currently works for non-gaming related tasks because the latency isn't as big an issue. 

So yes, this is marketing crap, in the same way all the "theoretical" values the console manufacturers give are over-inflated BS.

The point is that you discard completly the plus it would give to the console devs by just sticking to one of two details (like the title the thread) and even if 80% of the devs out there are not using it; it is still a positive addition to the console toolkit.

Whatever makes you happy, if you think adding this capabilities is just marketing crap, good for you. I do not :)

I'm not discarding the pluses, I'm saying it's too early to be useful and the claims made by MS are ridiculous and over-exagerated (i.e marketing crap) to make it seem like this will compensate for having a relatively weak GPU.

The times where similar methods have been used (Diablo 3, Sim City, MMOs) have shown that these claims are premature.



So anything announced by Microsoft/Sony is too early to be useful. You realize what you are saying here?
And no, I will repeat it one more time, Diablo/SimCity are a different story. Again just because we are talking about something in the "cloud"; you are just comparing this to anything with "cloud" somewhere in the name or the description.

And look at that : "relatively weak GPU."; here we go :)

Relatively weak compared to what (because I do not see the link with Microsoft statement and the power of their GPU)

Well, now I understand why you were bashing :) Another Microsoft haters just bashing and finding issues with everything they are doing. What a surprise; esp. lately...



Around the Network
Imaginedvl said:

So anything announced by Microsoft/Sony is too early to be useful. You realize what you are saying here?
And no, I will repeat it one more time, Diablo/SimCity are a different story. Again just because we are talking about something in the "cloud"; you are just comparing this to anything with "cloud" somewhere in the name or the description.

And look at that : "relatively weak GPU."; here we go :)

Relatively weak compared to what (because I do not see the link with Microsoft statement and the power of their GPU)

Well, now I understand why you were bashing :) Another Microsoft haters just bashing and finding issues with everything they are doing. What a surprise; esp. lately...

How are Diablo and Sim City any different? They both offload processing to Blizzard/EA servers and both are designed with this in mind. The servers are different but otherwise how is that any different to what Microsoft are pushing here?

And the GPU is weak compared to most modern PC GPUs and 50% weaker than PS4 GPU (which is mid-high range compared to current PC GPU). The tech sites have it with 12 CUs, 768 shader units and 1.2 Tflops (vs 18 CU, >1100 Shaders and 1.8 Tflops in the PS4, same GPU architecture). MS glossed over the specs and are pushing this cloud supplementing system instead. Typical PR for any company; quote your strengths. Unfortunately, this isn't as big as a strength as they're making out.

I'm not bashing, just being realistic. If you want to believe Microsofts PR, go ahead.

Edit: And I meant the tech is too early, not the announcement. The network infrastructure isn't good enough for most consumers.



if processing is partly done on cloud, how much bandwidth is this going to take? Kinda sucks for people in countries like Canada where there is a monthly cap unless they get the higher end packages.



Scoobes said:
Imaginedvl said:

So anything announced by Microsoft/Sony is too early to be useful. You realize what you are saying here?
And no, I will repeat it one more time, Diablo/SimCity are a different story. Again just because we are talking about something in the "cloud"; you are just comparing this to anything with "cloud" somewhere in the name or the description.

And look at that : "relatively weak GPU."; here we go :)

Relatively weak compared to what (because I do not see the link with Microsoft statement and the power of their GPU)

Well, now I understand why you were bashing :) Another Microsoft haters just bashing and finding issues with everything they are doing. What a surprise; esp. lately...

How are Diablo and Sim City any different? They both offload processing to Blizzard/EA servers and both are designed with this in mind. The servers are different but otherwise how is that any different to what Microsoft are pushing here?

And the GPU is weak compared to most modern PC GPUs and 50% weaker than PS4 GPU (which is mid-high range compared to current PC GPU). The tech sites have it with 12 CUs, 768 shader units and 1.2 Tflops (vs 18 CU, >1100 Shaders and 1.8 Tflops in the PS4, same GPU architecture). MS glossed over the specs and are pushing this cloud supplementing system instead. Typical PR for any company; quote your strengths. Unfortunately, this isn't as big as a strength as they're making out.

I'm not bashing, just being realistic. If you want to believe Microsofts PR, go ahead.

Edit: And I meant the tech is too early, not the announcement. The network infrastructure isn't good enough for most consumers.

You are constantly coming back to Microsoft PR bullshit and blah blah blah and trying to bring other issue into the debate.

Now you are bringing this "rumored" better performance of the PS4 GPU with this dicussion and using it as an excuse to bash MS... Seriously, MS talking about the cloud and their integration of Azure VCPUs in the Xbox toolkit BECAUSE of "their weak GPU"? Are you reading what you are writing before posting this non-sense?

By the way, dedicated servers != cloud processing... I will not EVEN try to explain you the difference.

It is alright, I'm done arguing with you. you know, just don't buy it lol...



Imaginedvl said:
Scoobes said:

How are Diablo and Sim City any different? They both offload processing to Blizzard/EA servers and both are designed with this in mind. The servers are different but otherwise how is that any different to what Microsoft are pushing here?

And the GPU is weak compared to most modern PC GPUs and 50% weaker than PS4 GPU (which is mid-high range compared to current PC GPU). The tech sites have it with 12 CUs, 768 shader units and 1.2 Tflops (vs 18 CU, >1100 Shaders and 1.8 Tflops in the PS4, same GPU architecture). MS glossed over the specs and are pushing this cloud supplementing system instead. Typical PR for any company; quote your strengths. Unfortunately, this isn't as big as a strength as they're making out.

I'm not bashing, just being realistic. If you want to believe Microsofts PR, go ahead.

Edit: And I meant the tech is too early, not the announcement. The network infrastructure isn't good enough for most consumers.

You are constantly coming back to Microsoft PR bullshit and blah blah blah and trying to bring other issue into the debate.

Now you are bringing this "rumored" better performance of the PS4 GPU with this dicussion and using it as an excuse to bash MS... Seriously, MS talking about the cloud and their integration of Azure VCPUs in the Xbox toolkit BECAUSE of "their weak GPU"? Are you reading what you are writing before posting this non-sense?

By the way, dedicated servers != cloud processing... I will not EVEN try to explain you the difference.

It is alright, I'm done arguing with you. you know, just don't buy it lol...

When did I say dedicated servers were the same as cloud processing? You keep putting words into my posts (first Gaikai, now this). You also haven't actually answered any of the questions in my previous post. Cloud processing has rarely been done on such a latency dependent application as gaming, and when similar tasks have been tried they have suffered issues that are beyond their control (internet infrastructure). You've said nothing to counter this or explain how MS will supposedly fix this.

And the GPU specs aren't rumoured: http://www.extremetech.com/gaming/156273-xbox-720-vs-ps4-vs-pc-how-the-hardware-specs-compare

"In terms of the GPU hardware, hard information was difficult to come by, but one of the engineers did let slip with a significant stat - 768 operations per clock. We know that both Xbox One and PlayStation 4 are based on Radeon GCN architecture and we also know that each compute unit is capable of 64 operations per clock. So, again through a process of extrapolation from the drip-feed of hard facts, the make-up of the One's GPU is confirmed - 12 compute units each capable of 64 ops/clock gives us the 768 total revealed by Microsoft and thus, by extension, the 1.2 teraflop graphics core. So that's another tick on the Durango leaked spec that has been transposed across to the final Xbox One architecture and the proof we need that PlayStation 4's 18 CU graphics core has 50 per cent more raw power than the GPU in the new Microsoft console."

You've said nothing to convince me that what MS have said isn't just normal PR speak. If you want to make a convincing argument, you need to actually backup your posts rather than just saying just making a point and saying "I'm not even going to explain it to you". It makes you come off as both arrogant and ignorant. For now, I'll just have to assume your both.



"Things that I would call latency-sensitive would be reactions to animations in a shooter, reactions to hits and shots in a racing game, reactions to collisions," Booty told Ars. "Those things you need to have happen immediately and on frame and in sync with your controller. There are some things in a video game world, though, that don't necessarily need to be updated every frame or don't change that much in reaction to what's going on."

"One example of that might be lighting," he continued. "Let’s say you’re looking at a forest scene and you need to calculate the light coming through the trees, or you’re going through a battlefield and have very dense volumetric fog that’s hugging the terrain. Those things often involve some complicated up-front calculations when you enter that world, but they don’t necessarily have to be updated every frame. Those are perfect candidates for the console to offload that to the cloud—the cloud can do the heavy lifting, because you’ve got the ability to throw multiple devices at the problem in the cloud."

Booty added that things like physics modeling, fluid dynamics, and cloth motion were all prime examples of effects that require a lot of up-front computation that could be handled in the cloud without adding any lag to the actual gameplay. And the server resources Microsoft is putting toward these calculations will be much greater than a local Xbox One could handle on its own. "A rule of thumb we like to use is that [for] every Xbox One available in your living room we’ll have three of those devices in the cloud available," he said.

Source: http://arstechnica.com/gaming/2013/05/how-the-xbox-one-draws-more-processing-power-from-cloud-computing/

The one thing I don't get is how will MS make profit from this? Cloud ressources cost a lot of money.