Quantcast
Will the Xbox One Cloud Underdeliver like The Cell Processor?

Forums - Gaming Discussion - Will the Xbox One Cloud Underdeliver like The Cell Processor?

Will the Xbox One Cloud Underdeliver like The Cell Processor?

Definitely. MS is full of crap! 407 77.67%
 
No way. The Cloud will kick ass! 117 22.33%
 
Total:524

The cell was very real and it delivered the most technically accomplished games of the generation.



Around the Network
bugrimmar said:

Obviously, the Xbox One Cloud is still in progress so we don't really know yet whether it will deliver on all of Microsoft's promises. What do you guys think? Will this live up to the hype or will it be this generation's "Power of teh CELL"?


Cloud this and Cloud that is a giant trap. I am not going to say they won't be able to do anything with it, but what I am going to say is this - 10 years from now, 1 or two consoles down the road, what incentive do they have to keep xbox one cloud services going and not dedicate those resources to their newest thing? Relying on any remote service means when the plug is pulled, you loose something, like all the times EA has pulled servers for this game or that game on a console, functions go away. If this cloud is so important and so great, what does loosing it mean? A good bit of people won't really care, but not everyone simply tosses their old machine and games everytime something new comes, nor does everyone stop playing them. Depending on their Cloud is putting an experiation date on your game.



Nintendo Network ID: jlrx81

Steam ID Nova Nightmare  - My 3DMark Can you beat that? Prove it!

Where is the "who knows" option?



yes all these reasons explain it..

None of those reasons explained anything except that you're a dirty rotten spambot...- axum



ICStats said:
Machiavellian said:
bugrimmar said:
87.5% of people so far believe that the cloud is garbage. I think so too :P

I had to comment.  Just because a lot of people think something is garbage does not make them right.  It just meean that a lot of people could be wrong mostly do to lack of understanding of the technology.  Just from this thread and others I have read.  The majority of people that make a comment actually have no ideal of the technology and they associate dedicated servers as cloud compute.  Most people do not even know what cloud compute means and how its implemented.  Most people do not understand the software that allows for could compute and how distributed cloud processing works.  Lack of knowledge and ignorance does not help when forming a valid opinion about a piece of technology.

Better to be a fool and right, than to be an expert who's delusional.

Microsoft have said they will provision 3X the compute in the cloud for every Xbox One they make.  That's quite a lot of servers when you think the XB1 is expected to sell over 10 million a year, but they have got statistics on what % of people play at the same time and they don't have to provision for 100%.

So let's say fine, they have 3X the compute power in the cloud, and by this I mean 3X the CPU compute only, because I haven't heard and don't expect them to have 3X the GPU compute as well.  So if we measure in FLOPS it's probably more like a only 25% extra via the Cloud than local, which is still less FLOPS than PS4 has available locally.

Anyway that's just statistics and lies.  Fine, let's say that extra CPU time is available, so it's like having tripple the CPU power of PS4 at your disposal - but it's not the same tomatoes.  The problem is you have on average 8mbits download, 0.5mbits upload, so you can only download 1MB/s OR upload 0.06MB/s to these servers, and they should work for the lowest common denominator.  The XB1 has between 65,000 ~ 1,040,000 times more bandwidth locally, and the PS4 has almost 3 times more than that.  How is Micosoft going to put their cloud CPU power to use to make the graphics look better when very little can fit through this pipe?

There are going to be some uses, but spotty bandwidth, latency, and the need to support running offline will limit this severely compared to having extra compute power locally.

Last Year Alone MS spent over a billion dollars on their datacenter to provide the X1 and office 365 support.  I think MS is putting their money where their mouth is.

I am not sure if your calculations really work that way.  The reason why is because we really do not know how the cloud compute can or will be used until we see development along those lines.  Graphics are an interesting subject because pieces and parts do not have to be render in the same conventional since they are done today, just the final image.  I have heard a few different unconventional statements from developers on this part whick will be very interesting if used.  I might throw up a post on the subject to get a discussion going.

Also your solution really only goes down one path and thats bandwidth but there are solutions to get around such problems like having a hosted instance you sync to in the cloud where you are not sending a bunch of data but telling the host what should be processed, have the cloud process those parts and stream the results.  As I mentioned in other post, you let the cloud process the environment and let the local system process the immediate stuff the user is interacting with to reduce bandwidth or issues with cloud streaming. I am sure there are a few stuff that can be streamed in advance or stuff that can be stream on demand that do not take a lot of bandwidth.

I believe there is to much conentration on the graphics part of a game without understand all the other parts that goes into a game.  Also people are limiting the scope of what you can do to what you think is possible.  A lot of people do not create games and I believe you are limiting your understanding based on limited info on the subject.  Even I do not create games but I do develope software that interact with cloud based services and some of the solutions I mentioned is how you get around some of the issues mentioned.  I am sure game developers will think up even more ways to geet around those issues as well as unconventional methods that solve those problems.

I see people dismissing graphics being rendered over the cloud but what about some of the initiatives by Nvidia with their Grid and Intel.  Graphics can definitely be done in cloud space and we may actually see such development down the line



Around the Network
Machiavellian said:
ICStats said:
Machiavellian said:
bugrimmar said:
87.5% of people so far believe that the cloud is garbage. I think so too :P

I had to comment.  Just because a lot of people think something is garbage does not make them right.  It just meean that a lot of people could be wrong mostly do to lack of understanding of the technology.  Just from this thread and others I have read.  The majority of people that make a comment actually have no ideal of the technology and they associate dedicated servers as cloud compute.  Most people do not even know what cloud compute means and how its implemented.  Most people do not understand the software that allows for could compute and how distributed cloud processing works.  Lack of knowledge and ignorance does not help when forming a valid opinion about a piece of technology.

Better to be a fool and right, than to be an expert who's delusional.

Microsoft have said they will provision 3X the compute in the cloud for every Xbox One they make.  That's quite a lot of servers when you think the XB1 is expected to sell over 10 million a year, but they have got statistics on what % of people play at the same time and they don't have to provision for 100%.

So let's say fine, they have 3X the compute power in the cloud, and by this I mean 3X the CPU compute only, because I haven't heard and don't expect them to have 3X the GPU compute as well.  So if we measure in FLOPS it's probably more like a only 25% extra via the Cloud than local, which is still less FLOPS than PS4 has available locally.

Anyway that's just statistics and lies.  Fine, let's say that extra CPU time is available, so it's like having tripple the CPU power of PS4 at your disposal - but it's not the same tomatoes.  The problem is you have on average 8mbits download, 0.5mbits upload, so you can only download 1MB/s OR upload 0.06MB/s to these servers, and they should work for the lowest common denominator.  The XB1 has between 65,000 ~ 1,040,000 times more bandwidth locally, and the PS4 has almost 3 times more than that.  How is Micosoft going to put their cloud CPU power to use to make the graphics look better when very little can fit through this pipe?

There are going to be some uses, but spotty bandwidth, latency, and the need to support running offline will limit this severely compared to having extra compute power locally.

Last Year Alone MS spent over a billion dollars on their datacenter to provide the X1 and office 365 support.  I think MS is putting their money where their mouth is.

I am not sure if your calculations really work that way.  The reason why is because we really do not know how the cloud compute can or will be used until we see development along those lines.  Graphics are an interesting subject because pieces and parts do not have to be render in the same conventional since they are done today, just the final image.  I have heard a few different unconventional statements from developers on this part whick will be very interesting if used.  I might throw up a post on the subject to get a discussion going.

Also your solution really only goes down one path and thats bandwidth but there are solutions to get around such problems like having a hosted instance you sync to in the cloud where you are not sending a bunch of data but telling the host what should be processed, have the cloud process those parts and stream the results.  As I mentioned in other post, you let the cloud process the environment and let the local system process the immediate stuff the user is interacting with to reduce bandwidth or issues with cloud streaming. I am sure there are a few stuff that can be streamed in advance or stuff that can be stream on demand that do not take a lot of bandwidth.

I believe there is to much conentration on the graphics part of a game without understand all the other parts that goes into a game.  Also people are limiting the scope of what you can do to what you think is possible.  A lot of people do not create games and I believe you are limiting your understanding based on limited info on the subject.  Even I do not create games but I do develope software that interact with cloud based services and some of the solutions I mentioned is how you get around some of the issues mentioned.  I am sure game developers will think up even more ways to geet around those issues as well as unconventional methods that solve those problems.

I see people dismissing graphics being rendered over the cloud but what about some of the initiatives by Nvidia with their Grid and Intel.  Graphics can definitely be done in cloud space and we may actually see such development down the line

nVidia has some very powerful servers with lots of Kepler boards and they are great for professional rendering for movie studios, who can afford to pay $20K+ for these things to improve productivity.

I don't think MS have GPUs in their cloud.  Xeon CPU's designed for cloud don't even have embedded graphics.  Even though one or two cores on a Xeon server may have quadruple the power of the Xbox One CPU, it's actually not enough to draw much graphics.  The XB1 can render far more graphics on it's own.

We shall see.  There are a few things that can be done, but not very impactful IMO, and take a lot of developer effort.  I mean, I'm a graphics developer.  If you asked me to come up with any ideas for using the cloud for graphics I could give you some, but if you asked me if I think it's a good idea I would say no way.

Devs will use the cloud for what is a good fit, not try to do something weird which would break the game if you had spotty connectivity or an offline user.



My 8th gen collection

ICStats said:

nVidia has some very powerful servers with lots of Kepler boards and they are great for professional rendering for movie studios, who can afford to pay $20K+ for these things to improve productivity.

I don't think MS have GPUs in their cloud.  Xeon CPU's designed for cloud don't even have embedded graphics.  Even though one or two cores on a Xeon server may have quadruple the power of the Xbox One CPU, it's actually not enough to draw much graphics.  The XB1 can render far more graphics on it's own.

We shall see.  There are a few things that can be done, but not very impactful IMO, and take a lot of developer effort.  I mean, I'm a graphics developer.  If you asked me to come up with any ideas for using the cloud for graphics I could give you some, but if you asked me if I think it's a good idea I would say no way.

Devs will use the cloud for what is a good fit, not try to do something weird which would break the game if you had spotty connectivity or an offline user.


You are right to an extent that Xeon doesn't have integrated graphics, mostly.
However there are Xeons that were on Socket 1155 and 1150 that do have Intel graphics which can use OpenCL like the Xeon E3-1245, but those are the kind of Xeon's that wouldn't be used in a server it would be the Socket 2011 variants.

The Azure CPU's are pretty horrible though.
Rumours suggest the CPU's are roughly around the level of an AMD Opteron 2347 HE @ 1.6 - 1.9ghz.
So I would hazard a guess that an 8-core Azure set-up would be at most... 50-100% faster than an 8-core Jaguar, dependant on application of course.

A Core i7 3770K with 4 Cores@ 3.5ghz is roughly 3x faster than an Azure Quad Core chunk both in single and multi-threaded scenarios... So throw another 10-30% in favor of a newer model Haswell.
If you intend to use the "Xbox Cloud" to assist in *anything* that is heavily single threaded, it's literally going to fall over on it's ass, multi-threaded stuff will of course shine if you can have a massive amount of cores at your disposal and can scale the compute to take advantage of it, however that could end up being costly if you have 8 cores dedicated to a single game and a few million gamers decide to jump into a game at the same time all requiring 8 cores each.



--::{PC Gaming Master Race}::--

I don't think so.



ICStats said:

nVidia has some very powerful servers with lots of Kepler boards and they are great for professional rendering for movie studios, who can afford to pay $20K+ for these things to improve productivity.

I don't think MS have GPUs in their cloud.  Xeon CPU's designed for cloud don't even have embedded graphics.  Even though one or two cores on a Xeon server may have quadruple the power of the Xbox One CPU, it's actually not enough to draw much graphics.  The XB1 can render far more graphics on it's own.

We shall see.  There are a few things that can be done, but not very impactful IMO, and take a lot of developer effort.  I mean, I'm a graphics developer.  If you asked me to come up with any ideas for using the cloud for graphics I could give you some, but if you asked me if I think it's a good idea I would say no way.

Devs will use the cloud for what is a good fit, not try to do something weird which would break the game if you had spotty connectivity or an offline user.


You do know that you do not have to have a GPU to render graphics.  Also a lot of new graphical features can be easily done using CPU compute.  Using thousands of CPUs in the cloud to perform those calculations is called cloud compute.  Also Intel Xeon chips actually do perform graphics calculations and from a little research their E3-1200v3 chip is built off theire Haswell design.  Even AMD offers graphics processing with their Server Opteron chips.  Its a misconception to believe that Nvidia has the only option or that their solution is the only one that can be leverage.

The thing is you are limiting your thinking to what is done today.  Thats actually being very narrow in your perception.  New techniques are being developed every moment and there probably will be plenty developed using cloud compute.  I see a lot of pontential in this space because I have seen throughout my years how fast cloud based tech has improved.  Now that companies like Intel, AMD, Nvidia and MS are leveraging their skills in this area mean we will start to see a lot of innovative ways cloud compute can expan this space.



Machiavellian said:
ICStats said:

nVidia has some very powerful servers with lots of Kepler boards and they are great for professional rendering for movie studios, who can afford to pay $20K+ for these things to improve productivity.

I don't think MS have GPUs in their cloud.  Xeon CPU's designed for cloud don't even have embedded graphics.  Even though one or two cores on a Xeon server may have quadruple the power of the Xbox One CPU, it's actually not enough to draw much graphics.  The XB1 can render far more graphics on it's own.

We shall see.  There are a few things that can be done, but not very impactful IMO, and take a lot of developer effort.  I mean, I'm a graphics developer.  If you asked me to come up with any ideas for using the cloud for graphics I could give you some, but if you asked me if I think it's a good idea I would say no way.

Devs will use the cloud for what is a good fit, not try to do something weird which would break the game if you had spotty connectivity or an offline user.


You do know that you do not have to have a GPU to render graphics.  Also a lot of new graphical features can be easily done using CPU compute.  Using thousands of CPUs in the cloud to perform those calculations is called cloud compute.  Also Intel Xeon chips actually do perform graphics calculations and from a little research their E3-1200v3 chip is built off theire Haswell design.  Even AMD offers graphics processing with their Server Opteron chips.  Its a misconception to believe that Nvidia has the only option or that their solution is the only one that can be leverage.

The thing is you are limiting your thinking to what is done today.  Thats actually being very narrow in your perception.  New techniques are being developed every moment and there probably will be plenty developed using cloud compute.  I see a lot of pontential in this space because I have seen throughout my years how fast cloud based tech has improved.  Now that companies like Intel, AMD, Nvidia and MS are leveraging their skills in this area mean we will start to see a lot of innovative ways cloud compute can expan this space.

You just don't realize how weak CPUs are at 3D rendering compared to GPUs today.  Think hundreds of times slower.  I could go into more detail but I don't think it's your area of expertise.

Plus I'm not talking abot the all time future potential, I'm just talking about the current solution.  If MS says they have 3X the CPU power of the XB1 in the cloud, then that is not thousands of CPUs in the cloud.



My 8th gen collection