Forums - Gaming Discussion - Will the Xbox One Cloud Underdeliver like The Cell Processor?

Will the Xbox One Cloud Underdeliver like The Cell Processor?

Definitely. MS is full of crap! 407 77.67%
 
No way. The Cloud will kick ass! 117 22.33%
 
Total:524
ICStats said:
Machiavellian said:
ICStats said:

nVidia has some very powerful servers with lots of Kepler boards and they are great for professional rendering for movie studios, who can afford to pay $20K+ for these things to improve productivity.

I don't think MS have GPUs in their cloud.  Xeon CPU's designed for cloud don't even have embedded graphics.  Even though one or two cores on a Xeon server may have quadruple the power of the Xbox One CPU, it's actually not enough to draw much graphics.  The XB1 can render far more graphics on it's own.

We shall see.  There are a few things that can be done, but not very impactful IMO, and take a lot of developer effort.  I mean, I'm a graphics developer.  If you asked me to come up with any ideas for using the cloud for graphics I could give you some, but if you asked me if I think it's a good idea I would say no way.

Devs will use the cloud for what is a good fit, not try to do something weird which would break the game if you had spotty connectivity or an offline user.


You do know that you do not have to have a GPU to render graphics.  Also a lot of new graphical features can be easily done using CPU compute.  Using thousands of CPUs in the cloud to perform those calculations is called cloud compute.  Also Intel Xeon chips actually do perform graphics calculations and from a little research their E3-1200v3 chip is built off theire Haswell design.  Even AMD offers graphics processing with their Server Opteron chips.  Its a misconception to believe that Nvidia has the only option or that their solution is the only one that can be leverage.

The thing is you are limiting your thinking to what is done today.  Thats actually being very narrow in your perception.  New techniques are being developed every moment and there probably will be plenty developed using cloud compute.  I see a lot of pontential in this space because I have seen throughout my years how fast cloud based tech has improved.  Now that companies like Intel, AMD, Nvidia and MS are leveraging their skills in this area mean we will start to see a lot of innovative ways cloud compute can expan this space.

You just don't realize how weak CPUs are at 3D rendering compared to GPUs today.  Think hundreds of times slower.  I could go into more detail but I don't think it's your area of expertise.

Plus I'm not talking abot the all time future potential, I'm just talking about the current solution.  If MS says they have 3X the CPU power of the XB1 in the cloud, then that is not thousands of CPUs in the cloud.

I believe you are not getting how a cloud based compute system would work.  I believe you are getting confused with how Playstation Now works compared to what MS is doing. 

Here is an example of what Intel has done with Wofenstien a few years ago.

http://www.extremetech.com/gaming/24860-23754-revision

This is taking just the rendering engine for the wofenstien game and running it in the cloud.  This can be done with many different parts of a game where the developer could leverage such a setup and if you include the size and capability of Azure, MS should be able to deliver around the world.

The thing is, technology is moving at a pace most gamers have no clue about.  Companies are not goining into this space because the cloud cannot handle it.  Instead they are rushing into this space because it is a gold mine and the company with the big pockets probably will be the one to come out on top.  This is not future pontential, its todays solution because most companies that saw this coming was already investing in it like Intel and MS.  MS did not just wake on the release of the X1 and thought about a cloud based compute infrastructure.  They have been working on this since 2005.



Around the Network
Machiavellian said:
ICStats said:
Machiavellian said:
ICStats said:

nVidia has some very powerful servers with lots of Kepler boards and they are great for professional rendering for movie studios, who can afford to pay $20K+ for these things to improve productivity.

I don't think MS have GPUs in their cloud.  Xeon CPU's designed for cloud don't even have embedded graphics.  Even though one or two cores on a Xeon server may have quadruple the power of the Xbox One CPU, it's actually not enough to draw much graphics.  The XB1 can render far more graphics on it's own.

We shall see.  There are a few things that can be done, but not very impactful IMO, and take a lot of developer effort.  I mean, I'm a graphics developer.  If you asked me to come up with any ideas for using the cloud for graphics I could give you some, but if you asked me if I think it's a good idea I would say no way.

Devs will use the cloud for what is a good fit, not try to do something weird which would break the game if you had spotty connectivity or an offline user.


You do know that you do not have to have a GPU to render graphics.  Also a lot of new graphical features can be easily done using CPU compute.  Using thousands of CPUs in the cloud to perform those calculations is called cloud compute.  Also Intel Xeon chips actually do perform graphics calculations and from a little research their E3-1200v3 chip is built off theire Haswell design.  Even AMD offers graphics processing with their Server Opteron chips.  Its a misconception to believe that Nvidia has the only option or that their solution is the only one that can be leverage.

The thing is you are limiting your thinking to what is done today.  Thats actually being very narrow in your perception.  New techniques are being developed every moment and there probably will be plenty developed using cloud compute.  I see a lot of pontential in this space because I have seen throughout my years how fast cloud based tech has improved.  Now that companies like Intel, AMD, Nvidia and MS are leveraging their skills in this area mean we will start to see a lot of innovative ways cloud compute can expan this space.

You just don't realize how weak CPUs are at 3D rendering compared to GPUs today.  Think hundreds of times slower.  I could go into more detail but I don't think it's your area of expertise.

Plus I'm not talking abot the all time future potential, I'm just talking about the current solution.  If MS says they have 3X the CPU power of the XB1 in the cloud, then that is not thousands of CPUs in the cloud.

I believe you are not getting how a cloud based compute system would work.  I believe you are getting confused with how Playstation Now works compared to what MS is doing. 

Here is an example of what Intel has done with Wofenstien a few years ago.

http://www.extremetech.com/gaming/24860-23754-revision

This is taking just the rendering engine for the wofenstien game and running it in the cloud.  This can be done with many different parts of a game where the developer could leverage such a setup and if you include the size and capability of Azure, MS should be able to deliver around the world.

The thing is, technology is moving at a pace most gamers have no clue about.  Companies are not goining into this space because the cloud cannot handle it.  Instead they are rushing into this space because it is a gold mine and the company with the big pockets probably will be the one to come out on top.  This is not future pontential, its todays solution because most companies that saw this coming was already investing in it like Intel and MS.  MS did not just wake on the release of the X1 and thought about a cloud based compute infrastructure.  They have been working on this since 2005.

You seem to not have any idea of the economies involved here.  Plus I'm not arguing that it's impossible to render in the cloud.  A lot is possible with enough resources.  You could put some SLI Titan boards on the servers and stream games with the highest PC fidelity today.  It would just be expensive.

If you listen carefully you would hear that demo from Intel was running on 4 larrabee equipped servers.  Intel now sells Xeon Phi cards like that.  They cost thousands, and draw 300 Watts of power each.  Very expensive to build and operate.  Microsoft could do that, but maybe Xbox Live Gold would need to be $50 a month instead of $5 a month.

With nVida Grid, last year at nVidia's GPU conference you could play PS3 quality Street Fighter 4 in a Grid powered cloud.  Nice tech, but the costs... you would likely need to charge users $1 an hour.  Would you pay that?

Just because it can be built, doesn't mean it can be done economically and be a "goldmine".  Cloud compute isn't free - it has a set up cost and a running cost (space, electricity, bandwidth).  People pay to use it.  Given how much money Microsoft makes from their users on games and Xbox Live subscriptions, there are limits to how much cloud compute could be dedicated to each user, and so there are limits to what could be achieved in such a service.  I believe the compute available will be so limited, that you won't be able to be used to enhance XB1 graphics in a meaningful way.

Also your example is JUST like Playstation Now - it's a game fully rendered in the cloud and streamed to the client as a video.  He mentions the client is a "thin client", ie. it's just something to display the video.  Same as Sony will stream PS Now to Vita TV, smart devices, etc. because all they need to do is display a video.  You don't need a $500 Xbox One to stream the video.

 




My 8th gen collection

ICStats said:
Machiavellian said:
ICStats said:
Machiavellian said:
ICStats said:

nVidia has some very powerful servers with lots of Kepler boards and they are great for professional rendering for movie studios, who can afford to pay $20K+ for these things to improve productivity.

I don't think MS have GPUs in their cloud.  Xeon CPU's designed for cloud don't even have embedded graphics.  Even though one or two cores on a Xeon server may have quadruple the power of the Xbox One CPU, it's actually not enough to draw much graphics.  The XB1 can render far more graphics on it's own.

We shall see.  There are a few things that can be done, but not very impactful IMO, and take a lot of developer effort.  I mean, I'm a graphics developer.  If you asked me to come up with any ideas for using the cloud for graphics I could give you some, but if you asked me if I think it's a good idea I would say no way.

Devs will use the cloud for what is a good fit, not try to do something weird which would break the game if you had spotty connectivity or an offline user.


You do know that you do not have to have a GPU to render graphics.  Also a lot of new graphical features can be easily done using CPU compute.  Using thousands of CPUs in the cloud to perform those calculations is called cloud compute.  Also Intel Xeon chips actually do perform graphics calculations and from a little research their E3-1200v3 chip is built off theire Haswell design.  Even AMD offers graphics processing with their Server Opteron chips.  Its a misconception to believe that Nvidia has the only option or that their solution is the only one that can be leverage.

The thing is you are limiting your thinking to what is done today.  Thats actually being very narrow in your perception.  New techniques are being developed every moment and there probably will be plenty developed using cloud compute.  I see a lot of pontential in this space because I have seen throughout my years how fast cloud based tech has improved.  Now that companies like Intel, AMD, Nvidia and MS are leveraging their skills in this area mean we will start to see a lot of innovative ways cloud compute can expan this space.

You just don't realize how weak CPUs are at 3D rendering compared to GPUs today.  Think hundreds of times slower.  I could go into more detail but I don't think it's your area of expertise.

Plus I'm not talking abot the all time future potential, I'm just talking about the current solution.  If MS says they have 3X the CPU power of the XB1 in the cloud, then that is not thousands of CPUs in the cloud.

I believe you are not getting how a cloud based compute system would work.  I believe you are getting confused with how Playstation Now works compared to what MS is doing. 

Here is an example of what Intel has done with Wofenstien a few years ago.

http://www.extremetech.com/gaming/24860-23754-revision

This is taking just the rendering engine for the wofenstien game and running it in the cloud.  This can be done with many different parts of a game where the developer could leverage such a setup and if you include the size and capability of Azure, MS should be able to deliver around the world.

The thing is, technology is moving at a pace most gamers have no clue about.  Companies are not goining into this space because the cloud cannot handle it.  Instead they are rushing into this space because it is a gold mine and the company with the big pockets probably will be the one to come out on top.  This is not future pontential, its todays solution because most companies that saw this coming was already investing in it like Intel and MS.  MS did not just wake on the release of the X1 and thought about a cloud based compute infrastructure.  They have been working on this since 2005.

You seem to not have any idea of the economies involved here.  Plus I'm not arguing that it's impossible, I'm arguing that it's not what MS have suggested.

If you listen carefully you would hear that demo from Intel was running on 4 larrabee equipped servers.  Intel now sells Xeon Phi cards like that.  They cost thousands, and draw 300 Watts of power each.  Far too expensive to build and operate.  Just because it can be built, doesn't mean it can be done economically and be a "goldmine".  Cloud compute is very elastic, but it isn't free.  Xbox Live Gold would have to be 10 times more expensive to pay for that.

Also your example is just like Playstation Now - it's a game fully rendered in the cloud and streamed to the client as a video.  He mentions the client is a "thin client", ie. it's just something to display the video.  Same as Sony will stream PS Now to Vita TV, smart devices, etc. because all they need to do is display a video.

 You don't need a $500 Xbox One console to play Gaikai or OnLive on it.


@Bolded: Do you understand that since 2009, MS has spent more than 4 billion on their datacenter.  At this point in time MS has more than a million servers around the world making them second to Google.  Just in the last 2 years MS has spent more than 2.5 billion on building their cloud infrastructure.  It seems pretty obvious to me that MS is spending the money in this space.  As for them using Intel knights Landing, Corner or even their Xeon Phi, who knows.  Also who knows how their investment will be funded but then again there are many ways to go about that.  Breaking even on revenue compared to server cost could also be their option depending on what their goal is to get developers using their servers.  I will not waste to much space speculating as there are many creative ways to tier up a system like this.

I know exactly what the demo was running, I read that article a while ago and I have also kept up with Nvidia and AMD solutions as well.  What I am telling you is that MS has put the money in the infrastructure and they also have built the platform which is Orleans which they also have spent over 3 years building to support their cloud compute platform.

As for the demo being the same as Playstation Now, yes and no.  Yes the entire game is running server side but how its run is totally different how Playstation now operates.  Playstation now run the whole game in the cloud, compresses the output and send it to a supporting device.  In other words, its one instance of the game running on one PS3 type server.  The Intell demo has 2 different parts of the game running in the cloud.  There is the one server instance of the game that send data to the second instance that is the rendering engine.  In other words, the rendering engine for the graphics is running totally separate from the main engine.  This would be the case if this was running the main instance of the game on the X1 while the rendering engine is running server side.  You could even have one host instance in the cloud that syncs up with the client instance of one to many X1 systems.  This way a lot of data does not have to be uploaded just the position of the user at and what they are doing.  The host instance can then just send the revelant pieces of code to process over the cloud for multiple servers to process the scene.  The rendering engine in the intel demo is actually using cloud compute as it takes the scene and split it up into 32X32 or 64X64 pixels and distribute the work amoung each of their servers.  Each machine will finish the one frame and send back the data to the client machine.

This is just a graphics demostration but the same could be done with other parts of a game like AI, lighting, environmental effects.  Also the cloud is greate for other rendering techniques like point rendering, voxel rendering.



Machiavellian said:
ICStats said:

You seem to not have any idea of the economies involved here.  Plus I'm not arguing that it's impossible, I'm arguing that it's not what MS have suggested.

If you listen carefully you would hear that demo from Intel was running on 4 larrabee equipped servers.  Intel now sells Xeon Phi cards like that.  They cost thousands, and draw 300 Watts of power each.  Far too expensive to build and operate.  Just because it can be built, doesn't mean it can be done economically and be a "goldmine".  Cloud compute is very elastic, but it isn't free.  Xbox Live Gold would have to be 10 times more expensive to pay for that.

Also your example is just like Playstation Now - it's a game fully rendered in the cloud and streamed to the client as a video.  He mentions the client is a "thin client", ie. it's just something to display the video.  Same as Sony will stream PS Now to Vita TV, smart devices, etc. because all they need to do is display a video.

 You don't need a $500 Xbox One console to play Gaikai or OnLive on it.


@Bolded: Do you understand that since 2009, MS has spent more than 4 billion on their datacenter.  At this point in time MS has more than a million servers around the world making them second to Google.  Just in the last 2 years MS has spent more than 2.5 billion on building their cloud infrastructure.  It seems pretty obvious to me that MS is spending the money in this space.  ... /snip

All impressive numbers in the context of running a search engine, Outlook mail, live.com, office 365, etc. and many 3rd party services.

You could run some killer benchmark on 1 million servers .

Though in the context of power assisting a big userbase of Xbox Ones of say 10 million units a year for the next 5 years, it's not so big.  It's not like some limitless resource when it's hammered by 10s of millions of XB1 gamers, and 100s of millions of other MS service users.

MS is putting lots of investment into cloud to a) power all of it's services like Bing, live maps, etc. [as Google] plus b) to sell hosting as a business [also as Google].  You're jumping to the conclusion that because they've invested so much, they're about to revolutionize gaming too.

If you want to estimate how much cloud power XB1 users will get, look at how much $$ cloud power costs and then try to estimate how much MS $$ is taking from users to pay for the cloud.

For example look at Amazon's prices.  http://aws.amazon.com/ec2/pricing/ .  Let's say you pay $50/year for XBL Gold, and you play XBox 1 hour a day, then you can pay 14c per instance hour.  On Amazon that will buy you 1 hyper thread of a 2.8GHz Xeon core.  Kind of weak.  But MS is building this themselves so let's say it's 4X cheaper... we can get 4 hyper threads of a Xeon core.  That actually is in range of what MS have hinted.  Cool.... so with that you can do some interesting stuff, but as far as graphics please believe me that you can NOT even do PS2 level graphics.

Set your XB1 Cloud graphics expectations no higher than this: http://www.youtube.com/watch?v=m0uVW2AZ1xY .  In other words don't expect anything because it would be a waste of time.




My 8th gen collection

ICStats said:
Machiavellian said:
ICStats said:

You seem to not have any idea of the economies involved here.  Plus I'm not arguing that it's impossible, I'm arguing that it's not what MS have suggested.

If you listen carefully you would hear that demo from Intel was running on 4 larrabee equipped servers.  Intel now sells Xeon Phi cards like that.  They cost thousands, and draw 300 Watts of power each.  Far too expensive to build and operate.  Just because it can be built, doesn't mean it can be done economically and be a "goldmine".  Cloud compute is very elastic, but it isn't free.  Xbox Live Gold would have to be 10 times more expensive to pay for that.

Also your example is just like Playstation Now - it's a game fully rendered in the cloud and streamed to the client as a video.  He mentions the client is a "thin client", ie. it's just something to display the video.  Same as Sony will stream PS Now to Vita TV, smart devices, etc. because all they need to do is display a video.

 You don't need a $500 Xbox One console to play Gaikai or OnLive on it.


@Bolded: Do you understand that since 2009, MS has spent more than 4 billion on their datacenter.  At this point in time MS has more than a million servers around the world making them second to Google.  Just in the last 2 years MS has spent more than 2.5 billion on building their cloud infrastructure.  It seems pretty obvious to me that MS is spending the money in this space.  ... /snip

All impressive numbers in the context of running a search engine, Outlook mail, live.com, office 365, etc. and many 3rd party services.

You could run some killer benchmark on 1 million servers .

Though in the context of power assisting a big userbase of Xbox Ones of say 10 million units a year for the next 5 years, it's not so big.  It's not like some limitless resource when it's hammered by 10s of millions of XB1 gamers, and 100s of millions of other MS service users.

MS is putting lots of investment into cloud to a) power all of it's services like Bing, live maps, etc. [as Google] plus b) to sell hosting as a business [also as Google].  You're jumping to the conclusion that because they've invested so much, they're about to revolutionize gaming too.

If you want to estimate how much cloud power XB1 users will get, look at how much $$ cloud power costs and then try to estimate how much MS $$ is taking from users to pay for the cloud.

For example look at Amazon's prices.  http://aws.amazon.com/ec2/pricing/ .  Let's say you pay $50/year for XBL Gold, and you play XBox 1 hour a day, then you can pay 14c per instance hour.  On Amazon that will buy you 1 hyper thread of a 2.8GHz Xeon core.  Kind of weak.  But MS is building this themselves so let's say it's 4X cheaper... we can get 4 hyper threads of a Xeon core.  That actually is in range of what MS have hinted.  Cool.... so with that you can do some interesting stuff, but as far as graphics please believe me that you can NOT even do PS2 level graphics.

Set your XB1 Cloud graphics expectations no higher than this: http://www.youtube.com/watch?v=m0uVW2AZ1xY .  In other words don't expect anything because it would be a waste of time.


@Bolded: what you are saying probably would make sense if the fact that MS has already restructured their pricing for the gaming market.  This is something that Respawn mentioned a few times how they asked MS about host services to be affortable for developers and MS came back with a solution.  Also why do you continue to use just one Xeon core when I have repeatedly mentioned that MS has built the Orleans platform that allows them to take pieces of code and spread it around multiple resources, including datacenters.  Meaning one to as many as they feel is needed to perform the calculations.  Its the same thing as the Intel demo on how it takes a scene and splits a 32X32 pixels and sends each part to be process by individual servers or virtual resources.  It really seems you are stuck on one solution to the problem when there are many different solutions and the one developed by MS isn't the one you are talking about.

As for supporting 10million X1 a year, it seems MS spending on their infrastructure is probably going to match that number pretty easy.  If they continue to spend roughly 1.5Billion a quater, they should easily be able to support the Xbox user base.  Also the amount of games that will be using cloud compute will be small in the first few years probably regulated to first party titles.

I actually said nothing about a revolution in gaming.  I am stating that MS is investing in this space because there is money to be made first and formost.  What I am really doing is dismissing all the naysayers that say that its all PR.  Yes there is some hyperbole about 3x the power of the X1 in the cloud but the technology itself is sound.



Around the Network
Machiavellian said:

@Bolded: what you are saying probably would make sense if the fact that MS has already restructured their pricing for the gaming market.  This is something that Respawn mentioned a few times how they asked MS about host services to be affortable for developers and MS came back with a solution.

Sure but until we get some different figures, I'm going to assume my estimate is in the right ballpark.

Machiavellian said:

Also why do you continue to use just one Xeon core when I have repeatedly mentioned that MS has built the Orleans platform that allows them to take pieces of code and spread it around multiple resources, including datacenters.  Meaning one to as many as they feel is needed to perform the calculations.  Its the same thing as the Intel demo on how it takes a scene and splits a 32X32 pixels and sends each part to be process by individual servers or virtual resources.  It really seems you are stuck on one solution to the problem when there are many different solutions and the one developed by MS isn't the one you are talking about.

I'm talking about a unit of processing.  One Xeon core hour = a unit of compute.  How it's divided doesn't matter in this argument.  Could be 1 hour on 1 server, or 1 minute on 60 servers.

They can't use as many as they feel like, only as many as they have.  This is something you keep ignoring.  When you have 1 million beans and 10 million users, you can't give 10 beans to each user.  Even if Orleans technology could give 10 beans per user, you have only 0.1 beans to give.



My 8th gen collection

ICStats said:
Machiavellian said:

@Bolded: what you are saying probably would make sense if the fact that MS has already restructured their pricing for the gaming market.  This is something that Respawn mentioned a few times how they asked MS about host services to be affortable for developers and MS came back with a solution.

Sure but until we get some different figures, I'm going to assume my estimate is in the right ballpark.

Machiavellian said:

Also why do you continue to use just one Xeon core when I have repeatedly mentioned that MS has built the Orleans platform that allows them to take pieces of code and spread it around multiple resources, including datacenters.  Meaning one to as many as they feel is needed to perform the calculations.  Its the same thing as the Intel demo on how it takes a scene and splits a 32X32 pixels and sends each part to be process by individual servers or virtual resources.  It really seems you are stuck on one solution to the problem when there are many different solutions and the one developed by MS isn't the one you are talking about.

I'm talking about a unit of processing.  One Xeon core hour = a unit of compute.  How it's divided doesn't matter in this argument.  Could be 1 hour on 1 server, or 1 minute on 60 servers.

They can't use as many as they feel like, only as many as they have.  This is something you keep ignoring.  When you have 1 million beans and 10 million users, you can't give 10 beans to each user.  Even if Orleans technology could give 10 beans per user, you have only 0.1 beans to give.

You are talking about a Unit which basically is a node in cloud terms.  That type of setup is not the infrastructure that MS has built.  Instead think of each node and all of its cores as one huge resource pool.  Each task sent to the Azure cloud can leverage each individual core as a unit of processing not just the entire node.  Each grain from the Orleans platform operate as a single threaded execution on one core using a small number of threads.  For the X1 cloud compute you can have multiple X1s operating on one node each using a number of cores to process their individual grain or code. 



Machiavellian said:
ICStats said:
Machiavellian said:

@Bolded: what you are saying probably would make sense if the fact that MS has already restructured their pricing for the gaming market.  This is something that Respawn mentioned a few times how they asked MS about host services to be affortable for developers and MS came back with a solution.

Sure but until we get some different figures, I'm going to assume my estimate is in the right ballpark.

Machiavellian said:

Also why do you continue to use just one Xeon core when I have repeatedly mentioned that MS has built the Orleans platform that allows them to take pieces of code and spread it around multiple resources, including datacenters.  Meaning one to as many as they feel is needed to perform the calculations.  Its the same thing as the Intel demo on how it takes a scene and splits a 32X32 pixels and sends each part to be process by individual servers or virtual resources.  It really seems you are stuck on one solution to the problem when there are many different solutions and the one developed by MS isn't the one you are talking about.

I'm talking about a unit of processing.  One Xeon core hour = a unit of compute.  How it's divided doesn't matter in this argument.  Could be 1 hour on 1 server, or 1 minute on 60 servers.

They can't use as many as they feel like, only as many as they have.  This is something you keep ignoring.  When you have 1 million beans and 10 million users, you can't give 10 beans to each user.  Even if Orleans technology could give 10 beans per user, you have only 0.1 beans to give.

You are talking about a Unit which basically is a node in cloud terms.  That type of setup is not the infrastructure that MS has built.  Instead think of each node and all of its cores as one huge resource pool.  Each task sent to the Azure cloud can leverage each individual core as a unit of processing not just the entire node.  Each grain from the Orleans platform operate as a single threaded execution on one core using a small number of threads.  For the X1 cloud compute you can have multiple X1s operating on one node each using a number of cores to process their individual grain or code. 

Irrelevant... I don't care how you split the beans.  I will assume you split them as efficiently as possible.  That doesn't give you infinite beans.  There's a finite total amount of work that X1 cloud can do.  What is that amount, per user?



My 8th gen collection

ICStats said:
Machiavellian said:
ICStats said:
Machiavellian said:

@Bolded: what you are saying probably would make sense if the fact that MS has already restructured their pricing for the gaming market.  This is something that Respawn mentioned a few times how they asked MS about host services to be affortable for developers and MS came back with a solution.

Sure but until we get some different figures, I'm going to assume my estimate is in the right ballpark.

Machiavellian said:

Also why do you continue to use just one Xeon core when I have repeatedly mentioned that MS has built the Orleans platform that allows them to take pieces of code and spread it around multiple resources, including datacenters.  Meaning one to as many as they feel is needed to perform the calculations.  Its the same thing as the Intel demo on how it takes a scene and splits a 32X32 pixels and sends each part to be process by individual servers or virtual resources.  It really seems you are stuck on one solution to the problem when there are many different solutions and the one developed by MS isn't the one you are talking about.

I'm talking about a unit of processing.  One Xeon core hour = a unit of compute.  How it's divided doesn't matter in this argument.  Could be 1 hour on 1 server, or 1 minute on 60 servers.

They can't use as many as they feel like, only as many as they have.  This is something you keep ignoring.  When you have 1 million beans and 10 million users, you can't give 10 beans to each user.  Even if Orleans technology could give 10 beans per user, you have only 0.1 beans to give.

You are talking about a Unit which basically is a node in cloud terms.  That type of setup is not the infrastructure that MS has built.  Instead think of each node and all of its cores as one huge resource pool.  Each task sent to the Azure cloud can leverage each individual core as a unit of processing not just the entire node.  Each grain from the Orleans platform operate as a single threaded execution on one core using a small number of threads.  For the X1 cloud compute you can have multiple X1s operating on one node each using a number of cores to process their individual grain or code. 

Irrelevant... I don't care how you split the beans.  I will assume you split them as efficiently as possible.  That doesn't give you infinite beans.  There's a finite total amount of work that X1 cloud can do.  What is that amount, per user?

True, there is not infinite resources but then again the way that MS is spending on new datacenters, they should be able to outpace X1 sales.  Anyway I will concede this is all theory until we actually see a product on the market.  I see a lot of pontential in this space and I also see MS spending the money to make it happen.  That does not equal success but it does mean that their claim isn't 100% PR BS.

I personally believe people are dismissing cloud compute for lack of knowledge in this area as well as some of the advancements in technology and software development that can make it happen.  



ICStats said:

I'm talking about a unit of processing.  One Xeon core hour = a unit of compute.  How it's divided doesn't matter in this argument.  Could be 1 hour on 1 server, or 1 minute on 60 servers.

They can't use as many as they feel like, only as many as they have.  This is something you keep ignoring.  When you have 1 million beans and 10 million users, you can't give 10 beans to each user.  Even if Orleans technology could give 10 beans per user, you have only 0.1 beans to give.


The rumours I have seen suggest that Azure is using Bulldozer/Vishira based Opteron's operating at around 1.6-1.9ghz because the performance is pretty horrible in lightly threaded tasks.
If you have information on the contrary, I would be interested. :)



--::{PC Gaming Master Race}::--