By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Will the Xbox One Cloud Underdeliver like The Cell Processor?

 

Will the Xbox One Cloud Underdeliver like The Cell Processor?

Definitely. MS is full of crap! 407 77.67%
 
No way. The Cloud will kick ass! 117 22.33%
 
Total:524
Machiavellian said:
Subie_Greg said:
Underdeliver?

They promise 4k gaming on Xbox One with teh cloud

yeah

I have been keeping up with X1 news and I am sure I never saw any statement from MS that promised 4K gaming.  Now if you said MS stated cloud processing would deliver 4x the performance of the X1 then you would be correct.

Nealson did in a Boogie 2988 vid iirc



Around the Network

87.5% of people so far believe that the cloud is garbage. I think so too :P



We will see.



It's not going to underperform, it's going to do nothing.



The cell processor didnt underdeliver, it was simply super hard ot work with, the ones who mastered it liek Naughty Dog did stuf that loooks like Uncharted 3 and Last of Us with that processor, and seriously no next gem game has beat the Last of Us graphicswise yet.

The cloud wont underdeliver either, itll work exactly like its suposed to, meaning that it wont help increase the performance of games or theyre graphics in one bit, itll be awesome for multiplayer tough and other internet based services like backing up files and such.



Around the Network

You didnt even do the power of the cell line properly.


The cell proved itself time and time again. May i add that the 360 had the better graphics card but still could not keep up with the PS3 when it came to graphics. I wonder why.

Must be t3h sect3tz source.

Onto the cloud. Not to "rain" on M$ parade but a "storm" is a brewing and I dont want to be around when they get the backlash from it.



Nobody's perfect. I aint nobody!!!

Killzone 2. its not a fps. it a FIRST PERSON WAR SIMULATOR!!!! ..The true PLAYSTATION 3 launch date and market dominations is SEP 1st

Serious_frusting said:
You didnt even do the power of the cell line properly.


The cell proved itself time and time again. May i add that the 360 had the better graphics card but still could not keep up with the PS3 when it came to graphics. I wonder why.

Must be t3h sect3tz source.

Onto the cloud. Not to "rain" on M$ parade but a "storm" is a brewing and I dont want to be around when they get the backlash from it.


We don't know if the Xbox 360 had the better graphics card. In terms of flops? Sure, but there is so much more to performance than that.
Lets take the Radeon x1900 XTX... It had double the pixel pipelines as the Geforce 7900 GTX, it also had a higher "gflop" number, but in a large majority of games, the difference was pretty neglible, sometimes even wen't in the Geforce's favor.
The Xbox GPU however did have a few extra techniques that the PS3 couldn't do like hardware-based Tessellation and 3dc compression.

Now the other thing to keep in mind was the the Xbox GPU was more or less a "stepping stone" between the x1900 series and the incredibly horrible 2900 series, if it's architecture leaned closer to the 2900 series it's real world performance could have been impacted rather largely.

But let's be realistic.
Despite the claimed Cells "Super computing levels of performance" and how powerfull people hampered on about... It really didn't show in games, you didn't get 10x better graphics or framerates or resolutions or units/objects on screen.
It certainly wasn't the difference between the PC and PS3 that's for sure, that's why the Cell didn't prove itself. (I even had debates with people on this forum who claimed it was faster than my 3930K @ 4.8ghz. - Like that was ever possible.)

The Cell was mostly only used for framebuffer effects, it's simply not powerfull enough to do anything more complex, not to mention people seem to confuse "art" with "graphics fidelity" which are very different. (You can have a low-graphics game but make it look amazing with great art pre-baked into assets, Nintendo seem to be the masters in this regard.)



--::{PC Gaming Master Race}::--

Pemalite said:
fps_d0minat0r said:

Im not denying what you do for a living and how helpful that is to your clients, im just saying MS havent explicitly said how much they have invested in this. Like I said before, it might just be something done on a tiny scale.

Remember their very first video showing natals vision and how it actually turned out? That was because they led people to believe they were working on something far bigger than they actually could deliver.

And when do you think its 'true test' will be? dont you think its unethical to market something which the consumers have no idea when it will be coming?

Sony might as well start marketing titanfall 2 for PS4 because I bet that will come out sooner than MS's cloud will have its 'true test'


However, you can *work* it out to a rough estimate.

The "Xbox Cloud" runs on top of Azure, Microsoft has already made over a Billion dollars from Azure itself via Xbox, Office, Outlook and other business ventures.
Azure itself consists of a Million servers, in a datacenter you have roughly 50,000 - 100,000 servers so you are looking at anywhere from 10-20 data centers positioned all around the world.
Thus by extension... All the Yanks won't be getting the 1 million servers for gaming, they get a fraction of that because it's distributed world-wide, they also get less of that fraction as it's not all dedicated to Xbox.

Now assuming every server costs roughly $1,000 (That's being conservative!) that's $1 Billion spent right there and that's before you actually build the complex (Can cost 10's - 100's of millions), provide dedicated high-speed fibre. (Think: Much faster than 1Gbps bandwidth.), provide relatively complex cooling (I.E. - Water or even something like Helium.) and throw electricity at it.

The average Data centers energy costs is roughly $10 million per megawatt and at 250 megawatts would then cost about 2.5 billion, so it's not a stretch to assume the entire endevour has cost upwards of 4.5 Billion+ and that's before you take into account engineers, network administrators and other staff you need to hire, taxes, research and development... You name it.

Microsoft has invested a massive amount in it's server infrustructure, the only company that competes with it is... Google who probably has a few more.
However, those 1 million servers are not dedicated to Xbox, it's dedicated to whoever has their wallets open, thus if Sony or Nintendo ever wishes to do so... They could use the "Cloud" to perform calculations for their own platforms.

Exactly.  Sony, Nintendo, EA, etc. can and do use these type of cloud services like Azure for game backend services.

Now MS has likely spent time to integrate some Xbox specific features, and SDK libraries on XB1 to make it easier to use, so it isn't nothing.  It's going to add something to the XB1 experience.

However when you think about how many servers there are and how many users there are, there isn't server time to do graphics.  This is the broken idea, that the cloud will somehow accelerate Xbox One graphics and make up for it's weaker GPU.  As far as I know Azure servers don't even have GPUs (at least not most of them) so you probably could't run graphics better than Quake 1, and that would be a horrible waste of resources.



My 8th gen collection

SvennoJ said:
Machiavellian said:
 

You are thinking about the solution wrong.  The developer knows where you are at, what you are doing and where you are going at all times within their game.  A lot of stuff can be sent to the cloud before the gamer ever gets to a specific location, perform a specific action.  That info can be sent to the Cloud to be process and streamed back to the user in advance.  Depending on how the developer create their environment and host an instance of their software in the cloud, a lot of complex calucations can be performed which do not require a lot of data streamed to the local machine.  Hell, this is not something new.  You can take the predictive Google search as an example.

 Also another way to take care of latency is to have your game synced with the cloud server.  Both are running an instance of the game or pieces of it, where the cloud service can stream just that bit of info back to the local client.

As an example, think of a game like the latest Assasin Creed.  When you are on the island and walking through a jungle.  All of the environmental effects can be offloaded to the cloud.  Wind, Sand blowing, fog, you name it.  Local things that the user interact with can be done client side. 

I'm very skeptical about your examples for cloud enhancements.

First of all it's pretty expensive to run an entire instance of the same game for every single player game being played out there.
All the predictable things can be precalculated at compile time and put ont he disc, no need for cloud.
More importantly if you look at the specs for NVidia cloudlight, even the most basic light map enhancements are already pushing the 1.5 mbps recommendation MS set for cloud features. (http://www.ppsloan.org/publications/Crassin13Cloud.pdf page 8 table 2) Offloading volumetric fog and blowing sand for a 1080p game is a lot more data, it soon becomes less expensive to simply send the whole game image over a 5mbps h.265 stream, since you're running a game instance in the cloud anyway.

I believe you forget that MS already stated that every X1 game gets a dedicated server for free in which the developer can use as they please, Second is the cloud base platform Orleans that MS has developed.  Orleans allows pieces of code to run server side called grains.  These grains can be replicated so that multiple servers can operated on that piece of code.  Also those grains can be operated anywhere in the world an computed locally to a datacenter where the gamer is located.  This is how MS plans to tackle the cloud compute and its a platform they have been building for over 3 years.  Also, you really only have to have one server run an instance of the game and send pieces of code to other servers to process those results and send to the console.  Think of the main dedicated server as a traffic controller where it direct multiple cars to different paths.

So using Orleans, pieces of code that run server side can be crunching parts of a game constantly.  If done correctly, the server side code can already have the piece of code completed since a lot of stuff will not really change if processing environmental effects and just streams that info to the gamer console when they reach a certain area.  The info can be streamed in advance so its ready once the user gets to that section.  

Gamers have to stop thinking about the cloud rendering a scene instead of just performing calculations.  Even rendering a scene can be done by using similar techiques like Gaikai/Playstation now.

This stuff is how distributed cloud services work today and really not that new.  The thing is, its new for game applications which is why such services will take time as developers work out different solutions for how their game work.



If anyone believes, essentially, that the internet is going to make your games better, they are fooling themselves. If computer processing and rendering graphics could just simply be offloaded to the internet (meaning onboard hardware has less of a hassle running the game), then my shitty desktop I have lying around from 2001 should be able to run Crysis