sasquatchmontana said:
SvennoJ said:
If it was a multiplatform game, both would use the cloud...
|
No they wouldn't, it just suits the comparison better. If this were a MP game that ran at 900p on XBone and 1080p on PS4 it would be used as evidence that the latter were more powerful. So if CD leveraged the cloud on XB1 and not the on the PS4 it would be same proof as otherwise.
Of course we could compare it to Infamous Second Son and the difference in destruction is the same, but then i'd get some bullshit about "Second Son wasn't built for destruction so doesn't need it" to prove a point, but that sucks.
Why would a developer make a multiplatform game, and not use the servers on one platform? In the end the console still has the same resources to divide for rendering. Off-loading the physics doesn't make it more powerful. Does streaming pre-baked lighting data from disk in AC Unity make the console more powerful? Did the pre calculated destruction data in Portal 2 make consoles more powerful? Now it comes from the cloud in a dynamic way, console is still the same.
SvennoJ said:
The cloud has nothing to do with debris staying or not. That's all in local memory, rendered locally, a choice by the developers to keep the debris as the rest of the graphics will have to be toned down already to compensate for the many pieces falling at once scenarios. (Btw in the demos not all debris stays, the pile of rubble is quite small compared to the buildings)
|
Crackdown 3’s multiplayer is all about unscripted, real-time destruction on an unseen scale. 2009’s Red Faction: Guerrilla gave us a glimpse of this level of destruction but few developers since have followed suit. The problem, explains Jones, is that destruction on this grand level is a big drain on physics and requires a large amount of processing power and memory. More than a single box can realistically provide. Reagent’s solution to this problem is to leverage the Xbox One’s cloud computing capabilities to provide the horsepower necessary to facilitate a 100 per cent destructible environment...Impressively, none of the debris disappears either. The evidence of your destruction persists for as long as the game lasts.
Debris already disappears while the building falls apart...

What they said was true for the 2014 prototype. Yet watch the E3 footage again. There is too little stuff left over on the ground after a building collapses. Sure what actually reaches the ground probably stays there until the end of the game.
SvennoJ said:
Yet it's not used since it limits the graphic fidelity compared to other games, hence a cell shaded game is getting this tech.
|
Crackdown 3 is cel shaded, because crackdown is cel shaded. It always has been. Even for CGI trailers its celshaded. That has little to do with it.
That's why Crackdown 3 is getting it. http://www.vg247.com/2013/08/12/a-fully-geo-mod-enabled-saints-row-is-literally-impossible-in-this-gen-says-volition/ Reason: “With the kind of competition that’s out there I think, I suspect it would almost be impossible to do it and still remain competitive visually.” Cell shaded, no worries about texturing all the new pieces and stylized lighting to simplify rendering
SvennoJ said:
There are other ways to optimize total destruction. Red Faction Armageddon managed fine on 360, the Geomod engine should do fine scaled up to this gen.
|
Assuming this is dealt with by the most powerful CPU this console generation has to offer, at best we can hope for is 1/14th of the destruction of crackdown.
Except gpu's are perfectly capable for these kind of physics tasks. Simple stats, XB1 has 112 Gflops total cpu power, ps4 has 533 Gflops extra gpu power over the XB1. It has 5 extra XB1 Cpu's hidden in the GPU? And don't forget the overhead from distributed general purpose computing compared to local specialized hardware.
SvennoJ said:
What would be a problem is online multiplayer when every client has to synchronize all the destruction with the other clients. Dedicated servers are a better solution when a lot of moving parts are involved. Plus the server has the luxury to calculate ahead and spoon feed the clients within a set bandwidth limit. As soon as you fire a rocket, the outcome can already be calculated.
|
These "problems" are the same today. If I blow up a tank in a 32 player game of battlefield, that info still needs to be synched...as does every bullet fired etc.
Without dedicated servers it quickly escalates with every extra player. Having one central server is far more efficient when lots of data communication is involved between clients. It would be a cool challenge to use distributed computing in a peer to peer setup. Each client is assigned and calculates a bit of physics and shares their results with the others. The more players join, the bigger the mayham can get. Yet bandwidth limitations and latency quickly blow that up, as each console needs to communicate with every other connected player. Hence dedicated servers.
| SvennoJ said:
Anyway we don't know much about the single player atm. How much destruction it will allow, or if the graphics will be enhanced compared to competive multiplayer. Nor how well it will work through an average internet connection through a shared wifi router.
|
If there's an online caveat, then potentially the same. Even for game design, they could always unlock it upon the games completion.
Everything else is the same as any other games online performance.
Except a consistent 2-4 mbps requirement is a lot more than online games require so far. At the begin of the gen MS stated 1.5 mbps would be their optimal experience target. http://news.xbox.com/2013/06/connected Plus can they smooth it out enough. If for example the bulk of the data is needed for the first 5 frames out of 30, you're effectively looking at a requirement of 24mbps, even though it's only needed for the first 160 ms. Just as judder in frame rate can break the performance of a game, spikes in data transfer can be just as off putting.
|