By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Microsoft shows off potential power of the Cloud at BUILD

fatslob-:O said:

The rest of the demos looks like they have prefractured structures ... 

It's clear that these demo fake the destruction and you can clearly see the flaws. Don't just assume that their on the same level because they are NOWHERE NEAR the same shit. The cloud demo is doing MUCH MORE than those demo's from either Nvidia or Havoc. 


MS's demo appears to have prefractured destructionl, just look at the balls that get blown up and then become uninteractive. Nvidia's Apex demos actually use real-time fracture tho



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

Around the Network
zarx said:
fatslob-:O said:

The rest of the demos looks like they have prefractured structures ... 

It's clear that these demo fake the destruction and you can clearly see the flaws. Don't just assume that their on the same level because they are NOWHERE NEAR the same shit. The cloud demo is doing MUCH MORE than those demo's from either Nvidia or Havoc. 


MS's demo appears to have prefractured destructionl, just look at the balls that get blown up and then become uninteractive. Nvidia's Apex demos actually use real-time fracture tho

No no, the rubles are created in realtime as you can see is microsoft's demo up at the top with it's "chunk count". It's cool that Nvidia showed real-time fracturing in this demo but the rest of their demos don't do it because it's computationally expensive as seen in microsoft's demo. Compared to the scale shown it's clear that MS demo is more impressive considering the fact that it's handling collision detection with 30 000 procedurally generated chunks!



why 32fps and not 60fps...



fatslob-:O said:

No no, the rubles are created in realtime as you can see is microsoft's demo up at the top with it's "chunk count". It's cool that Nvidia showed real-time fracturing in this demo but the rest of their demos don't do it because it's computationally expensive as seen in microsoft's demo. Compared to the scale shown it's clear that MS demo is more impressive considering the fact that it's handling collision detection with 30 000 procedurally generated chunks!


The Colluseum demo from last year also uses real-time fracture

"The arena after serious destruction resulting in 20k compounds and 32k convexes. Throughout the simulation, the fracture time

stays below 50ms"

" As mentioned earlier, the frame rates in our examples are determined by rendering, rigid body simulation and dust simulation, typically using about 15, 45 and 40 percent of the computation time, respec-tively."

" The simulation runs at over 30 fps including rigid body simulation,
dust simulation and rendering until the end of the sequence where the original mesh is split up into 20k separate pieces."

http://matthias-mueller-fischer.ch/publications/fractureSG2013.pdf

if there are 52K fractures that means there are a comparable number of chunks to the MS demo in real-time



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

fatslob-:O said:

No no, the rubles are created in realtime as you can see is microsoft's demo up at the top with it's "chunk count". It's cool that Nvidia showed real-time fracturing in this demo but the rest of their demos don't do it because it's computationally expensive as seen in microsoft's demo. Compared to the scale shown it's clear that MS demo is more impressive considering the fact that it's handling collision detection with 30 000 procedurally generated chunks!

That would be it's achilles heel as much as it sounds impressive. The data stream to update that kind of debris count at 32fps would easily exceed 60mbps. Plus the local machine still needs to have the spare overhead to render all that extra stuff. No wonder it's all simple grey blocks with flat lighting and no shadows.



Around the Network
zarx said:
fatslob-:O said:

No no, the rubles are created in realtime as you can see is microsoft's demo up at the top with it's "chunk count". It's cool that Nvidia showed real-time fracturing in this demo but the rest of their demos don't do it because it's computationally expensive as seen in microsoft's demo. Compared to the scale shown it's clear that MS demo is more impressive considering the fact that it's handling collision detection with 30 000 procedurally generated chunks!


The Colluseum demo from last year also uses real-time fracture

"The arena after serious destruction resulting in 20k compounds and 32k convexes. Throughout the simulation, the fracture time

stays below 50ms"

" As mentioned earlier, the frame rates in our examples are determined by rendering, rigid body simulation and dust simulation, typically using about 15, 45 and 40 percent of the computation time, respec-tively."

" The simulation runs at over 30 fps including rigid body simulation,
dust simulation and rendering until the end of the sequence where the original mesh is split up into 20k separate pieces."

http://matthias-mueller-fischer.ch/publications/fractureSG2013.pdf

if there are 52K fractures that means there are a comparable number of chunks to the MS demo in real-time

Fracturing is one part but the MS demo could also be doing some stress simulation ... 



SvennoJ said:
fatslob-:O said:
 

No no, the rubles are created in realtime as you can see is microsoft's demo up at the top with it's "chunk count". It's cool that Nvidia showed real-time fracturing in this demo but the rest of their demos don't do it because it's computationally expensive as seen in microsoft's demo. Compared to the scale shown it's clear that MS demo is more impressive considering the fact that it's handling collision detection with 30 000 procedurally generated chunks!

That would be it's achilles heel as much as it sounds impressive. The data stream to update that kind of debris count at 32fps would easily exceed 60mbps. Plus the local machine still needs to have the spare overhead to render all that extra stuff. No wonder it's all simple grey blocks with flat lighting and no shadows.

Nobody knows if the demo itself is streamed or is using syncronized rendering ... I'd assumed that it's likely streamed ... 



I don't buy it, tbh, the demo was devoid of a LOT of information, and very very specifically, they do not at ANY point say "GPU", then he later goes on to say "Even if we had multiple high end machines and these things were talking, they couldn't do the kind of..." So what you're saying there is, its not possible even if there's multiple machines doing the computation?, like.. you know.. a cloud of computers?

Again however, the key point for me is how they actively sidestep at every given point the use of "CPU" or "GPU", if they were running the physics on CPU then the framerate would indeed be very stable until physics started to kick it up a notch.

Having had a lot of time messing with the libraries in the latest sdks for utilizing azure beyond data storage and on to processing, I do not buy this example, not a single bit.

All signs point to the left frame being ran on CPU and the right on GPU.



can they do a test like this with a real game that's already online? that would be very helpful.



Too bad it's nearly impossible to use IRL when developers have no clue how a persons internet connection could be. Plus, if games were built ground up with this in mind, and a person loses internet while playing said game, they're going to face a ton of performance issues.



                                                                                                               You're Gonna Carry That Weight.

Xbox One - PS4 - Wii U - PC