By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Is cloud the new blast processing of this generation?

 

Is cloud pretty much a marketing term?

Yes...and I ain't falling for it. 83 76.85%
 
No...Cloud POWAH FTW!!! 13 12.04%
 
Yo Mama 12 11.11%
 
Total:108
vivster said:

All the CPU and RAM in the worl won't help if the shaders are overworked. That's what I meant.

I never said that MS is responsible for these "gaming PCs". Just that they are helping to keep the average consumer dumb.


Well, CPU's are far more flexible and programmable, more than any GPU, that's where the strength lays and they can indeed handle graphics duties with better image quality than ANY GPU.
However because of a CPU's serialistic processing nature, they generally aren't very good at tasks which can be run highly parallel such as graphics duties, that's where a graphics processor steps in, it's a design with many thousands of cores, which leverages the fact that graphics tasks are highly parallel.
Then again, it's all mathematics in the end, just the CPU can handle allot more complex math than a GPU, whilst the GPU can handle more mathematic problems at once.

This is why after the 90's era, most games on the PC landscape shifted from "software" to "hardware" rendering (I.E. Handled via  CPU to being run on a GPU) because of the simple fact GPU's can leverage parallel processing better than any CPU can, however that comes at the cost of the GPU being "Stupid" in comparison in order to conserve transister space for more "dumb cores" for better performance.

With consoles however with so many CPU cores, developers will probably leverage a CPU core or two for some smaller framebuffer effects such as morphological Anti-Aliasing. (Which is a blur filter.)
Thus the CPU will still handle some graphics duties, this is where Azure/Cloud can also step in.



--::{PC Gaming Master Race}::--

Around the Network

Oh yeah, cloud is going to be awesome. I mean PS Now of course, not the no visible results XB1 cloud



My 8th gen collection

Pemalite said:
vivster said:

All the CPU and RAM in the worl won't help if the shaders are overworked. That's what I meant.

I never said that MS is responsible for these "gaming PCs". Just that they are helping to keep the average consumer dumb.


Well, CPU's are far more flexible and programmable, more than any GPU, that's where the strength lays and they can indeed handle graphics duties with better image quality than ANY GPU.
However because of a CPU's serialistic processing nature, they generally aren't very good at tasks which can be run highly parallel such as graphics duties, that's where a graphics processor steps in, it's a design with many thousands of cores, which leverages the fact that graphics tasks are highly parallel.
Then again, it's all mathematics in the end, just the CPU can handle allot more complex math than a GPU, whilst the GPU can handle more mathematic problems at once.

This is why after the 90's era, most games on the PC landscape shifted from "software" to "hardware" rendering (I.E. Handled via  CPU to being run on a GPU) because of the simple fact GPU's can leverage parallel processing better than any CPU can, however that comes at the cost of the GPU being "Stupid" in comparison in order to conserve transister space for more "dumb cores" for better performance.

With consoles however with so many CPU cores, developers will probably leverage a CPU core or two for some smaller framebuffer effects such as morphological Anti-Aliasing. (Which is a blur filter.)
Thus the CPU will still handle some graphics duties, this is where Azure/Cloud can also step in.

Do you really believe that though? The cloud may be very well able to process these things but look at the delay. It will hardly be able to process graphical features and transport them back in a timely manner to be presented on screen.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

vivster said:
Pemalite said:
vivster said:

All the CPU and RAM in the worl won't help if the shaders are overworked. That's what I meant.

I never said that MS is responsible for these "gaming PCs". Just that they are helping to keep the average consumer dumb.


Well, CPU's are far more flexible and programmable, more than any GPU, that's where the strength lays and they can indeed handle graphics duties with better image quality than ANY GPU.
However because of a CPU's serialistic processing nature, they generally aren't very good at tasks which can be run highly parallel such as graphics duties, that's where a graphics processor steps in, it's a design with many thousands of cores, which leverages the fact that graphics tasks are highly parallel.
Then again, it's all mathematics in the end, just the CPU can handle allot more complex math than a GPU, whilst the GPU can handle more mathematic problems at once.

This is why after the 90's era, most games on the PC landscape shifted from "software" to "hardware" rendering (I.E. Handled via  CPU to being run on a GPU) because of the simple fact GPU's can leverage parallel processing better than any CPU can, however that comes at the cost of the GPU being "Stupid" in comparison in order to conserve transister space for more "dumb cores" for better performance.

With consoles however with so many CPU cores, developers will probably leverage a CPU core or two for some smaller framebuffer effects such as morphological Anti-Aliasing. (Which is a blur filter.)
Thus the CPU will still handle some graphics duties, this is where Azure/Cloud can also step in.

Do you really believe that though? The cloud may be very well able to process these things but look at the delay. It will hardly be able to process graphical features and transport them back in a timely manner to be presented on screen.


Of course I beleive it.
Will it make graphics 10x better? Hardly.
Will it make textures sharper and cleaner? Nope.
Will it make the system run at a higher resolution? Not going to happen.

What *can* happen is they can use a similar technique to what CPU's use, which is "prediction" where they predict the data ahead-of-time so it's ready for processing, it's going to be limited to non-latency and bandwidth sensitive tasks of course.
And unlike the "Blast Processing" debacle, the processing power is indeed real, it's just limited in it's use-case scenario's.



--::{PC Gaming Master Race}::--

ICStats said:
Oh yeah, cloud is going to be awesome. I mean PS Now of course, not the no visible results XB1 cloud

The glorious X1 cloud will show its true power soon.........maybe!



                
       ---Member of the official Squeezol Fanclub---

Around the Network
Pemalite said:


Of course I beleive it.
Will it make graphics 10x better? Hardly.
Will it make textures sharper and cleaner? Nope.
Will it make the system run at a higher resolution? Not going to happen.

What *can* happen is they can use a similar technique to what CPU's use, which is "prediction" where they predict the data ahead-of-time so it's ready for processing, it's going to be limited to non-latency and bandwidth sensitive tasks of course.
And unlike the "Blast Processing" debacle, the processing power is indeed real, it's just limited in it's use-case scenario's.

And there we are at the core of the problem. That's not how it is advertised. It is advertised as giving the X1 a 3x faster CPU. It can be used but either are the offloaded tasks negligable or have nothing to do with visual fidelity. Take this for example:

What they are showing there implicates that a machine using the cloud would be able to do this. That this would never be possible on a consumer X1 that is using the cloud via the internet should be clear for the techsavvy consumer but the uninformed will gobble it up. MS does very well in avoiding telling outright lies though. It's always fun to hear them talk about the cloud, manuvering through the difficult topics but still giving the impression that it would make games run faster.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Pemalite said:

Then again, it's all mathematics in the end, just the CPU can handle allot more complex math than a GPU, whilst the GPU can handle more mathematic problems at once.

GPUs & CPUs can handle the same math, it's just a matter of how quickly you need the result.  GPU "threads" are each executed very slowly compared to a modern CPU core.

For example a CPU core could execute several instructions from the same thread, including perhaps SIMD, each cycle at say 3 GHz.  A single thread may hypothetically run at say 3 Giga-instructions-per-second.

A GPU like GCN runs at say 1GHz or less so it's already 3 times slower per-thread, then add up to 40 warps sharing a core (similar to how CPU hyperthreads share a core) and each instruction is 'scalar' (no SIMD).  You will realize that the GPU may run a single thread hundreds of times slower than the CPU.

Therefore if you need the answer to a single math solution quickly then the CPU will win every time.  On the other hand the GPU can run 10s of thousands of threads at the same time, so if you need 10 thousand math solutions then the GPU will win every time, and more power efficiently too.

That's why doing graphics on CPU is a terrible idea, and becomes even more terrible over time.  The future is more specialized silicon.  Graphics on GPU hardware.  Video encode/decode in hardwre.  Computer vision in hardware.  Motion processing in hardware.  etc. etc.



My 8th gen collection

vivster said:
Pemalite said:


Of course I beleive it.
Will it make graphics 10x better? Hardly.
Will it make textures sharper and cleaner? Nope.
Will it make the system run at a higher resolution? Not going to happen.

What *can* happen is they can use a similar technique to what CPU's use, which is "prediction" where they predict the data ahead-of-time so it's ready for processing, it's going to be limited to non-latency and bandwidth sensitive tasks of course.
And unlike the "Blast Processing" debacle, the processing power is indeed real, it's just limited in it's use-case scenario's.

And there we are at the core of the problem. That's not how it is advertised. It is advertised as giving the X1 a 3x faster CPU. It can be used but either are the offloaded tasks negligable or have nothing to do with visual fidelity. Take this for example:

What they are showing there implicates that a machine using the cloud would be able to do this. That this would never be possible on a consumer X1 that is using the cloud via the internet should be clear for the techsavvy consumer but the uninformed will gobble it up. MS does very well in avoiding telling outright lies though. It's always fun to hear them talk about the cloud, manuvering through the difficult topics but still giving the impression that it would make games run faster.


Why wouldn't this be possible on a X1? It wouldn't be unique to it, but it it's running on an X86 PC it'll run on an X1.



youarebadatgames said:

Why wouldn't this be possible on a X1? It wouldn't be unique to it, but it it's running on an X86 PC it'll run on an X1.

They did not specify how this particular demo works with the cloud. I highly doubt that they used cloud servers on the internet and rather had them on site. Since they said nothing about how it worked it might as well just been an interactive video stream that was completely running on cloud servers.

We will know when we see actual results on the X1(which we won't).

Another fun thing would be to know how they would accomodate people without or shoddy internet connection. Will the games suddenly become unplayable when there is a little lag? Will the game just crash or will it offload all the work back to the local CPU?



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

I skipped everthing except the first few posts. Anyway, I'm gonna say no because the cloud actually exists and Microsoft built their console with the cloud in mind. Whether it does what they say it does remains to be seen.

Blast processing was totally imagined and if you break down a Sega Genesis, you'll never find a "Blast Processor". It was made up after the SNES hit the market and Sega found out that they had a faster processor.

So, no.