vivster said:
Pemalite said:
vivster said:
All the CPU and RAM in the worl won't help if the shaders are overworked. That's what I meant.
I never said that MS is responsible for these "gaming PCs". Just that they are helping to keep the average consumer dumb.
|
Well, CPU's are far more flexible and programmable, more than any GPU, that's where the strength lays and they can indeed handle graphics duties with better image quality than ANY GPU. However because of a CPU's serialistic processing nature, they generally aren't very good at tasks which can be run highly parallel such as graphics duties, that's where a graphics processor steps in, it's a design with many thousands of cores, which leverages the fact that graphics tasks are highly parallel. Then again, it's all mathematics in the end, just the CPU can handle allot more complex math than a GPU, whilst the GPU can handle more mathematic problems at once.
This is why after the 90's era, most games on the PC landscape shifted from "software" to "hardware" rendering (I.E. Handled via CPU to being run on a GPU) because of the simple fact GPU's can leverage parallel processing better than any CPU can, however that comes at the cost of the GPU being "Stupid" in comparison in order to conserve transister space for more "dumb cores" for better performance.
With consoles however with so many CPU cores, developers will probably leverage a CPU core or two for some smaller framebuffer effects such as morphological Anti-Aliasing. (Which is a blur filter.) Thus the CPU will still handle some graphics duties, this is where Azure/Cloud can also step in.
|
Do you really believe that though? The cloud may be very well able to process these things but look at the delay. It will hardly be able to process graphical features and transport them back in a timely manner to be presented on screen.
|
Of course I beleive it.
Will it make graphics 10x better? Hardly.
Will it make textures sharper and cleaner? Nope.
Will it make the system run at a higher resolution? Not going to happen.
What *can* happen is they can use a similar technique to what CPU's use, which is "prediction" where they predict the data ahead-of-time so it's ready for processing, it's going to be limited to non-latency and bandwidth sensitive tasks of course.
And unlike the "Blast Processing" debacle, the processing power is indeed real, it's just limited in it's use-case scenario's.