JaggedSac said:
That will only affect games that are developed for Natal, ie AI, physics, etc. It will not affect accuracy in any capacity and the input lag will most likely be improved since the 360 CPU is more powerful than a stand alone chip in Natal would be. The reason the chip was good was because developers could develop games as though Natal was just another controller. They would not have to worry about it eating up resources. Now they do. Having the software on the 360 can be a good thing. Specific games can have their own specific Natal software set that developers develop. It is a much more open platform now. |
That's not necessarily true. A chip dedicated to processing the information that the camera captures could very well be better than using 15-30% of the XBox360 CPU power. The chip would be designed for whatever needs to happen to process the data. CPUs are generally good at everything and are designed to be useful in almost any circumstance. A custom designed chip would probably be crap at doing most things, but excel at what it's supposed to do (process Natal data).
An easy example is having a GPU with an h.264 decoder vs one without. If your GPU has an h.264 decoder, your CPU utilization when watching an HD video will probably be relatively light, even for weaker CPUs. If your GPU doesn't have an h.264 decoder, well I hope you have a powerful CPU. The NVIDIA Tegra can do 1080p h.264 decoding that an older desktop CPU would have a tough time with, but in terms of pure power, the older desktop CPU is probably more powerful than the Tegra.
EDIT: On a side note, it was the cheaper, less powerful GPUs from NVIDIA that were the first to include an H.264 decoder while the more powerful GPUs went without one.