By using this site, you agree to our Privacy Policy and our Terms of Use. Close
shikamaru317 said:
Machiavellian said:

If we are looking at just using current CPU/GPU tech I would agree.  This is always the advantage of consoles is there ability to marry custom design we would not see within a traditional PC setup.  Looking at how MS created their own custom coprocessor for Hololens which pretty much is exactly what they can do for Ray tracing.  The HoloLens coprocessor handles all the 3d environment, tracking and other calculations and pretty much hands that data to the CPU already cooked.  The same thing can be done with Ray Tracing where the coprocessor would handle all of the scenes cooked and hand it off to the GPU for final output.  The cost of such a chip would be pretty cheap or cheap enough to stay within the 500 budget.

The way I am looking at this is that MS has the tech, they now offer the software solution and have support for major engine developers and publishers.  You have a good 2 to 3 year span of time games to come out with the tech and showcase the work.  4K 60fps I do not believe provides enough wow factor to move the goal post for MS over Sony.  If you are going to go big then go big or go home.

Thing is though, if they drop the AMD CPU + AMD GPU combo they lose backwards compatibility. At this point they've built up a massive catalog of backwards compatibility titles for 360 as well as a few for OG Xbox, a library which will be even bigger in 2020 when Xbox Scarlett most likely releases, and if they go with AMD + AMD again they will also have backwards compatibility with alot of Xbox One games by default, because they've been working with devs for awhile now to help the devs make their games forwards compatible with future AMD + AMD Xbox hardware. I can't see them giving that up to go with a custom solution just for ray tracing. Sure they could go the dual chip solution, and have both an AMD APU and a custom chip designed around ray tracing, but history shows that dual chip solutions usually don't work out well, they just make development harder for devs and only a handful of skilled devs end up utilizing the dual chip solution properly. A dual chip solution would also be expensive, as it is it's going to be hard to hit native 4k 60fps with a small graphical upgrade over this gen for $500 in 2020, a dual chip solution would either cause them to sell at a loss or make the next Xbox more expensive than $500 at release.

Nah, I think they will just aim for release in 2020 at $500 with native 4k 60 fps on most games, with some small-medium graphical upgrades over Xbox One X as well. Sure there's not a big wow factor there, but PS5 won't have a big wow factor either if it releases in 2019 or 2020. Maintaining backwards compatibility is a must, as that will be a big selling point for the 35m+ Xbox One owners (a number which will be over 50m by 2020 most likely). Their efforts to improve their first party for next gen with all these new studios will help to win over new fans as well. Then in say 2023 they can release their mid-gen upgrade and push graphics up closer to wherever PC ultra is at in 2023 (which will most likely include some ray tracing).

MS would not need to ditch the AMD CPU + GPU combo.  The good thing about having a discrete coprocessor is that it can just plug in like an audio chip or any other specialize processor.  This is where writing to an API aleviate all of the extra coding a developer would need to do in order to  work with such a setup.  The API would decide how to send what data to the coprocessor to crunch.  Allow it to cook the scene then send it back to the CPU/GPU for output.  No need to break compatibility since we are talking to the API and not the mettle here.  Since Ray Tracing is in DX12 with support for current gen GPUs, compatibility is still retained even if a coprocessor is not used.