By using this site, you agree to our Privacy Policy and our Terms of Use. Close
trasharmdsister12 said:
Otter said:
" Backwards compatible games run natively on the Xbox Series X hardware, running with the full power of the CPU, GPU and the SSD. No boost mode"

Whats the difference between running natively and boost mode?

The difference is that there's a virtual limit set on the hardware in a "boost mode".

So for example, say a game on Xbox One X runs at a target of 4K and 60 FPS set by the developers. Xbox Series X will use all the CPU and GPU it needs to in order to hit that target. If the developer goes in and makes a patch to support higher settings or changes 60 FPS to 120 FPS, then Xbox Series X will automatically scale to use the CPU and GPU capabilities it has to reach that new target up until that hardware hits its natural limit.

In boost mode (as Sony has employed it in PS4 Pro - not sure this is the definition MS is employing in their PR) what happens is that the hardware is virtualized to run games to a specific set of hardware. This can be used to ease compatibility. For example, Sony has a boost mode on PS4 Pro that can run some PS4 games at better framerates even though they didn't get PS4 patches. But that new performance profile isn't taking full advantage of the CPU or GPU. The CPU or GPU is being restricted to a certain frequency or to use a certain amount of resources/cores that parallel the earlier hardware. So even if the hardware is capable of a higher performance threshold, it won't be realized. 

So really the big difference in practice is say a developer goes back and adds a mode to have a higher framerate; for this example RDR2 at 60 fps. With a boost mode, the new console would limit its CPU and GPU resources to better mimic the hardware setup of an older console (less cpu cores/threads and gpu compute units so the code's parallel nature can operate the same way) and the boost comes from frequency upgrades and other IPC improvements. This has a lower natural limitation as maybe half the CPU is just sitting idle for example. In the case of RDR2 at 60 FPS this setup may top out at 50-55fps even though the CPU is half unused. MS's approach is what would possibly allow them to use the rest of that CPU to increase the framerates in this example to a solid 60. And only the parts of the CPU that don't need to be used to hit that limit would sit idle.

I see, thanks for the detailed response!

I'm curious to see how this will compare to sonys approach and whether it will make a meaningful difference to the quality of BC between the 2 systems.