By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
zero129 said:

No it isnt the Switch port its the pc version running Lower then Low settings (Using a few ini tweaks) to show you that the same game can scale down to looking more then a gen apart without a massive team once the engine is already made to be scaled. It can also go beyond ultra settings too with some ini tweaks and a few texture mods it can look like an early next gen game.

Its ironic you talking about chery picked photos when thats all you do in every thread. Find photos with the smallest difference and claim they prove the is no difference and call it a day. My pics clearly show the can be a massive difference bigger then the ones between anything you showed from infamous from PS3 to PS4.

.ini tweaks are just a list of options tweakable by a game, often when you change something in a games UI interface, it changes the corresponding .ini file trigger.
And often a games UI settings do not expose those extra options in the .ini file.

And sometimes a game has a command line interface where you can take things even further, such as idtech, CryEngine and Source.

goopy20 said:

It isn't about how graphics can scale its about parity and hitting performance targets on all platforms. Who would buy Witcher 3 and play it at lower than the lowest settings? It's why pc games have minimum requirements in the first place. Sure you can still play them with even lower than the minimum requirements, but then you will spending money on a game that runs way below the quality norm that the developer was aiming for.

There is actually a sub-group of PC gamers who try to run games on hardware that is below minimum requirements, it's actually a big community.

The entire Oldblivion project (Something I was a part of) was essentially an approach of rewriting The Elder Scrolls: Oblivion shaders in order to be compatible with Shader Model 1.0. - Perfect for a Geforce 3, which is a similar GPU found in the Original Xbox.

We also did a similar thing for Fallout 3.

And another team created "shadershock" which took a similar approach in order to run Bioshock on older shader models.

System requirements are a guideline in order for a developer to remove any "mud" from their hands legally if their game doesn't work on certain hardware sets, that is pretty much it. - In my 25 or more years of PC gaming, not once have I ever referred to a games system requirements.

goopy20 said:

Sure, they could push Series X to the max with Halo Infinite while targeting 30fps/1440p. No doubt it would look spectacular and the engine can easily do that. But why would they if they also want to sell the Xone and pc versions? You keep forgetting that the core game has to look and play identical on all platforms and hit 1080p and a steady 30fps on Xone.

Who says it has to look and play identical on all platforms?

Battlefield on 7th gen didn't. Multiplayer was very different on PC thanks to larger maps and player counts thanks to the larger amounts of CPU power and Ram... And it looked almost a generation ahead on PC.

Compare Minecraft RTX on PC to the 7th gen consoles, game looks very different, plays very different, the 7th gen consoles even had world size restrictions.

DonFerrari said:

Not really a business decision doesn't necessarily be based on a technical limitation. They may just decide to cut of PS4 version even if feasible just to entice more people to buy the newer console while sacrificing some sales of the SW they would have on PS4. Nothing on the trailer seemed to not be possible to dial down.

It usually takes a few years before "dialing things down" to slower hardware becomes impossible or extremely difficult, it takes time for developers to come to terms and leverage the various hardware nuances to the fullest extent of a console platform and build/upgrade their game engines to match, being exclusive or not doesn't really change that.

It does mean that if a prior entry to a game used lots of baked assets and the successor used lots of dynamic assets, then when doing a back-port to an older platform there will seem like there is a big regression in visuals compared to other games as the dynamic assets get turned off.

Case in point: Blacks Ops 3 and Dragon Age 3 on Xbox 360, games looked flat with lack of shadowing and lighting giving definition to scenes.

The Switch seems to be handling backports very well, mostly because it's hardware is very efficient for 720P and lower resolutions, mostly thanks to the efficiency of nVidia Maxwell and it's underlying technologies like tiled based rendering and delta colour compression.
Plus it's hardware features are a match and even exceed the Xbox One/Playstation 4 base consoles in a few areas thanks to being a more modern GPU, so for example the base Playstation 4 and Xbox One use shader model 5 and Switch uses shader model 6 for example... So back porting to the Switch is just easy, much easier than a 7th gen device with outdated and inefficient hardware feature sets.

I know there's a subgroup who try to run games at lower than the lowest settings, just like there's a huge mod community that try to push the visuals beyond what the developers intended. However, developers didn't officially support those settings as they typically don't want their games to look totally different on different platforms. As soon as developers decide to build their game for multiple platforms, parity becomes a thing that affects the entire design and ambitions of the project before they even started making it. Unfortunately, that means the ambitions on the far more powerful platform will always be held back because of it.

There were some exceptions like BF3 and we've seen some atrocious down ports like Shadow of Morder, but in most cases you're getting a similar experience on base consoles as on a high-end pc, minus 120fps, 4k and a bump in graphics settings. Xbox One will obviously hold things back in the first 2 years, but if we look beyond that and the whole Lockhart/ Series X situation, it can only really work if Series X games are forced to run at 4k and 60fps.

What I really don't like about that is just how much of computing power is essentially wasted in the pursuit of 4k and 60fps. I'm not saying that 4k doesn't bring better clarity, more detail etc. but imo, the turn to 4k is happening too soon in relation to available computing power. Hell, 1440p has not even become mainstream in the pc space and now we have this 4k "craze" that brings down even the mighty RTX2080 to it's knees with current gen games. It'll massively slow down the pursuit towards better graphical fidelity, as gpu's in both PC and consoles need to push 4 times the amount of pixels.

I mean, imagine what would be possible on Series X if instead of sticking with base Xbox One settings and maxing out resolution, developers could focus on ramping up visuals to a maximum while keeping 1440p or even 1080p as target rendering resolution. With Series X that won't be possible as developers will already be doing that on the 4Tflops Lockhart.