By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Wyrdness said:
Thunderbird77 said:

Do not quote me if that's the kind of things you're going to say.

You still didn't explain yourself. Why is there no relation between hardware power and framerate + resolution?

 

I think what they mean is in regards to the assets used in a game, for example X1 games will have assets that are natively 900p while Wii U games will have assets that natively 720p. This means running the former at 30fps takes significantly more power than the latter at 60fps because of the workload caused by the assets.

When I say assets I'm not talking strictly about resolution as that's just the visual output and PCs have been able to output at HD resolutions long before the 360 was even a concept, assets are things like the textures, character models and all. Assets have a native resolution they're created at and look best at and hardware not accustomed to the native resolution would not be able to run it at the same fidelity and would require compromises.

Framerate and resoltuion are in relation to power becase the are a few PS3 games that run in 1080p + 60fps but these games natively are sub HD in their assets and don't do as much as most other games, it would be a stretch to say that PS3 hardware is comparable to the 8th gen because of these few games.

ps3's 1080p games are nowhere near the graphical level of 8th gen console games. What I said before remains.