Otter said:
Native 4k resolution isn't required for HDR? And you can market a console as a 4k machine, without all of its games being 4k, thats down to developers (most 360/ps3 games aren't 1080p), what I'm saying is that 4k comes at a huge cost of other graphical improvements like lighting, quality of textures, pop in, dOP, VFX/simulations, poly count and of course performance. I got a 4k tv because it was on sale, I do love technology and wanted to experience the next jump. I would say I have quite a well trained eye and I honestly see the difference in resolution isn't huge unless you have a huge 60"+ TV or sit right in front of your screen. I appreciate it but I also see 1080p as a very pleasing resolution to the eye. Given how much pure processing power it takes to render an image natively in 4k in laymen terms we're talking 7-8tflops for PS4 quality visuals,and realistically next consoles will be 10-12Tflops, I just don't see it as a worthy investment when there are still many other improvements they can make that could make an even bigger impression than a slightly crisper image. For example I'd much rather see a 1440p checkboard (or even 1080p) presentation which represents witcher 3 in its original trailer, than a native 4k presentation of what the game looks like the game when it actually launch.
|
I will agree 4k is a resource eater. I believe its more to do with the bandwidth than the Tflops. I could be wrong there.
I brought up HDR because there is very little content that is HDR that isn't 4k. Its a great compliment for high pixel counts.
The reason why I said 1080p has become obsolete is because TV brands aren't or rarely making 1080p Panels anymore. The content will always be there for 1080p however the industry is all about going forward not backwards. 1080p and 1440p is going backwards.
4k with all those effects you mention compliments them quite well, they will look absolutely amazing when viewing with a extremely sharp image.
At the start of this generation I remember everyone bashing the Xbox One for not hitting 1080p regularly, so imagine the same thing happening when next gen arrives and games aren't hitting the TV standards.
Personally (I will be hated for this) for console games, I much rather have 30frames at 4k in campaign modes and keeping the must have 60fps solely for MP. I was disappointed when 343 announced Halo 5 to be 60fps for campaign because campaigns don't really need 60fps to run and they could have done so much more with half the frames just like the original games. Majority of the time when playing single player games, its like playing out a movie, and the visual quality would be more effective than the gameplay. As for High end PCs its not an issue or using a Mouse 60fps is quite necessary.
Next Gen wont have issues running 4k as long as there sticking with current gaming engines, of course that will change sooner rather than later. This is why I think the XB1X is in a good situation even for next gen consoles, because when next gen arrives its pretty clear they will focus on 4k content as its primary selling point, and if true, that means all the X has to do is render those next gen games at 1080p which will help push next gen games on the platform so its looking pretty future proof right now.
Anyway I love my 4k TV and PC setup, I am happy running games in-between 40 to 60 frames at 4k because the clarity is just so much easier on the eyes.