By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Native 4K or Checkerboard "uprendered" 4k

SvennoJ said:

Perhaps, chroma subsampled you only need 15 bits per pixel instead of 30 for 10 bit color. If memory bandwidth is a bottleneck, saving half of it could help. All the pixels still need to be rendered though...

Yeah, my assumption was the renderer could natively work with YUL/HSV type color, and straight up not spend render time on info it doesn't want.
Plus, given the relatively limited memory of consoles, something like this seems like it could be relevant in some cases.
(more so in the case of Xbone, but even for PS4 Pro, memory bandwidth was the least increased factor of everything)

Anyhow, I was kind of spit-balling ideas to get an understanding of other possibilities for different approaches we might see in future.



Around the Network
Peh said:
Slimebeast said:

How can it look blurry? I understand it's blurry if the monitor upscales from 1080p to 4K (and uses an upscaling technique similar to those you linked to), but can't you prevent it from upscaling so that it remains the original sharp 1080p image?

Those pics don't work, except for one.

It's basically this :

http://www.red.com/learn/red-101/upscaled-1080P-vs-4K

I have no influence of what the monitor does with the image. I just imagine that the frame buffer in the monitor is always 4k whatever the size of the source frame is and it fills due to interpolation the missing dots.

Damn it! Because a new plan I came up with recently was to get a 4K monitor next instead of a 3440x1440 monitor, so that I can play all my older games in 1080p without any scaling problems. Since 1080p is 1/4 of 4k in amount of pixels it should show up perfectly on a 4K screen and look "native" if every original pixel is represented by 4.

But from your experience the damn interpolation apparently interpolates and thereby softens and blurs the image automatically!

Damn it, what am I gonna do now?

Decisions.



Peh said:
Pemalite said:

By what I mean in regards to "depends on panel" is that some TV's have such poor Brightness and Colour calibration that resolution is less of a factor in the overall presentation of an image. (Aka. Resolution isn't everything.)

Secondly some TV's have scalers of questionable quality that may add noise or artifacts into the displayed image, in some more moderate scenario's you might get additional input lag.

Thirdly, some TV's will also do some post-processing on the displayed image to bolster contrast, sharpness, blur, or perhaps create an additional frame between two frames to simulate a higher refresh rate.

Then lastly there is a massive factor of panel size and viewing distance you need to factor in, there is a reason why PC gamers like the "smaller" more expensive details in games... Because we are also some of the first to pick up and notice them due to how close we sit to our displays.

I'm not disputing the fact that native content is superior though, but there is more to this issue than people actually realise.

Those pics are taken on a Monitor. Not a TV :) There is no post processing of the image.

I was using it as general example of course.
Also, some monitors will also upscale and add post processing to the image, but that depends on the panel and the market (I.E. Multi-purpose) it's geared towards.




www.youtube.com/@Pemalite

Pemalite said:
Peh said:

What do you mean by "depends on the panel"?

I don't think a panel can get rid of the blurry picture by upscalling.

Nevertheless, I just checked again by not using AA. The 4k is crisp as it possibly can be. The upscalled 1080 image is still blurry. And obviously less details are being shown.

1080 upscaled no AA

http://www.pic-upload.de/view-31680389/20160912_124137.jpg.html

4k no AA

http://www.pic-upload.de/view-31680399/20160912_124225.jpg.html

By what I mean in regards to "depends on panel" is that some TV's have such poor Brightness and Colour calibration that resolution is less of a factor in the overall presentation of an image. (Aka. Resolution isn't everything.)

Secondly some TV's have scalers of questionable quality that may add noise or artifacts into the displayed image, in some more moderate scenario's you might get additional input lag.

Thirdly, some TV's will also do some post-processing on the displayed image to bolster contrast, sharpness, blur, or perhaps create an additional frame between two frames to simulate a higher refresh rate.

Then lastly there is a massive factor of panel size and viewing distance you need to factor in, there is a reason why PC gamers like the "smaller" more expensive details in games... Because we are also some of the first to pick up and notice them due to how close we sit to our displays.

I'm not disputing the fact that native content is superior though, but there is more to this issue than people actually realise.

Hey perma what would you suppose will be the gains for me on that 4k 65" Sony TV going from PS4 to Pro (this checkerboard plus the HDR) on IQ?



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

DonFerrari said:
Pemalite said:

By what I mean in regards to "depends on panel" is that some TV's have such poor Brightness and Colour calibration that resolution is less of a factor in the overall presentation of an image. (Aka. Resolution isn't everything.)

Secondly some TV's have scalers of questionable quality that may add noise or artifacts into the displayed image, in some more moderate scenario's you might get additional input lag.

Thirdly, some TV's will also do some post-processing on the displayed image to bolster contrast, sharpness, blur, or perhaps create an additional frame between two frames to simulate a higher refresh rate.

Then lastly there is a massive factor of panel size and viewing distance you need to factor in, there is a reason why PC gamers like the "smaller" more expensive details in games... Because we are also some of the first to pick up and notice them due to how close we sit to our displays.

I'm not disputing the fact that native content is superior though, but there is more to this issue than people actually realise.

Hey perma what would you suppose will be the gains for me on that 4k 65" Sony TV going from PS4 to Pro (this checkerboard plus the HDR) on IQ?

Thats something I would also like to know. Since I dont have a 4k tv yet, I really could not apreciate the play station meating. But digital foundry was impresed as hell. They said that checkerboard was really close to 4k and for such an underpower machine. 



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

Around the Network

It's a pretty neat trick if you have less than 10TF or so to render AAA budget game at 4k, with a decent frame rate....

 

I personally think 4K is the "retina" level resolution for TVs (depending how far and what size your TV is) and this type of technique becomes much more tolerable than they were at 1080p or less... it's not a simple blow up of the image, it uses the moving data and the data from previous frames (that were in the other color, as the renderer alternate between the two sets of tiles for each frame) to guess what`s in the areas not being rendered in the current frame, my take is this can be pretty convincing in action, however a little less clean than true 4K.



eva01beserk said:
DonFerrari said:

Hey perma what would you suppose will be the gains for me on that 4k 65" Sony TV going from PS4 to Pro (this checkerboard plus the HDR) on IQ?

Thats something I would also like to know. Since I dont have a 4k tv yet, I really could not apreciate the play station meating. But digital foundry was impresed as hell. They said that checkerboard was really close to 4k and for such an underpower machine. 

The difference should be substantual and certainly superior to upscaling 1080P content, that can't really be disputed, twice the pixels of 1080P is being rendered here afterall.

Even though the pixel count is roughly in-line with that of 2560x1440/2560x1600, the result is slightly better than even that as there is no funky upscaling algorithm being used.
It's certainly not going to be 4k in quality, but it's still superior to 1080P and a little better than 1440P/1600P.

The real question though is it "Good enough"? For most gamers that would be a yes, now we just need the games to see what the hardware truly is capable of.




www.youtube.com/@Pemalite

alabtrosMyster said:

It's a pretty neat trick if you have less than 10TF or so to render AAA budget game at 4k, with a decent frame rate....

 

I personally think 4K is the "retina" level resolution for TVs (depending how far and what size your TV is) and this type of technique becomes much more tolerable than they were at 1080p or less... it's not a simple blow up of the image, it uses the moving data and the data from previous frames (that were in the other color, as the renderer alternate between the two sets of tiles for each frame) to guess what`s in the areas not being rendered in the current frame, my take is this can be pretty convincing in action, however a little less clean than true 4K.

Meanwhile in Japan, available since last year
http://www.sharp.co.jp/business/8k-display/products/lv85001_feature.html
85" LED tv for a measly 130k

It depends on size and distance. Taking NHK research about 60cpd (120 pixels per degree) starts "retina" level, beyond 155 cpd (310 pixels per degree) people can't distinguish it from the real thing. https://www.nhk.or.jp/strl/results/annual2010/2010_chapter1.pdf

cpd = cycles per degree, need 2 pixels per cycle (on/off)

They also tested for sense of being there, greater for wider fov. For 8K the sweet spot between realness and immersion is at about 65 degrees.


Obviously VR has the greatest immersion level at 100 degrees fov. Yet at 120 pixels per degree "retina" level you would need a 12K display for VR.
For a 65" 8K tv, you can sit at 4ft for that retina level resolution with just over 60 degrees fov.
For a 65" 4K tv, you can sit at 8ft for retina level resolution with 32 degrees fov.

VR is the future for high res high immersion experience. The display might need to be 8k or 12k, however with foveated rendering that won't be much more demanding than current 3D 1080p rendering. It's already happening: http://www.extremetech.com/computing/209740-samsung-is-reportedly-working-on-an-11k-screen-claims-it-can-create-3d-illusions and http://www.theverge.com/2016/7/22/12260430/nvidia-foveated-rendering-vr-graphics-smi-eye-tracking-siggraph
Cram that screen into VR glasses with eye tracking, no more need for tvs.

Sorry, got sidetracked. Checkerboard 4K is a nice stop gap for 4K tvs.



Pemalite said:
eva01beserk said:

Thats something I would also like to know. Since I dont have a 4k tv yet, I really could not apreciate the play station meating. But digital foundry was impresed as hell. They said that checkerboard was really close to 4k and for such an underpower machine. 

The difference should be substantual and certainly superior to upscaling 1080P content, that can't really be disputed, twice the pixels of 1080P is being rendered here afterall.

Even though the pixel count is roughly in-line with that of 2560x1440/2560x1600, the result is slightly better than even that as there is no funky upscaling algorithm being used.
It's certainly not going to be 4k in quality, but it's still superior to 1080P and a little better than 1440P/1600P.

The real question though is it "Good enough"? For most gamers that would be a yes, now we just need the games to see what the hardware truly is capable of.

But how much am I losing only havind DP-10 (on 65X850C) instead of Dolby protocol? Since Sony is adherent on DP-10 PS4Pro will probably use it right?



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

DonFerrari said:

But how much am I losing only havind DP-10 (on 65X850C) instead of Dolby protocol? Since Sony is adherent on DP-10 PS4Pro will probably use it right?

The PS4 will be backing HDR-10 and will not be supporting Dolby Vision HDR, this seems to be an industry wide thing at the moment.

Dolby Vision is obviously superior with a higher baseline in many aspects, for example 12bit colour rather than 10bit with HDR10.. Which means better colour gradients, less colour banding, Dolby is also more dynamic in that colour and brightness levels can be adjusted for every scene, where as HDR10 takes a more compromised approach.

As for how much you are loosing? Well. You don't really loose anything.
If you are happy with your picture now with non-HDR content, then you will also be happy with either HDR10 or Dolby Vision as both offer a marked improvement.
If your TV only supports HDR10(Which from a quick glance it does) then I wouldn't rush out and buy a new set with Dolby Vision support anyway, especially considering that Dolby Vision may be the next HD DVD.

In a few years, expect to see the successor to HDR10, aka. "HDR12" which should bring HDR10 in-line with Dolby Vision.




www.youtube.com/@Pemalite