By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Peh said:
                               
Pemalite said:

The Switch's console generation is far from over you have a few thousand games to get released yet. Every console has games that will have screen tearing. (Except Scorpio via Freesync.)
Just saying how it could have been easily avoidable with a hardware solution, so you don't need to resort to things like double/tripled buffered v-sync which destroys any resemblance of responsiveness and other minor caveats.

Eh? What are you talking about? Afaik Nintendo uses vsync. There is no screen tearing in Nintendo consoles. Input lag is a different issue. You know that the TV also needs Freesync in order for it to work? How many TV's do this feature currently have? Btw. scorpio is not even out yet. I really don't know what kind of point you are trying to make here.

How sure are you about that? ;) Willing to make a bet? ;) Evidence is a-plenty.

*facepalm* Of course a TV also needs Freesync to work. I did state that earlier in the thread.

And the point I am trying to make is that there is a hardware solution to solve screen tearing, I'm not sure if I could get anymore blatantly obvious than that?

Peh said:
Well, that's some well reasoned argument. I just don't know how I can argue against that. ....

It's because it's already been thoroughly debunked.
Prior examples mean something, right?


 

Peh said:

Read what I wrote:

"No, it washes out the textures even more and blurs the edges out. A result is a blurry image. A good use of AA is just to render the image at a higher resolution and downscale it (SSAA). This results in the best image quality, yet at the most cost of performance."

 I was simply talking about how the best image quality can be achieved. Not what the most efficient way is. There are obviously effiecient ways to make the image appear with less aliasing. But also an efficient and good image quality will come with perfomance cost. You seem to ignore this fact.

I don't know why you still keep comparision to 3DS, because the AA on the 3DS is just a blurry image.

The point I am trying to make is you can retain a similar degree of effect, but be vastly more efficient in your approach so that it's possible on more anemic hardware like the Switch.

As for the 3DS... It has Anti-Aliasing. It's "Blurryness" isn't the fault of Anti-Aliasing... As you stated prior, in 3D mode Anti-Aliasing is turned off and the image remains blurry.

 

Peh said:

So, the answer is no.

The issue is not on "how many monitors can you display whatever resolution", because the monitor you are using are still limited to 1440p or 1080p. What also matters is the PPI and distance to a single monitor. Even if you can display 5760x1080, 1920 and 1080 will still stay the same on each display making aliasing more obvious. The higher the resolution is by the same display size will make aliasing less noticeable, so lower AA filters need to be applied for having a good image quality. So, there are many factors to be taken care in.

Actually the answer is not "no". Read near the end of the paragraph.

I am well aware of all these factors. But a 27" 1440P monitor is certainly going to have higher PPI than a 60" 4k TV.
The TV would have a PPI of 73.43 whilst the monitor would be 108.79 PPI. Viewing distance also plays a factor as you so eloquently state.

Aliasing exists at all resolutions. We are working with pixels remember.

Peh said:

You failed the point. When a console is being developed, then the company looks at statistics at how many and what kind of devices their customerbase has and how possibly the future could look like in the next 4-5 years (simply speaking, because that task is a bit more complicated). If the majority of people are using simple CRT's be it NTSC, PAL and SECAM and the next gen of TV's are in no sight, I will focus on developing for these devices. (Not taking into consideration of stupid design choices)

Even if the LG 32fs4d which I hear the first time from it (which also doesn't matter) was available back in the 90's. How many customers do you think would have this device and would it be worth developing for? You can answer this question by your own.

Rubbish.

During the CRT era we had a console known as the "Xbox". And there were a few games in high-definition. This was before High-definition displays were commonplace.

A console manufacturer will provide the best hardware they can for any given price point, form factor and gimmick.
It is up to developers on what they wish to do with it.

The Xbox 360 and Playstation 3 were supposed to be "High Definition" consoles. Yet many games were actually sub-HD.
The Wii was a sub-HD console in the era of HD.

We are transitioning from the HD era, yet the Xbox One, Playstation 4 and Switch isn't able to achieve Full-High Definition in every instance.

So building these boxes to match the displays simply hasn't occured historically has it?

CRT TV's usually had a resolution of 640x480, some had 800x600. Some had 720x576, some had 1280x1024, some had 1280x720, some had 1920x1080.

I mean, the PS2 had component RCA which has a maximum resolution of 1080i, many TV's, especially rear-projection TV's at the time could resolve that resolution. But no game rendered natively at that, did they? Kinda throws a spanner in the works with your hypothesis that consoles are built for the display technology available. Especially when the Xbox of the same generation had a game or two at 1080i/1080p.

 








www.youtube.com/@Pemalite