By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Digital Foundary: Mario Kart 8 Deluxe: Switch vs 3DS/Wii U

Peh said:

Like I already said, we don't know why there is no AA on 1080p. Maybe it looked bad, maybe it impacted the performance too much too hold 60 fps, or maybe because it is a racing game, no one would be really bothered.

Not knowing why there is no AA is ultimately irrellevant. They are only excuses.
There is zero reason for games in 2017, regardless of platform, resolution or hardware capability to have zero anti-aliasing.

We already know the Switch has underpowered hardware, but even then it's still capable of performing rudimentary anti-aliasing.

Peh said:

Image Quality depends on the screen you are using (resolution of that screen and what ppi it has) and at what distance you are away from the screen. Smaller Resolution on a higher Resolution Screen will always look bad, because of upscaling. AA will make it worse imo.

Good Anti-Aliasing never reduces image quality. It always improves it.
Nintendo's underpowered hardware isn't an excuse for omitting Anti-Aliasing. Work around it, no need to be apologetic and defend Nintendo's horrible decisions.
Other Switch games have Anti-Aliasing. Wii U games have Anti-Aliasing. Xbox 360 games have Anti-Aliasing. Playstation 3 games have Anti-Aliasing.
Wii has Anti-Aliasing, 3DS has Anti-Aliasing. Playstation 2 even used Anti-Aliasing... Excuses, excuses. No need to make them.

Peh said:

For PC gaming I am using a 27" for 4k (~50cm away from it)  and 55" 4k at a distance of 3 meters. You don't really need AA anymore, because the pixels are at such a small size, that aliasing is hardly noticeable. FXAA does the job pretty much done. But if there is still room for performance I go with 2x or 4x MSAA / TXAA 1x max.

And yet. At 4k. I still opt for Anti-Aliasing. Real Anti-Aliasing that is, because it actually works on the games geometry.
Regardless. The Switch isn't powerful enough for 4k anyway, Nintendo didn't include optimal hardware for that resolution. Which is fine.

Peh said:

I don't know at what screen you are looking at, but the N64 was made for CRT's in its mind. And that really did its job on a lower resolution TV. It looks like crap on modern TV's, though.

The Nintendo 64 being made for "CRT's" is a fallacy. It was never made for any particular display technology. In-fact. CRT could exceed the Nintendo 64's display output capabilities with ease, I did touch upon Nintendo's use of RCA/S-Video and limited resolution prior.

We had 1080P CRT displays back in the mid 90's you know.
CRT's also had refresh rates that exceeded most Nintendo 64 games.
And also had contrast ratio's and colour depth that put early LCD panels to utter shame... And in many cases still has superior input latency to many displays today... But I digress.





www.youtube.com/@Pemalite

Around the Network
curl-6 said:
Peh said:

Well, that's great then. Because it's a standard for Nintendo console to not have screen tearing, at all. Something like Sony and Microsoft still fail to accomplish nowadays as a standard for video games.

I take no AA:

http://www.nvidia.de/docs/IO/132426/txaa-updated.png

over screen tearing:

http://media.gamersnexus.net/images/media/2015/gpu/screen-tearing-blacklist.jpg

every day.

I hate screen tearing too, with a passion in fact, and I'm glad Nintendo does not tolerate it, but that doesn't change the fact that by foregoing any kind of AA Nintendo is, in this regard, falling behind a standard set by consoles that came out in 2005/2006.

They probably have their reasons. And I certainly don't make a big deal out of it, if there is no AA in some of their games. It just doesn't bother me as much. As I stated above. If the use of AA tends to make the image appear to be blurry, then don't use it. I want a sharp, fluid image on my TV.



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Crappy Monitor| HTC Vive Pro :3

Peh said:
curl-6 said:

I hate screen tearing too, with a passion in fact, and I'm glad Nintendo does not tolerate it, but that doesn't change the fact that by foregoing any kind of AA Nintendo is, in this regard, falling behind a standard set by consoles that came out in 2005/2006.

They probably have their reasons. And I certainly don't make a big deal out of it, if there is no AA in some of their games. It just doesn't bother me as much. As I stated above. If the use of AA tends to make the image appear to be blurry, then don't use it. I want a sharp, fluid image on my TV.

Ironically... Screen tearing could have been completely eliminated on Switch if Nintendo was a little more forward-thinking and went with a display that had a variable refresh rate that matched the GPU.

Scorpio will have Freesync, so we just need displays to catch up on that front for that Console.

Otherwise, it's not overtly difficult to dial back on an effect and use those freed up resources to bolster Anti-Aliasing. In-fact, that is something that I find that is generally preferred.
I don't think people fully comprehend how much of an improvement Anti-Aliasing can make to a scene and how little overhead it actually has on modern hardware.




www.youtube.com/@Pemalite

A PC Gaming Master Race is stalking Nintendo hardware everyday  

He seems to be ill.

 



                               

Pemalite said:
Peh said:

Like I already said, we don't know why there is no AA on 1080p. Maybe it looked bad, maybe it impacted the performance too much too hold 60 fps, or maybe because it is a racing game, no one would be really bothered.

Not knowing why there is no AA is ultimately irrellevant. They are only excuses.
There is zero reason for games in 2017, regardless of platform, resolution or hardware capability to have zero anti-aliasing.

We already know the Switch has underpowered hardware, but even then it's still capable of performing rudimentary anti-aliasing.

With the exception of FXAA, every single use of AA does impact the performance quality and image quality. If the result is an unstable framerate and a blurry image. Then it's better to avoid AA, at all.

Pemalite said:
Peh said:

Image Quality depends on the screen you are using (resolution of that screen and what ppi it has) and at what distance you are away from the screen. Smaller Resolution on a higher Resolution Screen will always look bad, because of upscaling. AA will make it worse imo.

Good Anti-Aliasing never reduces image quality. It always improves it.
Nintendo's underpowered hardware isn't an excuse for omitting Anti-Aliasing. Work around it, no need to be apologetic and defend Nintendo's horrible decisions.
Other Switch games have Anti-Aliasing. Wii U games have Anti-Aliasing. Xbox 360 games have Anti-Aliasing. Playstation 3 games have Anti-Aliasing.
Wii has Anti-Aliasing, 3DS has Anti-Aliasing. Playstation 2 even used Anti-Aliasing... Excuses, excuses. No need to make them.

No, it washes out the textures even more and blurs the edges out. A result is a blurry image. A good use of AA is just to render the image at a higher resolution and downscale it (SSAA). This results in the best image quality, yet at the most cost of performance.

I don't know about Wii Games using AA. 3DS can only use AA by disabling the 3D. The performance used to render the 2nd screen is being translated to AA rendering instead. So, they just went with it. Xbox 360 and PS3 use AA, but this results in a lot of Jaggies. If that is your argument for using AA, then that is not a good one.

Just as a side note: Knowing that the PS3 and Xbox 360 were pretty powerful at their time, most of it's power went Polygon count and Textures, and the rest to bad image quality: Screen Tearing and bad use of AA -> Blurry and Jaggies. I don't know about PS2.

Pemalite said:

Peh said:

For PC gaming I am using a 27" for 4k (~50cm away from it)  and 55" 4k at a distance of 3 meters. You don't really need AA anymore, because the pixels are at such a small size, that aliasing is hardly noticeable. FXAA does the job pretty much done. But if there is still room for performance I go with 2x or 4x MSAA / TXAA 1x max.

And yet. At 4k. I still opt for Anti-Aliasing. Real Anti-Aliasing that is, because it actually works on the games geometry.
Regardless. The Switch isn't powerful enough for 4k anyway, Nintendo didn't include optimal hardware for that resolution. Which is fine.

Do you own 4k devices and can you play at it? Just a question. I just want to know if you actually had any experience with native 4k gaming on PC for example.

Pemalite said:
Peh said:

I don't know at what screen you are looking at, but the N64 was made for CRT's in its mind. And that really did its job on a lower resolution TV. It looks like crap on modern TV's, though.

The Nintendo 64 being made for "CRT's" is a fallacy. It was never made for any particular display technology. In-fact. CRT could exceed the Nintendo 64's display output capabilities with ease, I did touch upon Nintendo's use of RCA/S-Video and limited resolution prior.

We had 1080P CRT displays back in the mid 90's you know.
CRT's also had refresh rates that exceeded most Nintendo 64 games.
And also had contrast ratio's and colour depth that put early LCD panels to utter shame... And in many cases still has superior input latency to many displays today... But I digress.

When I am talking about CRT, I am mainly talking about TV's. Not monitors. Besides Pal, NTSC and Secam there wasn't much else during that time for TV's. A home console is being attached to a TV most of the time, so I don't know why you had to go for higher Resolution CRT's which the console was not developed for. My point still stands.



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Crappy Monitor| HTC Vive Pro :3

Around the Network
Pemalite said:
Peh said:

They probably have their reasons. And I certainly don't make a big deal out of it, if there is no AA in some of their games. It just doesn't bother me as much. As I stated above. If the use of AA tends to make the image appear to be blurry, then don't use it. I want a sharp, fluid image on my TV.

Ironically... Screen tearing could have been completely eliminated on Switch if Nintendo was a little more forward-thinking and went with a display that had a variable refresh rate that matched the GPU.

Huh? There is screen tearing on Switch?

I only heard of issues that the screen will tear once while playing BotW due to some unknown causes, which also happened to me a 4 times in playing over 100 hours. But so far, there is no screen tearing like it is common on the other consoles by Sony and Microsoft.



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Crappy Monitor| HTC Vive Pro :3

Peh said:
Pemalite said:

Ironically... Screen tearing could have been completely eliminated on Switch if Nintendo was a little more forward-thinking and went with a display that had a variable refresh rate that matched the GPU.

Huh? There is screen tearing on Switch?

I only heard of issues that the screen will tear once while playing BotW due to some unknown causes, which also happened to me a 4 times in playing over 100 hours. But so far, there is no screen tearing like it is common on the other consoles by Sony and Microsoft.

The Switch's console generation is far from over you have a few thousand games to get released yet. Every console has games that will have screen tearing. (Except Scorpio via Freesync.)
Just saying how it could have been easily avoidable with a hardware solution, so you don't need to resort to things like double/tripled buffered v-sync which destroys any resemblance of responsiveness and other minor caveats.

Oneeee-Chan!!! said:

A PC Gaming Master Race is stalking Nintendo hardware everyday  

He seems to be ill.

1) I was alerted to this thread.

2) Being a part of the PC Gaming Master Race does not exclude me from being interested or owning other platforms.

3) Resorting to argumentum ad hominem is stupid. Don't do it please.


Peh said:  

With the exception of FXAA, every single use of AA does impact the performance quality and image quality. If the result is an unstable framerate and a blurry image. Then it's better to avoid AA, at all.

Wrong.


Peh said:
Pemalite said:

Good Anti-Aliasing never reduces image quality. It always improves it.
Nintendo's underpowered hardware isn't an excuse for omitting Anti-Aliasing. Work around it, no need to be apologetic and defend Nintendo's horrible decisions.
Other Switch games have Anti-Aliasing. Wii U games have Anti-Aliasing. Xbox 360 games have Anti-Aliasing. Playstation 3 games have Anti-Aliasing.
Wii has Anti-Aliasing, 3DS has Anti-Aliasing. Playstation 2 even used Anti-Aliasing... Excuses, excuses. No need to make them.

No, it washes out the textures even more and blurs the edges out. A result is a blurry image. A good use of AA is just to render the image at a higher resolution and downscale it (SSAA). This results in the best image quality, yet at the most cost of performance.

I don't know about Wii Games using AA. 3DS can only use AA by disabling the 3D. The performance used to render the 2nd screen is being translated to AA rendering instead. So, they just went with it. Xbox 360 and PS3 use AA, but this results in a lot of Jaggies. If that is your argument for using AA, then that is not a good one.

Just as a side note: Knowing that the PS3 and Xbox 360 were pretty powerful at their time, most of it's power went Polygon count and Textures, and the rest to bad image quality: Screen Tearing and bad use of AA -> Blurry and Jaggies. I don't know about PS2.

Wrong. SSAA is not the only form of "good" Anti-Aliasing. And is most certainly not a form of Anti-Aliasing I expect out of fixed-hardware of moderate capabilities. - I have already touched upon it in my prior posts. But I shall do so again.

There are forms of Anti-Aliasing which detects the edge of Geometry which is where aliasing typically occurs, it then proceeds to sample said edges of geometry and apply various patterns/filters to the affected area.

Now taking that same approach by detecting the edges of Geometry, some methods of Anti-Aliasing will render the edges of geometry at a significantly higher resolution and downscale them. It's a more efficient form of SSAA.

Thus the Anti-Aliasing isn't working on the entire image at a time. Thus it is more paletable for low-end hardware like the Switch.

Again. There is no excuses for the Switch if even the paltry hardware of the 3DS can perform Anti-Aliasing.

Peh said:


Do you own 4k devices and can you play at it? Just a question. I just want to know if you actually had any experience with native 4k gaming on PC for example.

I have actually had a triple 1440P set-up and a Triple 1080P setup... Which is 5760x1080 and 7680x1440 (More pixels than 4k) respectively.

I currently use a single 2560x1440 display as my primary driver, so I am certainly not a High-Definition/Full High-Definition peasant.

I have used professional 4k monitors and projectors for work purposes at my last job.

Peh said:
When I am talking about CRT, I am mainly talking about TV's. Not monitors. Besides Pal, NTSC and Secam there wasn't much else during that time for TV's. A home console is being attached to a TV most of the time, so I don't know why you had to go for higher Resolution CRT's which the console was not developed for. My point still stands.

There are High-Definition CRT TV's. So your point is moot.
Some of them like the LG 32fs4d even had HDMI.




www.youtube.com/@Pemalite

                               

Pemalite said:
Peh said:

Huh? There is screen tearing on Switch?

I only heard of issues that the screen will tear once while playing BotW due to some unknown causes, which also happened to me a 4 times in playing over 100 hours. But so far, there is no screen tearing like it is common on the other consoles by Sony and Microsoft.

The Switch's console generation is far from over you have a few thousand games to get released yet. Every console has games that will have screen tearing. (Except Scorpio via Freesync.)
Just saying how it could have been easily avoidable with a hardware solution, so you don't need to resort to things like double/tripled buffered v-sync which destroys any resemblance of responsiveness and other minor caveats.

Eh? What are you talking about? Afaik Nintendo uses vsync. There is no screen tearing in Nintendo consoles. Input lag is a different issue. You know that the TV also needs Freesync in order for it to work? How many TV's do this feature currently have? Btw. scorpio is not even out yet. I really don't know what kind of point you are trying to make here.

Pemalite said:
Peh said:  

With the exception of FXAA, every single use of AA does impact the performance quality and image quality. If the result is an unstable framerate and a blurry image. Then it's better to avoid AA, at all.

Wrong.

Well, that's some well reasoned argument. I just don't know how I can argue against that. ....

Pemalite said:

Peh said:

No, it washes out the textures even more and blurs the edges out. A result is a blurry image. A good use of AA is just to render the image at a higher resolution and downscale it (SSAA). This results in the best image quality, yet at the most cost of performance.

I don't know about Wii Games using AA. 3DS can only use AA by disabling the 3D. The performance used to render the 2nd screen is being translated to AA rendering instead. So, they just went with it. Xbox 360 and PS3 use AA, but this results in a lot of Jaggies. If that is your argument for using AA, then that is not a good one.

Just as a side note: Knowing that the PS3 and Xbox 360 were pretty powerful at their time, most of it's power went Polygon count and Textures, and the rest to bad image quality: Screen Tearing and bad use of AA -> Blurry and Jaggies. I don't know about PS2.

Wrong. SSAA is not the only form of "good" Anti-Aliasing. And is most certainly not a form of Anti-Aliasing I expect out of fixed-hardware of moderate capabilities. - I have already touched upon it in my prior posts. But I shall do so again.

There are forms of Anti-Aliasing which detects the edge of Geometry which is where aliasing typically occurs, it then proceeds to sample said edges of geometry and apply various patterns/filters to the affected area.

Now taking that same approach by detecting the edges of Geometry, some methods of Anti-Aliasing will render the edges of geometry at a significantly higher resolution and downscale them. It's a more efficient form of SSAA.

Thus the Anti-Aliasing isn't working on the entire image at a time. Thus it is more paletable for low-end hardware like the Switch.

Again. There is no excuses for the Switch if even the paltry hardware of the 3DS can perform Anti-Aliasing.

Read what I wrote:

"No, it washes out the textures even more and blurs the edges out. A result is a blurry image. A good use of AA is just to render the image at a higher resolution and downscale it (SSAA). This results in the best image quality, yet at the most cost of performance."

I was simply talking about how the best image quality can be achieved. Not what the most efficient way is. There are obviously effiecient ways to make the image appear with less aliasing. But also an efficient and good image quality will come with perfomance cost. You seem to ignore this fact.

I don't know why you still keep comparision to 3DS, because the AA on the 3DS is just a blurry image.

Pemalite said:
Peh said:


Do you own 4k devices and can you play at it? Just a question. I just want to know if you actually had any experience with native 4k gaming on PC for example.

I have actually had a triple 1440P set-up and a Triple 1080P setup... Which is 5760x1080 and 7680x1440 (More pixels than 4k) respectively.

I currently use a single 2560x1440 display as my primary driver, so I am certainly not a High-Definition/Full High-Definition peasant.

I have used professional 4k monitors and projectors for work purposes at my last job.

So, the answer is no.

The issue is not on "how many monitors can you display whatever resolution", because the monitor you are using are still limited to 1440p or 1080p. What also matters is the PPI and distance to a single monitor. Even if you can display 5760x1080, 1920 and 1080 will still stay the same on each display making aliasing more obvious. The higher the resolution is by the same display size will make aliasing less noticeable, so lower AA filters need to be applied for having a good image quality. So, there are many factors to be taken care in.

Pemalite said:
Peh said:
When I am talking about CRT, I am mainly talking about TV's. Not monitors. Besides Pal, NTSC and Secam there wasn't much else during that time for TV's. A home console is being attached to a TV most of the time, so I don't know why you had to go for higher Resolution CRT's which the console was not developed for. My point still stands.

There are High-Definition CRT TV's. So your point is moot.
Some of them like the LG 32fs4d even had HDMI.

You failed the point. When a console is being developed, then the company looks at statistics at how many and what kind of devices their customerbase has and how possibly the future could look like in the next 4-5 years (simply speaking, because that task is a bit more complicated). If the majority of people are using simple CRT's be it NTSC, PAL and SECAM and the next gen of TV's are in no sight, I will focus on developing for these devices. (Not taking into consideration of stupid design choices)

Even if the LG 32fs4d which I hear the first time from it (which also doesn't matter) was available back in the 90's. How many customers do you think would have this device and would it be worth developing for? You can answer this question by your own.



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Crappy Monitor| HTC Vive Pro :3

Peh said:
                               
Pemalite said:

The Switch's console generation is far from over you have a few thousand games to get released yet. Every console has games that will have screen tearing. (Except Scorpio via Freesync.)
Just saying how it could have been easily avoidable with a hardware solution, so you don't need to resort to things like double/tripled buffered v-sync which destroys any resemblance of responsiveness and other minor caveats.

Eh? What are you talking about? Afaik Nintendo uses vsync. There is no screen tearing in Nintendo consoles. Input lag is a different issue. You know that the TV also needs Freesync in order for it to work? How many TV's do this feature currently have? Btw. scorpio is not even out yet. I really don't know what kind of point you are trying to make here.

How sure are you about that? ;) Willing to make a bet? ;) Evidence is a-plenty.

*facepalm* Of course a TV also needs Freesync to work. I did state that earlier in the thread.

And the point I am trying to make is that there is a hardware solution to solve screen tearing, I'm not sure if I could get anymore blatantly obvious than that?

Peh said:
Well, that's some well reasoned argument. I just don't know how I can argue against that. ....

It's because it's already been thoroughly debunked.
Prior examples mean something, right?


 

Peh said:

Read what I wrote:

"No, it washes out the textures even more and blurs the edges out. A result is a blurry image. A good use of AA is just to render the image at a higher resolution and downscale it (SSAA). This results in the best image quality, yet at the most cost of performance."

 I was simply talking about how the best image quality can be achieved. Not what the most efficient way is. There are obviously effiecient ways to make the image appear with less aliasing. But also an efficient and good image quality will come with perfomance cost. You seem to ignore this fact.

I don't know why you still keep comparision to 3DS, because the AA on the 3DS is just a blurry image.

The point I am trying to make is you can retain a similar degree of effect, but be vastly more efficient in your approach so that it's possible on more anemic hardware like the Switch.

As for the 3DS... It has Anti-Aliasing. It's "Blurryness" isn't the fault of Anti-Aliasing... As you stated prior, in 3D mode Anti-Aliasing is turned off and the image remains blurry.

 

Peh said:

So, the answer is no.

The issue is not on "how many monitors can you display whatever resolution", because the monitor you are using are still limited to 1440p or 1080p. What also matters is the PPI and distance to a single monitor. Even if you can display 5760x1080, 1920 and 1080 will still stay the same on each display making aliasing more obvious. The higher the resolution is by the same display size will make aliasing less noticeable, so lower AA filters need to be applied for having a good image quality. So, there are many factors to be taken care in.

Actually the answer is not "no". Read near the end of the paragraph.

I am well aware of all these factors. But a 27" 1440P monitor is certainly going to have higher PPI than a 60" 4k TV.
The TV would have a PPI of 73.43 whilst the monitor would be 108.79 PPI. Viewing distance also plays a factor as you so eloquently state.

Aliasing exists at all resolutions. We are working with pixels remember.

Peh said:

You failed the point. When a console is being developed, then the company looks at statistics at how many and what kind of devices their customerbase has and how possibly the future could look like in the next 4-5 years (simply speaking, because that task is a bit more complicated). If the majority of people are using simple CRT's be it NTSC, PAL and SECAM and the next gen of TV's are in no sight, I will focus on developing for these devices. (Not taking into consideration of stupid design choices)

Even if the LG 32fs4d which I hear the first time from it (which also doesn't matter) was available back in the 90's. How many customers do you think would have this device and would it be worth developing for? You can answer this question by your own.

Rubbish.

During the CRT era we had a console known as the "Xbox". And there were a few games in high-definition. This was before High-definition displays were commonplace.

A console manufacturer will provide the best hardware they can for any given price point, form factor and gimmick.
It is up to developers on what they wish to do with it.

The Xbox 360 and Playstation 3 were supposed to be "High Definition" consoles. Yet many games were actually sub-HD.
The Wii was a sub-HD console in the era of HD.

We are transitioning from the HD era, yet the Xbox One, Playstation 4 and Switch isn't able to achieve Full-High Definition in every instance.

So building these boxes to match the displays simply hasn't occured historically has it?

CRT TV's usually had a resolution of 640x480, some had 800x600. Some had 720x576, some had 1280x1024, some had 1280x720, some had 1920x1080.

I mean, the PS2 had component RCA which has a maximum resolution of 1080i, many TV's, especially rear-projection TV's at the time could resolve that resolution. But no game rendered natively at that, did they? Kinda throws a spanner in the works with your hypothesis that consoles are built for the display technology available. Especially when the Xbox of the same generation had a game or two at 1080i/1080p.

 








www.youtube.com/@Pemalite

Pemalite said:
Peh said:
                               

Eh? What are you talking about? Afaik Nintendo uses vsync. There is no screen tearing in Nintendo consoles. Input lag is a different issue. You know that the TV also needs Freesync in order for it to work? How many TV's do this feature currently have? Btw. scorpio is not even out yet. I really don't know what kind of point you are trying to make here.

How sure are you about that? ;) Willing to make a bet? ;) Evidence is a-plenty.

*facepalm* Of course a TV also needs Freesync to work. I did state that earlier in the thread.

And the point I am trying to make is that there is a hardware solution to solve screen tearing, I'm not sure if I could get anymore blatantly obvious than that?

Show me the evidence to the context I was talking about.

Dude, your freesync argument makes no sense, at all.

Yes, Free Sync is a hardware solution. The console as well as the TV will need it. I say TV, because I go for the majority of customers that actually attach a console to a TV. Freesync is a solution invented by AMD in contrast to G-Sync from Nvidia. From which company, do you think, does the GPU in the Nintendo Switch comes from? Do I have to write it out for you?

Should I also imply a *facepalm* like you did?

 

Pemalite said:
Peh said:
Well, that's some well reasoned argument. I just don't know how I can argue against that. ....

It's because it's already been thoroughly debunked.
Prior examples mean something, right?

It has? How about using quotes for that debunked segment.

Pemalite said:
Peh said:

Read what I wrote:

"No, it washes out the textures even more and blurs the edges out. A result is a blurry image. A good use of AA is just to render the image at a higher resolution and downscale it (SSAA). This results in the best image quality, yet at the most cost of performance."

 I was simply talking about how the best image quality can be achieved. Not what the most efficient way is. There are obviously effiecient ways to make the image appear with less aliasing. But also an efficient and good image quality will come with perfomance cost. You seem to ignore this fact.

I don't know why you still keep comparision to 3DS, because the AA on the 3DS is just a blurry image.

The point I am trying to make is you can retain a similar degree of effect, but be vastly more efficient in your approach so that it's possible on more anemic hardware like the Switch.

As for the 3DS... It has Anti-Aliasing. It's "Blurryness" isn't the fault of Anti-Aliasing... As you stated prior, in 3D mode Anti-Aliasing is turned off and the image remains blurry.

An effect that does not appear to be blurry? I am intrigued. Show me.

Pemalite said:
Peh said:

So, the answer is no.

The issue is not on "how many monitors can you display whatever resolution", because the monitor you are using are still limited to 1440p or 1080p. What also matters is the PPI and distance to a single monitor. Even if you can display 5760x1080, 1920 and 1080 will still stay the same on each display making aliasing more obvious. The higher the resolution is by the same display size will make aliasing less noticeable, so lower AA filters need to be applied for having a good image quality. So, there are many factors to be taken care in.

 1. Actually the answer is not "no". Read near the end of the paragraph.

I am well aware of all these factors. But a 27" 1440P monitor is certainly going to have higher PPI than a 60" 4k TV.
The TV would have a PPI of 73.43 whilst the monitor would be 108.79 PPI. Viewing distance also plays a factor as you so eloquently state.

Aliasing exists at all resolutions. We are working with pixels remember.

Peh said:

You failed the point. When a console is being developed, then the company looks at statistics at how many and what kind of devices their customerbase has and how possibly the future could look like in the next 4-5 years (simply speaking, because that task is a bit more complicated). If the majority of people are using simple CRT's be it NTSC, PAL and SECAM and the next gen of TV's are in no sight, I will focus on developing for these devices. (Not taking into consideration of stupid design choices)

Even if the LG 32fs4d which I hear the first time from it (which also doesn't matter) was available back in the 90's. How many customers do you think would have this device and would it be worth developing for? You can answer this question by your own.

2. Rubbish.

During the CRT era we had a console known as the "Xbox". And there were a few games in high-definition. This was before High-definition displays were commonplace.

A console manufacturer will provide the best hardware they can for any given price point, form factor and gimmick.
It is up to developers on what they wish to do with it.

The Xbox 360 and Playstation 3 were supposed to be "High Definition" consoles. Yet many games were actually sub-HD.
The Wii was a sub-HD console in the era of HD.

We are transitioning from the HD era, yet the Xbox One, Playstation 4 and Switch isn't able to achieve Full-High Definition in every instance.

So building these boxes to match the displays simply hasn't occured historically has it?

CRT TV's usually had a resolution of 640x480, some had 800x600. Some had 720x576, some had 1280x1024, some had 1280x720, some had 1920x1080.

I mean, the PS2 had component RCA which has a maximum resolution of 1080i, many TV's, especially rear-projection TV's at the time could resolve that resolution. But no game rendered natively at that, did they? Kinda throws a spanner in the works with your hypothesis that consoles are built for the display technology available. Especially when the Xbox of the same generation had a game or two at 1080i/1080p.

1. The main point is at what factor do you notice aliasing and at what strength.

2. I didn't know that the Xbox and the N64 came out at the same time. What? They didn't? Colour me surprised. RCA maximum of 1080i? I know this connector by CINCH. And I have my share amount of doubt that it actually does 1080i. Care to show me the data sheet for that, because I am unable to find it.



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Crappy Monitor| HTC Vive Pro :3