By using this site, you agree to our Privacy Policy and our Terms of Use. Close

                               

Pemalite said:
Peh said:

Like I already said, we don't know why there is no AA on 1080p. Maybe it looked bad, maybe it impacted the performance too much too hold 60 fps, or maybe because it is a racing game, no one would be really bothered.

Not knowing why there is no AA is ultimately irrellevant. They are only excuses.
There is zero reason for games in 2017, regardless of platform, resolution or hardware capability to have zero anti-aliasing.

We already know the Switch has underpowered hardware, but even then it's still capable of performing rudimentary anti-aliasing.

With the exception of FXAA, every single use of AA does impact the performance quality and image quality. If the result is an unstable framerate and a blurry image. Then it's better to avoid AA, at all.

Pemalite said:
Peh said:

Image Quality depends on the screen you are using (resolution of that screen and what ppi it has) and at what distance you are away from the screen. Smaller Resolution on a higher Resolution Screen will always look bad, because of upscaling. AA will make it worse imo.

Good Anti-Aliasing never reduces image quality. It always improves it.
Nintendo's underpowered hardware isn't an excuse for omitting Anti-Aliasing. Work around it, no need to be apologetic and defend Nintendo's horrible decisions.
Other Switch games have Anti-Aliasing. Wii U games have Anti-Aliasing. Xbox 360 games have Anti-Aliasing. Playstation 3 games have Anti-Aliasing.
Wii has Anti-Aliasing, 3DS has Anti-Aliasing. Playstation 2 even used Anti-Aliasing... Excuses, excuses. No need to make them.

No, it washes out the textures even more and blurs the edges out. A result is a blurry image. A good use of AA is just to render the image at a higher resolution and downscale it (SSAA). This results in the best image quality, yet at the most cost of performance.

I don't know about Wii Games using AA. 3DS can only use AA by disabling the 3D. The performance used to render the 2nd screen is being translated to AA rendering instead. So, they just went with it. Xbox 360 and PS3 use AA, but this results in a lot of Jaggies. If that is your argument for using AA, then that is not a good one.

Just as a side note: Knowing that the PS3 and Xbox 360 were pretty powerful at their time, most of it's power went Polygon count and Textures, and the rest to bad image quality: Screen Tearing and bad use of AA -> Blurry and Jaggies. I don't know about PS2.

Pemalite said:

Peh said:

For PC gaming I am using a 27" for 4k (~50cm away from it)  and 55" 4k at a distance of 3 meters. You don't really need AA anymore, because the pixels are at such a small size, that aliasing is hardly noticeable. FXAA does the job pretty much done. But if there is still room for performance I go with 2x or 4x MSAA / TXAA 1x max.

And yet. At 4k. I still opt for Anti-Aliasing. Real Anti-Aliasing that is, because it actually works on the games geometry.
Regardless. The Switch isn't powerful enough for 4k anyway, Nintendo didn't include optimal hardware for that resolution. Which is fine.

Do you own 4k devices and can you play at it? Just a question. I just want to know if you actually had any experience with native 4k gaming on PC for example.

Pemalite said:
Peh said:

I don't know at what screen you are looking at, but the N64 was made for CRT's in its mind. And that really did its job on a lower resolution TV. It looks like crap on modern TV's, though.

The Nintendo 64 being made for "CRT's" is a fallacy. It was never made for any particular display technology. In-fact. CRT could exceed the Nintendo 64's display output capabilities with ease, I did touch upon Nintendo's use of RCA/S-Video and limited resolution prior.

We had 1080P CRT displays back in the mid 90's you know.
CRT's also had refresh rates that exceeded most Nintendo 64 games.
And also had contrast ratio's and colour depth that put early LCD panels to utter shame... And in many cases still has superior input latency to many displays today... But I digress.

When I am talking about CRT, I am mainly talking about TV's. Not monitors. Besides Pal, NTSC and Secam there wasn't much else during that time for TV's. A home console is being attached to a TV most of the time, so I don't know why you had to go for higher Resolution CRT's which the console was not developed for. My point still stands.



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Crappy Monitor| HTC Vive Pro :3