Peh said:
Huh? There is screen tearing on Switch? I only heard of issues that the screen will tear once while playing BotW due to some unknown causes, which also happened to me a 4 times in playing over 100 hours. But so far, there is no screen tearing like it is common on the other consoles by Sony and Microsoft. |
The Switch's console generation is far from over you have a few thousand games to get released yet. Every console has games that will have screen tearing. (Except Scorpio via Freesync.)
Just saying how it could have been easily avoidable with a hardware solution, so you don't need to resort to things like double/tripled buffered v-sync which destroys any resemblance of responsiveness and other minor caveats.
| Oneeee-Chan!!! said: A PC Gaming Master Race is stalking Nintendo hardware everyday He seems to be ill. |
1) I was alerted to this thread.
2) Being a part of the PC Gaming Master Race does not exclude me from being interested or owning other platforms.
3) Resorting to argumentum ad hominem is stupid. Don't do it please.
| Peh said:
With the exception of FXAA, every single use of AA does impact the performance quality and image quality. If the result is an unstable framerate and a blurry image. Then it's better to avoid AA, at all. |
Wrong.
Peh said:
No, it washes out the textures even more and blurs the edges out. A result is a blurry image. A good use of AA is just to render the image at a higher resolution and downscale it (SSAA). This results in the best image quality, yet at the most cost of performance. I don't know about Wii Games using AA. 3DS can only use AA by disabling the 3D. The performance used to render the 2nd screen is being translated to AA rendering instead. So, they just went with it. Xbox 360 and PS3 use AA, but this results in a lot of Jaggies. If that is your argument for using AA, then that is not a good one. Just as a side note: Knowing that the PS3 and Xbox 360 were pretty powerful at their time, most of it's power went Polygon count and Textures, and the rest to bad image quality: Screen Tearing and bad use of AA -> Blurry and Jaggies. I don't know about PS2. |
Wrong. SSAA is not the only form of "good" Anti-Aliasing. And is most certainly not a form of Anti-Aliasing I expect out of fixed-hardware of moderate capabilities. - I have already touched upon it in my prior posts. But I shall do so again.
There are forms of Anti-Aliasing which detects the edge of Geometry which is where aliasing typically occurs, it then proceeds to sample said edges of geometry and apply various patterns/filters to the affected area.
Now taking that same approach by detecting the edges of Geometry, some methods of Anti-Aliasing will render the edges of geometry at a significantly higher resolution and downscale them. It's a more efficient form of SSAA.
Thus the Anti-Aliasing isn't working on the entire image at a time. Thus it is more paletable for low-end hardware like the Switch.
Again. There is no excuses for the Switch if even the paltry hardware of the 3DS can perform Anti-Aliasing.
| Peh said: Do you own 4k devices and can you play at it? Just a question. I just want to know if you actually had any experience with native 4k gaming on PC for example. |
I have actually had a triple 1440P set-up and a Triple 1080P setup... Which is 5760x1080 and 7680x1440 (More pixels than 4k) respectively.
I currently use a single 2560x1440 display as my primary driver, so I am certainly not a High-Definition/Full High-Definition peasant.
I have used professional 4k monitors and projectors for work purposes at my last job.
| Peh said: When I am talking about CRT, I am mainly talking about TV's. Not monitors. Besides Pal, NTSC and Secam there wasn't much else during that time for TV's. A home console is being attached to a TV most of the time, so I don't know why you had to go for higher Resolution CRT's which the console was not developed for. My point still stands. |
There are High-Definition CRT TV's. So your point is moot.
Some of them like the LG 32fs4d even had HDMI.

www.youtube.com/@Pemalite









