By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Digital Foundary: Mario Kart 8 Deluxe: Switch vs 3DS/Wii U

Peh said:
Pemalite said:

How sure are you about that? ;) Willing to make a bet? ;) Evidence is a-plenty.

*facepalm* Of course a TV also needs Freesync to work. I did state that earlier in the thread.

And the point I am trying to make is that there is a hardware solution to solve screen tearing, I'm not sure if I could get anymore blatantly obvious than that?

Show me the evidence to the context I was talking about.

Keep in mind the context was "There is no screen tearing in Nintendo consoles."

But lets see you try disputing Digital Foundry, with Dark Siders 2 on the Wii U.
http://www.eurogamer.net/articles/digitalfoundry-darksiders-2-on-wii-u-face-off


Peh said:

Yes, Free Sync is a hardware solution. The console as well as the TV will need it. I say TV, because I go for the majority of customers that actually attach a console to a TV. Freesync is a solution invented by AMD in contrast to G-Sync from Nvidia. From which company, do you think, does the GPU in the Nintendo Switch comes from? Do I have to write it out for you?

Should I also imply a *facepalm* like you did?

Again. I have elaborated that both the console and display need to support it to work. Not sure why you keep bringing that up.

And Really? You think that having an Nvidia chip excludes you from having Freesync? News flash. It doesn't.
Freesync is an open standard. FreeSync is royalty-free. Freesync is free to use.

The VESA or the "Video Electronics Standards Association" standard has adopted AMD's Freesync dubbed "Adaptive Sync" and integrated it as part of the Displayport 1.2a, 1.3, 1.4 and newer standards.

Nintendo is free to use and license it. And so is nVidia. Nintendo is paying nVidia for it's "semi-custom" SoC. A large part of the work is on the software side as long as the display supports variable refresh rates.


Peh said:  

It has? How about using quotes for that debunked segment.

Fine.

Pemalite said:


TXAA will sample prior frames to try and improve the sampling...

https://developer.nvidia.com/postworks


Peh said:  

An effect that does not appear to be blurry? I am intrigued. Show me.

Turn 3D off and on.


Peh said:  

1. The main point is at what factor do you notice aliasing and at what strength.

Doesn't matter. Anti-Aliasing is cheap. Use at-least 2x.

Peh said:  

2. I didn't know that the Xbox and the N64 came out at the same time. What? They didn't? Colour me surprised. RCA maximum of 1080i? I know this connector by CINCH. And I have my share amount of doubt that it actually does 1080i. Care to show me the data sheet for that, because I am unable to find it.

I never said they did come out at the same time.

Component and Composite RCA. Learn the differeces.





https://en.wikipedia.org/wiki/Component_video
https://en.wikipedia.org/wiki/Composite_video

Here are the original Xbox and PS2 Component RCA cables.



--::{PC Gaming Master Race}::--

Around the Network
Pemalite said:



The VESA or the "Video Electronics Standards Association" standard has adopted AMD's Freesync dubbed "Adaptive Sync" and integrated it as part of the Displayport 1.2a, 1.3, 1.4 and newer standards.


Heck, it's part of HDMI 2.1 specs.


  • Game Mode VRR features variable refresh rate, which enables a 3D graphics processor to display the image at the moment it is rendered for more fluid and better detailed gameplay, and for reducing or eliminating lag, stutter, and frame tearing.


Pemalite said:
Peh said:

Show me the evidence to the context I was talking about.

1. Keep in mind the context was "There is no screen tearing in Nintendo consoles."

But lets see you try disputing Digital Foundry, with Dark Siders 2 on the Wii U.
http://www.eurogamer.net/articles/digitalfoundry-darksiders-2-on-wii-u-face-off


Peh said:

Yes, Free Sync is a hardware solution. The console as well as the TV will need it. I say TV, because I go for the majority of customers that actually attach a console to a TV. Freesync is a solution invented by AMD in contrast to G-Sync from Nvidia. From which company, do you think, does the GPU in the Nintendo Switch comes from? Do I have to write it out for you?

Should I also imply a *facepalm* like you did?

2. Again. I have elaborated that both the console and display need to support it to work. Not sure why you keep bringing that up.

And Really? You think that having an Nvidia chip excludes you from having Freesync? News flash. It doesn't.
Freesync is an open standard. FreeSync is royalty-free. Freesync is free to use.

The VESA or the "Video Electronics Standards Association" standard has adopted AMD's Freesync dubbed "Adaptive Sync" and integrated it as part of the Displayport 1.2a, 1.3, 1.4 and newer standards.

Nintendo is free to use and license it. And so is nVidia. Nintendo is paying nVidia for it's "semi-custom" SoC. A large part of the work is on the software side as long as the display supports variable refresh rates.


Peh said:  

It has? How about using quotes for that debunked segment.

3. Fine.

Pemalite said:


TXAA will sample prior frames to try and improve the sampling...

https://developer.nvidia.com/postworks


Peh said:  

An effect that does not appear to be blurry? I am intrigued. Show me.

4. Turn 3D off and on.


Peh said:  

1. The main point is at what factor do you notice aliasing and at what strength.

5. Doesn't matter. Anti-Aliasing is cheap. Use at-least 2x.

Peh said:  

2. I didn't know that the Xbox and the N64 came out at the same time. What? They didn't? Colour me surprised. RCA maximum of 1080i? I know this connector by CINCH. And I have my share amount of doubt that it actually does 1080i. Care to show me the data sheet for that, because I am unable to find it.

6. I never said they did come out at the same time.

Component and Composite RCA. Learn the differeces.





https://en.wikipedia.org/wiki/Component_video
https://en.wikipedia.org/wiki/Composite_video

Here are the original Xbox and PS2 Component RCA cables.

1. I didn't know about Darksiders 2 on WII U. Glad I didn't bought it. It's really the first game which I see that actually do has screen tearing on a Nintendo console. Are there others or is this the only one?

2. Nvidia doing Freesync is as likely as AMD doing PhysX rendering. Even if Freesync is open standard, do you really think that Nvidia will ditch their money and resources that went into G-Sync and go for Freesync? That's why your point leads to nowhere. But hey, I would still welcome it, but as for now, that's just fiction.

I'm not 100% certain, but which console uses Display Port? Afaik, none does. I also doubt that a lot of TV's even have a DP interface built in. So, why are you even bringing this up? Your argument just doesn't follow.

3. How does your answer has anything to do with my prior statement? Btw. TXAA has a higher impact on Performance than FXAA because it also uses MSAA. Don't you mean TAA instead?  

4. If you don't want to take this serious, then why should I? That's a non sequitar. Go for my original statemant.

5. Besides the point. You obviously don't get what I am trying to say. Must be your hostility towards me.

6.  Then why mention the Xbox at all? Do you actually remember what you are arguing against? Because I have a feeling that you either don't understood what I was saying or you deliberately want to argue for the opposite of who knows what. Reread my original Statement and see what I wrote there.

Great, video composite can even achieve 2560p. Didn't knew about that. But it cannot do HDCP, so if the devices use this encryption then they won't work with this inteface. And that is also besides the entire point I was going for. You wasted your time for actually nothing here. Why? What resolution is the N64 capable of and what devices were present in the living room around prior to the release of 1996 (Development Time) and for the next 4-5 years even before the Xbox launched? Again, do you remember what you are actually arguing against?

 



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Crappy Monitor| HTC Vive Pro :3

Peh said:
                               
Pemalite said:

Not knowing why there is no AA is ultimately irrellevant. They are only excuses.
There is zero reason for games in 2017, regardless of platform, resolution or hardware capability to have zero anti-aliasing.

We already know the Switch has underpowered hardware, but even then it's still capable of performing rudimentary anti-aliasing.

With the exception of FXAA, every single use of AA does impact the performance quality and image quality. If the result is an unstable framerate and a blurry image. Then it's better to avoid AA, at all.

FXAA is not the only cost-effective method of anti-aliasing these days. Plenty of games employ post-process or temporal AA techniques that allow for clean image quality while still maintaining stable framerates. 

Heck, let's even go back to Nintendo's own Wii U games and look at the improvement even simple AA can offer to image quality:

Both games are 720p, yet 3D World has significantly cleaner image quality, and still runs at a locked 60fps.



Peh said:
Pemalite said:

Keep in mind the context was "There is no screen tearing in Nintendo consoles."

But lets see you try disputing Digital Foundry, with Dark Siders 2 on the Wii U.
http://www.eurogamer.net/articles/digitalfoundry-darksiders-2-on-wii-u-face-off


Again. I have elaborated that both the console and display need to support it to work. Not sure why you keep bringing that up.

And Really? You think that having an Nvidia chip excludes you from having Freesync? News flash. It doesn't.
Freesync is an open standard. FreeSync is royalty-free. Freesync is free to use.

The VESA or the "Video Electronics Standards Association" standard has adopted AMD's Freesync dubbed "Adaptive Sync" and integrated it as part of the Displayport 1.2a, 1.3, 1.4 and newer standards.

Nintendo is free to use and license it. And so is nVidia. Nintendo is paying nVidia for it's "semi-custom" SoC. A large part of the work is on the software side as long as the display supports variable refresh rates.


Fine.

https://developer.nvidia.com/postworks


Turn 3D off and on.


Doesn't matter. Anti-Aliasing is cheap. Use at-least 2x.

I never said they did come out at the same time.

Component and Composite RCA. Learn the differeces.





https://en.wikipedia.org/wiki/Component_video
https://en.wikipedia.org/wiki/Composite_video

Here are the original Xbox and PS2 Component RCA cables.

Screwed up the reply... need to rework.

15 hours later. Still no reworked reply.

That must have been one heck of a post...



Around the Network
curl-6 said:
Peh said:
                               

With the exception of FXAA, every single use of AA does impact the performance quality and image quality. If the result is an unstable framerate and a blurry image. Then it's better to avoid AA, at all.

FXAA is not the only cost-effective method of anti-aliasing these days. Plenty of games employ post-process or temporal AA techniques that allow for clean image quality while still maintaining stable framerates. 

Heck, let's even go back to Nintendo's own Wii U games and look at the improvement even simple AA can offer to image quality:

Both games are 720p, yet 3D World has significantly cleaner image quality, and still runs at a locked 60fps.

That's great. Now tell me why Nintendo didn't use AA for Mario Kart 8. Like Pemalite said, I don't wanna hear excuses.

Pemalite said:

Edited.



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Crappy Monitor| HTC Vive Pro :3

Hynad said:
Peh said:

Screwed up the reply... need to rework.

15 hours later. Still no reworked reply.

That must have been one heck of a post...

Is this post of yours really necessary? You know, I got a life besides VGchartz, I have to sleep, I have to work. I post when I got the time to do so.

Your post has only the purpose of ridiculing me. Stop that! 



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Crappy Monitor| HTC Vive Pro :3

Peh said:
curl-6 said:

FXAA is not the only cost-effective method of anti-aliasing these days. Plenty of games employ post-process or temporal AA techniques that allow for clean image quality while still maintaining stable framerates. 

Heck, let's even go back to Nintendo's own Wii U games and look at the improvement even simple AA can offer to image quality:

Both games are 720p, yet 3D World has significantly cleaner image quality, and still runs at a locked 60fps.

That's great. Now tell me why Nintendo didn't use AA for Mario Kart 8. Like Pemalite said, I don't wanna hear excuses.

I'm not privy to the reasoning behind EAD's decision, but that doesn't mean we shouldn't automatically accept Nintendo not meeting a more than decade-old technical standard.



curl-6 said:
Peh said:

That's great. Now tell me why Nintendo didn't use AA for Mario Kart 8. Like Pemalite said, I don't wanna hear excuses.

I'm not privy to the reasoning behind EAD's decision, but that doesn't mean we shouldn't automatically accept Nintendo not meeting a more than decade-old technical standard.

The issue here is that you pretend to know more than Nintendo does in this regard. It's their development team, and we already both know that Nintendo used AA in the past, and obviously left it out for Mario Kart 8. If it was easy to implement or if there are still resources left for them to use, then they would probably add it. But fact is, they didn't. And you and me and even Pemalite don't know the real reasoning behind this decision. So we have to go with what the are offering us. Just accept it. Simple as that. If they decide to patch it later in and it doesn't look like shit, then great. I am not and never have argued against AA in general. I just don't like washed out screens. And there are different methods of AA to achieve a visualy good effect, but we still don't know why they opted it out.



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Crappy Monitor| HTC Vive Pro :3

Peh said:

1. I didn't know about Darksiders 2 on WII U. Glad I didn't bought it. It's really the first game which I see that actually do has screen tearing on a Nintendo console. Are there others or is this the only one?

 

There sure is more. But I have already proven you wrong on this point.
Feel free to do some of your own research on games with frame pacing/screen tearing/framerate drops etc on the NES, SNES, N64, Gamecube, Wii, Wii U, Switch, Gameboy, Gameboy Colour, Gameboy Advance, Nintendo DS, Nintendo 3DS rather than moving the goal post and wasting my time.

Peh said:

2. Nvidia doing Freesync is as likely as AMD doing PhysX rendering. Even if Freesync is open standard, do you really think that Nvidia will ditch their money and resources that went into G-Sync and go for Freesync? That's why your point leads to nowhere. But hey, I would still welcome it, but as for now, that's just fiction.

 

I don't think you get it. It's not up to nVidia.

Peh said:

I'm not 100% certain, but which console uses Display Port? Afaik, none does. I also doubt that a lot of TV's even have a DP interface built in. So, why are you even bringing this up? Your argument just doesn't follow.

 

Do you understand what a VESA standard is?
Regardless. A console doesn't need to have Display Port. HDMI supports it as well.

I have already proven you wrong on this point. So I think you are just arguing for arguing sake.

Peh said:

3. How does your answer has anything to do with my prior statement? Btw. TXAA has a higher impact on Performance than FXAA because it also uses MSAA. Don't you mean TAA instead? 

No. I don't mean TAA instead.

Peh said:

5. Besides the point. You obviously don't get what I am trying to say. Must be your hostility towards me.

I treat everyone equally. I'm far from being hostile.


Peh said:

6.  Then why mention the Xbox at all? Do you actually remember what you are arguing against? Because I have a feeling that you either don't understood what I was saying or you deliberately want to argue for the opposite of who knows what. Reread my original Statement and see what I wrote there.

It's called an e-x-a-m-p-l-e.

Peh said:

Great, video composite can even achieve 2560p. Didn't knew about that. But it cannot do HDCP, so if the devices use this encryption then they won't work with this inteface. And that is also besides the entire point I was going for. You wasted your time for actually nothing here. Why? What resolution is the N64 capable of and what devices were present in the living room around prior to the release of 1996 (Development Time) and for the next 4-5 years even before the Xbox launched? Again, do you remember what you are actually arguing against?

It wasn't besides the point. I just proved your point factually incorrect. And that was RCA can support high-definition resolutions. You asked the question, I answered the question and now you are trying to undermine the evidence I provided by shifting the goal post. Aint happening.

If you go farther back before... You stated consoles are made to match the displays that are on the market. Again. That has been proven incorrect.

The bulk of Nintendo 64 games operated at 320x240. CRT TV's are often more than happy to resolve twice that resolution. Minimum.

Heck. I only have to say the word "Wii" in the era of HD to prove your entire point incorrect.

So I think whatever argument you have on this point has been thoroughly and utterly, debunked and you should move on.



--::{PC Gaming Master Race}::--