potato_hamster said:
Where on earth do you get off saying "its that easy"? 1 Does the user use a controller differently when haptic buttons are raised? You might have to rewrite half of your controls code if it isn't based on that alone. 2 Does the haptic screen register touches the same way? 3 How to create the shape and height of the button? 4 Are the tolerances for the haptic button different? 5 Are gestures recognized the same way? 6 Are the same gestures possible on a haptic display? 7 It's not like they say "put a button at coordinates x, y on the controller" and the system goes here! magic button! Ohh look it's pressed in the exact way you expect it to. You have to code that. 8 There is a 0% any devkits that are being used to make games 6 months away from launch aren't fully featured unless Nintendo wants the release games to be total shit. |
1-Uh, no ?? it's used in the same exact way, you can already tell if a button is ergonomically well placed or not even on a mobile game so this is exactly the same;
2-read above;
3-Devs didn't have the choice to chose the height and dimension of the buttons in the first place with regular controllers, why would they suddenly want to chose that ?
4-Nintendo can easily include the possibility to regulate the haptic feel & let the player chose how to switch buttons around the screen;
5-Yes, otherwise they wouldn't be compatible with regular devices;
6-Yes again;
7-Nintendo can give devs a software to place buttons, it's one of the components of a devkit...
8-This whole argument started because of the fact that I pointed out that Nintendo might want to not give finalized devkits before the NX release: not owning the actual screen devs won't be able to give infos/details to competitors that, in the case NX is succesfull, would want to copy it...








