By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Switch vs WiiU vs Xbox One vs PS4 (Last Update: January 12, 2017)

superchunk said:
curl-6 said:

"All major releases" seems a tad over-optimistic. Even Gamecube, which was stronger than PS2, missed out on a lot of PS2/Xbox multiplats. Wii U missed out on plenty on PS3/360 multiplats despite being fully capable of running them. N64 was stronger than PS1/Saturn yet still missed out on plenty of games.

So did N64. That is where the political angle comes in, when 3rd parties refused to work with Nintendo, not out of technical limitations.

In the end WiiU was not capable to run them or rather the barrier to entry was too great. It would require Nintendo or devs to upgrade tooling to support WiiU as the middleware teams (except Unity) refused to do so.

NS is supported day one by all of these middleware devs.

Eitherway, I do recognize the real possibility that 3rd parties ignore Nintendo, but it is not due to technical limitations based on the support given from middleware and the tech NS is now assumed to have.

Ah, I misunderstood you, I thought you meant that you expected all major third party releases in 2017 would come to Switch.

Although technically speaking, I think it would depend on the game; for example, it would be a lot easier to port a game like Skyrim Remastered to Switch than a game like Witcher 3.



Around the Network
superchunk said:

RE active cooling

Here is a very detailed and direct referenced break-down of the Nintendo patent filed this past week.

https://www.reddit.com/r/NintendoSwitch/comments/5ii58o/nintendo_switch_patent_megathread/

I don't see why you ignore this as a source for active cooling outside of purposeful bias.

RE NS vs Shield TV

Why are you asking if NS GPU is 1GHz? Shield TV is only 512GHz (FP32).

https://en.wikipedia.org/wiki/Tegra#Tegra_X1
http://www.neogaf.com/forum/showthread.php?t=1297191

That is what I've had in the OP for X1 and orig dev kit rumor all along. That is at least where NS started at as a base discussion before doing whatever tweaks they've done for hte 500 man-years of effort Nvidia claimed.

Even if we base NS at the dev kit rumor, we see a lower CPU in NS, better RAM in NS and equivalient GPU in NS.

However, there is also https://blogs.nvidia.com/blog/2016/10/20/nintendo-switch/

This demonstrates what the X2 or P1 in the wiki above is discussing. It also, thus far, matches the latest confirmations in patents and confirmed things like usb-c power/adapter to dock, no fans/tech in dock, shoulder buttons on connector end of joycons, etc. The only quesitonable data in that now is maxwell vs pascal - though there have been multple sources on gaf to say while dev kit is maxwell, they are still positive retail is actually pascal.

This is where the Denver and some clock speeds in the OP come from and that is absolutely more powerful and very much plausible.

Patents =/= Source 

I don't care about just one stat like the amount of GFlops a machine can deliver and that's especially if it's just rumored. Clock rate also determines how fast the machine will perform and if the Switch is clocked at less than half of the Tegra X1, there's almost no chance of it matching the SHIELD Android TV which struggled with last gen ports seeing as how Nvidia is usually only able to fit in 1 GPC in their mobile chips ... 

I don't know how you sleep at night with these huge leaps in logic ... 

I have no idea what you're compulsive obsession is about in belief that the Switch will somehow come on top of another similar unit which is actually released with active cooling, has constant power supply from the wall and is in a bigger form factor to allow better ventilation of heat ... (Even if you believe that the Switch will get a fan to cool off the processor it still won't have the benefit of having a higher power consumption.) 

The relatively ginormous Google Pixel C could only clock the Tegra X1's GPU to 850 MHz so what hope does the Switch have to do better ? 



fatslob-:O said:

Some GCN optimizations could potentially hurt Nvidia's microachitecture ... (Tons of asymmetries when we're talking about GPU microachitecture.) 

Shading language optimizations will also be different too. Shader model 6 is a perfect fit for GCN but I wonder what will happen to the Switch's performance as developers start doing these very hardcore shader optimizations you see on GCN ... 

Exactly. ;)

Trumpstyle said:
Maybe I'm missing something but Nintendo switch makes no sense to me. It has no harddrive which mean all games must sell through retail. This will kill the indie game market and make this console totally depended on the big game studios.

But the console is to weak for AAA games. If NS has 500 gigaflops and 25gb memory bandwidth there is just no way any gamedeveloper wanna waste time with this console. It needs a minimun 750 gigaflops and 50gb of memory bandwidth to even have a small chance for support.

You have MicroSD to exapand the storage.

Flops and bandwidth are pointless metrics on their own.
Remember, if it is based on Maxwell/Pascal, then it is using colour compression where-as the Xbox One and the Playstation 4 do not.
Thus, that 25Gb/s in the real world should be higher than the number implies.

It will be fine for 720P with 1080P for lighter/older/simpler titles.

superchunk said:

RE active cooling

Here is a very detailed and direct referenced break-down of the Nintendo patent filed this past week.

https://www.reddit.com/r/NintendoSwitch/comments/5ii58o/nintendo_switch_patent_megathread/

I don't see why you ignore this as a source for active cooling outside of purposeful bias.


Patents are just a legal protection of "Ideas".

Companies are constantly making patent claims, even when they have zero intention of using them, they have value during times of litigatation, gives you a leg-up in cross-patent agreements and licensing deals.

A patent is not empirical evidence for anything.


superchunk said:

RE NS vs Shield TV

Why are you asking if NS GPU is 1GHz? Shield TV is only 512GHz (FP32).

https://en.wikipedia.org/wiki/Tegra#Tegra_X1
http://www.neogaf.com/forum/showthread.php?t=1297191


Wut. Are you confused?


superchunk said:

So did N64. That is where the political angle comes in, when 3rd parties refused to work with Nintendo, not out of technical limitations.

The Nintendo 64 had technical limitations though which prevented a ton of games going multiplat.
You were not going to get a game like Final Fantasy 8 that spanned across 4 optical discs (2.8Gb) on a cart with a maximum of 0.064Gb for instance.

PC was also pushing optical media heavily at the time.

As for the Gamecube... We need to keep in mind it had the smallest amount of market share, but only just behind the Xbox.
However... The difference was in development.
The original Xbox was basically a PC, even used a Windows Kernel, Direct X (Plus another Low-level API.) and used x86+nVidia hardware.. That allowed the Xbox to be attractive to PC developers, as development was super easy, hence why it got gems like Morrowind.
The PS2 had the bulk of the market. It was where the money was.

With the Wii and Wii U, it was a generation behind.

We also need to keep in mind that Nintendo hardware typically favours it's own developers in terms of sales, Sony and Microsoft do work with big 3rd party developers on development and advertising which probably helps bolster that...

The difference with the Switch though is that Nintendo is leveraging nVidia's hardware stack and nVidia's software stack and one would assume that Nintendo would also be able to leverage the ties that nVidia has formed with developers as well.




--::{PC Gaming Master Race}::--

Pemalite said:

Exactly. ;)

Another piece of tidbit is that shader model 6 goes one step further than SPIR-V in terms of exposing more features from the GCN microachitecture ... 

WaveOnce and GlobalOrderedIncrement along with WaveGetOrderedIndex are pretty specific to AMD ... (DPP from GCN3 is also a good idea for implementing quad swizzles) 



fatslob-:O said:
Pemalite said:

Exactly. ;)

Another piece of tidbit is that shader model 6 goes one step further than SPIR-V in terms of exposing more features from the GCN microachitecture ... 

WaveOnce and GlobalOrderedIncrement along with WaveGetOrderedIndex are pretty specific to AMD ... (DPP from GCN3 is also a good idea for implementing quad swizzles) 

Not many people on this forum would understand Swizzles. (Aka. Re-arranging Vector elements.) let alone waves, also proves how far you have come over the last few years in your knowledge.
Are you going to make a career out of it? If not already? Or just an Enthusiast?



--::{PC Gaming Master Race}::--

Around the Network
Pemalite said:

Not many people on this forum would understand Swizzles. (Aka. Re-arranging Vector elements.) let alone waves, also proves how far you have come over the last few years in your knowledge.
Are you going to make a career out of it? If not already? Or just an Enthusiast?

For now I'm following the books, many resources out there to learn digital circuit design ...  

One day I want to be able to use HDLs like Verilog but I'm also interested in VLSI design too ...



superchunk said:
zwei said:
Wii U -> 352 GFlops. The table is wrong.

https://www.techpowerup.com/gpudb/1903/wii-u-gpu

Go read my linked table at the top of the OP where there is documentation from many sources demonstrating that it is universally agreed that the WiiU was 176GF.

Maybe its 176GF FP32 & 352GF FP16, and you're right. 176x2=352.



MEGADRIVE, SNES, SATURN, PS1, N64, DREAMCAST, PS2, GC, XBOX, X360, Wii, PS3, Wii U, PS4, XONE, GAME GEAR, GBP, GBA, NGAGE, GBAP, DS, PSP, 3DS, VITA.

fatslob-:O said:
Pemalite said:

Not many people on this forum would understand Swizzles. (Aka. Re-arranging Vector elements.) let alone waves, also proves how far you have come over the last few years in your knowledge.
Are you going to make a career out of it? If not already? Or just an Enthusiast?

For now I'm following the books, many resources out there to learn digital circuit design ...  

One day I want to be able to use HDLs like Verilog but I'm also interested in VLSI design too ...

Playing with Microcontrollers gave me a leg-up back in the 90's, also got me hooked on hardware. Haha

zwei said:
superchunk said:

Go read my linked table at the top of the OP where there is documentation from many sources demonstrating that it is universally agreed that the WiiU was 176GF.

Maybe its 176GF FP32 & 352GF FP16, and you're right. 176x2=352.

VLIW should have a 1:1 ratio of FP32 to FP16 or less, due to the fact that VLIW doesn't support FP16 natively out of the box on a hardware level.

Graphics Core Next (Excluding Vega) even has a 1:1 FP32 to FP16 ratio, the SoC in the PS4 Pro being a semi-custom design has double rate FP16.
That isn't to say the WiiU can't have double rate FP16, it is just unlikely... Besides, the die-shots and the analysing of those dies have shown that it is a 176Gflop design for FP32/Single Precision.. And that is the important part to take away from it all.

It was a similar case with nVidia, prior to Pascal, FP16 compute tasks were simply promoted to FP32, so it had a 1:1 ratio.
With Pascal, nVidia bundles two FP16 tasks and then promotes it to FP32 for compute and storage.

176Gflop is what the Wii U has. And that is perfectly fine, even if it is lower than the other machines, it is still faster than the PS3 and Xbox 360's GPU and the games back that up with superior texturing, lighting and shader effects, it's not going to compete with the Next-Gen twins, but it is certainly the best looking console when compared with the last generation when games are made it's way.



--::{PC Gaming Master Race}::--

Hopefully this puts a lot of rumors to bed. And I guess you can update the OP again.


View on YouTube



http://www.eurogamer.net/articles/digitalfoundry-2016-nintendo-switch-spec-analysis

Yep. This is certainly a surprise to me. To those I've been arguing with lately, yep I was wrong. I really expected it to be on par with a standard X1 and then up-ticked a bit but nothing crazy. This is a little saddening.

Never-the-less I'll still get one as it is still an awesome little machine with what will be excellent first party.