By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Foxconn Leak was real, when Docked Switch is about 70 % Xbone

SmileyAja said:

The Traktor speculation about the additional docks with GPUs is impossible, all external GPU docks you see around use Thunderbolt 3, which is based off USB-C but is improved and faster, it's 40gbps compared to 10gbps, and the Switch won't be using Thunderbolt, IIRC the nVidia tech isn't compatible with Thunderbolt anyways, so that's a no go.

Now OP's in the ballpark but he doesn't specify his calculations, here's one on Reddit that does a better job of explaining this type of performance;

https://www.reddit.com/r/NintendoSwitch/comments/5jlbub/clock_speeds_power_efficiency_and_why_the/

We'll probably be seeing closer 750 GFLOPS though (as the P1 can do that max), but time will tell. This is using a die shrink of Maxwell and not full on Pascal for a "worst" case scenario.

It has been done: http://www.anandtech.com/show/9963/asus-booth-tour-at-ces-2016-10g-switches-external-gpu-dock-usb-c/2



“Simple minds have always confused great honesty with great rudeness.” - Sherlock Holmes, Elementary (2013).

"Did you guys expected some actual rational fact-based reasoning? ...you should already know I'm all about BS and fraudulence." - FunFan, VGchartz (2016)

Around the Network
SmileyAja said:

The Traktor speculation about the additional docks with GPUs is impossible, all external GPU docks you see around use Thunderbolt 3, which is based off USB-C but is improved and faster, it's 40gbps compared to 10gbps, and the Switch won't be using Thunderbolt, IIRC the nVidia tech isn't compatible with Thunderbolt anyways, so that's a no go.

Now OP's in the ballpark but he doesn't specify his calculations, here's one on Reddit that does a better job of explaining this type of performance;

https://www.reddit.com/r/NintendoSwitch/comments/5jlbub/clock_speeds_power_efficiency_and_why_the/

We'll probably be seeing closer 750 GFLOPS though (as the P1 can do that max), but time will tell. This is using a die shrink of Maxwell and not full on Pascal for a "worst" case scenario.

Honestly thats not a worst case, thats more like a best case.

Eurogamer says the dev kits are 400 Gflops.

If nintendo somehow doubled the power for the final product vs the dev kit, thats a "best case" scenario.

I still think OP is wrong.

And even if he was right about it being 700 gflops, that is still only HALF of a xbox one slim (not 70%).

Then factor in less memory, slower memory bandwidth, ect and its probably less than that.



Mr Puggsly said:
SubiyaCryolite said:

At least "Teh Cell" eventually lived up to its expectations.

Uhm... no. Even at its best PS3 games were at par with 360. Not significnatly better if at all, just at par.

Sony should have used in a cheaper CPU and a better GPU. You know... like PS4.

Uhhh, it's been a while but I'm pretty certain nothing on 360 touches Naughty Dog's or Santa Monica's last efforts.



"We'll toss the dice however they fall,
And snuggle the girls be they short or tall,
Then follow young Mat whenever he calls,
To dance with Jak o' the Shadows."

Check out MyAnimeList and my Game Collection. Owner of the 5 millionth post.

FunFan said:
SmileyAja said:

The Traktor speculation about the additional docks with GPUs is impossible, all external GPU docks you see around use Thunderbolt 3, which is based off USB-C but is improved and faster, it's 40gbps compared to 10gbps, and the Switch won't be using Thunderbolt, IIRC the nVidia tech isn't compatible with Thunderbolt anyways, so that's a no go.

Now OP's in the ballpark but he doesn't specify his calculations, here's one on Reddit that does a better job of explaining this type of performance;

https://www.reddit.com/r/NintendoSwitch/comments/5jlbub/clock_speeds_power_efficiency_and_why_the/

We'll probably be seeing closer 750 GFLOPS though (as the P1 can do that max), but time will tell. This is using a die shrink of Maxwell and not full on Pascal for a "worst" case scenario.

It has been done: http://www.anandtech.com/show/9963/asus-booth-tour-at-ces-2016-10g-switches-external-gpu-dock-usb-c/2

"The external GPU dock on display from ASUS is a little different again to the laptops and AIO mentioned in the previous paragraph, by taking the PCIe lanes and passing the data over a Type-C interface. Using a proprietary IC, ASUS is able to carry 32 Gbps of data over a passive Type-C cable (note, Thunderbolt 3 is limited to 20 Gbps over passive) to a dock that can decode the data. That being said, there were two data cables going from the dock to the laptop, making me think that this is actually two lots of 20Gbps maximum and the IC logic is there to reconstruct the bits from different PCIe lane data streams."



JRPGfan said:
SmileyAja said:

The Traktor speculation about the additional docks with GPUs is impossible, all external GPU docks you see around use Thunderbolt 3, which is based off USB-C but is improved and faster, it's 40gbps compared to 10gbps, and the Switch won't be using Thunderbolt, IIRC the nVidia tech isn't compatible with Thunderbolt anyways, so that's a no go.

Now OP's in the ballpark but he doesn't specify his calculations, here's one on Reddit that does a better job of explaining this type of performance;

https://www.reddit.com/r/NintendoSwitch/comments/5jlbub/clock_speeds_power_efficiency_and_why_the/

We'll probably be seeing closer 750 GFLOPS though (as the P1 can do that max), but time will tell. This is using a die shrink of Maxwell and not full on Pascal for a "worst" case scenario.

Honestly thats not a worst case, thats more like a best case.

Eurogamer says the dev kits are 400 Gflops.

If nintendo somehow doubled the power for the final product vs the dev kit, thats a "best case" scenario.

I still think OP is wrong.

And even if he was right about it being 700 gflops, that is still only HALF of a xbox one slim (not 70%).

Then factor in less memory, slower memory bandwidth, ect and its probably less than that.

Hence the quotation marks. Eurogamer was speculating about the X1, they were calculating based off the clock speeds and an X1, which is highly unlikely because of how well suited Pascal is for a hybrid form factor like the Switch and the Foxconn leak implying such. Not to mention it's cheaper and should become cheaper as time goes on allowing for a bit deeper price cuts.

"There's an additional wrinkle to the story too, albeit one we should treat with caution as it is single-source in nature with a lot of additional speculation on our part. This relates to the idea that the Tegra X1 in the NX development hardware is apparently actively cooled, with audible fan noise. With that in mind, we can't help but wonder whether X1 is the final hardware we'll see in the NX. Could it actually be a placeholder for Tegra X2?"

"While we're confident that our reporting on Switch's clock-speeds is accurate, all of the questions we have concerning the leaked spec remain unanswered.... Performance at lower clocks could be boosted by a larger GPU (ie more CUDA cores), but this seems unlikely - even if Switch is using newer 16nm technology, actual transistor density isn't that different to Tegra X1's 20nm process - it's the FinFET '3D' transistors that make the difference. A larger GPU would result in a more expensive chip too, with only limited performance gains. And if Switch is using a more modern 16nm Tegra chip, we would expect Nintendo to follow Nvidia's lead in how the new process is utilised. However, the Tegra X2 features the same CUDA core count and apparently boosts GPU clocks by 50 per cent, the opposite direction taken by Nintendo."

Having less memory won't be an issue, it seems they're making as simple of an OS as possible to conserve it's RAM consuption, and PS4 and XBONE use half of their RAM on the OS, using around 4 or 5. The Switch having around 3.5 wouldn't sacrifice that much performance at all, playing games with 4GB of RAM instead of 8 on PC for example perhaps cuts down 5-ish frames, that could be easily justified by the lower graphical fidelity and to a lesser extent further optimization and lessen the load on RAM. RAM speeds though are a legit concern, since some of the fastest mobile RAM tops at around 30gbps, that's around half of the XBONE, Nintendo might have figured a work-around for this but I don't find that very likely.

750 GFLOPS is half of XBONE, but with additional optimization software provided by nVidia, you could achieve better performance, I'm not implying groundbreaking optimization magic, but it should still help. There's always the GFLOPs aren't the only measure for graphics, but IIRC there used to be people on Reddit mentioning AMD's tflop efficency, which is worse than Pascal's because the architecture is older, and 800 gflops on Pascal would be a TFLOP in AMD tflops? Not too sure here, would be great if someone with knowledge in this area could fill me in.

Anyways, Switch will be good enough for third party. We have OBE1 saying RE7 with VR (though they're running into performance issues with VR IIRC) and Assassins Creed Egypt on the same day as PS4 and XBONE, and he's been right on no Netflix or etc on launch, online free at launch but charged later, new ATLUS game, new Warriors crossover and Reggie has hinted at Metroid. They managed to optimize GTA V and Rise of The Tomb Raider for X360 and PS3 and it ran well enough on systems that will be inferior to the Switch, if the sales are there developers will put in more effort to do the same for the Switch but with more modern titles. We'll see, I'm heavily speculating here as anyone else, and there's a good chance that I'll be eating my words one day.



Around the Network
Mr Puggsly said:
SubiyaCryolite said:

At least "Teh Cell" eventually lived up to its expectations.

Uhm... no. Even at its best PS3 games were at par with 360. Not significnatly better if at all, just at par.

Sony should have used in a cheaper CPU and a better GPU. You know... like PS4.

I'm not sure where you are getting your information from. A lot of games are enhanced over xbox 360 on ps3.

1. Many 360 games have much inferior sound. 360 games had to come on dvd's and didn't have the space of ps3 blu-rays so were more highly compressed and only supported dolby 5.1 compared to up to uncompressed 7.1 on ps3 often with additional sound layers.

2. movie sequences on ps3 could be pre-rendered 1080p high quality movies where as these were often reduced quality highly compressed movie sequences on 360 or often game engine movie sequences.

3. ps3 supported a greater range of 1080p games 

4. ps3 had wider support for 3D games

5. cell processor is less optimised in multiformat games but games built from the ground up made good use of the cell and many of these games wouldn't be possible on 360 to the same quality.

I'm an owner of both 360 and ps3 and while for multiformat games the 360 was generally better for exclusive games they often performed to a higher level on ps3 than similar games on 360.



VGPolyglot said:
vivster said:
I don't play specs, I play games. And apparently I'm still playing 900p 30fps games in 2017.

I'm still playing 240p whatever-FPS games on the PS1 and N64 in 2017.

Looks like you need desperately to step your game up.

Btw, its good to know, but I still will get BotW for WiiU. Congrats for those getting BotW along with the switch, good news for them.



                          

"We all make choices, but in the end, our choices make us" - Andrew Ryan, Bioshock.

SmileyAja said:
FunFan said:

It has been done: http://www.anandtech.com/show/9963/asus-booth-tour-at-ces-2016-10g-switches-external-gpu-dock-usb-c/2

"The external GPU dock on display from ASUS is a little different again to the laptops and AIO mentioned in the previous paragraph, by taking the PCIe lanes and passing the data over a Type-C interface. Using a proprietary IC, ASUS is able to carry 32 Gbps of data over a passive Type-C cable (note, Thunderbolt 3 is limited to 20 Gbps over passive) to a dock that can decode the data. That being said, there were two data cables going from the dock to the laptop, making me think that this is actually two lots of 20Gbps maximum and the IC logic is there to reconstruct the bits from different PCIe lane data streams."

if AsusTek can do, so can anyone else. Nintendo hasn't give us the specs and the chances of it working just like any standar Type-C are slim, not necessarily because Nintendo would use it for extra performance but because of their agressive anti-piracy stance.



“Simple minds have always confused great honesty with great rudeness.” - Sherlock Holmes, Elementary (2013).

"Did you guys expected some actual rational fact-based reasoning? ...you should already know I'm all about BS and fraudulence." - FunFan, VGchartz (2016)

Soundwave said:
JRPGfan said:

am I the only that thinks the OP makes no sense?
Hes pulling numbers out of his arse without explaining where he gets them from.

Somehow 471 gflops becomes 707 gflops.
but maybe it has extra cores and then with its gpu speed it could be 921 gflops!
but then no, thats to much, it wouldnt run full speed, so that 921 is actually just 710 gflops.


what?

Op doesnt explain anything.
I still think the most likely situation is the one Eurogamer gave us.
150 gflops (portable) and 397 gflops (docked).

They atleast explain why they believe its has 256 cores, and what speed those run at ect ect.

Op just pulls numbers outta his butt :p

I was thinking the same thing, lol, the OP just randomly says its 384 or 512 Cuda cores, the leaker never said that. 

It's likely a 256 Cuda Core part (just as the Tegra X1 is), but hopefully die shrunk to 16nm which would allow if to be more power efficient. 

There are 128 cores per sm, so its suggesting a custom built tegra in switch (3sm=384, 4sm=512 cores)

That speculative stuff aside, is 4310 mah even a common battery?  That guess was way too precise so I'm leaning towards believing the foxconn report now.



Sooo... how how much percentage off the PS4 powa is it?

While we are at it PS360 too. Only then ill have a clear picture of how powafool the nintendo svvitch is.



Hunting Season is done...