Darc Requiem said:
spemanig said:
What does that mean in laymans terms, though? What would X1 mean for the Switch as opposed to X2 in terms of practical application.
|
I'm a layman myself. However the X1 is 20nm so it would run hotter and use more power. The X1 has really limited memory bandwith. That's the biggest issue I see. The Wii U has really limited bandwith 12.8GB /s for the main memory but at least it has embedded VRAM in the GPU to compensate. Similar to how the Xbox one compensates for it's main memory bandwith (68GB/s) with ESRAM. The X1 has LDDR4 with 25.6 GB/s bandwith. That's 360/PS3 level bandwith. Best case scenario is an X2 with GDDR5.
|
Yeah, I must correct my prior statements. Xbox One real world parity is still possible, but only with modifications to beef up the X2. Not out of the question (consoles have done that before), but it's no guarantee. A Tegra X2 unmoddified will get you something like 60% to the Xbox One on the GFLOP scale while the X1 unmodified gets you to 40%. So that lowers my cost concerns a bit since the X2 isn't so outrageous :P . But it does put a lot more on Nintendo and Nvidia's modification and augmentation efforts. Weaker tech in raw stats can be made to outperform stronger tech in real world performance. Again, look at the Vita. But it puts a lot of burden on the work Nintendo and Nvidia has done to customize the Switch's chips.