By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Who will provide the NX GPU?

 

Who is making the NX GPU

nVida 187 41.19%
 
AMD 210 46.26%
 
Silicon Graphics Inc 16 3.52%
 
Sony (the power of the Cell!!!) 41 9.03%
 
Total:454
Tenebrae77 said:
Soundwave said:

While Nvidia remained hush on anything NX related, one interesting tiny blip did come out of the Hot Chips conference aside from just a general idea of what the Tegra Parker (X2) can do. This slide mentions "sufficent thread count for automotive and gaming applications" for the upcoming Tegra X2/Parker ... now we know they're not making more Shield consoles, so likely this is referring to the Nintendo NX. 

Because reasons.

lol, seeing Zero go on his Alt rampages is like walking into a public washroom and seeing some kid has shit all over the floor again. 

Poor mods always have to clean it up. 



Around the Network
dongo8 said:
JEMC said:

The problem with those articles is that they are based on nothing.

I don't know if you're a PC gamer or follow the news about new graphics cards, but the situation is similar: one site post a rumor about an upcoming product, then another site makes an article about that product using the info from the first site as a source, after that another one does the same claiming "several sources" (the first and second site), and from there it's a snowball going downhill.

What we know from the Eurogamer article is that Tegra X1 seems to be the processor that will power the NX, and that would make it more powerful than PS360 & Wii U, but far from the PS4/X1. And if they go with a Tegra X2, the situation will be better, but it won't change that much.

Found something that may hint a little more closely at the possible power of the chips...Most likely the ones in the NX will be semi-custom though, so who knows about the numbers regardless haha.

http://www.anandtech.com/show/9903/nvidia-announces-drive-px-2-pascal-power-for-selfdriving-cars

Looking at this link again, and seeing the setup of the Tegra X2 in the chart, it has me thinking that the NX portable could use a Tegra, while the SCD could potentially use one of the "unknown" Pascal cards for additional graphics processing.  Essentially, 50% of the setup above.  When going solo, the NX could operate with 650 (or less) GFLOPs that the Tegra could provide.  But when docked, the system would have an additional non-mobile graphics card.  No reason it would have to be a second Tegra.  A different card would make much more sense, and could boost a docked NX to 3 or 4 TFLOPs.



TheLastStarFighter said:
dongo8 said:

Found something that may hint a little more closely at the possible power of the chips...Most likely the ones in the NX will be semi-custom though, so who knows about the numbers regardless haha.

http://www.anandtech.com/show/9903/nvidia-announces-drive-px-2-pascal-power-for-selfdriving-cars

Looking at this link again, and seeing the setup of the Tegra X2 in the chart, it has me thinking that the NX portable could use a Tegra, while the SCD could potentially use one of the "unknown" Pascal cards for additional graphics processing.  Essentially, 50% of the setup above.  When going solo, the NX could operate with 650 (or less) GFLOPs that the Tegra could provide.  But when docked, the system would have an additional non-mobile graphics card.  No reason it would have to be a second Tegra.  A different card would make much more sense, and could boost a docked NX to 3 or 4 TFLOPs.

Something like that could be possible, but I'd probably lean towards it being unlikely.

For one the SCD would probably be quite expensive, Nvidia giving Nintendo Tegra tech that no one else is using for close to cost is one thing, but they're not going to give Nintendo their gaming desktop GPU tech at dramatically cheaper margins too. 

Second I guess what would the NX base unit really even accomplish if the GPU is like 2.5+ TFLOP? It would be like having two horses to pull a wagon and then throwing a dog down there too, lol. The Nvidia desktop GPU would be capable of running the games outright. 

Also scaling games would be difficult, that's a pretty huge gap, keep in mind the portable NX is unlikely to be able to use the full 625 GFLOPS of the Tegra X2 as is, it would run too hot most likely. So you're talking about developers having to make the same game work on one config that's probably about 400 GFLOPS, and another config that's 3-4 TFLOPS ... that's a ridiculous gap in power. 

You don't want the gap in power to be so large that the average user goes "holy FUCK! This game now looks like shit!" when they go from the SCD home play to continuing to play on the road. It has to be some what seamless. 



JRPGfan said:
Does anyone worry about the 50 GB/s memory bandwidth of this new Tegra X2 ?
It just seems low compaired to the PS4s 176 GB/s.

It will be well suited to 720P.

But you need to keep in mind that bandwidth numbers cannot be directly compared as Tegra employs tiled based rendering and has a slew of compression tricks to make better use of that 50GB/s.

But this is also a chip that isn't going to compete with the Playstation 4 or Xbox One anyway.

Miyamotoo said:

Not at all, Xbox One has 68 GB/s while Wii U has only 12.8 GB/s, so basicly Tegra X2 has 4x more time memory bandwidth than Wii U.

This is one more reason why most likly NX will have Tegra X2 not X1, X1 has only 25 GB/s memory bandwidth. Actualy X2 memory memory bandwidth will probably be biggest gain for NX comparision to X1.



But we need to put things into perspective, it's useless grabbing various numbers and comparing the chips on that alone, it's not that black and white.

For example... With Maxwell (Tegra X1) nVidia introduced Delta Colour Compression and nVidia was able to get a 17-29% increase in effective bandwidth.
With Pascal, nVidia further optimised that technology and managed to get another 20%.

So their usable bandwidth would likely be (I'll use 20% for both) 30GB/s for Maxwell based Tegra X1 and 72GB/s for Pascal based Tegra X2, of course there is varience there and it can be higher/lower.

Also nVidia can make better use of it's bandwidth thanks to it's tiled based approach, there is simply less waste.

Obviously it still can't hold a candle to the Playstation 4... And the Xbox One still has a big edge thanks to the eSRAM, so it's still not black and white.

JEMC said:
The Tegra Parker/X2 isn't much of an improvement over the X1 in terms of graphics power, and most of the gains in compute (GFlops) are likely due to the much stronger CPUs.

In my opinion, if Nintendo goes with Nvidia, the X1 would be the better choice. That said, if Nintendo is serious about that Supplemental Compute Device, the beefier CPU of the X2 could come in handy to avoid some bottlenecks... but at that point memory bandwidth may be their biggest problem.


Or GPU clocks, we know Pascal is a clocking monster due to nVidia reworking the chip to achieve higher clockspeeds.
Plus FinFet has some favourable power characteristics to push that home. :)

TheLastStarFighter said:

Looking at this link again, and seeing the setup of the Tegra X2 in the chart, it has me thinking that the NX portable could use a Tegra, while the SCD could potentially use one of the "unknown" Pascal cards for additional graphics processing.  Essentially, 50% of the setup above.  When going solo, the NX could operate with 650 (or less) GFLOPs that the Tegra could provide.  But when docked, the system would have an additional non-mobile graphics card.  No reason it would have to be a second Tegra.  A different card would make much more sense, and could boost a docked NX to 3 or 4 TFLOPs.

I can see costs blowing out.
nVidia isn't exactly known for being cheap.

Not only that, but nVidia's Multi-GPU technology has never been that flexible as far as I know.

Tenebrae77 said:

"So basically based on nothing"

Like everything you say.

"Shitty home performance given all things but at least it's a reasonable upgrade on the Wii U and not the GameCube to Wii all over again."

HAHAHAHAHAHAHA

750 gflops is LESS  of a jump from wii u than wii was to GC. Go home, you're drunk. It's so pathetic that you actually think Nintendo's next home console will not be a lot more powerful than ps4, or moderately more powerful than ps4, or equal to ps4, or weaker than ps4, or weaker than X1, or only a little stronger than wii u (625-650 gflops).

Based on what? Gflops? Please. There is more to GPU's and graphics than single precision compute.

Miyamotoo said:

Well to be fair Wii U also has eDRAM, that is similar to that eSRAM in XB1.

The Wii U didn't have enough of the stuff. Pretty sure latency might have been higher than the Xbox One's eSRAM as well.

Besides, the Wii U's main limiter was that GPU and CPU.

It's like a waterpipe... Think of Memory bandwidth as the size of the pipe and think of the CPU and GPU as the amount of water flowing through, if the CPU and GPU aren't big enough and can't fully utilise the size of the pipe, then the bigger pipe is wasted isn't it?




www.youtube.com/@Pemalite

Soundwave said:
TheLastStarFighter said:

Looking at this link again, and seeing the setup of the Tegra X2 in the chart, it has me thinking that the NX portable could use a Tegra, while the SCD could potentially use one of the "unknown" Pascal cards for additional graphics processing.  Essentially, 50% of the setup above.  When going solo, the NX could operate with 650 (or less) GFLOPs that the Tegra could provide.  But when docked, the system would have an additional non-mobile graphics card.  No reason it would have to be a second Tegra.  A different card would make much more sense, and could boost a docked NX to 3 or 4 TFLOPs.

Something like that could be possible, but I'd probably lean towards it being unlikely.

For one the SCD would probably be quite expensive, Nvidia giving Nintendo Tegra tech that no one else is using for close to cost is one thing, but they're not going to give Nintendo their gaming desktop GPU tech at dramatically cheaper margins too. 

Second I guess what would the NX base unit really even accomplish if the GPU is like 2.5+ TFLOP? It would be like having two horses to pull a wagon and then throwing a dog down there too, lol. The Nvidia desktop GPU would be capable of running the games outright. 

Also scaling games would be difficult, that's a pretty huge gap, keep in mind the portable NX is unlikely to be able to use the full 625 GFLOPS of the Tegra X2 as is, it would run too hot most likely. So you're talking about developers having to make the same game work on one config that's probably about 400 GFLOPS, and another config that's 3-4 TFLOPS ... that's a ridiculous gap in power. 

You don't want the gap in power to be so large that the average user goes "holy FUCK! This game now looks like shit!" when they go from the SCD home play to continuing to play on the road. It has to be some what seamless. 

I practice it could be a 2 TFLOP home card.  The Tegra still contains the CPU function.  Home, 2 FLOP for HD big TV, road 500GFLOP for small portable screen. It makes far more sense than two Tegras.



Around the Network
Pemalite said:

 

TheLastStarFighter said:

Looking at this link again, and seeing the setup of the Tegra X2 in the chart, it has me thinking that the NX portable could use a Tegra, while the SCD could potentially use one of the "unknown" Pascal cards for additional graphics processing.  Essentially, 50% of the setup above.  When going solo, the NX could operate with 650 (or less) GFLOPs that the Tegra could provide.  But when docked, the system would have an additional non-mobile graphics card.  No reason it would have to be a second Tegra.  A different card would make much more sense, and could boost a docked NX to 3 or 4 TFLOPs.

I can see costs blowing out.
nVidia isn't exactly known for being cheap.

Not only that, but nVidia's Multi-GPU technology has never been that flexible as far as I know.


It could be a mid-tear, 2 TFLOP card or so.  It could function like the setup for cars referenced above, or like an Alienware laptop PC that shuts down the mobile card when connected to SCD at home.



Pemalite said:
JRPGfan said:
Does anyone worry about the 50 GB/s memory bandwidth of this new Tegra X2 ?
It just seems low compaired to the PS4s 176 GB/s.

It will be well suited to 720P.

But you need to keep in mind that bandwidth numbers cannot be directly compared as Tegra employs tiled based rendering and has a slew of compression tricks to make better use of that 50GB/s.

But this is also a chip that isn't going to compete with the Playstation 4 or Xbox One anyway.

Miyamotoo said:

Not at all, Xbox One has 68 GB/s while Wii U has only 12.8 GB/s, so basicly Tegra X2 has 4x more time memory bandwidth than Wii U.

This is one more reason why most likly NX will have Tegra X2 not X1, X1 has only 25 GB/s memory bandwidth. Actualy X2 memory memory bandwidth will probably be biggest gain for NX comparision to X1.



But we need to put things into perspective, it's useless grabbing various numbers and comparing the chips on that alone, it's not that black and white.

For example... With Maxwell (Tegra X1) nVidia introduced Delta Colour Compression and nVidia was able to get a 17-29% increase in effective bandwidth.
With Pascal, nVidia further optimised that technology and managed to get another 20%.

So their usable bandwidth would likely be (I'll use 20% for both) 30GB/s for Maxwell based Tegra X1 and 72GB/s for Pascal based Tegra X2, of course there is varience there and it can be higher/lower.

Also nVidia can make better use of it's bandwidth thanks to it's tiled based approach, there is simply less waste.

Obviously it still can't hold a candle to the Playstation 4... And the Xbox One still has a big edge thanks to the eSRAM, so it's still not black and white.

JEMC said:
The Tegra Parker/X2 isn't much of an improvement over the X1 in terms of graphics power, and most of the gains in compute (GFlops) are likely due to the much stronger CPUs.

In my opinion, if Nintendo goes with Nvidia, the X1 would be the better choice. That said, if Nintendo is serious about that Supplemental Compute Device, the beefier CPU of the X2 could come in handy to avoid some bottlenecks... but at that point memory bandwidth may be their biggest problem.


Or GPU clocks, we know Pascal is a clocking monster due to nVidia reworking the chip to achieve higher clockspeeds.
Plus FinFet has some favourable power characteristics to push that home. :)

TheLastStarFighter said:

Looking at this link again, and seeing the setup of the Tegra X2 in the chart, it has me thinking that the NX portable could use a Tegra, while the SCD could potentially use one of the "unknown" Pascal cards for additional graphics processing.  Essentially, 50% of the setup above.  When going solo, the NX could operate with 650 (or less) GFLOPs that the Tegra could provide.  But when docked, the system would have an additional non-mobile graphics card.  No reason it would have to be a second Tegra.  A different card would make much more sense, and could boost a docked NX to 3 or 4 TFLOPs.

I can see costs blowing out.
nVidia isn't exactly known for being cheap.

Not only that, but nVidia's Multi-GPU technology has never been that flexible as far as I know.

Tenebrae77 said:

"So basically based on nothing"

Like everything you say.

"Shitty home performance given all things but at least it's a reasonable upgrade on the Wii U and not the GameCube to Wii all over again."

HAHAHAHAHAHAHA

750 gflops is LESS  of a jump from wii u than wii was to GC. Go home, you're drunk. It's so pathetic that you actually think Nintendo's next home console will not be a lot more powerful than ps4, or moderately more powerful than ps4, or equal to ps4, or weaker than ps4, or weaker than X1, or only a little stronger than wii u (625-650 gflops).

Based on what? Gflops? Please. There is more to GPU's and graphics than single precision compute.

Miyamotoo said:

Well to be fair Wii U also has eDRAM, that is similar to that eSRAM in XB1.

The Wii U didn't have enough of the stuff. Pretty sure latency might have been higher than the Xbox One's eSRAM as well.

Besides, the Wii U's main limiter was that GPU and CPU.

It's like a waterpipe... Think of Memory bandwidth as the size of the pipe and think of the CPU and GPU as the amount of water flowing through, if the CPU and GPU aren't big enough and can't fully utilise the size of the pipe, then the bigger pipe is wasted isn't it?

Assuming the home docked version of the NX can run at full clock/cores at punch at 625 Nvidia GFLOPS with 50GB/sec memory bandwidth and all the bandwidth saving techniques Nvidia has ...

Do you think PS4/XB1 ports would be possible at 1280x720? That would be 921,600 pixels to render instead of 2,073,600 pixels of 1080p. 

Lets even on the CPU side that Nintendo is using something close/equal to the Jaguar core setup of the current consoles. 

Just theoretically. It's possible Nintendo wouldn't even use that but I'm pretty curious what a system under these conditions could pump out. 

50GB of memory bandwidth/sec is quite a bit if you save 30-60% due to various Nvidia techniques and you're only rendering 1/2 the pixels, no? The PS4's effective bandwidth from what I've heard is only 140GB/sec too. If Nvidia can save so much on the bandwidth side it could have been a huge reason as to why Nintendo chose Nvidia. 



Soundwave said:

Assuming the home docked version of the NX can run at full clock/cores at punch at 625 Nvidia GFLOPS with 50GB/sec memory bandwidth and all the bandwidth saving techniques Nvidia has ...

Do you think PS4/XB1 ports would be possible at 1280x720? That would be 921,600 pixels to render instead of 2,073,600 pixels of 1080p. 

Lets even on the CPU side that Nintendo is using something close/equal to the Jaguar core setup of the current consoles. 

Just theoretically. It's possible Nintendo wouldn't even use that but I'm pretty curious what a system under these conditions could pump out. 

50GB of memory bandwidth/sec is quite a bit if you save 30-60% due to various Nvidia techniques and you're only rendering 1/2 the pixels, no? The PS4's effective bandwidth from what I've heard is only 140GB/sec too. If Nvidia can save so much on the bandwidth side it could have been a huge reason as to why Nintendo chose Nvidia. 

I think Playstation 4 and Xbox One ports would be entirely possible at 720P.

But there will still be quality reductions even from the Xbox One versions, Tegra simply doesn't have the pixel or texturing fillrate or even the geometry performance to have the same level of fidelity, maybe in a couple of years when there is a push to 10nm chips.

TheLastStarFighter said:

It could be a mid-tear, 2 TFLOP card or so.  It could function like the setup for cars referenced above, or like an Alienware laptop PC that shuts down the mobile card when connected to SCD at home.

nVidia usually charges several hundred AUD for a mid-tier GPU.
Not to mention Nintendo still needs to include memory, Mosfets/VRM/General Power delivery/Cooling/Case for that GPU.

And then Nintendo needs to provide the (What could be a fairly expensive) Tegra SoC, several gigabytes of memory for that, VRM/Mosfet/General Power delivery, Screen and supporting chips, battery and the case...
Consoles are cost sensitive devices, you simply can't have everything.

We need to remember that nVidia's Multi-GPU technology simply isn't the most flexible multi-GPU technology on Earth... The memory pools don't get combined, they get duplicated and will need to be seperate.
The faster GPU's will often run at the same speed as the slowest GPU.
And that ignores the issue that SLI requires the use of the same chip to begin with, unless you are going to make a GPU only perform Physics calculations or some other compute task. (Perhaps Megatexturing?)

You also loose the ability to do Vsync and Triple buffering in Alternate Frame Rendering mode. - You can also get Microstutter and frame latency issues.

It would make more sense for nVidia to combine two Tegra SoC's to be honest.

And having the mobile chip "switch off" when docked is just a waste of potential... Especially if you are a user who would never take the device out of the dock.




www.youtube.com/@Pemalite

Pemalite said:
Soundwave said:

Assuming the home docked version of the NX can run at full clock/cores at punch at 625 Nvidia GFLOPS with 50GB/sec memory bandwidth and all the bandwidth saving techniques Nvidia has ...

Do you think PS4/XB1 ports would be possible at 1280x720? That would be 921,600 pixels to render instead of 2,073,600 pixels of 1080p. 

Lets even on the CPU side that Nintendo is using something close/equal to the Jaguar core setup of the current consoles. 

Just theoretically. It's possible Nintendo wouldn't even use that but I'm pretty curious what a system under these conditions could pump out. 

50GB of memory bandwidth/sec is quite a bit if you save 30-60% due to various Nvidia techniques and you're only rendering 1/2 the pixels, no? The PS4's effective bandwidth from what I've heard is only 140GB/sec too. If Nvidia can save so much on the bandwidth side it could have been a huge reason as to why Nintendo chose Nvidia. 

I think Playstation 4 and Xbox One ports would be entirely possible at 720P.

But there will still be quality reductions even from the Xbox One versions, Tegra simply doesn't have the pixel or texturing fillrate or even the geometry performance to have the same level of fidelity, maybe in a couple of years when there is a push to 10nm chips.

TheLastStarFighter said:

It could be a mid-tear, 2 TFLOP card or so.  It could function like the setup for cars referenced above, or like an Alienware laptop PC that shuts down the mobile card when connected to SCD at home.

nVidia usually charges several hundred AUD for a mid-tier GPU.
Not to mention Nintendo still needs to include memory, Mosfets/VRM/General Power delivery/Cooling/Case for that GPU.

And then Nintendo needs to provide the (What could be a fairly expensive) Tegra SoC, several gigabytes of memory for that, VRM/Mosfet/General Power delivery, Screen and supporting chips, battery and the case...
Consoles are cost sensitive devices, you simply can't have everything.

We need to remember that nVidia's Multi-GPU technology simply isn't the most flexible multi-GPU technology on Earth... The memory pools don't get combined, they get duplicated and will need to be seperate.
The faster GPU's will often run at the same speed as the slowest GPU.
And that ignores the issue that SLI requires the use of the same chip to begin with, unless you are going to make a GPU only perform Physics calculations or some other compute task. (Perhaps Megatexturing?)

You also loose the ability to do Vsync and Triple buffering in Alternate Frame Rendering mode. - You can also get Microstutter and frame latency issues.

It would make more sense for nVidia to combine two Tegra SoC's to be honest.

And having the mobile chip "switch off" when docked is just a waste of potential... Especially if you are a user who would never take the device out of the dock.

What do you think two Tegra X2's in unison could accomplish (Base NX + A Hypothetical Supplement Compute Device with a second SoC)? 

I'm kinda just curious to see how far a company could take these little chips, it's sorta fascinating. 

I think the SCD would simply be a second Tegra X2 or maybe the same chip with more CUDA cores or something, Nintendo won't want to pay for an entirely seperate semi-custom design and by putting the same GPU in the SCD, it could lower Nintendo's costs by increasing mass production of the same chip.

Actually it is kinda interesting that the Parker Drive X2 giant board already utilizes two Tegra X2s in tandem, I wonder if their automotive work has forced Nvidia to become more used to multiple processor usage from that and maybe that's also where Nintendo's idea for the Supplemental Compute Device comes from. 



Soundwave said:

What do you think two Tegra X2's in unison could accomplish (Base NX + A Hypothetical Supplement Compute Device with a second SoC)? 

I'm kinda just curious to see how far a company could take these little chips, it's sorta fascinating. 

I think the SCD would simply be a second Tegra X2 or maybe the same chip with more CUDA cores or something, Nintendo won't want to pay for an entirely seperate semi-custom design and by putting the same GPU in the SCD, it could lower Nintendo's costs by increasing mass production of the same chip.

Actually it is kinda interesting that the Parker Drive X2 giant board already utilizes two Tegra X2s in tandem, I wonder if their automotive work has forced Nvidia to become more used to multiple processor usage from that and maybe that's also where Nintendo's idea for the Supplemental Compute Device comes from. 

I would place two Tegra's X2 at roughly Xbox One levels of imagry to be honest, but with 720P resolution.

I just wan't answers and clear concise ones from Nintendo with a full NX reveal already. Haha




www.youtube.com/@Pemalite