By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - What If The NX Console Is Portable Too?

JustBeingReal said:
Soundwave said:


Pretty sure that is what AMD is aiming for with the 14nm FinFET process that starts next year. 

The Tegra X1 gets 500 GFLOPS at roughly 10 watts, but that's at 20nm, the rumor is they are moving to 14nm FinFET later this year, which will allow the X1 to be put into a mini-tablet form factor. 

The Power VR GT 7900 also gets 800 GFLOPS at about 10-12 watts and that's at 14nm/16nm. 

I think what Nintendo will be using will more akin to these new advances in mobile tech, not based on AMD's existing desktop/laptop processors. Those just don't have anywhere close to the power efficiency to reasonably power a portable, which is essential to the NX concept. 

IMO if AMD cannot give Nintendo similar performance/watt that Nvidia and PowerVR (Apple) is getting, then they've made a huge mistake in choosing AMD again. 


You're doing it again, using Flop ratings to compare multiple different manufacturer's technology, when Floating Point calculations are not an industry standard measuring metric.

No company measures them in the same way, so you using them like they are is completely disingenuous.

No AMD are not aiming for 70GFlops per watt, going by the way AMD measures FLOPs, at 28nm's they've achieved 819Flops over 35watts, that's taking everything together into account across a Carrizo APU/SOC, which means they've achieved 23.4GFlops per watt, if that architecture can be shrunk to 14nm it would achieve 46.8GFlops per watt.

Perhaps they can optimize their design to gain more performance, but that would take time, 2016 probably isn't going to be when they can achieve 70GFlops per watt at 14nm, maybe they'll get close to 60GFlops, but this is based on how AMD measures floating point performance.

GFlops are not a standard for measuring total processor performance, you using PowerVR, Nvidia's Tegra or anyone else's tech as a comparison, with GFlops as a metric makes absolutely no sense, ratings are not comparable.

 

Flops are a marketing buzzterm, a fake figure to make it seem like you have X amount performance, but it's not accurate or a figure that is commonly always used in exactly the same way, nor does it cover total processor performance.

Flops don't tell you Instructions Per Second (IPS), cache size, cache amount, Bandwidth, latency between components inside a chip or between chips.

Let me ask you then, what's the maximum performance you think they can get in 4-5 watt power evelope for a handheld launching fall 2016?

 If it's not enough to at least be able to run Wii U ports ... I think Nintendo's whole conceptualization of this unified concept will probably have been a bad idea, but that's kind of becoming par for the course with their hardware designs of late. 



Around the Network
Soundwave said:
JustBeingReal said:


You're doing it again, using Flop ratings to compare multiple different manufacturer's technology, when Floating Point calculations are not an industry standard measuring metric.

No company measures them in the same way, so you using them like they are is completely disingenuous.

No AMD are not aiming for 70GFlops per watt, going by the way AMD measures FLOPs, at 28nm's they've achieved 819Flops over 35watts, that's taking everything together into account across a Carrizo APU/SOC, which means they've achieved 23.4GFlops per watt, if that architecture can be shrunk to 14nm it would achieve 46.8GFlops per watt.

Perhaps they can optimize their design to gain more performance, but that would take time, 2016 probably isn't going to be when they can achieve 70GFlops per watt at 14nm, maybe they'll get close to 60GFlops, but this is based on how AMD measures floating point performance.

GFlops are not a standard for measuring total processor performance, you using PowerVR, Nvidia's Tegra or anyone else's tech as a comparison, with GFlops as a metric makes absolutely no sense, ratings are not comparable.

 

Flops are a marketing buzzterm, a fake figure to make it seem like you have X amount performance, but it's not accurate or a figure that is commonly always used in exactly the same way, nor does it cover total processor performance.

Flops don't tell you Instructions Per Second (IPS), cache size, cache amount, Bandwidth, latency between components inside a chip or between chips.

Let me ask you then, what's the maximum performance you think they can get in 4-5 watt power evelope for a handheld launching fall 2016?

 If it's not enough to at least be able to run Wii U ports ... I think Nintendo's whole conceptualization of this unified concept will probably have been a bad idea, but that's kind of becoming par for the course with their hardware designs of late. 

Well we know by AMD's own performance measuring metrics Carrizo, an APU/SOC that packs in 4 Excavator CPU Cores at 2.1GHz, an 8CU, 32TMUs, 8 ROP GPU clocked at 800MHz, that runs on 35 watts at 28nm and it produces 819GFlops.

We can compare that to PS4 and XB1, at least Wii U's GPU, Excavator is substantially more capable than the PowerPC cores in Wii U.

The above AMD tech would produce about half the performance of XBox 360 or about one third of Wii U's capabilities at 5 watts, maybe at 14nm Zen or a K12 APU could produce about 2/3rds of Wii U's capabilities on 5 watts, if Nintendo wants Wii U levels of GPU performance, with some more capable CPU and efficiencies aren't quite where they need them to be they can always use a larger battery, like a 10 watt unit as AMD do create SOCs that work on that level and that works for tablets, no reason Nintendo can't use the same for a portable version of the Wii U's Gamepad.

 

TBH though a handheld doesn't need to be as powerful as Wii U, the image the output isn't on a huge screen, it could even be 480p, so 640X480, which equals 307,200 pixels, exactly 1/3rd of Wii U's 720p or 921,600 pixel output, Nintendo can scale resolution down to run on a 117GFlop version of AMD's Carrizo tech, if they wanted they could even reduce graphical details and people probably wouldn't notice the difference because a smaller screen wouldn't show it.



Soundwave said:

I guess another way of doing it is what if Nintendo made a different console for different regional tastes?

I'm going say Nintendo chooses to be a little bold and uses AMD's 14nm FinFET process which is supposed to be firing on all cylinders by next year. So lets assume 70 GFLOPS/watt.

NX Pocket Handheld - 350 GFLOP. 960x540 4.88-inch LCD screen. $199.99. Standard Nintendo option, good for kids, people who want a DS/3DS successor. 3GB RAM. Cheap screen but does the job. 

NX Mobile Console (Japan) - 600 GFLOP (on battery); 900 GFLOP (plugged in). New Console Concept. Has a 1280x720 7-inch LCD screen. Can stream wirelessly to the TV via HDMI receiver (sold separately). Form factor may look like a Wii U controller or maybe a Surface tablet (kickstand display, play with controller). Not designed for pockets, but easy enough to take in a bag or carry from room to room. 6GB RAM. - $299.99 MSRP

NX Home Console (US/EU Markets) - 2TFLOP console (@28 watts), 1TB internal HDD, your standard Nintendo console. Games run at the full 1080P resolution for TV. 8GB RAM. About the size of the OG Wii (no disc drive). $299.99 MSRP.

All three versions could be sold in all markets of course, just the focus in the US would be the home console, in Japan the mobile console is the console made for Japanese tastes, and you have the standard Nintendo portable option for the typical kid market, budget parent, and the gamer who values portability/pocket-ability.

The only thing is I don't think the NX Pocket would be able to run all games (though at 350GFLOPS for only 540p render is pretty beastly still), but it would be able to run most third party games with scaled down effects and probably all Nintendo games at the lowered resolution, plus virtual console games and perhaps Android app ports. Ideal for getting kids with budget strict parents into the NX ecosystem and playing Splatoon 2/Mario Maker 2.0/Dragon Quest XI, then later on they can start bugging mom/dad for one of the console versions. 

More of this complete and utter nonsense. Games don't just "scale" like you think they do. It's not like PC games where you can just make the game run decently in on a variety of hardware specs and just keep driving up the minimum requirements until the game runs okay. That is not how it works. The specs don't change.  Video games cannot "just scale" on consoles. It never has and it never will.

But let's just assume it does.

Game engines still have to be optimized for each hardware spec, or mode. Every single one of them. Games have to be tested for each hardware spe, individually. Instead of each developer requiring one dev kit, they now need three. Now instead of taking an hour to make a simple adjustment and test it on PS4/XB1/Wii U, they need to test it on PS4/XB1/NXA/NXB/NXC/NXD Wonderful! Awesome. Now the developers need to spend even more time testing things before their code is submitted. Did I mention this process can happen hundreds of times per day? Or at least it did. You just took 1/2 hour to test to make sure your code didn't break the build and turned it into an hour-1.5 hour process. Never mind the added cost and time needed to test the game.

Do you ever want to see a third party game on a Nintendo console ever again? Because the cost of developing for all those different skus and modes drives development cost through the roof, and I do mean astronomically high.

This will never happen, unless you want the NX to fail harder than the Virtual Boy.



potato_hamster said:
Soundwave said:

I guess another way of doing it is what if Nintendo made a different console for different regional tastes?

I'm going say Nintendo chooses to be a little bold and uses AMD's 14nm FinFET process which is supposed to be firing on all cylinders by next year. So lets assume 70 GFLOPS/watt.

NX Pocket Handheld - 350 GFLOP. 960x540 4.88-inch LCD screen. $199.99. Standard Nintendo option, good for kids, people who want a DS/3DS successor. 3GB RAM. Cheap screen but does the job. 

NX Mobile Console (Japan) - 600 GFLOP (on battery); 900 GFLOP (plugged in). New Console Concept. Has a 1280x720 7-inch LCD screen. Can stream wirelessly to the TV via HDMI receiver (sold separately). Form factor may look like a Wii U controller or maybe a Surface tablet (kickstand display, play with controller). Not designed for pockets, but easy enough to take in a bag or carry from room to room. 6GB RAM. - $299.99 MSRP

NX Home Console (US/EU Markets) - 2TFLOP console (@28 watts), 1TB internal HDD, your standard Nintendo console. Games run at the full 1080P resolution for TV. 8GB RAM. About the size of the OG Wii (no disc drive). $299.99 MSRP.

All three versions could be sold in all markets of course, just the focus in the US would be the home console, in Japan the mobile console is the console made for Japanese tastes, and you have the standard Nintendo portable option for the typical kid market, budget parent, and the gamer who values portability/pocket-ability.

The only thing is I don't think the NX Pocket would be able to run all games (though at 350GFLOPS for only 540p render is pretty beastly still), but it would be able to run most third party games with scaled down effects and probably all Nintendo games at the lowered resolution, plus virtual console games and perhaps Android app ports. Ideal for getting kids with budget strict parents into the NX ecosystem and playing Splatoon 2/Mario Maker 2.0/Dragon Quest XI, then later on they can start bugging mom/dad for one of the console versions. 

More of this complete and utter nonsense. Games don't just "scale" like you think they do. It's not like PC games where you can just make the game run decently in on a variety of hardware specs and just keep driving up the minimum requirements until the game runs okay. That is not how it works. The specs don't change.  Video games cannot "just scale" on consoles. It never has and it never will.

But let's just assume it does.

Game engines still have to be optimized for each hardware spec, or mode. Every single one of them. Games have to be tested for each hardware spe, individually. Instead of each developer requiring one dev kit, they now need three. Now instead of taking an hour to make a simple adjustment and test it on PS4/XB1/Wii U, they need to test it on PS4/XB1/NXA/NXB/NXC/NXD Wonderful! Awesome. Now the developers need to spend even more time testing things before their code is submitted. Did I mention this process can happen hundreds of times per day? Or at least it did. You just took 1/2 hour to test to make sure your code didn't break the build and turned it into an hour-1.5 hour process. Never mind the added cost and time needed to test the game.

Do you ever want to see a third party game on a Nintendo console ever again? Because the cost of developing for all those different skus and modes drives development cost through the roof, and I do mean astronomically high.

This will never happen, unless you want the NX to fail harder than the Virtual Boy.


So do PC manufacturers test every hardware config when bugging a game?

Why is it that PC hardware is so different from console hardware, is there something magical in the console chips that make them completely different?

If MGSV wins game of the year, that will be three straight years (The Last of Us, Destiny, and MGSV) of a game getting GOTY being cross gen between two completely seperate hardware generations, let alone a unified family of devices with the same architecture. These are platforms in some cases seperated by almost 10 years with completely different CPU, GPU, memory structures, etc. etc. 

By the way here's a really interesting look at how insanely scalable the Battlefield 4 engine is

https://www.youtube.com/watch?v=z0BpD2fylmk

You can scale it down all the way to basically make it look like a PS2/3DS game and scale it back up so that it's too much to run on a Playstation 4.

There's nothing magical or different about console chipsets these days, they're basically just watered down PCs or smartphone architectures, both of which are extremely scalable. Having low/medium/high settings is nothing new for developers. 

This will be easier for third parties than in the past, where if they wanted to make a Wii game, say Call of Duty for Wii ... they had to use a completely seperate team and a totally different engine. Want to make a DS/3DS game too? Well that's another new engine that's going to require a completely other team. A scalable NX will be easier for third parties dealing with Nintendo not harder. 



JustBeingReal said:
Soundwave said:
JustBeingReal said:


You're doing it again, using Flop ratings to compare multiple different manufacturer's technology, when Floating Point calculations are not an industry standard measuring metric.

No company measures them in the same way, so you using them like they are is completely disingenuous.

No AMD are not aiming for 70GFlops per watt, going by the way AMD measures FLOPs, at 28nm's they've achieved 819Flops over 35watts, that's taking everything together into account across a Carrizo APU/SOC, which means they've achieved 23.4GFlops per watt, if that architecture can be shrunk to 14nm it would achieve 46.8GFlops per watt.

Perhaps they can optimize their design to gain more performance, but that would take time, 2016 probably isn't going to be when they can achieve 70GFlops per watt at 14nm, maybe they'll get close to 60GFlops, but this is based on how AMD measures floating point performance.

GFlops are not a standard for measuring total processor performance, you using PowerVR, Nvidia's Tegra or anyone else's tech as a comparison, with GFlops as a metric makes absolutely no sense, ratings are not comparable.

 

Flops are a marketing buzzterm, a fake figure to make it seem like you have X amount performance, but it's not accurate or a figure that is commonly always used in exactly the same way, nor does it cover total processor performance.

Flops don't tell you Instructions Per Second (IPS), cache size, cache amount, Bandwidth, latency between components inside a chip or between chips.

Let me ask you then, what's the maximum performance you think they can get in 4-5 watt power evelope for a handheld launching fall 2016?

 If it's not enough to at least be able to run Wii U ports ... I think Nintendo's whole conceptualization of this unified concept will probably have been a bad idea, but that's kind of becoming par for the course with their hardware designs of late. 

Well we know by AMD's own performance measuring metrics Carrizo, an APU/SOC that packs in 4 Excavator CPU Cores at 2.1GHz, an 8CU, 32TMUs, 8 ROP GPU clocked at 800MHz, that runs on 35 watts at 28nm and it produces 819GFlops.

We can compare that to PS4 and XB1, at least Wii U's GPU, Excavator is substantially more capable than the PowerPC cores in Wii U.

The above AMD tech would produce about half the performance of XBox 360 or about one third of Wii U's capabilities at 5 watts, maybe at 14nm Zen or a K12 APU could produce about 2/3rds of Wii U's capabilities on 5 watts, if Nintendo wants Wii U levels of GPU performance, with some more capable CPU and efficiencies aren't quite where they need them to be they can always use a larger battery, like a 10 watt unit as AMD do create SOCs that work on that level and that works for tablets, no reason Nintendo can't use the same for a portable version of the Wii U's Gamepad.

 

TBH though a handheld doesn't need to be as powerful as Wii U, the image the output isn't on a huge screen, it could even be 480p, so 640X480, which equals 307,200 pixels, exactly 1/3rd of Wii U's 720p or 921,600 pixel output, Nintendo can scale resolution down to run on a 117GFlop version of AMD's Carrizo tech, if they wanted they could even reduce graphical details and people probably wouldn't notice the difference because a smaller screen wouldn't show it.


I really wonder if AMD is the right company for this job if this is the direction Nintendo wants. Seems to me like PowerVR is able to get incredible performance in 10 watts or less and Nvidia is doing OK too, Tegra X1 seems like a beast. But I guess maybe Nintendo want to incorporate elements of the Wii U architecture/design and that's why they kept AMD? 

I dunno if Carrizo is power efficient enough, I know AMD has the Mullins architecture, that would probably be more suitable for a scalable platform that has to include a portable (or in Nintendo's case, the portable is very likely the lead/most popular SKU). 

Do you think it would be possible to use a small amount of HBM2 RAM in a portable? Not for the general RAM, but Nintendo likes to have very high speed caches of eDRAM (the Wii U has 32MB of it), could they have like 32 or 64MB of HBM2?



Around the Network

Mobile kind of does away with power, even when you get to expensive ends like xb1/ps4 (mobile cpu says hi)



In this day and age, with the Internet, ignorance is a choice! And they're still choosing Ignorance! - Dr. Filthy Frank

Soundwave said:
potato_hamster said:
Soundwave said:

I guess another way of doing it is what if Nintendo made a different console for different regional tastes?

I'm going say Nintendo chooses to be a little bold and uses AMD's 14nm FinFET process which is supposed to be firing on all cylinders by next year. So lets assume 70 GFLOPS/watt.

NX Pocket Handheld - 350 GFLOP. 960x540 4.88-inch LCD screen. $199.99. Standard Nintendo option, good for kids, people who want a DS/3DS successor. 3GB RAM. Cheap screen but does the job. 

NX Mobile Console (Japan) - 600 GFLOP (on battery); 900 GFLOP (plugged in). New Console Concept. Has a 1280x720 7-inch LCD screen. Can stream wirelessly to the TV via HDMI receiver (sold separately). Form factor may look like a Wii U controller or maybe a Surface tablet (kickstand display, play with controller). Not designed for pockets, but easy enough to take in a bag or carry from room to room. 6GB RAM. - $299.99 MSRP

NX Home Console (US/EU Markets) - 2TFLOP console (@28 watts), 1TB internal HDD, your standard Nintendo console. Games run at the full 1080P resolution for TV. 8GB RAM. About the size of the OG Wii (no disc drive). $299.99 MSRP.

All three versions could be sold in all markets of course, just the focus in the US would be the home console, in Japan the mobile console is the console made for Japanese tastes, and you have the standard Nintendo portable option for the typical kid market, budget parent, and the gamer who values portability/pocket-ability.

The only thing is I don't think the NX Pocket would be able to run all games (though at 350GFLOPS for only 540p render is pretty beastly still), but it would be able to run most third party games with scaled down effects and probably all Nintendo games at the lowered resolution, plus virtual console games and perhaps Android app ports. Ideal for getting kids with budget strict parents into the NX ecosystem and playing Splatoon 2/Mario Maker 2.0/Dragon Quest XI, then later on they can start bugging mom/dad for one of the console versions. 

More of this complete and utter nonsense. Games don't just "scale" like you think they do. It's not like PC games where you can just make the game run decently in on a variety of hardware specs and just keep driving up the minimum requirements until the game runs okay. That is not how it works. The specs don't change.  Video games cannot "just scale" on consoles. It never has and it never will.

But let's just assume it does.

Game engines still have to be optimized for each hardware spec, or mode. Every single one of them. Games have to be tested for each hardware spe, individually. Instead of each developer requiring one dev kit, they now need three. Now instead of taking an hour to make a simple adjustment and test it on PS4/XB1/Wii U, they need to test it on PS4/XB1/NXA/NXB/NXC/NXD Wonderful! Awesome. Now the developers need to spend even more time testing things before their code is submitted. Did I mention this process can happen hundreds of times per day? Or at least it did. You just took 1/2 hour to test to make sure your code didn't break the build and turned it into an hour-1.5 hour process. Never mind the added cost and time needed to test the game.

Do you ever want to see a third party game on a Nintendo console ever again? Because the cost of developing for all those different skus and modes drives development cost through the roof, and I do mean astronomically high.

This will never happen, unless you want the NX to fail harder than the Virtual Boy.


So do PC manufacturers test every hardware config when bugging a game?

Why is it that PC hardware is so different from console hardware, is there something magical in the console chips that make them completely different?

If MGSV wins game of the year, that will be three straight years (The Last of Us, Destiny, and MGSV) of a game getting GOTY being cross gen between two completely seperate hardware generations, let alone a unified family of devices with the same architecture. These are platforms in some cases seperated by almost 10 years with completely different CPU, GPU, memory structures, etc. etc. 

By the way here's a really interesting look at how insanely scalable the Battlefield 4 engine is

https://www.youtube.com/watch?v=z0BpD2fylmk

You can scale it down all the way to basically make it look like a PS2/3DS game and scale it back up so that it's too much to run on a Playstation 4.

There's nothing magical or different about console chipsets these days, they're basically just watered down PCs or smartphone architectures, both of which are extremely scalable. Having low/medium/high settings is nothing new for developers. 

This will be easier for third parties than in the past, where if they wanted to make a Wii game, say Call of Duty for Wii ... they had to use a completely seperate team and a totally different engine. Want to make a DS/3DS game too? Well that's another new engine that's going to require a completely other team. A scalable NX will be easier for third parties dealing with Nintendo not harder. 

In simplified terms:

PC game developers (I've made PC games as well) test for a handful of common GPUs and CPUs, design their engines to integrate with the graphics card APIs and let AMD, NVidia, or whoever deal with the "scaling". If something runs like shit? Blame the driver, or just raise the minimum spec. Problem solved. No time and effort is put into optimizing for specific hardware specifications, as long as it runs decently on what is considered to be a decent PC, then it's fine. 

Consoles on the otherhand? Console games go a layer deeper than that. Console development has the engine interacting directly with the hardware, or through a much more minimal, more open API. Game engines often encompass what drivers do for PCs, and this gives a lot of advantages when it comes to control. That control is needed for Optimization, and that's what console video games are all about. - Optimization, optimization, optimization. Then more optimization. If there is a single unified spec, optimze for that single hardware spec, you can work around specific bottle necks, you can arrange instructions to make processes run faster for a single GPU and CPU that runs at a specific speed, has specific cache sizes. When you are coding for specific quanities rather than general quanities you can get more, and more out of them. That's why console games at the start of a generation look terrible compared to the games that come out at the end of a generation.  That's why if you could hypothetically take a console and a PC with the exact same hardware specs, and fast forward five years into its lifetime, the console will run the same brand new game better than the PC.

That is the heart of what differentiates consoles from PCs, and you just want to pretend it doesn't exist, that you can just wipe it away and expect the games to come out to either a)look as good, and play as nearly well as they would if the NX was just one unified spec or b) not drive up devlopment costs as console developers have to develop for 3-4 different specs instead of 1. It does not work that way.

Consoles aren't scalable. That's all there is to it.



potato_hamster said:
Soundwave said:
potato_hamster said:
Soundwave said:

I guess another way of doing it is what if Nintendo made a different console for different regional tastes?

I'm going say Nintendo chooses to be a little bold and uses AMD's 14nm FinFET process which is supposed to be firing on all cylinders by next year. So lets assume 70 GFLOPS/watt.

NX Pocket Handheld - 350 GFLOP. 960x540 4.88-inch LCD screen. $199.99. Standard Nintendo option, good for kids, people who want a DS/3DS successor. 3GB RAM. Cheap screen but does the job. 

NX Mobile Console (Japan) - 600 GFLOP (on battery); 900 GFLOP (plugged in). New Console Concept. Has a 1280x720 7-inch LCD screen. Can stream wirelessly to the TV via HDMI receiver (sold separately). Form factor may look like a Wii U controller or maybe a Surface tablet (kickstand display, play with controller). Not designed for pockets, but easy enough to take in a bag or carry from room to room. 6GB RAM. - $299.99 MSRP

NX Home Console (US/EU Markets) - 2TFLOP console (@28 watts), 1TB internal HDD, your standard Nintendo console. Games run at the full 1080P resolution for TV. 8GB RAM. About the size of the OG Wii (no disc drive). $299.99 MSRP.

All three versions could be sold in all markets of course, just the focus in the US would be the home console, in Japan the mobile console is the console made for Japanese tastes, and you have the standard Nintendo portable option for the typical kid market, budget parent, and the gamer who values portability/pocket-ability.

The only thing is I don't think the NX Pocket would be able to run all games (though at 350GFLOPS for only 540p render is pretty beastly still), but it would be able to run most third party games with scaled down effects and probably all Nintendo games at the lowered resolution, plus virtual console games and perhaps Android app ports. Ideal for getting kids with budget strict parents into the NX ecosystem and playing Splatoon 2/Mario Maker 2.0/Dragon Quest XI, then later on they can start bugging mom/dad for one of the console versions. 

More of this complete and utter nonsense. Games don't just "scale" like you think they do. It's not like PC games where you can just make the game run decently in on a variety of hardware specs and just keep driving up the minimum requirements until the game runs okay. That is not how it works. The specs don't change.  Video games cannot "just scale" on consoles. It never has and it never will.

But let's just assume it does.

Game engines still have to be optimized for each hardware spec, or mode. Every single one of them. Games have to be tested for each hardware spe, individually. Instead of each developer requiring one dev kit, they now need three. Now instead of taking an hour to make a simple adjustment and test it on PS4/XB1/Wii U, they need to test it on PS4/XB1/NXA/NXB/NXC/NXD Wonderful! Awesome. Now the developers need to spend even more time testing things before their code is submitted. Did I mention this process can happen hundreds of times per day? Or at least it did. You just took 1/2 hour to test to make sure your code didn't break the build and turned it into an hour-1.5 hour process. Never mind the added cost and time needed to test the game.

Do you ever want to see a third party game on a Nintendo console ever again? Because the cost of developing for all those different skus and modes drives development cost through the roof, and I do mean astronomically high.

This will never happen, unless you want the NX to fail harder than the Virtual Boy.


So do PC manufacturers test every hardware config when bugging a game?

Why is it that PC hardware is so different from console hardware, is there something magical in the console chips that make them completely different?

If MGSV wins game of the year, that will be three straight years (The Last of Us, Destiny, and MGSV) of a game getting GOTY being cross gen between two completely seperate hardware generations, let alone a unified family of devices with the same architecture. These are platforms in some cases seperated by almost 10 years with completely different CPU, GPU, memory structures, etc. etc. 

By the way here's a really interesting look at how insanely scalable the Battlefield 4 engine is

https://www.youtube.com/watch?v=z0BpD2fylmk

You can scale it down all the way to basically make it look like a PS2/3DS game and scale it back up so that it's too much to run on a Playstation 4.

There's nothing magical or different about console chipsets these days, they're basically just watered down PCs or smartphone architectures, both of which are extremely scalable. Having low/medium/high settings is nothing new for developers. 

This will be easier for third parties than in the past, where if they wanted to make a Wii game, say Call of Duty for Wii ... they had to use a completely seperate team and a totally different engine. Want to make a DS/3DS game too? Well that's another new engine that's going to require a completely other team. A scalable NX will be easier for third parties dealing with Nintendo not harder. 

In simplified terms:

PC game developers (I've made PC games as well) test for a handful of common GPUs and CPUs, design their engines to integrate with the graphics card APIs and let AMD, NVidia, or whoever deal with the "scaling". If something runs like shit? Blame the driver, or just raise the minimum spec. Problem solved. No time and effort is put into optimizing for specific hardware specifications, as long as it runs decently on what is considered to be a decent PC, then it's fine. 

Consoles on the otherhand? Console games go a layer deeper than that. Console development has the engine interacting directly with the hardware, or through a much more minimal, more open API. Game engines often encompass what drivers do for PCs, and this gives a lot of advantages when it comes to control. That control is needed for Optimization, and that's what console video games are all about. - Optimization, optimization, optimization. Then more optimization. If there is a single unified spec, optimze for that single hardware spec, you can work around specific bottle necks, you can arrange instructions to make processes run faster for a single GPU and CPU that runs at a specific speed, has specific cache sizes. When you are coding for specific quanities rather than general quanities you can get more, and more out of them. That's why console games at the start of a generation look terrible compared to the games that come out at the end of a generation.  That's why if you could hypothetically take a console and a PC with the exact same hardware specs, and fast forward five years into its lifetime, the console will run the same brand new game better than the PC.

That is the heart of what differentiates consoles from PCs, and you just want to pretend it doesn't exist, that you can just wipe it away and expect the games to come out to either a)look as good, and play as nearly well as they would if the NX was just one unified spec or b) not drive up devlopment costs as console developers have to develop for 3-4 different specs instead of 1. It does not work that way.

Consoles aren't scalable. That's all there is to it.

I don't agree with that, I think we're just going to have to agree to disagree. Saying PC games don't run very well on a variety of different configurations is just flat out wrong IMO. Even if you say a "handful" of cards, you're talking like 6/7/8 different configurations right there because I could have a different CPU which each of those cards. I could have a different RAM amount. 

If anything a scalable architecture like the NX would be easier because developers would know exactly which spec each version has and what CPU, what RAM, etc. It's not like PC where I could have an Nvidia card, but a different CPU. A different amount of RAM. Etc. etc. etc. 

I also don't think there's anything special about the PS4 or X1 architectures. They're just watered down PCs in a box. The days of consoles being unique architectures completely seperate from PC cards is over.

Most likely developers simply won't develop for Nintendo period if their next console is just a standard console anyway. The reason NX may have appeal is that it might actually have a decent userbase because of the HANDHELD side of the Nintendo market (which right now is like 5x the size of their abysmall console audience) may have access to main games that third parties make. That's probably why Square-Enix jumped so eagerly to announce Dragon Quest XI for it before the console itself has even been formally unveiled. 

I doubt Square-Enix would give two shits if NX was just Nintendo's "me-too PS4" with a userbase of 0 and three years late to the party. They as a Japanese developer are jumping on board so early probably because the handheld can play the console games scaled back in some manner. 



Soundwave said:
JustBeingReal said:
Soundwave said:
JustBeingReal said:


You're doing it again, using Flop ratings to compare multiple different manufacturer's technology, when Floating Point calculations are not an industry standard measuring metric.

No company measures them in the same way, so you using them like they are is completely disingenuous.

No AMD are not aiming for 70GFlops per watt, going by the way AMD measures FLOPs, at 28nm's they've achieved 819Flops over 35watts, that's taking everything together into account across a Carrizo APU/SOC, which means they've achieved 23.4GFlops per watt, if that architecture can be shrunk to 14nm it would achieve 46.8GFlops per watt.

Perhaps they can optimize their design to gain more performance, but that would take time, 2016 probably isn't going to be when they can achieve 70GFlops per watt at 14nm, maybe they'll get close to 60GFlops, but this is based on how AMD measures floating point performance.

GFlops are not a standard for measuring total processor performance, you using PowerVR, Nvidia's Tegra or anyone else's tech as a comparison, with GFlops as a metric makes absolutely no sense, ratings are not comparable.

 

Flops are a marketing buzzterm, a fake figure to make it seem like you have X amount performance, but it's not accurate or a figure that is commonly always used in exactly the same way, nor does it cover total processor performance.

Flops don't tell you Instructions Per Second (IPS), cache size, cache amount, Bandwidth, latency between components inside a chip or between chips.

Let me ask you then, what's the maximum performance you think they can get in 4-5 watt power evelope for a handheld launching fall 2016?

 If it's not enough to at least be able to run Wii U ports ... I think Nintendo's whole conceptualization of this unified concept will probably have been a bad idea, but that's kind of becoming par for the course with their hardware designs of late. 

Well we know by AMD's own performance measuring metrics Carrizo, an APU/SOC that packs in 4 Excavator CPU Cores at 2.1GHz, an 8CU, 32TMUs, 8 ROP GPU clocked at 800MHz, that runs on 35 watts at 28nm and it produces 819GFlops.

We can compare that to PS4 and XB1, at least Wii U's GPU, Excavator is substantially more capable than the PowerPC cores in Wii U.

The above AMD tech would produce about half the performance of XBox 360 or about one third of Wii U's capabilities at 5 watts, maybe at 14nm Zen or a K12 APU could produce about 2/3rds of Wii U's capabilities on 5 watts, if Nintendo wants Wii U levels of GPU performance, with some more capable CPU and efficiencies aren't quite where they need them to be they can always use a larger battery, like a 10 watt unit as AMD do create SOCs that work on that level and that works for tablets, no reason Nintendo can't use the same for a portable version of the Wii U's Gamepad.

 

TBH though a handheld doesn't need to be as powerful as Wii U, the image the output isn't on a huge screen, it could even be 480p, so 640X480, which equals 307,200 pixels, exactly 1/3rd of Wii U's 720p or 921,600 pixel output, Nintendo can scale resolution down to run on a 117GFlop version of AMD's Carrizo tech, if they wanted they could even reduce graphical details and people probably wouldn't notice the difference because a smaller screen wouldn't show it.


I really wonder if AMD is the right company for this job if this is the direction Nintendo wants. Seems to me like PowerVR is able to get incredible performance in 10 watts or less and Nvidia is doing OK too, Tegra X1 seems like a beast. But I guess maybe Nintendo want to incorporate elements of the Wii U architecture/design and that's why they kept AMD? 

I dunno if Carrizo is power efficient enough, I know AMD has the Mullins architecture, that would probably be more suitable for a scalable platform that has to include a portable (or in Nintendo's case, the portable is very likely the lead/most popular SKU). 

Do you think it would be possible to use a small amount of HBM2 RAM in a portable? Not for the general RAM, but Nintendo likes to have very high speed caches of eDRAM (the Wii U has 32MB of it), could they have like 32 or 64MB of HBM2?


AMD can provide the whole SOC package, with the level of performance for a console and a handheld, a unified architecture, which can scale between lower power requirements and higher, right up to performance PC level.

With their latest technology they're even efficient enough for a decent 480p handheld at 5 watts, it can of course scale up for higher resolutions.

Tegra's X1 is weak in the CPU department, I know you have some obsession with PowerVR, but it's not proven technology for a console.

AMD/ATI have been providing the GPU for Nintendo since Gamecube, sticking with AMD does make sense from a compatibility point of view.

 

Carrizo is definitely efficient enough to put Wii U level graphical quality natively at 480p in a handheld, a reduction in resolution is all that would be needed and every other graphical feature and frame rate could be equalled. The whole console version could be whatever spec Nintendo wanted within their given power/wattage demands, be it a straight up copy of Carrizo or multiples slapped together onto one die.

As for HBM the whole point is that it offers both a big frame buffer and performance on the bandwidth side, you don't want to try and cram all of your data into a small cache if you don't have too, Nintendo would be better off going with a single unified memory pool, forget the Wii U or XB1 style memory layout.

A single 4GB pool of HBM could mimic the demands of Wii U's memory set-up easily.

 

Nintendo only went with high speed caches in the past because that's what was available, going wide on a big pool of storage has always been the goal, because it's the most flexible system and you don't have any limitations, latency is vastly reduced because the chips are stacked, allowing for faster access to your chosen area of data.

If possible a system with all HBM would be the most logical next step, if needed carry on using DDR3 or 4 for system memory and use HBM as your VRAM cache like a PC does.

I'm not sure about the handheld side of things for memory, but it would definitelty make sense to use all HBM if it's available in the quantities Nintendo needs.



Soundwave said:
potato_hamster said:
Soundwave said:
potato_hamster said:
Soundwave said:

I guess another way of doing it is what if Nintendo made a different console for different regional tastes?

I'm going say Nintendo chooses to be a little bold and uses AMD's 14nm FinFET process which is supposed to be firing on all cylinders by next year. So lets assume 70 GFLOPS/watt.

NX Pocket Handheld - 350 GFLOP. 960x540 4.88-inch LCD screen. $199.99. Standard Nintendo option, good for kids, people who want a DS/3DS successor. 3GB RAM. Cheap screen but does the job. 

NX Mobile Console (Japan) - 600 GFLOP (on battery); 900 GFLOP (plugged in). New Console Concept. Has a 1280x720 7-inch LCD screen. Can stream wirelessly to the TV via HDMI receiver (sold separately). Form factor may look like a Wii U controller or maybe a Surface tablet (kickstand display, play with controller). Not designed for pockets, but easy enough to take in a bag or carry from room to room. 6GB RAM. - $299.99 MSRP

NX Home Console (US/EU Markets) - 2TFLOP console (@28 watts), 1TB internal HDD, your standard Nintendo console. Games run at the full 1080P resolution for TV. 8GB RAM. About the size of the OG Wii (no disc drive). $299.99 MSRP.

All three versions could be sold in all markets of course, just the focus in the US would be the home console, in Japan the mobile console is the console made for Japanese tastes, and you have the standard Nintendo portable option for the typical kid market, budget parent, and the gamer who values portability/pocket-ability.

The only thing is I don't think the NX Pocket would be able to run all games (though at 350GFLOPS for only 540p render is pretty beastly still), but it would be able to run most third party games with scaled down effects and probably all Nintendo games at the lowered resolution, plus virtual console games and perhaps Android app ports. Ideal for getting kids with budget strict parents into the NX ecosystem and playing Splatoon 2/Mario Maker 2.0/Dragon Quest XI, then later on they can start bugging mom/dad for one of the console versions. 

More of this complete and utter nonsense. Games don't just "scale" like you think they do. It's not like PC games where you can just make the game run decently in on a variety of hardware specs and just keep driving up the minimum requirements until the game runs okay. That is not how it works. The specs don't change.  Video games cannot "just scale" on consoles. It never has and it never will.

But let's just assume it does.

Game engines still have to be optimized for each hardware spec, or mode. Every single one of them. Games have to be tested for each hardware spe, individually. Instead of each developer requiring one dev kit, they now need three. Now instead of taking an hour to make a simple adjustment and test it on PS4/XB1/Wii U, they need to test it on PS4/XB1/NXA/NXB/NXC/NXD Wonderful! Awesome. Now the developers need to spend even more time testing things before their code is submitted. Did I mention this process can happen hundreds of times per day? Or at least it did. You just took 1/2 hour to test to make sure your code didn't break the build and turned it into an hour-1.5 hour process. Never mind the added cost and time needed to test the game.

Do you ever want to see a third party game on a Nintendo console ever again? Because the cost of developing for all those different skus and modes drives development cost through the roof, and I do mean astronomically high.

This will never happen, unless you want the NX to fail harder than the Virtual Boy.


So do PC manufacturers test every hardware config when bugging a game?

Why is it that PC hardware is so different from console hardware, is there something magical in the console chips that make them completely different?

If MGSV wins game of the year, that will be three straight years (The Last of Us, Destiny, and MGSV) of a game getting GOTY being cross gen between two completely seperate hardware generations, let alone a unified family of devices with the same architecture. These are platforms in some cases seperated by almost 10 years with completely different CPU, GPU, memory structures, etc. etc. 

By the way here's a really interesting look at how insanely scalable the Battlefield 4 engine is

https://www.youtube.com/watch?v=z0BpD2fylmk

You can scale it down all the way to basically make it look like a PS2/3DS game and scale it back up so that it's too much to run on a Playstation 4.

There's nothing magical or different about console chipsets these days, they're basically just watered down PCs or smartphone architectures, both of which are extremely scalable. Having low/medium/high settings is nothing new for developers. 

This will be easier for third parties than in the past, where if they wanted to make a Wii game, say Call of Duty for Wii ... they had to use a completely seperate team and a totally different engine. Want to make a DS/3DS game too? Well that's another new engine that's going to require a completely other team. A scalable NX will be easier for third parties dealing with Nintendo not harder. 

In simplified terms:

PC game developers (I've made PC games as well) test for a handful of common GPUs and CPUs, design their engines to integrate with the graphics card APIs and let AMD, NVidia, or whoever deal with the "scaling". If something runs like shit? Blame the driver, or just raise the minimum spec. Problem solved. No time and effort is put into optimizing for specific hardware specifications, as long as it runs decently on what is considered to be a decent PC, then it's fine. 

Consoles on the otherhand? Console games go a layer deeper than that. Console development has the engine interacting directly with the hardware, or through a much more minimal, more open API. Game engines often encompass what drivers do for PCs, and this gives a lot of advantages when it comes to control. That control is needed for Optimization, and that's what console video games are all about. - Optimization, optimization, optimization. Then more optimization. If there is a single unified spec, optimze for that single hardware spec, you can work around specific bottle necks, you can arrange instructions to make processes run faster for a single GPU and CPU that runs at a specific speed, has specific cache sizes. When you are coding for specific quanities rather than general quanities you can get more, and more out of them. That's why console games at the start of a generation look terrible compared to the games that come out at the end of a generation.  That's why if you could hypothetically take a console and a PC with the exact same hardware specs, and fast forward five years into its lifetime, the console will run the same brand new game better than the PC.

That is the heart of what differentiates consoles from PCs, and you just want to pretend it doesn't exist, that you can just wipe it away and expect the games to come out to either a)look as good, and play as nearly well as they would if the NX was just one unified spec or b) not drive up devlopment costs as console developers have to develop for 3-4 different specs instead of 1. It does not work that way.

Consoles aren't scalable. That's all there is to it.

I don't agree with that, I think we're just going to have to agree to disagree. Saying PC games don't run very well on a variety of different configurations is just flat out wrong IMO. Even if you say a "handful" of cards, you're talking like 6/7/8 different configurations right there because I could have a different CPU which each of those cards. I could have a different RAM amount. 

If anything a scalable architecture like the NX would be easier because developers would know exactly which spec each version has and what CPU, what RAM, etc. It's not like PC where I could have an Nvidia card, but a different CPU. A different amount of RAM. Etc. etc. etc. 

I also don't think there's anything special about the PS4 or X1 architectures. They're just watered down PCs in a box. The days of consoles being unique architectures completely seperate from PC cards is over.

Most likely developers simply won't develop for Nintendo period if their next console is just a standard console anyway. The reason NX may have appeal is that it might actually have a decent userbase because of the HANDHELD side of the Nintendo market (which right now is like 5x the size of their abysmall console audience) may have access to main games that third parties make. That's probably why Square-Enix jumped so eagerly to announce Dragon Quest XI for it before the console itself has even been formally unveiled. 

I doubt Square-Enix would give two shits if NX was just Nintendo's "me-too PS4" with a userbase of 0 and three years late to the party. They as a Japanese developer are jumping on board so early probably because the handheld can play the console games scaled back in some manner. 

Now you've decided having different hardware specs and scalable engines is an advantage. That it would easier to program for an NX than a PC. By that logic it must be dramatically easier to develop for a PS3 than it is a PC, right?. That's just one spec compared to hundreds, if not thousands, afterall. You have x86 architectures, x64 architectures, Intel and AMD processors, you have dual core, quad core and hex core processors, you have GPUs by AMD and nVidia. It's insanely complicated. PC game developers must have thousands of PCs and thousands of testers testing every configuration under the sun! It's the only way to know the game will work. It's not like they're coding for the operating system and the API and let the API handle the scaling of PC games for them. Not at all! It's so complicated! On the other hand, The PS3 is comparitvely simple. One CPU. One GPU! PowerPC CELL processor with 9 cores, and processor-specifc bottlenecks? No problem! It'll be a pleasure to code for compared to a PC!

Please. Stop embarassing yourself. You have no idea how ridiculous you sound. You could not possibly be further from the truth.

Look, when it comes down to it I have my name in over a dozen video games and you probably haven't even gotten the tour of a video game studio. Now I'm not saying that to be a dick, I'm saying it because you learn a lot about how video games are made by actually making video games. Agree to disagree all you want. The fact of the matter is, is that I am speaking from experience, and that you don't understand where I'm coming from because you've never worked on a console video game or a PC video game to be able to compare the two. You can't imagine my perspective because you literally have no idea what it is like. Listen, just because you can't imagine something being a certain way because your point of view is narrower than you realize does not mean that it isn't that way. 

Have you actually seen a dev kit in person before? Deployed a build to one? Had the opportunity to learn how dramatcally different they are from retail consoles?Do you realize what a highly specialized piece of machinery a dev kit is now (even though you think it's just a "watered down PC in a box")? Do you know how much they cost? You really don't have the slightest do you? Can you imagine how extraordinarily expensive and difficult it would be to create a dev kit that would able to handle all of the different input schemes and "scale the hardware" accurately? It's either that or quite literally force developers to buy multiple dev kits for one platform. Those costs are passed directly to the developers. That is just one dramatically increased cost. Testing, certification, deployment time. etc. etc. It all adds up. We haven't even considered the act tools that Nintendo provides which, if you ask anyone who has had to deal with it, are beyond terrible compared to Microsoft and Sony. It is harder to get less out of the Wii U than it is to get more out the PS4.

If you want to discount that experience as "a matter of opinion" that's fine. You don't have to take my word for it.

At the end of the day when the NX comes out and either a) doesn't have multiple hardware specs, or b) doesn't has less third party support than the Wii U, you'll know why. The fact that Square-Enix has already announced they're developing a game for it should be more of an indicator that the NX is a me-too PS4 than less of one, because there's no way they're touching a console with multiple hardware configurations.