By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Shin’en Multimedia: Wii U Is Most Definitely A Next-Generation Console

dahuman said:
TheJimbo1234 said:

Erm, but we know exactly what is in the WiiU so why are you claiming we don't? It has already been stripped and tested and came out rather poor.

o_O; we don't, and we still don't, there are a lot of speculations, like I mentioned earlier, nobody is thinking it will run xbox one or ps4 levels, but we don't have enough data to know how much worse. How the parts interact with each other is unknown, and the GPU is nowhere near diciphered.


http://www.anandtech.com/show/6465/nintendo-wii-u-teardown

Yes, we do knoq what is in it as that is not exactly hard to do. You buy one, strip it, then run the correct software on it. Job done. It's a 40nm gpu, which says enough, but when combined with benchmarks and power input, this gives you a perfectly good figure to work with.



Around the Network
TheJimbo1234 said:
dahuman said:
TheJimbo1234 said:

Erm, but we know exactly what is in the WiiU so why are you claiming we don't? It has already been stripped and tested and came out rather poor.

o_O; we don't, and we still don't, there are a lot of speculations, like I mentioned earlier, nobody is thinking it will run xbox one or ps4 levels, but we don't have enough data to know how much worse. How the parts interact with each other is unknown, and the GPU is nowhere near diciphered.


http://www.anandtech.com/show/6465/nintendo-wii-u-teardown

Yes, we do knoq what is in it as that is not exactly hard to do. You buy one, strip it, then run the correct software on it. Job done. It's a 40nm gpu, which says enough, but when combined with benchmarks and power input, this gives you a perfectly good figure to work with.

Only that thing is still making people scratch their heads on GAF and Beyond3D because it's so customized that nobody's knows WTF is really going on as some devs are claiming that it has almost unlimited bandwidth for their needs on the device or that it's really efficient and the watt/performance ratio is stupid good without even really pushing the system yet. I myself am very puzzled by it because I did look at the Anandtech tear down and the real time teardown on twitch. If you look at the GPU die, the first reaction from me was literally "WTF?" because I've looked at plenty of AMD dies before, and the shit in the Wii U GPU is like ??? other than the obvious eDRAM areas and possible shader areas.



About the tessellator...
I've seen tech demos with flames being procedurally generated using tesselation, compared to "standard" flames made with 2D animations they look much better of course and more three-dimentional.
Now I've just seen some videos of Mario Kart 8, and I do think flames coming out form exhaust pipes cold be made using tessellation. Do you agree?



freebs2 said:
About the tessellator...
I've seen tech demos with flames being procedurally generated using tesselation, compared to "standard" flames made with 2D animations they look much better of course and more three-dimentional.
Now I've just seen some videos of Mario Kart 8, and I do think flames coming out form exhaust pipes cold be made using tessellation. Do you agree?


doubtful, you don't need tessellation to make nice looking flames, the tech demo was prolly just showing a possibility.



dahuman said:
freebs2 said:
About the tessellator...
I've seen tech demos with flames being procedurally generated using tesselation, compared to "standard" flames made with 2D animations they look much better of course and more three-dimentional.
Now I've just seen some videos of Mario Kart 8, and I do think flames coming out form exhaust pipes cold be made using tessellation. Do you agree?


doubtful, you don't need tessellation to make nice looking flames, the tech demo was prolly just showing a possibility.

ok, I asked because normailly flames looks very fake. In mario kart propbaly they are just using a good looking 2D animation.



Around the Network
dahuman said:
TheJimbo1234 said:
dahuman said:
TheJimbo1234 said:

Erm, but we know exactly what is in the WiiU so why are you claiming we don't? It has already been stripped and tested and came out rather poor.

o_O; we don't, and we still don't, there are a lot of speculations, like I mentioned earlier, nobody is thinking it will run xbox one or ps4 levels, but we don't have enough data to know how much worse. How the parts interact with each other is unknown, and the GPU is nowhere near diciphered.


http://www.anandtech.com/show/6465/nintendo-wii-u-teardown

Yes, we do knoq what is in it as that is not exactly hard to do. You buy one, strip it, then run the correct software on it. Job done. It's a 40nm gpu, which says enough, but when combined with benchmarks and power input, this gives you a perfectly good figure to work with.

Only that thing is still making people scratch their heads on GAF and Beyond3D because it's so customized that nobody's knows WTF is really going on as some devs are claiming that it has almost unlimited bandwidth for their needs on the device or that it's really efficient and the watt/performance ratio is stupid good without even really pushing the system yet. I myself am very puzzled by it because I did look at the Anandtech tear down and the real time teardown on twitch. If you look at the GPU die, the first reaction from me was literally "WTF?" because I've looked at plenty of AMD dies before, and the shit in the Wii U GPU is like ??? other than the obvious eDRAM areas and possible shader areas.


Yet anyone with the slightest bit of knowledge knows such claims are a) absurd b) impossible. The PS4s position and choice of RAM provides it with an insane bandwidth - one that will likely never be used. The WiiU's? Erm, how is that the case with theirs? The RAM is slow, small, and generic. It certainly will have a limited bandwidth. What is going on is the devs choice of words eg. "unlimited bandwidth for their needs" aka low spec games. If you are running pong, then hell, 2GB might as well be 2TB as it will make no difference. But if you are running a state of the art game such as Killzone:SF, then people would be saying otherwise. The fact that AMD have not commented on the gpu is a bad sign as all companies like to brag about their gear - look at how they have used the PS4 and xbox1 to push their JAg range of procs. But when it comes to the WiiU? Silence. That isn't a good sign and is most likely because the answer would be "we did the best for a low spec system".



TheJimbo1234 said:
dahuman said:
TheJimbo1234 said:
 


http://www.anandtech.com/show/6465/nintendo-wii-u-teardown

Yes, we do knoq what is in it as that is not exactly hard to do. You buy one, strip it, then run the correct software on it. Job done. It's a 40nm gpu, which says enough, but when combined with benchmarks and power input, this gives you a perfectly good figure to work with.

Only that thing is still making people scratch their heads on GAF and Beyond3D because it's so customized that nobody's knows WTF is really going on as some devs are claiming that it has almost unlimited bandwidth for their needs on the device or that it's really efficient and the watt/performance ratio is stupid good without even really pushing the system yet. I myself am very puzzled by it because I did look at the Anandtech tear down and the real time teardown on twitch. If you look at the GPU die, the first reaction from me was literally "WTF?" because I've looked at plenty of AMD dies before, and the shit in the Wii U GPU is like ??? other than the obvious eDRAM areas and possible shader areas.


Yet anyone with the slightest bit of knowledge knows such claims are a) absurd b) impossible. The PS4s position and choice of RAM provides it with an insane bandwidth - one that will likely never be used. The WiiU's? Erm, how is that the case with theirs? The RAM is slow, small, and generic. It certainly will have a limited bandwidth. What is going on is the devs choice of words eg. "unlimited bandwidth for their needs" aka low spec games. If you are running pong, then hell, 2GB might as well be 2TB as it will make no difference. But if you are running a state of the art game such as Killzone:SF, then people would be saying otherwise. The fact that AMD have not commented on the gpu is a bad sign as all companies like to brag about their gear - look at how they have used the PS4 and xbox1 to push their JAg range of procs. But when it comes to the WiiU? Silence. That isn't a good sign and is most likely because the answer would be "we did the best for a low spec system".


uh, I've been talking about the Wii U now for awhile, it's not really about the PS4 right now in this discussion, I've already made it clear that it'd run shittier, we just don't know how much shittier, remember? The point is that the people actually working with the hardware are saying those things, so unless you are a dev and can provide me with actual benchmark numbers(which would be great, I'm dying to see it so I can actually estimate it's actual performance level and not second guess in a wide margin of a giant fucking circle.)

The majority of the bandwidth would be coming from the eDRAM BTW, DDR3 just has decent latency for fetching small data quickly and that performance will depend on how many cycles it takes to actually get to where it needs to be. Your comment about bandwidth with a low detail game also makes no sense, the bandwidth on the hardware itself will not change no matter what kind of game is being made, the more accurate estimate to what it might mean would be that with how much the dev thinks the GPU can handle, the bandwidth is plenty to work with is how I'm taking it. 1000TB/s bandwidth won't make a dick difference if the GPU itself can't handle the RAM anyways.

What I want are hard number answers, not opinions or bullshit, and more than anything else, if it has certain fixed functions, or what the fuck is hidden in those blocks in the GPU. The only things we know are "efficient" and "balanced." Nobody wants to share any numbers and I'm not a god damned Wii U dev so WTF Nintendo? AMD is not commenting on the GPU is because they are under NDA BTW, and who knows what the hell Nintendo is thinking? Nobody but the top brass in their company.

Shadow Fall is not state of the art BTW, there is no such game yet on the PS4 or Xbox One, they haven't pushed the hardware yet, it's just higher res and better lighting ATM if you remove your pony rainbow goggles, they are 7th gen games running on a decent PC rig right now with DX11 features turned on. I hope you are not satisfied with just that level of graphics this early in the generation at least. I'm counting on them to become much better so PC games can look even better.



dahuman said:
TheJimbo1234 said:
dahuman said:
TheJimbo1234 said:
 


http://www.anandtech.com/show/6465/nintendo-wii-u-teardown

Yes, we do knoq what is in it as that is not exactly hard to do. You buy one, strip it, then run the correct software on it. Job done. It's a 40nm gpu, which says enough, but when combined with benchmarks and power input, this gives you a perfectly good figure to work with.

Only that thing is still making people scratch their heads on GAF and Beyond3D because it's so customized that nobody's knows WTF is really going on as some devs are claiming that it has almost unlimited bandwidth for their needs on the device or that it's really efficient and the watt/performance ratio is stupid good without even really pushing the system yet. I myself am very puzzled by it because I did look at the Anandtech tear down and the real time teardown on twitch. If you look at the GPU die, the first reaction from me was literally "WTF?" because I've looked at plenty of AMD dies before, and the shit in the Wii U GPU is like ??? other than the obvious eDRAM areas and possible shader areas.


Yet anyone with the slightest bit of knowledge knows such claims are a) absurd b) impossible. The PS4s position and choice of RAM provides it with an insane bandwidth - one that will likely never be used. The WiiU's? Erm, how is that the case with theirs? The RAM is slow, small, and generic. It certainly will have a limited bandwidth. What is going on is the devs choice of words eg. "unlimited bandwidth for their needs" aka low spec games. If you are running pong, then hell, 2GB might as well be 2TB as it will make no difference. But if you are running a state of the art game such as Killzone:SF, then people would be saying otherwise. The fact that AMD have not commented on the gpu is a bad sign as all companies like to brag about their gear - look at how they have used the PS4 and xbox1 to push their JAg range of procs. But when it comes to the WiiU? Silence. That isn't a good sign and is most likely because the answer would be "we did the best for a low spec system".


uh, I've been talking about the Wii U now for awhile, it's not really about the PS4 right now in this discussion, I've already made it clear that it'd run shittier, we just don't know how much shittier, remember? The point is that the people actually working with the hardware are saying those things, so unless you are a dev and can provide me with actual benchmark numbers(which would be great, I'm dying to see it so I can actually estimate it's actual performance level and not second guess in a wide margin of a giant fucking circle.)

The majority of the bandwidth would be coming from the eDRAM BTW, DDR3 just has decent latency for fetching small data quickly and that performance will depend on how many cycles it takes to actually get to where it needs to be. Your comment about bandwidth with a low detail game also makes no sense, the bandwidth on the hardware itself will not change no matter what kind of game is being made, the more accurate estimate to what it might mean would be that with how much the dev thinks the GPU can handle, the bandwidth is plenty to work with is how I'm taking it. 1000TB/s bandwidth won't make a dick difference if the GPU itself can't handle the RAM anyways.

What I want are hard number answers, not opinions or bullshit, and more than anything else, if it has certain fixed functions, or what the fuck is hidden in those blocks in the GPU. The only things we know are "efficient" and "balanced." Nobody wants to share any numbers and I'm not a god damned Wii U dev so WTF Nintendo? AMD is not commenting on the GPU is because they are under NDA BTW, and who knows what the hell Nintendo is thinking? Nobody but the top brass in their company.

Shadow Fall is not state of the art BTW, there is no such game yet on the PS4 or Xbox One, they haven't pushed the hardware yet, it's just higher res and better lighting ATM if you remove your pony rainbow goggles, they are 7th gen games running on a decent PC rig right now with DX11 features turned on. I hope you are not satisfied with just that level of graphics this early in the generation at least. I'm counting on them to become much better so PC games can look even better.


And Xbox ONE probably plays PC games I'm guessing.

I remember the first Xbox being said to play PC games, so I assume the 360 & ONE do the same.



Kaizar said:
dahuman said:
TheJimbo1234 said:
dahuman said:
TheJimbo1234 said:
 




Yet anyone with the slightest bit of knowledge knows such claims are a) absurd b) impossible. The PS4s position and choice of RAM provides it with an insane bandwidth - one that will likely never be used. The WiiU's? Erm, how is that the case with theirs? The RAM is slow, small, and generic. It certainly will have a limited bandwidth. What is going on is the devs choice of words eg. "unlimited bandwidth for their needs" aka low spec games. If you are running pong, then hell, 2GB might as well be 2TB as it will make no difference. But if you are running a state of the art game such as Killzone:SF, then people would be saying otherwise. The fact that AMD have not commented on the gpu is a bad sign as all companies like to brag about their gear - look at how they have used the PS4 and xbox1 to push their JAg range of procs. But when it comes to the WiiU? Silence. That isn't a good sign and is most likely because the answer would be "we did the best for a low spec system".


uh, I've been talking about the Wii U now for awhile, it's not really about the PS4 right now in this discussion, I've already made it clear that it'd run shittier, we just don't know how much shittier, remember? The point is that the people actually working with the hardware are saying those things, so unless you are a dev and can provide me with actual benchmark numbers(which would be great, I'm dying to see it so I can actually estimate it's actual performance level and not second guess in a wide margin of a giant fucking circle.)

The majority of the bandwidth would be coming from the eDRAM BTW, DDR3 just has decent latency for fetching small data quickly and that performance will depend on how many cycles it takes to actually get to where it needs to be. Your comment about bandwidth with a low detail game also makes no sense, the bandwidth on the hardware itself will not change no matter what kind of game is being made, the more accurate estimate to what it might mean would be that with how much the dev thinks the GPU can handle, the bandwidth is plenty to work with is how I'm taking it. 1000TB/s bandwidth won't make a dick difference if the GPU itself can't handle the RAM anyways.

What I want are hard number answers, not opinions or bullshit, and more than anything else, if it has certain fixed functions, or what the fuck is hidden in those blocks in the GPU. The only things we know are "efficient" and "balanced." Nobody wants to share any numbers and I'm not a god damned Wii U dev so WTF Nintendo? AMD is not commenting on the GPU is because they are under NDA BTW, and who knows what the hell Nintendo is thinking? Nobody but the top brass in their company.

Shadow Fall is not state of the art BTW, there is no such game yet on the PS4 or Xbox One, they haven't pushed the hardware yet, it's just higher res and better lighting ATM if you remove your pony rainbow goggles, they are 7th gen games running on a decent PC rig right now with DX11 features turned on. I hope you are not satisfied with just that level of graphics this early in the generation at least. I'm counting on them to become much better so PC games can look even better.


And Xbox ONE probably plays PC games I'm guessing.

I remember the first Xbox being said to play PC games, so I assume the 360 & ONE do the same.

It probably can, the hardware is a PC and it has a windows layer so I don't see why not. 360 not so much, the CPU is quiet different.



dahuman said:
TheJimbo1234 said:
dahuman said:
TheJimbo1234 said:
 


http://www.anandtech.com/show/6465/nintendo-wii-u-teardown

Yes, we do knoq what is in it as that is not exactly hard to do. You buy one, strip it, then run the correct software on it. Job done. It's a 40nm gpu, which says enough, but when combined with benchmarks and power input, this gives you a perfectly good figure to work with.

Only that thing is still making people scratch their heads on GAF and Beyond3D because it's so customized that nobody's knows WTF is really going on as some devs are claiming that it has almost unlimited bandwidth for their needs on the device or that it's really efficient and the watt/performance ratio is stupid good without even really pushing the system yet. I myself am very puzzled by it because I did look at the Anandtech tear down and the real time teardown on twitch. If you look at the GPU die, the first reaction from me was literally "WTF?" because I've looked at plenty of AMD dies before, and the shit in the Wii U GPU is like ??? other than the obvious eDRAM areas and possible shader areas.


Yet anyone with the slightest bit of knowledge knows such claims are a) absurd b) impossible. The PS4s position and choice of RAM provides it with an insane bandwidth - one that will likely never be used. The WiiU's? Erm, how is that the case with theirs? The RAM is slow, small, and generic. It certainly will have a limited bandwidth. What is going on is the devs choice of words eg. "unlimited bandwidth for their needs" aka low spec games. If you are running pong, then hell, 2GB might as well be 2TB as it will make no difference. But if you are running a state of the art game such as Killzone:SF, then people would be saying otherwise. The fact that AMD have not commented on the gpu is a bad sign as all companies like to brag about their gear - look at how they have used the PS4 and xbox1 to push their JAg range of procs. But when it comes to the WiiU? Silence. That isn't a good sign and is most likely because the answer would be "we did the best for a low spec system".


uh, I've been talking about the Wii U now for awhile, it's not really about the PS4 right now in this discussion, I've already made it clear that it'd run shittier, we just don't know how much shittier, remember? The point is that the people actually working with the hardware are saying those things, so unless you are a dev and can provide me with actual benchmark numbers(which would be great, I'm dying to see it so I can actually estimate it's actual performance level and not second guess in a wide margin of a giant fucking circle.)

The majority of the bandwidth would be coming from the eDRAM BTW, DDR3 just has decent latency for fetching small data quickly and that performance will depend on how many cycles it takes to actually get to where it needs to be. Your comment about bandwidth with a low detail game also makes no sense, the bandwidth on the hardware itself will not change no matter what kind of game is being made, the more accurate estimate to what it might mean would be that with how much the dev thinks the GPU can handle, the bandwidth is plenty to work with is how I'm taking it. 1000TB/s bandwidth won't make a dick difference if the GPU itself can't handle the RAM anyways.

What I want are hard number answers, not opinions or bullshit, and more than anything else, if it has certain fixed functions, or what the fuck is hidden in those blocks in the GPU. The only things we know are "efficient" and "balanced." Nobody wants to share any numbers and I'm not a god damned Wii U dev so WTF Nintendo? AMD is not commenting on the GPU is because they are under NDA BTW, and who knows what the hell Nintendo is thinking? Nobody but the top brass in their company.

Shadow Fall is not state of the art BTW, there is no such game yet on the PS4 or Xbox One, they haven't pushed the hardware yet, it's just higher res and better lighting ATM if you remove your pony rainbow goggles, they are 7th gen games running on a decent PC rig right now with DX11 features turned on. I hope you are not satisfied with just that level of graphics this early in the generation at least. I'm counting on them to become much better so PC games can look even better.


...yet comments from devs has varied from "Yeahm it's great (and also we are huge Nintendo devs and without them would be sunk)", to "It's aweful (and we are not a Nintnedo dev company and don't have to watch what we say)". Now which do you think are telling the truth?

The lack of numbers is as I said, a bad sign and reinforces the latter comments from those devs. Kz:SF is good enough to start with. The DX11 features is next gen (tesselation, material reflections, particles etc), and about 2 PC games have these options due to everything being multiplatform nowadays.