By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sales Discussion - Will the PS4 outsell the XBox One and WiiU?

snowdog said:
goopy20 said:
Zero999 said:
snowdog said:
It isn't pointless to keep reminding you that there's over a third of the silicon that's a complete mystery. It's a fact, and a very important one at that. We really don't have much of a clue about Latte apart from two things - it has the same amount of eDRAM as the One's GPU and it has a DX11-equivalent feature set. And that's it.

We don't have a clue how many ROPs or ALUs it has, have no idea. It's a completely custom GPU and we don't even know what GPU it's based on because it's been customised so much that its origins are completely unrecognisable.

And it should also be noted that not a single developer, named or anonymous, has complained about the GPU or RAM...quite the opposite in fact.

before he starts the secret sauce shit, remind him that WE don't know those details like flopage but devs do.

Exactly, how can anyone seriously consider the idea of some mysterious custom made gpu that nobody understands? It would be pretty bad on Nintendo's part to leave developers in the dark if they wanted to make a Wii U game. Those guys simply know how the Wii U works and they are complaining about the slow cpu and ram. Not in comparion to the ps4, but actually compared to the current gen consoles.



There isn't a single developer, named or anonymous, that's complained about the RAM. Developers have been praising it.

The Metro engine is a known CPU resource hog and the Wii U's architecture is completely different compared to the PS3 and 360. Once the PS4 and One are released developers will find it a great deal easier to port between the Wii U/PS4/One (which also have 'slow' CPUs) than it currently is to port between the Wii U/PS3/360.

As for EA and their stance on support, that's going to change before Christmas. Shareholders have already pressured them into supporting the console with a few titles by the looks of it, and when 3D Mario and Mario Kart 8 have been released you'll find EA adding even more support for the platform

And regarding Expresso, it should be noted that it has a ridiculously short pipeline (only 4 stages!!!) so will be a great deal more efficient than the PS4 and One CPUs. The PS4 CPU, for example, has 17 stages.

Are u now seriously saying the Wii u's cpu is better then those 8 core Jaguar APU's? 

Anyway, I got the patience of a Budhist monk but it just seems useless to argue with someone who doesn't consider actual scientific numbers as prove of anything. All I can say is take a look at the confirmed specs of all 3 consoles in the link and where the Wii U stands. http://gamrconnect.vgchartz.com/thread.php?id=136756

Lets just agree that Mario 3D and Mario Karts will be awesome and sell loads of copies. But lets also agree Nintendo won't and can't depend on 3rd party support from developers that will try the ps4 to it's limits. In other words, the kind of games the typical graphic whore/ core gamer will care about.

 



Around the Network

I didn't say anything about it being better. It's going to be more efficient, which will lessen the gap between Expresso and the CPUs in the other two machines a little.

Jags are hardly powerhouses mate, and the same goes for Expresso too. None of the CPUs this gen are suitable for double precision floating point work, that's precisely why all 3 platform holders have chosen a similar console architecture with CPUs great at general processing and using GPUs to do the floating point work.

If truth be told both Sony and Microsoft chose very odd architectures last gen.



goopy20 said:

Are u now seriously saying the Wii u's cpu is better then those 8 core Jaguar APU's? 

Anyway, I got the patience of a Budhist monk but it just seems useless to argue with someone who doesn't consider actual scientific numbers as prove of anything. All I can say is take a look at the confirmed specs of all 3 consoles in the link and where the Wii U stands. http://gamrconnect.vgchartz.com/thread.php?id=136756

Lets just agree that Mario 3D and Mario Karts will be awesome and sell loads of copies. But lets also agree Nintendo won't and can't depend on 3rd party support from developers that will try the ps4 to it's limits. In other words, the kind of games the typical graphic whore/ core gamer will care about.

 

so unannounced flopage for wii u and xone, as well as other stuff now means confirmed? you got problems.



snowdog said:
goopy20 said:
Zero999 said:
snowdog said:
It isn't pointless to keep reminding you that there's over a third of the silicon that's a complete mystery. It's a fact, and a very important one at that. We really don't have much of a clue about Latte apart from two things - it has the same amount of eDRAM as the One's GPU and it has a DX11-equivalent feature set. And that's it.

We don't have a clue how many ROPs or ALUs it has, have no idea. It's a completely custom GPU and we don't even know what GPU it's based on because it's been customised so much that its origins are completely unrecognisable.

And it should also be noted that not a single developer, named or anonymous, has complained about the GPU or RAM...quite the opposite in fact.

before he starts the secret sauce shit, remind him that WE don't know those details like flopage but devs do.

Exactly, how can anyone seriously consider the idea of some mysterious custom made gpu that nobody understands? It would be pretty bad on Nintendo's part to leave developers in the dark if they wanted to make a Wii U game. Those guys simply know how the Wii U works and they are complaining about the slow cpu and ram. Not in comparion to the ps4, but actually compared to the current gen consoles.



There isn't a single developer, named or anonymous, that's complained about the RAM. Developers have been praising it.

The Metro engine is a known CPU resource hog and the Wii U's architecture is completely different compared to the PS3 and 360. Once the PS4 and One are released developers will find it a great deal easier to port between the Wii U/PS4/One (which also have 'slow' CPUs) than it currently is to port between the Wii U/PS3/360.

As for EA and their stance on support, that's going to change before Christmas. Shareholders have already pressured them into supporting the console with a few titles by the looks of it, and when 3D Mario and Mario Kart 8 have been released you'll find EA adding even more support for the platform

And regarding Expresso, it should be noted that it has a ridiculously short pipeline (only 4 stages!!!) so will be a great deal more efficient than the PS4 and One CPUs. The PS4 CPU, for example, has 17 stages.

I wouldn't take anything that EA said seriously in the past few months.  They are clearly in bed with the idea of DRM and those who support it.  



And it should also be pointed out that you shouldn't judge a console on specs alone, a few developers have noted that the GPU 'punches above its weight'. It wouldn't surprise me if the CPU and RAM performance is the same.

And even if their performance reflects the specs it still isn't going to have any problems handling down-ports from the PS4 and One.



Around the Network
snowdog said:
And it should also be pointed out that you shouldn't judge a console on specs alone, a few developers have noted that the GPU 'punches above its weight'. It wouldn't surprise me if the CPU and RAM performance is the same.

And even if their performance reflects the specs it still isn't going to have any problems handling down-ports from the PS4 and One.

The gpu does punch above it's weight but only compared to current gen consoles. People did actually figure out what is in the Wii U and it has about 1,5 the raw computing power of the 360's GPU, and off course twice the memory. Not bad by today's standards but a far cry from a true next machine like the ps4.

the final GPU is indeed a close match to the 4650/4670, albeit with a deficit in the number of texture-mapping units and a lower clock speed - 550MHz. AMD's RV770 hardware is well documented so with these numbers we can now, categorically, finally rule out any next-gen pretensions for the Wii U - the GCN hardware in Durango and Orbis is in a completely different league. However, the 16 TMUs at 550MHz and texture cache improvements found in RV770 do elevate the capabilities of this hardware beyond the Xenos GPU in the Xbox 360 - 1.5 times the raw shader power sounds about right. [Update: It's generally accepted that the PS3 graphics core is less capable than Xenos, so Wii U's GPU would be even more capable.] 1080p resolution is around 2.5x that of 720p, so bearing in mind the inclusion of just eight ROPs, it's highly unlikely that we'll be seeing any complex 3D titles running at 1080p.

We are talking about a 600% difference just from the gpu and I am not even going to begin about the 8 core CPU and 7 gigs of GDDR5. All I know is that if developers can't show a big leap i n graphics between both platforms, then they are doing a piss poor job.



goopy20 said:
snowdog said:
And it should also be pointed out that you shouldn't judge a console on specs alone, a few developers have noted that the GPU 'punches above its weight'. It wouldn't surprise me if the CPU and RAM performance is the same.

And even if their performance reflects the specs it still isn't going to have any problems handling down-ports from the PS4 and One.

The gpu does punch above it's weight but only compared to current gen consoles. People did actually figure out what is in the Wii U and it has about 1,5 the raw computing power of the 360's GPU, and off course twice the memory. Not bad by today's standards but a far cry from a true next machine like the ps4.

the final GPU is indeed a close match to the 4650/4670, albeit with a deficit in the number of texture-mapping units and a lower clock speed - 550MHz. AMD's RV770 hardware is well documented so with these numbers we can now, categorically, finally rule out any next-gen pretensions for the Wii U - the GCN hardware in Durango and Orbis is in a completely different league. However, the 16 TMUs at 550MHz and texture cache improvements found in RV770 do elevate the capabilities of this hardware beyond the Xenos GPU in the Xbox 360 - 1.5 times the raw shader power sounds about right. [Update: It's generally accepted that the PS3 graphics core is less capable than Xenos, so Wii U's GPU would be even more capable.] 1080p resolution is around 2.5x that of 720p, so bearing in mind the inclusion of just eight ROPs, it's highly unlikely that we'll be seeing any complex 3D titles running at 1080p.

We are talking about a 600% difference just from the gpu and I am not even going to begin about the 8 core CPU and 7 gigs of GDDR5. All I know is that if developers can't show a big leap i n graphics between both platforms, then they are doing a piss poor job.

bolded 1: no

b2: no

b3: 4 times and more efficient memory.

b4: utter bullshit from people who just looked at the gpu and couldn't even figure what 1/3 of it does.

b5: no, because of the reasons above.



“The Wii U GPU is several generations ahead of the current gen. It allows many things that were not possible on consoles before. If you develop for Wii U you have to take advantage of these possibilities, otherwise your performance is of course limited. Also your engine layout needs to be different. You need to take advantage of the large shared memory of the Wii U, the huge and very fast EDRAM section and the big CPU caches in the cores. Especially the workings of the CPU caches are very important to master. Otherwise you can lose a magnitude of power for cache relevant parts of your code. In the end the Wii U specs fit perfectly together and make a very efficient console when used right.”

Shin'en multimidia.



Zero999 said:

“The Wii U GPU is several generations ahead of the current gen. It allows many things that were not possible on consoles before. If you develop for Wii U you have to take advantage of these possibilities, otherwise your performance is of course limited. Also your engine layout needs to be different. You need to take advantage of the large shared memory of the Wii U, the huge and very fast EDRAM section and the big CPU caches in the cores. Especially the workings of the CPU caches are very important to master. Otherwise you can lose a magnitude of power for cache relevant parts of your code. In the end the Wii U specs fit perfectly together and make a very efficient console when used right.”

Shin'en multimidia.

Ok fine guys. The Wii U is a beast that will run something like Killzone 4 just fine. In 720p because, obviously, the console was designed as a 720p machine and has nothing to do with the lack of power that prevents it from running 1080p. All the AAA developers just don't know what the hell they are talking about. But listen to Shin'en multimedia who were able to pull of next gen graphics on the Wii-u with their 9,99 DLC title Nano Assault Neo and .... oh wait they only made that one game... Seriously now, It's a good game, but how is anyone supposed to take their tech advise seriously when their game could very likely run fine on a ps2?

Can't we just agree that people will buy a Wii U for it's exclusives, not for the graphics? What kind of next gen graphics do you guys seriously expect from a Mario kart and does it matter for games like that? You seem to forget that the Wii U already has a Mario game, which is actually pretty good and is sitting on a 84 Metacritic score. Those are the sort of games Nintendo needs more off and no Nintendo fan is going to care if it uses Tesselation, Dynamic GI or any other fancy DX11 rendering effects. What Nintendo doesn't need are weak sauce versions of core games because people who care about graphics will notice the difference between 720 and 1080p or dynamic vs pre-baked lighting.



goopy20 said:
Zero999 said:

“The Wii U GPU is several generations ahead of the current gen. It allows many things that were not possible on consoles before. If you develop for Wii U you have to take advantage of these possibilities, otherwise your performance is of course limited. Also your engine layout needs to be different. You need to take advantage of the large shared memory of the Wii U, the huge and very fast EDRAM section and the big CPU caches in the cores. Especially the workings of the CPU caches are very important to master. Otherwise you can lose a magnitude of power for cache relevant parts of your code. In the end the Wii U specs fit perfectly together and make a very efficient console when used right.”

Shin'en multimidia.

Ok fine guys. The Wii U is a beast that will run something like Killzone 4 just fine. In 720p because, obviously, the console was designed as a 720p machine and has nothing to do with the lack of power that prevents it from running 1080p. All the AAA developers just don't know what the hell they are talking about. But listen to Shin'en multimedia who were able to pull of next gen graphics on the Wii-u with their 9,99 DLC title Nano Assault Neo and .... oh wait they only made that one game... Seriously now, It's a good game, but how is anyone supposed to take their tech advise seriously when their game could very likely run fine on a ps2?

Can't we just agree that people will buy a Wii U for it's exclusives, not for the graphics? What kind of next gen graphics do you guys seriously expect from a Mario kart and does it matter for games like that? You seem to forget that the Wii U already has a Mario game, which is actually pretty good and is sitting on a 84 Metacritic score. Those are the sort of games Nintendo needs more off and no Nintendo fan is going to care if it uses Tesselation, Dynamic GI or any other fancy DX11 rendering effects. What Nintendo doesn't need are weak sauce versions of core games because people who care about graphics will notice the difference between 720 and 1080p or dynamic vs pre-baked lighting.

bolded 1: EA and very few others talking shit out of their asses doesn't equal "All the AAA developers"

bolded 2: Apparently some people were blessed with a magical ps2.

bolded 3: people who cares about GAMES won't' give a shit to hard to notice resolution difference that just serves to make the games more expensive and inflate fanboys egos.