By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - After seeing Bayonetta 2 and 'X' in action today...

 

The PS4's power seems...

Better, but not THAT much better anymore... 241 15.42%
 
Are you crazy?! The PS4 is GOD! 349 22.33%
 
The Wii U is clearly unde... 741 47.41%
 
The PS4 is selling better... 36 2.30%
 
I think I'll be buying a... 191 12.22%
 
Total:1,558
starworld said:
supernihilist said:
starworld said:

 

 

 


you  comparsion is horrible at least post HD pics, you posted small pics that are not even 420p and one was from  cut scene. post some 720p gameplay pics of X.


The first image is gameplay. who cares about pic size. both are clearly representative of the graphic engine that X uses. Same for GTAV, and i fail to see any meaningfull difference. i actually think X looks better than GTAV. since the latter runs like crap at sub HD

GTAV is not sub hd its 720p and who knows if X is sub hd, or even runs better then GTAV. and you need hd pics to judge graphics i can post small pics of any game  and it will look great by hiding its flaws.


i posted GTAV bullshots to compensate but even then X looks better



Around the Network
ZyroXZ2 said:
fatslob-:O said:
ZyroXZ2 said:

 

But after releasing and receiving a patch, is 1080p@30 up from 900p@30.  If the PS4 is so powerful, why did the game get scaled to HALF the framerate of the original claim?  I'm sure there's ALL kinds of excuses, but this is the kind of thing that Sony fanboys sweep under the rug.  If this was a Nintendo game: "HAH, Wii U is underpowered, weak, can't even do 60 fps, it's going to get blown away by the PS4/XBox1!!!".  This is the double standard that the console wars need to stop.  If people are going to fight over consoles, fight about it consistently, and honestly.  If anything, fight over the games...

This is pure none sense, the reason AC4 doesn't run 60 fps is because ubi soft didn't think it was worth the effort and probably didn't have enough time to optimize it cause it had to come out for launch, we already have ps4 doing last gen game like tom raider, trine 2 and metal gear solid zero at 1080p/60 fps.  trine 2, even supporting 3d and metal gear zero is sub hd on last gen consoles, so we have solid proof ps4 could have done it just by those 3 games early in its life span, while the wiiu is still running games 7-15 fps below current gen games with out  graphical upgrades. 



supernihilist said:
starworld said:
supernihilist said:
starworld said:

 

 

 


you  comparsion is horrible at least post HD pics, you posted small pics that are not even 420p and one was from  cut scene. post some 720p gameplay pics of X.


The first image is gameplay. who cares about pic size. both are clearly representative of the graphic engine that X uses. Same for GTAV, and i fail to see any meaningfull difference. i actually think X looks better than GTAV. since the latter runs like crap at sub HD

GTAV is not sub hd its 720p and who knows if X is sub hd, or even runs better then GTAV. and you need hd pics to judge graphics i can post small pics of any game  and it will look great by hiding its flaws.


i posted GTAV bullshots to compensate but even then X looks better


Well i posted  good quality gameplay shots of X but i guess you think they look crappy.



ZyroXZ2 said:

Yup, you continue to have nothing to back yourself up, and continue name-calling with no logic whatsoever.  You continue to hide under the pretense that I don't understand, but provide nothing that demonstrates your own.  You're devolving PC gamers right now, we're the types that rely on numbers and figures to measure anything and everything, that's why we spend hours overclocking and benchmarking.

The one who's name calling is you ... I've already covered these topics ages ago so go and dig up some threads instead of asking me to waste time trying to explain for you. If I'm devolving PC gamers and the like then what does make you for doing a worthless analysis that means nothing ? Some PC gamers are smarter than others and those who know something are the ones who get their fix from either toms hardware or anandtech.

Weren't you just complaining about numbers not meaning everything, but now you're using power draw?  All measurements of power draw of all systems have been done AT THE SOCKET, which reduces the accuracy of the representation of the system's overall power (essentially, you're also using numbers not wholly representative of the systems).  People are still GUESSING what the power draw of the GPU is in the Wii U.  Of course, with Nintendo's engineering target of low power and high efficiency, the system utilizes a meager 75 watt power brick, and draw at the socket on NSMBU and other early launch games has been measured around 35 watts during gaming.  What makes it so hard to compare the two is that the PS4 IDLES at 90 watts.  The PS4 takes almost triple the power of the Wii U to just sit at the menu.  However, to address your comparison to the last generation:

I didn't say that numbers mean NOTHING. It's just that your interpretation is simply worthless due to the fact that you have such a limited understanding of how a GPU would work and if you knew the basics then you could have easily come to a simple sentiment. No, there are no guesses as to how much power the system draws, we ALREADY KNOW how much power the SoC itself draws and that's 30 WATTS according to anandtech not counting the memory, flash memory, and the disc drive. 5 for the CPU and 25 for the GPU. Nintendo's engineering is considerably crap for the most part and there's nothing efficient about using a dead ass 10 year old CPU architecture ever since gamecube. Nintendo was NEVER into engineering and they showed it by hiring other professionals such as SGI, IBM, and AMD to design their hardware. Idle power draw means NOTHING and what matters the most is MAX DRAW.

http://www.eurogamer.net/articles/digitalfoundry-wii-u-is-the-green-console

Taking less energy means NOTHING and infact it only supports the opposing sides argument of saying that the WII U is underpowered.

And then about the Wii U's power supply: http://www.cinemablend.com/games/Wii-U-Memory-Bandwidth-GPU-More-Powerful-Than-We-Thought-62437-p2.html

They only did that to reduce the chances of the power supply blowing a fuse or getting overheated. 

While I take that GamesBlend article with a grain of salt for its lack of proper reference to 1080p games on the Wii U, the power supply numbers check out on a logical level.  If the Wii U has, say, up to 50 watts for games, and the PS4 has up to 150 watts (again, in your favor, I'm raising the PS4's numbers to give your arguments the edge, of which your side still never holds up), then the PS4, BY YOUR POWER DRAW LOGIC, is only three times as powerful as the Wii U.  WHOOO, such a BIG GAP, right?  On a side note unrelated to the argument, something I noticed about the Wii U is that its idle power and gaming power were very close when tested.  This leads me to believe that the Wii U is always engaging the CPU and GPU at a full power state even when not in use (or marginally in use).  That's always bothered me in particular, since that lack of a lower power state goes against their engineering targets for "high efficiency"...

Again this is a FACT that the WII U only uses around 30 watts, just because you have an 80 watt power supply doesn't mean that you will get to draw all of it. It is the components that chooses how much it will draw from the power supply. It is a fact that the PS4 uses around 140 watts according to anandtech. Again this is why your interpretation is worthless ... It is not only power draw that will detemine how powerful a system is, it will be a factor of things such as manufacturing process nodes and newer architectures that will give unmatched efficiency. Hence, why I'm reserved to explain anything because you simply don't understand.

And by the way: https://www.psu.com/a020967/PS4-Xbox-One-Assassins-Creed-IV-is-60fps-at-a-much-higher-resolution-than-current-gen?page=2

It says Tim Browne confirmed 60 fps.

The funny thing is the article that you linked sourced from VG247 which was already debunked ... 

You aren't going to win this, I keep coming with numbers, articles and all you do is devolve yourself into calling BS on everything and saying I don't understand.  I said I'd stop, and yet here I am kicking you while you're down.  I will stop from this point forward, regardless of your response, for your sake.

Coming up with numbers mean nothing if your interpretation is worthless in the end ... Your not hurting anyone especially by the fact that your still ignorant about hardware ...

The one who called BS was you and you simply in the end didn't debunk anything at the end of all this ... Why don't you go listen to JoeTheBro instead to give you an explanation.



oniyide said:
curl-6 said:

None use better than PS360 textures despite having more thhan twice as much RAM available, and Blacklist's framerate takes a dive with just three or four enemies on screen, despite even launch ports like Mass Effect 3 and Black Ops 2 on Wii U handle dozens of characters at a much higher framerate. From a technical point of view, they are lazy, no two ways about it.


so...you're mad that they dont have better textures than the PS360 version? something that barely effects gameplay?

I'm not mad, just pointing out clear evidence of laziness that I'm confident will continue with  Watch Dogs.



Around the Network
starworld said:

you  comparsion is horrible at least post HD pics, you posted small pics that are not even 420p and one was from  cut scene. post some 720p gameplay pics of X.

There are none. Every pic of X on the net is a horribly compressed capture from a video stream, rendering any screenshot comparison unfair.



Hynad said:
curl-6 said:
fatslob-:O said:
curl-6 said:

You're right; there will be no excuses. Just legitimate reasons, like copy+pasting PS360 assets and doing bugger all optimization, just like they did with Blacklist, AC3, and AC4. And I'm hardly sour; I didn't buy any of Ubisoft's Wii U games because I don't support their laziness.

@Bold That sounds more like an excuse seeing as how you don't know if they are using EVERY PS360 assets.

Seems like a safe bet looking at their past ports. I don't expect them to change their habits now.


Wait... So Ubisoft is making one version to be the main one to be ported to the other platforms, right? They port it, say from the 360 to the PS3, and all is fine and dandy... But when they do the same move to the Wii U, a console you believe to be more powerful than the HD twins, the porting process doesn't work as well? 

By your logic, if the PS3, which has a very different CPU and GPU compared to the 360, can handle such lazy porting process of assets, the Wii U should have no problem whatsoever since it's supposedly more powerful than the PS3. 


PS3 had the same problems back in its early years actually.

It's not a problem any more because after 7 years of practice, porting between PS3/360 has been refined to a science.

But the Wii U porting doesn't have 7 years of dev experience and investment on its side.



fatslob-:O said:

The one who called BS was you and you simply in the end didn't debunk anything at the end of all this ... Why don't you go listen to JoeTheBro instead to give you an explanation.

Oh I'll still talk with ZyroXZ2, but I'm done with this discussion with him. I like debating these things, not arguing them. With lines like "I said I'd stop, and yet here I am kicking you while you're down.  I will stop from this point forward, regardless of your response, for your sake"he shows he's not expressing his points, but rather is making it a personal battle lacking any commitment to the truth. You're far from innocent yourself, but at least I haven't noticed any childish remarks like this in your comments.



curl-6 said:
Hynad said:
curl-6 said:

Seems like a safe bet looking at their past ports. I don't expect them to change their habits now.


Wait... So Ubisoft is making one version to be the main one to be ported to the other platforms, right? They port it, say from the 360 to the PS3, and all is fine and dandy... But when they do the same move to the Wii U, a console you believe to be more powerful than the HD twins, the porting process doesn't work as well? 

By your logic, if the PS3, which has a very different CPU and GPU compared to the 360, can handle such lazy porting process of assets, the Wii U should have no problem whatsoever since it's supposedly more powerful than the PS3. 


PS3 had the same problems back in its early years actually.

It's not a problem any more because after 7 years of practice, porting between PS3/360 has been refined to a science.

But the Wii U porting doesn't have 7 years of dev experience and investment on its side.

You state all this as if it were facts. You should really prepare yourself for disappointment if you believe the Wii U is an entire league more powerful than the HD twins.

Devs don't have much experience on PS4, yet they ported the PC assets and improved many aspects of Tomb Raider... The scenario you're making up would make sense only if the Wii U didn't have much more horsepower than the HD twins. But the way you present all this indicates that you believe it's actually in an other league... Which so far has been proven to be false, even when you consider the exclusives.



curl-6 said:

PS3 had the same problems back in its early years actually.

It's not a problem any more because after 7 years of practice, porting between PS3/360 has been refined to a science.

But the Wii U porting doesn't have 7 years of dev experience and investment on its side.

Does it really need 7 years of experience when almost everyone in the industry already knows what it's hardware is and/or already has the tools ? 

Think about, curl ... The IBM espresso is just an overclocked gamecube processor with two more cores. The GPU is just from some evergreen family with no architectural modifications to boot. How hard can it be to program on it ? If it has better support for GPGPU like you mentioned then shouldn't it be much easier to program on it seeing as how you can easily use a middleware to expose the underlying functionalities of it on a high level abstraction layer ? 

Afterall one of the goals of GPGPU is to make concurrent programming models simpler.