By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - After seeing Bayonetta 2 and 'X' in action today...

 

The PS4's power seems...

Better, but not THAT much better anymore... 241 15.42%
 
Are you crazy?! The PS4 is GOD! 349 22.33%
 
The Wii U is clearly unde... 741 47.41%
 
The PS4 is selling better... 36 2.30%
 
I think I'll be buying a... 191 12.22%
 
Total:1,558

Well I could be wrong but I expect PS4 & XB1 games still have a long way to go technically.

1) Launch games are developed in a very time crunched situation with tools not being great, SDK's not being stable, etc. so I think just from having more time, and the maturity of tools games will get much better than you can see today.

2) Current games haven't even begun to use GPU compute.

3) Current game tech on PS4 & XB1 has been ported from PC, but PCs have very inefficient GPU drivers compared to the low level access on PS4/X1.  You can see in AMD's Mantle SDK how they're able to increase draw calls by 900% compared to D3D.  What you see done today on PC on equivalent GPUs is not the limit for PS4/X1.

Look at this: http://cdn4.wccftech.com/wp-content/uploads/2013/12/MantleBenefits1.jpg

Yes, PC games will "leverage optimization work from next gen game consoles".

 



My 8th gen collection

Around the Network
ZyroXZ2 said:

I started to see AMD GPUs as good for one thing: bitcoin mining.

I may EVENTUALLY return to ATI/AMD GPUs, but I've gotten so absorbed in the power of NVIDIA GPUs, that I haven't had an ATI/AMD GPU since my 8800GTS 512, lol

If I were you dude I would check out other virtual currencies too like litecoin.

The reason why nvidia is soo shitty at mining has to do with their integer shift and add performance. 

The reason why AMD is so good at mining has to do with implementing instructions on their GPU's to make mining faster. 

Their integer performance is so abysmal that I would be surprised if hackers even considered an nvidia GPU. 

Integer performance dominates in the landscape for cryptography therefore AMD GPUs are better password crackers LOL. 



fatslob-:O said:
ZyroXZ2 said:

I started to see AMD GPUs as good for one thing: bitcoin mining.

I may EVENTUALLY return to ATI/AMD GPUs, but I've gotten so absorbed in the power of NVIDIA GPUs, that I haven't had an ATI/AMD GPU since my 8800GTS 512, lol

If I were you dude I would check out other virtual currencies too like litecoin.

The reason why nvidia is soo shitty at mining has to do with their integer shift and add performance. 

The reason why AMD is so good at mining has to do with implementing instructions on their GPU's to make mining faster. 

Their integer performance is so abysmal that I would be surprised if hackers even considered an nvidia GPU. 

Integer performance dominates in the landscape for cryptography therefore AMD GPUs are better password crackers LOL. 

Oh I know why AMD/ATI is better for mining, but that's not what I use my PC for... ANYMORE... BWHAHAHAHAHAH MWHAHAHAHAHAHAHAHAH >:D



Check out my entertainment gaming channel!
^^/

Eh. I will wait until E3 where Sony will showcase more PS4 games. What we have so far are just launch titles and nothing more.



                
       ---Member of the official Squeezol Fanclub---

fatslob-:O said:

Read the rest of the thread Curl ...

Sure I didn't specify but I thought people knew what I was going on about. Just because the WII U supports GPGPU (Well every console that had DX9 capable cards were able to do it so that includes PS360 as well.) does not mean that you will get acceptable performance and that especially goes for where the fluid occupies a large voxel.

Wii U's GPU isn't DX9 though, it's DX10/11 equivalent. Could it do it as well as PS4/Xbone?  Of course not. 

But the 7th gen situation, where there were tons of PS3/360 effects Wii just couldn't do, has not been repeated as this time Wii U is much closer in functionality.



Around the Network
JoeTheBro said:
Kyuu said:
The gap between Wii U and PS3 is smaller than Wii and PS2. No?

Much smaller, but that doesn't really matter. The PS2 was much weaker than the XBOX and GameCube. It was a weak system.

lol no. 



:^)



Soundwave said:

Yeah lets wait until we actually see some PS4 games that use the system fully ... I recall a lot of "Haha, Perfect Dark Zero doesn't look that much better than Resident Evil 4!" early in the generation last time.


You forgot the main difference. PS3 has a exotic hardware and it took developers years to use its full potential. And the game budgets raised every year.

Put PS4One are practically (the PS4 even more than One) only PC's in a small box. There is no exotic hardware that has to be understand. Technically there will be not better graphics on these consoles than we had seen already with Ryse and Killzone. The game budgets had already reached peak and will not raise far more, so no better graphics will come from that.

The Wii U has the most exotic console hardware this gen by far. So the gap between launch games and later games will probably the largest on Wii U.



Wasn't that impressed by X, the game looks like it possible on last gen to me.



curl-6 said:
fatslob-:O said:

Read the rest of the thread Curl ...

Sure I didn't specify but I thought people knew what I was going on about. Just because the WII U supports GPGPU (Well every console that had DX9 capable cards were able to do it so that includes PS360 as well.) does not mean that you will get acceptable performance and that especially goes for where the fluid occupies a large voxel.

Wii U's GPU isn't DX9 though, it's DX10/11 equivalent. Could it do it as well as PS4/Xbone?  Of course not. 

But the 7th gen situation, where there were tons of PS3/360 effects Wii just couldn't do, has not been repeated as this time Wii U is much closer in functionality.

@Bold I already know that but I'm going to have to put a stop in the tracks to your claims of GPGPU being a saviour or this ultra awesome feature but really only the consoles with higher power like the PS4 and the X1 will seriously see the advantages.