By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

I think you are getting it confused D-Joe. They are saying the PS4 will be easier to develop for than the PS3. But if the PS4 is more powerful it has an advantage for game development than the next box. Either way, both consoles should be much easier to develop for than last gen. But yeah, edge said it would be easier to optimize the PS4, that doesn't mean it is some kind of huge thing like 360/PS3 was.



Before the PS3 everyone was nice to me :(

Around the Network

I didn't,you guys know the difference,but doesn't mean media knows



AMD delayed the GCN2 to 2014... so the consoles will use GCN (HD 7000) for sure now.

http://www.tomshardware.com/news/Radeon-Delay-GeForce-HD8000,20979.html



ethomaz said:
AMD delayed the GCN2 to 2014... so the consoles will use GCN (HD 7000) for sure now.

http://www.tomshardware.com/news/Radeon-Delay-GeForce-HD8000,20979.html


Unless the consoles are delayed too, but we haven't heard anything solid on GCN2 and I'm not sure if it is even worth doing, power, draw, or cost wise.



Before the PS3 everyone was nice to me :(

Wow found an excellent post by Timothy Lottes, the chap who invented FXAA anti-aliasing at Nvidia, regarding the issues of using slower DDR3 in consoles for the GPU and trying to make up for it with eDRAM, instead of going full bus width and GDDR5 setup as used in traditional GPUs (and rumoured for PS4). I said earlier you cannot overcome memory bandwidth limitations to the GPU if you go with DDR3 + eDRAM route because if you could, GPU makers would be doing this. Lottes explains it on a technical level:

Lottes seems fearful of Microsoft using a large amount of DDR3 memory because it might pose limits on memory bandwidth. On this issue he says:

“On this platform I’d be concerned with memory bandwidth. Only DDR3 for system/GPU memory pared with 32MB of “ESRAM” sounds troubling. 32MB of ESRAM is only really enough to do forward shading with MSAA using only 32-bits/pixel color with 2xMSAA at 1080p or 4xMSAA at 720p. Anything else to ESRAM would require tiling and resolves like on the Xbox360 (which would likely be a DMA copy on 720) or attempting to use the slow DDR3 as a render target.

I’d bet most titles attempting deferred shading will be stuck at 720p with only poor post process AA (like FXAA). If this GPU is pre-GCN with a serious performance gap to PS4, then this next Xbox will act like a boat anchor, dragging down the min-spec target for cross-platform next-generation games.”

 

Lottes seems to be less concerned about the rumored specs of the next Playstation. Most rumors have been pointing at the PS4 having less RAM than the next Xbox (4GB vs 8GB) but that may not matter if Sony uses DDR5 memory instead of the type of memory Microsoft is rumored to be considering. It would produce better results with a better amount of memory bandwidth.

Lottes says:

 

“If PS4 has a real-time OS, with a libGCM style low level access to the GPU, then the PS4 1st party games will be years ahead of the PC simply because it opens up what is possible on the GPU. Note this won’t happen right away on launch, but once developers tool up for the platform, this will be the case.

As a PC guy who knows hardware to the metal, I spend most of my days in frustration knowing damn well what I could do with the hardware, but what I cannot do because Microsoft and IHVs wont provide low-level GPU access in PC APIs. One simple example, drawcalls on PC have easily 10x to 100x the overhead of a console with a libGCM style API.”

Source: http://gimmegimmegames.com/2013/01/creator-of-fxaa-gives-his-2-cents-on-the-power-of-the-next-playstation-and-xbox-consoles/?utm_source=zergnet.com&utm_medium=referral&utm_campaign=zergnet_42581

Looks like Sony is doing all the right things to fix all the major issues of PS3 because I already linked other sources saying PS4 will allow libGCM style access to the metal of its hardware!

 

 



Around the Network

@BlueFalcon I think both consoles will use GCN.



ethomaz said:
@BlueFalcon I think both consoles will use GCN.

Ya I agree but we are getting some additional details regarding compromises of Xbox 720's DDR3 + eDRAM setup vs. PS4's GDDR5. Also, it appears, PS4's development tools will allow easy access to the 'metal' of the GPU hardware to extract maximum performance without being obstructed like Xbox 720 appears to be by MS's API / tools (at least initially). I think Sony really took all the criticisms of PS2-3 to heart this time. They are doing a full 180* turnaround in terms of console design -- spending more $$ on the GPU (instead of CPU as they did in PS3), and making sure it's as easy to develop for as possible. Huge props because on PS3 they made it too difficult to develop / optimize for and spend $$ on the wrong components. A console's budget should be allocated more towards the GPU since that's what renders the actual graphics, not the CPU. Adding faster GDDR5 memory to feed the GPU is already a win since it's a lot closer to how high-end GPUs are designed on the PC.



BlueFalcon said:

Ya I agree but we are getting some additional details regarding compromises of Xbox 720's DDR3 + eDRAM setup vs. PS4's GDDR5. Also, it appears, PS4's development tools will allow easy access to the 'metal' of the GPU hardware to extract maximum performance without being obstructed like Xbox 720 appears to be by MS's API / tools (at least initially). I think Sony really took all the criticisms of PS2-3 to heart this time. They are doing a full 180* turnaround in terms of console design -- spending more $$ on the GPU (instead of CPU as they did in PS3), and making sure it's as easy to developer for as possible. Huge props because on PS3 they made it too difficult to develope / optimize for and spend $$ on the wrong components. A console's budget should be allowed more towards the GPU since that's what renders the actual graphics, not the CPU or memory. Adding faster GDDR5 memory to feed the GPU is already a win since it's a lot closer to how high-end GPUs are designed on the PC.

I agree... the Orbis looks a lot more close to PC than Durango... that's good and easy for developers... no need extra work to use special or fixed functions.



Kotaku reports the next Xbox will ship with a 500GB hard drive, mandatory game installs to the hard drive, always on internet connection and mandatory Kinect 2.0 integration in every Xbox 720. The hard drive is a mechanical one with maximum read speeds of 50 MB/sec. 

The retail system specs appear to not have changed from what's been posted here. Notably the shared DDR3 and gimped SATA 2.0 connection for the hard drive remain. I was hoping at least one of those would be upgraded before launch. SATA 3.0 and ability to swap in a 3rd party hard drive would have been welcome since prices of SSDs are surely to drop expontentially over the next 3-4 years which would make a 512GB SSD swap a no brainer for people who want to cut down on their load times.

Here is the diagram outlining the differences between Kinect 1.0 and 2.0. 

Supposedly the retail specs of the console. Not seeing any changes from earlier diagrams linked here already.

http://kotaku.com/5982986/we-know-all-about-the-next-xbox-from-someone-who-says-theyve-got-one

The Data Move engines could be used to accelerate certain graphical effects but it remains how useful they will be.

Mandatory Kinect is a double edged sword. On one hand it would allow developers a feature that's universal on all Xbox consoles which would push for this to become more integrated into games. On the other hand, a lot of gamers simply do not want to pay for this feature or use it in games. With Kinect 2.0 more or less confirmed to be included in every next gen Xbox, it would stand to reason why MS didn't have a lot of $ left to spend on a beefier GPU and memory subsystem feeding it, without either taking larger losses on the hardware than MS or having to price the console much higher to compensate.



Interesting stuff and mostly in line with what is theorized in OP. I don't think I'll add any of its specifics as it doesn't really add anything and we're probably less than a month away from a reveal anyways where we'll surely see some of these, especially the UI and new Kinect.