By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft - TRUE XB1 vs PS4 spec comparison

GribbleGrunger said:

From Gaf:

Xbone: 1.18 TF GPU (12 CUs) for games
Xbone: 768 Shaders
Xbone: 48 Texture units
Xbone: 16 ROPS
Xbone: 2 ACE/ 16 queues

PS4: 1.84TF GPU ( 18 CUs) for games + 56%
PS4: 1152 Shaders +50%
PS4: 72 Texture units +50%
PS4: 32 ROPS + 100%
PS4: 8 ACE/64 queues +400%

The games are going to be great on the X1 but you people really need to just accept it's weaker and move on.

Holy shit I've never seen it written from a GPU centric pov, the Xbone really is quite the bit weaker. MS really have their work cut out, serves them right for focusing on TV, apps and Kinect but who knows maybe that's what the mass market will warm up towards and MS will be laughing all the way to the bank.



Around the Network

Adinnieken said:

I think though in my original comment I acknowledged this.  The realworld bandwidth that Microsoft has achieved is 70-80% of the maximum theorethical.  And they state the reason why no system can achieve the maximum theorethical, not even Sony.  I don't disagree that theoretical maximum's aren't necessarily a figure we should use, but you can't say the PS4 will achieve 176GB/s and then say the Xbox One won't achieve its theorethical max when neither can reach it.

You are right but PS4 have more change to archive close the peak bandwidth than Xbone... in fact the eSRAM + DDR3 system have more bottlenecks than a single GDDR5 pool.



MonstaMack said:

Truthfact: People bought 360s over the PS3 because of the graphics powah! /sarcasm

Truthfact: No one ultimately cares about graphics, look at Wii and 3DS sales compared to 360/PS3 and Vita sales.


True enough, though the interesting part about this scenario is that PS4 is also the cheaper one, which is most certainly part of why people buy one console over another as well as being more powerful.  It'll be interesting to see if Kinext 2.0 really hits it big and MS can still prevail.  



...

ethomaz said:

Adinnieken said:

I think though in my original comment I acknowledged this.  The realworld bandwidth that Microsoft has achieved is 70-80% of the maximum theorethical.  And they state the reason why no system can achieve the maximum theorethical, not even Sony.  I don't disagree that theoretical maximum's aren't necessarily a figure we should use, but you can't say the PS4 will achieve 176GB/s and then say the Xbox One won't achieve its theorethical max when neither can reach it.

You are right but PS4 have more change to archive close the peak bandwidth than Xbone... in fact the eSRAM + DDR3 system have more bottlenecks than a single GDDR5 pool.

Why are you so worried about the the ESRAM ? It's 32 god damn megabytes! It's so tiny that it's almost a non factor to the system's bandwidth performance. 



Adinnieken said:
kitler53 said:
you forgot to multiply all of xbone's stats by 3x for teh cloud.

The reality is we don't know what the possible impact off-system computing will have.

"The Cloud" is nothing more than Client/Server computing in some respects.  To assume there isn't potential in that is to be ignorant of reality.  After the PC was released, businesses started creating client/server applications where PCs were used essentially for UI and mainframes and UNIX servers did all the heavy processing.  Eventually, with the advent of the web, you had lightweight applications that ran on the client that accessed the Web server, which did all the heavy processing. 

Heck, multiplayer games are nothing more than client/server computing.  The only difference is that the rendering takes place on the local client.  Regardless of whether it's a peer-to-peer or server-to-peer arrangement, one system does all the leg work in determining where everyone in the game is and how they're all meshing together. 

People, like yourself, may like to playdown the impact off-system computing will have in the cloud, but the reality is it can be extremely powerful.  We are beginning to run into a wall in terms of computing.  They've developed single atom circuits.  They can't get any smaller than single atoms.  Unless computing moves away from Silicon, then what will eventually happen is we will hit a wall.  Device won't get smaller, computing power won't grow exponentially every few years, and the only way to get more power into a machine is you either build a bigger box or you offload computational work.

Quantum computing is still decades away.  Other mediums that allow better cooling properies than silicon and would allow smaller dies are still being developed.  Even still, they won't be able to offer a circuit size smaller than the atomic level, they'll only allow engineers to develop dies with less space between circuits and still run cool.  Not to mention operate without shorting out. 

So yeah.  If for cost reasons you can't increase the size of your device, then in order to expand the capabilities of what it can do you will need to offload some of the computation onto servers.  For consumers, that's the cloud.  Businesses have the option of in-house servers as well as cloud-based servers.


don't downplay the cloud.

"We're provisioning for developers for every physical Xbox One we build, we're provisioning the CPU and storage equivalent of three Xbox Ones on the cloud," he said. "We're doing that flat out so that any game developer can assume that there's roughly three times the resources immediately available to their game, so they can build bigger, persistent levels that are more inclusive for players. They can do that out of the gate."  ~microsoft.



Around the Network
Adinnieken said:

Quantum computing is still decades away. 

Sure?

http://www.wired.com/wiredenterprise/2013/06/d-wave-quantum-computer-usc/



kitler53 said:

don't downplay the cloud.

"We're provisioning for developers for every physical Xbox One we build, we're provisioning the CPU and storage equivalent of three Xbox Ones on the cloud," he said. "We're doing that flat out so that any game developer can assume that there's roughly three times the resources immediately available to their game, so they can build bigger, persistent levels that are more inclusive for players. They can do that out of the gate."  ~microsoft.

I realize you're trying to make a point, but i fail to see it.

If I make 300,000 servers available to developers.  Servers that we have no clue as to their technical specifications, each server may offer 3x the power of an Xbox One.

I'm sorry, I worked in an environment where we routinely used client/server applications through HTTP.  These weren't systems that spoke to the servers through a single hop, these were systems that encountered latency and existed in remote environments.  The company had tried putting more power into those remote locations, which they could do, at greater expense, but ultimately they relied on web-based applications. 

We had applications that were running several million transactions per second running on servers less powerful than a smartphone and they were responsive as well.  I'm sorry, but I've seen it.  I've done it.  I know it's possible.  What I haven't seen first hand is off-system rendering, but Nvidia's example via off-the-shelf hardware was impressive. 

Not to mention, but what do you really think most of your smartphone apps are?  Do you honestly think a GPS application is running 100% locally?  Unless you actually have a GPS unit, GPS applications rely on remote systems in order to provide your route data and directions. 



dsgrue3 said:
Adinnieken said:

Quantum computing is still decades away. 

Sure?

http://www.wired.com/wiredenterprise/2013/06/d-wave-quantum-computer-usc/

Yep...even in spite of that article.  Right now it's a bit like the ENIAC was back in the 1940's.  A technological marvel, but it didn't do much.  Mostly for research and learning. 

It will be at least two decades before consumers see something near them that will use quantum computing.



Adinnieken said:
dsgrue3 said:
Adinnieken said:

Quantum computing is still decades away. 

Sure?

http://www.wired.com/wiredenterprise/2013/06/d-wave-quantum-computer-usc/

Yep...even in spite of that article.  Right now it's a bit like the ENIAC was back in the 1940's.  A technological marvel, but it didn't do much.  Mostly for research and learning. 

It will be at least two decades before consumers see something near them that will use quantum computing.

Very pessimistic take. The pace of technology is exponential, I'd be surprised if quantum computers weren't on the public market within a decade.

PS: Most smart phones have a gps receiver.



dsgrue3 said:
Adinnieken said:
dsgrue3 said:
Adinnieken said:

Quantum computing is still decades away. 

Sure?

http://www.wired.com/wiredenterprise/2013/06/d-wave-quantum-computer-usc/

Yep...even in spite of that article.  Right now it's a bit like the ENIAC was back in the 1940's.  A technological marvel, but it didn't do much.  Mostly for research and learning. 

It will be at least two decades before consumers see something near them that will use quantum computing.

Very pessimistic take. The pace of technology is exponential, I'd be surprised if quantum computers weren't on the public market within a decade.

PS: Most smart phones have a gps receiver.

I doubt that. They're currently only efficient for a small subset of tasks and it'll be difficult to leverage their power for most people.

In 10 years it'll still be in the domain of academia I think.