By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - TRUE XB1 vs PS4 spec comparison

As someone married to a developer working on a game for the second wave on both consoles, he would like me to pass along this message to you.

Audio processing does not eat up "an entire core worth of resource".
Additionally :
- full surround audio processing can, and easily, too, be handled on a compute unit and passed along to the audio processor, a fully library to support audio processing in compute is present in the SDK and has been for the past 5 revisions.
- ROPS "having little impact in the real world" is also wrong, it boosts the ability to handle post processing, depth of field, AA and, if ever needed, supersampling.

Of course, you're going to try and counter with your google smarts, and i'm not going to bother replying because as is evident from your conduct in this entire thread, you have already set up your opinion, and nobody else, however more accurate is going to mean a thing.

I'll just quietly take pleasure in knowing the true reality.



Around the Network
OdinHades said:
I don't think all that bandwith jibberish will bring any major changes. At least when I upgrade my PC with a better graphics card, the differences are massive. But if I buy some faster RAM, I don't see any difference at all.

I just don't see how that tiny eSRAM should make any difference. Sure, it's not a bad solution, but neither is GDDR5. It's both good enough. But the PS4 still has the better GPU. I mean, come on. Even if the Xbone might have the upper hand in some weird areas nobody gives a crap about, overall the PS4 is still faster. Does that simple fact really hurt so much?

I know that I didn't care that the Xbox was faster than my Gamecube back in the days. They were on the same level, that's all what matters to me, I couldn't care less about some more fps, a higher resolution or some details I will never look at anyway. Sure, I'm buying the PS4, but certainly not because of the hardware. It's all about the games and Sony just has more games that interest me.

Man, I'm beginning to understand Ninty fans. Just screw all that hardware crap, get a console everyone knows it is slower and just play some games. Memory bandwith, dGPU, secret sauce, NDA, what the hell. You're not stopping until every person on the internet says that Xbox One is faster, are you? I'll ask again: Why is it so goddamn important to you?

There's no use in trying to bicker him. He won't be able to listen to reason when he himself understands absolutely nothing about hardware. We all know that tiny ass ESRAM won't fix the bandwidth issue but that doesn't matter since the xbone already has the weaker hardware to accomodate the lower badnwidth. I don't think the ESRAM is supposed to be used as a main memory but just an extended form of cache. I mean just exactly what are devs gonna do with 32mb ? Agreed on that last part of how hardware will mean less in trying to be a market leader. 



Frequency said:
As someone married to a developer working on a game for the second wave on both consoles, he would like me to pass along this message to you.

Audio processing does not eat up "an entire core worth of resource".
Additionally :
- full surround audio processing can, and easily, too, be handled on a compute unit and passed along to the audio processor, a fully library to support audio processing in compute is present in the SDK and has been for the past 5 revisions.
- ROPS "having little impact in the real world" is also wrong, it boosts the ability to handle post processing, depth of field, AA and, if ever needed, supersampling.

Of course, you're going to try and counter with your google smarts, and i'm not going to bother replying because as is evident from your conduct in this entire thread, you have already set up your opinion, and nobody else, however more accurate is going to mean a thing.

I'll just quietly take pleasure in knowing the true reality.

Having more ROPS will lead to a higher framerate and resolution too. :P I mean just look at those BF4 rumours going around. 



This thread is Hilarious and that Fallen poster seems to be trying to justify to himself that the X1 is worth purchasing!
Lmfao!
No point in arguing with a brick wall!



Frequency said:
As someone married to a developer working on a game for the second wave on both consoles, he would like me to pass along this message to you.

Audio processing does not eat up "an entire core worth of resource".
Additionally :
- full surround audio processing can, and easily, too, be handled on a compute unit and passed along to the audio processor, a fully library to support audio processing in compute is present in the SDK and has been for the past 5 revisions.
- ROPS "having little impact in the real world" is also wrong, it boosts the ability to handle post processing, depth of field, AA and, if ever needed, supersampling.

Of course, you're going to try and counter with your google smarts, and i'm not going to bother replying because as is evident from your conduct in this entire thread, you have already set up your opinion, and nobody else, however more accurate is going to mean a thing.

I'll just quietly take pleasure in knowing the true reality.

That sounds reasonable enough, although he said she said could be the case without proof. I'm skeptical of the audio related comment, (can you point me to any PC games doing audio on the GPU?) but I'll give it a "shrug".

 

And my source (basically just the guy who helped design SHAPE on B3D) stated it was around a core IIRC. BTW he also confirmed the most powerful parts of SHAPE are reserved by Kinect, which saddens me, but whats left over is still roughly equivilant to a CPU core worth of audio processing according to him.

Sure, more ROPS are better. The question is are 32 ROPS twice as good (or 188% as good, clock adjusted) as 16 ROPS for the average console operating at 1080P or below. I am sure the answer is "not even close".

I just didn't want people to be fooled by the huge discrepancy in that number. It's not likely to matter nearly as much as +88% suggests.

 

Can you ask your husband how X1 performs versus Ps4 on multiplatform titles? I'd love to hear it.

 

What I have heard from a guy who knows developers, X1 is way behind before optimization, but after optimization, the gap gets much smaller.

 

I'll just quietly take pleasure in knowing the true reality.

 

The true reality will be revealed by games, and those who paint X1 as a weakling or the PS4 as 60+% more powerful, are the ones who will be surprised. I think we all know that...or at least suspect it.



Around the Network
fallen said:

X1:

e) 1.75 ghz Jaguar (Plus SHAPE audio chip likely ~1 CPU core of audio processing, +cloud support which we will ignore for now but could offload things like AI from X1 CPU in the future)

 

Reading shit again.

Saying cloud support on hardware side is stupid. PS4 and Wii U are connected to the internet too.



If the xbone was truly more powerful why haven't we seen a UE4 demo of elemental yet ? (Or is it because the xbone just isn't powerful enough to run it ? I want no excuses at this point.)



I think we need to subtract 10% from the PS4 power for being a uncomfortable place



“It appeared that there had even been demonstrations to thank Big Brother for raising the chocolate ration to twenty grams a week. And only yesterday, he reflected, it had been announced that the ration was to be reduced to twenty grams a week. Was it possible that they could swallow that, after only twenty-four hours? Yes, they swallowed it.”

- George Orwell, ‘1984’

RenCutypoison said:
fallen said:

X1:

e) 1.75 ghz Jaguar (Plus SHAPE audio chip likely ~1 CPU core of audio processing, +cloud support which we will ignore for now but could offload things like AI from X1 CPU in the future)

 

Reading shit again.

Saying cloud support on hardware side is stupid. PS4 and Wii U are connected to the internet too.

Can you point me to where Sony and (LOL) Nintendo execs mention their array of servers set up and offered to developers to help their consoles?

 

That's really the difference here. MS has their Azure cloud infrastructure ready to go, because it's a part of their business.  Someone will bring up Sony's Gakai here, but that's complelety different tech. It's more in line with what Microsoft calls "Rio" (cloud streaming service, which they demonstrated streaming Halo 4 to a windows phone)

 

I did not include Cloud in the comparison anyway, simply mentioned it as an aside. We will see. But I also think it's silly to pretend it doesn't exist. I'd say more that the jury is out how much it will help.

 

Even if it simply is used for dedicated servers for every multiplayer game, that alone saves CPU. I think AI might be another lowhanging fruit to offload from the local CPU, but again without any proof yet, I dont include it in my spec comparison. The picture could change in 2-3 years though.

 

 



fallen said:
RenCutypoison said:
fallen said:

X1:

e) 1.75 ghz Jaguar (Plus SHAPE audio chip likely ~1 CPU core of audio processing, +cloud support which we will ignore for now but could offload things like AI from X1 CPU in the future)

 

Reading shit again.

Saying cloud support on hardware side is stupid. PS4 and Wii U are connected to the internet too.

Can you point me to where Sony and (LOL) Nintendo execs mention their array of servers set up and offered to developers to help their consoles?

 

That's really the difference here. MS has their Azure cloud infrastructure ready to go, because it's a part of their business.  Someone will bring up Sony's Gakai here, but that's complelety different tech. It's more in line with what Microsoft calls "Rio" (cloud streaming service, which they demonstrated streaming Halo 4 to a windows phone)

 

I did not include Cloud in the comparison anyway, simply mentioned it as an aside. We will see. But I also think it's silly to pretend it doesn't exist. I'd say more that the jury is out how much it will help.

 

Even if it simply is used for dedicated servers for every multiplayer game, that alone saves CPU. I think AI might be another lowhanging fruit to offload from the local CPU, but again without any proof yet, I dont include it in my spec comparison. The picture could change in 2-3 years though.

 

 

MS Azure won't do shit for graphics just so you know otherwise they would be losing billions upons billions and that would be the end of the xbox divison.