By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - PS4 Pro Holding back Xbox One X???

taus90 said:

I wouldnt use the term "significant" to define a slight bump in cpu clock speed, 2.1 vs 2.3 

You do realise we wen't through the Pentium 4/Netburst era that showed the world that clock rates don't determine absolute performance, right?





www.youtube.com/@Pemalite

Around the Network
Pemalite said:
taus90 said:

I wouldnt use the term "significant" to define a slight bump in cpu clock speed, 2.1 vs 2.3 

You do realise we wen't through the Pentium 4/Netburst era that showed the world that clock rates don't determine absolute performance, right?


what does it have do with the post other than proving my point.. slight increase in CPU clock speed doesnt improve theoritical performance in closed API enviroment as simple as that. I m not a hardware guy, i develop games, develop mock code to test the console cieling performance, if people think that xbox x having slightly higher clock speed than Pro will give them 60fps on a 4k buffer are kidding themselves. at best xbox x will have 5 to 10 fps more if the target fidelity is on the level of Pro.



taus90 said:
Pemalite said:

You do realise we wen't through the Pentium 4/Netburst era that showed the world that clock rates don't determine absolute performance, right?


what does it have do with the post other than proving my point.. slight increase in CPU clock speed doesnt improve theoritical performance in closed API enviroment as simple as that. I m not a hardware guy, i develop games, develop mock code to test the console cieling performance, if people think that xbox x having slightly higher clock speed than Pro will give them 60fps on a 4k buffer are kidding themselves. at best xbox x will have 5 to 10 fps more if the target fidelity is on the level of Pro.

It does the complete opposite of proving your point.

Have you forgotten all the improvements Microsoft has done in the name of efficiency? You can't take the clockrate of the Scorpio's Jaguar cores and think it's only a minor increase over the Playstation 4 as if it's some kind of denominator for gauging absolute performance. Because it's not.

Don't use clock rates as some kind of performance benchmark. Just like Flops isn't a true representation of a piece of hardwares performance either.




www.youtube.com/@Pemalite

As Phil Spencer said, the PS4 Pro is competing with the Xbox One S not the Xbox One X. Believe him....if you would.



Pemalite said:
taus90 said:

what does it have do with the post other than proving my point.. slight increase in CPU clock speed doesnt improve theoritical performance in closed API enviroment as simple as that. I m not a hardware guy, i develop games, develop mock code to test the console cieling performance, if people think that xbox x having slightly higher clock speed than Pro will give them 60fps on a 4k buffer are kidding themselves. at best xbox x will have 5 to 10 fps more if the target fidelity is on the level of Pro.

It does the complete opposite of proving your point.

Have you forgotten all the improvements Microsoft has done in the name of efficiency? You can't take the clockrate of the Scorpio's Jaguar cores and think it's only a minor increase over the Playstation 4 as if it's some kind of denominator for gauging absolute performance. Because it's not.

Don't use clock rates as some kind of performance benchmark. Just like Flops isn't a true representation of a piece of hardwares performance either.

lol the last optimization pack we got for our dev kit was back in novemeber of 2015.  

yeah sure Flops doesnt mean anything to general purpose hardware and computing, so a game developer who does low level codes and profiling graphics api, will not even entertain you if you say clock speed and flops arent important... flops, clockspeed, ram bus all matters when chasing that elusive frame render budget.

so flops tells a developer what they can achieve on that specific hardware designed for a specific task which is gaming. No matter how much PC logic you apply for determing a hardwares performance,  it doesnt mean squat to developers developing the game for that machine, Xbox one x is sports car with road car tires.



Around the Network
taus90 said:

yeah sure Flops doesnt mean anything to general purpose hardware and computing, so a game developer who does low level codes and profiling graphics api, will not even entertain you if you say clock speed and flops arent important... flops, clockspeed, ram bus all matters when chasing that elusive frame render budget.

 

I never said that flops was not important. But using it alone as a determiner for absolute performance isn't important and is actually highly inaccurate.
It's a theoretical number often not achievable by real world hardware.

For instance you can have a GPU with less flops outperform a GPU with more flops. Require me to prove this?

taus90 said:

so flops tells a developer what they can achieve on that specific hardware designed for a specific task which is gaming.

No it doesn't. You know what does though? Benchmarks, profiling, testing. All done on Dev kits. Flops doesn't mean a game can run at such and such resolution.

And flops alone completely ignores the precision the floating point operations are operating at, it also has nothing to do with integer performance, doesn't give you an idea on geometry performance... And it tells you nothing of fillrate and I can go on.

Megahertz and flops alone does not tell us what the hardware is capable of. It never has. Never will. And people should stop abusing those metrics.




www.youtube.com/@Pemalite



taus90 said:

so flops tells a developer what they can achieve on that specific hardware designed for a specific task which is gaming.

No it doesn't. You know what does though? Benchmarks, profiling, testing. All done on Dev kits. Flops doesn't mean a game can run at such and such resolution.

And flops alone completely ignores the precision the floating point operations are operating at, it also has nothing to do with integer performance, doesn't give you an idea on geometry performance... And it tells you nothing of fillrate and I can go on.

Megahertz and flops alone does not tell us what the hardware is capable of. It never has. Never will. And people should stop abusing those metrics.

lol what!!!??

so you are saying flops isnt a barometer to tell me as developer and that i can render a 6tflop scene on a 4.2 gpu with almost same CPU???

You do know developers code workaround those issue to get the performance.. but hey next time when I m profiling and assembling 1080p/4k frame budgets for a new sequence I will remember to consider integer performance because u say so.. don't even know where and how to begining with that .. and also i will look into how geometry will affect........ nope i give up. good luck with your pc logic. u enjoy your coolaid and i'll enjoy mine. Thanks for the chuckles though



If anything it's probably more Sony's relationships with the developers and stopping Xbox One X from getting exclusive content or noticeably better versions of the game than the developers intentionally dumbing it down to stop the Pro from looking inferior.



Mystro-Sama said:
Honestly the only thing holding back the X is lack of exclusives.

True exclusives. Of course Microsoft wins on the technicality because timed exclusivity counts as being a real exclusive.



A7XRayDog247 said:
If anything it's probably more Sony's relationships with the developers and stopping Xbox One X from getting exclusive content or noticeably better versions of the game than the developers intentionally dumbing it down to stop the Pro from looking inferior.

I don't know if this is the case. I think you should be more focused on Microsoft making games that look better for their platform. I mean no one could fix sonys bottleneck from last gen so Sony made exclusives which displayed how powerful the PS3 was. Microsoft should do the same. The