By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - PS4 Pro is not powerful enough to run Destiny 2 at 60fps, says Bungie

WolfpackN64 said:
JRPGfan said:

I would say their more like the CPU you find in Ultra Books.

Those weak low powered CPUs from Intel.

Like a i5-5200u (the U part is the important part, means its one of those weak lower power cores).

Dispite how much crap people give these Jaguar CPU cores from AMD used in the consoles, their probably better than any Atom CPU was.

The Core chips from Intel have a MUCH higher IPC. The Jaguar was certainly more performant then the Atom CPU's of yore, but the current Goldmont Atoms will be more performant.

In a Console where you dictate everything should be multi threaded, single thread performance doesnt really matter.

So lets look at IPC of both (multi threaded, same clock speeds):

 

Back in 2012-2013, the Ultra Books then where useing CPU's like the   i7-3537U  (2c/4t , 2ghz base, 3.1ghz boost)

And lets find a Jaguar with a 2ghz base.

Athlon 5350 (4c/4t, 2.05ghz base, no boost)

 

Now we have 2 cpus at 2ghz.... lets see if IPC (instructions pr clock) are vastly better (when both are 2ghz).

a CPU benchmark that favors intel (cinebench 11,5 multi threaded) but its easy to find reviews on:

Athlon 5350  scores 2.04 points.

i7-3537U scores 2.89 points.

Thats like a differnce of 40%.

Its def. there, but its not really massive.



Around the Network
JRPGfan said:
WolfpackN64 said:

The Core chips from Intel have a MUCH higher IPC. The Jaguar was certainly more performant then the Atom CPU's of yore, but the current Goldmont Atoms will be more performant.

In a Console where you dictate everything should be multi threaded, single thread performance doesnt really matter.

So lets look at IPC of both (multi threaded, same clock speeds):

 

Back in 2012-2013, the Ultra Books then where useing CPU's like the   i7-3537U  (2c/4t , 2ghz base, 3.1ghz boost)

And lets find a Jaguar with a 2ghz base.

Athlon 5350 (4c/4t, 2.05ghz base, no boost)

 

Now we have 2 cpus at 2ghz.... lets see if IPC (instructions pr clock) are vastly better (when both are 2ghz).

a CPU benchmark that favors intel (cinebench 11,5 multi threaded) but its easy to find reviews on:

Athlon 5350  scores 2.04 points.

i7-3537U scores 2.89 points.

Thats like a differnce of 40%.

Its def. there, but its not really massive.

But it's not because more games make use of multiple threads that single thread performance is suddenly unimportant. Gaming loads will be devided over multiple CPU's, but there will always be a heavier load on the first CPU's (and as technology progress, this will be flattened across even more cores), but the performance of the Intel chip for gaming will be more then the 40% difference.

Anyhow, Intel chips were out of the question for the Xbox and PS4 anyway since they would have a pretty useless iGPU and would cost too much.



...yet my 3+ year old gaming PC will run this much faster than 60fps @1080p.  I'll probably be in the 30fps @ 4k ultra settings though.



Stop hating and start playing.

Well, that's what happens when you multiply your GPU power but keep the same piss-weak CPU that was underpowered in 2013.



Wait what? An online multiplayer first person shooter at 30fps in 2017 on the most powerful consoles? Is this a freaking joke? I thought Destiny 2 was gonna be 60fps for sure even on the basic Ps4...unbelievable



Around the Network

Bungie made a choice to run a higher resolution (and other graphics stuff) at the expense of fps.

What I would like to hear / read one day is a proper breakdown of what would need to be sacrificed to achieve 60fps.

If to achieve 60fps on PS4 and PS4 Pro would mean 900p and 1080p respectively then I think a majority of gamer would probably prefer the res sacrifice to achieve the higher framerate. Not sure what the Xb one situation would be to achieve 60fps 760p? Would other sacrifices be needed? lighting, particles, characters on screen, textures V-sync? Where's the tipping point and why is the fps sacrifice the better option?

I don't like them just saying it's not possible.



“The fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt.” - Bertrand Russell

"When the power of love overcomes the love of power, the world will know peace."

Jimi Hendrix

 

That will be epic win for xbox s if can run it at 60 fps



 

 

binary solo said:
Bungie made a choice to run a higher resolution (and other graphics stuff) at the expense of fps.

What I would like to hear / read one day is a proper breakdown of what would need to be sacrificed to achieve 60fps.

If to achieve 60fps on PS4 and PS4 Pro would mean 900p and 1080p respectively then I think a majority of gamer would probably prefer the res sacrifice to achieve the higher framerate. Not sure what the Xb one situation would be to achieve 60fps 760p? Would other sacrifices be needed? lighting, particles, characters on screen, textures V-sync? Where's the tipping point and why is the fps sacrifice the better option?

I don't like them just saying it's not possible.

It's not always a matter of just resolution; if your bottleneck is the CPU, then you can drop it to 480p and it still won't hit 60fps.



It would be interesting to see if Scorpio will have 60FPS. But I really don't see how they implement 60FPS for one console without making it massively unfair for the playerbase playing on the weaker version of that console at 30FPS. That would be a technical nightmare for multiplayer.



 

Radek said:
I expect all PS4, PS4 Pro, Xbox One and even Scorpio will run this game in 30 fps.

900p on Xbox One, 1080p on PS4, 1440p on PS4 Pro and 1800p or 4K on Scorpio.

The first gAmerican ran native 1080p on XB1 so I expect the same and 4k on Scorpio .