By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - No parity requirements on Scorpio versus Xbone

Tagged games:

Pemalite said:
Angelv577 said:
Let see what is going to happen but I don't expect developers to choose 60fps on multiplayer for scorpio while the base console stay at 30fps. I can see that on single player which is what the ps4 had tried to do on some games.

PC has no issues even if someone is 20fps and another gamer is 200fps.

I don't think Scorpio should have issues either.  I just think it will be chosen that way by developers.  Let see what happen.



Around the Network
Azzanation said:

Erm.. PS3 says otherwise.

The GPU didn't have access to a Tessellation unit. The Xbox 360 did via it's Truform based geometry unit, but that relied on N-Patches and could not even hold a candle to today.
Particles are bigger and more plentifull today as well. HBAO+ was usually mimicked and "baked" into the texture work last gen, which was free from a performance perspective but also required more development time. Even this generation of consoles struggle to use the best quality Occlusion in games.

eva01beserk said:

I dont play multy player ever, so I would like to know how big of a diference could a game in 30 and 60 be competitevly? is it big enough for devs to say hey, we need to separate x1 and scorpio players cuz its unfair? If so, would they then need to make separate servers?

I mean. I would be telling lies if I said there wasn't a difference between 30 and 60fps competitively. Because there is a difference.
How big exactly? That depends on the game and even what role you are playing.

For instance in Overwatch it's fine to play a character that doesn't rely on firing precision like Mercy, Symmetra or Torbjorn. But a character like widowmaker, soldier Mc'cree really does perform better at 60fps.

But if you are playing competitively or professionally, then your approach to the game should be more than just a casual one, so you will tend to gravitate to the platform that performs best anyway. Like Scorpio.

SvennoJ said:

Yeah I was just thinking that. Original XBox had a good cpu too, as well as XBox 360, heck even the ps2 was no slouch. It's not easy to emulate XBox 360 games even though the architecture isn't all that different. It's this gen that introduced rather weak cpus pared with much more powerful gpus. And true, ps3 orginally wasn't even meant to have a gpu, 2 cell processors instead.

xbox cpu 3 gflops
ps2 cpu 6.2 gflops

XBox360 cpu 115 gflops
ps3 cpu 230 gflops

XBoxOne cpu 112 gflops (147 scorpio)
ps4 cpu 102 gflops (136 pro)

It's different processors ofcourse, yet this gen wasn't any real step forward cpu wise. Ofcourse last gen consoles were sold at a loss and engineered to be close to the cutting edge while this gen consoles had to be cheaper and sold at par.

Using flops within that context doesn't tell us the performance of those chips.
Think about it: If the Cell had 230+ Gflop of real-world performance and the GPU had 400+ Gflop...
Then why does the Switch which clearly has a CPU with less flops and a GPU with less flops, have it's games look significantly superior? Think about it. Please. And stop using flops in the context you are using it in.

********

The original Xbox's CPU was trending towards the low-end.
It had a coppermine based core with half the cache.

Intel had already moved onto Tualatin by the time the Xbox had released and had CPU's with almost twice the clock speed. (1.4Ghz verses 733mhz)
4x the amount of cache, prefetching etc'.
Intel also had Willamatte Pentium 4 chips operating at 2Ghz+

And AMD had it's Palomino based Athlon XP's on the market with PR ratings of 2100+ based on the amazing K7 core, this was the fastest chip at the time.

So whilst the original Xbox was beastly for a console, it paled in comparison to what the PC was offering in 2001, let-alone in 2004.

******

As for Cell. The only time that CPU showed any kind of decent performance is when using iterative refinement floating point. Otherwise it was terrible. It performed terrible. It was terrible to program for.
The Cell was never going to be a GPU replacement either. It doesn't even have hints of the appropriate silicon to replace the task of a GPU, nor does it even have a fraction of the performance for it either.

The Cell processor was 234~ million transistors at 221mm2 at 90nm SOI.

Roughly comparable to intels Pentium D which released before the Cell.

However, Sony had to disable a SPE for yield reasons. It would have likely been cheaper to manufacture than the Pentium D for that very reason.

Can't forget that Core 2 released around the time of the Playstation 3 either... And  that was a superior CPU uArch.

The Cell was also shit at integers. Games and game engines don't just using floating point you know.


*******

The Xbox 360's CPU isn't anything special either. Sure, 3x cores operating at 3.2Ghz all those years ago might have sounded flashy to the uneducated, but it's an in-order design like the original Intel Atom in phones and tablets.
It had a long pipeline, not much in the way of prefetching or branch prediction and only 8-way associative cache, with that cache running at only half the CPU clock rate where-as most x86 CPU's since the Pentium 3 era, the cache runs at the same speed as the CPU clock.

PowerPC hasn't really been a contender against x86 for a very very long time. IBM tends to go for extremely wide core designs which appeals to professional markets, for everyone else, it's not ideal.

... And I could go on. But I think I have made my point.



--::{PC Gaming Master Race}::--

LudicrousSpeed said:
twintail said:

This quote claims Sony says. Yet the authors same article has zero proof Sony said anything. In fact, they have a ND programmar quote, who the author claims was speaking for Sony. But again, nothing in the article backs up this claim. In fact, there is zero understanding as to what the programmer was even talking about. 

Later tweets from the same programmar actually seem to suggest that he was talking about UC4, which makes sense: original PS4 UC4 MP is 60 fps, so there obviously are no gains on the PS4Pro version.

Both DigitalFoundry and the official Sony guide to PS4 pro both claim that frame rates can be increased. 

https://blog.eu.playstation.com/2016/09/08/ps4-pro-the-ultimate-faq/ - 'render higher'

http://www.eurogamer.net/articles/digitalfoundry-2016-how-playstation-4k-neo-and-the-original-ps4-will-co-exist - 'meet or exceed'

There is no backtracking. Devs and publishers are choosing what to do themselves.

Here is another source for you: https://www.videogamer.com/news/multiplayer-games-will-not-run-at-a-faster-frame-rate-on-ps4- all the evidence I need to support my post. Not to mention I also have reality, which to date we have not seen a single game target a higher frame rate option for MP. Again, if I'm wrong, just name the game. If this is as obvious as you make it seem to be then you should easily be able to whip out a list of games that prove me wrong, or concrete info Sony put out that actually says developers can give us higher frame rate settings in MP.

You seem hung up on the idea that I am saying this is a Sony policy. If it makes you feel better, I could just as easily say the Pro is not powerful enough to give us higher frame rate options instead of pointing to Sony's parity policy that numerous websites confirm exist and I've only ever seen one person say it doesn't :)

They're not huge multiplayer games, but both Dark Souls 3 and Steep run at much higher framerates than their original ps4 versions, both of which target 30fps. Granted, Steep is all about co-op, so it's not a competitive game, but Dark Souls 3 definitely has PvP. Going through the list of pro supported games, there are very few big multiplayer titles that don't target 60 fps, basically all the Ubisoft games, For Honor, Ghost Recon, and the Division, and yeah, I don't think the cpu is a big enough bump to get 60 fps in those titles, especially Ghost Recon, which drops frames quite a bit on the original ps4 version. You may be right, and the pollcy may be in place, but it may not. There are so few games that would actually put it into practice that it's difficult to tell. 



Hiku said:

So for multiplayer, how could this work? If a player on Xbox One plays COD at 30fps gets matched up against a player on Scorpio at 60fps...?

COD offers 60FPS MP and so do many MP games. COD on Scorpio will boost a 4k mode and probably better textures and effects.

My guess is Scorpio wont boost frame rate at all however it will most likely iron out the dips and boost 4k res and effects only. Maybe even add newer effects that can't be done on the older hardware. 

To keep it fair i believe they will keep frame rates the same for those big competitive games.

Also remember PC Multiplayer games all run at different frame rates too. All depends on the owners PC. It wont be anything new if Scorpio holds a frame rate advantage.



Guy 'a' plays COD at 30fps, guy 'b' plays COD at 60fps. Guy 'a' isn't happy guy 'b' has the advantage and does one of two things: Drops his XB1 entirely and all his friends with it or buys a Scorpio in order to continue playing with his friends.

That's what MS are counting on.



 

The PS5 Exists. 


Around the Network
GribbleGrunger said:
Guy 'a' plays COD at 30fps, guy 'b' plays COD at 60fps. Guy 'a' isn't happy guy 'b' has the advantage and does one of two things: Drops his XB1 entirely and all his friends with it or buys a Scorpio in order to continue playing with his friends.

That's what MS are counting on.

Except for 1 small problem.  Scorpio is a powerfull machine but not powerful enough to hit both 4k and 60 frames. And considering the Scorpio is advertised as a 4k machine than its main focus is running 4k not 60 frames. 

So big disagree with your post 



We're expecting parity requirements? I mean...it's unlikely that a dev is going to deliberately screw up their own multiplayer base, but the there had never been any indication that there would be parity rules.



Bet with Adamblaziken:

I bet that on launch the Nintendo Switch will have no built in in-game voice chat. He bets that it will. The winner gets six months of avatar control over the other user.

Ain't the boost mode of the PS4 Pro also allowing higher fps? Or is that feature locked in multiplayer?



Conina said:
Ain't the boost mode of the PS4 Pro also allowing higher fps? Or is that feature locked in multiplayer?

Boost mode will just help increase or smoothen framerates towards the target. Star Wars Battlefront, for instance, plays at a higher framerate under boost mode, but both it and the og ps4 version target 60fps. What we're wondering is if Sony has asked developers not to allow a huge framerate difference in multiplayer games (i.e. can games targeting 30fps on the og ps4 target 60fps on pro) that some of the singleplayer games have been getting (Snake Pass, Rotr, The Surge) for example.

I'm not certain, to be honest. Such a clause was originally in Sony's blue papers, but those were from quite a while before the system's release and the policies may have changed. Since Sony doesn't comment, it's very difficult to tell. There are a few multiplayer games that feature much higher framerates (Dark Souls 3 and Steep both have unlocked framerates where the og ps4 targets 30) and the few multiplayer games that would actually benefit from this feature are likely as hampered by the pro's cpu as any policy Sony might employ. It's still relatively rare for single player games to receive a huge frame rate boost, after all, and we know there are no parity restrictions on them. Most multiplayer games target 60fps these days, save for some of the Ubisoft games like For Honor or Ghost Recon, and most notably, Destiny 2, so it's difficult to tell whether it might be a policy choice or cpu power. Perhaps it will be more telling when some of the bigger games come out that will likely target 30fps on the original ps4, like Far Cry 5 or Red Dead 2.



Pemalite said:
SvennoJ said:

Yeah I was just thinking that. Original XBox had a good cpu too, as well as XBox 360, heck even the ps2 was no slouch. It's not easy to emulate XBox 360 games even though the architecture isn't all that different. It's this gen that introduced rather weak cpus pared with much more powerful gpus. And true, ps3 orginally wasn't even meant to have a gpu, 2 cell processors instead.

xbox cpu 3 gflops
ps2 cpu 6.2 gflops

XBox360 cpu 115 gflops
ps3 cpu 230 gflops

XBoxOne cpu 112 gflops (147 scorpio)
ps4 cpu 102 gflops (136 pro)

It's different processors ofcourse, yet this gen wasn't any real step forward cpu wise. Ofcourse last gen consoles were sold at a loss and engineered to be close to the cutting edge while this gen consoles had to be cheaper and sold at par.

Using flops within that context doesn't tell us the performance of those chips.
Think about it: If the Cell had 230+ Gflop of real-world performance and the GPU had 400+ Gflop...
Then why does the Switch which clearly has a CPU with less flops and a GPU with less flops, have it's games look significantly superior? Think about it. Please. And stop using flops in the context you are using it in.

********

The original Xbox's CPU was trending towards the low-end.
It had a coppermine based core with half the cache.

Intel had already moved onto Tualatin by the time the Xbox had released and had CPU's with almost twice the clock speed. (1.4Ghz verses 733mhz)
4x the amount of cache, prefetching etc'.
Intel also had Willamatte Pentium 4 chips operating at 2Ghz+

And AMD had it's Palomino based Athlon XP's on the market with PR ratings of 2100+ based on the amazing K7 core, this was the fastest chip at the time.

So whilst the original Xbox was beastly for a console, it paled in comparison to what the PC was offering in 2001, let-alone in 2004.

******

As for Cell. The only time that CPU showed any kind of decent performance is when using iterative refinement floating point. Otherwise it was terrible. It performed terrible. It was terrible to program for.
The Cell was never going to be a GPU replacement either. It doesn't even have hints of the appropriate silicon to replace the task of a GPU, nor does it even have a fraction of the performance for it either.

The Cell processor was 234~ million transistors at 221mm2 at 90nm SOI.

Roughly comparable to intels Pentium D which released before the Cell.

However, Sony had to disable a SPE for yield reasons. It would have likely been cheaper to manufacture than the Pentium D for that very reason.

Can't forget that Core 2 released around the time of the Playstation 3 either... And  that was a superior CPU uArch.

The Cell was also shit at integers. Games and game engines don't just using floating point you know.


*******

The Xbox 360's CPU isn't anything special either. Sure, 3x cores operating at 3.2Ghz all those years ago might have sounded flashy to the uneducated, but it's an in-order design like the original Intel Atom in phones and tablets.
It had a long pipeline, not much in the way of prefetching or branch prediction and only 8-way associative cache, with that cache running at only half the CPU clock rate where-as most x86 CPU's since the Pentium 3 era, the cache runs at the same speed as the CPU clock.

PowerPC hasn't really been a contender against x86 for a very very long time. IBM tends to go for extremely wide core designs which appeals to professional markets, for everyone else, it's not ideal.

... And I could go on. But I think I have made my point.

Yes I said it's different processors, but it's no secret emulating 360 games is tricky. Unless MS is deliberately pacing BC games to gain more attention? :)

Anyway how would you rate the increase of CPU capabilities. From Gen 6 to Gen 7 is a theoretical  max performence increase of 38x. How much did the move to powerPC limit that? How much more capable would you rate gen 8 cpus over gen 7. Keeping in mind only 6 cores (and now a bit of the 7th) are used for gaming.

But true, ps3 was a beast at folding@home. Not so much for games. Although it still managed to surpass 360 in 1st party titles.