By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Switch TFLOPS : 1,024 ? 0,79 ? 0,393 ?

SKMBlake said:
The whole TFlop discussion is pointless, the Tegra X1 doesn't work as the same way AMD CPU/GPUs work on both PS4 and Xbox One. The Switch is way more powerful than what we could expect from a 2015 mobile chip but isn't powerful enough to reach what we could expect from home consoles this gen. It's a 900p console, it can do better (MK8) and worse (ARK) but can run current gen with good optimisations (Doom, Wolfenstein II, Warframe, Rocket League, South Park 2, Crash Bandicoot, DB FighterZ, NBA 2K)

Don't forget one of the most impressive conversions of the lot, Outlast II, which manages to look strikingly close to the PS4 version while still running at 1008p. I mean it may not be as demanding a game as Doom or Wolf 2, but it's still very much a current gen title in terms of its rendering tech, and so far I'd say it's the best looking realistic-styled game on Switch.



Around the Network
Mr Puggsly said:

People often make that argument but I feel aspects like better physics, AI and world interactivity have more to do with the game design. Hence, developers can do plenty with the Jaguar CPUs in that regard but just don't bother. A lot of those aspects still feel like last gen content because that's just the way games are generally made.

Jaguar is pegged pretty hard as it is... I don't think people have a real comprehension how nasty those Cores truly are.
It would take all 8 of them to even match a low-clocked Intel/Ryzen dual core.

Mr Puggsly said:

I mean look at Just Cause 4, that's really showcase of what the Jaguar CPU is capable of.

Yeah, but particle effects from explosions have been dialed back because of the Physics in Just Cause 4... There is a reason why even the Xbox One X struggled with Just Cause 3.
And even on the Xbox One X, despite it's CPU and GPU advantage, it still drops frames.

Mr Puggsly said:

The X1X has 30% more CPU power and it appears to make a fairly big difference. So double the CPU power of Jaguar CPU would be significant.

30% more compared to what? Because the CPU performance between the base Xbox One and Xbox One X is certainly larger than 31% that the clockspeed alone would imply. (Remember, certain CPU tasks are offloaded to the GPU's command processor on the Xbox One X.)

Double the performance of Jaguar is still a very low ceiling... Remember, Jaguar was AMD's absolutely slowest CPU at a time where AMD's CPU's were absolutely the worst (relative to the competition) in the company's entire history.
Being such a small, narrow CPU core just doesn't give it much in the way of legs.

Mr Puggsly said:

I don't think there is a CPU and GPU imbalance on the PS4 and X1. They're designed to run technically demanding games at 30 fps. 60 fps is possible if developers build around that and games can still look impressive at 60 fps.

It's certainly there. - It's just not as noticeable as say... The PC because games tend to be cutback in various areas in order to run better on the Jaguars.
But lots of games have issues on the 8th gen consoles with sub-30fps, poor frame pacing and so on... And the bulk of that is attributed to Jaguars being pretty terrible.

Mr Puggsly said:

Hypothetically if the Switch had an upgrade that simply doubled GPU power, that would make a world of difference when it comes to resolution and improving performance in some games. Games like Wolfenstein II and Doom would actually stay at 720p!

Indeed. Pascal based Tegra could in theory offer that as well at the same powerlevel.

Nintendo does have allot of options on where it can take the Switch going forward, so I am interested to see what they do with the platform from a hardware perspective.




--::{PC Gaming Master Race}::--

Pemalite said:
Mr Puggsly said:

People often make that argument but I feel aspects like better physics, AI and world interactivity have more to do with the game design. Hence, developers can do plenty with the Jaguar CPUs in that regard but just don't bother. A lot of those aspects still feel like last gen content because that's just the way games are generally made.

Jaguar is pegged pretty hard as it is... I don't think people have a real comprehension how nasty those Cores truly are.
It would take all 8 of them to even match a low-clocked Intel/Ryzen dual core.

Mr Puggsly said:

I mean look at Just Cause 4, that's really showcase of what the Jaguar CPU is capable of.

Yeah, but particle effects from explosions have been dialed back because of the Physics in Just Cause 4... There is a reason why even the Xbox One X struggled with Just Cause 3.
And even on the Xbox One X, despite it's CPU and GPU advantage, it still drops frames.

Mr Puggsly said:

The X1X has 30% more CPU power and it appears to make a fairly big difference. So double the CPU power of Jaguar CPU would be significant.

30% more compared to what? Because the CPU performance between the base Xbox One and Xbox One X is certainly larger than 31% that the clockspeed alone would imply. (Remember, certain CPU tasks are offloaded to the GPU's command processor on the Xbox One X.)

Double the performance of Jaguar is still a very low ceiling... Remember, Jaguar was AMD's absolutely slowest CPU at a time where AMD's CPU's were absolutely the worst (relative to the competition) in the company's entire history.
Being such a small, narrow CPU core just doesn't give it much in the way of legs.

Mr Puggsly said:

I don't think there is a CPU and GPU imbalance on the PS4 and X1. They're designed to run technically demanding games at 30 fps. 60 fps is possible if developers build around that and games can still look impressive at 60 fps.

It's certainly there. - It's just not as noticeable as say... The PC because games tend to be cutback in various areas in order to run better on the Jaguars.
But lots of games have issues on the 8th gen consoles with sub-30fps, poor frame pacing and so on... And the bulk of that is attributed to Jaguars being pretty terrible.

Mr Puggsly said:

Hypothetically if the Switch had an upgrade that simply doubled GPU power, that would make a world of difference when it comes to resolution and improving performance in some games. Games like Wolfenstein II and Doom would actually stay at 720p!

Indeed. Pascal based Tegra could in theory offer that as well at the same powerlevel.

Nintendo does have allot of options on where it can take the Switch going forward, so I am interested to see what they do with the platform from a hardware perspective.


I understand that, but in practice developers have done great things with these Jaguar CPUs. Frankly, developers seem to struggle with GPU bottleneck more often which is why dynamic resolution has become so common.

The minor performance drops in Just Cause 4 is likely optimization issues or bottleneck on the GPU. I mean the PS4 version is essentially running the same CPU with a stable 30 fps. Either way, that's an open world game with AAA visuals and has heavy use of physics. That's the exact experience people seem to feel the Jaguar CPUs can't do well.

Even if they're among the worst processors AMD had at the moment, they've demonstrated to be more capable than they're given credit for. I'm sure developers have to work harder to squeeze the most out of those CPUs, but that's generally how it goes for making console games.

A lot of games also had frame rate issues in the 7th gen, the 6th gen, the 5th gen... so what's your point? Its not necessarily just a CPU issue. Again, dynamic resolution is common because the issue isn't just CPU.

 

Well looking at the technically demanding Switch games, the primary issue seems to be lack of GPU power. The Switch can run modern games fairly well but its often times at sub HD resolutions with reduced graphics. Therefore I argue the Switch actually has a bigger power imbalance than X1 and PS4.

Hence, if the next Switch could only have a CPU or GPU upgrade, its the GPU upgrade would actually create a more dramatic boost to presentation and performance.



Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Mr Puggsly said:

I understand that, but in practice developers have done great things with these Jaguar CPUs. Frankly, developers seem to struggle with GPU bottleneck more often which is why dynamic resolution has become so common.

It can be CPU bottlenecks that hit the dynamic resolution as well, especially when the CPU cannot keep up with the number of draw calls.

Whether the CPU or GPU is the bottleneck will obviously vary depending on game engine, resolution, framerate targets, what is displayed on scene and other aspects... And will constantly switch between the two.

Mr Puggsly said:

The minor performance drops in Just Cause 4 is likely optimization issues or bottleneck on the GPU. I mean the PS4 version is essentially running the same CPU with a stable 30 fps. Either way, that's an open world game with AAA visuals and has heavy use of physics. That's the exact experience people seem to feel the Jaguar CPUs can't do well.

The Playstation 4 Pro version is scaled back from the Xbox One X version...
But without proper profiling of the game and it's engine, it's difficult to say with any quantifiable certainty as to whether the CPU or GPU is the main culprit, I wouldn't be surprised if it's both.

But the fact is, some CPU-loads have been removed/reduced from the jump between Just Cause 3 and Just Cause 4... And the developer also made their physics engine more heavily threaded to take advantage of the Jaguar core counts.

With that said, the game does suffer from some technical limitations, such as short draw distances on shadowing and objects that will pop-in frequently.

Mr Puggsly said:

Even if they're among the worst processors AMD had at the moment, they've demonstrated to be more capable than they're given credit for. I'm sure developers have to work harder to squeeze the most out of those CPUs, but that's generally how it goes for making console games.

I am not saying they are incapable... I am saying they are terrible relative to other hardware.
Comparing Jaguar against say... The Cell or Xenon it's advantages become abundantly clear... But once you start comparing it to any modern x86 CPU, heck even ARM CPU's, it really does look antiquated.

Next gen will be interesting... We could be looking at a 5-10x fold increase in CPU capability, which is larger than the CPU jump between 7th and 8th gens.


Mr Puggsly said:

A lot of games also had frame rate issues in the 7th gen, the 6th gen, the 5th gen... so what's your point? Its not necessarily just a CPU issue. Again, dynamic resolution is common because the issue isn't just CPU.

To be fair, 5th and 6th gens had terrible CPU's as well.
The Original Xbox probably got the most right during the 6th gen on the CPU side of the equation... But it's Pentium 3/Celeron hybrid chip @ 733Mhz paled in comparison to AMD's Athlon XP 1900+ or Pentium 4 1.8ghz chips that were available at the time.

Mr Puggsly said:

Well looking at the technically demanding Switch games, the primary issue seems to be lack of GPU power. The Switch can run modern games fairly well but its often times at sub HD resolutions with reduced graphics. Therefore I argue the Switch actually has a bigger power imbalance than X1 and PS4.

Agreed. The Switch's lack of GPU power is a massive hindrance, but not only that... It's DRAM bandwidth isn't helping matters either.

I will still argue that the Switch is a more balanced machine overall, it's GPU capabilities isn't orders-of-magnitude greater than it's CPU. - That doesn't mean it doesn't need more GPU though.

Mr Puggsly said:

Hence, if the next Switch could only have a CPU or GPU upgrade, its the GPU upgrade would actually create a more dramatic boost to presentation and performance.

Lack of DRAM bandwidth will hinder fillrate and thus resolution though. - With today's compression and culling technology... Around 50GB/s of memory bandwidth is probably what you would ideally want to try and target in a Switch revision to hit higher resolutions with a step up in fidelity.

It's not always about GPU brute force, otherwise AMD's GPU's would always win. :P



--::{PC Gaming Master Race}::--

HoloDust said:

As said previously, all games need to run in portable mode, so your base performance is set at under 200GFLOPs.

The thing is though, in portable mode the smaller screen means you can get away with a lot of cuts to make up the difference. Lower screen resolutions, ambient occlusion off, lower res shadows, dialled down draw distance, etc. On a 6 inch screen cutbacks like this aren't as noticeable as they would be on a HDTV.



Around the Network
Evilms said:
393 Gflops docked
196 Gflops undocked

Switch, as everybody knows, supports two kinds of FLOPS(PS4 Pro can do this as well). The "standard" is FP32. For that, Switch 153.6/192 GFLOPS in handheld mode and 384 GFLOPS docked. The other kind, FP16, doubles those numbers. 

By the way, raw numbers is not the whole story. Switch uses a newer and more efficient GPU architecture than PS3 and 360, so even though the handheld numbers look lower than PS3/360, the actual performance are better.  Furthermore, Switch is basicly a portable, and people who buy it don't care about specs, otherwise they would buy a PS4, or the state of the art XBoxOne X for the best 3rd party game experience.



”Every great dream begins with a dreamer. Always remember, you have within you the strength, the patience, and the passion to reach for the stars to change the world.”

Harriet Tubman.

Pemalite said:



 








Mr Puggsly said:

Most laptops on the market can't play games as well as the Xbox One. The laptops that can cost significantly more than a Xbox One. For the record, the Xbox One X is pretty great.

Its not really fair to compare your laptop to a Switch because the Switch is significantly smaller.

Ryzen 2500U and 2700U is budget hardware that can match the Xbox One. (I have a 2700u notebook.)
Low-end laptops can match the Xbox One these days, with the exception of those running with Intel Decelerator Graphics.

Low-end Geforce parts like the MX150, 1030 can all turn in better results than the Xbox One as well.

But the fact that AMD's integrated graphics can beat the Xbox One is just a testament to how old 8th gen is getting and how quickly GPU hardware has evolved.



Nate4Drake said:

I never said PS3 is more powerful than Switch.   But I consider Uncharted3 a more impressive Game than anything on Switch, from a tech point of you. Geometry, details, textures, scale, animations, etc.   Where is the non sense?

Uncharted 3 came out... When exactly during the PS3's lifecycle?
And the Switch is how old?

Best to compare the visuals of games at similar timelines in a consoles lifecycle as developers learn the various nuances of the hardware... That's how you make a proper assessment.
Uncharted 3 was certainly a technical marvel given the constraints it was working in though.

But, the Switch's GPU hardware is capable of more effects than the DX9 based hardware in the PS3, so it will be interesting to see how Developers leverage that GPU in the following years.

Ryzen 2700U is far weaker than Xbox one, it runs Doom (Vulkan) at 30 fps 900p with lowest settings, xbox one does 60fps 900p with medium-high settings. In Witcher 3 Ryzen 2700u does 20fps at lowest setting with 720p... it's not even close.

(Doom video)

https://www.youtube.com/watch?v=NijqNFWlUug&t=492s

(Witcher 3 video)

https://youtu.be/Am7gzmMuaGI?t=180

I'm glad you accept last-gen games (atleast uncharted 3) looks better than what Switch has, I remember we had a similiar discussion about this and you and John refused to accept Halo 4 looked better than Doom on Switch (handheld).



6x master league achiever in starcraft2

Beaten Sigrun on God of war mode

Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.

1-0 against Grubby in Wc3 frozen throne ladder!!

Pemalite said:
Mr Puggsly said:

I understand that, but in practice developers have done great things with these Jaguar CPUs. Frankly, developers seem to struggle with GPU bottleneck more often which is why dynamic resolution has become so common.

It can be CPU bottlenecks that hit the dynamic resolution as well, especially when the CPU cannot keep up with the number of draw calls.

Whether the CPU or GPU is the bottleneck will obviously vary depending on game engine, resolution, framerate targets, what is displayed on scene and other aspects... And will constantly switch between the two.

Mr Puggsly said:

The minor performance drops in Just Cause 4 is likely optimization issues or bottleneck on the GPU. I mean the PS4 version is essentially running the same CPU with a stable 30 fps. Either way, that's an open world game with AAA visuals and has heavy use of physics. That's the exact experience people seem to feel the Jaguar CPUs can't do well.

The Playstation 4 Pro version is scaled back from the Xbox One X version...
But without proper profiling of the game and it's engine, it's difficult to say with any quantifiable certainty as to whether the CPU or GPU is the main culprit, I wouldn't be surprised if it's both.

But the fact is, some CPU-loads have been removed/reduced from the jump between Just Cause 3 and Just Cause 4... And the developer also made their physics engine more heavily threaded to take advantage of the Jaguar core counts.

With that said, the game does suffer from some technical limitations, such as short draw distances on shadowing and objects that will pop-in frequently.

Mr Puggsly said:

Even if they're among the worst processors AMD had at the moment, they've demonstrated to be more capable than they're given credit for. I'm sure developers have to work harder to squeeze the most out of those CPUs, but that's generally how it goes for making console games.

I am not saying they are incapable... I am saying they are terrible relative to other hardware.
Comparing Jaguar against say... The Cell or Xenon it's advantages become abundantly clear... But once you start comparing it to any modern x86 CPU, heck even ARM CPU's, it really does look antiquated.

Next gen will be interesting... We could be looking at a 5-10x fold increase in CPU capability, which is larger than the CPU jump between 7th and 8th gens.


Mr Puggsly said:

A lot of games also had frame rate issues in the 7th gen, the 6th gen, the 5th gen... so what's your point? Its not necessarily just a CPU issue. Again, dynamic resolution is common because the issue isn't just CPU.

To be fair, 5th and 6th gens had terrible CPU's as well.
The Original Xbox probably got the most right during the 6th gen on the CPU side of the equation... But it's Pentium 3/Celeron hybrid chip @ 733Mhz paled in comparison to AMD's Athlon XP 1900+ or Pentium 4 1.8ghz chips that were available at the time.

Mr Puggsly said:

Well looking at the technically demanding Switch games, the primary issue seems to be lack of GPU power. The Switch can run modern games fairly well but its often times at sub HD resolutions with reduced graphics. Therefore I argue the Switch actually has a bigger power imbalance than X1 and PS4.

Agreed. The Switch's lack of GPU power is a massive hindrance, but not only that... It's DRAM bandwidth isn't helping matters either.

I will still argue that the Switch is a more balanced machine overall, it's GPU capabilities isn't orders-of-magnitude greater than it's CPU. - That doesn't mean it doesn't need more GPU though.

Mr Puggsly said:

Hence, if the next Switch could only have a CPU or GPU upgrade, its the GPU upgrade would actually create a more dramatic boost to presentation and performance.

Lack of DRAM bandwidth will hinder fillrate and thus resolution though. - With today's compression and culling technology... Around 50GB/s of memory bandwidth is probably what you would ideally want to try and target in a Switch revision to hit higher resolutions with a step up in fidelity.

It's not always about GPU brute force, otherwise AMD's GPU's would always win. :P

Generally speaking we know the PS4 and X1 are about equal on the CPU side, but lower resolutions are on the X1 due to GPU limitations. So generally speaking, it evident low resolution is mostly a GPU issue.

As long as games keep pushing visual quality complexity on machines with limited capabilities, pop in is always going to be an issue. At best pop in less of an issue this gen than the last, but I anticipate it will be an issue next gen as well. Frankly, I'm annoyed some modern games don't leverage the extra power of X1X or Pro to improve draw distance.

Pentium 3 was a pretty capable CPU though, developers seemed to struggle with OG Xbox's limited GPU power and RAM in later titles.

Switch's ability to run games like Ark, Fortnite, Doom, and Wolfenstein II is impressive. Especially at a solid frame rate. However, the massive visual compromises (480p, 360p and lower) tell me the CPU is much more capable than the GPU. So again, I feel the Switch is more imbalanced than X1 and PS4.



Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Mr Puggsly said:
Pemalite said:

It can be CPU bottlenecks that hit the dynamic resolution as well, especially when the CPU cannot keep up with the number of draw calls.

Whether the CPU or GPU is the bottleneck will obviously vary depending on game engine, resolution, framerate targets, what is displayed on scene and other aspects... And will constantly switch between the two.

The Playstation 4 Pro version is scaled back from the Xbox One X version...
But without proper profiling of the game and it's engine, it's difficult to say with any quantifiable certainty as to whether the CPU or GPU is the main culprit, I wouldn't be surprised if it's both.

But the fact is, some CPU-loads have been removed/reduced from the jump between Just Cause 3 and Just Cause 4... And the developer also made their physics engine more heavily threaded to take advantage of the Jaguar core counts.

With that said, the game does suffer from some technical limitations, such as short draw distances on shadowing and objects that will pop-in frequently.

I am not saying they are incapable... I am saying they are terrible relative to other hardware.
Comparing Jaguar against say... The Cell or Xenon it's advantages become abundantly clear... But once you start comparing it to any modern x86 CPU, heck even ARM CPU's, it really does look antiquated.

Next gen will be interesting... We could be looking at a 5-10x fold increase in CPU capability, which is larger than the CPU jump between 7th and 8th gens.


To be fair, 5th and 6th gens had terrible CPU's as well.
The Original Xbox probably got the most right during the 6th gen on the CPU side of the equation... But it's Pentium 3/Celeron hybrid chip @ 733Mhz paled in comparison to AMD's Athlon XP 1900+ or Pentium 4 1.8ghz chips that were available at the time.

Agreed. The Switch's lack of GPU power is a massive hindrance, but not only that... It's DRAM bandwidth isn't helping matters either.

I will still argue that the Switch is a more balanced machine overall, it's GPU capabilities isn't orders-of-magnitude greater than it's CPU. - That doesn't mean it doesn't need more GPU though.

Lack of DRAM bandwidth will hinder fillrate and thus resolution though. - With today's compression and culling technology... Around 50GB/s of memory bandwidth is probably what you would ideally want to try and target in a Switch revision to hit higher resolutions with a step up in fidelity.

It's not always about GPU brute force, otherwise AMD's GPU's would always win. :P

Generally speaking we know the PS4 and X1 are about equal on the CPU side, but lower resolutions are on the X1 due to GPU limitations. So generally speaking, it evident low resolution is mostly a GPU issue.

As long as games keep pushing visual quality complexity on machines with limited capabilities, pop in is always going to be an issue. At best pop in less of an issue this gen than the last, but I anticipate it will be an issue next gen as well. Frankly, I'm annoyed some modern games don't leverage the extra power of X1X or Pro to improve draw distance.

Pentium 3 was a pretty capable CPU though, developers seemed to struggle with OG Xbox's limited GPU power and RAM in later titles.

Switch's ability to run games like Ark, Fortnite, Doom, and Wolfenstein II is impressive. Especially at a solid frame rate. However, the massive visual compromises (480p, 360p and lower) tell me the CPU is much more capable than the GPU. So again, I feel the Switch is more imbalanced than X1 and PS4.

Pop in will always be an issue if Devs don't address the power in a "balanced way".  Of course a very weak hardware has much more problems, and if you wanna eliminate significantly pop in, the overall geometry, tessellation and IQ would be so poor that Devs prefer to keep more significant pop in, and a clear example is The Legend of Zelda on Nintendo Switch.

 So, what about pop in, when PS5 and Next Xbox will be out ? It could always be an issue if Developers aim for the maximum geometry, details, effects, good tessellation in the medium/long distance and big scale and complex environment, to an extent that frame rate would significally drop with zero pop in. It's only a matter of resource management, and it will be always an issue, regardless of the power available.  The good thing is, with PS5 and Next Xbox, you might have games with far better graphics, more advanced physics, AI, Animations, effects, and further reduce pop in, if devs want.

PS: Do you remeber what kind of pop in we had in racing games on PSX, Sega Saturn and N64 ? :D

Last edited by Nate4Drake - on 23 January 2019

”Every great dream begins with a dreamer. Always remember, you have within you the strength, the patience, and the passion to reach for the stars to change the world.”

Harriet Tubman.

Trumpstyle said:

Ryzen 2700U is far weaker than Xbox one, it runs Doom (Vulkan) at 30 fps 900p with lowest settings, xbox one does 60fps 900p with medium-high settings. In Witcher 3 Ryzen 2700u does 20fps at lowest setting with 720p... it's not even close.

(Doom video)

https://www.youtube.com/watch?v=NijqNFWlUug&t=492s

(Witcher 3 video)

https://youtu.be/Am7gzmMuaGI?t=180

I haven't actually tried those two games on my Ryzen 2700U notebook.

But Overwatch, Battlefield 1, Call of Duty: WW2 were hitting 1080P native... Which is certainly a step up over what my base Xbox One machine can do... Or I can dial back the resolution to 75% scale (Which fits the bandwidth better) and dial up some of the effects.

The thing with Ryzen notebooks is that Ryzen has a configurable TDP and some notebook manufacturers limit it to 15w instead of 25w.
Some notebooks also only come with single channel DDR4...
And some notebooks (like mine) actually allow the DDR4 to run at 2666mhz rather than 2400mhz.
And some notebooks aren't limited to 12 month old driver sets. (I hacked mine.)


Trumpstyle said:
I'm glad you accept last-gen games (atleast uncharted 3) looks better than what Switch has, I remember we had a similiar discussion about this and you and John refused to accept Halo 4 looked better than Doom on Switch (handheld).

"Look better" is a subjective approach... I rather take a methodical one.

Doom on Switch is using techniques that Halo 4 doesn't such as GPU accelerated particle effects, it's rendering is a step up over Halo 4's baked approach and it shows.
From a graphics perspective, Doom on Switch beats Halo 4 on 360 in terms of fidelity. - Artistic style, Halo 4 probably wins.

Mr Puggsly said:

Generally speaking we know the PS4 and X1 are about equal on the CPU side, but lower resolutions are on the X1 due to GPU limitations. So generally speaking, it evident low resolution is mostly a GPU issue.

Indeed, the Xbox One is GPU limited.

But games that are CPU bound will occasionally pull ahead. (Assassins Creed at one point.)
The difference between the Xbox One and Playstation 4's CPU's are pretty inconsequential. I.E. 1.6ghz vs 1.75ghz. - 150Mhz is stuff all either way you cut the cake.

The Xbox One does have the DDR3 and eSRAM advantage which reduces latencies as well, which gives it a little extra kick in the CPU department.

However, the Xbox One on the GPU side is partly hampered by the lack of bandwidth... Even the PC's Radeon 7750 saw large reductions in performance moving from GDDR5 to DDR3. - The Xbox ONE's GPU is a step up over that and it has to share it's limited bandwidth with the CPU and other components as well, compounding the issue... Granted it's mitigated somewhat by the use of a 256bit bus and eSRAM, but not resolved entirely.

The ROP/CU/TMU reductions doesn't help matters either of course.

Nate4Drake said:

Pop in will always be an issue if Devs don't address the power in a "balanced way".  Of course a very weak hardware has much more problems, and if you wanna eliminate significantly pop in, the overall geometry, tessellation and IQ would be so poor that Devs prefer to keep more significant pop in, and a clear example is The Legend of Zelda on Nintendo Switch.

 So, what about pop in, when PS5 and Next Xbox will be out ? It could always be an issue if Developers aim for the maximum geometry, details, effects, good tessellation in the medium/long distance and big scale and complex environment, to an extent that frame rate would significally drop with zero pop in. It's only a matter of resource management, and it will be always an issue, regardless of the power available.  The good thing is, with PS5 and Next Xbox, you might have games with far better graphics, more advanced physics, AI, Animations, effects, and further reduce pop in, if devs want.

PS: Do you remeber what kind of pop in we had in racing games on PSX, Sega Saturn and N64 ? :D

Well. There are better ways to go about resolving pop-in, such as fading. But I digress, that is another discussion entirely that could take me many hours to elaborate upon. :P



--::{PC Gaming Master Race}::--