By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Why Xbox One Will Catch Up To Playstation 4 in Performance.

Norris2k said:
Michael-5 said:
 

 

2. Above 24 FPS, a human eye can't detect any changes to frame rate (first year chem/bio). You will see a smoother image, but never any screen tear or lag. So above 24 FPS and the difference is marginal. PS3 games lock in either 30 or 60 FPS, XB1 does the same, and so does WiiU. Neither is performing better here, all are doing well.

There is nothing like that in the human eye or brain. Light signal is continuous and there is no notion of FPS in the eye or brain. 24 FPS is just a convenient and very old standard to produce with trick and limits a non interactive, good enough looking movie:

1 - Motion blur in the picture trick the eye to see the animation smoother than it is. Lag would be painfully noticeable if you were to watch 24 fps without blur. Don't expect a 3D game to be as good as the real motion blur from time of light exposition on a real thing.

2 - On interactive game, if you click on a button and the action display in 1/30th or 1/60 you will notice. In a movie you don't have any reference to assume a picture is late or not. In a game you compare to your own input.

3 -  You need only 1 fps to display an unmoving wall, but the faster the movement, the more fps you need to make it smooth. That's why fast camera travellings in movies doesn't look so great.

4 - Even for a non interactive motion blurred movie and no super fast travelling... go to 48 fps and you will notice it a lot. In fact it will fill a little wierd, not like a movie. It will feel more real.

I dunno about that, I took a chemistry course in school and one of the exam questions was to calculate the FPS human eyes can see at. Then do the same for wolves. Humans do make a distintion at 24FPS. We can see above 24FPS, but the difference is marginal.

Below 24 FPS, the screen flickers. (For wolves & dogs it's 60FPS, that's why they rarely watch tv, it's always flickering)



What is with all the hate? Don't read GamrReview Articles. Contact me to ADD games to the Database
Vote for the March Most Wanted / February Results

Around the Network
Normchacho said:
Michael-5 said:
Normchacho said:

So after reading much of this thread, this is how I think most people on here feel


Thanks for the new show reccommendation. I watched that episode thanks to this comment, and now I'm in love with Rick & Morty. Watched the exact episode too

It's such a good show! The guy who voiced the Meeseeks is the same guy who voices spongebob!

Haha man I love that show! Did you see the original (much more vulgar) pilot-like (i think) youtube version?

He also voices this guy:



PSN ID: clemens-nl                                                                                                                

Michael-5 said:
Norris2k said:
Michael-5 said:
 

 

2. Above 24 FPS, a human eye can't detect any changes to frame rate (first year chem/bio). You will see a smoother image, but never any screen tear or lag. So above 24 FPS and the difference is marginal. PS3 games lock in either 30 or 60 FPS, XB1 does the same, and so does WiiU. Neither is performing better here, all are doing well.

There is nothing like that in the human eye or brain. Light signal is continuous and there is no notion of FPS in the eye or brain. 24 FPS is just a convenient and very old standard to produce with trick and limits a non interactive, good enough looking movie:

1 - Motion blur in the picture trick the eye to see the animation smoother than it is. Lag would be painfully noticeable if you were to watch 24 fps without blur. Don't expect a 3D game to be as good as the real motion blur from time of light exposition on a real thing.

2 - On interactive game, if you click on a button and the action display in 1/30th or 1/60 you will notice. In a movie you don't have any reference to assume a picture is late or not. In a game you compare to your own input.

3 -  You need only 1 fps to display an unmoving wall, but the faster the movement, the more fps you need to make it smooth. That's why fast camera travellings in movies doesn't look so great.

4 - Even for a non interactive motion blurred movie and no super fast travelling... go to 48 fps and you will notice it a lot. In fact it will fill a little wierd, not like a movie. It will feel more real.

I dunno about that, I took a chemistry course in school and one of the exam questions was to calculate the FPS human eyes can see at. Then do the same for wolves. Humans do make a distintion at 24FPS. We can see above 24FPS, but the difference is marginal.

Below 24 FPS, the screen flickers. (For wolves & dogs it's 60FPS, that's why they rarely watch tv, it's always flickering)

Flickering is an other and even worst problem. You would so much notice it at 24 FPS that any TV refresh at least 50 or 60 times a second and it's still visible. That's what  is 50Hz or 60hz on a television. On LCD it became better, but on CRT for your eye not being tired by flickering required 80 to 120hz. I think video projectors are at only 48hz with no visible flickering, but the room is dark. It really depends on technology, conditions, anyway there is no "24 fps" absolute value.

I think what you got as a course was just an over simplification or even a mistake. http://en.wikipedia.org/wiki/Flicker_%28screen%29



Because of the highly expensive ESRAM, the X1 is a huge design failure. Even if they would drop Kinect to sell at a cheaper price, the much slower APU would still be much more expensive to produce so they will never be able to be competitive in power and price.



etking said:
Because of the highly expensive ESRAM, the X1 is a huge design failure. Even if they would drop Kinect to sell at a cheaper price, the much slower APU would still be much more expensive to produce so they will never be able to be competitive in power and price.

I agree for design, but if the APU is more expensive (it can't be that much different, the impact on cost is moslty the yield rate and it will improve), DDR3 is a lot less expensive (30~50% ?), so it should compensate more than APU cost difference. For a significant cheaper price, they will still have to bleed more money, but they can afford to.

More than manufacture price, what is the point for a company like Microsoft to improve market share in the not so (and even less) profitable console market with inferior hardware if it means throwing away most hope for a casual/kinect wii-like market ? 1 year ago hey had the super dream of a blue ocean with DRM/Casual/3rd party exclusive with a powerfull enough console (compared to wii vs ps360), I'm not sure what they will do.



Around the Network

Wow!

Nice to see my article has gotten so much response.

Thanks guys it makes me feel happy inside.



This whole argument is kind of a moot point, meaningless. The difference in tech abilities between the X1 and PS4, which is the stronger of the two, is marginal enough that it ultimately comes down to games and the experiences had with each.
Right now X1 has the stronger library, more robust user experience, and simply more capabilities than the PS4 that don't count pixel fill rate or frames per second as the only metric.
America is the largest, most lucrative console market in the world and the X1, priced $100 more, is currently beating the PS4 in sales.
There's a reason for that.



No, no it won't. And that is ok. 360 was a little underpowered compared to PS3 and we did just fine.



Xbox: Best hardware, Game Pass best value, best BC, more 1st party genres and multiplayer titles. 

 

sales2099 said:
No, no it won't. And that is ok. 360 was a little underpowered compared to PS3 and we did just fine.


The 360 was techincally weaker but the PS3 was such a nightmare to develop for games often looked and performed better on the 360.

The Xbone and PS4 share a very similar architecture, PS4 just has faster parts and better memory.



Norris2k said:
Michael-5 said:

I dunno about that, I took a chemistry course in school and one of the exam questions was to calculate the FPS human eyes can see at. Then do the same for wolves. Humans do make a distintion at 24FPS. We can see above 24FPS, but the difference is marginal.

Below 24 FPS, the screen flickers. (For wolves & dogs it's 60FPS, that's why they rarely watch tv, it's always flickering)

Flickering is an other and even worst problem. You would so much notice it at 24 FPS that any TV refresh at least 50 or 60 times a second and it's still visible. That's what  is 50Hz or 60hz on a television. On LCD it became better, but on CRT for your eye not being tired by flickering required 80 to 120hz. I think video projectors are at only 48hz with no visible flickering, but the room is dark. It really depends on technology, conditions, anyway there is no "24 fps" absolute value.

I think what you got as a course was just an over simplification or even a mistake. http://en.wikipedia.org/wiki/Flicker_%28screen%29

I'm telling you, as someone who's taken university level chemisty/bio, there is a 24 FPS value. Something to do with how our brain perceives data. I think what happens is below 24 FPS our eyes can percieve each individual frame, so below 24 FPS you see the switch in frames, and thus a flicker. Above 24 FPS you're no longer able to distinguish individual frames.

http://en.wikipedia.org/wiki/Frame_rate

Wikipedia is claiming a 10-12 unique FPS rate for human eyes, but I recall calculating it at 24.

So a great example would be flashing blue & yellow light in alternate frames. 24FPS and lower, that's going to give you a seizure. 24FPS and higher, it's going to start to look green. We can see the extra frame, but our brain can't process it fast enough, so it mushes the two images together and makes green.



What is with all the hate? Don't read GamrReview Articles. Contact me to ADD games to the Database
Vote for the March Most Wanted / February Results