By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Scarlett Will Prioritize Frame Rate Over Graphics

Tagged games:

 

Do you prefer 60/4k with reduced visuals or 30/4k with increased visuals?

YES! 30 40.00%
 
No. 5 6.67%
 
Depends on the game. 32 42.67%
 
I dont care. 8 10.67%
 
Total:75
Dulfite said:
I want games to look and feel like block buster movies at theaters, and not like soap operas. When my TV is set to higher fps things I watch look weird and overly fluid, taking away from the epicness of it all. Graphics over frame rate for all the RPGs, frame rate for sports games.

100% agreed. It's the same reason 48fps failed for films. It no longer looks like an epic movie, but a cheap soap opera, no matter how much is spent on the effects.

Personally, if a game is aiming for realism, I prefer 30fps (actually Infamous:SS around 35-40fps looked pretty good, too.) If it's a racing game, a twitch shooter, or has cartoony graphics, 60fps is good.



Around the Network
thismeintiel said:
Dulfite said:
I want games to look and feel like block buster movies at theaters, and not like soap operas. When my TV is set to higher fps things I watch look weird and overly fluid, taking away from the epicness of it all. Graphics over frame rate for all the RPGs, frame rate for sports games.

100% agreed. It's the same reason 48fps failed for films. It no longer looks like an epic movie, but a cheap soap opera, no matter how much is spent on the effects.

Personally, if a game is aiming for realism, I prefer 30fps (actually Infamous:SS around 35-40fps looked pretty good, too.) If it's a racing game, a twitch shooter, or has cartoony graphics, 60fps is good.

Yep! I'll Lord of the Rings over the Hobbit visually anyday!



sethnintendo said:

How do you turn the settings up on a console?  Your talking PC right?

Well, I would assume by next gen, that the big 3 will give you options to tweak with. You already have the option to tweak with motion blur, CA and Depth of field (very few games mind you).

Next gen should absolutely give you, the consumer, the option to tweak with settings, but if they cannot, they should allow devs to tweak the settings of the games themselves, to make sure it plays at 1440 60fps+, rather than faux 4k 30fps. 

CGI knows what he's saying, and so do other analysts in the industry when it comes to the differences between 4k and 1440p.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

chakkra said:

What about dynamic resolution? Can you tell when a game drops resolution during explosions and stuffs?

I can normally tell when a game shifts resolution during gameplay however that does become harder to notice as normally it happens in intense scenes which means you are often distracted playing the game while its happening. But to answer your question, its difficult to tell the difference.

CGI-Quality said:

In my experience, there is not an eye-popping difference between 1440p and 4K in games. A difference, yes, but not that overblown. This comes from someone who games in 1440p/144Hz on PC and 4K (Dynamic/Native ~ this matters) 30-60fps on the Xbox One X. I generally prefer the former because the difference is not as big as one would expect and PC games have better graphics anyway. Once higher res monitors take advantage of higher Hz in abundance (I'm waiting on 8K/120-144Hz for PC, since that's when rendering will truly makes the difference), that is when things will stomp 1440p.

That said, watching movies in 4K vs 1440p is where the difference grows on the eye, and HDR is a Godsend, but for gaming the difference isn't as big when just talking those two resolutions.  

That also depends on the size of your screen. I am guessing you game on a Monitor? 

On a 55inch TV, 4k is quite noticeable and I have fiddled around with resolution settings in games to tell the difference and I can easily see the improvements on my screen.

Most PC gamers use smaller monitors around the 22inch to 32inch size. That size is perfect for 1080p and 1440p however if you went with a 55inch to 75inch TV, you can easily see the difference between 4k and 1440p. However I am in the minority here because majority of PC gamers don't use a TV to game, they normally opt for refresh rates in monitors instead.

I will be buying a decent monitor next year as I am moving house and I am looking at something good, if I stick to a 32inch size I probably wouldn't need 4k and can lower it to 1440p and increase framerates in my rig instead. 



For the studios owned by MS sure they can focus on 60fps. 3rd parties will focus on 30fps and better visuals just as usual.

And if MS focus on 60fps the visual gap to Sony AAA games will be big.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Around the Network
Azzanation said:
chakkra said:

What about dynamic resolution? Can you tell when a game drops resolution during explosions and stuffs?

I can normally tell when a game shifts resolution during gameplay however that does become harder to notice as normally it happens in intense scenes which means you are often distracted playing the game while its happening. But to answer your question, its difficult to tell the difference.

Yeah, that's why I always say that dynamic resolution is indeed the best option.



thismeintiel said:
Dulfite said:
I want games to look and feel like block buster movies at theaters, and not like soap operas. When my TV is set to higher fps things I watch look weird and overly fluid, taking away from the epicness of it all. Graphics over frame rate for all the RPGs, frame rate for sports games.

100% agreed. It's the same reason 48fps failed for films. It no longer looks like an epic movie, but a cheap soap opera, no matter how much is spent on the effects.

The only reason for this is because movies have always been 24fps and soap opera's have always been 60fps. It's just how your brain has been conditioned.

60fps is objectively better.



CGI-Quality said:

The only reason you'd notice a difference is because of the lack of a 55" 1440p TV. Otherwise, in a like-for-like situation, the difference gets smaller (hence why many PC gamers opt for 1440/144Hz over 4K/60 on 27-32" monitors). There's not a huge difference between those resolutions. HDR makes a bigger difference.

Well if we are comparing 32inch monitors side by side with one at 4k and the other at 1440p, than there will be very little difference. However when you jump to 55inch screens and above, 1440p starts to look blurry or stretched. You will want to start hitting 4k on the bigger screens. 4k on a 4k TV is outstanding, anything below starts to become noticeable to my eyes when using big screens. The bigger the screen the more pixels you will want. HDR does make a difference as well, HDR combined with native 4k is the obvious sweet spot for these new TVs.



I prefer 120fps VR, any word on VR headset plans for Scarlet?



Hard to sell frame rates to casuals. If the game looks dated some will likely skip it.