By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft - Rumor: Xbox "Lockhart" specs leaked, is $300

goopy20 said:
CGI-Quality said:

Yeah, the fake 4K UHDs use the 1080 floor. That's why I said that true 4K UHDs are what he would need to target.

I'm just curious what your thoughts are on using native 4k for the next gen console games. Wouldn't that be a massive waste of resources compared to 1440p? I mean there has to be a reason why almost nobody plays pc games in native 4k and why the high-end pc monitors are mostly 1440p. 

It has nothing to do with what makes sense and you know it. Staying at 1080p for much longer would be better but the industry is moving to 4k and game devs will push that. It's no different than the many games of the last 15 years that cant hold 30 fps because devs pushed as much eye candy as possible.



Around the Network
Nu-13 said:
goopy20 said:

I'm just curious what your thoughts are on using native 4k for the next gen console games. Wouldn't that be a massive waste of resources compared to 1440p? I mean there has to be a reason why almost nobody plays pc games in native 4k and why the high-end pc monitors are mostly 1440p. 

It has nothing to do with what makes sense and you know it. Staying at 1080p for much longer would be better but the industry is moving to 4k and game devs will push that. It's no different than the many games of the last 15 years that cant hold 30 fps because devs pushed as much eye candy as possible.

True, but they wouldn't be pushing eye candy, they would just be pushing resolution, which isn't really a big selling point for the masses. Like I said before, if resolution was so important, why did the X1X sell only a couple million units? 



goopy20 said:
Nu-13 said:

It has nothing to do with what makes sense and you know it. Staying at 1080p for much longer would be better but the industry is moving to 4k and game devs will push that. It's no different than the many games of the last 15 years that cant hold 30 fps because devs pushed as much eye candy as possible.

True, but they wouldn't be pushing eye candy, they would just be pushing resolution, which isn't really a big selling point for the masses. Like I said before, if resolution was so important, why did the X1X sell only a couple million units? 

Because regardless of resolution, it was a $499 hardware revision of a system that wasnt that popular in the first place.



goopy20 said:
CGI-Quality said:

Yeah, the fake 4K UHDs use the 1080 floor. That's why I said that true 4K UHDs are what he would need to target.

I'm just curious what your thoughts are on using native 4k for the next gen console games. Wouldn't that be a massive waste of resources compared to 1440p? I mean there has to be a reason why almost nobody plays pc games in native 4k and why the high-end pc monitors are mostly 1440p. 

4k is going to be far more achievable next gen and not come at such a big cost to visual fidelity.
The consoles will have the fillrate to handle it fine for the most part.

The issue between 1440P and 2160P varies from person to person and game to game.
1440P is certainly a step up over 1080P, it's sharper, cleaner and not as resource heavy as 4k, that gives headroom to dial up the visuals, but aliasing/stair stepping/shimmering will be more prevalent than 4k.
Micro-details in texture work aren't likely to pop as much either.

Resolution and framerates will be entirely up to developers, developers chasing the graphics dream will likely opt for 1440P+Ray Tracing @30fps I would imagine.

The reason why allot of PC displays are only 1440P are many... It's a good price/performance level, that way manufacturers can focus on refresh rates and contrasts ratios rather than pure pixel counts... Plus 2160P on a 24" panel is an insane pixel density, keeping that resolution for larger panels makes more sense, plus display processing needs are well equipped to handle 1440P today, which keeps input latency down, you do notice input latency more readily on PC due to the mouse being such a precise form of control.

Also plenty of gamers use 4k. (Steam isn't the entire PC community, it's just a large sampling size.)
Many PC gamers also render games at 4k and downsample them to lower resolutions like 1080P and 1440P.

Radek said:

1. Because 90% of PC gamers play on monitors and monitor would have to be at least 40 inches to fully utilize the extra pixels, and that is hard to fit on most desks, and still be able to use keyboard and mouse.

2. Because 90% of PC gamers want to play at 60 fps or more so it's better to stick with 1440p, while 90% of console games run at 30 fps regardless the resolution.

Apples and oranges, just because most PC players prefer 1440p 60-144 fps doesn't mean console players don't want 4K 30 fps on their 55" TV's.

Personally I can tell the difference between 1440P and 2160P on a 27-31.5" panel, granted it's not going to be as big of a perceivable difference as the move from 1080P to 1440P on the same panel size... For instance, I wanted to stab my eye with a pitch fork once when I had to use a 1080P 27" panel.

At around 21-22-23-24" you might as well stick with 1080P.
25-27" panels, 1440P is a really good fit.
32" and larger 2160P is fantastic.

You can get 32" 1440P panels... And I have one, it has roughly the same pixel density as a 23-24" 1440P panel, so it's doable, but I would hope anyone purchasing such a panel is doing so not because of the resolution but because of the refresh rates instead...

PC displays are also a little different to Televisions, there is less emphasis on post-processing and a larger emphasis on keeping input latencies as low as possible... As the venerable mouse is a highly accurate input method, so manufacturers will often prioritize things differently.




www.youtube.com/@Pemalite

CGI-Quality said:
goopy20 said:

Well obviously there are exceptions but according to the steam hardware survey, only 1,9% of its users are gaming at native 4k, while over 80% is using 1080p or 1440p. But my question was more related to game development. Wouldn't native 4k suck up way too much resources and limit a developer's ambitions on what they can do on these next gen consoles? 

I've already told people my stance on the Steam survey, so it's a waste of time throwing that out in a discussion with me. It's incomplete.

In regards to game development — no. 4K wouldn't be a waste of resources. In fact, in 2020, it is where things should be.

That doesn't make sense. Why could, or should these next gen console aim for native 4k when even a 2080Ti can't even hit 60fps on current gen games at that resolution? I'm not even talking about RT, as that would cut that framerate by another 50%. We would practically be playing the same games we're playing now lol.



Around the Network
goopy20 said:
CGI-Quality said:

I've already told people my stance on the Steam survey, so it's a waste of time throwing that out in a discussion with me. It's incomplete.

In regards to game development — no. 4K wouldn't be a waste of resources. In fact, in 2020, it is where things should be.

That doesn't make sense. Why could, or should these next gen console aim for native 4k when even a 2080Ti can't even hit 60fps on current gen games at that resolution? I'm not even talking about RT, as that would cut that framerate by another 50%. We would practically be playing the same games we're playing now lol.

An obvious problem with your argument and graphs is you're looking at 4K resolution with highest graphics settings.

Current gen games on consoles don't necessarily run games at highest graphics settings, its often more like a mix of low, medium and high. They lower the graphics settings primarily to maintain a high resolution.

Look at this video of RDR2 for example. The low and medium settings is probably more reflective of the console settings. The ultra settings are well beyond what we get on consoles and gets about half the frame rate of low and medium. Meanwhile high settings looks good and sits comfortably at 60 fps.

Like you I agree aiming for 4K can be a waste of resources, but aiming for ultra settings can also be a waste when high already looks comparable and runs much better. However, another compromise is ultra settings with a dynamic resolution. But if developers do aim for 4K/60 fps, its evident 4K with high settings is fine in this scenario.



Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Nu-13 said:
victor83fernandes said:

1 - That's USA, I live in Europe, only the 600euro model was available at launch

2 - 600dollars is a bargain in 2020, my phone cost more than that and I upgrade phones every 2 years. Minimum wages here is 1700dollars per month, 600 is barely more than a third of a months wages if you are on minimum which I am not. And 600 for an investment for the next 7-8 years is not bad at all

3 - Games have yes jumped in base price, they used to be 39.99pounds here in UK, 65 euros in Europe, now they are 45pounds in UK and 70 euros in Europe, and when you mention DLC and microtransactions, it means the real price for the complete games is more like 100euros now, add the paid online now, and you pay actually minimum 120 euros per game depends how long you stick to the online. Game prices have increased substantially, they are just hidden

4 - Yes resolution is important if things are equal, but they can never be equal at the same price point, you either sacrifice resolution, or graphics, or framerates, on PC I have the choice, on consoles I dont, most developers are pushing resolution if that means they will drop framerates and effects like AA. For me a good AA and good effects has more impact on image quality than resolution. Hence why I am probably moving to PC in January.

5 - You still don't get it about 4k and HDR, ps4 pro games look better on my projector without 4K and without HDR, and my TV is not cheap either, its a quality panasonic. Real life experience is more important than numbers on a piece of paper, in fact games look more beautiful on my brother plasma even if its 720p than on my 4k HDR, colours just look better on the plasma

6 - So you are saying the switch is much worse graphically than I thought? Way to make my point that switch is far too underpowered for next gen. Ill probably just emulate it on my PC and play with much better graphics, thanks Nintendo but the xbox series X is at least 30x more powerful, I might as well just sell my switch and never go back to Nintendo, one more reason for going PC instead of consoles for next gen.

No, it isnt, not for a videogame. Its very expensive and so is $499.

Get a job, 500dollars is 1 weeks work.



CGI-Quality said:
goopy20 said:

That doesn't make sense. Why could, or should these next gen console aim for native 4k when even a 2080Ti can't even hit 60fps on current gen games at that resolution? I'm not even talking about RT, as that would cut that framerate by another 50%. We would practically be playing the same games we're playing now lol.

I said nothing about 'aiming for' (you have a habit of making things up as you post — quit doing it)! I said it's where we should be. We're obviously not there, but even though we aren't, it still isn't a waste of resources (especially when you only mentioned 4K and didn't say anything about 60fps — a convenient omission).

Besides, there are games that a 2080Ti can run at 60fps in 4K, so that's not a working example even if that was my argument.

It wasn't a trick question here. I just asked, from a game development point of view, wouldn't native 4k suck up too much resources and limit the ambitions of developers on next gen console games? Maybe for some people half the overall visual fidelity in favor of 4k or a higher fps isn't a waste of resources, for example if you're gaming on a 144hz monitor. But we've already seen with the X1X that saying "this plays the same games but better" just doesn't fire up consumers like the promise of completely new games that take a dramatic leap from current gen titles does. 

Obviously there are games that do run at 4k/60fps on a $1300 2080Ti, but there are already plenty that don't. So lets take RDR as an example, how is Rockstar supposed to push things even further with RDR3, if the game was designed from the ground up to run in native 4k and RT on ps5? Roughly speaking, the boost in resolution and RT (depending on how AMD's RT cores perform) would leave Rockstar with about the same resources left for making the actual game as they had when they were making RDR2 for the ps4. 

So just let me ask you another question, what kind of leap in overall visual fidelity are you expecting from these next gen games?



Mr Puggsly said:
goopy20 said:

That doesn't make sense. Why could, or should these next gen console aim for native 4k when even a 2080Ti can't even hit 60fps on current gen games at that resolution? I'm not even talking about RT, as that would cut that framerate by another 50%. We would practically be playing the same games we're playing now lol.

An obvious problem with your argument and graphs is you're looking at 4K resolution with highest graphics settings.

Current gen games on consoles don't necessarily run games at highest graphics settings, its often more like a mix of low, medium and high. They lower the graphics settings primarily to maintain a high resolution.

Look at this video of RDR2 for example. The low and medium settings is probably more reflective of the console settings. The ultra settings are well beyond what we get on consoles and gets about half the frame rate of low and medium. Meanwhile high settings looks good and sits comfortably at 60 fps.

Like you I agree aiming for 4K can be a waste of resources, but aiming for ultra settings can also be a waste when high already looks comparable and runs much better. However, another compromise is ultra settings with a dynamic resolution. But if developers do aim for 4K/60 fps, its evident 4K with high settings is fine in this scenario.

That's true but I already said ultra settings are also an enormous waste of resources. You can take almost any current gen game and buckle a 2080ti to it's knees by using ultra or insane settings. That's why developers usually stay clear of settings like that on consoles. They want to be as efficient as possible and won't use settings that take up too much resources with a relatively small gain in visuals. It's why things like FXAA and checkerboard rendering exists.



victor83fernandes said:
Nu-13 said:

No, it isnt, not for a videogame. Its very expensive and so is $499.

Get a job, 500dollars is 1 weeks work.

Congratulations on doing nothing to disprove me. 500 dollars is every expensive for a videogame system.