By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Reasons why next gen consoles don't have to be much more powerful

disolitude said:
Slimebeast said:

Next gen won't be 1080p at 60fps because it's a waste of resources.

Resolution and framerate determines only a fraction of image quality.

Compare this:

 

to this:

oth in 1080p. So what?

Not sure if you're being funny as one of those videos is 17 years old and the other is less than a year old. You should follow PC gaming trends more as those usually signal console gaming to come. Current PC trends are 120-144 hz monitors and eyefinity/surround. So higher frame rate and higher resolution...

As someone that has a true 120 hz monitor in order to play twitch games on PC at well over 60 frames per second I can tell you that frame rate makes a world of difference. I see 60 fps becoming the norm, even if 720p resolution is used. They can mask lack of resolution with effects and AA, but not lack of frame rate.

No way man.

Eyefinity is a niche product and will remain so in the coming generation.

The importance of good framerates in online multiplayer games on PC is at least a decade old issue.

Why would it change in the console world just because you discovered it now?

On consoles in online multiplayer framerate has a marginal difference because there is already input lag, screen lag and network lag that combined amounts to rufly ½ a second.

The point of the video was to show that graphics quality isn't mainly determined by image resolution. Both games in the videos are rendered in 1080p, yet one looks fabulous and the other looks like shit.

It takes a 4.5 times increase in GPU power to render an image in 1080p@60fps compared to the current gen standard of 720p@30fps (and not all are even in 720p).

That's nearly the whole budget that next gen brings (rumoured to be 6-8 x X360).

For example you would struggle to run Unreal Engine 4 with all its particle effects and whatnot on next gen consoles at 1080p@60fps. So developers will choose games to be rendered in either 720p@60fps (COD and racing) 1080p@30fps and 720@30fps instead.



Around the Network
disolitude said:
Slimebeast said:

Next gen won't be 1080p at 60fps because it's a waste of resources.

Resolution and framerate determines only a fraction of image quality.

Compare this:

 

to this:

oth in 1080p. So what?

Not sure if you're being funny as one of those videos is 17 years old and the other is less than a year old. You should follow PC gaming trends more as those usually signal console gaming to come. Current PC trends are 120-144 hz monitors and eyefinity/surround. So higher frame rate and higher resolution...

As someone that has a true 120 hz monitor in order to play twitch games on PC at well over 60 frames per second I can tell you that frame rate makes a world of difference. I see 60 fps becoming the norm, even if 720p resolution is used. They can mask lack of resolution with effects and AA, but not lack of frame rate.

Agreed, 100%  I can't understand how someone can say that frame rate doesn't markedly attribute to visual quality.



CGI-Quality said:
disolitude said:
Slimebeast said:

Next gen won't be 1080p at 60fps because it's a waste of resources.

Resolution and framerate determines only a fraction of image quality.

Compare this:

 

to this:

oth in 1080p. So what?

Not sure if you're being funny as one of those videos is 17 years old and the other is less than a year old. You should follow PC gaming trends more as those usually signal console gaming to come. Current PC trends are 120-144 hz monitors and eyefinity/surround. So higher frame rate and higher resolution...

As someone that has a true 120 hz monitor in order to play twitch games on PC at well over 60 frames per second I can tell you that frame rate makes a world of difference. I see 60 fps becoming the norm, even if 720p resolution is used. They can mask lack of resolution with effects and AA, but not lack of frame rate.

Affirmative. Now the question is, as an owner of a real 120hz device myself, what monitor are you using? 

I used to have Acer GD235HZ which I bought in like earlu 2011 but never used it. I was hardcore in to 3D back then and projector 3D was much better quality. I saw these new 3D vision 2 monitors with lightboost and they looked pretty sweet and picked up a BenQ XL2420TX 3 months ago. Pretty sweet monitor since it has HDMI 1.4 support for PS3 and 360 3D and full 1080p Dual Link DVI support.

I'm considering selling my Dell monitors and getting 2 more of these a4nd setting up 120hz 2D or 3D surround...but to do that I have to get a brand new rig. Current platform can't delivery this... I need 3 way SLI with 680s lol.



disolitude said:

While we all wish for next gen consoles to be extremely powerful and cutting edge, here are a few logical reasons why this really doesn't have to be the case. 

1. 1080p resolution displays @ 60 hz

This is the limit of 100% of TV's available on the market today and 95% of TVs sold in the next 5 years. Even the TV's that run at supposed 120 and 240hz, don't actually accept signal higher than 60hz. Therefore 1080p@60fps is the optimal performance benchmark that next gen consoles need to aim for. 4K resolution gaming is not a reality for console gaming for next gen, no matter what Sony and Microsoft end up claiming on the box. 

2. Gaming optimized hardware and software 

The hardware and software development tools that go in to gaming consoles are very highly optimized and allow devs to squeeze as much performance as possible. A gaming machine doesn't have Windows or some other heavy duty operating system running in the background and a more processing power can be utilized for game performace. Look at Halo 4 and the impressive visuals devs are getting out of 7 year old 360, which at this point has laughable specs in terms of current PC's.

3. Game budgets 

The current state of AAA gaming is very expensive and a massive leap in visual fidelity and production is not something that an industry can support as a whole. 

4. Hardware cost

Looks like next gen will bundle motion sensers, tablets, waggle controllers and standard controllers. Trying to bundle all these technologies while also offering cutting edge hardware and keeping the price at $299-$399 level is not something 2 out of 3 console manufacturers can afford to do. 

 

Don't get me wrong, I'm not saying that next gen consoles won't be more powerful than current gen consoles, far from it. However we shouldn't be expecting cutting edge high end PC-like graphics performance on paper. When the consoles come out games will look as good as they do on PC and 1080p resolutions, but with lesser hardware. 

Remember a console game running at 1080p @60 fps looks exactly the same as PC game at 1080p at 200 fps on a 60 hz display. It actually may look better @ 60fps as you won't see any frame tearing. 


You are wrong
We only have seen a few PC games now running maxed out and those were 1st games on new Engines > Battlefield 3 and maybe Crysis 2 with a patch.
Both don't indicate how a completely maxed out PC game with the full understanding of the Engine would look like today.

We are at least 2 Console Gens ahead right now on PC...Yes im pretty sure we could achieve the power of 2 future console gens right now on PC if there would be enough 1) Money to build a game 2) Enough people who would buy a game.

When the Next Gen starts we actually won't see what is possible at this time.
2013/14 we will see (maybe) slightly more beautiful games than The Last of Us,Beyond Two Souls and Battlefield 3 Ultra Settings but probably not even directly when the consoles get released.
Yes games like Uncharted 3,Halo 4 and stuff look great - Now let the same games go all out on PC RIGHT now - Using the best processor and the best Graphics Card available and we would probably die of a heartattack cause it goes far beyond our imagination right now.

They have to be a lot more powerful than the consoles right now otherwise they will be too far behind when they get released.
And i know no one can carry another "PS3=600$/€" console,especially not Next Gen when development costs are doubled but they have to find a way.
Its a weird situation - At the moment there isn't a single game that can max out current PC Top notch Hardware and we can have games like Battlefield 3 Ultra Settings but I for MYSELF would gladly pay 200-300 more for new consoles so the leap in graphics,AI and other stuff is really big.



Slimebeast said:
disolitude said:
Slimebeast said:

Next gen won't be 1080p at 60fps because it's a waste of resources.

Resolution and framerate determines only a fraction of image quality.

Compare this:

 

to this:

oth in 1080p. So what?

Not sure if you're being funny as one of those videos is 17 years old and the other is less than a year old. You should follow PC gaming trends more as those usually signal console gaming to come. Current PC trends are 120-144 hz monitors and eyefinity/surround. So higher frame rate and higher resolution...

As someone that has a true 120 hz monitor in order to play twitch games on PC at well over 60 frames per second I can tell you that frame rate makes a world of difference. I see 60 fps becoming the norm, even if 720p resolution is used. They can mask lack of resolution with effects and AA, but not lack of frame rate.

No way man.

Eyefinity is a niche product and will remain so in the coming generation.

The importance of good framerates in online multiplayer games on PC is at least a decade old issue.

Why would it change in the console world just because you discovered it now?

On consoles in online multiplayer framerate has a marginal difference because there is already input lag, screen lag and network lag that combined amounts to rufly ½ a second.

The point of the video was to show that graphics quality isn't mainly determined by image resolution. Both games in the videos are rendered in 1080p, yet one looks fabulous and the other looks like shit.

It takes a 4.5 times increase in GPU power to render an image in 1080p@60fps compared to the current gen standard of 720p@30fps (and not all are even in 720p).

That's nearly the whole budget that next gen brings (rumoured to be 6-8 x X360).

For example you would struggle to run Unreal Engine 4 with all its particle effects and whatnot on next gen consoles at 1080p@60fps. So developers will choose games to be rendered in either 720p@60fps (COD and racing) 1080p@30fps and 720@30fps instead.

Who says I've discovered that frame rate matters just now? Only in the last few years did LCD manufacturers release monitors that could deliver the kind of frame rate CRTs used to have.  Look up the Asus VG27HE. 144 hz monitor...

Regarding your video comparison argument, it doesn't stick because of time difference. If you take a 2 "current looking" games and give one slightly better AA and more particle effects than the other but make the other 60 FPS, the 60FPS one will look better 9 out of 10 times. Only people that post screenshots on forums will be happy with the extra particle effects.

COD is already pretty much 720p@60 on consoles, are you saying that next gen will be the same resolution only we will see more smoke and particle effects?  In 8 years that passed, thats the best they can do with hardware and processing? 

Also look at the PC visuals today. Are you saying that we will magically see a brand new visual fidelity with the new consoles which todays PC's aren't able to deliver.  All these high profile cross platform PC/console games look better on PC because they have higher res textures, better resolution, more anti aliasing and better frame rate. There is no other magic sauce, despite what Epic games say about they new engine they are trying to sell.



Around the Network
CGI-Quality said:
disolitude said:
CGI-Quality said:
disolitude said:

Not sure if you're being funny as one of those videos is 17 years old and the other is less than a year old. You should follow PC gaming trends more as those usually signal console gaming to come. Current PC trends are 120-144 hz monitors and eyefinity/surround. So higher frame rate and higher resolution...

As someone that has a true 120 hz monitor in order to play twitch games on PC at well over 60 frames per second I can tell you that frame rate makes a world of difference. I see 60 fps becoming the norm, even if 720p resolution is used. They can mask lack of resolution with effects and AA, but not lack of frame rate.

Affirmative. Now the question is, as an owner of a real 120hz device myself, what monitor are you using? 

I used to have Acer GD235HZ which I bought in like earlu 2011 but never used it. I was hardcore in to 3D back then and projector 3D was much better quality. I saw these new 3D vision 2 monitors with lightboost and they looked pretty sweet and picked up a BenQ XL2420TX 3 months ago. Pretty sweet monitor since it has HDMI 1.4 support for PS3 and 360 3D and full 1080p Dual Link DVI support.

I'm considering selling my Dell monitors and getting 2 more of these a4nd setting up 120hz 2D or 3D surround...but to do that I have to get a brand new rig. Current platform can't delivery this... I need 3 way SLI with 680s lol.

I had the BenQ XL2420T last month, but it had three dead pixels that I just couldn't shrug off. Returned it for the Samsung SA23700D. Best gaming monitor I could have chosen, even if the 3D isn't a friend of NVIDIA. :P

Btw, you thought I was crazy running one 690 on a single monitor in 1080p, I have a friend that bought three, and plans to use all of them.....wait for it.....on just a 27" 1080p monitor (not even 120hz at that). So, in a sense, that's 6-way SLI in 1080p. What do you say to a person like that?


Lol your friend is indeed crazy cause 3 way dual GPU SLI doesn't work. Quad SLI is the max and even that scalles really poorly. Best he can do with that is do quad SLI and use the 3rd card for Physx. But thats such a massive waste... 3 way GTX 680 SLI outperforms dual GTX 690 because 3 way SLi still scales decently (~250% performace vs 1 card) where quad SLI just doesn't scale well at all. Drivers aren't there yet...

What killed every samsung 3D monitor for me is lack of Vesa mounts... That is a very concening trend that newer monitors tend to follow. I don't care how slim it is, no VESA mount, no buy.



disolitude said:
Slimebeast said:
disolitude said:
Slimebeast said:

Next gen won't be 1080p at 60fps because it's a waste of resources.

Resolution and framerate determines only a fraction of image quality.

Compare this:

 

to this:

oth in 1080p. So what?

Not sure if you're being funny as one of those videos is 17 years old and the other is less than a year old. You should follow PC gaming trends more as those usually signal console gaming to come. Current PC trends are 120-144 hz monitors and eyefinity/surround. So higher frame rate and higher resolution...

As someone that has a true 120 hz monitor in order to play twitch games on PC at well over 60 frames per second I can tell you that frame rate makes a world of difference. I see 60 fps becoming the norm, even if 720p resolution is used. They can mask lack of resolution with effects and AA, but not lack of frame rate.

No way man.

Eyefinity is a niche product and will remain so in the coming generation.

The importance of good framerates in online multiplayer games on PC is at least a decade old issue.

Why would it change in the console world just because you discovered it now?

On consoles in online multiplayer framerate has a marginal difference because there is already input lag, screen lag and network lag that combined amounts to rufly ½ a second.

The point of the video was to show that graphics quality isn't mainly determined by image resolution. Both games in the videos are rendered in 1080p, yet one looks fabulous and the other looks like shit.

It takes a 4.5 times increase in GPU power to render an image in 1080p@60fps compared to the current gen standard of 720p@30fps (and not all are even in 720p).

That's nearly the whole budget that next gen brings (rumoured to be 6-8 x X360).

For example you would struggle to run Unreal Engine 4 with all its particle effects and whatnot on next gen consoles at 1080p@60fps. So developers will choose games to be rendered in either 720p@60fps (COD and racing) 1080p@30fps and 720@30fps instead.

Who says I've discovered that frame rate matters just now? Only in the last few years did LCD manufacturers release monitors that could deliver the kind of frame rate CRTs used to have.  Look up the Asus VG27HE. 144 hz monitor...

Regarding your video comparison argument, it doesn't stick because of time difference. If you take a 2 "current looking" games and give one slightly better AA and more particle effects than the other but make the other 60 FPS, the 60FPS one will look better 9 out of 10 times. Only people that post screenshots on forums will be happy with the extra particle effects.

COD is already pretty much 720p@60 on consoles, are you saying that next gen will be the same resolution only we will see more smoke and particle effects?  In 8 years that passed, thats the best they can do with hardware and processing? 

Also look at the PC visuals today. Are you saying that we will magically see a brand new visual fidelity with the new consoles which todays PC's aren't able to deliver.  All these high profile cross platform PC/console games look better on PC because they have higher res textures, better resolution, more anti aliasing and better frame rate. There is no other magic sauce, despite what Epic games say about they new engine they are trying to sell.

Framerates in PC gaming have been important for at least 10 years with +60fps being the norm. And yet it hasn't spread to consoles.

Graphics aren't made only by resolution, AA, AF and framerate. It's just that PC gamers are very familiar with those features because they're easy to tweak.

But true advancements in graphics quality come from a multitude of other techniques, such as increased polygon count, HDR lighting, increased texture resolution, increased draw distance, real-time dynamic lighting, ambient occlusion, bump mapping, normal maps, volumetric particle effects, lighting & shadow on particles, depth of field, tesselation, real-time reflections and countless more.

Those are the reasons why next gen graphics won't be focused on increased screen resolution and framerate.

And those are also the reasons why next gen consoles need to be as powerful as possible.



Slimebeast said:
disolitude said:
Slimebeast said:
disolitude said:
Slimebeast said:

Next gen won't be 1080p at 60fps because it's a waste of resources.

Resolution and framerate determines only a fraction of image quality.

Compare this:

 

to this:

oth in 1080p. So what?

Not sure if you're being funny as one of those videos is 17 years old and the other is less than a year old. You should follow PC gaming trends more as those usually signal console gaming to come. Current PC trends are 120-144 hz monitors and eyefinity/surround. So higher frame rate and higher resolution...

As someone that has a true 120 hz monitor in order to play twitch games on PC at well over 60 frames per second I can tell you that frame rate makes a world of difference. I see 60 fps becoming the norm, even if 720p resolution is used. They can mask lack of resolution with effects and AA, but not lack of frame rate.

No way man.

Eyefinity is a niche product and will remain so in the coming generation.

The importance of good framerates in online multiplayer games on PC is at least a decade old issue.

Why would it change in the console world just because you discovered it now?

On consoles in online multiplayer framerate has a marginal difference because there is already input lag, screen lag and network lag that combined amounts to rufly ½ a second.

The point of the video was to show that graphics quality isn't mainly determined by image resolution. Both games in the videos are rendered in 1080p, yet one looks fabulous and the other looks like shit.

It takes a 4.5 times increase in GPU power to render an image in 1080p@60fps compared to the current gen standard of 720p@30fps (and not all are even in 720p).

That's nearly the whole budget that next gen brings (rumoured to be 6-8 x X360).

For example you would struggle to run Unreal Engine 4 with all its particle effects and whatnot on next gen consoles at 1080p@60fps. So developers will choose games to be rendered in either 720p@60fps (COD and racing) 1080p@30fps and 720@30fps instead.

Who says I've discovered that frame rate matters just now? Only in the last few years did LCD manufacturers release monitors that could deliver the kind of frame rate CRTs used to have.  Look up the Asus VG27HE. 144 hz monitor...

Regarding your video comparison argument, it doesn't stick because of time difference. If you take a 2 "current looking" games and give one slightly better AA and more particle effects than the other but make the other 60 FPS, the 60FPS one will look better 9 out of 10 times. Only people that post screenshots on forums will be happy with the extra particle effects.

COD is already pretty much 720p@60 on consoles, are you saying that next gen will be the same resolution only we will see more smoke and particle effects?  In 8 years that passed, thats the best they can do with hardware and processing? 

Also look at the PC visuals today. Are you saying that we will magically see a brand new visual fidelity with the new consoles which todays PC's aren't able to deliver.  All these high profile cross platform PC/console games look better on PC because they have higher res textures, better resolution, more anti aliasing and better frame rate. There is no other magic sauce, despite what Epic games say about they new engine they are trying to sell.

Framerates in PC gaming have been important for at least 10 years with +60fps being the norm. And yet it hasn't spread to consoles.

Graphics aren't made only by resolution, AA, AF and framerate. It's just that PC gamers are very familiar with those features because they're easy to tweak.

But true advancements in graphics quality come from a multitude of other techniques, such as increased polygon count, HDR lighting, increased texture resolution, increased draw distance, real-time dynamic lighting, ambient occlusion, bump mapping, normal maps, volumetric particle effects, lighting & shadow on particles, depth of field, tesselation, real-time reflections and countless more.

Those are the reasons why next gen graphics won't be focused on increased screen resolution and framerate.

And those are also the reasons why next gen consoles need to be as powerful as possible.


You're just describing the settings that PC games have in the "advance graphics tab". You forgot the mighty uber sampling...

Most of those elements are either supported or not supported by the game engine or hardware. If they support it, they do it in a manner where it isnt that taxing on the performance. Tesselation for example isn't supported on the current gen consoles due to the GPU not supporting it, not due to lack of performance...

Like I understand what you're saying and I do agree that the visuals are advanced with techniques and tools other than resolution and frame rate. However I disagree that we will see a massive leap in dynamic lighting and particle effects the day next gen consoles come out. These things are improved with time and development techniques/tools. What we will see is a jump in resolution or frame rate and possibly both.

Look at GTAIV vs GTAV for example. From what we see, GTAIV looks much worse than GTAV because of the optimization made to the game engine. Draw distance, character models, textures and mapping is much better in GTAV. Yet the game doesn't run at a lower resolution or half the frame rate. This is because of experience with hardware and engine optimization for the hardware at hand. 

Same goes for next gen. Games released on day 1 release will not have 10X better particle effects but higher resolution and frame rates. Slowly over time, devs will optimize hardware to keep up with the latest technology and visual development techniques.



The next generation consoles ought to be quite efficient given the move towards power efficiency in designs as IHC's have hit their various thermal and power barriers and have worked for half a decade to improve efficiency more-so than outright performance. The PS3 and Xbox 360 were designed right on the cusp of the end of the Ghz wars and since then designers have been forced to design for performance within a limited budget and have made great strides over the past 5-6 years.

I don't believe we'll see sub 100W consoles from Microsoft or Sony so the consoles ought to have at least 6-7* the outright performance with an additional efficiency gain on top as embedded memory for instance has improved to the point where it is a practical choice for 1080P games and the designs won't be clocked to the point of inefficiency. The Wii U was designed efficiently and we ought to have even more efficient designs given significantly a greater power budget so it shouldn't surprise anyone when we see the next generation performance so long as their expectations are realistic.



Tease.

wfz said:
It's not just about resolution. AI, physics, etc. There are so many areas games can improve in. I look forward to the day where I feel a lot more immersion due to my interaction and experiences with the terrain around me, rather than it all feeling like just a bunch of pretty geometrical shapes.


Well if people/devs would invest time they could make the AI at least so much better compared to the AI from 2005  as the PS3 and 360 graphics became better from 2005 to 2012.  I mean average AI is as dumb as on  PS2 Xbox1 etc. So despite the huge jump in processing power  AI is just stupid.

They didnt care about AI they wont do it in the future unless you  give them a AI core that can do nothing else than calculating AI. So they are forced to use it.    Otherwhise they will simply use the power they have for more  WOW OMG WTF stuff  the dumb easily impressed customer wants.

When I play "newer" PS2 games or Wii games I dont really see that the 360 and PS3 or PC games have SIGNIFICANTLY better AI.   (for example Fifa every AI on every platform is as dumb as the AI on the other platforms. Also COD etc.   If the AI behaves smarter its because more is scripted and not because the NPC Brain uses more CPU to decide what to do)

Noone cares and they dont seem to see the issues and as long as noone cares they wont come up with smart solutions to fix issues  they dont "see".