By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Have we hit the technology wall?

naznatips said:
rocketpig said:

You're completely ignoring the fact that this generation is barely getting rolling. Two years into the sixth generation, we were still seeing games that were marginally better than the PS1 generation and the Xbox/GC still hadn't even released.

There is still three to four years left in this gen and games will continue to look better and better.

PS. If you don't see huge advancements from GoWII (last major release in the previous gen) to Heavenly Sword, Gears, and Mass Effect, you aren't looking very hard. It's not only the graphics; the physics, environments, AI, and number of things onscreen are making huge gains. 


This generation may be young, but it's already run its course graphically.  We already have Crysis above even the possibility of being run on any home console and Mass Effect pushing the hardware past what it can really do.  I don't think you should be expecting much if any graphical improvement in the future.  

As far as AI leaps, what games are you talking about?  Because a few big next gen games come to mind for me:  CoD4, Assassin's Creed, and Heavenly Sword.  None of those have AI better than last gen.  Some of them arguably much worse.  I know theoretically AI should be better this generation, but it simply isn't.  Not yet. 


I'm sorry but saying that the generation has run its course graphically is silly. Just because some games hiccup and have bad framerates, screen tearing, etc. at the expense of graphics now doesn't mean that future improvements won't be made when dealing with the hardware. Most companies are still on their first game with this generation. You can't expect them to suddenly stop improving this generation when every generation in history has shown steady graphical improvement over its life. Do you seriously think that Gears 2 won't look better than the original?

As far as AI, I can't think of a game last generation that offers the AI immersion and intelligence of a game like Assassin's Creed. It's definitely not perfect but there is an improvement there, without a doubt. Older systems couldn't even handle the crowds of the newer games, much less the improved AI. 




Or check out my new webcomic: http://selfcentent.com/

Around the Network
rocketpig said:
naznatips said:
rocketpig said:

You're completely ignoring the fact that this generation is barely getting rolling. Two years into the sixth generation, we were still seeing games that were marginally better than the PS1 generation and the Xbox/GC still hadn't even released.

There is still three to four years left in this gen and games will continue to look better and better.

PS. If you don't see huge advancements from GoWII (last major release in the previous gen) to Heavenly Sword, Gears, and Mass Effect, you aren't looking very hard. It's not only the graphics; the physics, environments, AI, and number of things onscreen are making huge gains. 


This generation may be young, but it's already run its course graphically.  We already have Crysis above even the possibility of being run on any home console and Mass Effect pushing the hardware past what it can really do.  I don't think you should be expecting much if any graphical improvement in the future.  

As far as AI leaps, what games are you talking about?  Because a few big next gen games come to mind for me:  CoD4, Assassin's Creed, and Heavenly Sword.  None of those have AI better than last gen.  Some of them arguably much worse.  I know theoretically AI should be better this generation, but it simply isn't.  Not yet. 


I'm sorry but saying that the generation has run its course graphically is silly. Just because some games hiccup and have bad framerates, screen tearing, etc. at the expense of graphics now doesn't mean that future improvements won't be made when dealing with the hardware. Most companies are still on their first game with this generation. You can't expect them to suddenly stop improving this generation when every generation in history has shown steady graphical improvement over its life. Do you seriously think that Gears 2 won't look better than the original?

As far as AI, I can't think of a game last generation that offers the AI immersion and intelligence of a game like Assassin's Creed. It's definitely not perfect but there is an improvement there, without a doubt. Older systems couldn't even handle the crowds of the newer games, much less the improved AI. 


I think the point is there IS a lot of incremental improvement and those will continue as developers master the new systems. However, the change from PS2 to PS3 or Xbox to 360 is nothing compared to the change from 2d to 3d environments or other changes that a new generation of systems have brought in the past.   I look at a game like Virtual Fighter 5 and yeah, sure, it's pretty, but does it play any different than VF4? Most, if not all, games so far, have been the same games we've already played, just prettier, more crowds, lighting, etc. 

The one game that in my mind might really seem 'next gen' is Force Unleashed.  If your not familiar with what LucasArts is doing with that goto IGN and check it out.  But still, although it'll make the world seem 'real' gameplay may remain essentially the same.



 

fazz said:

"It seems to me we are not far off where the hardware is no longer limiting games"

Wrong. PS3 and 360 are already limited. There are some games out there that can't run on their hardware because it's too weak (I won't say names). And seriously, PS3 and 360 are faaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaar below of some technology that's already present on other places. Don't think they are "next-gen" or "above next-gen", because they're not.

You don't have to worry, in years to come you'll see more and more games that will make you say "OMFG that's real time?!"


Yes and we all know that the CELL PROCESSOR can run circles around any computer.



 

mM
Gamerace said:

I think the point is there IS a lot of incremental improvement and those will continue as developers master the new systems. However, the change from PS2 to PS3 or Xbox to 360 is nothing compared to the change from 2d to 3d environments or other changes that a new generation of systems have brought in the past.   I look at a game like Virtual Fighter 5 and yeah, sure, it's pretty, but does it play any different than VF4? Most, if not all, games so far, have been the same games we've already played, just prettier, more crowds, lighting, etc. 

The one game that in my mind might really seem 'next gen' is Force Unleashed.  If your not familiar with what LucasArts is doing with that goto IGN and check it out.  But still, although it'll make the world seem 'real' gameplay may remain essentially the same.


I'll agree that the jump isn't phenominal for most games but I keep hearing people bring up the PS2... Exactly what "huge leap" was made from gen five to six? Sure, the move from 2D to 3D was huge but the last 2D consoles were three generations ago. What have they done for us lately if improved environments, graphics, and AI aren't moving the industry forward enough?

In short, if you're going to criticize this generation, you also have to criticize last generation.

As for Force Unleashed, it has real potential and it's stuff like that where I think we'll see the most progression. Assassin's Creed brought us incredible cityscapes, Mass Effect brought us a movie-like experience, and Force Unleashed has the potential to break open the physics with completely destructible surroundings.

Things are still moving forward, whether people want to admit it or not. It almost seems as if defending game progression in technology is an affront to some Wii fanboys based on the way they want technology to grind to a halt (as if suddenly graphics, AI, and physics don't matter anymore now that waggle exists). I just don't get it. Some people want different things from different games.




Or check out my new webcomic: http://selfcentent.com/

rocketpig said:

The entire dialogue & conversation system would suffer greatly if Mass Effect wasn't able to bump those great textures, use depth-of-field, etc. One thing people are forgetting is that these graphical achievements allow companies to stop using the bloody cutscene. The graphical engines are allowing gorgeous movie-like interactive situations instead of sitting there watching a cutscene.


 But you are more or less watching real-time cutscenes in Mass Effect in the conversation system. You pick a response, and it plays out. There's several actual cut-scenes in Mass Effect that you can't skip as well. Companies are still using cutscenes. I really don't think bad textures would kill Mass Effect's story for me, espeically considering you're constantly seeing unfinished ones for few seconds at every cut from start to finish.

Bad voice acting might of since there's so much of it, but so long as no what the textures were suppose to be it would have more or less the same game for me.



Around the Network
leo-j said:
fazz said:

"It seems to me we are not far off where the hardware is no longer limiting games"

Wrong. PS3 and 360 are already limited. There are some games out there that can't run on their hardware because it's too weak (I won't say names). And seriously, PS3 and 360 are faaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaar below of some technology that's already present on other places. Don't think they are "next-gen" or "above next-gen", because they're not.

You don't have to worry, in years to come you'll see more and more games that will make you say "OMFG that's real time?!"


Yes and we all know that the CELL PROCESSOR can run circles around any computer.


Too bad that the Cell is only one piece of the puzzle. There is no denying that top-end computers are capable of much more than either the PS3 or 360. Of course, they also cost roughly four times as much. 




Or check out my new webcomic: http://selfcentent.com/

BrainBoxLtd said:
rocketpig said:

The entire dialogue & conversation system would suffer greatly if Mass Effect wasn't able to bump those great textures, use depth-of-field, etc. One thing people are forgetting is that these graphical achievements allow companies to stop using the bloody cutscene. The graphical engines are allowing gorgeous movie-like interactive situations instead of sitting there watching a cutscene.


 But you are more or less watching real-time cutscenes in Mass Effect in the conversation system. You pick a response, and it plays out. There's several actual cut-scenes in Mass Effect that you can't skip as well. Companies are still using cutscenes. I really don't think bad textures would kill Mass Effect's story for me, espeically considering you're constantly seeing unfinished ones for few seconds at every cut from start to finish.

Bad voice acting might of since there's so much of it, but so long as no what the textures were suppose to be it would have more or less the same game for me.

Whether they look like cutscenes or not is irrelevent. You have to realize that even a Blu-ray disc probably wouldn't hold all of the pre-rendered video needed to play Mass Effect if the game engine didn't power the entire game. Not to mention the millions of dollars BioWare would have needed to put into the game to pre-render all the footage. Mass Effect wouldn't have nearly the impact if you forced lower resolutions into the storytelling. You may disagree with me but I have a feeling that most people with 40" or larger HDTVs will probably agree with me on this one.




Or check out my new webcomic: http://selfcentent.com/

Consoles have sort of hit a wall. PCs have just smashed through one that's about 10 years old. If we're talking about tech jumps, for the most part you have exclude consoles, because that isn't where the cutting edge is.

The 360 and PS3 use fairly old technology. Something like 2004/2005. Not ancient, but by tech standards 2-3 years is pretty old.

An article on why Crysis will never be on a console.
http://www.crysis-online.com/Articles/crysis-on-consoles-the-facts-of-the-matter.php

This level of calculation right now is just being explored. Dual and Quad core processors, these new cards coming out, maybe (mmmaybe) PhysX cards. and DirectX 10 are opening some new doors.

You thought 800 zombies on the screen at once was awesome. How about thousands?

Granted, it's still in its infancy. Crysis and the new Star Wars game on PC will nudge what's capable, but within 2 years I imagine consoles will seem archaic and quaint compared to the massive leaps these new processors are capable of.

Engines like euphoria are making animation a ton easier, which will mitigate other expenses. I'd look up more on what else this new tech will make easier, but I'm too lazy at the moment.

Ironically the 2 biggest gaming tech jumps have been the Wii (horrible power wise, but revolutionary controller), and the PC (same old controller, insane power).



HECK NO!

Have you seen Wii's graphics? We got a long way to go.



rocketpig said:
BrainBoxLtd said:
rocketpig said:

The entire dialogue & conversation system would suffer greatly if Mass Effect wasn't able to bump those great textures, use depth-of-field, etc. One thing people are forgetting is that these graphical achievements allow companies to stop using the bloody cutscene. The graphical engines are allowing gorgeous movie-like interactive situations instead of sitting there watching a cutscene.


 But you are more or less watching real-time cutscenes in Mass Effect in the conversation system. You pick a response, and it plays out. There's several actual cut-scenes in Mass Effect that you can't skip as well. Companies are still using cutscenes. I really don't think bad textures would kill Mass Effect's story for me, espeically considering you're constantly seeing unfinished ones for few seconds at every cut from start to finish.

Bad voice acting might of since there's so much of it, but so long as no what the textures were suppose to be it would have more or less the same game for me.

Whether they look like cutscenes or not is irrelevent. You have to realize that even a Blu-ray disc probably wouldn't hold all of the pre-rendered video needed to play Mass Effect if the game engine didn't power the entire game. Not to mention the millions of dollars BioWare would have needed to put into the game to pre-render all the footage. Mass Effect wouldn't have nearly the impact if you forced lower resolutions into the storytelling. You may disagree with me but I have a feeling that most people with 40" or larger HDTVs will probably agree with me on this one.


 I have a 50" HD set. For me the more impressive parts of Mass Effect was the camera direction, as it seemed to be one of the few games to actually direct the camera in dramatic ways. That's what stood out to me, not the textures and the lighting.