By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft - Has the Xbox 360 reached it's peak in terms of graphics, Please read!!

Ender said:
SMcc1887 said:
badgenome said:
The people who made Gears 2 say it hasn't, so I guess not.

The people who made Gears 2 also stated that they blew up over 5 X360's in the production of the game, so I think Gears 2 is the 360's max.

 

 

huh?

 

 Yes, Epic cofirmed that they bust over 5 Xbox's in the production of Gears 2 as the system occasionally was unable to cope with the game.



Around the Network
mrstickball said:
No.

The Playstation 2 is just about tapped out....But it's taken 8 YEARS to do that. Why expect the X360 to have it's power tapped in 3?

Are X360 games getting 'saturated' - to the point that graphics increases may be minimal? We're probably getting there in 09. But the fact is, if a team wants to improve on every other game, the horsepower still exists to improve.

It'd be incredibly asinine to say that it's reached it's peak in 3 years...Given the knowledge that God of War 2 came out 6 years into the PS2's lifespan.

I thought the argument's (in favor of 360 devleopment) always been that the 360 is tons easier to develop on since it has a PC-like Architechture, unlike the PS2 and PS3 which are foreign.  So theoretically, wouldn't it take a shorter amount of time to max out a 360 rather than the PS2/3? 

I don't think it's peaked yet, but I'd probably say Gears 2 will still be in the top 5 looking 360 games when all is said and done.



SMcc1887 said:
Ender said:
SMcc1887 said:
badgenome said:
The people who made Gears 2 say it hasn't, so I guess not.

The people who made Gears 2 also stated that they blew up over 5 X360's in the production of the game, so I think Gears 2 is the 360's max.

 

 

huh?

 

 Yes, Epic cofirmed that they bust over 5 Xbox's in the production of Gears 2 as the system occasionally was unable to cope with the game.

 

What you are describing doesn't make any sense.  Using hardware processing shouldn't make a system blow-up.  That's like something out of an early 80's flim.

 



Ronster316 said:
Barozi said:
Seraphic_Sixaxis said:
Ronster316 said:
Seraphic_Sixaxis said:
@Ron- Lol wtf, H-Telcom would'nt do that, ... RoadRunner would thoug...

And wtf, i didnt even know the UK had AOL... 0_o

 

Yup, we have AOL......... And it was great.......... About 5 years ago then i think carphone warehouse took over it here in the UK and drove it in to the ground, the only way you would know its AOL is because its still called...... wait for it...... AOL lol

You used to be able to check peoples profiles instantly now you have to go to somwhere like bebo, it really is sad because it genuinely was great many moons ago.

Ouch... but doesnt AOL stand for America Online L-Something? since your from the UK and all... lol.

To bad, sounds like it was a top class service for ya guys to.

AOL has always been terrible in the states though... Right from the damned get go. ^_^

AOL was baaad here in Germany.

Was it ever at least half good at any stage in Germany? or was it always Grade A poop?

Looks like AOL in europe is on its last legs, hardly any of the people on my AOL buddy list come online anymore, looks like everyones jumped ship.

To be honest I don't know anybody who uses the AOL service in the present.

Though I can remember 2 or 3 guys using it 5 years ago.... ^^



SMcc1887 said:

Can you put more textures on a BR-DVD? Sure. But if the PS3 can't actually render them, then the point is moot.

However, Raw uncompressed Data is more likely to have better quality visuals, as you will probably see in 2-4 years time (if the PS3 is still here XD)

 

You really don't understand computers outside of simple concepts do you. Were not talking about streamed video to a display. Were talking about textures that need rendering. Compression or non compressed make no difference to the visual qaulity. It makes a difference in regards to how long it loads and prepare to be used.

For the record since compression makes no visual difference in regards to 3D. The BR size can store vastly more compressed data.

Now how about actually do some real research than PR research.

 



Squilliam: On Vgcharts its a commonly accepted practice to twist the bounds of plausibility in order to support your argument or agenda so I think its pretty cool that this gives me the precedent to say whatever I damn well please.

Around the Network
DMeisterJ said:

 

I thought the argument's (in favor of 360 devleopment) always been that the 360 is tons easier to develop on since it has a PC-like Architechture, unlike the PS2 and PS3 which are foreign.  So theoretically, wouldn't it take a shorter amount of time to max out a 360 rather than the PS2/3?

I don't think it's peaked yet, but I'd probably say Gears 2 will still be in the top 5 looking 360 games when all is said and done.

Ditto.



 

 

 

 

 

@ Barozi, how ironic, you dont know of anyone who has used it for 5 years and i said in an earlier post that it hasn't been any good for 5 years, will the last person to leave AOL in europe please turn out the lights lol



CaptDS9E said:
The PS2 has been around for ever, and the newest games still improved. People couldn't believe how good God of War 2 looked on the system. There is always room for improvement. People figure out new ways of compression, new ways of manipulating processing power, etc.

 

This is dead-on true.  There are obviously some sort of bandwidth limits that keep a PS2 game from looking as good as a PS3/360 game can look if the same amount of effort is applied, but Shadow of the Colossus was another example of doing what people said couldn't be done on the PS2.

The question isn't, "when does console X become graphically tapped out", but "what is the rate of diminishing returns for graphical improvement", i.e. - how much harder does developer Y have to work to get incremental improvement Z out of that console?

The 360 and PS3 will continue to be used to new levels for years to come, it will just become more and more expensive, development-wise, to do so.



SMcc1887 said:
badgenome said:
The people who made Gears 2 say it hasn't, so I guess not.

The people who made Gears 2 also stated that they blew up over 5 X360's in the production of the game, so I think Gears 2 is the 360's max.

 

LMAO. Tell me you didn't actually believe that.

 



badgenome said:
SMcc1887 said:
badgenome said:
The people who made Gears 2 say it hasn't, so I guess not.

The people who made Gears 2 also stated that they blew up over 5 X360's in the production of the game, so I think Gears 2 is the 360's max.

 

LMAO. Tell me you didn't actually believe that.

 

 

There are times when a developer can "blow up" the hardware by doing something the hardware designers never intended that causes the component to significantly overhead.  I've personally experienced this when I worked at another company years ago... we could burn out a modem chip by doing wonky things with some of the registers.  It's also happened with CPUs, etc.

So the above isn't outside the realm of possibility.  However... it's very rare for something like that to happen.  Most hardware designs take this sort of thing into account and include built in limits to prevent catastrophic failures.