By using this site, you agree to our Privacy Policy and our Terms of Use. Close
theprof00 said:
walsufnir said:
fillet said:

He did, and that wasn't a dig at zarx, it was very well put and respect to him for it.

But it's kinda obvious stuff really. Everyone here knows enough about developement that timing/expected target is critical for console gaming.

When games are "polished" ironing out frame rate issues are always going to be a part of that, changing architecture would simply remove all that consistency.  That is obvious.

Yes, to some people it is clear as water :) But keep in mind this doesn't apply to all users here. Sometimes stating the obvious is useless but obviously prof's knowledge about technical things is so bad that he had to start this thread. I still wonder if there is more to it...

I'm asking for proof, not conjecture.

I know all of these reasons already. I know that changing something like the CPU/GPU is a huge deal. I know there are contracts in place. I know this would result in massive delays. I know this would fuck with all the work developers have previously put into their games.

But is there PROOF?

Do we have a link or someone in the industry saying quite clearly "these contracts are all written long before production even begins", or "changing out a GPU for a completely different one would create massive problems, even 6 months out from release". I know the evidence, and I actually work with computers.

If you know what a big deal it would be and the problems it would cause, why do you need proof? The proof is simple common sense. COULD a company do it? Sure, but it would not be pretty. WOULD a company do it? IIRC the only company to have done it is SEGA and they don't make hardware anymore.