By using this site, you agree to our Privacy Policy and our Terms of Use. Close
VanceIX said:

I think that both Sony and Microsoft themselves were uncertain about the final hardware specifications of their consoles until the last few months before release, which would lead to devs not having access to a solid development kit. I'm sure Sony wasn't positive about the unified DDR5 memory until the last year, as just two or three years ago the cost would have been too prohibitive to include 8gb of it. Also, the first generation kinect was released towardds the end of last generation, which could mean that Kinect 2.0 was only recently finished and finalized. Finally, AMD's unified APUs were introduced only in 2011, which means that the decision to use unified chips probably was a recent one, and truly graphically competent GPUs weren't even here until 2012-2013 (and, seeing as that the development cycle for most triple-A games is around 2-3 years, a year or at most two probably wouldn't be enough for developers to transition console development).

This leads me to believe that devs have had much less time to fiddle around with development kits compared to previous generations. Thoughts? 


It's partly the developers own fault.
There is a scramble with most developers to rebuild new game engines and assets to take advantage of the new horsepower offered by the Next-gen twins, which takes a ton of time, however a few developers literally didn't have to.
Frostbite, 4A, CryEngine were already "Next gen" with assets to match, on the PC years before the new consoles dropped, which translated well instantly over to the Playstation 4 and Xbox One with Battlefield 4 as a prime example, all that occured was a little downgrade in image quality. (I.E High vs Ultra graphics and only 1080P or lower.)

If developers such as those in charge of Call of Duty, Assassins Creed, Grand Theft Auto etc' had "next gen" assets and engines, then developers wouldnt have been caught off guard as severly, people are still waiting for GTA to arrive on the PC, Xbox One, Playstation 4 due to the lack of next-gen readyness foresight by Rockstar Games.

As for GPU's, the PC has had single GPU's that (Theoretical performance) eclipses the Playstation 4's GPU for roughly 5 years now, 6-7 years for multi-GPU solutions, however it wasn't untill a couple of years ago that such performance levels were affordable for a cheap low-end and low-cost device, not to mention thermal and power constraints had to be factored into it aswell.

So really, it's not that developers were "caught off guard" but instead they simply wanted to maximise profits as much as possible with the current engines and assets as they possibly can, game engines and assets afterall cost money to develop/license which does not come free. (Hence the over-reliant use on Unreal Engine 3, which looked ugly.)




www.youtube.com/@Pemalite