Shadow1980 said:
GProgrammer said: Technological improvements are greatly slowing down, in 20 years time the improvements will be far less than they were 20 years ago. Eg take the earliest and the latest fastest common Intel processor of the i7 line. Core i7-975 Extreme Edition released June 2009 6199 passmark CPU score Core i7-7700K released January 2017 12196 passmark CPU score In 8 years performance hasn't even doubled, maybe in 20 years time performance will have doubled again The fastest Intel Processor from 20 years ago Pentium II 300 I dont know what its passmark score is, probably about 20 |
Even GPU power growth has slowed. The N64's GPU was rated at 100 MFLOPs, while 10 years later the 360's GPU was rated at 240 GFLOPs, a 2400 times increase. Fast-forward another decade and you have the PS4 Pro with a 4.2 TFLOP GPU, only a 17.5x increase over the 360's GPU. I think the most powerful GPU today is rated at something like 12 TFLOPs, only a 50x increase over the 360's. To get the kind of increase we saw from the N64 to the 360, we would have to have seen something well past the 100 TFLOP mark. Now, there might have been GPUs out there a good bit more powerful than the N64's, and they were probably designed differently than the ones today. But still, we're talking about something that can be put in a roughly $400 box ($260 in 1996 dollars), and we're also talking three orders of magnitude from '96 to '06, but only one order of magnitude between '06 and '16.
No matter what, be it population sizes or economies or compound interest or computing power, exponential growth into perpetuity is impossible. The growth curve flattens out and becomes sigmoid in the long run. Computers will likely continue to grow in power over time, but the rate of growth we've seen over the past few decades will not simply keep happening. Gordon Moore himself admitted "It can't continue forever. The nature of exponentials is that you push them out and eventually disaster happens." There are always limits to growth. Always.
What this means for video games is anyone's guess. Graphics, which have always been the primary beneficiary of increases in computing power throughout the history of video games, still have plenty of room for improvement. Better anti-aliasing (still seeing bad aliasing this gen), better draw distances (still seeing a lot of texture pop this gen), better lighting, better textures, and better animation, all at higher resolutions and faster, stabler frame rates (visually superb games running at 4K & 60 fps being a nearer term goal) are still to come. But diminishing returns are a very real thing, and eventually it'll get to the point of squeezing blood out of the proverbial turnip.
This does have potential implications for the console market. Namely, it is a highly cyclical market dependent on regular refreshes to keep total hardware sales strong, hence the existence of discrete console generations (since 2002, combined PS+Xbox sales in the U.S. have remained above 9M each year except for 2005 & 2012, and averaged over 10.6M annually in the past 15 years). But will the cycle continue at a healthy level if the increases in power each generation and their associated benefits are perceived to be insufficient? Or will the advances still be enough to justify the cycle continuing? If not, then what?
|