HoloDust said:
They would've need better cooling and probably slightly better PSU - but that's not really what I meant - those 2 disabled CUs are disabled for better yields, hence lower price, if they went with fully enabled GPU (basically 7870 GHz edition) they would have Epic's 2.5TFLOPS GPU, but yields would be worse and price would be higher. |
The PSU is over-engineered anyway as it should be, It would likely handle the extra power demands at the expense of some lifetime. (Due to Capacitor aging which reduces the max wattage of a PSU over time.)
The fan speed could likely be ramped up with just an update to handle the increased TDP demands.
The PS4 having a clock of 1ghz would likely have meant that all games would probably be hitting 1080P.
As for Flops, there are more important things to worry about.
| Slimebeast said: I didn't say flops were directly proportional to performance, but they're still a good indication. For example, your X1950 example and almost 500gflops, well it turns out it was at least 50% faster than an X360 at the time. So, not perfectly proportional but in the same ballpark. Your post is nevertheless very interesting. Aslo, you seem to agree that with PS4 and BXO we witnessed by far the weakest generationional leap relatively speaking. In the past a new console generation could be 20 times stronger, now it's only 6-8 times stronger and next time the difference will be even smaller. |
I disagree. I find flops to be a terrible indication for gauging performance, unless of course your workload is *only* single precision floating point heavy... Like... Folding@Home.
Typically newer GPU's with less gflop can beat older GPU's with more gflop.
Besides Gflop ignores the GPU's Geometry, Memory, Texturing, ROP and more capabilities as well. If you exclude half of a processor in a comparison, then how could the comparison ever be accurate?
And yes, I do agree that this generation's jump in graphics is pretty dismal, I always hammered that tune though... But you also need to keep in mind that games of this generation are also using waaaaaaay more dynamic effects which are simply more expensive, while last generation everything was pre-calculated/static/cheap.
For example it was fairly common to have light details baked into the textures in games last generation, that unloaded a ton of processing power to use for something else... People of course would walk away saying xxx game has impressive graphics when really it was thanks to great art and assets, rather than sheer amazing dazzling effects all running at once thanks to some magical and amazing optimization or the power of the Cell/Cloud/Cats.
That has also impacted the kind of "jump" people are seeing this generation.
With that said... You reach a point where every time you double your "graphics" quality, you need multiples more capable hardware.
This image explains it really well:
10x increase in polygons, you reach a point where an increase isn't as dramatic anymore.
--::{PC Gaming Master Race}::--









