Slimebeast said:
shikamaru317 said:
A CD Project dev hinted that they were expecting more power from XB1 and PS4, which is what lead to the Witcher 3 downgrade. Likewise, a former Ubisoft dev who worked on Watch Dogs said that they were expectng more power as well, which is what lead to the Watch Dogs downgrade.
There's also the fact that PS4 and XB1 were equivalent to mid-range PC's at release. 1300 gflops and 1800 gflops for XB1 and PS4, while the best PC GPU in 2013 had 5000+ gflops. Meanwhile, when 360 released in 2005, it had a 240 gflops GPU, while the best PC GPU at the time were around 260 gflops, a much smaller gap.
|
Ahh, this is interesting to hear. People were so upset by upgrades but turns out it's not even those developers fault lol
All those flops numbers are so revealing. The increase in hardware performance has really taken a drastic halt.
|
The flop numbers are not revealing at all.
Anyone who thinks they can take a GPU from a decade ago and compare it to a modern GPU on flops alone is absolutely kidding themselves.
Even if the Xbox 360 and Xbox One had the EXACT same amount of flops, the Xbox One would be substantually faster.
Fact of the matter is, the Xbox 360's GPU isn't the same as any PC derived GPU, it has characteristics from both the Radeon x19xx series and the extremely inefficient Radeon 29xx series, so it cannot be compared to any PC GPU.
The PS4's GPU however closely resembled the Geforce 7900 series, but with cut down TMU's and ROP's, it wasn't high-end, but it was close enough.
However... It was also overshadowed by nVidia Geforce 8000 series which launched just before the PS3 if memory serves me right.
So when the PS3 launched it's GPU was already relegated to mid-range in terms of performance anyway, relative to the PC.
The Xbox One and Playstation 4 though, they were already using hardware that was almost a couple of years old, they were only mid-range and they were also overshadowed by AMD's more efficient Graphics Core Next 1.2/Gen 2 GPU update.
With that said... If you wish to play the Gflop game, The PC also had multi-GPU's back then as well, overclocking the x1950 XT though, you could theoretically get to almost 500Gflop... Thus a couple of them would yield you almost a Teraflop. This was a decade ago, it puts the Xbox 360's "240gflop" GPU into perspective doesn't it?
It's not comparable for obvious reason still.
HoloDust said:
CGI-Quality said:
I haven't heard those devs say any of that.
@Conina: That's what I was looking for. Thanks.
|
Not sure about CDPR and Ubi, but from what I remember, folks at Epic were saying before PS4/XBO launched that they need 2.5TFLOPS consoles (which is what fully operational PS4 GPU@ 1GHz is rated at - but that would mean higher price of PS4)...since neither of them had enough juice, they put SVOGI on hold for UE4 at the time.
As for weakest gen, I'd argue that 4th gen was pretty weak...or at least very late to catch up with computers. Genesis launched in '89 in NA, SNES in '91, and both computers that had similar specs (Amiga and Atari ST) launched in 1985.
|
Increasing the PS4's GPU clock to 1Ghz might not have cost a single cent extra.