Teeqoz said:
Hynad said:
Skyrim and GTAV one PS3 and 360 aren't in the same ballpark as BOTW. It would have to run at sub 720p, with maybe a low setting FXAA if at all, much less individual grass strands, much more pop-ins/weaker draw distance, and who knows what would happen to the physics. xD Skyrim and GTA V both didn't have physics as intricate as what's found in BOTW, and that is pretty CPU intensive. Let's not mention how all of it would perform. Skyrim wasn't much of an example for that, and GTA V struggles quite a bit at times, while Nintendo games are always great performers.
|
If there's anything from BoTW that the PS360 wouldn't struggle with, it would be the physics. Especially the PS3 has a powerful as fuck CPU. iirc, it beats the PS4 and XBO's CPUs. Only problem is that it's hard to optimize for. GTAV and Skyrim are bad examples, because they are multiplats, and thus haven't gotten the same attention and optimization as BoTW has gotten by virtue of being an exclusive. (I mean, Bethesda is notoriously shit at optimization, the early PS3 version is infamous for it).
I doubt the PS360 could run BoTW with the exact same visual fidelity as the Wii U, but I also doubt they would have to reduce visual fidelity as much as you make it seem here. It would probably be plagued by pop-in though, because the one place where the Wii U is substantially ahead of the PS360 is in the amount (and perhaps speed?) of the RAM.
|
The Playstation 3's CPU isn't powerful. It never was even on release and certainly isn't today, I apologise if you fell for Sony's marketing, it was powerful for a console, for sure, but pales in comparison to even tablet CPU's today.
I bet all you are doing is taking the floating point numbers and comparing it against Jaguar? Wrong way to do things. You can take a 1 Teraflop GPU and it can beat a 2 Teraflop GPU, the amount of flops is useless when comparing different processors of completely different architectures, instruction sets, heck even generation.
Not only that but the Playstation 3's CPU only excels in iterative refinement floating point, Jaguar excels in everything.
Now I am not sure if you are aware, but games use all sorts of math, so you cannot strictly render a game using iterative refinement floating point, you may need your bog standard single precision, double precision, integer and more all at once.
Essentially it will be like a car race, Jaguar will be able to maintain 100km/h during the entire race regardless of conditions, whilst Cell will be doing 50km/h during the entire race, however the Cell will have an occasional "boost" when road conditions, fuel type, wind speed and direction all play to it's core strength and then it can hit 150km/h temporarily.
Cell might have the top speed, but it will loose the race every single time.
Not only that, but Jaguar is smarter, it's more efficient, it has more SIMD instructions, it has better branch tree prediction, it has better bandwidth management, it has larger caches, it's out-of-order and there is a degree of cache coherency, it's lower latency, everything.
Cell has more in common with the first generation Intel Atom's than it does with a modern out-of-order high-performance CPU architecture and it's performance meets those expectations as well.