selnor said:
I think NJ5's point was that those PS3 exclusive developers wouldn't know, because they have never made a game for the 360.
It's interesting to note to, people carp on about dves needing to use the multithread capabilities for the PS3 to get more out of it. Everyone seems to forget that the 360 has 6 threads ALL General Purpose computing. So far we have NOT seen a 360 game written and codded specifically to take advantage of the 360 6 thread architectrure. In fact Unreal Engine 3 was designed to run on a single thread CPU. For example Gears Of War 1 runs fine on a single core 3.2ghz PC, 1.5gb ram and 8800GTX GPU. At Full specs.
Alan Wake is the first game in development to take advantage of multiple General purpose cores. We know this from the Intel conference, where they state the use an entire core for Physics, and an entire core to prepare and stream the gameworld info for the GPU. as well as a 3rd core for the other mechanics. Remedys words were we simply cannot run Alan Wake on a single core CPU. Finally we will start to see the 360 stretch it's legs later in 2009. Finally dvevlopers will take advantage of it's multithread capabilities.
All 360's top games for graphics have been made with the single core wonder Unreal 3.
Mass Effect 1 = Unreal 3
Gears 1 + 2 = Unreal 3
Bioshock = Unreal 3
I really cant wait to see the 360 pushed with multithreaded games specifically built around General purpose threads, 6 of them. :)
|
The bolded part is false, and every logic that follows from it is thus faulty, just google "Unreal 3 engine multithreaded" and you'll get the plethora of articles explaining how U3 was one of the first engines to heavily rely on multithreaded parallelization to take advantage of multicore CPUs.
You do realize that a multithreaded application of any sort can run on a single CPU? The CPU will simply timeshare between multiple threads, and that's what they have always done in single core computers. Of course doing so you incur in some hoverhead for context switching, and the code has to be more complex to allow for data sharing and syncronization between threads.
The guys at Remedy maybe will be among the first to put a minimum requirement for multicore CPU on the back of the box of their PC game (when was the intel demo and talk? 2007?), but it's been since 2005 at least that game engines have been taking advantage of multiple cores.
To be even more explicit, here's a quote from an interview with Tim Sweeney:
...
PCGH: It is well known that your engine supports multi core CPUs. What is the maximum number of threads the engine can calculate? What is the performance gain when you play UT 3 with a quad core CPU? Will the engine even support future CPU with more than four cores?
Tim Sweeney: Unreal Engine 3's threading support is quite scalable. We run a primary thread for gameplay, and a secondary thread for rendering. On machines with more than two cores, we run additional threads to accelerate various computing tasks, including physics and data decompression. There are clear performance benefits to quad-core, and though we haven't looked beyond that yet, I expect further gains beyond quad-core in future games within the lifetime of Unreal Engine 3.
PCGH: Can UT 3 be played with full detail on a single core machine?
Tim Sweeney: You can play UT3 at any detail level on any machine; the dependent variable is the frame rate! If you have a fast GPU (and thus aren't GPU-bound), then you'll notice significant performance gains going from a single-core PC to a dual-core PC, and incremental improvements in going to quad-core, at a constant clock rate.
PCGH: Are there any things you learned while developing Gears of War for next gen consoles that you can now benefit from when finalizing UT 3 for the PC?
Tim Sweeney: The Gears of War experience on Xbox 360 taught us to optimize for multi-core, and to improve the low-level performance of the key engine systems. This has carried over very well to PC. The division of UE3's rendering and gameplay into separate threads, implemented originally for 360, has brought even more significant gains on PC where there is a more heavyweight hardware abstraction layer in DirectX, hence more CPU time spent in rendering relative to gameplay.
...