| WereKitten said: @jetrii About TPS vs FPS: Yes, I'm sure many other TPS swap models a-la-Uncharted. I only brought it as an example because it is a game that I find beautiful to look at and still was able to use such an obvious optimization (incidentally I read that Naughty dog is going to use the hyper-detailed Drake model for both cutscenes and gameplay in Uncharted 2... I wonder if it will make an obvious difference visually) That was the whole core of my TPS/FPS idea anyway: TPS can decide more easily on texture/geometry vs performance tradeoffs because the view scene is much more "controlled". Controlled as in having a big character right in the middle, in a known and limited range of distances, and in having a much smaller logarithimc spread for the distances of other objects as well. It seems to me that FPS don't have this luxury because by definition the camera is much more free, and the object right in the middle of the scene, ie the one the player is concentrating his/her attention on, is potentially the worst looking one because of scaling. About processors: You seem to be knowledgable on the subject, but I can't understand the whole Cell vs GPUs ongoing issue. From what I read it was my impression that we're moving towards fading the distinction between CPUs and GPUs, or at least that's what I got from my reading about Larrabee, about GPGPUs, and about the ideas that Cell themselves could be responsible for the rendering. If this generation of consoles ends up with a longer lifespan, couldnt the whole CPU/GPU issue be moot at the point, with Sony sticking with an updated Cell and asking IBM to develop a Larrabee-like rendering subsystem based on a number dedicated SPUs? Of course additional SPUs could "help in" if required, a bit like Guerrilla did with their deferred rendering techniques, using SPUs to have thousands of real light sources and complex post-processing. If this is the trend Nvidia is basically the weakest player because their GPGPUs won't be able to match the flexibilty of "real" CPUs (Intel's Larrabee, AMD+ATI could very much move in the same direction, IBM/Sony could as well with Cell)
|
I'll jump straight down to the CPU/GPU section since that is the part that interests me the most. Don't get me wrong, I think you have a very valid point, but I think that it's a rather objective discussion. Little pressed for time so I would rather jump down to the part I like and reply to the first part when I get home.
Although the lines of what GPUs and CPUs can do, they are still two very different chips. They are essentially going to acomplish the same thing, but they do it in completely different ways. Modern GPUs have hundreds(1000+ now) of stream processors with programmable shaders to achieve its goal while Larrabee has a few dozen x86 based cores stuck together + 512-bit SIMD unit and plans to achieve the same goal with software rendering.
The problem I have with Larrabee/Cell processor for graphics is that although they *can* render, and I am sure Larrabee will do it effectively, all the numbers I've seen are very underwhelming.
Last I heard, Intel said Larrabee could handle up to16 flops a second.
2Ghz * 16 flops * 48 cores = 1536 Gigaflops for the highest end Larrabee under the best possible conditions.
Right now a year old Radeon HD 4870 reaches around 1,200 Gigaflops and ATI's best card passes the 2.4 Teraflop barrier. A single 4870 can also handle double-precision floating point operations 16X faster than the Cell processor in the PS3 and over 2X faster than the latest PowerXCell 8i CPU which was released 6 months ago. It just seems to me that GPUs are advancing much quicker than CPUs and are becoming more and more flexible thanks to OpenCL and DirectX11
Also, keep in mind that Nvidia has been hiring x86 engineers like there's no tomorrow and AMD already has a bunch of them. Things are only improving for the GPGPU fanboys!
I know this post is a mess but I got lost in my own daydreams. Some people dream of driving ferraris, others dream of having millions of dollars... Me, I dream of a GPU with 2,000 stream processors, GDDR6 memory, and the ability to cure cancer (which it would obviously cause due to an overexposure to pure awesomeness)
Good news Everyone!
I've invented a device which makes you read this in your head, in my voice!







