By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Hynad said:


The topic is about efficiency. Not about scaling. I've already made my points. Which you didn't disprove.

Specs for specs, console are more efficient. The OS footprint among other things are responsible for this. I gave you the point about Mantle. That it has the potential to turn this around. But we'll have to wait for when it comes out to really be the judge of this.


I did disprove, proof is in history of prior game releases, I did iterate upon this in my prior posts, if you need more information of what I have stated, feel free to use the worlds largest repository of information known to mand-kind, which is available at www.google.com
The reason why I brought up scaling is because you made it sound like you can only scale PC games downwards and not upwards, you can do both.

I'm not discounting that consoles are lighter, leaner machines later on in a generational cycle, but early on? Not so much. Oblivion, Bioshock, Call of Duty 4, Unreal Tournament 3 being prime examples of this, all of which perfectly happy running on GPU's and CPU's an entire generation older than the consoles.
There is actually a good explanation of this, during a new generation, Developers typically use High-level API's while they shift their game engines to the new hardware and drop support of the old hardware, thus that incurrs ovehead. (Hence why Oblivion performs worse on the Xbox 360 than a PC with half the GPU horsepower.)
Now as time goes on, Developers get more bold and they start writing to a low-level API, the bolder developers will go even closer to the metal than that.

Later on in the generation, it's a different story as PC games have higher image quality out of the box and no longer support older rendering paths such as Direct X 9, so they really can't be compared, Battlefield 3 being a prime example of this with better textures, lighting, shadows, tessellation, larger maps, more players amongst other things.

As for the OS, Windows XP which is the OS that was around during the Xbox 360 and Playstation 3's launch is actually a very lean OS, it can run rather happily with 70-90Mb of Ram which is competitive enough with the twins.
Windows 7/8 is even leaner than the Xbox One and Playstation 4's OS.

It's the API's holding things back at the moment, which is soon to be rectified, at-least for AMD users.

You edited your post.

Hynad said:

Build me a PC with 512GB of total RAM, with CPU and GPU comparable to those found in the PS3 and 360, and then show me a direct feed video of it. Otherwise, I don't see how you can make me believe you ran Crysis as well as on the 360 with such low PC specs. 

You can't build a consumer grade PC with 512Gb of Ram and why would you want to? You would have to do some severe stripping back of a Windows XP install to make that kind of Ram amount have any weight, you're forgetting a PC does more at any one time than a console.
Sitting at the desktop mine is running my Keyboards screen with detailed information of Core temperatures, voltages, clockspeeds, fan speeds for both the CPU and GPU.
It's also showing me information on the weather, CPU, Network and GPU usage, with some servers waiting to be "woken up" via another device to begin transcoding, just to name a few, once a console can do everything my PC can do, then it would be a viable comparison point.

And why should I find you anything? Go do it yourself. I'm not about to help someone who ridicules others a favour.

Also, a 7950 GX2 is faster than the GPU in the Playstation 3 or Xbox 360, if you knew anything about hardware, you would know the obvious reason why.
Go ahead, do a youtube search for someone running Crysis on a Radeon x1950 XT or a Geforce 7950GT at 720P.

edit: In-fact, this has gone way off topic, so better to do it elsewhere.



--::{PC Gaming Master Race}::--