By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Intrinsic said:
sc94597 said:

1. If all that mattered was the ability to play games (and no other multimedia functions) then why would those features be advertised by the companies and why would they even be implemented in the costs of producing the system? Why didn't Sony just use more powerful hardware instead of spending money for their software to support multimedia applications like Netflix? 

2. In your opening post you said nothing about performance or graphics. You mentioned, "what is in the box." It was generally labeled "hardware", 

 

 

  1. Can you show me one commercial sony has made saying that the PS3 plays netflix? Or that its also a blu-ray player. I am beginning to think you do not grasp what it means when referring to somethings primary fucntion. A gaming consoles primary fucntion is to play games. That is solely what it is designed for. Everything else it can do is just a bonus. But I strongly doubt anyone alive spends $400 on a console primarily to play netflix. And as for Pcs, if all someone wants to do with their PC is browse, youtube, check their mail and edit the odd documnet or two. I strongly doubt they would buy anything outside a basic laptop.
  2. You should read my opening post again. I am just not gona do this again with yet another person that choses to noyt read the opening post properly, sees PC vs consoles and jumps to conclusions.

1. Actually many people bought PS3's solely as media devices. The combination of a (3D) Blu-Ray player, and third-party media apps made them very appealing to that audience. My point was, however, is that the cost of implementing something must be justified for any company which wants to be profitable. If these things cannot be justified they will not be implemented. By consequence, the decision to add these features increases the value of the platform, just as say a secondary feature like - backwards compatibility does. There is much more to the sales (and therefore its marginal value) of a platform than its games performance. Many people prefer platforms that have backwards compatibility and will only buy a platform that has backwards compatibility because they don't want multiple devices and for convenience (see your last category in the OP.) Many gamers choose to do more with their gaming hobby. Twich, Ustream, Youtube, and the plethora of social gaming proves this. A PC is the better option when it comes to social gaming, and usually for those who want to do intensive streaming is the only option.  ALl of these things have costs involved and will increase the price of a system. And consequently, your assumption of "what is in the box" being "equal" doesn't hold (which my my main point here.) 



Around the Network
sc94597 said:

1. Actually many people bought PS3's solely as media devices. The combination of a (3D) Blu-Ray player, and third-party media apps made them very appealing to that audience. My point was, however, is that the cost of implementing something must be justified for any company which wants to be profitable. If these things cannot be justified they will not be implemented. By consequence, the decision to add these features increases the value of the platform, just as say a secondary feature like - backwards compatibility does. There is much more to the sales (and therefore its marginal value) of a platform than its games performance. Many people prefer platforms that have backwards compatibility and will only buy a platform that has backwards compatibility because they don't want multiple devices and for convenience (see your last category in the OP.) Many gamers choose to do more with their gaming hobby. Twich, Ustream, Youtube, and the plethora of social gaming proves this. A PC is the better option when it comes to social gaming, and usually for those who want to do intensive streaming is the only option.  ALl of these things have costs involved and will increase the price of a system. And consequently, your assumption of "what is in the box" being "equal" doesn't hold (which my my main point here.) 

Ok, if i understand you correctly; your analysis of what s in the box seems to be based on how relevant those features are and whatever reason those features were put into the box.

That however, has nothing to do with the main point of this thread. Which is to measure the perfromance of a game across platforms/hardware.

Take note, I did not say how many apps you can use on the platforms, which one browses the web better, which one plays blu-rays better, which one is better at streaming or all the other stuff that may be tacked on to the overall use and performance of a platform.

I am solely referring to a game being played on a specific hardware/platform. Everything you are saying is true and if this were a discussion about the overarching worth or value of any specific platform then it would be in  the right place. But right now your argumnets are just in the wrong place. 

Put simply, this is about running the same game across different platforms/hardware. Nothing more nothing less.

Here's is a question, and one that I have asked as an example in this thread already. If I gave you a game (call of duty:AW) on PC and told you to find out two things; (1) Which sub $200 PC GPU runs it the best from every sub $200 GPU out there and (2) which sub $600 PC GPU runs it the best, What results would you end up with and how best would you arrive at those results? 

What I am basically saying, is that the method you use to arrive at the solutions to the above problem, should also apply if anyone is comparing a game running on a PC to  console. Unless of course you can tell me why anyone in their right mind will compare the same game running on a  sub $150 GPU to a $500+ GPU.

And I don't thik anyone compares the PS4 to the PS3 to see which one will run the same game better. The value of the console (which one has more games for it and apps) yes those comparisons may be made. But that is not the point of this thread. 



Intrinsic said:

Here's is a question, and one that I have asked as an example in this thread already. If I gave you a game (call of duty:AW) on PC and told you to find out two things; (1) Which sub $200 PC GPU runs it the best from every sub $200 GPU out there and (2) which sub $600 PC GPU runs it the best, What results would you end up with and how best would you arrive at those results? 

What I am basically saying, is that the method you use to arrive at the solutions to the above problem, should also apply if anyone is comparing a game running on a PC to  console. Unless of course you can tell me why anyone in their right mind will compare the same game running on a  sub $150 GPU to a $500+ GPU.

And I don't thik anyone compares the PS4 to the PS3 to see which one will run the same game better. The value of the console (which one has more games for it and apps) yes those comparisons may be made. But that is not the point of this thread. 

In that case I think this topic which critiques a certain comparison is too disconnected from the context of the discussion it is trying to analyze. People use both comparisons (the one you presented and the one I presented) in different contexts. Usually I see the "PC games can perform better anyway" only when there is a comparison between different consoles, such as say: PS4 vs. Wii U or PS3 vs. XB360. Yes, within this framework it is fair to say that a PC game performs better (as a general statement) because the same logic was used for the PS4 vs. Wii U or PS3 vs. 360. Then one might argue that the PS4 has a better cost per (say) tflop than the Wii U has, and the same argument can be made for PC vs. PS4 (after some point.) So none of these comparisons are limited to the PC vs. Console debate, they are found between console gamers as well. Nevertheless, there is another context in which PC gamers and console gamers argue about performance "which is the cheaper platform to play games on." Never is it really, "which is the cheaper platform to run COD at 1080p 2xAA on."I don't think I've ever seen a discussion that limits itself to that. The other considerations are always there as well. The performance argument is just one of many and it targets a specific audience of gamers anyway (hardcore graphics whores.) The mother who is deciding whether or not to get her kid a console or a gaming pc (which never would happen in the first place) isn't thinking about 1080p 16x MSAA and neither is the bro-gamer to be honest. The entire discussion is relevant to the subset of the gaming population who will buy many games and who will be persuaded to spend $100/$200 more for 60fps or 1440p, just like they will spend $70 more for 1080p vs. 720p gaming (among other things.) My main point: I think it is important to understand the context in which these comparisons are made. 



Pemalite said:
Xenobot said:
Intrinsic said:
Pemalite said:
 


Consoles also do less at any one time.


Say WHAAAAAAT?
Like when? Are you computer or software engineer to say that?

I did study in electrical engineering, taught myself various programming languages starting with Beginners All-pupose Symbolic Instruction Code on the commodore 64, then moved to more "full featured" languages over the years, these days I mostly deal with objective C.

However, my argument is sound and is logical and I shall expand on my reasoning for why.

Is a console running a full featured, multi-tasking OS that has dozens of services running in the background and taskbar?
Are their games graphically at the same or better level as PC games?

If you answered "No" to both of those questions, then you are correct.

Yes a console can pull off better pictures with the hardware it has, only because it does less of everything, resolution, framerates, textures, lighting, shadows, Anti-Aliasing, Texture filtering, A.I and character counts, geometry, OS, API's, Drivers, various services like Steam... etc'.

So mr. software expert please explain me this:

Why CoD AW runs smoothly in 720p 30 fps on PS3, but you can't expect same result from Core i7, 4 GB ram and GeForce 7800 GTX (both PS3 and GF 7800 are based on Nvidia G70)?

Is it because of background tasks like steam?

Does steam and os services drain power from graphics card?

If I could turn of most of os services and background tasks can I get 30 fps 720p in CoD AW on GTX 7800?
You don't need antivirus, printing service, network service, indexing service nor explorer it self to run a game. You can turn them off.

and don't tel me it's DX 11 - PS3 and PS4 versions runs just fine without DX 11.

Both PC and PS4, Xbox One does the same: same resolution, same resolution, same textures, same lighting, same shadows, same Anti-Aliasing, same Texture filtering, same A.I and character counts, same geometry, same or similar API's, and various similar services like Steam, PSN, XBL. But PC OS and HW is highly compatible and PS4/XO HW and OS is highly optimized for gaming.



PC for well optimised multiplats. Consoles for exclusives and games that are just meant for consoles like COD etc. That's just how I feel.



Around the Network
Xenobot said:

So mr. software expert please explain me this:

Why CoD AW runs smoothly in 720p 30 fps on PS3, but you can't expect same result from Core i7, 4 GB ram and GeForce 7800 GTX (both PS3 and GF 7800 are based on Nvidia G70)?


Because the developers used the hardware differently on a console than a PC.

Besides, the game was altered, compiled, developed differently for the PC.
For instance at a minimum you couldn't run Call of Duty: Advanced Warfare on a Geforce 7800 GTX, because it doesn't support the Direct X 9 renderer to begin with.
The game won't even run on Windows XP, due to the requirement of Windows Vista as minimum, the game also requires a 64bit system, so your argument is completely and utterly pointless.

Historically, every single Call of Duty game could happily run on PC hardware that was inferior to the console, just not the latest "next gen" Call of Duty.
For example:
https://www.youtube.com/watch?v=iDSmaN0qo74



--::{PC Gaming Master Race}::--