By using this site, you agree to our Privacy Policy and our Terms of Use. Close
ishiki said:

PC's are more expensive.

The 3000 Dollar argument is bogus unless you're trying to do a multi monitor setup, must do supersampling and 16X MSAA, or just like to waste money.

Since the 1 year before the PC has been released I've spent about 2000 dollars on PC stuff and upgrades. Selling old components for about 400 dollars total. Running at settings and frame rates much higher than consoles (Not quite maxed in games that are console ports).

The monitor and sound is not included, but I use the same monitor and sound system for my PC and consoles.

Since basically the xbox has been released I've spent
900 Dollars on Xbox360 (2 Break).
500 Dollars PS3
300 Bucks Wii-U

So yes, basically I've spent 2x as much on PC as on Xbox, 4x as much as PS3 during that same time. 

But during that time, I've gotten much better image quality, resolution options IE 2560x1440 has been available forever, 3 monitor support, framerate, and customizability... And a lot of times, users can fix major issues in games, or make said games better. Not all games are available on any platform unfortunately.

Consoles are easier to setup!

Nope its not bogus. Why? Cause we could be here talking about how good BF4 looks on the PS4 in 900p compared to it being 720p on the XB1. And some PC gamer will take  that as a cue to find that benchmark from TomsHardware that ran the game at 4K at 64fps on ultra settings with a R9 295x2 GPU (thats a $1000+ GPU by the way) or on an Nvidia titan z ($3000 GPU) and start throwing those screen shots around in the thread like that is the floor of PC performance when in truth its a very nrealistic ceiling that very very very few PC gamers can afford. We all know that with more money to spend you can get better performance from a PC setup. But how does that make sense when comparing that to what basically a sub $200 GPU is doing. 

The whole point of this thread, is to make a simple suggestion. If people want to compare how a game runs on the PC to how it runs on a console, then they should make that comaprison using hardware on the PC that is similar to the hardware you will find in the console its being compared to. There is zero sense in comparing hardware that is marginally or extremely more expensive than what you have in  a console to the console. In the same way that you will not compare a $1000GPU to a $150 GPU when they are both hooked up in a PC set up. 

Somehow, this point has gone right over the heads of some people. And it seems so obvious its actually strange.

I could even make the ame point and remove consoles from the equation and it would still apply. If you are gonna try and find the best sub $200 GPU on the market, does it make sense to add a $1000-$3000 GPU in the testing?