Licence said:
|
An individualist more like but carry on.
Intel core i7 930 OC @ 4.0 ghz
XFX Double dissipation Radeon HD 7950 356 bit 3gb GDDR5 OC @ 1150 MHz core + 1575 x 4 memory
Triple channel DDR 3 12gb RAM 1600 MHz

Licence said:
|
An individualist more like but carry on.
Intel core i7 930 OC @ 4.0 ghz
XFX Double dissipation Radeon HD 7950 356 bit 3gb GDDR5 OC @ 1150 MHz core + 1575 x 4 memory
Triple channel DDR 3 12gb RAM 1600 MHz

Licence said:
|
yeah but it doesnt help that xbox one costs more for a lower resolution.
if they sacrificed kinect and sold the xbox one on its own at £279 compared to PS4's £349, no-one could complain.
instead they give kinect (which no-one wants) at the expense of a weaker GPU (which no-one wants) and price it £100 above the PS4. very bad strategy.
ive always predicted one last 180 from MS and thats to sell xbox's without kinect
fps_d0minat0r said:
yeah but it doesnt help that xbox one costs more for a lower resolution. if they sacrificed kinect and sold the xbox one on its own at £279 compared to PS4's £349, no-one could complain. instead they give kinect (which no-one wants) at the expense of a weaker GPU (which no-one wants) and price it £100 above the PS4. very bad strategy. ive always predicted one last 180 from MS and thats to sell xbox's without kinect |
Perhaps, we'll see how well the XBone does. I'm certainly not getting one, but I can understand that Kinect may be appealing. And I think the media-centre features can also sway some people who are not into hardcore gaming.
What if it's native 720p that's upscaled to 1080p?
allblue said:
I think people who believed microsoft's bullshit PR should be offended by their lies in claiming consoles power parity. |
Then you should not just trash one console but both. Because BOTH consoles are weak compared to PC, noty just one.
And how many times did Mark Cerny talk about what a spectacular architecture PS4 is (despite it's just a low end laptop CPU and mid range GPU glued together)? it's standard PR speak.
| ViktorBKK said: So I 've been reading the forum last couple of days, and some of the comments that I see have been driving me nuts. Some people either need to get their eyes fixed, or they are just in denial. Regardless I will try to explain some things from a technical perspective. 1080p has been the standard by TV manufacturers for many many years now. The vast majority of TVs sold in retail output natively at 1920x1080. As a matter of fact the standard has become so widely adopted that even PC monitor makers chose to implement it in their products due to a cost of scale. 1080p is by no means new, neither is it expensive. A 40' TV screen can be had for as little as 400 dollars or less. Some 1080p PC monitors go for as little as 100-150 bucks. Now someone would argue, but 720p is HD too whats the big deal? Well here is the deal. If you have a 1080p monitor at home, try to lower a resolution to 1280x720. You will notice that the image on screen is very blurry. Downscaling or upscaling by a factor of non integer numbers produces a blurry image. But let's explain that a little better. If I have an image of 1280x720 and I want to upscale it to 1920x1080, it means I need to increase the length and width of my image by a factor of 1.5. So for every 1 pixel in length or width I need 1.5 pixels respectively. This is not achievable without loss in image quality. You can only upscale without image quality loss if you increase your output by a factor of integer. For example 4k resolution is 3840x2160, exactly x2 the length and width of 1080p. Manufacturers opted for this new standard, because they know you can view the current 1080p content without image quality loss. That is why developers have been stressing the importance of rendering at native 1080p. The majority of monitors out there output in this resolution, and it will be a filthy mess if you try to render at a lower resolution and then upscale. And don't give me that "only gameplay matters" crap. If that's the case then go play your NES. |
worth noting you will notice much much more depending on viewing distance.
on a pc monitor you are sitting 1-3 feet away.
a hdtv you are sitting much farther and it is harder to notice.
a 720p game would be untolerable on my pc, but i dont really notice on my tv.
1080p is better, of course.
Some people have been missing the point here. This isn't an argument of wether 720p is enough resolution or not. The problem is that almost all screens in the market now are 1080p, and you can't display a 720p image on a 1080p screen without loss in image quality.
| curl-6 said: Meh, I had more fun on the 480p Wii than the 720p Xbox 360, and the most I've enjoyed any console was the SNES. |
That is only the case becasue 99% of the games ran at 60fps (or 50 for us PAL-people)
This is obviously the only reason.