By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - AC: Unity PS4 Leaked (aka I Hate Sub-1080p Blurriness)

Pemalite said:


Actually, if you're going to correct someone, try to correct them correctly.
2k is *not* officially 1080P. It is: 2048 x 1152 or 2048 × 1556. (full-aperture)
However, if you look at the industry trend, 2k is mostly referenced towards 2560x1440 as it's a nice clean quad-druple of 720P which allows for clean scaling and monitor/tv manufacturers heck even phone/tablets are advertising 2560x1440 as 2k.

Converesly 4k is supposed to be 4096 x 2160, rather than the 3840 x 2160 standard the industry has settle for, because it allows for better scaling of 1920x1080 content due to it retaining the same aspect ratio as it's quad-druple the 1080P's resolution.

The more you know.

I didn't want to go too much into detail, now I will :)
From the movie Industry 2K and 4K signify the horizontal resolution, indeed 2048 and 4096. 2K is either 2048x858 scope or 1998x1080 flat for digital cinema. (2048x1556 is a camera resolution) I've never heard of 2K being used for 2560x1440 display resolution, that's 2.5K But now I get what you mean with quad HD, quad 720p, 1440p. Yep that's possible next gen.

Marketing takes a run with these terms anyway, 2160p sounds too difficult, 4K sounds 4 times better. Full HD is a stupid term too, and now we get quad full HD or UHDTV. What are they going to make up for 8K, super hi-vision for now.

1920x1080 originated from the 80's analog 5:3 1125 line MUSE format. 1080 was the safe display area (minus the scan return) for digital tv, and 16:9 was agreed upon as the desired aspect ratio to fit with most movies. (However movies are in 1.85:1 1998x1080, but 1998 can't be divided by 16) DCI went with 2048, I assume because of digital camera specs. Yet why use 1080 and not 1152, easier to crop to blu-ray I guess. Since film was mostly shot anamorphically the horizontal resolution took priority anyway. Always fun when standards are made, and then miss eachother by an inch...

Another interesting fact, PAL movies used to waste less of your time, they all ran 4% faster to display the 24fps at 25hz without any stutter. The resulting difference in pitch was too small to be distracting. It was better then the 3:2 pulldown crap for NTSC.



Around the Network
SvennoJ said:

 I've never heard of 2K being used for 2560x1440 display resolution, that's 2.5K But now I get what you mean with quad HD, quad 720p, 1440p.


http://www.channelnews.com.au/news/2FM210DP-benq-launches-2k-quad-hd-monitor-for-499.aspx
http://www.trustedreviews.com/opinions/lg-g3-qhd-screen-the-case-for-and-against-2k-phone-displays
http://www.cnet.com/news/are-2k-smartphone-resolutions-overkill-maybe-not/
http://www.technologytell.com/gadgets/140339/unannounced-2k-resolution-samsung-tablet-crops-up-amoled-display-in-tow/
http://www.dailytech.com/Huawei+Boss+Explains+the+Nonsense+of+QHD+Screens+on+Smartphones+/article34885.htm
http://www.androidheadlines.com/2014/01/samsung-confirms-qhd-2k-uhd-4k-amoled-displays-works.html


Now you have.
More where that comes from!



--::{PC Gaming Master Race}::--

Pemalite said:
ethomaz said:

I think you comment about that part don't make sense at all because I don't expect a new generation before 2020 and I can't see anything less 4k to be standard... PC will be at 8k or more in 2020.

The second part...

You alreayd have a transitation generation from pre-HD to HD... sub-720p to 1080... now it is time to 1080p shine... 1080p is standard for HDTV and it needs to be that resolution becuase we already passed the transiction generation.

Resolution is one of the key points that affect image quality and if you are a PC user you will know the last thing you will drop in game is resolution... everything else will be second to resolution except framerate... effects, textures, AA, etc, etc will be all dropped before resolution for better image quality.

I don't want a console that not evolute... I want a console that more forward... that is why I choose PS4.


The great thing about PC though, there is no resolution standard, you get what you pay for, for example... If you wanted, you can have resolutions that exceed 8k today.


Yes if you wanna play tetris or pong. Good luck getting a demanding game to run on 4k let alone 8k with any GPU on the market



What if the game is actually fun, though? Can we buy it then?



Assassins Creed 3 on my Wii U looks better than those screens, are these just a bad batch of screens? surely the game looks better when you see the game running.



Around the Network
d21lewis said:
What if the game is actually fun, though? Can we buy it then?

No!!, you cant buy a technically flawed game that has fun gameplay, its against the hardcore gamers code of conduct :)



Kane1389 said:
Pemalite said:
ethomaz said:

I think you comment about that part don't make sense at all because I don't expect a new generation before 2020 and I can't see anything less 4k to be standard... PC will be at 8k or more in 2020.

The second part...

You alreayd have a transitation generation from pre-HD to HD... sub-720p to 1080... now it is time to 1080p shine... 1080p is standard for HDTV and it needs to be that resolution becuase we already passed the transiction generation.

Resolution is one of the key points that affect image quality and if you are a PC user you will know the last thing you will drop in game is resolution... everything else will be second to resolution except framerate... effects, textures, AA, etc, etc will be all dropped before resolution for better image quality.

I don't want a console that not evolute... I want a console that more forward... that is why I choose PS4.


The great thing about PC though, there is no resolution standard, you get what you pay for, for example... If you wanted, you can have resolutions that exceed 8k today.


Yes if you wanna play tetris or pong. Good luck getting a demanding game to run on 4k let alone 8k with any GPU on the market


I run triple 2560x1440 monitors for a total of 7680x1440 resolution.
That's a resolution that's higher than 4k.
My secondary PC has 3x 1920x1080 monitors for a total resolution of 5760x1080.

It is easily doable with multiple GPU's, so no, you don't have to resort to only playing Tetris or Pong.



--::{PC Gaming Master Race}::--

Pemalite said:
ethomaz said:

And yeap... no-1080p was a resolution taken even before the generation started... in 2012 because that what I expected from PS3 and it didn't delivered and now with PS4 it is the minimum because you know the HDTVs standard is 1080p and non-native resolutions looks like shit.

Next-generation I expected 4k mininum... I will buy a 4k in the next five years... my next console will need to be 4k.

I don't what to step back from what I had on PS360... I want to go foward and I support that... PS4 at least is moving foward way better than PS3 while I can't say the samething about the competition.


Good luck with that.
If this is a "Short" generation, there may not be enough technical progress in the PC space to make 4k a viable option for consoles in 4-5 years time, 2k resolution (I.E. Quad-HD) could be a possibility, my bets are on 1080P being a target again.
Keep in mind that "big" GPU jumps are only happening once every 3-4 years now in the PC space and consoles will probably use mid-range or low-end hardware anyway.
Right now you need to Crossfire/SLI two high-end cards to achieve a decent 4k experience and with the baseline moved up thanks to the new consoles, I can see the need for myself and others throwing more hardware at the performance problem, which you can't do in a consoles because of costs.


Very good points. GPU performance used to double every 18-24 months but now it takes 3 years for that to happen. HD7970 came out December 2011 and only now with 980 we are approaching 80-90% more performance than that card:

http://www.techpowerup.com/reviews/ASUS/GTX_980_STRIX_OC/25.html

The pace of progress is even slower in the 100-125 TDP mobile space. AMD just released M295X which is nowhere near 2X faster than their HD7970M that debuted May 2012. Going from 1080P to 4K essentially requires 3-4X the GPU power of HD7970/295X. That means 3-4X the GPU increased is wasted just to move up to 4K but it doesn't even take into consideration that games in 2019-2020 will become a lot more demanding in terms of lighting complexity, AI, NPCs, textures, tessellation, etc. That means PS5 really needs a GPU at least 10X as powerful as PS4 to even consider 4K gaming viable. As a result, it's highly unlikely that PS4/XB1 will have a short 4-5 years lifecycle. 

In upcoming games like Witcher 3 or Kingdom Come Deliverance a single $550 980 will be a slideshow at 4K:

But the biggest reason 4K console gaming is more or less gimmicky is the requirement for a very large 4K TV.  This article goes into a lot of detail that if you sit at a normal viewing distance of 9-10 feet, you'll need an 84" 4K TV to resolve all the details of 4K. Firstly, that is a very large TV for most apartments/households to fit comfortable. Secondly, such TVs still cost way too much and even in 4-5 years they are unlikely to cost less than $1500. 

http://www.cnet.com/news/why-ultra-hd-4k-tvs-are-still-stupid/

For someone who wants to experience 4K gaming today, it's far better to just grab a 32" IPS BenQ monitor for $999:

http://www.bhphotovideo.com/c/product/1081065-REG/benq_bl3201ph_32_3840x2160_uhd_ips.html

The solution would then be to play most cross-platform titles like AC Unity on the PC and enjoy console exclusives on XB1/PS4. Otherwise, one is going to be waiting 7-8 years to experience 4K on a PS5. 

As far as AC Unity goes, the problem is the hype Ubisoft created. The game has flat shadows, pre-baked lighting model that runs off the CPU instead of using DirectCompute of GCN and it's very poorly optimized, calling for a GTX680/7970 as the minimum spec on the PC. Overall it pales in comparison to the gameplay footage and screenshots of say Witcher 3. Ubisoft isn't particularly known for creating good looking well optimized games though. They should have focused on gameplay and co-op aspects instead of hyping up graphics for Unity because now everyone is just talking about the graphics which is a lot of negative publicity for this title. Whoever is PR/marketing for AC Unity should be fired for focusing on the wrong aspect to promote the game. 



ethomaz said:

Resolution is one of the key points that affect image quality and if you are a PC user you will know the last thing you will drop in game is resolution... everything else will be second to resolution except framerate... effects, textures, AA, etc, etc will be all dropped before resolution for better image quality.

No, that's not true at all. Except for some strategy games, a PC game at 1080P or 1440P with maximum IQ and AA will always look better graphically than a game at 4K with Low or Medium settings, unless the PC version is broken whereby there is no IQ change going from Medium to Very High which happens often with half-assed console ports. Resolution does nothing to fix low polygon character models, broken shadow/lighting model, low resolution textures, lack of realistic physics effects, etc. If you were a PC gamer, it was evident that Crysis 1 in 2007 at 1280x1024 fully maxed out look better than any PC game ever made at that time at 2560x1600. If you take a game like Super Mario 64 or Quake 3 and play it at 8K, it's never going to look at good as Crysis 3 at 1280x1024. The reason why PC games look so horrible at lower resolution is the lack of proper scaling in LCDs when playing in non-native full window mode. If however you fired up Crysis 3 at 1080P, it would look better than AC Unity on a 32" 4K monitor just because the complexity of the graphics and the lighting model is far superior since CryEngine 3 is a superior game engine all around. 

Pemalite said:

And the only way consoles next gen will be capable of 4k gaming is actually, up to the PC... And lets face it, the PC is slowing down in terms of performance increases, AMD for instance has pretty much been stagnant for 3 years, maybe 4 (Remains to be seen.).
Single GPU's today just dont have the grunt for 4k yet and if AMD continues on a course of only 10-50% performance increases every 3+ years, then it's probably going to be well into 2020 before mid-range hardware is capable of 4k.

You are way off there. HD6970 came out Dec 2010 vs. 290X that came out November 2013. In 3 years AMD increased performance 2.3X. In 4 years from HD5870 that came out Sept 2009, AMD increased performance 2.95X

http://www.computerbase.de/2013-12/grafikkarten-2013-vergleich/10/

In 2015 we will have an AMD GPU 2X faster than HD7970. Both AMD and NV now double GPU performance in 3 years. The conclusion of your statement I agree with though because even if GPUs are 10X faster by 2020, games will also become 2-3X more complex. The law of diminishing returns will also mean that 10X the graphics horsepower will not make a giant leap in graphics anymore.

That's Crytek says it's more difficult to wow gamers graphically now. Simply said 10X the increase in graphics power from 2000 to 2010 is not the same as 10X the increase in power from 2010 to 2020. Graphics are already so good relative to early 2000s, that the next leap in graphics will require GPUs 50-100X more powerful. The first time you saw a game like Unreal 2 or  Crysis and your jaw dropped...well that's not going to happen anymore. Crysis 3, Metro LL, Ryse Son of Rome and Project CARS are already so good looking on the PC that improving beyond that requires an exponential increase in GPU power. I would even say if PS5 focused on 1080P 60 fps gaming for all games at max IQ by 2020, that would be far more preferable than 4K games at low/medium details. Alas, marketing will try to spin 4K gaming as the future despite most people owning < 65" TVs.



BlueFalcon said:
Pemalite said:

And the only way consoles next gen will be capable of 4k gaming is actually, up to the PC... And lets face it, the PC is slowing down in terms of performance increases, AMD for instance has pretty much been stagnant for 3 years, maybe 4 (Remains to be seen.).
Single GPU's today just dont have the grunt for 4k yet and if AMD continues on a course of only 10-50% performance increases every 3+ years, then it's probably going to be well into 2020 before mid-range hardware is capable of 4k.

You are way off there. HD6970 came out Dec 2010 vs. 290X that came out November 2013. In 3 years AMD increased performance 2.3X. In 4 years from HD5870 that came out Sept 2009, AMD increased performance 2.95X

http://www.computerbase.de/2013-12/grafikkarten-2013-vergleich/10/


The Radeon 7970 was released in December 2011, we are one month away from hitting 3 years and all that's been released is the Radeon 290 and 290X which didn't "replace" the 7970 per-say, but slotted in a spot higher.

The Radeon R9 290 (Keep in mind, I have four of them, prior to that I had three 7970's and before that dual 6950's unlocked into 6970's.) is anywhere from 10-50% faster than the Radeon 7970 depending on the benchmark, averging closer to 30-40% overall. (I can't be bothered working out the exact percentages.)
http://anandtech.com/bench/product/1032?vs=1059

Please refrain from posting non-English links, I am unable to read what they say, Anandtech however is generally regarded as highly reliable and accurate.

Now we will be passing the 3 year mark soon before the Radeon 300 series is released, with some rumours placing the 300 series in the second or 3rd quarter of next year for release (I HOPE NOT!), hence my reasoning of using "3+".
And even then, we are not guarenteed to get an increase in the high-end that is going to be significant.

However, what we can do is look at the increase that the jump from 40nm (Terascale 2) to 28nm (Graphics Core Next) brought us, which was in the realm of 30-90% depending on the benchmark.
However, the difference this time around is that there is no architecture overhaul, what AMD will be doing is using the tech found in Tonga, aka, Graphics Core Next "1.2". - So one can assume that the gains will be less than the jump between Terascale and Graphics Core Next, although there is node drop hopefully to 20nm.

With that in mind, what makes me think second half of 2015 before a release was the rumour of two new GPU's, one that was a 350mm2 die, which turned out to be tonga' and another 500mm2 die which has yet been released, which would make it larger than the Radeon R9 290X, that may then buy AMD a little time untill the second half of 2015. - Again, only a rumour and only half of it has come true, take it as you will.



--::{PC Gaming Master Race}::--