By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Teddy said:

 It doesn't really matter what it is called I agree but that's what it is called, your attitude is ignorance and that it does make a difference to you, if you were to see them side by side, it don't care if you don't mind but that's what standards are for. Anyway I can see where you are going with this and that's your opinion but you can't dictate your view just because you don't care, that's an argument from ignorance, beause we do over here as I have said even 1080i is not good enough here and that's what the transmission of most TV is on SKY TV which use 1080i but Talktalk use 1080p. Just because something is x16 doesnt make it HD. It makes it higher resolution. Standards are Standards, and your opinion that it doesn't matter what it is called is just ignorance and ignorance shouldn't be the standard. 

We obviously have very different opinion on this so I'll leave if at that but the basic minimum anyone can call 'HD' is 1280x720 which is 921,600 pixels.  1024x768 is 786,432 pixels which is a lot less and not HD.  My old CRT computer monitor in the 90s haha 1024x768 resolution  it was a POS to be honest. XGA display standard introduced in 1990. Later it became the most common 1024x768 pixels display resolution

Standards are for telivision manufactuers and the consumers who purchase them. The plasmas that either had an XGA or WXGA screen here were called HD, so even then it has been a blurry (definition-wise) standard for a while. Having said that, the terminology of HD seems to not change over time. Otherwise we wouldn't have things like Ultra HD. 1080p would just become the new SD (Standard Definition) and High Definition (HD) would be 4k (or maybe 1440p.) It doesn't matter when the standard started either. 1080i has been a thing since the 90's as well. 

Honestly, this is like arguing whether it is right to write the word, "color" or "colour." Standardization doesn't give us any more knowledge in this context. If we were buying Telivisions right now, sure I would concede that using the standard terminology is important and useful, but what we are doing is talking about the difference(or multiple) of rendering resolution of two different versions of a game. In that context HD in a colloquial sense is fitting. As for the ignorance statement, what is it that makes 720p and 1080p  special? It is just that they fit the ideal aspect ratio for a widescreen television. That's all. It tells us nothing about picture quality in regards to something with a different aspect ratio. A DS game with an aspect ratio that is 1024 x 768 has similar image quality to and equally sized DS image which would be stretched and reduced to 1280 x 720 (if we ignore the ugly stretching.) Notice that such a resolution as 1024 x 768 also doesn't fit within the Standard Definition terminilogy. So in that case, what would a plasma manfuacturer producing a 4:3 plasma television call the resolution capabilities of said television? It isn't that clear-cut and concise. It is blurry of a label which only works in certain contexts. That is why I choose to ignore it when I describe a game's image quality as HD or not. 

Also you are ignoring the history of the term used in video (rather than televisions) where it is even less stringent and more colloquial. 

http://en.wikipedia.org/wiki/High-definition_video

"