By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Is 720p still acceptable to you?

 

Well, is it?

Yes 571 64.30%
 
No 317 35.70%
 
Total:888
SvennoJ said:
TomaTito said:

Please inform us how that goes. How laggy is it? Compared to Gamepad or Vita(TV)?

The Gamepad uses the 5GHz band to reduce the lag, and due to this it has a low range.
I'm assuming Remote Play is just streamed through your network? Even from outside?

FYI WiDi on my laptop and smartphone does lag noticeably.

I posted my findings in this thread
http://gamrconnect.vgchartz.com/thread.php?id=215495&page=2

It is laggy over wifi, neither my ps4 or laptop are wired. Direct wifi connection is not (yet?) supported.
Still playable, yet a bit blurry (max 10mbps) Definitely not WiiU gamepad quality.

Thanks for the heads up, already saw the great posts you made there.
Still not sure what protocol the Gamepad uses since Miracast/WiDi/etc all have an inherent lag.

-----

Did a little research (I've tried before during launch but nothing came up) and there are little snippets:

  • It uses a modified Wi-Fi protocol designed for low-latency transmission, establishing its connection with the console by using a variant of the WPS process, with proprietary transfer protocol and software co-developed with Broadcom [source]
  • Controller inputs themselves are beamed back to the Wii U via the same WiFi channel (180 times per second) [source]
  • And some actual unfiltered information [source]:
While some “journalists” reported that the Wii U gamepad is using the Miracast technology, a Wi-Fi standard, it turned out that this was never the case. Instead, Nintendo decided to reinvent four different protocols (video streaming, audio streaming, input streaming as well as a light request-reply RPC protocol), and embed them in a slightly obfuscated version of WPA2, sent over the air using 5GHz Wi-Fi 802.11n. A small ARM CPU is embedded in the Wii U Gamepad (codenamed DRC) and runs a realtime operating system to handle network communication. In the Wii U, another ARM CPU (codenamed DRH) does the same thing.

Sorry for going off-topic guys :)



@Twitter | Switch | Steam

You say tomato, I say tomato 

"¡Viva la Ñ!"

Around the Network

I can tolerate it, but I can definitely tell the difference between it and 1080p. 1080p is preferable. I went between resolutions on my Xbox One and PS4 and it was like night and day.



I was gaming at higher than 720P in the mid 90's. (1280x1024)

I don't find 1080P acceptable unless it's a mobile device, 720P has no chance, 1080P belongs in last century.
Quad HD and Quad Full HD is where it is at.

I also find it hilarious that the same people on these forums who didn't care about resolution before these consoles launched are championing 1080P like it's some new amazing revelation.



--::{PC Gaming Master Race}::--

Yerm said:
TH-Work said:

Why? It's not like any resolution makes a game better!

its not a matter of game quality, its a metter of product quality. 720p looks fine, but even CELL PHONES are starting to have higher resolution

And? Who cares? Alot NES games are still better than games from today, resolution doesen't matter ;)



lol no



Around the Network

if the iq is good sure.



TomaTito said:

Thanks for the heads up, already saw the great posts you made there.
Still not sure what protocol the Gamepad uses since Miracast/WiDi/etc all have an inherent lag.

-----

Did a little research (I've tried before during launch but nothing came up) and there are little snippets:

  • It uses a modified Wi-Fi protocol designed for low-latency transmission, establishing its connection with the console by using a variant of the WPS process, with proprietary transfer protocol and software co-developed with Broadcom [source]
  • Controller inputs themselves are beamed back to the Wii U via the same WiFi channel (180 times per second) [source]
  • And some actual unfiltered information [source]:
While some “journalists” reported that the Wii U gamepad is using the Miracast technology, a Wi-Fi standard, it turned out that this was never the case. Instead, Nintendo decided to reinvent four different protocols (video streaming, audio streaming, input streaming as well as a light request-reply RPC protocol), and embed them in a slightly obfuscated version of WPA2, sent over the air using 5GHz Wi-Fi 802.11n. A small ARM CPU is embedded in the Wii U Gamepad (codenamed DRC) and runs a realtime operating system to handle network communication. In the Wii U, another ARM CPU (codenamed DRH) does the same thing.

Sorry for going off-topic guys :)

It has it's drawbacks though, the gamepad already starts to lose connection at the other end of my living room, nor will it work through the patio door. I have to move the WiiU outside if I want to sit on the deck with the gamepad, while it's less than 8 ft from where it normally sits.

Yet it gives possibilities for wireless vr headsets, perhaps the WiiU tech is fast enough for that. However game streaming VR games will be impossible. Vita and PS TV should be a bit better as they support direct WiFi, perhaps that will be a future update for remote play too. Too bad the ps4 doesn't have 5ghz wifi.

Anyway the lag is a much more restricting factor than 720p.



Pemalite said:
I was gaming at higher than 720P in the mid 90's. (1280x1024)

I don't find 1080P acceptable unless it's a mobile device, 720P has no chance, 1080P belongs in last century.
Quad HD and Quad Full HD is where it is at.

I also find it hilarious that the same people on these forums who didn't care about resolution before these consoles launched are championing 1080P like it's some new amazing revelation.

 

Can't speak for others, but I think this has to do with most people now having 1080p displays, with 4k still being too expensive and under-supported to justify a buy-in. I remember when I first played a game at 1080p res and was like "holy shit, this is what my TV can do," and from there on out was disappointed with all the 720p last gen. Previously I didn't care about resolution whatsoever.

I'm good with 720p on my current screen (50"). I may get a 60+ incher in a few years, and at that point I would prefer 1080p or 4k. But 720 would still look "good".

I prefer graphical resources put into better models, lighting, large zones filled with content, etc.



If you own a big screen TV and cant see the difference between 720p and 1080p, i would say that you are lying.