By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - RDR2 and Destiny 2 on Stadia run at lower resolutions than they do on XB1 X

And you pay a subscription fee + full price of the game on a supposedly better HW (that some swear would compete with PS5 and Xbox next) to get an inferior result than X1X and probably PS4Pro, no thanks. And all that for a small selection of older games.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Around the Network
Bofferbrauer2 said:
I wouldn't say that the resolution is worse... but rather that the codec compresses the image so much that the end result looks more like a much lower resolution than it should have.

And that was obvious from the get-go, since the low internet speeds posted as necessary for 1080p60 and 4k made it very clear it would be hyper-compressed compared to industry standards.

Nope, you are wrong, the game is rendered at 1080p then upscaled to 4k for the streaming (and probably after the rendering compressed)... it is a cluster of fuck.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

DonFerrari said:
Bofferbrauer2 said:
I wouldn't say that the resolution is worse... but rather that the codec compresses the image so much that the end result looks more like a much lower resolution than it should have.

And that was obvious from the get-go, since the low internet speeds posted as necessary for 1080p60 and 4k made it very clear it would be hyper-compressed compared to industry standards.

Nope, you are wrong, the game is rendered at 1080p then upscaled to 4k for the streaming (and probably after the rendering compressed)... it is a cluster of fuck.

@bolded: That's what I was alluding to, that their compression is shit because they wanted to keep internet bandwith low. Upscaling a 1080p resolution to 4K is just the cherry on top to achieve that goal. Iirc then you needed 10 Mbit for 720p, but with 30 Mbit you already reached 4k according to them. It was clear to me that this needed some trickery like extreme (and inadequate) compression - and upscaling on top of that as it seems.

I mean, these are the recommended bandwiths for 4k60:

VP9: 120 Mbit

AV1, HVEC: 40-160 Mbit

AVC: 240 Mbit

Keep in mind that the minimum of 40 Mbit needs to be a rock-solid at all times to work, which is rarely the case over internet.



DonFerrari said:
And you pay a subscription fee + full price of the game on a supposedly better HW (that some swear would compete with PS5 and Xbox next) to get an inferior result than X1X and probably PS4Pro, no thanks. And all that for a small selection of older games.

Sometimes it looks worse than the base PS4 and XB1s does.
Ontop of that theres stuttering (audio isnt synced, and will hitch), and input lag (anywhere from like 0,5sec to full 1s).

Your getting a worse experiance than a console does.
Some of this is fixable, if google spends more on the servers, and if people have good enough bandwidth and data caps.

However whats not really fixable is the input delay (it varies on case by case, depending on where you live)

Watch this: https://www.youtube.com/watch?v=o6pf988yFSc&feature=emb_logo

27 secounds into it, he does some input lag testing...... just the eyeball type.
However its so bad, you dont really need equipment to messure it.

You can observe the delay between his key press, and the action ingame.
Its long in this case, like 1+ secound of delay.

Last edited by JRPGfan - on 19 November 2019

I don't think the results necessarily mean GPUs are being split as someone suggested. It could be more likely instead at the servers are severely heat constrained, instead of hardware constrained, and running at lower clocks. Cheaper and simpler than spending a few million dollars in extra cooling... specially if you believe your userbase is too dumb to notice the difference.



 

 

 

 

 

Around the Network
JRPGfan said:
DonFerrari said:
And you pay a subscription fee + full price of the game on a supposedly better HW (that some swear would compete with PS5 and Xbox next) to get an inferior result than X1X and probably PS4Pro, no thanks. And all that for a small selection of older games.

Sometimes it looks worse than the base PS4 and XB1s does.
Ontop of that theres stuttering (audio isnt synced, and will hitch), and input lag (anywhere from like 0,5sec to full 1s).

Your getting a worse experiance than a console does.
Some of this is fixable, if google spends more on the servers, and if people have good enough bandwidth and data caps.

However whats not really fixable is the input delay (it varies on case by case, depending on where you live)

Watch this: https://www.youtube.com/watch?v=o6pf988yFSc&feature=emb_logo

27 secounds into it, he does some input lag testing...... just the eyeball type.
However its so bad, you dont really need equipment to messure it.

You can observe the delay between his key press, and the action ingame.
Its long in this case, like 1+ secound of delay.

That does look unplayable and is a much bigger problem than resolution or low FPS.



Signature goes here!

TruckOSaurus said:
JRPGfan said:

Sometimes it looks worse than the base PS4 and XB1s does.
Ontop of that theres stuttering (audio isnt synced, and will hitch), and input lag (anywhere from like 0,5sec to full 1s).

Your getting a worse experiance than a console does.
Some of this is fixable, if google spends more on the servers, and if people have good enough bandwidth and data caps.

However whats not really fixable is the input delay (it varies on case by case, depending on where you live)

Watch this: https://www.youtube.com/watch?v=o6pf988yFSc&feature=emb_logo

27 secounds into it, he does some input lag testing...... just the eyeball type.
However its so bad, you dont really need equipment to messure it.

You can observe the delay between his key press, and the action ingame.
Its long in this case, like 1+ secound of delay.

That does look unplayable and is a much bigger problem than resolution or low FPS.

Look a bit further where he plays on his phone. It also says in the article that it wasn't a problem on his phone. When you get lower latency through your phone, there's something wrong with your internet connection...



Bofferbrauer2 said:
DonFerrari said:

Nope, you are wrong, the game is rendered at 1080p then upscaled to 4k for the streaming (and probably after the rendering compressed)... it is a cluster of fuck.

@bolded: That's what I was alluding to, that their compression is shit because they wanted to keep internet bandwith low. Upscaling a 1080p resolution to 4K is just the cherry on top to achieve that goal. Iirc then you needed 10 Mbit for 720p, but with 30 Mbit you already reached 4k according to them. It was clear to me that this needed some trickery like extreme (and inadequate) compression - and upscaling on top of that as it seems.

I mean, these are the recommended bandwiths for 4k60:

VP9: 120 Mbit

AV1, HVEC: 40-160 Mbit

AVC: 240 Mbit

Keep in mind that the minimum of 40 Mbit needs to be a rock-solid at all times to work, which is rarely the case over internet.

Understood. It is so fucking idiotic to upscale to them compress =p

Stadia looks worse every time I look at it.

JRPGfan said:
DonFerrari said:
And you pay a subscription fee + full price of the game on a supposedly better HW (that some swear would compete with PS5 and Xbox next) to get an inferior result than X1X and probably PS4Pro, no thanks. And all that for a small selection of older games.

Sometimes it looks worse than the base PS4 and XB1s does.
Ontop of that theres stuttering (audio isnt synced, and will hitch), and input lag (anywhere from like 0,5sec to full 1s).

Your getting a worse experiance than a console does.
Some of this is fixable, if google spends more on the servers, and if people have good enough bandwidth and data caps.

However whats not really fixable is the input delay (it varies on case by case, depending on where you live)

Watch this: https://www.youtube.com/watch?v=o6pf988yFSc&feature=emb_logo

27 secounds into it, he does some input lag testing...... just the eyeball type.
However its so bad, you dont really need equipment to messure it.

You can observe the delay between his key press, and the action ingame.
Its long in this case, like 1+ secound of delay.

Worse than base console for a theoretically new gen HW level is obtuse kkkk, and yes that isn't even the worse issue. But that is basically what we have been saying about "The Cloud" that would make X1 much more powerful than PS4, latency and internet reliability isn't there to be a good idea.

haxxiy said:
I don't think the results necessarily mean GPUs are being split as someone suggested. It could be more likely instead at the servers are severely heat constrained, instead of hardware constrained, and running at lower clocks. Cheaper and simpler than spending a few million dollars in extra cooling... specially if you believe your userbase is too dumb to notice the difference.

That would be really obtuse of them. To make air cooling on the server room be the limiting factor is ridiculous, would be better to put cheaper HW and not have issue with the HVAC while still delivering same shitty performance.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

SanAndreasX said:
Stadia honestly looks more like a product for people in poorer countries where conventional console or PC hardware is too expensive due to local economics or trade issues.

Those kind of places won't have the internet speed Stadia needs to run.



Hiku said:
SvennoJ said:

Look a bit further where he plays on his phone. It also says in the article that it wasn't a problem on his phone. When you get lower latency through your phone, there's something wrong with your internet connection...

In the video it didn't say there wasn't a problem on his phone. It said it wasn't as present on mobile.
I was able to pause the video after he pressed a button and see no action in the game, so it's still significant. Just not as bad as 1+ seconds long.

The reason for this isn't necessarily the internet connection. There's both input lag and display lag going on simultaneously. And while testing on mobile he is using both a different screen, and a different controller, that are all wired differently.

I'm also not sure what the settings are for mobile. Stadia may be outputting at lower resolutions for phones to save bandwidth.

If the user have to also worry about the quality of his modem, router, distance, etc to have a decent experience then it will be even less likely to satisfy customers.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."