By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Bofferbrauer2 said:
DonFerrari said:

Nope, you are wrong, the game is rendered at 1080p then upscaled to 4k for the streaming (and probably after the rendering compressed)... it is a cluster of fuck.

@bolded: That's what I was alluding to, that their compression is shit because they wanted to keep internet bandwith low. Upscaling a 1080p resolution to 4K is just the cherry on top to achieve that goal. Iirc then you needed 10 Mbit for 720p, but with 30 Mbit you already reached 4k according to them. It was clear to me that this needed some trickery like extreme (and inadequate) compression - and upscaling on top of that as it seems.

I mean, these are the recommended bandwiths for 4k60:

VP9: 120 Mbit

AV1, HVEC: 40-160 Mbit

AVC: 240 Mbit

Keep in mind that the minimum of 40 Mbit needs to be a rock-solid at all times to work, which is rarely the case over internet.

Understood. It is so fucking idiotic to upscale to them compress =p

Stadia looks worse every time I look at it.

JRPGfan said:
DonFerrari said:
And you pay a subscription fee + full price of the game on a supposedly better HW (that some swear would compete with PS5 and Xbox next) to get an inferior result than X1X and probably PS4Pro, no thanks. And all that for a small selection of older games.

Sometimes it looks worse than the base PS4 and XB1s does.
Ontop of that theres stuttering (audio isnt synced, and will hitch), and input lag (anywhere from like 0,5sec to full 1s).

Your getting a worse experiance than a console does.
Some of this is fixable, if google spends more on the servers, and if people have good enough bandwidth and data caps.

However whats not really fixable is the input delay (it varies on case by case, depending on where you live)

Watch this: https://www.youtube.com/watch?v=o6pf988yFSc&feature=emb_logo

27 secounds into it, he does some input lag testing...... just the eyeball type.
However its so bad, you dont really need equipment to messure it.

You can observe the delay between his key press, and the action ingame.
Its long in this case, like 1+ secound of delay.

Worse than base console for a theoretically new gen HW level is obtuse kkkk, and yes that isn't even the worse issue. But that is basically what we have been saying about "The Cloud" that would make X1 much more powerful than PS4, latency and internet reliability isn't there to be a good idea.

haxxiy said:
I don't think the results necessarily mean GPUs are being split as someone suggested. It could be more likely instead at the servers are severely heat constrained, instead of hardware constrained, and running at lower clocks. Cheaper and simpler than spending a few million dollars in extra cooling... specially if you believe your userbase is too dumb to notice the difference.

That would be really obtuse of them. To make air cooling on the server room be the limiting factor is ridiculous, would be better to put cheaper HW and not have issue with the HVAC while still delivering same shitty performance.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."