Forums - Gaming Discussion - RDR2 and Destiny 2 on Stadia run at lower resolutions than they do on XB1 X

That's because Scorpio is a monster.



Around the Network
eva01beserk said:
SanAndreasX said:
Stadia honestly looks more like a product for people in poorer countries where conventional console or PC hardware is too expensive due to local economics or trade issues.

They would not be able to afford hardware, but somehow be able to afford a subscription service + Crazy fast internet?

Well, imported hardware prices will scale accordingly to nominal exchange rates, while services are priced locally. So it makes some sense.

Of course, the latency would be insane and there's still an expensive controller to buy so nah.



 

 

 

 

 

Mr Puggsly said:
RolStoppable said:
I play Switch games at 720p and 30 fps, so why would I be bothered by this?

People make a fuss about 1440p on Stadia, but that's nothing compared to the inability of Google to stream Nintendo games. Oh wait... Google's idea for that is Youtube where people can watch others play Nintendo games. What a bunch of fools, they really don't get gaming.

Well you've a good job make this topic about yourself.

But here's the rub. Stadia was marketed for having vastly superior specs than consoles, X1X included. In theory those specs create the expectation of maybe 4K and 60 fps.

This is actually good news for MS and Sony. Their streaming services werent making big visual promises like Stadia. Therefore the disparity isnt going to be as big as expected.

Personally the resolution doesent bug me, but the 30 fps cap is bullshit. It was marketed for its power, its not demonstrating what was advertised.

Early days, early games and you conveniently ignore this

Destiny 2 is perfectly playable on Stadia and delivers the 60fps experience that Xbox One X cannot

60 fps for a streaming service is pretty good, that with lower latency than a lot of 30 fps games on console. (Yeah latency is still 44ms higher for this game compared to the console version, not bad but still noticeable)



SanAndreasX said:
Stadia honestly looks more like a product for people in poorer countries where conventional console or PC hardware is too expensive due to local economics or trade issues.

Those countries often also lack fast internet in turn - and when you can afford fast enough internet for stadia in those countries, you can very well afford a console there, too.



Ok guys, I figured out the exact spec each user is getting from Google stadia

CPU: Intel Xeon 4core/8thread 2,7ghz

GPU: 5,35 Teraflop Vega

Ram: 8GB HBM2 Vram + 8GB DDR4

The cpu is based on this slide, where several people on twitter and reddit said it's a Intel Xeon 4core/8thread Cpu based on the L2 + L3 cache number

The amount of RAM is based on the slide above and this quote from Arstechnica "a total of 16GB combined VRAM and system RAM clocked at up to 484 GB/s".

8GB hbm2 + 8GB DDR4 makes the most sense based on that quote.

https://arstechnica.com/gaming/2019/03/google-jumps-into-gaming-with-google-stadia-streaming-service/

We now know that Destiny 2 is running at 1080p/60fps at medium settings. Now looking at a benchmark on guru3d, vega 57 which has 10,5TF gets 142FPS at 1080p medium settings, split that into 2 users you get 71fps. The professional VEGA GPU have 16GB of HBM2 and 11,3TF (air cooled version). So I simply think google is using a professional vega gpu at 10,7TF with 16GB of HBM2 and splitting the gpu for 2 users.

Let's hope Eurogamer will do it's job and ask google exactly what hardware spec each user is getting as it's kinda funny how people are paying money into google stadia and can't be sure what kind of hardware they're getting :) and note I'm just guessing but sharing what I'm guessing this on.



"Donald Trump is the greatest president that god has ever created" - Trumpstyle

6x master league achiever in starcraft2

Beaten Sigrun on God of war mode

Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.

1-0 against Grubby in Wc3 frozen throne ladder!!

Around the Network

YouTube always makes my video uploads very blurry so I guess this is on brand.



Nintendo is selling their IPs to Microsoft and this is true because:

http://gamrconnect.vgchartz.com/thread.php?id=221391&page=1

And you pay a subscription fee + full price of the game on a supposedly better HW (that some swear would compete with PS5 and Xbox next) to get an inferior result than X1X and probably PS4Pro, no thanks. And all that for a small selection of older games.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Bofferbrauer2 said:
I wouldn't say that the resolution is worse... but rather that the codec compresses the image so much that the end result looks more like a much lower resolution than it should have.

And that was obvious from the get-go, since the low internet speeds posted as necessary for 1080p60 and 4k made it very clear it would be hyper-compressed compared to industry standards.

Nope, you are wrong, the game is rendered at 1080p then upscaled to 4k for the streaming (and probably after the rendering compressed)... it is a cluster of fuck.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

DonFerrari said:
Bofferbrauer2 said:
I wouldn't say that the resolution is worse... but rather that the codec compresses the image so much that the end result looks more like a much lower resolution than it should have.

And that was obvious from the get-go, since the low internet speeds posted as necessary for 1080p60 and 4k made it very clear it would be hyper-compressed compared to industry standards.

Nope, you are wrong, the game is rendered at 1080p then upscaled to 4k for the streaming (and probably after the rendering compressed)... it is a cluster of fuck.

@bolded: That's what I was alluding to, that their compression is shit because they wanted to keep internet bandwith low. Upscaling a 1080p resolution to 4K is just the cherry on top to achieve that goal. Iirc then you needed 10 Mbit for 720p, but with 30 Mbit you already reached 4k according to them. It was clear to me that this needed some trickery like extreme (and inadequate) compression - and upscaling on top of that as it seems.

I mean, these are the recommended bandwiths for 4k60:

VP9: 120 Mbit

AV1, HVEC: 40-160 Mbit

AVC: 240 Mbit

Keep in mind that the minimum of 40 Mbit needs to be a rock-solid at all times to work, which is rarely the case over internet.



DonFerrari said:
And you pay a subscription fee + full price of the game on a supposedly better HW (that some swear would compete with PS5 and Xbox next) to get an inferior result than X1X and probably PS4Pro, no thanks. And all that for a small selection of older games.

Sometimes it looks worse than the base PS4 and XB1s does.
Ontop of that theres stuttering (audio isnt synced, and will hitch), and input lag (anywhere from like 0,5sec to full 1s).

Your getting a worse experiance than a console does.
Some of this is fixable, if google spends more on the servers, and if people have good enough bandwidth and data caps.

However whats not really fixable is the input delay (it varies on case by case, depending on where you live)

Watch this: https://www.youtube.com/watch?v=o6pf988yFSc&feature=emb_logo

27 secounds into it, he does some input lag testing...... just the eyeball type.
However its so bad, you dont really need equipment to messure it.

You can observe the delay between his key press, and the action ingame.
Its long in this case, like 1+ secound of delay.

Last edited by JRPGfan - on 19 November 2019