If Google is splitting the GPU performance in half (2 users per gpu) we should see games running at 1080p/60fps, 1440p/30fps or 4k/30fps with checkerboard rendering. This is PS4 games running at 1080p/30fps. The gpu in Stadia should be more capable than I listed but should be close enough.
The cpu is 2-3x stronger than the PS4 CPU depending how well games are optimized for multicores, this if it's a Intel Xeon 4core, 8thread cpu. We don't have a good way finding this out as the GPU will probably be the bottleneck in almost all scenarios. We should see it in Red dead redemption 2 though, performance should land around 52 average fps in more demanding cpu scenes in that game.
Soon eurogamer will probably have analysis of several games.
Virtualization doesn't work like that. The way instructions need to be assigned and executed in the complex GPU pipelines, each with different burst times, data fetches etc. would mean that each "half" would have far less computing power than it is theoretically available. And it would probably be readily noticed from frame time spikes and latency, a bit like what happens with dual GPUs but worse.
Not to mention that comercially speaking, two GPUs half the size of a larger one would have been a cheaper alternative considering how the costs for larger dies scale. Of course, I'm not saying is impossible that it could have happened, maybe Google did invest a lot on GPU resource scheduling and sharing (to little results), but it would seem super sloppy and amateurish even for the standards of the Stadia launch.