When devs say about the 30/60 fps thing they mean they couldn't get all of the effects, textures and quality they wanted if they ran at 60fps. 60fps will look better, but going to 30 and getting a better-looking game is often preferable.
The difference between 30 and 60 is usually a tiny difference in perception but is double the workload on the graphics chip (GPU). 120Hz probably wouldn't look different to 60Hz for 95% of people but would again be double the work. The reason PC have 120Hz monitors is so they can have 3D glasses tech and give 60fps to each eye.
Console CPUs and GPUs are 'worse' than PC tech because they are fixed in a point in time (2005/2006) in order to maintain compatibility, and because their max power consumption is MUCH lower than a PC (Wii = 18W, average PC = 300W, gaming PC can be up to 500/600W real-world consumption). So naturally the Wii's CPU has to use a lot less power and therefore performs a lot less. Like how netbooks can't match a desktop.
Moore's Law, in PCs, is that the transistor count doubles every two years for CPUs and GPUs (I'm paraphrasing but that's the effect). This means that if you have a 2-core CPU in 2006, you can probably have a 4-core CPU in 2008 and an 8-core CPU in 2010 of the same kind (again not quite, because the core is only a part of the CPU). Same thing for the GPU. Performance scales relatively well with core and GPU-shader counts.
So, if a new console was made today, it could have about eight times the performance of the X360 or PS3 (both CPU and GPU) for the same power consumption. If companies were willing to let a console pull 300W then it would be even faster like a desktop PC.
If you want any of that further explained, just ask, there's a lot to say about CPUs and GPUs in particular.







