bdbdbd said:
On consoles the cost per efficiency is on a whole different level, but on PC you're able to get more power when you're willing to pay for it. I think an editor on an PC magazine that commented Xbox 360 launch back in the day nailed it. He commented that he doesn't understand the fuzz around 360 being able to draw hundreds of characters on screen that all look the same, when on a PC you're able to draw hundreds of characters that all look different, though GPU capable of doing that costs as much a a 360. What I really love about console hardware, are the technical tweaks used to boost on-screen performance. PS2 had insane VRAM bandwidth, GC CPU used L2 cache as a buffer to eliminate empty clock cycles, Dreamcast didn't draw off-screen (or behind an object) polygons, 360 CPU was designed to have low internal latencies, Megadrive's DMA controller was interesting enough to have it's own marketing term, SNES had a number of cheap special purpose processors to boost the performance of the weak hardware, to name a few.
|
Yeah, Dreamcast used the PowerVR that was released as PC graphics cards called Kyro. Don't know if you remember them, but a Kyro without HW T&L could be faster than a GeForce. Kyro was PowerVR series 3 though.
Thing about PC is as a gaming platform it needed about 15 years to really get on top. It had it pros like already decent 3D performance at least with an FPU though. But back then it was expensive and outdated quickly. That basically has changed within the last ten years. You still can play every game with a 100$/€ graphics card from 2010/11.
On the other side, console hardware is about maxing out what you have. Always has been. I remember seeing screenshots, not even videos, of Donkey Kong Country and at first thinking that's a next gen game. Like many did back then. At that time i already played on PC as well.
Now i've come to a point where looking what AMD or Nvidia are doing is interesting bu i just don't care anymore for the highest res or ultrahigh quality settings.







