kaneada said:
I'm glad I read that all the way through because my immediate knee jerk reaction was to flame you for your lack of understanding of what a 64-bit bus actually is. The second reaction was one of irony, becuase something reeked of satire. Hats off to you, because you could be a fanboy in a cheap hollywood production.
Atari's math was off. Two 32-bit processors (named Tom and Jerry) does not equal a 64 bit bus. Secondly the bus width preaching of the old gamming days was nothing but a gimmick. Why didn't I buy one? Because my parents thought my SNES was sufficient and didn't think I needed another gamming console. Being 10 or 12 at the time I wanted every video game system out there because I played video games then like I drink now. ALL THE TIME!
|
We are at a place where people no longer discuss number of bits, because it flat out doesn't matter to sales. In regards to the Jaguar, here is Wikipedia on it:
http://en.wikipedia.org/wiki/Atari_Jaguar
Flare II initially set to work designing two consoles for Atari Corp. One was a 32-bit architecture (codenamed "Panther"), and the other was a 64-bit system (codenamed "Jaguar"); however, work on the Jaguar design progressed faster than expected, and Atari Corp. canceled the Panther project to focus on the more promising 64-bit technology.
In a last ditch effort to rescue the Jaguar, Atari Corp. tried to play down the other two consoles by proclaiming the Jaguar was the only "64-bit" system. This claim is questioned by some[9], because the CPU (68000) and GPU executed a 32-bit instruction-set, but sent control signals to the 64-bit graphics co-processors (or "graphics accelerators"). Atari Corp.'s position was that the mere presence of 64-bit ALUs for graphics was sufficient to validate the claim. Design specs for the console allude to the GPU or DSP being capable of acting as a CPU, leaving the Motorola 68000 to read controller inputs. In practice, however, many developers used the Motorola 68000 to drive gameplay logic.
Processors
- "Tom" Chip, 26.59 MHz
- Graphics processing unit (GPU) – 32-bit RISC architecture, 4 KB internal cache, provides wide array of graphic effects
- Object Processor – 64-bit RISC architecture; programmable; can behave as a variety of graphic architectures
- Blitter – 64-bit RISC architecture; high speed logic operations, z-buffering and Gouraud shading, with 64-bit internal registers.
- DRAM controller, 32-bit memory management
Again, it was 64bit the wat the PC-Engine was 16bit. It had a 64bit processor in it and a 64bit bus. Of course, most developers coded to the 16bit processor in it, so it was moot. Number of bits don't mean much either. The Intellivision was a 16bit system, but compare the 8bit NES to it. The NES blows it away.
I understand what is going on here. The real understanding is screaming specs is a joke when it comes to games.