disolitude said:
No it wasn't. The first true 64 bit processors were PC processor 3-4 years back. All this other mumbo jumbo with consoles being 64, 128 bits is not true. All these consoles were 32 bit processors with 2 or 4 x Floating Point Bus. 4 x 32 = 128...do the math :) PS2, dreamcast, xbox were all 32 bit processors... http://www.segatech.com/technical/cpu/index.html Read the very last paragraph on the page. |
The Jaguar did have more than one 64bit processors in it:
http://en.wikipedia.org/wiki/Atari_Jaguar
"Tom" Chip, 26.59 MHz
- Graphics processing unit (GPU) – 32-bit RISC architecture, 4 KB internal cache, provides wide array of graphic effects
- Object Processor – 64-bit RISC architecture; programmable; can behave as a variety of graphic architectures
- Blitter – 64-bit RISC architecture; high speed logic operations, z-buffering and Gouraud shading, with 64-bit internal registers.
- DRAM controller, 32-bit memory management
Anyhow, just with the Intellivision, number of bits isn't key to power. The programmers mostly would code to the 16bit 68000 chip in the Jaguar. And yes, the Intellivision was 16bit:
http://en.wikipedia.org/wiki/Intellivision
Intellivision was the first 16-bit game console, though some people have mistakenly referred to it as a 10-bit system because the CPU's instruction set and game cartridges are 10 bits wide. The registers in the microprocessor, where the mathematical logic is processed, were 16 bits wide.
Also, the TurboGrafix-16 (PC-Engine) has a 16bit graphics processor, but 8 bit CPU:
http://en.wikipedia.org/wiki/Turbografx_16
The TurboGrafx-16 has an 8-bit CPU and a dual 16-bit GPU capable of displaying 32 sets of 15 colors at once out of 512.
Anyhow, number of bits doesn't say much.







