By using this site, you agree to our Privacy Policy and our Terms of Use. Close
richardhutnik said:
kaneada said:
 

I'm glad I read that all the way through because my immediate knee jerk reaction was to flame you for your lack of understanding of what a 64-bit bus actually is. The second reaction was one of irony, becuase something reeked of satire. Hats off to you, because you could be a fanboy in a cheap hollywood production.

Atari's math was off. Two 32-bit processors (named Tom and Jerry) does not equal a 64 bit bus. Secondly the bus width preaching of the old gamming days was nothing but a gimmick. Why didn't I buy one? Because my parents thought my SNES was sufficient and didn't think I needed another gamming console. Being 10 or 12 at the time I wanted every video game system out there because I played video games then like I drink now. ALL THE TIME!

We are at a place where people no longer discuss number of bits, because it flat out doesn't matter to sales.  In regards to the Jaguar, here is Wikipedia on it:

http://en.wikipedia.org/wiki/Atari_Jaguar

Flare II initially set to work designing two consoles for Atari Corp. One was a 32-bit architecture (codenamed "Panther"), and the other was a 64-bit system (codenamed "Jaguar"); however, work on the Jaguar design progressed faster than expected, and Atari Corp. canceled the Panther project to focus on the more promising 64-bit technology.

In a last ditch effort to rescue the Jaguar, Atari Corp. tried to play down the other two consoles by proclaiming the Jaguar was the only "64-bit" system. This claim is questioned by some[9], because the CPU (68000) and GPU executed a 32-bit instruction-set, but sent control signals to the 64-bit graphics co-processors (or "graphics accelerators"). Atari Corp.'s position was that the mere presence of 64-bit ALUs for graphics was sufficient to validate the claim. Design specs for the console allude to the GPU or DSP being capable of acting as a CPU, leaving the Motorola 68000 to read controller inputs. In practice, however, many developers used the Motorola 68000 to drive gameplay logic.

Processors

  • "Tom" Chip, 26.59 MHz
    • Graphics processing unit (GPU) – 32-bit RISC architecture, 4 KB internal cache, provides wide array of graphic effects
    • Object Processor – 64-bit RISC architecture; programmable; can behave as a variety of graphic architectures
    • Blitter – 64-bit RISC architecture; high speed logic operations, z-buffering and Gouraud shading, with 64-bit internal registers.
    • DRAM controller, 32-bit memory management

Again, it was 64bit the wat the PC-Engine was 16bit.  It had a 64bit processor in it and a 64bit bus.  Of course, most developers coded to the 16bit processor in it, so it was moot.  Number of bits don't mean much either.  The Intellivision was a 16bit system, but compare the 8bit NES to it.  The NES blows it away. 

 

I understand what is going on here.  The real understanding is screaming specs is a joke when it comes to games.

That's where I pulled my info. However when you think about it, that was kind of a bottle neck in performance. If your primary processing can only send 32-bit instructions, having 64-bit accelerators don't matter. Ultimately the system was 32-bit on my understanding of this document. The only real advantage you have there is parallelism as theorectically you could send two 32-bit instructions on that bus, but ultimately that was a weak solution.

As far as whether the system was 64 bit or not is kind of a joke. If the presence of 64-bit anything existing on the board made it a 64-bit system then I could just as easily question that it is 32-bit due to the main CPU. That would be like saying the Gensis was only 8-bit because it used Z80 for sound processing and backward compatibility.The point is that neither statement in this paragraph faithfully represents the systems acutal capability (i.e. the Gensis used a motorolla 68000 CPU for its primary processor which is 16-bit.)

Lastly, I did read all the way through this, so I know it was intended to be satirical and therefore I wasn't actually flaming you for any lack of understanding and I do think I did mention that the systems specs were used a gimmick, so I got your point.

 



-- Nothing is nicer than seeing your PS3 on an HDTV through an HDMI cable for the first time.