By using this site, you agree to our Privacy Policy and our Terms of Use. Close
The_Liquid_Laser said:
Conina said:

No, it hadn't. They have processors of the same CPU family, both based on the 6502... not the same CPU!

The Atari 2600 had a dumbed down 6507, clocked 1.19 MHz, with very limited memory access: the chip can only address 8 Kb memory and the cartridge slot limited it further down to addressable 4 KB.

The NES had a Ricoh 2A03, clocked 1.79 MHz.

You also can't just ignore the different co-processors and suggest that both devices had the same limitations, but Nintendo was better in optimization. 

Thank you so much for helping me with my argument, which is that the NES hardware was considered weak when it was released.  In Japan it was released in 1983, while the Atari 2600 was released in the US in 1977.  As your post points out the NES processor was only 50% more powerful.  But according to Moore's Law it should have been 1600% more powerful after 6 years. 

Moore's law ain't about clock rates and only indirectly about performance... but I know what you mean.

Of course there were "next-gen" CPUs already available, f.e. the Motorola 68000 was released in 1979. But it took years to get these technologies into consumer products back then, especially into home consoles which were considered as toys.

The 68000 was used exclusively in expensive workstations the first years (Sun, SGI, HP, Apollo), then it trickled down into cheaper products: into the $2500 Macintosh in 1984, then into the $1300 Amiga 1000 and into the $800 Atari ST in 1985, then into the $700 Amiga 500 in 1987 and finally into the $189 SEGA Genesis in 1988/89. So it took almost a decade to reach consumer products below $500.

Nobody would have expected such a CPU in a $179 console in 1983 and Nintendo would have been crazy to plan with that in 1980 - 1982.

So which other options than a modified 6502 did they have in the planning phase of the NES? The Z80 and the intel 8085 were't that much faster (and a lot more expensive).