Xen said: No, the Wii can't do the same - it's weaker than the original XB... have you seen Far Cry on the box, by the way?
|
Seriously when does ignorance end? I'm sure it's been posted and discussed so many times. Are you that new to any of the system data? This is so intergral to the GC being more of a raw power proscer than the XBox, check RISC out. Coupled with the fact that GC was built to have no bottle necks the GC could pump out apx 2.5 times the number crunching ability. It's just that A) MS Shaders are easier to use than TEV and B) most developers don't know what the hell TEV is.
As for the Wii, it's built upon GC with increased stats. It's far more powerful. It just suffers from the same lack of knowledge of TEV and knowing what the machine can do. The benifit that GC and Wii had over last gen is that it is RISC and has no bottle necks. 360 I believe no also uses RISC since they switched to a PPC(same as GC/Wii) and have put effort to remove bottle necks. PS3 still has bottle necks, it's just got massive raw computing power. Theres just so much effiency the GC/Wii have over the last gen that's it's not funny. I just listed RISC and the catchall.
Wii
Processors:
- CPU: PowerPC-based "Broadway" processor, made with a 90 nm SOI CMOS process, reportedly† clocked at 729 MHz[76]
- GPU: ATI "Hollywood" GPU made with a 90 nm CMOS process,[77] reportedly† clocked at 243 MHz[76]
Memory:
- 88 MB main memory (24 MB "internal" 1T-SRAM integrated into graphics package, 64 MB "external" GDDR3[78] SDRAM)
- 3 MB embedded GPU texture memory and framebuffer.
XBox
- CPU: 32-bit 733 MHz Custom Intel Coppermine-based processor in a Micro-PGA2 package. 180 nm process.[12]
- SSE floating point SIMD. 4 single-precision floating point numbers per clock cycle.
- MMX integer SIMD.
- 133 MHz 64-bit GTL+ front side bus to GPU.
- 32 KB L1 cache. 128 KB on-die L2 "Advanced Transfer Cache".
- Shared memory subsystem
- GPU and system chipset: 233 MHz "NV2A" ASIC. Co-developed by Microsoft and NVIDIA.
- Geometry engine: 115 million vertices/second, 125 million particles/second (peak)
- 4 pixel pipelines with 2 texture units each
- 932 megapixels/second (233 MHz x 4 pipelines), 1,864 megatexels/second (932 MP x 2 texture units) (peak)
- Peak triangle performance (32pixel divided from filrate): 29,125,000 32-pixel triangles/sec raw or w. 2 textures and lit.
- 485,416 triangles per frame at 60fps
- 970,833 triangles per frame at 30fps
- Peak triangle performance (32pixel divided from filrate): 29,125,000 32-pixel triangles/sec raw or w. 2 textures and lit.
- 4 textures per pass, texture compression, full scene anti-aliasing (NV Quincunx, supersampling, multisampling)
- Bilinear, trilinear, and anisotropic texture filtering
- Similar to the GeForce 3 and GeForce 4 PC GPUs.
Anyways I'm not a big fan of Id games anyways. I'm just not a big FPS fan at all.
Squilliam: On Vgcharts its a commonly accepted practice to twist the bounds of plausibility in order to support your argument or agenda so I think its pretty cool that this gives me the precedent to say whatever I damn well please.