ninjablade said:
curl-6 said:
ninjablade said:
curl-6 said:
The specs are incomplete, and some of them are assumptions rather than fact.
|
did you read the article, most of it is cold hard facts, at first some people said eurogamer rushed the article but its going on 2 days now and nobody corrected anything, even neogaf are accepting the 352 gflops numbers now here is quote i found intersting in the comment section.
Am I the only one being shocked by the fact that the Wii U CPU is basically the same type that I had in my 1998 PowerMac G3?! Am I correct here, it's basically a triple-core overclocked PowerPC 750?!! ROFLMAO! No wonder the poor ports and performance, the GPU on that thing is no powerhouse by any means but that ancient CPU architecture is choking it blue. A 3DFX Voodoo2 would have been a better match. They'd be better off if they'd stuck a current low-end, dual-core celeron in there instead but I guess they got a ridiculously good deal on tons of left-over GC/Wii CPU's laying around the manufacturers warehouse. Facepalm Nintendo, facepalm...
|
I read the article; there's plenty of unknown factors and guesstimates. And I'm not sure how a troll post helps your point?
|
i don't see how its a troll post it actaully makes sense, why they put such a weak cpu in the machine, anyway i believe DF, they have a amazing track record, no reason to doubt them and nobody has proved them wrong.
|
"I guess they got a ridiculously good deal on tons of left-over GC/Wii CPU's laying around the manufacturers warehouse. Facepalm Nintendo, facepalm..." That's trolling. People who drop those kind of lines strike me as people who are not going to give an objective perspective on Nintendo hardware. Not to mention, it hasn't been proven that the Wii U CPU is a "triple core overclocked PowerPC 750", in fact it's most likely not. It might be based on it for backwards compatibility purposes, but newer chips are often "based" on older ones because they're a direct descendent, just with improvements that come over time like large/faster caches, more cores, etc.
Nobody has proven Digital Foundry's analysis right either. What will prove it one way or another are games that come out once developers get a handle on the system's innards.