By using this site, you agree to our Privacy Policy and our Terms of Use. Close
S.T.A.G.E. said:
RazorDragon said:
S.T.A.G.E. said:

I was right when I said 8gb of ram for next gen consoles and i'll be right again. I cant wait to win this and the major difference between our arguments is I actually listen to what devs say. Now I can stop talking.

My stipulation is that on your sig the loser must say I lost a bet that the Wii U would/wouldn't get third party support. So and so was correct. Deal?

 

 

PS4's GPU has 1154 compute cores compared to 320 cores in the Wii U (PS4 = 18 CU, 1 CU = 64 cores)

but when you factor in:

-almost 4 times the physical amount of cores

-roughly (14x) FOURTEEN TIMES the memory bandwidth

-7 times the ACTUAL physical memory (7gb for games, vs 1gb)

-the fact that those 1154 PS4 cores are likely GCN2 cores, while the Wii U's 320 are based on a core architecture from 2008, 

-cores are clocked faster (800mhz vs 550mhz) EDIT: Fixed.

-Oh, and the Wii U's CPU cores are the same architecture as the Wii, which was the same as the Gamecube.  Wii U is directly using the same CPU cores as the Wii and Gamecube, just using MORE of them and clocked higher.  This architecture was created in 1999 

 


Current top PC games like Crysis 3 can target as low as a GT 520 and Pentium Dual Core to as high as a GTX 680 and a i7 2600k to run at the maximum settings.

GTX 680 has 1536 shader units. GT 520 has 48.

Bandwidth-wise, the difference between a GTX 680(192.3 GB/s) and a GT 520(14.4GB/s) is similar Wii U/PS4 difference. However, that's not counting the 32MB of EDRAM that Wii U has, which should greatly improve bandwidth based on the results shown with Wii U games looking better than PS3/360 despite the bandwidth being seemingly slower than both.

RAM doesn't actually matter. Wii U won't be able to use textures or shadows with a resolution as high as PS4, so the RAM requirement will be much lower. Also, you can expect no PS4 game using more than 4GB to graphics, as it's GPU isn't high-end enough to use textures and shadows with a resolution able to fill out 7GBs during a real time gameplay. You can expect the extra RAM in PS4 being used to remove loading times and pop-in.

Core clock is irrelevant because architectures are different.

Wii U's CPU isn't a Broadway core overclocked, it's a new tri-core CPU based on the PPC 750 architecture. The efficiency of Wii U's CPU alone shows that it can't be three Wii CPU's overclocked and duct-taped together.


The Wii U is not impressive at all specs-wise ....stop trying to sell it. Its last gen tech with next gen ideals so lets see Nintendo prove that to us shall we?


I never said it was impressive or tried to sell it. Also, your definition of last gen tech is incorrect. If it really was last gen tech, it would be drawing more than the  ~70W that current PS3 and Xbox 360 models draw. However, it uses only 32W during load and offers better performance than both of these consoles while being made at the same 40/45nm process. Current-gen tech has better efficiency thanks to newer architectures. It's the same as comparing new Atom processors that offer little performance improvement when compared to old Core 2 Duos but consume ~70% less power and, based on the performance alone, saying that the Atom processor is last gen tech. Better energy efficiency is one of the improvements that current and next-gen technology offers when compared to last-gen tech, it's not just about increasing the raw performance.

Anyway, I was just showing that your point doesn't tell the whole truth. Current top PC games have a bigger difference between hardware requirements than the difference Wii U has to PS4.