| fatslob-:O said: The wii u's cpu is isn't stronger at all! It's estimated performance comes in at around 15 gflops and I'm willing to bet it's integer performance isn't too hot either.
Core for core their features are similar but performance is a different story altogether to the point where the ibm espresso doesn't come anywhere near the jaguar cores.
The only benefit of the wiiu accessing edram is increases cpu performance that is it. Btw the wii u doesn't even have GCN cores to begin supporting the hUMA model!
Actually this where ninjablade might be right. Somebody here who is named nylevia basically told us that that the wiiu has around 150 to 200 shaders. Why do you find it silly that the wiiu only has 320 sahders (maybe even less than we suggest because nylevia basically confirmed that it doesn't have over 200 shaders) ? It's not that surprising because cod ghosts is still running at a sub hd resolution which only further proves ninjablades point.
Look here the wii u's gpu is done on a 40nm node process which is ancient by todays standards but to suggest that it has more shaders while cod ghosts is running at sub hd resolutions is asinine. The wii u has current gen performance deal with it!
|
Wow... I caught attention of a fanboy... LMFAO
Wii U's CPU is stronger than Xbox 360's Xenon and PlayStation 3's CELL and is it that hard to believe? We are talking about strictly CPU tasks where both Xenon and CELL are failures and Anand himself said that it was better to put a dual core from AMD or Intel than the mess that is in Xbox 360 and PlayStation 3.
15 GFLOPS if all cores have 256KB of L2 Cache per core, it is 17.5Gflops total since Core 0 has 512KB Core 1 has 2MB and Core 2 has 512KB of L2 Cache. Yet this is all mere speculation since we don't know the effect of eDRAM as L2 Cache nor were those cores and that architecture being modified/customized a bit at all. So we can only speculate since Nintendo/IBM did not disclosed anything involving Espresso except that it is based on IBM's designs/architecture.
Did you know that Jaguar IPC is mostly below Core 2 Duo IPC? Probably not... :3
So another benefit over Xbox 360 and PlayStation 3, so it is a direct communication between CPU and GPU also that is practically the basis of hUMA and your reaction is hilarious. I know that WIi U does not have Graphic Core Next(that can be only used with x86 architecture according to agrement with Intel) so it can not use hUMA model yet I did not say it has hUMA. I said that it would have similar effect like hUMA and that is direct communication between CPU and GPU and that is what hUMA achieves primarily, so it is not hUMA yet it does what one(or more) thing(s) that hUMA does...
So you believe somebody that may actually not be right at all and that might be FOS and BSing you that also could be a fanboy spreading FUD! So he "confirmed" it yet nobody on NeoGAF, Beyond3D or any other forum knows at what are they looking and you only believe his information because you have a bias and/or an interest and any information that does not fit your agenda will be automatically be denied.
I bolded the part where you failed hard and exposed yourself and your lack of knowledge!
Call Of Duty Black Ops 2 is a cheap port of Xbox 360 version and it ran okay and fine after a patch, Call Of Duty Ghost is also another cheap port of Xbox 360 version so you can not expect higher resolution except some minor details looking better plus those teams that port are just handful of people. A cheap port running poorly on Wii U does not support his point/claim/"confirmation" at all also you ignore other games...
Need For Speed Most Wanted on Wii U ran practicaly rock solid at 30 fps, has better lightning and some higher quality over Xbox 360/PlayStation 3 version.
Batman Arkham Origins runs better on Wii U with less bugs and glitches and with a few framerate drops in some areas compared to Xbox 360/PlayStation 3.
Deus Ex Human Revolution Director's Cut is all around better and Xbox 360/PlayStation 3 version have framerate issues and even more when using smartglass/vita as secondary screen plus then there is no anykind of AA.
All three examples from above say otherwise so how in the hell just 160/200 shaders at 550mhz, so a GPU with 176-220GFLOPS beats a GPU that is 240GFLOPS? Look, you need to set a side your bias if you consider yourself a professional and smart because your reaction points out to oposite of that in a nano second.
40nm is old, but it is not ancient by your claims and is still widely used in the industry because if it was ancient then it would not be used at all to any degree. >_>
You think Wii U has current generation performance? Wow... Such a gold failure of a statement... Lets make a proper comparison.
PlayStation All-Stars Battle Royale is a Super Smash Bros clone or is being inspired by it is 720p60fps
Super Smash Bros 4/U/Wii U is 1080p60fps and that is confirmed by Nintendo
Also why Bayonetta 2 looks way better than Bayonetta? If Wii U has current generation performance also how the hell is X running on Wii U also how the hell is Mario Kart 8 running on Wii U and why is this looking so good?
Shin'en confirmed that the vehicle on the site is rendered in game(read replies);
We are happy to announce our next Wii U game "FAST Racing Neo"http://t.co/9MzGWkUil1
— Shin'en Multimedia (@ShinenGames) October 29, 2013
Also explain me why Cryengine 3.5/4 is supported for Wii U if it has curren generation performance? It does not support Xbox 360/PlayStation 3;
http://www.crytek.com/news/crytek-announces-the-arrival-of-the-new-cryengine
Radeon HD 6650M should fit and be in Wii U, you will say it is not possible yet it is possible. Firs tof all AMD produces their GPU's at relatively cheap process/silicon while Nintendo is producing their GPU "Latte" at TSMC CMOS process/silicon that is more expensive and higher quality so it should allow some higher density, right? Also Radeon HD 6000 series were released in 2010 so I doubt that AMD did not find a way to increase density and possibly save some space in two years and 6000 series are used in Trinity and Richland APU's that are 32nm that may use improved technique of manufacturing the chips.
Nintendo most likely customized the chip and we know that they don't waste any kind of silicon, they are being cheap ass when comes to silicon and they will try to get most out of the silicon for less and save couple of cents if possible while they will use higher quality process/silicon to potentially reduce power consumption and increase efficency of their hardware. Also don't say it is 4000 series because they 5000 series is just 4000 series at 40nm with implemented DirectX 11 support and 6000 series is improved 5000 series basically with changes.
http://www.techpowerup.com/gpudb/326/radeon-hd-6650m.html
http://www.notebookcheck.net/AMD-Radeon-HD-6650M.43962.0.html
If it was just 160 to 200 shaders then Nintendo would not put 32MB of eDRAM at all because it would be a waste of die space and money, we are talking about Nintendo and not Microsoft or Sony! Nintendo saw their mistakes with Xbox 360 and PlayStation 3 in design! You love to underestimate things, right? :P
Also don't dare to call me a Nintendo fanboy because I am not... I don't own any games or platforms from Nintendo. ;)
~ Mod edit ~
This user has been moderated by TruckOSaurus for this post.







