By using this site, you agree to our Privacy Policy and our Terms of Use. Close

eyeofcore said:


Wow... I caught attention of a fanboy... LMFAO

LOLOL you got yourself banned. (I should probably warn the mods at IGN forums but whatever. Oh geez I just found out you hang out at anandtech too! I hope the guys won't rip at you right there. You have my blessings for being brave to even show your attitude there LOL.)

eyeofcore said:


Wii U's CPU is stronger than Xbox 360's Xenon and PlayStation 3's CELL and is it that hard to believe? We are talking about strictly CPU tasks where both Xenon and CELL are failures and Anand himself said that it was better to put a dual core from AMD or Intel than the mess that is in Xbox 360 and PlayStation 3.

Why is it so hard to believe that the WII Us CPU is weak! CPU tasks ? (Apparently you don't know what the purpose of a CPU even is!) The bigger failure is the ibm espresso LOL. Apparently you don't know that the ibm espresso is using the same cores from the gamecube LOL. Your system of choice got denied by 4A games. Bwahahaha

eyeofcore said:


15 GFLOPS if all cores have 256KB of L2 Cache per core, it is 17.5Gflops total since Core 0 has 512KB Core 1 has 2MB and Core 2 has 512KB of L2 Cache. Yet this is all mere speculation since we don't know the effect of eDRAM as L2 Cache nor were those cores and that architecture being modified/customized a bit at all. So we can only speculate since Nintendo/IBM did not disclosed anything involving Espresso except that it is based on IBM's designs/architecture.

You further showed ignorance of how caches work even AFTER the fact that pemalite lectured you. Hehehe. You probably don't even know the purpose of a cache! Caches are used to maintain performance not increase it!

eyeofcore said:


Did you know that Jaguar IPC is mostly below Core 2 Duo IPC? Probably not... :3

Did you know that the disaster that is the ibm espresso has an even lower IPC than every processor on the market ? (I'm willing to bet that you will deny the fact. LOL) 5 gflops at 1.2 Ghz is pathetic by todays standards. 

eyeofcore said:


So another benefit over Xbox 360 and PlayStation 3, so it is a direct communication between CPU and GPU also that is practically the basis of hUMA and your reaction is hilarious. I know that WIi U does not have Graphic Core Next(that can be only used with x86 architecture according to agrement with Intel) so it can not use hUMA model yet I did not say it has hUMA. I said that it would have similar effect like hUMA and that is direct communication between CPU and GPU and that is what hUMA achieves primarily, so it is not hUMA yet it does what one(or more) thing(s) that hUMA does...

Except the WII U can't communicate in harmony together with the GPU or CPU. The caches must be separate and the GPU doesn't provide any flat addressing methods which renders the WII U been incapable of any similar affect to hUMA. (Dat ignorance of yours showing LOL.) BTW what makes this post even more hilarious is the fact that GCN isn't exclusive to any CPU architectures. Why would ARM partner up with AMD ? (I'm willing to bet that you don't know this answer.)

eyeofcore said:


So you believe somebody that may actually not be right at all and that might be FOS and BSing you that also could be a fanboy spreading FUD! So he "confirmed" it yet nobody on NeoGAF, Beyond3D or any other forum knows at what are they looking and you only believe his information because you have a bias and/or an interest and any information that does not fit your agenda will be automatically be denied.

That so called "somebody" is a developer. If your going to deny a developers words then I don't have anymore to say. 

eyeofcore said:


I bolded the part where you failed hard and exposed yourself and your lack of knowledge!

You mean you bolded the part where you showed even more ignorance ? 0.0

eyeofcore said:


Call Of Duty Black Ops 2 is a cheap port of Xbox 360 version and it ran okay and fine after a patch, Call Of Duty Ghost is also another cheap port of Xbox 360 version so you can not expect higher resolution except some minor details looking better plus those teams that port are just handful of people. A cheap port running poorly on Wii U does not support his point/claim/"confirmation" at all also you ignore other games...

How can you say that when the WII U is literally cheap hardware! That so called "cheap port" is done on a system that has literally almost identical gamecube CPU architecture while sporting a weak GPU from the PC space. It's the WII Us fault that it can't handle BLOPS 2 at 720p!

eyeofcore said:


Need For Speed Most Wanted on Wii U ran practicaly rock solid at 30 fps, has better lightning and some higher quality over Xbox 360/PlayStation 3 version.

It doesn't change the fact that it's closer to consoles!

eyeofcore said:


Batman Arkham Origins runs better on Wii U with less bugs and glitches and with a few framerate drops in some areas compared to Xbox 360/PlayStation 3.

Neither DF or LOT did the analysis plus I wouldn't be to too sure of claiming the definitive version after the fact that arkham city ran worse on the WII U and what's more is that arkham origins is using the same engine!

eyeofcore said:


Deus Ex Human Revolution Director's Cut is all around better and Xbox 360/PlayStation 3 version have framerate issues and even more when using smartglass/vita as secondary screen plus then there is no anykind of AA.

Again neither DF or LOT did an analysis yet.

eyeofcore said:


All three examples from above say otherwise so how in the hell just 160/200 shaders at 550mhz, so a GPU with 176-220GFLOPS beats a GPU that is 240GFLOPS? Look, you need to set a side your bias if you consider yourself a professional and smart because your reaction points out to oposite of that in a nano second.

Those 3 examples turned into one because you couldn't provide an adequate source to your claims LOL. Look you need to set aside your bias and deal with the fact that the WII U is weak. If your a professional then you should know why the WII U is failing to out the PS360 in alot of games. 

eyeofcore said:


40nm is old, but it is not ancient by your claims and is still widely used in the industry because if it was ancient then it would not be used at all to any degree. >_>

If a poor company like AMD could then why is nintendo so lazy with transitioning to the process ? AMD and smartphone manufacturers did it at the beginning of 2012! Deal with the fact that 40nm is pathetic by todays standard then!

eyeofcore said:


You think Wii U has current generation performance? Wow... Such a gold failure of a statement... Lets make a proper comparison.

What's the failure is the fact that you have showed absolutely no understanding thus far!

eyeofcore said:

 

PlayStation All-Stars Battle Royale is a Super Smash Bros clone or is being inspired by it is 720p60fps

Super Smash Bros 4/U/Wii U is 1080p60fps and that is confirmed by Nintendo

Smash bros is only confirmed to be 1080p by nintendo. http://www.eurogamer.net/articles/2013-06-11-super-smash-bros-for-wii-u-and-3ds-due-2014 That 60fps part doesn't even have any backing by nintendo themselves LOL. Uh Oh you might have to deal with 30fps! Bwahahaha

eyeofcore said:


Also why Bayonetta 2 looks way better than Bayonetta? If Wii U has current generation performance also how the hell is X running on Wii U also how the hell is Mario Kart 8 running on Wii U and why is this looking so good?

They all look average at best. I guess your so used to sub hd you feel that leap that PS360 users did in 2006 LOL.

eyeofcore said:

 

http://fast.shinen.com/neo/

Shin'en confirmed that the vehicle on the site is rendered in game(read replies);

I wonder how the rest will turn out. Remeber that neo assault game ? It turned out to be not so impressive if you ask anyone LOL.

eyeofcore said:

 

Also explain me why Cryengine 3.5/4 is supported for Wii U if it has curren generation performance? It does not support Xbox 360/PlayStation 3;

http://www.crytek.com/news/crytek-announces-the-arrival-of-the-new-cryengine

Cryengine unversioned is even supported by current gen consoles! http://www.gamespot.com/articles/new-cryengine-revealed/1100-6413371/ So much for trying to differentiate between current gen consoles for the WII U LOL.

eyeofcore said:


Radeon HD 6650M should fit and be in Wii U, you will say it is not possible yet it is possible. Firs tof all AMD produces their GPU's at relatively cheap process/silicon while Nintendo is producing their GPU "Latte" at TSMC CMOS process/silicon that is more expensive and higher quality so it should allow some higher density, right? Also Radeon HD 6000 series were released in 2010 so I doubt that AMD did not find a way to increase density and possibly save some space in two years and 6000 series are used in Trinity and Richland APU's that are 32nm that may use improved technique of manufacturing the chips.

AMD always fabs their GPU at TSMC so if anything the quality ain't any different. Wow you don't follow AMD like you claim in your youtube videos! -_- Except that increased density is small compared to a node transition. At best they can probably only squeeze out 10%.

eyeofcore said:


Nintendo most likely customized the chip and we know that they don't waste any kind of silicon, they are being cheap ass when comes to silicon and they will try to get most out of the silicon for less and save couple of cents if possible while they will use higher quality process/silicon to potentially reduce power consumption and increase efficency of their hardware. Also don't say it is 4000 series because they 5000 series is just 4000 series at 40nm with implemented DirectX 11 support and 6000 series is improved 5000 series basically with changes.

http://www.techpowerup.com/gpudb/326/radeon-hd-6650m.html

http://www.notebookcheck.net/AMD-Radeon-HD-6650M.43962.0.html

Thanks for proving that nintendo is cheap when it comes to processs nodes. *rolls eyes* BTW the WII U's processing components probably take less than 30 watts! http://www.anandtech.com/show/6465/nintendo-wii-u-teardown I assume that the CPU take like 5 watts while the GPU might take around up to 25 watts. Us PC users all know that a Radeon HD 5550 TDP is about 40 watts max!. If we assume half of the shaders are to be non functional then that exactly matches up with the power consumption in the WII U so ninjablade might be right afterall! The WII U is closer to the HD5000 series rather than the HD 6000 series. The WII U IS looking to have less than 200 gflops in total! Deal with it!

eyeofcore said:


If it was just 160 to 200 shaders then Nintendo would not put 32MB of eDRAM at all because it would be a waste of die space and money, we are talking about Nintendo and not Microsoft or Sony! Nintendo saw their mistakes with Xbox 360 and PlayStation 3 in design! You love to underestimate things, right? :P

Nintendo are more incompetent when it comes to designing hardware! That 32mb of eDRAM isn't exactly a waste. What is a waste is that nintendo cheaped out on the hardware so don't be trying to deny this! Just because nintendo saw their mistakes does not mean that nintendo will learn from them! No I don't love to underestimate, instead I over evaluated that nintendo would have the better hardware completely! Now you just further changed my view on how weak the WII U truly is! You did not make your case better in fact those cases were used against you! I think I have solved the puzzle on the WII Us hardware mysteries!

eyeofcore said:


Also don't dare to call me a Nintendo fanboy because I am not... I don't own any games or platforms from Nintendo. ;)

SHMH. Your youtube channel says very differently and so do the users at anandtech.