Pemalite said:
|
I love when people use great analogies like that for explaining things. The car one was also pretty good.
Pemalite said:
|
I love when people use great analogies like that for explaining things. The car one was also pretty good.
eyeofcore said:
How much is Wii U's CPU faster than Xbox 360 and PlayStation 3 CPU's on average if you could make an rough estimation/approximation?! Also how core to core is comparable Wii U CPU to CPU's in Xbox One/PlayStation 4? Do you have an rough idea of what are the benefits of Wii U's CPU to access directly to eDRAM of Wii U's GPU? Could it have similar effect to AMD's hUMA in away, not the same implementation or that stuff. I mean the effect on performance and programming if it is easier than regular route on UMA systems. Starcraft 2 is only a bottleneck in CPU department because a lot of things are done on CPU and also it only uses 2 cores, if it was using 4 cores natively and properly then it would ran okay on next generation systems, Battlefield 4 is GPU intensive primarily because of DirectX 11 effects and other things. I have been doing an investigation into Wii U's GPU and I found it silly that Wii U's GPU is supposedly just 320 Shaders or basically something like Radeon HD 5550 or as some people tried to force those ridicilous 160 Shaders yet I found out that a GPU like Radeon HD 6630M would fit in and since AMD's GPU's are done on relatively cheap process in TSMC's fabs while Wii U's GPU is also done in TSMC's fabs but at CMOS process or something like that. So I assume that is done on a better process/silicon or something and that it would allow higher density and other things. |
The wii u's cpu is isn't stronger at all! It's estimated performance comes in at around 15 gflops and I'm willing to bet it's integer performance isn't too hot either.
Core for core their features are similar but performance is a different story altogether to the point where the ibm espresso doesn't come anywhere near the jaguar cores.
The only benefit of the wiiu accessing edram is increases cpu performance that is it. Btw the wii u doesn't even have GCN cores to begin supporting the hUMA model!
Actually this where ninjablade might be right. Somebody here who is named nylevia basically told us that that the wiiu has around 150 to 200 shaders. Why do you find it silly that the wiiu only has 320 sahders (maybe even less than we suggest because nylevia basically confirmed that it doesn't have over 200 shaders) ? It's not that surprising because cod ghosts is still running at a sub hd resolution which only further proves ninjablades point.
Look here the wii u's gpu is done on a 40nm node process which is ancient by todays standards but to suggest that it has more shaders while cod ghosts is running at sub hd resolutions is asinine. The wii u has current gen performance deal with it!
fatslob-:O said: The wii u's cpu is isn't stronger at all! It's estimated performance comes in at around 15 gflops and I'm willing to bet it's integer performance isn't too hot either.
Core for core their features are similar but performance is a different story altogether to the point where the ibm espresso doesn't come anywhere near the jaguar cores.
The only benefit of the wiiu accessing edram is increases cpu performance that is it. Btw the wii u doesn't even have GCN cores to begin supporting the hUMA model!
Actually this where ninjablade might be right. Somebody here who is named nylevia basically told us that that the wiiu has around 150 to 200 shaders. Why do you find it silly that the wiiu only has 320 sahders (maybe even less than we suggest because nylevia basically confirmed that it doesn't have over 200 shaders) ? It's not that surprising because cod ghosts is still running at a sub hd resolution which only further proves ninjablades point.
Look here the wii u's gpu is done on a 40nm node process which is ancient by todays standards but to suggest that it has more shaders while cod ghosts is running at sub hd resolutions is asinine. The wii u has current gen performance deal with it!
|
Wow... I caught attention of a fanboy... LMFAO
Wii U's CPU is stronger than Xbox 360's Xenon and PlayStation 3's CELL and is it that hard to believe? We are talking about strictly CPU tasks where both Xenon and CELL are failures and Anand himself said that it was better to put a dual core from AMD or Intel than the mess that is in Xbox 360 and PlayStation 3.
15 GFLOPS if all cores have 256KB of L2 Cache per core, it is 17.5Gflops total since Core 0 has 512KB Core 1 has 2MB and Core 2 has 512KB of L2 Cache. Yet this is all mere speculation since we don't know the effect of eDRAM as L2 Cache nor were those cores and that architecture being modified/customized a bit at all. So we can only speculate since Nintendo/IBM did not disclosed anything involving Espresso except that it is based on IBM's designs/architecture.
Did you know that Jaguar IPC is mostly below Core 2 Duo IPC? Probably not... :3
So another benefit over Xbox 360 and PlayStation 3, so it is a direct communication between CPU and GPU also that is practically the basis of hUMA and your reaction is hilarious. I know that WIi U does not have Graphic Core Next(that can be only used with x86 architecture according to agrement with Intel) so it can not use hUMA model yet I did not say it has hUMA. I said that it would have similar effect like hUMA and that is direct communication between CPU and GPU and that is what hUMA achieves primarily, so it is not hUMA yet it does what one(or more) thing(s) that hUMA does...
So you believe somebody that may actually not be right at all and that might be FOS and BSing you that also could be a fanboy spreading FUD! So he "confirmed" it yet nobody on NeoGAF, Beyond3D or any other forum knows at what are they looking and you only believe his information because you have a bias and/or an interest and any information that does not fit your agenda will be automatically be denied.
I bolded the part where you failed hard and exposed yourself and your lack of knowledge!
Call Of Duty Black Ops 2 is a cheap port of Xbox 360 version and it ran okay and fine after a patch, Call Of Duty Ghost is also another cheap port of Xbox 360 version so you can not expect higher resolution except some minor details looking better plus those teams that port are just handful of people. A cheap port running poorly on Wii U does not support his point/claim/"confirmation" at all also you ignore other games...
Need For Speed Most Wanted on Wii U ran practicaly rock solid at 30 fps, has better lightning and some higher quality over Xbox 360/PlayStation 3 version.
Batman Arkham Origins runs better on Wii U with less bugs and glitches and with a few framerate drops in some areas compared to Xbox 360/PlayStation 3.
Deus Ex Human Revolution Director's Cut is all around better and Xbox 360/PlayStation 3 version have framerate issues and even more when using smartglass/vita as secondary screen plus then there is no anykind of AA.
All three examples from above say otherwise so how in the hell just 160/200 shaders at 550mhz, so a GPU with 176-220GFLOPS beats a GPU that is 240GFLOPS? Look, you need to set a side your bias if you consider yourself a professional and smart because your reaction points out to oposite of that in a nano second.
40nm is old, but it is not ancient by your claims and is still widely used in the industry because if it was ancient then it would not be used at all to any degree. >_>
You think Wii U has current generation performance? Wow... Such a gold failure of a statement... Lets make a proper comparison.
PlayStation All-Stars Battle Royale is a Super Smash Bros clone or is being inspired by it is 720p60fps
Super Smash Bros 4/U/Wii U is 1080p60fps and that is confirmed by Nintendo
Also why Bayonetta 2 looks way better than Bayonetta? If Wii U has current generation performance also how the hell is X running on Wii U also how the hell is Mario Kart 8 running on Wii U and why is this looking so good?
Shin'en confirmed that the vehicle on the site is rendered in game(read replies);
We are happy to announce our next Wii U game "FAST Racing Neo"http://t.co/9MzGWkUil1
— Shin'en Multimedia (@ShinenGames) October 29, 2013
Also explain me why Cryengine 3.5/4 is supported for Wii U if it has curren generation performance? It does not support Xbox 360/PlayStation 3;
http://www.crytek.com/news/crytek-announces-the-arrival-of-the-new-cryengine
Radeon HD 6650M should fit and be in Wii U, you will say it is not possible yet it is possible. Firs tof all AMD produces their GPU's at relatively cheap process/silicon while Nintendo is producing their GPU "Latte" at TSMC CMOS process/silicon that is more expensive and higher quality so it should allow some higher density, right? Also Radeon HD 6000 series were released in 2010 so I doubt that AMD did not find a way to increase density and possibly save some space in two years and 6000 series are used in Trinity and Richland APU's that are 32nm that may use improved technique of manufacturing the chips.
Nintendo most likely customized the chip and we know that they don't waste any kind of silicon, they are being cheap ass when comes to silicon and they will try to get most out of the silicon for less and save couple of cents if possible while they will use higher quality process/silicon to potentially reduce power consumption and increase efficency of their hardware. Also don't say it is 4000 series because they 5000 series is just 4000 series at 40nm with implemented DirectX 11 support and 6000 series is improved 5000 series basically with changes.
http://www.techpowerup.com/gpudb/326/radeon-hd-6650m.html
http://www.notebookcheck.net/AMD-Radeon-HD-6650M.43962.0.html
If it was just 160 to 200 shaders then Nintendo would not put 32MB of eDRAM at all because it would be a waste of die space and money, we are talking about Nintendo and not Microsoft or Sony! Nintendo saw their mistakes with Xbox 360 and PlayStation 3 in design! You love to underestimate things, right? :P
Also don't dare to call me a Nintendo fanboy because I am not... I don't own any games or platforms from Nintendo. ;)
~ Mod edit ~
This user has been moderated by TruckOSaurus for this post.
Pemalite said:
|
I'm quite a big AMD fan for many reasons: my first PC's chipset, an Intel 430TX, was quite a rip-off, while my first CPU, an AMD k6-166 was very good for its price, and what's more, at that time there still were buggy Pentiums around. Later I had a Duron, quite better than a Celeron for the same price, then an Athlon XP when Intel offered really disappointing P4's, and the current one has a quite power frugal AMD CPU and the HD3300, besides being cool enough to just need passive cooling, was the most powerful onboard GPU in its times. So I'm willing to wait and see what AMD will be able to offer with its new APUs. BTW I believe we need AMD to stay afloat if we want Intel to behave fairly enough, without competition it could become really ruthless...
eyeofcore said:
|
LOLOL you got yourself banned. (I should probably warn the mods at IGN forums but whatever. Oh geez I just found out you hang out at anandtech too! I hope the guys won't rip at you right there. You have my blessings for being brave to even show your attitude there LOL.)
eyeofcore said:
|
Why is it so hard to believe that the WII Us CPU is weak! CPU tasks ? (Apparently you don't know what the purpose of a CPU even is!) The bigger failure is the ibm espresso LOL. Apparently you don't know that the ibm espresso is using the same cores from the gamecube LOL. Your system of choice got denied by 4A games. Bwahahaha
eyeofcore said:
|
You further showed ignorance of how caches work even AFTER the fact that pemalite lectured you. Hehehe. You probably don't even know the purpose of a cache! Caches are used to maintain performance not increase it!
eyeofcore said:
|
Did you know that the disaster that is the ibm espresso has an even lower IPC than every processor on the market ? (I'm willing to bet that you will deny the fact. LOL) 5 gflops at 1.2 Ghz is pathetic by todays standards.
eyeofcore said:
|
Except the WII U can't communicate in harmony together with the GPU or CPU. The caches must be separate and the GPU doesn't provide any flat addressing methods which renders the WII U been incapable of any similar affect to hUMA. (Dat ignorance of yours showing LOL.) BTW what makes this post even more hilarious is the fact that GCN isn't exclusive to any CPU architectures. Why would ARM partner up with AMD ? (I'm willing to bet that you don't know this answer.)
eyeofcore said:
|
That so called "somebody" is a developer. If your going to deny a developers words then I don't have anymore to say.
eyeofcore said:
|
You mean you bolded the part where you showed even more ignorance ? 0.0
eyeofcore said:
|
How can you say that when the WII U is literally cheap hardware! That so called "cheap port" is done on a system that has literally almost identical gamecube CPU architecture while sporting a weak GPU from the PC space. It's the WII Us fault that it can't handle BLOPS 2 at 720p!
eyeofcore said:
|
It doesn't change the fact that it's closer to consoles!
eyeofcore said:
|
Neither DF or LOT did the analysis plus I wouldn't be to too sure of claiming the definitive version after the fact that arkham city ran worse on the WII U and what's more is that arkham origins is using the same engine!
eyeofcore said:
|
Again neither DF or LOT did an analysis yet.
eyeofcore said:
|
Those 3 examples turned into one because you couldn't provide an adequate source to your claims LOL. Look you need to set aside your bias and deal with the fact that the WII U is weak. If your a professional then you should know why the WII U is failing to out the PS360 in alot of games.
eyeofcore said:
|
If a poor company like AMD could then why is nintendo so lazy with transitioning to the process ? AMD and smartphone manufacturers did it at the beginning of 2012! Deal with the fact that 40nm is pathetic by todays standard then!
eyeofcore said:
|
What's the failure is the fact that you have showed absolutely no understanding thus far!
eyeofcore said:
PlayStation All-Stars Battle Royale is a Super Smash Bros clone or is being inspired by it is 720p60fps Super Smash Bros 4/U/Wii U is 1080p60fps and that is confirmed by Nintendo |
Smash bros is only confirmed to be 1080p by nintendo. http://www.eurogamer.net/articles/2013-06-11-super-smash-bros-for-wii-u-and-3ds-due-2014 That 60fps part doesn't even have any backing by nintendo themselves LOL. Uh Oh you might have to deal with 30fps! Bwahahaha
eyeofcore said:
|
They all look average at best. I guess your so used to sub hd you feel that leap that PS360 users did in 2006 LOL.
eyeofcore said:
Shin'en confirmed that the vehicle on the site is rendered in game(read replies); We are happy to announce our next Wii U game "FAST Racing Neo"http://t.co/9MzGWkUil1 |
I wonder how the rest will turn out. Remeber that neo assault game ? It turned out to be not so impressive if you ask anyone LOL.
eyeofcore said:
Also explain me why Cryengine 3.5/4 is supported for Wii U if it has curren generation performance? It does not support Xbox 360/PlayStation 3; http://www.crytek.com/news/crytek-announces-the-arrival-of-the-new-cryengine |
Cryengine unversioned is even supported by current gen consoles! http://www.gamespot.com/articles/new-cryengine-revealed/1100-6413371/ So much for trying to differentiate between current gen consoles for the WII U LOL.
eyeofcore said:
|
AMD always fabs their GPU at TSMC so if anything the quality ain't any different. Wow you don't follow AMD like you claim in your youtube videos! -_- Except that increased density is small compared to a node transition. At best they can probably only squeeze out 10%.
eyeofcore said:
http://www.techpowerup.com/gpudb/326/radeon-hd-6650m.html http://www.notebookcheck.net/AMD-Radeon-HD-6650M.43962.0.html |
Thanks for proving that nintendo is cheap when it comes to processs nodes. *rolls eyes* BTW the WII U's processing components probably take less than 30 watts! http://www.anandtech.com/show/6465/nintendo-wii-u-teardown I assume that the CPU take like 5 watts while the GPU might take around up to 25 watts. Us PC users all know that a Radeon HD 5550 TDP is about 40 watts max!. If we assume half of the shaders are to be non functional then that exactly matches up with the power consumption in the WII U so ninjablade might be right afterall! The WII U is closer to the HD5000 series rather than the HD 6000 series. The WII U IS looking to have less than 200 gflops in total! Deal with it!
eyeofcore said:
|
Nintendo are more incompetent when it comes to designing hardware! That 32mb of eDRAM isn't exactly a waste. What is a waste is that nintendo cheaped out on the hardware so don't be trying to deny this! Just because nintendo saw their mistakes does not mean that nintendo will learn from them! No I don't love to underestimate, instead I over evaluated that nintendo would have the better hardware completely! Now you just further changed my view on how weak the WII U truly is! You did not make your case better in fact those cases were used against you! I think I have solved the puzzle on the WII Us hardware mysteries!
eyeofcore said:
|
SHMH. Your youtube channel says very differently and so do the users at anandtech.
eyeofcore said: All three examples from above say otherwise so how in the hell just 160/200 shaders at 550mhz, so a GPU with 176-220GFLOPS beats a GPU that is 240GFLOPS? Look, you need to set a side your bias if you consider yourself a professional and smart because your reaction points out to oposite of that in a nano second. 40nm is old, but it is not ancient by your claims and is still widely used in the industry because if it was ancient then it would not be used at all to any degree. >_> |
Even if the Wii U's GPU's Gigaflop performance was lower than the Xbox 360 or Playstation 3's GPU's, it would still be faster, it's a pointless metric to use when comparing different architectures.
Basically, the Wii U's GPU can do *more* per Gigaflop thanks to technologies such as better compression algorithms, better culling, more powerfull geometry engines, you name it.
If a GPU only had to deal with floating point math, then it would be an accurate metric to gauge performance, but that's not reality.
As for 40nm vs 28nm, well. The law of Physics plays here, 28nm is pretty much superior.
With that said, 40nm is also stupidly cheap and very very mature.
You can actually optimise the type of transister for a given fabrication process in order to minimise leakage, what that means is you can pack more transisters into the same space without dropping down a lithographic node.
For example the Radeon 290X has 43% more Transisters than the Radeon 7970, yet it's die size is only 20% larger at the same fabrication process.
--::{PC Gaming Master Race}::--
Pemalite said:
|
The WII U may be superior due to the fact that it has a larger memory. BTW Radeon HD 5000 tessellators were awful. (Uhhhg how painful that was.) Anyways agreed with everything else.
fatslob-:O said: The WII U may be superior due to the fact that it has a larger memory. BTW Radeon HD 5000 tessellators were awful. (Uhhhg how painful that was.) Anyways agreed with everything else. |
Still, the Radeon 5000 class Tessellators are far better than the non-existent Tessellator in the Playstation 3 and the Truform based Tessellator in the Xbox 360. :P
--::{PC Gaming Master Race}::--
Pemalite said:
|
Touche.
fatslob-:O said:
LOLOL you got yourself banned. (I should probably warn the mods at IGN forums but whatever. Oh geez I just found out you hang out at anandtech too! I hope the guys won't rip at you right there. You have my blessings for being brave to even show your attitude there LOL.) So...? hahaha You get exicted for nothing. lol
Why is it so hard to believe that the WII Us CPU is weak! CPU tasks ? (Apparently you don't know what the purpose of a CPU even is!) The bigger failure is the ibm espresso LOL. Apparently you don't know that the ibm espresso is using the same cores from the gamecube LOL. Your system of choice got denied by 4A games. Bwahahaha *facepalm* You failed... Wow... Why is hard to believe that Wii U's CPU is not weak?! I love how you force one premise that is not valid as I said CPU task is CPU task and you don't know what I meant by CPU task? Wow! CPU is primarily doing one thing aka serial tasks while it is not good at parallel tasks like the GPU is. Actually the bigger failure is Xbox 360's Xenon and PlayStation 3's CELL that have 32-40 stage pipeline thus it has a lot of "bubbles" and 1MB/768KB of cache for three cores is not enough plus Gekko/Broadway/Espresso is Out Of Order Execution versus In Order Execution. Apparently you really love to force premises and try to low ball me without asking a question, just because it is "apparently" that does not mean it is and you will only make yourself a easier target to get shotdown and pwned in debates because if you assume that apparently that and that person surely does not know that and that can backfire you like DRM thing with Xbox One and Microsoft. Also Gamecube and Wii/Wii U cores are not the same really, Gamecube's Gekko is PPCDBK(iteration of PowerPC750 CX) while Wii/Wii U is Broadway/Espresso is PowerPC750CL yet Wii U's CPU is first PowerPC750 that has multi-core support and other things thus it is not really a true PowerPC750 design since it got couple of small things from Power 7 eg eDRAM implementation. My system of choice did not got denied by 4A Games, did 4A games said that PC(x86-64) is weak? No... LMAO
You further showed ignorance of how caches work even AFTER the fact that pemalite lectured you. Hehehe. You probably don't even know the purpose of a cache! Caches are used to maintain performance not increase it! You further decided to fail and I can only laugh at you since purpose of Cache is to store informations eg code and microcode like AI, Physics, Caches maintain the performance thus it increases the performance thus CPU's in the long run compared to equilavent with less Cache! I don't talk about GFLOPS performance as you apparently think that I do. lol
Did you know that the disaster that is the ibm espresso has an even lower IPC than every processor on the market ? (I'm willing to bet that you will deny the fact. LOL) 5 gflops at 1.2 Ghz is pathetic by todays standards. It is not a disaster for what it is... Espresso matches in GFLOPS; i5 480M Core's;2/Thread's;4/Litography;32nm/Clock;2.66-2.9Ghz/Cache;3MB/Die Size;81mm^2/Transistor's;328 million/TDP; 35 watts Pentium E5800 Core's;2/Litography;45nm/Clock;3.2Ghz/Cache;2MB/Die Size;82mm^2/Transistor's; 228 million/TDP; 65 watts Are both of these from above disasters for what they are compared to Espresso that has 3 core's, produced at litography of 45nm, clocked at 1.243Ghz, has 3MB of L3 Cache, die size of just 27.7mm^2 with 60 million transistor's while TDP is like 10-15 watts?! You deny the fact that it is not a disaster for what it is, for the architecture it is for performance per mm^2 it has at 45nm litography.
Except the WII U can't communicate in harmony together with the GPU or CPU. The caches must be separate and the GPU doesn't provide any flat addressing methods which renders the WII U been incapable of any similar affect to hUMA. (Dat ignorance of yours showing LOL.) BTW what makes this post even more hilarious is the fact that GCN isn't exclusive to any CPU architectures. Why would ARM partner up with AMD ? (I'm willing to bet that you don't know this answer.) Espresso has seperate eDRAM cache for each CPU core and Latte has 32MB of eDRAM plus seperate 2MB of eDRAM, quote from NeoGAF; "The interface running around the lower right corner of the die is the DDR3 memory interface (the DDR3 is known as MEM2). Your responses are ignorant, arrogant and above all egoistic... That's kinda anti social. Graphic Core Next architecture acts similar as x86 and I read on Anandtech forums that it can not be used in hUMA/APU with other CPU architectures configuration except x86, that is what I know. AMD partnered with ARM in research involving HSA also because to diversify them self to increase chances of profit and survivability plus expand their presence in the market.
That so called "somebody" is a developer. If your going to deny a developers words then I don't have anymore to say. How can you be sure that this "developer" does not have a agenda and even then how can be sure since he is not an engineer also how can I, you and him be sure that hardware is not locked or something because the system is not stable. Something like Xbox One's Kinect thing where Microsoft dedicated/allocated 10% of GPU power for it yet they can't allow it usage for games since the system is unstable. Remember when Cliff said that Unreal 4 can not run on Wii U yet few days after he pulled back his statemenet, remember? Yes? Did he developed for the Wii U? Can he confirm it? DId he developed a fully fledged game? Is he a programmer? Does developers have complet access to the hardware or is it limited to the firmware that could be updated... Countless factors.
You mean you bolded the part where you showed even more ignorance ? 0.0 Fail... You said that guys point is valid because a cheap port is not looking good on it or is not native 720p/HD. It is a cheap port from Xbox 360 version, it was done by a handful of people, not a fully fledged team of 100+ that could to complete engine changes and optimizations wrapped around Wii U's hardware and its virtues.
How can you say that when the WII U is literally cheap hardware! That so called "cheap port" is done on a system that has literally almost identical gamecube CPU architecture while sporting a weak GPU from the PC space. It's the WII Us fault that it can't handle BLOPS 2 at 720p! Another fail. So you have something against cost effective hardware? eSRAM/SRAM is six times larger than eDRAM that has a bit higher latency, a bit lower bandwidth and performance yet it is good enough at 45nm and better than regular L2 Cache that is SRAM 90nm that is in Xbox 360/PlayStation 3 and to maintain compatibility when shrinking the components thus it has lower latency thus games would be incompatible that are optimized for original latency of Xbox 360/PlayStation 3's 90nm SRAM thus they need to change some things to negate some improvements of die shrinking to not break compatibility with the software. It is a cheap port no matter how you flip flop it... It is not Wii U's fault, so you are saying that it is not Bethesda's fault for their games running poorly on PlayStation 3 like Skyrim, it is the hardwares fault... Right? *facepalm*
It doesn't change the fact that it's closer to consoles! The fact is that is overal better and is a good port when looking that a handful people were doing that port, also do you know a game called Project CARS? It was set for PlayStation 3, Xbox 360, Wii U and PC. Now it is set for Wii U, Xbox One, PlayStation 4 so explain me why they did not ditch the Wii U version... :3 Hint; Engine developer in SMS for Project CARS worked on Need For Speed Most Wanted U port so he knows the capability of the Wii U's hardware yet he decided to not ditch it and always in report involving Wii U build, DirectX 11 a like features... DirectX 11 is taxing yet it is being used in Wii U version and Wii U version did not get ditched. This will be the first Wii U's game that is fully next generation.
Neither DF or LOT did the analysis plus I wouldn't be to too sure of claiming the definitive version after the fact that arkham city ran worse on the WII U and what's more is that arkham origins is using the same engine! Another failure... Arkham City Wii U Edition used 2 cores, said by the developers at NeoGAF and he said they barely used 3rd core and I know it is using the same engine yet it does not mean that successor Arkham Origins would run exactly the same as Arkham City. Developers need time to get the know the hardware, they learned what to do on Wii U and how to optimize it so they use what they learned. So by your logic Launch Games on Xbox 360 and PlayStation 3 used most out of the system since you deny that developers can learn how to optimize for the hardware and resolve the issues. So that should be also valid for PlayStation 4 and Call Of Duty Ghost that has some frame rate issues so successor to Ghost should also run with some frame rate issues also?
Again neither DF or LOT did an analysis yet. So you deny the reports from NeoGAF users... Reviewers touted Wii U version as definitive version, long time ago I stumbled upon a thread at NeoGAF,; http://www.neogaf.com/forum/showthread.php?t=702931 SSAO is dialed back and frame rate/performance get a hit when in Dual Screen mode in PlayStation 3 version... Same thing is most likely valid on Xbox 360 version since they are not designed to have dual screen in mind compared to Wii U version.
Those 3 examples turned into one because you couldn't provide an adequate source to your claims LOL. Look you need to set aside your bias and deal with the fact that the WII U is weak. If your a professional then you should know why the WII U is failing to out the PS360 in alot of games. I am sorry, you have a bias and you need to set side your bias since I don't have a bias and that is your premise that you force. You can't deal with the fact that Wii U is getting cheap ports of Xbox 360 version, is that hard to accept the truth? Don't you remember the PlayStation 3 when it got cheap ports? Are you that naive to think that developers can use up all the potential in one single freaking year and that they already know how to optimize for it? First of all not many game developers had experience with Wii's CPU let alone a multi-core PowerPC750CL that is the first of its kind thus it is practically totally unfamiliar to every developer in the world. eDRAM as L2 Cache in CPU is completelly new thing to these developers and doing low level API optimizations on VLIW5/VLIW4 architecture is also completelly unknown to developers, let alone if it has its own custom Open GL API from Nintendo that is likely called GX2 as in leaked dev kit informations plus it could also have TEV units rather than TMU's to make things worser for them. It is like PlayStation 2, developers could not really wrap their heads around it in first year and they were literally doing zero's and one's as I head in various forums. LMAO At least Wii U is still far simpler to program for than Xbox 360 and PlayStation 3 nightmare with a lot of serious bottlenecks and short comings.
If a poor company like AMD could then why is nintendo so lazy with transitioning to the process ? AMD and smartphone manufacturers did it at the beginning of 2012! Deal with the fact that 40nm is pathetic by todays standard then! You need to deal with the fact that it is not pathetic even by todays standard, if it was like 55nm then it is borderline, 65nm would be pathetic by todays standard while 40nm is now mature and safe way to go for larger chips to get fabbed. You simply can't understand it?! 32nm is nearly fully occupied since 28nm is occupied to full extent because of primarily ARM chips also it is because of the design. CPU is 45nm, GPU is 40nm and when IBM makes available for 20nm fabs for production chips then Nintendo will jump for their CPU the time TSMC gets their 20nm/16nm ready plus did you forgot about the eDRAM part? From what I know only Intel managed to get eDRAM at 22nm with their latest IGP's for Haswell and it gets trickier for eDRAM to get produced compared to eSRAM as litography process gets smaller. When IBM and TSMC 20nm fabs get ready, Nintendo will do what Microsoft did with latest Xbox 360 chip. Unifie them into one and drastically reduce costs so that they could sell Wii U for 250$. I can only agree with you that Nintendo should have waited rather than jumped the gun early, they should have waited for 20nm.
What's the failure is the fact that you have showed absolutely no understanding thus far! Well your responses hit what with the premise that you are tryingto force.
Smash bros is only confirmed to be 1080p by nintendo. http://www.eurogamer.net/articles/2013-06-11-super-smash-bros-for-wii-u-and-3ds-due-2014 That 60fps part doesn't even have any backing by nintendo themselves LOL. Uh Oh you might have to deal with 30fps! Bwahahaha I don't have deal with anything, just because they only said that is 1080p while they did not commented on frame rate does not mean that it is not 60fps. We are talking about a fighter and above all it is from Nintendo, Nintendo targets 60fps when ever possible and that is their target for Mario Kart 8.
They all look average at best. I guess your so used to sub hd you feel that leap that PS360 users did in 2006 LOL. I am sorry, but NeoGAF does not agree with you. :3 For your information, my only console was original PlayStation and since 2004 I am gaming on my PC. I was at my sisters girl friends house and I was with their brother that had a PlayStation 3 and he played PES and it was 720p60fps so I got my first feel of HD in 2008-9 on a console. I decided to up then resolution on my PC to 1280x1024 in Half Life 2, never regret it even with a performance hit that draged me down to 30fps from relatively stable 60.
I wonder how the rest will turn out. Remeber that neo assault game ? It turned out to be not so impressive if you ask anyone LOL. Nano Assault Neo was impressive for using an engine that was not fully optimized for Wii U, first generation engine from Wii adapted to run on Wii U and it used only one core and a fraction of Wii U's resources. The game is totally uncompressed and it is only 50MB. Visuals are impressive for amount resources it used and is 720p60fps. Don't underestimate gods of tech demo scene.
Cryengine unversioned is even supported by current gen consoles! http://www.gamespot.com/articles/new-cryengine-revealed/1100-6413371/ So much for trying to differentiate between current gen consoles for the WII U LOL. Yet you fail again, every engine can be downgraded/adapted to run on older hardware and weaker one, Unreal 2.5 worked on Wii and 3DS. :) It is officially supported by Crytek's CryEngine 4 while Xbox 360 and PlayStation 3 are not also when unversioned version is supported to Xbox 360 and PlayStation 3 it will be with scaled down features to DirectX 9/Open GL 2.0 and not DirectX 11/Open GL 4.3 ... ;) So it is not a full Cry Engine 3.5/4 :D
AMD always fabs their GPU at TSMC so if anything the quality ain't any different. Wow you don't follow AMD like you claim in your youtube videos! -_- Except that increased density is small compared to a node transition. At best they can probably only squeeze out 10%. Hmmm... Care to tell me something that I did not knew? So you deny that process/litography quality can not be improved or that a higher quality silicon can be used? Wow... Your forceful premise and guesses are constant, I follow AMD and I follow their stock yet it is not my primariy focus compared to finding a job in this recession. Recent leaks point out that Kaveri/Steamroller will have lower floating yet higher interger performance intentionally so floating points calculations/processing will be offloaded to GPU... This will make worser performance in older games :/ So Radeon HD 6570M is 104mm^2 so they can squeeze it under 100.71mm^2, die size of 93.6mm^2. Thanks for the information. Chipworks guy said that Latte's GPU is heavily customized and that he could not tell what kind of AMD's GPU is so it is supposedly extremely customized design, by the way. You crushed your theory and theory of that guy that it is 160-200 shaders LMAO.
Thanks for proving that nintendo is cheap when it comes to processs nodes. *rolls eyes* BTW the WII U's processing components probably take less than 30 watts! http://www.anandtech.com/show/6465/nintendo-wii-u-teardown I assume that the CPU take like 5 watts while the GPU might take around up to 25 watts. Us PC users all know that a Radeon HD 5550 TDP is about 40 watts max!. If we assume half of the shaders are to be non functional then that exactly matches up with the power consumption in the WII U so ninjablade might be right afterall! The WII U is closer to the HD5000 series rather than the HD 6000 series. The WII U IS looking to have less than 200 gflops in total! Deal with it! Nope... It is 30 watts, at worse it is 35 watts sometimes and the power supply is 75 watts with 90% efficiency so it can handle 67.5 watts at max and degradation of the power supply unit is ignorable if not stressed fully to reduce its life span. How can we be certain that Nintendo did not locked resources? Just in case if the system/OS is unstable... You need to deal with this... You are looking at PC GPU's not mobile/embedded ones at all also does that TDP include GDDR3/GDDR5 memory into TDP/power consumption plus the board its self? If so then remove GDDR3(I know that 1GB 65nm one uses 20watts) or if is GDDR5(2GB 46nm is 7.5-8 watts so 512MB is 1.75-2 watts) then that the board its self could take a couple of more watts. So look at Radeon HD 6570M... 30 watt rated TDP, if 1GB of GDDR3 45nm then lets remove that thus power consumption goes down to 16 watts then lower the clocks from 600 to 550 thus power consumption goes down to 13.6 then we remove the board that could use 1-2 watts and it is 11.6-12.6 then we add like 2-3 watts that 32mb eDRAM uses and we are around 12.6-15.6. I might be wrong, you may say I am FOS yet I might be right or yet we both don't know s*** when comes to what is used in TDP calculation of GPU since GDDR3/GDDR5 consume noticeable amount of watts and can add to heat of the GPU.
Nintendo are more incompetent when it comes to designing hardware! That 32mb of eDRAM isn't exactly a waste. What is a waste is that nintendo cheaped out on the hardware so don't be trying to deny this! Just because nintendo saw their mistakes does not mean that nintendo will learn from them! No I don't love to underestimate, instead I over evaluated that nintendo would have the better hardware completely! Now you just further changed my view on how weak the WII U truly is! You did not make your case better in fact those cases were used against you! I think I have solved the puzzle on the WII Us hardware mysteries! Take a look at Gamecube please... Look at Xbox 360 and PlayStation 3... nuff said. It is a waste if its potential can not be saturated, also why I would be trying to deny that they cheaped out on hardware? LMAO So Nintendo did not learn from Nintendo 64 to not use Cartridges and to not make programming far easier? So Gamecube is a console with architecture full of bottlenecks and has cartridges for games? Nope... Gamecube has simple architecture and a disc drive and it was too small in capacity with its own disc so Wii used fully fledged DVD discs. Yes you do, don't deny it. I've seen people come and go and still underestimate a lot of things. You only over evaluated expectactions in hardware department, look at Wii. They did not try to be a power house so why would a successor that also has a Wii in the name to be a power house? Use logic... Now. If Wii U was weak then Super Smash Bros would not be 1080p60fps and Project CARS would have been cancelled for Wii U yet it is not... You have not solved anything... That is the sad part. :/
SHMH. Your youtube channel says very differently and so do the users at anandtech. What you see and what you they say does not mean it is the truth, yet I don't deny you to fool yourself because that is your right! I deleted all of my videos on my channel, I decided to do a "restart". I am sick of fake ass console wars and fanboys, that was is futile and is only war of attrition, I will make some other content none related to that c***. You can call me Nintendo fanboy yet it will only make you look like a fool, I started gaming on PlayStation and I continued to game on my PC... My Steam serves me well. :D |
Bolded is my response... Enjoy.