By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Splinter Cell Blacklist Performance Comparison – PS3/Xbox360/Wii U

superchunk said:
ninjablade said:

(your general wii u hate)

I don't understand you.

Your hate is so strong that it ignores facts and doesn't even consider logical conclusions in game development.

1) This game is obviously overall the best graphically no Wii U.

2) This game has load times not present (in total time) to other versions.

3) This game has Gamepad gameplay that is arguably the best interaction option with the game.

4) This game was made on development engines that were created and optimized for PS360.

5) This game was then ported (well after game started on PS360) by a separate team who had to learn the gamecode all the while tweaking it and the engine to work well with Wii U. (likely engine was already tweaked some by previous Wii U Ubisoft ports, but still needed some work for specific game)

I don't see how this could possibly be negative to Wii U. Personally, I can't wait to see Watch_Dogs and how it is massively better on Wii U vs PS360 since it is the exact opposite scenario. Wii U first (with other next-gen) on next-gen engines while PS360 late/rushed and down ported.

BTW I've always hated your biased and missinformed sig comparison. So much more there than the raw numbers involved here as well as you not respecting the raw number differences in Wii Us favor. Especially GPU standards.

Wii U is 2x-3x better than PS360 in many ways which is only 2x-3x less than XBone all the while still being far closer to Xbone due to the GPUs modern architecture (GPGPU).


I wouls just dismiss anything these guy says. Wii U specs still arent known but he is posting them as if they have been confirmed and trying to justify it by saying its what someone else believes so it must be true. I wouldnt take him seriously on anything



Around the Network
dsgrue3 said:
Man Wii U continues to struggle to maintain a competitive edge over 7th gen consoles.

Sad, really. Kinda like buying a new PC that can barely keep up with your old one from 7 years ago.



Not if you understand anything about programming which you clearly dont.

If I took a 20 year old PC game and ran it on my current day PC, it wouldnt suddenly have 1080p graphics.

If a game isnt coded properly to take advantage of superior hardware its not going to run any better on superior hardware.

The saddest part is this is obvious to anyone who knows anything about programming, which you clearly do not know anything about, yet you want to act like you do.

I also fail to see how a dev team making a game that was coded for 7 year old hardware superior to that hardware in a quarter of the time it took to make it for that 7 year old hardware is proof its struggling but I know you wont let facts get in the way of your blind hate.

Whats sad is when you don't mind looking misinformed or unintelligent in order to hate on a console you dont like and know nothing about.



ListerOfSmeg said:

Not if you understand anything about programming which you clearly dont.

If I took a 20 year old PC game and ran it on my current day PC, it wouldnt suddenly have 1080p graphics.

If a game isnt coded properly to take advantage of superior hardware its not going to run any better on superior hardware.

http://www.eduke32.com/

Or try some older games like Jedi Knight, System Shock 2, Titan Quest, Doom 3, GTA 3, Morrowind, Deus Ex. Ultima IX, which were barely able to run in 800x600 or 1024x768 on formerly "high-end-PCs"... now you can run them in FullHD and higher with 16x AA.



Lens of Truth lol what a bunch of amateurs. Wait for the DF face off.



Angelv577 said:
DieAppleDie said:
nobody cares about trophies
get real

Maybe not for Wii U only owners but it could care for those multi console owners and yes i do care.

IRL I know almost no one who cares about achievements/trophies. So in general I'd wager most peeps wouldn't make that a decision to buy or not buy a game.



Around the Network
SubiyaCryolite said:
Thats good to know. Part of me wonders why the PS3 had 7 active cores and the 360 had 6 active threads each rated 3.2GHz. I mean, if the Wii U's reported 1.2GHz TriCore CPU can handle 90% of the same games just fine. Seems like overkill/highly inefficient hardware from Sony and Microsoft.

or it has absolutely nothing to do with the CPUs.

Next-gen hardware push nearly all the work on the GPUs and there, Wii U is night/day difference in power/architecture/capabilities to that of PS360. Sure, PS4/Xbone are far better, but at least they (next-gen) all share the same architecture and likely the same technology advances... just difference in raw computing power.



curl-6 said:
unaveragejoe said:
curl-6 said:

It was most convenient for Nintendo to keep it PPC750 based though, that meant that to run Wii games they just had to use one of the three cores, downclock it to Wii speed, and use only 256kb of its L2 cache.

I do agree that it is an underestimated part though, and that with its GPGPU the console was clearly designed for share the load of traditionally CPU-assigned tasks.

From what I understand  Powerpc 476fp could have been used. At the end of the day it is a custom power based chip so maybe a little of this and a little of that could have been used. Also, in development of the Wii U they pondered a 1+1 but, chose not to. So I think the CPU they decided on was strong enough with out the need for that when used properly.

If it's strong enough to run games like Bayonetta 2, X, and Pikmin 3, then it's strong enough for me.


Wii's CPU was strong enough on it's own, and, even though the Wii U CPU isn't just three Broadway processors overclocked like some people here think, even if it was, it would be easily better than the 360's CPU. The 729MHz Broadway core was almost as fast as one core at 3.2GHz in Xenon, since it used a much shorter pipeline and was and out-of-order design, compared to PS3/360 processors that used a in-order design and long pipeline(much like Atom and ARM11 processors). Wii U's CPU certainly isn't a problem, it is surely stronger than Xenon, developers complaining about low performance on it clearly were using code optimized for high clock/low IPC cores of the PS3/360, which obviously won't work well on a low-clocked/high IPC design.



SubiyaCryolite said:
Thats good to know. Part of me wonders why the PS3 had 7 active cores and the 360 had 6 active threads each rated 3.2GHz. I mean, if the Wii U's reported 1.2GHz TriCore CPU can handle 90% of the same games just fine. Seems like overkill/highly inefficient hardware from Sony and Microsoft.

The PS3 and 360 CPUs were designed more for brute force, not efficiency. They get the job done by throwing tons of clock cycles at the task, and in the PS3's case, by having 7 SPUs to share the load.

Wii U's CPU, Espresso, is a very different beast. It's clocked low, but goes to great lengths to get as much performance as possible per clock cycle. It has 3 times as much on-board L2 cache memory as the 360, a shorter pipeline which means less operations are wasted if it makes a mistake, a separate chip to handle audio, a GPGPU to help out as well, and out of order execution so its not restricted to doing things in a pre-set order and can instead do it in an order that's more efficient for the specific task.

Wii U's 1.2GHz core can keep up with its 3.2GHz competitors simply because its built to focus on per-cycle efficiency.

EDIT: Ah Razordragon, great minds think alike! ;)



anthony64641 said:



Which it should be, the only thing that needs to be addressed is the loading issues. I don't think Ubisoft had the time to use the eDRAM sections or they could have gotten those load times quicker.


eDRAM wouldn't have any effect on load times. Load times are determined by 3 main factors I/O speed (Disc/HDD etc) RAM (size and speed) and the CPU. In that order, the data has to be read from the disc, stored in RAM and usually decompressed or processed by the CPU and written back to main RAM. eDRAM at 32MB is too small to really hold much data so won't really effect much. eDRAM's main use in the Wii U would be to hold the framebuffer where the system stores the current frame as it is rendered (drawn) before it's displayed on screen as well as the last frame (the one that is being displayed on screen so that the system can send that 2 the TV until the next frame is complete, instead of only updating part of the screen causing a "tear"), as the Wii U has limited main RAM bandwidth (framebuffers require a lot of bandwidth) the fact that the Wii U version has full VSYNC without tanking the framerate is a pretty good indicator that they are using the eDRAM and are probably in fact using triple buffering on Wii U.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

two times i posted so...........whatever man