By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U graphics power finally revealed - "we can now finally rule out any next-gen pretensions for the Wii U"

S.T.A.G.E. said:
dahuman said:
DF jumped the gun too fast to get hits on their site, at the moment of their writing and even right now, nothing is conclusive yet. They are good at analyzing "graphics," but they are no experts when it comes to analyzing hardware. I'll stick to my usual saying though, if you want some real fucking graphics, go PC or STFU about it. cause all 8th gen consoles are already last gen or worse on the graphics level. They also don't understand why GCN doesn't mean as much for consoles, it's meant for PCs to start with.

 

Why should people forget about the inevitable change for Sony and Microsoft consoles that graphics is an issue? Sony is kind of being directly threatened by Microsoft and they are playing a "who can do what the devs want best to have the best looking multiplats"game. Nintendo forgot that their hold on devs was directly because of their handling the power of systems and creating a name that devs became dependent on, and when Sony came they were free of it all. Consoles can impress graphically, but every gen we want to see a leap. Again, this leap isnt going to be as great, but Sony and MS are going to try as hard as possible to meet third parties wishes which will leave Nintendo in the dust where support is conerned for major titles but also, they will probably take a hit sell consoles. This might also screw up Nintendos chances at ever getting another FF franchise in the coming gen.


I'll be honest, at this point, I could give less fuck about Final Fantasy or any SE game on consoles, I'm more interested in their handheld games right now, basically, I'm seriously butt hurt right now by SE due to the 7th gen. I don't see the point in the race though, had they waited another year, just one fucking year, the graphical jump would have been monstrous, but they just had to fucking rush it. I'd have been perfectly happy with 360 and PS3 level games for another year!



Around the Network
superchunk said:
Just to let it out there...

I do keep updating my comparison thread in case peeps forget.

http://gamrconnect.vgchartz.com/thread.php?id=136756

Southern Islands

The 192 GB/sec memory bandwidth sounds very suspect for PS4's GPU. That would mean 6Ghz GDDR5, or expensive and much more power hungry 384-bit bus width of Tahiti XT chips. Neither the 6Ghz GDDR5 nor the 384-bit bus makes financial/architectural sense in the context of HD7970M GPU. Also, if you widen the bus width to 384-bit, you need more physical pins on the GPU stack. That raises the price big time.

HD7970M has 153.6 GB/sec memory bandwidth and that's the flaship HD7000 mobile GPU. Going above 154 GB/sec memory bandwidth on Pitcairn XT/Wimbledon XT GPU would be wasteful (20-30W extra power consumption), a LOT more expensive (GTX670 has 256-bit + 6Ghz GDDR5 and that card costs $340+), and would add little to no performance since the actual GPU is the bottleneck. So it would defeat the purpose from a performance/watt, performance scaling, and higher price points of view. It would only make sense if the GPU was of class HD7950 or higher imo.

We'll see what happens but to me the 192 GB/sec # on a 256-bit PS4 GPU does not logically make sense.



On another note, analysis of the GPU of the Wii U leads me to believe that it is composed of 1 billion transistors. Thus making its GPU comfortably more powerful then the 360's and PS3's, but with only 1/2 and 1/4 the memory. What's the point I ask? Little point if the power can't be used in-game, due to low memory bandwidth. Let's say Nintendo is running the games at 640x480 and scaling them up. Then it requires just 2 MB of frame buffer memory - which is what it probably has via 2 MB of high-speed SRAM. That leaves 32 MB of EDRAM for caching texture and polygons, and operating on them. Not enough.

32 MB is not enough to hold a full game. Thus - it can put a few nice effects here and there, that's it. For instance normal mapping. More powerful then 360, but not that much powerful.



superchunk said:
Just to let it out there...

I do keep updating my comparison thread in case peeps forget.

http://gamrconnect.vgchartz.com/thread.php?id=136756

And checking the prices.  Say all that is true.  So, you will end up getting people on here talking about THE POWA!  And then average folks looke at $400-$500 range and go, "ARE YOU FREAKING NUTS!"  Do people seriously expect to get sufficient numbers of sales at the $400-$500 range to be able to turn the current sagging generation software sales around?



richardhutnik said:
And checking the prices.  Say all that is true.  So, you will end up getting people on here talking about THE POWA!  And then average folks looke at $400-$500 range and go, "ARE YOU FREAKING NUTS!"  Do people seriously expect to get sufficient numbers of sales at the $400-$500 range to be able to turn the current sagging generation software sales around?

The 'average' folks are what's killing gaming in general - dumbed-down mainstream games with a focus on QTEs and cinematics over gameplay, short SP campaigns, truckloads of DLC/Season Passes, rehashed sequels, health regeneration in games, most games are way too easy, etc. All of this started once consoles went mainstream. If these so called 'average' mainstream gamers stopped buying consoles because they are too expensive for them, only the hardcore gamers would remain. Ironically, console gaming would be better anyway.

These 'average' consumers have no problems with buying $300 tablets and iPhones on a 2-year contract but then complain about $400-450 prices for next gen consoles. If they don't want next gen consoles, let them leave so developers focus more efforts on fewer high quality games like was the case in the past. Steam does just fine with a much smaller # of PC gamers (50 million users), which means you don't need a base of 240 million consoles to be profitable. If companies are spending $130+ million on developing a game and then $150 million on marketing it, they are doing it wrong. When small studios like Crytek and CD Projekt Red were able to make stunning games on small budgets (Crysis 1, Witcher 2), there are no excuses why great games cannot be made for a fraction of COD Blacks Ops 2 budget. The indie PC developers are thriving, which lands more evidence you can make good games for 1/10th the cost. Trine 2 - awesome game made by a small studio.

In the context of historical console pricing, $400-450 isn't that expensive assuming the hardware is next generation and there are new exciting games/IPs that take advantage of new technology.

http://www.gamasutra.com/view/news/177337/sizing_up_wii_us_price_tag_against_history.php#.URHOo6VZUeo

$400-450 would fall roughly at a mid-point relative to historical console prices adjusted for inflation.

If people don't want to pay $400-450+ for a next generation console, they can always wait 2 years for prices to drop. This is actually not a bad idea since bugs get worked out and the console's gaming library expands.

The funny part is the same people who complain about $400+ hardware don't talk about what is often an greater cost of console ownership over its useful life -- software. For instance, games like COD:BO2 cost $59.99 and then come with $30-40 of DLC. Borderlands 2 was what $59.99 and had $29.99 of Season Pass?

The average person in NA doesn't even blink at $60 a month + $199-299 2-year smartphone plans but then complains about spending $400 on a console that will last 6-7 years with 0 upgrades. The same people also pony up $ for Xbox Live Gold fees and amass a collection of 50+ videogames over 6-7 years. If someone purchased Xbox 360 on launch day and paid $40-50 a year on Xbox Live, by now they would have spent $280-350 on Xbox Live alone.

On another note, when Xbox 360 250GB bundle (a 7+ year old console) is still selling for $299.99, it's mind-boggling people are complaining about possibly paying $400-450 for next gen consoles. 

http://www.bestbuy.com/site/Microsoft+-+Xbox+360+250GB+Bundle/6693058.p?id=1218782894450&skuId=6693058&st=xbox%20360%20250gb&cp=1&lp=2

I am guessing mathematics and capital budgeting decisions are not strong points of the average console gamer!

Furthermore, videogaming is among the cheapest hobbies, compared to just about anything else out there. The average family spends > $400 on attending a single Football game. What about learning to play sports like tennis, golf, skiing/snowboarding? All way more expensive than videogaming. Gun ownership/shooting range, learning how to fly a plane, sailing, traveling, photography, wine tasting/China collecting? All more expensive than console gaming.



Around the Network
BlueFalcon said:
Soundwave said:
Iwata already said they don't see a future in pursuing higher and higher end graphics because the cost to develop games will out weigh the possible return in most cases.

If that were true, 3rd parties would have been more likely to abandon PS4/Xbox 720 to make games for Wii U due to lower costs. Instead they are already abandoning the Wii U and workin on next gen games for PS4/Xbox 720. His theory isn't translating well in the real world.

If Wii U's games compared to PS4/720 look like Xbox 360 vs. PC today, people are going to think really hard before spending $300-350 on the Wii U:

http://www.youtube.com/watch?v=-a3eilZRlyk

Nintendo is going to need to drop the price to $229-249 by Q2 2014 I bet. By holiday 2014, I can see places like Costco having the Basic Wii U for $199, unless Nintendo launches the some amazing 1st party line-up of games we've seen during SNES/N64 days.

I would imagine the gap is going to be larger than that Crysis 3 vid shows.  While the textures are higher res on the PC, and has better lighting/shadowing, the poly count is probably about equal for both versions.  However, I would imagine the PS4/NeXbox are going to be able to produce a much higher amount of polygons than the Wii U.



Some people have been pointing to the reference to R600 in the stuff from Marcan to conclude that the Wii U GPU is R600-based. There's a few problems with that theory.

For one, R600 was manufactured at 55nm at best. For another, the 3850 was a 190 mm^2 die, way too big. More importantly, the command he gave was a Linux command. Linux runs GPUs using drivers, and those drivers don't always have names matching those of the chip they're driving. For instance, the R600 graphics driver for X.org covers not just R600/700, but also Evergreen and Northern Islands chips. Evergreen and Northern Islands are 40nm chips. Given the way that drivers work, it seems likely that this means that register names, etc, also match.

The "earliest" chip to have 40 nm fabrication was an R700 chip, specifically the 4770. Of course, since it had 640 SPs, ran at 750 MHz, and drew 80 W of power, we can rule that chip out, too. No other R700 chips were 40 nm.

That brings us to Evergreen. which happens to have a suitable chip - the 5570, AKA Redwood LE - same clock speed, SP count matching the one currently speculated for the Wii U GPU, and a power draw maxing out at 39 W - a number that could probably be brought down by more modern alterations.

One of the oddities is that many are saying the Wii U GPU's memory bandwidth is 12.8 GB/s... this is strange, as that number is listed for the 5570 only when using DDR2 memory, whereas the GPU uses GDDR3 memory, which is listed with twice the memory bandwidth.

Of course, this isn't a 5570, either. As has been said so many times, this GPU is not a standard GPU by any measure. This leads me to suspect that it'll be found to be closer to a more powerful GPU, but with lowered clock and fewer SPs. Something like, say, the 6570M, which has 400 SPs, a clock speed of 650 MHz, and a power draw of 30 W. With a reduction from 400 SPs to 320 SPs and a reduction of clock speed from 650 MHz to 5500 MHz, you would expect a much lower power draw, while still managing 352 GFLOPS.

But this is speculation on my part, based on mathematics rather than specific knowledge of how GPUs work, and what influences power - all I know is that underclocking often decreases power usage more than linearly, and that you'd expect a reduction in power usage with reduction in number of transistors.



zero129 said:
Question. What if xbox 8 even if it doesnt have on paper as much power but has better looking 3rd party games will you still go for the PS4?.

Assuming both the Xbox 720 and PS4 share the same CPU, and PS4 maintains at least a 50% lead with the AMD GPU, the above statement is a near mathematical impossibility unless MS pays $ to 3rd parties to purposely subotage 3rd party ports to PS4. Since the GPUs are likely coming from the same AMD family (i.e., HD7000 series), it's impossible to make up for the graphical performance gap. This would be no different than if we took a PC gaming system with an identical CPU and one having a GPU 50% faster.

Some possible scenarios on how Xbox 720's game may look better: if the CPU is in the Xbox 720 is much better, the OS on the PS4 is horribly inefficient negating all of the processing power advantage of PS4's GPU, or if the developers can't figure out how to recreate "DX11" effects in PS4's OS environment, or PS4 shipping with severely crippled amount of system memory.

Considering Xbox Live fees vs. free PSN (or PSN+ offering way more features like free games over Xbox Live), and if PS4 has superior GPU, MS is going to have to do something spectacular or market the **** out of their console to make it worth buying over PS4 if their prices are similar. Without PS4's 1 year delay and $600 price tag, MS's key advantages it had with 360 over PS3 are completely gone. 

Since Xbox 360 is getting outsold by PS3 now, we know already that with equal prices and MS's thin 1st party variety, MS doesn't stand a chance against Sony if they just release the 720 without something that makes it stand out. It could be anything like a TV cable discount bundle, some cool show recording functionalities for live movie/TV recording, Kinect 2.0 bringing some revolutionary gaming experience in the living room, etc. 



BlueFalcon said:

Assuming both the Xbox 720 and PS4 share the same CPU, and PS4 maintains at least a 50% lead with the AMD GPU, the above statement is a mathematical impossibility unless MS pays $ to 3rd parties to purposely subotage 3rd party ports to PS4. Since the GPUs are likely coming from the same AMD family (i.e., HD7000 series), it's impossible to make up for the graphical performance gap. Some possible scenarios how Xbox 720's game may look better is if the CPU is in the Xbox 720 is much better and/or if the OS on the PS4 is horribly inefficient negating all of the processing power advantage of PS4's GPU or if the developers can't figure out how to recreate "DX11" effects in PS4's OS environment.

Considering Xbox Live fees vs. free PSN (or PSN+ offering way more features like free games over Xbox Live), and if PS4 has superior GPU, MS is going to have to do something spectacular or market the **** out of their console to make it worth buying over PS4 if their prices are similar. Without PS4's 1 year delay and $600 price tag, MS's key advantage it had with 360 over PS3 are completely gone.

Seems like the Nextbox have "special sauce" units to help and free up the GPU... so less work to GPU... in the end the performance is near the PS4.

It's like that...

PS4 power: 5 task per cycle
260 power:  3 task per cycle

You have to run 5 tasks in both GPU but the Nextbox can do 2 tasks in the "special sauce" units... so at the end both give you the same performance. Of course that a dummy example lol... and these tasks are not for graphics at all (it is but not shaders specifics)...

Another point is that PS4 will reserve only 896 shaders units (1.4TFLOPS) for graphics... the others 256 shaders units (400GFLOPS) will be free to developer use for anything (graphics, GPGPU, etc).

At the end what will show how good the console perform is the optimizations and full use of the hardware.



thismeintiel said:
I would imagine the gap is going to be larger than that Crysis 3 vid shows.  While the textures are higher res on the PC, and has better lighting/shadowing, the poly count is probably about equal for both versions.  However, I would imagine the PS4/NeXbox are going to be able to produce a much higher amount of polygons than the Wii U.

Ya, the actual single player footage looks way better. Watch these 2 videos in 1080P. I expect PS4/720 to have real-time graphics at least at this level.

http://www.youtube.com/watch?v=iflmHVj1dzU

http://www.youtube.com/watch?v=OuXHlMy-hqM