By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Apple A9X: The Mobile Processor That Outperforms a Wii U?

spemanig said:
Soundwave said:

I've had kids tell me straight up they like the iPad better than the 3DS, lol. My nephew and nieces when they come over, they want to play on the iPad first and foremost, they like it better than the 3DS. The games are easier to play and the larger display makes it easy to point and touch for kids. And when they're bored of the games they just switch on over to watching cartoons on it, it's an ideal device for a wide variety of media consumption. 

I don't really think these "poor kids" are being held hostage and being forced to play these tablet/phone games, lol. They genuinely like them. I know "hardcore game" fans cannot grasp this, but I really think it is the truth. We're the ones living in a bubble, not them. Most kids do not give a crap about the lack of buttons, if anything it makes the tablet more friendly and more approachable. When they get older and want "deeper" games, then they start to get into wanting a Playstation or XBox, as invariably kids get closer to 6th/7th grade, being "cool" becomes a life obsession. 

8th gen ports? I think they need 9th gen ports. As in Dragon Quest XI, not just X. Kingdom Hearts III. 

But what I think we'll actually get is another ho-hum Nintendo product, probably underpowered. They'll bundle it with Amiibo and try to sell it as a toy device. It'll have shared games, it'll do OK on the market nothing mind blowing. They would need another Wiimote type miracle revolution to significantly alter their fate now I think. The games will be fun though as they always are on every Nintendo hardware. 

When I was growing up, Mickey Mouse was basically irrelevant to me, Disney was largely as a whole irrelevant. Super Mario and Bart Simpson and the Ninja Turtles were the hot new thing then, no kid gave a crap about Mickey Mouse. It wasn't until Disney made The Little Mermaid that that started a new reinvention for the company that led to things like Aladdin and The Lion King and then they were relevant again, but not because of Mickey Mouse. 

If I'm a parent as well, I'm not stupid, I'm not buying them that Nintendo handheld. Why should I? So I can then be on the hook for my crying kids asking for $30-$40 games? Nah. Just giving them the old iPad and letting them play free games sounds a whole lot better. The kids aren't complaining and neither are the parents and I just don't think Nintendo has an answer to this. 


And I've had kids tell me they like the 3DS better than their smart phones. I work at a summer camp. Your anecdotes don't mean anything. I never said they were being held hostage or that they were "poor kids." I never said they didn't genuinely like them. I said that they aren't actively choosing the platform based off what they'd prefer, but what their parents give them. I loved playing online browser games on my computer when I was a kid. That's what I played the most, far more than superior console games or handheld games. I don't think I had a "poor" gaming upbringing because of that, and I don't think kids who only have access to tablets and phones have it either. Doesn't change the fact that that's the only reason kids play those platforms more. Not because they like them more, but because parents do.

And you're right. Parents aren't stupid. They are giving their kids more cheaper mobile experiences rather than handheld and console games. Doesn't mean either market is diminishing. There's still plenty of parents that definitely feel that a dedicated gaming toy is the right path to take, as proven by the wonderful 3DS sales. Nintendo doesn't need to kill smartphones. They just need to justify a parent buying both, which they already do fantastically now. They'll do even better with the NX and their new smart consoles.

DQ11 and KH3 are 8th gen games, not 9th gen. Don't know what you're on about there, so I'll just skip.

The unified platform is the "Wii like miracle." Not merely because the games will work across two platforms, but because of the antire modernization of the console concept with these "smart consoles." I think it'll definitely do better than the 3DS. Significantly so. And it'll also have an everlasting lifespan thanks to the way they'll make fraquent hardware upgrades much like Apple does with its iphones.

I don't know what alien era you grew up in, but I grew up in the crux of movies like TLK and TLM, and Mickey Mouse was still king of them all. He didn't need to be in the newest Disney movie to be, and I can't remember a single day in my entire life where he wasn't the king of relevancy, especially in my childhood. Everyone knows and loves Mickey.


I definitely think handhelds are declining, and I think there is some massive denial on this board about it but it will become pretty much undeniable by everyone fairly soon. Nintendo being forced to make smartphone games is basically a tacit admission on their part that smartphones have taken a large chunk of their audience away from now, now they have to bend and go make these games to try and get a few of those people back (which I don't think will work either, though Nintendo will make a ton of money on smartphone apps, it may even become their no.1 revenue/profit source in 3-4 years). 

Unified platform is OK, I don't think it's the magic cure all for all of Nintendo's problems though. It's probably the only option they have left if they want to continue with dedicated hardware, it's either that or ditch consoles to focus on handhelds entirely. I view that more as a neccessity than a choice at this point. 

No publisher is capable of making 22-24 HD quality games and maintain a high level of quality for said games ever year, certainly not Nintendo, and that's what would be required if they wanted to keep making discreet home consoles and portable consoles. They can't even support the Wii U and 3DS as is. 

I grew up in the 80s/90s. I didn't give a shit about Mickey Mouse, a Mickey Mouse toy would be like no.50 on my want list. Sure I knew who Mickey Mouse was, but it was all about Transformers, GI Joe, Thundercats, Super Mario, Teenage Mutant Ninja Turtles, The Simpsons, Tim Burton's Batman, etc. etc. 



Around the Network

I was under the impression that the a8x chip from last year was more or less similar in performance to the ps3/360/wiiu. So I would expect the a9, or better the a9x to beat them by a fair amount. Not quite ps4/xone though considering they are much better than their predecessors.



e=mc^2

Gaming on: PS4 Pro, Switch, SNES Mini, Wii U, PC (i5-7400, GTX 1060)

Solid-Stark said:
I was under the impression that the a8x chip from last year was more or less similar in performance to the ps3/360/wiiu. So I would expect the a9, or better the a9x to beat them by a fair amount. Not quite ps4/xone though considering they are much better than their predecessors.


The question is: in which way more or less similar in performance? Integer speed? Flops speed? IPC? Branch predictions? Mem copies?



invetedlotus123 said:
Gaming is actually getting kind big in tablets. I hope that power make developers port their games to iPad. Heck, iPad Pro could actually handle GTA V better than many pcs with proper optimization.


Playing GTA 5 on a touchscreen would be such ass though.



Soundwave said:

The whole userbase and development community shouldn't have to be penalized with weaker hardware because some old fart wants to play a DS game from 2005. 

What? lol. If someone was 12 in 2005, they'd be 22 years old today. Not really what I'd call an "old fart". haha



Around the Network
Solid-Stark said:
I was under the impression that the a8x chip from last year was more or less similar in performance to the ps3/360/wiiu. So I would expect the a9, or better the a9x to beat them by a fair amount. Not quite ps4/xone though considering they are much better than their predecessors.


It probably can produce something on the screen that's close if you allow it to cheat on resolution. 

That said it's probably also a chip that would generate a lot of heat and would certainly require a very large battery. 

It would be not be possible in something the size of the current 3DS XL I don't think. Not without melting the inside of the casing. 

You need some legroom and quite possibly even a fan (just in case) to get that level of performance. 

This is the battery of an iPad Mini, and it's massive compared to the current 3DS battery, and even that, at say 8 watts consumption/hour would only give you 3 hours of battery life. 



Eddie_Raja said:
Pemalite said:


No. It doesn't.
You have this thing called "Prediction" where you predict the data you are going to require ahead of time, it's a technique which has been used to varying degree's for decades, extremely effective on fixed-hardware devices such as consoles for obvious reasons.
Converesly, both devices will be streaming a significant amount of data not from Ram, but from mechanical and optical disc storage which is stupidly slow, we saw that put to great use last generation.

The real limitation to the Xbox One is not Bandwidth, it's actually the reduced GPU resources used to draw all the pretty things on your screen.
Look at other GPU designs in the PC space as an example, AMD Fury has an abundance of bandwidth, more than any other graphics card to ever exist, more than several Geforce cards combined... Yet has minimal benefit from it. Why? Because there is not enough hardware to make use of such a wide and fast highway.

Haha that extra bandwidth IS going to use in 4K and will make a massive difference in a year or two when it is actually utilized.  Just look at the 7970 vs the 680 for an example.  At first everyone acted like the extra bandwidth in the 7970 was wasted, but once games actually started needing to feed that much information it got pathetic - so pathetic that a 7970 pretty much matches a 780 now.

Not to mention that the ESRAM isn't even much faster than the GDDR5 the PS4 is using.  Usually this special RAM is like 2-4x faster (Or more), but its not evn 50% faster.  In fact it is slower than the 10MB of ESRAM in the Xbox 360.


I already game at resolutions higher than 4k. (7680x1440 eyefinity to be exact.)

And I have four Radeon R9 290's in crossfire. Before that, Triple Radeon 7970's. Before that... Radeon 6950's unlocked into 6970's.
And I will likely buy into AMD's Fury successor on the 1x node.
Guess where I see the largest performance increase? It's not overclocking my GPU Ram. It's actually on the core clocks, I am compute and ROP bound before memory bandwidth.

If you are thinking the Playstation 4 will be playing Crysis 4... At 4k, with everything dialed up to 11. You are dreaming. No console has the power to do that.

Also...

Check out the 4k benchmarks with nVidia's Titan X (336GB/s of bandwidth) vs AMD Fury (512GB/s of bandwidth)
Despite the nVidia Titan having 176GB/s less bandwidth than AMD's Fury... It still wins the vast majority of benchmarks.
But don't take my word for it...
http://anandtech.com/bench/product/1513?vs=1447

Again. You reach a point where more bandwidth does nothing, because the hardware cannot make use of it.





www.youtube.com/@Pemalite

Eddie_Raja said:
Pemalite said:


No. It doesn't.
You have this thing called "Prediction" where you predict the data you are going to require ahead of time, it's a technique which has been used to varying degree's for decades, extremely effective on fixed-hardware devices such as consoles for obvious reasons.
Converesly, both devices will be streaming a significant amount of data not from Ram, but from mechanical and optical disc storage which is stupidly slow, we saw that put to great use last generation.

The real limitation to the Xbox One is not Bandwidth, it's actually the reduced GPU resources used to draw all the pretty things on your screen.
Look at other GPU designs in the PC space as an example, AMD Fury has an abundance of bandwidth, more than any other graphics card to ever exist, more than several Geforce cards combined... Yet has minimal benefit from it. Why? Because there is not enough hardware to make use of such a wide and fast highway.

Haha that extra bandwidth IS going to use in 4K and will make a massive difference in a year or two when it is actually utilized.  Just look at the 7970 vs the 680 for an example.  At first everyone acted like the extra bandwidth in the 7970 was wasted, but once games actually started needing to feed that much information it got pathetic - so pathetic that a 7970 pretty much matches a 780 now.

Not to mention that the ESRAM isn't even much faster than the GDDR5 the PS4 is using.  Usually this special RAM is like 2-4x faster (Or more), but its not evn 50% faster.  In fact it is slower than the 10MB of ESRAM in the Xbox 360.


Eddie, in which imaginary world of facts do you live, seriously? 4K on PS4? I mean, it's ok if you prefer Sony products but this is something that goes well beyond a normal fan.

But just food for thought for you which also could make a difference: How many wait states are involved read and write for GDDR5 data and how many for ESRAM? Do you think they are the same?



Pemalite said:
Eddie_Raja said:

Haha that extra bandwidth IS going to use in 4K and will make a massive difference in a year or two when it is actually utilized.  Just look at the 7970 vs the 680 for an example.  At first everyone acted like the extra bandwidth in the 7970 was wasted, but once games actually started needing to feed that much information it got pathetic - so pathetic that a 7970 pretty much matches a 780 now.

Not to mention that the ESRAM isn't even much faster than the GDDR5 the PS4 is using.  Usually this special RAM is like 2-4x faster (Or more), but its not evn 50% faster.  In fact it is slower than the 10MB of ESRAM in the Xbox 360.


I already game at resolutions higher than 4k. (7680x1440 eyefinity to be exact.)

And I have four Radeon R9 290's in crossfire. Before that, Triple Radeon 7970's. Before that... Radeon 6950's unlocked into 6970's.
And I will likely buy into AMD's Fury successor on the 1x node.
Guess where I see the largest performance increase? It's not overclocking my GPU Ram. It's actually on the core clocks, I am compute and ROP bound before memory bandwidth.

If you are thinking the Playstation 4 will be playing Crysis 4... At 4k, with everything dialed up to 11. You are dreaming. No console has the power to do that.


Of course they can. It just would take several seconds to render a frame :)



Soundwave said:
Solid-Stark said:
I was under the impression that the a8x chip from last year was more or less similar in performance to the ps3/360/wiiu. So I would expect the a9, or better the a9x to beat them by a fair amount. Not quite ps4/xone though considering they are much better than their predecessors.


It probably can produce something on the screen that's close if you allow it to cheat on resolution. 

That said it's probably also a chip that would generate a lot of heat and would certainly require a very large battery. 

It would be not be possible in something the size of the current 3DS XL I don't think. Not without melting the inside of the casing. 

You need some legroom and quite possibly even a fan (just in case) to get that level of performance. 

This is the battery of an iPad Mini, and it's massive compared to the current 3DS battery, and even that, at say 8 watts consumption/hour would only give you 3 hours of battery life. 


We would assume all things possibly equal or so I thought was the OP, such as the Nvidia Shield console which has a Tegra X1 (better than a8x) and if I recall correctly edges out the PS3/360. Though the K1 beat the 7th gen consoles in compute performance it would still be behind in actual game performance, but the X1 would be a different story I think. Therefore the a9x (better than the X1) should do it no problem.



e=mc^2

Gaming on: PS4 Pro, Switch, SNES Mini, Wii U, PC (i5-7400, GTX 1060)