By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Breaking News on PS4 - Kotaku reveals new info!!

So it's pretty much confirmed that next gen we'll be seeing for the first time ever the "homogenization" of the home console industry, where all major home consoles sport comparable specs and architecture based on slightly customized versions off-the-shelf parts, rather than radically different and proprietary hardware as was the case in just about every generation up to the current one.



On 2/24/13, MB1025 said:
You know I was always wondering why no one ever used the dollar sign for $ony, but then I realized they have no money so it would be pointless.

Around the Network
Chandler said:
pezus said:
Chandler said:
pezus said:
Chandler said:
That's good news. Now I can watch the glorious 180° people will make about a touchscreen on a Controller.

Yep

Sometimes I wonder if you're really a Vita owner


I don't know what the Vita has to do with this since it's not a controller and it would be kinda idiotic for Sony to release a handheld without a screen, but we are talking Sony so you never know, right? 

Vita is basically a controller though...with a screen. All things considered. It seems like this PS4 controller will have a touch pad similar to the touchpad on the back of the Vita, not with a screen like WiiU.

Also, you misspelled pezus ;o

 

Me misspelling you should give you a hint about how much I care about you and your opinion. And all things considered, your reasoning is pretty much insane. But I guess everything goes in the name of the good fight.

You care much more than you pretend. You posted a pic you took specifically for him after all.



mjk45 said:
theprof00 said:
hmm clickable controller?
I remember having that idea back at the ps3 launch.
My idea was basically a split middle section that allowed you to partially rotate the independant halves. The idea was to work with something like drifting in a racing game, or throwing grenades/footballs with left or right arms, and weapon selecting.
Anyway, sounds pretty great, and the rgb does indeed sound like some kind of ps eye tracker.

the twistable controller has been done it was called the   neGcon namco made it mainly for racing games some people swore by

huh i guess it just goes to show you that if you have an idea, somebody probably already thought of it!



Chandler said:

 

Me misspelling you should give you a hint about how much I care about you and your opinion. And all things considered, your reasoning is pretty much insane. But I guess everything goes in the name of the good fight.

keep the aggression down please, there is really no need for it.

Vita could eventually act as a controller for the ps4, and while it wouldn't be THE main controller, it's not fair to call the idea insane or to suggest that peez is 'fighting the good fight' ala sony defense force. Slow your roll.

EDIT: wow with a layton theme....lot of animosity there



Honestly, if it come from Kotaku, i dont even bother trying to believe it.

I'll wait for official confirmation.



Around the Network

LOL the controller thing is just too funny!

other than that this will be a monster... on the wallet unless they are willing to lose a lot to make it affordable.

anywho looking foreword to seeing this :D



    R.I.P Mr Iwata :'(

wait arent dev kits packed with 2x amount of ram for debugging purposes? dont get me wrong more power to em but i dont see this thing being sub 400 at release. guess crytek wont be getting that 16-32gig of ram they wanted for next gen machines after all.



Yes dev kits almost always have more RAM than the consumer model, though I don't think it's 2x the amount these days (could be wrong though).

The 2.2 GB graphics RAM might have the extra 256MB as an overhead for programmers/artists etc., likely that means a retail model with 2GB graphics RAM (GDDR5?).



I hope that Sony keeps its standard controller and makes the new controller optional. The WiiMote really hurt my Wii-Experience because in most games motion controls simply do not work and are simply hurting the overall experience.

The same goes with the Vita. You just need to play Assassins Creed 3 Liberation and you will see how the added motion controls are simply annoying. Not only do they not offer any additional value to the games... they mke the whole game simply much more difficult. Simply try to beat that puzzle within the time limit for the full synch in chichen itza. Or have you ever tried to open that damn letters by using both touch screens and it simply didn't work? Or the minigames where you have to hold the back camera towards a light source... it simply did not work until I turned the light off and it was totally dark. Or how they reworked pickpocketing.

AC3 Liberation was the game that clearly showed me that for me Motion Controls are only a gimmick and for most games they simply can't offer a satisfying control input. I like it in games like Professor Layton or Railshooter... but keep it away from action adventures or simply let me turn off this mess so that I don't have to cope with it.



dahuman said:
For my sanity, I really hope they are not using Bulldozer based cores, for fuck sake, that shit sucks.

Once you accept that none of the next generation consoles will have an Intel based Core i5/i7 due to cost and Intel's high profit margins, what's the next best gaming processor in the world for a console that is expected to last 6-8 years? Bulldozer/Vishera. Say it isn't so?

Also, you forgot another key word: Context

On the PC gamers care a lot about CPU performance because the most popular gaming genres are MMOs & RTS/strategy games which are notoriously poorly multi-threaded and are very CPU limited style games. Furthermore, enthusiasts tend to spend $100-200 for small performance increases on the GPU side (like $100 extra to go from GTX670 to GTX680 or $200+ extra for 25% more performance to go from GTX660Ti to GTX680, or HD7950 to HD7970GE, etc.). For those users, especially if they overclock their CPUs, every little bit of performance and reduction in power at overclocked 4.5-5.0ghz states matters.

None of these things apply to consoles: (1) Console CPUs aren't overclocked (2) Consoles don't have RTS or MMOs (3) Consoles won't use high-end or dual-GPUs.

So why does context matter so much? Because when you put Bulldozer/Vishera (FX8150-8350) and pair it with a GTX670 in a wide variety of "console style games" games, the performance difference between it and Core i5/i7 is almost non-existent.

- Alan Wake, ARMA II: Operation Arrowhead, Assassin's Creed 3, Battlefield 3, Crysis 2, F1 2012, Far Cry 3, GTA IV, Hitman Absolution, Max Payne 3, Metro 2033, and Sleeping Dogs you end up with this:

http://pctuning.tyden.cz/hardware/graficke-karty/25994-vliv-procesoru-na-vykon-ve-hrach-od-phenomu-po-core-i7?start=16

At 1920x1080 with Anti-aliasing, when FX8120-8350 are paired with a GTX670, they are only 2% behind in performance! In the type of games that are on consoles (non-MMO/RTS games), the console will be primarily GPU limited. The context matters even more because it's almost a certainty that none of the next generation consoles will have a GPU as powerful as a GTX670. Essentially if you were to put an FX8000 CPU with any GPU slower than GTX670, the console will be 95% GPU limited for most of its life going forward assuming the games are running at 1920x1080 with some anti-aliasing that stresses the GPU. Most professionals reviews run CPU tests at useless resolutions like 1280x800 or 1680x1050 with no AA to show the differences in CPU speeds. This type of testing is absolutely meaningless if the consoles are targeting 1920x1080 with some AA, because the workload almost entirely shifts to the graphics sub-system. The slower the GPU is in the consoles, the more the workload will expose the GPU bottleneck. 

FX8150 is fast enough to get you 64 fps minimum and 81 fps average in Battlefield 3 at 1920x1080 4xMSAA Ultra Quality settings with a GTX690, barely behind Core i5 2500K/2600K. Of course we know the consoles will never have anything like a GTX690 in them, which automatically means a GPU bottleneck with an FX8150. 

Conclusion: FX8000 would not suck at all in a console -- that would actually be the 2nd best choice after Intel's CPUs i5/i7s, far superior to any processor made by IBM or any Core i3 from Intel; and it would handily trounce an 8-core Jaguar by miles. If anything, you should hope and pray that PS4 has an 8-core (4 module) Bulldozer/Vishera in it, not denounce it, because honestly that's the 2nd fastest CPU you can get once you step outside of Intel's CPU offerings. Also, if you look at FX8320's retail price of $169 and what Sony can possibly purchase this AMD processor for directly from AMD, then it's pretty obvious Intel's offerings are out of the question, unless you are willing to pay $600-700 for a PS4 with a Core i7. But again, that wouldn't even matter unless the console had a GPU more powerful than a GTX670...

I think console gamers need to reevaluate the context of this hardware more carefully. PS360 rendered most games at 1280x720 and later even at lower resolutions (Black Ops 2 at 880x720, Uncharted 3 at 896x504, God of War 3 at 1152x640). Because of such low resolutions with minimal or no AA filters, the console's CPU speed mattered a lot more. Once you move to 1920x1080 and add 2-4x AA, in probably 90-95% of cases the console's GPU willl become the bottleneck in non-MMO/non-RTS based games unless they pair an FX8000 with a GTX680 or faster. Whichever console has the best graphics will be dictated by its GPU setup, not CPU, assuming they don't put some slow anemic CPU aka Wii U in them.