By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Xbox One Just went Super Saiyan with Unlocked CPU Power

DonFerrari said:
Insidb said:
DonFerrari said:
Insidb said:

I have a feeling that this is a byproduct of MS being waaaaaaay better at marketing and knowing how to market. Massive publicly traded companies like them know how ot feed the news cycle to drive market valuations up, regardless of the actual value of said news. Sony is just like, "We don't have to do ANYTHING and we're still winning? That's super-dee-duper cool. Back to pachinko!"


They didn't made as much fuss over X360 at the time... I think is more paid ad to cover their underpowered console... While their heads were always touting there wasn't much difference, and downplaying the GAP in GPU at the same time we have the reports of how much improvements are made... they are just trying to make the console not look much weaker than PS4. If X1 was selling good and was stronger they wouldn't invest so much time on it.

I agree and think that they are one in the same. Perception is not reality; reality is reality. Perception, however, can influence real outcomes. As long as people believe that the gap is small and closing, they will have achieved their goal. 


And we can't even judge MS or the pratice, they are only working on perception without clearly lying, just a light spin, missdirection and all to downgrade the gap. And in fact for most gamers the gap won't matter too much.

Exactly.



Around the Network



Galahaddy said:


Did you know the orignal PS3 could run Linux and Firefox browser when you hack it out :)



Potential unlocked? :P



Norris2k said:

My main point is that virtualization and 3 OS (including windows) cost is higher than one native plain old BSD (which is a very light OS). And that's a fact.So, Sony could also give one more CPU... but ! What I consider a "high cost" is not a CPU cost, but the loss of reactivity coming from not having 2 reserved CPU anymore for OSs and apps. I'm not exactly sure if the will do OS and apps on on single core, which seems very unreactive, or be able to borrow resources from the 7 CPU, which would lower the impact.

BTW, I'm not aware of a clear CPU bottleneck on the PS4. Only Ubisoft on AC talk about that... and ends with about no difference. The GPU and memory bottleneck on the XBone is on the contrary pretty clear : most game look better and have a better resolution on the PS4.

 


Of course only developers aiming for the true next-gen will talk about that. Ubisoft did bet on worlds with highly connected AIs and now they can't realize them as wanted.

If PS4 developers only polish last gen software they won't complain as they will be happy what the PS4 has to offer.



Around the Network

Sony's never confirmed the CPU clock rate for their processor. Anyone saying x1 has a faster CPU doesn't have facts to back it up, just speculation.

I've read articles stating that the PS4 CPU base clock is 1.6GHZ and clocks upwards based on need. Does that make it true? Hardly - there is no confirmed clock speed for ps4, just generally accepted statements



People rejected Kinect, so of course they are going to free up the system. The difference between both systems is just as funny as this article. You could pause the tv, with the same game, point out the difference and most people still could not tell the difference.

Xbox did a great job on games this year and let's hope it continues. Both systems will optimize over time, it happens every gen.



mine said:
Norris2k said:

My main point is that virtualization and 3 OS (including windows) cost is higher than one native plain old BSD (which is a very light OS). And that's a fact.So, Sony could also give one more CPU... but ! What I consider a "high cost" is not a CPU cost, but the loss of reactivity coming from not having 2 reserved CPU anymore for OSs and apps. I'm not exactly sure if the will do OS and apps on on single core, which seems very unreactive, or be able to borrow resources from the 7 CPU, which would lower the impact.

BTW, I'm not aware of a clear CPU bottleneck on the PS4. Only Ubisoft on AC talk about that... and ends with about no difference. The GPU and memory bottleneck on the XBone is on the contrary pretty clear : most game look better and have a better resolution on the PS4.

 


Of course only developers aiming for the true next-gen will talk about that. Ubisoft did bet on worlds with highly connected AIs and now they can't realize them as wanted.

If PS4 developers only polish last gen software they won't complain as they will be happy what the PS4 has to offer.

If we are saying that this generation fails so far to go to an higher level than graphic improvements, I agree.

If you want to say that one more core for the XBone will lead to da real true next gen experience, and that the PS4 will not be able to, I think you are wrong. First, one more weak CPU for an already highly multithreaded program is not enough to make such a big difference. Second, it's not just about AI, and other improvements would take GPU power. Last, if it's just about giving one more CPU what ever the impact is, Sony can patch anytime.

I find your post quite biased as you talk about "PS4 developers" only polishing. Most developpers are multiplatforms, it's not a PS4/Sony problem. And that so far MS first party didn't get farer into next gen than Sony's first.



mine said:
JustBeingReal said:


See bolded comments of mine for my replies.


 You just confirmed what I said. The PS4 more "out of balance" than the XOne which are both not so well balanced compared to the Wii U.

LMFAO you haven't got a clue what you're talking about, I didn't confirm what you said, if you think that you don't understand the topic at hand at all.

PS4 is the most balanced out of all 3 consoles, it's GPU handles everything, RAM has just the right amount of bandwidth for the requirements of it's CPU and GPU.

XB1 lacks bandwidth to it's main memory,a standard 7770 requires 72GB/s from it's 1-2GB memory pool, it's 8GBs of DDR3 clocks in at 55GB/s of real world bandwidth, an 8 core Jaguar CPU requires about 20GB/s of bandwidth, so the GPU only gets about 35GB/s from it's largest pool of memory, that's a deficite of around 25GB/s, now you may say it's got 140-150GB/s from it's 32MB of eSRAM, you'd be right, if the size of that memory pool was sufficient, but it's not, it's tiny.

Wii U's main memory only supplies 12.8GB/s, it's GPU needs about 28.8GB/s of bandwidth from 512MB-1GB, but it's at a deficite of 10GB/s from it's 2GBs of storage, like XB1 yes it has a 32MB pool of cache, but it's tiny and it only runs at 70GB/s, once it's full it's not usable, can't do anything more until it's emptied for the next task, so developers are limited to the size of data they can put in it, it's fine for small render targets or some filtering, maybe selective effects or data storage, but not full frame buffers and tonnes of physics or AI data for GPGPU calc.

No Wii U isn't the most balanced, the facts don't agree in the slightest.


You can't move any code to the GPU. Even is AI is possible in theory, its a big difference to have it working in reality. This is because you'll need to "do something" with the computational results by moving them back from the GPU to the CPU which has to update the internal game world.

This has to be synchronized.

LOL yes you can, AMD's done it, they make the GPU tech, have the libraries of code, it's perfectly useful for physics and AI calculations.

It's all math data, it can be handled on the GPU, you just need the software to run it and AMD has that, they developed the libraries for it years ago. This whole back and forth thing is an invention you've made up.

And it is the synchroziation job which is the biggest challenge a devleoper faces on unbalanced hardware designs - besides having to get the core running at all.

LOL, just stop kid, you haven't got a clue what you're talking about.

As long as you have compute queues available developers can put their job in there, when compute units become available they can run the math, it's that simple, you just need stream processors to perform the process.

Developers need to code the thing, it has to be a task that their engine was written to perform on the hardware it's running on.

BTW: do you really think those people at Ubisoft aren't smart enough? That they wouldn't have the latest technology in compilers and tools for utilizing a GPU? That they didn't sign NDAs with every major GPU manufactor and have support beyond everything any of us here will be given?

Did I say that Ubisoft aren't smart enough?

It's a matter of whether the developers are given the time or resources to add the features to their engine.

If a game is due out by a certain time they have to focus on that, it's like Call of Duty now, Assassin's Creed is a yearly release, the devs are busy working on the game, creating new assets, all the models, environments and Unity was built on an old engine, which had most of it's development on last gen that wasn't GPGPU focused.

Look up AnvilNext, the fact Ubisoft said themselves that they're CPU limited for their AI says it all, considering they had GPU resources left over, to the point where (as they said) they could ACU at 100FPS, yet they're having issues even hitting 30FPS because of the AI.

It's a fact that GPUs can run large amounts of AI and Physics, you ignoring that doesn't make it untrue.


Of course they have that all. Every big player has. 

And it says something about the current situation and the hurdles to overcome if Ubisoft wasn't able to implement the AC:U AI on the GPU side...

It's because the engine is old, it's been built from the ground up for games like AC3, which were made with CPU focused AI, the environment's changed.

PC could have run AI fine on the GPU, but developers were using PS3 and 360 as their lead platforms, with AI on the CPU, evident by the situation we have with ACU.

I don't expect that to change soon or maybe ever - AI on GPUs shown by the vendors today is something totally different than "AI" on CPU utilized in games.

If all platforms, especially the market leader has a GPU capable of it then it makes perfect sense that the tech will get used, given the generation is early and Ubisoft were using an engine designed with 7th gen tech in mind I wouldn't expect this version of AnvilNext to use it, they can defnitely update it if the developers are given time.

AMD has the libraries for GPGPU physics and AI coding, hell Ubisoft showed that they can do physics coding, it wouldn't take long to add the feature into the engine, AI, physics, it's all math, a job that needs processing, they just need to incorporate AMD's libraries into their engine.

The fact that Ubisoft are more partnered with Nvidia probably has something to do with it.

Economics and politics have a huge effect here.

About the sales situation: it does matter from a developers perspective on what to bet their money on and what type of experiences they can deliver at which frame rate. They will try give you what is possible WITHIN THE LIMITS of the hardware and may find some "work arounds" by using 30fps or a sub-1080p resolution. They did excactly that on the PS3 and X360. And they already do it on PS4 and XOne...

The situation has changed from last generation, the ball is even more in Sony's court and the fact they have the most revolutionary technology, that is shared in some form by the other platforms can only be a benefit to the industry as a whole.

It's a great thing that PS4 is GPGPU focused, it benefits all parties.

 

 

Did you notice that Microsoft was always stating that the resolution of the XOne games are "enough" and Sony already "back paddeling" by giving up stating that the PS4 delivers 1080p/60? Just dropped those thing silently. 

Sony didn't stop talking about 1080p60FPS, Naughty Dog made a statement about how the PS4 is fully capable of running some very impressive graphics at that res and frame rate, the platform just needs to be optimized properly.

Hell the fact that Uncharted 4 is already running at a locked 1080p 30FPS with the heavy weight code it has in pre-alpha says it all really, between alpha and going gold a game's frame rate can often double, if not push further, with even more technical sparkle, this is what has happened with Bloodborne, it had a load of screen tear (Uncharted 4 doesn't suffer from that), ran between 10-20FPS when gameplay leaked around E3, now it's much better looking, runs at a locked 30FPS and has no tearing.

The Order is a similar deal, performance doubles as code gets refined a lot to free up resources.

 

This is as those dicussions hurt them both. Sony more than Microsoft. 

Why?

Who's hurt? Sales are great, far better than at this time last generation for both company.

People just have to keep their expectations in check.

Expecting that every developer will be able to achieve 1080p 60FPS with the best graphics around is going too far, but it can be done with the right amount of effort.

Expecting it in year 1 was going too far, because it requires optimization and learning the hardware, despite the fact that the platform holders are using X86 CPUs and PC GPU hardware there's still a learning curve.

Hell RISC has been the standard for console hardware for the longest time, it's also been in development for a lot longer than X86, so the libraries aren't as mature.

Because Microsoft is a OS company which happens to sell hardware at subsidized prices (aka "console") to have their OS and technology running in the living room. 

But nothing will hinder Microsoft to give their OS away for FREE to other manufactors as long they manage to get some - or my expectation even more - profit from it.

This only works if Microsoft actually starts to make games that they can sell to subsidize their free OS, right now there's no sign that they'll be giving away free OS for the PC market.


The two new AMD designs of which was speculated that the x86 one would be used by Nintendo - those are for Amazon (ARM) and FACEBOOK (x86).

Guess what? Microsoft is already flirting with Facebook to get their "Games Windows" on that true next-gen console which will run circles around the XOne AND the PS4 !!

Microsoft doesn't care if the Facebook console will be the next big console as long they have a stake in it...

 

All of this has nothing to do with the topic at the beginning of this discussion.

You inferred that Wii U is the most balanced (it's not) and we're talking about AI processing on GPUs, which is absolutely usable, the only reason it's not been used yet is because game engines need to be coded to allow for it, but given that PS4 is the market leader, it features the tech and every other 8th gen console does too (Wii U and XB1 are less capable than PC and PS4 in that area), along with PC we should see it used at some point in the not too distant future.

See reply in bold.



JustBeingReal said:
mine said:
JustBeingReal said:


See bolded comments of mine for my replies.


 You just confirmed what I said. The PS4 more "out of balance" than the XOne which are both not so well balanced compared to the Wii U.

LMFAO you haven't got a clue what you're talking about, I didn't confirm what you said, if you think that you don't understand the topic at hand at all.

PS4 is the most balanced out of all 3 consoles, it's GPU handles everything, RAM has just the right amount of bandwidth for the requirements of it's CPU and GPU.

XB1 lacks bandwidth to it's main memory,a standard 7770 requires 72GB/s from it's 1-2GB memory pool, it's 8GBs of DDR3 clocks in at 55GB/s of real world bandwidth, an 8 core Jaguar CPU requires about 20GB/s of bandwidth, so the GPU only gets about 35GB/s from it's largest pool of memory, that's a deficite of around 25GB/s, now you may say it's got 140-150GB/s from it's 32MB of eSRAM, you'd be right, if the size of that memory pool was sufficient, but it's not, it's tiny.

Wii U's main memory only supplies 12.8GB/s, it's GPU needs about 28.8GB/s of bandwidth from 512MB-1GB, but it's at a deficite of 10GB/s from it's 2GBs of storage, like XB1 yes it has a 32MB pool of cache, but it's tiny and it only runs at 70GB/s, once it's full it's not usable, can't do anything more until it's emptied for the next task, so developers are limited to the size of data they can put in it, it's fine for small render targets or some filtering, maybe selective effects or data storage, but not full frame buffers and tonnes of physics or AI data for GPGPU calc.

No Wii U isn't the most balanced, the facts don't agree in the slightest.


You can't move any code to the GPU. Even is AI is possible in theory, its a big difference to have it working in reality. This is because you'll need to "do something" with the computational results by moving them back from the GPU to the CPU which has to update the internal game world.

This has to be synchronized.

LOL yes you can, AMD's done it, they make the GPU tech, have the libraries of code, it's perfectly useful for physics and AI calculations.

It's all math data, it can be handled on the GPU, you just need the software to run it and AMD has that, they developed the libraries for it years ago. This whole back and forth thing is an invention you've made up.

And it is the synchroziation job which is the biggest challenge a devleoper faces on unbalanced hardware designs - besides having to get the core running at all.

LOL, just stop kid, you haven't got a clue what you're talking about.

As long as you have compute queues available developers can put their job in there, when compute units become available they can run the math, it's that simple, you just need stream processors to perform the process.

Developers need to code the thing, it has to be a task that their engine was written to perform on the hardware it's running on.

BTW: do you really think those people at Ubisoft aren't smart enough? That they wouldn't have the latest technology in compilers and tools for utilizing a GPU? That they didn't sign NDAs with every major GPU manufactor and have support beyond everything any of us here will be given?

Did I say that Ubisoft aren't smart enough?

It's a matter of whether the developers are given the time or resources to add the features to their engine.

If a game is due out by a certain time they have to focus on that, it's like Call of Duty now, Assassin's Creed is a yearly release, the devs are busy working on the game, creating new assets, all the models, environments and Unity was built on an old engine, which had most of it's development on last gen that wasn't GPGPU focused.

Look up AnvilNext, the fact Ubisoft said themselves that they're CPU limited for their AI says it all, considering they had GPU resources left over, to the point where (as they said) they could ACU at 100FPS, yet they're having issues even hitting 30FPS because of the AI.

It's a fact that GPUs can run large amounts of AI and Physics, you ignoring that doesn't make it untrue.


Of course they have that all. Every big player has. 

And it says something about the current situation and the hurdles to overcome if Ubisoft wasn't able to implement the AC:U AI on the GPU side...

It's because the engine is old, it's been built from the ground up for games like AC3, which were made with CPU focused AI, the environment's changed.

PC could have run AI fine on the GPU, but developers were using PS3 and 360 as their lead platforms, with AI on the CPU, evident by the situation we have with ACU.

I don't expect that to change soon or maybe ever - AI on GPUs shown by the vendors today is something totally different than "AI" on CPU utilized in games.

If all platforms, especially the market leader has a GPU capable of it then it makes perfect sense that the tech will get used, given the generation is early and Ubisoft were using an engine designed with 7th gen tech in mind I wouldn't expect this version of AnvilNext to use it, they can defnitely update it if the developers are given time.

AMD has the libraries for GPGPU physics and AI coding, hell Ubisoft showed that they can do physics coding, it wouldn't take long to add the feature into the engine, AI, physics, it's all math, a job that needs processing, they just need to incorporate AMD's libraries into their engine.

The fact that Ubisoft are more partnered with Nvidia probably has something to do with it.

Economics and politics have a huge effect here.

About the sales situation: it does matter from a developers perspective on what to bet their money on and what type of experiences they can deliver at which frame rate. They will try give you what is possible WITHIN THE LIMITS of the hardware and may find some "work arounds" by using 30fps or a sub-1080p resolution. They did excactly that on the PS3 and X360. And they already do it on PS4 and XOne...

The situation has changed from last generation, the ball is even more in Sony's court and the fact they have the most revolutionary technology, that is shared in some form by the other platforms can only be a benefit to the industry as a whole.

It's a great thing that PS4 is GPGPU focused, it benefits all parties.

 

 

Did you notice that Microsoft was always stating that the resolution of the XOne games are "enough" and Sony already "back paddeling" by giving up stating that the PS4 delivers 1080p/60? Just dropped those thing silently. 

Sony didn't stop talking about 1080p60FPS, Naughty Dog made a statement about how the PS4 is fully capable of running some very impressive graphics at that res and frame rate, the platform just needs to be optimized properly.

Hell the fact that Uncharted 4 is already running at a locked 1080p 30FPS with the heavy weight code it has in pre-alpha says it all really, between alpha and going gold a game's frame rate can often double, if not push further, with even more technical sparkle, this is what has happened with Bloodborne, it had a load of screen tear (Uncharted 4 doesn't suffer from that), ran between 10-20FPS when gameplay leaked around E3, now it's much better looking, runs at a locked 30FPS and has no tearing.

The Order is a similar deal, performance doubles as code gets refined a lot to free up resources.

 

This is as those dicussions hurt them both. Sony more than Microsoft. 

Why?

Who's hurt? Sales are great, far better than at this time last generation for both company.

People just have to keep their expectations in check.

Expecting that every developer will be able to achieve 1080p 60FPS with the best graphics around is going too far, but it can be done with the right amount of effort.

Expecting it in year 1 was going too far, because it requires optimization and learning the hardware, despite the fact that the platform holders are using X86 CPUs and PC GPU hardware there's still a learning curve.

Hell RISC has been the standard for console hardware for the longest time, it's also been in development for a lot longer than X86, so the libraries aren't as mature.

Because Microsoft is a OS company which happens to sell hardware at subsidized prices (aka "console") to have their OS and technology running in the living room. 

But nothing will hinder Microsoft to give their OS away for FREE to other manufactors as long they manage to get some - or my expectation even more - profit from it.

This only works if Microsoft actually starts to make games that they can sell to subsidize their free OS, right now there's no sign that they'll be giving away free OS for the PC market.


The two new AMD designs of which was speculated that the x86 one would be used by Nintendo - those are for Amazon (ARM) and FACEBOOK (x86).

Guess what? Microsoft is already flirting with Facebook to get their "Games Windows" on that true next-gen console which will run circles around the XOne AND the PS4 !!

Microsoft doesn't care if the Facebook console will be the next big console as long they have a stake in it...

 

All of this has nothing to do with the topic at the beginning of this discussion.

You inferred that Wii U is the most balanced (it's not) and we're talking about AI processing on GPUs, which is absolutely usable, the only reason it's not been used yet is because game engines need to be coded to allow for it, but given that PS4 is the market leader, it features the tech and every other 8th gen console does too (Wii U and XB1 are less capable than PC and PS4 in that area), along with PC we should see it used at some point in the not too distant future.

See reply in bold.


Ubisoft is so capable and knowledgeable that the same game runs completely worse on AMD compared to a similar Nvidia GPU.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."