By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - AMD Jaguar Cores

DarkTemplar said:

 

Modern games are not CPU demanding. For instance, if you have a code that has a lot of float point operations and it is running on the CPU you probably are wasting resources since the GPU is much more suited to perform this type of operation (consequently the game becomes even less CPU demanding).

This is why also think that 4 Jaguar cores and even 3 PPC 750 cores maybe more than enough to the next gen games (btw notice that the Wii U CPU was made for lazy developers since one of its cores has more cache, enabling single thread code to run quite well on it).

And finally, since Sony and MS decided to use only half of the CPU (4 cores) for games this maybe a hint to an OS with lots things running on the background (perhaps they are working on even more features for the SO than what we know so far).


Unfortunatly, not all floating point math can be done on the GPU, sometimes it has to be done in a serialised nature with other tasks due to various factors, which is what the CPU excells at, the GPU however will kick things into another gear when it comes to highly parallel stuff.

But you are right, that large chunks of the game can be offloaded onto the GPU, such as Physics, but that's really upto the Developer to choose.
For example, do they choose to go with amazing Physics? Or lighting so real it's beyond stupid.
Converesly, if they do say... Physics on the GPU, a CPU core or two can do some GPU tasks such as improving frame buffer effects or even Morphological Anti-Aliasing.

The previous generation was fairly restrictive in that regard, if you didn't do things in a certain way, you got poor performance, this time around though it's a different ball game.



--::{PC Gaming Master Race}::--

Around the Network

Although a really rough number you could compare MIPS(million instructions per second:

Xbox360 IBM "Xenon" (Triple core) 19,200 MIPS at 3.2 GHz
PS3 Cell BE (PPE only) 10,240 MIPS at 3.2 GHz
AMD E-350 (Dual core) 10,000 MIPS at 1.6 GHz
AMD Phenom II X4 940 BE 42,820 MIPS at 3.0 GHz
Intel Core i7 2600K 128,300 MIPS at 3.4 GHz

Now since Jaguar is rumored to be 20% faster than Brazos(e-350) and an 8-core cpu you would get:

PS4/Xone Jaguar cpu: 48,000 MIPS at 1.6 GHz

That would make it in theory faster than a AMD X4 940 wich is really not that bad, considering that they can offload tasks to the gpu easily.

In games today cpu speed jsut isn't that important, especially not if they can optimize like in consoles(a one chip design has way more benefits):



AnthonyW86 said:

Although a really rough number you could compare MIPS(million instructions per second:

Xbox360 IBM "Xenon" (Triple core) 19,200 MIPS at 3.2 GHz
PS3 Cell BE (PPE only) 10,240 MIPS at 3.2 GHz
AMD E-350 (Dual core) 10,000 MIPS at 1.6 GHz
AMD Phenom II X4 940 BE 42,820 MIPS at 3.0 GHz
Intel Core i7 2600K 128,300 MIPS at 3.4 GHz

Now since Jaguar is rumored to be 20% faster than Brazos(e-350) and an 8-core cpu you would get:

PS4/Xone Jaguar cpu: 48,000 MIPS at 1.6 GHz

That would make it in theory faster than a AMD X4 940 wich is really not that bad, considering that they can offload tasks to the gpu easily.

In games today cpu speed jsut isn't that important, especially not if they can optimize like in consoles(a one chip design has way more benefits):

That depends on the game

I also expect that CPU requirements will increase a with next gen games



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

AnthonyW86 said:

Although a really rough number you could compare MIPS(million instructions per second:

Xbox360 IBM "Xenon" (Triple core) 19,200 MIPS at 3.2 GHz
PS3 Cell BE (PPE only) 10,240 MIPS at 3.2 GHz
AMD E-350 (Dual core) 10,000 MIPS at 1.6 GHz
AMD Phenom II X4 940 BE 42,820 MIPS at 3.0 GHz
Intel Core i7 2600K 128,300 MIPS at 3.4 GHz


Just realized the full CBE chip and Sony could theoretically pull 51,200 MIPS. That's a monster for a chip designed eight years ago, with the 90nm manufacturing process in mind. Too bad it was spread among ten physical threads with limited applicability.

IBM really does come up with some crazy designs from time to time. I wonder if they'll try again with Intel pushing for Xeon-based supercomputers and whatnot. Probably not, but it would be fun to see.



 

 

 

 

 

zarx said:

That depends on the game

I also expect that CPU requirements will increase a with next gen games

Sorry, but no. Crysis 3 was tested with a GTX 680 and Metro even with a Titan, both way more powerfull than the PS4 gpu. The faster the gpu the bigger the cpu bottleneck will be. And even with those gpu there isn't a real limitation until far under the AMD X4 980(Jaguar 8 core performance). At HD 7850 levels there is no cpu bottleneck. And since both consoles will utilize a one chip design, any heavy task like physics will be offloaded to the gpu. 10% of the gpu is more gflops than even the highest end cpu can deliver.



Around the Network

Man, wait till ethomaz sees this thread... If he does, a slew of incoming graphs, charts, insider interviews and console vs PC arguments about Jaguar being awesomely more powerful are inbound.

All you guys are in big trouble...



AnthonyW86 said:

Sorry, but no. Crysis 3 was tested with a GTX 680 and Metro even with a Titan, both way more powerfull than the PS4 gpu. The faster the gpu the bigger the cpu bottleneck will be. And even with those gpu there isn't a real limitation until far under the AMD X4 980(Jaguar 8 core performance). At HD 7850 levels there is no cpu bottleneck. And since both consoles will utilize a one chip design, any heavy task like physics will be offloaded to the gpu. 10% of the gpu is more gflops than even the highest end cpu can deliver.


It will still depend on the game, that is with games where the world interactivity and AI is limited by current consoles. In the coming years devs will be able to vastly increase the number of AIs, animation blending/procedural animation, world interacivity, more dynamic elements (ecpesially weather effects) etc which can increase the demands on the CPU by a lot. Most games will still hit a GPU bottleneck before a CPU one but you can't make blanket statements. And that GPU bottelneck will also mean many developers won't be offloading all tasks like physics to the GPU anyway because they will be stressing the GPU for all the new graphical effects already, using 10% of the GPU for physics doesn't sound like much until you realise that the GPU is already 100% utalised for the graphical pipeline and as such physics will kill the framerate. GPU compute doesn't just allow devs to offload tasks for free, and it won't be the best choice in all cases even for tasks that are well suited to GPU compute. It will always be a balancing act for developers how they utalise the different proccessors for different tasks, next gen will give them a lot more flexability but there will still be scenarios that lean on the CPU more.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!