By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - HD console graphics performance comparison charts

Intrinsic said:
CrazyGPU said:

I agree that they could get better by coding, but as Cerny said, 1.time to triangle in PS4 is 1 to 2 months instead of 6 to 12 months with PS3. The architecture now is x86 just like the PC and PS4 has a unified memory pool, so PS4 is easier to program and very close to PC. The API is closer to metal, but Mantle and DX12 are getting closer to metal too. 2. I dont expect a big jump in performance in years to come as PS3 software did. It was really hard to use all the architecture in the Cell processor. 3.This APUs are much easier and PC like. The only thing that can be a real deal would be 4.computing with the GPU, but i wonder if that can take too many GPU resources and kill performance. Also 5. Battlefield 4 devs told that CPU was at 95% with PS4, An Intel I7 4770 CPU runs the game at 30%, so not all codes run at 100%.  if the 6. CPU is not holding back performance, why heavy games run at 900p 30fps when a HD 7850-70 with a good CPU can run easilly at 1080p?

There is so much you are not getting here. I numbered points in your post so my explanation doesn't seem to be all over the place.

 

  1. Time to triangle means how much time it takes to even get your engine up and running on a system. Not how much time it takes to build your game or clean up your code. 
  2. From above, the bump doesn't come from being able to run run your engine on the hardware, it comes rather; from optimizing your code. This doesn't just mean having more efficient code, it means writing code that takes advantage of the specific hardware. Something that no multiplat game is doing right now.
  3. Yes, and X86 means that a lot of the code as cross compatible. But Console architecture is very very very different from that of PCs. PCs basically brute force their way through everything. Thats because of the ridiculous amount of overhead in them that no matter how close to the metal they are, the fact that they stil run on an OS that must support legacy services means that they are always working with one hand tied behind their backs. Maybe I have already said this (or maybe i thought about it but didnt type it), think of consoles as formula one cars. Designed through and through for one purpose and one purpose alone. Then think of PCs as Bugatti Veyrons, with an option to strap a rocket to its back if you want.
  4. Of the 18 active CUs in the PS4, only 6 of them allow for GPU compute. So every one that you use for GPU compute is one less you have to use for your graphics engine. 
  5. Ignore them, devs say that kinda shit all the time. You need proof, just look at BF4, then look at KZ:SF and then infamous SS. Look at The Order, then wait and see the stuff that will blow your mind away in a few weeks at E3. Apparently 95% means different things for different devs. Besides, you also need to understand that every multiplat game you are seeing in this first year or so is noting but a direct port of the game running on a PC. Third party devs have not started making games specifically for the hardware in consoles. 
  6. Short answer, read above. Not optimized for consoles. 

 

1- I didnt say otherwise.

2- I dont think that this gen console hardware is that specific, I think its very PC like. The biggest change being the unified memory and the 64 commands. 

3- Good point, Windows is an overhead.

4- Good info, didnt know that. 

5- To tell you the truth, i have killzone and the caracter faces look like shit if you compare them with battlefield4 primary caracters. Infamous SS has great lighting, but look at the Trees, they are statues, dont move at all. Then watch the palms on the beach inside the storm in battlefield4. Particles, rain, vegetation blowing all over. I hope games get better.

6- Im looking forward what naughty dog geniuses and quantic dream can do optimizing for PS4.



Around the Network
CrazyGPU said:

 I dont think that this gen console hardware is that specific, I think its very PC like. The biggest change being the unified memory and the 64 commands. 

 

Thats the funny thing, "specific" is practically synonumous with consoles. The only thing that these consoles have in common with PCs are that their processors are based on X86. And if you really want to go into that, thats just basically saying that you can run code similar to what you can run on a PC and have more than 4GB of ram. But that is where the similarities end.

On PCs, devs use a very very very stripped down version of windows to compile and debug their games. On Consoles, they also use a stripped down version of windows to compile their games but the debugging is done on the "debug console". A game on PC has to not only take into account every single version of an amd/nvidia card and amd/intel cpu and their card specific drivers but also different generations of these gpus/cpus, also have to take different memory types, amounts and frequencies into consideration. Or at least thats what they want to have you believe. The truth is that te build their game with one or at most two set-ups in mind. This is why PCs basically brute force their way through everything. And why some games perform better on one than the other even if technically thehave the same amount of muscle..

I am not sure if you are mistaking the whole 64 commands to means something else tho... the GPUs in the consoles are based on AMDs southern islands GCN architecture. Each compute unit basically has 64 shader cores each core running one operation per cycle. its the same thing with a the GPUs in PCs. 

Anyways, just rest easy in the knowlege that you will see really great looking stuff on consoles as always. 



fatslob-:O said:
curl-6 said:

Even the techheads can't agree on something so simple as the number of shader parts or which part of the GPU does what, not to mention what customizations may have been made.

You're posting guesswork as if it's fact.

Actually some of those "techheads" do have a consensus on this information and it's this ... The best 40nm AMD GPU at the GFLOPs per watt front is the 1GB edition Radeon HD 5870 so even if the latte did match it on that front with TDP of 25 watts the WII U would only at it's best case scenario have 362 GFlops. The catch is that lower end GPUs such as latte usually on average have a fair bit lower GFlops per watt compared to the higher end parts so it's very likely that it would end up at around 10 GFlops per watt which would make the WII U likely weigh in at about 250 GFlops. If it's anything we agree on it would be a RANGEIt's that the WII U most likely has less than 240 shaders!

We even had a developer here by the name of "Nyleveia" and he gave us a range of around 150 to 200 shaders. He basically confirmed that the WII U has LESS than 200 shaders.

Anthony's assertion isn't even wrong in the slightest sense of the word ... 

Some of the techheads have reached a consensus. The debate itself still rages on.

TC posting speculation as fact is misleading and just generally poor form.



curl-6 said:

Some of the techheads have reached a consensus. The debate itself still rages on.

TC posting speculation as fact is misleading and just generally poor form.

Some of those "techheads" are actual techheads like Nyleveia, Pemalite, and myself ... The rest are a bunch of raging fanboys or haters from all sides. There's not much of a debate when we are the only ones around here with confirmed ranges or comprehensive theories based on other facts. TC has the right idea but it's incomplete and not totally speculation either. You really can't defy the laws of physics ...



fatslob-:O said:
curl-6 said:

Some of the techheads have reached a consensus. The debate itself still rages on.

TC posting speculation as fact is misleading and just generally poor form.

Some of those "techheads" are actual techheads like Nyleveia, Pemalite, and myself ... The rest are a bunch of raging fanboys or haters from all sides. There's not much of a debate when we are the only ones around here with confirmed ranges or comprehensive theories based on other facts. TC has the right idea but it's incomplete and not totally speculation either. You really can't defy the laws of physics ...

There's a difference between a range and a specific number. Besides Nintendo, the only people who know exactly and for sure are devs under NDA.

I just don't think we should pretend to be positive when we're not.



Around the Network
curl-6 said:

There's a difference between a range and a specific number. Besides Nintendo, the only people who know exactly and for sure are devs under NDA.

I just don't think we should pretend to be positive when we're not.

Yes there is a difference between range and specific number but it is contrained at the maximum of 200 shaders. Nyleveia is positive about that but he couldn't give the actual number due to an NDA agreement so OP is more in the right rather than the wrong because if that number were different it would actually be lower rather than higher. 



fatslob-:O said:
curl-6 said:

There's a difference between a range and a specific number. Besides Nintendo, the only people who know exactly and for sure are devs under NDA.

I just don't think we should pretend to be positive when we're not.

Yes there is a difference between range and specific number but it is contrained at the maximum of 200 shaders. Nyleveia is positive about that but he couldn't give the actual number due to an NDA agreement so OP is more in the right rather than the wrong because if that number were different it would actually be lower rather than higher. 

That's another thing; how do we know for sure this Nyleveia was a dev, and even if he/she was, how do we know he/she was telling the truth?

I just think we should be absolutely sure before we go presenting graphs as fact.



curl-6 said:

That's another thing; how do we know for sure this Nyleveia was a dev, and even if he/she was, how do we know he/she was telling the truth?

I just think we should be absolutely sure before we go presenting graphs as fact.

It's really hard to make an argument defending the amount of functional units the latte has when it has to stand up against other tough assertions such as the laws of physics or real life examples of performance. Unless I'm seriously overlooking something in favour of the amount of functional units in latte such as revolutionary ALU designs or somethng entirely different such as a DSP with some powerful SIMD capabilities but all of those things lack some evidence as I doubt AMD makes anything other than microprocessors or GPU. 

You are right about being absolutely sure about something before presenting the graphs but alot of the discussions take place in a different community rather than the general public so I would rather see some disputes between other users about the graph's accuracy in question. 



fatslob-:O said:

curl-6 said:

That's another thing; how do we know for sure this Nyleveia was a dev, and even if he/she was, how do we know he/she was telling the truth?

I just think we should be absolutely sure before we go presenting graphs as fact.

 

It's really hard to make an argument defending the amount of functional units the latte has when it has to stand up against other tough assertions such as the laws of physics or real life examples of performance. Unless I'm seriously overlooking something in favour of the amount of functional units in latte such as revolutionary ALU designs or somethng entirely different such as a DSP with some powerful SIMD capabilities but all of those things lack some evidence as I doubt AMD makes anything other than microprocessors or GPU. 

You are right about being absolutely sure about something before presenting the graphs but alot of the discussions take place in a different community rather than the general public so I would rather see some disputes between other users about the graph's accuracy in question. 

I honestly don't know if there's enough techheads on VGChartz to have a full-on discussion about the probabilities of Latte's performance, haha. 

And we both know what would happen, extremists from both sides would turn it into a circus. XD



No source? Where'd the graphs come from?