By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - HD console graphics performance comparison charts

mornelithe said:
DonFerrari said:
mornelithe said:

Yeah, people argue that the fossil record is an elaborate hoax by scientists, also.


There is a member here that open a thread to discuss that Evolutionism wasn't a scientific teory while creationism was.

My point exactly.

And there is another that says NPD is inacurrate because it is tracking X1 bellow VGC.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Around the Network
curl-6 said:

Even the techheads can't agree on something so simple as the number of shader parts or which part of the GPU does what, not to mention what customizations may have been made.

You're posting guesswork as if it's fact.

Actually some of those "techheads" do have a consensus on this information and it's this ... The best 40nm AMD GPU at the GFLOPs per watt front is the 1GB edition Radeon HD 5870 so even if the latte did match it on that front with TDP of 25 watts the WII U would only at it's best case scenario have 362 GFlops. The catch is that lower end GPUs such as latte usually on average have a fair bit lower GFlops per watt compared to the higher end parts so it's very likely that it would end up at around 10 GFlops per watt which would make the WII U likely weigh in at about 250 GFlops. If it's anything we agree on it would be a RANGEIt's that the WII U most likely has less than 240 shaders!

We even had a developer here by the name of "Nyleveia" and he gave us a range of around 150 to 200 shaders. He basically confirmed that the WII U has LESS than 200 shaders.

Anthony's assertion isn't even wrong in the slightest sense of the word ... 



DonFerrari said:

And there is another that says NPD is inacurrate because it is tracking X1 bellow VGC.

Aye, but, at least these folks are freely broadcasting that we should ignore anything and everything they say :)



mornelithe said:
DonFerrari said:

And there is another that says NPD is inacurrate because it is tracking X1 bellow VGC.

Aye, but, at least these folks are freely broadcasting that we should ignore anything and everything they say :)


Yes, the best part about people being completely idiots is that we are safe, we can just ignore them after that.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

DonFerrari said:
mornelithe said:
DonFerrari said:

And there is another that says NPD is inacurrate because it is tracking X1 bellow VGC.

Aye, but, at least these folks are freely broadcasting that we should ignore anything and everything they say :)


Yes, the best part about people being completely idiots is that we are safe, we can just ignore them after that.

On forums, yes, but when they translate that idiocy into voting power, it becomes an issue.



Around the Network

Performance graphs are great, and it shows that theoretically a PS4 GPU is in line with Radeon HD 7850 or near HD 7870. Its weird that even with that amount of power, its not able to run games like Battlefield 4 at 1080p 30 fps. Same with Watch Dogs. Those games run at 900p and scale. Scalliing makes textures more blurry. 

Now, Could it be that the AMD Jaguar multicore CPU is holding back the performance of the GPU?. 

"Sucker Punch said in a note in their own GDC 2014 Post-Mortem (regarding the CPU) “While the CPU has ended up working pretty well, it’s still one of our main bottlenecks."

http://www.redgamingtech.com/ infamous-second-son-post-mortem-part-2-ps4-performance-compute-particle-system/
Battlefield 4 uses 95% of CPU power of PS4
http://bf4central.com/2013/11/battlefield-4-uses-95-cpu-power-found-ps4-xbox-one/

I was thinking that with optimization, the "PS4 HD 7850 GPU equivalent" would get better graphics than the PC counterpart, but if the CPU is holding back performace, then it might be the case why the PS4 cant achieve 1080p in most games like the PC card does, and the XBONE is almost 720p because of its weaker GPU.

Cerny says that the GPGPU on the PS4 can make up for CPU weakness, but if the graphic card is doing compute, would it be able to cope with the same graphic quality? Actually Nvidia takes a hit when a graphic card use PhysX.

Also this link from this forum is saying that memory can become a bottleneck too. 

http://gamingbolt.com/crytek-8gb-ram-can-be-easily-filled-up-will-surely-be-limiting-factor-on-ps4xbox-one

Any PS4 dev here or someone with deep knowledge to comment on this?




CrazyGPU said:

Performance graphs are great, and it shows that theoretically a PS4 GPU is in line with Radeon HD 7850 or near HD 7870. Its weird that even with that amount of power, its not able to run games like Battlefield 4 at 1080p 30 fps. Same with Watch Dogs. Those games run at 900p and scale. Scalliing makes textures more blurry. 

Now, Could it be that the AMD Jaguar multicore CPU is holding back the performance of the GPU?. 

"Sucker Punch said in a note in their own GDC 2014 Post-Mortem (regarding the CPU) “While the CPU has ended up working pretty well, it’s still one of our main bottlenecks."

http://www.redgamingtech.com/ infamous-second-son-post-mortem-part-2-ps4-performance-compute-particle-system/
Battlefield 4 uses 95% of CPU power of PS4
http://bf4central.com/2013/11/battlefield-4-uses-95-cpu-power-found-ps4-xbox-one/

I was thinking that with optimization, the "PS4 HD 7850 GPU equivalent" would get better graphics than the PC counterpart, but if the CPU is holding back performace, then it might be the case why the PS4 cant achieve 1080p in most games like the PC card does, and the XBONE is almost 720p because of its weaker GPU.

Cerny says that the GPGPU on the PS4 can make up for CPU weakness, but if the graphic card is doing compute, would it be able to cope with the same graphic quality? Actually Nvidia takes a hit when a graphic card use PhysX.

Also this link from this forum is saying that memory can become a bottleneck too. 

http://gamingbolt.com/crytek-8gb-ram-can-be-easily-filled-up-will-surely-be-limiting-factor-on-ps4xbox-one

Any PS4 dev here or someone with deep knowledge to comment on this?


This is VGChartz, we are all devs and know exactly the specs and limitations of everything. I thought this was common knowledge.



Common knowledge is that CPUs of these consoles are week, and some people are making full use of them. but are they a limiting factor for gpu resolution? and will gpu compute solve that without taking a hit in gpu performance?



CrazyGPU said:

Performance graphs are great, and it shows that theoretically a PS4 GPU is in line with Radeon HD 7850 or near HD 7870. Its weird that even with that amount of power, its not able to run games like Battlefield 4 at 1080p 30 fps. Same with Watch Dogs. Those games run at 900p and scale. Scalliing makes textures more blurry. 

Now, Could it be that the AMD Jaguar multicore CPU is holding back the performance of the GPU?. 

"Sucker Punch said in a note in their own GDC 2014 Post-Mortem (regarding the CPU) “While the CPU has ended up working pretty well, it’s still one of our main bottlenecks."

http://www.redgamingtech.com/ infamous-second-son-post-mortem-part-2-ps4-performance-compute-particle-system/
Battlefield 4 uses 95% of CPU power of PS4
http://bf4central.com/2013/11/battlefield-4-uses-95-cpu-power-found-ps4-xbox-one/

I was thinking that with optimization, the "PS4 HD 7850 GPU equivalent" would get better graphics than the PC counterpart, but if the CPU is holding back performace, then it might be the case why the PS4 cant achieve 1080p in most games like the PC card does, and the XBONE is almost 720p because of its weaker GPU.

Cerny says that the GPGPU on the PS4 can make up for CPU weakness, but if the graphic card is doing compute, would it be able to cope with the same graphic quality? Actually Nvidia takes a hit when a graphic card use PhysX.

Also this link from this forum is saying that memory can become a bottleneck too. 

http://gamingbolt.com/crytek-8gb-ram-can-be-easily-filled-up-will-surely-be-limiting-factor-on-ps4xbox-one

Any PS4 dev here or someone with deep knowledge to comment on this?


Memory is used to store all types of data. The weaknesses with an AMD GPU include mediocre tessellation performance. It is instead best advised that the devs used a high detailed mesh because it is simply faster to pass more vertex data into the pipeline rather than generating the extra quads on the fly with tessellation. The drawbacks to this is a clear increase in memory consumption. I think The Order: 1886 holds the record currently so far for having the most vertex data weighing in just over 700MB for a level! Last gen games don't even come close 10MB of vertex data most of the times for a level. Render targets have also seen substantial increases to memory consumption too with killzone shadow fall leading the pack with 800MB whereas games from last generation didn't even hit 100MB. The biggest culprits to memory consumption are textures for the most part and the situation doesn't improve when employing the use partially resident textures as developers will be more motivated to put even bigger resolution textures. The next thng to worry about on memory consumption is the desire to improve transparency. An A-buffer implementation of order independent transparency will seriously put in alot of memory overheads. 

This isn't even accounting for the rest of data such as the ingame physics program, the game data itself, music and all the other things. I could definitely see why 8GB of memory may not be enough but it would require alot of techniques or data that depends on memory consumption. 



Who gives a dam