drkohler said:
I particularly liked the "I have a computer science degree and I am a software engineer" and then shows his memory throughput graphics - which puts him instantly into the "complete moron" cathegory. |
This user said it best:
drkohler said:
I particularly liked the "I have a computer science degree and I am a software engineer" and then shows his memory throughput graphics - which puts him instantly into the "complete moron" cathegory. |
This user said it best:
Adinnieken said:
The thing that has changed between this generation and the previous generation is the fact that the bottle necks have become smaller and less obvious. They require a greater deal of understanding systems and how data is used on those systems, than looking at numbers. |
The thing is the memory bandwidth/amount of memory only comes into play when you bump up to a higher resolution. The Xbox One has more than enough capability at 1080p. Considering that the current gen cosnoels will most likely never exceed 1080p for games it is safe to assume that both consoles will hit a peak eventually after optimization and stuff is factored in. Neither console needs more than 4GB of memor dedicated to video. Anything more than 3 is absurd for games running at 1080p. Which is why it would be a joke to bring up the fact one has more usable memory than the other. Since in the end IT DOES NOT MATTER AT 1080P. PC benchmarks would verify that perfectly. The fact is 2GB of GDDR5 seems to handle 1080P in games perfectly fine should be more than enough proof of that. I don't get why people think the way they do about hardware. The fact one produces more flops than the other has no real standing in games. Drivers have increased graphics cards performance by large amounts. So if Microsoft can out write Sony driverwise their graphics chip could end up working out to be the more powerful one even though on paper the one in the PS4 is more powerful. People have severely underestimated what a graphics driver rewrite can mean for the Xbox One. It has been rewritten and that alone could boost the Xbox Ones performance. The 50mhz bump up not as much.
Egann said: I don't know why we're still having this discussion. We *still* haven't seen an actual gameplay screenshot from either system because the demos for BOTH systems were actually high end PC's. But here's something to put in your craw. All sources agree that the PS4 and XBox 1 have GPU's in the 1.*something* TeraFLOPS range, although until we actually get our hands on the hardware we won't know where each system stands. We know for a fact the Wii U runs a third or less of this at 0.35 teraFLOPS. Actual comparison screenshots: Wii U: PS4/ X1: Yes, I did steal S.T.A.G.E.'s sig because I'm just that lazy. Conclusion: Whatever the power difference between the PS4 and XB1 exists, it's probably irrelevant. The Wii U is between a factor of three and eight weaker than either console, and yet the actual graphics pumped out are basically one or two notches better at best. |
The graphical different between the PS4/Xbone and WiiU is actually pretty huge :/
Wow, your maths about esRAM show you really know your way about hardware.
errorpwns said: The thing is the memory bandwidth/amount of memory only comes into play when you bump up to a higher resolution. The Xbox One has more than enough capability at 1080p. Considering that the current gen cosnoels will most likely never exceed 1080p for games it is safe to assume that both consoles will hit a peak eventually after optimization and stuff is factored in. Neither console needs more than 4GB of memor dedicated to video. Anything more than 3 is absurd for games running at 1080p. Which is why it would be a joke to bring up the fact one has more usable memory than the other. Since in the end IT DOES NOT MATTER AT 1080P. PC benchmarks would verify that perfectly. The fact is 2GB of GDDR5 seems to handle 1080P in games perfectly fine should be more than enough proof of that. I don't get why people think the way they do about hardware. The fact one produces more flops than the other has no real standing in games. Drivers have increased graphics cards performance by large amounts. So if Microsoft can out write Sony driverwise their graphics chip could end up working out to be the more powerful one even though on paper the one in the PS4 is more powerful. People have severely underestimated what a graphics driver rewrite can mean for the Xbox One. It has been rewritten and that alone could boost the Xbox Ones performance. The 50mhz bump up not as much. |
Just found that typo really funny. Like the typo itself is dyslexic. xD
cosnoels
I don't understand why some people keep adding the eSRAM bandwidth to the DDR3 one for a total bandwidth, if they know nothing about technology then they shouldn't do something so stupid.
And I did so well staying away from poorly written and factually misleading articles/blogs like this for the past month and a half. I see I need to stay away a bit longer.
I am the Playstation Avenger.
|
CGI-Quality said:
http://www.pushsquare.com/news/2013/07/xbox_one_developer_concedes_that_ps4_is_more_powerful http://www.gamespot.com/news/just-cause-dev-ps4-more-powerful-than-xbox-one-right-now-6408781 http://www.computerandvideogames.com/408484/ps4-currently-beats-xbox-one-in-terms-of-raw-power-says-avalanche-technical-officer/ http://gamingeverything.com/oddworld-developer-says-ps4-is-a-bit-stronger-than-xbo-overall-multiplatform-games-could-be-slightly-better-on-sonys-machine/ I tried to find devs from different months, but like a lot of people, most devs I hear from have been clear that the PS4 is more powerful. But their word isn't all to take, just the GPUs alone, 1.84TFLOPS vs 1.29TFLOPS. Plus, 8GB GRRD5 is stronger than 8GB DDR3, no matter how it's spun. None of it should be that big of a deal, but with an article being posted with blatant fallacies, things should be corrected. |
Thread should have ended here.
Well said CGI.
How on earth Xbone has 260GB/s bandwidth is beyond me.
Only read the front page of comments plus OP. As many have said, so much wrong info there based on abstract conclusions. I only had to look at the graph that indicates the speed of the RAM and I just skipped that section completely. Funny that he put his credential in front only to make him look silly.
The gap is bigger this time than the PS360 gap, folks.
This is like saying 7770/7790 and 7850/7870 are the same.
e=mc^2
Gaming on: PS4 Pro, Switch, SNES Mini, Wii U, PC (i5-7400, GTX 1060)
dsgrue3 said: The fuck would someone go to school for CS if they wanted to concern themselves with hardware? That's straight up computer engineering. |
Obviously the curricula in cs differ a lot depending on the country you study.