By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Of course Xbox One can do 1080p, 60fps, you guys are crazy

Panicnausia said:
megafenix said:
Panicnausia said:

This is not from sony it's a blog that has been disproven months ago. The math dpoesn;t even make sense, try harder. I ask again, what does this have to do with Ps4 still be quite a bit more powerfull??? Thise doesn;t change that.


its not the onl site you know

http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?print=1

"

Inside the PlayStation 4 With Mark Cerny

By Christian Nutt

The PlayStation 4 is due out this fall, and its technical specifications have been largely under wraps -- till now. While the company gave a presentation at GDC, the system's lead architect, Mark Cerny, hasn't talked publicly in any great depth about the platform since its unveiling this February.

Cerny approached Gamasutra in the hope of delivering a "no holds barred PlayStation 4 hardware expose," he said, during the interview that resulted in this story. "That certainly is what we're here to do," said Cerny, before speaking to Gamasutra for well over an hour.

What follows is a total breakdown of the hardware from a developer's perspective: the chips on the board, and what they're capable of.

Questions on the UI and OS were off the table. What was up for discussion is what the system is capable of, and the thinking that lead Cerny and his team to make the decisions they made about the components they chose and how they function together.

To get to the heart of this deeply technical discussion, Gamasutra was assisted by someone with an intimate knowledge of how console hardware really works: Mark DeLoura, THQ's former VP of tech and now senior adviser for digital media at the White House Office of Science and Technology Policy.

 

What Does 'Supercharged' Mean, Anyway?

"One thing we could have done is drop it down to 128-bit bus, which would drop the bandwidth to 88 gigabytes per second, and then have eDRAM on chip to bring the performance back up again," said Cerny. While that solution initially looked appealing to the team due to its ease of manufacturability, it was abandoned thanks to the complexity it would add for developers. "We did not want to create some kind of puzzle that the development community would have to solve in order to create their games. And so we stayed true to the philosophy of unified memory."

In fact, said Cerny, when he toured development studios asking what they wanted from the PlayStation 4, the "largest piece of feedback that we got is they wanted unified memory."

"I think you can appreciate how large our commitment to having a developer friendly architecture is in light of the fact that we could have made hardware with as much as a terabyte [Editor's note: 1000 gigabytes] of bandwidth to a small internal RAM, and still did not adopt that strategy," said Cerny. "I think that really shows our thinking the most clearly 

"

 

here to

http://www.dualshockers.com/2013/06/29/mark-cerny-explains-how-the-ps4s-8-gb-gdrr5-ram-and-bus-work-and-why-they-were-chosen/

 

dont know who the guy is?

http://en.wikipedia.org/wiki/Mark_Cerny

"

On February 20, 2013 at the global Sony PlayStation 4 unveiling event in New York City Mark Cerny was revealed as the console's lead architect. He also showed in-game footage of his own new game called Knack which he is developing for the console.[9] On September 21, 2013 Cerny was revealed to have been the lead architect of PlayStation Vita.[10]

"


Yeah and nowhere does he sat 32 Mb of esram at 1000 GB per second will give you a major system bump overall. The 32 MB of edram or esram coulf be 100000 GB/Per second its so tiny it still has to go through main ram..Do you even know how ram and system work? Nowhere does he say any system has it, just that they could have, try harder.

 

No where also does it say the X1 has 1000 GBs of bandwidth...

 

Also, I ask again. how does this hel the video card?


esram and edram are simililar but different

i am aware that micrso said that the esram has 200gb/s, but doesnt mean that is comparable to an edram of 200gb/s since esram has adventages like no refreshment and less latency, so in the end could behave like an edram of 500GB/s for example

 

and dont forget this

http://hdwarriors.com/why-the-wii-u-is-probably-more-capable-than-you-think-it-is/

"

The easiest way that I can explain this is that when you take each unit of time that the Wii U eDRAM can do work with separate tasks as compared to the 1 Gigabyte of slower RAM, the amount of actual Megabytes of RAM that exist during the same time frame is superior with the eDRAM, regardless of the fact that the size and number applied makes the 1 Gigabyte of DDR3 RAM seem larger. These are units of both time and space. Fast eDRAM that can be used at a speed more useful to the CPU and GPU have certain advantages, that when exploited, give the console great gains in performance.

The eDRAM of the Wii U is embedded right onto the chip logic, which for most intent and purposes negates the classic In/Out bottleneck that developers have faced in the past as well. Reading and writing directly in regard to all of the chips on the Multi Chip Module as instructed.

"

the edram is mainly for the gpu, not cpu and the os runs on the main ram not the esram, and considering how tiny the internal caches are like the local data, global data and texture caches then 32 megabytes dont look that tiny



Around the Network
megafenix said:
Panicnausia said:


Yeah and nowhere does he sat 32 Mb of esram at 1000 GB per second will give you a major system bump overall. The 32 MB of edram or esram coulf be 100000 GB/Per second its so tiny it still has to go through main ram..Do you even know how ram and system work? Nowhere does he say any system has it, just that they could have, try harder.

 

No where also does it say the X1 has 1000 GBs of bandwidth...

 

Also, I ask again. how does this hel the video card?


esram and edram are simililar but different

i am aware that micrso said that the esram has 200gb/s, but doesnt mean that is comparable to an edram of 200gb/s since esram has adventages like no refreshment and less latency, so in the end could behave like an edram of 500GB/s for example

 

and dont forget this

http://hdwarriors.com/why-the-wii-u-is-probably-more-capable-than-you-think-it-is/

"

The easiest way that I can explain this is that when you take each unit of time that the Wii U eDRAM can do work with separate tasks as compared to the 1 Gigabyte of slower RAM, the amount of actual Megabytes of RAM that exist during the same time frame is superior with the eDRAM, regardless of the fact that the size and number applied makes the 1 Gigabyte of DDR3 RAM seem larger. These are units of both time and space. Fast eDRAM that can be used at a speed more useful to the CPU and GPU have certain advantages, that when exploited, give the console great gains in performance.

The eDRAM of the Wii U is embedded right onto the chip logic, which for most intent and purposes negates the classic In/Out bottleneck that developers have faced in the past as well. Reading and writing directly in regard to all of the chips on the Multi Chip Module as instructed.

"

the edram is mainly for the gpu, not cpu and the os runs on the main ram not the esram, and considering how tiny the internal caches are like the local data, global data and texture caches then 32 megabytes dont look that tiny

None of that changes the fact the PS4 still has more memory bandwidth and a better gpu.... Esram only helps the nslower ddr3 memeroy some, unified GDDR5 still mops the floor with it.

 

And lets not even get into thge GPU which is the single biggets factor. This secret sauce, esram is just nonsense spread by people in denial.

 

Keep posting that article doesn't change anything nor does that artciel prove anything, it is tecnhically impossible for 32 MB of esram to off set the memory and GPU advantage of the PS4.

 

If anything it is a  barrier for developers, ehich is worse then any small performance gains it has. As the PS3 can tell you.

 

Where is your proof esram could behave like 500 GB/S?? You saying it certainly doesn;t make it so.

You cannot simply make up numbers and add them together and make it be fact. A system is only as fast as it's weakest link x1 is pretty unbalanced.

 

There is a reason the experts who make games say unified faster memory is the way to go.



Panicnausia said:
megafenix said:
Panicnausia said:


Yeah and nowhere does he sat 32 Mb of esram at 1000 GB per second will give you a major system bump overall. The 32 MB of edram or esram coulf be 100000 GB/Per second its so tiny it still has to go through main ram..Do you even know how ram and system work? Nowhere does he say any system has it, just that they could have, try harder.

 

No where also does it say the X1 has 1000 GBs of bandwidth...

 

Also, I ask again. how does this hel the video card?


esram and edram are simililar but different

i am aware that micrso said that the esram has 200gb/s, but doesnt mean that is comparable to an edram of 200gb/s since esram has adventages like no refreshment and less latency, so in the end could behave like an edram of 500GB/s for example

 

and dont forget this

http://hdwarriors.com/why-the-wii-u-is-probably-more-capable-than-you-think-it-is/

"

The easiest way that I can explain this is that when you take each unit of time that the Wii U eDRAM can do work with separate tasks as compared to the 1 Gigabyte of slower RAM, the amount of actual Megabytes of RAM that exist during the same time frame is superior with the eDRAM, regardless of the fact that the size and number applied makes the 1 Gigabyte of DDR3 RAM seem larger. These are units of both time and space. Fast eDRAM that can be used at a speed more useful to the CPU and GPU have certain advantages, that when exploited, give the console great gains in performance.

The eDRAM of the Wii U is embedded right onto the chip logic, which for most intent and purposes negates the classic In/Out bottleneck that developers have faced in the past as well. Reading and writing directly in regard to all of the chips on the Multi Chip Module as instructed.

"

the edram is mainly for the gpu, not cpu and the os runs on the main ram not the esram, and considering how tiny the internal caches are like the local data, global data and texture caches then 32 megabytes dont look that tiny

None of that changes the fact the PS4 still has more memory bandwidth and a better gpu.... Esram only helps the nslower ddr3 memeroy some, unified GDDR5 still mops the floor with it.

 

And lets not even get into thge GPU which is the single biggets factor. This secret sauce, esram is just nonsense spread by people in denial.

 

Keep posting that article doesn't change anything nor does that artciel prove anything, it is tecnhically impossible for 32 MB of esram to off set the memory and GPU advantage of the PS4.

 

If anything it is a  barrier for developers, ehich is worse then any small performance gains it has. As the PS3 can tell you.

 

Where is your proof esram could behave like 500 GB/S?? You saying it certainly doesn;t make it so.

You cannot simply make up numbers and add them together and make it be fact. A system is only as fast as it's weakest link x1 is pretty unbalanced.


actually should be clear that the esram has more bandwidth and is fatsre, the problem is that is more difficult to code and requires more thinking on how to use it to take adventage of it, read mark cenrys quote please

 

gddr5 adventage is that is easier to code, the bandwidth is good but there is still problem with the high latency and the fact that is not located near any component, buit still since is easier to code for is more likely developers will have no problem building good grapic games compared to the difficultty that esram would represent for that



megafenix said:


actually should be clear that the esram has more bandwidth and is fatsre, the problem is that is more difficult to code and requires more thinking on how to use it to take adventage of it, read mark cenrys quote please

 

gddr5 adventage is that is easier to code, the bandwidth is good but there is still problem with the high latency and the fact that is not located near any component, buit still since is easier to code for is more likely developers will have no problem building good grapic games compared to the difficultty that esram would represent for that


32 MB, doesn't matter how fast it is, it is another call stack code as to run through. any speed advanateg is off set but that alone and having to code for it... Even with 32 B of esram x1 still has less memory bandwidth. ESRAM does NOT have more bandwidth, it gives x1  alittle more then DDR3 but still does not match unified GDDR5.

Again, experts say unified GDDR5 is better. There are numerous tech analysis of each system that details why PS4 is considerably more powerfull. You simply cannoy add the GB/S of esram to the overal memory bandwidth, it doesn't work like that. Everything you daid here is FUD,.

 

GDDR5 in the PS4 is right next to the APU unified architecture components, stop speading BS., not would that make any differene LMAO.

Again, this does nothing to offset the biggest advantage.......The gpu. No amount of esram will help the GPU.



Panicnausia said:

Keep posting that article doesn't change anything nor does that artciel prove anything, it is tecnhically impossible for 32 MB of esram to off set the memory and GPU advantage of the PS4.


Well. It's not technically impossible, it is impossible.
eSRAM cannot offset the compute advantages that the Playstation 4 has, it can however offset any bandwidth advantage. - Also keep in mind the Xbox One by it's very nature of having less compute resources in turn doesn't need as much bandwidth in order for the pipelines to become fully saturated.

The ROP advantage the Playstation 4 has is another big factor, the Xbox One has the equivalent amount of ROPS that a Geforce 6800 and Radeon x850 had, 8-9  years ago. (Although, the Xbox One's is probably at-least twice as fast thanks to the clocks alone, let alone any efficiency gains that have been invented since then.)

They all pale in comparison to the almighty PC however.



--::{PC Gaming Master Race}::--

Around the Network
Panicnausia said:
megafenix said:


actually should be clear that the esram has more bandwidth and is fatsre, the problem is that is more difficult to code and requires more thinking on how to use it to take adventage of it, read mark cenrys quote please

 

gddr5 adventage is that is easier to code, the bandwidth is good but there is still problem with the high latency and the fact that is not located near any component, buit still since is easier to code for is more likely developers will have no problem building good grapic games compared to the difficultty that esram would represent for that


32 MB, doesn't matter how fast it is, it is another call stack code as to run through. any speed advanateg is off set but that alone and having to code for it... Even with 32 B of esram x1 still has less memory bandwidth. ESRAM does NOT have more bandwidth, it gives x1  alittle more then DDR3 but still does not match unified GDDR5.

Again, experts say unified GDDR5 is better. There are numerous tech analysis of each system that details why PS4 is considerably more powerfull. You simply cannoy add the GB/S of esram to the overal memory bandwidth, it doesn't work like that. Everything you daid here is FUD,.

 

GDDR5 in the PS4 is right next to the APU unified architecture components, stop speading BS., not would that make any differene LMAO.

Again, this does nothing to offset the biggest advantage.......The gpu. No amount of esram will help the GPU.


i am not saying gddr5 is not better option, quite the contrary, but i am also pointing out the adventages of the esram and its weaknesses

gddr5 is more convinient since is easier to code for, esram may have more bandwidth, faster access, no refreshness and almost no latency but still more difficult to take profit for

that all what i a saying,  but, you should be aware that games that really squeeze the esra capabilities can deliver great graphics, i do not want to tell how it will now since i want to talk about this for the wii u first so will leave it for later



The X1 can do 2160p at 30fps too. Theorically.


Imagine Ryse 2 at 2160p......(at 10fps average with framerate drops to, well Ryse drops to 16fps so at 4K...4fps?)

But it would look good on screenshots.



And a fat guy can run 100m

Does not mean anyone wants to see that happen.



This is the Game of Thrones

Where you either win

or you DIE

Panicnausia said:
selnor1983 said:
Panicnausia said:
selnor1983 said:
Ryse best launch game visuals. Thats good enough for me. QB looks awesome and Im sure Halo 5 will to.

I prefer to have a media device nowadays anyway over mainly a games console.


No ryse isn't, the fact that KZ looks as good or getter and is pushing 30-40% more pixels and is a lot more open and has way more going on only undeerscores the limits x1 has reaching said visuals. Not to mention FPS that dips in the low 20's.

 

PS4 also is a media device and a better game system, and people will agree once sales numbers slowly trickle out.

Majority say it is buddy.

Alot more detail in environments, animations and characters. Also lighting is rediculous. 

Good luc with enjoying your PS4.Your doinng alot of justifying lately.

Majority???? Where? Digital foundry disagrees. The tech proof is in the pudding, Ryse would gave ran at 1080P on ps4.

No DF said in the technical analysis Ryse over everything at launch. Wat your refferring to on Neogaf was the writer saying he prefferred the Aesthetic of KZ SF. Artstyle to help you understand. He confirms that when questioned. 

His Gaf orum posting your trying to cling to is him outside of DF. His DF article is his job. Dont confuse the 2. DF say Ryse > KZ. But Im suree youll cling to  what you can.

Ryse is better in all categories. KZ animation is pretty bad actually as is the texture detail in comparison to Ryse. 



Assassin's Creed IV is pretty good. Best iteration since II.