By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

superchunk said:
Badassbab said:
ESRAM on Xbox One needs updating. I'm pretty sure it's 109GB/s as standard due to the 10% GPU speed increase but has a theoretical peak of 192GB/s?

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

Others please correct me if I'm wrong, but an change in clock speed of a cpu/gpu does not mean there is any change to the bus speed of memory and/or its pipelines.

Allow me to correct myself-

http://gamrconnect.vgchartz.com/thread.php?id=167055 <<< Third inline attachment on ethomaz OP.

MS Hot Chips presentation says 109GB min and 204GB peak taking into account inefficiencies.



Around the Network
Badassbab said:

Allow me to correct myself-

http://gamrconnect.vgchartz.com/thread.php?id=167055 <<< Third inline attachment on ethomaz OP.

MS Hot Chips presentation says 109GB min and 204GB peak taking into account inefficiencies.

Issue is that is for the eSRAM connection... which is only one small part of the whole equation.

I updated the 102 to 109 for the eSRAM. It is more accurate but it shouldn't change anything I have in my various summaries for each section.



superchunk said:
RE the shaders...

Can someone post a source of actual devs saying it has 160 vs 320? I've had 320 for a very long time due to Chipworks and NeoGaf threads discussion combined with various tech sites.

Not much of a confirmation or anything, but a while ago a dev made a thread here where we could ask him things, and ninjablade (who else could it be?) made this question to him. His answer and a bit more:

http://gamrconnect.vgchartz.com/post.php?id=5737901

Too bad he had to left the site for that thread .



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

superchunk said:
RE the shaders...

Can someone post a source of actual devs saying it has 160 vs 320? I've had 320 for a very long time due to Chipworks and NeoGaf threads discussion combined with various tech sites.


How do you have 320 shaders when neogaf onfirmed it was 160 shaders, and in the very same thread the project cars dev said it was 192 shaders , 320 shaders has been ruled out already ruled out by neogaf and othere tech sites unless your not keeping up with the descussion and want to ignore the facts.



eyeofcore said:
drake4 said:
 

i'm still waiting for you source where 160 shaders was debunked, i checked the neogaf thread where is it was confirmed http://www.neogaf.com/forum/showthread.php?t=710765 there is no debunking, jusyt everybody agreeing.N

*facepalm*

Member that created that thread stated in the thread title that it is a rumor and he said it is a rumor and he confirmed it is a rumor yet you state a rumor as a fact and people that agreed are Sony/Microsoft fans without little to no knowledge involving hardware let alone software that only validate lowest possible denominator.

Majority people agreeing about something =/= fact

Being in majority saying one thing does not mean that you and them are right, you are basing all your claims based around a rumor without any evidence nor foundation with die shots contradicting claims that you support.

Consensus at NeoGAF about 160 Shaders rumor is that have a doubt about it because of games that were or will be released on Wii U. Need For Speed Most Wanted U has little to no issue in framerate compared to PlayStation 3 and Xbox 360, no screen tearing, it has far better lightning(specially at night) and high quality textures are present in the game. Trine 2: Directors Cut looks better and runs better and developers them self said that Wii U version of Trine 2 would not be possible to run on Xbox 360 nor PlayStation 3 hardware without downgrade. Thats one of examples...

Also do you care to explain me why Bayonetta 2 looks way better than original Bayonetta since Bayonetta 2 should not be possible on 160 SPU GPU even when its more efficient...


read the thread correctly it clearly says the 160 shaders part is confirmed and that the downgrade part is the rumor



Around the Network
drake4 said:
superchunk said:
RE the shaders...

Can someone post a source of actual devs saying it has 160 vs 320? I've had 320 for a very long time due to Chipworks and NeoGaf threads discussion combined with various tech sites.


How do you have 320 shaders when neogaf onfirmed it was 160 shaders, and in the very same thread the project cars dev said it was 192 shaders , 320 shaders has been ruled out already ruled out by neogaf and othere tech sites unless your not keeping up with the descussion and want to ignore the facts.

gaf's thread on the GPU has 320 in its OP.



drake4 said:
superchunk said:
RE the shaders...

Can someone post a source of actual devs saying it has 160 vs 320? I've had 320 for a very long time due to Chipworks and NeoGaf threads discussion combined with various tech sites.


How do you have 320 shaders when neogaf onfirmed it was 160 shaders, and in the very same thread the project cars dev said it was 192 shaders , 320 shaders has been ruled out already ruled out by neogaf and othere tech sites unless your not keeping up with the descussion and want to ignore the facts.

First you accuse me and now you accuse him... Nice job.

You need to stop with claiming that Wii U's GPU is 160 shaders because it is not, you are claiming a rumor as a fact yet that rumor did not even had any source nor evidence/proof while I base my statement around die shot of Wii U's GPU from Chipworks. So Chipworks die shot is fake? Are you going to say that?

What facts? Imaginary proof based on a rumor without any evidence nor source at all and that is to you a fact?!

Project CARS developer did not said anything, user that leaked anything involving Project CARS only leaked a log which contained some information involving render code which said 192 threads does not mean that Wii U GPU has 192 Shaders, it only uses 192 Threads. Anyway you interpret Project CARS information wrongly. I think those threads probably involve SRAM cells/chace thus as we know it Bobcat for example has 16 SRAM Cells/Cache and 20 SPU's/Shaders so it is ration 1.25. Now you 192 X 1.25 and you get 240 SPU's which would mean that it has same amount of Shaders as Xbox 360. That is the theory.

Don't mix Threads with Shaders.



drake4 said:
eyeofcore said:
drake4 said:
 

i'm still waiting for you source where 160 shaders was debunked, i checked the neogaf thread where is it was confirmed http://www.neogaf.com/forum/showthread.php?t=710765 there is no debunking, jusyt everybody agreeing.N

*facepalm*

Member that created that thread stated in the thread title that it is a rumor and he said it is a rumor and he confirmed it is a rumor yet you state a rumor as a fact and people that agreed are Sony/Microsoft fans without little to no knowledge involving hardware let alone software that only validate lowest possible denominator.

Majority people agreeing about something =/= fact

Being in majority saying one thing does not mean that you and them are right, you are basing all your claims based around a rumor without any evidence nor foundation with die shots contradicting claims that you support.

Consensus at NeoGAF about 160 Shaders rumor is that have a doubt about it because of games that were or will be released on Wii U. Need For Speed Most Wanted U has little to no issue in framerate compared to PlayStation 3 and Xbox 360, no screen tearing, it has far better lightning(specially at night) and high quality textures are present in the game. Trine 2: Directors Cut looks better and runs better and developers them self said that Wii U version of Trine 2 would not be possible to run on Xbox 360 nor PlayStation 3 hardware without downgrade. Thats one of examples...

Also do you care to explain me why Bayonetta 2 looks way better than original Bayonetta since Bayonetta 2 should not be possible on 160 SPU GPU even when its more efficient...


read the thread correctly it clearly says the 160 shaders part is confirmed and that the downgrade part is the rumor

No. You are claiming a rumor as a fact even when that individual that made that thread stated it as a rumor in thread and its title and he said that we/he/everyone should treat it as a rumor. He did not gave any evidence nor source, he only said that a some person told him something so it is an anonymous source that could be FOS and BS everyone including him.

Actually... You know what... I won't stop you from claiming a mere rumor as a fact and go against evidence at hand.

At least I have done research while you base your "fact" around a rumor without any source nor evidence to support it and individual that created that thread said to treat it as a rumor which he also wrote in thread and put rumor on thread title.

You are just stating a rumor as a fact... It has been "confirmed" yet it is not that confirmed.

Your confirmation is confirmation of people that want to believe in that rumor that is true yet that rumor does not have any kind of confirmation, no evidence, no source nor any real statement except from some anonymous individual that may or may not be FOS.



eyeofcore said:
drake4 said:
eyeofcore said:
drake4 said:
 

i'm still waiting for you source where 160 shaders was debunked, i checked the neogaf thread where is it was confirmed http://www.neogaf.com/forum/showthread.php?t=710765 there is no debunking, jusyt everybody agreeing.N

*facepalm*

Member that created that thread stated in the thread title that it is a rumor and he said it is a rumor and he confirmed it is a rumor yet you state a rumor as a fact and people that agreed are Sony/Microsoft fans without little to no knowledge involving hardware let alone software that only validate lowest possible denominator.

Majority people agreeing about something =/= fact

Being in majority saying one thing does not mean that you and them are right, you are basing all your claims based around a rumor without any evidence nor foundation with die shots contradicting claims that you support.

Consensus at NeoGAF about 160 Shaders rumor is that have a doubt about it because of games that were or will be released on Wii U. Need For Speed Most Wanted U has little to no issue in framerate compared to PlayStation 3 and Xbox 360, no screen tearing, it has far better lightning(specially at night) and high quality textures are present in the game. Trine 2: Directors Cut looks better and runs better and developers them self said that Wii U version of Trine 2 would not be possible to run on Xbox 360 nor PlayStation 3 hardware without downgrade. Thats one of examples...

Also do you care to explain me why Bayonetta 2 looks way better than original Bayonetta since Bayonetta 2 should not be possible on 160 SPU GPU even when its more efficient...


read the thread correctly it clearly says the 160 shaders part is confirmed and that the downgrade part is the rumor

No. You are claiming a rumor as a fact even when that individual that made that thread stated it as a rumor in thread and its title and he said that we/he/everyone should treat it as a rumor. He did not gave any evidence nor source, he only said that a some person told him something so it is an anonymous source that could be FOS and BS everyone including him.

Actually... You know what... I won't stop you from claiming a mere rumor as a fact and go against evidence at hand.

At least I have done research while you base your "fact" around a rumor without any source nor evidence to support it and individual that created that thread said to treat it as a rumor which he also wrote in thread and put rumor on thread title.

You are just stating a rumor as a fact... It has been "confirmed" yet it is not that confirmed.

Your confirmation is confirmation of people that want to believe in that rumor that is true yet that rumor does not have any kind of confirmation, no evidence, no source nor any real statement except from some anonymous individual that may or may not be FOS.

http://www.neogaf.com/forum/showpost.php?p=88910693&postcount=6

http://www.neogaf.com/forum/showpost.php?p=88911841&postcount=60 read the last line dude.

http://www.neogaf.com/forum/showpost.php?p=89459977&postcount=536



drake4 said:
eyeofcore said:
drake4 said:
eyeofcore said:
drake4 said:
 

i'm still waiting for you source where 160 shaders was debunked, i checked the neogaf thread where is it was confirmed http://www.neogaf.com/forum/showthread.php?t=710765 there is no debunking, jusyt everybody agreeing.N

*facepalm*

Member that created that thread stated in the thread title that it is a rumor and he said it is a rumor and he confirmed it is a rumor yet you state a rumor as a fact and people that agreed are Sony/Microsoft fans without little to no knowledge involving hardware let alone software that only validate lowest possible denominator.

Majority people agreeing about something =/= fact

Being in majority saying one thing does not mean that you and them are right, you are basing all your claims based around a rumor without any evidence nor foundation with die shots contradicting claims that you support.

Consensus at NeoGAF about 160 Shaders rumor is that have a doubt about it because of games that were or will be released on Wii U. Need For Speed Most Wanted U has little to no issue in framerate compared to PlayStation 3 and Xbox 360, no screen tearing, it has far better lightning(specially at night) and high quality textures are present in the game. Trine 2: Directors Cut looks better and runs better and developers them self said that Wii U version of Trine 2 would not be possible to run on Xbox 360 nor PlayStation 3 hardware without downgrade. Thats one of examples...

Also do you care to explain me why Bayonetta 2 looks way better than original Bayonetta since Bayonetta 2 should not be possible on 160 SPU GPU even when its more efficient...


read the thread correctly it clearly says the 160 shaders part is confirmed and that the downgrade part is the rumor

No. You are claiming a rumor as a fact even when that individual that made that thread stated it as a rumor in thread and its title and he said that we/he/everyone should treat it as a rumor. He did not gave any evidence nor source, he only said that a some person told him something so it is an anonymous source that could be FOS and BS everyone including him.

Actually... You know what... I won't stop you from claiming a mere rumor as a fact and go against evidence at hand.

At least I have done research while you base your "fact" around a rumor without any source nor evidence to support it and individual that created that thread said to treat it as a rumor which he also wrote in thread and put rumor on thread title.

You are just stating a rumor as a fact... It has been "confirmed" yet it is not that confirmed.

Your confirmation is confirmation of people that want to believe in that rumor that is true yet that rumor does not have any kind of confirmation, no evidence, no source nor any real statement except from some anonymous individual that may or may not be FOS.

http://www.neogaf.com/forum/showpost.php?p=88910693&postcount=6

http://www.neogaf.com/forum/showpost.php?p=88911841&postcount=60 read the last line dude.

Invalid and denied... That individual does not have credibility and is individual that created that thread.

He and/or his source is ignoring ammount of SRAM Cells's which indicate ammount of shaders according to AMD's design, his and/or claims of his "someone" contradict to design of GPU and die shot that they claim "confirms" their thesis yet its otherwise. Bgassassin is probably spreading misinformation and moderators/admins are likely deceived with fabricated information(misinformations).

You will say I am in "denial" and that statement will be factually wrong.

J1 to J4 don't even look remotely as interpolators and you can only have one interpolator and not four of them.

EDIT:
J1 to J4 look like TMU's or part of CU's, I am using AMD Radeon HD 4870 die shot for reference and from what I dug up around the net.