By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Penello specifies why X1vsPS4 performance difference is overstated

DD_Bwest said:
a few weeks.. Thats pretty laughable.

Not according to this guy:

Depending on the depth of info, he is most likely seeking approval from several divisions to release company confidential information. I know it sounds silly but this is most likely happening. Goes on with my company when we release tech/patent info, and the lawyers do take awhile.

http://www.neogaf.com/forum/showpost.php?p=81458017&postcount=850



Around the Network
Nsanity said:
DD_Bwest said:
a few weeks.. Thats pretty laughable.

Not according to this guy:

Depending on the depth of info, he is most likely seeking approval from several divisions to release company confidential information. I know it sounds silly but this is most likely happening. Goes on with my company when we release tech/patent info, and the lawyers do take awhile.

http://www.neogaf.com/forum/showpost.php?p=81458017&postcount=850

That sounds reasonable. He needs to get the info and has the headache of getting approval from numerous departments/groups (who're all likely butting heads) before going through the marketing dept to make sure it's all PR friendly.



Nsanity said:

Albert Penello:

I was quickly corrected (both on the forum and from people at the office) for writing the wrong number.

The challenge with the NeoGAF format is that, because threads move so fast, posts disappear or get buried so people aren't reading everything. Maybe this part got lost:

I've stated - there is no possible way for one single person to know every detail about our platform. That means I need to go get the answers to the questions you guys ask sometimes. There's a lot I know first hand, and a lot I need to get updated on.

So people understand - I'm not dodging any of the follow-up. I actually stated the other night - there are a handful of people who asked some really legitimate follow-ups to understand what I posted. And I honestly said - I'm not the guy to answer at that level of detail. Out of respect for the people who are genuinely curious to learn how we derived those numbers, and to get the most technically accurate answers - the best course is to have the answers come from those engineers directly.

So we're working now on the best format to do that.

I still stand by what I stated (except for the aforementioned 204/218). In fact, some really interesting threads were going back and forth giving me even more excruciating detail behind those numbers based on the questions people asked.

I doubt it will take the format of an AMA, but I've collected a bunch of the follow-up questions. It may take a few weeks, but we'll be following-up on this for sure. 

http://www.neogaf.com/forum/showpost.php?p=81372357&postcount=632

Cant believe guys in this thread was backing up some of Penello's false statements. Even after neutral guys on our site and GAF have said that some of the stuff hes saying is just wrong. Lost respect for a lot of posters here in the chartz in this thread...atleast on a technical level.

Im glad the guy admitted he was wrong...seems like he legit cares about engaging with the GAF community. Looking forward to getting more details from the engineers!



Intel Core i7 3770K [3.5GHz]|MSI Big Bang Z77 Mpower|Corsair Vengeance DDR3-1866 2 x 4GB|MSI GeForce GTX 560 ti Twin Frozr 2|OCZ Vertex 4 128GB|Corsair HX750|Cooler Master CM 690II Advanced|

We will see once the console comes out but I bet MS will be right and gaf/internet psuedo-experts will get owned.



Adinnieken said:
Ashadian said:

"The CPU contained within the PS4 is running stock speeds, 1.6ghz compared to Microsofts 1.75 after an upclock - 150mhz does not make up for the different in number of CUs, and (aptly ignored by Albert) the difference in ROP units - hence why even with the 150mhz upclock, the Xbox One is still way behind in terms of raw power - To see just how little difference 150mhz makes, go into your bios and bump your clock speed up to be 150mhz faster, you will see framerates climb barely 2fps at best - most modern CPUs clock to at least 400mhz higher than stock as an average, with some people bumping that to over 900mhz.

But seriously, before you try and tell me im wrong, he is comparing 1.6 to 1.75 and claiming its 10% faster, asside from being factually wrong (10% would be 1.76ghz), it also compares the cpus on the most basic of levels, which is a stupid thing to do, even more so given that theyre APUS - compare a 2ghz celeron to a 2ghz pentium to see why.

Last but not least adding up the ddr3 and esram speeds to get a higher number is universally seen as a retarded pr stunt by those with half a brain because theoretical peaks for different ram types DO NOT COMBINE - just like if you put two 3ghz xeons in a server, it does NOT mean that the server is now running at 6ghz, its running at 3ghz with additional cores - adding ddr3 and the tiny space allotted by esram does not work, even for "on paper" results - you cannot fill the 8gb of ddr3 with the esram fast enough without the esram being bottlednecked, esram could not be used for a large majority of game resources, where as GDDR5 can be used for just about anything at the cost of slightly higher latency."

Actually, he doesn't ignore it.  He addressed it.  Each CU gets a 6% speed increase, not simply the entire GPU with the frequency increase.  So while there are fewer CUs those CUs operate faster than the PS4's.  When 12 CUs are operating at 6% greater speed than the PS4, that equates into a 72% speed improvement.

Early in that thread (not that post) he states near 10%.  He has also stated before greater than 9%.  If we want to get specific it's 9.375%    No, he also talks on a deeper level, about how the the Xbox One's CPU also has a 30GB/s transfer rate between the CPU and memory,   A 50% speed improvement over the PS4.

You missed the part where he also specified that the eSRAM's speed was 204GB/s.  That still happens to be 16% faster than the GDDR5 memory in the PS4.  Don't assume my argument is that 32MB is the same as 8GB, but I wonder what memory speed is necessary in order for a modern GPU to adequately process a 1920x1080 image without bottlenecking or taking a performance hit.  The eSRAM is used both as a buffer for the DDR memory, as well as a cache to temporarily hold data that's needed quickly.

How did you get to this value? You speak as if you know about this kinda stuff, very opinioted and very harsh on other users when they mess up facts. Its kinda odd that you would make such a crazily incorrect statement.



Intel Core i7 3770K [3.5GHz]|MSI Big Bang Z77 Mpower|Corsair Vengeance DDR3-1866 2 x 4GB|MSI GeForce GTX 560 ti Twin Frozr 2|OCZ Vertex 4 128GB|Corsair HX750|Cooler Master CM 690II Advanced|

Around the Network
Jadedx said:
We will see once the console comes out but I bet MS will be right and gaf/internet psuedo-experts will get owned.


Dude the guy admitted that some of the information he posted was wrong....have you read the GAF thread?



Intel Core i7 3770K [3.5GHz]|MSI Big Bang Z77 Mpower|Corsair Vengeance DDR3-1866 2 x 4GB|MSI GeForce GTX 560 ti Twin Frozr 2|OCZ Vertex 4 128GB|Corsair HX750|Cooler Master CM 690II Advanced|

No, I make it a point to not stick my head in a toilet.



Shinobi-san said:
Nsanity said:

Albert Penello:

I was quickly corrected (both on the forum and from people at the office) for writing the wrong number.

The challenge with the NeoGAF format is that, because threads move so fast, posts disappear or get buried so people aren't reading everything. Maybe this part got lost:

I've stated - there is no possible way for one single person to know every detail about our platform. That means I need to go get the answers to the questions you guys ask sometimes. There's a lot I know first hand, and a lot I need to get updated on.

So people understand - I'm not dodging any of the follow-up. I actually stated the other night - there are a handful of people who asked some really legitimate follow-ups to understand what I posted. And I honestly said - I'm not the guy to answer at that level of detail. Out of respect for the people who are genuinely curious to learn how we derived those numbers, and to get the most technically accurate answers - the best course is to have the answers come from those engineers directly.

So we're working now on the best format to do that.

I still stand by what I stated (except for the aforementioned 204/218). In fact, some really interesting threads were going back and forth giving me even more excruciating detail behind those numbers based on the questions people asked.

I doubt it will take the format of an AMA, but I've collected a bunch of the follow-up questions. It may take a few weeks, but we'll be following-up on this for sure. 

http://www.neogaf.com/forum/showpost.php?p=81372357&postcount=632

Cant believe guys in this thread was backing up some of Penello's false statements. Even after neutral guys on our site and GAF have said that some of the stuff hes saying is just wrong. Lost respect for a lot of posters here in the chartz in this thread...atleast on a technical level.

Im glad the guy admitted he was wrong...seems like he legit cares about engaging with the GAF community. Looking forward to getting more details from the engineers!

Err, what was he wrong about?

 

The 218 vs 204 thing? I actually think 204 is correct. Andf if it was wrong, it was wrong to the negative of Xbox, which is the opposite of the hundreds of people saying he's lying to make Xbox look better.

 

The funny thing is the guys attacking him are usually horrendously wrong and have zero tech knowledge!



Captain_Tom said:

Exactly.  He can claim that 50% more cores doesn't net you 50% more power, but he is ignoring the fact that the PS4 also has 50% more ROP's/TMU's/etc.  A matter of fact, the 7970 has double the cores of the 7850 and gues what?  It performs twice as well! 

Then add in the fact that the PS4 has WAY more bandwidth and hUMA, and it is easy to see how it will perform twice as well like some developers have directly suggested.  Get your heads out of the cloud people...


It almost has 50%+ of everything, except for a few things like the Geometry Engines, which is going to be a big part for next generation, everything will have depth, hopefully no more flat blurry ground.

The bandwidth advantage of the PS4 isn't as big as you think either, the Xbox One has lower bandwidth requirements to begin with due to the slower GPU, the eSRAM will give it that little extra boost.
Of course ideally, Microsoft should have went with GDDR5, but probably due to immediate costs (And possibly CPU performance due to the roughly 20% added latency?), decided against it, GDDR5 doesn't enjoy the scale of economies like DDR3 does and it also requires a more complex memory controller, which costs transisters, the transister budget that could have been spent on the memory controller and GPU was pretty much all thrown at the eSRAM and then some.

On the flip side, once low-end GPU's and IGP's start using GDDR5, then it's going to be good news for Sony, it's going to get cheaper, high-end cards don't really sell much in terms of volume, so their shift to GDDR6 won't impact prices much.
Where-as DDR3 is going to be getting more costly from here-on-out, DDR3 prices have already increased over the past year, that cost should jump for Microsoft as the PC shifts it's focus to DDR4 production.



--::{PC Gaming Master Race}::--

fallen said:
Shinobi-san said:

Cant believe guys in this thread was backing up some of Penello's false statements. Even after neutral guys on our site and GAF have said that some of the stuff hes saying is just wrong. Lost respect for a lot of posters here in the chartz in this thread...atleast on a technical level.

Im glad the guy admitted he was wrong...seems like he legit cares about engaging with the GAF community. Looking forward to getting more details from the engineers!

Err, what was he wrong about?

 

The 218 vs 204 thing? I actually think 204 is correct. Andf if it was wrong, it was wrong to the negative of Xbox, which is the opposite of the hundreds of people saying he's lying to make Xbox look better.

 

The funny thing is the guys attacking him are usually horrendously wrong and have zero tech knowledge!

 

My biggest issues was the the way in which he adds up the bandwidth of the system. Which is just horribly incorrect.

Another incorrect statement he made was about having more CU's being inefficient. Which is again crazy wrong.

And lastly he makes a 6% GPU clock speed increase seem like something....more? How on earth a 6% GPU clock speed increase is anything other than exactly that, is beyond me.

Honestly these 3 things are sooo incorrect that i dont even...

And i didnt need GAF/Vgchartz/other articles to tell me this. Any PC enthusiast is familiar with these things. None of this stuff is overly complicated. Its not like we talking about undiclosed co-processors etc. We speaking about generic computing performance terms here....

I dont agree with people attacking the guy personally but the statements he made are just off. The 204/218 thing is minor didnt really mind it. But the other things he really needs to back it up. Thats why he hasnt responded to most of the follow up questions in the thread. Even the decent replies, with no insulting etc. He very clearly states that he just gave off the information that he got from the engineers. Thats why hes sending all the legit questions to get them answered. I like the guy, seems genuine, but the doesnt make any of the incorrect statements he made any better.

And what im confused about, in this partilar thread, is so called "techies" defending what is clearly wrong info...never mind the correct stuff. Look at this whole thing objectively and im sure you will agree.



Intel Core i7 3770K [3.5GHz]|MSI Big Bang Z77 Mpower|Corsair Vengeance DDR3-1866 2 x 4GB|MSI GeForce GTX 560 ti Twin Frozr 2|OCZ Vertex 4 128GB|Corsair HX750|Cooler Master CM 690II Advanced|