By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft - Is “GPU Acceleration” Xbox One Secret Sauce?

sales2099 said:
Eddie_Raja said:

Im sure PS4 will sell more in geographical turf locations like mainland Europe and Japan, but NA and UK is completely up for grabs. Your acting like PS4 is going to completely smash X1 in sales, and that the system has no redeemable qualities.Grow up.

Eating crow on the internet......ya pretty sure any reasonable person wouldn't care about being proved wrong about arbitrary video games discussion and let that affect theri self esteem in the real world. sigh.

?? Show me where I said any of that? You are way too sensitive, x1 will be sucessfull, but the gap will be much bigger thes gen the last. Sony beat Ms at their worst last gen. Sony has a good chance to even win or at leats be on par in the US.....


Just give up.  Sales2009 will die still defending MS even if they don't exist anymore...

These comments are meaningless and add nothing to the proceedings. I defend Xbox moreso then the corporation itself, basically because the console now looks in great shape with all the reversals, and its current hate is unjustified.

Please don't confuse me with Sega fans.

Thread after thread I have seen you jump in and defend Xbox like its your child.  Sometimes it gets to the point that your the only one left defending them on things no un-biased person would.  

I mean I remember that you said you were happy MS spends hundreds of millions of dollars on keeping content off Playstation/gimping their versions of Multi-plats instead of investing in NEW IP's and content.  That means you care more about a corporations profit than you do about gaming, and it is just pathetic and yet so sickening to watch...



Prediction for console Lifetime sales:

Wii:100-120 million, PS3:80-110 million, 360:70-100 million

[Prediction Made 11/5/2009]

3DS: 65m, PSV: 22m, Wii U: 18-22m, PS4: 80-120m, X1: 35-55m

I gauruntee the PS5 comes out after only 5-6 years after the launch of the PS4.

[Prediction Made 6/18/2014]

Around the Network

Bringing this thread back on topic...

Going to drop a bombshell.

Not GPUs are the same.

While the naming of DX11.2 is a bit of a head scratcher (it doesn't actually need to have Feature level 11_1 capabilities, so GPU's that don't even have full DX11_1 can claim to be DX11.2 GPU's, note that probably the primary 3D feature of interest is indeed tiled resources, which comes in two tiers. I just want to draw the technically inclined to a nuance in the recent statement concerning support:

The Radeon™ HD 7000 series hardware architecture is fully DirectX 11.2-capable when used with a driver that enables this feature. AMD is planning to enable DirectX 11.2 with a driver update in the Windows® 8.1 launch timeframe in October, when DirectX® 11.2 ships. Today, AMD is the only GPU manufacturer to offer fully-compatible DirectX 11.1 support, and the only manufacturer to support Tiled Resources Tier-2 within a shipping product stack.

[SOURCE]

Only the Xbox One's GPU is GCN 1.1. The "Tahiti"/"Pitcairn" line of AMD GPUs are first generation GCN (1.0), which is what the PS4's GPU is based on. The Xbox One's GPU is based on the "Bonaire" GPUs.  This is a simplification of the explaination.  There differences between hardware DirectX support levels and the API support levels.  For example, when you read on Wikipedia that an AMD or NVidia card support DirectX 11.1, it means they support the DirectX 11.1 API, they may not be at the DirectX 11_1 feature set level. 

[SOURCE]

Why does this matter, it's only DirectX right? 

No.  GCN 1.0 GPUs only support Tier 1 - Tiled Resources. GCN 1.1 GPUs support Tier 2 - Tiled Resources.

What the hell is Tier 1 and Tier 2?  Well, the differences are this:

  • TIER2 supports MIN and MAX texture sampling modes that return the min or max of 4 neighboring texels. In the sample they use this when sampling a residency texture that tells the shader the highest-resolution mip level that can be used when sampling a particular tile. For TIER1 they emulate it with a Gather.
  • TIER1 doesn't support sampling from unmapped tiles, so you have to either avoid it in your shader or map all unloaded tiles to dummy tile data (the sample does the latter)
  • TIER1 doesn't support packed mips for texture arrays
  • TIER2 supports a new version of Texture2D.Sample that lets you clamp the mip level to a certain value. They use this to force the shader to sample from lower-resolution mip levels if the higher-resolution mip isn't currently resident in memory. For TIER1 they emulate this by computing what mip level would normally be used, comparing it with the mip level available in memory, and then falling back to SampleLevel if the mip level needs to be clamped. There's also another overload for Sample that returns a status variable that you can pass to a new "CheckAccessFullyMapped" intrinsic that tells you if the sample operation would access unmapped tiles. The docs don't say that these functions are restricted to TIER2, but I would assume that to be the case.

[SOURCE]

Basically on Tier 1 hardware this means that a similar method needs to be used to determine page faults that was used for the software virtual texture implementations. But that's pretty much the most efficient way to do it, so the missing CheckAccessFullyMapped shouldn't hurt performance at all. The missing min/max filtering and the missing LOD clamp for sampler mean that Tier 1 hardware needs quite a few extra ALU instructions in the pixel shader. It shouldn't be a big deal for basic use scenarios, but when combined with per pixel displacement techniques (POM/QDM/etc) the extra ALU cost will start to hurt. And obviously if you use tiled resources for GPU SVO rendering, the extra ALU cost might hurt Tier 1 hardware even more. 

Yes, the PS4 will have similar capabilities as the Xbox One, the difference is that the Xbox One's GPU feature full hardware (Tier 2) where as the PS4's is a combination of software and hardware.

[SOURCE]

Bullshit right?  Sony announced that the PS4 was DirectX11.2+ and Microsoft announced that the Xbox One was DirectX11.1+.  Microsoft also announced that Windows 8.1 and the Xbox One were the only systems that supported DX11.2.  Sony can claim DX11.2+ support because they can potentially build into their GPU any features they want to support.  Not only this, but even AMD was caught off guard by Microsoft when they released the full DirectX11.2 feature set.  Which is why a driver update is necessary to support those features, not simply a DirectX update.

[SOURCE]  Beyond 3D

Note:  Dave Bauman is an employee at AMD.  The thread linked in this post was originally locked by the moderators.  They only opened the thread to allow Dave to post. 



Eddie_Raja said:
sales2099 said:
Eddie_Raja said:

Im sure PS4 will sell more in geographical turf locations like mainland Europe and Japan, but NA and UK is completely up for grabs. Your acting like PS4 is going to completely smash X1 in sales, and that the system has no redeemable qualities.Grow up.

Eating crow on the internet......ya pretty sure any reasonable person wouldn't care about being proved wrong about arbitrary video games discussion and let that affect theri self esteem in the real world. sigh.

?? Show me where I said any of that? You are way too sensitive, x1 will be sucessfull, but the gap will be much bigger thes gen the last. Sony beat Ms at their worst last gen. Sony has a good chance to even win or at leats be on par in the US.....


Just give up.  Sales2009 will die still defending MS even if they don't exist anymore...

These comments are meaningless and add nothing to the proceedings. I defend Xbox moreso then the corporation itself, basically because the console now looks in great shape with all the reversals, and its current hate is unjustified.

Please don't confuse me with Sega fans.

Thread after thread I have seen you jump in and defend Xbox like its your child.  Sometimes it gets to the point that your the only one left defending them on things no un-biased person would.  

I mean I remember that you said you were happy MS spends hundreds of millions of dollars on keeping content off Playstation/gimping their versions of Multi-plats instead of investing in NEW IP's and content.  That means you care more about a corporations profit than you do about gaming, and it is just pathetic and yet so sickening to watch...

Thread after thread I see PS fans defend/brag about PS4 like its their own child that got accepted into the honours program at elementary school......but hey, nothing wrong with that in your book right? There is nothing wrong with being bias...its totally natural to have a preference in many things in life.

"That means you care more about a corporations profit than you do about gaming, and it is just pathetic and yet so sickening to watch..."

You REALLY have to get out of the internet, or at the very least look at internet global news. Because if you think what I say about video games is pathetic and sickening to watch...then you are a very sheltered individual, and you have my hopes that you adequately adjust your definition of those two words.

You don't want to get personal and judge me. Id very much like to stack myself up against you in the real world and see whos "better" in the eyes of society :). So, to repeat my origional quote, either add something to the proceedings, or be quiet and move on.



Xbox: Best hardware, Game Pass best value, best BC, more 1st party genres and multiplayer titles. 

 

Adinnieken said:

Bringing this thread back on topic...

Going to drop a bombshell.

Not GPUs are the same.

While the naming of DX11.2 is a bit of a head scratcher (it doesn't actually need to have Feature level 11_1 capabilities, so GPU's that don't even have full DX11_1 can claim to be DX11.2 GPU's, note that probably the primary 3D feature of interest is indeed tiled resources, which comes in two tiers. I just want to draw the technically inclined to a nuance in the recent statement concerning support:

The Radeon™ HD 7000 series hardware architecture is fully DirectX 11.2-capable when used with a driver that enables this feature. AMD is planning to enable DirectX 11.2 with a driver update in the Windows® 8.1 launch timeframe in October, when DirectX® 11.2 ships. Today, AMD is the only GPU manufacturer to offer fully-compatible DirectX 11.1 support, and the only manufacturer to support Tiled Resources Tier-2 within a shipping product stack.

[SOURCE]

Only the Xbox One's GPU is GCN 1.1. The "Tahiti"/"Pitcairn" line of AMD GPUs are first generation GCN (1.0), which is what the PS4's GPU is based on. The Xbox One's GPU is based on the "Bonaire" GPUs.  This is a simplification of the explaination.  There differences between hardware DirectX support levels and the API support levels.  For example, when you read on Wikipedia that an AMD or NVidia card support DirectX 11.1, it means they support the DirectX 11.1 API, they may not be at the DirectX 11_1 feature set level. 

[SOURCE]

Why does this matter, it's only DirectX right? 

No.  GCN 1.0 GPUs only support Tier 1 - Tiled Resources. GCN 1.1 GPUs support Tier 2 - Tiled Resources.

What the hell is Tier 1 and Tier 2?  Well, the differences are this:

  • TIER2 supports MIN and MAX texture sampling modes that return the min or max of 4 neighboring texels. In the sample they use this when sampling a residency texture that tells the shader the highest-resolution mip level that can be used when sampling a particular tile. For TIER1 they emulate it with a Gather.
  • TIER1 doesn't support sampling from unmapped tiles, so you have to either avoid it in your shader or map all unloaded tiles to dummy tile data (the sample does the latter)
  • TIER1 doesn't support packed mips for texture arrays
  • TIER2 supports a new version of Texture2D.Sample that lets you clamp the mip level to a certain value. They use this to force the shader to sample from lower-resolution mip levels if the higher-resolution mip isn't currently resident in memory. For TIER1 they emulate this by computing what mip level would normally be used, comparing it with the mip level available in memory, and then falling back to SampleLevel if the mip level needs to be clamped. There's also another overload for Sample that returns a status variable that you can pass to a new "CheckAccessFullyMapped" intrinsic that tells you if the sample operation would access unmapped tiles. The docs don't say that these functions are restricted to TIER2, but I would assume that to be the case.

[SOURCE]

Basically on Tier 1 hardware this means that a similar method needs to be used to determine page faults that was used for the software virtual texture implementations. But that's pretty much the most efficient way to do it, so the missing CheckAccessFullyMapped shouldn't hurt performance at all. The missing min/max filtering and the missing LOD clamp for sampler mean that Tier 1 hardware needs quite a few extra ALU instructions in the pixel shader. It shouldn't be a big deal for basic use scenarios, but when combined with per pixel displacement techniques (POM/QDM/etc) the extra ALU cost will start to hurt. And obviously if you use tiled resources for GPU SVO rendering, the extra ALU cost might hurt Tier 1 hardware even more. 

Yes, the PS4 will have similar capabilities as the Xbox One, the difference is that the Xbox One's GPU feature full hardware (Tier 2) where as the PS4's is a combination of software and hardware.

[SOURCE]

Bullshit right?  Sony announced that the PS4 was DirectX11.2+ and Microsoft announced that the Xbox One was DirectX11.1+.  Microsoft also announced that Windows 8.1 and the Xbox One were the only systems that supported DX11.2.  Sony can claim DX11.2+ support because they can potentially build into their GPU any features they want to support.  Not only this, but even AMD was caught off guard by Microsoft when they released the full DirectX11.2 feature set.  Which is why a driver update is necessary to support those features, not simply a DirectX update.

[SOURCE]  Beyond 3D

Note:  Dave Bauman is an employee at AMD.  The thread linked in this post was originally locked by the moderators.  They only opened the thread to allow Dave to post. 

Once again your ignorance on the topic of hardware has taken over. Dude go back to school and learn some actual coding. You realize that the word "tier" in this case stands for the feature that is a part of the api not level's of hardware support. Do you even know how PRT's/tiled resourcing works ? 

Oh and btw the update on the hardware was to give it api compatibility not to support those features, I'm surprised that you don't even know how an api works.



sales2099 said:
Eddie_Raja said:
sales2099 said:
Eddie_Raja said:

Im sure PS4 will sell more in geographical turf locations like mainland Europe and Japan, but NA and UK is completely up for grabs. Your acting like PS4 is going to completely smash X1 in sales, and that the system has no redeemable qualities.Grow up.

Eating crow on the internet......ya pretty sure any reasonable person wouldn't care about being proved wrong about arbitrary video games discussion and let that affect theri self esteem in the real world. sigh.

?? Show me where I said any of that? You are way too sensitive, x1 will be sucessfull, but the gap will be much bigger thes gen the last. Sony beat Ms at their worst last gen. Sony has a good chance to even win or at leats be on par in the US.....


Just give up.  Sales2009 will die still defending MS even if they don't exist anymore...

These comments are meaningless and add nothing to the proceedings. I defend Xbox moreso then the corporation itself, basically because the console now looks in great shape with all the reversals, and its current hate is unjustified.

Please don't confuse me with Sega fans.

Thread after thread I have seen you jump in and defend Xbox like its your child.  Sometimes it gets to the point that your the only one left defending them on things no un-biased person would.  

I mean I remember that you said you were happy MS spends hundreds of millions of dollars on keeping content off Playstation/gimping their versions of Multi-plats instead of investing in NEW IP's and content.  That means you care more about a corporations profit than you do about gaming, and it is just pathetic and yet so sickening to watch...

Thread after thread I see PS fans defend/brag about PS4 like its their own child that got accepted into the honours program at elementary school......but hey, nothing wrong with that in your book right? There is nothing wrong with being bias...its totally natural to have a preference in many things in life.

"That means you care more about a corporations profit than you do about gaming, and it is just pathetic and yet so sickening to watch..."

You REALLY have to get out of the internet, or at the very least look at internet global news. Because if you think what I say about video games is pathetic and sickening to watch...then you are a very sheltered individual, and you have my hopes that you adequately adjust your definition of those two words.

You don't want to get personal and judge me. Id very much like to stack myself up against you in the real world and see whos "better" in the eyes of society :). So, to repeat my origional quote, either add something to the proceedings, or be quiet and move on.

I never said you are a terrible person or that you are the worst person I have ever seen "Very far from it."

Yes, we do all have our preferences, and indeed there is nothing wrong with that.  I do love games.  However I want others to enjoy as many great games as possible.  As such, timed exclusives and timed content makes no sense to me and only makes gaming worse.  

I can accept exclusives because they only exist due to a company making them exist.  For instance Heavy Rain is an amazing game that everyone should play.  It is unfortunate only PS3 owners can play it, but only Sony was willing to take a gamble on financing such a unique game (Quantic Dreams went to MS first and they declined).

However paying to make other games have less content/wait for more benifits no one, and only adds net unhappiness to the world.  That is what MS would rather spend their money on, than making more cool games for people to play.  If (Never said any names) you support that, you have issues...



Prediction for console Lifetime sales:

Wii:100-120 million, PS3:80-110 million, 360:70-100 million

[Prediction Made 11/5/2009]

3DS: 65m, PSV: 22m, Wii U: 18-22m, PS4: 80-120m, X1: 35-55m

I gauruntee the PS5 comes out after only 5-6 years after the launch of the PS4.

[Prediction Made 6/18/2014]

Around the Network

Anyways, to put this thread back on topic, and end the discussion as to why MS is not so retarded to use the GPU in this way, also to why the article is a pile of dumb ass shit, all whie in more layman's terms. GPU Accelerated video encoding is a terrible idea because it takes power away from games as it's using the customizable pipelines(GPU power in layman's term) to run code using whatever API to do the job, in retrospect, Intel's way of doing it is much better and that's most likely what the X1 and PS4 are both doing, which would be either a.) putting a fixed pipeline(low power consumption, the sole purpose would be to encode the video output for live streaming or saving them to the HDD at a small reasonable file size after compression for uploading to services like youtube) right in the GPU at a little corner or b.) use a seperate chip on the side for the same purpose. Using the actual GPU rendering pipelines for video encoding in this case would be absolutely retarded(I'd rather them using it for good physics if nothing else, they'd be wasting the idea behind GCN otherwise) and I don't want to believe that MS is that stupid nor do I think they are.

So that article is shit based on good logic, but I've seen some pretty bad logics from corporations as well so you never know......



dahuman said:

Anyways, to put this thread back on topic, and end the discussion as to why MS is not so retarded to use the GPU in this way, also to why the article is a pile of dumb ass shit, all whie in more layman's terms. GPU Accelerated video encoding is a terrible idea because it takes power away from games as it's using the customizable pipelines(GPU power in layman's term) to run code using whatever API to do the job, in retrospect, Intel's way of doing it is much better and that's most likely what the X1 and PS4 are both doing, which would be either a.) putting a fixed pipeline(low power consumption, the sole purpose would be to encode the video output for live streaming or saving them to the HDD at a small reasonable file size after compression for uploading to services like youtube) right in the GPU at a little corner or b.) use a seperate chip on the side for the same purpose. Using the actual GPU rendering pipelines for video encoding in this case would be absolutely retarded(I'd rather them using it for good physics if nothing else, they'd be wasting the idea behind GCN otherwise) and I don't want to believe that MS is that stupid nor do I think they are.

So that article is shit based on good logic, but I've seen some pretty bad logics from corporations as well so you never know......

Good read and all but from what I heard hasn't AMD always used fixed logic for video encoding via AMD's UVD and VCE.

GCN was a good reason for physics and all but I'd be willing to bet that it was more meant for Tiled Forward Plus Rendering via that AMD LEO demo that they showcased.



Eddie_Raja said:

Im sure PS4 will sell more in geographical turf locations like mainland Europe and Japan, but NA and UK is completely up for grabs. Your acting like PS4 is going to completely smash X1 in sales, and that the system has no redeemable qualities.Grow up.

Eating crow on the internet......ya pretty sure any reasonable person wouldn't care about being proved wrong about arbitrary video games discussion and let that affect theri self esteem in the real world. sigh.

?? Show me where I said any of that? You are way too sensitive, x1 will be sucessfull, but the gap will be much bigger thes gen the last. Sony beat Ms at their worst last gen. Sony has a good chance to even win or at leats be on par in the US.....


Just give up.  Sales2009 will die still defending MS even if they don't exist anymore...


Sony beat MS you can say because of RROD.  So don't act like the most known failure was not a issue. This most likely is the only reason sony caught up.



DJEVOLVE said:
Eddie_Raja said:

Im sure PS4 will sell more in geographical turf locations like mainland Europe and Japan, but NA and UK is completely up for grabs. Your acting like PS4 is going to completely smash X1 in sales, and that the system has no redeemable qualities.Grow up.

Eating crow on the internet......ya pretty sure any reasonable person wouldn't care about being proved wrong about arbitrary video games discussion and let that affect theri self esteem in the real world. sigh.

?? Show me where I said any of that? You are way too sensitive, x1 will be sucessfull, but the gap will be much bigger thes gen the last. Sony beat Ms at their worst last gen. Sony has a good chance to even win or at leats be on par in the US.....


Just give up.  Sales2009 will die still defending MS even if they don't exist anymore...


Sony beat MS you can say because of RROD.  So don't act like the most known failure was not a issue. This most likely is the only reason sony caught up.

Similarly, one could play that game you're playing and say the only reason the 360 got such a head start was because it released earlier and because the PS3 launched at a much higher price... ¬_¬



fatslob-:O said:
dahuman said:

Anyways, to put this thread back on topic, and end the discussion as to why MS is not so retarded to use the GPU in this way, also to why the article is a pile of dumb ass shit, all whie in more layman's terms. GPU Accelerated video encoding is a terrible idea because it takes power away from games as it's using the customizable pipelines(GPU power in layman's term) to run code using whatever API to do the job, in retrospect, Intel's way of doing it is much better and that's most likely what the X1 and PS4 are both doing, which would be either a.) putting a fixed pipeline(low power consumption, the sole purpose would be to encode the video output for live streaming or saving them to the HDD at a small reasonable file size after compression for uploading to services like youtube) right in the GPU at a little corner or b.) use a seperate chip on the side for the same purpose. Using the actual GPU rendering pipelines for video encoding in this case would be absolutely retarded(I'd rather them using it for good physics if nothing else, they'd be wasting the idea behind GCN otherwise) and I don't want to believe that MS is that stupid nor do I think they are.

So that article is shit based on good logic, but I've seen some pretty bad logics from corporations as well so you never know......

Good read and all but from what I heard hasn't AMD always used fixed logic for video encoding via AMD's UVD and VCE.

GCN was a good reason for physics and all but I'd be willing to bet that it was more meant for Tiled Forward Plus Rendering via that AMD LEO demo that they showcased.


UVD is a decoder that works in conjuntion with the GPU pipelines when doing post processing work and is the reason why I used Vista even though Vista was a terrible OS on so many fronts (needed that EVR, CPUs used to be terrible at decoding HD videos. These days though, who gives a shit lol, CPU decoding is just fine and often has better quality due to software.) The encoding aspect of avivo is also based on SPUs since it's done via all that parallel processing but is really limiting due to hardware implementation(reason GCN was introduced) for OpenCL. The point is they both eat GPU power in some way, whereas Intel's way is no load on the GPU or CPU if you use their quicksync part embedded in their iGPU for encoding, think in terms of a Wii U transfering images to the Wii U gamepad, it has a dedicated encoder that goes through wireless 5ghz right to the gamepad to get decoded, only in the case of X1 and PS4, it'd be used for live streaming or youtube, and the quality will suck unless they put in a chip that can do crazy quality at good bitrates.

ps: I'm not asking them to do crazy ass physics, even dedicate a little bit into it can do wonders, if we are working with consoles, we can tax the hardware in different ways(GPU+CPU in conjunction, for calculation purposes, the bandwidth requirement are really not super high) since the resource won't change, whereas PC can just brute force everything.