By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Is the Xbox 360 a Success?

 

Well, answer the damn question.

Yes it is. 333 77.26%
 
No it isn't. 68 15.78%
 
I love it when you recycle, d21! 30 6.96%
 
Total:431

You have hardware and software. In order to make things easier for developers to use the hardware there is a layer that sits between the games and the hardware that takes instructions from the game and turns it into video and sound.

This is the API, it is a set of instructions that can be used to draw what you see on your screen or hear through your speakers. It effectively means a developer doesn't have to tell the console hardware how to draw each pixel, they can send it an instruction to draw a triangle at a certain position and what to fill it with in very simple terms. It can do this for many more complex graphical effects too. Directx is an example of API, OpenGL is another. As Directx has been in use in Windows since 1995 it has by and large driven games development.

I hope that makes some sense, it doesn't cover all the nuances but gives you an idea of what they mean when talking about API's.



Around the Network

Of course its a success
but i will never buy a msoft product no matter how good it is



They did write off a huge amount of money due to RROD, I must admit I'm not convinced its financially successful. I'm not a wii fan but to my eyes only Nintendo have made what I would call a financially successful console. They sold a low cost console with a huge profit and also made a ton of money from software, they put little investment in any sort of infrastructive like xbox live etc, so their costs were low. The wii has run out of steam now though.

Personally I don't care about financial success. Happy to take any subsidies to my gaming habit from any company. I loved the original xbox, truly fantastic hardware for its time and some of my best gaming memories with some brilliant games where as wii has done next to nothing for me, with very few games worth playing. Dreamcast was incredible but clearly a failed console.

For the next generation I'm happy to see all three companies fighting for marketshare and cutting their profits so I can play games on the cheap.



Pemalite said:
mjk45 said:

the truth is you are treating a software choice as if it was the hardware itself .


You're missing my point entirely and are now putting words in my mouth.

Developers want to make a game as cheaply and quickly as possible, designing a game for the plethera of hardware directly is time consuming and expensive.
The solution? An API that sits between the hardware and software with a common feature set, this is where Microsoft enters the arena.
Microsoft works with graphics companies to dictate that feature set on the PC and even invents new technologies to go into that feature set.

Console manucturers pick up the PC GPU's that adhere to that feature set and whack them in their consoles, so Microsoft has a fairly large hand in dictating the capabilities a console has graphically.
That's why the GPU in the PS3 is a Direct X 9 Shader Model 3 specced PC part, because of Microsoft and Direct X and nVidia.

nVidia's first GPU for instance didn't use rasterization, hardly any games supported it, nVidia's second ever GPU then adhered to Direct X's Polygon based rasterization rendering, which then could run any game that adhered to that Direct X standard.

How you can flatout deny that Microsoft has had no part in the development of GPU technology I will never know, They have been doing this for almost 2 decades now.

Where did I put words in your mouth ? , I stated that you are treating software as if it was the hardware and that's my opinion it's not putting words in your mouth , yes what you said would hold some water if they all used direct x   but I would be arguing the same if the roles where reversed , it doesn't come down to MS 's part in the evolution of DX  and through it's domination it's use in GPU tech in general,but the assumption that  Sony is beholden to it just because they use a custom card based off one with dx capabilities , it's a chain with so many links that you can find an argument that just about every player who made an impact in the industry is beholden to the other , whether it be through pioneering work or taking something that someone else did and making it popular , look I'm not trying to downgrade MS or DX both are great ,it's just that I think it is a long bow to draw , so has happens in most disagreements when both sides mind are concrete , we will have to agree to disagree , still It was a nice argument that I enjoyed ,cheers.



Research shows Video games  help make you smarter, so why am I an idiot

mjk45 said:

Where did I put words in your mouth ? , I stated that you are treating software as if it was the hardware and that's my opinion it's not putting words in your mouth , yes what you said would hold some water if they all used direct x   but I would be arguing the same if the roles where reversed , it doesn't come down to MS 's part in the evolution of DX  and through it's domination it's use in GPU tech in general,but the assumption that  Sony is beholden to it just because they use a custom card based off one with dx capabilities , it's a chain with so many links that you can find an argument that just about every player who made an impact in the industry is beholden to the other , whether it be through pioneering work or taking something that someone else did and making it popular , look I'm not trying to downgrade MS or DX both are great ,it's just that I think it is a long bow to draw , so has happens in most disagreements when both sides mind are concrete , we will have to agree to disagree , still It was a nice argument that I enjoyed ,cheers.

I'm not treating software as hardware, what I am saying and you seem to continue to miss the point is that the software is influencing the hardware.

The other point is... The PS3 doesn't have to use direct X, all the features of the GPU are open via OpenGL (Which is an Open source API), but the feature set and the design of the GPU was designed with Direct X as the priority as that is what the majority of games on Windows which the GPU was designed for, uses.

I think you are making the mistake thinking the PS3's GPU is unique and only found on the PS3 when in fact it's a Geforce 7 chip and designed for the PC and Microsoft Windows and thus Direct X. - It's not a custom designed GPU at all, the Xbox is more "custom" than the PS3 in that it's GPU is based on a Hybrid design of the Radeon x18xx series and the Radeon HD 2000 series.

Lets go back a bit farther. During the era of Direct X 7 and TnL.
Back then ATI was pushing Microsoft for programmable pixel shaders, the pixel shaders in the Radeon 7000 series was actually rather flexible and powerfull, however nVidia stuck to the Direct X 7 specification with the Geforce 2.
Microsoft ended up inventing HLSL or High Level Shader Language and Shader Model 1, this was incompatible with ATI's implementation and it was the implementation that nVidia ended up picking up in the Geforce 3 and AMD jumped ahead a little with the Radeon 8500.
So, Microsoft helped influence Pixel Shading in todays graphics chips, that even the PS3 uses.

Then you have ATI's Truform technology, it origionally debuted in the Radeon 8500 series with a refinement in the Radeon 9000 series, essentially...
This was Tessellation at the earliest using N-Patches to determine where to Tessellate, for years ATI pushed for the inclusion of this technology into
Direct X, but it would still take 7+ years for it to be included in Direct X, but then Direct X's implementation was completely incompatible with ATI's.
nVidia however went with Quintic-RT patches which was incompatible with ATI and Direct X's implementation.
Direct X changed, improved and standardised the approaches both company's took when it came to Tessellation, the PC ultimatly benefitted greatly and will be technology the next consoles will be able to use.

So far you haven't really given any technical merits on why Direct X even after decades of influencing hardware choices hasn't played a part in the design of the Direct X 9 graphics chip in the PS3. Just the answer of. "No it hasn't."

Some reading:
http://en.wikipedia.org/wiki/High_Level_Shader_Language
http://en.wikipedia.org/wiki/Truform
http://en.wikipedia.org/wiki/Quintic



--::{PC Gaming Master Race}::--

Around the Network
Pemalite said:
mjk45 said:

Where did I put words in your mouth ? , I stated that you are treating software as if it was the hardware and that's my opinion it's not putting words in your mouth , yes what you said would hold some water if they all used direct x   but I would be arguing the same if the roles where reversed , it doesn't come down to MS 's part in the evolution of DX  and through it's domination it's use in GPU tech in general,but the assumption that  Sony is beholden to it just because they use a custom card based off one with dx capabilities , it's a chain with so many links that you can find an argument that just about every player who made an impact in the industry is beholden to the other , whether it be through pioneering work or taking something that someone else did and making it popular , look I'm not trying to downgrade MS or DX both are great ,it's just that I think it is a long bow to draw , so has happens in most disagreements when both sides mind are concrete , we will have to agree to disagree , still It was a nice argument that I enjoyed ,cheers.

I'm not treating software as hardware, what I am saying and you seem to continue to miss the point is that the software is influencing the hardware.

The other point is... The PS3 doesn't have to use direct X, all the features of the GPU are open via OpenGL (Which is an Open source API), which the GPU wbut the feature set and the design of the GPU was designed with Direct X as the priority as that is what the majority of games on Windows as designed for, uses.

I think you are making the mistake thinking the PS3's GPU is unique and only found on the PS3 when in fact it's a Geforce 7 chip and designed for the PC and Microsoft Windows and thus Direct X. - It's not a custom designed GPU at all, the Xbox is more "custom" than the PS3 in that it's GPU is based on a Hybrid design of the Radeon x18xx series and the Radeon HD 2000 series.

Lets go back a bit farther. During the era of Direct X 7 and TnL.
Back then ATI was pushing Microsoft for programmable pixel shaders, the pixel shaders in the Radeon 7000 series was actually rather flexible and powerfull, however nVidia stuck to the Direct X 7 specification with the Geforce 2.
Microsoft ended up inventing HLSL or High Level Shader Language and Shader Model 1, this was incompatible with ATI's implementation and it was the implementation that nVidia ended up picking up in the Geforce 3 and AMD jumped ahead a little with the Radeon 8500.
So, Microsoft helped influence Pixel Shading in todays graphics chips, that even the PS3 uses.

Then you have ATI's Truform technology, it origionally debuted in the Radeon 8500 series with a refinement in the Radeon 9000 series, essentially...
This was Tessellation at the earliest using N-Patches to determine where to Tessellate, for years ATI pushed for the inclusion of this technology into
Direct X, but it would still take 7+ years for it to be included in Direct X, but then Direct X's implementation was completely incompatible with ATI's.
nVidia however went with Quintic-RT patches which was incompatible with ATI and Direct X's implementation.
Direct X changed, improved and standardised the approaches both company's took when it came to Tessellation, the PC ultimatly benefitted greatly and will be technology the next consoles will be able to use.

So far you haven't really given any technical merits on why Direct X even after decades of influencing hardware choices hasn't played a part in the design of the Direct X 9 graphics chip in the PS3. Just the answer of. "No it hasn't."

Some reading:
http://en.wikipedia.org/wiki/High_Level_Shader_Language
http://en.wikipedia.org/wiki/Truform
http://en.wikipedia.org/wiki/Quintic

Look and this is the last time I reply since i said at the end of my last post we will have to agree  to disagree ,  if you read me correctly  said it was a custom card based of one with dx capabilitys , in that case a 7series and yes it is custom since they don't put in a standard card, it is simply the matter that open gl works fine and does everything needed of it , and that is why it doesn't matter about the card being designed to work with the latest dx because it would only matter if dx influenced the hardware to the point it was the only viable option left  , so to me it comes down not to DX's affect on gpu develoment or the technical merits of what ever api  , but on that development having a such an affect that Sony is beholden to it and to me unless they use it my answer is no.



Research shows Video games  help make you smarter, so why am I an idiot

TruckOSaurus said:
How the help did the No option get 50 votes? Anyone considering the Xbox 360 as a failure has serious mental issues.


Reported for insulting my brain.

mjk45 said:

Look and this is the last time I reply since i said at the end of my last post we will have to agree  to disagree ,  if you read me correctly  said it was a custom card based of one with dx capabilitys , in that case a 7series and yes it is custom since they don't put in a standard card, it is simply the matter that open gl works fine and does everything needed of it , and that is why it doesn't matter about the card being designed to work with the latest dx because it would only matter if dx influenced the hardware to the point it was the only viable option left  , so to me it comes down not to DX's affect on gpu develoment or the technical merits of what ever api  , but on that development having a such an affect that Sony is beholden to it and to me unless they use it my answer is no.


But it's not a custom card at all. That is the point I'm making.
The RSX (PS3 GPU) uses the G70 core which is exactly the same as the PC version, the only difference is the memory bus (Many graphics chips use different types of memory for more performance or a lower price) and the lower fabrication process. (So it runs cooler, uses less power and is cheaper to make).

RSX: G70 core, 24 pixel shaders, 8 vertex shaders, 24 texture mapping units, 8 ROPS, Core clock 500/550mhz.
GeForce 7800 GTX: G70 core, 24 pixel shaders, 8 vertex shaders, 24 texture mapping units, 16 ROPS, Core clock 430/550mhz.

Only difference is the G70 chip on the PS3 has the ROPS cut in half, ROP means: "Render Output Pipeline".
ROPS handle the final transition from the pixel pipeline to the display by building the pixel fragments generated from the pixel pipeline into complete pixels. - The ROP unit also optimizes the display image to save on memory bandwidth, for example when dealing with depth compression and colour comparison, amongst other things.

I would assume Sony cut the ROPS in half to save on money, it's not unusual for PC manufacturers to do the same to hit certain price points.

But regardless of what you say, the GPU in the PS3 is exactly the same as the PC part, hell the core codename is named exactly the same, how you can deny it beats me. Is this some blind faith thing you have going on?



--::{PC Gaming Master Race}::--

Pemalite said:
mjk45 said:

Look and this is the last time I reply since i said at the end of my last post we will have to agree  to disagree ,  if you read me correctly  said it was a custom card based of one with dx capabilitys , in that case a 7series and yes it is custom since they don't put in a standard card, it is simply the matter that open gl works fine and does everything needed of it , and that is why it doesn't matter about the card being designed to work with the latest dx because it would only matter if dx influenced the hardware to the point it was the only viable option left  , so to me it comes down not to DX's affect on gpu develoment or the technical merits of what ever api  , but on that development having a such an affect that Sony is beholden to it and to me unless they use it my answer is no.


But it's not a custom card at all. That is the point I'm making.
The RSX (PS3 GPU) uses the G70 core which is exactly the same as the PC version, the only difference is the memory bus (Many graphics chips use different types of memory for more performance or a lower price) and the lower fabrication process. (So it runs cooler, uses less power and is cheaper to make).

RSX: G70 core, 24 pixel shaders, 8 vertex shaders, 24 texture mapping units, 8 ROPS, Core clock 500/550mhz.
GeForce 7800 GTX: G70 core, 24 pixel shaders, 8 vertex shaders, 24 texture mapping units, 16 ROPS, Core clock 430/550mhz.

Only difference is the G70 chip on the PS3 has the ROPS cut in half, ROP means: "Render Output Pipeline".
ROPS handle the final transition from the pixel pipeline to the display by building the pixel fragments generated from the pixel pipeline into complete pixels. - The ROP unit also optimizes the display image to save on memory bandwidth, for example when dealing with depth compression and colour comparison, amongst other things.

I would assume Sony cut the ROPS in half to save on money, it's not unusual for PC manufacturers to do the same to hit certain price points.

But regardless of what you say, the GPU in the PS3 is exactly the same as the PC part, hell the core codename is named exactly the same, how you can deny it beats me. Is this some blind faith thing you have going on?

Rops in this chip are tied to the number of memory chips. 128 bits need 4 32 bits chip, and each rop is tied to a 16 bits memory channel. Newer architectures decouple rops and channel width.

256 bits needed 8 chips, so it's costlier, as you assumed.



Kynes said:
Pemalite said:
mjk45 said:

Look and this is the last time I reply since i said at the end of my last post we will have to agree  to disagree ,  if you read me correctly  said it was a custom card based of one with dx capabilitys , in that case a 7series and yes it is custom since they don't put in a standard card, it is simply the matter that open gl works fine and does everything needed of it , and that is why it doesn't matter about the card being designed to work with the latest dx because it would only matter if dx influenced the hardware to the point it was the only viable option left  , so to me it comes down not to DX's affect on gpu develoment or the technical merits of what ever api  , but on that development having a such an affect that Sony is beholden to it and to me unless they use it my answer is no.


But it's not a custom card at all. That is the point I'm making.
The RSX (PS3 GPU) uses the G70 core which is exactly the same as the PC version, the only difference is the memory bus (Many graphics chips use different types of memory for more performance or a lower price) and the lower fabrication process. (So it runs cooler, uses less power and is cheaper to make).

RSX: G70 core, 24 pixel shaders, 8 vertex shaders, 24 texture mapping units, 8 ROPS, Core clock 500/550mhz.
GeForce 7800 GTX: G70 core, 24 pixel shaders, 8 vertex shaders, 24 texture mapping units, 16 ROPS, Core clock 430/550mhz.

Only difference is the G70 chip on the PS3 has the ROPS cut in half, ROP means: "Render Output Pipeline".
ROPS handle the final transition from the pixel pipeline to the display by building the pixel fragments generated from the pixel pipeline into complete pixels. - The ROP unit also optimizes the display image to save on memory bandwidth, for example when dealing with depth compression and colour comparison, amongst other things.

I would assume Sony cut the ROPS in half to save on money, it's not unusual for PC manufacturers to do the same to hit certain price points.

But regardless of what you say, the GPU in the PS3 is exactly the same as the PC part, hell the core codename is named exactly the same, how you can deny it beats me. Is this some blind faith thing you have going on?

Rops in this chip are tied to the number of memory chips. 128 bits need 4 32 bits chip, and each rop is tied to a 16 bits memory channel. Newer architectures decouple rops and channel width.

256 bits needed 8 chips, so it's costlier, as you assumed.

Ok I concede you the custom bit , after all , all that means is it was something made /tuned to the customers specifics needs  , and it gets away from the rest of my reply ,and that is the part that shows my view on the matter .



Research shows Video games  help make you smarter, so why am I an idiot