By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Does Wii U supports DX11?

JEMC said:
Soleron said:
TheShape31 said:

 Don't get people too excited with threads like this. DX10 is confirmed, but that's where Wii U's capabilities end.

It uses an R700 GPU, so DX11 is confirmed, or at least its featureset is present on the hardware.

Weren't the HD 5xxx series of cards the first ones to be DX 11? The HD 4xxx series, the ones that used the R700 chip, were DX10 cards.

Yep, I was thinking of DX10.1. But it has a tesselator, so there's only a few features missing.

DX10.1 contains a lot as well, but Nvidia never supported it so it didn't get traction. It should look better than DX10 anyway.



Around the Network

Well the hardware normally would support DirectX, as in the hardware has support for it if it was off the shelf, but DirectX is a Windows API, and since the Wii U will not be running Windows none of the games will be done with DirectX but instead use one of the newer OpenGL ES APIs. So the hardware will probably be optimized to use that OpenGL and cut the DirectX support since it would be visigual.

The problem with saying what version it would support is the chip will not be an off the shelf GPU but modified version of the chip; so if anything it would be a hybrid that may or may not be able to run a version it wouldn't be able to normally.



Soleron said:
JEMC said:
Soleron said:
TheShape31 said:

 Don't get people too excited with threads like this. DX10 is confirmed, but that's where Wii U's capabilities end.

It uses an R700 GPU, so DX11 is confirmed, or at least its featureset is present on the hardware.

Weren't the HD 5xxx series of cards the first ones to be DX 11? The HD 4xxx series, the ones that used the R700 chip, were DX10 cards.

Yep, I was thinking of DX10.1. But it has a tesselator, so there's only a few features missing.

DX10.1 contains a lot as well, but Nvidia never supported it so it didn't get traction. It should look better than DX10 anyway.

True. And since most devs don't use DX when developing for consoles, it's a bit less important having DX 11 if the hardware is good enough.

lilbroex said:
JEMC said:

Weren't the HD 5xxx series of cards the first ones to be DX 11? The HD 4xxx series, the ones that used the R700 chip, were DX10 cards.

That would only be true if it were using a stock GPU base like the 360 and PS3, but even then, those two had some augmentations to do things the stock cards could not. Nintendo has never used a stock CPU or GPU. They've always used completely custom made components.

The Wii U can have whatever they specified.

Not only Nintendo, but also Microsoft and Sony use custom parts for their consoles. It's nothing new.

Also you can only do certain modifications to the GPU until going with another GPU a is more viable or even cheaper option. And so far all rumors have agreed that it uses a R7xx based GPU.



Please excuse my bad English.

Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Custom logic used to be good, but modern graphics processors are so complex and the drivers so optimised that if only you're using it it'd cost a few hundred million to make serious modifications.

This is why we're seeing consoles converge with PCs in hardware. Off the shelf parts from a year or two ago are the only things cheap enough to be in a $300 console.



Soleron said:
Custom logic used to be good, but modern graphics processors are so complex and the drivers so optimised that if only you're using it it'd cost a few hundred million to make serious modifications.

This is why we're seeing consoles converge with PCs in hardware. Off the shelf parts from a year or two ago are the only things cheap enough to be in a $300 console.


Did you bs that up?

It's not so complex.

There able to do it easily btw graphic processors are sold high ut cost low to produce. Just like graphic cards.



"Excuse me sir, I see you have a weapon. Why don't you put it down and let's settle this like gentlemen"  ~ max

Around the Network
ninetailschris said:
Soleron said:
Custom logic used to be good, but modern graphics processors are so complex and the drivers so optimised that if only you're using it it'd cost a few hundred million to make serious modifications.

This is why we're seeing consoles converge with PCs in hardware. Off the shelf parts from a year or two ago are the only things cheap enough to be in a $300 console.


Did you bs that up?

It's not so complex.

There able to do it easily btw graphic processors are sold high ut cost low to produce. Just like graphic cards.

I'm talking about design. Obviously once you have a fab ($5 billion) and a design ($500m++) each unit is cheap (~$50).

There are 1 billion transistors in R700. Even if you were only going to rewire out 10% of that for performance or an extra feature you'd be messing with the core of the design and have to get the chip re-laid out, qualified, masked and so on. That would require the work of 50+ engineers and more than a year plus fixed costs for getting a new chip fabbed which are very high. You'd also have to redo the drivers to perform well with the new logic, and AMD's driver team is easily over 100 people. 

If it's not so complex why don't we see Intel, Qualcomm, ARM or so on actually make a high-end desktop GPU? Because they can't do it better than AMD or Nvidia with reasonable costs. In Intel's case they actually got the PS4 contract conditional on their GPU not sucking and it did so much they cancelled the project and lost PS4.

The final problem of custom logic is that all the devs have been coding for AMD and Nvidia plus DX9-11 for years. Even if your custom chip has amazing capabilities (see also: PS3 Cell, Gamecube Flipper) no one will use it properly because the competition don't have it.

 

 

 



Soleron said:
ninetailschris said:
Soleron said:
Custom logic used to be good, but modern graphics processors are so complex and the drivers so optimised that if only you're using it it'd cost a few hundred million to make serious modifications.

This is why we're seeing consoles converge with PCs in hardware. Off the shelf parts from a year or two ago are the only things cheap enough to be in a $300 console.


Did you bs that up?

It's not so complex.

There able to do it easily btw graphic processors are sold high ut cost low to produce. Just like graphic cards.

I'm talking about design. Obviously once you have a fab ($5 billion) and a design ($500m++) each unit is cheap (~$50).

There are 1 billion transistors in R700. Even if you were only going to rewire out 10% of that for performance or an extra feature you'd be messing with the core of the design and have to get the chip re-laid out, qualified, masked and so on. That would require the work of 50+ engineers and more than a year plus fixed costs for getting a new chip fabbed which are very high. You'd also have to redo the drivers to perform well with the new logic, and AMD's driver team is easily over 100 people. 

If it's not so complex why don't we see Intel, Qualcomm, ARM or so on actually make a high-end desktop GPU? Because they can't do it better than AMD or Nvidia with reasonable costs. In Intel's case they actually got the PS4 contract conditional on their GPU not sucking and it did so much they cancelled the project and lost PS4.

The final problem of custom logic is that all the devs have been coding for AMD and Nvidia plus DX9-11 for years. Even if your custom chip has amazing capabilities (see also: PS3 Cell, Gamecube Flipper) no one will use it properly because the competition don't have it.

The latest batch of rumours suggest that if they're using an R700 then it's been heavily modified. The recent Eurogamer article even suggests it's actually based on a 7xxx series card although I don't think the author is much of a tech head so it wouldn't surprise me if he got R700 mixed up with 7 series.



No and neither will the PS4.



TheShape31 said:
"It’s great to have a massive processor that’s got a graphics pipeline that uses DX11, but..."

equals...

"It's great to have the latest and greatest, but that's not what Wii U is about."

Don't get people too excited with threads like this. DX10 is confirmed, but that's where Wii U's capabilities end.

A DX10 equivalent feature set has only been 'confirmed' by an anonymous source. And if he knew what he was talking about he wouldn't have referred to it as DX10 but a DX10.1 feature set. Anyone with any experience working with graphics APIs would.

We know that the GPGPU in the U supports compute shaders and tessellation and that the GPGPU is a modern Radeon HD. It'll have a DX11 equivalent feature set, no doubt in my mind about that. All of those 'specs' from that Eurogamer article were from an anonymous dev, not the first time Eurogamer have published a complete load of old bollocks about the U in an attmpt to get more advertising revenue. I don't even check the links any more, I just read the articles copied and pasted into threads instead.



snowdog said:
TheShape31 said:
"It’s great to have a massive processor that’s got a graphics pipeline that uses DX11, but..."

equals...

"It's great to have the latest and greatest, but that's not what Wii U is about."

Don't get people too excited with threads like this. DX10 is confirmed, but that's where Wii U's capabilities end.

A DX10 equivalent feature set has only been 'confirmed' by an anonymous source. And if he knew what he was talking about he wouldn't have referred to it as DX10 but a DX10.1 feature set. Anyone with any experience working with graphics APIs would.

We know that the GPGPU in the U supports compute shaders and tessellation and that the GPGPU is a modern Radeon HD. It'll have a DX11 equivalent feature set, no doubt in my mind about that. All of those 'specs' from that Eurogamer article were from an anonymous dev, not the first time Eurogamer have published a complete load of old bollocks about the U in an attmpt to get more advertising revenue. I don't even check the links any more, I just read the articles copied and pasted into threads instead.


Well spoken. I'm more interested in the processor the Wii U has. To date the only info we have it the info given by IBM themselves who said it would be Power7 based. It seems odd given other reports.