By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

ethomaz said:
zarx said:
Oh that's a bit smaller than I thought for eDRAM, I guess it's been a while since I looked into it. I didn't think they had actually started making it at 40nm, http://am.renesas.com/products/soc/asic/cbic/ipcore/edram/ still lists 40nm as under development.

What? In 2009 NEC and TSMC already have eDRAM in 40nm.

The IBM POWER7 uses 45nm eDRAM and they already started to manufacture 32nm eDRAM (0.24mm2 per Mbit) in Feb, 2012.


Can't say I have followed eDRAM production at all so when I went to the site of the company that is providing the eDRAM and saw that they still listed 40nm as under development I just assumed that they hadn't ramped up on 40nm volume production yet. Oh well live and learn.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

Around the Network
zarx said:


Can't say I have followed eDRAM production at all so when I went to the site of the company that is providing the eDRAM and saw that they still listed 40nm as under development I just assumed that they hadn't ramped up on 40nm volume production yet. Oh well live and learn.

If you want you can order eDram with 24nm structures, no problems. Factories make what you want, it is simply a question of economics. XBox used 90nm eDram up to recently because manufacturing on 90nm nodes has become dirt cheap. These lines can use low price wafers and old machinery that is fully written off. While you will get more than twice the number of eDrams out of a 40nm line, the price ypu pay is higher than double as the price on "antique machinery". Right now, 45nm lines are on the way out as more and more 28/32nm lines are put in use, so more common stuff is now moving from 90nm to 45/40nm. Basically factories have the choice to scrap their 45nm lines or continue to use them for bargain prices (like PC North/Southbridges, or eDram, for example).



JEMC said:
HoloDust said:

Well, 7970m (which is mobile, slightly downclocked version of HD 7870) is rated at 75W (though some sources state up to 100W). Now how they achieve that in mobile versions of their chips has always eluded me. But if that is correct number, CPU+GPU would give (95+75) TDP of 170W. Which I think would put it (when all other stuff is added) well within bounds of original Xbox360. But yeah, as you said, I wouldn't bet on 7970/680 equivalent inside any next-gen console.

(Numbers taken from Anandtech)

   HD 7970  HD 7900M
Stream Processors 2048 1280
Texture Units 128 80
ROPs 32 32
Core Clock  925MHz  850MHz
Memory Clock  1.375GHz (5.5GHz effective)  4.8GHz
Memory Bus Width  384-bit  256-bit
Memory  3GB GDDR5  2GB GDDR5

As you see, the cards are not exactly the same.

Forget GPUs with 384 bit paths instantly, even 256 bit paths is unlikely. Count the numbers of board layer any stable PC graphics card has - way too expensive for a console. As I have explained before, a (simplified) AMD7770 (which has a 128 bit bus) is cheap and more than powerful enough for 1080p/60Hz.



superchunk said:

Additionally, 8 cores is completely unnecessary for anything. Most PCs only have quad-core for a reason. Its not a damn server. 4 or 5/6 cores is more than enough, even if you consider Kinect needing one core to itself; very likely.

I'm betting on a 4core device with may a 2nd standalone CPU just for Kinect and/or OS. All 2011 (at best) tech and lower specs than high-end current PCs.

If MS really goes AMD, then an 8core FX processor is almost dead certain. In order to understand why Kinect2 will use an even number of cores (my guess is it might be up to four cores if the two webcams are high resolution), you'd have to understand the design peculiarities of their new multicore processors (which have a decisive difference to Intel CPUs, which in the case of Kinect2, could actually turn out to be favourable for the AMD processor).
The idea of adding another CPU is, mildly put, odd - for reasons that were already discussed in another thread. It would simply be an engineering nightmare to design. The idea that a separate CPU would be used just for the OS is an odd idea in itself, as _every_ software running uses OS calls all the time.

Or written in simpler terms: (Price of 2 processors with 4 cores doing separat stuff) >>>>> (Price of an 8core processor doing all the stuff).



regin2005 said:
a) While all these numbers are nice and all (8 core this, 3 gbDDR3 that) what REAL difference will these super powered consoles do that their current consoles (PS3, X360) cannot.

Ib) s the so-called power of the PS3 already tapped out?
c) Can visuals really get any better than they are already now on 360/PS3? If the consoles will be at LEAST twice as strong, will anyone be able to tell the difference?

a) They can output prettier pictures at higher frame rates - no more frame drops and cheeesy graphics.

b) yes. If you want prettier pictures, you need more ram than what is available now . If you want more complex games, you need more ram than what is available now. As of now, developers pretty much figured out how to efficiently use the 512M memory without having to fall back to slideshows due to streaming stuff too often.

c) Depends on your eyesight and the game played. A 1080p/60Hz game should look better than a 720p/30Hz game, if stuff is rapidly flying around on screen. Higher resolution can mean better textures a better GPU can mean more effects, it all depends on whether developers go for it or not.

The key is having more ram. For the PS3, no longer having the cell PPU/SPUs mystery will help weaker developers to the "promised land (1080p/60Hz)".



Around the Network
HoloDust said:
JEMC said:

As far as I've seen, the HD 7770 is rated at 100W, and it doesn't give the performance of a GTX570 (the rumor talked adout dev kits featuring the 8 core Intel CPU and a GTX570). To get that level of performace you'll need a HD7850 or 7870, and those cards use 120 - 150W.


http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units#Radeon_HD_7xxxM_Series

As I said, I have no idea how they make mobile version that is rated at only 75W (7970m vs HD7870).

We were talking about the desktop parts, not the mobility ones.

http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units#Southern_Islands_.28HD_7xxx.29_series

But it seems that I was wrong. The HD 7770 seems to use 130W, 30 more than I thought.

How they do make the mobility cards are a mistery to me, but from some forums it seems that the 7970m performs between the HD7770 and the HD 7850, being closer to this last one.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

drkohler said:
JEMC said:

(Numbers taken from Anandtech)

   HD 7970  HD 7900M
Stream Processors 2048 1280
Texture Units 128 80
ROPs 32 32
Core Clock  925MHz  850MHz
Memory Clock  1.375GHz (5.5GHz effective)  4.8GHz
Memory Bus Width  384-bit  256-bit
Memory  3GB GDDR5  2GB GDDR5

As you see, the cards are not exactly the same.

Forget GPUs with 384 bit paths instantly, even 256 bit paths is unlikely. Count the numbers of board layer any stable PC graphics card has - way too expensive for a console. As I have explained before, a (simplified) AMD7770 (which has a 128 bit bus) is cheap and more than powerful enough for 1080p/60Hz.

I was only comparing the 2 top of the line GPUs from AMD. the desktop and mobility ones.

I agree that there is no need for 384-bit GPUs in consoles, mainly because I don't think they will use GDDR5 memory but DDR3, and with DDR3 being slower it wouldn't be worth it.

And yes, an HD 7770 in a console would be powerful enough to play at 1080p (not sure at 60fps as developers always try to put more on screen rather than making it run faster). But the rumor we started discussing talked about a GTX570, and the 7770 don't reach that performance.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

For me Wii U uses a low clocked RV740 (~500-600Mhz).



JEMC said:
HoloDust said:

http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units#Radeon_HD_7xxxM_Series

As I said, I have no idea how they make mobile version that is rated at only 75W (7970m vs HD7870).

We were talking about the desktop parts, not the mobility ones.

http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units#Southern_Islands_.28HD_7xxx.29_series

But it seems that I was wrong. The HD 7770 seems to use 130W, 30 more than I thought.

How they do make the mobility cards are a mistery to me, but from some forums it seems that the 7970m performs between the HD7770 and the HD 7850, being closer to this last one.

Yeah, it's hard to find direct comparisons between desktop and mobile parts - only place I know where there is something like that is CLBenchmark, though that's not so much of a graphics comparison per se, but it can give some insight http://www.clbenchmark.com/compare.jsp?config_0=11973646&config_1=12713835

That said, I can see something like 7970m (or similar) going into next-gen, and fitting into power envelope (if that TDP really is as low as 75W).



ethomaz said:

For me Wii U uses a low clocked RV740 (~500-600Mhz).

That would be an HD 4770 which is what most of us though Nintendo was using, so yeah, it may be. Not too sure about the frequency, though. The retail card run at 750 MHz using ~95W, more than the whole WiiU.

Since they are probably using the same 40nm fab process, either they have done magic or the card runs slower than that.

@HoloDust: Thanks for the link. Yes, it's hard to compare not only because of the cards but also because the rest of the hardware is also different.

About the 7970m being possible. Well, your guess it as valid as mine. But that's the fun, the discussion about what's possible only to end being completely wrong .



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.