By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - [Unofficial Rumour] World Exclusive: Durango unveiled

1337 Gamer said:

Yes you are correct to an extent. But you do realize that you have to quadrouple the amount of needed memory assuming of course that the PS3 is really going to use 4GB if GDDR5. Yes probably the price has come down but your still probably looking at $80 to $100 for just the memory. And you still need to factor in the CPU cost and cooling and the case and power supply and things of that nature. There is no way it can be done for $400.

I think you are everpricing the component parts... 4GB DDR3 cost less than $20 in the retail market... it's even cheaper to manufature... the GDDR5 can be expensive (or not) but I know it is DDR3 based... so memory is no way $80 to $100.

The Orbis will be released at $350 (that's my prediction).

The components parts are cheaper and not expensive for manufacture... that super PC you paid $800 cost less than $400 to manufacture... the cost for Sony/Microsoft/Nintendo is manufacture cost not retail market price.



Around the Network

Just to make some cost estimate for Durango...

CPU: < $50
GPU: < $50
Special Sauce: < $30
RAM: < $20
 Power: < $10
Thermal: < $10
PCB: < $10
BD Drive: < $30
Other components: < $150

Total cost: $360

Box, manual, controller, etc: +$50

Final cost for Microsoft... ~$400

PS. No take too serious... that's just a vague estimate and a little overpriced.



CGI-Quality said:
ninjablade said:
CGI-Quality said:
ninjablade said:

the 360 ram was not split, only ps3 was, thats why some ps3 games like skyrim and fall out 3 had problems.

I never said the 360's RAM was split. 

you don't understand, many deveoplers have said wiiu is on par with currentgen, currentgen is already struggling to keep up, game like , crysis 1 and 2,  ac3, far cry 3 run like crap and rub sub hd, like COD blops 2.

I do understand. Some developers say it is weaker or equal to current gen consoles, others say otherwise. What I gather - the system has strengths and weaknesses (much like any console), but still are more powerful than the current HD devices, and while the sequels to both of those will surely top it, the WiiU shouldn't follow in the Wii's footsteps, due in part to its architecture. Besides, the games you mentioned look and run fine on current gen consoles (for the most part - save for Crysis 2, which was a mess, especially on PS3). 

Still doesn't explain why you quoted the last post with that, though. And, you didn't address it - not sure how you picked it out that I said the 360's RAM was split.

lets stop talking about wii u in this thread, developers could say what they want honestly to me i see a system with advanages and disadvantages over currentgen, until i start seeing superior multiplatform games, i will not call it more powerful, samethingthing for ps3 and it why i chose the 360 over it.



CGI-Quality said:
ninjablade said:
CGI-Quality said:
ninjablade said:

you don't understand, many deveoplers have said wiiu is on par with currentgen, currentgen is already struggling to keep up, game like , crysis 1 and 2,  ac3, far cry 3 run like crap and rub sub hd, like COD blops 2.

I do understand. Some developers say it is weaker or equal to current gen consoles, others say otherwise. What I gather - the system has strengths and weaknesses (much like any console), but still are more powerful than the current HD devices, and while the sequels to both of those will surely top it, the WiiU shouldn't follow in the Wii's footsteps, due in part to its architecture. Besides, the games you mentioned look and run fine on current gen consoles (for the most part - save for Crysis 2, which was a mess, especially on PS3). 

Still doesn't explain why you quoted the last post with that, though. And, you didn't address it - not sure how you picked it out that I said the 360's RAM was split.

lets stop talking about wii u in this thread, developers could say what they want honestly to me i see a system with advanages and disadvantages over currentgen, until i start seeing superior multiplatform games, i will not call it more powerful, samethingthing for ps3 and it why i chose the 360 over it.

You chose the PS3 over 360 because of mulitplats? If it's to say because it proved its superiority early on, I don't see how, as it did anything but that. However, if it's about exclusives, I can understand , but that's not what I gather from reading youe post.

Though you are right, the topic is not about the PS3, 360, nor WiiU. So, let's aim to correct that and move on. :)


i chose 360 over ps3, sorry.



thismeintiel said:
Well, a couple of them are MS biased, so they may just be saying it will close the gap of the GPUs.  However, if they are correct, they still don't take into account that the PS4 is supposed to have its own "secret sauce."  So, there would still be a gap.  It'll probably be like his gen, only the PS4 will be easier to develop for, so more devs can easily take advantage of it.

If this so called "secret sauce" was so effective, why wouldn't PCs have a 3rd type of component that alleviates CPU/GPU workload with this "secret sauce"? For example, we'd have the GPU/CPU and some "helper module" that speeds up graphics and it could be plugged into the PCIe slot on the PC. 

HD7850-7870 level of GPU delivers 10-20 fps more in modern games compared to HD7770Ghz. When consoles generally target 30 fps, this type of GPU performance difference is huge.  

 

Not to mention, people forget that for the GPU to work fast, it needs a lot of memory bandwidth. If you take the rumored level of performance ~HD7770Ghz in Xbox 720 that normally has 4500mhz GDDR5 over 128-bit bus and replace GDDR5 with DDR3-2133mhz over the same 128-bit bus, your GPU's memory bandwidth will fall from 72GB/sec to just 34 GB/sec!!! Guess what happens? Your GPU performance will fall 40-50% with this reduction in memory bandwidth.

This likely explains why MS is going to use eSRAM/eDRAM for the GPU because 8GB of DDR3 is going to be shared with the GPU (i.e., the GPU is going to be severely memory bandwidth bottlenecked).

In simplest terms, you can take a 1.2Tflop GPU and compare it to a 1.84Tflop GPU on the same GPU architecture, but if the 1st GPU's memory bandwidth just got neutured by DDR3, the performance of that GPU will drop like a rock! Effectively, you are no longer comparing a 1.2Tflop GPU to a 1.8Tflop one because the former GPU can't work at full capacity any longer as it's memory bandwidth bottlenecked.

Example -- GT 640's Floating Point performance is fixed but look at the dramatic effect of swapping out DDR3 for GDDR5 on actual gaming performance.

GT 640 DDR3 = 46 VP

GT 640 GDDR5 = 68 VP (+48% faster) by just swapping out DDR3 for GDDR5. 

http://alienbabeltech.com/abt/viewtopic.php?p=41174

On paper, things are looking MUCH worse for the Xbox 720's GPU. Not only is the GPU's floating point rumored to be 50% less than that of PS4, but the actual memory subsystem that feeds the GPU is dramatically inferior to PS4's rumored 4GB GDDR5 setup. The performance difference between an HD7770Ghz DDR3 and an 85% HD7870 GDDR5 is going to be more than double in graphical capability, not 50%, because memory bandwidth is required to keep the GPU working at full capacity (specifically the memory bandwidth feeds the GPU's ROPs).

The Render Output Unit, often abbreviated as "ROP", and sometimes called (perhaps more properly) Raster Operations Pipeline, is one of the final steps in the rendering process of modern 3D accelerator boards. If you neuter GPU's memory bandwidth, you neuter the ROPs and thus the final stage in the graphical rendering process stage. It's like putting Toyota Prius tires on a 700hp rear-wheel drive supercar. 32MB of eSRAM is not going to be the answer to this problem because it's not sufficient enough in size.



Around the Network

Looks like a nicely balanced arch as per the 360. Good to see that unified RAM still lives on, and it appears the ESRAM has DMA which would be very useful. It's also actually big enough to hold HD frames, unlike the 360's EDRAM.

EDIT: it looks like there's a blitter on the northbridge, perhaps that will allow handoff to the ESRAM which may have additional logic, as the EDRAM has AA etc. Tight.



BlueFalcon said:
thismeintiel said:
Well, a couple of them are MS biased, so they may just be saying it will close the gap of the GPUs.  However, if they are correct, they still don't take into account that the PS4 is supposed to have its own "secret sauce."  So, there would still be a gap.  It'll probably be like his gen, only the PS4 will be easier to develop for, so more devs can easily take advantage of it.

If this so called "secret sauce" was so effective, why wouldn't PCs have a 3rd type of component that alleviates CPU/GPU workload with this "secret sauce"? For example, we'd have the GPU/CPU and some "helper module" that speeds up graphics and it could be plugged into the PCIe slot on the PC. 

HD7850-7870 level of GPU delivers 10-20 fps more in modern games compared to HD7770Ghz. When consoles generally target 30 fps, this type of GPU performance difference is huge.  

 

Not to mention, people forget that for the GPU to work fast, it needs a lot of memory bandwidth. If you take the rumored level of performance ~HD7770Ghz in Xbox 720 that normally has 4500mhz GDDR5 over 128-bit bus and replace GDDR5 with DDR3-2133mhz over the same 128-bit bus, your GPU's memory bandwidth will fall from 72GB/sec to just 34 GB/sec!!! Guess what happens? Your GPU performance will fall 40-50% with this reduction in memory bandwidth.

This likely explains why MS is going to use eSRAM/eDRAM for the GPU because 8GB of DDR3 is going to be shared with the GPU (i.e., the GPU is going to be severely memory bandwidth bottlenecked).

In simplest terms, you can take a 1.2Tflop GPU and compare it to a 1.84Tflop GPU but if the 1st GPU just got neutured by DDR3, the performance of that GPU will drop like a rock!

GT 640 DDR3 = 46 VP

GT640 GDDR5 = 68 VP (+48% faster) by just swapping out DDR3 for GDDR5. 

http://alienbabeltech.com/abt/viewtopic.php?p=41174

On paper, things are looking MUCH worse for the Xbox 720's GPU. Not only is the GPU's floating point rumored to be 50% less than that of PS4, but the actual memory subsystem that feeds the GPU is dramatically inferior to PS4's rumored 4GB GDDR5 setup. The performance difference between an HD7770Ghz DDR3 and an 85% HD7870 GDDR5 is going to be more than double in graphical capability, not 50%, because memory bandwidth is required to feed the GPU (specifically the GPU's ROPs).

The Render Output Unit, often abbreviated as "ROP", and sometimes called (perhaps more properly) Raster Operations Pipeline, is one of the final steps in the rendering process of modern 3D accelerator boards. If you neuter GPU's memory bandwidth, you neuter the ROPs and thus the final stage in the graphical rendering process stage. It's like putting Toyota Prius tires on a 700hp rear-wheel drive supercar. 

i remain skeptical as well but this is suppose to be some kind of new top secret technology from microsoft,  i'm very  skeptical though cause people with strong bias will say anything. 



ninjablade said:

i remain skeptical as well but this is suppose to be some kind of new top secret technology from microsoft,  i'm very  skeptical though cause people with strong bias will say anything. 

eSRAM does not solve the memory bandwidth bottleneck since modern games use 1-2GB of VRAM and eSRAM is just 32MB. At best eSRAM will be used to aid in anti-aliasing or something minor, not major graphical speed up of any kind. They already had eDRAM on the Xbox 360 and it did little to help it have superior graphics to PS3, and that is despite Xbox 360's GPU being faster than PS3's RSX to begin with. 

Memory bandwidth for the GPU is like vital performance enhancing nutrients for a sports athlete. 

3DMark11

HD6670 GDDR3 = 1594 marks

HD6670 GDDR5 = 2479 marks (+55%)

http://www.overclockers.com/forums/showthread.php?t=710062

People keep talking about 8GB of system memory for Xbox 720 but they conveniently ignored it's DDR3, meaning the GPU is neutured because it has no fast memory dedicated for it. Also, if you look at the diagram, DDR3 has to go through the NorthBridge to communicate with the GPU. That introduces additional latencies. If PS4 has GDDR5 and at least some of it is dedicated to the GPU, not only will it be miles faster than Xbox 720's memory sub-system, but the GPU will be able to access it much quicker (much like it does on a real modern graphics card). Now take a GPU with 50% less GPU power and shared system DDR3 and PS4's GPU + dedicated GDDR5 will mop the floor with it.  There is a reason all high-end GPUs on the PC use dedicated GDDR5.....



here is a good post from neogaf

I don't think they are trying to compete in the tech arms race with Sony. They apparently spent less of their budget on GPU, 12CUs vs 18. And made some performance sacrifices to the memory system to have the quantity required for heavy non-game functions. I don't think it's a PS2 and Xbox difference in aggregate performance. However in practical terms, I doubt the two will be as closely matched as the PS3/X360 generation. From what we "know", that is stuff DF confirmed through multiple independent sources, PS4 easily seems stronger. I think the burden of proof lies with those people that claim the large difference in GPU logic is overcome by the move DMAs. Seems like space age technology. Maybe MS did have an inside deal with AMD and its GPU tech is a few years ahead, but I doubt it.

i kinda of agree with him, its just seems proelite and agieses work for Microsoft and are way to active in hyping up 720..



ninjablade said:

i remain skeptical as well but this is suppose to be some kind of new top secret technology from microsoft,  i'm very  skeptical though cause people with strong bias will say anything. 


Looking at the arch, I am assuming that the ESRAM is intended as the rendering framebuffer, which is directly bussed to the CPU/HDMI out along with whatever the proprietary processing engines are. The GPU appears to have a direct, high-bandwidth bus and I am assuming that the proprietary engines are post-processing magic, possibly with DMA access to the ESRAM. The image suggests that these may possibly be integrated on the same die.

I am assuming this tight integration taking the x360 arch as precedent. So overall it looks to be a promising use of reasonably priced components.