By using this site, you agree to our Privacy Policy and our Terms of Use. Close
derpysquirtle64 said:

It was inexcusable for nVidia to not lower GPU prices for original Xbox when their factory costs decreased after a couple of years. nVidia was just in it to get a lot of money from Microsoft and they succeeded. This was one of the main reasons why Microsoft lost around 5 billion dollars on it.

As for PS3, while the first part that Sony initially intended to use 2 Cell CPUs is true, I don't think the other one is.

Sony Abandoned using Cell as a GPU very early on... Because it would have been rubbish at the task, the Cell is still very general purpose and lacks any of the parts of a GPU that accelerates various tasks like texture samplers.

Sony did hope that Toshiba would provide the GPU at some point, but that never materialized, so nVidia was given the task.

In short... People clinged to Sony's statements from E3 when they essentially stated that the PS3 can generate excellent real-time graphics using only the power of the Cell... Cell isn't that good I am afraid, history has more than proven that.

Bofferbrauer2 said:

Just to add in something I wasn't sure at that point, Vega 10 FP64 can only can reach 1/16th of FP32 while Vega 20 can do 1/2, so the chip must have changed quite a bit under the hood. Double Precision is also something that cost quite a lot of energy, hence why it got cut down  in modern GPUs both by AMD and NVidia. NVidia nowadays mostly uses 1/32th, and all RTX cards, including the Quadro, do so. The original Titan had 1/4, while the Volta-based Titan V has 1/2 like the Vega 20, making the latter the probable target if the FP64 capability of the Radeon VII hasn't been cut down.

Not really. Graphics Core Next is extremely modular remember, you can update part of a chip and leave the rest identical.
Besides... AMD has a Vega GPU with 1/2 FP64 on the market right now... Meaning that the design of Vega 7 isn't new anyway.

In short, Vega 7 is a simply a GPU ported to 7nm with a CU removed to increase yields... Doubling of DRAM and Bandwidth and a big increase in clockrates, it was minimal effort by AMD... And because of that, it's unlikely it will beat a Geforce 1080Ti.

KingofTrolls said:

The collaboration is not the only hint we got on the topic,thoug. Forbes' leak points simply that MS will not use Navi technology, also it points the Vega will be a disappointment because resources were moved to Navi, and kinda it was proven true. So far, the leak is accurate.

It doesn't matter if Microsoft doesn't use Navi technology, Navi isn't likely to be anything special anyway. - AMD does have next-gen being designed remember.
With that said... A leak posted by Forbes can and shall be taken with grains of salt.

HoloDust said:

2020 is most likely date, though, X360 had ATI's custom GPU that had unified shaders way before their desktop cards with same feature hit the shelves. So, something like this may happen again, AMD focusing on delivering next-gen GPUs for consoles, and only later releasing desktop cards with those next-gen featues.

That just plays into my statements that GPU's and CPU's take years to design... And because of that, when you take a semi-custom approach, you have the opportunity to implement newer features that are set for future GPU designs.
The Xbox 360 is after-all is the perfect example of that.






--::{PC Gaming Master Race}::--