By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Mifely said:
Squilliam said:
Mifely said:
Slimebeast said:
 

 

 

 

The "Cell myth"? You mean the fact that parallal multi-core architectures are the future is... a myth? Intel, AMD, Sun, IBM, and every other processor manufacturer in existance don't appear to agree with you. They seem to believe in "The Cell Myth", as you call it. They're all working on simplified, 16-core processors at this moment, to beat the heat problems super-sized processors (like the P4) can't overcome with sheer muscle.

They aren't trying to beat the "heat problems", the core 2 processors are actually very good for heat production. They are conceding that in many cases parellel processors scale better than massively serial ones.

You're agreeing with me here that parallel architectures scale better.  Quad core processors run at slower clockrates due to heat, and 8- and 16 core processors will run even slower, on a per-processor basis.  The speed of a single parallel core is really immaterial to all users but PC gamers interested in playing legacy games that require a fast CPU core -- although PC games are almost always pretty easy to tune to your CPU, and often GPU bound.  There's only one group then that would actually not want to see a single core's clockrate drop... the group designing the next 360.  360 games don't tend to thread themselves magically to upscale to higher numbers of parallel cores.  If the next-XBox loses all forms of BC, I think 360 owners would be pretty upset if their new 360 became outdated by 2010 or 2011.  Including the current 3-core CPU on the next version might be pretty expensive, as early as 2010 or 2011.

Quad cores run at a slower speed because they complete more operations per second and they have complicated Pipelines. People can scale their quad core processors to over 4ghz on air cooling and they are not heat limited even then. They just hit a wall. The group designing the nextbox, how can you arbitrarily decide that they wouldn't want the clockrate to drop and yet they haven't been public about their wishes and dislikes for the next processor have they? As far as I am aware they are working on a custom processor for the nextbox.

GPUs suffer from the same heat troubles that CPUs do, and you can't ship a console with a set of huge fans, empty airspace in the case, and special cooling units. They go in the living room, not the special air-conditioned chamber of your home that you use as a server-room, and because you spent thousands on keeping it extra cool, you also use it for gaming. They aren't going to keep getting faster without going parallel.

Whats your point? GPU's are already massively parrelel in everything they do. G80 - 120 ALUs vs the gt200 - 240  ALUs.

I was kinda asking you the same question.

They scale well with lithographic progress. 240ALU's is about twice as fast as one with 120ALU's assuming everything else is scaled as well. You said they aren't going to get any faster without going parallel, they have been since the early days of their production. Why are you making that point now when the way forward has been clear for over a decade?

The fastest PC Graphics cards on the market today... Multiple GPUs. Increasing the number of GPUs available isn't really as difficult an architectural change as changing the CPU is, so if your argument is that MS can merely add GPUs to their existing 3-core, overheated design, and just reduce the die on the CPU to reduce heat... then that'd be a good argument... except that most console apps are limited by CPU power, not GPU power. They don't have the memory required (zillions of textures for huge multitexturing, super complex geometry, etc. take a lot of memory.. much more than the PS3 or 360 are usually able to provide, and still have a decent game) to push their GPUs to their limits and still have a decent game, really. You *can* push the GPUs too hard... but as a general rule, game developers are constantly struggling to increase performance on both the CPU and GPU -- and I would say the CPU is usually the one they work harder to improve software performance on, not the GPU. The CPU has to feed the GPU -- even the fancy GPUs on the 360 and PS3 still need to have their animation blending and physics done on the CPU before the data is pushed up the the GPU, and that's a hella lota work. There's no reason for more complex geometry on characters if you can't, for example, animate the bones in the skeleton to make it look realistic, and the big part of the expense with that is on the CPU, not the GPU.

I would say that consoles are very ram limited firstly. The console developers could implement their software in more efficient ways if they weren't so constricted by ram availability. But as a general rule, its easier to scale a GPU than a CPU for the greatest effect. Furthermore a current generation Nvidia chip can do physics on die and probably a lot of the SPE work anyway, as GPUs become more programable we will see them take on a wider variety of tasks. For example, if a current Nvidia chip was installed in the PS3 instead of the RSX it could do all the physics, it would not need vertex culling from the CPU, it could possibly even do the sound as well.

Where the hardware is located is immaterial.  If NVidia or Intel produce a chip that does physics, then it'll have to have access to all the collision geometry in the game, and thus it'll need a load of extra memory or... have a speedy bus and a small local store, such that it can stream data through, operating speedily.  Whether you call a processor a CPU or GPU or SPU, it still produces heat.  Just because its made by NVidia, and is called a GPU, has no bearing on what it can do, or its heat production.

Actually it does. The shaders on a GPU are becoming more programmable with every generation. The main point is - Where do we (Microsoft/Intel/Sony/Nintendo) Spend our transistor budget most efficiently. The local store is not an issue for GPUs they have them. As for the bus? A lot of GPU designs are calling on merging them into a current X86 CPU configuration. - Fusion from AMD, Larrabee from Intel. A GPU is 70% Processing units/30% Cache roughly, whilst a CPU is 30% processing unit and 70% cache. If they are releasing in 2011/2010 (Depending on the design they choose) They need to have a rough draft of the hardware specifications so they can direct the developers.  

If you doubt my claim, step back and think about it for a second. Why do you think ported games have, thusfar, tended to be so much faster on the 360 than the PS3? The 360s GPU is perhaps *slightly* faster than the PS3s, in certain circumstances (and the reverse may be true in other circumstances)... so why do you suppose the framerate is so drastically different? Its because the GPU output dependsupon the CPU that feeds it, and porting an app that is written to run on a 3-primary core CPU, to be an app that is meant to run on a 2-HW thread CPU and 6 SPUs is difficult, and publishers often push the ports out the door because they are tired of spending money on the framerate.

Perhaps the Cell isn't as good in real world circumstances as they hyped it to be. The Cell is not a good processor, its a fast processor, but not a good one. If it takes 100 hours to get 80% of the performance potential from one CPU and 100 hours to get 30% of the performance out of a CPU thats theoretically twice as fast, then its not a very good architecture in comparison.

The Cell is not an overly difficult processor to use.  It merely requires that its users be open-minded, and not bound by high-level programming concepts that look great on paper, but have never performed well in performance software like games.  The Cell is very simple, and can be upscaled to overcome the difficulties it presents to programmers fairly easily by having a larger local store on the SPUs.  Its mostly thick-headedness that prevents developers from creating new engines properly to use it, or in the case of devs who have a decent engine already, their reason is financial -- it takes time/money to adapt.  The Cell itself is not the problem -- pre-existing engines based upon non-parallel concepts are.  That'll have to change by the next gen though.  Engines will need to be re-written to perform well on architectures like the Cell whether devs like it or not.

Thick headedness? Some developers have worked on it non stop for over 3 years and still they haven't cracked it yet. It IS difficult, and not worthwhile. If there is a difference between the PS3 and the Xbox360, we probably won't see it in multiplatform games and 1st parties probably won't be obviously different for at least another year.

In order to be signifigantly better, as a platform, the 360 would have to change in a big way. increasing the number of cores it has to, say 6 or 8, is just going to make the chip crazy expensive and smouldering hot, if its even possible. The PS3's Cell, on the other hand, can probably up the number of SPUs very easily with smaller die sizes. It *is* a forward thinking architecture, no matter how difficult it is, for the average game development studio, to squeeze performance out of it.

The cell is a scalable architecture. But your judgement that the 360 CPU will turn into an 8 core space heater is completely false. Microsoft has a number of options available to them for a 2011 release date. They control the Direct 3d 11 development process so they will know which GPUs are appropriate for their consoles. They will also know before anyone else except for the designers of these GPUS exactly how programable they will be.

Again, where do you get the idea that renaming a custom CPU "GPU" gives it better heat production, or somehow makes programming a console game simpler, thus making the transition to a new console by 2010 ro 2011 easy?  GPUs are not the problem with typical console game development -- they're the problem with PC games.  Offloading work onto other processors, like a physics processor, only complicates and raises the expense of developing a console game.  Why would MS do the same thing to their console that Sony has done to the PS3, if their 360 design is so great due to its relative simplicity?

Their console is simple to design on partly because they provide the best tools of any console developer. Development tools are one of their day jobs. I said "They have a number of options available to them" Then I said that they will know ahead of almost everyone else what the next next generation gpus will function like. They have many options for CPUs and GPUs and given plenty of leadup time this time they can make the xbox360 games forwards compatible with the nextbox. They have options, and they will choose the ones that fit their goals the best. Sony on the other hand is pretty much fixed into using the Cell again.

They only rushed last time because Nvidia pulled production of the GPU in the Xbox1.

They also have Intel, and with their huge X86 libraries they could easily make the switch from IBM. With Intel and Nvidia going head to head over the GPGPU market space, they could also get a cheap deal on Intels "Larrabee" Their rasterization many core x86 GPU. Every developer has at some point used X86, its an even easier transistion than a Cell1 to a Cell2.

Every PC developer probably has used an x86 at some point.  The number of developers in the console industry who worked on a PC game at some point in their career is pretty dang low.  Devs from that era of game developer are usually pretty valuable -- because they are in short supply.  The number of developers making console games dwarfs the number making PC games by a huge margin, and has for a decade.  Again, your statement here does not address the damage that changing an architecture so drastically would do to the 360 market.

Every PC developer has worked on x86 at some point. Every programmer that went to school probably worked on X86 at some point. There are a hundred times as many programmers in the world that can work on X86 competently than there are who could work on the Cell. Microsoft can leverage its own workforce better if the Nextbox is X86. There are hundreds of engines and libraries upon libraries of material on the X86 design. Lastly - The PC gaming market > Any console, maybe even all 3.

From the top of my head X86 experience+console support.

Bungie, Ensemble, Bioware, EA - Everyone, Valve, ID, Epic, Rockstar, Many activision, Blizzard.

Lastly they could seek out AMD for both their CPU and GPU designs.

As I said, this generation is built to last, as least as far as MS is concerned, for a pretty simple reason -- heat. I think 5-7 years per console generation is a dated concept, and if you did a little research, I think you might agree. If this issue was purely a business problem, Sony and MS might very well choose to release a new console by 2011 or 2012. At this time, however, the PS3 and 360 are powerful machines, and they won't be overshadowed by successors anywhere near that soon. If there's a new console by 2012, it'll be from Nintendo. The PS3 and 360 will just be cheaper and slimmer by then.

It is a dated concept! I would say 5 years is about right for a console release, 4.5 years is possible too, early 2010 would do quite nicely.

Someone else in this thread tried to claim that because chip production processes would allow more transitors on a chip by 2010 or 2011, that it was possible for the 360's successor, etc. to be much more powerful by then.  This just simply isn't true -- it doesn't just boil down to speed, even if the number of transitors on a chip was directly related to software performance, and things like memory latency, etc. didn't matter.  More transitors, and smaller die sizes, result in increased electron loss, and colossal extra heat production.  In order to make a machine faster, you can't rely on the huge number of transitors it takes to do decent branch prediction or out-of-order execution.  These machine go in the average joe's livingroom -- not some special cooling chamber deep underground. 

Its *much* more heat efficient to raise the number of cores and simplify them (see: Cell processor).  There won't be a 32nm version of the 360 with 8 cores, or 16 cores, or that suddenly incorporates OOO instructions, etc.  With the possible exception of the Cell, the chance that the next generation of consoles will be *both* BC and much faster than current machines is pretty unlikely.  Due to a probable lack of BC that comes with a radical architecture change, it'd be folly for MS to shun their growing 360 userbase and spend a load of $ on a new design by 2011.  It'll be a radically different machine the next time around -- and it just won't be in 2010 or 2011.

Processors will scale with smaller die sizes. If they were expected to produce huge heat blooms then Intel/AMD/Nvidia/IBMs roadmaps for the future would look remarkibly different. With the relative ease of porting between X86 and the Xbox360, they could easily include BC in the next processor whatever they decide, so long as it IS faster than the current tri-core.

I wouldn't expect a new "high-end" console until at least 2013, and the ball is in Sony's court for making an easy upgrade on their existing hardware. MS is going to have to work for it, and its going to be painful... especially when they tell the customer "Sorry, backwards compatibility... totally impossible on the new platform" -- unless they're willing to include the entirety of the 360's triple core CPU on the new design.


 


 

As a final statement, I'd like to mention that I just don't think MS is dumb, when it comes to money.  Why on earth would they try to up the bar on Sony by 2011, when Nintendo is the one who owns the market?  The current 360 is perfectly capable of using a gimmicky pointing device as a controller, as is the PS3.  I think repackaging a smaller 360 with a new control device would be much more in line with typical MS strategy (IE, Silverlight, Zune, etc. anyone?).  That doesn't really qualify as the "next generation", however. 

Sony tends to be the company that pushes the envelope, as opposed to being the money-seeking missile that M$ represents.  They appear to believe that the high-end will always have a market, and they're right... however given the Wii's success, its not the largest market... and MS usually isn't interested in niches, if they don't already have the masses covered.

 

I'll re-iterate and state that I'm standing by 2013, and Sony is still going to have an easier time of it, no matter what approach you take to the issue.


 



Tease.