I think both console will have great graphics. And both console will have great games for it.
I think both console will have great graphics. And both console will have great games for it.
brendude13 said: The GPU doesn't make a system, especially in the case of PS3 vs 360 and Cell vs Xenon.
|
Well, that used to be true. - As time goes on, the CPU is tasked with doing less when it comes to the graphics rendering pipeline as more of the load has shifted to the GPU, the Xbox 360 and PS3's GPU wasn't exactly known for it's GPU compute afterall, amazing how 6 years of GPU evolution can change the landscape.
If you go back farther CPU's used to handle all the lighting in games which is now done on the GPU when TnL came along with the Geforce 256 and Radeon 7500.
With this console generation things like Physics, Anti-Aliasing (Morphological) and some framebuffer effects not to mention texture decompression (I.E. Rage) was all handled by the CPU, suddenly this next generation the CPU is free of such burdens, which is a *good* thing as it means more serialised processing time for tasks such as A.I.
It also means, that the PS4 having a 500 Gigaflop advantage in GPU compute is *going* to make a difference in all aspects once we are a couple of years into the generation.
--::{PC Gaming Master Race}::--
Adinnieken said:
|
Not being arrogant but I think Germans do better ;) The diploma-degree was always seen as a very good degree and you had no problems to get jobs with it in foreign countries for a reason.
Ssliasil said: There is so much wrong with this post that I would have to rewrite the entire thing for you in order to make all the needed corrections. And quite frankly, i don't particularly want to do that. |
You took the words right out of my mouth. So much fallacy and misunderstanding, it would be like trying to explain Algebra to a chimpanzee
This image has to be my favorite part:
Whoever wrote this blog post is either a mad genius, or just plain crazy. :D
3DS Friend Code: 0645 - 5827 - 5788
WayForward Kickstarter is best kickstarter: http://www.kickstarter.com/projects/1236620800/shantae-half-genie-hero
the-pi-guy said: The U.S. has Computer Engineering. In fact according to this, it has different applications compared to Computer Science. http://www.eng.buffalo.edu/undergrad/academics/degrees/cs-vs-cen I suppose you might know better than a college what colleges offer. |
Or, as it is more likely, things have changed.
There used to be no such thing as a "computer engineer". In fact, at one place of employement, this was actually an issue because we were calling ourselves engineers. In the past, if you wanted to specialize in computer engineering, you got an electrical engineering degree or depending on how indepth you wanted to get, a physics degree. As an example, Georgia Tech's computer engineering program is within the electrical engineering school. Same with Iowa State.
And to show just how much of a joke at one time "computer engineering" degrees were, they were originally a degree offered by schools like ITT and Devry.
Heck, 20 years ago a computer science degree wasn't even specialized. If you were going into networking, programming, or systems you learned the same thing as everyone else. Today, there are various computer science degrees for each field of study.
Likewise it would appear that schools have developed a computer engineering program. GATech's definitely seems to be a solid engineering program.
Adinnieken said:
The actual problem with GDDR5 is that it is designed for larger blocks of data. The other problem is you can't simply update the memory, you have to clear it and rewrite it. This is great when you're dealing with graphics, because the GPU takes the original data and if it needs to replace it so what if you lose the original data. You have the updated, possibly final form of the data you want and it generally takes up the entire block. |
Thanks for bringing in your knowledge and experiences to this discussion. Sometimes I don't know how to explain things as well as you have.
People, especially hear, seem to be confused as to how things work. It's much more complicated that just 8+1.2 or 8+1.8. When things are no longer linear, it is much harder for most to follow. But the multi play developers are having their own challenges, so we really don't know yet exactly how things will work out.
As I said during the reveal, I am concerned that Sony may have a cell type blunder, again. The unified GDDR memory might work for them, but I have yet to be show from Sony that it is actually no relevance to that concern. They should should let the public have some game-play.
But in another subject, just curious, what are you referring to with Microsoft abandoning the GDDR memory? Was this an actual product or just development?
errorpwns said:
The thing is the memory bandwidth/amount of memory only comes into play when you bump up to a higher resolution. The Xbox One has more than enough capability at 1080p. Considering that the current gen cosnoels will most likely never exceed 1080p for games it is safe to assume that both consoles will hit a peak eventually after optimization and stuff is factored in. Neither console needs more than 4GB of memor dedicated to video. Anything more than 3 is absurd for games running at 1080p. Which is why it would be a joke to bring up the fact one has more usable memory than the other. Since in the end IT DOES NOT MATTER AT 1080P. PC benchmarks would verify that perfectly. The fact is 2GB of GDDR5 seems to handle 1080P in games perfectly fine should be more than enough proof of that. I don't get why people think the way they do about hardware. The fact one produces more flops than the other has no real standing in games. Drivers have increased graphics cards performance by large amounts. So if Microsoft can out write Sony driverwise their graphics chip could end up working out to be the more powerful one even though on paper the one in the PS4 is more powerful. People have severely underestimated what a graphics driver rewrite can mean for the Xbox One. It has been rewritten and that alone could boost the Xbox Ones performance. The 50mhz bump up not as much. |
Thanks for bringing some thoughtful points to this discussion. It reminds of how during the 80's Lotus was able to optimize a 4 cylender engine that would out perform just about everything on the road.
And I don't want to point out yet the differeance between OpenGl and DirectX. I know Sony is working on their of version of DirectX 11.1, but as with many things, we have yet to see what it will be able to do.
This thread is kind of cute, also fake, but still cute afterall...
Like... mmm unicorns, for example.