By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Xbox One vs. PS4 Graphics Showdown: The Rematch

I think both console will have great graphics. And both console will have great games for it.



Donnie Darko 
Around the Network
brendude13 said:

The GPU doesn't make a system, especially in the case of PS3 vs 360 and Cell vs Xenon.

 


Well, that used to be true. - As time goes on, the CPU is tasked with doing less when it comes to the graphics rendering pipeline as more of the load has shifted to the GPU, the Xbox 360 and PS3's GPU wasn't exactly known for it's GPU compute afterall, amazing how 6 years of GPU evolution can change the landscape.
If you go back farther CPU's used to handle all the lighting in games which is now done on the GPU when TnL came along with the Geforce 256 and Radeon 7500.

With this console generation things like Physics, Anti-Aliasing (Morphological) and some framebuffer effects not to mention texture decompression (I.E. Rage) was all handled by the CPU, suddenly this next generation the CPU is free of such burdens, which is a *good* thing as it means more serialised processing time for tasks such as A.I.

It also means, that the PS4 having a 500 Gigaflop advantage in GPU compute is *going* to make a difference in all aspects once we are a couple of years into the generation.



--::{PC Gaming Master Race}::--

Adinnieken said:
walsufnir said:

 

Which country are you talking about?


The US.


Not being arrogant but I think Germans do better ;) The diploma-degree was always seen as a very good degree and you had no problems to get jobs with it in foreign countries for a reason.



Ssliasil said:
There is so much wrong with this post that I would have to rewrite the entire thing for you in order to make all the needed corrections. And quite frankly, i don't particularly want to do that.


You took the words right out of my mouth. So much fallacy and misunderstanding, it would be like trying to explain Algebra to a chimpanzee  



This image has to be my favorite part:

Whoever wrote this blog post is either a mad genius, or just plain crazy. :D



3DS Friend Code: 0645 - 5827 - 5788
WayForward Kickstarter is best kickstarter: http://www.kickstarter.com/projects/1236620800/shantae-half-genie-hero

Around the Network
the-pi-guy said:

The U.S. has Computer Engineering.  In fact according to this, it has different applications compared to Computer Science.  

http://www.eng.buffalo.edu/undergrad/academics/degrees/cs-vs-cen

I suppose you might know better than a college what colleges offer.  

Or, as it is more likely, things have changed.

There used to be no such thing as a "computer engineer".  In fact, at one place of employement, this was actually an issue because we were calling ourselves engineers.  In the past, if you wanted to specialize in computer engineering, you got an electrical engineering degree or depending on how indepth you wanted to get, a physics degree.  As an example, Georgia Tech's computer engineering program is within the electrical engineering school.  Same with Iowa State. 

And to show just how much of a joke at one time "computer engineering" degrees were, they were originally a degree offered by schools like ITT and Devry. 

Heck, 20 years ago a computer science degree wasn't even specialized.  If you were going into networking, programming, or systems you learned the same thing as everyone else.  Today, there are various computer science degrees for each field of study. 

Likewise it would appear that schools have developed a computer engineering program.  GATech's definitely seems to be a solid engineering program.



Adinnieken said:
S.T.A.G.E. said:
walsufnir said:
S.T.A.G.E. said:




So, you mean  to tell me  and GDDR5 ram cannot larger files and memory better than DDR3? Tell me its so, so I can return to best buy and tell the sales associate I've been had.

 

I say not a single sentence you wrote originally is true. And it still isn't. Don't put words in my mouth I didn't state.


Sorry if I put words in your mouth. 

The actual problem with GDDR5 is that it is designed for larger blocks of data.  The other problem is you can't simply update the memory, you have to clear it and rewrite it.  This is great when you're dealing with graphics, because the GPU takes the original data and if it needs to replace it so what if you lose the original data.  You have the updated, possibly final form of the data you want and it generally takes up the entire block.

The problem with GDDR5 is when you're dealing with the OS, and why I believe the PS4 does hard drive caching (virtual memory).  The OS and applications tend to use small blocks of data, and they often tend to append those blocks of data.  So, instead of replacing them, they simple add to them or subtract from them.

This post, for example, is an addition of memory (data) to the original block I created when I opened the tab.  As I type, this block gets seemlessly updated.  In GDDR memory that doesn't happen.  Every letter I write would require the memory be cleared, and data be reloaded from a cache into memory.  Not an impossible task, especially when you consider the speed of GDDR5 memory but every letter typed requires access the cache and that takes awawny a clock cycle that could be used for something else.

Sony wasn't the first company to use unified memory in a console.  Microsoft was.  Sony wasn't the first company to use GDDR memory for their unified memory either.  Microsoft was.  There was a reason why Microsoft abandoned GDDR memory.  In fact, on this front Sony is the LAST company to do both.  Even Nintendo came before them on unified memory. 

Thanks for bringing in your knowledge and experiences to this discussion.   Sometimes I don't know how to explain things as well as you have.

People, especially hear, seem to be confused as to how things work.  It's much more complicated that just 8+1.2 or 8+1.8.  When things are no longer linear, it is much harder for most to follow. But the multi play developers are having their own challenges, so we really don't know yet exactly how things will work out.

As I said during the reveal, I am concerned that Sony may have a cell type blunder, again.  The unified GDDR memory might work for them, but I have yet to be show from Sony that it is actually no relevance to that concern.  They should should let the public have some game-play.

But in another subject, just curious, what are you referring to with Microsoft abandoning the GDDR memory?  Was this an actual product or just development?



 

Really not sure I see any point of Consol over PC's since Kinect, Wii and other alternative ways to play have been abandoned. 

Top 50 'most fun' game list coming soon!

 

Tell me a funny joke!

This is so wrong in so many ways, lol.



errorpwns said:
Adinnieken said:
errorpwns said:

The Cell had power, but the rest of the console was one big joke. 512MB of memory and essentially only 256MB of it was allowed for the graphics. The 360 had more available graphics memory and in some multi platform games that showed to help it. The graphics chip and memory bottlenecked the cell. It's all about bottlenecks. I could put a couple titans into my system, but in the end my Athlon II x4 2.8ghz would bottleneck the graphics chips. I'd need to upgrade into a faster processor to see frame increases in some games. See why consoles have 8GB of memory now? They don't want to run into the terrible bottleneck. Even in 06 systems dedicated for gaming were getting 2-4GB of memory. So putting 512 in was flawed from the beginning. Especially when they planned a 7-8 year console cycle.  

The thing that has changed between this generation and the previous generation is the fact that the bottle necks have become smaller and less obvious.  They require a greater deal of understanding systems and how data is used on those systems, than looking at numbers.

What's the point of having GDDR5 memory when your memory block is 256Mb wide and you only have 4Mb to fill that space?  A block size is an addressible area, only one thing can fit into a single block.  So, having a larger block size as GDDR 5 does, become a disadvantage when you're dealing with smaller files.  Soon, most of your memory is consumed with small files because these 256Mb blocks are now consumed by sub 256Mb files.  Memory density, if you will.  DDR3 memory has a smaller block size, I think 64Mb?  So while it may take more blocks for a larger file, I'm not wasting as much memory on a smaller file.  That 4Mb file is only wasting 60Mb of memory in a block, not 252Mb.

Likewise, as I said before.  GDDR and DDR memory work differently.  In GDDR memory you need to flush the memory before writing to it.  You can't append the memory.  Where as in DDR memory you can.  Which is why I believe the PS4 relies on virtual memory (HDD cache file).  The cache file is used to read and write active data from/to memory as it's being used. 

The thing is the memory bandwidth/amount of memory only comes into play when you bump up to a higher resolution. The Xbox One has more than enough capability at 1080p. Considering that the current gen cosnoels will most likely never exceed 1080p for games it is safe to assume that both consoles will  hit a peak eventually after optimization and stuff is factored in. Neither console needs more than 4GB of memor dedicated to video. Anything more than 3 is absurd for games running at 1080p. Which is why it would be a joke to bring up the fact one has more usable memory than the other. Since in the end IT DOES NOT MATTER AT 1080P. PC benchmarks would verify that perfectly. The fact is 2GB of GDDR5 seems to handle 1080P in games perfectly fine should be more than enough proof of that. I don't get why people think the way they do about hardware. The fact one produces more flops than the other has no real standing in games. Drivers have increased graphics cards performance by large amounts. So if Microsoft can out write Sony driverwise their graphics chip could end up working out to be the more powerful one even though on paper the one in the PS4 is more powerful. People have severely underestimated what a graphics driver rewrite can mean for the Xbox One. It has been rewritten and that alone could boost the Xbox Ones performance. The 50mhz bump up not as much.

Thanks for bringing some thoughtful points to this discussion.  It reminds of how during the 80's Lotus was able to optimize a 4 cylender engine that would out perform just about everything on the road.

And I don't want to point out yet the differeance between OpenGl and DirectX.  I know Sony is working on their of version of DirectX 11.1, but as with many things, we have yet to see what it will be able to do.



 

Really not sure I see any point of Consol over PC's since Kinect, Wii and other alternative ways to play have been abandoned. 

Top 50 'most fun' game list coming soon!

 

Tell me a funny joke!

This thread is kind of cute, also fake, but still cute afterall...

Like... mmm unicorns, for example.