By using this site, you agree to our Privacy Policy and our Terms of Use. Close

I'm gonna say 32 GB or 64 GB, 131 GB Seems way to high.

I've had 6 GB on my laptop for the last 4 years and it's only starting to get out of the normal now.



Around the Network
Pemalite said:
CrazyGPU said:

If you analize the trend over the years you get this great table that took me some time to make.

  Playstation 1 x Playstation 2 x Playstation 3 x Playstation4
               
CPU and Video memory 3 MB 11 32 MB 16 512 MB 16 8192 MB
Bandwith 0,132 GB/s 24 3,2 GB/s 7 22,4 GB/s 8 176 GB/s
Gpixels/s     2,35 2 4,4 6 28
Gtexels/s     1,2 11 13,2 4 55
CPU calculations 0,066 GIPS   6,2 Gflops 35 218 Gflops 0,5 102 Gflops
GPU calculations         192 Gflops 10 1840 Gflops
Max Resolution 640 x 480   1280 x 1024   1920 x 1080   1920 x 1080
Optic Media 0,7 GB   8,5 GB   50 GB   50 GB

CPU calculation is less in PS4 than PS3 if you could use all the PS3 resources at once (you cant), Actually the throughput is amost the same from PS3 to PS4. This should be at least 2 times better on PS5.  

GPU flops increased 10 times, If it doubles in 3 years now , we can think about a machine that is 4 times as powerfull. New Maxwell and Hawaii cards have 5 Teraflops, more than twice what PS4 has and Maxwell consumes 200W. So its not insane to believe that we can have 8 Teraflops on a 100W envelope in six years

Now, games cost 2 to 4 times more to make in full hd than in SD, I wonder how many AAA companies could afford to make games if they cost 4 times what they do now. Maybe PS5 gen would be hold back from financials and not from hardware standpoint. 


The PS4's CPU is indeed faster than the PS3's Cell, how anyone even disputes this...


Flops is not a definitive, nor accurate representation of a processors performance, just like "Mhz and Ghz".
Processors use more than just floating point, games and by extension game engines use more than just floating point, which means that to get a proper understanding you need to use more than just floating point in comparisons.


Games cost nothing extra to be made in Full HD compared to SD.
It's just a resolution difference, developers just make the game's engine render at a higher resolution, that's it.
The PC has only been doing it for decades. (We had Full HD in 1995, 2 decades ago.)


The main issue that is going to influence what hardware goes into the next gen is going to be costs, well and truly, don't expect $1000 PC hardware in a $400 box, I think for a complete understanding of what will be in the next gen, we will need to take a pragmatic wait and see approach to see which direction the PC heads in.

1- How anyone even disputes this? ask Ubisoft. In their benchmarks showed in the game developement conference in Europe they show that PS3 CPU is faster than PS4. Cerny made the PS4 with powerfull programable graphics to be able to do physics tasks with gpu in a couple of years, there are plenty of interviews about that. 

http://gamingbolt.com/ubisoft-discovers-that-ps4-gpu-is-twice-as-powerful-as-xbox-one-gpu

2- Game cost a lot more to develop in HD because of times and resources, even Nintendo is having issues. 

http://www.escapistmagazine.com/news/view/125716-Nintendo-Underestimated-the-Cost-of-Going-HD

3- Of course cost will matter and consoles will not be able to reach PC level, because of power consumption, high end configurations, high price and so on. 

The computing power measure for games today is Teraflops, like it or not. Mostly because GPUs calculations are measured that way. Mostly teorethical Teraflops like I wrote above, but some make real world teraflop capability, like the ubisoft graphs.

http://gamingbolt.com/ubisoft-discovers-that-ps4-gpu-is-twice-as-powerful-as-xbox-one-gpu

Of course we do nothing with calculations only, thats why I put bandwith, Gpixels/s, Gtexels/s too. 

But again, would somebody buy a console that increases only from 1080p to 1440p? I wouldnt buy anything if it does not give you native 4k. Now, of course there will be some making 1440p games at 60 fps, but several companies will say, ohh i rather make it at 4K , because it looks so great and its native with 30 fps. Same as today with 800-900p 60 fps or 1080p with 30 fps in several games. Anyway, you need 4 times the computing power. 

Also we have to see how many people buy 4k TVs. If very few does, then 1440p consoles can be a reality, but what does it changes if you are looking on a 1080p display? 

--------------------

Last generation Nintendo made a cheap console The Wii, with horrible hardware performance, SD resolution and the greatest innovation in motion control. It attracted all the casual gamers and won the last gen 100 millon to 83. 

It seems that nobody cares about the wii mote, kinect or move any more, did anybody see any popular game showed in last E3 for new controls?

Casual gamers started gaming on mobile devices, or play still with the wii.

This new gen Gamers are choosing the best performing console. So if Sony and Microsoft believe that  they need the fastest machine for winning, maybe they will make it for 4K native. but who knows.

As new consoles are x86 compatible, we might see a new race starting. We might even get to see a  premium PS with more hardware and resolution, with games compatible with PS4 running at lower resolution, but thats just silly speculation.  

 



archer9234 said:

I get were you coming from. But this is wrong. While Yes, making the resoluion change isn't a big deal. Everything needed for that is what causes the expense. You aren't going to like PS1 game models in 1080 res. People are going to expect high detailed models with clothes physics, lighting, maps that have day/night cycle, bump maping, volumeteric shadows etc.


The resolution doesn't cost a thing, not a cent.
It's the assets that costs, those same higher-quality assets would also see benefits even if a game isn't at "HD".

Converesly, games have historically always increased the quality of the "graphics" despite being limited at certain resolutions.
For instance the jump between the origional NES and the PS2 had the same resolution ceiling, yet there is a stark difference in graphics.

Converesly, running a PS3 at PS2 levels of resolution, there is a stupidly large difference in fidelity still.
Again, resolution itself, costs nothing to implement, it's not the sole contributer to graphics, far from it and people's focus on that "magical number" needs to change, I could argue that contrast plays a far more significant role than resolution in better image quality.

The cost of a games development is always going to increase because we, as consumers demand ever increasing levels of graphics quality, regardless if you run at 640x480 or 1920x1080, it's just easier for people to blame resolution.

CrazyGPU said:

1- How anyone even disputes this? ask Ubisoft. In their benchmarks showed in the game developement conference in Europe they show that PS3 CPU is faster than PS4. Cerny made the PS4 with powerfull programable graphics to be able to do physics tasks with gpu in a couple of years, there are plenty of interviews about that. 

http://gamingbolt.com/ubisoft-discovers-that-ps4-gpu-is-twice-as-powerful-as-xbox-one-gpu

No offense, but Ubisoft doesn't know which way it's going anymore, their insane, ludicrous claims in the past have been incredibly humorous.
Their track record even recently, more so, that link also has no *real* technical information either.

As for the Cell vs Jaguar.
Yes, Cell is going to be more powerfull at Iterative Refinement Floating Point mathematics, however you throw Integer at it and it suddenly turns into a 486 DX66, it's not going to have peak performance with everything, always, that's why the Cell is slower.

If I were to create a proper analogy... Think of them as Cars, travelling down a highway, Jaguar will sit happily at 100 kilometers per hour constantly, never changing in speed(As it's equally adept, regardless of the math/instructions that are being used.), while the Cell will travel down the highway at 50 kilometers per hour, sometimes thanks to Wind Direction, Air Temperature, Traffic levels, Road and Tyre quality and fuel levels it can accellerate to 150 kilometers per hour, but because conditions change constantly cannot maintain that speed for very long.
And because of such, Jaguar will always reach the end of the highway first.

CrazyGPU said:

The computing power measure for games today is Teraflops, like it or not. Mostly because GPUs calculations are measured that way. Mostly teorethical Teraflops like I wrote above, but some make real world teraflop capability, like the ubisoft graphs.

There is so much wrong with this...

So are you saying that FLOPS is all you need for an ACCURATE and DEFINITIVE comparison between completely different processing architectures?
If you answered yes, then please leave this thread, no really, please leave and never speaketh to me again.



--::{PC Gaming Master Race}::--

Intrinsic said:
This has always kinda been the case butmore so now considering how PC like whats inside the consoles are. The way I see it, is that whatever is a midrange cpu/gpu ($200-$250 CPU or GPU) on the year of the consoles release is what will be in the next gen consoles. So if by that time we have $200 GPUs that are capable of doing 4k at 60fps then that is what we will have in the consoles. My money is still on them not even bothering with that and just limiting devs to internally render their games at no higher than 1440p then have a special/custom upscaler chip that will upscale that to 4k. So while the PS5/XB2 GPUs may have in excess of 5000 shader cores, sony/ms will limit how devs use that power to gurantee 4k@60fps in every game. Whil ein truth what we all will be playing wil be upscaled 1440p at 60fps.

@Bolded: I've alredy read that several times but I wouldn't be so sure about it.

Remember that we are talking about consoles here, which connect to TVs that will be either FullHD or 4K, not inbetween. With that in mind, why would developers target a resolution of 1440p and upscale that to 4K when it makes more sense to target FullHD resolution and have a perfect scale of 4x to 4K?



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:

@Bolded: I've alredy read that several times but I wouldn't be so sure about it.

Remember that we are talking about consoles here, which connect to TVs that will be either FullHD or 4K, not inbetween. With that in mind, why would developers target a resolution of 1440p and upscale that to 4K when it makes more sense to target FullHD resolution and have a perfect scale of 4x to 4K?

you have a point. I am just generally talking though. But the point I am really trying to make is that i feel the way consoles accomplish 4k will be from hardware upscaling. The PS4 has about 1100 GPU shader cores. I can just see the PS5 having no more than 4000 to 5000 shader cores but games being internally locked down to 1080p@60fps then upscaled to 4k for those that have the tvs to support it.

I hope I am wrong though.



Around the Network
Intrinsic said:
JEMC said:

@Bolded: I've alredy read that several times but I wouldn't be so sure about it.

Remember that we are talking about consoles here, which connect to TVs that will be either FullHD or 4K, not inbetween. With that in mind, why would developers target a resolution of 1440p and upscale that to 4K when it makes more sense to target FullHD resolution and have a perfect scale of 4x to 4K?

you have a point. I am just generally talking though. But the point I am really trying to make is that i feel the way consoles accomplish 4k will be from hardware upscaling. The PS4 has about 1100 GPU shader cores. I can just see the PS5 having no more than 4000 to 5000 shader cores but games being internally locked down to 1080p@60fps then upscaled to 4k for those that have the tvs to support it.

I hope I am wrong though.

Shaders aren't everything. Just looks at Nvidia's 980/970 cards, both chips have less of... well, basically almost everything but ROPs, and yet they outperform the 780Ti/780 respectively.

As for what will happen with the next consoles, time will tell, I guess.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Yeah I cant see the next consoles hitting 4K, not natively. Like others have said, it will more likely be 1080p still, scaled up to 4K for those with compatible TVs but with a fair amount of post processing going on to really give the 1080p image a boost. Regards to the Ram in the next gen, I don't think it will be that much of a boost if I'm honest. 32GB absolute max.

What they need to concentrate on is a better way of storing games in the first instance. This gen so far is a joke. Games supplied on Blu Ray but then the whole lot is copied to the hard drive. Space running out left right and centre when basically its just lazy on the dev part. There is no reason that I can think of for a game like Last Of Us as an example to drag over all the movie files to hard drive. These could be played quite happily off the Blu Ray drive and save what, 25GB space on the hard drive? I'm sure this is going on with every other game as well to some degree.



PREDICTIONS FOR END OF 2015: (Made Jan 1st 2015)

PS4 - 34M - XB1 - 21m - WII U -12M

JEMC said:

Shaders aren't everything. Just looks at Nvidia's 980/970 cards, both chips have less of... well, basically almost everything but ROPs, and yet they outperform the 780Ti/780 respectively.

As for what will happen with the next consoles, time will tell, I guess.

I know shaders aren't everything. But In case you haven't noticed I am not trying to be very very technical. I know all about ROPs, TMUs and everything else that goes into marking a GPUs performance. But in the spirit of keeping everything simple, I am only focusing on the one part of the GPU that everyone talks about; Its shader cores(amd)/stream processors(nvidia) which is used as the primary market of a GPUs expected performance.



kristianity77 said:

What they need to concentrate on is a better way of storing games in the first instance. This gen so far is a joke. Games supplied on Blu Ray but then the whole lot is copied to the hard drive. Space running out left right and centre when basically its just lazy on the dev part. There is no reason that I can think of for a game like Last Of Us as an example to drag over all the movie files to hard drive. These could be played quite happily off the Blu Ray drive and save what, 25GB space on the hard drive? I'm sure this is going on with every other game as well to some degree.

Ideally, what any dev would want to do is actually render every cutscene in game and and not just in engine. That way the game comes in in an overall smaller package. For now, some devs (eg ND) render cutscenes in negine then store them on the disc like a video file so while said video file is being run for the gamer the game loads up the upcoming leveling in the background. Its pretty much a very good loading scene. moving the entire game over to the HDD including the video files just helps streamline the dev process. Now, everyone pretty much uses the blu-ray drive as nothing but an initial storage and concurrent security medium.

With next gen I would expect significant improvements in the storage media I/O. Basically how the storage medium is connected to system memory.  I think in the next 6 years nand storage will be a lot cheaper and be the norm, so it wouldn't be out of it having 2TB SSDs in consoles by then. I also feel that consoles will complete ditch the SATa interface for data transfer too.



It will probably have a little less RAM than the average contemporary gaming PC.



Stwike him, Centuwion. Stwike him vewy wuffly! (Pontius Pilate, "Life of Brian")
A fart without stink is like a sky without stars.
TGS, Third Grade Shooter: brand new genre invented by Kevin Butler exclusively for Natal WiiToo Kinect. PEW! PEW-PEW-PEW!