Quantcast
Eurogamer: PS5 and Xbox Series X Spec Leak.

Forums - Gaming Discussion - Eurogamer: PS5 and Xbox Series X Spec Leak.

It's gonna be a fun year.



Around the Network
exclusive_console said:
On paper PS5 specs compared to X is much lower than expected. I guess Sony are confident 4K 60fps with those specs ? with the X it seems Microsoft want to get PC gamers onboard and core Xbox gamers from switching to Sony. Though if PS5 can undercut X price by 100$ I think they do not have to worry.

No one should be confident in this IMO. Developers will decide on a game by game basis and I'm sure 7/10 times graphical fidelity will be prioritised over resolution & frame rate. The only way we'll be getting consistent 4k/60fps is if Xbox Series S is very weak & used as the base developement system. Hopefully this is not the case and developers start from PS5 as the base and worry about Xbox S later. 

I think we will see lots of 1440p/30fps and even some 1080p 30fps games where devs really want to push raytracing.

I think Microsoft is aiming for that native 4k benchmark with the X1X but framerate will very.



setsunatenshi said:
HollyGamer said:

True, not all hardcore gamer are PC only, but most of them are on PC. Because best graphic can only be achieved on powerful PC that is irrefutable fact.  And not all hardcore gamers build  1500 USD , instead they buy GPU that cost 500 USD on par with 600 USD Xbox X and just upgrade their normal  PC instead building a new one. I don't see any reason to not upgrading your PC, but a loyal Xbox fans will probably buy Xbox X instead PC or go with Xbox and  PC 

30% will not be noticeable especially when most of xbox games are made using directX api that is good at scaling on many platform (consoles, many variant of PC on the market, future hardware compatibility)  but bad at optimizing on single devices. It's the opposite with Playstation API with their GNM and GNMX which is bad for porting games, bad at future hardware compatibility but good on optimizing one single device. 

Also there are more factors then just Teraflop number. like memory bandwidth, GPU speed etc etc. Even some expert said many game developer choose small CU count but higher clock instead bigger CU count but less cloaked frequency. PS5 are rumored to have 2 Ghz with small CU that will benefit on pixel rates and geometry shaders. On top of that  both are using the same memory bandwidth speed (Flute benchmark indicate PS5 will be using 18 Gbs of GDDR6 instead on the numer on the OP) . While Xbox will be using 56 CU with 1700 Mhz speed to achieve 12 teraflop with memory at 18 Gbs as well. This means it will be even far smaller then 30% in reality. This is very different scenario then PS4 and Xbox One comparison,  where the hardware is even bottleneck on  memory bandwidth. 

I think only very small people that can only tell the difference and you is one of them , also lets be honest here without going to Digital foundry nobody able to pint point the difference , especially when Digital foundry  pause the games, zoom in and out and bring the pic close together to compare it. Nobody really care that much. Probably only hundred thousand people like you who will care to see Digital Foundry to see the difference and  buy Xbox X many will just buy new GPU. And all of the person who go to Digital foundry are niche gamer and super hardcore.

Let's look on this gen on how a powerful Xbox One X is, It still unable to  changed their sales number. Content, price, brand is the most important thing,  power also important but on certain degree. 

@bolded: that's absolutely not true, especially when you have ray tracing lowering the fps count so significantly. at similar graphic settings can be the equivalent of playing at 30+ fps vs 60 fps locked

Both console will be aiming the same performance by tweaking the visual effect. we have seen a lot example with PS4 and Xbox One where Xbox has lower resolution or lower graphic effect ten PS4. We might see PS5 1900p while Xbox X run at native 4K. 

Me might see frame rates difference , but not as big with PS4/Xbox One different  . Also both will be using Variable refresh rate on HDMI 2.1 , so 10 to 20 frame rates difference will be hard to notice using Variable refresh rates.

Both consoles will be underperformed when using Ray tracing , so we will see a hybrid and more efficient technique  combination of Screen Space and texture based ray tracing.  



HollyGamer said:
setsunatenshi said:

@bolded: that's absolutely not true, especially when you have ray tracing lowering the fps count so significantly. at similar graphic settings can be the equivalent of playing at 30+ fps vs 60 fps locked

Both console will be aiming the same performance by tweaking the visual effect. we have seen a lot example with PS4 and Xbox One where Xbox has lower resolution or lower graphic effect ten PS4. We might see PS5 1900p while Xbox X run at native 4K. 

Me might see frame rates difference , but not as big with PS4/Xbox One different  . Also both will be using Variable refresh rate on HDMI 2.1 , so 10 to 20 frame rates difference will be hard to notice using Variable refresh rates.

Both consoles will be underperformed when using Ray tracing , so we will see a hybrid and more efficient technique  combination of Screen Space and texture based ray tracing.  

with a much bigger power difference than the one between ps4/xb1, how can you say it won't be as big as then? if anything it will be much, much bigger

and variable refresh rate only matters if your display is able to support it, i'm pretty sure my tv isn't, same for 99% of the people out there. you are making a lot of assumptions without anything to really back them up. history tells that graphics get pushed ahead of frame rate the majority of times, the idea would be to have powerful enough hardware not to have to compromise either vs the competitor.



PS5 = 9,2 Tflops
XBSX = 13 Tflops

Even digital foundry is jumping in on it.... well tbh it doesnt matter that much.
Im sure both will be more than capable of running 4k.

This could mean theres a 100$ price differnce between the two, as others have said.
I dont think its a mistake, if its possilbe for PS4 to hold the price advantage over the XBSX.



Around the Network
setsunatenshi said:
HollyGamer said:

Both console will be aiming the same performance by tweaking the visual effect. we have seen a lot example with PS4 and Xbox One where Xbox has lower resolution or lower graphic effect ten PS4. We might see PS5 1900p while Xbox X run at native 4K. 

Me might see frame rates difference , but not as big with PS4/Xbox One different  . Also both will be using Variable refresh rate on HDMI 2.1 , so 10 to 20 frame rates difference will be hard to notice using Variable refresh rates.

Both consoles will be underperformed when using Ray tracing , so we will see a hybrid and more efficient technique  combination of Screen Space and texture based ray tracing.  

with a much bigger power difference than the one between ps4/xb1, how can you say it won't be as big as then? if anything it will be much, much bigger.

and variable refresh rate only matters if your display is able to support it, i'm pretty sure my tv isn't, same for 99% of the people out there. you are making a lot of assumptions without anything to really back them up. history tells that graphics get pushed ahead of frame rate the majority of times, the idea would be to have powerful enough hardware not to have to compromise either vs the competitor.

#1
4,2 vs 6 Tflops = 30% differnce.
9,2 vs 13 Tflops = 28% differnce.

Its the same in terms of power.


#2
However with diminishing returns, and consoles at 6Tflops almost being able to run most things at 4k anyways, I think the differnces will be minor.
It matters more at lower resolutions. You can easily spot the differnce between a 720p image, and a 1080p.  Its alot harder with 1440p/1800p vs 4k (2160). Power matters less and less, in terms of sharpness, with increase of resolutions. You wont have 1 be blurry, while the other is sharp, the differnces wont be that big.

#3
"variable refresh rate only matters if your display is able to support it, i'm pretty sure my tv isn't, same for 99% of the people out there."
True... this will be the "next" big thing to get as a console gamer, a TV that has variablel refresh rates. So you avoid screen tearing, and have less noticeable lag, while playing.



HollyGamer said:
Azzanation said:

Not all hardcore gamers are PC only. There are plenty of hardcore console gamers who want the best visuals just as much as the next PC gamer in fact I know more PC gamers who play on PCs weaker than a Pro and an X and also there is plenty of info about the average PC gamer that does not have a high end PC. Not everyone wants a PC and that's coming from a PC gamer myself. I have plenty of friends who refuse to go out and build a $1500aus PC just to play games better than a $600aus console. 

As I mentioned before, games will always push the bar every new gen and they will eventually utilise 100% of the hardware in the next machines so having a 30% advantage will be noticeable. A better argument would be would console gamers care that much? Yes and no. Many console gamers this gen made a huge fuss over 900p to 1080p. You might not care but there will be others that do.

I don't need to zoom in to see the differences in games from a Pro to an X or an X to a Ultra PC game. I can tell, weather it affects my gameplay experience it doesn't. Next gen is going to be about what you deem affordable as its going a direction of how much you want to spend for quality.

True, not all hardcore gamer are PC only, but most of them are on PC. Because best graphic can only be achieved on powerful PC that is irrefutable fact.  And not all hardcore gamers build  1500 USD , instead they buy GPU that cost 500 USD on par with 600 USD Xbox X and just upgrade their normal  PC instead building a new one. I don't see any reason to not upgrading your PC, but a loyal Xbox fans will probably buy Xbox X instead PC or go with Xbox and  PC 

30% will not be noticeable especially when most of xbox games are made using directX api that is good at scaling on many platform (consoles, many variant of PC on the market, future hardware compatibility)  but bad at optimizing on single devices. It's the opposite with Playstation API with their GNM and GNMX which is bad for porting games, bad at future hardware compatibility but good on optimizing one single device. 

Also there are more factors then just Teraflop number. like memory bandwidth, GPU speed etc etc. Even some expert said many game developer choose small CU count but higher clock instead bigger CU count but less cloaked frequency. PS5 are rumored to have 2 Ghz with small CU that will benefit on pixel rates and geometry shaders. On top of that  both are using the same memory bandwidth speed (Flute benchmark indicate PS5 will be using 18 Gbs of GDDR6 instead on the numer on the OP) . While Xbox will be using 56 CU with 1700 Mhz speed to achieve 12 teraflop with memory at 18 Gbs as well. This means it will be even far smaller then 30% in reality. This is very different scenario then PS4 and Xbox One comparison,  where the hardware is even bottleneck on  memory bandwidth. 

I think only very small people that can only tell the difference and you is one of them , also lets be honest here without going to Digital foundry nobody able to pint point the difference , especially when Digital foundry  pause the games, zoom in and out and bring the pic close together to compare it. Nobody really care that much. Probably only hundred thousand people like you who will care to see Digital Foundry to see the difference and  buy Xbox X many will just buy new GPU. And all of the person who go to Digital foundry are niche gamer and super hardcore.

Let's look on this gen on how a powerful Xbox One X is, It still unable to  changed their sales number. Content, price, brand is the most important thing,  power also important but on certain degree. 

You must have missed the start of this gen. All this forum consisted of was links to DF videos showing slowed down zoomed in or paused videos of Xbone vs PS4 titles and all we heard about from users here was how massive the difference was and how easy it was to see and how important a factor it was.

Microsoft has already fixed their content problem. Their brand has also been recovering ever since Papa Phil took over. We'll see on price.



exclusive_console said:
On paper PS5 specs compared to X is much lower than expected. I guess Sony are confident 4K 60fps with those specs ? with the X it seems Microsoft want to get PC gamers onboard and core Xbox gamers from switching to Sony. Though if PS5 can undercut X price by 100$ I think they do not have to worry.

Forget about 60FPS on all games. It will happen on some games because devs choose FPS over graphics, but there will be games in 30 FPS with heavy focus on graphics.



JRPGfan said:
setsunatenshi said:

with a much bigger power difference than the one between ps4/xb1, how can you say it won't be as big as then? if anything it will be much, much bigger.

and variable refresh rate only matters if your display is able to support it, i'm pretty sure my tv isn't, same for 99% of the people out there. you are making a lot of assumptions without anything to really back them up. history tells that graphics get pushed ahead of frame rate the majority of times, the idea would be to have powerful enough hardware not to have to compromise either vs the competitor.

#1
4,2 vs 6 Tflops = 30% differnce.
9,2 vs 13 Tflops = 28% differnce.

Its the same in terms of power.


#2
However with diminishing returns, and consoles at 6Tflops almost being able to run most things at 4k anyways, I think the differnces will be minor.
It matters more at lower resolutions. You can easily spot the differnce between a 720p image, and a 1080p.  Its alot harder with 1440p/1800p vs 4k (2160). Power matters less and less, in terms of sharpness, with increase of resolutions. You wont have 1 be blurry, while the other is sharp, the differnces wont be that big.

#3
"variable refresh rate only matters if your display is able to support it, i'm pretty sure my tv isn't, same for 99% of the people out there."
True... this will be the "next" big thing to get as a console gamer, a TV that has variablel refresh rates. So you avoid screen tearing, and have less noticeable lag, while playing.

1- you are disregarding IPC improvements in the newest RDNA tech vs an extremely old architecture used in the ps4 generation (by today's standards). that by itself blows past the 30% actual difference between the 2 rumored specs. this is why people keep saying "flops" is not a good measure of power comparison

2- ray tracing will be a performance killer, you can look at current pc benchmarks to see that. the difference will be much more noticeable than the pixel count of current gen, which i agree, i don't particularly mind the 1440 upscale vs 4k done in the 2 consoles

3- yep, exactly



I don't know if their info is accurate or not, tho they must be pretty sure about it to put their credibility at risk, but I have to ask: why do some of you say that it's wrong because it doesn't include Ray Tracing hardware?

This article/video only talks about memory bandwidth, Compute Units, the frequency they may run at and the theoretical flops of it all, nothing of which has anything to do with Ray Tracing. So, again, why are some of you saying it's false because of it?



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.