By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Prediction - PS3-PS4 and X360-XOne last big performance spec leap

curl-6 said:
EricHiggin said:

PS3 should never have been as powerful and expensive as it ended up being at launch. It should have been closer to 125Gflops and $400, which would make more sense based on console hardware history.

The PS3 as is was very close in power to the 360 despite launching a year later.

True, but the 360 was really really high end hardware for a console when it launched. It probably didn't seem that way for MS since the OGXB launched a year after PS2 and was substantially more powerful than the PS2, so the gap between OGXB and 360 wasn't near as large as from PS2 to PS3.

I think 360 was around 180Gflops or something like that, so PS3 could have gotten by with 125. The games early on would have been a little more harsh visually, but a reasonable $400 price would have sold way more units early on and through the lifespan.

360 also shouldn't have been as powerful as it was either, but XB was bound and bent to take more market share and PS most certainly did not want to give it up (plus many other reasons), so we ended up with a spec war, which was good then, sort of, but makes the recent and future gen leaps seem weak.



PS1   - ! - We must build a console that can alert our enemies.

PS2  - @- We must build a console that offers online living room gaming.

PS3   - #- We must build a console that’s powerful, social, costs and does everything.

PS4   - $- We must build a console that’s affordable, charges for services, and pumps out exclusives.

PRO  -%-We must build a console that's VR ready, checkerboard upscales, and sells but a fraction of the money printer.

PS5   - ^ -We must build a console that’s a generational cross product, with RT lighting, and price hiking.

PRO  -&- We must build a console that Super Res upscales and continues the cost increases.

Around the Network
Pemalite said:

Delta Colour Compression does reduce bandwidth demands though, which means there is more room for say... Texturing.

Correct you can compress normal maps with texture compression, depending on compression algorithm of course, not all texture compression schemes do compress normal maps.

Lightmaps are probably going to stay for the immediate future.

Shadow map compression is already happening in 3dc+

Technically you can compress normal maps with just about any compression algorithms that support RGB channels ... (Which is just about every commercially available hardware accelerated texture compression algorithm's on GPUs, although quality may be variable.) 

Lightmaps probably will be staying for the immediate future ...

Shadow map compression may very well be a real thing since shadow map texels are generated but we're already applying some sort of shadow LOD optimization by doing cascaded shadow mapping ... 

Shadow mapping in general is a dumb idea. What we probably want are solutions based on methods for hardware accelerated conservative rasterization like Nvidia's research frustum traced raster shadows or SDF based shadows (bonus is proper penumbra widening for more realistic soft shadowing and it comes with ambient occlusion as well) ... 

Or even filterable shadow maps like non-linearly quantized moment shadow maps ... 

Pemalite said:

Well. There are caveats like with any technology.
AMD does have hardware support for Partially Resident Textures and the concept is sound.

*snip*

Even the latest doom game used Partially Resident Textures... And that game not only looked fantastic... But it ran silky smooth even on average hardware.

The last line is not true, what Doom uses is a texture atlas like many other modern games ... 

Partially resident textures was a good idea at first but there was no way to update the tile mappings from the GPU so you had to do it from the CPU and do that roundtrip causes a lot of overhead which can stall the game ... 



Bofferbrauer2 said:

Don't forget the PS4 Pro APU includes the CPU, and the Jaguar CPU is very small. A Ryzen-based CPU part would be much bigger even with just half the cores

5nm will be problematic unless the PS4 releases 2021 earliest. While 5nm will probably start production in 2019-2020, the yield rates will be atrocious at first, so not really financially viable. As for EUV, well it's initial investment will be enormous and somehow the fab has to make that money back, which means more expensive wafers and thus more expensive chips. It's actually the main reason by now EUV hasn't been implemented yet, the technical difficulties have been largely solved now.

In any case, TDP will be a problem with Vega, maybe Navi can change that, but I doubt it personally

Ryzen will eventually get small too with more transistor shrinks ... 

I think PS5 will launch in 2021 and EUV research was already ammortized so not much need make the money back from research and it didn't cost tens of billions like you imagine ... 

I also don't think we'll be getting Navi, it's probably going to be the architecture after that one if PS5 releases in 2021 ...



fatslob-:O said:

Technically you can compress normal maps with just about any compression algorithms that support RGB channels ... (Which is just about every commercially available hardware accelerated texture compression algorithm's on GPUs, although quality may be variable.) 

 

Well. True. But as you said, quality varies.

fatslob-:O said:

Shadow mapping in general is a dumb idea. What we probably want are solutions based on methods for hardware accelerated conservative rasterization like Nvidia's research frustum traced raster shadows or SDF based shadows (bonus is proper penumbra widening for more realistic soft shadowing and it comes with ambient occlusion as well) ... 

Or even filterable shadow maps like non-linearly quantized moment shadow maps ...

Well. It has it's Pro's and Con's like all things.

fatslob-:O said:
The last line is not true, what Doom uses is a texture atlas like many other modern games ...

https://youtu.be/la0O0SyM3gg?t=7m16s
 


http://renderingpipeline.com/2012/03/partially-resident-textures/
http://renderingpipeline.com/2012/03/partially-resident-textures-amd_sparse_texture/
https://en.wikipedia.org/wiki/Id_Tech_6

fatslob-:O said:

Partially resident textures was a good idea at first but there was no way to update the tile mappings from the GPU so you had to do it from the CPU and do that roundtrip causes a lot of overhead which can stall the game ...

AMD has hardware support for it AFAIK.






www.youtube.com/@Pemalite

EricHiggin said:
curl-6 said:

The PS3 as is was very close in power to the 360 despite launching a year later.

True, but the 360 was really really high end hardware for a console when it launched. It probably didn't seem that way for MS since the OGXB launched a year after PS2 and was substantially more powerful than the PS2, so the gap between OGXB and 360 wasn't near as large as from PS2 to PS3.

I think 360 was around 180Gflops or somehting like that, so PS3 could have gotten by with 125. The games early on would have been a little more harsh visually, but a reasonable $400 price would have sold way more units early on and through the lifespan.

360 also shouldn't have been as powerful as it was either, but XB was bound and bent to take more market share and PS most certainly did not want to give it up (plus many other reasons), so we ended up with a spec war, which was good then, sort of, but makes the recent and future gen leaps seem weak.

FLOPS aren't a very good measure of system power as they are just one metric of many that determine a system's performance.

As for PS3/360 being "too powerful", I'm actually glad they were as strong as they were, as it allowed for experiences like Uncharted 2 or Bioshock Infinite that really wouldn't have been as great or impactful as they were had the hardware been considerably weaker.



Around the Network
fatslob-:O said:
Bofferbrauer2 said:

Don't forget the PS4 Pro APU includes the CPU, and the Jaguar CPU is very small. A Ryzen-based CPU part would be much bigger even with just half the cores

5nm will be problematic unless the PS4 releases 2021 earliest. While 5nm will probably start production in 2019-2020, the yield rates will be atrocious at first, so not really financially viable. As for EUV, well it's initial investment will be enormous and somehow the fab has to make that money back, which means more expensive wafers and thus more expensive chips. It's actually the main reason by now EUV hasn't been implemented yet, the technical difficulties have been largely solved now.

In any case, TDP will be a problem with Vega, maybe Navi can change that, but I doubt it personally

Ryzen will eventually get small too with more transistor shrinks ... 

I think PS5 will launch in 2021 and EUV research was already ammortized so not much need make the money back from research and it didn't cost tens of billions like you imagine ... 

I also don't think we'll be getting Navi, it's probably going to be the architecture after that one if PS5 releases in 2021 ...

It's not even ready yet for volume production and keeps getting pushed back. Right now Volume production is slated to start around 2019/2020 (Intel for 1919, Globalfoundries, TSMC and Samsung for 2020), though even then the WPH (wafers per hour) rate is pretty low at less than 100 compared to almost 300 for 193nm immersion lithography. What'smore is it's still about 20% more expensive to run than thraditional 193nm lithography, which compounded with the low output makes it very expensive. It's far from amortized, the big Fabs don't even have EUV ready yet. It's possible that you confuse it with DUV, which is Deep Ultraviolet Lithography and the 193nm processes actually in use (Ultraviolet starts at around 350nm already) by all the Fabs right now.

Even then, from the roadmap of Globalfoundries I can see that EUV will only be used in Conjunction with DUV, and the reason is to reduce the multiple patterning necessary with 193nm. For 7nm for instance, you'll need 4 mask patterns without EUV, which is pretty complex and prone for errors.

It also depends if ASML can actually deliver the necessary EUV Lithography devices, as they are the sole producer of them. But they just reduced the outlook for 2017 from 12 machines to only 6-7. They shipped a grand total of 20 EUV Lithography machines in the years 2010-2016, compared to 60 DUV in 2016 alone.



I hope that for the next few years console manufacturers just settle for what's out there and release some high quality games for the bloody things!



curl-6 said:
EricHiggin said:

True, but the 360 was really really high end hardware for a console when it launched. It probably didn't seem that way for MS since the OGXB launched a year after PS2 and was substantially more powerful than the PS2, so the gap between OGXB and 360 wasn't near as large as from PS2 to PS3.

I think 360 was around 180Gflops or somehting like that, so PS3 could have gotten by with 125. The games early on would have been a little more harsh visually, but a reasonable $400 price would have sold way more units early on and through the lifespan.

360 also shouldn't have been as powerful as it was either, but XB was bound and bent to take more market share and PS most certainly did not want to give it up (plus many other reasons), so we ended up with a spec war, which was good then, sort of, but makes the recent and future gen leaps seem weak.

FLOPS aren't a very good measure of system power as they are just one metric of many that determine a system's performance.

As for PS3/360 being "too powerful", I'm actually glad they were as strong as they were, as it allowed for experiences like Uncharted 2 or Bioshock Infinite that really wouldn't have been as great or impactful as they were had the hardware been considerably weaker.

Correct. Flops are just an indicator like HP is for a vehicle. There are many other things to consider when analyzing specs in comparison to what is actually output on the screen. There really is no way to make a completely accurate comparison of the old consoles, taking into account everything, so I just used flops, as it was a simple way to indicate where PS5 could end up possibly. I had to change the past (PS3), to create a somewhat linear path to follow though, so that in itself is a big indicator it's a wild guess. It's just as possible PS5's specs, price, and launch date, will be totally different. Nobody but PS knows.

PS3 and 360 being so powerful was good in some ways, but not so great in others. To name a few good things, obviously the res and effects were a major upgrade, the media capabilities, 360 was affordable, PS3 had free online, and it also allowed the gen to last 7 years. Whether or not PS3 and 360 could have lasted 7 years still with around 125Gflops, is also something to take into account. Both consoles most certainly would be do for a next gen performance jump in 2013 if that were the case.

A few bad things would be two really expensive consoles that were heavily subsidized (PS3 more so), which was not good for either company and led to "weaker", cheaper, "off the shelf" semi custom PC part consoles. XB rushing/poorly engineering 360 as one of the causes for the red ring issue and huge losses, leading to the bulky design of the XB1. PS3's "it can only do everything" strategy being taken up by XB, trying to make the XB1 a more balanced, all in one box, instead of a focused gaming console. 360 charged for all of its online services, most likely to cover some of the console subsidy, and PS3 was crazy expensive due to their cell/media box/bc/we're PS so we can do whatever we want and they will buy into it attitude, etc.

I'm not saying it was necessarily a bad thing that PS3 and 360 were as powerful as they are, I'm just saying based on console history, it would have made more sense back then, now, and going forward, if they ended up being less powerful than they actually were. Instead of being a slightly lesser jump in performance, they were a greater leap. That is something that could not be sustained and could only lead to two things. Future consoles seeming like weaker jumps but being affordable, or similar leaps in performance with even more insane launch prices, massive subsidies, and console case shells that would make the XB1 look like a toothpick.



PS1   - ! - We must build a console that can alert our enemies.

PS2  - @- We must build a console that offers online living room gaming.

PS3   - #- We must build a console that’s powerful, social, costs and does everything.

PS4   - $- We must build a console that’s affordable, charges for services, and pumps out exclusives.

PRO  -%-We must build a console that's VR ready, checkerboard upscales, and sells but a fraction of the money printer.

PS5   - ^ -We must build a console that’s a generational cross product, with RT lighting, and price hiking.

PRO  -&- We must build a console that Super Res upscales and continues the cost increases.

Pemalite said:

Well. It has it's Pro's and Con's like all things.

Real shadow mapping is unbelievably scary as hell ... (Could be perspective/projective aliasing, erroneous self shadow/acne and includes peter panning as well.) 

Pemalite said:

https://youtu.be/la0O0SyM3gg?t=7m16s
 


http://renderingpipeline.com/2012/03/partially-resident-textures/
http://renderingpipeline.com/2012/03/partially-resident-textures-amd_sparse_texture/
https://en.wikipedia.org/wiki/Id_Tech_6

That's megatexturing (also known as virtual texturing), not partially resident textures ... 

Game developers don't like partially resident textures because they have no way of updating the tile mappings from the GPU ... (afaik only Intel Gen 9 has that hardware feature and it's not exposed in any APIs so far)

Pemalite said:

AMD has hardware support for it AFAIK. 

Yeah but it still sucks since no way to update tile mappings from GPU, have to do it on CPU which will stall the game ... 



Bofferbrauer2 said:

It's not even ready yet for volume production and keeps getting pushed back. Right now Volume production is slated to start around 2019/2020 (Intel for 1919, Globalfoundries, TSMC and Samsung for 2020), though even then the WPH (wafers per hour) rate is pretty low at less than 100 compared to almost 300 for 193nm immersion lithography. What'smore is it's still about 20% more expensive to run than thraditional 193nm lithography, which compounded with the low output makes it very expensive. It's far from amortized, the big Fabs don't even have EUV ready yet. It's possible that you confuse it with DUV, which is Deep Ultraviolet Lithography and the 193nm processes actually in use (Ultraviolet starts at around 350nm already) by all the Fabs right now.

Actually Samsung will use EUV in 2018 with their 7nm process technology ...  

And I'm not confused, far from it ...

Also the wafer throughput has been increased 125 WPH with ASML's NXE:3400B ...

Bofferbrauer2 said:

Even then, from the roadmap of Globalfoundries I can see that EUV will only be used in Conjunction with DUV, and the reason is to reduce the multiple patterning necessary with 193nm. For 7nm for instance, you'll need 4 mask patterns without EUV, which is pretty complex and prone for errors.

It also depends if ASML can actually deliver the necessary EUV Lithography devices, as they are the sole producer of them. But they just reduced the outlook for 2017 from 12 machines to only 6-7. They shipped a grand total of 20 EUV Lithography machines in the years 2010-2016, compared to 60 DUV in 2016 alone.

That's cause Global Foundries is full of goofs ... 

ASML also shipped 3 NXE:3400B systems last quarter with 8 additional orders of which 6 came from one customer for use in for both logic and DRAM ... (it's obviously Samsung)