By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - (Rumor):PS5 new Devkit named " Prospero ", Microsoft developing high tech camera for Xbox Next

I believe Cerny's "butterfly" is the key to deducing the power of the ps5 gpu. Double the Ps4 Pro compute units like a "4-winged butterfly" and you'll get 8.4 tf x 1.25 of the efficiency of Rdna and that's it, we have a 10.5 tf console.
In base Ps4 mode, the gpu only works at 1/4, in ps4 pro mode, for supported games it runs in half and in ps5 games the gpu is 100% active.
It will be a Godzilla + Mothra duo within this console, taking into account Ryzen 3.



Around the Network
HollyGamer said:

Using different methods  doesn't mean inventing new wheels, it means they are use seperate ways to render the " rays " , even PS4 and Xbox has different methods on applying their API for games. Microsoft heavy emphasis on their directX (for PC compatibility support)  while Sony use low level openGL

The Playstation 4 and Xbox One certainly do use different API's.
Microsoft and Sony won't even use Direct X or OpenGL for their low-level API's as those are inefficient high-level API's, lower budget/less demanding games will target OpenGL/Direct X due to how easy they are to interface and work with.

But the hardware still skins the cat exactly the same. They aren't rendering things differently, the Playstation 4 GPU's despite not having Direct X is still a Direct X compliant part and developers recognize and work with that.

HollyGamer said:

Cerny explaining how PS4 pro enhanced PS4 games: 

"First, we doubled the GPU size by essentially placing it next to a mirrored version of itself, sort of like the wings of a butterfly. That gives us an extremely clean way to support the existing 700 titles," Cerny explains, detailing how the Pro switches into its 'base' compatibility mode. "We just turn off half the GPU and run it at something quite close to the original GPU."

In Pro mode, the full GPU is active, and running at 911MHz - a 14 per cent bump in frequency, turning a 2x boost in GPU power to a 2.24x increase. However, CPU doesn't receive the same increase in raw capabilities - and Sony believes that interoperability with the existing PS4 is the primary reason for sticking with the same, relatively modest Jaguar CPU clusters.

"For variable frame-rate games, we were looking to boost the frame-rate. But we also wanted interoperability. We want the 700 existing titles to work flawlessly," Mark Cerny explains. "That meant staying with eight Jaguar cores for the CPU and pushing the frequency as high as it would go on the new process technology, which turned out to be 2.1GHz. It's about 30 per cent higher than the 1.6GHz in the existing model."

Obviously a dumbed down explanation in order not to confuse those with less technical backgrounds.
In saying that, it isn't some magical "full metal optimization".

Playstation 5 will be an architectural deviation from the Playstation 4, thus software level control is going to play a key part.

HollyGamer said:

Also previous PS5 APU benchmarked leak (Oberon) also indicates that PS5 apu will run with multiple clock speeds based on which's games is running inside the system, 

800 Mhz for PS4, 911 Mhz for PS4 pro, and 2 Ghz for PS5 games, As we know Sony does not have the benefit of API and software expertise unlike Microsoft on emulating . 

Sony also doesn't need it. Sony is relying on the *nix community.
If you think only clockspeed is needed to retain compatibility... Well. You are highly mistaken.

HollyGamer said:

That's why I mentioned above, they are using their "Their directX API"  for raytarcing support, because they have been working with Nvidia and AMD has not yet has the tech, Sony dont have direct X , they can use Opengl tho  support or modified version of Opengl and Vulcan 

Please read my post within it's appropriate context.

HollyGamer said:

I am talking about gaming PC GPU that available on market for consumer product rightnow, (as of rightnow PowerVR does not available for PC gaming and AMD are not yet has Raytracing). Microsoft are using their expertise on DirectX ray tracing with Nvidia to run on Scarlet, this what might be inside Scarlet 

AMD does have Ray Tracing. AMD has had Ray Tracing capability for years, Ray Tracing is inherently limited by compute capabilities, you can do Ray Tracing on an Xbox 360, it just wouldn't be ideal...
nVidia's approach is different, they instead spent a large chunk of their transistor budget on specialized low-precision floating point cores to handle the task.

But that isn't the only approach that can be taken and we don't know yet if it's even the right answer, you can do integer ray tracing for example.

Now would Microsoft be begging nVidia for Ray Tracing processing cores to put on their AMD hardware? Absolutely not.
https://devblogs.nvidia.com/nvidia-turing-architecture-in-depth/

At the end of the day, the only evidence we have for anything is that RDNA2/Scarlett/Playstation 5 will feature Ray Tracing "cores". - Everything else is just assertions without evidence.
https://www.guru3d.com/news-story/geasrs-5-developper-mentions-dedicated-raytracing-cores-from-amd-in-next-gen-xbox.html

HollyGamer said:

That's why I mentioned above, they are using their "Their directX API"  for raytarcing support, because they have been working with Nvidia and AMD has not yet has the tech, Sony dont have direct X , they can use Opengl tho  support or modified version of Opengl and Vulcan 

Ultimately doesn't matter. The Playstation 5 will be built to adhere to Microsofts Direct X specification anyway, that is just reality since Sony decided to adopt commodity PC components.

HollyGamer said:

They choose using AMD Tessellation because AMD has the tech already, this time around AMD don't have the tech ready available for 2020, RDNA or even RDNA 2 (probably are not going inside PS5 or Scarlet) are not even confirmed to have ray tracing solution 

You are clearly missing the point again.
They could have opted for AMD's prior implementation of Tessellation rather than one that adhered to the Direct X specification, but they didn't, because what is the point? Why would you waste your time and money trying to reinvent the wheel when there is already a standardized design?

RDNA does have Ray Tracing, all GCN GPU's have Ray Tracing, they just don't have Ray Tracing "cores".
AMD has a patent on a Ray Tracing implementation, so if you think they have been standing around and doing nothing for years... Well.
https://www.tomshardware.com/news/amd-patents-hybrid-ray-tracing-solution,39761.html

Will RDNA2 have Ray Tracing Cores? We don't know yet. We don't have the evidence to support nor deny such a thing at this point, so you are right there.

HollyGamer said:

The API is super relevant because  every consoles has their own API even Dreamcast and PS2 running on different API, hell even PS4 and Xbox running with different API, maybe some instruction  has Directx Tessellator like you mentioned but that doesn't mean all code and instruction running with the same AP

The API is irrelevant when the GPU designs are already built and adhere to industry standards, the GPU is still doing the same task.

HollyGamer said:

We still don't have any confirmation about RDNA 2 to have ray tracing solution. If they have, then PS5 and Scarlet  might have the same solution , but like i said if they have one there should have been some major leak and many forum dweller wouldn't have to arguing either PS5 have software based ray tracing or hardware solution (back then they were arguing Microsoft will have Hardware and Sony will not have hardware support for ray tracing for nextgen consoles) 

Read Prior. RDNA has Ray Tracing support.

HollyGamer said:

We don't know how far the contract between Sony and AMD is and  also we don't know how much money and times both company willing to invest on engineered their APU, We have a long history from PS3 and Xbox 360 where they even made a special chip for their consoles, and it has become less costume on PS4 and Xbox One era (due to cost ) etc. It's possible to have another IP inside other IP, Hell even inside AMD Navi alone it has different type of IP that patented by AMD for their tech.

Those "special chips" were still built on established designs for another market (The PC).

HollyGamer said:

We have the proves and it existed on Surface that has just launching today 

Citation Needed.
I would need to see an AMD chip with nVidia's tensor cores integrated to believe it.

HollyGamer said:

Can you give prove to this , because if they has one, it should have been available on RX 480, or RX 580 GPU back then 

It's simple. The Xbox One X has checkerboard/sparse rendering support. - If it was a Sony I.P. They wouldn't allow other vendors to use their hardware.
https://www.vgvids.com/exactly-xbox-one-x-checkerboard-rendering/

Rainbow 6 Siege also has Checkerboard rendering on PC. - https://www.gamasutra.com/view/news/316496/How_Rainbow_Six_Siege_was_rendered.php

You only need a couple of hardware features to have Checkerboard rendering support in hardware (ID Buffer for example.) - The rest is all in the software.

HollyGamer said:

Is more of the implementation, Microsoft kernel can be used for optimization code to metal but due to Microsoft policy (future backcompatibility, the games also should be coming to Windows and Steam)  it's impossible for them to run the games close to metal.  

No one builds games to the metal anymore, no one builds games in pure assembly anymore.
You can most certainly be able to build games to the metal if you have access to it on any console, it's just pointless.

HollyGamer said:

For this you may be correct , but for what i have read , PowerVr ray tracing method is not that different than Nvidia path tracing methods. So developer will not have difficulty to implement it on their games.  

But we don't know if they are using PowerVR's I.P.



--::{PC Gaming Master Race}::--

Pemalite said:
 The Playstation 4 and Xbox One certainly do use different API's.

Microsoft and Sony won't even use Direct X or OpenGL for their low-level API's as those are inefficient high-level API's, lower budget/less demanding games will target OpenGL/Direct X due to how easy they are to interface and work with.

But the hardware still skins the cat exactly the same. They aren't rendering things differently, the Playstation 4 GPU's despite not having Direct X is still a Direct X compliant part and developers recognize and work with that.

And in reality PS4 has their own tools and never used Xbox or Microsoft tools, that's because of legal stuff etc . Directx compliant or not it's not direct X, (far from it)  they are just compatible and easy to port .  

Even  PS4 and Xbox One, they have different methods on how they stream the assets from their memory, with PS4 using GDDR5 while Xbox One used E-Sram and DDR3. 

And we still don't know how far the customization is for each of the console, and how much the different  is , even if they are using the exact same chip.

Obviously a dumbed down explanation in order not to confuse those with less technical backgrounds.
In saying that, it isn't some magical "full metal optimization".

Playstation 5 will be an architectural deviation from the Playstation 4, thus software level control is going to play a key part.

Of course it will be using software as well,  but it's more traditional and less complicated then you might think, it's not even the same level like emulating consoles games on PC (PS2 PS3 emulation), it's more of hardware compatibility. Both leak confirmed that PS5 APU will run on different clock speed based on the content they are running.  The logical explanation is, PS5 backward compatibility required the APU run at certain speed to match the console they emulate. 

Sony also doesn't need it. Sony is relying on the *nix community.
If you think only clock speed is needed to retain compatibility... Well. You are highly mistaken

They will combined  software and hardware method, but as far as we know  with PS4 Pro on handling PS4 games and the benchmark Leak we got by the patent about backward compatibility, in my opinion i am pretty sure PS5 or Sony in general in handling backward compatibility using  brute force method.

AMD does have Ray Tracing. AMD has had Ray Tracing capability for years, Ray Tracing is inherently limited by compute capabilities, you can do Ray Tracing on an Xbox 360, it just wouldn't be ideal...
nVidia's approach is different, they instead spent a large chunk of their transistor budget on specialized low-precision floating point cores to handle the task.

But that isn't the only approach that can be taken and we don't know yet if it's even the right answer, you can do integer ray tracing for example.

Now would Microsoft be begging nVidia for Ray Tracing processing cores to put on their AMD hardware? Absolutely not.
https://devblogs.nvidia.com/nvidia-turing-architecture-in-depth/

At the end of the day, the only evidence we have for anything is that RDNA2/Scarlett/Playstation 5 will feature Ray Tracing "cores". - Everything else is just assertions without evidence.

https://www.guru3d.com/news-story/geasrs-5-developper-mentions-dedicated-raytracing-cores-from-amd-in-next-gen-xbox.html

I know, Also I never said Microsoft will fully copying Nvidia methodes, they are just using their experience on applying their API on Nvidia RTX, and creates their own solution and also there are many method available for raytracing 

The article you mentioned never mentioned or confirmed RDNA will have Raytracing, it's just confirmed that Xbox Scarlet or Next generation will have Ray Tracing on hardware level, it could be using AMD or even outside IP solution (PowrVr, Microsoft, etc) 

Ultimately doesn't matter. The Playstation 5 will be built to adhere to Microsofts Direct X specification anyway, that is just reality since Sony decided to adopt commodity PC components.

I get what you mean, it's still a "Directx compliant ". But it's far from DirectX. It may have some similarities but it's still different , you can say Vulcan can run games on PC that run DirectX API,  but in reality Vulkan =/= Directx. Or a PC server that use PC part but in reality they are not PC for consumer.  Even if Xbox Scarlet can run what PS5 can run, In the end the result will decide. Which one is more simpler, effective and do the job better. That's will explain even if they are using the same RDNA it doesn't mean they are the same. 

The same goes with PC, PS4 and Xbox comparison. Indeed they are using PC part. But using PC part does not mean it's equal to PC desktop/Laptop. or vice versa. PC is a personal computer that use directX Api to run games that all it's device purposely made for many aspect of life (working entertainment, study , etc) while console are 90% for gaming and the rest is just entertainment.   

You are clearly missing the point again.
They could have opted for AMD's prior implementation of Tessellation rather than one that adhered to the Direct X specification, but they didn't, because what is the point? Why would you waste your time and money trying to reinvent the wheel when there is already a standardized design?

RDNA does have Ray Tracing, all GCN GPU's have Ray Tracing, they just don't have Ray Tracing "cores".
AMD has a patent on a Ray Tracing implementation, so if you think they have been standing around and doing nothing for years... Well.
https://www.tomshardware.com/news/amd-patents-hybrid-ray-tracing-solution,39761.html

Will RDNA2 have Ray Tracing Cores? We don't know yet. We don't have the evidence to support nor deny such a thing at this point, so you are right there.

What happen with PS4 might be different with PS5 in 2020 . We have different situation today. We still don't know the reason why Sony are not using  AMD tessellation solution for PS4 nor any of Sony engineering explained that. They might opted to AMD this time or if it not meet their requirement (price, performance , time) etc Sony probably use other method. 

OK, Like I said  i know GCN  can do ray tracing, it's just need a compute unit to calculate, even PS3 can do ray tracing. The problem is that at current state there are no RDNA available design on market has  a dedicated core to run ray tracing solution. Nor we don't know  if RDNA 2 will used that patent you mentioned, or even if RDNA 2 coming with the dedicated Rautarcing  , we still don't know if PS5 and Scarlet will be using RDNA 2 for their GPU. But of course a costume RDNA for PS5 or Scarlet can use some of the tech from RDNA2. 

The API is irrelevant when the GPU designs are already built and adhere to industry standards, the GPU is still doing the same task.

They are doing the same task .... in different method and solution

Read Prior. RDNA has Ray Tracing support.

Not dedicated ray tracing cores on the same level with RTX. Even GCN GPU based already support Ray Tracing 

Those "special chips" were still built on established designs for another market (The PC).

That's true, no one said no. But  not all costume design chip coming to market or make it to the PC consumer. We might have PC GPU on market that have some similarities with console GPU, and can do the same, better or less. But not all customization are needed for PC. For example we still not even get a fully AMD APU motherboard with GDDR5 as main board instead ww still using DDR3 and newest one DDR4.

Citation Needed.
I would need to see an AMD chip with nVidia's tensor cores integrated to believe it.

OK , that's my mistakes . It seems it's not a dedicated tensor cores or a dedicated cores that inside Surface . But they have some semi custom design  build for Surface https://community.amd.com/community/amd-business/blog/2019/10/02/microsoft-takes-pole-position-in-laptops-based-on-amd-technology

It's simple. The Xbox One X has checkerboard/sparse rendering support. - If it was a Sony I.P. They wouldn't allow other vendors to use their hardware.
https://www.vgvids.com/exactly-xbox-one-x-checkerboard-rendering/

Rainbow 6 Siege also has Checkerboard rendering on PC. - https://www.gamasutra.com/view/news/316496/How_Rainbow_Six_Siege_was_rendered.php

You only need a couple of hardware features to have Checkerboard rendering support in hardware (ID Buffer for example.) - The rest is all in the software.

I mean checkerboard rendering can be emulates via software, no one can argue thats. I mean Rainbow Six siege are build based on that tech but not after the rendering done using brute method like what usually happen with PC GPU that can be enabled as we like. What i need a is a build inside checkerboard instruction on Polaris. Because on software level even Nvidia GPU can also run Rainbow 6 Siege that use checkerboard. 


No one builds games to the metal anymore, no one builds games in pure assembly anymore.
You can most certainly be able to build games to the metal if you have access to it on any console, it's just pointless.

First party and exclusives does this especially Sony. It's not pointless, a middle ware is good if you want to have general look and decent outcome with relatively cheap cost but special optimization is much better in every aspect except time consuming. 

But we don't know if they are using PowerVR's I.P.

I am just guessing here, is just my opinion and my deduction based on the names "Prospero " A wizard and PowrVR Wizard fit the name and the situation at current situation regarding Raytracing . But if indeed AMD has the real concrete solution on dedicated RT cores, then they might be using the ones that AMD has. 

Last edited by HollyGamer - on 05 October 2019

HollyGamer said:

 

-

OK lets the discussion begin 

Nice post but you missed a few things, first gonzalo showed a CPU clocked at 3,2ghz, Komachi must have missed that but I'm sure he has made several tweets about gonzalo to be 3,2ghz and the latest gonzalo leak in march showed the gpu clocked at 1,8ghz.

You missed FLUTE leak:

Which again is a 3,2ghz cpu, but FLUTE had more info, it showed 16GB gddr6 Vram in clamshell mode (16chips) I assume Sony is doing clamshell because 2GB Vram chips with high speed are not ready for mass production. It memory speed was 512+ GB/s (the benchmark was deleted).

Redgamingtech asked Komachi if he had more info on the OBERON leak, “BC1 is 40CU, BC2 is 18CU, and native is unknown. Perhaps BC means Back Compatible.

http://www.redgamingtech.com/playstation-5-gpu-specs-leak-2ghz-backwards-compatibility-mode/

Now BC2 probably means PS4 backwards compatibilty but BC1 doesn't fit PS4 pro BC (PS4 pro has 36CU) so maybe BC1 means boost mode BC something similiar to ps4 pro has. And is utilizing all CU in PS5? When Oberon leaked it didn't make any sense as 2ghz is just to high of clock speed and 40CU is not possible based on 2 Navi layout pictures AMD has released. 1 showing only 36/52CU and the other layout showing 36/44/52CU is possible (I'm assuming sony will disable 4CU's).

But we had a verified insider saying that both next-gen consoles will be over 10TF and 40CU's clocked at 2ghz is 10,2TF. Leaks of navi 14 seems to be 22CU's and this is the full GPU not a cut-down which means those layout pictures AMD released are dummys/fake and 36/40/44/48/52 CU's is possible for next-gen. This gpu is suppose to be revealed tomorrow so if navi 14 is indeed 22CU and this is the full gpu then PS5 might indeed be 40CU clocked at 2ghz. Now we should be sceptical as 40CU clocked at 2ghz pulls about 250W.

Funny thing is GONZALO, FLUTE and OBERON all relates to Shakespeare.

Last edited by Trumpstyle - on 06 October 2019

6x master league achiever in starcraft2

Beaten Sigrun on God of war mode

Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.

1-0 against Grubby in Wc3 frozen throne ladder!!

Trumpstyle said:

Which again is a 3,2ghz cpu, but FLUTE had more info, it showed 16GB gddr6 Vram in clamshell mode (16chips) I assume Sony is doing clamshell because 2GB Vram chips with high speed are not ready for mass production. It memory speed was 512+ GB/s (the benchmark was deleted).

16 Gigabit chips have been available for awhile now.
https://news.samsung.com/global/samsung-electronics-starts-producing-industrys-first-16-gigabit-gddr6-for-advanced-graphics-systems
https://techreport.com/news/33129/samsung-fires-up-its-foundries-for-mass-production-of-gddr6-memory/

At the end of the day, the current "hardware" on the market in the form of dev kits or concept designs may not be representative of next-gen hardware.

Before this current generation of consoles there was all sorts of hardware configurations floating around in dev kits which weren't representative of final console hardware, just keep that in mind.

Trumpstyle said:

Redgamingtech asked Komachi if he had more info on the OBERON leak, “BC1 is 40CU, BC2 is 18CU, and native is unknown. Perhaps BC means Back Compatible.

http://www.redgamingtech.com/playstation-5-gpu-specs-leak-2ghz-backwards-compatibility-mode/

Red Gaming Tech is probably an outlet we should probably steer clear from.




--::{PC Gaming Master Race}::--