chakkra said:
Pemalite said:
It's really to early to tell what the fundamental differences are between RDNA 1.0 and 2.0 from a low-laying perspective anyway.
|
If we forget for a moment about PS5 vs XSX (which is the part that makes everybody shut their doors) and just focus on RDNA2 vs RDNA1, there has been a few articles outlining the differences between them.
"Ever since AMD announced the RDNA2 architecture, they have reiterated a singular goal: they wanted to achieve a 50% jump in perf-per-watt over RDNA1. And that they would accomplish it entirely with architectural improvements, not process improvements..."
"Along with numerous optimizations to the power efficiency of their GPU architecture, RDNA2 also includes a much-needed update to the graphics side of AMD’s GPU architecture. RDNA (1), though a massive replumbing of the core compute architecture, did not include any graphics feature upgrades. As a result, AMD only offered a DirectX feature level 12_1 feature set – the same as the Radeon RX Vega series – at a time when NVIDIA was offering ray tracing and the other features that have since become DirectX 12 Ultimate"
https://www.anandtech.com/show/16202/amd-reveals-the-radeon-rx-6000-series-rdna2-starts-at-the-highend-coming-november-18th/2
|
I am talking low-level stuff not marketing fluff laid on top, that is for those who don't delve into the hardware nitty gritty.
chakkra said:
And about Primitive Shaders and Mesh Shaders being exactly the same, not quite.
"A Mesh shader is a new type of shader that combines vertex and primitive processing. VS, HS, DS, and GS shader stages are replaced with Amplification Shader and Mesh Shader. Roughly, Mesh shaders replace VS+GS or DS+GS shaders and Amplification shaders replace VS+HS."
https://microsoft.github.io/DirectX-Specs/d3d/MeshShader.html
"Mesh shaders represent a radical simplification of the geometry pipeline. With a mesh shader enabled, all the shader stages and fixed-function features described above are swept away. Instead, we get a clean, straightforward pipeline using a compute-shader-like programming model. Importantly, this new pipeline is both highly flexible—enough to handle the existing geometry tasks in a typical game, plus enable new techniques that are challenging to do on the GPU today— it looks like it should be quite performance-friendly, with no apparent architectural barriers to efficient GPU execution."
https://www.starcitizen.gr/2642867-2/
|
They are the same from a user standpoint. One is essentially a marketing term from nVidia, another is a marketing term from AMD.
With AMD's set-up you have to assemble the input of a pre-defined format like vertices+vertex indices sequentially, where-as with a mesh-shader it's entirely defined by the user and is thus not bounded by the assembly stage.
From a developer/hardware point of view AMD's approach with in-driver shader transformation, the advantages are limited compared to full mesh shader support, as programmability is sacrificed.
But for all intents and purposes, they are the same.
Are there more differences? Of course. But they set out to offer the same result, ends users aren't going to care about the finer points.
AsGryffynn said:
Pemalite said:
In short... Please take the 350w vs 320w PSU as irrelevant, because ultimately it is.
|
Not necessarily. Most of the loss in power does translate into... well... heat. If a lighter supply is feeding a more powerful machine, there's less heat generation from the PSU. In other words... if the PS5 does have a more powerful unit, it might be generating more heat.
|
I am not saying they are 400w units, just saying that the wattage a unit might have stickered onto the side is irrelevant.
AsGryffynn said:
Pemalite said:
In short... Please take the 350w vs 320w PSU as irrelevant, because ultimately it is.
|
Not necessarily. Most of the loss in power does translate into... well... heat. If a lighter supply is feeding a more powerful machine, there's less heat generation from the PSU. In other words... if the PS5 does have a more powerful unit, it might be generating more heat.
|
I already touched on thermodynamics and energy conversion efficiency. - Which includes heat.
And it's not just about a more powerful PSU vs weaker PSU. It's down to energy conversion efficiency, higher efficiency means less wasted heat during conversion.
drkohler said:
Actually, the exact opposite could be true.
If a 350W ps operates in its "comfort zone", and a 320W ps operates slightly above its "comfort zone", then the latter ps generates more heat.
However, the whole thing depends on the voltage regulator circuitry. Contrary to what you seem to think, the ps does not feed "the machine", it feeds the voltage regulator circuitry (where much more heat is generated than in the ps). The vrc is the place where money vs heat is traded at design time (more phases = less heat but higher costs).
Nothing has been revealed about where the ps work or how the vrcs are built so any discussion about "this comsole is better than that one" is pointless.
|
My point is from the very beginning is that the hardware needs to actually be tested before anything definitive can be asserted.
AsGryffynn said:
I'm going on a limb and saying they are both operating at normal capacity.
Well, at that point we're dealing with internal components so our guess will pretty much be moot unless we get someone like Louis Rossmann to tear both consoles down and look at them on a component level. I do assume that they are going to be fairly similar if not identical however. If this is the case, then the ball falls back on the PSU's energy loss and not the heat generation of the SoC (there's a good reason why the CPU and PSU get the cooling on the spot, but then again, we're dealing with APUs, so presumably the heat is going somewhere else.
|
As years have gone by and Power supplies have gotten more efficient... Many PC PSU's won't actually initiate their fan until the unit exceeds a certain temperature threshold anyway.
Traditionally PC's have also used the PSU to exhaust the cases air as well.