fatslob-:O said:
Pemalite said:
nVidia do not have APU's. It's a marketing term strictly limited to AMD. nVidia does not have x86 SoC's as nVidia does not have an x86 license.
|
Don't exactly need an x86 patent but might need patents for x86 extensions to be useful ... (someone needs to remind me when the patents for x86-64 and SSE2 expire as patents only last for 2 decades)
Nvidia already designed an x86 processor before but so did tons of others before ...
|
You need an x86 license that allows you to use x86 extensions.
fatslob-:O said: It should be but since graphics programmers are obsessed with achieving real time physically based ray tracing, we probably won't see a resolution increase ... |
I think we might be a couple generations away from that still yet.
fatslob-:O said: Hopefully GCN is amenable to ISA extensions, it's almost as if though Sony accounted for the fact that newer GCN microachitectures would have changed microcodes in their shader compiler thus they got double rate FP16 in the end with an ISA between GCN3/5 ... (It's good that Sony didn't lock down their shader compiler a specific GPU ISA microcode but their still probably stuck with AMD GPUs that offer similar functionality) |
Graphics Core Next is an extremely modular design anyway, which is why AMD has been able to make constant iterative updates to parts of the architecture with minimal effort/cost whilst leaving the rest of the architecture identical to it's prior versions.
It's also why they can take parts of newer architectures and add them to older designs like with what they did with the Xbox One X and Playstation 4 Pro.
Vega is supposed to be largest deviation from prior Graphics Core Next designs with massive backend overhauls.
So if Sony wanted to, they could take a theoretical Graphics Core Next 6/7 design and regress say... The shaders to be compliant at a hardware level to Graphics Core Next 1.0 if they wanted.
But I believe that would be a silly approach to take. There are ways around such hardware incompatabilities using software-based approaches.
Bofferbrauer2 said:
1. Xbox ONE X is still using Jaguar because it's an incremental update of the Xbox ONE. While I was hoping for Ryzen in the XboX I certainly didn't expect it. The next gen consoles certainly won't use this Ryzen, but a future gen of Ryzen, but it will nontheless be Ryzen, as AMD won't have a new architecture anytime soon
|
Clearly then my statement wasn't directly applicable to you. But you cannot deny that there was overwhelming amounts of people fapping at the idea of Ryzen in the Xbox One X.
Bofferbrauer2 said:
4. yes, Moore's Law applies to any kind of microchip. But at the time of the PS360's releases 1GiB RAM was the usual amount on a gaming PC. I can still remember the outcry when it was revealed that the PS3 would only come with 256 MiB of RAM and how that won't be enough to fuel Cell. The desaturated, Gray-brownish colored games that followed are a testimony that this just wan't enough even at it's inception. Same happened when the current gen launched (though at a much lesser degree) and again with their upgrades.
|
The Playstation 3 didn't come with 256Mb of Ram. It came with 512Mb. Same with the Xbox 360.
Remember, the Xbox 360 released in 2005. Windows XP reigned supreme, Pentium 4/Pentium D/Athlon XP/Athlon 64 X2 were the CPU's of choice.
Low-end gaming rigs back then came with 512Mb of Ram, mid-range 1Gb, high-end 2Gb. And games of that era reflected that with games like Oblivion, Half Life 2, Doom 3 working on 512Mb systems.
But that is all irrellevant. That isn't representative of what we have today.
Today a low-end gaming rig would come with 8Gb of Ram. That's system Ram. Then you can add the Gigabytes of GDDR5 memory from the graphics card to that pool too.
Bofferbrauer2 said:
You need more RAM for 4k? Really? Tell that to the console manufacturers, because the PS4 Pro doesn't have more RAM and the additional RAM on the X is eaten up by it's gargantuan OS (5 GiB for the OS on a console? Really Microsoft???). (before you make a comment about it, yes I know higher screen resolutions need more (V)RAM)
|
Yes really. Higher resolutions, higher quality textures, more and higher quality meshes and so on and so forth require more Ram.
There are multiple reasons why the Playstation 4 Pro and Xbox One X cannot reliably achieve 4k. Ram capacity, memory bandwidth, CPU performance and GPU performance are all factors.
The Xbox One X has a massive increase in available memory for games over the Xbox One, Playstation 4 and Playstation 4 Pro for a damn specific reason.
And It's great that you recognize that higher screen resolutions need more VRAM. Because VRAM and System Ram is the same on a console. ;)
Are you just arguing with me for the sake of arguing?
Bofferbrauer2 said:
The next gen consoles won't be able to push 4k much better than the current upgrades, most will just be upscaled or checkerboxed in some fashion unles they turn down the details compared to the PC releases.
|
I cannot in good conscience agree with this. If you think the capability of graphics hardware is going to remain constant for the next several years when there is industry-wide pressures to push for 4k, then you are highly mistaken.
Bofferbrauer2 said:
This is due to consoles having to choose mid-range graphics chips for their consoles due to price and TDP constraints, and I doubt those will be much better at 4k then they are right now. Worse, as the consoles start to age they will inevitably drop down again to 1080p or even less in the most demanding third party titles.
|
Correct. Consoles do have to choose cost-sensitive parts for obvious reasons.
However, today's high-end levels of performance becomes tomorrows mid-range, not always in sheer theoretical numbers like flops, but through efficiency gains via the reworking/new architectures that bring improvements such as compression, culling and such. Which is why the Radeon RX 480, despite having less flops than the RX 380, can beat the RX 380, all with 200mm2 taken off the die size, 50% less bandwidth and 125w less of power consumption.
GDDR6 should help bring such capabilties to the mid-range. I don't think you fully comprehend how long 3 years is in GPU development and what it means for GPU gains, especially as we look towards 12nm/10nm/7nm/5nm fabrication processes.