By using this site, you agree to our Privacy Policy and our Terms of Use. Close
alexxonne said:
Pemalite said:

You actually know nothing.

You say you are a pc gamer, yet still don't understand lots of things.

If you are going to quote me, you need to actually quote my statements rather than drive your own false narrative with fake quotations.

I never said "You actually know nothing.".

alexxonne said:

Legacy software is run by using the new routines based on cycles(mhz) to match the older hardware. When the legacy software is incompatible or unstable it just doesn't work. It is the sole reason modern CPUs can be 10 times more powerful yet they run old software like crap. You need to equalize processing cycles to achieve true legacy interpretation. And not just the cycles, the instructions sets and features, all of them interpreted for a true legacy solution. Having multi-core and multi-threads processing helps, but is has been the sole reason for the difficulties in building an emulation profiles to run 360 games in xbox one. They needed to alter game routines in order to run them on a lesser cycle based processor, and test them on a one by one basis, it is a very time consuming, effort. As a PC gamer you should know that, VERY WELL.

Apparently you don't have an understanding of how backwards compatibility is achieved on the Xbox One.
If you think Microsoft is doing pure emulation... You are highly mistaken.

The rest of your ramble needs some citations, otherwise I am discarding the lot.

But hey, if you believe that Mhz is the sole driver for performance... I am about to school you rather excessively on why the Pentium 4 and AMD bulldozer flopped so hard despite having more Mhz.

alexxonne said:

Gpu architecture design is dependent on patents, rights and other things. Microsoft needed a solution to survive their 2nd generation hardware because they failed with the OG xbox, and then designed the 360 to emulate Xbox games by brute force performance and commands interpretations, not legacy. In ATI's favor (now AMD if you didn't know) xbox og gpu was a just a modified geforce 3 card. Microsoft NT based OS for the OG xbox essentially converted all of the games for it into simple directX based routines (hence the X-box dumbass). ATI managed to assimilate lots of the routines, but not all of them. It is the same problems and issues when using and building emulators. You use a different and superior CPU to emulate a lesser CPU, but since processing cycles dominate half of the performance issues you need a faster (in cycles) processor and a better one (in architecture).  Just try comparing and airplane vs cargo ship.  A cargo ship has more capacity while the airplane has less, but one carries it's cargo faster. Old games were built with those differences in mind. Another issue is excessive performance , just take the ps2 for example. It doesn't like to read games from a hard drive, because it is just too fast compared to the optical drive. Lots of PS2 games were built using the native ps2 cd drive reading speed, so when you load games using a hard drive, most of them crash if you don't toggle the options to cap the transfer rate. Try reading about emulation, and backwards compatibility. Is just common sense.

Jesus Christ.
Yes you are right, GPU architectures are laden with patents, rights and other things, but so what? Nothing here actually contradicts my statements.

alexxonne said:

Publisher rights for backwards compatibility is an issue from this generation, not older ones. This makes you a kid, by not knowing that. Not having your game running on the newer console back then (~2000-2008) was seen a failure for not reaching the entire gaming market, so most publishers were very supportive and some were forced (business wise) to help Sony and Microsoft, by having their games running in BC mode. PS3 sales failure at the beginning of the last generation (2006-2008) forced them to abandon BC and old game sales collapsed somehow, and the monetization scheme for BC changed into the "buy again" model this generation, mostly in Sony's side, while Microsoft absorbed their implementation. Essentially you buy the same game you have, but with a custom made profile to run in the newer console. Other options as porting and remastering were used. Being the latter the better approach for that kind compatibility.

Do not call me a kid. Consider this a warning.

Publisher rights for backwards compatibility was an issue for the older generations as well. It's an issue whenever it is a software approach to backwards compatibility... Which is why the entire Original Xbox library wasn't backwards compatible on the XBox 360 or Xbox One, because Microsoft needs permission for both platforms.

Sony's false start with the Playstation 3 has nothing to do with any of that.

alexxonne said:

For true backwards compatibility you need either a previous generation console CPU/GPU chip embedded along side the new console architecture or a legacy approach using a software interpreter, that can translate command by command. An interpreters is what was used for ps1 BC in the PS2 and PS3. The code is already built no need to re-invent the wheel. The same is for the PS2 up to an extent. But PS3 is massive and complicated system that needs lots of brute force processing. BC compatibility could have been announced for ps1/ps2 games without any issues, even the ps4 have a built in ps2 emulator that they use for the ps2 classics, and a game by game basis.

False.

Ironically... People like to assert that the Playstation 3 is this "massively complex machine" that requires "brute force processing" in order to emulate... Yet the Playstation 3 Emulator RSPC3 is better than the Xbox 360 emulator not just in terms of compatibility... But performance as well!

Kinda' contradicts your entire position and all.

True backrwards compatibility you do not need the previous generation CPU/GPU chip, it helps, but it's not a requirement.
A successive chip which retains the same instruction set is sufficient.

alexxonne said:

For a 4K gaming machine it will work great, but not for what we hoped for. This is almost a Slim ps5 solution not a robust one console. And until they finally launch the console, i will be hesitant about it because it simply doesn't make any sense. Sony built the PS5 for a 399 price point. Whether they lose some money or not by launch is another thing. But that is the reality of it. For a true PS4 successor Sony needed a machine in the same range or over Xbox SX.  I hate to see resources and money spent over an audio chip that no matter how great can be, it wil not change a bit how games are played. Audio immersion is so much diluted in gaming and diversified, that by the time PS5 arrives a gunshot, a punch, a crash, a scream will sound the same as in the xbox Sx, due to the use of the same crappy TV speakers everyone uses. 4K Gaming is about 4K not audio. That was a misfire from Cerny and Love the man but, hes out of his zone already.

Citations needed.

Don't take audio lightly, lots of people have decent audio setups and don't use crappy TV speakers.

Positional 3D audio can also assist those who are just using something as basic as headphones anyway.


Microsoft is taking 3D positional audio seriously as well... In-fact 3D positional Audio took a backseat with the 7th and 8th gen consoles... The Original Xbox with it's Soundstorm chip had impressive capabilities for the era.

alexxonne said:

Cerny was the underdog and lost, that was my point. He needed to counter act Microsoft approach and recent news of the new generation console but by simply stating teraflops aren't equal and not being relevant just undermined his own credibility. Is the same failed PS4 pro tactic as when he tried to prove that checkerboard rendering was in equal quality as native 4k rendering.

I don't give two shits about Phil, Cerny or whoever else does the PR for these companies.

Teraflops aren't equal or relevant, I have been saying that for years on these forums.

alexxonne said:

And as a PC gamer you should know very well that a teraflops is a theoretical performance measurement of Fp16/32 integers based on an specific hardware architecture, and will never be the same is given architecture changes. How THE HELL YOU DON'T KNOW THIS.? You need to study and fast.

Teraflops that are propagated by various individuals are a theoretical denominator.

Teraflops that are actually measured in real-world scenarios have a degree of legitimacy depending on task.

Teraflops isn't FP16/32 Integers. It's floating point, not integers. Wow.

Teraflops being floating point numbers is the same regardless of architecture, regardless if it's AMD, nVidia, IBM or Intel.

alexxonne said:

True teraflops don't mean a bit if hardware architecture is different, but both console already announced they will use the same RDNA 2 architecture, just different configurations (CU's and clock). Any other difference would be less tangible.

Just because they are RDNA2, doesn't mean their real-world teraflops are equivalent, there may be other bottlenecks in the design due to different clockrates/functional units/external factors coming into play. (I.E. Bandwidth.)

alexxonne said:

Don' t get me wrong I LOVE Sony, but since the PS4 Pro, I'm hesitant to buy anything because of the approach Sony is using since then. But loving a brand involves criticism and not blindness.

Your love for any company is irrelevant... And with all due respect... I honestly don't care.

alexxonne said:

I recommend you to read/watch about what is

- Legacy software

- Emulation / Interpreters

- Fp16/32 integers calculation and benchmarking

- Backwards compatibility PS1/PS2/OG XBOX

- History of Game Consoles

- PC hardware (CPU, GPU, Memory)

you get it.

I think you may need to do some research if you think FP16/32 is integer.

And on the front of Emulation... You should probably look up Binary Translation, Abstraction, Virtualization, Code Morphing and so forth.

alexxonne said:

**As a side note...I don't know why Sony (Microsoft too) doesn't just let people decide what content they want. They can use their generic emulators(ps2/ps1) with the respective compatibles games. An option for developers is to charge a premium for advanced solutions like internal resolution scaling or a higher resolution texture packs. There are options. Crowdfunding a title so it can be compatible or establishing funding goals to achieve compatibility options. There are lots of ideas and ways that can benefit users and developers alike. Imagine if you had the first Gran Turismo for PSX, with the options to be internally rendered at 4k with 4k textures, texture filters and AA options, all of it for just $1.99/2.99. If you don't pay then you run the game exactly as the ps1 the vanilla version. These features normally are just options in the emulation profile and it doesn't involve lots of money or time. Lets say 100,000 people buy the upgraded option, surely it will pay for the efforts and revitalizes monetization of old games versus full remasters. Leaving the user that bough the upgrade to tun off/on the individual features at will and decide what kind of experience they want. Maybe they can offer a backward compatibility option at a price, if you own the game pay 3.99, full game 9.99. Take for example Code Veronica for PS4, is just an emulated ps2 game, with some advanced options turned on. I own the game for ps2, still i had to buy it (2.99 if i remember well). OK. Why not offer me for free or at a very low price the vanilla version with the low resolution and assets as the original, and/or offer me as a side offer the advanced options for an additional price. That is pro consumer, you give power and options to the user, and with it additional income to the developers for those games not currently available.**

Because licenses.

All of what you propose (upscaling) is generally free on other platforms. (PC.)

DonFerrari said:

Small correction, Cerny confirmed some data on the Spider-Man was duplicated over 100 times on the HDD. They were very splicit that game size would be reduced with the SDD solution.

Got a source?

EricHiggin said:
Pemalite said:

Citation needed.

Nope. AMD invented it for it's notebook APU's. It's just better sharing of TDP between the CPU and GPU to drive up clocks... It's just a more refined technology to what was in Raven Ridge in 2018.

25:00 - 26:26

Cerny does say 'discreet GPU products around the time PS hardware comes out'. That would likely mean SmartShift isn't one of them as it's already out.

Seems Cerny is stating that they bring forth "concepts". - AMD actually builds and invents the tech.
We actually saw that with Graphics Core Next when Sony pushed harder for higher ACE unit counts... Which assisted heavily with asynchronous compute.

DonFerrari said:

The only question on the sprint all day is how thrustworthy Cerny saying the system was designed to be always in boost mode and that it won't overheat, power consumption is always the same and the heat is already covered. Just that it will have some small percentage trade-off between the load on CPU to GPU (to save like 10% power on small decrease of frequency).

Also the other aspect that needs more detail on the architeture is how much the extra frequency will help PS5 against the choice for less CUs.

We simply don't know yet.

Last edited by Pemalite - on 20 March 2020

--::{PC Gaming Master Race}::--