Pemalite said:
taus90 said:
what does it have do with the post other than proving my point.. slight increase in CPU clock speed doesnt improve theoritical performance in closed API enviroment as simple as that. I m not a hardware guy, i develop games, develop mock code to test the console cieling performance, if people think that xbox x having slightly higher clock speed than Pro will give them 60fps on a 4k buffer are kidding themselves. at best xbox x will have 5 to 10 fps more if the target fidelity is on the level of Pro.
|
It does the complete opposite of proving your point.
Have you forgotten all the improvements Microsoft has done in the name of efficiency? You can't take the clockrate of the Scorpio's Jaguar cores and think it's only a minor increase over the Playstation 4 as if it's some kind of denominator for gauging absolute performance. Because it's not.
Don't use clock rates as some kind of performance benchmark. Just like Flops isn't a true representation of a piece of hardwares performance either.
|
lol the last optimization pack we got for our dev kit was back in novemeber of 2015.
yeah sure Flops doesnt mean anything to general purpose hardware and computing, so a game developer who does low level codes and profiling graphics api, will not even entertain you if you say clock speed and flops arent important... flops, clockspeed, ram bus all matters when chasing that elusive frame render budget.
so flops tells a developer what they can achieve on that specific hardware designed for a specific task which is gaming. No matter how much PC logic you apply for determing a hardwares performance, it doesnt mean squat to developers developing the game for that machine, Xbox one x is sports car with road car tires.