By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:

It's not debatable because the games eventually proved it.

Not really because technically extensive games released towards the end of the generation like COD (I know you don't appreciate the series but the technical leads behind it are highly talented like it or not), Mass Effect 3, Ghost Recon Future Soldier, Max Payne 3, Sleeping Dogs, Borderlands 2, Arkham Origins, MGSV and especially Shadow of Mordor all showed Sony's platform still struggling despite their competent efforts ... (the latter most example highlights 360's most unique advantage and most likely resides in hardware as well)

Nothing is a slam dunk in the bag like you seem to think it is ... (technical superiority is more than just graphics and higher resolution textures) 

Pemalite said:

If a port is shit, then it's shit. No point mincing words in order to avoid offending people.
I am Australian, we tend not to care.

I am apprised on how difficult it is to be a developer, but if the port is crap, it's crap. I am not going to do a dance and a song and pretend it is good.

It's not even about paying lip service, that's just reality. Sometimes ports or even acceptable ones at that really aren't possible with divergent hardware design ... 

Pemalite said: 

 

Yeah. There is no point having this debate.
The Gamecube was superior to the Playstation 2 overall, the games have proved it, that's the evidence. - Trying to say otherwise is simply disingenuous and nonsensical.

The same applies the other way around. (you don't benchmark hardware based on specific exclusives, coming to a fair consensus requires more than benchmarking just technically impressive titles) You really can't discount the many other cases just because a few games shine on a specific piece of hardware ... 

And as for trying to argue otherwise, this former developer would seem to think so ... (along with his many other interesting proceeding posts)

We can make the same argument for previous AMD vs Nvidia hardware but nothing is going to change the fact most code ran slower on the former despite having a "few" key fast paths ... (what is and what isn't superior is nearly entirely situational and I don't think you understand this) 

Pemalite said: 

Tessellation is being used, it's there in most Frostbite games. It's even in PUBG.

If you have an AMD GPU you can even turn it off and see the increase in performance in most modern games, sometimes the visual reduction isn't very severe either. Heck even Overwatch saw a small uptick in performance when I turned the Tessellator off on the Core2Quad rig.

Not sure how a Maxwell Polymorph engine stacks up against GCN 1.0 geometry engine though, but I would assume the Polymorph engine is more capable.

Truform is also end of life and is no longer supported by AMD's drivers or even included at the silicon level... It was replaced by the Tessellator.
Relying on N-Patches to Tessellate placed to much of a strain on development.

Async Compute is one of the main focuses in engine/game development right now and holds a ton of promise.

Nah, tessellation is dead for the most part just like geometry shaders are. Most AAA games don't even try to implement the feature and what you said about most Frostbite games featuring it is not true since the last game to feature it was Star Wars Battlefront ... 

And the small uptick you saw is probably within margin of error as tessellation isn't explicitly implemented in the game. BTW consoles are GCN 2 ... 

Tessellation is just a bad idea in general. It will forever remain as one of the biggest crap stain on real time graphics technology, Microsoft and the industry screwed up horribly at the time and now graphics programmers/hardware designers have to suffer for whoever wished for it. I seriously wish the industry had some more foresight otherwise we wouldn't end up with a crowded x86 opcode space or Spectre/Meltdown ... (Why couldn't we have hardware for vertex compression when tessellation is just a bad form of geometry compression ?)

Hopefully async compute does get traction since it's also supported in Apple Metal ... (even though they're designing their own GPUs although not desktop graphics)