By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Vasto said:

I am done talking about this subject. Its a difference between performing better and purposely making something run worse on the competition.

http://www.gamespot.com/articles/watch-dogs-pc-patch-on-the-way-as-nvidia-and-amd-quarrel/1100-6419996/

"i done talking about this subject so heres a link to further push my opinion anyway"

Try actually reading, or turn off your selective reading for a while.

"I've heard that before from AMD and it’s a little mysterious to me. We don't and we never have restricted anyone from getting access as part of our agreements. Not with Watch Dogs and not with any other titles," he said. "Our agreements focus on interesting things we're going to do together to improve the experience for all PC gamers and of course for Nvidia customers. We don't have anything in there restricting anyone from accessing source code or binaries. Developers are free to give builds out to whoever they want. It's their product."

Basically put, AMD are crying wolf, AMD fanboys are believing it and grabbing the pitchforks, rather than asking why it took the games release and a performance backlash to speak up about the issue when they could have, at any point in time if their claims were true, publicly said "nvidia arent letting ubisoft give us access to the game to optimize it" insteads they wait till its gone retail and people with AMD cards have spat their dummy to pass the blame over to their competition, how predictable.

Ask yourself this, if ubisoft were denied access to AMD codebases, why do the console versions exist? you know.. since theyre running on AMD gpu's.. unless you're trying to suggest that ubisoft would develop a game that works fine on AMD based consoles then decide "lol lets delibrately break it for the PC" doesnt make any sense and reeks of fanboy tinfoil hats, the logical explanation is that the windows drives for AMD cards sucks, and that is no surprise to anyone who has been in the pc gaming scene long enough to remember ATIs switch from AGP to PCIE.

Not to mention the fact that the game at launch RAN LIKE SHIT ON ANY MACHINE with the settings maxed out, i have 780ti's in SLI and it still chugged maxed out at first,