Out of curiosity, do European countries view the U.S. poorly?
I've heard that many countries around the world don't like us, but I'm not big into politics myself. I would think one of the reasons is the "ignorance" we have towards other countries. By that I mean that we don't really know a lot about other countries as much as they know about us.
I couldn't tell you who the leader of Brazil is, but most people around the world seem to know who Trump is. Also, is America the only place in the world that makes movies? I know India has "Bollywood" or something like that, but I don't think they have any films that make it over here.
The influence of the US cinema industry is certainly massive and it shouldn't necessarily be so, just as I think westernism and the spread of western culture, clothing, values etc. should be, well, not exactly imperialism, but not something other countries should take for granted or necessary either.
I know, though, that China is making really massive, really high budget films in the last years - films that gross as much over there, or even more, than American movies gross on the US. How far those will spread in the future, like Japanese and now Korean culture has spread, as those countries became more developed, is anyone's guess.
OT - also somewhat relevant as historical roots of the issue, even though OP compares it to Portuguese rather than Spanish colonization:
(Please do not take the image (very) seriously.)
A friendly reminder to everyone, anyways, that part or most of what you hear about stuff like the Spanish inquisition, colonization etc. are basically "fake news" dating back to the days the British Empire and the Spanish Empire were fierce rivals and the former was trying to find a way to demonize the later. It's called the "Spanish Black Legend", look it up.