By using this site, you agree to our Privacy Policy and our Terms of Use. Close

In American English, America just means the United States. The Americas refers to all of both continents.

I know some people are offended by the term meaning just the US, but its got a long history of use here. We were defining ourselves as American as opposed to English or European during the revolution. That evolved into us calling ourselves Americans and our country America for short. That may not be politically correct, but I don't think we should have to change it after hundreds of years of use.