By using this site, you agree to our Privacy Policy and our Terms of Use. Close

You don't think it was Americans that won the Revolutionary war do you?
"the United States of America, claiming sovereignty and rejecting any allegiance to the British monarchy. In 1777 the Continentals captured a British army, leading to France entering the war on the side of the Americans in early 1778, and evening the military strength with Britain. Spain and the Dutch Republic – French allies – also went to war with Britain over the next two years.

Throughout the war, the British were able to use their naval superiority to capture and occupy coastal cities, but control of the countryside (where 90% of the population lived) largely eluded them due to their relatively small land army. French involvement proved decisive, with a French naval victory in the Chesapeake leading at Yorktown in 1781 to the surrender of a second British army. In 1783, the Treaty of Paris ended the war and recognized the sovereignty of the United States over the territory bounded by what is now Canada to the north, Florida to the south, and the Mississippi River to the west."

I realise that they don't like putting that in Hollywood films due to anti-french attitudes so people probably didn't know they saved their arses.

Still no matter what DeGaul or Sarkozy do you can't hate an entire nation for over 60 years because of that. Truth is they told you the war in Iraq was wrong and they didn't like that as it went against the ridiculous antics and attitude of Bush