TheRealMafoo said:
Well, to be honest, he is more right then wrong. When the colonies created the US, and became states, they did so with an expectation of what there state rights would be. Those rules radically changed, and changed in a way where it negatively impacted the southern states. It got to a point where they said "thanks, but no thanks. Good luck with that US thing, we are going to go another way". Left the group, and started there own country. The US said "sorry, but we need you, so no" and invaded them. Think of it like if you joined a club where you had to pay $1000 in dues, and after a few years the club moves in a direction you don't like. You then resign from the club, but the owner (who likes your $1000) beats the shit out of you, and says you can't leave, and must pay the grand each year. That sums up the war. It was a state rights thing, and had nothing to do with slavery (if anyone thinks it did). |
That rule change was slavery, so I don't see how you can say slavery had nothing to do with it.