… and built its initial wealth on slavery revenue.
It’s a shame because there are a lot of other great things to be proud about when it comes to the US. I guess when people boast about US freedom, what they mean is democracy, and starting the end of the colonial era, inspiring a tidal wave of democratic uprisings around the world, which is accurate. I wish they didn’t use the word “freedom” for that.
That’s not all that exciting. All of Europe (and basically every other are of the world) was built on slave labor as well, that’s literally what the colonial period was about. Also vikings were primarily about capturing slaves, Rome and Greece were mostly slaves, serfdom wasn’t significantly different than slavery.
Sure; but it still bothers me that the US is part of it and yet is often associated with freedom by American nationalists. The same way I’m annoyed that France (my native country, I’m a naturalized American) boasts itself the “pays des droits de l’homme” (“the country of human rights”), despite freedom of speech and of religion having gigantic asterisks, even though they feel like such basic human rights to me. It’s just like, if your national identity happens to not be the greatest at something, maybe don’t boast about being the best at it!
But anyway, this leads me to wonder… I feel like US slavery is discussed and depicted in arts a lot more often, and I genuinely wonder why that is. What do you think? Is it just that American culture chooses to address it head on when a lot of others don’t, or do you think there’s more to it?
So the US was born in a world where slavery was the norm, practiced slavery, and soon became (one of?) the first countries to formally abolish slavery, and fought a civil war with hundreds of thousands of casualties to back up that abolishment.
Let’s look at this question another way: do you think if the USA had never been founded, that there would be more or less slavery in the world today?