I know there is regional variation on how the slave trade is taught, but when I was in school we had numerous, extended, and graphic discussions on the horrors of the slave trade starting from elementary school and extending into college.
Without doxxing yourself could you give an idea of where you went to school? I went to public school in the south and other than being mentioned I didn’t learn much about slavery in school. I mean we learned about the underground railroad and generally knew about the slave trade and that being a slave was about the worst thing humanly possible. But other than getting whipped they didn’t talk about much of the torture or punishments they’d went through. Civil rights I remember being discussed more in depth than slavery but when I was a kid I attributed it through the fact that most of my teachers remember the civil rights movement from when they were my age. Sorry I’m high so I’m rambling now.
I grew up in California
I’m not surprised about your experience though. I have also lived in the south and many of the southern states are still feeling the effects of decades of extensive lobbying on education by the Daughters of the Confederacy.
They DoC has historically pushed a narrative about slaves being happy and content overall, cared for by empathetic masters who valued their well-being. There are many monuments still standing glorifying the wartime deeds done by “loyal” and happy slaves. It’s really insidious.