Quote:
Originally Posted by Low C Sharp
(Post 2096059)
To me, "The South won the peace" means that the majority-Northern federal government was willing to give the South near-complete autonomy in racial matters rather than enforcing the 14th and 15th Amendments (there or anywhere) following the end of Reconstruction. The South had another 90 years to largely go its own way before the federal government started enforcing those amendments in the early 1960s -- which is why Civil War rhetoric and imagery were so popular among segregationists at that time.
|
I suppose I really had Reconstruction in mind when I first read the "won the peace" deal. Because I'm pretty sure from my study of history that Reconstruction really sucked for everyone. From post-Reconstruction up to the Civil Rights era, yes, the states were left to do what they wanted on racial matters.
I consider 1865 - 1965 rough times in the South, black or white. (I realize the economy is a separate issue, but so much of the race issue in the South was tied to the economy that you can't always separate it out.) There were more have nots than haves in those days no matter what the heritage might be. So, yes, the whites in the South were free to create untenable laws (Jim Crow, I'm looking at you), but I don't consider that winning the peace. I think it was more of a "holy-shit-we're-tired-of-dealing-with-this" from the northern states rather than any kind of southern victory. For too long, the attitude among poorer whites was (wrongly), "well, I may be poor, but at least I'm not a N*&^%." Logic would tell a person that such an attitude and a dollar will buy you a coke.
Also, I completely 100% cosign this:
Quote:
Honest answer: I believe that those who are not racist are either: 1. ignorant of the Southern history they supposedly honor, or 2. they know that the flag was used to terrorize fellow Americans, but they choose not to think about whether that part of its history has any relevance today.
|
With either option, the end result is ignorance.
I am writing as a person who used to idealize the South, and then I studied it. As a child, I thought it was all about getting to wear pretty dresses and flirt with suitors. (Thanks, Hollywood!) The more I studied Southern culture, the more I saw that turned my stomach. Read up on the slave trade, tour a plantation and
include the slave quarters, read testimonials about life as a slave, read accounts of life as a sharecropper, and then tell me how wonderful it all was.
There are ideals that I still revere: gentility, hospitality, grace, and charity, but when those ideals are gained on the backs of suffering humans, then the price is too high.
I realize that I can't experience what my Black friends have lived through (because, yes, some of my very best friends are Black :D), and I can never truly understand it. I can have great empathy for the experience. I can choose to respect the challenges they face that I will never encounter as a white person. I can open my eyes and acknowledge just because I haven't experienced something that it doesn't mean that it doesn't exist.