General Discussion
Showing Original Post only (View all)Why is Southern History so Romanticized? [View all]
The south has many wonderful people and has a wonderful culture. This post is not meant to attack anyone in the south. However, it is meant to have an open and honest debate about history and the myths of history. Below I will write in general and regional terms, even though I know that some in the south were not with the general public sentiment during any of the periods I mention. This is meant to spark debate about how we understand history, not demean anyone.
With the said, the South has one of the most difficult and painful histories in the United States. From slavery, to reconstruction, to the civil rights fight, and today's racism, the South has a history that is ugly and has historically been on the wrong side of every cultural issue that this country has ever faced. Yet, America as a whole, not just southern America, has romanticized southern history and culture. From the Civil War movies today that minimize slavery in southern society, The Birth of a Nation in the 1920s, to school books that minimize difference in the Civil War and Reconstruction: America has a version of southern history and culture that is distorted. It is more like Gone with the Wind then the real ugly history. The myth of the south and its "lost cause" ( http://en.wikipedia.org/wiki/Lost_Cause_of_the_Confederacy ) dominant the historical narrative.
This is compared to the history of the North. Thaddeus Stevens, Charles Sumner, and other radical republicans stood for freedom and equality during the civil war period. Yet, they are almost totally forgotten. Even the few individuals that did fight for freedom and are somewhat remembered, such Frederick Douglass, are less known then someone like Robert E. Lee. We have forgotten that there were many people in the North that felt strong about slavery and wanted to end it. They were a driving factor in the creation of the Republican Party and the reason why that party was seen as such a threat by the South. They also fought hard for equality after the end of the civil war. Yet they are nearly forgotten.
My question is why? Why is the south so romanticized?