‘The Whole US is Southern’ How Our Troubled Racial History Went National

Read The Guardian Article in Full Here

By Cynthia Tucker and Frye Gaillard

The US south has long been the epicenter of racism. But the cancer has taken root in every corner of America

In 1974, the great southern journalist John Egerton wrote a prescient book entitled The Americanization of Dixie: The Southernization of America.

In a series of connected but self-contained essays, he made the point that something fundamental was changing – both in his native south, and in the country as a whole. But even Egerton seemed not to be sure exactly how things would unfold.

He was, as those of us who knew him could attest, one of the great and gentle souls of his time, a man deeply committed to racial justice who wanted badly to believe that it would be a good thing if this troubled place in which he lived – this part of America that had once fought a war

for the right to own slaves – could emerge from the strife of the civil rights years somehow chastened and wiser for the journey; if it could narrow its distance from the rest of the country and perhaps even lead it toward better days.

That was the hope. But Egerton, as was his habit, saw darker possibilities as well. Giving voice to his fears, he wrote:

“The South and the nation are not exchanging strengths as much as they are exchanging sins; more often than not, they are sharing and spreading the worst in each other, while the best languishes and withers.”

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *