It’s a common belief that America is taking a hard turn to the political right. Wrong. That’s is not what is taking place. America is going back to an earlier time when unions did not exist, when women did not have rights, when Blacks knew their place in society and that “place” was reinforced through violence with the backing of law. This is not a right-ward political turn. This is the America we read about in the history books. This is an America that was ignorant, that was wracked by economic booms and busts, that was run by powerful white men who controlled the economy and the political process. This is an America where life for the vast majority of people was nasty, brutish, and short.
We are not experiencing a right-ward political turn. To say that assumes that we were always liberal or progressive. We were not. America has always been a country of screwball religious revivals, temperance movements, systematic slaughter of indigenous peoples, and enormous wealth built on slavery and wage-slavery. What we are currently witnessing in our country is not a right-ward political tilt. We are simply going back to our roots, to those things that made America great.by