In what sense can we describe a nation as being “Christian?” By its leaders, or its citizens, or maybe its laws? By all these measures, America would not be considered a Christian nation. And yet, I frequently hear Christians speak so earnestly about how America is a Christian nation, and how we need to get back to our Christian roots, almost as though the United States has a special place in God’s heart. For many evangelicals in America their Christian faith is directly tied to their American identity.
In fact, I have many memories of saying the Pledge of Allegiance at my Christian elementary school, or singing the “Star Spangled Banner” in church. But the truth is, America is not a Christian nation, and I’m not sure it really ever has been a Christian nation.
It’s always difficult for me to hear certain Christians talk about the founding fathers as being these proud, whole-hearted Christians. It’s often said that they stood on biblical/Christian principles. I have even heard some say that they built this nation based on teachings of Jesus and the laws found in the Pentatuech. This is simply not true, and sadly misinformed. The vast majority of the founding fathers were not Christian. Thomas Jefferson, who wrote the Declaration of Independence and was our third president, did not believe that Jesus was God, performed miracles, or rose from the dead. Does that sound like someone evangelicals would consider Christian? In fact, Jefferson literally modified the Bible by taking out any references to the miraculous and boiled it all down to Jesus’ moral teaching. George Washington was an Anglican but had serious doubts about the traditional tenets of the Christian faith. Whenever Washington referred to God it was only in deistic terms like, “Supreme Being” “Heaven” and so on. Similarly, Benjamin Franklin recognized a deity but believed that Jesus was simply a moral teacher and not God. (Of all the founding fathers the two we know that were committed to orthodox Christian faith were John Witherspoon and John Jay). The truth is the founding fathers were far more influenced by the Enlightenment (particularly thinkers like John Locke and William Blackstone) than by any Christian distinctives.
Consider America’s laws: Jesus said to look at a person lustfully is to a commit an offense, but there aren’t laws prohibiting lust in this country. Or what about the first of the ten commandments that commands all worship be given to Yahweh? Nothing in US law states that Americans have to worship Yahweh. Or what about Paul’s teaching to avoid gossip, serve one another in love, act in humility…these are all important practices of the Christian faith and yet the American government does not mandate them. On the other hand, a church will make sure its leaders and members are not lusting, worshiping other gods, and so on, and there will be disciplinary action taken if Biblical principles are violated. This is not the case in the American government. An American citizen can worship whatever god he or she pleases, does not have to act in humility, and can lust after anyone or anything. The founding fathers were not thinking of Christian principles as they formed this nation’s laws. Rather they derived their ideas from Enlightenment philosophy, which placed a high emphasis on the individual, as the basis on which to build this country.
Finally, what about the people of America? Haven’t Americans mostly been Christians? Perhaps partly. Christians did seem to rise up and play a strong role in the issues surrounding slavery and the civil war (on both sides), and there were church revivals that took place in America that ignited Christian fervor, but that is not to say that most Americans were committed followers of Jesus, and that certainly has not been the case in the past sixty years or so.
So what happened? How come so many Christians today are fixated on reclaiming America as a Christian nation? I think Daryl Cornett explains it best, “The nineteenth century displayed a significant Christianization of the American people up through the Civil War, evidenced by revival and social reform. After the Civil War, steady decline in religious adherence was the impetus for evangelicals to mythologize American history and pine for a return to a golden age of Christian faith and virtue at its founding that never existed.”
All of this to say, I think it is high time evangelicals in America gave up the quest to reclaim America for something it never was: a Christian nation. Christianity was never meant to be spread through political power, and when it has, it does not reflect the self-sacrificial love of a dying savior. Christianity (or any religion) that is mandated, enforced, or politicized has always been and always will be a disaster. American Christians need to stop worrying so much about whether or not the Ten Commandments are displayed in courthouses, and instead, recall the words of Jesus on his way to the cross, “My kingdom is not of this world.”
Faith Colloquium : A Blog about Theology, Philosophy, Church, and Culture
515 total views, 1 views today