Is America Post-Christian?

There is an on-going debate as to whether America is a post-Christian nation. The question at hand; is America headed to a similar fate as Europe or has it already arrived? Some would say no based upon research that reports well over 75% of Americans identify themselves as Christians (Pew Forum on Religion and Public Life - 2007). In other research, more than 40% of those surveyed said they were born-again or evangelical (USA Today, October 18, 2012). Those numbers would lead one to believe that Christianity in America is alive and well, and that the nation is not headed towards a Western European fate where the presence of Christianity is, in most cases, a shadow of what it once was. By all accounts the Christian church is losing, if not lost, its role in influencing the culture in those places. So, should we endeavor to ask this question about America and does it matter?

I think it is an important question for us to ask and answer as it relates to America. For if we don't ask and answer this question, I believe we run the risk of becoming complacent with the status quo. In time, we can become satisfied with doing what we have been doing and therefore become more and more focused on ourselves, and the programming directly offered by the church for us to consume. Programming in and of itself can be a good and is a necessary thing, but it is not the main thing. So, the question about whether America is a post-Christian nation is important because in answering it, we come to understand the impact the Church (God's people) is having on the world. Direct observation of our culture would lead one to believe that our society is not 75% Christian. One only needs to read the paper, listen to any group of young adults and observe the traffic at the local abortion clinic to realize that we are not living in a Christian society. Obviously, there are more examples but I think a rational and objective look at our culture declares an allegiance to things other than biblical. One of the challenges we have in answering this question is how do we go about answering it? On one hand we have research that says America is Christian, but on the other hand we have evidence it isn't behaving like it. So where does that leave us?

Some recently have attempted to tackle this question by attempting to dimension post-Christianity and thereby affix a more objective measure to this concept. The Barna Research Group recently published a series of articles in which they not only dimension post-Christianity in America but also measured it and compared cities as well as regions in the country. They used 15 criteria to determine if a person is post-Christian noting that if they affirmed 60% (9 of 15) the criteria they would be considered post-Christian. Some of the criteria included statements of not believing in God, believing Jesus sinned, not giving money to a local church, not attending a small group, not sharing one's faith and so forth. It is interesting that these criteria get much deeper than a cultural label which I believe many of the previous surveys merely measure. Self-identification can be different than actually demonstrating the reality of that identification. The full report from the Barna Group can be accessed at the following link:

According to this report from Barna, there are notable areas in America that are pervasively post-Christian, and there are other areas in America that are less so. Not surprising, the Northeast region of America represents the strongest presence of post-Christian thought. It was also noteworthy that Buffalo ranked #8 of the list of top post-Christian cities. As a side note, Albany was #1; so, yes, New York is well represented according to Barna as post-Christian. So what does that mean for us who live in these places where the predominant thinking is post-Christian? I think it means the value of demonstrating the gospel is even more important. How we live, how we love and how we lead is important because the prevailing culture has tuned out the things of God. They need to see and experience something different from this world so that they will listen to gospel. What you do and what I do matters. As we become more and more like Christ, those around us see and experience something radically different from what they see and experience in other places. Who we are becoming is so vital as we endeavor to evangelize this region. Sure, a clear proclamation of the gospel remains essential so that people have an opportunity to respond by faith, and we as God's people need to be able to articulate the gospel to others. In the same vein, our testimony in how we live and who we are bears witness and in many cases is the only witness to a people in a place that has in thought moved past Christianity. Simply stated, people see us more than they hear us. The question for us then is that even though we may be living in a post-Christian culture are we, as God's people, living like we have been transformed by God's grace and thereby have His Spirit living in us?

Share This

Share This With A Friend

Subject: Is America Post-Christian?

Sharing URL:

Send Email