Positive News: Christianity is on the Decline in America



The trend of the changing religious landscape in the United States began in the 1980s. The country is the Christian hub of the West with an unprecedented influence of church into politics. In 1990,  88 percent of Americans identified as Christians but since, that percentage has been been in decline. By 2007, it was 77 percent and by 2016 it reached a historic low of 69 percent with only 62 percent belonging to church congregations.


This shift is heavily linked to the dissociation of religiosity and national identity to Americanism. Additionally, increasing waves of multi-ethnic immigrants from all over the world has heightened the need for secularism with neutral laws that are inclusive of all residents. According to the website www.religioninpublic.com , the unaffiliated Americans increased from 22.2 percent in 2008 to 29.5 percent in 2018. The study indicated that states like Hawaii and Wisconsin witnessed the biggest increased for the nones by 22.9 percent and 19.5 percent simultaneously. Alaska, Wyoming and South Dakota are the only states that experienced a very slow net growth of Evangelicals primarily due to internal migration of the younger population and the rapid increase in the number of the elderly who have the tendency to lean more religious. 


The increasing tendency to intertwine religion and social conservatism with politics is the lost hope of restoring the wrong-headed dogma-based policy which was once normal for the public to accept.Numerous reasons led to this escalating change in the religious landscape across the United States. 


First, the desegregation which transformed homogeneous populations in urban America to invent a new set of norms that embraced inclusiveness; the Martin Luther King era eliminated racial and religious superiority and made religion less of a priority to a significant percentage of the population that had started to experience the modernization within suburban America. 


Second, the irreversible changing definition of women’s role in society. It all began by gaining the right to vote in 1919 when the 19th amendment was ratified, to the right of abortion in 1970s with Roe v. Wade. When the U.S. experienced an economic boom in the 1950s, there were far fewer concerns about the explicit role of women that the Bible dictates. Tens of millions of American women joined the labor force making them comparable, though not equal to men as breadwinners.   Women now comprise 48 percent of the U.S. labor force. Such incremental changes made the role of women in America irreversible and will likely be even more progressive in the future. 


Third, and this is perhaps the most important element, is immigration. With over 55 million foreign-born citizens, Americans became heavily influenced in adjusting their beliefs to become more accommodating to an increasingly diverse society. It has been noticeable that religiosity among the younger wave of immigrants is significantly less in comparison to first generation immigrants. 



With two-thirds of the country’s population residing in cities, it will only be a matter of time for America to unleash its true identity as a secular, multiracial and multilingual country which will make it more desirable than its current motto suggests as  a “nation founded on Judaeo-Christian” principles.



Reporting for The Humanist Advocate



August 5, 2019