More Christians believe America is a ‘Christian nation’ that has not oppressed minorities and is chosen by God An increasing number of American Christians believe strongly that the United States is a Christian nation, that it has not oppressed minorities and that it has been specifically chosen by God to lead the world, according to a recent Barna…