On Obama's Speech About Christian America

I saw a youtube post of Barack Obama's speech where he boldly declares, "We are no longer a Christian nation. Atleast not just!"

About time, don't you think?

This isn't the 1970's where the hard core fundamental Christians ruled America. This is not even the America of the late 1990's when Presidents pro-activate prayers at the Parliaments in the presence of the press and shies away from being too bold on touchy issues like abortion and gay marriage. This is America of now. And the moment Americans decided to elect Obama (him being a African American guy), they should have known that changes are on its way. Not because his motto was all about change, but because this reflects the flexibility that had begun to be seen in the people.

I can see why many Christians and Churches will be tormented by this speech. I can see the headlines for the many coming weeks on Christian papers and sermons blaring out against this blasphemous statements by Obama.

But I don't see a reason why Americans and Christians should keep thinking of America as a Christian nation. If anything, I feel very relieved as a Church goer myself. And I completely agree with Obama on that statement. America's significant population are atheists, Hindus, Muslims, Buddhists and so on. If you talk about justice (as the Bible does) then you need to admit that America is run by people other than Christians too.

Also, the reason I feel relieved (and I think the Church as a whole should too) is because what America does has too long been labelled synonymously as what Christianity does. For example, when USA invaded Afghanistan, most muslims took it as a Christian man's war against a Moslem man. Whereas it was nothing like that. It only seemed like it because America was too 'Christian' as a nation (atleast as the labelling goes).

If I think of America now, I think of Hollywood, Las Vegas, superhero comics, Coca Cola, Burger King, Apple and blah blah blah. I am sorry, but America being The Christian Nation sounds more like an insult to the faith. And I am glad Obama got us over that phase. Atleast in words.

George Washington and the founders of USA founded America on the Biblical Principles. In that sense America will remain to be 'Christian' as long as those principles are still endorsed by the White House. And I don't see why they will not still continue to be endorsed.

But Christianity is a personal declaration. It is not even a religion. It cannot be a label for a country. Not even for a family. Christianity is a faith that one person chooses for himself. You may sport it and spread the word but ultimately it is upto you to decide whether you want to be or not.