Clip

The Myth of America as a Christian Nation
The idea of America as a Christian nation is often pulled out as a way of saying that America has always been a religious country, but that's not only an oversimplification; it's not really true. The original charter for Virginia and the words of George Washington show that the Christian nation story is a myth.