What does it mean to claim the US is a Christian nation, and what does the Constitution say?

What does it mean to claim the US is a Christian nation, and what does the Constitution say?

Many Americans believe the United States was founded as a Christian nation, and the idea is energizing some conservative and Republican activists. But the concept means different things to different people, and historians say that while the issue is complex, the founding documents prioritize religious freedom and do not create a Christian nation.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.