Many Americans believe the United States was founded as a Christian nation, and the idea is energizing some conservative and Republican activists. But the concept means different things to different people, and historians say that while the issue is complex, the founding documents prioritize religious freedom and do not create a Christian nation.
Does the U.S. Constitution establish Christianity as an official religion?
I do believe the U.S. was originally founded as a Christian nation. The word 'originally' is important here.
It is God who established, what would be the U.S., Christianity as the religion of America. "Origin"
It was Christians that first came over from Europe and settled the land in 1620. Read the Mayflower Compact. Churches went up everywhere because they were Christian. "Origin"
Our first Constitution was the 'Articles of Confederation' adopted by the 13 States. The power of the government under this Constitution was located more in the States then the Federal. and most every state identified Christianity as it's religion, and some even to require it's belief in, in their State Constitutions. The Articles of Confederation were approved in 1777. "Origin"
In 1787 a bloodless coup occurred in Philadelphia and the Articles of Confederation were overthrown, thrown in the trash, and a new Constitution was created. This gave more power to the Federal govt. giving less meaning to each States declarations of the importance of Christianity. This is our present Constitution. Far from 'original'.
By 1787 the 'Enlightenment' had affected mankind in the West and so had it's atheistic affect upon some of those in Philadelphia.
So, though no official statement of Christianity as an official religion, the original Constitution, the Articles of Confederation, supported the Christian faith, and the States declared their founding on the Christian faith. "In The Year Of Our Lord 1777"
Lees