US is a Christian nation

Jazzy

Well-known member
Valued Contributor
Joined
Feb 14, 2020
Messages
3,283
Location
Vermont
Gender
Female
Religious Affiliation
Charismatic
Marital Status
Single
Acceptance of the Trinity & Nicene Creed
Yes
Many Americans believe the United States was founded as a Christian nation, and the idea is energizing some conservative and Republican activists. But the concept means different things to different people, and historians say that while the issue is complex, the founding documents prioritize religious freedom and do not create a Christian nation.

Does the U.S. Constitution establish Christianity as an official religion?
 

Albion

Well-known member
Valued Contributor
Joined
Sep 1, 2017
Messages
7,760
Gender
Male
Religious Affiliation
Anglican
Political Affiliation
Conservative
Marital Status
Married
Acceptance of the Trinity & Nicene Creed
Yes
Americans believe the United States was founded as a Christian nation, and the idea is energizing some conservative and Republican activists. But the concept means different things to different people, and historians say that while the issue is complex, the founding documents prioritize religious freedom and do not create a Christian nation.

Does the U.S. Constitution establish Christianity as an official religion?
You answered the question yourself, so why ask the next one that calls it into doubt?

Raising the issue of an established religion is mainly a 'talking point' used by people who would prefer that religious liberty--which IS protected by the Constitution--be ended.

People of faith overwhelmingly affirm the idea of religious liberty and few such people today think that we ought to create an official religion.
 
Last edited:

Lees

Well-known member
Joined
Mar 16, 2022
Messages
2,182
Gender
Male
Religious Affiliation
Christian
Acceptance of the Trinity & Nicene Creed
Yes
Many Americans believe the United States was founded as a Christian nation, and the idea is energizing some conservative and Republican activists. But the concept means different things to different people, and historians say that while the issue is complex, the founding documents prioritize religious freedom and do not create a Christian nation.

Does the U.S. Constitution establish Christianity as an official religion?

I do believe the U.S. was originally founded as a Christian nation. The word 'originally' is important here.

It is God who established, what would be the U.S., Christianity as the religion of America. "Origin"

It was Christians that first came over from Europe and settled the land in 1620. Read the Mayflower Compact. Churches went up everywhere because they were Christian. "Origin"

Our first Constitution was the 'Articles of Confederation' adopted by the 13 States. The power of the government under this Constitution was located more in the States then the Federal. and most every state identified Christianity as it's religion, and some even to require it's belief in, in their State Constitutions. The Articles of Confederation were approved in 1777. "Origin"

In 1787 a bloodless coup occurred in Philadelphia and the Articles of Confederation were overthrown, thrown in the trash, and a new Constitution was created. This gave more power to the Federal govt. giving less meaning to each States declarations of the importance of Christianity. This is our present Constitution. Far from 'original'.

By 1787 the 'Enlightenment' had affected mankind in the West and so had it's atheistic affect upon some of those in Philadelphia.

So, though no official statement of Christianity as an official religion, the original Constitution, the Articles of Confederation, supported the Christian faith, and the States declared their founding on the Christian faith. "In The Year Of Our Lord 1777"

Lees
 

Hadassah

Well-known member
Joined
Feb 12, 2024
Messages
221
Gender
Female
Religious Affiliation
Catholic
Marital Status
Married
Acceptance of the Trinity & Nicene Creed
Yes
I think the country was Lukewarm at its beginning. Freedom for whites, not for blacks and native americans. I think the idea of a "Christian" nation is relative. Depends on where you stand on human dignity and rights.How you feel about bodily autonomy.

Christian charity and brotherly love for the majority was only extended to whites. Is that Christian? WWJD or say? Christianity did not apply to some.

Think of it like when they lied about the Covid shots, and you could not work or go into places without them, you lost your livelihood and once enjoyed freedoms. Now people are losing their lives and have since the forced "vaccination". Put that on steroids and your body is subject to servitude without reprieve, your love means nothing, your husbands/ wives are not yours nor are your children, and your desires mean nothing.

" Christian" nation? It is perspective. Walk a mile in your brother's shoes and your view of history will change. It would seem might makes right, and he with it, determines the narrative.
 

NewCreation435

Well-known member
Valued Contributor
Joined
Jul 13, 2015
Messages
5,045
Gender
Male
Religious Affiliation
Christian
Political Affiliation
Conservative
Marital Status
Married
Acceptance of the Trinity & Nicene Creed
Yes
the constitution is a reaction to what was happening in England before the Revolution where you were expected to be a part of the church of England. They intentionally put in aspect into the constitution to try and encourage religious freedom. It is not a Christian nation now nor was it really before
 
Top Bottom