Yes, America Had a Christian Founding
The Founding Fathers often cited the God of the Bible as the source of our rights.
Thanks for making THE RIGHT WAY the #1 newsletter for red-pilled patriots around the world.
Get 15% off today to help me share the absolute truth!
Welcome to the 18th article in our new section on religion: Christian Nation.
Why does the U.S. have such a distinctive history? What is exceptional about us? How are we different from “normal countries” (Barack Obama’s term) — like those in Western Europe — which are mostly content to dissolve their sovereignty in the secretive, autocratic European Union?
If you ask most American liberals, and the historians they read, the answers are clear. We aren’t distinctive. We’re not exceptional, except that we embraced “diversity” through mass immigration before Europeans got the memo. We are a normal country, like France or The Netherlands. We’re just in denial about it. It’s the job of the Democratic party, our history teachers, and the U.S. courts to banish our old superstitions. To accept that there is nothing providential about American history. And that there was nothing Christian about its founding.
Our founders’ refusal to let Congress establish a U.S. church was meant to launch a radical separation of church and state. The purpose of that, we’re told, was to protect unbelieving citizens from the menace of “theocracy.” Those who disagree? They want to remake America on the model of Iran or Saudi Arabia.
Our rights don’t come from a Creator. (You’ll notice Joe Biden couldn’t bring himself to mention one.) They emerge from a “liberal consensus,” which means that they’re ever changing, and must be discerned by judges, then applied by the government. So we might have really had gun rights in 1783. But now we don’t anymore. Instead we have the right to protection from transphobic discrimination.
If that’s any consolation.




