Christianity in America is getting smaller but religion will always thrive in one form or another. I believe it would be more dangerous to get rid of God from society. Without God, then people will start worshiping man, and imaginary God who puts our heart in right direction is much better than a real person who will bring evil.
I know many here don't like Christianity, but I for one believe that the world would be so much worse without Christianity. Christianity brought meaning, purpose, fought evil and just made people better. I'm never afraid of young people just coming out of bible study at a church - no matter what they look like.
Couple of you actually think Native American style of life is best? That's gotta be a joke.
Christians have made mistakes, but overall, they have brought good.
Secularism definitely have brought more evil to the world.







