Christianity has fallen on hard times in popular Western culture. I get it.
Christians are known more for what they are against and for having perfected culture war tactics–and the grotesquely fearful and hateful versions of Christianity peddled by ambitious politicians doesn’t help the Christian image one bit.
There is plenty of bad press out there about how Christianity is more the problem than the solution to world problems.
You can read the rest. Thoughts?