Skip to main content

Is Christianity Really Dying in America?

Share This Video

The mainstream media is reporting recent survey results by the Pew Research Center as if it's the end of Christianity in America. But is this really the case?