Is Christianity Really Dying in America? Share This Video Link was copied! The mainstream media is reporting recent survey results by the Pew Research Center as if it's the end of Christianity in America. But is this really the case?