I know it sounds crazy, but as soon as Christians start telling non-Christians how to live their lives, we've lost the Christian faith.
I think many times Christians don't really take the opportunity to hear what people are saying and seeing in the world around them.
My primary assessment would be because American Christians tend to be incredibly self-indulgent, so they see the church as a place there for them to meet their needs and to express faith in a way that is meaningful for them.
For Christians, they need to access the power of Jesus and not look at Christianity as a religion. It is our Lord Jesus that makes you change, and Christians need to actualize it and put it into practice.