Hi, I'm in the process of enrolling in a Christian college. It is mainly Church of Christ, I believe. I've been going to a Church of Christ for a few months now but I'm not sure about some of their views.. I'm just wondering if this is a good idea? I mean they teach the gospel obviously. Has anyone else been to a Christian college that had some different views but still felt like they weren't being deprived of what they needed for their faith? Thanks.