Navigator
Member
Alright, first off, I would like to say hello to everyone (again). I haven't been here in a while, so forgive me for that; I've been very busy with school.
As the title suggests, I want to get all of your opinions on my question: "Is the title 'Christian' as important as we make it out to be?"
Yes, being a "Christian" is something that we can hang our hats on, it shows us and others who know of our religious standing that we are "little Christs" following God's Word and accepting Jesus as our Savior.
But that's just it. If we do accept Jesus as our Savior; do we NEED to call ourselves Christians? I guess I should have put a little disclaimer or something though...I am NOT saying I am ready to drop my title as a Christian or anything, it's just something I've been thinking about.
Being a Christian can have so many advantages; you can meet other Christians, grow and develop your spiritual lives together and form close bonds in the Spirit. THAT is just awesome. But is it really the title that we latch onto with others, or their love for Jesus that you share with them?
The big issue I have is that people are doing things, both positive AND negative, today in the name of Christianity (at least it seems). They are Christians so they should give money to the poor; They are Christians so they should help the old lady cross the street. To me, it seems Christians are doing things because they think that's just what Christians do. I don't really like that personally, because it shouldn't be an obligation. Your title of Christian doesn't mean that you HAVE to do certain things...it may hint that you SHOULD but that decision to help others should be yours because you love them..you pity them..like Jesus did with us.
I'm rambling, and I don't even know if I have a real argument here, but what are your thoughts????
As the title suggests, I want to get all of your opinions on my question: "Is the title 'Christian' as important as we make it out to be?"
Yes, being a "Christian" is something that we can hang our hats on, it shows us and others who know of our religious standing that we are "little Christs" following God's Word and accepting Jesus as our Savior.
But that's just it. If we do accept Jesus as our Savior; do we NEED to call ourselves Christians? I guess I should have put a little disclaimer or something though...I am NOT saying I am ready to drop my title as a Christian or anything, it's just something I've been thinking about.
Being a Christian can have so many advantages; you can meet other Christians, grow and develop your spiritual lives together and form close bonds in the Spirit. THAT is just awesome. But is it really the title that we latch onto with others, or their love for Jesus that you share with them?
The big issue I have is that people are doing things, both positive AND negative, today in the name of Christianity (at least it seems). They are Christians so they should give money to the poor; They are Christians so they should help the old lady cross the street. To me, it seems Christians are doing things because they think that's just what Christians do. I don't really like that personally, because it shouldn't be an obligation. Your title of Christian doesn't mean that you HAVE to do certain things...it may hint that you SHOULD but that decision to help others should be yours because you love them..you pity them..like Jesus did with us.
I'm rambling, and I don't even know if I have a real argument here, but what are your thoughts????