Christ_empowered
Member
- Oct 23, 2010
- 14,261
- 10,744
Much of Europe, Australia, New Zealand, etc. is now post-Christian. This is kinda sad. On the other hand, they have universal health care, a safety net, and more social services and often have lower crime rates than the United States.
What's going on here? Why is it that the United States has more people who identify as Christian, yet we have a so many (possibly avoidable) social problems that aren't as big an issue as in, say, the UK?
For those of you who live in these decidedly, obviously, blatantly post-Christian+highly secularized nations, what do you think about your society? Human nature is human nature, wherever you go. Having said that, I think some secularized nations are more live-able for a lot of people than the United States.
Let me just say, also, that I --do-- like the United States. I don't hate the US, and I believe God has me here for a reason, just as He has all His other children where they are for any number of reasons that may or may not be apparent in this life, here on Earth. I get that.
What I don't get is...well, there's so much anger and hostility and general wrath going on in the United States. Rich people want more and more, middle class people are angry because of the instability of being middle class and the lack of upward mobility, a lot of black people are angry about the cops and a lack of progress in race relations, conservatives are angry at liberals, the liberals think they're pursuing the greater good, a lot of people are taking out a lot of issues on the poor and immigrants, and...
I don't get it. Before I was on disability, I was in a neighboring state, living off my (loving, generous, kind) people. So, I went to church. Some 17 year old young lady talked about me "living off welfare" and it dawned on me, recently: that's America, especially in the South.
Maybe I'm just upset because people put me through it, so I've seen the dark side of American culture. I dunno.
But...yeah...what are your thoughts?
What's going on here? Why is it that the United States has more people who identify as Christian, yet we have a so many (possibly avoidable) social problems that aren't as big an issue as in, say, the UK?
For those of you who live in these decidedly, obviously, blatantly post-Christian+highly secularized nations, what do you think about your society? Human nature is human nature, wherever you go. Having said that, I think some secularized nations are more live-able for a lot of people than the United States.
Let me just say, also, that I --do-- like the United States. I don't hate the US, and I believe God has me here for a reason, just as He has all His other children where they are for any number of reasons that may or may not be apparent in this life, here on Earth. I get that.
What I don't get is...well, there's so much anger and hostility and general wrath going on in the United States. Rich people want more and more, middle class people are angry because of the instability of being middle class and the lack of upward mobility, a lot of black people are angry about the cops and a lack of progress in race relations, conservatives are angry at liberals, the liberals think they're pursuing the greater good, a lot of people are taking out a lot of issues on the poor and immigrants, and...
I don't get it. Before I was on disability, I was in a neighboring state, living off my (loving, generous, kind) people. So, I went to church. Some 17 year old young lady talked about me "living off welfare" and it dawned on me, recently: that's America, especially in the South.
Maybe I'm just upset because people put me through it, so I've seen the dark side of American culture. I dunno.
But...yeah...what are your thoughts?