brewmama
Member
- Mar 31, 2016
- 410
- 18
I don't think its "liberal policies" that are ruining America. People point their fingers at the left for everything...meanwhile, the GOP is turning into a bunch of social darwinists paying lip service to Christ and "traditional values."
Not to get too political, but in this matter you can't get away from it. If you don't know what I'm talking about, read Charles Murray's book, which demonstrates how the destruction of social mores and traditional values have devastated the working class, while the rich elites still more or less live by the older rules, and therefore don't suffer the consequences. And it's definitely liberal and secular policies.