Stan1953
Member
This came up in another thread that I didn't want to address and derail, but why do some Christians think it is better to send their kids to Christian schools? Is not the biblical imperative to be the salt of the earth as Christians and let our light shine? Does that not hold true for our kids?