Which political ideology do you most identify with?
What role do you think religion should play in education, and why?
People shouldn't teach that "Christianity is the United States religion" when there are so many Americans who aren't Christians. It doesn't make them less American.
Be the first to reply to this disagreement.