It should be taught to prevent and bring awareness to close-minded children because they only know what their parents have taught them, whether good or bad. It's essential to our history as Americans instead of living in denial and waiting till it is too late. People who have never been discriminated against in modern-day America would not understand how important it is to be taught because you think that racism doesn't exist anymore.
Be the first to reply to this disagreement.