It is my opinion that you should only have to learn about the “isms” in the world once. Social studies/history teachers should spend one year teaching students about racism, sexism, etc and call it good. I believe this because I had to read at least one book every summer and spend countless hours listening to lectures in class about American slavery. In one if my college English classes we just finished reading Frederick Douglass’s personal narrative.
My opinion and personal experience is that after the second time around people stop caring. I know that during those units I spent a lot of time thinking, “I get it. Slavery is bad. Can we move on please?” It wasn’t that I thought it was no big deal, but it seemed that the schools were trying to make us feel guilty for the fact that we were white, I am not accountable for what southern slave holders did over 100 years ago.
Feminism also drives me crazy. I understand that at one point in time women weren’t allowed to vote or work or go to school. The fact of the matter is that 60% of the students at my college are female. My dad’s entire team at the bank is made up of nearly entirely women. The problem is that feminism has gone so far that it often turns into sexism against men. So many commercials nowadays feature independent women coming to the rescue of their stupid or incapable husbands.
I understand that in many places racism and sexism are still big problems, but my belief is that instead of trying to make people feel guilty for things they were not involved in or fighting in out of date ways for put of date causes, schools, parents or whoever else should teach students about the places where slavery and oppression of women still exist and teach them about what people are currently doing to try and solve the problems.