This is an interesting topic. American schools do not teach a lot of history in my opinion, especially about other cultures different from "the norm". It is really sad that I just now have begun to learn about other cultures through my GRADUATE education. And I am a SOCIAL WORKER too which is even more ironic. Most Social Workers don't know much about cultures outside of their own and are expected to provide "culturally competent services" to their clients. I think that our whole education system in general is really screwed up but that is definately an American culture thing too. We are only taught what our country whats us to know about people "different" then "us" and most of it centers on stereotypes. My school is excellent in teaching about the trauma related to the immigration experience here in the U.S. Also my grad school did an excellent job in teaching through its cultural diversity courses. I remember about a month ago discussing the topic of the Holocaust in my ethics course. I discussed the fact that in my elementary-high school education we really never were even taught much about this tragedy. It wasn't until my professional education that I really began to learn about it. I think that many of us can admit that we have really taken our education for granted and more often than not have not questioned things that we are taught (or not taught) "by society".
|