Thursday, October 25, 2007
When do we as women, or a society as a whole, decide to become body conscious? At what point in our lives do we suddenly go from having fun and not careing about anything to standing in front of the mirror for ten minutes scrutinizing every odd bump or non-symetrical line on our bodies? Is it our parents fault? Is it societies fault? Media? Opposite Sex? Hormones? Genetics?
It frustrates me that there really isn't an answer. Yes, you can blame the media as a whole, but then why didn't our parents make sure to tell us not to listen to the media. My mom has always told me how beautiful and perfect she thinks I am. But why isn't that enough? Why when she said that did I think, "you have to say that, your my mom"
I have a 2 year old. Yes he is a boy, which makes it not AS big of an issue, but still there. How do we make sure that we teach our kids to have a positive image of themselves regardless of ANYTHING or anyone when I myself can not positively grasp my individuality?