There is something very perverse about society. The way women are expected to look...like pre pubescent-adolescents, with no hair anywhere, perfect skin, teeth and hair. Doleful eyes and Cheshire cat white grins, a gap between their thighs and the ability to sigh when the moment is right...acting innocent with no ambition. Instead of focusing on achieving an education, girls are catty and more focused on what they weigh, how they look. Instead of building each other up with compliments, girls break each other down; in person, online. Hurtful words scar deep and implant themselves into the brain where they remain for eternity. Bodies are mangled to conform to the societal norms. Hair is plucked, waxed and dyed in the hope of attracting a wealthy and handsome man. In society it is no longer enough to be yourself, you have to be an imitation of the unrealistic images you see on TV. Those images are photoshopped, altered, cropped, but still everyone appears to be striving for that one thing; acceptance. Women have been bought and sold on the idea that it is acceptable to actually pay money, real money, to sit on a bed of lights to burn their skin in order to look tan. We've been sold on the idea that it is okay to inject our bodies with toxins and to implant objects into our bodies to make our breasts, buttocks, lips, face appear fuller and more desirable. Whatever happened to being naturally beautiful? When a little girl sees her Mom putting on makeup in the mirror what kind of message is this sending? No wonder this cycle of depression, eating disorders, and plastic surgery appears to be never ending. If people don't step up to break this cycle, it will continue for generations to come. My God, society, why did you ever allow this to be done? To make women hate their bodies, to compare themselves to their best friends, to feel ashamed.