General Discussion
In reply to the discussion: Women should "get their tits out" in order to be taken seriously? [View all]In my lifetime, getting someone to pay attention to you has always been about flaunting what you've got. As a young woman, I felt stalked and harassed where ever I was; the only attention I got was for my body. I learned to hide it under loose clothing, go about without make-up or hair styles, trying to deflect attention. It seemed like that was the only value I had to others. Even other women, who were constantly urging me to be more decorative. As an older woman who no longer attracts that kind of attention, I find that some people take me seriously. Others are intimidated because I'm not passive, and I don't respond to attempts to bully. Professionally, every year I have a few attempts to send a man in to intimidate me with his testosterone. It never works.
There are plenty of people in our culture that value women for who we are, but our culture as a whole still values body over mind and heart, still sees women as somehow less than men, in my opinion.
If we were truly taken seriously, breasts out, breasts in...it wouldn't matter.