Before we get started, I want to lay out an exhaustive, complete list of every single person who gets a say in what I do with my body:
- Me
End of list.
Cool, now that we have that out of the way, let’s move on.
As a woman (or a man or non-gender conforming person), have you ever felt like everyone and their mother thinks they ought to have a say in what you do with your body? It’s become such an ingrained part of our culture that it can be difficult to even notice sometimes.
I did something crazy today: I wore red lipstick. I usually go with no lipstick or something neutral, so I got a lot of comments. Most of them were really positive, and I felt great about how I looked today. Then one of my coworkers, a man about twenty years older than me, walked over to my desk and commented on it. Here’s how the conversation went:
Him: “Who are you trying to impress with that lipstick?”
Me: “No one. I just like the color and I thought it would be fun.”
Him: *gives me a level look* “Come on now. You ladies don’t wear lipstick like that unless you’re trying to impress someone.”
Author’s note: it was a Herculean effort of will not to immediately start yelling and flipping tables.
Me: “Nope. I just like the color.”
Him: “Well, I’m just saying you don’t need all that. I think you look better without it.”
Me: “Thank you?”
Author’s note: I think this was supposed to be a compliment, but basically what he just said was “You look less good now than you usually do,” which really doesn’t seem like a compliment at all.
Him: “Next time I see you, I don’t wanna see any more lipstick okay?”
Me: *Stares blankly until he walks away, then immediately reapplies lipstick*
Can someone please explain to me why this is deemed acceptable? Ever? Unless the lipstick is MADE OF POISON and I am actually slowly killing myself by wearing it, no one gets a say in whether or not I wear it. And actually, even then, it’s my body and if I decide I want to slowly kill myself with poisoned lipstick, that’s my choice and everyone else can take a hike.
I am so tired of being told that my body isn’t my own. I am tired of seeing insane dress codes that punish young women for having female bodies. I am tired of being told to smile by random guys on the street. I’m tired of being told to cover up or strip down, wear my glasses more often or never wear my glasses at all, laugh more or laugh less. I’m tired of being told that my standards are too high because I don’t settle in any aspect of my life.
I’ve said it before, and I’m sure I’ll say it over and over again: if you want to live a happier, better life, you have to love yourself first. And part of loving yourself is taking ownership of yourself. It’s trusting yourself and the choices you make about your life and your body. It’s giving a big, giant middle finger to anyone, family, friend, or stranger, who tries to make you doubt those things. You are a rocking, kick-ass, awesome human being, and taking ownership of yourself is learning to trust that.
Pop-quiz! Let’s see if you’ve been paying attention. Please answer the following multiple choice question:
Which of these people gets a say in what you do with your body?
a) your neighbor with the annoying yappy dog who thinks your new bangs look weird
b) the scary homeless guy on the street who thinks your skirt is too short
c) your beautiful, confident, kick-ass self
d) your boyfriend who thinks you don’t shave your legs enough
e) all of the above
If you picked C, you’re well on your way to taking ownership of your body. Woohoo! Go you! If you picked D, you should probably dump your loser boyfriend. If you picked anything else…well, maybe go back to the beginning and read again.
Taking ownership of yourself can be scary and seem impossible, whether you’ve been through something traumatic or you’re just dealing with the daily reminders that our culture doesn’t think your body is your own. Trust me, I understand. But I’m here to tell you that taking back your body is worth it, because life on the other side is pretty damn great.