May 6, 2021

Why we Should Never Tell a Woman to Smile.

Aside from the fact that what a woman does with her body should be nobody’s business but her own, there are many reasons why telling a woman to smile is harmful.

In a culture that tells women how to look beautiful, telling a woman to smile can be devastating to her self-esteem.

She’s already told that real women have thigh gaps and curves in certain places. She’s taught she needs to wear makeup, have no wrinkles, and wear her hair long. Any woman who doesn’t fit that mold has probably hated herself for it at some point. She wasn’t gifted with the magic genes that bestowed her with the perfect body.

I’ll let you in on a little secret, though: many women who are told to smile have “resting b*tch face.” For some people, the corners of their mouths and eyes turn up toward the sky. Those with resting b*tch face have features that turn down. Consequently, some women appear to be smiling all the time, and some appear to be frowning all the time.

My genetics gifted me with the features that turn down, so I often look like I’m scowling. I have to work a little harder to seem like I’m smiling and happy. I can’t count the number of times I’ve looked at myself in the mirror and wished that it were the other way around.

Why do I have to look so unhappy or unbeautiful to other people when I’m only being me?

So when a man tells a woman like me to smile, it simply points out another feature that I hate about myself. It undoes a lot of the work and therapy over the years as I address this issue.

Or maybe a woman has crooked teeth, and she hates showing them. She might feel ugly because she won’t smile or be unattractive because she has crooked teeth showing in her smile.

I spent many years hating one tooth that was terribly out of alignment, and I never smiled with my teeth. When I got braces as an adult, it still took me a handful of years to smile with my teeth.

Sure, these are more superficial issues, but what about traumatic and behavioral problems?

It can set them back in their therapy work for women who are working on building their autonomy and sovereignty. Suppose a woman is directed what to do by either family members or a romantic interest. In that case, it can take years to undo that trauma. She will question herself often, ask for reassurance, and possibly hide to avoid making people unhappy or uncomfortable.

Depending on how far along she is in her journey, she can feel like she’s right back in that controlling relationship, all because she’s told to smile more. Think about how many times people force a young girl to smile for family and pictures. She has to do what her parents say, or she gets in trouble. Then she ends up in a controlling relationship with a man who tells her what to do with her body.

At some point, she fully dissociates because she doesn’t know what she’s supposed to feel anymore, especially if the relationship is physically abusive too.

It can take years and years of therapy to help the woman genuinely find herself in her body again.

The sad reality is none of us know what is going on in someone’s life, especially a stranger’s. So men, why is it so important that a woman smile? Don’t you have more important things to worry about, something that won’t be detrimental to a woman’s healing process?

~

Read 28 Comments and Reply
X

Read 28 comments and reply

Top Contributors Latest

Ember James  |  Contribution: 39,385

author: Ember James

Image: Thought Catalog/Flickr

Image: Quote Catalog

Editor: Naomi Boshari

Relephant Reads:

See relevant Elephant Video