Does Weight Loss Improve Body Image?

There is a common belief that body image is correlated with one’s actual physical appearance. That is, many of us believe that if we somehow “improve” how we look – by losing weight, for example – we will be happier with our appearance. It makes sense, but it is not always true.

A Purdue University study has demonstrated that Caucasian teenage girls who lose weight, transitioning from an obese weight into a healthier weight range, still see themselves as overweight even after they’ve shed the pounds. So, losing weight doesn’t always make us feel better about ourselves. Why is this? Here are a few reasons.

 

The wrong diet

Often, we believe that losing weight will make us feel happier and more confident. So, in a desperate attempt to shed the pounds, we undertake crash diets, begin counting calories, and put ourselves through punishing workouts. This might help us to lose weight, but because that kind of lifestyle is unsustainable, we feel constantly stressed about keeping the weight off. On the other hand, if we adopt truly healthy lifestyles and begin to accept our bodies at any weight, we will likely reach a healthy size and be able to maintain it without the stress.

Prev1 of 3Next

Leave a Reply

Your email address will not be published. Required fields are marked *