A quiet day at the spa is probably on everyone’s relaxation list. But let’s be…
In the United States, many women — especially caucasian women — strive to achieve tanned skin. Dr. Deborah S. Sarnoff, a New York City dermatologist, told The Skincare Foundation that this trend is actually pretty recent. According to her, “tanned skin is not, nor has it ever been, a universally accepted ideal.”
While Americans aren’t the only ones wanting to darken their complexion — European and Brazilian women do too — we’re not the majority. Women throughout China, Korea, and Thailand don’t understand the tanning trend — they actually strive to look paler or, as Sarnoff explained, “more pink in tone.” In India, too, some men and women look to creams to lighten their natural skin tone.
For centuries, pale skin was thought to indicate high status because a tan would mean you spent a great deal of time outdoors, perhaps doing manual labor, whereas pale skin meant you were privileged enough to stay indoors. Needless to say, America’s glorification of fair skin was pretty cringe-worthy back then.
By the 1920s, however, Coco Chanel popularized tanning and, by the 1960s, being tan started to signify privilege because it meant you had the time — and the money — to travel. Whether Americans were obsessed with staying pale or, now, getting tan, attempting to change your skin color — especially to look wealthy — is a bit surreal.