Quote:
Originally Posted by susan314
Hmmn, now that I think about it, this whole conversation makes me think of a story that was relayed to me recently. My grandmother passed away on Easter, and at her funeral last week people were sharing lots of stories. One of her sisters mentioned that they tried as hard as possible to cover themselves and avoid getting tan while working the fields. (My grandmother was raised on a farm and worked the fields, tended the animals, etc.) Apparently back then, being tan was a dead giveaway that you came from a poorer family. I wonder when the perception changed that being tan was a "bad" thing to being a highly desired thing?
(Sorry, I know that last paragraph is straying a little from the topic at hand - all the talk about tanning reminded me of that story, and of course my grandmother has been on my mind lately with her recent passing.)
|
When most people no longer had to work in the fields, rich or poor. Then tanness became a sign of being healthy and physically fit.
In the early 20th century being tan started to mean that you were more well off (at least in Europe) and no one would confuse your golden tan from the French Riviera as a farmer's tan. By the '60s tan was in and pale was out.
See:
Wiki on Sun tanning