Quote:
Originally Posted by Drolefille
When most people no longer had to work in the fields, rich or poor. Then tanness became a sign of being healthy and physically fit.
In the early 20th century being tan started to mean that you were more well off (at least in Europe) and no one would confuse your golden tan from the French Riviera as a farmer's tan. By the '60s tan was in and pale was out.
See: Wiki on Sun tanning 
|
Funny this topic came up. I was having a discussion about this very phenomenon with a white co-worker the other day. The more things change, the more they remain the same.