Blacks lightening their skin and whites tanning are two completely different things.
Whites began to tan not to look black but because white people with tanned skin were seen as more rich. Tans symbolized an extravagant lifestyle and a rich one that allowed you to go to beaches and on vacations.
Blacks lighten their skin because fairer skin people are seen to live better lives. More on television, more seen as attractive etc
I heard opposite.
Back in the day the paler you were the higher the status. If you were white with a tan that would indicate that they were a peasant out in the streets, or one of the workers around the house cutting the grass and trimming the hedges.
But it could be two different era's and s**t has changed