Changing gender roles in the workplace

Hello readers!

I came across this article in The Atlantic today which I found really interesting. It talks about how the American workforce has been changing. Many men who used to be in "manly" jobs like manufacturing, factory work, etc., can no longer find work and are looking to healthcare related roles historically associated with women (think nurse, dental assistant, stuff like that).   How are these guys dealing with this career transition? what challenges are they facing? are they satisfied with their choices?  Check out the article and find out!

 

https://www.theatlantic.com/business/archive/2017/05/men-in-nursing/526623/?utm_source=atlfb

 

Albert Pignataro