Career & Education

Women in Healthcare: What’s New?
Career & Education Health & Wellness

Women in Healthcare: What’s New?

Healthcare is one of the few industries in which workers are predominantly female. Even today, women account for 76% of the healthcare workforce in the U.S. While males have, unsurprisingly, been overly represented in higher-paying healthcare roles, even this is beginning to…