Historically, women have always cared for the sick in their communities as healers, medicine women, and midwives. But only men were allowed to get formal medical training. It wasn’t until 1849 that ...