I was on-board with this show until the Dr. stated that because most Doctors are men, female patients sometimes feel their health concerns are dismissed. She went on to say that because most Doctors are white, black patients often feel their health concerns are trivialized.
Regardless of whether or not most doctors are white men, it is a well documented fact that black people and women are given far worse healthcare in all aspects. Pain is not taken seriously, and symptoms are frequently blamed on "nerves". I've experienced this. I know it is a common occurence. I think this Dr. did everyone a huge disservice by blowing it off, instead of getting her facts right. :(