Why do movie producers think it is okay to make a show specially about the sexuality and sexual business of young women? Why can’t there be a show about The Difficult Lives of College Girls who faced Adversity. Seems to me this society thinks there’s nothing beyond appearance and genitals in women. SMH