Women's empowerment can be defined to promoting women's sense of self-worth, their ability to determine their own choices, and their right to influence social change for themselves and others. In Western countries, female empowerment is often associated with specific phases of the women's rights movement in history.