People think that feminism is about women proving to the world that women are superior. This idea is as silly as sexism is. The fact of the matter is, is that feminism is about teaching the world that women should have the same fundamental human rights as men. Having different DNA and genitalia should never take away from someones worth as a human being.
As a woman business owner myself, this is very interesting with powerful information for us women. Women In Business Infographic - How female-owned small businesses are faring #womeninbusiness