What is Feminism
Feminism by defintion is the idea that equality should be enforced
between the sexes. Throughout history, in every single country and
religion, some shape or form of sexism takes place. The idea that women
are considered less than men, that they shouldn't be treated the same,
has been a huge issue in society. Women have come a long way, from a
time where society enforced that women were silenced, to a time where
our society can proudly say they support feminism. That they are working
towards equality.
A feminst is someone who supports feminism. While the idea of being a
feminist sounds straight forward, wanting equality, the word has been
twisted and misdefined.
Many anti-feminists have created a barrier around the word, spreading
the belief that feminists are anti-men. As a result, many women refrain
from calling themselves a feminist because they don't believe in the
oppression of men.
Many people believe that equal rights have been accomplished, that women should be grateful and not make a scene about the small things. So many people, men and women,fail to realize how each small barrier contributes and encloses women, making their problems feel insignificant and undermined.
Women have come a long way, we have fought but the battle is not yet
won. Modern day sexism is less of physical obstacles and more mental.
The obstacles are harder to percieve, but they undermine women
substantialy.
Adding on, even today, most people generally believe
in equality between men and women. The problem is people's mentality,
many still believe that there are certain jobs only men should do, or
are made for men. For example, from a survey in the US, 39% of people
surveyed believe men are better political leaders. The majority of the
remaining (~60%) believe that gender has nothing to do with what job
you should do. These statistics show us one of the reasons why the
United States has not electted a female president yet, and only recently
managed to elect a female vice president, Kamala Harris.