Vous êtes sur la page 1sur 2

The Role of Women in the Last Centuries.

Throughout history, the role of women within society has been always
diminished. Due to her weaker appearance, soft character or poor education,
women have been consigned to the house, garden and childcare for centuries.
Fortunately, as the time went by and some key events occurred, women
certainly gained a more important and consolidated position in society; yet I
still believe that there is a long way to go until complete equality between both
genders is achieved.

From the beginning of times, women were expected to be caregivers, have and
raise children, and do the household duties for the family. Basically that was a
women's typical role, they even needed their husband’s permission to talk.
Women were seen as passive and weak, and not as able to accomplish
physically or mentally all the tasks that men could. It was believed, even back
in the 1600's, during the time of the Puritans that women should not have any
rights outside her home, they only could teach her children about life, morals,
and religion; and they had to depend on her husband to provide for the family
and to take care of them. The role of women in society was contributed by
religious beliefs and the constant negative attitude that man had given women.
Women were treated like they were property of men, with no voice in their
own fate. In contrast, men were constantly seen as superior to women, so
therefore they had the absolute power and decided on everything that involved
their family lives and future.

The roles of women between 1790 and 1860 became more public, focusing less
on domestic affairs and allowing women a limited freedom in the political and
economic areas. Women started to work outside their home to help the house
income, mainly lower class women. If they didn’t have children, they also
helped their husbands in harvest time. By the late eighteenth century, women
could not vote or retain property after marriage, and to make matters worse,
their husbands could also legally beat them. Wives were subordinates to their
husbands and expected to stay quiet and out of sight.

However, by the mid-nineteenth century, women were given more


independence; they began to demand greater political presence along with an
elevated social status. Abigail Adams first introduced the idea of equal rights
for women; but only after the Revolution, there was a change in how society
viewed women. Women started to study and became teachers, later on they
worked on factories. However, on the one hand, women were bound by the cult
of domesticity. Once a working woman became married, she was expected to
give up whatever job she had in order to stay at home. Culturally it had long
ago been established that the wives were the moral compass of the family,
they also assumed leadership in morality and piety in society; they were
expected to raise competent daughters and virtuous and patriotic sons.

1
On the other hand, many women felt that the home was a cage and longed to
escape and establish their own place in society. Families gradually grew
smaller as domestic feminism became moderately popular. Women made the
decision to have fewer children and consequently families became smaller,
though more close and affectionate. By 1860, women had not been able to get
rid of their domestic burden; however many decided not to marry in order to
maintain a healthy career.

The Industrial Revolution brought about the birth of factories which provided
young women with opportunities for employment. The hardships they
encountered led many to become involved in the developing trade unions and
associations to influence the less fortunate and promote religious ideals,
eliminate prostitution, alleviate poverty and achieve equality. They wanted to
correct abuse and instill a code of morality in men. Women also got involved
with abolition and abused of their black counterparts, this made them spring to
politics. They wanted to apply the constitutional principle of men equality to
women as well.

Consequently, the way in which western society sees women today is


absolutely a result of the entire struggle that our ancestors have carried out.
Nowadays, women have the same rights as men, and they have the capacity to
lead their own lives and their own businesses. They are able to follow a career,
speak up their minds and being respected, and have a family as well. Society
has verbally accepted the equity between men and women, so both can reach
top positions as judges, business leaders and politicians, and even a number of
previously all-male professions are opening their ranks to women. However, the
traditional views of the position of women within society are so deeply
ingrained that women are still seen as the housewives who look after their
families, which is even reinforced in TV advertisements. Women are still seen
as the weaker sex, and whenever a woman becomes part of a male
environment, either as a coworker or as a chief, men feel reluctant to accept
their opinions or authority. It may be because they feel they will lose their
manliness if they do so.

In sum, no matter how enormous are the world changes, no matter what
country and social system people live in, no one can deny women's importance
in history. Although it is fair to say that women, in the broadest sense, have
more freedom, equal rights and are priority in many cases, prejudices still
remain. It is important to look back on history and value the actions taken by
women along the past centuries to make progress for gender equity, respect
and acceptance; I can say that we are living in the age of multifaceted self-
sufficient women, though we still have many struggles to come until we raise
full consciousness of total gender equality.