Saturday, September 20, 2008

Why Women in Hollywood? Start of the Issue


Women in Hollywood originated around the same time as Hollywood. People began moving to Hollywood in the late 19th/ early 20th century, as it offered a lot of unused land for movie production (Allen; 2008). D.W. Griffith shot the first movie in Hollywood, In Old California, in 1910. Now Hollywood is a hubbub of activity, including several lots of studios for television and movies.
Hollywood is not just known for its movies. Hollywood is a part of Los-Angeles, and it is known for several reasons. There are award shows, shopping, numerous clubs, and a lot of paparazzi to go along with all three. It is a breading ground for celebrities and the rich and famous.
Because of the culture of Hollywood, and the opportunities available for the up-and-coming, a lot of media gets focused in Hollywood. Tabloids and magazines produce pictures of celebrities daily, and a lot of times ones that portray these people, especially women, poorly. There have been numerous times that Lindsey Lohan and Britney Spears have been caught getting out of a car without underwear on. Tabloids run pictures of celebrity women in bathing suits, and then circle the areas on their bodies that are not perfect.
A lot of times, movies and television shows are not any better in their portrayal of women. Many movies still show women as delicate housewives, or if they do work, as unfit mothers. A couple of new movies, The Women and House Bunny, show women in an unfriendly light. The former shows women in a department store advertising facelifts and joking about cooking, and this is just in the previews. The latter shows women in scantily clad outfits, pretending to be dumb to get the attention of their male counterparts.
Our population is already obsessed with how we look. We do not need these movies and magazines telling us that our bodies are ugly and that we should dumb ourselves down for the sake of getting a boyfriend. We are presenting a negative image to younger generations, who see these portrayals and act accordingly. No wonder eating disorders and plastic surgery are common in the United States. These celebrities, who have so much influence on the nation, are giving women everywhere a bad name, and influencing the population in a negative way.
This issue is highly prevalent today, and I am personally sick of watching television or movies, or having to see tabloid magazines while I am grocery shopping, that portray women in such a negative way. These women who allow themselves to be shown like this need to realize that they are influencing so many people, and need to either not play the roles that treat women unfairly, or the general public needs to stop buying tabloids that take women back to the 1940’s.
Sources sited:
Allen, R. "Sociology of Film." WSU, Fall 2008.

No comments: