Devchonka Posted June 7, 2010 Share Posted June 7, 2010 Is there anyone else that feels this strongly about ALL the rape scenes in television and movies? It just pissed me off so much. I wasn't even raped, but I feel connected to fellow women and it just feels wrong. I feel like rape of women is used to excite anger and pity for entertainment purposes. I feel like women are used. Men don't know what that emotion is like and I know women are not at all interested in seeing women get savagely raped, especially when the scenes last way too long...struggling victim with fear in her eyes and brute one or more men over her. Not to mention it is the men overwhelmingly making the movies and writing the scripts...so who are they writing this for? Most of the time the scene doesn't contribute anything to the movie or the show anyway, not in a way that couldn't be portarayed in some different way. I just watched Play Dirty, and the scene caught me off guard because it was nothing to do with the movie aside from showing us that the men were brutal (as if the rest of the movie didn't) all I kept thinking is HOW THE **** IS THIS ENTERTAINMENT? WHY ARE YOU SHOWING THIS TO ME? I would love to see how many anally raped men scenes there are for every woman rape scene. Even when we are talking about SVU, which I refuse to watch anymore. You know why there isn't? Because men would never stomach constantly watching themselves being raped. THAT'S why it's only women or little boys or girls. I mean really? Does no one else find it disturbing? Men portraying women in inescapable situations where men are destroying their dignity and life while overpowering and enjoying the fear and sadness in their eyes? WHY IS THIS OKAY? Link to comment
This topic is now archived and is closed to further replies.