|
There's an ever-present complaint by non-Americans that US'uns have it backward... we're comfortable with violence in the media, but uncomfortable with sexuality. Not to debate which is the correct view, do I post. Instead, I would like to discuss WHY we are that way.
I would guess that it goes back to the fact that this country was founded under the pretenses of religion. That's why that whole "under God" thing. Sex bad. But why would violence be fine by these people? I guess religion has a history of violence, but is American religion any more violent than European religion?
Another thought was, maybe it's because we think you can fake violence, so it's ok. What I mean is, if we go to a movie and see someone take a bullet to the head, we can say "that's ok, it wasn't real." The guy gets back up after the camera stops. But when you go to a movie and see a naked woman, that's it. You've seen real nudity. She might put her clothes back on afterward, but the "damage" is done.
I don't know. Any thoughts? |
|