Do you really think Hollywood is desensitizing the world's youth? (um, it's tough to tell without a tone of voice, but that's a legit question, not sarcasm) Do movies have that much influence on society, at least as far as violence is concerned?
I'm curious what other people think, since it's such a popular media topic. (one line answer for me, which I can expand at will- No, I don't believe movies can be proven to be a major contributor to the "desensitization of youth", or can unbalance a balanced mind.) Sorry to try and start a philosophical discussion on a fluff thread, but I'm curious.
One more question- did anyone see From Hell? I wanted to, but I never got a chance. Is it worth the two hours?