Now I am pretty far left, but for some reason this is bothering me. I feel like it's this kind of thing that makes people think journalism schools are almost always liberal feeding grounds.
1. Am I nuts or was this not a good idea for them to show us?
2. Should I suggest a different movie about media bias without a slant?
3. What movies or books are good on fair media and reporting?