I wonder what would happen if the press actually told the truth? A larger question maybe is why wouldn't they want to tell the truth? Who are they and what is their purpose if not to inform the general public what is actually happening in our communities?
Is it the job of the news media to explain and inform people on issues, or explain the news as a matter of one's own agenda and ideology to manage the news?
It appears that that media, as a group has lost the ability to function as an independent source of information bound by journalist integrity with a moral obligation that the Constitution guarantees freedom of the press.
But as well we know freedom doesn't mean license, it means the media has a responsibility to report the news, not make the news. There is a big difference and it appears our national guardians of the nation can't tell the difference any longer.
|It all depends on who your readers are. Making the news rather then reporting the news.|