If for any reason one might decide that media bias we have in all ''news'' outlets is new and it just now that Facebook is the only one that has corrupted the information business, one has to turn themselves into the nearly mental clinic to be examined.
Media bias is worse now I believe as I have been watching this stuff for decades, because there are so many different outlets bringing the ''news'' to everyone on a minute by minute basis. Media bias was a given a few years ago when there were only 3 major outlets for television news, and just several major printed sources to convince you they are actually telling us the truth.
Remember Walter Cronkite's ending his news cast signoff on CBS, 'And that's the way it is''? Well, most like it wasn't the way it was. The editors decided for you the way it is.
In reality, they really believed what they were doing was legitimate. They believed beyond a doubt it was their job to decide what was important and then print that decision or hand the copy to the news anchor to broadcast reality. Maybe it was, but then again, maybe not.
Many of the words were changed to protect the innocent and the guilty depending on the auditors politics.
Media bias is worse now I believe as I have been watching this stuff for decades, because there are so many different outlets bringing the ''news'' to everyone on a minute by minute basis. Media bias was a given a few years ago when there were only 3 major outlets for television news, and just several major printed sources to convince you they are actually telling us the truth.
Remember Walter Cronkite's ending his news cast signoff on CBS, 'And that's the way it is''? Well, most like it wasn't the way it was. The editors decided for you the way it is.
In reality, they really believed what they were doing was legitimate. They believed beyond a doubt it was their job to decide what was important and then print that decision or hand the copy to the news anchor to broadcast reality. Maybe it was, but then again, maybe not.
Many of the words were changed to protect the innocent and the guilty depending on the auditors politics.
The Blurred Lines of Modern Media
Armstrong Williams
/
@Arightside
/
Facebook is once again under heavy fire for its seemingly insatiable appetite for slurping up and analyzing endless streams of user data. The social media giant is again playing defense over its powerful role in controlling the news that each of us sees on our Facebook feeds.
Like so many other modern sites, Facebook takes an algorithmic approach to news. It uses code to draw its own conclusions about what stories we want to see, thereby impacting how we look at certain issues.
The chorus of criticism is deafening. However, the mainstream media is being incredibly hypocritical by teaming up against Mark Zuckerberg and his company. After all, how is the media process of determining what people to feature and what topics to cover any different than what Facebook is doing?
If anything, the old way of choosing the news is likely more prone to political or situational bias and more susceptible to human error. After all, people are fallible, and we all make mistakes.
I know about this firsthand. As a syndicated columnist, my articles run in print in various publications across the world and appear on various websites. My reach is further expanded through my own social media channels, as well as through dedicated readers consistently sharing my content with an ever-widening audience by posting links to the stories in their social media feeds, or by emailing them to their family members and friends.
I will be the first to admit that the topics I explore are often determined by situations arising in my own life. It’s not an inherently good or bad thing, it just is.
If a friend were to be diagnosed with an illness, then without any conscious effort I would probably be more inclined to write pieces related to wellness or nutrition, or I might talk philosophically upon the most important things in our lives.
While there are many similarities between the “curation” of news in newsrooms, and around the table during editorial meetings, and the curation of news algorithmically by Facebook and other sites, there are also some notable differences.
Social media often forces us to experience a stronger media bias. This is likely due in large part to the fact that the goal is to drive clicks, to extend the time we spend within the confines of any one platform, and to increase the number of stories with which we interact.
There are clear financial incentives for this approach: Facebook benefits by collecting user data about our preferences. It also increases the likelihood of converting visitors into customers, because Facebook’s advertisers are able to bombard us with sales pitches. In addition, when we increase the amount of time we spend on any one platform, we boost their statistics. This provides the foundation for ever-increasing prices in advertising payments by corporations and brands eager to connect with captive audiences.
Also, let’s not forget that the “social” aspect of social media all but guarantees that we become more firmly ensconced in silos and echo chambers. If one person’s friends are talking about a specific topic, then Facebook is more likely to display content related to it. While for me, this ironically often means that I am presented with a diverse array of viewpoints and topics, for most people, the opposite seems to be true. For the average user, Facebook’s machines lead them to like-minded people talking in similar ways about the same topics as dictated by the platform.
Another clear differentiator is that social media is more likely than traditional media to comingle stories that are related thematically, but may not be from the same news cycle or timeframe. How many times have we all experienced reading a story about a topic in the news, only to click on a related story that is linked below or even hyperlinked from within the text itself?
A negative byproduct of this machine approach to sharing news is that it is sometimes very hard to quickly grasp time differences between pieces that otherwise seemed to be correlated. This can create confusion, as well as compounding issues that should actually be kept separate.
While reading recent coverage of the mass shooting at Marjory Stoneman Douglas High School, I suddenly found myself skimming a story about a mass stabbing in another school. When it showed up in my news feed, I mistakenly thought that it was a breaking story; however, it was actually an older incident that took place in 2014.
With access to more information than ever before, we must be ever vigilant of the drawbacks of modern media, and we must always be ready to read with a critical eye.
Facebook is once again under heavy fire for its seemingly insatiable appetite for slurping up and analyzing endless streams of user data. The social media giant is again playing defense over its powerful role in controlling the news that each of us sees on our Facebook feeds.
Like so many other modern sites, Facebook takes an algorithmic approach to news. It uses code to draw its own conclusions about what stories we want to see, thereby impacting how we look at certain issues.
The chorus of criticism is deafening. However, the mainstream media is being incredibly hypocritical by teaming up against Mark Zuckerberg and his company. After all, how is the media process of determining what people to feature and what topics to cover any different than what Facebook is doing?
If anything, the old way of choosing the news is likely more prone to political or situational bias and more susceptible to human error. After all, people are fallible, and we all make mistakes.
I know about this firsthand. As a syndicated columnist, my articles run in print in various publications across the world and appear on various websites. My reach is further expanded through my own social media channels, as well as through dedicated readers consistently sharing my content with an ever-widening audience by posting links to the stories in their social media feeds, or by emailing them to their family members and friends.
I will be the first to admit that the topics I explore are often determined by situations arising in my own life. It’s not an inherently good or bad thing, it just is.
If a friend were to be diagnosed with an illness, then without any conscious effort I would probably be more inclined to write pieces related to wellness or nutrition, or I might talk philosophically upon the most important things in our lives.
While there are many similarities between the “curation” of news in newsrooms, and around the table during editorial meetings, and the curation of news algorithmically by Facebook and other sites, there are also some notable differences.
Social media often forces us to experience a stronger media bias. This is likely due in large part to the fact that the goal is to drive clicks, to extend the time we spend within the confines of any one platform, and to increase the number of stories with which we interact.
There are clear financial incentives for this approach: Facebook benefits by collecting user data about our preferences. It also increases the likelihood of converting visitors into customers, because Facebook’s advertisers are able to bombard us with sales pitches. In addition, when we increase the amount of time we spend on any one platform, we boost their statistics. This provides the foundation for ever-increasing prices in advertising payments by corporations and brands eager to connect with captive audiences.
Also, let’s not forget that the “social” aspect of social media all but guarantees that we become more firmly ensconced in silos and echo chambers. If one person’s friends are talking about a specific topic, then Facebook is more likely to display content related to it. While for me, this ironically often means that I am presented with a diverse array of viewpoints and topics, for most people, the opposite seems to be true. For the average user, Facebook’s machines lead them to like-minded people talking in similar ways about the same topics as dictated by the platform.
Another clear differentiator is that social media is more likely than traditional media to comingle stories that are related thematically, but may not be from the same news cycle or timeframe. How many times have we all experienced reading a story about a topic in the news, only to click on a related story that is linked below or even hyperlinked from within the text itself?
A negative byproduct of this machine approach to sharing news is that it is sometimes very hard to quickly grasp time differences between pieces that otherwise seemed to be correlated. This can create confusion, as well as compounding issues that should actually be kept separate.
While reading recent coverage of the mass shooting at Marjory Stoneman Douglas High School, I suddenly found myself skimming a story about a mass stabbing in another school. When it showed up in my news feed, I mistakenly thought that it was a breaking story; however, it was actually an older incident that took place in 2014.
With access to more information than ever before, we must be ever vigilant of the drawbacks of modern media, and we must always be ready to read with a critical eye.
No comments:
Post a Comment