Category: Information / Topics: Bias, Distortion • Credibility • Information • Internet • Media • Perception • Trends
by Stu Johnson
Posted: October 284, 2020
Where and how you get your information matters…
Two things hit my radar this past week that spoke to the contentious election season in which we find ourselves in 2020.
Note: See the follow-up to this article in the October 16 post, "The Facebook Threat."
Last Sunday, Ted Koppel did a piece on CBS Sunday Morning about the sources of information people use leading up to an election. He was dismayed by how many (he focused on Trump supporters) did not rely on traditional sources of news—radio, TV (even Fox), newspapers—but other sources, especially social media. To me, this pointed out several things:
Over the past year or so there has been an increasing focus on the role that technology giants like Amazon, Facebook, and Google play on a global scale. This includes scrutiny of concern for privacy, their predatory practices (buying out smaller companies to either secure or bury their apps), and their roles in the toxic divisiveness seen today.
While Facebook and other platforms contribute to a great sense of community not possible through other technologies, it is important to recognize that being run on business models that rely on advertising (as opposed to subscription or non-profit models) their true clients are not the users but the advertisers. And the sophistication of the technology pushes that to levels of targeting not possible with print and broadcasting. A few examples, then the warning received this past week from Mozzila that relates to the election. I will use Facebook as an example, but all of the major social media platforms share these traits (and are owned by the biggest players, free from anti-trust action so far).
In order to increase exposure to advertising, it is important that the platform gains your attention and holds it for as long a period of time as possible. This leads to features and content that grab you (admittedly with pleasure on the part of many users because they are, in fact, useful and fun).
Driving the ability to target advertising is the network of users, capturing as much data as possible about you, your interests, and your purchasing habits. Combined with your network of friends, this creates a Filter Bubble, which makes your experience on the platform unique to you. What you post may be limited in access to your Friends, but not the eager grasp of the Facebook database.
That little Like button that seems so innocuous is actually a fairly simple way for Facebook to expand the extent and detail of its data on you and other users. Tagging people in pictures is another. AI (Artificial Intelligence) is increasing the ability to analyze, look for patterns of behavior and lifestyle choices, all to drive more advertising targeted with ever more precision. Predictive analysis can look at the steps a buyer takes to make a decision on a big purchase (let's say an expensive vehicle), then find other users in the network who may potentially be helped to follow a similar path through contacts, ads, content, and recommendations.
The most insidious examples related to political and social division is in the area of news feeds and recommendations. Realizing that people spend more time when exposed to negative rather than positive viewpoints—which increases potential advertising revenue—Facebook and the other platforms use algorithms that tailor recommendations to each user, clustering people into opposing camps rather than bringing them together into community (which Mark Zuckerberg would have you believe).
For many years I used Google Chrome as my default browser because it did the best job of giving me a faithful presentation for website code development. After learning more about the troubling practices of Big Tech, I switched to Firefox, the browser offered by Mozilla, a non-profit group concerned about privacy issues and best practices for developing the Internet as a good place to be. (Gallup added Large Technology to its list of institutions this year, which put it at number 10 with 32% confidence.)
Following is a message that came this past week from Mozilla:
Facebook Groups pose a major threat in this election season.
They’ve become hidden breeding grounds for disinformation campaigns and organizing platforms for extremists. And Facebook’s own algorithmic recommendation engines actively grow these networks by promoting them to unsuspecting users – something the company has known since 2016.
With conspiracy theories, disinformation, and foreign influence running rampant in Facebook Groups, the company must turn off group recommendations until the U.S. election results are certified.
In recent days, the company acknowledged the role of Groups in spreading misinformation by discontinuing recommendations of health Groups to “prioritize connecting people with accurate health information.” While this is a good step, this isn’t a strategy - it’s a never-ending game of whack-a-mole with devastating consequences.
Facebook has known about this problem for years but ignored it, while extremism grew on the platform. In fact, the company began heavily promoting Groups for the last several years even though in 2016, researchers presented evidence to the company showing that “64% of all extremist group joins are due to [Facebook’s] recommendation tools…” in other words “[Facebook’s] recommendation systems grow the problem.”
Bad actors will use whichever Groups they can to plant disinformation, and algorithmic recommendations for Facebook users to join new Groups help grow a potential audience. With a critical election underway, Facebook must take this policy a step further and stop ALL group recommendations until election results are certified.
Read the Mozzila article, sign the petition and find other related resoruces.
This article was also posted on SeniorLifestyle, which Stu edits. There you can also see Dan Seagren's article on
Search all articles by Stu Johnson
Stu Johnson is owner of Stuart Johnson & Associates, a communications consultancy in Wheaton, Illinois focused on "making information make sense."
• E-mail the author (moc.setaicossajs@uts*)* For web-based email, you may need to copy and paste the address yourself.
Posted: October 284, 2020 Accessed 3,366 times
Go to the list of most recent InfoMatters Blogs
Search InfoMatters (You can expand the search to the entire site)
Category: Information / Topics: Bias, Distortion • Credibility • Information • Internet • Media • Perception • Trends
by Stu Johnson
Posted: October 284, 2020
Where and how you get your information matters…
Two things hit my radar this past week that spoke to the contentious election season in which we find ourselves in 2020.
Note: See the follow-up to this article in the October 16 post, "The Facebook Threat."
Last Sunday, Ted Koppel did a piece on CBS Sunday Morning about the sources of information people use leading up to an election. He was dismayed by how many (he focused on Trump supporters) did not rely on traditional sources of news—radio, TV (even Fox), newspapers—but other sources, especially social media. To me, this pointed out several things:
Over the past year or so there has been an increasing focus on the role that technology giants like Amazon, Facebook, and Google play on a global scale. This includes scrutiny of concern for privacy, their predatory practices (buying out smaller companies to either secure or bury their apps), and their roles in the toxic divisiveness seen today.
While Facebook and other platforms contribute to a great sense of community not possible through other technologies, it is important to recognize that being run on business models that rely on advertising (as opposed to subscription or non-profit models) their true clients are not the users but the advertisers. And the sophistication of the technology pushes that to levels of targeting not possible with print and broadcasting. A few examples, then the warning received this past week from Mozzila that relates to the election. I will use Facebook as an example, but all of the major social media platforms share these traits (and are owned by the biggest players, free from anti-trust action so far).
In order to increase exposure to advertising, it is important that the platform gains your attention and holds it for as long a period of time as possible. This leads to features and content that grab you (admittedly with pleasure on the part of many users because they are, in fact, useful and fun).
Driving the ability to target advertising is the network of users, capturing as much data as possible about you, your interests, and your purchasing habits. Combined with your network of friends, this creates a Filter Bubble, which makes your experience on the platform unique to you. What you post may be limited in access to your Friends, but not the eager grasp of the Facebook database.
That little Like button that seems so innocuous is actually a fairly simple way for Facebook to expand the extent and detail of its data on you and other users. Tagging people in pictures is another. AI (Artificial Intelligence) is increasing the ability to analyze, look for patterns of behavior and lifestyle choices, all to drive more advertising targeted with ever more precision. Predictive analysis can look at the steps a buyer takes to make a decision on a big purchase (let's say an expensive vehicle), then find other users in the network who may potentially be helped to follow a similar path through contacts, ads, content, and recommendations.
The most insidious examples related to political and social division is in the area of news feeds and recommendations. Realizing that people spend more time when exposed to negative rather than positive viewpoints—which increases potential advertising revenue—Facebook and the other platforms use algorithms that tailor recommendations to each user, clustering people into opposing camps rather than bringing them together into community (which Mark Zuckerberg would have you believe).
For many years I used Google Chrome as my default browser because it did the best job of giving me a faithful presentation for website code development. After learning more about the troubling practices of Big Tech, I switched to Firefox, the browser offered by Mozilla, a non-profit group concerned about privacy issues and best practices for developing the Internet as a good place to be. (Gallup added Large Technology to its list of institutions this year, which put it at number 10 with 32% confidence.)
Following is a message that came this past week from Mozilla:
Facebook Groups pose a major threat in this election season.
They’ve become hidden breeding grounds for disinformation campaigns and organizing platforms for extremists. And Facebook’s own algorithmic recommendation engines actively grow these networks by promoting them to unsuspecting users – something the company has known since 2016.
With conspiracy theories, disinformation, and foreign influence running rampant in Facebook Groups, the company must turn off group recommendations until the U.S. election results are certified.
In recent days, the company acknowledged the role of Groups in spreading misinformation by discontinuing recommendations of health Groups to “prioritize connecting people with accurate health information.” While this is a good step, this isn’t a strategy - it’s a never-ending game of whack-a-mole with devastating consequences.
Facebook has known about this problem for years but ignored it, while extremism grew on the platform. In fact, the company began heavily promoting Groups for the last several years even though in 2016, researchers presented evidence to the company showing that “64% of all extremist group joins are due to [Facebook’s] recommendation tools…” in other words “[Facebook’s] recommendation systems grow the problem.”
Bad actors will use whichever Groups they can to plant disinformation, and algorithmic recommendations for Facebook users to join new Groups help grow a potential audience. With a critical election underway, Facebook must take this policy a step further and stop ALL group recommendations until election results are certified.
Read the Mozzila article, sign the petition and find other related resoruces.
This article was also posted on SeniorLifestyle, which Stu edits. There you can also see Dan Seagren's article on
Search all articles by Stu Johnson
Stu Johnson is owner of Stuart Johnson & Associates, a communications consultancy in Wheaton, Illinois focused on "making information make sense."
• E-mail the author (moc.setaicossajs@uts*)* For web-based email, you may need to copy and paste the address yourself.
Posted: October 284, 2020 Accessed 3,367 times
Go to the list of most recent InfoMatters Blogs
Search InfoMatters (You can expand the search to the entire site)
Category: Information / Topics: Bias, Distortion • Credibility • Information • Internet • Media • Perception • Trends
by Stu Johnson
Posted: October 284, 2020
Where and how you get your information matters…
Two things hit my radar this past week that spoke to the contentious election season in which we find ourselves in 2020.
Note: See the follow-up to this article in the October 16 post, "The Facebook Threat."
Last Sunday, Ted Koppel did a piece on CBS Sunday Morning about the sources of information people use leading up to an election. He was dismayed by how many (he focused on Trump supporters) did not rely on traditional sources of news—radio, TV (even Fox), newspapers—but other sources, especially social media. To me, this pointed out several things:
Over the past year or so there has been an increasing focus on the role that technology giants like Amazon, Facebook, and Google play on a global scale. This includes scrutiny of concern for privacy, their predatory practices (buying out smaller companies to either secure or bury their apps), and their roles in the toxic divisiveness seen today.
While Facebook and other platforms contribute to a great sense of community not possible through other technologies, it is important to recognize that being run on business models that rely on advertising (as opposed to subscription or non-profit models) their true clients are not the users but the advertisers. And the sophistication of the technology pushes that to levels of targeting not possible with print and broadcasting. A few examples, then the warning received this past week from Mozzila that relates to the election. I will use Facebook as an example, but all of the major social media platforms share these traits (and are owned by the biggest players, free from anti-trust action so far).
In order to increase exposure to advertising, it is important that the platform gains your attention and holds it for as long a period of time as possible. This leads to features and content that grab you (admittedly with pleasure on the part of many users because they are, in fact, useful and fun).
Driving the ability to target advertising is the network of users, capturing as much data as possible about you, your interests, and your purchasing habits. Combined with your network of friends, this creates a Filter Bubble, which makes your experience on the platform unique to you. What you post may be limited in access to your Friends, but not the eager grasp of the Facebook database.
That little Like button that seems so innocuous is actually a fairly simple way for Facebook to expand the extent and detail of its data on you and other users. Tagging people in pictures is another. AI (Artificial Intelligence) is increasing the ability to analyze, look for patterns of behavior and lifestyle choices, all to drive more advertising targeted with ever more precision. Predictive analysis can look at the steps a buyer takes to make a decision on a big purchase (let's say an expensive vehicle), then find other users in the network who may potentially be helped to follow a similar path through contacts, ads, content, and recommendations.
The most insidious examples related to political and social division is in the area of news feeds and recommendations. Realizing that people spend more time when exposed to negative rather than positive viewpoints—which increases potential advertising revenue—Facebook and the other platforms use algorithms that tailor recommendations to each user, clustering people into opposing camps rather than bringing them together into community (which Mark Zuckerberg would have you believe).
For many years I used Google Chrome as my default browser because it did the best job of giving me a faithful presentation for website code development. After learning more about the troubling practices of Big Tech, I switched to Firefox, the browser offered by Mozilla, a non-profit group concerned about privacy issues and best practices for developing the Internet as a good place to be. (Gallup added Large Technology to its list of institutions this year, which put it at number 10 with 32% confidence.)
Following is a message that came this past week from Mozilla:
Facebook Groups pose a major threat in this election season.
They’ve become hidden breeding grounds for disinformation campaigns and organizing platforms for extremists. And Facebook’s own algorithmic recommendation engines actively grow these networks by promoting them to unsuspecting users – something the company has known since 2016.
With conspiracy theories, disinformation, and foreign influence running rampant in Facebook Groups, the company must turn off group recommendations until the U.S. election results are certified.
In recent days, the company acknowledged the role of Groups in spreading misinformation by discontinuing recommendations of health Groups to “prioritize connecting people with accurate health information.” While this is a good step, this isn’t a strategy - it’s a never-ending game of whack-a-mole with devastating consequences.
Facebook has known about this problem for years but ignored it, while extremism grew on the platform. In fact, the company began heavily promoting Groups for the last several years even though in 2016, researchers presented evidence to the company showing that “64% of all extremist group joins are due to [Facebook’s] recommendation tools…” in other words “[Facebook’s] recommendation systems grow the problem.”
Bad actors will use whichever Groups they can to plant disinformation, and algorithmic recommendations for Facebook users to join new Groups help grow a potential audience. With a critical election underway, Facebook must take this policy a step further and stop ALL group recommendations until election results are certified.
Read the Mozzila article, sign the petition and find other related resoruces.
This article was also posted on SeniorLifestyle, which Stu edits. There you can also see Dan Seagren's article on
Search all articles by Stu Johnson
Stu Johnson is owner of Stuart Johnson & Associates, a communications consultancy in Wheaton, Illinois focused on "making information make sense."
• E-mail the author (moc.setaicossajs@uts*)* For web-based email, you may need to copy and paste the address yourself.
Posted: October 284, 2020 Accessed 3,368 times
Go to the list of most recent InfoMatters Blogs
Search InfoMatters (You can expand the search to the entire site)