The appearance of Facebook’s Whistleblower Frances Haugen on 60 Minutes this week, raised the level of concern about the power and influence of Facebook, Google, Twitter and other social media platforms. Their ability to influence everything from clothing trends to election outcomes is a source of alarm on many levels.
Haugen’s complaints, as filed with the Securities and Exchange Commission and revealed in an extensive series in The Wall Street Journal demonstrate how the algorithms that determine our viewing paths (as determined by personal preference data they collect on each of us) increase ad revenue and swell their bottom line. While few are surprised by her remarks, it is startling to hear how Facebook repeatedly and cavalierly chooses its profits over the wellbeing of its users.
Says Haugen, “The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook, over and over again, chose to optimize for its own interests, like making more money. Facebook is picking out that content…that gets engagement, a reaction, but its own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions.”
Particularly disturbing in Haugen’s report is the role Instagram, owned by Facebook, plays in the lives of teenage girls. When 60 Minutes’ commentator Scott Pelley prompted, “One study says 13.5% of teen girls say Instagram makes thoughts of suicide worse and 17% of teen girls say Instagram makes eating disorders worse,” Haugen responded, “and what’s super tragic is Facebook’s own research says as these young women begin to consume this eating disorder content, they get more and more depressed and it actually makes them use the app more and so they end up in this feedback cycle where they hate their bodies more and more.”
I have long been a proponent of free speech and I have always looked cautiously at regulatory policies that can endanger free speech except when the public good is sacrificed to the bottom line. Earlier in my career I was involved in media justice, working against consolidation in the broadcast industry which had the impact of silencing marginalized voices—especially women, people with disabilities and people of color. Section 230 of the 1996 Communications Decency Act (CDA)—which has received much publicity in today’s debates—was initially passed to safeguard free speech. However, the original intent of the law has been twisted by the profits-first business model of social media giants like Facebook.
CDA Section 230 is now used to provide “safe harbor” for algorithms that favor anger, ethnic hatred and misinformation in an attempt to keep users “engaged” in the site—irrespective of the well-being of the user or society-at-large—for the purpose of adding to the corporate bottom line. Digital advertising spending worldwide stood at an estimated 378 billion U.S. dollars in 2020, and this figure is forecast to increase in the coming years, reaching a total of 646 billion U.S. dollars by 2024. So, these advertising dollars form a very sweet pot indeed.
Much of Haugen’s on-air comments centered on the role of misinformation in our political arena. The motivation behind algorithmic choices that favor corporate greed over national security applies here as well. According to Haugen, Facebook understood the danger of misinformation in the polarized 2020 election, and turned on safety systems to reduce this misinformation. Then she added, “as soon as the election was over, they turned them back off or they changed the settings back to what they were before to prioritize growth over safety. And that really feels like a betrayal of democracy to me.” Nothing, it seems, is sacred to Facebook.
The internet was in its infancy and social media did not exist when many regulatory provisions were established. It is time for Congress to take a long, hard look at how social media has changed in the 21st century and reshape regulatory policies that are relevant to our day. For example, there are ways that the controversial CDA Section 230 can be modified by including a “duty-of-care” provision that adds a “common good” mandate under which social media platforms must operate. Congress can amend the quarter-century old law to more appropriately reflect today’s media landscape.
These and other guardrails are necessary for digital platforms like Google, Twitter and Facebook that run in largely unregulated territory through so much of our daily lives. “We the people” must challenge Congress to limit the power of these massive entities who dominate so much of our daily lives. There must be limitations placed on the opaque and underhanded methodologies used to drive traffic in ways that only benefit profit margins and not their often unwitting users.