The appearance of Facebook’s Whistleblower Frances Haugen on 60 Minutes this week, raised the level of concern about the power and influence of Facebook, Google, Twitter and other social media platforms. Their ability to influence everything from clothing trends to election outcomes is a source of alarm on many levels.

Haugen’s complaints, as filed with the Securities and Exchange Commission and revealed in an extensive series in The Wall Street Journal demonstrate how the algorithms that determine our viewing paths (as determined by personal preference data they collect on each of us) increase ad revenue and swell their bottom line. While few are surprised by her remarks, it is startling to hear how Facebook repeatedly and cavalierly chooses its profits over the wellbeing of its users.
Says Haugen, “The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook, over and over again, chose to optimize for its own interests, like making more money. Facebook is picking out that content…that gets engagement, a reaction, but its own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions.”

Particularly disturbing in Haugen’s report is the role Instagram, owned by Facebook, plays in the lives of teenage girls. When 60 Minutes’ commentator Scott Pelley prompted, “One study says 13.5% of teen girls say Instagram makes thoughts of suicide worse and 17% of teen girls say Instagram makes eating disorders worse,” Haugen responded, “and what’s super tragic is Facebook’s own research says as these young women begin to consume this eating disorder content, they get more and more depressed and it actually makes them use the app more and so they end up in this feedback cycle where they hate their bodies more and more.”
I have long been a proponent of free speech and I have always looked cautiously at regulatory policies that can endanger free speech except when the public good is sacrificed to the bottom line. Earlier in my career I was involved in media justice, working against consolidation in the broadcast industry which had the impact of silencing marginalized voices—especially women, people with disabilities and people of color. Section 230 of the 1996 Communications Decency Act (CDA)—which has received much publicity in today’s debates—was initially passed to safeguard free speech. However, the original intent of the law has been twisted by the profits-first business model of social media giants like Facebook.

CDA Section 230 is now used to provide “safe harbor” for algorithms that favor anger, ethnic hatred and misinformation in an attempt to keep users “engaged” in the site—irrespective of the well-being of the user or society-at-large—for the purpose of adding to the corporate bottom line. Digital advertising spending worldwide stood at an estimated 378 billion U.S. dollars in 2020, and this figure is forecast to increase in the coming years, reaching a total of 646 billion U.S. dollars by 2024. So, these advertising dollars form a very sweet pot indeed.
Much of Haugen’s on-air comments centered on the role of misinformation in our political arena. The motivation behind algorithmic choices that favor corporate greed over national security applies here as well. According to Haugen, Facebook understood the danger of misinformation in the polarized 2020 election, and turned on safety systems to reduce this misinformation. Then she added, “as soon as the election was over, they turned them back off or they changed the settings back to what they were before to prioritize growth over safety. And that really feels like a betrayal of democracy to me.” Nothing, it seems, is sacred to Facebook.

The internet was in its infancy and social media did not exist when many regulatory provisions were established. It is time for Congress to take a long, hard look at how social media has changed in the 21st century and reshape regulatory policies that are relevant to our day. For example, there are ways that the controversial CDA Section 230 can be modified by including a “duty-of-care” provision that adds a “common good” mandate under which social media platforms must operate. Congress can amend the quarter-century old law to more appropriately reflect today’s media landscape.
These and other guardrails are necessary for digital platforms like Google, Twitter and Facebook that run in largely unregulated territory through so much of our daily lives. “We the people” must challenge Congress to limit the power of these massive entities who dominate so much of our daily lives. There must be limitations placed on the opaque and underhanded methodologies used to drive traffic in ways that only benefit profit margins and not their often unwitting users.
Until people realize they are the product, giving info for free for social media to sell to the customers, advertisers, we can’t progress. It is a diabolical business model, resulting in few jobs in a booming industry since users do all the work. These distortions of decency are a greater threat to American democracy and economic opportunity for the masses than any Robber Baron monopoly was. Further, they have a stranglehold
on millions of person’s brains and have rigged the outmoded rules to keep their power. Don’t use them!
Anti trust enforcement, restoration of reasonable taxes on the rich and going to subscriptions could help as would enforcement of penalties against hate crimes and legal action against perpetrators of harmful lies.
Bob, this is an important piece. FYI I highly recommend the movie “The Social Dilemma” if you haven’t seen it. Am dying to meet the person behind the making of the documentary as he started some sort of institute, got people to speak out. I want to tell them about other, much better ways we can use the immense resources and talent that has been deployed to get people depressed and addicted to their phones – for example, what Just Results is doing – applying user experience techniques to digitizing administrative procedures that small businesses struggle with.
I will second Lara’s recommendation of “The Social Dilemma.” We watched it recently with both kids and it led to some great conversations. One of the creators of the film now has a podcast called “Your Undivided Attention.” One of the points they make in the film, which is essential for understanding the insidious nature of these platforms, is that, it’s not so much that they are selling your data. What they are really selling is your attention. It’s fascinating (and scary) the way they describe the difference.
I don’t think censoring the platforms is really the best way to fix it. It’s really all about the algorithms. If we did nothing else but just return to a chronological feed, that would instantaneously make a difference. If YouTube stopped “suggesting” videos, that would make a huge difference, too. It’s the addictive nature that’s so destructive. And it’s addictive because of the algorithms.
An algorithm, favoring “anger, ethnic hatred and misinformation in an attempt to keep users “engaged” in the site—”.. I hadn’t heard of this. Kind of horrifying but not surprising..
I’d like to learn more about the “duty of care”
provision as it sounds like that may be part of the answer. I’ll always remember my mom (1960’s?) standing up in church during an after-service congregational open discussion and asking, “Aren’t we just saying that there IS no freedom without responsibility?”