EP. 39 — MAKING OUTRAGE ADDICTIVE

(Transcripts may contain errors. Please check the corresponding audio before quoting in print.)

Weston Wamp: I'm Weston Wamp, and this is “Swamp Stories,” presented by Issue One. 

Weston Wamp: If you’re not paying for the product, then you are the product.

This adage has applied to the offerings of technology companies that do not cost us anything.

Or at least we’ve been led to believe that they’re free.

In reality, the website that began connecting college students nearly 20 years ago has fundamentally changed many of our daily lives. Many of the smartest computer programmers in the world have been charged with building algorithms to engage us and to addict us with the help of data that identifies who we are, how we think, and what we like to buy.

It turns out that they want us to spend as much time as possible on their platforms in order to rake in unprecedented amounts of ad revenue by delivering us — the buyers — to companies that are selling things. To do so, these companies’ business models rely on inflaming our emotions and creating insular communities of people who believe the same things. Within these bubbles, they elevate conversations with the most controversial and eye-popping content. 

So what’s the best way of keeping people glued? Division. Extremism. Hate. That’s often the content that attracts the most likes and shares — content that sells.

In the dawn of social media, the potential downside was rarely considered. These new platforms were going to bring about a new age of enlightened discourse and build new communities and, in politics, create opportunities to speak to new voters in unconventional ways. 

Not so much. 

This is Episode 39: Making Outrage Addictive

Weston Wamp: In order to assess the evolving impact of technology on our democracy, it’s necessary to understand how social media platforms — and the algorithms that power them — consume so much of our personal time. I spoke to a couple of the leading voices who are raising concerns and proposing solutions.

David Jay is the Chief Mobilization Officer at the Center for Humane Technology.

David Jay: We've seen tremendous innovation come out of the technology industry. A huge amount of that innovation has been focused on what's called engagement.

It's been thinking about, how do we get people to spend a little bit more time on our site? How do we get them to click the button that we want them to click? How do we get them to upload the videos we want them to upload?

Getting people to do what you want is something that is as old as human history, persuasion is as old as human history, we are a decade, a little more, into a unprecedented experiment in the automation of persuasion.

Where we are engaged in a race to the bottom of the brain stem, where the smartest minds of a generation, all over the world, are designing thousands and thousands and millions of experiments, to figure out how we can get a little bit further in everyone's mind and a little bit better at getting them to do the thing that we want.

Weston Wamp: Our brains have evolved slowly over millennia. Now, relatively suddenly, we are engaging information through the internet in a way that we simply aren’t prepared for. We can’t evolve fast enough for the algorithms. 

And if any of this sounds overstated – click through on the next weekly usage report on your iPhone. Check the time you’ve spent on social media apps. Hours of our lives are being subjected to the experiment David described. 

As a father of four young children, I can’t help but fear that they’re growing up in a world where they won’t have a point of reference prior to the experiment.

Camille Carlton: When it comes to the struggles that especially younger generations feel on social media, it's not an issue of personal will or parenting. I mean, I see it with my nieces and nephews. I see it with cousins.

Weston Wamp: Camile Carlton is a colleague of David’s at the Center for Humane Technology. 

Camille Carlton: I just really want to emphasize that platforms are intentionally designing products to be addictive and to keep youth on. 

Weston Wamp: The success that tech companies have keeping youth on their platforms is stunning.

David Jay: So the influencer is now the fourth highest career aspiration among elementary school students.

The thing that we sit with, at the Center for Humane Technology, is that these tactics that are really good at drilling into the brain stem, really good at getting people's attention, and really good at shifting people's behavior, not only are they terrifying because technology that can shift what we think and how we behave is terrifying. They're terrifying because they have unintended consequences.

The same tactics that get someone to look at a screen all day, may undermine their ability to form meaningful relationships with their peers. They may result in a feeling of social isolation. They may result in anxiety.

The same algorithms that amplify the most compelling content, may make it so that cyber bullying within the school is heavily amplified. You have an effect of pile-ons, of people who are seen as vulnerable, that can have really profound negative consequences.

Weston Wamp: The proverbial pile on David references points to the way the algorithms prioritize certain types of content. And frankly, there’s growing evidence that the algos prefer problematic content.

David Jay: One good example of this is research by an ally of ours named Dr. Molly Crockett at Yale. Dr. Crockett looked at what emotional triggers were likely to get someone to hit a share button, both through, I believe research on Twitter and through research in a lab.

What she found was that, there were two emotions that were kind of neck-in-neck for the most likely to get someone to share something. The first was a sense of sort of joy, like the equivalent of sharing a cat video. The second was a sense of moral outrage, especially moral outrage against an out group.

Weston Wamp: When you combine an algorithm that rewards outrage and a business model that incentivizes creators based on popular content, you have a toxic situation.

David Jay: If I'm posting content and what I want to do is create content that's engaging, if I post content that triggers a sense of moral outrage, especially against people who are seen as an other, that's the content that's going to get pushed to the top. So, that's the content that I, as a creator, I'm to get trained to create.

If I am looking at content, then because that content that triggers a sense of moral outrage is what's amplified, increasingly, the only thing I see about an out group, say a political party that I don't agree with, is content that may or may not be true, but that triggers my sense of outrage about them.

Weston Wamp: Unfortunately, it’s this social media dynamic that has fueled extremist groups in recent years.

David Jay: We have also seen from the Wall Street Journal that 64% of extremist group joins — and this was, I believe, several years ago — were due to group recommendations. Because the groups that were highly engaging, were the groups that triggered a sense of moral outrage.

There's a similar study that said that each word of moral outrage added to a tweet or words of moral outrage added to a tweet, increased the rate of retweets by 17%.

It's not hard to imagine how an attention economy that rewards this sense of othering and outrage, creates an environment in which finding shared understanding and operating effectively as a democracy, becomes increasingly difficult. 

Weston Wamp: Nora Benavidez is a Senior Counsel with Free Press where she manages the organization’s efforts around platform and media accountability to defend against digital threats to democracy. 

Nora Benavidez: With the digital age, of course, what we've seen is that there has been not just an increase in opportunity to engage with each other and the ability to express ourselves, to connect you and me or you and another person across the globe, but that there are other malign forces behind many of these forms of communication. And so social media in particular, we have seen, has been fertile ground for the spread of disinformation, misinformation, hateful content, other junk news that is low quality journalism and everything in between. 

And what we've seen is that it isn't just that this type of content exists online, it is that this content is being fed to us for profit by social media companies. Their business model incentivizes engagement, and the most controversial content is engaged with the most. We know that extremist content gets engaged six times more than non-extremist content on social media platforms.

Weston Wamp: One part of problem, Nora points out, is that we don’t fully understand the business model of these giant social media companies. And given the stakes for civilization, the lack of transparency is chilling.

Nora Benavidez: In the mix with all of this, there is one I think, defining feature, which is we have no transparency into these practices. We don't fully grasp what Weston's feed looks like versus Nora's feed. And so I'm living my reality and it's wonderful and I love my Twitter feed, I love my Instagram feed. I have crafted those to be great places for me. I have no idea what yours look like.

And so the question of how is discourse affected by the online world, it's affected in every way, because our information pathways are being shaped by what social media business practices look like. And it means that maybe sometimes we are very divided, sometimes we may not be. I'm not ready to say that in all instances, we are more divided than ever. I think that's hyperbole, but I do believe that we don't have an adequate sense of what each of us is experiencing and how that will then influence our attitudes, our voting preferences, what we think is the most pressing issue of the day.

Weston Wamp: The democratization of information has taken a dark turn towards the monetization of disinformation. We’ve all seen disinformation flow freely on social media – in part because it evokes moral outrage, which the algorithms like, and in part because the algorithm surrounds us with like minded voices, echo chambers.

The psychological and cultural ramifications of social media rewiring our brains are alarming enough. But how this affects our politics, and ultimately our ability to self-govern, is a gravely important consideration. The commodification of controversy – of division, of hyperpartisanship – is changing our politics. 

As Camille Carlton explained, one encouraging trend is that tech companies are coming under increased pressure from both sides of the aisle in Washington.

Camille Carlton: Thanks to a lot of groundwork by researchers and whistleblowers like Frances Haugen, there's an appetite both within Congress and on the ground. Parents are asking for it. We're asking for this. So, I think we feel really hopeful about that.

Weston Wamp: Even legislatively, there is momentum behind solutions that will bring appropriate regulation to an industry that is fundamentally altering the lives of millions of Americans.

Camille Carlton: One of the really big themes that we're seeing, I think, right now in a lot of legislation that's popping up, is a call for just general transparency and understanding of the harms that tech platforms are causing.

So different pieces of legislation, they're looking at just starting to ask questions and give resources to say, “what are the harms of technology? How is social media affecting our kids? How is social media affecting our stability?” And just building that into the framework.

David Jay: That's where legislative frameworks like the KIDS Act and the Kids PRIVCY Act, which have some very solid, principled approaches to addressing the challenges that we've been talking about here, especially as they impact young people.

I think about things like preventing micro-target advertising from being targeted at young people. That's a really important example.

They also include components like funding the FTC to be able to better build out the capacity to regulate technology going forward. Because, I think, having that regulatory capacity where people can keep up with the speed of innovation is going to be really critical.

Weston Wamp: As Nora explained to me, finding solutions to these problems is going to be complex. From education efforts to demanding accountability to the public and, of course, we’re going to need more transparency.

Nora Benavidez: The path has to be interdisciplinary, it can't be a single sector or a single set of actors that are working on this alone. And so I often try to propose, we need direct interventions. The individual media literacy training that you or I might need to be able to gird against and identify misleading content online, we need that. We need that in our education system, we need that, of course, throughout life for older adults as well.

And then we need more systemic reforms and I typically think of those in a couple of different categories. One is what companies themselves can and should be doing. We are reaching what feels like a fever pitch of public pressure for social media and other tech companies to take action, to sort of move beyond what has been a very obvious lack of good faith in coming to the table to make reforms themselves.

And then we need to have social media companies show us the receipts to actually give us transparency into what they're doing, show us what their practices are, their business models, their moderation and enforcement practices.

Weston Wamp: On the next episode of “Swamp Stories,” we’ll speak with former Congressman Carlos Curbelo, a Miami native whose Cuban American roots and often contrarian perspective within conservative circles have made him one of the most interesting voices in American politics.

Weston Wamp: Thanks for listening to “Swamp Stories,” presented by Issue One, the country's leading political reform organization that unites Republicans, Democrats, and independents to fix our broken political system. Please subscribe to the podcast and share it with your friends. Even better, rate and review it on iTunes to help us reach more listeners. You can find out more at swampstories.org. I'm your host Weston Wamp. A special thank you to executive producer, Ethan Rome, senior producer Evan Ottenfeld, producer Sydney Richards, and editor Parker from ParkerPodcasting.com. “Swamp Stories” was recorded in Tennessee, edited in Texas and can be found wherever you listen to podcasts.


HOW TO LISTEN