19.07.2023 Views

The 2023 Social Media Summit@MIT Event Report

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

WATCH VIDEO<br />

IS THE TIDE<br />

TURNING FOR<br />

MISINFORMATION?<br />

Truth is making progress on social media.<br />

But with platforms regrouping and facing<br />

tough times, can the gains continue?<br />

Tom Cunningham<br />

Economist<br />

Kate Klonick<br />

Associate Professor of Law, St. John’s University<br />

David Rand (Moderator)<br />

Professor, MIT Sloan, and Research Lead,<br />

Misinformation and Fake News, MIT IDE<br />

While the threat of online misinformation remains serious,<br />

some experiments with intervention seem to be working<br />

and turning the tide. <strong>The</strong> proliferation of fake news peaked<br />

in 2016, and since then, growing concern has kept online<br />

misinformation in check. This panel, led by IDE leader David<br />

Rand, considered what should—and should not—be done to<br />

keep it there.<br />

Greater awareness and vigilance are helping to keep social<br />

media truthful. Still, it seems unlikely that misinformation and<br />

fake news can be eliminated entirely, especially as platform<br />

companies face earnings pressure, panelists said. Debates<br />

now center on how social media platforms and others should<br />

combat the spread of misinformation and prevent backsliding.<br />

Since the 2016 U.S. presidential elections, social media<br />

platforms have experimented with interventions to flag or<br />

remove false information. This type of content moderation<br />

has been effective and represents the best way forward,<br />

said Tom Cunningham.<br />

Cunningham, who previously worked as an economist and<br />

data scientist for Facebook and Twitter, believes that content<br />

moderation can reduce misinformation. But there are tradeoffs,<br />

he added, such as inadvertently taking down accurate<br />

information and infringing on free speech. Identifying<br />

misinformation isn’t as straightforward as it might seem.<br />

Rand agreed, noting that when fake news first began to<br />

spread on Facebook, he believed it would be easy to tackle.<br />

“<strong>The</strong>re are things that are true and things that are false,” he<br />

recalled. “If the platform wanted to do something about it, it<br />

could just get rid of the false stuff.” However, Rand said that<br />

once he started working in the field, “I changed my mind.<br />

This is way more complicated than I’d appreciated.”<br />

Rand now sees online information as a “continuum” of truth<br />

and falsity, with most content somewhere in the middle.<br />

“That doesn’t mean we’re in a post-truth world,” he said.<br />

“It’s just that accuracy is hard to measure with certainty.”<br />

Research shows that most social media users don’t actually<br />

want to share misinformation, Rand explained. Instead, many<br />

are either confused or not paying enough attention.<br />

Gold Standard<br />

Professional fact-checking by dedicated employees is “still the<br />

gold standard,” Rand said. But even fact-checkers don’t always<br />

agree on what’s true or false.<br />

Cunningham suggested that decentralized social media<br />

platforms accelerate the spread of misinformation because<br />

they’re “bad at assessing the quality of content...people<br />

tend to click whatever is shiny, whatever looks good, but not<br />

what is good.” This can amplify sensational and superficial<br />

ideas. By contrast, centralized media—newspapers, radio,<br />

and television—have traditionally been more cautious, fearing<br />

they’ll lose readership and ad revenue, or risk litigation.<br />

[Read more about decentralization on page 12.]<br />

Kate Klonick noted that Facebook and Twitter are mostly<br />

centralized, which may be good for creating what she<br />

called “one place for intervention.” But over the next couple<br />

of years, she expects to see more decentralization, where<br />

“misinformation is breeding, and it’s going to be very difficult<br />

to track.”<br />

Cunningham noted that while misinformation is still too<br />

prevalent, it has declined by a factor of five or 10 since<br />

peaking in 2016. <strong>The</strong> reason? Platforms, he said, have<br />

“added friction on politics, on sharing, on viral content, on<br />

certain types of reactions, on content from new producers,<br />

on content from fringe producers, and on content that looks<br />

like clickbait.” While this has significantly decreased the rate<br />

of misinformation, Cunningham added, “Anything that looks<br />

a little bit like misinformation is going to get suppressed<br />

as well.”<br />

Platforms Under Pressure<br />

Cunningham also maintained that social-media companies<br />

should strive to be platforms that distribute high-quality<br />

“In a free society, the law does not necessarily have a<br />

role to play in trying to determine truth.”<br />

Kate Klonick<br />

Associate Professor of Law, St. John’s University<br />

content, not just platforms that maximize retention, likes, and<br />

time spent. And yet, in the current cost-cutting environment,<br />

many social media companies have been laying off their<br />

ethicists and fact-checkers.<br />

Klonick said that while “lots of work has been done to raise<br />

awareness about malicious actors and misinformation,” the<br />

tradeoff may be increased public skepticism about what is<br />

or is not fake news.<br />

She doubts that legal interventions will help to clarify online<br />

truth. “One of the very unpopular things that I have said over<br />

the last five years, but I think is ultimately true, is that in a<br />

free society, the law does not necessarily have a role to play<br />

in trying to determine truth,” Klonick explained.<br />

Sometimes, she added, it’s best to let the private sector be<br />

the gatekeepers. Platforms know how to create friction and<br />

slow things down. “Markets,” Klonick said, “respond faster<br />

than the law to changing norms.”<br />

For these reasons and more, Klonick doesn’t see legislative<br />

remedies coming to the rescue anytime soon. Policy<br />

proposals, she said, are “a disincentive, a stick...but they’re<br />

not that helpful.” What’s more, she said, substantial bans on<br />

certain types of content, based on subjective judgments, are<br />

“not going to be in line with First Amendment principles at all.”<br />

<strong>The</strong> Cost of Regulation<br />

In many countries, social media platforms already comply<br />

with content regulations, Cunningham noted. Much of that<br />

content would be removed anyway, he said: “<strong>The</strong>y’re going<br />

to take down terrorist stuff, child porn; they’re going to take<br />

down hate speech.”<br />

Before the current layoffs, platforms were aggressively<br />

addressing the problem. <strong>The</strong>y were spending 5% of their<br />

overall costs on data scientists, content raters, and different<br />

structures for moderating content, Cunningham said, even<br />

when they were not legally required to do so. <strong>The</strong> reason<br />

wasn’t altruism or free speech, but pressure from advertisers.<br />

Cunningham isn’t confident that most platform companies<br />

“have a super-clear North Star; they’re responsive to half a<br />

dozen different constituencies.” Those include the media,<br />

governments, advertisers, users, employees, and investors.<br />

“All of those,” Cunningham added, “have very strong opinions<br />

about what content should be on the platform.”<br />

Ideally, social media platforms will discover new ways to keep<br />

the good content while eliminating the bad. But, Cunningham<br />

warned, “I don’t think that we should be crossing our fingers<br />

for that business model in the very short term.”<br />

8<br />

9

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!