With the Election Over, Facebook Gets Back to Spreading Misinformation

Authored by vanityfair.com and submitted by DonaldWillKillUsAll
image for With the Election Over, Facebook Gets Back to Spreading Misinformation

In the run-up to last month’s election and in its immediate aftermath, Facebook broke from its typically laissez-faire approach to the content posted on its platform: The social network instituted a new political ad policy to combat lies and misinformation and began putting warning labels on Donald Trump’s conspicuously bogus and wildly reckless conspiracy posts. And, as those lies and delusions nevertheless spread through the bloodstream of American politics in the days after Trump lost to Joe Biden, the company tweaked its algorithms to elevate posts from trusted news sources over the hyper-partisan, less trustworthy ones that amplified falsehoods.

They were imperfect measures, but they were at least something, and a clear demonstration that Facebook can address issues plaguing its platform if it wants to. But after those baby steps forward during election season, Facebook has taken a big step back: Though it is continuing its policy against bogus political ads, at least for the Georgia runoffs, the social media giant confirmed to the New York Times that it did away this week with the algorithm modification that prioritized credible news sources, once again setting the stage for fake and hyper-partisan news to flourish. “This was a temporary change we made to help limit the spread of inaccurate claims about the election,” Facebook spokesman Joe Osborne told the Times’ Kevin Roose.

Osborne maintained that the company was “still ensuring that people see authoritative and informative news on Facebook, especially during major news cycles and around important global topics like elections, COVID-19, and climate change.” But it could become harder to do that if sources like Breitbart and Occupy Democrats, two outlets that saw their traffic drop after Facebook began assigning publishers “news ecosystem quality” scores, dominate newsfeeds. Recognizing this, some Facebook employees have reportedly pressed for the so-called “nicer newsfeed” to become permanent. But to higher-ups, the NEQ system and other changes that have been scaled back were “break glass” measures never meant to last.

There was always skepticism of how long these enhanced moderation efforts would would last, but when it was reported earlier this month that Facebook had also altered is algorithms to better root out hate speech—a major issue on social media platforms—there was at least a sense that the company was perhaps beginning, ever so slightly, to inch ahead in the right direction.

Facebook has incentive to keep such alterations temporary. As the Times reported in November, the measures it could take to limit harmful content on the platform might also limit its growth: In experiments Facebook conducted last month, posts users regarded in surveys as “bad for the world” tended to have a greater reach—and algorithmic changes that reduced the visibility of those posts also reduced users’ engagement with platform, according to the Times. “The results were good,” a summary of the test read, “except that it led to a decrease in sessions.” The “nicer newsfeed” may be better for the world, in other words, but it may not be better for business.

The problem is complex, and not just for Facebook, as it tries to take over the world without becoming a supervillain. Mark Zuckerberg may have created a monster—or, as Elizabeth Warren has called it, a “disinformation for profit” machine—but taming it is no easy task. The burgeoning regulatory offensive against it could help. But so long as Facebook’s business model favors the angry echo chamber, Zuckerberg will be disinclined to institute change from within. “The question is, what have they learned from this election that should inform their policies in the future?” Vanita Gupta, CEO of the Leadership Conference on Civil and Human Rights, told the Times in November. “My worry is that they’ll revert all of these changes despite the fact that the conditions that brought them forward are still with us.”

More Great Stories From Vanity Fair

— Mary Trump Thinks Her Uncle’s Postpresidency Woes Are Just Beginning

— There’s a Wave of COVID Patients Who Don’t Believe It’s Real

— Doug Band: Confessions of a Clintonworld Exile

— Will Rupert Murdoch Spring for a Postpresidential Fox Gig?

— Ivanka Desperately Tries to Rehab Her Image on Her Way Out

— After Remaking CNN and Antagonizing Trump, Jeff Zucker Eyes the Exits

— With COVID Vaccines Approaching, Is the FDA Ready to Inspect Where They’re Made?

— From the Archive: Probing the Nightmare Reality of Randy Quaid and His Wife, Evi

— Not a subscriber? Join Vanity Fair to receive full access to VF.com and the complete online archive now.

Lyianx on December 21st, 2020 at 19:30 UTC »

Just don't get your 'news' from Facebook.

Thurmansherman on December 21st, 2020 at 18:21 UTC »

It blows my mind that people are still figuring out that what's good for the world won't be good for established business.

rpguy04 on December 21st, 2020 at 17:51 UTC »

Gets back... when did it ever stop?