Facebook reportedly had evidence that its algorithms were dividing people, but top executives killed or weakened proposed solutions

Authored by businessinsider.com and submitted by jigsawmap
image for Facebook reportedly had evidence that its algorithms were dividing people, but top executives killed or weakened proposed solutions

Facebook's internal research found that it encouraged polarization, but Mark Zuckerberg and other top executives rejected ideas aimed at fixing the problem, The Wall Street Journal reported.

One report concluded that Facebook's algorithms "exploit the human brain's attraction to divisiveness," according to The Journal.

But Zuckerberg and Facebook's policy chief, Joel Kaplan, repeatedly nixed proposed solutions because they feared appearing biased against conservatives or simply lost interest in solving the problem, The Journal reported.

Facebook has come under increasing pressure to address toxic content and polarization on its platform during the coronavirus pandemic and before the 2020 presidential election.

Visit Business Insider's homepage for more stories.

Facebook had evidence that its algorithms encourage polarization and "exploit the human brain's attraction to divisiveness," but top executives including CEO Mark Zuckerberg killed or weakened proposed solutions, The Wall Street Journal reported on Tuesday.

The effort to better understand Facebook's effect on users' behavior was a response to the Cambridge Analytica scandal, and its internal researchers determined that, contrary to the company's mission of connecting the world, its products were having the opposite effect, according to the newspaper.

One 2016 report found that "64% of all extremist group joins are due to our recommendation tools," with most people joining at the suggestion of Facebook's "Groups You Should Join" and "Discover" algorithms. "Our recommendation systems grow the problem," the researchers said, according to The Journal.

The Journal reported that Facebook teams pitched multiple fixes, including limiting the spread of information from groups' most hyperactive and hyperpartisan users, suggesting a wider variety of groups than users might normally encounter, and creating subgroups for heated debates to prevent them from derailing entire groups.

But these proposals were often dismissed or significantly diluted by Zuckerberg and Facebook's policy chief, Joel Kaplan, according to the newspaper, which reported that Zuckerberg eventually lost interest in trying to address the polarization problem and was concerned about the potential to limit user growth.

In response to the pitch about limiting the spread of hyperactive users' posts, Zuckerberg agreed to a diluted version and asked the team to not bring something like that to him again, The Journal said.

The company's researchers also determined that because of a larger presence of far-right accounts and pages publishing content on Facebook, any changes — including apolitical tweaks, like reducing clickbait — would have disproportionately affected conservatives.

That worried Kaplan, who previously halted a project called "Common Ground" that aimed to encourage healthier political discourse on the platform.

Ultimately, many of the efforts weren't incorporated into Facebook's products, with managers telling employees in September 2018 that the company was pivoting "away from societal good to individual value," according to The Journal.

"We've learned a lot since 2016 and are not the same company today," a Facebook spokeswoman told the paper. "We've built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform's impact on society so we continue to improve."

Facebook has repeatedly been scrutinized by critics who say the company hasn't done enough to limit the spread of harmful content on its platform. That topic has come into sharper focus as coronavirus-related misinformation has run rampant on social media and as the 2020 presidential election approaches.

Peace_Pepper on May 26th, 2020 at 19:45 UTC »

A divisive or controversial post attracts more traffic, which in turn generates more money. That’s basic human nature which is exploited by all social media platforms.

ordinaryBiped on May 26th, 2020 at 18:59 UTC »

Daily reminder that capitalism is about maximizing profits, not improving society.

Hyper_Rico on May 26th, 2020 at 18:12 UTC »

Turns out rage and paranoia bring more attention (and consequently cash) than peace and love. The problem is they are optimizing for cash, and the algorithm works.