Facebook and Instagram parent company Meta on Tuesday said it had disrupted a disinformation campaign linked to Chinese law enforcement that the social media company described as the "largest known cross-platform covert influence operation in the world."
The company took down more than 7,700 accounts and 930 pages on Facebook. The influence network generated positive posts about China, with a particular focus on positive commentary about China's Xinjiang province, where the government's treatment of the Uyghur minority group has prompted international sanctions.
The network also attempted to spread negative commentary about the U.S. and disinformation in multiple languages about the origins of the Covid-19 pandemic, Meta said. The network was or is present on nearly every popular social media platform, including Medium; Reddit; Tumblr; YouTube; and X, formerly known as Twitter, according to the company.
Meta began looking for signs of a Chinese influence operation on its own platforms after reports in 2022 highlighted how a disinformation campaign linked to the Chinese government targeted a human rights nongovernmental organization.
"These operations are big, but they're clumsy and what we're not seeing is any real sign that they're building authentic audiences on our platform or elsewhere on the internet," Meta's global lead for threat intelligence Ben Nimmo told CNBC's Eamon Javers.
Meta researchers were able to link this latest disinformation network to a prior influence campaign in 2019, code named Spamouflage.
"Taken together, we assess Spamouflage to be the largest known cross-platform covert influence operation to date," Meta said in its quarterly threat report. "Although the people behind this activity tried to conceal their identities and coordination, our investigation found links to individuals associated with Chinese law enforcement."
Meta also identified and disrupted other operations and published a more detailed analysis of a Russian disinformation campaign it identified shortly after the beginning of the 2022 war in Ukraine.
The disruptions come ahead of what will likely be a contentious election cycle. Concerns over the role of influence campaigns in past elections led social media platforms, including Meta, to institute stricter guidelines on both the kind of political content allowed and the labels it adds to that content.
Influence campaigns have affected Meta users in the past, notably a Russia-backed campaign to inflame popular sentiment around the 2016 U.S. presidential election.
But this disinformation network, while prolific, was not effective, Meta cybersecurity executives said on a briefing call. The campaign's pages collectively had more than 500,000 followers, most of which were inauthentic and from Bangladesh, Brazil and Vietnam.