U.S. Special Forces Want to Use Deepfakes for Psy-ops

Authored by theintercept.com and submitted by deepskydiver
image for U.S. Special Forces Want to Use Deepfakes for Psy-ops

U.S. Special Operations Command, responsible for some of the country’s most secretive military endeavors, is gearing up to conduct internet propaganda and deception campaigns online using deepfake videos, according to federal contracting documents reviewed by The Intercept. The plans, which also describe hacking internet-connected devices to eavesdrop in order to assess foreign populations’ susceptibility to propaganda, come at a time of intense global debate over technologically sophisticated “disinformation” campaigns, their effectiveness, and the ethics of their use.

While the U.S. government routinely warns against the risk of deepfakes and is openly working to build tools to counter them, the document from Special Operations Command, or SOCOM, represents a nearly unprecedented instance of the American government — or any government — openly signaling its desire to use the highly controversial technology offensively. SOCOM’s next generation propaganda aspirations are outlined in a procurement document that lists capabilities it’s seeking for the near future and soliciting pitches from outside parties that believe they’re able to build them. “When it comes to disinformation, the Pentagon should not be fighting fire with fire,” Chris Meserole, head of the Brookings Institution’s Artificial Intelligence and Emerging Technology Initiative, told The Intercept. “At a time when digital propaganda is on the rise globally, the U.S. should be doing everything it can to strengthen democracy by building support for shared notions of truth and reality. Deepfakes do the opposite. By casting doubt on the credibility of all content and information, whether real or synthetic, they ultimately erode the foundation of democracy itself.”

“When it comes to disinformation, the Pentagon should not be fighting fire with fire.”

Meserole added, “If deepfakes are going to be leveraged for targeted military and intelligence operations, then their use needs to be subject to review and oversight.” The pitch document, first published by SOCOM’s Directorate of Science and Technology in 2020, established a wish list of next-generation toys for the 21st century special forces commando, a litany of gadgets and futuristic tools that will help the country’s most elite soldiers more effectively hunt and kill their targets using lasers, robots, holographs, and other sophisticated hardware. Last October, SOCOM quietly released an updated version of its wish list with a new section: “Advanced technologies for use in Military Information Support Operations (MISO),” a Pentagon euphemism for its global propaganda and deception efforts. The added paragraph spells out SOCOM’s desire to obtain new and improved means of carrying out “influence operations, digital deception, communication disruption, and disinformation campaigns at the tactical edge and operational levels.” SOCOM is seeking “a next generation capability to collect disparate data through public and open source information streams such as social media, local media, etc. to enable MISO to craft and direct influence operations.” SOCOM typically fights in the shadows, but its public reputation and global footprint loom large. Comprised of the elite units from the Army, Marine Corps, Navy, and Air Force, SOCOM leads the most sensitive military operations of the world’s most lethal nation. While American special forces are widely known for splashy exploits like the Navy SEALs’ killing of Osama bin Laden, their history is one of secret missions, subterfuge, sabotage, and disruption campaigns. SOCOM’s “next generation” disinformation ambitions are only part of a long, vast history of deception efforts on the part of the U.S. military and intelligence apparatuses. Special Operations Command, which is accepting proposals on these capabilities through 2025, did not respond to a request for comment.

Though Special Operations Command has for years coordinated foreign “influence operations,” these deception campaigns have come under renewed scrutiny. In December, The Intercept reported that SOCOM had convinced Twitter, in violation of its internal policies, to permit a network of sham accounts that spread phony news items of dubious accuracy, including a claim that the Iranian government was stealing the organs of Afghan civilians. Though the Twitter-based propaganda offensive didn’t use deepfakes, researchers found that Pentagon contractors employed machine learning-generated avatars to lend the fake accounts a degree of realism. Provocatively, the updated capability document reveals that SOCOM wants to boost these internet deception efforts with the use of “next generation” deepfake videos, an increasingly effective method of generating lifelike digital video forgeries using machine learning. Special forces would use this faked footage to “generate messages and influence operations via non-traditional channels,” the document adds. While deepfakes have largely remained fodder for entertainment and pornography, the potential for more dire applications is real. At the onset of Russia’s invasion of Ukraine, a shoddy deepfake of Ukrainian President Volodymyr Zelenskyy ordering troops to surrender began circulating on social media channels. Ethical considerations aside, the legality of militarized deepfakes in a conflict, which remains an open question, is not addressed in the SOCOM document. As with foreign governmental “disinformation” campaigns, the U.S. has spent the past several years warning against the potent national security threat represented by deepfakes. The use of deepfakes to deliberately deceive, government authorities warn regularly, could have a deeply destabilizing effect on civilian populations exposed to them.

At the federal level, however, the conversation has revolved exclusively around the menace foreign-made deepfakes might pose to the U.S., not the other way around. Previously reported contracting documents show SOCOM has sought technologies to detect deepfake-augmented internet campaigns, a tactic it now wants to unleash on its own. Perhaps as provocative as the mention of deepfakes is the section that follows, which notes SOCOM wishes to finely tune its offensive propaganda seemingly by spying on the intended audience through their internet-connected devices. Described as a “next generation capability to ‘takeover’ Internet of Things (loT) devices for collect [sic] data and information from local populaces to enable breakdown of what messaging might be popular and accepted through sifting of data once received,” the document says that the ability to eavesdrop on propaganda targets “would enable MISO to craft and promote messages that may be more readily received by local populace.” In 2017, WikiLeaks published pilfered CIA files that revealed a roughly similar capability to hijack into household devices. The technology behind deepfake videos first arrived in 2017, spurred by a combination of cheap, powerful computer hardware and research breakthroughs in machine learning. Deepfake videos are typically made by feeding images of an individual to a computer and using the resultant computerized analysis to essentially paste a highly lifelike simulacrum of that face onto another.

“The capacity for societal harm is certainly there.”

Once the software has been sufficiently trained, its user can crank out realistic fabricated footage of a target saying or doing virtually anything. The technology’s ease of use and increasing accuracy has prompted fears of an era in which the global public can no longer believe what it sees with its own eyes. Though major social platforms like Facebook have rules against deepfakes, given the inherently fluid and interconnected nature of the internet, Pentagon-disseminated deepfakes might also risk flowing back to the American homeland. “If it’s a nontraditional media environment, I could imagine the form of manipulation getting pretty far before getting stopped or rebuked by some sort of local authority,” Max Rizzuto, a deepfakes researcher with the Atlantic Council’s Digital Forensic Research Lab, told The Intercept. “The capacity for societal harm is certainly there.” SOCOM’s interest in deploying deepfake disinformation campaigns follows recent years of international anxiety about forged videos and digital deception from international adversaries. Though there’s scant evidence Russia’s efforts to digitally sway the 2016 election had any meaningful effect, the Pentagon has expressed an interest in redoubling its digital propaganda capabilities, lest it fall behind, with SOCOM taking on a crucial role. At an April 2018 hearing of the Senate Armed Services Committee, Gen. Kenneth Tovo of the Army Special Operations Command assured the assembled senators that American special forces were working to close the propaganda gap. “We have invested fairly heavily in our psy-op operators,” he said, “developing new capabilities, particularly to deal in the digital space, social media analysis and a variety of different tools that have been fielded by SOCOM that allow us to evaluate the social media space, evaluate the cyber domain, see trend analysis, where opinion is moving, and then how to potentially influence that environment with our own products.” While military propaganda is as old as war itself, deepfakes have frequently been discussed as a sui generis technological danger, the existence of which poses a civilizational threat. At a 2018 Senate Intelligence Committee hearing discussing the nomination of William Evanina to run the National Counterintelligence and Security Center, Sen. Marco Rubio, R-Fla., said of deepfakes, “I believe this is the next wave of attacks against America and Western democracies.” Evanina, in response, reassured Rubio that the U.S. intelligence community was working to counter the threat of deepfakes.

Related ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​ ▄​

Tiny_Package4931 on March 7th, 2023 at 02:22 UTC »

Deep fakes will be incorporated into modern psychological warfare operations just like other technologies were before and it's unlikely that there will be a complete block of the technology.

I think it was in Indonesia where there was a covert operation to convince socialist insurgents that the woods they were operating in were also inhabited by a local myth about vampire women. Flew over the jungle with wailing and screaming and other curses coming out of giant speakers.

Psyop groups will in a few years time be making deep fakes of military leadership of opposing forces that are targeted to demoralize rank and file Soldiers more than likely.

Imagine a deep fake of Taliban leadership with prostitutes in Qatar or something along those lines. If it were possible back during the Obama surge we would have absolutely used it.

Suspicious_Loads on March 7th, 2023 at 00:37 UTC »

Shouldn't this be a CIA thing and not SOCOM? Special forces are uniformed for direct action right?

TheThirdDumpling on March 6th, 2023 at 21:38 UTC »

Decoupling from reality, is that really what the global politics need more of?