US officials worry about 'chilling effect' on combating election disinformation after order limiting Biden administration contact with social platforms
Views:
1970-01-01 08:00
A federal judge's move to limit how some US agencies communicate with social media companies could have a "chilling effect" on how the federal government and states address election-related disinformation just as the 2024 election cycle get underway, according to interviews with current and former US officials.

A federal judge's move to limit how some US agencies communicate with social media companies could have a "chilling effect" on how the federal government and states address election-related disinformation just as the 2024 election cycle get underway, according to interviews with current and former US officials.

Last week's court order by a district judge in Louisiana blocking several federal agencies from communicating with social platforms about certain content has been portrayed as a fight over free speech on the internet amid allegations of government censorship.

As the Biden administration's appeal in the case slowly works its way through the courts, however, security experts, academics and current and former US officials worry that their own good faith-efforts to protect elections could end up in a future lawsuit, according to interviews with more than half a dozen sources. The resulting uncertainty, they say, risks slowing the government and social media companies' ability to respond to election-related disinformation that appears on tech platforms.

"There will absolutely be a chilling effect at least when you consider that you'll probably have a more conservative approach to engagements [between federal officials and tech platforms] that are otherwise completely appropriate and on solid legal footing with or without the injunction," said Chris Krebs, former head of the US Cybersecurity and Infrastructure Security Agency (CISA). But Krebs expressed confidence that CISA, in particular, can still carry out its election security mission, adding: "Everything that CISA can do in election security survives the injunction."

Still, the new uncertainty from the order may only add to a growing list of concerning factors ahead of the 2024 election, as social media platforms cut staff and roll back policies, and as experts worry about the threat of a new wave of convincing AI-generated misinformation.

The court order may undercut efforts to safeguard the 2024 elections or discourage tech platforms from taking greater actions themselves to protect the democratic process, other legal and security experts told CNN, by disrupting the routine, close cooperation between government officials and social media companies that developed following Russia's efforts to meddle in the 2016 US elections.

There are early signs of this dynamic playing out. The injunction has already prompted some federal agencies to rethink their engagement with social media companies even if they are not among those named in the litigation. Last week, the State Department canceled a routine meeting on election security with Facebook, according to a person familiar with the matter.

The State Department is not one of the agencies involved in the case brought by Missouri and Louisiana. But the canceled meeting is an example of the type of "freezing effect" the injunction may have in the short run even if the order is ultimately overturned, said Katie Harbath, a former Facebook official who helped lead the company's global election efforts until 2021.

The injunction

The Louisiana district judge's sweeping order last week barred the Justice Department and the FBI, along with the Departments of Homeland Security and Health and Human Services, from meeting or communicating with social media companies for the purposes of "urging, encouraging, pressuring, or inducing in any manner for removal, deletion, suppression, or reduction of content containing protected free speech."

But evidence submitted in the court battle has shown that the public-private coordination on election security did not involve government pressure on the social media companies to remove or delete user content.

Rather, periodic meetings between the companies and the government were held to mutually share intelligence about possible foreign influence threats. If a government agency identified social media posts as potentially problematic, it did so by referencing the content as possible violations of the platforms' own terms of service; it did not make any requests for it to be removed and made no threats of punishment if the companies chose not to act. Lawyers for Twitter testified to that effect last month in a separate case involving former President Donald Trump.

On Wednesday, FBI Director Christopher Wray defended the close contacts between the US government and the social media companies.

"There is no serious dispute that foreign adversaries have and continue to attempt to interfere in our elections and that they use social media to do it," Wray testified to the House Judiciary Committee. "President Trump himself in 2018 declared a national emergency to that very effect, and the Senate Intelligence Committee — in a bipartisan, overwhelmingly bipartisan way — not only found the same thing but called for more information-sharing between us and the social media."

Even if the order is overturned, the ongoing litigation could prove to be its own source of electoral disruption, Harbath said. Not only will agencies and companies be stuck in limbo while they await a final decision — which might ultimately require the Supreme Court to get involved — but it is possible that a conclusive ruling may not arrive until sometime in 2024, Harbath said, further upending election security efforts in the middle of an election year.

The injunction does contain some exceptions allowing more limited contact between affected agencies and social media companies. For example, there are exceptions for communications involving threats to national or cyber security; foreign election meddling; public safety; and efforts to mislead voters about the election process.

Even so, the order is expected to affect tech platforms' and the federal government's ability to quickly share information that might help identify emerging security threats, current and former US officials told CNN.

Slowing down information sharing

Social media companies have met regularly with US government agencies including the FBI, the Department of Homeland Security and the Office of the Director of National Intelligence since at least 2018, according to evidence in the Louisiana case as well as separate statements by Yoel Roth, Twitter's former head of site integrity.

In a number of those meetings, US officials warned in general terms of the risk of "hack-and-leak operations" involving foreign agents and the selective release of hacked information for the purposes of disrupting US elections, with those expectations reinforced by past experience with Russian meddling efforts.

That type of coordination may now be slowed down in the face of the injunction.

For example, US officials and Big Tech firms hunting for foreign bots and trolls might need lawyers in the room when they interact, slowing down the process for information sharing. A CISA spokesperson declined to comment on how the injunction would affect the agency's work on election security. An FBI spokesperson did not respond to CNN's request for comment.

Rather than ensuring their work to protect elections remains unimpeded, the carveouts may actually create more hurdles for agencies who may now be required to take extra care to determine whether or which exceptions may apply in any given scenario, said Gowri Ramachandran, a senior counsel at the Brennan Center for Justice at New York University.

"It's kind of leaving the federal government in a position of having to try to figure out what they are and aren't allowed to say and do," Ramachandran said, "which is just, at the very least, a sort of added burden on them at a time when they're obviously preparing for 2024."

Take the exception allowing for communications with tech platforms about foreign election meddling. It baselessly presumes that the government will be able to determine that the source of a problematic post is a foreign actor intending to cause havoc in US elections, said Harbath.

The trouble is that in many situations, the dividing line between domestic speech and foreign influence is not immediately obvious. A domestically originated false narrative can often be amplified by malicious foreign actors or vice versa, said Ramachandran. And during Russia's attempted election meddling in 2016, disinformation agents posed as American social media users but the nature of the deception did not become clear until much later after a great deal of forensic effort.

"They had to do a lot of work to figure out and trace back the actual origins of where that content was," Harbath said. "Well, the government doesn't necessarily have those capabilities to do that back-end work that the social media companies do. But this injunction, if you're a very cautious lawyer, you're going to be like, 'Well, you can't tell me 100% that it's foreign, and it could be domestic and that could be against this injunction.'"

The impact of the injunction may not be limited to the federal government. It could "definitely have chilling effects" for state governments as well, according to Evelyn Douek, an assistant professor at Stanford Law School.

"More importantly, if First Amendment doctrine develops in the way this court seems to want it to, it could open state officials up to much more jawboning claims directly in the future."

Ramachandran agreed the litigation could potentially "open the door a little bit more" to lawsuits targeting state government interactions with social media companies, but added that there have been numerous such cases already at the state level and virtually none have succeeded so far.

State election officials told CNN this week that they were studying the temporary injunction carefully but that it doesn't directly prevent them from countering disinformation on their own.

A broader pullback

More concerning than the injunction are signs that social media platforms are scaling back efforts to counter disinformation ahead of the 2024 election, said Michigan Secretary of State Jocelyn Benson, highlighting YouTube's decision earlier this year to roll back a policy restricting false claims that the 2020 elections were stolen.

"When they reversed course ... that was the most alarming thing to me," Benson said of YouTube's policy update.

YouTube's decision appears to be part of a broader retrenchment by social media companies, said Harbath, who also pointed to a recent statement by Instagram CEO Adam Mosseri suggesting the company's new Twitter rival, Threads, will not actively encourage news and politics content due to perceived negativity and platform integrity challenges.

The bigger question, Harbath said, is whether executives at social media companies continue to believe it is worth investing in politics-related policies and content moderation when they represent added labor costs, litigation risk and public scrutiny for the platforms.

The injunction could give cover to social media companies that may already want to pull back on some of their platforms' rules or staff around election integrity, said Harbath.

Meta declined to comment on this story. Twitter and YouTube did not immediately respond to a request for comment.

Last fall and spring, Meta laid off several members of its platform integrity teams that investigate disinformation and coordinated troll and harassment campaigns, CNN previously reported.

The Meta layoffs are but one data point a broader trend heading into the 2024 US election: an uncertain economy, new laws and lawsuits targeting platform decision-making, and years of criticism by policymakers and civil society groups have made content moderation far more of a chore.

"This can be a very easy thing for everyone to point to and be like, 'Listen, we're going to wait for it to work through the courts,'" said Harbath. "I can [almost] hear [Meta Global Affairs President] Nick Clegg saying that 'we're going to be cautious of what we do, because we wouldn't want to run afoul of the law.'"

-- CNN's Donie O'Sullivan contributed reporting.

Tags disinformation chilling epus one epus scitech election effect injunction eppersons