Disinformation Researchers Fret About Fallout From Judge's Order

Disinformation Researchers Fret About Fallout From Judge's Order

Researchers and groups fighting hate speech, online abuse, and disinformation say that the decision of a federal judge to limit government communication with social media platforms this week could have wide-ranging effects: it could hamper efforts to curb harmful material.

Alice E. Marwick is a researcher from the University of North Carolina in Chapel Hill. She was among several disinformation specialists who stated on Wednesday that this ruling could hinder efforts to stop false claims regarding vaccines and voter fraud.

She said that the order was a result of other efforts, mostly from Republicans, which are "part of an organized effort to push back against disinformation in general."

Judge Terry A. Doughty issued a preliminary order on Tuesday. He said that the Department of Health and Human Services and Federal Bureau of Investigation as well as other government agencies must cease contacting social media companies to 'encourage, encourage, pressure or induce in any way the removal, deletion or suppression of protected free speech content'

The ruling was a result of a lawsuit filed by the attorneys general from Louisiana and Missouri who claimed that Facebook, Twitter, and other social media platforms censored right-leaning material, often in collusion with the government. The Republicans and they cheered on the decision of the U.S. District Court of the Western District of Louisiana judge as a victory for the First Amendment.

Researchers said that the government's collaboration with social media companies is not a problem as long as they aren't forced to remove content. They said that the government had in fact notified companies of potentially harmful messages such as lies about election fraud and misleading information about Covid-19. Researchers, nonprofits or software and people at social platforms are the ones who flag most misinformation.

"That's really the important distinction: the government should be able inform social media companies of things they feel are harmful for the public," said Miriam Metzger. She is a professor of communication at the University of California Santa Barbara and a member of the Center for Information Technology and Society.

Researchers said that a possible chilling effect is of greater concern. The decision of the judge prevented certain government agencies from contacting some research groups, including the Stanford Internet Observatory, and the Election Integrity Partnership about removing social-media content. Some of these groups were already targeted as part of a Republican-led campaign against universities, think tanks and other research organizations.

Peers said that such restrictions could discourage younger scholars from conducting disinformation research, and intimidate grant-funding donors.

Bond Benton is an associate professor of communication at Montclair State University, who studies disinformation. He described the ruling, as a 'potential Trojan horse'. He said that the ruling is limited to the relationship between the government and social media platforms on paper, but it conveys a message saying that misinformation can be considered speech, while its removal constitutes suppression of speech.

Dr. Benton explained that platforms used to be able to say "no shirt, no shoe, no service" if they didn't want the event. This ruling will probably make platforms more careful about this.

Platforms have increasingly relied on algorithms and automated tools to detect harmful content in recent years. This has reduced the effectiveness of complaints made by people outside of the companies. Viktorya Vilk is the director of digital safety and freedom expression at PEN, a nonprofit organization that promotes free speech. She said that academics and anti-disinformation groups often complain that platforms are unresponsive to their complaints.

She said that platforms are good at ignoring requests from civil society groups for help, information or to escalate individual cases. They are less willing to ignore the government.

Researchers who study disinformation were concerned that the ruling would allow social media platforms to relax their standards of vigilance before the election in 2024. Some have already slowed down efforts to combat misinformation. Researchers said that it was not clear how new initiatives, like the White House Task Force on Online Harassment, which had been a response to researchers' concerns, and suggestions would perform.

Imran Ahmed is the CEO of the Center for Countering Digital Hate. The decision made on Tuesday highlighted other issues, including the United States' "particularly fangless" approach to dangerous content in comparison to places like Australia and Europe, and the necessity to update the rules governing the liability of social media platforms. In the ruling of Tuesday, the ruling cited that the Center for Countering Digital Hate had presented a report to the surgeon general’s office in 2021 on online antivaccine activists called 'The Disinformation Dozen'.

Ahmed stated that it was 'bananas' that Facebook could broadcast Nazi propaganda and empower stalkers and harassment, undermine public safety and facilitate extremism within the United States. This court decision only exacerbates the feeling of impunity that social media companies have, despite being the main vector for hate and misinformation in the society.