February 6, 2024
Source: Stat News
Photo / Image Source: Unsplash,
WASHINGTON — As social media sites were flooded with misleading posts about vaccine safety, mask effectiveness, Covid-19’s origins and federal shutdowns at the height of the pandemic, Biden officials urged platforms to pull down posts, delete accounts, and amplify correct information.
Now the Supreme Court could decide whether the government violated Americans’ First Amendment rights with those actions — and dictate a new era for what role, if any, officials can play in combating misinformation on social media.
The Supreme Court is set to hear arguments next month in a case that could have sweeping ramifications for federal health agencies’ communications in particular. Murthy v. Missouri alleges that federal officials coerced social media and search giants like Facebook, Twitter, YouTube, and Google to remove or downgrade posts that questioned vaccine safety, Covid’s origins, or shutdown measures. Biden lawyers argue that officials made requests but never forced companies.
Government defenders say that if the Court limits the government’s power, it could hamstring agencies scrambling to achieve higher vaccination rates and other critical public health initiatives. Critics argue that federal public health officials — already in the throes of national distrust and apathy — never should have tried to remove misleading posts in the first place.
“The best way is to have a very vigorous offensive social media strategy, which we didn’t have,” said Paul Mango, a Trump deputy chief of staff for the Health and Human Services Department who worked closely on Operation Warp Speed, the effort to speed Covid-19 vaccines and treatments to market. “Rather than trying to keep bad information off by suppression, why don’t we have a strategy that really is very aggressive at propagating accurate information?”
Though the Association of State and Territorial Health Officials is not taking a stance on the case or the government’s argument that it can ask sites to take social media down, its chief medical officer Marcus Plescia also said the best use of federal public health resources is counter-messaging.
“We really are limited to the extent that we can control misinformation,” said Plescia. “The number one [request from state officials] is we need good messaging that’s been tested, and that’s shown to be effective.”
For their part, social media executives like Meta CEO Mark Zuckerberg have said in the past that they made and altered their content moderation policies on their own. But the tech executives are unlikely to weigh in now, considering they are in the midst of two other firestorms over moderation. One is a suit against a Florida law that would effectively diminish platforms’ abilities to moderate false and misleading posts. Another is last week’s very public battering by senators demanding more content moderation to protect childrens’ safety on their platforms.
The recent hearing before the Senate Judiciary Committee, which also called TikTok, Snap and Discord executives to testify, stands in stark contrast to the coronavirus misinformation lawsuit, as it conversely suggests tech companies aren’t doing enough to police their platforms. At one point, Sen. Josh Hawley (R-Mo.) urged Zuckerberg to stand up and apologize to families in the hearing room for damage caused by Facebook and Instagram use.
Senators from both parties seemed open to peeling back a federal protection of tech companies that host problematic or false content.
“It is now time to make sure that the people who are holding up the signs can sue on behalf of their loved ones. Nothing will change until the courtroom door is open to victims of social media,” South Carolina Republican Lindsay Graham said.
The Murthy quandary
Biden’s lawyers are set to argue that he, and his officials, can make the same type of demands.
A lower courts in this case ruled that the federal government can’t put any pressure on social media platforms to censor their content. Under that ruling, even public statements by the president about the teen mental health crisis could be construed as undue pressure, Solicitor General Elizabeth Prelogar argued in a legal filing.
For instance, under that ruling, a White House statement condemning the role social media plays in teens’ mental health and calling for potential legislative reform “might be viewed as coercion or significant encouragement under the Fifth Circuit’s novel understanding of those concepts,” she wrote.
But this case didn’t start with mental health, and much of it will likely rest on private rather than public comments from federal officials.
The lawsuit, started by then-Missouri Attorney General Eric Schmitt, reflects a growing trend of state attorneys general mounting politically divisive cases against the federal government. Another state, Louisiana, joined the suit along with three doctors who co-signed a paper on herd immunity, an anti-lockdown activist in Louisiana, and a conservative news site, The Gateway Pundit.
Federal officials began communicating with the social platforms in early 2021, according to court documents. Those communications included White House messages to one site saying to take a post down “ASAP” and “keep an eye out for tweets that fall in the same … genre” or instructions to another platform to “remove [an] account immediately.” CDC officials also regularly flagged posts to the companies and in one instance asked “what [was] being done on the amplification-side” to promote official messaging on coronavirus information.
Later, according to court documents, government officials began asking Facebook and others for data and the details of their moderation policies and standards. They held regular meetings, suggested changes and at least one company created a portal for government requests to be prioritized. After a Washington Post article detailing Facebook’s moderation struggle, an official wrote to the company that they felt Facebook was not “trying to solve the problem” and the White House was “[i]nternally … considering our options on what to do about it.”
In July 2021, federal officials took their frustrations to the public. Surgeon General Vivek Murthy said in a press briefing that “modern technology companies have enabled misinformation to poison our information environment, with little accountability to their users. “
He added, “We’re asking them to operate with greater transparency and accountability. We’re asking them to monitor misinformation more closely.” The same day, he issued his first formal advisory as surgeon general — on confronting health misinformation.
Despite a lower court ruling that those statements could be inappropriate pressure, experts who spoke to STAT said it’s hard to imagine the Supreme Court going that far.
“The government does, and should, have the ability to communicate with private entities about the dangers that exist,” said Clay Calvert, a senior fellow on technology policy at the American Enterprise Institute. “Why this case is so controversial is the inherently political divisiveness of the content in question — that divided Republicans and Democrats on matters like mask mandates and Covid vaccines.”
Where this leaves federal health officials
The overarching question before the court is whether these actions count as government coercion of a private company, which would be an overstep of its authority. Justice Department lawyers argue that while officials “frequently suggested” removal or downgrade of posts, they didn’t force companies — nor did companies always oblige.
An appeals court deemed some officials’ actions — particularly those of the White House — potentially coercive, but vastly whittled down a district court’s broad prohibition of government officials’ correspondence with social media companies. In doing so, they laid the groundwork for some communications — particularly the CDC’s alerts on changing recommendations and explainers on true vs. false information — as valid dispatches.
But that does not mean the CDC is in the clear when the Supreme Court considers the case in March. Justice Samuel Alito already signaled some of his apprehension when he dissented from other justices on lifting the ban before they hear arguments.
“At this time in the history of our country, what the Court has done, I fear, will be seen by some as giving the Government a green light to use heavy-handed tactics to skew the presentation of views on the medium that increasingly dominates the dissemination of news,” Alito wrote.
Even if the court rejects broader controls on federal communications with social media sites, the case could have extensive implications for effective messaging from federal health officials, legal experts say.
“It will have a chilling effect on the government … especially for the CDC,” said Dorit Reiss, a professor at UC Law San Francisco. “Because the line is fuzzy and because they don’t want to be accused of coercion, they’re not going to be sure when they can talk to social media.”
Have you taken the vaccine? Were you mandated to? By whom?
Share the wealth of health with your friends and family by sharing this article with 3 people today.
If this article was helpful to you, donate to the Shidonna Raven Garden and Cook E-Magazine Today. Thank you in advance.
Comments