by Zachary Roth, Georgia Recorder [This article first appeared in the Georgia Recorder, republished with permission]
November 26, 2023
Back in July, Homeland Security Secretary Alejandro Mayorkas testified before the U.S. House Judiciary Committee.
A federal court had recently granted a temporary injunction, in Missouri v. Biden, finding that the Biden administration had violated the First Amendment by coercing social media companies to remove content, related both to elections and the COVID-19 vaccine, that it deemed false and harmful.
The ruling is being appealed to the Supreme Court, which last month temporarily blocked the order. But one committee member wanted to press the advantage.
“What is disinformation?” asked Rep. Mike Johnson, R-La.
“Disinformation is inaccurate information…” Mayorkas began.
“Who determines what’s inaccurate?” Johnson interjected almost immediately. “Who determines what’s false? You understand the problem here?”
Moments later, Mayorkas testified that the Cybersecurity and Infrastructure Security Agency, a unit of DHS whose activities were part of the case, focuses on fighting disinformation from foreign adversaries — speech that would likely enjoy fewer First Amendment protections than speech by Americans.
But Johnson was ready.
“No, sir,” Johnson said. “The court determined you and all of your cohorts made no distinction between domestic speech and foreign speech. So don’t stand there under oath and tell me that you only focused on … foreign actors. That’s not true.”
“I so, so, regret that I’m out of time,” Johnson concluded.
Johnson’s forceful performance lit up right-wing media, burnishing his credentials as a conservative stalwart and a fiercely effective Republican partisan.
This was hardly the first time that Johnson had put the Biden administration on the defensive over its efforts to fight online disinformation — false or misleading information that is deliberately spread to advance a political or ideological goal. In fact, in recent years, there appear to have been few national issues on which Johnson, who in October was elected by the GOP as speaker of the House, has played a more prominent role.
Johnson’s efforts have largely succeeded. Since 2020, the federal government and social media companies have scaled back their work to counter online disinformation — thanks in part, experts say, to the furious pushback from Johnson and other leading Republicans.
Disinformation specialists increasingly worry that voters in next year’s election may have to contend with a barrage of lies flowing freely online.
“It’s going to be a wild ride,” said Sam Wineburg, a professor at the Stanford Graduate School of Education, who studies how people respond to disinformation. “We are going to see a deluge of misinformation and … a great deal of confusion online.”
A spokesman for the speaker’s office did not respond to a request for comment on the growing concerns about election-related disinformation.
Pulling back on countering disinformation
In 2020, amid a flurry of online lies about both the COVID-19 pandemic and the presidential election, social media companies stepped up their efforts to curb disinformation, including labeling some of President Donald Trump’s tweets as misleading.
These initiatives, especially the platforms’ work to limit the spread of stories about Hunter Biden’s laptop in the election’s final weeks, spurred a backlash from conservatives. Though information from the laptop hasn’t been shown to significantly implicate President Joe Biden, the authenticity of the laptop itself has since been confirmed
The ensuing pressure campaign by congressional Republicans like Johnson, decrying what they call censorship and collusion by government and Big Tech, has led both the Biden administration and the platforms themselves to rein in their efforts to protect voters from disinformation.
Last year, DHS announced the Disinformation Governance Board, charged with coordinating efforts to identify and counter online disinformation. Weeks later, after a fierce Republican backlash, it shuttered the panel.
CISA, the DHS agency about which Johnson grilled Mayorkas in July, has stopped reaching out to social media companies to share information, NBC News recently reported. The same outlet also reported that the FBI recently put an indefinite hold on most briefings to social media companies about Russian, Iranian, and Chinese influence campaigns. The bureau’s interactions with the platforms now must be pre-approved by government lawyers.
Another DHS program, Rumor Control, which provided quick responses to election disinformation on election night and beforehand, also appears to be on the wane, NPR reported.
The government’s retreat has upped the onus on social media companies to proactively monitor disinformation. But these efforts, too, are set to be less robust in 2024 than they were in 2020, after layoffs have reduced staffing on trust and safety teams.
Last year, Meta loosened constraints on political advertising to allow ads on Facebook and Instagram that question the legitimacy of the 2020 election, the Wall Street Journal recently reported. And X owner Elon Musk announced recently that he has eliminated the site’s elections integrity team, claiming it was “undermining election integrity.”
Experts suggest the more hands-off approach from the platforms is in part a response to the Republican outrage, which has left tech leaders reluctant to alienate powerful figures who may expand their control of the federal government just over a year from now.
“The platform leadership often felt that 2020 put them in a horrible position for potentially what the next administration might look like,” said Tim Harper, who runs the Elections and Democracy program at the Center for Democracy and Technology, which advocates for tech policies that promote democracy. “And so they’re a little bit more wary of what the political implications could be after the 2024 election.”
Meanwhile, technological advances, especially those involving generative AI, have made online disinformation much more sophisticated and harder to see through than in past elections.
“Think about where we were in 2016 and 2020,” said Wineburg, the Stanford disinformation expert. “The St. Petersburg labs spewing disinformation, often with spellings and tortured grammar — that’s gone. Just plug it into an LLM (a Large Language Model — an AI-driven algorithm that can convincingly generate human language text) and it’s going to look like it’s not only by a native speaker, but a native speaker who understands proper grammar.”
Johnson warned of ‘censorship industrial complex’
Since his ascension to the speaker job last month, Johnson’s leading role in the Republican pushback has received little attention. But, over the last year-and-a-half, he has used congressional hearings, legislation, and conservative media appearances to relentlessly warn about the dangers of what he has called the “censorship industrial complex.”
When Mayorkas announced the creation of the Disinformation Governance Board in April 2022, it was described as a working group with no operational authority. But conservatives were quick to paint it as an Orwellian scheme to censor political speech.
The far-right influencer Jack Posobiec, who has 1.7 million followers on X, called the panel a “Ministry of Truth” — language soon picked up by Johnson and other congressional Republicans.
“The Biden Administration’s decision to stand up a ‘Ministry of Truth,’ is dystopian in design, almost certainly unconstitutional, and clearly doomed from the start,” Johnson said in a press release announcing legislation he was introducing to defund the board. “The government has no role whatsoever in determining what constitutes truth or acceptable speech.”
A week later, in a sign that Republicans recognized the benefits of highlighting the issue, then-Speaker Kevin McCarthy introduced his own version of Johnson’s bill, with Johnson as a co-sponsor.
Johnson and other Republicans also seized on the news that the board would be run by Nina Jankowicz, a disinformation scholar. They noted that ahead of the 2020 election, Jankowicz had called the Hunter Biden laptop a “Russian influence op.”
“She’s supposed to be in charge of determining what is true and what is acceptable speech, and it’s just outrageous,” Johnson told Newsmax.
Jankowicz was targeted with abuse and death threats, and within weeks she had resigned. When, not long after, the board was shuttered entirely, it was perhaps Republicans’ most concrete victory on the issue.
But Johnson hasn’t let the subject drop. In March, he appeared on Fox News to discuss a subpoena issued to Jankowicz by a newly formed House subcommittee on weaponization of the federal government, chaired by Ohio Republican Rep. Jim Jordan, to which Johnson had been named.
“We have a lot of questions about the foundation of (the disinformation board),” Johnson said. “How were they going to determine what is so-called disinformation? What were they going to do with this? It’s pretty scary. We have to make sure that this never ever happens again.”
Later that same month, the subcommittee held a hearing aimed at promoting Missouri v. Biden, the lawsuit filed by Republican attorneys general in Missouri and Louisiana that challenged the Biden administration’s efforts to work with social media companies to control disinformation.
“The executive branch has undertaken a broad campaign to censor the American people,” Johnson declared. “That’s the headline. That’s the takeaway today.”
In May, at another hearing of the Select Subcommittee on the Weaponization of Government, this one focused on the FBI, Johnson said the bureau“worked with the social media platforms hand in hand, almost as partners over the last two election cycles, to censor and silence conservatives online that they disagreed with.”
At yet another hearing In July, not long after the federal court ruling that the bureau and other agencies had violated the First Amendment, Johnson grilled FBI director Christopher Wray.
Like Mayorkas, Wray said his agency focused on disinformation from foreign adversaries.
“That’s not accurate,” Johnson jumped in. “You need to read this court opinion because you’re in charge of enforcing it … (It) wasn’t just foreign adversaries, sir, it was American citizens. How do you answer for this?”
That same month, Johnson teamed up with Jordan and Sen. Rand Paul, R-Ky., to introduce more legislation on the subject. The Free Speech Protection Act would bar executive branch employees from censoring protected speech and impose “mandatory severe penalties” on executive branch employees who censor protected speech. The bill has not advanced in the House.
“The dystopian scheme by the government, Big Tech, academia, and many NGOs to censor American citizens and silence conservative voices is far-reaching and dangerous,” said Johnson.
The outrage generated by Johnson and his colleagues has had an impact. As the 2024 election looms, experts say, those looking to undermine elections by using false information to confuse voters online are in a stronger position than ever — not least because they will have had four more years to perfect their craft.
“There are people who are highly incentivized to harm our democracy by raising doubt about elections, and they have been actively organizing and raising money over the last three years,” David Becker, the founder and executive director for the Center on Election Innovation and Research, which works with election officials to improve election administration and build voter trust, recently told States Newsroom. “In 2020, those that were trying to undermine that election were largely making it up as they went along with crazy legal theories and chasing bizarre pieces of disinformation. They are going to be better prepared.”
Georgia Recorder is part of States Newsroom, a network of news bureaus supported by grants and a coalition of donors as a 501c(3) public charity. Georgia Recorder maintains editorial independence. Contact Editor John McCosh for questions: info@georgiarecorder.com. Follow Georgia Recorder on Facebook and Twitter.