Facebook’s misinformation problem has local election officials on edge
Voters cast their mail-in ballots at a ballot drop box outside Maricopa County Recorder and Elections Department southeast Mesa office during the Arizona state primary election in Mesa, Arizona, U.S. July 30, 2024.
Rebecca Noble | Reuters
Derek Bowens has never had such an important job. He’s the director of elections in Durham County, North Carolina, one of the most-populous areas of a state that’s increasingly viewed as crucial to the 2024 presidential contest.
So when a former precinct official emailed Bowens in July to warn him of a post containing voting misinformation that was spreading virally on Facebook, Bowens quickly recognized that he may be facing a crisis.
The post, written as if from an authority on the subject, said voters should request new ballots if a poll worker, or anyone else, writes anything on their form, because it would be invalidated. The same incorrect message was spread on Facebook during the 2020 election, but the platform flagged the content at the time as “false information” and linked to a story that debunked the rumor by Facebook’s fact-checking partner, USA Today.
Bowens said no such tag appeared on the post, which was widespread enough that the North Carolina State Board of Elections had to issue a press release on Aug. 2, informing voters that false “posts have been circulating for years and have resurfaced recently in many N.C. counties.”
“It was spreading and there wasn’t anything happening to stop it until our state put out a press release and we started engaging with our constituency on it,” Bowens told CNBC in an interview.
The elections board wrote a post on Facebook, telling voters to “steer clear of false and misleading information about elections,” with a link to its website. As of Wednesday, the post had eight comments and 50 shares. Meanwhile, multiple Facebook users in states like North Carolina, Mississippi and New Jersey continue to share the ballot misinformation without any notification that it’s false.
CNBC flagged posts with the false information to Meta. A company spokesperson said, “Meta has sent them to third-party fact-checkers for further review.”
Across the U.S., with 40 days until the Nov. 5 election, state and local officials say they are puzzled by what to expect from Facebook. Like in the past two presidential election cycles, the spread of misinformation on the social network has threatened to disrupt voting in what’s expected to be another razor-thin contest decided by thousands of voters in a handful of states. Recently, a Facebook post containing a false claim about Haitian immigrants eating pets in Springfield, Ohio, ballooned out of control and gained resonance after it was repeated by Republican nominee Donald Trump in a debate.
In 2016, Facebook was hammered by Russian operatives, pushing out false posts about Hillary Clinton to bolster Trump. In 2020, the site hosted rampant misinformation about politically charged issues like Covid treatments, masking and voter fraud.
The big difference this go-round is that Facebook has largely removed itself from the equation. In 2021, Meta began pushing political and civic content lower in its algorithms, which contributed to a dramatic decline in news traffic last year for publishers. Earlier this year, Meta announced that it would deprioritize the recommendation of political content on Instagram and its Twitter-like Threads service, a move the company said more aligns with what consumers want to see on their feeds.
Still, posts with false information can spread rapidly across wide swaths of users along with comments that amplify the misinformation, and government agencies have little ability to counteract them, because they have such limited reach on the platform.
And while Facebook has lost some of its prominence due in part to the rise of TikTok, particularly among younger audiences, the site still had more than 200 million daily users in the U.S. and Canada at the end of last year, the last time it issued regional numbers. Facebook and Instagram are generally both in the top 10 among the most-visited websites and most-popular apps in the U.S, according to the Pew Research Center and Similarweb.
Interviews with nearly a dozen regional and statewide government officials with election-related duties reveal the challenges they say they’re having using and monitoring Meta’s apps, as well as other social networking services like X, now owned by Elon Musk. The officials say they’re working overtime to ensure the safety and integrity of the election but say they’re receiving little effective help from the companies, which scaled back their trust and safety teams as part of broader cost-cutting efforts that began in 2022.
Meta ultimately cut 21,000 jobs, including in trust and safety and customer service, over multiple rounds of layoffs. As CNBC reported last year, the company dissolved a fact-checking tool that would have let news services like The Associated Press and Reuters, as well as credible experts, add comments at the top of questionable articles as a way to verify their trustworthiness. Reuters is still listed as a fact-checking partner, but an AP spokesperson said the news agency’s “fact-checking agreement with Meta ended back in January.”
The Meta spokesperson told CNBC in a statement that the company’s “integrity efforts continue to lead the industry and we have around 40,000 people globally working on safety and security — more than we had during the 2020 cycle.” The company says it now partners with about 100 third-party fact-checking groups across the globe “who review and rate viral misinformation in more than 60 languages.”
Challenges in Maricopa County
Like North Carolina, Arizona is one of the seven swing states expected to determine whether Trump or Vice President Kamala Harris, the Democratic nominee, win the presidency.
That reality has put Taylor Kinnerup in the spotlight. Kinnerup is the communications director for the recorder’s office of Maricopa County, home to more than half of Arizona’s population.
Kinnerup and her colleagues use social media to distribute up-to-date information about election-related procedures, like when residents can mail in early ballots or where to find their voting center. It’s a particularly sensitive job following Trump’s false claims of voter fraud in Arizona in 2020, when the state went blue for the first time in a presidential contest since 1996.
Given Maricopa County’s high profile during the election season, the state often attracts attention from Facebook users across the country. Many of them, Kinnerup said, are older and still leave comments about debunked conspiracy theories, such as the false claim that Sharpie markers invalidate ballots.
Kinnerup said her team places “extreme emphasis on constant communication and transparency to the public,” actively sharing election-related content across Facebook and Instagram, particularly during peak hours when it’s more likely to reach voters.
A few months ago, Kinnerup discovered that her office’s Facebook and Instagram accounts were no longer linked, meaning she couldn’t access the apps using the same credentials, or automatically schedule a single post to go across both sites.
Ahead of the primary elections in July, Kinnerup said she struggled to resolve the account issues with Meta. She said she engaged in a monthslong email exchange with numerous representatives, but found there was “no way to really make progress.” When she did get a response, it was little more than a canned statement, Kinnerup said.
Meanwhile, Kinnerup is busy overseeing media and constituent tours of the county’s election facilities to help dispel false notions that the process is being rigged as her office continues to deal with the fallout of the 2020 election. Kinnerup said she led more than 20 such tours in June.
“I couldn’t be dealing with Meta every single day, because I had to be giving tours,” Kinnerup said. The time spent trying to find a fix “was a huge issue for me,” she said.
By the time Kinnerup said she’d resolved her account issues, in mid-July, she and her colleagues had wasted countless hours on the problem, leaving her team to “feel we were put in a position where the full message we were trying to get out wasn’t ever fully there.”
Even with her office’s Facebook and Instagram accounts working again, Kinnerup says their organic social media posts generate little engagement, and her team has used sponsored ads to help expand reach across the platforms. Her team has continued with the facility tours, leading 25 this month.
Meta’s spokesperson said the company has been hosting training sessions for state and local officials since February, informing them of tools like voting alerts, which allow them to send messages to people in their area.
Former US President and Republican presidential candidate Donald Trump leaves at the end of a presidential debate with US Vice President and Democratic presidential candidate Kamala Harris at the National Constitution Center in Philadelphia, Pennsylvania, on September 10, 2024.
Saul Loeb | AFP | Getty Images
“There are multiple channels by which officials can reach us, including teams responsible for specific states and regions, and our ability to respond to them remains unchanged,” the spokesperson said.
Kinnerup said she was not “aware of any of this,” and in her year in the role has “never received any direct communication with Meta that I’m aware of.”
Bowens told CNBC in a follow-up email that he “was not aware of the sessions or the tools.”
Congress is well aware of potential problems. During a Senate hearing last week on election threats, Meta’s head of global affairs, Nick Clegg, fielded questions about the company’s election preparedness. Sen. Susan Collins, R-Maine, expressed concern about the safety and integrity of “down-ballot races at the state level, county level, local level.”
Intelligence agencies, Collins said, have told senators that bad actors from China could be focusing on disrupting regional races as opposed to the presidential election, and that state and regional officials “are far less likely to receive the kinds of briefings that we receive or to get information from Homeland Security or the FBI on how to be on alert.”
Clegg said Collins was “right to be concerned” and that Meta’s “vigilance needs to be constant.”
“It can’t just sort of peak at the time of the presidential elections,” Clegg said.
‘Three people will see it’
For Scott McDonell, the Dane County clerk in the swing state of Wisconsin, it’s been difficult to share accurate voting information on Facebook from his office’s official government account, which only has 608 followers on Facebook. McDonell said his posts get very little traction compared with years past.
“If I link to a story about election security, three people will see it,” McDonell said. Posts that include pictures do marginally better, he said, because “Facebook likes pictures.”
“Don’t link to an article, that will go to zero,” he said.
McDonell said many of his colleagues have “gotten abused” so much on Facebook in recent years that they don’t post about elections anymore.
“Basically, your average county clerk is terrified of it, and they just do it to share baby photos,” McDonell said.
In Los Angeles County, Jeramy Gray, the chief deputy of the registrar-recorder/county clerk office, said small government offices often lack the resources needed to effectively utilize social media and to troubleshoot problems.
Meta “recently put a team together to assist” his office, Gray said, adding that the company appears to be the “most mature” of the big platforms even if it’s not a “model partner.”
“What I would like to see is just more engagement from them, at least three to four months from a large national election, for them to reach out to key stakeholders at the state and local level to really talk about what they can do or what they’re doing,” Gray said.
Bowens, in North Carolina’s Durham County, said the tech platforms could be much more helpful in assisting his office and others as they navigate through some of the confusion about what type of content is acceptable.
Bowens said he’s concerned about acting too aggressively because of potential censorship issues and recognizes there’s a gray area between misinformation and citizens exercising their First Amendment rights.
“You know, we’ve got a very diverse election system in this country,” Bowens said. “What was on that post may very well be true in another state. Therefore, is it misinformation?”
Read the original article here