WASHINGTON — Facebook owner Meta has been quietly scaling back some of the safeguards designed to thwart voting disinformation or foreign interference in U.S. elections as November mid-term voting approaches.
It’s a sharp departure from the social media giant’s multibillion-dollar effort to increase the accuracy of US election posts and regain the trust of lawmakers and the public after they were outraged by learning that the company had used people’s data and allowed falsehoods to fill her site during the 2016 campaign.
The foundation raises concerns about Meta’s priorities and how some can exploit the world’s most popular social media platforms to spread misleading claims, launch fake accounts and enrage partisan extremists.
More from TIME
“They don’t talk about it,” said former Facebook policy director Katie Harbutt, now CEO of technology and policy firm Anchor Change. “Best case scenario: they’re still doing a lot behind the scenes. Worst-case scenario: They pull out, and we don’t know how that will play out for the platforms’ midterms.”
Since last year, Meta has stopped checking how lies are spread political ads on facebook by banishing researchers from the site indefinitely.
CrowdTangle, the online tool the company offered hundreds of newsrooms and researchers to identify trending posts and misinformation on Facebook or Instagram, is no longer working on some days.
The public announcement about the company’s response to the election misinformation has definitely been muted. Between 2018 and 2020, the company released more than 30 statements detailing how it would clamp down on misinformation about the US election, prevent foreign adversaries from running ads or posts around the vote, and crack down on divisive hate speech.
Senior executives hosted question-and-answer sessions with reporters about the new policies. CEO Mark Zuckerberg wrote posts on Facebook promising to stamp out false voting information and authored opinion pieces calling for more regulations to address foreign interference in US elections via social media.
But this year Meta released only a one-page document outlining plans for the fall election, although the potential threats to the vote remain clear. Several Republican candidates are pushing false claims about the US election on social media. In addition, Russia and China continue to wage aggressive social media propaganda campaigns aimed at further politically dividing the American public.
Metta says the election remains a priority and that policies developed in recent years around election disinformation or foreign interference are now firmly embedded in the company’s operations.
“With each election, we incorporate learnings into new processes and have established channels to share information with the government and our industry partners,” said Meta spokesman Tom Reynolds.
He declined to say how many employees will be involved in the project to protect US elections this year.
During the 2018 election cycle, the company offered tours and photos and produced a headcount for its election response war room. But The New York Times reported that the number of Meta employees working on this year’s election had been cut from 300 to 60, a figure Meta disputes.
Reynolds said Meta will pull hundreds of employees who work in 40 of the company’s other teams to monitor the upcoming vote alongside the election team, with its unspecified number of workers.
The company is continuing many initiatives it developed to curb election misinformation, such as a fact-checking program launched in 2016 that enlists the help of news outlets to investigate the veracity of popular falsehoods spread on Facebook or Instagram. The Associated Press is part of Meta’s fact-checking program.
This month, Meta also launched a new political ads feature that lets the public search for details on how advertisers target people based on their interests on Facebook and Instagram.
Yet Meta has stifled other efforts to identify election disinformation on its sites.
He stopped making improvements to CrowdTangle, a website he pitched to newsrooms around the world that provides information on trending social media posts. Journalists, fact-checkers and researchers used the website to analyze Facebook content, including tracking popular misinformation and who is responsible for it.
That tool is now “dying,” former CrowdTangle CEO Brandon Silverman, who left Meta last year, told the Senate Judiciary Committee this spring.
Silverman told the AP that CrowdTangle has been working on upgrades that would make it easier to search the text of Internet memes, which can often be used to spread half-truths and evade the oversight of fact-checkers, for example.
“There’s no real shortage of ways you can organize this data to make it useful to many different parts of the fact-checking community, newsrooms and the broader civil society,” Silverman said.
Not everyone at Meta agrees with this transparent approach, Silverman said. The company hasn’t released new updates or features to CrowdTangle in over a year, and there have been hours of outages in recent months.
Meta also ended efforts to investigate how misinformation travels through political ads.
The company has indefinitely suspended access to Facebook for two New York University researchers who it says were collecting unauthorized data from the platform. The move came hours after New York University professor Laura Edelson said she had shared plans with the company to investigate the spread of misinformation on the platform around January 6, 2021, attack on the US Capitolwhich is now the subject of a House investigation.
“What we found when we looked closely is that their systems were probably unsafe for many of their users,” Edelson said.
Privately, former and current Meta employees say the disclosure of these dangers around the US election has created a public and political backlash for the company.
Republicans routinely accuse Facebook of unfairly censoring conservatives, some of whom were fired for violating the company’s rules. democratsmeanwhile, they regularly complain that the tech company hasn’t gone far enough to curb misinformation.
“It’s something that’s so politically charged that they’re trying to back away from it rather than jump in head first,” said Harbat, Facebook’s former director of policy. “They just see it as a big old pile of headaches.”
Meanwhile, the possibility of U.S. regulation no longer looms over the company, with lawmakers unable to reach a consensus on how much oversight the billion-dollar company should be subject to.
Free from that threat, Meta’s leaders have devoted the company’s time, money and resources to a new project in recent months.
Zuckerberg embarked on this massive rebranding and reorganization of Facebook last October, when he changed the company’s name to Meta Platforms Inc. It plans to spend years and billions of dollars developing its social media platforms into a nascent virtual reality construct called a “metaverse” — sort of like the Internet brought to life, rendered in 3D.
Posts on his public Facebook page now focus on product announcements, praising artificial intelligence and photos of him enjoying life. News about the run-up to the election was reported in posts on the company’s blog that were not written by him.
In one of Zuckerberg’s posts last October, after a former Facebook employee leaked internal documents showing how the platform increased hate and misinformation, he defended the company. He also reminded his followers that he pushed Congress to modernize election regulations for the digital age.
“I know it’s disappointing to see the good work we do being mischaracterized, especially for those of you who make important contributions to safety, integrity, research and products,” he wrote on Oct. 5. “But I believe that in the long run, if we continue to try to do what’s right and provide experiences that improve people’s lives, it will be better for our community and our business.”
That was the last time he discussed the company’s campaign work in Menlo Park, Calif., in a public Facebook post.
Associated Press technology writer Barbara Ortutai contributed to this report.
More must-see stories from TIME