Meta ends third-party validation plan as it prepares for Trump’s return

Photo of author

By [email protected]


Facebook owner Meta has ended its third-party fact-checking program and will instead rely on its users to report misinformation, as the social media giant prepares for Donald Trump’s return as president.

The $1.6 trillion company said Tuesday that it “will allow greater expression by lifting restrictions on some topics that are part of mainstream discourse and focusing our enforcement on illegal, high-risk violations” and “taking a more personalized approach to political content.”

“It’s time to go back to our roots of freedom of expression on Facebook and Instagram,” said Mark Zuckerberg, CEO and co-founder of Meta. he said in sharing the video.

Trump had strongly criticized Zuckerberg during the US presidential election campaign last year, indicating that if this happened dead If he interferes in the 2024 elections, he “will spend the rest of his life in prison.”

But the founder of Facebook has He sought to rebuild relations with the president-elect After his victory in November, including visiting him at his residence in Mar-a-Lago, Florida.

On Monday, Meta moved to make further inroads with the incoming US presidential administration by hiring the UFC founder and prominent Trump supporter Dana White to its Board of Directors.

White will sit on Meta’s board alongside another Trump ally, technology investor Marc Andreessen, who has long pushed the company to ease its censorship of online content.

Zuckerberg said the complexity of its content moderation system, which was expanded in December 2016 after Trump won the election for the first time, had introduced “a lot of errors and a lot of censorship.”

Starting in the US, Meta will move to a so-called “community feedback” model, similar to the one used by Elon Musk X, which allows users to add context to controversial or misleading posts. Meta itself will not write community feedback.

Meta said there is no “immediate plan” to end third-party fact-checking and community feedback outside the United States. It is unclear how this system would comply with regulations such as the European Union’s Digital Services Act and the UK’s Online Safety Act, which require online platforms to put in place measures to tackle illicit content and protect users.

Zuckerberg added that Meta will also change its systems to “significantly reduce” the amount of content its automated filters remove from its platforms.

This includes lifting restrictions on topics such as immigration and gender, to focus its systems on “unlawful and high-risk abuses”, such as terrorism, child exploitation and fraud, as well as content related to suicide, self-harm and eating disorders.

He acknowledged that the changes would mean Meta would “catch less bad stuff,” but said the trade-off was worthwhile reducing the number of “innocent people’s” posts removed.

The changes bring Zuckerberg closer to getting along with Muskwhich reduced content moderation after purchasing the social media platform, then called Twitter, in 2022.

“Just like in X, community feedback will require agreement among people with a range of viewpoints to help prevent biased evaluations,” Meta said in a blog post.

“This is awesome,” Musk said in an X post referencing the Meta changes.

Joel Kaplan, a prominent Republican, was declared dead last week Seizure Sir Nick Clegg, the company’s head of global affairs, told Fox News on Tuesday that third-party fact-checkers were “extremely biased.”

Referring to Trump’s return to the White House on January 20, Kaplan added: “We have a real opportunity now. We have a new administration and a new president coming in, who are great advocates of free speech and that makes a difference.”

As part of the changes announced on Tuesday, Meta also said it would move its US-based content moderation staff from California to Texas. “I think it will help us build the trust to do this work in places where there’s less concern about our teams being biased,” Zuckerberg said.

Meta’s changes have been criticized by online safety activists. Ian Russell, and his 14-year-old daughter, Molly It took its own life After seeing harmful content on sites including Instagram, he said he was “appalled” by the plans.

“These movements could have serious consequences for many children and young people,” he said.

Zuckerberg first introduced a third-party fact-checking service as part of a set of measures in late 2016 designed to address criticism of rampant misinformation on Facebook.

he He said At the time the company needed “stronger detection” of misinformation and would work with the news industry to learn from fact-checking systems used by journalists.

Meta said it now spends billions of dollars annually on its safety and security systems, and employs or contracts with tens of thousands of people around the world.

But on Tuesday, Zuckerberg blamed governments and “legacy media” for pushing his company to “impose more and more censorship.”

He said Meta would work with the Trump administration “to deter governments around the world that go after American companies and push for more oversight.”

He pointed to restrictive regulations in China and Latin America, as well as highlighting what he called the “ever-increasing number” of European laws that “institutionalize censorship and make it difficult to build anything innovative there.”

Meta shares were down 2 percent Tuesday morning to $616.11.



https://www.ft.com/__origami/service/image/v2/images/raw/https%3A%2F%2Fd1e00ek4ebabms.cloudfront.net%2Fproduction%2Fa0e12c92-5189-4310-84f8-18e5101ec83e.jpg?source=next-article&fit=scale-down&quality=highest&width=700&dpr=1

Source link

Leave a Comment