Fb, Alarmed by Discord Over Vote Depend, Is Stated to Be Taking Motion

Facebook, Alarmed by Discord Over Vote Count, Is Said to Be Taking Action

SAN FRANCISCO – Facebook plans to take new measures to make it harder to spread election misinformation on its platform, two knowledgeable people said Thursday as the outcome of the presidential race remained uncertain.

Facebook plans to add more "friction" – like an extra click or two – before people can share posts and other content, said the people who requested anonymity because they weren't allowed to speak publicly. The company will also downgrade content in the news feed if it contains election-related misinformation, makes it less visible and limits the distribution of election-related Facebook live streams.

The measures, which could be rolled out as early as Thursday, are in response to increased unrest and social discord on Facebook following Tuesday's elections. They said users and Facebook groups have been doing more activity to coordinate potentially violent actions on issues such as election fraud. President Trump has falsely claimed on social media and in White House statements in recent days that the elections were "stolen" from him, although a final outcome remained unclear.

The changes would be some of the key steps taken by Facebook, which in the past has tried to make information sharing as easy as possible in order to increase engagement on its website. The moves would most likely only be temporary, said people who knew them, and should cool down angry Americans who clash on the network.

"As the vote count continues, we are seeing more reports of inaccurate claims about the election," Facebook said in a statement. As a result, "additional temporary steps are being taken".

Facebook has been more proactive in addressing misinformation in recent months, even though its managing director Mark Zuckerberg has said he doesn't want to be the arbiter of the truth. The company spent months preparing for the election. Dozens of opportunities were traversed, which could happen on November 3rd, and thereafter in case political candidates or others tried to use the platform to de-legitimize the results. The new measures were part of that planning, people said.

This week, Facebook also suspended political advertising indefinitely, adding notifications at the top of the newsfeed that no winner was named in the election.

Other social media companies have also made changes to slow the flow of information on their networks and highlight accurate information on their websites. Twitter, which Mr. Trump uses as a megaphone, had warned 38 percent of his 29 tweets and retweets since early Tuesday that he made misleading claims about the election process, according to a New York Times review. Last month, Twitter also made it harder to retweet posts or share links to articles that users hadn't read before.

TikTok said it is expanding its fact-checking partnerships for disinformation about elections and is updating its policies to better reflect what types of content are not allowed in the app. YouTube used its homepage to show people accurate information about the choice.

Republicans and Democrats have long criticized Facebook and Mr. Zuckerberg for their stance on misinformation. Mr Trump and other Republicans have accused Facebook of suppressing and censoring conservative speeches, while Democrats protested the tech companies for not doing enough to clean up the barrage of toxic misinformation online.

On Thursday, the company also scrapped a new Facebook group, Stop the Steal, with more than 320,000 members, as part of a stepped up campaign against election-related disinformation and calls to violence.

Facebook said the group was "organized around the delegitimization of the electoral process" and that a number of members of the group had made calls for real violence.

Some of Facebook's new measures have precedents. In June, the company added more context on the coronavirus and highlighted accurate information about Covid-19 from health officials to help prevent the spread of untruths. WhatsApp and Messenger, two messaging apps from Facebook, have limited the number of times a message can be forwarded and limited the sharing of private messages to a maximum of five people.