SAN FRANCISCO — It is the 11th hour before the presidential election. But Facebook and Twitter are still changing their minds.
With just a few weeks to go before the Nov. 3 vote, the social media companies are continuing to shift their policies and, in some cases, are entirely reversing what they will and won’t allow on their sites. On Friday, Twitter underlined just how fluid its policies were when it began letting users share links to an unsubstantiated New York Post article about Hunter Biden that it had previously blocked from its service.
The change was a 180-degree turn from Wednesday, when Twitter had banned the links to the article because the emails on which it was based may have been hacked and contained private information, both of which violated its policies. (Many questions remain about how the New York Post obtained the emails.)
Late Thursday, under pressure from Republicans who said Twitter was censoring them, the company began backtracking by revising one of its policies. It completed its about-face on Friday by lifting the ban on the New York Post story altogether, as the article has spread widely across the internet.
Twitter’s flip-flop followed a spate of changes from Facebook, which over the past few weeks has said it would ban Holocaust denial content, ban more QAnon conspiracy pages and groups, ban anti-vaccination ads and suspend political advertising for an unspecified length of time after the election. All of those things had previously been allowed — until they weren’t.
The rapid-fire changes have made Twitter and Facebook the butt of jokes and invigorated efforts to regulate them. On Friday, Senator Josh Hawley, Republican of Missouri, said he wanted to subpoena Mark Zuckerberg, Facebook’s chief executive, to testify over the “censorship” of the New York Post article since the social network had also reduced the visibility of the piece. Kayleigh McEnany, the White House press secretary, said that Twitter was “against us.” And President Trump shared a satirical article on Twitter that mocked the company’s policies.
“Policies are a guide for action, but the platforms are not standing behind their policies,” said Joan Donovan, research director of the Shorenstein Center on Media, Politics and Public Policy at Harvard’s Kennedy School. “They are merely reacting to public pressure and therefore will be susceptible to politician influence for some time to come.”
Keep up with Election 2020
A Twitter spokesman confirmed that the company would now allow the link to the New York Post article to be shared because the information had spread across the internet and could no longer be considered private. He declined further comment.
A Facebook spokesman, Andy Stone, said: “Meaningful events in the world have led us to change some of our policies, but not our principles.”
For nearly four years, the social media companies have had time to develop content policies to be ready for the 2020 election, especially after Russian operatives were found to have used the sites to sow discord in the 2016 election. But even with all the preparations, the volume of last-minute changes by Twitter and Facebook suggests that they still do not have a handle on the content flowing on their networks.
That raises questions, election experts said, about how Twitter and Facebook would deal with any interference on Election Day and in the days after. The race between Mr. Trump and his Democratic challenger, Joseph R. Biden Jr., has been unusually bitter, and the social media sites are set to play a significant role on Nov. 3 as distributors of information. Some people are already using the sites to call for election violence.
The chaotic environment could challenge the companies’ policies, said Graham Brookie, director of the Digital Forensic Research Lab, a center for the study of social media, disinformation and national security. “Everybody has a plan until they get punched in the face,” he said.
Other misinformation experts said Twitter and Facebook have had little choice but to make changes on the fly because of the often norm-breaking behavior of Mr. Trump, who uses social media as a megaphone.
Alex Stamos, director of the Stanford Internet Observatory and a former Facebook executive, noted that after Mr. Trump recently made comments to his supporters to “go into the polls and watch very carefully,” some companies — like Facebook — created new policies that forbid a political candidate to use their platforms to call for that action. The companies also prohibited candidates from claiming an election victory early, he said.
“These potential abuses were always covered by very broad policies, but I think it’s smart to commit themselves to specific actions,” Mr. Stamos said.
Oct. 16, 2020, 9:19 p.m. ET
From the start, the New York Post article was problematic. It featured purported emails from Hunter Biden, a son of Joseph Biden, and discussed business in Ukraine. But the provenance of the emails was unclear, and the timing of their discovery so close to the election appeared suspicious.
So on Wednesday, Twitter blocked links to the article hours after it had been published. The company said sharing the article violated its policy that prohibits users from spreading hacked information. It also said the emails in the story contained private information, so sharing the piece would violate its privacy policies.
But after blocking the article, Twitter was blasted by Republicans for censorship. Many conservatives — including Representative Jim Jordan of Ohio and Ms. McEnany — reposted the piece to bait the company into taking down their tweets or locking their accounts.
Twitter soon said it could have done more to explain its decision. Jack Dorsey, Twitter’s chief executive, said late Wednesday that the company had not provided enough context to users when they were prevented from posting the links.
His reaction set off a scramble at Twitter. By late Thursday, Vijaya Gadde, Twitter’s top legal and policy official, said that the policy against sharing hacked materials would change and that the content would no longer be blocked unless it was clearly shared by the hackers or individuals working in concert with them. Instead, information gleaned from hacks would be marked with a warning label about its provenance, Ms. Gadde said.
The internal discussions continued. On Friday, Twitter users could freely post links to the New York Post article. The company had not added labels to tweets with the article as it said it would.
At Facebook, the recent policy changes have grabbed attention partly because the company said on Sept. 3 that it did not plan to make changes to its site until after the election. “To ensure there are clear and consistent rules, we are not planning to make further changes to our election-related policies between now and the official declaration of the result,” Mr. Zuckerberg wrote in a blog post at the time.
Yet just a few weeks later, the changes started coming rapidly. On Oct. 6, Facebook expanded its takedown of the QAnon conspiracy group. A day later, it said it would ban political advertising after the polls closed on Election Day, with the ban lasting an undetermined length of time.
Days later, Mr. Zuckerberg also said Facebook would no longer allow Holocaust deniers to post their views to the site. And less than 24 hours after that, the company said it would disallow advertising related to anti-vaccination theories.
Facebook’s Mr. Stone positioned the changes as a natural response to what it called “a historic election,” as well as the coronavirus pandemic and Black Lives Matter protests.
“We remain committed to free expression while also recognizing the current environment requires clearer guardrails to minimize harm,” he said.
But there is one change Facebook hasn’t made. After reducing visibility of the New York Post article on its site on Wednesday and saying the article needed to be fact checked, the social network has continued to stick by that decision.