Twitter flagged 1 in every 500 election tweets as 'potentially misleading' — but 1 in 4 people saw them
- Twitter labelled 1 in 500 tweets related to the US election in the week before and after Election Day as containing "disputed" and "potentially misleading" content, it said Thursday.
- But around a quarter of people that saw these tweets did so before they were flagged, Twitter said.
- It also prevented users from retweeting, liking, and replying to 456 of these tweets, including some from President Donald Trump.
- In a blog post, Twitter said that encouraging users to share tweets with a comment — "quote tweeting" — rather than retweeting had slowed the spread of election misinformation.
- Visit Business Insider's homepage for more stories.
Twitter labelled 300,000 tweets for being "disputed" and "potentially misleading" about the 2020 election, it said in a blog post Thursday.
But around a quarter of people that saw these tweets did so before they were flagged, Twitter said.
The platform said it flagged 1 in every 500 tweets about the election between October 27 to November 11. It flagged them under its Civic Integrity Policy, which states users can't use the site to manipulate elections and share misleading information.
Twitter estimated a 29% drop in the number of times these tweets were retweeted with a comment — or "quote tweeted" — which it partly credits to a prompt that warned people before they shared.
Twitter also added stricter warning messages to 456 of these tweets over the 16-day period to limit their spread on the platform, it said. Users had to read a disclaimer before accessing these tweets, and weren't able to retweet, like, or reply to them.
They included some tweets by President Donald Trump, such as one in which he, without evidence, accused the Democrats of trying to steal the election.
After Twitter flagged several of his tweets within a short window, Trump called the platform "out of control."
Twitter prompts telling US users that election results were likely to be delayed, and that voting by mail was safe and legitimate, were seen 389 million times, Twitter said. This meant the site "got ahead of potentially misleading information," it said.
During the election period, Twitter also encouraged users to quote tweet rather than retweet content by other users, which Twitter said "introduced some friction" to sharing.
This change led to a 23% drop in retweets and a 26% increase in quote tweets – marking a 20% overall drop in the number of tweets reshared.
This "slowed the spread of misleading information," Twitter said. It was keeping the change for now, it said.
However, not all of Twitter's changes had their desired impact.
Twitter temporarily stopped recommending tweets on newsfeeds and through notifications from accounts that users don't follow.
Twitter hoped this would reduce the spread of misleading information, it said — but it didn't. Instead, Twitter said removing recommendations prevented people from finding new accounts and joining conversations.
It reversed the change Friday, it said.
Twitter is continuing to evaluate the changes, and will share a comprehensive report on its response to the election in early 2020, it said.
"We also want to be very clear that we do not see our job as done — our work here continues and our teams are learning and improving how we address these challenges," Kayvon Beykpour, Twitter's product lead, and Vijaya Gadde, its legal, policy, and trust and safety lead, said in the blog post.
In September, political scientists at six American universities published the first known study to examine the effects on Trump's active Twitter presence on public attitudes toward democracy.
Trump supporters are more likely to believe that elections are rigged after reading his tweets about electoral fraud, their research showed.
Receive a daily news update on your cellphone. Or get the best of our site emailed to you.
Go to the Business Insider front page for more stories.