Twitter Suspended 44K Accounts for Promoting Terrorism, Violent Orgs in First Half of 2021

In the first six months of 2021, Twitter suspended 44,974 individual accounts for promoting terrorism or violent organizations, according to the social media platform's new transparency report.

Of those accounts, 93 percent were "proactively identified and actioned," the report said.

Twitter began releasing so-called transparency reports in 2012 to give insights on certain data regarding the platform. This newest update, covering January 1, 2021, through June 30, 2021, also documented millions of tweet removals.

The company's guidelines on violent organizations prohibit users from promoting terrorism and violent extremism. Certain criteria set by Twitter must be met in order for a collective to be classified as either a violent extremist group or organization, and a given group's actions both on and off the platform are considered in Twitter's assessment, according to the policy.

"There is no place on Twitter for violent organizations, including terrorist organizations, violent extremist groups, or individuals who affiliate with and promote their illicit activities. The violence that these groups engage in and/or promote jeopardizes the physical safety and well-being of those targeted," the policy read.

Twitter Terrorism Content
In the first six months of 2021, Twitter suspended 44,974 individual accounts for promoting terrorism or violent organizations, according to the social media platform’s new transparency report. Above, Twitter's logo is displayed on the screen of a smartphone and a tablet in Toulouse, southern France, on October 26, 2020. Lionel Bonaventure/AFP via Getty Images

The potential links between social media giants like Twitter and the spread of terrorism-related or violent content were highlighted following the January 6, 2021 riot in which supporters of then-President Donald Trump stormed the U.S. Capitol building in an effort to stop the certification of Joe Biden's presidential victory. Some have accused the companies of enabling extremists to organize and promote such an event, or failing to sufficiently squash the types of misinformation that contributed to it.

The January 6 House select committee subpoenaed Twitter, Reddit, and the parent companies of Facebook and Google earlier this month, demanding that the organizations hand over a trove of records related to domestic terrorism, misinformation and pushes to overturn the 2020 presidential election results.

"Two key questions for the Select Committee are how the spread of misinformation and violent extremism contributed to the violent attack on our democracy, and what steps — if any — social media companies took to prevent their platforms from being breeding grounds for radicalizing people to violence," Congressman Bennie Thompson, the committee's chairman, said in a statement.

Twitter said in its transparency update that in the six-month report period, it also required users to remove a total of 4.7 million tweets that breached the platform's rules. Of those tweets, 68 percent received fewer than 100 impressions before they were removed, and another 24 percent received between 100 and 1,000 impressions before they were taken down.

"In total, impressions on these violative Tweets accounted for less than 0.1% of all impressions for all Tweets during that time period," the report said.

In addition to barring content that promotes or threatens terrorism and violence, Twitter's rules prohibit content that includes abuse, harassment, child sexual exploitation, suicide, self-harm, hateful conduct, sensitive media, and certain goods and services that may be illegal.

"Broadly, Twitter continues to see an overall downward trend in the number of violating accounts which is likely due to changing behaviors of these actors coupled with continued improvements of our defenses in this area," the report said.