X (formerly Twitter) is again making the headlines over mishandling the Israel-Hamas war on the platform. After EU Commissioner Thierry Breton accused X (Twitter) of being a platform to “disseminate illegal content and disinformation,” the Center for Countering Digital Hate (CCDH) is now doubling down on that by accusing X of not removing hate speech content.
CCDH released a report on Tuesday that says X is not taking necessary actions to battle antisemitism, Islamophobia, and other hate speech. The content comes as the result of an ongoing war between Israel and Hamas.
CCDH noted they reported 200 “hateful” posts to X moderators on October 31. However, about 98% of those posts are still available on the platform and not removed by moderators. The anti-hate group alleged that those posts violated X community rules.
X reportedly allows hate speech to circulate on the platform, CCDH claims
CCDH further added that the hate posts were collected from 101 separate X accounts. But only one account was suspended in the aftermath. Additionally, the posts gathered a combined 24,043,693 views. CCDH also noted that 43 of the 101 X accounts were verified by X. This means means they could benefit from algorithmic boosts. A study by NewsGuard already revealed that most of the misinformation regarding the Israel-Hamas war come from X verified accounts.
In September, CCDH released a similar report about hate speech on X, but the platform alleged that CCDH misrepresented the amount of views that content got. X also filed a lawsuit against CCDH in July, accusing the anti-hate group of creating “flawed” studies about the platform.
In response, X’s head of business operations, Joe Benarroch, told The Verge that the firm had highlighted its “proactive measures” for users in a blog post. The post details X’s efforts to keep the platform safe amid the Israel-Hamas war. X stated that it had removed 3,000 accounts and 325,000 pieces of content that violated its community rules.
Benarroch also noted that most X actions focus on individual posts by restricting their reach. Additionally, he questioned the methodology of the CCDH’s study, saying CCDH only considers a post “actioned” after the account has been suspended. CCDH later said that was not true about methodology.