With the midterm elections just a week away, Elon Musk's privately-held Twitter is restricting employees' access to certain content moderation and policy enforcement tools, Bloomberg reports.
Most people who work in Twitter's Trust and Safety division are not currently able to police accounts that violate the platform's policies on hate speech and misinformation unless real-world harm is imminent, sources familiar with the matter told the news outlet.
According to Bloomberg, Twitter is using automated content moderation tools and third party contractors to counter misinformation and offensive posts while the company's employees review high-impact violations.
Yoel Roth, Twitter head of safety and integrity, tweeted a response to Bloomberg's report late Monday night.
"This is exactly what we (or any company) should be doing in the midst of a corporate transition to reduce opportunities for insider risk," Roth said. "We're still enforcing our rules at scale."
The limited content moderation capabilities have caused concern among Trust and Safety employees that the company will be unable to effectively enforce its policies in the last week before the midterm elections on Nov. 8. According to Bloomberg, Trust and Safety employees are frequently charged with enforcing Twitter's misinformation and civic integrity policies.
After the deal closed on Friday, Musk said one of his first moves as the new owner of Twitter will be to form a "content moderation council," without going into specifics. The tech billionaire added that he won't make any "major content decisions" or reinstate banned accounts before the council begins its work.
© 2025 Newsmax. All rights reserved.