Listen to this article now |
Elon Musk, the visionary behind X, has set his sights on addressing the contentious issue of “shadowbanning” within the social network, formerly known as Twitter. Shadowbanning, a practice that covertly diminishes a user’s visibility without outright banning them, has long been a topic of concern, sparking debates around transparency and accountability in online platforms. However, despite Musk’s promise of a swift resolution, a former insider from the Trust & Safety team sheds light on the formidable obstacles that lie ahead.
Musk, known for his ambitious ventures, recently responded to users on X, admitting the delay in addressing shadowbanning and shedding light on the complexities involved. He revealed that the intricacies of the “trust & safety” software layers pose a significant challenge, often taking hours for the company to unravel the reasons behind account suspensions or shadowbans. In an effort to simplify this convoluted codebase, Musk disclosed that a comprehensive rewrite was already underway.

Yet, the intricacies and obstacles surrounding this task become more evident in a thread posted by Yoel Roth, Twitter’s former Head of Trust and Safety, on the decentralized Twitter alternative, Bluesky. Roth provides a deeper understanding of the challenges Musk’s company is grappling with. He notes that as social media platforms evolve, they tend to shift from basic spreadsheets or documents detailing bans to metadata directly linked to user accounts.
However, Roth explains that Twitter’s system still retains much of its enforcement metadata in free-text notes, complicating automated user notice mechanisms. This leads to Musk’s complaint about the difficulty of parsing through this unstructured data. Roth emphasizes that transitioning to a structured format is essential for providing automated user notifications about account status, aligning with Musk’s vision.
Roth agrees with the idea of rewriting the enforcement attribution code, which was already in progress prior to Twitter’s acquisition. However, due to potential factors such as the Musk takeover and subsequent restructuring, the project’s timeline might have been affected. Roth further delves into the complexity of the codebase, particularly in handling spam, which involves numerous models and heuristics running concurrently. This complexity, while essential for combating spam, complicates pinpointing the exact cause of a user’s “shadowbanning” at a specific moment.
As Musk’s company navigates the labyrinthine challenges unveiled by Roth, it becomes apparent that the promise of implementing the technology to display users’ account statuses might not be as imminent as initially hoped. The project’s success will hinge on reprioritizing and streamlining development efforts, given the intricate hurdles laid out by Roth. The road ahead for X’s mission to enhance transparency in the realm of shadowbanning remains steep and uncertain.