As a research lab committed to social justice, equity, and informed public discourse, Radical Urban Lab has decided to leave the X platform (formerly Twitter). This decision comes after careful consideration and stems from our deep concerns about the platform’s role in amplifying disinformation, hate speech, and perpetuating a toxic digital environment.
At Radical Urban Lab, we reject the notion that technology is neutral. We recognize that all major tech platforms are designed to serve the interests of data capitalism, prioritizing profit through surveillance, algorithmic manipulation, and the commodification of user data. When digital platforms are in private hands, the integrity of public discourse is inevitably compromised. However, recent changes on X have pushed the platform beyond a critical threshold.
The platform’s shift in content moderation, weakened verification systems, and its disregard for accountability have transformed it into a space hostile to inclusive and constructive dialogue. Recent restrictions on what data streams are available to watchdogs and researchers – especially through changes to its API – has not only diminished transparency but has also created a hostile environment for researchers, making it harder to track upticks in extremism and hate speech.
Compounding this issue is Elon Musk’s erratic and ideological behaviour, including the promotion of anti-Semitic conspiracy theories, ‘the great replacement’ ideas, and white supremacy thinking. By pushing reactionary and far-right views to the mainstream, Musk has further contributed to the platform’s transformation into a space that normalizes neo-Nazi extremism.
In the name of ‘free speech’, neo-Nazi and fascist propagandists run rampant on the X platform, often boosted to the ‘For You’ tab or receiving ‘verified status’ which both legitimates and commodifies their violent ideological campaigns. X continues to profit through subscription fees from pro-Nazi accounts and by running advertisements on those accounts or adjacent to the pro-Nazi content.
This supportive social media environment can give a sense of validation to hate speech and help far-right and neo-Nazi groups recruit more people. The spread of fake news and Islamophobic narratives in X during the violent anti-immigration riots in the UK in 2024 is an example of this strategy, which contributed to the escalation of violence towards ethnic minorities and migrants.
Furthermore, the introduction of the platform’s new AI tool, the chatbot Grok, deepens our concerns. Grok’s lack of transparency in data use, potential for bias, and ability to technologically amplify disinformation makes it a dangerous tool antagonistic to public reason. Grok risks becoming a weapon of Psychological Operations (PsyOp) warfare to manipulate public discourse, reinforce harmful ideologies, and distort facts.
We believe that any platform must be responsible for maintaining a balance between open dialogue and protecting the safety and dignity of its users. X has crossed this line, and we can no longer engage with a platform that perpetuates political bias and enables toxic, harmful and anti-social behaviour.
Radical Urban Lab will continue to push the boundaries of knowledge production on platforms and in spaces that prioritize accuracy, equity, and the collective well-being of all communities.