At a time when authoritarianism tightens its grip and digital repression becomes the global norm, conversations about platforms, algorithms, and social justice are not only more urgent than ever - they are also more precarious, more contested, and more necessary. The Platforms, Algorithms, and Social Justice workshop at the Massachusetts Institute of Technology emerged from these charged times, bringing together a community of critical scholars, practitioners, and students to confront the algorithmic architectures reshaping our public spheres. The workshop centered on reflections about who gets to be seen, heard, and remembered in a world governed by code and power
From the silencing of dissent in Turkey to the erasure of activist voices across both the Global Majority and the Global North, the workshop cut across geographies and disciplines to foreground a pressing question: what does justice look like in a platformed world? Invited speakers brought diverse methods and sites to the table, ranging from influencer economies and wartime cultural moderation to corporate PR and algorithmic suppression under authoritarian regimes. Rather than offering a singular narrative, what emerged was a textured, messy, and global portrait of resistance, co-option, and survival.
Zoë Glatt, a postdoctoral researcher with the Social Media Collective at Microsoft, opened the event with a sharp critique of how platforms corporately co-opt social justice narratives, repackaging them to serve their own brand image and strategic interests. Her talk, Please Mind the Gap, showed how industry campaigns like #YouTubeBlack or TikTok’s LGBTQ+ Visionary Voices List often mask the entrenched inequalities baked into platform economies. Drawing from her research, she argued that far from disrupting the neoliberal status quo, these platforms prioritize brand-safe, non-disruptive creators—rendering structural inequalities both hyper-visible and unchangeable. Zoë pushed the audience to question how performative gestures of inclusion operate within advertising-driven ecosystems, and whether a hollow gesture is better than no gesture at all. Drawing on the power of absence, she highlighted YouTube’s failure to mark Black History Month in February 2025—a silence where celebration once stood, signaling a troubling political shift in the platform’s priorities.

Tom Divon, a PhD candidate at the Hebrew University of Jerusalem, delivered a searing examination of how activist creators contend with cultural moderation on TikTok and Instagram. His presentation, Misunderstood Communities, focused on Arab, Palestinian, and Israeli users who are entangled in moderation systems that flatten cultural specificity, particularly in moments of war and political unrest. He showed how these creators are caught in the “bear hug” of neoliberal content governance: platforms that claim to protect users instead applying such tight control that they end up suffocating the ability to communicate core aspects of identity. A case in point: Arabic words are frequently flagged because their transliterations trigger moderation systems built on English-centric baselines. With rich ethnographic work, Tom highlighted the emotional and political exhaustion of being persistently misread. He closed by reflecting on how language and accent have long been weaponized against marginalized identities, referencing a press conference in which U.S. President Trump responded to Afghan journalist Nazira Karimi’s question by saying he couldn’t understand her due to her accent. The exchange, Tom suggested, exemplified how accented speech is often perceived as “other.” In that moment, a question about the future of the Afghan people was dismissed, and Karimi was reduced to a spectacle—her identity flattened into an exotic soundbite. Tom drew a chilling parallel to how platforms treat creators from marginalized communities: good for PR, marketable in their difference, but rarely truly listened to. Genuine understanding, Tom argued, requires effort and risk, costs platforms are often unwilling to bear.

I, Nora Suren, a PhD candidate at the University of Massachusetts Amherst and a Visiting Fellow at MIT, ended the formal session with my presentation, Social Justice and Algorithmic Precarity. I turned the spotlight to Turkey’s increasingly surveilled digital landscape. Through my concept of alternative creators—women, LGBTQI+ individuals, and politically progressive voices who do not always identify as activists but consistently challenge dominant narratives—I examined how these users navigate dual forms of repression: top-down state censorship and algorithmic suppression. Rather than retreat, they remain visible through adaptive strategies like banal positivity, strategic in/visibility, and grassroots care work that sustains community in moments of silence and grief.

I showed how these creators engage in everyday acts of resistance that offer a critical lens into broader global patterns of platform governance and digital authoritarianism. Much like Tom’s creators caught in cultural misrecognition and Zoë’s creators stuck in marketable inclusion, these Turkish creators are simultaneously exploited for their symbolic capital and excluded from meaningful participation. I ended by reflecting on what it means to survive, not just politically, but emotionally and relationally, when your voice is always at risk of being erased. In these spaces, justice is not a destination but a daily negotiation, fought through fragments, hashtags, and the refusal to disappear.
The roundtable that followed wove our threads into a thoughtful and urgent conversation. Adriana de Souza e Silva brought attention to the often-invisible dimension of mobility and access in hybrid spaces, both physical and digital. Drawing from her research on mobile communication and urban space in Brazil, she underscored that platforms, much like cities, are simultaneously spaces of control and resistance, where people navigate infrastructural and algorithmic barriers in ways that are both deeply creative and profoundly unequal. Jonathan Corpus Ong joined and called for care-driven, ethnographic approaches to digital injustice, urging us to consider the social identities, moral justifications, and labor behind harmful content, and to ask who gets to tell stories, under what conditions. Then, TL Taylor brought much-needed messiness into the room, cautioning against overly purist or wholly critical approaches to platforms. She encouraged us to acknowledge not just the co-option and performativity within platforms, but also the moments when things do work—when people are able to build communities, gain visibility, and create meaningful change. As critical platform scholars, our eyes are often filtered by harm, but platforms can also do good.
Together, we painted a picture of chaotic, messy spaces: platforms that may seem insignificant to those who control them, but mean everything to those who live within them. Over time, these spaces have created an unspoken dependency, where platforms become the very means of survival for people navigating constant risk and precarity—arenas where justice is negotiated every day, often by those whose voices are the most subjected to erasure. We did not end with solutions, but with a commitment to carry the complexity, to build coalitions across difference, and to keep pushing for justice not only as a critique but as a practice. In times like these, we probably need daily reminders. Hold tight. Hold space. Hold each other.
