
On 7 January, Mark Zuckerberg announced sweeping changes to content moderation and fact-checking across Meta’s platforms – Facebook, Instagram and Threads – sparking widespread reaction and debate about the future of online discourse and the role of social networks in shaping global conversations.
This overhaul is just one part of a broader story focused on the dramatic changes across the societal landscape in the United States that are expected once Donald Trump takes up the reins of power in his second term as President on 20 January. While the immediate implications of Meta’s moves are tied to the US, these shifts are bound to have global repercussions.
It serves as a critical moment to reflect on the shifting landscape of social networks and social media, and the deeper questions it raises about trust, governance, and user empowerment. Meta’s trajectory highlights a troubling trend for platforms seeking to balance openness with safety while navigating complex political landscapes.
As someone who has used Meta’s services extensively since joining Facebook in 2007 – and later Instagram, Messenger, WhatsApp, and more recently Threads – I’ve watched the company evolve from an essential connector of people to a global networking and advertising platform embracing social networks and messaging apps. According to Statista, almost 3.3 billion people were active daily during Q3 2024 on at least one Meta platform.
However, Meta is marred by scandals and troubling practices. The Cambridge Analytica scandal in 2018 – where personal data belonging to millions of Facebook users was collected by the British consulting firm for political advertising without user consent – left a lasting mark, creating a persistent sense of disquiet about Meta’s data practices.
But this past week’s developments have turned my unease into outright dismay, leading me to seriously reconsider my engagement with Meta platforms. More on that below.
A Troubling Shift
Meta’s new approach to moderation – starting in the US, replacing established policies with so-called “Community Notes” reminiscent of the system on X – raises critical concerns. This change, combined with removing restrictions on certain attacks against marginalised groups, highlights a worrying shift in priorities.
As Casey Newton noted in Platformer last week, the company now permits “allegations of mental illness or abnormality when based on gender or sexual orientation” under its hateful conduct policy, framing this as part of “political and religious discourse.”
Meta’s decision to push political content to all Instagram and Threads users adds another layer of complexity. According to The Verge, Instagram boss Adam Mosseri framed this as part of a “more speech and fewer mistakes” strategy. Yet, it’s hard to ignore the potential for amplifying divisive rhetoric and further eroding trust in online spaces.
The Financial Times captured the heart of the issue in an article by Richard Waters: “It may simply be impossible to run a completely open, uncensored network, while at the same time presenting an environment where anyone can feel safe and at home.” This sentiment, this uncomfortable truth, underscores the core tension within Meta’s policies.
The Role of the Oversight Board

One critical element of Meta’s moderation ecosystem has been the Oversight Board, an independent body set up in 2020 and tasked with reviewing content moderation decisions and providing guidance. The sweeping changes announced by Meta raise questions about the Board’s role moving forward.
Helle Thorning-Schmidt, one of four co-chairs of the Board and a former Prime Minister of Denmark, told the BBC in a recent interview that she is “very concerned” about the loosening of content standards and the broader implications for how content is moderated on Meta platforms.
The Oversight Board has often been a key player in advocating for more transparent and principled decision-making in content governance, but these new policies appear to diminish its influence.
Political Expediency and a Global Context
The political undertones of Meta’s decisions are hard to ignore, particularly as the company positions itself for the second Trump presidency.
The New York Times described Meta’s recent actions as a “MAGA makeover,” noting Zuckerberg’s pragmatism in aligning with shifting political winds. While this alignment might secure short-term peace, it risks alienating users and advertisers alike, particularly those seeking a safer, more inclusive online experience.
This political positioning is not confined to the US. Meta’s policy changes echo broader societal shifts, according to CNN. Wired notes: “By abandoning fact-checkers and loosening its Hateful Conduct policy, Meta has made clear the future it wants for its platforms.”
However, these changes will not be without resistance outside the US. For example, if Meta wanted to implement its moderation changes in European Union countries, it would face stricter oversight. Under the EU Digital Services Act, platforms must submit risk assessments to the European Commission before making such changes, reports Euronews.
A Personal Turning Point
As I reflect on my own journey with Meta platforms, I find myself at a crossroads. I’ve enjoyed using Threads, Instagram, WhatsApp and Facebook to connect with others, but recent revelations have forced me to question whether these platforms align with my values. I did this in 2023 about X, which I joined in 2006 soon after Twitter’s founding, ultimately eschewing posting on that platform.
On Threads, I shared some initial thoughts about this dilemma, acknowledging the confusion I feel about balancing the benefits of these services with the growing concerns about their impact.

Exploring Alternatives: Bluesky and Beyond
In contrast to Threads, platforms like Bluesky offer a refreshing perspective. I joined Bluesky in May 2023 during its closed beta and have increasingly focused my attention there. As Mark Cuban recently noted, Bluesky stands out as “the only moderated social media platform with tools that allow the user to control their own experience.”
Its commitment to decentralisation and user empowerment feels like a step in the right direction, particularly as other networks grapple with fragmentation and instability.
Bluesky’s unique features, such as allowing users to apply independent moderation algorithms, provide tangible ways to curate their online experiences. This decentralised approach has already helped resolve contentious issues, fostering healthier dialogues compared to more centralised platforms.
While Bluesky is not without its challenges, it represents a promising alternative in a landscape desperate for innovation and trustworthiness. Its growth following the US presidential election outcome last November underscores its potential to foster meaningful dialogue in a polarised world.
Platforms must invest in features that empower users to customise their experiences, such as independent moderation tools and user-defined algorithms. At the same time, smaller platforms like Bluesky must address myriad challenges while maintaining their core values.
The Road Ahead
As Meta doubles down on its controversial strategies, the broader social media ecosystem is facing a moment of reckoning. Decentralisation and user empowerment are no longer abstract ideals but critical components of a sustainable future for online platforms.
As individuals, our actions collectively shape the platforms we use. Transitioning to alternatives, advocating for transparency, and demanding accountability send powerful signals to Big Tech.
Recent history provides clear examples of user-driven change.
In 2007, Facebook retracted its Beacon advertising system after widespread protests against privacy invasions, and in 2010, similar backlash prompted a revision of its privacy settings to offer greater user control.
Research also supports the impact of collective action. A study in the International Journal of Communication highlights how social media activism can drive organisational change, illustrating the strength of user advocacy.

And in 2020, the #StopHateForProfit campaign saw numerous users, advertisers, and other organisations boycott Facebook, leading to increased scrutiny and discussions about the platform’s content moderation policies. This movement underscores how coordinated user actions can pressure platforms to reconsider their practices.
This article is a snapshot in time, reflecting on a pivotal moment in the evolving dynamics of social media and the role users play in shaping its future. It captures a moment of profound change in the social media landscape. The story is still unfolding, and its long-term impact remains uncertain.
I encourage you to reflect on your own social media usage, consider alternative platforms, and prepare for the possibility of rapid changes. If you maintain a presence on Facebook, Instagram, and Threads, consider tailored exit strategies for each. Planning ahead will help ensure a smoother transition should the need arise.
Together, we have the power to shape the future of online discourse. We mustn’t leave it all to Big Tech and the current political climate; shaping the future of online discourse requires active choices and collective action from all of us.
Additional Reading:
- Zuckerberg urges Trump to stop the EU from fining US tech companies (Politico Europe, 11 January 2025)
- Inside Mark Zuckerberg’s Sprint to Remake Meta for the Trump Era (New York Times, 10 January 2025)
- What Meta’s move to community moderation could mean for misinformation (The Conversation UK, 10 January 2025)
- The X-ification of Meta (Wired, 8 January 2025)
- Mastodon CEO calls Meta’s moderation changes ‘deeply troubling,’ warns users cross-posting from Threads (TechCrunch, 8 January 2025)
- The Good, The Bad, And The Stupid In Meta’s New Content Moderation Policies (Techdirt, 8 January 2025)