HOT TAKE: Content Moderation
There are a lot of hot takes on Meta’s decision to end fact checking and continue to reduce content moderation. Here is my perspective as one recruited by Meta in the wake of Cambridge Analytica to improve a lot of things and especially how they manage content-based risks to users.
1. It’s largely a safe move for Meta: they successfully navigated the year of elections with no big issues, confirming their decision to significantly reduce integrity teams. Advertisers want their ads in a moderated space but since pretty much no where is moderated anymore and since hashtag#RIPTWITTER, there is no real advertising competition. People are addicted to the product.
2. BlueSky is going to have another big day. Check out The PostCard App as well for private photo/video sharing and encrypted messaging within a community of your choosing, https://lnkd.in/g9WVtiAm cc: Adam Voll
3. Because tech is ubiquitous in our daily lives, a functional public service, we tend to think of it in a vein similar to that government. Corporations never have the same level of accountability as a government does. They can reverse decisions in an instance if there isn’t a regulatory or an operational obstacle. So, while this is a big decision, it’s also a pendulum swing. Content moderation will continue to be a dynamic space in the years ahead.
COMING SOON: Over the past year I conducted independent research on global perspectives on content moderation, report out in a few weeks.
TLDR people want moderation even when they say they don’t, they deeply don’t trust social media companies to be transparent or make good decisions.
People who identify as African want significantly more moderation, significant for a lot of reasons not least of which that by 2050 1/4 of the world's population will be African.
Respondents from the Middle East and North Africa are highly polarized on the subject, and interestingly Europeans don’t want quite as much as one would imagine given the regulatory state. They use it for their business needs, much less for news, and are less attached in general.
Content moderation will continue to evolve, today is not the end. The reality is that the reasons content moderation and fact-checking were established haven’t gone away. The challenges of online harm and violence will have to be managed one way or another. Some will tune out. Others will be impacted by the real world harms amplified by social media. Unfortunately we sometimes have to learn the lesson the hard way.